Dolly 2.0 :The World’s First Truly Open Instruction-Tuned LLM

    36
    « Back to Glossary Index

    Dolly 2.0 is a language model with 12 billion parameters that derives from the EleutherAI pythia model series.

    Its training was conducted exclusively using a new dataset composed of high-quality human-generated instruction following data.

    This dataset was created through crowd-sourcing among employees of Databricks.

    Now, they are open-sourcing the entirety of Dolly 2.0, including the training code, the dataset, and the model weights, all suitable for commercial use.

    This means that any organization can create, own, and customize powerful LLMs that can talk to people, without paying for API access or sharing data with third parties.

    Dolly 2.0 is freely available for researchers and developers to use and has been used in a variety of natural language processing applications, such as question answering, text completion, and language translation.

    « Back to Glossary Index
    Previous articleLiterature Review
    Next articleZotero RA
    Dr. Ujjal Marjit leads the Centre for Information Resource Management of the University of Kalyani, India. He received his bachelor honours degree from Visva Bharati, Central University and Master in Computer Application from Jadavpur University, India.He did his BLISc and MLISc from Madurai Kamraj University, India. He obtained his PhD in Computer Science and Engineering from University of Kalyani. He was also a visiting researcher at Norwegian University of Science and Technology (NTNU), Norway. Dr. Marjit was a member of the Association for Computing Machinery (ACM), USA. He has coauthored several book chapters and over 70 research publications in various International Journals and Conferences. Dr. Marjit attended many national and international conferences in India and abroad ( Germany, London, Finland, Norway, Netherlands). He has been working in University since 2001.