In authors or contributors

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

Resource type
Preprint
Authors/contributors
Title
GPT-NeoX-20B: An Open-Source Autoregressive Language Model
Abstract
We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe \model{}'s architecture and training and evaluate its performance on a range of language-understanding, mathematics, and knowledge-based tasks. We find that GPT-NeoX-20B is a particularly powerful few-shot reasoner and gains far more in performance when evaluated five-shot than similarly sized GPT-3 and FairSeq models. We open-source the training and evaluation code, as well as the model weights, at https://github.com/EleutherAI/gpt-neox.
Repository
arXiv
Archive ID
arXiv:2204.06745
Date
2022-04-14
Accessed
24/02/2024, 17:43
Short Title
GPT-NeoX-20B
Library Catalogue
Extra
arXiv:2204.06745 [cs]
Citation
Black, S., Biderman, S., Hallahan, E., Anthony, Q., Gao, L., Golding, L., He, H., Leahy, C., McDonell, K., Phang, J., Pieler, M., Prashanth, U. S., Purohit, S., Reynolds, L., Tow, J., Wang, B., & Weinbach, S. (2022). GPT-NeoX-20B: An Open-Source Autoregressive Language Model (arXiv:2204.06745). arXiv. https://doi.org/10.48550/arXiv.2204.06745