mirror of
https://github.com/osmarks/nanogpt-experiments.git
synced 2024-11-10 20:09:58 +00:00
experimenting with badges, and discord link to start specifically. issues sometimes feel a little too heavy
This commit is contained in:
parent
7f74652843
commit
6dab32c003
@ -1,6 +1,8 @@
|
||||
|
||||
# nanoGPT
|
||||
|
||||
(https://dcbadge.vercel.app/api/server/3zy8kqD9Cp?compact=true&style=flat)](https://discord.gg/3zy8kqD9Cp)
|
||||
|
||||
![nanoGPT](assets/nanogpt.jpg)
|
||||
|
||||
The simplest, fastest repository for training/finetuning medium-sized GPTs. It is a rewrite of [minGPT](https://github.com/karpathy/minGPT) that prioritizes teeth over education. Still under active development, but currently the file `train.py` reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in 38 hours of training. The code itself is plain and readable: `train.py` is a ~300-line boilerplate training loop and `model.py` a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That's it.
|
||||
|
Loading…
Reference in New Issue
Block a user