mirror of
https://github.com/osmarks/nanogpt-experiments.git
synced 2024-12-18 14:10:28 +00:00
docs: simplify dependencies installation
Adds a `pip install ...` command that will install all necessary dependencies, while retaining original dependency notes. Added quick description of `tqdm` as well.
This commit is contained in:
parent
7fe4a099ad
commit
eeac8732b9
14
README.md
14
README.md
@ -11,15 +11,19 @@ Because the code is so simple, it is very easy to hack to your needs, train new
|
|||||||
|
|
||||||
## install
|
## install
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install torch numpy transformers datasets tiktoken wandb tqdm
|
||||||
|
```
|
||||||
|
|
||||||
Dependencies:
|
Dependencies:
|
||||||
|
|
||||||
- [pytorch](https://pytorch.org) <3
|
- [pytorch](https://pytorch.org) <3
|
||||||
- [numpy](https://numpy.org/install/) <3
|
- [numpy](https://numpy.org/install/) <3
|
||||||
- `pip install transformers` for huggingface transformers <3 (to load GPT-2 checkpoints)
|
- `transformers` for huggingface transformers <3 (to load GPT-2 checkpoints)
|
||||||
- `pip install datasets` for huggingface datasets <3 (if you want to download + preprocess OpenWebText)
|
- `datasets` for huggingface datasets <3 (if you want to download + preprocess OpenWebText)
|
||||||
- `pip install tiktoken` for OpenAI's fast BPE code <3
|
- `tiktoken` for OpenAI's fast BPE code <3
|
||||||
- `pip install wandb` for optional logging <3
|
- `wandb` for optional logging <3
|
||||||
- `pip install tqdm` <3
|
- `tqdm` for progress bars <3
|
||||||
|
|
||||||
## quick start
|
## quick start
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user