1
0
mirror of https://github.com/osmarks/nanogpt-experiments.git synced 2024-11-10 20:09:58 +00:00

add note about windows and pytorch 2.0 and torch compile in general

This commit is contained in:
Andrej Karpathy 2023-01-12 02:17:52 +00:00
parent bb49751439
commit 7f51d17977

View File

@ -153,6 +153,10 @@ Results
- Actually reproduce GPT-2 results and have clean configs that reproduce the result. It was estimated ~3 years ago that the training cost of 1.5B model was ~$50K (?). Sounds a bit too high.
# acknowledgements
## troubleshooting
- Note that by default this repo uses PyTorch 2.0 (i.e. `torch.compile`). This is fairly new and experimental, and not yet available on all platforms (e.g. Windows). If you're running into related error messages try to disable this by adding `--compile=False` flag. This will slow down the code but at least it will run.
## acknowledgements
Thank you [Lambda labs](https://lambdalabs.com) for supporting the training costs of nanoGPT experiments.