1
0
mirror of https://github.com/osmarks/nanogpt-experiments.git synced 2024-12-21 07:30:28 +00:00
nanogpt-experiments/config
2023-02-03 22:17:05 +00:00
..
eval_gpt2_large.py adding a lightweight configurator that may be a terrible mistake lol. also adding configs to evaluate the baseline GPT2 versions released by OpenAI on OWT. we have some ways to go to match those numbers atm 2022-12-28 23:31:23 +00:00
eval_gpt2_medium.py adding a lightweight configurator that may be a terrible mistake lol. also adding configs to evaluate the baseline GPT2 versions released by OpenAI on OWT. we have some ways to go to match those numbers atm 2022-12-28 23:31:23 +00:00
eval_gpt2_xl.py adding a lightweight configurator that may be a terrible mistake lol. also adding configs to evaluate the baseline GPT2 versions released by OpenAI on OWT. we have some ways to go to match those numbers atm 2022-12-28 23:31:23 +00:00
eval_gpt2.py adding a lightweight configurator that may be a terrible mistake lol. also adding configs to evaluate the baseline GPT2 versions released by OpenAI on OWT. we have some ways to go to match those numbers atm 2022-12-28 23:31:23 +00:00
finetune_shakespeare.py rename compile_model to compile, shroter, version 2 stragglers 2023-01-02 01:15:55 +00:00
train_gpt2.py include launch command too. anyone should be able to do this now 2023-02-03 22:17:05 +00:00
train_shakespeare_char.py a bit better settings... for a single gpu at least. these settings would fry a simple cpu though i think 2023-01-14 03:59:53 +00:00