Spaces:
Runtime error
Runtime error
readme update
Browse files
README.md
CHANGED
|
@@ -10,4 +10,19 @@ pinned: false
|
|
| 10 |
license: mit
|
| 11 |
---
|
| 12 |
|
| 13 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
license: mit
|
| 11 |
---
|
| 12 |
|
| 13 |
+
# GPT from scratch
|
| 14 |
+
|
| 15 |
+
This repo contains code to train a GPT from scratch. The dataset is taken from the [RedPajama 1 trillion data](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T-Sample). Only samples from this are taken and used for the training purposes. The implementation of the transformer is similar to the [LitGPT](https://github.com/Lightning-AI/lit-gpt).
|
| 16 |
+
|
| 17 |
+
The trained model has a parameter count of about 160M. The final training loss was found to be 3.2154.
|
| 18 |
+
|
| 19 |
+

|
| 20 |
+
|
| 21 |
+
The training details can be found in the attached notebooks. The initial training was stopped when the loss was around 4.
|
| 22 |
+
|
| 23 |
+

|
| 24 |
+
|
| 25 |
+
|
| 26 |
+
Using the checkpoint, the training was resumed and stopped when it went below 3.5.
|
| 27 |
+
|
| 28 |
+
Github link - https://github.com/mkthoma/gpt_from_scratch
|