historical note? (#5)
Browse files- historical note? (b52b0a468db214f480f9480b4ba23cc323974bce)
README.md
CHANGED
|
@@ -3,7 +3,7 @@ language: en
|
|
| 3 |
license: mit
|
| 4 |
---
|
| 5 |
|
| 6 |
-
# OpenAI GPT
|
| 7 |
|
| 8 |
## Table of Contents
|
| 9 |
- [Model Details](#model-details)
|
|
@@ -19,7 +19,7 @@ license: mit
|
|
| 19 |
|
| 20 |
## Model Details
|
| 21 |
|
| 22 |
-
**Model Description:** `openai-gpt` is
|
| 23 |
|
| 24 |
- **Developed by:** Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever. See [associated research paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) and [GitHub repo](https://github.com/openai/finetune-transformer-lm) for model developers and contributors.
|
| 25 |
- **Model Type:** Transformer-based language model
|
|
|
|
| 3 |
license: mit
|
| 4 |
---
|
| 5 |
|
| 6 |
+
# OpenAI GPT 1
|
| 7 |
|
| 8 |
## Table of Contents
|
| 9 |
- [Model Details](#model-details)
|
|
|
|
| 19 |
|
| 20 |
## Model Details
|
| 21 |
|
| 22 |
+
**Model Description:** `openai-gpt` (a.k.a. "GPT-1") is the first transformer-based language model created and released by OpenAI. The model is a causal (unidirectional) transformer pre-trained using language modeling on a large corpus with long range dependencies.
|
| 23 |
|
| 24 |
- **Developed by:** Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever. See [associated research paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) and [GitHub repo](https://github.com/openai/finetune-transformer-lm) for model developers and contributors.
|
| 25 |
- **Model Type:** Transformer-based language model
|