Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -13,7 +13,7 @@ tags: | |
| 13 | 
             
            <p align="center">
         | 
| 14 | 
             
              <a href="https://gpt-oss.com"><strong>Try gpt-oss</strong></a> ·
         | 
| 15 | 
             
              <a href="https://cookbook.openai.com/topic/gpt-oss"><strong>Guides</strong></a> ·
         | 
| 16 | 
            -
              <a href="https:// | 
| 17 | 
             
              <a href="https://openai.com/index/introducing-gpt-oss/"><strong>OpenAI blog</strong></a>
         | 
| 18 | 
             
            </p>
         | 
| 19 |  | 
| @@ -166,3 +166,17 @@ The gpt-oss models are excellent for: | |
| 166 | 
             
            Both gpt-oss models can be fine-tuned for a variety of specialized use cases.
         | 
| 167 |  | 
| 168 | 
             
            This smaller model `gpt-oss-20b` can be fine-tuned on consumer hardware, whereas the larger [`gpt-oss-120b`](https://huggingface.co/openai/gpt-oss-120b) can be fine-tuned on a single H100 node.
         | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 13 | 
             
            <p align="center">
         | 
| 14 | 
             
              <a href="https://gpt-oss.com"><strong>Try gpt-oss</strong></a> ·
         | 
| 15 | 
             
              <a href="https://cookbook.openai.com/topic/gpt-oss"><strong>Guides</strong></a> ·
         | 
| 16 | 
            +
              <a href="https://arxiv.org/abs/2508.10925"><strong>Model card</strong></a> ·
         | 
| 17 | 
             
              <a href="https://openai.com/index/introducing-gpt-oss/"><strong>OpenAI blog</strong></a>
         | 
| 18 | 
             
            </p>
         | 
| 19 |  | 
|  | |
| 166 | 
             
            Both gpt-oss models can be fine-tuned for a variety of specialized use cases.
         | 
| 167 |  | 
| 168 | 
             
            This smaller model `gpt-oss-20b` can be fine-tuned on consumer hardware, whereas the larger [`gpt-oss-120b`](https://huggingface.co/openai/gpt-oss-120b) can be fine-tuned on a single H100 node.
         | 
| 169 | 
            +
             | 
| 170 | 
            +
            # Citation
         | 
| 171 | 
            +
             | 
| 172 | 
            +
            ```bibtex
         | 
| 173 | 
            +
            @misc{openai2025gptoss120bgptoss20bmodel,
         | 
| 174 | 
            +
                  title={gpt-oss-120b & gpt-oss-20b Model Card}, 
         | 
| 175 | 
            +
                  author={OpenAI},
         | 
| 176 | 
            +
                  year={2025},
         | 
| 177 | 
            +
                  eprint={2508.10925},
         | 
| 178 | 
            +
                  archivePrefix={arXiv},
         | 
| 179 | 
            +
                  primaryClass={cs.CL},
         | 
| 180 | 
            +
                  url={https://arxiv.org/abs/2508.10925}, 
         | 
| 181 | 
            +
            }
         | 
| 182 | 
            +
            ```
         | 

