Update README.md
Browse files
README.md
CHANGED
|
@@ -24,10 +24,14 @@ The Cogito v2 LLMs are instruction tuned generative models. All models are relea
|
|
| 24 |
- This model is trained in over 30 languages and supports a context length of 128k.
|
| 25 |
|
| 26 |
# Evaluations
|
|
|
|
|
|
|
| 27 |
<p align="left">
|
| 28 |
<img src="images/cogito-v2-405b-benchmarks.png" alt="Logo" width="90%">
|
| 29 |
</p>
|
| 30 |
|
|
|
|
|
|
|
| 31 |
# Usage
|
| 32 |
Here is a snippet below for usage with Transformers:
|
| 33 |
|
|
|
|
| 24 |
- This model is trained in over 30 languages and supports a context length of 128k.
|
| 25 |
|
| 26 |
# Evaluations
|
| 27 |
+
Here is the model performance on some standard industry benchmarks.
|
| 28 |
+
|
| 29 |
<p align="left">
|
| 30 |
<img src="images/cogito-v2-405b-benchmarks.png" alt="Logo" width="90%">
|
| 31 |
</p>
|
| 32 |
|
| 33 |
+
For detailed evaluations, please refer to the [Blog Post](https://www.deepcogito.com/research/cogito-v2-preview).
|
| 34 |
+
|
| 35 |
# Usage
|
| 36 |
Here is a snippet below for usage with Transformers:
|
| 37 |
|