Update README.md
Browse files
README.md
CHANGED
|
@@ -311,6 +311,8 @@ model-index:
|
|
| 311 |
|
| 312 |
**Note:** Check out my new, updated models: [German_Semantic_V3](https://huggingface.co/aari1995/German_Semantic_V3) and [V3b](https://huggingface.co/aari1995/German_Semantic_V3b)!
|
| 313 |
|
|
|
|
|
|
|
| 314 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
| 315 |
|
| 316 |
Special thanks to [deepset](https://huggingface.co/deepset/) for providing the model gBERT-large and also to [Philip May](https://huggingface.co/philipMay) for the Translation of the dataset and chats about the topic.
|
|
|
|
| 311 |
|
| 312 |
**Note:** Check out my new, updated models: [German_Semantic_V3](https://huggingface.co/aari1995/German_Semantic_V3) and [V3b](https://huggingface.co/aari1995/German_Semantic_V3b)!
|
| 313 |
|
| 314 |
+
This model creates german embeddings for semantic use cases.
|
| 315 |
+
|
| 316 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
| 317 |
|
| 318 |
Special thanks to [deepset](https://huggingface.co/deepset/) for providing the model gBERT-large and also to [Philip May](https://huggingface.co/philipMay) for the Translation of the dataset and chats about the topic.
|