Update README.md
Browse files
README.md
CHANGED
|
@@ -120,16 +120,14 @@ print(result)
|
|
| 120 |
If you want to use Hugging Face `transformers` to generate text, you can do something like this.
|
| 121 |
|
| 122 |
```py
|
| 123 |
-
from transformers import
|
| 124 |
-
|
| 125 |
-
|
| 126 |
-
|
| 127 |
-
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
|
| 131 |
-
outputs = model.generate(**inputs, max_new_tokens=20)
|
| 132 |
-
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
| 133 |
```
|
| 134 |
|
| 135 |
## Limitations
|
|
|
|
| 120 |
If you want to use Hugging Face `transformers` to generate text, you can do something like this.
|
| 121 |
|
| 122 |
```py
|
| 123 |
+
from transformers import pipeline
|
| 124 |
+
|
| 125 |
+
messages = [
|
| 126 |
+
{"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
|
| 127 |
+
{"role": "user", "content": "Who are you?"},
|
| 128 |
+
]
|
| 129 |
+
chatbot = pipeline("text-generation", model="mistralai/Mistral-7B-Instruct-v0.3")
|
| 130 |
+
chatbot(messages)
|
|
|
|
|
|
|
| 131 |
```
|
| 132 |
|
| 133 |
## Limitations
|