Green 7B
Green is an open-source long-context model based on Mistral.
π§ Usage Example
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_name = "Abigail45/Green"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map="auto"
)
prompt = "Write a short poem about green forests."
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
output = model.generate(**inputs, max_new_tokens=150)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Model tree for Abigail45/Green
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3
Datasets used to train Abigail45/Green
Evaluation results
- ROUGE-L (50k context) on long-range-arenaself-reported45.670
- Exact Match (50k) on long-range-arenaself-reported62.340
- MMLU (0-shot, 50k context) on cais/mmluself-reported72.450
- ARC-Challenge (25-shot) on cais/mmluself-reported78.920