Green 7B

Green is an open-source long-context model based on Mistral.

πŸ”§ Usage Example

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model_name = "Abigail45/Green"

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype=torch.float16,
    device_map="auto"
)

prompt = "Write a short poem about green forests."
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
output = model.generate(**inputs, max_new_tokens=150)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Abigail45/Green

Finetuned
(406)
this model

Datasets used to train Abigail45/Green

Evaluation results