Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -129,8 +129,10 @@ Note that: 
     | 
|
| 129 | 
         
             
            You can generate code using the `transformers` library as follows:
         
     | 
| 130 | 
         | 
| 131 | 
         
             
            ```python
         
     | 
| 132 | 
         
            -
             
     | 
| 133 | 
         
            -
             
     | 
| 
         | 
|
| 
         | 
|
| 134 | 
         | 
| 135 | 
         
             
            x = tokenizer.encode('def fibonacci(n): ', return_tensors='pt')
         
     | 
| 136 | 
         
             
            y = model.generate(x, max_length=100, do_sample=True, top_p=0.95, top_k=4, temperature=0.2, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id)
         
     | 
| 
         | 
|
| 129 | 
         
             
            You can generate code using the `transformers` library as follows:
         
     | 
| 130 | 
         | 
| 131 | 
         
             
            ```python
         
     | 
| 132 | 
         
            +
            from transformers import AutoModelForCausalLM, AutoTokenizer
         
     | 
| 133 | 
         
            +
             
     | 
| 134 | 
         
            +
            tokenizer = AutoTokenizer.from_pretrained('replit/replit-code-v1-3b', trust_remote_code=True)
         
     | 
| 135 | 
         
            +
            model = AutoModelForCausalLM.from_pretrained('replit/replit-code-v1-3b', trust_remote_code=True)
         
     | 
| 136 | 
         | 
| 137 | 
         
             
            x = tokenizer.encode('def fibonacci(n): ', return_tensors='pt')
         
     | 
| 138 | 
         
             
            y = model.generate(x, max_length=100, do_sample=True, top_p=0.95, top_k=4, temperature=0.2, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id)
         
     |