hrithiksagar-tih commited on
Commit
615c8a9
·
verified ·
1 Parent(s): 852965e

Added requirement file for transformers

Browse files
Files changed (1) hide show
  1. README.md +23 -0
README.md CHANGED
@@ -175,6 +175,29 @@ We provide an example code to run `InternVL3-78B` using `transformers`.
175
 
176
  > Please use transformers>=4.37.2 to ensure the model works normally.
177
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
178
  ### Model Loading
179
 
180
  #### 16-bit (bf16 / fp16)
 
175
 
176
  > Please use transformers>=4.37.2 to ensure the model works normally.
177
 
178
+ - Use this requirements.txt file:
179
+ ```
180
+ # Note: PyTorch and FlashAttention must be installed separately.
181
+ # See installation instructions.
182
+
183
+ # Core ML/DL Libraries
184
+ transformers>=4.37.2
185
+ accelerate
186
+ einops
187
+ timm
188
+
189
+ # Utilities
190
+ pyyaml
191
+ Pillow
192
+ tqdm
193
+ ninja
194
+ wheel
195
+ packaging
196
+
197
+ # pip3 install torch torchvision --index-url https://download.pytorch.org/whl/cu126
198
+ # pip install flash-attn --no-build-isolation
199
+ ```
200
+
201
  ### Model Loading
202
 
203
  #### 16-bit (bf16 / fp16)