ariG23498 HF Staff commited on
Commit
7ee56a6
·
verified ·
1 Parent(s): e6992d0

Upload inclusionAI_Ring-flash-linear-2.0-128k_1.txt with huggingface_hub

Browse files
inclusionAI_Ring-flash-linear-2.0-128k_1.txt ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```CODE:
2
+ # Load model directly
3
+ from transformers import AutoModelForCausalLM
4
+ model = AutoModelForCausalLM.from_pretrained("inclusionAI/Ring-flash-linear-2.0-128k", trust_remote_code=True, torch_dtype="auto")
5
+ ```
6
+
7
+ ERROR:
8
+ Traceback (most recent call last):
9
+ File "/tmp/inclusionAI_Ring-flash-linear-2.0-128k_1fNCpm3.py", line 16, in <module>
10
+ model = AutoModelForCausalLM.from_pretrained("inclusionAI/Ring-flash-linear-2.0-128k", trust_remote_code=True, torch_dtype="auto")
11
+ File "/tmp/.cache/uv/environments-v2/7a001057f9cd0d8c/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 586, in from_pretrained
12
+ model_class = get_class_from_dynamic_module(
13
+ class_ref, pretrained_model_name_or_path, code_revision=code_revision, **hub_kwargs, **kwargs
14
+ )
15
+ File "/tmp/.cache/uv/environments-v2/7a001057f9cd0d8c/lib/python3.13/site-packages/transformers/dynamic_module_utils.py", line 604, in get_class_from_dynamic_module
16
+ final_module = get_cached_module_file(
17
+ repo_id,
18
+ ...<8 lines>...
19
+ repo_type=repo_type,
20
+ )
21
+ File "/tmp/.cache/uv/environments-v2/7a001057f9cd0d8c/lib/python3.13/site-packages/transformers/dynamic_module_utils.py", line 427, in get_cached_module_file
22
+ modules_needed = check_imports(resolved_module_file)
23
+ File "/tmp/.cache/uv/environments-v2/7a001057f9cd0d8c/lib/python3.13/site-packages/transformers/dynamic_module_utils.py", line 260, in check_imports
24
+ raise ImportError(
25
+ ...<2 lines>...
26
+ )
27
+ ImportError: This modeling file requires the following packages that were not found in your environment: fla. Run `pip install fla`