ariG23498 HF Staff commited on
Commit
4861d5b
·
verified ·
1 Parent(s): 54c9ea4

Upload deepseek-ai_DeepSeek-V3.2-Exp_1.txt with huggingface_hub

Browse files
Files changed (1) hide show
  1. deepseek-ai_DeepSeek-V3.2-Exp_1.txt +37 -0
deepseek-ai_DeepSeek-V3.2-Exp_1.txt ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```CODE:
2
+ # Load model directly
3
+ from transformers import AutoModelForCausalLM
4
+ model = AutoModelForCausalLM.from_pretrained("deepseek-ai/DeepSeek-V3.2-Exp", torch_dtype="auto")
5
+ ```
6
+
7
+ ERROR:
8
+ Traceback (most recent call last):
9
+ File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1360, in from_pretrained
10
+ config_class = CONFIG_MAPPING[config_dict["model_type"]]
11
+ ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
12
+ File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1048, in __getitem__
13
+ raise KeyError(key)
14
+ KeyError: 'deepseek_v32'
15
+
16
+ During handling of the above exception, another exception occurred:
17
+
18
+ Traceback (most recent call last):
19
+ File "/tmp/deepseek-ai_DeepSeek-V3.2-Exp_1PWAFqQ.py", line 16, in <module>
20
+ model = AutoModelForCausalLM.from_pretrained("deepseek-ai/DeepSeek-V3.2-Exp", torch_dtype="auto")
21
+ File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 549, in from_pretrained
22
+ config, kwargs = AutoConfig.from_pretrained(
23
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~^
24
+ pretrained_model_name_or_path,
25
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
26
+ ...<4 lines>...
27
+ **kwargs,
28
+ ^^^^^^^^^
29
+ )
30
+ ^
31
+ File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1362, in from_pretrained
32
+ raise ValueError(
33
+ ...<8 lines>...
34
+ )
35
+ ValueError: The checkpoint you are trying to load has model type `deepseek_v32` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
36
+
37
+ You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`