Improve model card: Add paper and GitHub links, fill citation

#4
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +26 -4
README.md CHANGED
@@ -1,11 +1,17 @@
1
  ---
2
- license: mit
3
  base_model:
4
- - inclusionAI/Ling-flash-base-2.0
5
- pipeline_tag: text-generation
6
  library_name: transformers
 
 
7
  ---
8
 
 
 
 
 
 
 
9
  <p align="center">
10
  <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width="100"/>
11
  <p>
@@ -264,8 +270,24 @@ More usage can be found [here](https://docs.sglang.ai/basic_usage/send_request.h
264
 
265
  ### Finetuning
266
 
267
- We recommend you to use [Llama-Factory](https://github.com/hiyouga/LLaMA-Factory) to [finetune Ring](https://github.com/inclusionAI/Ling-V2/blob/main/docs/llamafactory_finetuning.md).
268
 
269
  ## License
270
 
271
  This code repository is licensed under [the MIT License](https://github.com/inclusionAI/Ring-V2/blob/master/LICENSE).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
2
  base_model:
3
+ - inclusionAI/Ling-flash-base-2.0
 
4
  library_name: transformers
5
+ license: mit
6
+ pipeline_tag: text-generation
7
  ---
8
 
9
+ # Ring-flash-2.0
10
+
11
+ This model is presented in the paper [Every Step Evolves: Scaling Reinforcement Learning for Trillion-Scale Thinking Model](https://huggingface.co/papers/2510.18855).
12
+
13
+ The official code repository is available at: [https://github.com/inclusionAI/Ring-V2](https://github.com/inclusionAI/Ring-V2).
14
+
15
  <p align="center">
16
  <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width="100"/>
17
  <p>
 
270
 
271
  ### Finetuning
272
 
273
+ We recommend you to use [Llama-Factory](https://github.com/hiyouga/LLaMA-Factory) to [finetune Ring](https://github.com/inclusionAI/Ring-V2/blob/main/docs/llamafactory_finetuning.md).
274
 
275
  ## License
276
 
277
  This code repository is licensed under [the MIT License](https://github.com/inclusionAI/Ring-V2/blob/master/LICENSE).
278
+
279
+ ## Citation
280
+
281
+ If you find our work helpful, feel free to give us a cite.
282
+
283
+ ```bibtex
284
+ @article{lingteam2025everystep,
285
+ title={Every Step Evolves: Scaling Reinforcement Learning for Trillion-Scale Thinking Model},
286
+ author={Ling Team and Anqi Shen and Baihui Li and Bin Hu and Bin Jing and Cai Chen and Chao Huang and Chao Zhang and Chaokun Yang and Cheng Lin and Chengyao Wen and Congqi Li and Deng Zhao and Dingbo Yuan and Donghai You and Fagui Mao and Fanzhuang Meng and Feng Xu and Guojie Li and Guowei Wang and Hao Dai and Haonan Zheng and Hong Liu and Jia Guo and Jiaming Liu and Jian Liu and Jianhao Fu and Jiannan Shi and Jianwen Wang and Jianxin Lai and Jin Yang and Jun Mei and Jun Zhou and Junbo Zhao and Junping Zhao and Kuan Xu and Le Su and Lei Chen and Li Tang and Liang Jiang and Liangcheng Fu and Lianhao Xu and Linfeng Shi and Lisha Liao and Longfei Zheng and Meng Li and Mingchun Chen and Qi Zuo and Qiang Cheng and Qianggang Cao and Qitao Shi and Quanrui Guo and Senlin Zhu and Shaofei Wang and Shaomian Zheng and Shuaicheng Li and Shuwei Gu and Chen, Siba and Wu, Tao and Zhang, Tao and Zhang, Tianyu and Zhou, Tianyu and Bie, Tiwei and Yang, Tongkai and Hong, Wang and Ren, Wang and Chen, Weihua and Yu, Wenbo and Zheng, Wengang and Wang, Xiangchun and Yan, Xiaodong and Wan, Xiaopei and Zhao, Xin and Kong, Xinyu and Tang, Xinyu and Han, Xudong and Wang, Xudong and Yang, Xuemin and Hu, Xueyu and Zhang, Yalin and Sun, Yan and Shan, Yicheng and Wang, Yilong and Xu, Yingying and Liu, Yongkang and Guo, Yongzhen and Wang, Yuanyuan and Yan, Yuchen and Wang, Yuefan and Guo, Yuhong and Li, Zehuan and Xu, Zhankai and Li, Zhe and Zhang, Zhenduo and Gui, Zhengke and Pan, Zhenxuan and Huang, Zhenyu and Lan, Zhenzhong and Ding, Zhiqiang and Zhang, Zhiqiang and Li, Zhixun and Liu, Zhizhen and Wang, Zihao and Wen, Zujie},
287
+ year={2025},
288
+ eprint={2510.18855},
289
+ archivePrefix={arXiv},
290
+ primaryClass={cs.LG},
291
+ url={https://arxiv.org/abs/2510.18855},
292
+ }
293
+ ```