Update README.md
Browse files
README.md
CHANGED
|
@@ -13,12 +13,12 @@ The eCeLLM-M model is instruction-tuned from the large base models [Mistral-7B I
|
|
| 13 |
|
| 14 |
## Citation
|
| 15 |
```bibtex
|
| 16 |
-
@
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
}
|
| 24 |
```
|
|
|
|
| 13 |
|
| 14 |
## Citation
|
| 15 |
```bibtex
|
| 16 |
+
@inproceedings{
|
| 17 |
+
peng2024ecellm,
|
| 18 |
+
title={eCe{LLM}: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
|
| 19 |
+
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
|
| 20 |
+
booktitle={Forty-first International Conference on Machine Learning},
|
| 21 |
+
year={2024},
|
| 22 |
+
url={https://openreview.net/forum?id=LWRI4uPG2X}
|
| 23 |
}
|
| 24 |
```
|