Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,44 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- chemistry
|
| 4 |
+
---
|
| 5 |
+
|
| 6 |
+
# ChemGPT 19M
|
| 7 |
+
ChemGPT is based on the GPT-Neo model and was introduced in the paper [Neural Scaling of Deep Chemical Models](https://chemrxiv.org/engage/chemrxiv/article-details/627bddd544bdd532395fb4b5).
|
| 8 |
+
|
| 9 |
+
## Model description
|
| 10 |
+
ChemGPT is a transformers model for generative molecular modeling, which was pretrained on the PubChem10M dataset.
|
| 11 |
+
|
| 12 |
+
## Intended uses & limitations
|
| 13 |
+
|
| 14 |
+
### How to use
|
| 15 |
+
You can use this model directly from the 🤗/transformers library.
|
| 16 |
+
|
| 17 |
+
### Limitations and bias
|
| 18 |
+
This model was trained on a subset of molecules from PubChem. You can use this model to generate molecules, but it is mostly intended to be used for investigations of the effects of pre-training and fine-tuning on downstream datasets.
|
| 19 |
+
|
| 20 |
+
## Training data
|
| 21 |
+
PubChem10M, a dataset of SMILES strings from PubChem, available via [DeepChem](https://deepchemdata.s3-us-west-1.amazonaws.com/datasets/pubchem_10m.txt.zip).
|
| 22 |
+
|
| 23 |
+
## Training procedure
|
| 24 |
+
|
| 25 |
+
### Preprocessing
|
| 26 |
+
SMILES strings were converted to SELFIES using version 1.0.4 of the SELFIES library.
|
| 27 |
+
|
| 28 |
+
|
| 29 |
+
### Pretraining
|
| 30 |
+
See code in the [LitMatter repository](https://github.com/ncfrey/litmatter/blob/main/lit_models/lit_chemgpt.py).
|
| 31 |
+
|
| 32 |
+
### BibTeX entry and citation info
|
| 33 |
+
```
|
| 34 |
+
@article{frey_soklaski_axelrod_samsi_gomez-bombarelli_coley_gadepally_2022,
|
| 35 |
+
place={Cambridge}, title={Neural Scaling of Deep Chemical Models},
|
| 36 |
+
DOI={10.26434/chemrxiv-2022-3s512}, journal={ChemRxiv}, publisher={Cambridge Open Engage},
|
| 37 |
+
author={Frey, Nathan and Soklaski, Ryan and Axelrod, Simon and Samsi, Siddharth and Gomez-Bombarelli, Rafael and Coley, Connor and Gadepally, Vijay},
|
| 38 |
+
year={2022}} This content is a preprint and has not been peer-reviewed.
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
+
```
|
| 42 |
+
Frey, Nathan, Ryan Soklaski, Simon Axelrod, Siddharth Samsi, Rafael Gomez-Bombarelli, Connor Coley, and Vijay Gadepally.
|
| 43 |
+
"Neural Scaling of Deep Chemical Models." ChemRxiv (2022). Print. This content is a preprint and has not been peer-reviewed.
|
| 44 |
+
```
|