ahmadAlrabghi/al_tadmoreyyah_model
This model is a fine-tuned version of aubmindlab/aragpt2-large on an altadmoriyyah dataset.
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 40
- num_epochs: 15
Training results
Framework versions
- PEFT 0.12.0
- Transformers 4.44.2
- Pytorch 2.4.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
This project serves as an experimental step in building models aimed at preserving Arab-Islamic heritage using artificial intelligence. The fine-tuning of the AraGPT2-large model on the text "Al-Tadmuriyah" by Shaykh al-Islam Ibn Taymiyyah represents the beginning of exploring how language models can enhance our understanding and appreciation of classical texts.
Insha'Allah, I will continue working on other projects in this domain to further contribute to this field
you can fine more details about this model in this github repo: https://github.com/AhmadAlmustadi/al_tadmreyyah_model
- Downloads last month
- -
Model tree for ahmadAlrabghi/al_tadmoreyyah_model
Base model
aubmindlab/aragpt2-large