Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Paper
•
1910.10683
•
Published
•
15
This is multi-task t5-base model trained for question answering and answer aware question generation tasks.
For question generation the answer spans are highlighted within the text with special highlight tokens (<hl>) and prefixed with 'generate question: '. For QA the input is processed like this question: question_text context: context_text </s>
You can play with the model using the inference API. Here's how you can use it
For QG
generate question: <hl> 42 <hl> is the answer to life, the universe and everything. </s>
For QA
question: What is 42 context: 42 is the answer to life, the universe and everything. </s>
For more deatils see this repo.