Derek Thomas commited on
Commit
cc20464
·
1 Parent(s): 6a149f6

End of training

Browse files
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - rouge
7
+ - bleu
8
+ model-index:
9
+ - name: t5-end2end-question-generation
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # t5-end2end-question-generation
17
+
18
+ This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.3875
21
+ - Rouge1: 29.8409
22
+ - Rouge2: 15.2583
23
+ - Rougel: 25.4802
24
+ - Rougelsum: 28.8023
25
+ - Gen Len: 18.9971
26
+ - Bleu: 1.8149
27
+ - Bleu 0: 71.9158
28
+ - Bleu 1: 46.3975
29
+ - Bleu 2: 31.3479
30
+ - Bleu 3: 20.236
31
+
32
+ ## Model description
33
+
34
+ More information needed
35
+
36
+ ## Intended uses & limitations
37
+
38
+ More information needed
39
+
40
+ ## Training and evaluation data
41
+
42
+ More information needed
43
+
44
+ ## Training procedure
45
+
46
+ ### Training hyperparameters
47
+
48
+ The following hyperparameters were used during training:
49
+ - learning_rate: 5e-05
50
+ - train_batch_size: 8
51
+ - eval_batch_size: 8
52
+ - seed: 42
53
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
+ - lr_scheduler_type: linear
55
+ - num_epochs: 5
56
+
57
+ ### Training results
58
+
59
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | Bleu | Bleu 0 | Bleu 1 | Bleu 2 | Bleu 3 |
60
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|:------:|:-------:|:-------:|:-------:|:-------:|
61
+ | 1.4252 | 0.21 | 500 | 1.4638 | 29.5937 | 14.6438 | 25.1309 | 28.5076 | 18.9990 | 1.7595 | 70.9726 | 44.8789 | 29.8013 | 18.9402 |
62
+ | 1.3591 | 0.42 | 1000 | 1.4619 | 29.4017 | 14.7271 | 25.1139 | 28.3406 | 19.0 | 1.7286 | 70.9671 | 45.415 | 30.2413 | 19.1132 |
63
+ | 1.426 | 0.64 | 1500 | 1.4313 | 29.9163 | 15.0542 | 25.5098 | 28.852 | 19.0 | 1.8109 | 71.924 | 46.0312 | 30.8421 | 19.6842 |
64
+ | 1.5525 | 0.85 | 2000 | 1.4177 | 30.0353 | 15.2661 | 25.6495 | 28.9867 | 19.0 | 1.8387 | 72.1696 | 46.3888 | 31.1768 | 20.1203 |
65
+ | 1.5035 | 1.06 | 2500 | 1.4185 | 29.7649 | 15.1864 | 25.4353 | 28.738 | 19.0 | 1.7868 | 71.9618 | 46.6091 | 31.4797 | 20.209 |
66
+ | 1.4294 | 1.27 | 3000 | 1.4138 | 29.5473 | 14.877 | 25.1373 | 28.5195 | 18.9990 | 1.7516 | 71.3163 | 45.6707 | 30.7404 | 19.6335 |
67
+ | 1.4336 | 1.49 | 3500 | 1.4058 | 29.9003 | 15.213 | 25.4924 | 28.8375 | 19.0 | 1.799 | 71.8573 | 46.2609 | 31.2086 | 20.0675 |
68
+ | 1.4434 | 1.7 | 4000 | 1.3978 | 30.0046 | 15.2722 | 25.6091 | 28.9496 | 18.9990 | 1.839 | 72.2448 | 46.6283 | 31.463 | 20.2921 |
69
+ | 1.4285 | 1.91 | 4500 | 1.3984 | 30.0478 | 15.1083 | 25.4469 | 28.9337 | 18.9990 | 1.8247 | 71.6695 | 45.7508 | 30.7813 | 19.7828 |
70
+ | 1.3926 | 2.12 | 5000 | 1.3982 | 30.0837 | 15.4009 | 25.6203 | 29.0334 | 18.9990 | 1.8237 | 72.2626 | 46.662 | 31.5043 | 20.2789 |
71
+ | 1.369 | 2.33 | 5500 | 1.3980 | 29.9042 | 15.1828 | 25.4962 | 28.8323 | 18.9990 | 1.8064 | 71.8783 | 46.1411 | 31.0047 | 19.9691 |
72
+ | 1.3577 | 2.55 | 6000 | 1.3936 | 29.9335 | 15.2821 | 25.5855 | 28.9161 | 19.0 | 1.8099 | 71.8881 | 46.3101 | 31.3396 | 20.3185 |
73
+ | 1.3636 | 2.76 | 6500 | 1.3908 | 29.9512 | 15.2434 | 25.5476 | 28.9224 | 18.9995 | 1.8242 | 71.9772 | 46.3212 | 31.2688 | 20.1704 |
74
+ | 1.3799 | 2.97 | 7000 | 1.3900 | 29.9393 | 15.1658 | 25.4702 | 28.8729 | 18.9971 | 1.8055 | 71.9431 | 46.1286 | 30.9969 | 19.9389 |
75
+ | 1.3318 | 3.18 | 7500 | 1.3934 | 29.7982 | 15.132 | 25.3908 | 28.7333 | 18.9995 | 1.7908 | 71.7081 | 46.1832 | 31.1416 | 20.1409 |
76
+ | 1.3208 | 3.4 | 8000 | 1.3928 | 29.9378 | 15.1421 | 25.4586 | 28.8793 | 19.0 | 1.8258 | 71.7795 | 45.969 | 30.9173 | 19.9664 |
77
+ | 1.3135 | 3.61 | 8500 | 1.3888 | 29.9264 | 15.2179 | 25.5529 | 28.875 | 19.0 | 1.8363 | 71.9537 | 46.2706 | 31.2245 | 20.2624 |
78
+ | 1.323 | 3.82 | 9000 | 1.3868 | 29.8749 | 15.2251 | 25.4639 | 28.7949 | 18.9971 | 1.812 | 71.6918 | 46.1503 | 31.0437 | 19.9965 |
79
+ | 1.3325 | 4.03 | 9500 | 1.3868 | 29.8804 | 15.2658 | 25.4848 | 28.8238 | 18.9971 | 1.8105 | 71.9146 | 46.3617 | 31.2842 | 20.1447 |
80
+ | 1.296 | 4.24 | 10000 | 1.3882 | 29.941 | 15.28 | 25.5209 | 28.9109 | 18.9971 | 1.817 | 71.994 | 46.3801 | 31.216 | 20.0596 |
81
+ | 1.3027 | 4.46 | 10500 | 1.3883 | 29.8492 | 15.2017 | 25.4398 | 28.7911 | 18.9971 | 1.7994 | 71.8366 | 46.0939 | 30.9953 | 19.9115 |
82
+ | 1.3046 | 4.67 | 11000 | 1.3880 | 29.8538 | 15.2605 | 25.4897 | 28.8236 | 18.9971 | 1.8136 | 71.9285 | 46.3689 | 31.2969 | 20.1728 |
83
+ | 1.294 | 4.88 | 11500 | 1.3875 | 29.8409 | 15.2583 | 25.4802 | 28.8023 | 18.9971 | 1.8149 | 71.9158 | 46.3975 | 31.3479 | 20.236 |
84
+
85
+
86
+ ### Framework versions
87
+
88
+ - Transformers 4.27.1
89
+ - Pytorch 1.13.1+cu116
90
+ - Datasets 2.10.1
91
+ - Tokenizers 0.13.2
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
+ "pad_token_id": 0,
5
+ "transformers_version": "4.27.1"
6
+ }
logs/events.out.tfevents.1679323036.a4e382f4421a.1876.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:90e8b2956f3e66ffcda9fe11dc5f949b0b6557125956d56f6880a472e5cecd1c
3
- size 26319
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:72d5f3e7a34a4611f8b3da7e92ead20d628ba55ae2cbd38c5f663942091bb0ac
3
+ size 26673
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d2d958c17b469e232b6e417b69fd0da96be314d951bde4ff12e0403395f4423a
3
  size 891702929
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0da0516f0ba012abd3dd72f588051578774c3e9faa662c6791dc094a21178b77
3
  size 891702929
special_tokens_map.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>"
103
+ ],
104
+ "eos_token": "</s>",
105
+ "pad_token": "<pad>",
106
+ "unk_token": "<unk>"
107
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>"
103
+ ],
104
+ "eos_token": "</s>",
105
+ "extra_ids": 100,
106
+ "model_max_length": 512,
107
+ "pad_token": "<pad>",
108
+ "special_tokens_map_file": null,
109
+ "tokenizer_class": "T5Tokenizer",
110
+ "unk_token": "<unk>"
111
+ }