Joint Laboratory of HIT and iFLYTEK Research (HFL)
commited on
Commit
·
743445d
1
Parent(s):
1eb2cbf
Update README.md
Browse files
README.md
CHANGED
|
@@ -4,7 +4,7 @@ language:
|
|
| 4 |
license: "apache-2.0"
|
| 5 |
---
|
| 6 |
## Chinese BERT with Whole Word Masking
|
| 7 |
-
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
| 8 |
|
| 9 |
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
|
| 10 |
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
|
|
|
|
| 4 |
license: "apache-2.0"
|
| 5 |
---
|
| 6 |
## Chinese BERT with Whole Word Masking
|
| 7 |
+
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
| 8 |
|
| 9 |
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
|
| 10 |
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
|