On this page

Modeling Analysis of Modern Chinese Quantitative Word Constructions Based on Graph Neural Networks

By: Jianping Xu1
1Tianjin University of Finance and Economics Pearl River College, Tianjin, 300000, China

Abstract

In the transformation process of traditional media to converged media, the labeling technology of modern Chinese language becomes more important, and the labeling of modern Chinese text is not only convenient for organizing and categorizing, but also can provide more accurate search and recommendation services for Internet users. In this paper, a global classification model for multi-labeled text is constructed by combining graph convolutional neural network, multi-head attention mechanism and BERT pre-training model, so as to realize the modeling of quantitative word constructions in modern Chinese. In the experimental datasets, the classification accuracies of the models after BiGRU and BERT are added to GCN are significantly improved, while the classification accuracy of the BERT+GCN model in this paper is better than that of the BiGRU+GCN model, which verifies the effectiveness of the text classification model in this paper. In addition, the classification effect of this paper’s method on four datasets is better than all other compared models, and it improves 1.31%, 0.98%, 1.44%, and 0.50% compared to BiHAM model on the four datasets of Ohsumed, MR, R52, and R8, respectively. The model application results show that both “length” and “length and short” can be collocated with quantifiers, there are some common collocation quantifiers between the two, and the collocation of the two quantifiers also exists two positions, but the former is more significant in this feature. This paper provides an analytical path for modeling quantity word constructions in modern Chinese.