Download PDFOpen PDF in browserComparative Study of Inductive Graph Neural Network Models for Text ClassificationEasyChair Preprint 91355 pages•Date: October 26, 2022AbstractAmong several proposed methods for text classifi- cation, transformers and GNN have gained popularity recently. Models which use GNN are both transductive and inductive. Transductive models such as TextGCN fail to deal with scalability issues with larger datasets because of converting the whole corpus into a graph. Induction models were introduced, which convert individual documents to graphs fed to the model for classification. In this paper, a comparative study of the three Inductive Graph Neural Network(GNN), namely TexTING, In- GCN, In-GAT models, is analyzed. The study shows that In- GAT gave better result comparecd other two models. Also, It is proved that message passing mechanism does not have effect on performance of model and Entropy loss value depends on size of Dataset and Model used. Keyphrases: Entropy, Gated Graph Recurrent Unit, Graph Attention Network, Graph Convolutional Network, inductive model, text classification
|