SENTIBERT: PRE-TRAINING LANGUAGE MODEL COMBINING SENTIMENT INFORMATION

SentiBERT: Pre-training Language Model Combining Sentiment Information

SentiBERT: Pre-training Language Model Combining Sentiment Information

Blog Article

Pre-training language models on large-scale unsupervised corpus are attracting the attention of researchers in the field of natural language processing.The existing model mainly extracts the semantic and structural features of the BIO C-GEL 500MG text in the pre-training stage.Aiming at sentiment task and complex emotional features, a pre-training method focusing on learning sentiment features is proposed on the basis of the latest pre-training language model BERT(bidirectional encoder representations from transformers).In the further pre-training stage, this paper improves pre-training task of BERT with the help of sentiment dictionary.

At the same time, this paper Cyclist Accessories - Gloves uses context-based word sentiment prediction task to classify the sentiment of masked words to acquire the textual representation biased towards sentiment features.Finally, fine-tuning is performed on a small labeled data sets.Experimental results show that, compared with the original BERT model, the accuracy of sentiment tasks can be improved by 1 percentage point.More advanced results can be achieved at small training sets.

Report this page