Treffer: RMHAN: Random Multi-Hierarchical Attention Network with RAG-LLM-Based Sentiment Analysis Using Text Reviews.
Weitere Informationen
Currently, social media networks produce a large quantity of social data from users. To understand the views of people and sentimental tendencies on a commodity or an event in a timely manner, it is essential to conduct sentiment analysis (SA) on the views that are expressed by users. For longer text data, it comprises various contents and the correlation among words is more complicated than the short text. To bridge this gap, a random multi-hierarchical attention network (RMHAN) is introduced for SA using text reviews. First, the input review is passed to bidirectional encoder representation from transformers (BERTs) tokenization, which breaks the text into individual tokens, and the output-1 is obtained. Likewise, input review is passed to the retrieval-augmented generation-large language model (RAG-LLM) for recognizing, translating, predicting, or generating text or additional content, and thus, output-2 is accomplished. Thereafter, the tokenized word is passed to the feature extraction phase for extracting the features. Then, SA is conducted employing RMHAN. Here, RMHAN is the combination of random multimodel deep learning (RMDL) and hierarchical attention network (HAN), where layers are modified employing the Taylor network with some forward methodology. It can be noticed that RMHAN accomplished a better accuracy of 91.90%, a precision of 91.70%, and an F1-score of 89.10%. [ABSTRACT FROM AUTHOR]
Copyright of International Journal of Computational Intelligence & Applications is the property of World Scientific Publishing Company and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)