IAD Index of Academic Documents
  • Home Page
  • About
    • About Izmir Academy Association
    • About IAD Index
    • IAD Team
    • IAD Logos and Links
    • Policies
    • Contact
  • Submit A Journal
  • Submit A Conference
  • Submit Paper/Book
    • Submit a Preprint
    • Submit a Book
  • Contact
  • Journal of Artificial Intelligence and Data Science
  • Volume:1 Issue:2
  • Performance Trade-Off for Bert Based Multi-Domain Multilingual Chatbot Architectures

Performance Trade-Off for Bert Based Multi-Domain Multilingual Chatbot Architectures

Authors : Davut Emre TAŞAR, Şükrü OZAN, Seçilay KUTAL, Oğuzhan ÖLMEZ, Semih GÜLÜM, Fatih AKCA, Ceren BELHAN
Pages : 144-149
View : 20 | Download : 10
Publication Date : 2021-12-30
Article Type : Research Paper
Abstract :Text classification is a natural language processing insert ignore into journalissuearticles values(NLP); problem that aims to classify previously unseen texts. In this study, Bidirectional Encoder Representations for Transformers insert ignore into journalissuearticles values(BERT); architecture is preferred for text classification. The classification is aimed explicitly at a chatbot that can give automated responses to website visitors` queries. BERT is trained to reduce the need for RAM and storage by replacing multiple separate models for different chatbots on a server with a single model. Moreover, since a pre-trained multilingual BERT model is preferred, the system reduces the need for system resources. It handles multiple chatbots with multiple languages simultaneously. The model mainly determines a class for a given input text. The classes correspond to specific answers from a database, and the bot selects an answer and replies back. For multiple chatbots, a special masking operation is performed to select a response from within the corresponding bank answers of a chatbot. We tested the proposed model for 13 simultaneous classification problems on a data set of three different languages, Turkish, English, and German, with 333 classes. We reported the accuracies for individually trained models and the proposed model together with the savings in the system resources.
Keywords : BERT, classification, chatbot, memory gain, multi domain, multi lingual

ORIGINAL ARTICLE URL
VIEW PAPER (PDF)

* There may have been changes in the journal, article,conference, book, preprint etc. informations. Therefore, it would be appropriate to follow the information on the official page of the source. The information here is shared for informational purposes. IAD is not responsible for incorrect or missing information.


Index of Academic Documents
İzmir Academy Association
CopyRight © 2023-2025