A Prompting-based Encoder-Decoder Approach to Intent Recognition and Slot Filling

Authors:

Panagiotis Tassias, Ion Androutsopoulos, Themos Stafylakis
Colaborators:

Omilia – Conversational Intelligence, Athens, Greece

Publication Date

November, 2022

In recent years, there is an increasing interest in developing advanced conversational agents that facilitate users to accomplish specific goals. Natural Language Understanding (NLU), a subfield of Natural Language Processing, is at the core of these task-oriented dialogue systems. In this thesis, we experimented with different ways of tackling NLU problems, focusing on the sub-tasks of Intent Recognition and Slot Filling. By conducting various experiments on the publicly available ATIS and SNIPS datasets, we confirm that in cases where there is explicit slot label alignment, fine-tuning large Language Models like BERT, seems to be the gold standard approach. However, regarding the Slot Filling problem, in most real-world cases this method is not feasible due to the absence of human-annotated BIO tags and it additionally performs poorly in few-shot settings, where there is a limited set of labeled data. In order to overcome these limitations, we propose using an encoder-decoder approach that incorporates the concept of prompting. Specifically, we utilize the T5 Language Model along with natural language templates which the model is prompted to fill in with the relevant information. This method achieves 98% intent accuracy and 95.9% slot micro-F1-score on the SNIPS dataset. More importantly, it provides substantial performance improvements in few-shot settings and displays great adaptability to different intents and domains, when compared to its counterpart that does not embody prompts.
Omilia