Question Answering NLU (QANLU) is an approach that maps the NLU task into question answering, leveraging pre-trained question-answering models to perform well on few-shot settings. Instead of training an intent classifier or a slot tagger, for example, we can ask the model intent- and slot-related questions in natural language:
Context : I'm looking for a cheap flight to Boston. Question: Is the user looking to book a flight? Answer : Yes Question: Is the user asking about departure time? Answer : No Question: What price is the user looking for? Answer : cheap Question: Where is the user flying from? Answer : (empty)
Thus, by asking questions for each intent and slot in natural language, we can effectively construct an NLU hypothesis. For more details, please read the paper: Language model is all you need: Natural language understanding as question answering.
This repository contains code to transform MATIS++ NLU data (e.g. utterances and intent / slot annotations) into SQuAD 2.0 format question-answering data that can be used by QANLU. MATIS++ includes the original English version of ATIS and a translation into eight languages: German, Spanish, French, Japanese, Hindi, Portuguese, Turkish, and Chinese.