-
AAAI 20202020A considerable part of the success experienced by Voice-controlled virtual assistants (VVA) is due to the emotional and personalized experience they deliver, with humor being a key component in providing an engaging interaction. In this paper we describe methods used to improve the joke skill of a VVA through personalization. The first method, based on traditional NLP techniques, is robust and scalable.
-
AAAI 20202020Knowledge distillation is typically conducted by training a small model (the student) to mimic a large and cumbersome model (the teacher). The idea is to compress the knowledge from the teacher by using its output probabilities as soft labels to optimize the student. However, when the teacher is considerably large, there is no guarantee that the internal knowledge of the teacher will be transferred into
-
AAAI 20202020Task-oriented dialog agents provide a natural language interface for users to complete their goal. Dialog State Tracking (DST), which is often a core component of these systems, tracks the system’s understanding of the user’s goal throughout the conversation. To enable accurate multi-domain DST, the model needs to encode dependencies between past utterances and slot semantics and understand the dialog context
-
AAAI 20202020Machine Reading Comprehension (MRC) for question answering (QA), which aims to answer a question given the relevant context passages, is an important way to test the ability of intelligence systems to understand human language. Multiple-Choice QA (MCQA) is one of the most difficult tasks in MRC because it often requires more advanced reading comprehension skills such as logical reasoning, summarization,
-
AAAI 20202020Conversation structure is useful for both understanding the nature of conversation dynamics and for providing features for many downstream applications such as summarization of conversations. In this work, we define the problem of conversation structure modeling as identifying the parent utterance(s) to which each utterance in the conversation responds to. Previous work usually took a pair of utterances
Related content
-
June 29, 2020Alexa AI vice president of natural understanding Prem Natarajan discusses the upcoming cycle for the National Science Foundation collaboration on fairness in AI, his participation on the Partnership on AI board, and issues related to bias in natural language processing.
-
June 17, 2020Earlier this year, Amazon notified grant applicants who were recipients of the 2019 Amazon Research Awards.
-
June 05, 2020More than eight percent of interns will have applied research, and data science roles.
-
June 04, 2020Watch the recording of Natarajan's live interview with Alexa evangelist Jeff Blankenburg.
-
May 19, 2020At smaller vocabulary sizes, tokenizers trained on unannotated data work best.
-
May 15, 2020Advances by student teams drive major improvements in users' experiences with socialbots.