-
ACL-IJCNLP 20212021The product reviews summarization task aims to automatically produce a short summary for a set of reviews of a given product. Such summaries are expected to aggregate a range of different opinions in a concise, coherent and informative manner. This challenging task gives rise to two shortcomings in existing work. First, summarizers tend to favor generic content that appears in reviews for many different
-
ACL-IJCNLP 20212021Behavior of deep neural networks can be inconsistent between different versions. Regressions1during model update are a common cause of concern that often over-weigh the benefits in accuracy or efficiency gain. This work focuses on quantifying, reducing and analyzing regression errors in the NLP model updates. Using negative flip rate as regression measure, we show that regression has a prevalent presence
-
NAACL 2021 Workshop on Visually Grounded Interaction and Language (ViGIL), ACL Findings 20222021Interactive robots navigating photo-realistic environments need to be trained to effectively leverage and handle the dynamic nature of dialogue in addition to the challenges underlying vision-and-language navigation (VLN). In this paper, we present VISITRON, a multi-modal Transformer-based navigator better suited to the interactive regime inherent to Cooperative Vision-and-Dialog Navigation (CVDN). VISITRON
-
ACL-IJCNLP 20212021A commonly observed problem with the state-of-the art abstractive summarization models is that the generated summaries can be factually inconsistent with the input documents. The fact that automatic summarization may produce plausible-sounding yet inaccurate summaries is a major concern that limits its wide application. In this paper we present an approach to address factual consistency in summarization
-
ACL Findings 20212021In Natural Language Understanding (NLU), to facilitate Cross-Lingual Transfer Learning (CLTL), especially CLTL between distant languages, we integrate CLTL with Machine Translation (MT), and thereby propose a novel CLTL model named Translation Aided Language Learner (TALL). TALL is constructed as a standard transformer, where the encoder is a pre-trained multilingual language model. The training of TALL
Related content
-
July 14, 2022To become the interface for the Internet of things, conversational agents will need to learn on their own. Alexa has already started down that path.
-
July 13, 2022Four MIT professors are the recipients of the inaugural call for research projects.
-
July 13, 2022Allowing separate tasks to converge on their own schedules and using knowledge distillation to maintain performance improves accuracy.
-
July 11, 2022The SCOT science team used lessons from the past — and improved existing tools — to contend with “a peak that lasted two years”.
-
July 08, 2022Industry track chair and Amazon principal research scientist Rashmi Gangadharaiah on trends in industry papers and the challenges of building practical dialogue systems.
-
July 08, 2022New model sets new standard in accuracy while enabling 60-fold speedups.