-
ACL 20232023In real-world systems, an important requirement for model updates is to avoid regressions in user experience caused by flips of previously correct classifications to incorrect ones. Multiple techniques for that have been proposed in the recent literature. In this paper, we apply one such technique, focal distillation, to model updates in a goal-oriented dialog system and assess its usefulness in practice
-
Mitigating the burden of redundant datasets via batch-wise unique samples and frequency-aware lossesACL 20232023Datasets used to train deep learning models in industrial settings often exhibit skewed distributions with some samples repeated a large number of times. This paper presents a simple yet effective solution to reduce the increased burden of repeated computation on redundant datasets. Our approach eliminates duplicates at the batch level, without altering the data distribution observed by the model, making
-
The Web Conference Workshop on Interactive and Scalable Information Retrieval Methods for eCommerce (ISIR-eCom)2023Product search for online shopping should be season-aware, i.e., presenting seasonally relevant products to customers. In this paper, we propose a simple yet effective solution to improve seasonal relevance in product search by incorporating seasonality into language models for semantic matching. We first identify seasonal queries and products by analyzing implicit seasonal contexts through time-series
-
ACL 20232023Large language models (LLMs) are known to memorize significant portions of their training data. Parts of this memorized content have been shown to be extractable by simply querying the model, which poses a privacy risk. We present a novel approach which uses prompt-tuning to control the extraction rates of memorized content in LLMs. We present two prompt training strategies to increase and decrease extraction
-
The Web Conference 20232023This paper investigates cross-lingual temporal knowledge graph reasoning problem, which aims to facilitate reasoning on Temporal Knowledge Graphs (TKGs) in low-resource languages by transfering knowledge from TKGs in high-resource ones. The cross-lingual distillation ability across TKGs becomes increasingly crucial, in light of the unsatisfying performance of existing reasoning methods on those severely
Related content
-
November 18, 2021In Conversation Mode, Alexa detects device-directed speech without the need for the wake word.
-
November 08, 2021Combining elastic weight consolidation and data mixing yields better trade-offs between performance on old and new tasks.
-
November 05, 2021Natural-language understanding and question answering are areas of focus, with additional topics ranging from self-learning to text summarization.
-
November 04, 2021Amazon's Georgiana Dinu on current challenges in machine translation.
-
October 11, 2021Take a behind-the-scenes look at the unique challenges the engineering teams faced, and how they used scientific research to drive fundamental innovation.
-
October 04, 2021University team application deadline is October 31, 2021.