-
CIKM 20232023Many public pre-trained word embeddings have been shown to encode different types of biases. Embeddings are often obtained from training on large pre-existing corpora, and therefore resulting biases can be a reflection of unfair representations in the original data. Bias, in this scenario, is a challenging problem since current mitigation techniques require knowing and understanding existing biases in the
-
Interspeech 20232023Neural text-to-speech systems are often optimized on L1/L2 losses, which make strong assumptions about the distributions of the target data space. Aiming to improve those assumptions, Normalizing Flows and Diffusion Probabilistic Models were recently proposed as alternatives. In this paper, we compare traditional L1/L2-based approaches to diffusion and flow-based approaches for the tasks of prosody and
-
ACL 2023 Workshop on Learning with Small Data2023We investigate and refine denoising methods for NER task on data that potentially contains extremely noisy labels from multi-sources. In this paper, we first summarized all possible noise types and noise generation schemes, based on which we built a thorough evaluation system. We then pinpoint the bottleneck of current state-of-art denoising methods using our evaluation system. Correspondingly, we propose
-
CIKM 20232023In conversational AI assistants, SLU models are part of a complex pipeline composed of several modules working in harmony. Hence, an update to the SLU model needs to ensure improvements not only in the model specific metrics but also in the overall conversational assistant. Specifically, the impact on user interaction quality metrics must be factored in, while integrating interactions with distal modules
-
AutoML Conference 20232023Large Language Models (LLM) achieved considerable results on natural language understanding tasks. However, their sheer size causes a large memory consumption or high latency at inference time, which renders deployment on hardware-constrained applications challenging. Neural architecture search (NAS) demonstrated to be a promising framework to automatically design efficient neural network architectures.
Related content
-
September 18, 2020Learn how Alexa Conversations helps developers in authoring complex dialogue management rules.
-
September 16, 2020How Amazon conducted customer-obsessed science research and engineering to release a vastly improved experience.
-
September 14, 2020University teams have until October 6, 2020 to submit their applications.
-
September 14, 2020Winning teams from the third annual Alexa Prize competition present their research in new video.
-
August 25, 2020ACL 2020 keynote presentation given by Amazon Scholar and Columbia University professor Kathleen McKeown.
-
August 21, 2020Watch the recording of Marcu's live interview with Alexa evangelist Jeff Blankenburg.