-
IEEE Journal on Emerging and Selected Topics in Circuits and System (JETCAS)2019Large scale machine learning (ML) systems such as the Alexa automatic speech recognition (ASR) system continue to improve with increasing amounts of manually transcribed training data. Instead of scaling manual transcription to impractical levels, we utilize semi-supervised learning (SSL) to learn acoustic models (AM) from the vast firehose of untranscribed audio data. Learning an AM from 1 Million hours
-
NAACL 20192019Executable semantic parsing is the task of converting natural language utterances into logical forms that can be directly used as queries to get a response. We build a transfer learning framework for executable semantic parsing. We show that the framework is effective for Question Answering (Q&A) as well as for Spoken Language Understanding (SLU). We further investigate the case where a parser on a new
-
EMNLP 2019 Workshop on Neural Generation and Translation2019Data availability is a bottleneck during early stages of development of new capabilities for intelligent artificial agents. We investigate the use of text generation techniques to augment the training data of a popular commercial artificial agent across categories of functionality, with the goal of faster development of new functionality. We explore a variety of encoder decoder generative models for synthetic
-
NeurIPS 2019 Workshop on Conversational AI2019An automated metric to evaluate dialogue quality is vital for optimizing data driven dialogue management. The common approach of relying on explicit user feedback during a conversation is intrusive and sparse. Current models to estimate user satisfaction use limited feature sets and employ annotation schemes with limited generalizability to conversations spanning multiple domains. To address these gaps,
-
ICML 20192019In this paper, we investigate novel quantization approaches to reduce memory and computational footprint of deep neural network (DNN) based keyword spotters (KWS). We propose a new method for KWS offline and online quantization, which we call dynamic quantization, where we quantize DNN weight matrices column-wise, using each column’s exact individual min-max range, and the DNN layers’ inputs and outputs
Related content
-
August 04, 2021New approach corrects for cases when average improvements are accompanied by specific regressions.
-
July 29, 2021Amazon’s Dan Roth on a hot new research topic — that he’s been studying for more than 25 years.
-
July 19, 2021Five teams to compete for $500,000 first prize; winners will be announced in August 2021.
-
July 15, 2021Hirschberg explains why mastering empathetic speech is critical for successful dialogue systems.
-
July 15, 2021The paper, which received honorable mention at EACL, presents guidelines for better analysis and construction of datasets.
-
July 12, 2021New method uses cross-attention and multitask training to improve the accuracy and training efficiency of video moment retrieval.