Customer-obsessed science
Research areas
-
November 6, 2025A new approach to reducing carbon emissions reveals previously hidden emission “hotspots” within value chains, helping organizations make more detailed and dynamic decisions about their future carbon footprints.
-
-
Featured news
-
U2BigData 20242024This paper introduces a Context-Aware and User Intent-Aware follow-up Question Generation (CA-UIA-QG) method in multi-turn conversational settings. Our CA-UIA-QG model is designed to simultaneously consider the evolving context of a conversation and identify user intent. By integrating these aspects, it generates relevant follow-up questions, which can better mimic user behavior and align well with users
-
2024Abductive reasoning is the process of making educated guesses to provide explanations for observations. Although many applications require the use of knowledge for explanations, the utilization of abductive reasoning in conjunction with structured knowledge, such as a knowledge graph, remains largely unexplored. To fill this gap, this paper introduces the task of complex logical hypothesis generation, as
-
EPTC 20242024Plastic encapsulation is a key feature for System-in-Package (SiP) technology as it provides robust mechanical protection and structural support for all the electronic components enclosed within the package. This allows a highly compact design with minimal component-to-component spacing without compromising long-term reliability and performance. However, as the density and complexity of SiP modules continue
-
2024In-context learning (ICL) adapts Large Language Models (LLMs) to new tasks, without requiring any parameter updates, but few an-notated examples as input. In this work, we investigate selective annotation for ICL, where there is a limited budget for annotating examples, similar to low-budget active learning (AL). Although uncertainty-based selection is unreliable with few annotated data, we present COVERICL
-
LoG 20242024Graph clustering on text-attributed graphs (TAGS), i.e., graphs that include natural language text as additional node information, is typically performed using graph neural networks (GNNs), which forego the text in lieu of embeddings. While GNN methods ensure scalability and effectively leverage graph topology, text attributes contain rich information that can be leveraged using large language models (LLMs
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all