Too much of product information: Don’t worry, let’s look for evidence!
2023
Product question answering (PQA) aims to provide instant responses to customer questions posted on shopping message boards, social media, brand websites and retail stores. In this paper, we propose a distantly supervised solution to answer customer questions by using product information. Auto-answering questions using product information poses two main challenges: (i) labelled data is not readily available and (ii) lengthy product information requires attending to various parts of the text to answer the question. To this end, we first propose a novel distant-supervision-based NLI model to prepare training data without any manual efforts. To deal with lengthy context, we factorize answer generation into two sub-problems. First, given product information, the model extracts evidence spans relevant to the question. Then the model leverages evidence spans to generate the answer. Further, we propose two novelties in the fine-tuning approach: (i) first, we jointly fine-tune the model for both tasks in an end-to-end manner and showcase that it outperforms standard multitask fine-tuning; (ii) next, we introduce an auxiliary contrastive loss for evidence extraction. We show that the combination of these two ideas achieves an absolute improvement of 6% in accuracy (human evaluation) over baselines.
Research areas