We are excited to announce 22 recipients from the AWS Machine Learning Research Awards (now part of Amazon Research Awards) 2020 Q1/Q2 call-for-proposal cycles. The recipients, representing 21 universities in six countries, aim to develop open-source tools and conduct research that benefit the ML community at large, or create impactful research using AWS ML solutions, such as Amazon SageMaker, AWS AI Services, and Apache MXNet on AWS. We are also pleased to work with universities in China, Hong Kong, and Taiwan for the first time.
The following are the 2020 Q1/Q2 recipients:
Recipient | University | Research Title | Award Year |
Andrea Vedaldi | University of Oxford | Large-Scale Understanding of Self-Supervised Image Feature Representation Learning | 2020 Q1 |
Chao Zhang | Georgia Institute of Technology | Enabling Pre-Trained Language Models for Open, Low-Resource Information Extraction | 2020 Q1 |
Gennady Pekhimen-ko | University of Toronto | Efficient DNN Training at Scale: From Algorithms to Hardware | 2020 Q1 |
Irwin King | The Chinese University of Hong Kong | Graph Neural Networks for Learning on Heterogeneous Graphs: Methods, Applications, and Tools | 2020 Q1 |
Jason Hong | Carnegie Mellon University | Organizing Crowd Workers to Categorize Bias in ML Systems with Bias Bounties | 2020 Q1 |
Jiawei Han | University of Illinois at Urbana-Champaign | Empower Heterogenous Information Network with Label Efficient Graph Representation Learning | 2020 Q1 |
Jonathan Tamir | The University of Texas at Austin | AI-driven Magnetic Resonance Imaging for Same-Day Point-of-Care Imaging and Diagnosis | 2020 Q1 |
Michael Bronstein | Imperial College London | Geometric Deep Learning Model for Functional Protein Design | 2020 Q1 |
Peng Gong | Tsinghua University | 21st Century Seasonal to Annual Global Land Cover and Land Use Dynamics: A Spatial- Temporal Cube Reconstruction Approach Using Amazon Web Services | 2020 Q1 |
Ying Ding | The University of Texas at Austin | I-RadioDiagno: Human-Centered AI Medical Imaging Diagnosis Tool | 2020 Q1 |
Yun-Nung (Vivian) Chen | National Taiwan University | Towards Robust Spoken Language Understanding | 2020 Q1 |
Zhaoran Wang | Northwestern University | Provable Deep Reinforcement Learning in Real World: Efficient Exploration, Model-Based Learning, and Sim2Real | 2020 Q1 |
Zhou Yu | University of California, Davis | Knowledge Augmented Dialog Systems with Computational and Sample Efficiency | 2020 Q1 |
Anna Korhonen | University of Cambridge | AI-assisted Functional Genomics | 2020 Q2 |
Hung-yi Lee | National Taiwan University | Speech Processing Decathlon | 2020 Q2 |
Jonathan P. How | Massachusetts Institute of Technology | Fast Adaptation via Meta-Learning in Multiagent Reinforcement Learning | 2020 Q2 |
Liang Zhao | Emory University | Distributed Large‐scale Graph Deep Learning by Gradient‐free Optimization | 2020 Q2 |
Noah Snavely | Cornell University | Joint Reasoning over Images, Language, and 3D | 2020 Q2 |
Qi (Rose) Yu | University of California, San Diego | Deep Relational Forecasting for Dynamic Graphs | 2020 Q2 |
Xia Ning | The Ohio State University | Synthesizability-Guided Molecular Graph Generation via Deep Learning | 2020 Q2 |
Xiangxiang Zeng | Hunan University | Graph Neural Networks for Drug Repositioning | 2020 Q2 |
Yong Yu | Shanghai Jiao Tong University | Exploring Interaction Models on Graph Data and their Applications on Recommender Systems | 2020 Q2 |
Since 2017, AWS Machine Learning Research Awards (MLRA) has supported more than 190 research projects from 80 schools and research institutes in 15 countries, on topics such as ML algorithms, computer vision, natural language processing, medical research, neuroscience, social science, physics, and robotics.
In August 2020, MLRA and Amazon Research Awards (ARA) merged to make it easier for academic researchers to apply through a single submission system and a centrally managed website.
ARA is reviewing the proposals from 2020 call for proposals, and the decision letters will be sent out next month. Congratulations to the new recipients! We look forward to supporting your research.