TL;DR: We propose controllable counterfactuals (CoCo) to evaluate dialogue state tracking (DST) models on novel scenarios, which results in significant performance drop of up to 30.8% for state-of-the-art DST models. Using CoCo for…
We are proud to announce the 2020 winners of our Salesforce AI Research Grant. Each of our winners will receive a $50K grant to advance their work and help us shape the future of AI.
TL; DR: We find that current self-supervised learning approaches suffer from poor visual grounding and receive improper supervisory signal when trained on complex scene images. We introduce CAST to improve visual grounding during…
This year marks the 34th annual conference on Neural Information Processing Systems (NeurIPS [https://neurips.cc/]) reimagined for the first time ever in a fully virtual format. NeurIPS is a leading conference in the area…
TL; DR: We propose a new semi-supervised learning method which achieves state-of-the-art performance by learning jointly-evolved class probabilities and image representations. What are the existing semi-supervised learning methods? Semi-supervised learning aims to leverage…
This year marks the 24th annual Empirical Methods in Natural Language Processing (EMNLP) conference reimagined for the first time ever in a fully virtual format. EMNLP is a leading conference in the area…
In recent years, the natural language processing (NLP) community has seen the development of increasingly powerful language models [1, 2], capable of generating textual output that is indistinguishable from human-written text. This includes…
> TL;DR: We theoretically analyze the differential architecture search (DARTS) for understanding the role and impact of skip connections, which inspires a new method for Neural Architecture Search (NAS) using group-structured sparse gates…