Draw Me a Flower: Processing and Grounding Abstraction in Natural Language

Royi Lachmy, Valentina Pyatkin, Avshalom Manevich, Reut Tsarfaty

Research output: Contribution to journalArticlepeer-review

Abstract

ion is a core tenet of human cognition and communication. When composing natural language instructions, humans naturally evoke abstraction to convey complex procedures in an efficient and concise way. Yet, interpreting and grounding abstraction expressed in NL has not yet been systematically studied in NLP, with no accepted benchmarks specifically eliciting abstraction in NL. In this work, we set the foundation for a systematic study of processing and grounding abstraction in NLP. First, we deliver a novel abstraction elic-itation method and present HEXAGONS, a2D instruction-following game. Using HEXAGONS we collected over 4k naturally occurring visually-grounded instructions rich with di-verse types of abstractions. From these data, we derive an instruction-to-execution task and assess different types of neural models. Our results show that contemporary models and modeling practices are substantially in-ferior to human performance, and that model performance is inversely correlated with the level of abstraction, showing less satisfying performance on higher levels of abstraction. These findings are consistent across models and setups, confirming that abstraction is a challenging phenomenon deserving further attention and study in NLP/AI research.

Original languageEnglish
Pages (from-to)1341-1356
Number of pages16
JournalTransactions of the Association for Computational Linguistics
Volume10
DOIs
StatePublished - 28 Nov 2022
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2022 Association for Computational Linguistics.

Fingerprint

Dive into the research topics of 'Draw Me a Flower: Processing and Grounding Abstraction in Natural Language'. Together they form a unique fingerprint.

Cite this