TY - JOUR
T1 - Draw Me a Flower
T2 - Processing and Grounding Abstraction in Natural Language
AU - Lachmy, Royi
AU - Pyatkin, Valentina
AU - Manevich, Avshalom
AU - Tsarfaty, Reut
N1 - Publisher Copyright:
© 2022 Association for Computational Linguistics.
PY - 2022/11/28
Y1 - 2022/11/28
N2 - ion is a core tenet of human cognition and communication. When composing natural language instructions, humans naturally evoke abstraction to convey complex procedures in an efficient and concise way. Yet, interpreting and grounding abstraction expressed in NL has not yet been systematically studied in NLP, with no accepted benchmarks specifically eliciting abstraction in NL. In this work, we set the foundation for a systematic study of processing and grounding abstraction in NLP. First, we deliver a novel abstraction elic-itation method and present HEXAGONS, a2D instruction-following game. Using HEXAGONS we collected over 4k naturally occurring visually-grounded instructions rich with di-verse types of abstractions. From these data, we derive an instruction-to-execution task and assess different types of neural models. Our results show that contemporary models and modeling practices are substantially in-ferior to human performance, and that model performance is inversely correlated with the level of abstraction, showing less satisfying performance on higher levels of abstraction. These findings are consistent across models and setups, confirming that abstraction is a challenging phenomenon deserving further attention and study in NLP/AI research.
AB - ion is a core tenet of human cognition and communication. When composing natural language instructions, humans naturally evoke abstraction to convey complex procedures in an efficient and concise way. Yet, interpreting and grounding abstraction expressed in NL has not yet been systematically studied in NLP, with no accepted benchmarks specifically eliciting abstraction in NL. In this work, we set the foundation for a systematic study of processing and grounding abstraction in NLP. First, we deliver a novel abstraction elic-itation method and present HEXAGONS, a2D instruction-following game. Using HEXAGONS we collected over 4k naturally occurring visually-grounded instructions rich with di-verse types of abstractions. From these data, we derive an instruction-to-execution task and assess different types of neural models. Our results show that contemporary models and modeling practices are substantially in-ferior to human performance, and that model performance is inversely correlated with the level of abstraction, showing less satisfying performance on higher levels of abstraction. These findings are consistent across models and setups, confirming that abstraction is a challenging phenomenon deserving further attention and study in NLP/AI research.
UR - http://www.scopus.com/inward/record.url?scp=85143087325&partnerID=8YFLogxK
U2 - 10.1162/tacl_a_00522
DO - 10.1162/tacl_a_00522
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85143087325
SN - 2307-387X
VL - 10
SP - 1341
EP - 1356
JO - Transactions of the Association for Computational Linguistics
JF - Transactions of the Association for Computational Linguistics
ER -