Toward Zero-Shot Instruction Following

Renze Lou, Wenpeng Yin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This work proposes a challenging yet more realistic setting for zero-shot cross-task generalization: zero-shot instruction following, presuming the existence of a paragraph-style task definition while no demonstrations exist. To better learn the task supervision from the definition, we propose two strategies: first, to automatically find out the critical sentences in the definition; second, a ranking objective to force the model to generate the gold outputs with higher probabilities when those critical parts are highlighted in the definition. The joint efforts of the two strategies yield state-of-the-art performance on the SUPERNATURALINSTRU (Wang et al., 2022b).

Original languageEnglish (US)
Title of host publicationEACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Student Research Workshop
EditorsNeele Falk, Sara Papi, Mike Zhang
PublisherAssociation for Computational Linguistics (ACL)
Pages50-60
Number of pages11
ISBN (Electronic)9798891760905
StatePublished - 2024
Event18th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2024 - Student Research Workshop, SRW 2024 - St. Julian's, Malta
Duration: Mar 21 2024Mar 22 2024

Publication series

NameEACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Student Research Workshop

Conference

Conference18th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2024 - Student Research Workshop, SRW 2024
Country/TerritoryMalta
CitySt. Julian's
Period3/21/243/22/24

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Software
  • Linguistics and Language

Cite this