Knowledge Distillation on Cross-Modal Adversarial Reprogramming for Data-Limited Attribute Inference

Quan Li, Lingwei Chen, Shixiong Jing, Dinghao Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Social media generates a rich source of text data with intrinsic user attributes (e.g., age, gender), where different parties benefit from disclosing them. Attribute inference can be cast as a text classification problem, which, however, suffers from labeled data scarcity. To address this challenge, we propose a data-limited learning model to distill knowledge on adversarial reprogramming of a visual transformer (ViT) for attribute inferences. Not only does this novel cross-modal model transfers the powerful learning capability from ViT, but also leverages unlabeled texts to reduce the demand on labeled data. Experiments on social media datasets demonstrate the state-of-the-art performance of our model on data-limited attribute inferences.

Original languageEnglish (US)
Title of host publicationACM Web Conference 2023 - Companion of the World Wide Web Conference, WWW 2023
PublisherAssociation for Computing Machinery, Inc
Pages65-68
Number of pages4
ISBN (Electronic)9781450394161
DOIs
StatePublished - Apr 30 2023
Event32nd Companion of the ACM World Wide Web Conference, WWW 2023 - Austin, United States
Duration: Apr 30 2023May 4 2023

Publication series

NameACM Web Conference 2023 - Companion of the World Wide Web Conference, WWW 2023

Conference

Conference32nd Companion of the ACM World Wide Web Conference, WWW 2023
Country/TerritoryUnited States
CityAustin
Period4/30/235/4/23

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'Knowledge Distillation on Cross-Modal Adversarial Reprogramming for Data-Limited Attribute Inference'. Together they form a unique fingerprint.

Cite this