Training transitive and commutative multimodal transformers with LoReTTa.
In: (Advances in Neural Information Processing Systems, 10-16 December 2023, New Orleans). 10010 North Torrey Pines Rd, La Jolla, California 92037 Usa: Neural Information Processing Systems (nips), 2023. 14
as soon as is submitted to ZB.
Training multimodal foundation models is challenging due to the limited availability of multimodal datasets. While many public datasets pair images with text, few combine images with audio or text with audio. Even rarer are datasets that align all three modalities at once. Critical domains such as healthcare, infrastructure, or transportation are particularly affected by missing modalities. This makes it difficult to integrate all modalities into a large pre-trained neural network that can be used out-of-the-box or fine-tuned for different downstream tasks. We introduce LoReTTa (Linking mOdalities with a tRansitive and commutativE pre-Training sTrAtegy) to address this understudied problem. Our self-supervised framework unifies causal modeling and masked modeling with the rules of commutativity and transitivity. This allows us to transition within and between modalities. As a result, our pre-trained models are better at exploring the true underlying joint probability distribution. Given a dataset containing only the disjoint combinations (A, B) and (B, C), LoReTTa can model the relation A <-> C with A <-> B <-> C. In particular, we show that a transformer pre-trained with LoReTTa can handle any mixture of modalities at inference time, including the never-seen pair (A, C) and the triplet (A, B, C). We extensively evaluate our approach on a synthetic, medical, and reinforcement learning dataset. Across different domains, our universal multimodal transformer consistently outperforms strong baselines such as GPT, BERT, and CLIP on tasks involving the missing modality tuple.
Altmetric
Additional Metrics?
Publication type
Article: Conference contribution
Document type
Thesis type
Editors
Corresponding Author
Keywords
Keywords plus
ISSN (print) / ISBN
1049-5258
e-ISSN
ISBN
Book Volume Title
Conference Title
Advances in Neural Information Processing Systems
Conference Date
10-16 December 2023
Conference Location
New Orleans
Proceedings Title
Quellenangaben
Volume: ,
Issue: ,
Pages: 14
Article Number: ,
Supplement: ,
Series
Publisher
Neural Information Processing Systems (nips)
Publishing Place
10010 North Torrey Pines Rd, La Jolla, California 92037 Usa
University
University place
Faculty
Publication date
0000-00-00
Application date
0000-00-00
Patent owner
Further owners
Application country
Patent priority
Reviewing status
Grants
Helmholtz Association under the joint research school "Munich School for Data Science -MUDS"