Loading…

BartSmiles: Generative Masked Language Models for Molecular Representations

We discover a robust self-supervised strategy tailored toward molecular representations for generative masked language models through a series of tailored, in-depth ablations. Using this pretraining strategy, we train BARTSmiles, a BART-like model with an order of magnitude more compute than previou...

Full description

Saved in:
Bibliographic Details
Published in:Journal of chemical information and modeling 2024-08, Vol.64 (15), p.5832-5843
Main Authors: Chilingaryan, Gayane, Tamoyan, Hovhannes, Tevosyan, Ani, Babayan, Nelly, Hambardzumyan, Karen, Navoyan, Zaven, Aghajanyan, Armen, Khachatrian, Hrant, Khondkaryan, Lusine
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We discover a robust self-supervised strategy tailored toward molecular representations for generative masked language models through a series of tailored, in-depth ablations. Using this pretraining strategy, we train BARTSmiles, a BART-like model with an order of magnitude more compute than previous self-supervised molecular representations. In-depth evaluations show that BARTSmiles consistently outperforms other self-supervised representations across classification, regression, and generation tasks, setting a new state-of-the-art on eight tasks. We then show that when applied to the molecular domain, the BART objective learns representations that implicitly encode our downstream tasks of interest. For example, by selecting seven neurons from a frozen BARTSmiles, we can obtain a model having performance within two percentage points of the full fine-tuned model on task Clintox. Lastly, we show that standard attribution interpretability methods, when applied to BARTSmiles, highlight certain substructures that chemists use to explain specific properties of molecules. The code and pretrained model are publicly available.
ISSN:1549-9596
1549-960X
1549-960X
DOI:10.1021/acs.jcim.4c00512