Rxivist logo

Molecular Graph Enhanced Transformer for Retrosynthesis Prediction

By Kelong Mao, Peilin Zhao, Tingyang Xu, Yu Rong, Xi Xiao, Junzhou Huang

Posted 06 Mar 2020
bioRxiv DOI: 10.1101/2020.03.05.979773

With massive possible synthetic routes in chemistry, retrosynthesis prediction is still a challenge for researchers. Recently, retrosynthesis prediction is formulated as a Machine Translation (MT) task. Namely, since each molecule can be represented as a Simplified Molecular-Input Line-Entry System (SMILES) string, the process of retrosynthesis is analogized to a process of language translation from the product to reactants. However, the MT models that applied on SMILES data usually ignore the information of natural atomic connections and the topology of molecules. To make more chemically plausible constrains on the atom representation learning for better performance, in this paper, we propose a Graph Enhanced Transformer (GET) framework, which adopts both the sequential and graphical information of molecules. Four different GET designs are proposed, which fuse the SMILES representations with atom embeddings learned from our improved Graph Neural Network (GNN). Empirical results show that our model significantly outperforms the vanilla Transformer model in test accuracy.

Download data

  • Downloaded 609 times
  • Download rankings, all-time:
    • Site-wide: 52,246
    • In bioinformatics: 5,286
  • Year to date:
    • Site-wide: 53,048
  • Since beginning of last month:
    • Site-wide: 39,723

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

News