Rxivist logo

Deep Learning-based Ligand Design using Shared Latent Implicit Fingerprints from Collaborative Filtering

By Raghuram Srinivas, Niraj Verma, Eric C. Larson, Elfi Kraka

Posted 20 Nov 2020
bioRxiv DOI: 10.1101/2020.11.18.389213

In their previous work , Srinivas et al have shown that implicit fingerprints capture ligands and proteins in a shared latent space, typically for the purposes of virtual screening with collaborative filtering models applied on known bioactivity data. In this work, we extend these implicit fingerprints/descriptors using deep learning techniques to translate latent descriptors into discrete representations of molecules (SMILES), without explicitly optimizing for chemical properties . This allows the design of new compounds based upon the latent representation of nearby proteins, thereby encoding drug-like properties including binding affinities to known proteins. The implicit descriptor method does not require any fingerprint similarity search, which makes the method free of any bias arising from the empirical nature of the fingerprint models \cite{srinivas2018implicit}. We evaluate the properties of the novel drugs generated by our approach using physical properties of drug-like molecules and chemical complexity. Additionally, we analyze the reliability of the biological activity of the new compounds generated using this method by employing models of protein ligand interaction, which assists in assessing the potential binding affinity of the designed compounds. We find that the generated compounds exhibit properties of chemically feasible compounds and are likely to be excellent binders to known proteins. Furthermore, we also analyze the diversity of compounds created using the Tanimoto distance and conclude that there is a wide diversity in the generated compounds.

Download data

  • Downloaded 316 times
  • Download rankings, all-time:
    • Site-wide: 133,841
    • In bioinformatics: 10,048
  • Year to date:
    • Site-wide: 157,018
  • Since beginning of last month:
    • Site-wide: 161,294

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

News