Rxivist logo

Rxivist combines preprints from bioRxiv with data from Twitter to help you find the papers being discussed in your field. Currently indexing 57,910 bioRxiv papers from 266,458 authors.

RamaNet: Computational De Novo Protein Design using a Long Short-Term Memory Generative Adversarial Neural Network

By Sari Sabban, Mikhail Markovsky

Posted 14 Jun 2019
bioRxiv DOI: 10.1101/671552

The ability to perform De novo protein design will allow researchers to expand the pool and variety of available proteins, by designing synthetic structures computationally they can make available more structures than are available in the Protein Data Bank, design structures that are not found in nature, or direct the design of proteins to acquire a specific desired structure. While some researchers attempt to design proteins from first physical and thermodynamic principals, we decided to attempt to perform de novo protein design statistically using machine learning by building a model that uses a long short-term memory generative adversarial neural network architecture. The LSTM based GAN model used the Φ and Ψ angles of each residue from an augmented dataset of only helical protein structures. Though the network's output structures were not perfect, they were idealised and evaluated post prediction where the bad structures were filtered out and the adequate structures kept. The results were successful in developing a logical, rigid, compact, helical protein backbone topology. This backbone topology was then used to computationally design side chains that should allow the final protein to fold to the designed structure.

Download data

  • Downloaded 359 times
  • Download rankings, all-time:
    • Site-wide: 25,424 out of 57,910
    • In bioinformatics: 3,500 out of 5,899
  • Year to date:
    • Site-wide: 6,335 out of 57,910
  • Since beginning of last month:
    • Site-wide: 4,215 out of 57,910

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


Sign up for the Rxivist weekly newsletter! (Click here for more details.)


News