Rxivist logo

Monte Carlo Sampling of Protein Folding by Combining an All-Atom Physics-Based Model with a Native State Bias

By Yong Wang, Pengfei Tian, Wouter Boomsma, Kresten Lindorff-Larsen

Posted 03 Jul 2018
bioRxiv DOI: 10.1101/361527 (published DOI: 10.1021/acs.jpcb.8b06335)

Energy landscape theory suggests that native interactions are a major determinant of the folding mechanism of a protein. Thus, structure-based (Go) models have, aided by coarse-graining techniques, shown great success in capturing the mechanisms of protein folding and conformational changes. In certain cases, however, non-native interactions and atomic details are also essential to describe the protein dynamics, prompting the development of a variety of structure-based models which include non-native interactions, and differentiate between different types of attractive potentials. Here, we describe an all-protein-atom hybrid model, termed ProfasiGo, that integrates an implicit solvent all-atom physics-based model (called Profasi)and a structure-based Go potential, and its implementation in two software packages (PHAISTOS and ProFASi)that are developed for Monte Carlo sampling of protein molecules. We apply the ProfasiGo model to study the folding free energy landscapes of four topologically similar proteins, one of which can be folded by the simplified potential Profasi, and two that have been folded by explicit solvent, all-atom molecular dynamics simulations with the CHARMM22* force field. Our results reveal that the hybrid ProfasiGo model is able to capture many of the details present in the physics-based potentials, while retaining the advantages of Go models for sampling and guiding to the native state. We expect that the model will be widely applicable to study the folding of more complex proteins, or to study conformational dynamics and integration with experimental data.

Download data

  • Downloaded 497 times
  • Download rankings, all-time:
    • Site-wide: 66,770
    • In biophysics: 2,332
  • Year to date:
    • Site-wide: 86,045
  • Since beginning of last month:
    • Site-wide: 48,524

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

News