Rxivist logo

Rxivist combines preprints from bioRxiv with data from Twitter to help you find the papers being discussed in your field. Currently indexing 73,219 bioRxiv papers from 318,728 authors.

Scalable approximate Bayesian inference for particle tracking data

By Ruoxi Sun, Liam Paninski

Posted 05 Mar 2018
bioRxiv DOI: 10.1101/276253

Many important datasets in physics, chemistry, and biology consist of noisy sequences of images of multiple moving overlapping particles. In many cases, the observed particles are indistinguishable, leading to unavoidable uncertainty about nearby particles' identities. Exact Bayesian inference is intractable in this setting, and previous approximate Bayesian methods scale poorly. Non-Bayesian approaches that output a single "best" estimate of the particle tracks (thus discarding important uncertainty information) are therefore dominant in practice. Here we propose a flexible and scalable amortized approach for Bayesian inference on this task. We introduce a novel neural network method to approximate the (intractable) filter-backward-sample-forward algorithm for Bayesian inference in this setting. By varying the simulated training data for the network, we can perform inference on a wide variety of data types. This approach is therefore highly flexible and improves on the state of the art in terms of accuracy; provides uncertainty estimates about the particle locations and identities; and has a test runtime that scales linearly as a function of the data length and number of particles, thus enabling Bayesian inference in arbitrarily large particle tracking datasets.

Download data

  • Downloaded 850 times
  • Download rankings, all-time:
    • Site-wide: 11,271 out of 73,219
    • In bioinformatics: 1,879 out of 7,136
  • Year to date:
    • Site-wide: 21,190 out of 73,219
  • Since beginning of last month:
    • Site-wide: 21,190 out of 73,219

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

Sign up for the Rxivist weekly newsletter! (Click here for more details.)


News