Rxivist logo

Rxivist combines preprints from bioRxiv with data from Twitter to help you find the papers being discussed in your field. Currently indexing 57,812 bioRxiv papers from 266,109 authors.

Active learning reveals underlying decision strategies

By Paula Parpart, Eric Schulz, Maarten Speekenbrink, Bradley C Love

Posted 25 Dec 2017
bioRxiv DOI: 10.1101/239558

One key question is whether people rely on frugal heuristics or full-information strategies when making preference decisions. We propose a novel method, model-based active learning, to answer whether people conform more to a rank-based heuristic (Take-The-Best) or a weight-based full-information strategy (logistic regression). Our method eclipses traditional model comparison techniques by using information theory to characterize model predictions for how decision makers should actively sample information. These analyses capture how sampling affects learning and how learning affects decisions on subsequent trials. We develop and test model-based active learning algorithms for both Take-The-Best and logistic regression. Our findings reveal that people largely follow a weight-based learning strategy rather than a rank-based strategy, even in cases where their preference decisions are better predicted by the Take-The-Best heuristic. This finding suggests that people may have more refined knowledge than is revealed by their preference decisions, but which can be revealed by their information sampling behavior. We argue that model-based active learning is an effective and sensitive method for model selection that expands the basis for model comparison.

Download data

  • Downloaded 384 times
  • Download rankings, all-time:
    • Site-wide: 23,794 out of 57,812
    • In animal behavior and cognition: 291 out of 895
  • Year to date:
    • Site-wide: 38,125 out of 57,812
  • Since beginning of last month:
    • Site-wide: 29,900 out of 57,812

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide

Sign up for the Rxivist weekly newsletter! (Click here for more details.)