Home Projects Publications Code

Joe Austerweil

(Sorry you have to work to replace the underscore, lastname, at, and dot, with _, austerweil, @, and .. I really hate spam).

I am Joe Austerweil, an Assistant Professor at Brown University in the Department of Cognitive, Linguistic, and Psychological Sciences. I am the Principal Investigator of the Austerweil Laboratory.

Short Research Philosophy

As a computational cognitive psychologist, my research program explores questions at the intersection of perception and higher-level cognition. I use recent advances in statistics and computer science to formulate ideal learner models to see how they solve these problems and then test the model predictions using traditional behavioral experimentation. Ideal learner models help us understand the knowledge people use to solve problems because such knowledge must be made explicit for the ideal learner model to successfully produce human behavior. This method yields novel machine learning methods and leads to the discovery of new psychological principles.

Academic history

Brown University (2007), Sc. B. in Applied Mathematics-Computer Science (with honors)
University of California, Berkeley (2011), M.A. in Statistics
University of California, Berkeley (2012), Ph.D. in Psychology

Upcoming Presentations and Workshops

Harvard University. October 29, 2015

NYU. November 12, 2015


I am currently teaching Human and Machine Learning (CLPS 1211) in Fall 2015.

I will be teaching Human Cognition (CLPS 0200) and Core Topics in Cognition (CLPS 2200).

Current highlighted paper(s):

Joshua Abbott, Joseph Austerweil, and Thomas Griffiths (in press). Random walks on semantic networks can resemble optimal foraging. Psychological Review. [Preprint PDF]

Joseph Austerweil and Thomas Griffiths. (2013). A nonparametric Bayesian framework for constructing flexible feature representations. Psychological Review, 120 (4), 817-851. [DOI]

Highlighted code tools:

My laboratory is developing open source tools to perform fast (GPU-based using OpenCL) and easy (written for use in Python) inference for Bayesian nonparametric models. The current methods are available at Github.

Last Updated October 18, 2015