Bayesian Networks, Multi-Belief Networks and Belief Polarization

Bayesian networks are a a type of probabilistic graphical model that uses Bayesian inference for probability computations. They have seen widespread use in a number of fields including computer science, cognitive science, and philosophy. They offer a valuable tool to represent the complex inferential structure of beliefs. Notably, Bayesian networks provide a setting in which many rational belief polarization can naturally arise. The beliefs of agents move apart, even for ideally rational agents who update on precisely the same evidence. My research has focused on studying the phenomenon both between pairs of agents, and in a social setting, on networks of many agents.

The Significance of Haag’s Theorem in the Foundations of Quantum Field Theory

Haag's theorem is traditionally viewed as a no-go theorem for one of the most widely used and successful techniques in quantum field theory: the interaction picture and its attendant methods of perturbation theory. This raises numerous questions about the success of the interaction picture, the role of axiomatic methods and no-go theorems, and about the proper ways to do quantum field theory. Attitudes for how to respond vary widely among philosophers, physicists and mathematicians. My research, with Marian Gilton and Chris Mitsch focuses on understanding the significance of Haag’s theorem within the foundations of quantum field theory.

Reinforcement Learning and Evolution with Invention

Bargaining games have played a prominent role in modeling the evolution of social conventions. However, most game theoretic work has assumed that agents must choose from a predetermined recipe of strategies. A more naturalistic choice would be allow agents to invent and reinforce new strategies. My research has studied evolution and reinforcement learning for bargaining games in which agents can invent new strategies, and also in which unsuccessful strategies can die off or be forgotten.

The Epistemology of Models and Simulations in High Energy Physics

Computer models and simulations play an increasingly important role in scientific research; however,. My research has focused on understanding the epistemic role of models simulations in high energy physics, I have also investigated the inferences we can draw from sloppy models, and their relation to scientific realism. So-called sloppy models are dependent on a large number of parameters, but highly insensitive to the vast majority of parameter combinations.

Vector Boson Pair Production at the ATLAS Detector

My previous research in particle physics centered on analysis involving data from the ATLAS detector at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). I worked on the precision measurement of cross-sections for the production of WW and WZ boson pairs, and on setting limits on anomalous triple gauge couplings (these couplings would be an indication of physics beyond the Standard Model). I focused on two types of particle decays that required a very different treatment: those in which the decay products merged into a single large-radius, highly-boosted jet, and those in which the decay products separated into two highly distinct jets. I also participated in the trigger upgrade for the second run of the LHC. To this end, I used tools from data analysis and statistics, as well as computational modeling using Monte Carlo methods.

As a member of the ATLAS collaboration, I am technically listed as a co-author of several hundred papers. Below is a selection of papers to which I made substantial personal contribution.