Variable Selection and Parameter Tuning for BART Modeling in the Fragile Families Challenge
Date
2019-09
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Our goal for the Fragile Families Challenge was to develop a hands-off approach that could be applied in many settings to identify relationships that theory-based models might miss. Data processing was our first and most time-consuming task, particularly handling missing values. Our second task was to reduce the number of variables for modeling, and we compared several techniques for variable selection: least absolute selection and shrinkage operator, regression with a horseshoe prior, Bayesian generalized linear models, and Bayesian additive regression trees (BART). We found minimal differences in final performance based on the choice of variable selection method. We proceeded with BART for modeling because it requires minimal assumptions and permits great flexibility in fitting surfaces and based on previous success using BART in black-box modeling competitions. In addition, BART allows for probabilistic statements about the predictions and other inferences, which is an advantage over most machine learning algorithms. A drawback to BART, however, is that it is often difficult to identify or characterize individual predictors that have strong influences on the outcome variable.
Description
Keywords
Citation
Carnegie, Nicole Bohme, and James Wu. “Variable Selection and Parameter Tuning for BART Modeling in the Fragile Families Challenge.” Socius: Sociological Research for a Dynamic World 5 (January 2019): 237802311982588. doi:10.1177/2378023119825886.
Collections
Endorsement
Review
Supplemented By
Referenced By
Creative Commons license
Except where otherwised noted, this item's license is described as © This manuscript version is made available under the CC-BY-NC 4.0 license