This page is still under construction.
Project Leads: David Madigan (Columbia University and Rutgers University), David D. Lewis (David D. Lewis Consulting)
Active Developers: Alex Genkin (original architect and developer), Shenzhi Li
Past Developers: Bing Bai, Dmitriy Fradkin, Michael Hollander, Vladimir Menkov
Contact Address:
This is the permanent home page for the open source Bayesian logistic regression packages BBR, BMR, and BXR. There are currently six programs in the B*R family. All six programs were released by David Madigan of Rutgers University in 2007 under the MIT X License, an open source license which allows the programs to be modified by anyone and used for essentially any purpose. (See the text of the license in the source code files for details.)
Active development is now focused on BXRtrain and BXRclassify, though bug fixes to the BBR and BMR versions will be made when possible. Most users will wish to use BXRtrain and BXRclassify, but just in case we summarize the differences among the family members here.
BBRtrain: Trains binary (i.e. 2 class) logistic regression models from labeled data. All of its capabilities are also present in BXRtrain, except for:
Except for representation transformations (which are better handled outside learning software) and threshold tuning (which is handled in a more flexible way in BXRtrain/BXRclassify), we hope to eventually add these capabilities in BXRtrain. Note that along with some earlier versions of BBRtrain we provided a Perl script for computing bootstrap estimtes of the variance of model coefficients. It turns out that bootstrapping is not an accurate way (even asymptotically) to estimate that variance when the Laplace prior is used, so we have removed that script.
Project page: BBR and BMR Project at Google Code
BBRclassify: Applies models trained by BBRtrain to new data. Capabilities currently not present in BXRclassify:
We hope to eventually include the group labeling/hierarchical modeling capability in BXRclassify.
Project page: BBR and BMR Project at Google Code
BMRtrain: Trains polytomous (2+ class) logistic regression models. Capabilities not present in BXRtrain:
The representation transformations will not be added to BXRtrain.
Project page: BBR and BMR Project at Google Code
BMRclassify: Applies models produced by BMRtrain to new data. Capabilities not present in BXRclassify:
The cosine normalization capability will not be added to BXRclassify.
Project page: BBR and BMR Project at Google Code
BXRtrain: BXRtrain provides a range of new capabilities, particularly with respect to ease of use.
Project page: BXR Project at Google Code
BXRclassify: BXRclassify provides a range of new capabilities, particularly with respect to ease of use.
Project page: BXR Project at Google Code
Acknowledgments: Research and development on BBR, BMR, and BXR was supported by the KD-D group, via National Science Foundation grant EIA-0087022 ("Monitoring Message Streams") to DIMACS at Rutgers University. Additional support came from the National Science Foundation through grants DMS-0113236 (ITR program) and DMS-0505599 to Rutgers University, the Agency for Healthcare Research and Quality through grant 5R01HS011609-04 to the University of Illinois at Chicago, and the U.S. Army Research Laboratory - Human Research Engineering Directorate (ARL-HRED) under Cooperative Agreement Number W911NF-07-2-0079 with Columbia College in Chicago, IL.
We gratefully acknowledge suggestions and encouragment from Andrei Anghelescu, Steve Benson, Aynur Dayanik, Susana Eyheramendy, Dmitriy Fradkin, Navendu Garg, Paul Kantor, Sathiya Keerthi, Paul Komarek, Justin Langseth,Vladimir Menkov, Fred Roberts, Cun-Hui Zhang, and all the users of BBR, BMR, and BXR who have written us.