PyCP: An Open-Source Conformal Predictions Toolkit

Conference/Journal
Springer, Berlin, Heidelberg
Authors
Vineeth N Balasubramanian Aaron Baker Matthew Yanez Shayok Chakraborty Sethuraman Panchanathan
BibTex
Abstract
Abstract The Conformal Predictions framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. The framework combines principles of transductive inference, algorithmic randomness and hypothesis testing to provide guaranteed error calibration in online settings (and calibration in offline settings supported by empirical studies). As the framework is being increasingly used in a variety of machine learning ...