Abstract. Venn Predictors (VPs) are machine learning algorithms that can provide well calibrated multiprobability outputs for their predictions. The only drawback of Venn Predictors is their computational inefficiency, especially in the case of large datasets. In this work, we propose an Inductive Venn Predictor (IVP) which overcomes the computational inefficiency problem of the original Venn Prediction framework. Each VP is defined by a taxonomy which separates the data into categories. We develop an IVP with a taxonomy derived from a multiclass Support Vector Machine (SVM), and we compare our method with other probabilistic methods for SVMs, namely Platt's method, SVM Binning, and SVM with Isotonic Regression. We show that these methods do not always provide well calibrated outputs, while our IVP will always guarantee this property under the i.i.d. assumption.