In this paper, a new classifier design methodology, confidence-based classifier design, is proposed to design classifiers with controlled confidence. This methodology is under the guidance of two optimal classification theories, a new classification theory for designing optimal classifiers with controlled error rates and the C.K. Chow’s optimal classification theory for designing optimal classifiers with controlled conditional error. The new methodology also takes advantage of the current well-developed classifier’s probability preserving and ordering properties. It calibrates the output scores of current classifiers to the conditional error or error rates. Thus, it can either classify input samples or reject them according to the output scores of classifiers. It can achieve some reasonable performance even though it is not an optimal solution. An example is presented to implement the new methodology using support vector machines (SVMs). The empirical cumulative density function method is used to estimate error rates from the output scores of a trained SVM. Furthermore, a new dynamic bin width allocation method is proposed to estimate sample conditional error and this method adapts to the underlying probabilities. The experimental results clearly demonstrate the efficacy of the suggested classifier design methodology. (c) 2006 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.