Classical methods of designing classifier generally pursue more highly accuracy based on the assumption that all misclassifications have the same cost and the sample number of each class is approximately equal. 以分类精度为目标的传统分类算法通常假定:每个样本的误分类具有同样的代价且每类样本数大致相等。
The learning of Backpropagation Neural Network ( BPNN) aimed at lowering the classification error, usually assuming that all the samples had equal price when misclassifications were made. 传统的反向传播神经网络(BPNN)学习以分类错误最小为目标,通常假定在分类错误时所有样本的代价完全相同。