In this note, we discuss the boosting algorithm AdaBoost and identify two of its main drawbacks: it cannot be used in the boosting by filtering framework and it is not noise resistant. In order to solve them, we propose a modification of the weighting system of AdaBoost. We prove that the new algorithm is in fact a boosting algorithm under the condition that the sequence of advantages generated by the weak learning algorithm with respect to the modified distributions is monotonically decreasing. Second, we show how our algorithm can be used in the boosting by filtering framework unlike AdaBoost that can only be used in the boosting by subsampling framework. Finally, we also show that our boosting algorithm is a statistical query algorithm and thus, it is robust to noise in certain sense, a property that AdaBoost does not seem to have.