Aiming at the problems of low linear combination efficiency and too much attention to hard examples of the base classifiers of Adjusts Adaptive Boosting (AdaBoost) algorithm, two improved algorithms based on margin theory, named sample Weight and Parameterization of Improved AdaBoost (WPIAda) and sample Weight and Parameterization of Improved AdaBoost-Multitude (WPIAda.M), were proposed. Firstly, the updates of sample weights were divided into four situations by both WPIAda and WPIAda.M algorithms, which increased the sample weights with the margin changing from positive to negative to suppress the negative movement of the margin and reduce the number of samples with the margin at zero. Secondly, according to the error rates of the base classifiers and the distribution of the sample weights, a new method to solve the coefficients of base classifiers was given by WPIAda.M algorithm, thereby improving the combination efficiency of base classifiers. On 10 UCI datasets, compared with algorithms such as WLDF_Ada (dfAda), skAda, SWA-Adaboost (swaAda), WPIAda and WPIAda.M algorithms had the test error reduced by 7.46 percentage points and 7.64 percentage points on average respectively, and the Area Under Curve (AUC) increased by 11.65 percentage points and 11.92 percentage points respectively. Experimental results show that WPIAda and WPIAda.M algorithms can effectively reduce the attention to hard examples, and WPIAda.M algorithm can integrate base classifiers more efficiently, so that the two algorithms can both further improve the classification performance.