In this paper, we propose a new method which is a adaptive group lasso with least angle regression selection to improve the high dimensional linear model in explanatory data. The addition of excessive variables to a model can lead to severe consequences. When a model contains numerous variables, it is likely that some of them will exhibit strong correlations. However, explanatory variables should ideally not possess strong relationships among themselves. This issue, known as multicollinearity, can significantly impact the interpretation of results by causing notable variations between models. It is well known that proper handling of outliers is essential in data analysis. This is even more so in variable selection. Many variable selection methods are based on assessing minor differences in model quality or even in assessing statistics such as significance calculated from model parameters. Adaptive group lasso is an attractive method that enjoys the oracle property, and it is a convex penalty method, the adaptive group Lasso can be extended to some high-dimensional semiparametric models. It performs better than Lasso, elastic net, ordinary least square and ridge regression in various settings, particularly for large column dimension and big group sizes. Also adaptive group lasso with least angle regression selection algorithm is robust to parameter selection and has less variance inflation factor, less mean square error and largest determination coefficient.