Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods.