The canopy height model (CHM) is a representation of the height of the top of vegetation from the surrounding ground level. It is crucial for the extraction of various forest characteristics, for instance, timber stock estimations and forest growth measurements. There are different ways of obtaining the vegetation height, such as through ground-based observations or the interpretation of remote sensing images. The severe downside of field measurement is its cost and acquisition difficulty. Therefore, utilizing remote sensing data is, in many cases, preferable. The enormous advances in computer vision during the previous decades have provided various methods of satellite imagery analysis. In this work, we developed the canopy height evaluation workflow using only RGB and NIR (near-infrared) bands of a very high spatial resolution (investigated on WorldView-2 satellite bands). Leveraging typical data from airplane-based LiDAR (Light Detection and Ranging), we trained a deep neural network to predict the vegetation height. The provided approach is less expensive than the commonly used drone measurements, and the predictions have a higher spatial resolution (less than 5 m) than the vast majority of studies using satellite data (usually more than 30 m). The experiments, which were conducted in Russian boreal forests, demonstrated a strong correlation between the prediction and LiDAR-derived measurements. Moreover, we tested the generated CHM as a supplementary feature in the species classification task. Among different input data combinations and training approaches, we achieved the mean absolute error equal to 2.4 m using U-Net with Inception-ResNet-v2 encoder, high-resolution RGB image, near-infrared band, and ArcticDEM. The obtained results show promising opportunities for advanced forestry analysis and management. We also developed the easyto-use open-access solution for solving these tasks based on the approaches discussed in the study cloud-free composite orthophotomap provided by Mapbox via tile-based map service.