The integration of large amounts of generation into distribution networks faces some limitations. By deploying reactive power-based voltage control concepts (e.g., volt/var control with distributed generators), the voltage rise caused by generators can be partly mitigated. As a result, the network hosting capacity can be accordingly increased, and costly network reinforcement might be avoided or postponed. This works however only for voltage-constrained feeders (opposed to current-constrained feeders). Due to the low level of monitoring in low voltage networks, it is important to be able to classify feeders according to the expected constraint in order to avoid the overloading risk. The main purpose of this paper is to investigate to which extent it is possible to predict the hosting capacity constraint (voltage or current) of low voltage feeders on the basis of a large network data set. Two machine-learning techniques have been implemented and compared: clustering (unsupervised) and classification (supervised). The results show that the general performance of the classification or clustering algorithms might be considered as rather poor at a first glance, reflecting the diversity of real low voltage feeders. However, a detailed analysis shows that the benefit of the classification is significant.