Automatic and accurate thorax disease diagnosis in Chest X-ray (CXR) image plays an essential role in clinical assist analysis. However, due to its imaging noise regions and the similarity of visual features between diseases and their surroundings, the precise analysis of thoracic disease becomes a challenging problem. In this study, we propose a novel knowledge-guided deep zoom neural network (KGZNet) which is a data-driven model. Our approach leverage prior medical knowledge to guide its training process, due to thoracic diseases typically limit within the lung regions. Also, we utilized weaklysupervised learning (WSL) to search for finer regions without using annotated samples. Learning on each scale consists of a classification sub-network. The KGZNet starts from global images, and iteratively generates discriminative part from coarse to fine; while a finer scale sub-network takes as input an amplified attended discriminative region from previous scales in a recurrent way. Specifically, we first train a robust modified U-Net model of lung segmentation and capture the lung area from the original CXR image through the Lung Region Generator. Then, guided by the attention heatmap, we obtain a finer discriminative lesion region from the lung region images by the Lesion Region Generator. Lastly, the most discriminative features knowledge is fused, and the complementary features information is learned for final disease prediction. Extensive experiments demonstrate that our method can effectively leverage discriminative region information, and significantly outperforms the other state-of-the-art methods in the thoracic disease recognition task.