Adding texture to a line drawing is an important process in the production of comics and illustrations. Garment drawings especially often have large deformations with self-occlusions, so deforming patterns is essential for representing realistic garment designs. However, it is currently done manually and requires a significant amount of effort by experts. A possible approach is to infer 3D surface geometry and then apply texture to all 3D surfaces, but it is difficult to represent deep creases using this approach. In this paper, we introduce a "neurally-guided" optimization system for automatically deforming and directly mapping 2D texture patterns to 2D line drawings, bypassing 3D geometry. First, we build a deep neural network to estimate local transformation matrices of texture patterns, called neural-guidance, from line drawings. Second, we build a 2D triangle mesh for the garment and deforms the mesh to obtain the texture coordinates by integrating the local transformations. Our algorithm is effective and easy to integrate into existing drawing systems. We provide several examples to demonstrate the efficiency of our proposed system over previous methods and illustrate the versatility of our method. CCS CONCEPTS • Computing methodologies → Texturing; Shape modeling; Artificial intelligence.