Evasion attacks on deep neural networks (DNN) use artificial data to manipulate common neural network layers (e.g., convolution operations) to create higher losses. This allows targets to evade detection and/or classification across a wide range of DNNs without the need for a backdoor attack or to know specific type used. Most of the existing work in evasion attacks have focused on planar images (e.g., photo, satellite imaging) in relatively consistent lighting conditions. More recent work have recognised the need to create patterns that are more easily printed or work in diverse lighting environments.Here, we build printable evasion patterns for fabric clothing to highlight the risks to autonomous systems and provide data for future adversarial training. These novel evasion attacks are for soft body human stakeholders, where patterns are designed to take into account body rotation, fabric stretch, printable, and lighting variations. We show that these are effective and robust to different human poses. This poses a significant threat to safety of autonomous vehicles and adversarial training should consider this new area.