Convolutional neural networks (CNN) are often computationally demanding for mobile devices. Offloading some computation lowers this burden: initial convolutional layers are processed on a smartphone, the resulting high dimensional features are transmitted, and latter layers are processed in the cloud/edge/another device. To improve this process, we propose Dynamic Switch, a convolutional subnetwork enabling anywhere splittable CNNs with multirate feature compression using a single set of network parameters. We achieve 90% feature compression with at most 3% accuracy loss for MobileNet and MSDNet on ImageNet dataset and at most 4.58% on CIFAR100 dataset with MSDNet, ResNet-18, Mo-bileNet/MobileNetv2 and ShuffleNet/ShuffleNetv2.