Given a set of constraints F and a user-defined weight function W on the assignment space, the problem of constrained sampling is to sample satisfying assignments of F conditioned on W. Constrained sampling is a fundamental problem with applications in probabilistic reasoning, synthesis, software and hardware testing. Consequently, the problem of sampling has been subject to intense theoretical and practical investigations over the years. Despite such intense investigations, there still remains a gap between theory and practice. In particular, there has been significant progress in the development of sampling techniques when W is a uniform distribution, but such techniques fail to handle general weight functions W. Furthermore, we are, often, interested in Σ 1 1 formulas, i.e., G(X) := ∃Y F (X, Y) for some F ; typically the set of variables Y are introduced as auxiliary variables during encoding of constraints to F. In this context, one wonders whether it is possible to design sampling techniques whose runtime performance is agnostic to the underlying weight distribution and can handle Σ 1 1 formulas? The primary contribution of this work is a novel technique, called WAPS, for sampling over Σ 1 1 whose runtime is agnostic to W. WAPS is based on our recently discovered connection between knowledge compilation and uniform sampling. WAPS proceeds by compiling F into a well studied compiled form, d-DNNF, which allows sampling operations to be conducted in linear time in the size of the compiled form. We demonstrate that WAPS can significantly outperform existing state-of-the-art weighted and projected sampler WeightGen, by up to 3 orders of magnitude in runtime while achieving a geometric speedup of 296× and solving 564 more instances out of 773. The distribution generated by WAPS is statistically indistinguishable from that generated by an ideal weighted and projected sampler. Furthermore, WAPS is almost oblivious to the number of samples requested.