Recent years have witnessed great progress in the development of deep neural networks (DNNs), which has led to growing interest in deploying DNNs in resource-constrained environments such as network-edge and edge-cloud environments. To address objectives of efficient DNN inference, numerous approaches as well as specialized platforms have been designed for inference acceleration. The flexibility and diverse capabilities offered by these approaches and platforms result in large design spaces with complex trade-offs for DNN deployment. Relevant objectives involved in these trade-offs include inference accuracy, latency, throughput, memory requirements, and energy consumption. Tools that can effectively assist designers in deriving efficient DNN configurations for specific deployment scenarios are therefore needed. In this work, we present a design space exploration framework for this purpose. In the proposed framework, DNNs are represented as dataflow graphs using a lightweight-dataflowbased modeling tool, and schedules (strategies for managing processing resources across different DNN tasks) are modeled in a formal, abstract form using dataflow methods as well. The dataflow-based application and schedule representations are integrated systematically with a multiobjective particle swarm optimization (PSO) strategy, which enables efficient evaluation of implementation trade-offs and derivation of Pareto fronts involving alternative deployment configurations. Experimental results using different DNN architectures demonstrate the effectiveness of our proposed framework in exploring design spaces for DNN deployment.