The Internet of Things (IoT) is now infiltrating into our daily lives, providing important measurement and collection tools to inform us of every decision. Millions of sensors and devices continue to generate data and exchange important information through complex networks that support machine-to-machine communication and monitor and control critical smart world infrastructure. As a strategy to alleviate resource congestion escalation, edge computing has become a new paradigm for addressing the needs of the Internet of Things and localization computing. Compared to well-known cloud computing, edge computing migrates data calculations or storage to the edge of the network near the end-user. Thus, multiple compute nodes distributed across the network can offload computational pressure from a centralized data center and can significantly reduce latency in message exchanges. Besides, the distributed architecture balances network traffic and avoids spikes in traffic in the IoT network, reduces latency between edge/cloud servers and end-users, and reduces response time for real-time IoT applications compared to traditional cloud services. In this article, we conducted a comprehensive survey to analyze how edge computing can improve the performance of IoT networks. We classify edge calculations into different groups based on the architecture and study their performance by comparing network latency, bandwidth usage, power consumption, and overhead. Through the systematic introduction of the concept of edge computing, typical application scenarios, research status, and key technologies, it is considered that the development of edge computing is still in the initial stage. There are still many problems in practical applications that need to be solved, including optimizing edge computing performance, security, interoperability, and intelligent edge operations management services.