The cloud native approach is rapidly transforming how applications are developed and operated, turning monolithic applications into microservice applications, allowing teams to release faster, increase reliability, and expedite operations by taking full advantage of cloud resources and their elasticity. At the same time, "fog computing" is emerging, bringing the cloud towards the edge, near the end user, in order to increase privacy, improve resource efficiency, and reduce latency. Combining these two trends, however, proves difficult because of four fundamental disconnects between the cloud native paradigm and fog computing. This article identifies these disconnects and proposes a fog native architecture along with a set of design patterns to take full advantage of the fog. Central to this approach is turning microservice applications into microservice workflows, constructed dynamically by the system using an intent-based approach taking into account a number of factors such as user requirements, request location and available infrastructure and microservices. The architecture introduces a novel softwarized fog mesh facilitating both inter-microservice connectivity, external communication, and end-user aggregation. Our evaluation analyses the impact of distributing microservicebased applications over a fog ecosystem, illustrating the impact of CPU and network latency and application metrics on perceived Quality of Service of fog native workflows compared to the cloud. The results show the fog can offer superior application performance given the right conditions.