High throughput experimentation is a growing and evolving field that allows to execute dozens to several thousands of experiments per day with relatively limited resources. Through miniaturization, typically a high degree of automation and the use of digital data tools, many parallel reactions or experiments at a time can be run in such workflows. High throughput experimentation also requires fast analytical techniques capable of generating critically important analytical data in line with the increased rate of experimentation. As traditional techniques usually do not deliver the speed required, some unique approaches are required to enable workflows to function as designed. This review covers the recent developments (2019‐2020) in this field and was intended to give a comprehensive overview of the current “state‐of‐the‐art.”