A new metaheuristic can be developed by constructing from scratch, modifying the existing metaheuristics, or hybridizing some metaheuristics. This work presents a new metaheuristic: extended stochastic coati optimizer (ESCO). ESCO is developed by expanding the shortcoming coati optimization algorithm (COA). ESCO expands the number of searches and references used in COA. ESCO also implements a stochastic process for each unit to choose the searches that will perform. It differs from COA, which splits the population into two fixed groups, each performing its strategy. ESCO implements three sequential phases in every iteration. Two options can be chosen in every phase. ESCO has three references in its guided search: the global best unit, a randomly selected unit, and a randomized unit within the search space. In this work, ESCO is challenged to solve 23 classic functions and benchmarked with five shortcoming metaheuristics: guided pelican algorithm (GPA), puzzle optimization algorithm (POA), average subtraction-based optimizer (ASBO), and coati optimization algorithm (COA). The result presents the superiority of ESCO among five shortcoming metaheuristics by outperforming the GPA, POA, GSO, ASBO, and COA in solving 13, 21, 23, 16, and 13 functions, respectively. Through investigation, the multiple search approach is more effective than the single search approach.