For a number of years research has been carried out in several centres which has demonstrated the potential benefits of 100‐m scale models for a range of meteorological phenomena. More recently, some meteorological services have started to consider seriously the operational implementation of practical hectometric models. Many, but by no means all, of the applications are likely to relate to urban areas, where the enhanced resolution has obvious benefits. This article is concerned with the issues that need to be addressed to bridge the gap between research at 100‐m scales and practical models. We highlight a number of key issues that need to be addressed, with suggestions of important avenues for future development. An overarching issue is the high computational cost of these models. Although some ideas to reduce this are presented, it will always be a serious constraint. This means that the benefits of these models over lower resolution ones, or other techniques for generating high‐resolution forecasts, will need to be clearly understood, as will the trade‐offs with resolution. We discuss issues with model dynamical cores and physics–dynamics coupling. There are a number of challenges around model parameterisations, where some of the traditional problems (e.g., convection) become easier but a number of new challenges (e.g., around surface parameterisations) appear. Observational data at these scales present a challenge and novel types of observations will need to be considered. Data assimilation will be needed for short‐range forecasts, but there is currently little knowledge of this, although some of the likely issues are clear. An ensemble approach will be essential in many cases (e.g., convection), but research is needed into ensembles at these scales and significant work on post‐processing systems is required to make the best use of models at these grid lengths.