Developing for tabletops puts special requirements on interface programming frameworks: managing parallel input, device discovery, device equivalence, and describing combined interactions. We analyse these issues and describe the solutions that were used in IntuiKit, a modelbased framework aimed at making the design and development of post-WIMP user interfaces more accessible. Some solutions are simple consequences of the support of multimodality, while others are more specific to multiple touch. We illustrate these features through examples developed in several tabletop projects, including one application aimed at improving collaboration between air traffic controllers.