More than a decade after the term was introduced by Jeff Howe in 2006 [33], crowdsourcing seems to be considered by many as a, if not "the," universal means to solve virtually any kind of problem, online and offline, that requires sustained human involvement. We see it used to motivate employees to engage with less rewarding work routines, attract the best possible ideas to boost innovation, enhance artificial intelligence algorithms, and support ambitious social and entrepreneurial initiatives. Achieving a goal by collecting contributions from many individuals has a long tradition, far beyond the most recent developments in the digital world. In fact, some of the most successful exemplars of the "wisdom of the crowds" in modern times, for instance Wikipedia, predate Howe's article. And yet, with the rise of social networks, smart mobile devices and online platforms, the phenomenon has found a new dimension in terms of scale and achievements-it routinely manages to mobilise very large groups of people in a relatively short period of time, helping organisations from tech and government to the military and marketing improve the ways they operate, decide, and engage with the world. The crowdsourcing landscape is as diverse as its applications. It includes paid microtask platforms such as Amazon's Mechanical Turk 1 and CrowdFlower (now Figure Eight), 2 alongside online labour marketplaces such as TaskRabbit 3 and UpWork, 4 and open innovation contests in the style of Kaggle 5 and Innocentive. 6 It features volunteer citizen science systems such as Galaxy Zoo 7 and SciStarter, 8 and games with a purpose (GWAPs) such as PhraseDetective 9 and EyeWire. 10 Putting aside the principled differences between these forms of crowdsourcing, which make them from the offset amenable only to specific types of problems, the success of any crowdsourcing endeavour will depend on the ability of the "requester"-the person or institution reaching out to the crowdto benefit from crowd outputs and attract a critical mass of contributors. The frame conditions can be challenging as well: To be effective, crowdsourcing needs to provide a real alternative to existing solutions of the problem the requester is trying to solve, in terms of quality, timeliness, and costs. Getting this mixture right is not trivial, as it requires insight into a wide array of subjects from artificial intelligence and data analysis to user experience design and behavioural sciences [21, 22]. Aligning the motivations of the crowd with the goals of the requester and finding the right mix of rewards to drive purposeful crowd engagement are particularly critical. Existing crowdsourcing platforms already provide some level of support-whether paid microtasks, challenges, or citizen science projects, each of these forms of crowdsourcing makes some basic assumptions about what drives people to contribute to their "open call" [33], and offers sometimes bespoke incentives to encourage a certain style of behaviour. For example, people register on Mechanical Turk primarily for f...