This version is available at https://strathprints.strath.ac.uk/56037/ Strathprints is designed to allow users to access the research output of the University of Strathclyde. Unless otherwise explicitly stated on the manuscript, Copyright © and Moral Rights for the papers on this site are retained by the individual authors and/or other copyright owners. Please check the manuscript for details of any other licences that may have been applied. You may not engage in further distribution of the material for any profitmaking activities or any commercial gain. You may freely distribute both the url (https://strathprints.strath.ac.uk/) and the content of this paper for research or private study, educational, or not-for-profit purposes without prior permission or charge.Any correspondence concerning this service should be sent to the Strathprints administrator: strathprints@strath.ac.ukThe Strathprints institutional repository (https://strathprints.strath.ac.uk) is a digital archive of University of Strathclyde research outputs. It has been developed to disseminate open access research outputs, expose data about those outputs, and enable the management and persistent access to Strathclyde's intellectual output. ) have yet to be established for Crowdsourced Design (cDesign). As a contribution to the development of such a general model this paper proposes a cDesign framework to support the creation of crowdsourced design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforce's creative activities but is also fundamental to almost all iterative optimisation processes. This paper reports an experimental investigation into two different Crowdsourced design evaluation approaches; free evaluation and 'Crowdsourced Design Evaluation Criteria' (cDEC). The results are benchmarked against a 'manual' evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an "expert panel". The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions.