This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
The rate of change in the world is increasing both in scope and magnitude by rapidly developing digital technologies. The challenging problem for performance measurement and management (PMM) in the digital era is twofold: firstly, the constant change in the external environment is compelling PMM to be more dynamic. Secondly, organisations have to deal with different varieties and volumes of data to create competitive advantage. The aim of this paper is to explore how PMM models and practices should be renovated to be resilient and reflect advances in the digital economies. Literature review on the state of the art was conducted covering the issues faced by organisations in the digital economies and their relevance to PMM. A case study was conducted to explore the practitioner perceptions of dealing with the issues faced in digital economies as well as to understand how they are making changes to their PMM. The key findings from the study include: (1) Organisations should refocus their measurement efforts to incorporate evaluation of their performance over a wider network involving various stakeholders. (2) Organisations need to understand how technological developments could create competitive advantage through their strategy and deploy it to relevant positivistic and behavioural measures.
ARTICLE HISTORY
Managed Learning Environments (MLEs) in higher education institutions (HEIs) are relatively new to the arena of higher education, even though there are over 90% of institutions in the higher and further education sector who are currently engaged in some kind of MLE development activity (University of Brighton 2003). However, when it comes to the task of assessing the performance of an MLE there are no universally recognisable frameworks for evaluating MLEs in HEIs currently discussed within the literature. The paper advances a general systemic framework for evaluating MLEs based on Checkland's SSM and reports on the first stages of our attempt to evaluate the MLE at Manchester Metropolitan University involving the team developing the system and the stakeholders concerned. Two of three iterations of this research have been completed and, whilst outside the scope of this paper, have found that SSM has coped with the criteria demanded of the evaluation framework within its context. After completing a stakeholder analysis, the criteria for evaluating an MLE, based on the stakeholders' requirements, emerged. These iterations have tentatively concluded that by contextualising SSM to the evaluation requirements of an MLE in a UK HEI, the measures of performance suggested by SSM need to be adjusted. The final iteration will check this outcome.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.