Abstruct-Given the central role that software development plays in the delivery and application of information technology, managers are increasingly focusing on process improvement in the software development area. This demand has spurred the provision of a number of new and/or improved approaches to software development, with perhaps the most prominent being object-orientation (00). In addition, the focus on process improvement has increased the demand for software measures, or metrics with which to manage the process. The need for such metrics is particularly acute when an organization is adopting a new technology for which established practices have yet to be developed. This research addresses these needs through the development and implementation of a new suite of metrics for 00 design. Metrics developed in previous research, while contributing to the field's understanding of software development processes, have generally been subject to serious criticisms, including the lack of a theoretical base. Following Wand and Weber, the theoretical base chosen for the metrics was the ontology of Bunge. Six design metrics are developed, and then analytically evaluated against Weyuker's proposed set of measurement principles. An automated data collection tool was then developed and implemented to collect an empirical sample of these metrics at two field sites in order to demonstrate their feasibility and suggest ways in which managers may use these metrics for process improvement.
The burden of organizational learning surrounding software process innovations (SPIs)---and complex organizational technologies in general---creates a "knowledge barrier" that inhibits diffusion. Attewell (Attewell, P. 1992. Technology diffusion and organizational learning the case of business computing. Organ. Sci. 3(1) 1--19.) has suggested that many organizations will defer adoption until knowledge barriers have been sufficiently lowered; however, this leaves open the question of which organizations should be more likely to innovate, even in face of high knowledge barriers. It is proposed here that organizations will innovate in the presence of knowledge barriers when the burden of organizational learning is effectively lower, either because much of the required know-how already exists within the organization, or because such knowledge can be acquired more easily or more economically. Specifically, it is hypothesized that organizations will have a greater propensity to initiate and sustain the assimilation of SPIs when they have a greater scale of activities over which learning costs can be spread (learning-related scale), more extensive existing knowledge related to the focal innovation (related knowledge), and a greater diversity of technical knowledge and activities (diversity). An empirical study using data on the assimilation of object-oriented programming languages (OOPLs) by 608 information technology organizations strongly confirmed the importance of the three hypothesized factors in explaining the assimilation of OOPLs.software process innovation, adoption, diffusion, assimilation, organizational learning, object-oriented programming, object orientation, disk-based survey
Because of network externalities, the success of a software product may depend in part on stalled base and its conformance to industry standards. This research builds a hedonic model to determine the effects of network externalities, standards, intrinsic features and a time trend on microcomputer spreadsheet software prices. When data for a sample of products during the 1987--1992 time period were analyzed using this model, four main results emerged: 1) Network externalities, as measured by the size of a product's installed base, significantly increased the price of spreadsheet products: a one percent increase in a product's installed base was associated with a 0.75% increase in its price. 2) Products which adhered to the dominant standard, the Lotus menu tree interface, commanded prices which were higher by an average of 46%. 3) Although nominal prices increased slightly during this time period, quality-adjusted prices declined by an average of 16% per year. 4) The hedonic model was found to be a good predictor of actual market prices, despite the fact that it was originally estimated using list prices. Several variations of the model were examined, and, while the qualitative findings were robust, the precise estimates of the coefficients varied somewhat depending on the sample of products examined, the weighting of the observations and the functional form used in estimation, suggesting that the use of hedonic methods in this domain is subject to a number of limitations due, inter alia, to the potential for strategic pricing by vendors.network externalities, standards, pricing, software, hedonic regression, information technology
Practitioners have expressed concern over their inability to accurately estimate costs associated with software development. This concern has become even more pressing as costs associated with development continue to increase. As a result, considerable research attention is now directed at gaining a better understanding of the software-development process as well as constructing and evaluating software cost estimating tools. This paper evaluates four of the most popular algorithmic models used to estimate software costs (SLIM, COCOMO, Function Points, and ESTIMACS). Data on 15 large completed business data-processing projects were collected and used to test the accuracy of the models' ex post effort estimation. One important result was that Albrecht's Function Points effort estimation model was validated by the independent data provided in this study [3]. The models not developed in business data-processing environments showed significant need for calibration. As models of the software-development process, all of the models tested failed to sufficiently reflect the underlying factors affecting productivity. Further research will be required to develop understanding in this area.
This paper presents theoretical woric that builds a suite of metrics for object-oriented design.In particular, these metrics are based upon measurement theory and are also informed by the insights of experienced object-oriented software developers. In evaluating these metrics against a standard set of criteria, they are found to both (a) perform relatively well, and (b) suggest some ways in which Lhe object oriented approach may differ in terms of desirable or necessary design features from more traditional approaches.In order for object-oriented software production to fulfill its promise in moving software development and maintenance from the current 'craft' environment into something more closely resembling conventional engineering, it will require metrics of the process to aid the software management, project planning and project evaluation functions. While software metrics are a generally desirable feature in any software environment, they are of special imponance in the object-oriented approach, since it represents a non-trivial technological change for the organization.The metrics presented in this paper are the first steps in a project aimed at measuring and evaluating the use of object oriented design principles in organizations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.