Context There is a widespread belief in both SE and other branches of science that experience helps professionals to improve their performance. However, cases have been reported where experience not only does not have a positive influence but sometimes even degrades the performance of professionals. Aim Determine whether years of experience influence programmer performance. Method We have analysed 10 quasi-experiments executed both in academia with graduate and postgraduate students and in industry with professionals. The experimental task was to apply ITLD on two experimental problems and then measure external code quality and programmer productivity. Results Programming experience gained in industry does not appear to have any effect whatsoever on quality and productivity. Overall programming experience gained in academia does tend to have a positive influence on programmer performance. These two findings may be related to the fact that, as opposed to deliberate practice, routine practice does not appear to lead to improved performance. Experience in the use of productivity tools, such as testing frameworks and IDE also has positive effects. Conclusion Years of experience are a poor predictor of programmer performance. Academic background and specialized knowledge of task-related aspects appear to be rather good predictors.
Abstract-Context. Requirements elicitation is a highly communicative activity in which human interactions play a critical role. A number of analyst characteristics or skills may influence elicitation process effectiveness. Aim. Study the influence of analyst problem domain knowledge on elicitation effectiveness. Method. We executed a controlled experiment with post-graduate students. The experimental task was to elicit requirements using open interview and consolidate the elicited information immediately afterwards. We used four different problem domains about which students had different levels of knowledge. Two tasks were used in the experiment, whereas the other two were used in an internal replication of the experiment; that is, we repeated the experiment with the same subjects but with different domains. Results. Analyst problem domain knowledge has a small but statistically significant effect on the effectiveness of the requirements elicitation activity. The interviewee has a big positive and significant influence, as does general training in requirements activities and interview experience. Conclusion. During early contacts with the customer, a key factor is the interviewee; however, training in tasks related to requirements elicitation and knowledge of the problem domain helps requirements analysts to be more effective. 1INTRODUCTIONR EQUIREMENTS elicitation, that is, seeking, capturing and consolidating requirements, is a core activity of any requirements engineering process [1] and has a direct influence on software quality [2]. Requirements elicitation depends on intensive communication between users and analysts in order to gather the right information [3]. Human interactions play an important role in this context. On one hand, customers should be able to interact and communicate their needs to analysts. On the other hand, analysts should be able to draw out and grasp the necessary domain information from customers.The effectiveness of requirements engineering activities is believed to partially depend on the participating individuals [4]. It has been observed that interview effectiveness can vary significantly depending on interviewer skills, probably because proficiency affects the course of the questioning [5]. As a result, elicitation strongly depends on the individual doing the interviewing [6]. Similar effects have been identified in brain-storming [4] and using other elicitation techniques [7].Several personal attributes may have a bearing on the effectiveness of any requirements-related task: experience [8]
Context. Nowadays there is a great deal of uncertainty surrounding the effects of experience on Requirements Engineering (RE). There is a widespread idea that experience improves analyst performance. However, there are empirical studies that demonstrate the exact opposite. Aim. Determine whether experience influences requirements analyst performance. Method. Quasi-experiments run with students and professionals. The experimental task was to elicit requirements using the open interview technique immediately followed by the consolidation of the elicited information in domains with which the analysts were and were not familiar. Results. In unfamiliar domains, interview, requirements, development, and professional experience does not influence analyst effectiveness. In familiar domains, effectiveness varies depending on the type of experience. Interview experience has a strong positive effect, whereas professional experience has a moderate negative effect. Requirements experience appears to have a moderately positive effect; however, the statistical power of the analysis is insufficient to be able to confirm this point. Development experience has no effect either way. Conclusion. Experience effects analyst effectiveness differently depending on the problem domain type (familiar, unfamiliar). Generally, experience does not account for all the observed variability, which means there are other influential factors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.