It 1 has been 50 years since the term "software engineering" was coined in 1968 at a NATO conference. The field should be relatively mature by now, with most established universities covering core software engineering topics in their Computer Science programs and others offering specialized degrees. However, still many practitioners lament a lack of skills in new software engineering hires. With the growing demand for software engineers from the industry, this apparent gap becomes more and more pronounced. One corporate strategy to address this gap is for the industry to develop supplementary training programs before the hiring process, which could also help companies screen viable candidates. In this paper, we report on our experiences and lessons learned in conducting a summer school program aimed at screening new graduates, introducing them to core skills relevant to the organization and industry, and assessing their attitudes toward mastering those skills before the hiring process begins. Our experience suggests that such initiatives can be mutually beneficial for new hires and companies alike. We support this insight with pre-and post-training data collected from the participants during the first edition of the summer school and a follow-up questionnaire conducted after a year with the participants, 50% of whom were hired by the company shortly after the summer school. CCS CONCEPTS• Social and professional topics→ Computing Education KEYWORDS Software engineering education, software engineering training, software engineering summer school, hiring practices for software professionals
Purpose-In this paper, the methodology of definition a tailored Technology Management (TM) Framework for a large scale enterprise defense company (HAVELSAN) will be explained briefly, and selected subset of TMF components in the company will be further discussed. The method of managing technologies in enterprises depends primarily on the size of the company and its organizational structure. Methodology-The main purpose of this paper is the identification and analysis of the factors influencing the manner in which technologies are managed in big enterprise companies especially focused on defense industry. Findings-The concepts that are investigated and the methodologies that are developed in this study are primarily based on the case of technology management framework within a defence technology enterprise. Conclusion-The methods of defining a newly introduced Technology Management Framework (TMF) will help to enhance knowledge related to the development of methods used for technology management and the technology management tools in relation with each other.
Özet-Günümüzde bilişim sistemlerinde geçmişe oranla çok daha büyük veriler oluşmaktadır. Bu verilerin depolanması ve analizinde önemli kaynak sorunları yaşanmaktadır. Büyük Verinin depolanması, işlenmesi ve analiz edilmesi için ihtiyaç duyulan sistemlerin, güncel sistemlerden daha hızlı çalışması ve daha az enerji tüketmesi gerekmektedir. Aksi takdirde çok büyük maliyet ve veri analiz süreleri önümüze çıkmaktadır. Bu çalışmada tek kart mini kişisel bilgisayarlar ile küme oluşturup ve üzerinde kap tabanlı sanallaştırma sağlayıp büyük veri algoritmaları denemeleri yapılmıştır. Bu kapsamda oluşturulan büyük veri sistemlerinin temelini oluşturan Map Reduce işlemlerinin özel olarak tasarlanmış ARM işlemci kümeleri üzerinde yürütülmesini ve etkinliğinin test edilmesi araştırılmıştır. ARM işlemcili tek kart mini bilgisayarların maliyeti ucuz, enerji tüketimi düşük, karbon salınımı düşüktür. Kümeleme, bulut bilişim, çoklu işlem, paralel işlem ve büyük veri uygulamalarına da uygunluğu da görülmüştür. Tek kart bilgisayar donanımı üzerinde kap temelli sanallaştırma kullanımını denenmemiş bir yaklaşımdır. MapReduce uygulamasında işçi düğüm olarak işlem tecritlenmesi kullanılması da yeni bir uygulamadır.Anahtar Kelimeler-tek kart bilgisayar kümesi, sanallaştırma, büyük veri,MapReduce, Hadoop Study on MapReduce Operations Creating Cloud with Single Board ComputerAbstract-Nowadays information systems have much larger data than in the past. The storage and analysis of this data has a huge lack of resource. Saving, processing and analyzing of the big data needs systems that work faster and consume less energy than current systems. Otherwise much greater costs and times of data analysis will be faced. In this study, a cluster of single board computers is created and succeeded to run process isolation operating system level virtualization for experiencing on big data algorithms. In this context created our work, the Map Reduce transactions, which are the basics of big data systems, were executed on specifically designed ARM architecture mini supercomputer clusters. ARM processed single board computers have effective costs, less energy consumptions and less carbon emissions. Clustering, cloud computing, multiprocessing, parallel processing and big data applications compliance has been also observed. Container virtualization on single board computer is an untested approach to use. Using process isolation for MapReduce WorkerNode is yet another new practice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.