Web services solutions are being increasingly adopted in enterprise systems. However, ensuring the quality of service of Web services applications remains a costly and complicated performance engineering task. Some of the new challenges include limited controls over consumers of a service, unforeseeable operational scenarios and vastly different XML payloads. These challenges make existing manual performance analysis and benchmarking methods difficult to use effectively. This paper describes an approach for generating customized benchmark suites for Web services applications from a software architecture description following a Model Driven Architecture (MDA) approach. We have provided a performance-tailored version of the UML 2.0 Testing Profile so architects can model a flexible and reusable load testing architecture, including test data, in a standards compatible way. We extended our MDABench [27] tool to provide a Web service performance testing "cartridge" associated with the tailored testing profile. A load testing suite and automatic performance measurement infrastructure are generated using the new cartridge. Best practices in Web service testing are embodied in the cartridge and inherited by the generated code. This greatly reduces the effort needed for Web service performance benchmarking while being fully MDA compatible. We illustrate the approach using a case study on the Apache Axis platform.
Performance benchmarks are domain specific applications that are specialized to a certain set of technologies and platforms. The development of a benchmark application requires mapping the performance specific domain concepts to an implementation and producing complex technology and platform specific code. Domain Specific Modeling (DSM) promises to bridge the gap between application domains and implementations by allowing designers to specify solutions in domain-specific abstractions and semantics through Domain Specific Languages (DSL). This allows generation of a final implementation automatically from high level models. The modeling and task automation benefits obtained from this approach usually justify the upfront cost involved. This paper employs a DSM based approach to invent a new DSL, DSLBench, for benchmark generation. DSLBench and its associated code generation facilities allow the design and generation of a completely deployable benchmark application for performance testing from a high level model. DSLBench is implemented using Microsoft Domain Specific Language toolkit. It is integrated with the Visual Studio 2005 Team Suite as a plug-in to provide extra modeling capabilities for performance testing. We illustrate the approach using a case study based on .Net and C#.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.