Application profiling tools are the instruments used to measure software performance at the function and application levels. Without careful configuration of the tools and the environment, invalid results are readily obtained. The errors may not become obvious if a large, complex application is profiled before more simple validations are attempted. A set of four simple, synthetic reference applications was used to validate configurations for profiling under x86 64 Linux. Results from one validated configuration and examples of observed invalid results are presented. While validation results for specific versions of software quickly lose value, this exercise demonstrates how future configurations can be validated and shows the kinds of errors that may reoccur. * Specific computer hardware and software products are identified in this report to support reproducibility of results. Such identification does not imply recommendation or endorsement by the National Institute of Standards and Technology, nor does it imply that the products identified are necessarily the best available for the purpose.
Programmers routinely omit run-time safety checks from applications because they assume that these safety checks would degrade performance. The simplest example is the use of arrays or array-like data structures that do not enforce the constraint that indices must be within bounds. This report documents an attempt to measure the performance penalty incurred by two different implementations of bounds-checking in C and C++ using a simple benchmark and a desktop PC with a modern superscalar CPU. The benchmark consisted of a loop that wrote to array elements in sequential order. With this configuration, relative to the best performance observed for any access method in C or C++, mean degradation of only (0.881 ± 0.009) % was measured for a standard bounds-checking access method in C++. This case study showed the need for further work to develop and refine measurement methods and to perform more comparisons of this type. Comparisons across different use cases, configurations, programming languages, and environments are needed to determine under what circumstances (if any) the performance advantage of unchecked access is actually sufficient to outweigh the negative consequences for security and software quality.
Application profiling tools are the instruments used to measure software performance at the function and application levels. The most powerful measurement method available in application profiling tools today is sampling-based profiling, where a potentially unmodified application is interrupted based on some event to collect data on what it was doing when the interrupt occurred. It is well known that sampling introduces statistical uncertainty that must be taken into account when interpreting results; however, factors affecting the variability have not been well-studied. In attempting to validate two previously published analytical estimates, we obtained negative results. Furthermore, we found that the variability is strongly influenced by at least one factor, self-time fragmentation, that cannot be determined from the data yielded by sampling alone. We investigate this and other factors and conclude with recommendations for obtaining valid estimates of uncertainty under the conditions that exist.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.