We have developed a procedure for fitting experimental and simulated X-ray reflectivity and diffraction data in order to automate and to quantify the characterization of thin-film structures. The optimization method employed is a type of genetic algorithm called 'Differential Evolution'. The method is capable of rapid convergence to the global minimum of an error function in parameter space even when there are many local minima in addition to the global minimum. We show how to estimate the pointwise errors of the optimized parameters, and how to determine whether the model adequately represents the structure. The procedure is capable of fitting some tens of adjustable parameters, given suitable data.
A new metric, Word Maturity, estimates the development by individual students of knowledge of every word in a large corpus. The metric is constructed by Latent Semantic Analysis modeling of word knowledge as a function of the reading that a simulated learner has done and is calibrated by its developing closeness in information content to that of a simulated literate adult. Individual human learner knowledge is aligned with the simulation by adaptive testing. Evidence of accuracy, example applications to vocabulary assessment, teaching and reading research, properties of the metric, and a conjecture about its possible wider importance are described.This article introduces a new metric for the development of readingcomprehension-oriented word knowledge. In brief, a computational simulation based on Latent Semantic Analysis (LSA) first creates learning trajectories for each unique orthographic word-form in a text corpus that is representative in size and content of the lifetime reading of a literate adult. A metric-based adaptive test is used to align individual reader knowledge with the simulation. The technology can then make estimates of degree of knowledge of every individual word for individual readers relative to that of an average literate adult.We begin with a brief discussion of the need and value of such a metric, comparing it with the current state of the art in vocabulary assessment. Next come sections on LSA and how it is used to create the new metric. This includes a review of evidence that LSA does the things that the metric requires. This is Correspondence should be sent to Thomas K. Landauer, Pearson Knowledge Technologies, 4940 Pearl East Circle, Suite 200, Boulder, CO 80301.
The environment considered in this research is a massive multiplayer online gaming (MMOG) environment. Each user controls an avatar (an image that represents and is manipulated by a user) in a virtual world and interacts with other users. An important aspect of MMOG is maintaining a fair environment among users (i.e., not give an unfair advantage to users with faster connections or more powerful computers). The experience (either positive or negative) the user has with the MMOG environment is dependent on how quickly the game world responds to the user's actions. This study focuses on scaling the system based on demand, while maintaining an environment that guarantees fairness. Consider an environment where there is a main server (MS) that controls the state of the virtual world. If the performance falls below acceptable standards, the MS can off-load calculations to secondary servers (SSs). An SS is a user's computer that is converted into a server. Four heuristics are proposed for determining the number of SSs, which users are converted to SSs, and how users are assigned to the SSs and the MS. The goal of the heuristics is to provide a "fair" environment for all the users, and to be "robust" against the uncertainty of the number of new players that may join a given system configuration. The heuristics are evaluated and compared by simulation.
The creation of a virtual world environment (VWE) has significant costs, such as maintenance of server rooms, server administration, and customer service. The initial development cost is not the only factor that needs to be considered; factors such as the popularity of a VWE and unexpected technical problems during and after the launch can affect the final cost and success of a VWE. The capacity of servers in a client/server VWE is hard to scale and cannot adjust quickly to peaks in demand while maintaining the required response time. To handle these peaks in demand, we propose to employ users' computers as secondary servers. The introduction of users' computers as secondary servers allows the performance of the VWE to support an increase in users. In this study, we develop and implement five static heuristics to implement a secondary server scheme that reduces the time taken to compute the state of the VWE. The number of heterogeneous secondary servers, conversion of a player to a secondary server, and assignment of players to secondary servers are determined by the heuristics implemented in this study. A lower bound of the performance is derived to evaluate the results of the heuristics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.