Analyzing stakeholder needs and transforming them into requirements is an important early step in the systems engineering lifecycle [1]. In regulated industries, important technical requirements can be found in state and federal laws and regulations. Casino gaming is one such industry. This paper analyzes South Dakota and Nevada slot machine regulations and applies automated natural language processing to extract and analyze technical requirements derived from them. First, each parts of speech (POS) in the regulations is identified. From this, the important adjective and noun keywords and keyword combinations are extracted using the Rapid Automatic Keyword Extraction (RAKE) algorithm [2]. Next, slot machine requirements are extracted from the gaming laws, many of which lack a “shall” in them. To perform this, a 12-rule pattern matching algorithm that applies phrase substitutions and identifies leader-subordinate paragraph headings is applied to the slot machine gaming rules. This approach successfully extracts nearly all of the slot machine technical and operations requirements, though fails to separate compound requirements accounting for approximately 3% of the total. Then, after stemming and stopping the regulations, a Naïve Bayes model for identifying functional requirements is constructed from the South Dakota regulations and applied to the Nevada regulations. This model is able to predict the Nevada functional product requirements from amongst the full set of extracted requirements with 87.5% accuracy. Finally, using a modified version of the Dice similarity metric where the word counts are weighted by the term frequency-inverse document frequency (TF-TDF) scores, the South Dakota requirements most similar to each of the Nevada requirements is determined. The paired South Dakota and Nevada requirements are then assessed using systems engineering expertise for equivalency and relatedness. Using the geometric mean of sensitivity and specificity as a scoring metric, the pairing algorithm optimum performance is 96.1% accurate in identifying equivalent requirements between the two sets of regulations, and 82.0% accurate in identifying related requirements.
OBJECTIVES: This study sought to provide population-based estimates of drug-using arrestees in the 185 largest US cities. METHODS: A prevalence model for drug-using arrestees was developed by relating selected social indicators (from 1990 census data) and drug use rates (from Drug Use Forecasting program data) via logistic regression analysis. RESULTS: It was estimated that in 1990, across the 185 cities, about 925,000 arrestees used cocaine, 317,000 used opiates, 213,000 used amphetamines, 389,000 were drug injectors, and 1,296,000 used an illicit drug. CONCLUSIONS: This approach represents a cost-efficient method for prevalence estimation based on empirically demonstrable relationships between social indicators and drug use rates.
This paper describes an empirical analysis of optimized portfolios and safe return rates across multiple investment time horizons using Telser’s Safety-First method. The analysis uses thirty years of historical monthly data for 81 different Fidelity® mutual funds and a blended money market fund rate. The Fidelity® funds represent a wide variety of investment factors, strategies and asset types, including bonds, stocks, commodities and convertible securities. A large synthetic return dataset was generated from this data by a Monte-Carlo random walk using cointegrated bootstrapping of investment returns and yields. Portfolio optimization was then performed on this synthetic dataset for safety factors varying from 60% to 99% and time horizons varying from one month to ten years. Results from portfolio analyses include the following: 1) there are no risk-free investments available to Fidelity® mutual fund investors, as even money market funds have risk due to yield fluctuations, 2) optimized portfolios are sensitive to both investment time horizons and safety factor confidence levels, 3) conservative, short-term investors are better off leaving their money in a money market fund than investing in securities, 4) optimized portfolios for longer term, more aggressive investors consist of a blend of both value and growth equities, and 5) the funds most often represented in optimized portfolios are those that have the best risk/reward ratios, although this rule is not universal. Two practical applications of this optimization approach are also presented.
This paper describes a methodology for estimating safe withdrawal rates during retirement that is based on a retiree’s age, risk tolerance and investment strategy, and then provides results obtained from using that methodology. The estimates are generated by a three-step process. In the first step, Monte Carlo simulations of future inflation rates, 10-year treasury rates, corporate bond rates (AAA and BAA), the S&P 500 index values and S&P 500 dividend yields are performed. In the second step, portfolio composition and withdrawal rate combinations are evaluated against each of the Monte Carlo simulations in order to calculate portfolio longevity likelihood tables, which are tables that show the likelihood that a portfolio will survive a certain number of years for a given withdrawal rate. In the third and final step, portfolio longevity tables are compared with standard mortality tables in order to estimate the likelihood that the portfolio outlasts the retiree. This three-step approach was then applied using both a Monte Carlo random walk model and an ARIMA/GARCH model based upon over 100 years of monthly historical data. The end result was estimates of the likelihood of portfolio survival to mortality for over 500,000 retiree age/sex/portfolio/withdrawal rate combinations, each combination supported by at least 10,000 Monte Carlo economic simulation points per model. Both models are supported by 100 years of historical data. The first model is a random walk with step sizes determined by bootstrapping, and the second is an ARIMA/GARCH regression model. This data is analyzed to predict safe withdrawal rates and portfolio composition strategies appropriate for the late 2021 economic environment.
This report investigates the distribution of abalone data collected off of the coast of Tasmania, Australia in 1994. Analysis of this data is used to determine whether or not the dataset includes groups which are unusual and possibly indicative of the abalone population collapse that has occurred in Tasmania since the data was collected. As a side effect, some of the abalone population distribution data that was reportedly collected in (Nash, 1994) but neither published in (Nash, 1994) nor in the UCI data repository (Nash, 1995) has been reconstituted and made available in the appendix.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.