Hotspot mapping is a popular analytical technique that is used to help identify where to target police and crime reduction resources. In essence, hotspot mapping is used as a basic form of crime prediction, relying on retrospective data to identify the areas of high concentrations of crime and where policing and other crime reduction resources should be deployed. A number of different mapping techniques are used for identifying hotspots of crime -point mapping, thematic mapping of geographic areas (e.g. Census areas), spatial ellipses, grid thematic mapping and kernel density estimation (KDE). Several research studies have discussed the use of these methods for identifying hotspots of crime, usually based on their ease of use and ability to spatially interpret the location, size, shape and orientation of clusters of crime incidents. Yet surprising, very little research has compared how hotspot mapping techniques can accurately predict where crimes will occur in the future. This research uses crime data for a period before a fi xed date (that has already passed) to generate hotspot maps, and test their accuracy for predicting where crimes will occur next. Hotspot mapping accuracy is compared in relation to the mapping technique that is used to identify concentrations of crime events (thematic mapping of Census Output Areas, spatial ellipses, grid thematic mapping, and KDE) and by crime type -four crime types are compared (burglary, street crime, theft from vehicles and theft of vehicles). The results from this research indicate that crime hotspot mapping prediction abilities differ between the different techniques and differ by crime type. KDE was the technique that consistently outperformed the others, while street crime hotspot maps were consistently better at predicting where future street crime would occur when compared to results for the hotspot maps of different crime types. The research offers the opportunity to benchmark comparative research of other techniques and other crime types, including comparisons between advanced spatial analysis techniques and prediction mapping methods. Understanding how hotspot mapping can predict spatial patterns of crime and how different mapping methods compare will help to better inform their application in practice.
A methodologically sound systematic review is characterized by transparency, replicability, and a clear inclusion criterion. However, little attention has been paid to reporting the details of interrater reliability (IRR) when multiple coders are used to make decisions at various points in the screening and data extraction stages of a study. Prior research has mentioned the paucity of information on IRR including number of coders involved, at what stages and how IRR tests were conducted, and how disagreements were resolved. This article examines and reflects on the human factors that affect decision-making in systematic reviews via reporting on three IRR tests, conducted at three different points in the screening process, for two distinct reviews. Results of the two studies are discussed in the context of IRR and intrarater reliability in terms of the accuracy, precision, and reliability of coding behavior of multiple coders. Findings indicated that coding behavior changes both between and within individuals over time, emphasizing the importance of conducting regular and systematic IRR and intrarater reliability tests, especially when multiple coders are involved, to ensure consistency and clarity at the screening and coding stages. Implications for good practice while screening/coding for systematic reviews are discussed.
BackgroundMany local authorities in England and Wales have reduced street lighting at night to save money and reduce carbon emissions. There is no evidence to date on whether these reductions impact on public health. We quantified the effect of 4 street lighting adaptation strategies (switch off, part-night lighting, dimming and white light) on casualties and crime in England and Wales.MethodsObservational study based on analysis of geographically coded police data on road traffic collisions and crime in 62 local authorities. Conditional Poisson models were used to analyse longitudinal changes in the counts of night-time collisions occurring on affected roads during 2000–2013, and crime within census Middle Super Output Areas during 2010–2013. Effect estimates were adjusted for regional temporal trends in casualties and crime.ResultsThere was no evidence that any street lighting adaptation strategy was associated with a change in collisions at night. There was significant statistical heterogeneity in the effects on crime estimated at police force level. Overall, there was no evidence for an association between the aggregate count of crime and switch off (RR 0.11; 95% CI 0.01 to 2.75) or part-night lighting (RR 0.96; 95% CI 0.86 to 1.06). There was weak evidence for a reduction in the aggregate count of crime and dimming (RR 0.84; 95% CI 0.70 to 1.02) and white light (RR 0.89; 95% CI 0.77 to 1.03).ConclusionsThis study found little evidence of harmful effects of switch off, part-night lighting, dimming, or changes to white light/LEDs on road collisions or crime in England and Wales.
In the United Kingdom, since 2011 data regarding individual police recorded crimes have been made openly available to the public via the police.uk website. To protect the location privacy of victims these data are obfuscated using geomasking techniques to reduce their spatial accuracy. This paper examines the spatial accuracy of the police.uk data to determine at what level(s) of spatial resolution -if any -it is suitable for analysis in the context of theory testing and falsification, evaluation research, or crime analysis. Police.uk data are compared to police recorded data for one large metropolitan Police Force and spatial accuracy is quantified for four different levels of geography across five crime types. Hypotheses regarding systematic errors are tested using appropriate statistical approaches, including methods of maximum likelihood. Finally, a "best-fit" statistical model is presented to explain the error as well as to develop a model that can correct it. The implications of the findings for researchers using the police.uk data for spatial analysis are discussed.
The illegal treatment and trade of waste is an international problem which is widely assumed to be both evolving and growing. Emergent forms of criminality such as this often have the problem of data being in scarce supply, and as a result are difficult to study, and subsequently understand. In this paper we introduce the methodological concept of script analysis to assist a more objective assessment and understanding of illegal waste activity. This includes using crime scripts in two ways; to help identify data requirements, and as a tool to analyse illegal waste processes. We illustrate the utility of this methodology using waste electrical and electronic equipment. In doing so, we argue that this approach elicits a specific, focused account of what illegal activity has occurred, and nests it within the wider context of the waste management system. We anticipate that using this methodology will provide academics and practitioners a means of enhancing the investigation, detection and prevention of illegal waste activity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.