2014
DOI: 10.1109/tdsc.2013.21
|View full text |Cite
|
Sign up to set email alerts
|

A Large-Scale Study of the Time Required to Compromise a Computer System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
30
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(31 citation statements)
references
References 38 publications
1
30
0
Order By: Relevance
“…the configuration values regarding attacks have been preset. Such quantitative data was collected from various sources including surveys and studies such as (Holm 2014;Jonsson and Olovsson 1997), but also from public vulnerability databases such as the US's National Vulnerability Database (NVD) 3 and China National Vulnerability Database of Information Security (CNNVD) 4 . Therefore, security expertise is not required from users.…”
Section: Reference Model Language and Cyber Security Analysis Toolmentioning
confidence: 99%
“…the configuration values regarding attacks have been preset. Such quantitative data was collected from various sources including surveys and studies such as (Holm 2014;Jonsson and Olovsson 1997), but also from public vulnerability databases such as the US's National Vulnerability Database (NVD) 3 and China National Vulnerability Database of Information Security (CNNVD) 4 . Therefore, security expertise is not required from users.…”
Section: Reference Model Language and Cyber Security Analysis Toolmentioning
confidence: 99%
“…For instance, the effort required to discover a zero-day vulnerability was measured on a scale of work days [41]. For such data, analysis using the Akaike Information Criterion [42] was conducted to find the distribution best fit for modeling the dataset, with tested distributions selected based on choices and findings from a previous study involving time-to-compromise [43]. If no distribution was well fit, linear interpolation was used to derive a custom CDF.…”
Section: Likelihood Of Successful Attack As a Function Of Time Spent mentioning
confidence: 99%
“…Given this lack of knowledge, a reasonable choice might be to choose the statistical model that is best fit for modeling the time-to-compromise a system in general. For this purpose, we performed an analysis of what distribution that is best fit for modeling the time from installation of a system to compromise [40] of that system (a dataset of all malware incidents over 260,000 computers across three years) [43]. We also performed such distribution fitting for time-to-compromise observations from an international cyber defense exercise involving more than 100 participants [45].…”
Section: Likelihood Of Successful Attack As a Function Of Time Spent mentioning
confidence: 99%
“…The focus of this work is on intrusion detection with generation of attack dataset using real time traffic and IDS logs by exporting it to outside the victim network. Various approaches discussed for intrusion detection such as mathematical models like chi-square test, Hidden Markov Model based IDS [6]; frameworks discussed like, random forest tree based analysis [7]; architectures like NICE [8], multilevel intrusion detection [9]; methods like multi-variant correlation analysis [10], Poisson distribution [11]. Commencing with the survey, let's take a tour of Digital forensics & its technique's pros and cons first.…”
Section: Related Workmentioning
confidence: 99%
“…For the low cost of an IDS, the parallel string machine scheme is useful to reduce strain on memory in the context of parallel string matching engines [19]. The time factor to compromise the system depends on Poisson process which heavily depends on number of intrusions & system level security on the client as well as server side [11].…”
Section: Related Workmentioning
confidence: 99%