2016 2nd International Conference on Contemporary Computing and Informatics (IC3I) 2016
DOI: 10.1109/ic3i.2016.7917965
|View full text |Cite
|
Sign up to set email alerts
|

Big data: A review of analytics methods & techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…Some techniques have been developed that can be used to analyze datasets and provide some insight into the data. Various analytical techniques are available, including data mining software, data analysis tools, visualization/dashboard tools, machine learning, deep learning, gradual approaches, cloud computing, IoT, data stream processing, and intelligent analysis [24] Storing and processing large data is a problem, and some innovations have been made to overcome these technological and processing challenges. For example, grid computing is used to manage high-volume data, cloud computing is used to handle high-speed and high-volume data, and open-source cost-reduction and virtualization technology reduces time to test, implement, and enhance processing speeds [28].…”
Section: Resultsmentioning
confidence: 99%
“…Some techniques have been developed that can be used to analyze datasets and provide some insight into the data. Various analytical techniques are available, including data mining software, data analysis tools, visualization/dashboard tools, machine learning, deep learning, gradual approaches, cloud computing, IoT, data stream processing, and intelligent analysis [24] Storing and processing large data is a problem, and some innovations have been made to overcome these technological and processing challenges. For example, grid computing is used to manage high-volume data, cloud computing is used to handle high-speed and high-volume data, and open-source cost-reduction and virtualization technology reduces time to test, implement, and enhance processing speeds [28].…”
Section: Resultsmentioning
confidence: 99%
“…Big data is a term used to explain sets of data that are of several forms or structures, achieve extremely high speeds and cannot be processed successfully by traditional database management systems [1] [2]. Zhou et al [3], argued that by the end of 2015 the overall data volume is going to surpass 7.9 Zettabytes(ZB) reaching 35ZB by 2020.…”
Section: Introductionmentioning
confidence: 99%