Analyzing 3 TB Field Measurement Data Set
Keywords:
big data, data analysis, machine learning, artificial intelligence, deep learningAbstract
This arcicle describes the use of intelligent algorithms for analysing
field measurement data. The main focus is on describing the general
workflow and practical issues when the amount of data is ''big''
and typical data analysis methods for small data cannot be used. When
the amount of data is tens of terabytes, it is no longer fitting to
computer memory. Data visualization is also challenging: visualization
tools can only render a small fraction of data to computer screen
and visual inspecting of the whole dataset is not meaningful at all. The
data is simply too big. Thus, new approaches to study data are needed
where the data is processed automatically in calculation clusters
without manual human work. The basic idea of data mining is to gradually
reduce the amount of data by various techniques, as long as the final
data contains only information relevant to the research question and in
such a compact form that its viewing from the human point of view
is rational use of time.
How to Cite
Copyright (c) 2017 Jukka Aho, Tero Frondelius
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.