General Overview Of Data Analytics
As there is massive growth in technology, the usage of data is also increasing day by day due to social networks and cloud computing. Lots of data was getting used and there was no back up for the data, so analytics of data was introduced. Data analytics uses all the fundamental principal to analyse big and complex set of data. Data analytics has two prominent methods they are quantitative and qualitative research.
Quantitative research: This process is done by observations and interviews.
Qualitative research: Qualitative research is by research and survey.
There are different types of in data analytics. They are descriptive, diagnostic, predictive and prescriptive analytics.
- Descriptive analytics deals with what happened to the data.
- Diagnostic analytics has all previous data which can be used in opposite to other data if something goes wrong.
- Predictive analytics is an analytics process where it says what can happen.
- Prescriptive analytics deals with the action to remove future errors.
Data analytics can be done through different methods in different platforms. The big data analytics tool has the data flow collected from big datasets and is used to give useful information.
- Collection of Data: The data available is collected from different sources.
- Storage of Data: All the information obtained is stored, which is used for analysing as per organisation requirements.
- Processing: The structured, unstructured and semi structured data is converted to a format by matching up and performing algorithms. The processed information is used later.
- Visualisation: To get accurate insights of data, data visualisation is used to analyse in a better way.
The different software used for data analytics are Hadoop, talend, cloudera.
- Hadoop: It is an open source software tool used to scale the data without causing harm to the hardware.
- Talend: It is an open source product helps in data management.
- Cloudera: It is used for management and analysis of data within the limit or incloud.
History
Analytics emerged in 1950’s when the tools were developed that would capture all the data and identify the patterns and process faster than human minds. These early analytics is often referred as analytics 1. 0. The characteristics of this analytics include small, structured and internal data. It is done by batch processing which could take months for processing and has limited information. This process took more time for collecting and processing the data than analysing. This lasted for about half a century, starting from mid1950’s until big data was introduced in 2009. Big data is another version of analytics known as analytics 2. 0. When the big data arrived, new technologies and processes were introduced to wrap up the speed to help turn the data into insight and helpful. Big data uses tools such as Hadoop for analyses and uses NoSQL for data storage and manipulation. When the new era analytics 3. 0 arrived, machine learning was introduced which gives real time insights and results.
Future Scope
It’s clearly known that the use of data is increasing day to day. This provides challenges and opportunities for data analytics. The high demand for both capturing the data and organizing as well as analysing of data to make better decisions also increases.