Big Data: Five Fundamental Concepts
Big Data technologies : A few months ago we told you what Big Data is, what its uses are and what types there are. We also introduce you to the so-called ‘4Vs’ (volume, speed, variety and truth). In this post, we will go a little further and delve into some fundamental concepts to understand how data management works.
1. Massive Data – Big Data Technologies
In order to process and analyze large amounts of massive data, free software is necessary. There are many tools out there, but most are based on the Hadoop Distributed File System (HDFS), a portable, scalable, distributed file system.
HDFS is written in Java for Hadoop, a framework that allows applications to work with thousands of nodes and petabytes of data.
2. Real-Time or Fast Data
By Real Time or Fast Data we understand the ability to obtain data in real time, that is, at the same time it is generated. The transmission of information occurs thousands of times per second.
In addition to the high frequency of data entry, Fast Data also has to do with the ability to process this data and make decisions based on it in the shortest possible time.
3. NoSQL databases
NoSQL (“not only SQL”) encompasses a large class of database management systems that are characterized by the fact that they do not require fixed structures such as tables. On the contrary, they are based on other storage systems such as key-value, column mapping or graphs.
Unlike traditional information storage models, NoSQL allows you to handle a larger volume of data and avoids bottlenecks. In addition, it hardly requires computing, so it saves on machinery costs.
4. Data Analytics
A fundamental part of working is Data Analytics, the process of examining data series with the aim of drawing conclusions about the information they contain.
Analytics allow companies to personalize their services or products. Consequently, Data Analytics has accelerated the decision time in companies, in addition to facilitating the commercial strategy.
5. Cloud Computing
The Cloud is a key sector to work since it allows processing large volumes of information. In addition, it is a high-performance system that does not require the installation of specific hardware.
Cloud Computing is, in short, a cheap, fast, comfortable, accessible and secure system, which more and more companies are turning to.
Also read : Business transfer
What Does Small Data Refer To?
One way to define small data is by comparing it to big data. The latter is defined based on the so-called three Vs of Big Data Technologies :
- data volume
- data generation speed
- variety of data formats
Small data, on the other hand, is about small concrete details, extracted through direct observation of customers.
In fact, according to a Campaigner survey, two-thirds of marketers believe smaller, more segmented data provides better insight for marketing.
That is why Roger Dooley, a specialist in business strategies, supports Lindstrom’s statement that the most important current trend is not mobile, social media, but small data.
This is why it is necessary to get to know customers better, directly and individually. This applies both in online and offline environments, as data can be obtained from any context.
Small data: the value of details
The pioneer in the concept of small data is Martin Lindstrom, author of the book “ Small Data: The small clues that warn us of the big trends ”. In this book, Martin reveals a new philosophy on data analysis that would revolutionize business management ever since.
This philosophy, which he called ” small data”, was born from his experience as a business advisor and market researcher for the Lego company. Martin explains how Lego bounced back from a rough patch thanks to “little facts” he discovered on a visit to the home of an 11-year-old boy.
From that moment and thanks to that small data, Lego changed its focus and revived its sales. The key to its success was the increased attention it paid to its child consumers.