Science & Technology

Challenges of Analyzing the Huge Amount of Data these Days

Spread the love

Reading Time: 3 minutes

In the maddening crowd of worldwide computing, our growing dependence on huge data bases keep on increasing. In the world wide web, every second, we are generating and analysing humongous data. The data generated is useful for various purposes across organisations, companies, institutions, agencies, etc. Pradipta helps us understand some of the ways that huge data is being used for common human good. He enlists ways of doing it via Cloud Computing, Big Data and Hadoop, to name a few. The challenges of analysing these data is equally enormous. Here’s an exclusive report in the Thursday TechGuru, a weekly column, in Different Truths.

In today’s fast moving world the amount of data being generated every day is enormous.

Think of it. In a few seconds, how much search queries Google handles?

How many mobile calls, text messages are being made?

How many posts, shares, likes Facebook or twitter handles, and so on.

When you arrive at the total amount of data being generated in a day, you might imagine it is humongous, in billions of gigabytes (A gigabyte (GB) is a measure of computer data storage capacity that is roughly equivalent to 1 billion BYTE, which is a unit of storage capable of holding a single character).The amount of data continues to increase at an exponential rate.

The data generated is useful for various purposes across organisations, companies, institutions, agencies, etc.

Here are some diverse examples, where data analysis is important.

Using the information kept in the social network like Facebook, the marketing agencies are learning about the response for their campaigns, promotions, and other advertising mediums.

Using the data regarding the previous medical history of patients, hospitals are providing better and quicker services.

To reduce and stop crime by analysing peoples’ data crime analysts identify trends and make recommendations based on their observations.

Through analysis and computer mapping, crime analysts play a crucial role in helping law enforcement agencies quantify, evaluate, and respond to the changing landscape of criminal activity in their jurisdictions. The list is very long where data analysis helps in our day to day life.

Now, coming to the technical part is how we do this.

Addressing big data is a challenging and time-demanding task that requires a large computational infrastructure to ensure successful data processing and analysis.

To do that below are some of the technical frameworks or ways:

Cloud Computing: It is a powerful technology to perform massive-scale and complex computing and is available on demand. Cloud computing has become a highly demanded service or utility due to the advantages of high computing power, cheap cost of services, high performance, scalability, accessibility as well as availability. In cloud computing, the word cloud (also phrased as “the cloud”) is used as a metaphor for “the Internet”, so the phrase cloud computing means “a type of Internet-based computing”, where different services — such as servers, storage and applications —are delivered to an organisation’s computers and devices through the Internet.

Big Data: It is basically the extreme volume of data, the wide variety of types of data and the velocity at which the data must be must processed.

Hadoop: It is a framework and set of tools for processing very large data sets. Hadoop and big data are conjoined.

Of course there is lots and lots more to this. This is just a small glimpse to give you a very brief idea how in today’s world the enormous amount of data are being generated. And posing the technological challenges of analysing those, how people and organisations are handling the large body of data with the ever evolving technical solutions.

It’s mindboggling!

©Pradipta Roy

Pix sourced by author from the Net.

 


Spread the love

Leave a Comment

Your email address will not be published.

You may also like

error: Content is protected !!