Ever since we entered the world of “trendy” in SCM class –from
lean to reshoring and now big data–
I’ve taken an interest in challenging all these hip new words and question
everything that they entail while assessing their effectiveness. This week is no different; I want to play
devil’s advocate against big data and get the reader to look at this topic
through a different perspective.
First, we need to establish a definition for this concept;
why is this data big? Big data refers to
the software engineering term that describes sets of data that grow so large
that they are difficult to process using regular database management tools [1]. As companies realized that they could use
data collected from external stakeholders –consumers, suppliers, etc.– they started developing other means to process and capture
these enlarged numbers of data to factor in their decision making processes.
These stories of success have caused other companies to
pursue similar actions, but they have not been as successful in doing so. Why is that? The data is out there for
everyone who needs it to capture; how can Amazon be gaining competitive
advantage while others are either losing or not generating as much ROI as
expected? There are three main reasons
that I narrowed this down to, based on my research:
It’s not magic
As Teradata’s Stephen Brobst cleverly stated, “New
technologies are often perceived as silver bullets that will solve all problems”
[3]. This topic has been heavily
discussed in the past and Nicholas Carr caused quite a stir with his “IT Does
not Matter” paper; he has a point though, one cannot continue to do things the
same way, invest in a million-dollar technology that everyone is using to
improve the business, and just sit and wait for it to solve itself. There must be a good reasoning behind the
investment and an awareness that these are investments that might not yield
returns until after a series of changes in different areas (culture, processes,
technical expertise, to name a few). The
entire change cycle could take years before positive results are shown. Furthermore, the investment is not to be done
for the sake of having the latest and most expensive technology; the right
tools are needed to handle and make the most out of what big data brings.
You need the right
people
Part of these tools includes people. Technologies such as Hadoop, MapReduce, and
NoSQL that can process big data are extremely difficult when compared to the
traditional database-oriented tools. The
gurus who know these technologies are scarce, and those who are receiving
formal training in schools don’t know enough, so there is a demand from the
companies that is not being compensated by what the market has to offer. There are even new job roles that companies
have created, such as data scientists, who focus on studying and reading this
data to interpret what exactly it is telling them that can be of value. What good does it make to have all this data
if you don’t have the right tools to read it or people to analyze it?
You need the right
processes
The other part of the before mentioned tools is
processes. The decision to invest should
have been carefully reviewed and questions such as “what workloads in our
processes is this technology most efficient for?” should have been
answered. The entire environment should
be designed so that the selected technologies work well together with the
existing resources. It is also important
to assess whether
Not to say though, that big data is bad idea, but I believe
that companies do not understand the investments necessary for it to be successful. My recommendation for companies would be to
practice smart investments, they need to go beyond technology and consider
people with the right skill sets along with the processes. They should think about the value that this
can bring to their business when they align technology with the strategy
instead of focusing solely on technology.
On that note, I ask the reader: Is big data here to stay?
References:
by Elisa Taymes
Amazon SimpleDB is another solution. Amazon SimpleDB can be useful for those who need a non-relational database for storage of smaller, non-structural data. Amazon SimpleDB has restricted storage size to 10GB per domain. Amazon SimpleDB offers simplicity and flexibility. SimpleDB automatically indexes all data. Amazon SimpleDB pricing is based on your actual box usage. You can store any UTF-8 string data in Amazon SimpleDB.
ReplyDeleteSDB Explorer provides an industry-leading and intuitive Graphical User Interface (GUI) to explore Amazon SimpleDB service in a thorough manner, and in a very efficient and user friendly way.
http://www.sdbexplorer.com/