Big data has the potential to transform humankind – from helping us cure diseases to simplifying and streamlining our lives. Now, let us not get into the debatable aspects of our increasingly digital universe but focus on the art of the possible.
After having digitized our entire lives, which is well documented in the digital universe study, we have moved on to machines.
With the help of devices, wearables, sensors and systems, we’ve begun collecting, connecting and disseminating information, thoughts, and experiences in ways unimaginable just a few years back.
Most organizations are looking for ways to ride the waves of big data and transform their domain – be it customer experiences, processes, systems or plain and simple time management. All big data projects need a roadmap, a recipe that can iteratively move you toward your goal. After wading through a wide array of literature, I was able to simplify the process into three steps: an exploration phase followed by optimization that leads to true transformation.
These iterative steps can help guide your projects irrespective of domain in an easy to execute format.
Explore:
All big data projects start with a question; data scientists will not touch anything without a good question – the first is often to explore. The question can be as generic as:
- “Is our customer experience with our company good enough to keep them coming?”
or - “How can we transform the customer experience so that customers become staunch evangelists for our brand?”
or a much more specific question like - “How much more would a buyer spend if we kept him or her on the site for two more minutes?”
Businesses often start with a question to explore, build on the answers in an iterative fashion until insights stand out among the noise, whether its customer experience, process improvement or IT management.
The explore phase requires taking inventory of all data assets and cataloging every bit of information available in its natural and original form. Then, construct a sandbox to play with the information, build models, create prototypes, develop a hypothesis and test them out. The exploration enables business and data scientists to get questions answered to a sufficient level of detail. This drives action and helps optimize the system.
Optimize:
The primary function of this phase is to operationalize efficiently. Having identified the drivers of business impact in terms of people, processes and technologies, the business moves to scale what you learn and optimize the systems to encourage the right behaviors, patterns and habits. You have to find better and more efficient ways to get things done.
This is done through a drill down on the exploratory question with a qualifier. For instance:
- “Are we satisfying the right customer need in the best possible way for the customer?”
But the process does not stop here.
To build a sustainable competitive advantage, businesses have to transform the workforce, organization, and industry by institutionalizing the explore-optimize process. This gets you to the unique secret sauce to build sustainable systems that consistently deliver and raise the bar in the industry – establishing innovation beyond competition.
Transform:
To get to the next level, individuals, organizations, and industries need a plan, a roadmap to navigate the unknown waters of time. True transformation comes from building a living system of improvement that leverages insights to continuously iron away the wrinkles and flatten the seams one item at a time, answering questions like,
- “Are we done with transforming our customer experiences or have we created a new need?”
The improvements, adjacencies and optimizations will no longer be one-offs, but an organic system where you have a system to iteratively raise the bar.
This is the aspiration. It takes a sustained effort on the part of individuals, organizations, and industries to come to fruition.
The overall big data endeavor might appear daunting at first but chip away in small chunks and huge innovations become possible rather quickly. The three step process is a small attempt to simplify the strategy to move forward quickly. It is not very different from the software development lifecycle, services lifecycle, six sigma processes, etc.—the principles remain the same.
A recent whitepaper examines how data sciences, IT and big data can come together to work on a data lake that helps chip away at your goals in small chunks. Data scientists, IT or database administrators can get a lot from the whitepaper available here.