Big Data

The Liberating “What If” Analytics Cycle

Bill Schmarzo By Bill Schmarzo CTO, Dell EMC Services (aka “Dean of Big Data”) May 20, 2013

Ah, the anguish of not knowing the “right” answers.  Organizations struggle with the process of determining the “right” answers, resulting in lots of wasted debates and divisive arguments regarding whose answers are more right.  They even have a name for this debilitating process – “analysis paralysis” – where different sides of the argument bring forth their own factoids and antidotal observations to support the justification of their “right” answer.  However the concepts of experimentation and instrumentation can actually liberate organizations from this “analysis paralysis” by providing a way out – a way forward that leads to action versus just more debate, more frustrations, and more analysis paralysis.

For many organizations, the concepts of experimentation and instrumentation are a bit foreign.   Internet companies (such as Yahoo, Google, Facebook, Amazon) and direct marketing organizations have ingrained these two concepts into their analytics and customer engagement processes.  They have leveraged the concepts of experimentation and instrumentation to free up the organizational thinking – to freely explore new ideas and test “hunches” – but in a scientific manner that results in solid evidence and new organizational learning.

Let’s understand how your organization can embrace these same concepts as part of your big data strategy.  Let’s start by defining these key terms:

  • Experimentation is defined as the act, process, practice, or an instance of making experiments, where an experiment is a test, trial, or tentative procedure; an act or operation for the purpose of discovering something unknown or of testing a principle, supposition, etc.[1]
  • Instrumentation is defined as the art and science of measurement and control of process variables within a production or manufacturing area.[2]

Taken together, these two concepts can liberate organizations that are suffering from analysis paralysis – struggling when they are not certain as to what decision to make.  Should I increase prices 10% or decrease prices 10%?  Should I use this ad and messaging, or should I use this ad and messaging?  Should I offer promotion A or promotion B?

Taken together, these two concepts can power the creative “What If” thinking that is necessary as organizations look to embrace big data.  The “What If” Analytics Cycle can advance the organization’s understanding of the business potential with new sources of structured and unstructured data, located both internal and external to the organization, coupled with advanced analytics and data science methodologies (see figure 1 below).

What if Bill S Post

Figure 1:  The “What If” Empowerment Cycle

I like to call this the “empowerment” cycle as it empowers organizations to freely debate different ideas without having to worry about which ideas are right ahead of time.  Consequently, organizations can embrace an environment of experimentation to encourage the free flow of new ideas.  Organizations can let the tests tell them which ideas are “right” and not let the most persuasive debater or most senior person in the room make that determination.  It empowers the organizational to feel free to challenge conventional thinking, and empowers creative thinking that can surface potentially worthy ideas.  No longer do you have to spend time debating whose idea is right.  Instead, put the ideas to the test and let the data tell you!

Let’s walk through an example of how one would leverage the “What If” Empowerment Cycle:

  • Step 1:  Develop hypothesis or theory that you want to test.  For example, I believe that my target audience will respond more favorably to Offer A, while my colleague believes that Offer B is more attractive to our target audience.
  • Step 2:  Create an experiment (e.g., test environment with corresponding test cases) that can prove or disprove the hypothesis.  We’d also want to be clear as to the metrics that we would use to measure the test results (e.g., click through rate, store traffic, sales).  In this example, we’d create tests for three test cases: Offer A, Offer B, and a Control Group.  We’d employ some data science methodologies to select our test and control audiences, and ensure that other potential variables are held constant during the test (e.g., same time of day, same audience characteristics, same channel, timeframe, etc.).
  • Step 3:  Instrument all test cases in order to measure the results of the test.   In this example, we’d want to ensure that each of the three test cases was appropriately “tagged” and that we were capturing all the relevant data to determine who responded to which offers, who didn’t respond, and what were the results of their responses.
  • Step 4:  Execute the tests.  For our example, we’d now start the tests, capture, assemble, and integrate the relevant data and then conclude the test.
  • Step 5:  Learn/quantify test results and move onto next test.  Finally, we’d look at the results of the tests, examine who clicked on what ads, determine the final results, and declare a winner.  And more importantly, we’d then move onto the next test.

The beauty of an organization that understands and embraces the experimentation and instrumentation cycle is to make both decisions, and then see which one works out the best.  This “What If” Analytics Cycle leverages experimentation and instrumentation to empower the organization to freely explore and test new ideas, and empowers organizations to get moving by not get bogged down in “analysis paralysis.”  In fact, I’d argue that big data is the anti-“analysis paralysis” by giving organizations the data, tools and methodologies to test ideas, learn from those tests, and move on.

 


[1] http://dictionary.reference.com/browse/experiment

[2] http://en.wikipedia.org/wiki/Instrumentation

Bill Schmarzo

About Bill Schmarzo


CTO, Dell EMC Services (aka “Dean of Big Data”)

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting the strategy and defining the Big Data service offerings and capabilities for Dell EMC Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power the organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill was ranked as #15 Big Data Influencer by Onalytica.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored Dell EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a masters degree in Business Administration from the University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

Read More

Join the Conversation

Our Team becomes stronger with every person who adds to the conversation. So please join the conversation. Comment on our posts and share!

Leave a Reply

Your email address will not be published. Required fields are marked *