There are a lot of articles discussing minimal viable products. And there are many legitimate reasons why they exist. However, I believe another approach is required minimal viable tests.
Many people endeavoring to launch a new product of feature into the market, begin doing so by developing a small product that does not have all the bells and whistles of a full-fledged product, but instead tries to solve the core problem. That approach still requires a light-weight solution, and often those looking to bring a solution to market are unable to trim down the number of features causing a lot of bloat in affirming viability of their solution.
A minimal viable test, however, requires no solution – there is no bloat. All you’re doing is testing to see if there is a problem, validating or debunking assumptions, and learning along the way. Your objective is to find out where points of friction are in a proposed solution to a problem.
Let’s break it down.
At Element Analytics, at conception, we believed that Data Scientists could unlock a step function in industrial operational performance improvements. But this was a hypothesis, built around many assumptions. An MVP approach would have had us develop a data science product around some specific use case built around these assumptions. Instead, the minimal viable test process led us down another direction:
First, we listed out all of our assumptions: (1) Did industrial companies focused on operations even want data science driven solutions, (2) were these companies comfortable with giving away their data for such a solution, (3) were data scientists interested in pursuing industrial operational improvements (4) could data science work provide value to an industrial organization.
Rather than formulating a hypothesis, we decided to test our assumptions. In our experiment, we were quickly and easily able to find a company willing to provide their data, and a data scientist willing to work on the data and he was able to provide some value. We knew that there was value, but what did come up was the incredible amount of friction between getting operational data ready for data scientists. And then we began the process of developing a set of Minimal Viable Tests around that process.
In this process, we never came up with solutions, just ways to uncover problems validate assumptions. This process led to greater visibility on where to build would MVPs, which served as hypotheses, of solutions to problems.
Taking a more scientific method approach towards our solution development may be a bit more methodical in approach, but as a result, we know what we’re building is the right thing, and not having to take many cycles to redo what we believe is right. A focus on building the right solution instead of hacking a solution to see what sticks is especially needed for B2B companies. A methodical approach that is problem/solution focused enables transformational innovation and more effective MVP sprints.