Building enterprise products for unknown tomorrow

The only certainty about future is uncertainty. The change was always constant occurring ever since the evolution of mankind. But what differentiates this era from foregone is the velocity at which change is happening. In my earlier blog post, I have shared my thoughts on customers’ needs and outcomes are increasingly becoming moving targets – an unendorsed challenge of building enterprise products. A product that is built today is at a huge risk of becoming obsolete or irrelevant tomorrow. Product Manager cannot succeed building products for static needs, there is a necessity to think ahead, think future and think audacious. Your product cannot be a lottery, there should be something definitive about future.

Unfortunately, Product Manager is not Nostradamus to predict future. Nevertheless, Product Manager can anticipate needs and customers of tomorrow by developing a thorough understanding of how markets evolve, how technologies evolve and how customers’ behaviors or their challenges evolve. Accordingly, ensure that the new product can scale and adapt for needs of tomorrow, markets of tomorrow and outcomes of tomorrow.

Please read my earlier post on moving targets before proceeding further, it asserts the necessity for building customer insights and building a scalable product architecture to tackle the challenges of moving target of customer’s needs and outcomes.

Product Manager has to build customer insights through experiments and observing customers in their natural habitat, immersing in their business, assimilating their business process, problems, and challenges and not just listen to what they say but to read between the lines to understand what they did not say. Customers might not be able to articulate what business challenges they might face in future. Based on trends affecting the product and general understanding of customers’ business environment, Product Manager should anticipate customers’ requirements and ensure that the new product will optimally address the requirements of tomorrow. Product Manager can do so by looking outside the boundaries of existing customers and trying to establish a generalized view of how the market evolves because of changes in external factors influencing technologies and customer behaviors. One plausibility to understand future is to comprehend what has caused the present to diverge from past and use that as a reference to anticipate what will cause future to diverge from the present.

 

Why look into future?

Focusing on future needs and identifying those needs can help Product Manager anticipate customers of tomorrow and needs of tomorrow. Accordingly, Product Manager can conceptualize a product architecture that is scalable for future needs.

What I had noticed is that the fundamental need does not undergo many changes, what changes are the scale and the outcome. While regulation and economic factors contribute to either accelerate or decelerate the demand for the need e.g. France to ban all petrol and diesel vehicles by 2040.

  1. Scale – As the adoption of the product increases, customer start demanding more and there will be a need for vertical scaling of certain parameters – More processing power, more storage etc.
  2. Outcome – Technology evolution facilitates delivering differential outcome e.g. AI/Machine Learning is drastically changing how certain needs (autonomous driving vehicles, fraud detection) are addressed efficiently and effectively. Uber/Airbnb addresses a classic need but delivers differential outcome embracing technology.

Every product has certain scale parameters and every scale parameter has an expiry date. Product architecture plays a crucial role in determining whether those parameters can scale beyond their initial limits. Some of the parameters have a soft limit i.e. the parameters can scale beyond their initial range without requiring too much rework. Consider SaaS products, demand for a SaaS product can rise from 1000 customers to 1 million customers very quickly immediately after determining the product-market fit. With cloud providers like AWS, Azure etc. scaling of SaaS products is never difficult. SaaS companies can deploy and distribute multiple instances of their products globally. Cloud providers have efficient load balancers to manage those instances. Therefore, it is sufficient to build a SaaS product for few thousand customers and scale them as the demand arises. However, scale parameters of few other products (mostly HW products) have a hard limit. Increasing them requires a lot of rework, requires Product Manager, architects, and developers to hit the drawing boards. What we should essentially consider is the cost of revisiting the drawing boards. There could also be a counter-argument that what if customers would never require the scale that the product delivers. True, we are then not adapting lean development methodologies. We are wasting resource attempting something that customers never require. It is for Product Manager to make those trade-offs.

Building new product involves lots of decision-making. Certain decisions are irreversible or reversing them will cost a lot. Other categories of decisions are always reversible. Decisions involving scale parameters with hard limits are irreversible decisions and the decisions involving scale parameters with soft limits are reversible decisions. Product Manager has to make both kinds of decisions during the course of building the new product. However, certain irreversible decisions are dependent on anticipation of how customer requirements evolve in future. Irreversible decisions cannot be made irrationally. The decisions are taken thoughtfully after analyzing all possible risks. However, for a well-informed decision-making, Product Manager should develop deeper insights about customers to understand how their future requirements might evolve. Developing customer insights is like unearthing those deep truths about customers that customers themselves might not have acknowledged directly.

Get closer than ever to your customers. So close that you tell them what they need well before they realize it by themselves – Steve Jobs.

Considering the lifetime of an HW product or a complex SW product could at least be for five years with a possible extension of support for a couple of years, anticipating how future might affect the new product is crucial to ensure that the new product is scalable for precise future needs of target customers. Furthermore, longer the relevance of the product in the market, better the ROI, as the incremental cost of building additional HW product is minimal (economies of scale is achieved through selling more). For SW products, the incremental cost of building additional software is almost zero. So building scalable products that can sell more for a longer duration is required for better revenues with higher margins while adhering to lean practices without wasting any resources unnecessarily through developing better customer insights.

The other factor that necessitates looking at future is to derive a threat matrix. What newer technologies, economic policies, regulatory policies etc. can pose a threat in a near future to the new product? Let me pick a familiar domain – technology. While I was building the new product (HW appliance, back in 2013), I could anticipate two threats (1) the impact of white-labeled network products with entirely software-driven architecture and (2) the impact of virtualization. In addition, I also had an additional (3) threat from a regulatory body on the issue of net neutrality. While building the new product, I had to identify all possible threats and outline the probability of occurrence. Simultaneously identify what factors can cause or mitigate those threats. The cause and effect relationship between those factors and corresponding threats that can help Product Manager consciously identify the probability of occurrence is elaborated in detail later in this post.

Figure – Threat matrix of virtualization in service provider network

I had earlier indicated about three possible threats for the new product. I did use one of them to draw the below threat matrix to assess the impact of virtualization on the proposed new product. Virtualization is a classic example of addressing the same older need but with a differential outcome. Customers adopt virtualization because of newer outcome leading to cost savings, flexibility in deploying services as when required, efficient and effective use of available resources. There are two possible scenarios:

i)               ISP customers adopt virtualization in various insertions points in their network beyond DC (Data Center).

ii)              ISP customers do not adopt virtualization in their network. Virtualization is restricted to only DC.

The subsequent section of this eBook elaborates how to analyze factors that could contribute to various scenarios outlined in threat matrix. The higher probability of customers not adopting virtualization does not pose any threat to the new product while the higher probability of customers adopting virtualization poses a serious threat to the commercial success of the new product. The remaining two scenarios do not possess any immediate threat or relief.

How to understand future?

Understanding future is tantamount to diligently anticipating the following:

  • How do customers’ needs evolve?
  • How do technologies evolve?
  • How do markets evolve?
  • Who are customers of tomorrow?
  • What are customer needs of tomorrow?

While I have spoken extensively on the above items in subsequent sections – What is important is to understand the exhaustive list of factors that would influence how customer needs evolve, technologies evolve and markets evolve. Understanding those factors would help Product Manager determine the causal effect, meanwhile monitoring those causal factors pro-actively would help Product Managers ascertain how future might unfold.

Customer needs, technologies, and markets do not evolve overnight and they do evolve at a linear pace. However, there are certain forces at play that culminate together to suddenly push the evolution of customer needs, technologies and markets on trajectory path reflecting a hockey stick. Especially for high-tech products, Clayton R Christensen has clearly outlined that when the performance of new technology outpaces older technology, it gains adoption. Similar to performance, Product Manager had to identify several such factors that would result in the evolution of new markets, new needs and thereby bringing in new normal completely replacing older way of doing things. The first digital camera was invented in 1975, why did it gain acceptance only in later 1990’s and early 2000’s, what caused the technology to replace the older film cameras 25 years after its invention. It is always essential to look at those elements. Imagine someone building a film-camera in late 1990’s. Even if built with awesome features, it would have been sure recipe for disaster. History can be helpful to provoke our thoughts. Product Managers of film camera products could have used the data to anticipate threat and take corrective action. Product Managers of digital camera should have used the data to understand ways of accelerating performance as to introduce the digital photography products faster to markets. We are always on the cusp of major technological changes, a structured analysis is required to differentiate fad from reality for analyzing which major technologies are poised to become a reality and what factors could cause them to enter a mainstream market for wide adoption. Later evaluate the impact to the new product both from a perspective of threat and opportunity.

Let me pick another example that is more relevant to today’s world called AI. AI is a vast area with wider levels of intelligence according to the use-cases that it intends to address. At a broader sense, scientists and architects working on AI are contemplating to replicate neural systems of the human brain to build intelligent systems that can learn and adapt on its own just like humans do. However, to build such systems. We broadly need two things

  1. Huge processing power at an affordable cost
  2. Availability of huge data and corresponding big data systems to retrieve, store, model, process, and act upon that data in a fraction of seconds.

The industry is making huge progress on both (1) and (2). However, whether they are sufficient or not purely depends on the AI systems that we are building. When we are building a new product that either embraces AI or discards it as a hype. We should have a clear logic behind it instead of merely abiding by analyst reports or intuition. We have to analyze the kind of progress (1) and (2) are making and what factors could further accelerate or decelerate the progress that could eventually determine whether AI is really a hype or reality. Such analysis can also throw light on the possible duration for AI to become a reality. Accordingly, we can either determine the threats that AI could pose to the new product if we are discarding it as a hype or determine when it is the right time to build the new product embracing AI. Following three categories outline broader classification of AI products.

  • Artificial Narrow Intelligence (ANI) – Specializes in a specific task
  • Artificial General Intelligence (AGI) – Matches the capabilities of a human brain
  • Artificial Super Intelligence (ASI) – Exceeds the capabilities of a human brain. It is getting tough to fathom the exact potential of ASI
Figure – Growth rate of computing systems

Now that I have indicated various categories of AI and dependencies of those AI systems on (1) processing power and (2) availability of data and ability to process and act upon the data. Let us look at the above picture to understand the evolution of computing systems. Clearly, computing systems that can mimic human brain at an affordable cost will probably evolve around 2030. If we are looking at building AGI systems, then we know when it is the right time to start building those systems. Simultaneously, we can also anticipate that there is no possible threat at least until 2030 from AGI systems to ANI systems. However, after 2030 there could be intelligent systems that could do much beyond than just playing chess and driving cars. Simultaneously it is essential to undergo similar analysis to understand when big data systems required for AGI systems will actually evolve. Will it happen before 2030 or later? We should always perform such analysis to understand both threats and opportunities encountering the new product from related emerging technologies.

While trying to understand the impact of technology either from a perspective of threat or from a perspective of opportunity over a definitive timeline, it is essential to do some structured analysis as shown above. Such analysis is possible only if we could understand the dependencies that underpin the evolution of various technologies. I have exclusively focused only on technology. Nevertheless, Product Manager should focus on regulatory, customer behaviors, the purchasing power of customers, economy etc. while anticipating how future unfolds and determining how all those factors will influence the evolution and acceptance of a technology. Many companies such as Kodak has gone into oblivion because they could not anticipate the threat that digital photography can have on their products. Such analysis could have provided Kodak clear hindsight of when digital photography is ready for mainstream market and what factors can aid its adoption. Accordingly, Kodak could have switched gears to embrace to digital photography. We now knew that self-driving cars would eventually become a reality. What about a decade ago while self-driving car initiative was still very nascent. Was it possible to identify factors that could make self-driving car a reality and anticipate approximate duration for such possibility? I presume traditional companies making cars might have done such analysis to evaluate the threat matrix. Sometimes it helps to look back at history to derive some meaningful insights that can help us connect with the past in order to comprehend future.

The quantum of changes that will occur in future is much higher than what we have seen in the past. Rightly, the changes that have transpired in the last decade is much higher than the changes occurred in the last five decades. However, looking into the past will definitely help Product Manager to connect the dots to anticipate how smaller changes can combine and what factors could bring those smaller changes to take a bigger form. In today’s world, technology is one of the biggest drivers of changes. It has caused many changes in customer behaviors, markets etc. No industry or product is immune to technology advancements. One way of identifying future is to anticipate changes in technology landscape and understand how it could impact existing markets or create new markets, change customer behaviors or create new needs etc. Ideally identifying factors related to technology would be a perfect start to understanding future.

History offers meaningful insights about the past that can help us construct foresight about the future. When some people are able to take decisions in split seconds, they are able to do so because they are quickly reflecting on their past learnings, and they are actively looking back at history. Here are some meaningful quotes on how connecting with the past can help us comprehend the future.

 Study the past if you would define the future.” – Confucius

History is the past that illuminates the past, and a key that unlocks the door to the future.” – Runoko Rashidi

History is important because it teaches us about the past and by learning about the past, you come to understand the present so that you make educated decisions about the future.” – Richard Mead

As outlined by thought leaders, history does offer many lessons to understand what led to the present state and what could lead us to the future. To understand future, it is always important to understand the present by explicitly connecting it back to the past.

Understanding causal-effect

Any transformation in customer behaviors, market or technology would cause a paradigm shift. Understanding causal-effect is estimating the quantum of such shift by thoroughly anticipating all causal factors and analyzing how those factors could cause the paradigm shift and when. No change is independent. There is always a correlation of smaller changes to coalesce into something bigger – X -> Y -> Z -> BIG CHANGE i.e. paradigm shift. There are always some elements acting as a catalyst to combine smaller changes (‘X’, ‘Y’ and ‘Z’) into a bigger change. The smaller changes need not essentially combine linearly or sequentially, it can sometimes be a complex tree structure. For simplification, I choose a simple linear model of combining smaller changes. Along with identifying the smaller changes, Product Manager also had to identify the catalyst that can combine those smaller changes to spur a bigger change. Evolution of technology, market and customer needs will have so many connected pieces that Product Manager has to identify those pieces and identify what connects those pieces together to anticipate bigger changes.

There are two ways to do it

1.    Bottom-up

  • Identify smaller changes and later anticipate what connects those smaller changes to coalesce into something bigger
  • Product Manager has to be all ears and eyes to spot signs signifying smaller changes and use scenario analysis to anticipate how those smaller changes could culminate into something bigger

2.    Top-down

  • Anticipate a potential bigger change and work backward to identify what smaller changes could sum up to cause those bigger changes
  • Analysts provide lots of information on possibility of bigger changes to trends and their data can be a probable source of truth in this scenario

Nowadays analysts do a fantastic job of predicting how customer needs, technologies, and markets evolve in future. Product Manager can rely on analyst information, but instead of overly relying on the analyst data Product Manager should try to understand what could cause their prediction to come true. I did attempt to drop some thought process on how virtual reality could enter the mainstream market by 2020.

Virtual reality is supposed to be a huge market by 2020. Firstly, let us understand why virtual reality has not entered the mass market today.

  • Is the technology not affordable? Is the technology not mature?
  • Is there not a relevant and appropriate use of virtual reality technology?
  • Has the virtual reality ecosystem not evolved completely?

Understand what stops virtual reality from entering the mass market today and how the gaps could be bridged propelling virtual reality to attain mainstream market in 2020.

  • How can someone ensure affordability of virtual reality technology?
  • How can the technology mature? Can affordability and maturity of the technology be good enough factors for adoption of the technology in B2C space?
  • What would be the appropriate use of virtual reality that can attract a mass market?
  • In which segments, would there be demand for VR devices? Existences of what business drivers would cause the demand for virtual reality products in those market segments (particularly B2B)

I did a similar analysis for an earlier example that I picked to draw the threat matrix. We analyzed what stops customers from adopting virtualization in service provider networks. We drew following observations that prevent the adoption of virtualization in service provider networks. Below observations, reflect scenario as in 2013 and not as of today.

  • Lack of use-cases
  • Inability of virtualized products to meet performance requirements of customers
  • Lack of products to orchestrate, manage and load balance the traffic to virtualized instances and
  • Lack of clear and tangible advantage over HW appliances

We later analyzed the presence of what factors would allow bridging of above gaps to increase adoption of virtualization in service provider networks. We did analyze that (3) cannot hold ground. When there are even remote signs of customers adopting virtualized products, companies building products to perform (3) will automatically mushroom. The primary aspect that was blocking adoption was performance. Without improvements in performance, which will eventually lead to doing more processing on a single core CPU, it is tough to match performance requirements of customers. Conceptualizing use-cases require a discovery process along with customers to understand their business environments and challenges that virtualization can tackle. To do so, there is a need for a tangible product. Without anything tangible, a mere whiteboard discussion will not yield results. Therefore, we attempted to create an MVP version of a virtualized product (i.e. a software appliance running within a virtualized environment). The existence of a real product can help articulate value while allowing customers to experiment with the product to derive real use-cases.

Even though analyst will outline when technologies such as Big Data, IoT, Self-Driving Cars, Virtual Reality etc. would reach mainstream marketing, Product Manager should independently assess how those technologies will attain mainstream and in which market segments will they achieve mainstream. The idea is to assess what factors would make the technology affordable and usable, which segments would contribute to demand of those technologies. Accordingly, Product Managers can evolve their products to capture a majority of the predicted growth. Product Manager should always be inquisitive and curious constantly asking ‘WHY?’ with insatiable quest to unravel the enigmatic future of markets, technologies, and customers.

The ultimate goal is to create a mental map of all possibilities of future and then identify the factors that determine the likely occurrence of each of those possibilities. The biggest responsibility of Product Manager is the ability to narrow down the possibilities to just one future that is most likely to occur based on the identification of corresponding causal factors that is highly likely to occur. The fundamental idea is that we should not leave anything to chance.

[1] Source: http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html?utm_source=share&utm_medium=twitter&utm_campaign=sm_share

Leave a Reply