0

Building enterprise products for unknown tomorrow

The only certainty about the future is uncertainty. The change was always constant occurring ever since the evolution of humankind. But what differentiates this era from foregone is the velocity at which change is happening. In my earlier blog post, I have shared my thoughts on customers’ needs, and outcomes are increasingly becoming moving targets – an unendorsed challenge of building enterprise products. A product that is built today is at a huge risk of becoming obsolete or irrelevant tomorrow. Product Manager cannot succeed in building products for static needs, there is a necessity to think ahead, think future and think audacious. Your product cannot be a lottery, there should be something definitive about the future.

Unfortunately, Product Manager is not Nostradamus to predict the future. Nevertheless, Product Manager can anticipate needs and customers of tomorrow by developing a thorough understanding of how markets evolve, how technologies evolve and how customers’ behaviors or their challenges evolve. Accordingly, ensure that the new product can scale and adapt for needs of tomorrow, markets of tomorrow and outcomes of tomorrow.

Please read my earlier post on moving targets before proceeding further, it asserts the necessity for building customer insights and building a scalable product architecture to tackle the challenges of moving target of customer’s needs and outcomes.

How THE FUTURE transcends?

Choose any product (e.g., Digital Camera, AirBnB) or a category (e.g., SmartPhone) that has caused major disruptions. What we can realize is that the fundamental need addressed by those products did not undergo many changes. The need for capturing an image has not changed, the need for accommodation while on business or tourism has not changed, the need for communication has not changed. However, what has changed is the scale and the outcome at which the new-age products addressed those needs leading to new business models and creating a new normal. Enterprise products are no different. While regulation, microeconomic, and macroeconomic factors contribute to either acceleration or deceleration of the demand for a need, e.g., France to ban all petrol and diesel vehicles by 2040.

  • Scale – As the adoption increases, users demand more functionality which is seldom not possible to deliver without additional processing power, storage, etc. In addition to an increase in the scale of specific attributes, there is also a rise in additional needs or challenges as the adoption of the product increases. More IoT connected devices might lead to challenges in securing those devices and the need for an operational simplicity of managing those devices. Those are needs that arise at the periphery as the adoption grows. Accordingly, ecosystems evolve with the addition of new players.
  • Outcome – Technology evolution creates immense possibilities of delivering a differential outcome not possible before, e.g., AI/Machine Learning aids in effectively and efficiently addressing specific needs such as autonomous driving vehicles, fraud detection, etc. New age companies like Uber and Airbnb while delivering a classic need created a new normal through providing differential outcome embracing technology.

 

Product scale

Every product has specific scale parameters, and every scale parameter has an expiry date. Product architecture plays a crucial role in determining whether those parameters can scale beyond their initial limits. Some of them have a soft limit, i.e., the ability to scale beyond their initial range without requiring too much rework. Consider SaaS products, demand for a SaaS product can quickly rise from 1000 customers to 1 million customers immediately after determining the product-market fit. With cloud providers like AWS, Azure, etc. auto-scaling of SaaS products is not a distant dream. SaaS companies can deploy and distribute multiple instances of their products globally. Cloud providers have efficient load balancers to manage those instances. Therefore, it is sufficient to build a SaaS product for few thousand customers and scale them on demand. However, scale parameters of a few other products (mostly HW products) have a hard limit. Increasing them requires a lot of rework compelling Product Manager, architects, and developers to hit the drawing boards. What we should primarily consider is the cost of revisiting the drawing boards. There could also be a counter-argument that what if customers would never require the scale that the product delivers. True, we are then not adapting lean development methodologies. We are wasting resource attempting something that customers will never demand. It is for Product Manager to make those trade-offs. Ideally, I would recommend to first determine product-market fit before making any decision on scaling. The thumb rule would be to nail the product before scaling it. Nevertheless, it is essential to determine the scale requirements immediately after reaching the product market fit but before adoption increases.

The scale is just not limited to the specific attributes or parameters of a product. Product Manager should foresee opportunities and challenges triggered by large-scale adoption of a product. Increase in adoption opens new opportunities altering the existing ecosystem. Adoption of digital cameras paved the way for photo editors, photo sharing and hosting solutions, and photo viewers. In most cases, the ecosystem evolves on its own further increasing the adoption of the product. Otherwise, the Product Manager has to strategize explicitly to strengthen the ecosystem.

Majority of the products operate in an ecosystem. Ironically, not all products in an ecosystem capture the same value. As the ecosystem evolves with the addition of new players, there is a shift in the value. Products Manager has to be conscious about where the shift is occurring and the possible implications it can have on the product (s)he owns.

 

Product Outcome

The emergence of new technologies creates enormous possibilities for new outcomes. Anticipate all possible emerging technologies related to the product and comprehend the new outcomes those technologies could deliver. Now, Product Manager has to evaluate the following:

  1. What technologies are likely to enter the mainstream market
  2. What outcomes could those technologies deliver

I have a blog post pending on ‘Outcome thinking canvas’. The canvas elaborates what new outcomes do the emerging technology delivers and what would cause the customers to embrace the new outcome or stay with the status quo. Accordingly, assess the risk to the new product in the future and strategize the mitigation plans.

Virtualization was one of the significant trends gaining attention in the industry while I was building an HW appliance (back in 2013). I drew the below adoption matrix to assess the impact of virtualization on the proposed new product. There are two possible scenarios:

  1. ISP customers adopt virtualization in various insertions points in their network beyond DC (Data Center).
  2. ISP customers do not adopt virtualization in their network. Virtualization then was restricted to only DC.

Figure – Adoption matrix of virtualization in the service provider

Using the technology canvas – elaborated in the later section of this post, I identified what is stopping ISPs from adopting virtualization, what could cause ISPs to adopt virtualization in future, and when is virtualization ripe for mass adoption. The higher the probability of customers not adopting virtualization does not pose any threat to the new product while the higher probability of customers adopting virtualization poses a severe threat to the commercial success of the new product. The remaining two scenarios do not possess any immediate threat or relief. To comprehend and ascertain the most probable outcomes, it is essential to rationalize what technology could define the future.

 

What defines THE FUTURE?

The technology was always a catalyst determining how future evolved and will continue to be so. While the Internet defined mid-90’s, mobile defined the mid-2000s, and the Artificial Intelligence could define future ahead starting mid-2010s. Technology advancements always had and will continue to have far-reaching implications on how markets evolve, how customer needs and their behaviors evolve. Faster technology advancements are putting products at the risk of becoming irrelevant sooner. Since technology is continuously at the epicenter of causing a new normal in markets and customer behaviors, it is only appropriate for Product Managers to exclusively focus on forecasting or anticipating what technologies could define the next decade. A plethora of technologies emerge every day, and hardly few enter the mainstream market. But, why do specific technologies alone succeed, what factors could contribute to their success?

The enigma of technology emergence

There is always an enigma behind why few technologies successfully emerge while many others eventually fade away. Even among technologies that successfully entered the mainstream market, some took longer than the others.

  • The invention of the first digital camera happened in 1975. Why did it gain acceptance only in the later 1990’s and early 2000’s? What caused the technology to replace the older film cameras 25 years after its invention?
  • Why was the smartphone one of the fastest adopted technology?
  • Why did Google Glass, Segway, and Amazon Fire Power fail?
  • The creation of the 1st AI program occurred in the 1950s. However, it is adopted and successfully applied to address several problems only recently. What caused the slow rate of adoption and what would cause the creation of an AI system that would completely replicate a human brain?

There is a necessity to comprehend factors that could catapult specific technologies into the mainstream market, identify if there is a pattern that can help us predict or forecast the potential technologies that will emerge defining THE FUTURE. The first digital camera even though invented in 1975 gained acceptance only in the later 1990’s and early 2000’s, what caused the technology to replace the older film cameras 25 years after its invention. Probably improvements in image quality of digital cameras, the proliferation of PCs to store and process images, etc., it is always essential to comprehend factors that could accelerate the adoption of emerging technology. Imagine someone building a film-camera in late 1990’s. Even if made with impressive features, it would have been a sure recipe for disaster. History can be helpful to provoke our thoughts. Product Managers of film camera products could have anticipated the threat of digital cameras and took appropriate corrective action. Product Managers of digital camera should have identified factors essential to accelerate the adoption of digital photography. We are always on the cusp of significant technological changes. There is a necessity for performing structured analysis to differentiate fad from reality and comprehending which technologies are poised to become a reality and how. Later evaluate the impact of emerging technology on the new product both from a perspective of threat and opportunity.

Underpinning the dependencies for technology emergence

AI (Artificial Intelligence) is a vast domain with several tiers of intelligence according to the use-cases that it intends to address. Following three categories outline a broader classification of AI products.

  • Artificial Narrow Intelligence (ANI) – Specializes in a specific task
  • Artificial General Intelligence (AGI) – Matches the capabilities of a human brain
  • Artificial Super Intelligence (ASI) – Exceeds the capabilities of a human brain. It is getting tough to fathom the exact potential of ASI.

Scientists and architects working on AI have embarked on an audacious journey to build intelligent systems (AGI) that can learn and adapt on its own just like humans do or even exceed the capabilities of humans through replicating neural systems of the human brain. What is the realistic possibility of building such an AGI system and when it could happen? For building AGI systems that behave as humans do, we have to make massive strides in AI algorithms which is interdependent on two other factors.

  1. Immense processing power at an affordable cost
  2. Availability of humongous data and corresponding big data systems to retrieve, store, model, process, and act upon that data in a fraction of sub-microseconds.

The industry is making tremendous progress on both (1) and (2). However, whether the development is sufficient purely depends on the computing requirements of AGI systems that we are building. We have to analyze the kind of progress (1) and (2) are making and what factors could further accelerate or decelerate the progress that could eventually determine whether AGI systems are hype or reality. Such analysis can also throw light on the possible duration for AGI systems to become a reality.

There is a clear indication that the computing systems that can mimic the human brain at an affordable cost will probably evolve around 2030. If we are looking at building AGI systems, then we know when it is feasible to build those systems successfully. Simultaneously, we can also anticipate that there is no possible threat at least until 2030 from AGI systems to ANI systems. However, after 2030 there could be intelligent systems that could do much beyond than just playing chess and driving cars replacing white-collar jobs probably. Simultaneously it is essential to undergo similar analysis to understand when big data systems required for AGI systems will evolve. Does it happen before 2030 or later? We should always perform such analysis for comprehending which emerging technologies could enter the mainstream market and when. Accordingly, identify both threats and opportunities confronting the new product.

 

What defines THE FUTURE?

The technology was always a catalyst determining how future evolved and will continue to be so. While mid-90’s was defined by the Internet, the mid-2000s was defined by the Mobile and future ahead starting mid-2010s could be defined by the Artificial Intelligence. Technology advancements always had and will continue to have far-reaching implications on how markets evolve, how customer needs and their behaviors evolve. Faster technology advancements are putting products at the risk of becoming irrelevant sooner. Since technology is continuously at the epicenter of causing a new normal in markets and customer behaviors, it is only appropriate for Product Managers to forecast or anticipate what technologies could define the next decade. A plethora of technologies emerge every day and hardly few enter the mainstream market. But, why do certain technologies alone succeed, what factors could contribute to their success?

The enigma of technology emergence

There is always an enigma behind why few technologies successfully emerge while many others eventually fade away. Even among technologies that successfully emerge, some take longer than the others to enter the mainstream market.

  • The first digital camera was invented in 1975. Why did it gain acceptance only in the later 1990’s and early 2000’s? What caused the technology to replace the older film cameras 25 years after its invention?
  • Why was the smartphone one of the fastest adopted technology?
  • Why did Google Glass, Segway, and Amazon Fire Power fail?
  • The 1st AI program was created in the 1950s. However, it is adopted and successfully applied to address several problems only recently. What caused the slow rate of adoption and what would cause the creation of an AI system that would completely replicate a human brain?

There is a necessity to comprehend factors that could catapult certain technologies into the mainstream market, identify if there is a pattern that can help us predict or forecast the potential technologies that will emerge defining THE FUTURE. The first digital camera even though invented in 1975 gained acceptance only in the later 1990’s and early 2000’s, what caused the technology to replace the older film cameras 25 years after its invention. Probably improvements in image quality of digital cameras, the proliferation of PCs to store and process images etc., it is always essential to comprehend factors that could accelerate adoption of emerging technology. Imagine someone building a film-camera in late 1990’s. Even if built with awesome features, it would have been sure recipe for disaster. History can be helpful to provoke our thoughts. Product Managers of film camera products could have used the data to anticipate threat and take corrective action. Product Managers of digital camera should have used the data to identify factors that could have accelerated the adoption of digital photography. We are always on the cusp of major technological changes. A structured analysis is required to differentiate fad from reality for analyzing which major technologies are poised to become a reality and how. Later evaluate the impact on the new product both from a perspective of threat and opportunity. Product Manager has to build customer insights through experiments and to observe customers in their natural habitat, immersing in their business, assimilating their business process, problems, and challenges and not just listen to what they say but to read between the lines to understand what they did not say. Customers might not be able to articulate what business challenges they might face in future. Based on trends affecting the product and general understanding of customers’ business environment, Product Manager should anticipate customers’ requirements and ensure that the new product will optimally address the requirements of tomorrow. Product Manager can do so by looking outside the boundaries of existing customers and trying to establish a generalized view of how the market evolves because of changes in external factors influencing technologies and customer behaviors. One plausibility to understand future is to comprehend what has caused the present to diverge from past and use that as a reference to anticipate what will cause future to diverge from the present.

Let me pick a contemporary example – AI (Artificial Intelligence). AI is a vast domain with several tiers of intelligence according to the use-cases that it intends to address. Following three categories outline a broader classification of AI products.

  • Artificial Narrow Intelligence (ANI) – Specializes in a specific task
  • Artificial General Intelligence (AGI) – Matches the capabilities of a human brain
  • Artificial Super Intelligence (ASI) – Exceeds the capabilities of a human brain. It is getting tough to fathom the exact potential of ASI

Scientists and architects working on AI have embarked on an audacious attempt to build intelligent systems (AGI) that can learn and adapt on its own just like humans do or even exceed the capabilities of humans replicating neural systems of the human brain. What is the realistic possibility of building such an AGI system and when it could happen? To build AGI systems that behave like humans, we have to make huge strides in AI algorithms which is interdependent on two other factors.

  1. Huge processing power at an affordable cost
  2. Availability of huge data and corresponding big data systems to retrieve, store, model, process, and act upon that data in a fraction of microseconds.

The industry is making huge progress on both (1) and (2). However, whether they are sufficient or not purely depends on the computing requirements of AGI systems that we are building. We have to analyze the kind of progress (1) and (2) are making and what factors could further accelerate or decelerate the progress that could eventually determine whether AGI systems are hype or reality. Such analysis can also throw light on the possible duration for AGI systems to become a reality. Accordingly, we can either determine the threats that AGI system could pose to ANI systems if we are discarding it as hype or determine when it is possible to build a new product that behaves as humans do.

There is a clear indication that the computing systems that can mimic a human brain at an affordable cost will probably evolve around 2030. If we are looking at building AGI systems, then we know when it is feasible to successfully build those systems. Simultaneously, we can also anticipate that there is no possible threat at least until 2030 from AGI systems to ANI systems. However, after 2030 there could be intelligent systems that could do much beyond than just playing chess and driving cars probably replacing white-collar jobs. Simultaneously it is essential to undergo similar analysis to understand when big data systems required for AGI systems will evolve. Will, it happen before 2030 or later? We should always perform such analysis for comprehending which emerging technologies could enter the mainstream market. Accordingly, identify both threats and opportunities confronting the new product.
Figure – The growth rate of computing systems

Unraveling the enigma of technology emergence

The earlier analysis concluded that it is not possible to build an AGI system before 2030. Nevertheless, computing systems are not the only dependency on the advancements of AGI systems. Moreover, a decade is too long for making any predictions and blindly sticking to it. We need to keep a close watch on whether computing systems are growing as expected by identifying the dependencies that determine the progress of computing systems.

Conceptualization of a wide range of ANI use-cases, and successful adoption and proliferation of related products such as Tesla, Alexa, and Nest, etc. will accelerate the demand for more advanced AI use-cases. The initial euphoria of AI products gaining customer acceptance lure investors to make huge bets on AI companies, thereby increasing investments, fueling further innovations in computing systems and other related AI infrastructure making it feasible to harness additional use-cases that were eluding humanity earlier. It is a cyclic reaction, initial adoption of ANI systems by B2B and B2C customers will fuel more investments, leading to better technology advancements and in turn, creating better AI products through improvement in computing systems and HW/SW entities related to AI.

Initial success fuels more investments and additional investments beget further progress. The cycle rotates until the success or progress decelerates, or investments slow down. For better prediction, we (Product Managers) should be able to predict what could break, accelerate or decelerate the cycle. Certain possibilities are, technology improvements do not happen as predicted, the economic slowdown might hamper in-flow of money, enterprise customers stop investing in AI devices, etc. On the contrary, check what could accelerate the cycle, killer AI use-cases for B2B and B2C making AI products indispensable. Killer use-cases will ensure that AI products will be the last category to take a hit while either B2B or B2C customers decided to cut down their spending. Meanwhile, keep collecting exhaustive data that can indicate how the AI adoption cycle performs. Below are few snippets of relevant data i) Phenomenal increase in AI funding over the last five years, and ii) Increase in share value of NVIDIA.

 

We can continue looking at other relevant data i) Adoption of AI products (both B2B and B2C), ii) Increasing rate of AI startups, iii) Technology improvements, and iv) Acquisition of AI companies, etc. It is essential to evaluate each data in correlation with others and connect the dots to predict the progress of AI. Conduct such analysis of all possible emerging technologies related to the new product using the below model. I have identified five major areas that can either impede or advance the emergence of any technology.

Figure – Emerging technology canvas

 

AI was an example to illustrate the structured approach of analyzing the possibility of any emerging technology entering the mainstream market. Nevertheless, we can generically examine the potential of an emerging technology under the following five broader parameters.

  1. Technical – Is the technology not matured yet, what is stopping the technology to reach its desired level of performance? Building AGI systems are not possible until computing systems with the required performance levels are built.
  2. Customer behavior – Does technology requires a change in customer behavior. Virtual reality requires a change in customer behavior. The technology even though exciting, expects customers to experience the world in a way that they are not accustomed.
  3. Regulation – Is the technology has to undergo any regulatory compliances g., autonomous vehicles. The existence of red tape can have a negative impact on technology emergence
  4. Economy – Does the wider adoption of the technology is dependent on the economic state of its target segments. Products typically build with emerging technology for segments at the bottom of the pyramid should give lots of consideration for economic factors.
  5. Lack of use-cases and ecosystem – Ecosystem is critical for the wider adoption of certain technologies, g. autonomous vehicles is dependent on the presence of charging stations. Adoption of HDTV was delayed because of the absence of an ecosystem. Lack of high-definition cameras, archaic broadcast standards, and older production and post-production infrastructure delayed the emergence of HDTV. Most importantly, the presence of killer use-cases that could entice customers to migrate to newer technology. Can the outcome delivered by the technology is good enough for customers to discard the status-quo and embrace new ways of doing things.

 

Connecting the dots

Technology is undeniably the protagonist of our discussions, but technology alone does not define THE FUTURE, it is not a one-way street. There is a symbiotic relationship among the evolution of technology, market, and customers and how they coalesce together defining THE FUTURE. Understanding future is tantamount to anticipating –

  • Who are customers of tomorrow?
  • What are the customer needs of tomorrow?

Through diligently analyzing the following albeit not independently but in correlation to each other.

  • How do customers’ needs evolve?
  • How do technologies evolve?
  • How do markets evolve?

Customer needs, technologies, and markets do not evolve overnight. They do evolve at a linear pace. However, there are certain forces at play that culminate together suddenly pushing the evolution of customer needs, technologies and markets on a trajectory path reflecting a hockey stick. Especially for high-tech products, Clayton R Christensen has clearly outlined that when the performance of new technology outpaces older technology, it gains adoption. Similar to performance, Product Manager had to identify several such factors that would result in the evolution of new markets, new technologies, new needs and thereby bringing in new normal completely replacing older way of doing things.

In the earlier blog post, I have independently provided models to understand how the customers’ needs evolve by categorizing whether the need is a dormant need or emergent need. Comprehending the demand drivers of the need, and the factors making it feasible to address a dormant need should facilitate Product Manager to anticipate how the demand drivers evolve in future, what existing demand drivers extinct, what new additional demand drivers surface in future and what emerging technologies could deliver better outcomes. Similarly, for technology, I have provided a model to anticipate which emerging technology will most likely attain mainstream market and what new outcomes they could deliver. Now identify how emerging technologies and emerging needs could amalgamate creating emerging markets.

Figure – Product Canvas – Building for the future

The prospects of higher scale or newer outcomes in future are the causal effect of the interplay of how markets evolve, how technologies evolve and how needs evolve (or rather how the demand drivers of the need evolve). Product Manager has to assess how smaller changes in each of them could lead to the next significant change that defines the future.

 

Understanding causal-effect

Any transformation in customer behaviors, market, or technology, will cause a paradigm shift. Understanding causal-effect is estimating the quantum of change by thoroughly anticipating all causal factors and analyzing how those factors can cause the paradigm shift and when.  No change is independent. There is always a correlation between smaller changes to coalesce into something bigger – X à Y à Z à BIG CHANGE, i.e., a paradigm shift. There are always some elements acting as a catalyst to combine smaller changes (X, Y, and Z) into a more significant change. The smaller changes need not essentially combine linearly or sequentially. The combination is often as complicated as a DNA structure. Along with identifying the smaller changes, Product Manager should also to identify the catalyst that can connect those smaller changes to spur a more significant change. Evolution of technology, market and customer needs have lots of interconnected pieces. Product Manager has to identify those pieces and understand what connects them to anticipate more significant changes.

There are two ways to do it

  1. Bottom-up
    • Identify smaller changes and later anticipate what connects those smaller changes to coalesce into something bigger.
    • Product Manager has to be all ears and eyes to spot signs signifying smaller changes and use scenario analysis to anticipate how those smaller changes could culminate in something bigger.
  2. Top-down
    • Anticipate a potentially significant change and work backward to identify what minor changes could combine to cause it.
    • Analysts provide quality data on trends that might dominate the future. Their data can be a probable source of truth.

Analysts do a fantastic job of predicting how customer needs, technologies, and markets evolve in the future. Nevertheless, do not get blindly swayed by the phenomenal optimism of analysts. Product Manager can rely on analyst information, but instead of overly relying on the analyst data Product Manager should try to rationalize their predictions by identifying what elements or factors could cause their prediction to come true. I did attempt to rationalize what would cause virtual reality to enter the mainstream market by 2020.

Virtual reality is supposed to be an enormous market by 2020. Firstly, let us understand why virtual reality has not entered the mainstream market today.

  • Is technology not affordable? Is the technology not mature?
  • Is there not a relevant and appropriate use of virtual reality technology?
  • Has the virtual reality ecosystem not evolved completely?

Understanding what stops virtual reality from entering the mainstream market today will help Product Manager identify the gaps that could be bridged propelling virtual reality to attain mainstream market in 2020.

  • How can someone ensure affordability of virtual reality technology?
  • How can the technology mature? Can affordability and maturity of the technology be good enough factors for adoption of the technology in B2C space?
  • What would be the appropriate use of virtual reality that can attract a mass market?
  • In which segments, would there be demand for VR devices? Existences of what business drivers would cause the demand for virtual reality products in those market segments (particularly B2B)

While I was building the new product (HW appliance, back in 2013), NFV (Network Function Virtualization) was an emerging technology that was holding lots of promise (especially in Service Provider market). NFV was predicted to be a billion dollar market within the duration of 3 – 5 years. Naturally, we want to have a pie of that growing market by building a virtual appliance. However, I attempted to make a rational analysis to understand the real possibility of NFV gaining tremendous adoption in service provider networks. We analyzed what stops customers from adopting virtualization in service provider networks. We drew following observations that prevent the adoption of virtualization in service provider networks. Below observations, reflect scenario as in 2013 and not as of today.

  1. Lack of use-cases
  2. The inability of virtualized products to meet performance requirements of customers
  3. Lack of products to orchestrate, manage and load balance the traffic to virtualized instances and
  4. Lack of clear and tangible advantage over HW appliances

We later analyzed the presence of what factors would allow bridging of the above gaps to increase adoption of virtualization in service provider networks. We did conclude that (3) cannot hold ground. When there are even remote signs of customers adopting virtualized products, companies building products to perform (3) will automatically burgeon. The primary aspect that was blocking adoption was performance. Without improvements in performance, which will eventually lead to doing more processing on a single core CPU, it is tough to accelerate the adoption of virtualization. The performance improvement was on increasing trajectory, and it was merely a matter of time before it reaches desired levels. Meanwhile, we decided to leverage that time focusing on conceptualizing use-cases. Conceptualizing use-cases require a discovery process along with customers to understand their business environments and challenges that virtualization can tackle. There is a need for a tangible product to undergo such exercise. Without anything substantial, a mere whiteboard discussion will not yield results. Therefore, we attempted to create an MVP version of the virtualized product (i.e., a software appliance running within a virtualized environment). The existence of a real product can demonstrate value while allowing customers to experiment with the product to derive real use-cases.

Let us look at another scenario – There is a demand for luxury cars in China with a higher growth rate in the next five years.

  • Why do we foresee an increase in demand for high-end automobiles in the next five years? Probably increase in per-capita income or emergence of new breed of customers into higher income group
  • Why do we foresee that the per-capita income will go higher? Probably a change in economic policies will spur growth and increase in high net worth individuals
  • Why do we anticipate that the economic policies will st growth? Probably new policy change of government to create investments in infrastructure and increase in exports through manufacturing

Even though analyst will outline when technologies such as Big Data, IoT, Self-Driving Cars, Virtual Reality, etc. would reach mainstream marketing, Product Manager should independently assess how those technologies will attain mainstream and in which market segments will they achieve mainstream. The idea is to determine what factors would make the technology affordable and usable, which customer segments would contribute to demand of those technologies. Accordingly, Product Managers can evolve their products to capture a majority of the predicted growth. Product Manager should always be inquisitive and curious continually asking ‘WHY?’ with an insatiable quest to unravel the mysterious future of markets, technologies, and customers.

The ultimate goal is to create a mental map of all possibilities of future and then identify the factors that determine the likely occurrence of each of those possibilities. The primary responsibility of the Product Manager is the ability to narrow down the possibilities to just one that is most likely to occur based on the identification of corresponding causal factors that is highly likely to happen. The fundamental idea is that we should not leave anything to chance.

 

Design thinking and customer insights

Customers are not future thinkers. It is not possible for them to imagine a world that does not exist yet. Even if asked to do so, existing business challenges, technology constraints, and existing product biases will hamper the thinking of customers. Nothing is worse than relying on customers to understand how their business challenges will evolve in the future. It is the responsibility of Product Managers to obtain more profound insights into their customers that they attain the ability to foresee what their customers might require much before customers do. Developing customer insights is like unearthing those profound truths about customers that customers themselves might not have acknowledged directly.

Product Manager has to build customer insights through observing customers in their natural habitat, immersing in their business, and assimilating their business process, problems, and challenges. Product Manager should also build insights by just not listening to what they say but reading between the lines to understand what they did not say.  Customers might not be able to articulate what business challenges they might face in future. Based on trends affecting the product and general understanding of customers’ business environment, Product Manager should anticipate customers’ requirements of tomorrow and ensure that the new product will optimally address them. Product Manager does so by looking outside the boundaries of existing customers and trying to establish a generalized view of how the market evolves because of changes in external factors influencing technologies and customer behaviors through leveraging ‘product canvas – building for the future.’ Any attempt to build the canvas without gaining customer insights is futile, and there is no better way to obtain customer insights than embracing design thinking.

Please drop your thoughts or experiences on how you managed to build enterprise products for an unknown future.

[1] Source: http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html?utm_source=share&utm_medium=twitter&utm_campaign=sm_share

Appreciate your thoughts or opinions