Part of the reason for the rise is that buyers are reaching for larger targets. Kony projects its topline will grow to $120m in 2020. Quickbase, Nintex and Mendix were all nearing or above $100m in their recent sales.
Source: High on low-code
Part of the reason for the rise is that buyers are reaching for larger targets. Kony projects its topline will grow to $120m in 2020. Quickbase, Nintex and Mendix were all nearing or above $100m in their recent sales.
Source: High on low-code
This is a draft excerpt from a book I’m working on, tentatively titled The Business Bottleneck. If you’re interested in the footnotes, leaving a comment, and the further evolution of the book, check out the Google Doc for it. Also, the previous excerpt, “The Finance Bottleneck.”
Digital transformation is a fancy term for customer innovation and operational excellence that drive financial results. John Rymer & Jeffrey Hammond, Forrester, Feb 2019.
The traditional approach to corporate strategy is a poor fit for this new type of digital-driven business and software development. Having worked in corporate strategy I find that fitting its function to an innovation-led business is difficult. If strategy is done on annual cycles, predicting and proscribing what the business should be doing over the next 12 months, it seems a poor match for the weekly learning you get from a small batch process. Traditionally, strategy defines where a company focuses: which market, which part of the market, what types of products, how products are sold, and therefore, how money should be allocated. The strategy group also suggests mergers and acquisitions, M&A, that can make their plans and the existing business better. If you think of a company as a portfolio of businesses, the strategy group is constantly asses that each business in that portfolio to figure out if it should buy, sell, or hold.
The dominant strategy we care about here goes under the name “digital transformation.” Sort of. The idea that you should use software as a way of doing business isn’t new. A strategy group might define new markets and channels where software can be used: all those retail omnichannel combinations, new partnerships in open banking APIs, and new products. They also might suggest businesses to shut down, or, more likely divest to other companies and private equity firms, but that’s one of the less spoken about parts of strategy: no one likes the hand that pulls the guillotine cord.
First, pardon a bit of strategy-splaining. Having a model of what strategy is, however, is a helpful baseline to discuss how strategy needs to change to realize all these “digital transformation” dreams. Also, I find that few people have a good grasp of what strategy is, nor, what I think it should be.
I like to think of all “markets” as flows of cash, big tubes full of money going from point A to point B. For the most part, this is money from a buyer’s my wallet flowing to a merchant. A good strategy figures out how to grab as much of that cash as possible, either by being the end-point (the merchant), reducing costs (the buyer), or doing a person-in-the-middle attack to grab some of that cash. That cash grabbing is often called “participating in the market.”
When it comes to defining new directions companies can take, “payments” is a good example. We all participate in that market. Payments is one of the more precise names for a market: tools people use to, well, pay for things.
First, you need to wrap your head around the payments industry, this largely means looking at cashless transactions because using cash requires no payment tool. “Most transactions around the world are still conducted in cash,” The Economist explains, “However, its share is falling rapidly, from 89% in 2013 to 77% [in 2019].” There’s still a lot of cash used, oddly in the US, but that’s changing quickly, especially in Asia, for example in China, The Economist goes on, “digital payments rose from 4% of all payments in 2012 to 34% in 2017.” That’s a lot of cash shifting and now shooting through the payments tube. So, let’s agree that “payments” is a growing, important market that we’d like to “participate” in.
There are two basic participants here:
“Strategy,” then, is (1.) deciding to participate in these markets, and, (2.) the exact way these companies should participate, how they grab money from those tubes of cash. Defining and nailing strategy, then, is key to success and survival. For example, an estimated 3.3 trillion dollars flowed through the credit card tube of money in 2016. As new ways of processing payments gain share, they grab more and more from that huge tube of cash. Clearly, this threatens the existing credit card companies, all of whom are coming up with new ways to defend their existing businesses and new payment methods.
As an example of a general strategy for incumbents, a recent McKinsey report on payments concludes:
The pace of digital disruption is accelerating across all components of the GTB value chain, placing traditional business models at risk. If they fail to pursue these disruptive technologies, banks could become laggards servicing less lucrative portions of the value chain as digital attackers address the friction points. To avoid this fate, banks must embrace digitized transaction banking with a goal of eliminating discrepancies, simplifying payments reconciliation, and streamlining infrastructure to operate profitably at lower price points. They must take proactive strategic steps to leverage their current favorable market position, or watch new market entrants pass them by.
Now, how you actually put all that into practice is what strategy is. Each company and industry has its own peccadilloes. The reason McKinsey puts out all those fine charts is to do the pre-sales work of getting to invite them in and ask “yes, but how?”
We came to the realization that, ultimately, we are a technology company operating in the financial-services business. So, we asked ourselves where we could learn about being a best-in-class technology company. The answer was not other banks, but real tech firms.
This type of thinking has gone on for years, but change in large organizations has been glacial. If you search for the phrase “digital transformation” you’ll daily find sponsored posts on tech news sites preaching this, as they so often say, “imperative.” They’re long on blood curdling pronouncements and short on explaining what to actually do.
We’re all tired of this facile, digital genuflection. But maybe it’s still needed.
If survey and sentiment are any indication, digital strategies are not being rolled out broadly across organizations as one survey, below, suggests. It shows that the part of the businesses that creates the actual thing being sold, product design and development, is being neglected:
As with all averages, this means that half of the firms are doing better…and half of them worse. Curiously, IT is getting most of the attention here: as I say, the IT bottleneck is fixed. My anecdotes-as-data studies match up with the attention customer service is getting: as many of my examples here show, like the Orange one, early digital transformation applications focus on moving people from call centers to apps. And, indeed, “improving customer experience” is one of the top goals of most app work I see.
But, it drops off after there. There’s plenty of room for improvement and much work to be done by strategy groups to direct and decide digital strategy. Let’s look at a two part toolkit for how they might could do it:
Changing enterprise strategy is costly and risky. Done too early, and you deliver perfectly on a vision but are unable to scale to more customers: the mainstream is not yet “ready.” Done too late, and you’re in a battle to win back customers, often with price cutting death spirals and comically disingenuous brand changes: you don’t have time for actual business innovation, so you put lipstick on discount pigs.
An innovation strategy relies on knowing the right time to enter the market. You need a strategy tool to continually sense and time the market. Like all useful strategy tools, it not only tells you when to change, but also when to stay the same, and how to prioritize funding and action. Based on our experience in the technology industry, we suggest starting with a simple model based on numerous tech market disruptions and market shifts. This model is Horace Dediu’s analysis of the post-2007 PC market. 2007, of course, is the year the iPhone was introduced. I’m not sure what to call it, but the lack of a label doesn’t detract from its utility. Let’s call it The Dediu Cliff:
To detect when a market is shifting, Dediu’s model emphasizes looking beyond your current definition of your market. In the PC market, this meant looking at mobile devices in addition to desktops and laptops. Microsoft Windows and x86 manufacturers had long locked down the definition and structure of the PC market. Analyst firms like IDC tracked the market based on that definition and attempted disruptors like Linux desktop aspirants competed on those terms.
When the iPhone and Android were introduced in 2007, the definition of the PC market changed without much of anyone noticing. In a short 10 years, these “phones” came to dominate the “PC” market by all measures that mattered: time spent staring at the screen, profits, share increases, corporate stability and high growth, and customer joy. Meanwhile, traditional PCs were seen mostly as work horses, as commodities like pens and copy machines bought on refresh cycles with little regard to differentiation.
Making your own charts will often require some art. For example, another way to look at the PC market changing is to look at screen time per device, that is, how much time people spend on each device:
You have to find the type of data that fits your industry and the types of trends you’re looking to base strategy on. Those trends could be core assumptions that drive how your daily business functions. For example, many insurance businesses are still based on talking with an agent. So, in the insurance industry, you might chart online vs. offline browsing and buying:
While more gradual than Deidu’s PC market chart, this slope will still allow you to track trends. Clearly, some companies aren’t paying attention to that cliff: as the Gartner L2 research goes on to say, once people look to go from quote to purchasing, only 38% of insurance companies allow for that purchase online.
Gaining this understanding of shifts in the very definition of your market is key. Ideally, you want to create the shift. If not, you want to enter the market once the shift is validated, as early as possible, even if the new entrant has single digit market share. Deploying your corporate resources (time, attention, and money) often takes multiple years despite the “overnight success” myths of startups.
Timing is everything. Nailing that, per industry, is fraught, especially in highly regulated industries like banking, insurance, pharmaceuticals, and other markets that can use regulations to, uh, artificially bolster barriers to entry. Don’t think that high barriers to entry will save you though: Netflix managed to wreak havoc in the cable industry, pushing top telcoes even more into being dumb pipes, moving them to massive content acquisitions to compete.
I suggest the following general tactics to keep from falling off The Dediu Cliff:
We’ll take a look at each of these, and then expand on how the third is generalized into your core innovation function.
Measuring what your customer things about you is difficult. Metrics like NPS and churn give you trailing indicators of satisfaction, but they won’t tell you when your customer’s expectations are changing, and, thus, the market.
You need to understand how your customer spends their time and money, and what “problems” they’re “solving” each day. For most strategy groups, getting this hands on is too expensive and not in their skill set. Frameworks like Jobs to Be Done and customer journey mapping can systemize this research, as we’ll see below, using a small batch process to implement your application allows you to direct strategy by observing what your customers actually interact with your business do day-to-day.
In the ever challenging retail world, The Home Depot has managed to prosper by knowing their customer in detail. The company’s omnichannel strategy provides an example. Customers expect “omnichannel” options in retail, the ability to order products online, buy them in-store, order online but pick-up in-store, return items from online in-store…you get the idea. Accomplishing all of those tasks seems simple from the outside, but integrating all of those inventory, supply-chain, and payment systems is extremely difficult. Nonetheless, as Forrester has documented, The Home Depot’s concerted, hard fought work to get better at software is delivering on their omnichannel strategy: “[a]s of fiscal year 2018, The Home Depot customers pick up approximately 50% of all online orders in the store” and a 28% growth in online sales.
Advances in this business have been fueled by intimate knowledge of The Home Depot’s customers and in-store staff by actually observing and talking with them. “Every week, my product and design teams are in people’s homes or [at] customer job sites, where we are bringing in a lot of real-time insights from the customers,” Prat Vemana, The Home Depot’s Chief Digital Office said at the time.
The company focuses on customer journeys, the full, end-to-end process of customers thinking to, researching, browsing, acquiring, installing, and then using a product. For example, to hone in on improving the experience of buying appliances, the product team working on this application spent hours in stores studying how customers bought appliances. They also spent time with customers at home to see how they browsed appliance options. The team also traveled with delivery drivers to see how the appliances are installed.
Here, we see a company getting to know their customer and their problems intimately. This leads to new insights and opportunities to improve the buying experience. In the appliances example, the team learned that customers often wanted to see the actual appliance and would waste time trying to figure out how they could see it in person. So, the team added a feature to show which stores had the appliances they were interested in, thus keeping the customer engaged and moving them along the sales process.
Spanning all these parts of the customer journey gives the team research-driven insights into how to deliver on The Home Depot’s omnichannel strategy. As customers increasingly start research on their phone, in social media, go instore to browse, order online, pick up instore, have items delivered, and so forth, many industries are figuring out their own types of omnichannel strategies.
All of those different combinations and changing options will be a fog to strategy groups unless they start to get to know their customers better. As Allianz’s Firuzan Iscan puts it: “When we think from the customer perspective, most of our customers are hybrid customers. They are starting in online, and they prefer an offline purchasing experience. So that’s why when we consider the journey end to end, we need to always take care of online and offline moments of this journey. We cannot just focus on online or offline.”
The level of study done at The Home Depot may seem absurd for the strategy team to do. Getting out of the office may seem like a lot of effort, but the days spent doing it will give you a deep, ongoing understanding of what your customers are doing, how you’re fulfilling their needs, and how you can better their overall journey with you to keep their loyalty and sell more to them. Also, it’s a good excuse to get out of beige cubicle farms and dreary conference rooms. Maybe you can even expense some lunches!
As we’ll see, when the product teams building these applications are put in place, strategy teams will have a rich source of this customer information. In the meantime, if you’re working on strategy, you’d be wise to fill that gap however you can. We’ll discuss one method next, listening to those people yelling and screaming doom and disruption.
In Western mythos, Cassandra was cursed to always have 100% accurate prophecies but never be believed. For those of us in the tech industry, cloud computing birthed many Cassandras. Now, in 2019, the success of public cloud is indisputable. The on-premises market for hard and software is forever changed. Few believed that a “booker seller” would do much here or that Microsoft could reinvent itself as an infrastructure provider, turning around a company that was easily dismissed in the post-iPhone era.
Despite this, as far back as 2007, early Casandras were pointing out that software developers were using AWS in increasing numbers. Early on, RedMonk made the case that developers were the kingmakers of enterprise IT spend. And, if you tracked developer choice, you’d see that developers were choosing cloud. More Cassandras emerged over the years as cloud market share grew. Traditional companies heard these Cassandras, some eventually acting on the promises.
Finally, traditional companies took the threat seriously, but as Charles Fitzgerald wickedly chronicled, it was too late. As his chart above shows, entering the public cloud market at this stage would cost $100’s of billions of dollars, each year, to catch up. The traditional companies in the infrastructure market failed to sense and act on The Cliff early enough – and these were tech companies, those outfits that are supposed to outmaneuver and outsmart the market!
Now, don’t take this to mean that these barriers to entry are insurmountable. Historically, almost every tech leader has been disrupted. That’s what happened in this market. There’s no reason to think that cloud providers are immune. We just don’t know when and how they’ll succumb to new competitors or, like Microsoft, have to reinvent themselves. What’s important, rather is for these companies to properly sense and respond to that threat.
There’s similar, though, rearview mirror oriented, stories in many industries. TK( listing or summarizing one in a non-tech company would sure be cool here ).
To consider Cassandras, you need a disciplined process that looks at year over year trends, primarily how your customers spend their time and money. Mary Meeker’s annual slide buffet is a good example: where are your customers spending their time? RedMonk’s analysis of developers is another example. A single point in time Cassandra is not helpful, but a Cassandra that reports at regular intervals gives you a good read on momentum and when your market shifts.
Finally, putting together your own Dediu Cliff can self-Cassandraize you. Doing this can be tricky as you need to imagine what your market will look like – or several scenarios. You’ll need to combine multiple market share numbers from industry analysts into a Cliff chart, updating it quarterly. Having managed such a chart, I can say it’s exhilarating (especially if someone else does the tedious work!) but can be disheartening when quarter by quarter you’re filed into an email inbox labeled “Cassandras.”
Thus far, our methods for sensing the market have been a research, even “assume no friction” methods. Let’s look at the final method that relies on actually doing work, and then how it expands into the core of the new type of strategy and breaking The Business Bottleneck.
The best way to understand and call market shifts is to actually be in the market, both as a customer and a producer. Being a customer might be difficult if you’re, for example, manufacturing tractors, but for many businesses being a customer is possible. It means more than trying your competitor’s products. To the point of tracking market redefinition, you want to focus on the Jobs to Be Done, problems customers are solving, and try new ways of solving those problems. If this sounds like it’s getting close to the end goal of innovation, it’s because it is: but doing it in a smaller, lower cost and lower risk way.
For example, if you’re in the utility business, become a customer of in-home IoT devices, and how that technology can be used to steal your customer relationship, further pushing your business into a commodity position. In the PC market, some executives at PC companies made it a point of pride to never have tried, or “understood” the appeal of small screens – that kind of willful, proud ignorance isn’t helpful when you’re trying to be innovative.
You need to know the benefits of new technologies, but also the suffering your products cause. There’s a story that management at US car manufacturers were typically given a company car and free mechanical service during the day while their car was parked at the company parking lot. As a consequence, they didn’t know first hand how low quality affected the cars. As Nassem Talab would put, they didn’t have any skin in the game…and they lost the game. Regularly put your skin in the game: rent a car, file an insurance claim, fill out your own expenses, travel in coach, and eat at your in-store delis.
Ket to trying new things is to be curious, not only in finding these things, but in thinking up new products to improve and solve the problems you are, now, experiencing first hand.
The goal of trying new things is to experiment with new products, using them to direct your strategy and way of doing business. If you have the capability to test new products, you can systematically sense changes in market definition. Tech companies regularly gloat new ideas as test products to sense customer appetite and, thus, market redefinitions. If you’ve ever used an alpha or beta app, or an invite only app, you’ve played a part in this process. These are experiments, ways the company tries new things. We laud companies like Google for their innovation successes, but we easily forget the long list of failed experiments. The website killedbygoogle.com catalogs 171 products that Google killed. Not all of these are “experiments,” some were long-running products that were killed off. Nonetheless, once Google sensed that an experiment wasn’t viable or a product no longer valid, they killed it, moving on.
When it comes to trying things, we must be very careful about the semantics of “failure.” Usually, “failure” is bad, but when it comes to trying new things, “failure” is better thought of as “learning.” When you fail at something, you’ve learned something that doesn’t work. When you’re feeling your way through foggy, frenetic market shifts requires tireless learning. So, in fact, “failing” is often the fastest way to success. You just need a safe, disciplined system to continually learn.
Innovation requires failure. There are few guarantees that all that failure will lead to success, but without trying new things, you’ll never succeed at creating new businesses and preventing disruption. Historically, the problems with strategy has been the long feedback cycles required to tell you if your strategy “worked.”
First, budgets are allocated annually, meaning your strategy cycle is annual as well.Worse, to front-load the budget cycle, you need to figure out your strategy even earlier. Most of the time, this means the genesis of your current strategy was two, even three years ago. The innovation and business rollout cycles at most organizations are huge. TK( some long roll out figure). It can be even worse: five years, if not ten years in many military projects. Clearly, in “fast moving markets,” to use the cliché, that kind of idea-to-market timespan is damaging. Competing against companies that have shorter loops is key for organizations now. As one pharmacy executive put it, taking six months to release competitive features isn’t much use if Amazon can release them in two months.
Your first instinct might be the start trying many new things, creating an incubation program as a type of beta-factory of your own. The intention is good, but the risks and costs are too high for most large organizations. Learning-as-failure is expensive and can look downright stupid and irresponsible to share holders. Instead, you need a less costly, lower risk way to fail than throwing a bunch of things at the wall and seeing what sticks.
Many organizations using what we’ll call the small batch cycle. This is a feedback loop that relies on four simple steps:
This is, essentially, the scientific method. The lean startup method and, later, lean design has adapted this model to software development. This same loop can be applied “above the code” to strategy. This is how you can use failure-as-learning to create validated strategy and, then, start innovating like a tech company.
As described above, due to long cycles, most corporate strategy is theoretical, at worse, PowerPoint arts and crafts with cut-and-pasting from a few web searches. The implementation details can become dicey and then there’s seeing if customers will actually buy and use the product. In short, until the first customer buys and uses the “strategy,” you’re carrying the risk of wasting all your budget and time on this strategy, often a year or more.
That risk might pay off, or it might not. Not knowing either way is why it’s a risk. A type of corporate “double up to catch up” mentality adds to the risk as well. Because the timeline is so long, the budget so high, and the risk of failure so large, managers will often seek the biggest bang possible to make the business case’s ROI “work.” Taking on a year’s time and $10m budget must have a significant pay off. But with such high expectations, the risk increases because more must be done, and done well. And yet, the potential downside is even higher as well.
This risky mentality has been unavoidable in business for the most part – building factories, laying phone lines, manufacturing, etc. require all sorts of up-front spending and planning. Now, however, when your business relies on software, you can avoid these constraints and better control the risks. Done well, software costs relatively little and is incredibly malleable. It’s, as they say, “agile.” You just need to connect the agile nature of software to strategy. Let’s look at an example.
As an energy company, Duke Energy has plenty of strategizing to do around issues like: disintermediation from IoT devices, deregulation, power needs for electric vehicles, and improving customer experience and energy conservation. Duke has a couple years of experience being cloud-native, getting far enough along to open up an 83,000-square- foot labs building housing 400 employees working in product teams.
They’re applying the mechanics of small batches and agile software to their strategy creation. “Journey teams” are used to test out strategies before going through the full-blown, annual planning process. “They’re small product-type teams led by design thinkers that help them really map out that new [strategic] journey and then identify [what] are the big assumptions,” Duke’s John Mitchell explained. Once identified, the journey teams test those assumptions, quickly proving or disproving the strategy’s viability.
Mitchell gives a recent example: labor is a huge part of the operating costs for a nuclear power plant, so optimizing how employees spend their time can increase profits and the time it takes to address issues. For safety and compliance reasons, employees work in teams of five on each job in the plant, typically scheduled in hour-long blocks. Often, the teams finish in much less than an hour, creating spare capacity that could be used on another job.
If Duke could more quickly, in near real-time, move those teams to new jobs they could optimize each person’s time. “So the idea was, ‘How can we use technology?’” Mitchell explains. “What if we had an RFID chip on all of our workers? Not to ‘Big Brother’ check in on them,” he quickly clarifies, but to better allocate the spare capacity of thousands of people. Sounds promising, for sure.
Not so fast though, Mitchell says: “You need to validate, will that [approach] work? Will RFID actually work in the plant?” In a traditional strategy cycle, he goes on, “[You’d] order a thousand of these things, assuming the idea was good.” Instead, Duke took a validated strategy approach. As Mitchell says, they instead thought, “let’s order one, let’s take it out there and see if it actually works in plant environment.” And, more importantly, can you actually put in place the networking and software needed: “Can we get the data back in real time? What do we do with data?” The journey team tested out the core strategic theories before the company invested time and money into a longer-term project and set of risks.
Key to all this, of course, is putting these journey teams in place and making sure they have the tools needed to safely and realistically test out these prototypes. “[T]he journey team would have enough, you know, a very small amount of support from a software engineer and designer to do a prototype,” Mitchell explains. “[H]opefully, a lot of the assumptions can be validated by going out and talking to people,” he goes on, “and, in some cases there’s a prototype to be taken out and validated. And, again, it’s not a paper prototype—unless you can get away with it—[it’s] working software.”
Once the strategic assumptions are validated (or invalidated, the entire company has a lot more confidence in the corporate strategy. “Once they … validate [the strategy],” Mitchell explains, “you’ve convinced me—the leader, board, whatever—that you know you’re talking about.”
With software, as I laid out in Monolithic Transformation, the key ways to execute the loop are short release cycles, smaller amounts of code in each release, and the infrastructure capabilities to reliably reverse changes and maintain stability if things go wrong.
These IT changes lead directly to positive business outcomes. Using a small batch cycle increases the design quality and cost savings of application design, directly improving your business. First, the shorter, more empirical, customer-centered cycles mean you better match what your customers actually want to do with your software. Second, because your software’s features are driven by what customers actually do, you avoid overspending on your software by putting in more features than are actually needed.
For example, The Home Depot kept close to customers and “found that by testing with users early in the first two months, it could save six months of development time on features and functionality that customers wouldn’t use.” That’s 4 months time and money saved, but also functionality in the software that better matches what customers want.
As you mature, these capabilities lead to even wider abilities to experiment with new features like A/B testing, further honing down the best way to match what your software does to how your customers want to use it, and, thus, engage with your business. TK( quick example would be nice here ).
Software is the reason we call tech companies tech companies. They rely on software to run, even define their business. Thus, it’s TK( maybe? ) software strategy where we need to look at next.
This is a draft excerpt from a book I’m working on, tentatively titled The Business Bottleneck. If you’re interested in the footnotes, leaving a comment, and the further evolution of the book, check out the Google Doc for it.
All businesses have one core strategy: to stay alive. They do this by constantly offering new reasons for people to buy from them and, crucially, stay with them. Over the last decade, traditional businesses have been freaked by competitors that are figuring out better offerings and stealing those customers. The super-clever among these competitors innovate entirely new business models: hourly car rentals, next day delivery, short term insurance for jackets, paying for that jacket with your phone, banks with only your iPhone as a branch, incorporating real-time weather information into your reinsurance risk analysis.
In the majority (maybe all) of these cases, surviving and innovating is done well with small business and software development cycles. The two work hand-in-hand are ineffective without the other. I’d urge you think of them as the same thing. Instead of business development and strategy using PowerPoint and Machiavellian meeting tactics as their tool, they now use software.
You innovate by systematically failing weekly, over and over, until you find the thing people will buy and the best way to deliver it. We’ve known this for a long time and enshrined it in processes like The Lean Startup, Jobs to Be Done, agile development and DevOps, and disruption theory. While these processes are known and proven, they’ve hit several bottlenecks in the rest of the organization. In the past, we had IT bottlenecks. Now we have what I’ve been thinking of as The Business Bottleneck. There’s several of them. Let’s start by looking at the first, and, thus, most pressingly damaging one. The bottleneck that cuts off business health and innovation before it even starts: finance.
Most software development finance is done wrong and damages business. Finance seeks to be accurate, predictable, and works on annual cycles. This is not at all what business and software development is like.
Software development is a chaotic, unpredictable activity. We’ve known this for decades but we willfully ignore it like the advice to floss each day. Mark Schwartz has a clever take on the Standish software project failure reports. Since the numbers in these reports stay the same each year, basically, the chart below shows that that software is difficult and that we’re not getting much better at it:
What this implies, though, is something even more wickedly true: it’s not that these project failed, it was that we had false hopes. In fact, the red and yellow in the original chart actually shows that software is performs consistent to its true nature. Let me rework the chart to show this:
What this second version illustrates is that the time and budget it takes to get software software right can’t be predicted with any useful accuracy. The only useful accuracy is that you’ll be wrong in your predictions. We call it software engineering, and even more accurately “development” because it’s not scientific. Science seeks to describe reality, the be precise and correct – to discover truths that can be repeated. Software isn’t like that at all. There’s little science to what software organizations do, there’s just the engineering mentality of what works with what we have time and budget to do.
What’s more, business development is chaotic as well. Who knows what new business idea, what exact feature will work and be valuable to customers? Worse, there is no science behind business innovation – it’s all trial and error, constantly trying to both sense and shape what people and businesses will buy and at what price. Add in competitors doing the same, suppliers gasping for air in their own chaos quicksand, governments regulating, and culture changing people’s tastes, and it’s all a swirling cipher.
In each case, the only hope is rigorously using a system of exploration and refining. In business, you can study all the charts and McKinsey PDFs you want, but until you actually experiment by putting a product other there, seeing what demand and pricing is, and how your competitors will respond, you know nothing. The same is true for software.
Each domain has tools for this exploration. I’m less familiar with business development, and only know the Jobs to Be Done tool. This tool studies customer behaviors to discover what products they actually will spend money on, to find the “job” they hire your company to solve, and then change the business to profit from that knowledge.
The discovery cycle in software follows a simple recipe: you reduce your release cycle down to a week and use a theory-driven design process to constantly explore and react to customer preferences. You’re looking to find the best way to implement a specific feature in the UI to maximize revenue and customer satisfaction. That is, to achieve whatever “business value” you’re after. It has many names and diagrams, but I call this process the “small batch cycle.”
For example, Orange used this cycle when perfecting its customer billing app. Orange wanted to reduce traffic to call centers, thus lower costs but also driving up customer satisfaction (who wants to call a call center?). By following a small batch cycle, the company found that its customers only wanted to see the last two month’s worth of bills and their employees current data usage. That drove 50% of the customer base to use the app, helping remove their reliance on actual call centers, driving down costly and addressing customer satisfaction.
These business and software tools start with the actual customers, people, who are doing the buying and use these people as the raw materials and lab to run experiments. The results of these experiments are used to validate, more often invalidate theories of what the business should be and do. That’s a whole other story, and the subject of my previous book, Monolithic Transformation.
We were going to talk about finance, though, weren’t we?
Finance likes certainly – forecasts, plans, commits, and smooth lines. But if you’re working in the chaos of business and software development, you can’t commit to much. The only certainty is that you’ll know something valuable once you get out there and experiment. At first all you’ll learn is that your idea was wrong. In this process, failure is as valuable as success. Knowing what doesn’t work, a failure, is the path to finding what does work, a success. You keep trying new things until you find success. To finish the absurd truth: failure creates success.
Software organizations can reliably deliver this type of learning each week. The same is true for business development. We’ve known this for decades, and many organizations have used it as their core differentiation engine.
But finance doesn’t work in these clever terms. “What they hell do you mean ‘failure creates success’? How do I put that in a spreadsheet?” we can hear the SVP of Finance saying, “Get the hell out of this conference room. You’re insane.”
Instead, when it comes to software development, finance focuses only on costs. These are easy to know: the costs of staff, the costs of their tools, and the costs of the data centers to run their software. Business development has similar easy to know costs: salary, tools, travel, etc.
When you’re developing new businesses and software, it’s impossible to know the most important number: revenue. Without that number, knowing if costs are good or bad is difficult. You can estimate revenue and, more likely, you can wish-timate it. You can declare that you’re going to have 10% of your total addressable market (TAM). You can just declare – ahem, assume – that you’re chasing a $9bn market opportunity. Over time, once you’ve discovered and developed your business, you can start to use models like consumer spending vs. GDP growth, or the effect of weather and political instability on the global reinsurance market. And, sure, that works as a static model so long as nothing ever changes in your industry.
For software development, things are even worse when it comes to revenue. No one really tells IT what the revenue targets are. When IT is asked to make budgets, they’re rarely involved in, nor given revenue targets. Of course, as laid out here, these targets in new businesses can’t be known with much precision. This pushes IT to just focus on costs. The problem here, as Mark Schwartz points out in all of his books, is that cost is meaningless if you don’t know the “value” you’re trying to achieve. You might try to do something “cheaply,” but without the context of revenue, you have no idea what “cheap” is. If the business ends up making $15m, is $1m cheap? If it ends up making $180m, is $5m cheap? Would it have been better to spend $10m if it meant $50m more in revenue?
IT is rarely involved in the strategic conversations that narrow down to a revenue. Nor are they in meetings about the more useful, but abstract notion of “business value.” So, IT is left with just one number to work with: cost. This means they focus on getting “a good buy” regardless of what’s being bought. Eventually, this just means cutting costs, building up a “debt” of work that should have been done but was “too expensive” at the time. This creates slow moving, or completely stalled out IT.
A rental car company can’t introduce hourly rentals because the back office systems are a mess and take 12 months to modify – but, boy, you sure got a good buy! A reinsurance company can’t integrate daily weather reports into its analytics to reassess its risk profile and adjust its portfolio because the connection between simple weather APIs and rock-solid mainframe processing is slow – but, sister, we sure did get a good buy on those MIPS! A bank can’t be the first in its market to add Apple Pay support because the payments processing system takes a year to integrate with, not to mention the governance changes needed to work with a new clearinghouse, and then there’s fraud detection – but, hoss, we reduced IT costs by $5m last year – another great buy!
Worse than shooting yourself in the foot is having someone else shoot you in the foot. As one pharmacy executive put it, taking six months to release competitive features isn’t much use if Amazon can release them in two months. But, hey! Our software development processes cost a third less than the industry averages!
Business development is the same, just with different tools and people who wear wing-tips instead of toe-shoes. Hopefully you’re realizing that the distinction between business and software development is unhelpful – they’re the same thing.
So, when finance tries to assign a revenue number, it will be wrong. When you’re innovating, you can’t know that number, and IT certainly isn’t going to know it. No one knows the business value that you’re going to create: you have to first discover it, and then figure out how to deliver it profitably.
As is well known, the problem here is the long cycle that finance follows: at least a year. At that scope, the prediction, discovery, and certainty cycle is sloppy. You learn only once a year, maybe with indicators each quarter of how it’s going. But, you don’t really adjust the finance numbers: they don’t get smarter, more accurate, as you learn more each week. It’s not like you can go get board approval each week for the new numbers. It takes two weeks just to get the colors and alignment of all those slides right. And all that pre-wiring – don’t even get me started!
In business and software development, each week when you release your software you get smarter. While we could tag shipping containers with RFID tags to track them more accurately, we learn that we can’t actually collect and use that data – instead, it’s more practical to have people just enter the tracking information at each port, which means the software needs to be really good. People don’t actually want to use those expensive to create and maintain infotainment screens in cars, they want to use their phones – cars are just really large iPhone accessories. When buying a dishwasher, customers actually want to come to your store to touch and feel them, but first they want to do all their research ahead of time, and then buy the dishwasher on an app in the store instead of talking with a clerk.
These kinds of results seem obvious in hindsight, but business development people failed their way to those success. And, as you can imagine, strategy and finance assumptions made 12 to 18 months ago that drove businesses cases often seem comical in hindsight.
A smaller cycle means you can fail faster, getting smarter each time. For finance, this means frequently adjusting the numbers instead of sticking to the annual estimates. Your numbers get better, more accurate over time. The goal is to make the numbers adjust to reality as you discover it, as you fail your way to success, getting a better idea of what customers want, what they’ll pay, and how you can defend against competition.
Some companies are lucky to just ignore finance and business models. They burn venture capital funding as fuel to rocket towards stability and profitability. Uber is a big test of this model – will it become a viable business model (profitable), or will it turn out that all that VC money was just subsidizing a bad business model? Amazon is a positive example here, over the past 20 years cash-as-rocket-fuel launched them to boatloads of profit.
Most organizations prefer a less expensive, less risky methods. In these organizations, what I see are programs that institutionalize these failure driven cycles. They create new governance and financing models that enforce smaller business cycles, allowing business and software development to take work in small batches. Allianz, for example, used 100 day cycles discover and validate new businesses. Instead of one chance every 365 days to get it right, they have three, almost four. As each week goes by, they get smarter, there’s less waste and risk, and finance gets more accurate. If their business theory is validated, the new business is graduated from the lab and integrated back into the relevant line of business. The Home Depot, Thales, Allstate, and many others institutionalize similar practices.
Each of these cycles gives the business the chance to validate and invalidate assumptions. It gives finance more certainly, more precision, and, thus, less errors and risk when it comes to the numbers. Finance might even be able to come up with a revenue number that’s real. That understanding makes funding business and software development less risky: you have ongoing health checks on the viability of the financial investment. You know when to stop throwing good money after bad when you’ve invalidated your business idea. Or, you can change your assumptions and try again: maybe no one really wants to rent cars by the hour, maybe they want scooters, or maybe they just want a bus pass.
With a steady flow of business development learning, you can start making growth decisions. If validate that you can track a team of nuclear power plant workers better with RFID badges, thus directing them to new jobs more quickly and reducing costly downtime, you can then increase your confidence that spending millions of dollars to do it for all plant workers with payoff. You see similar small experiments leading to massive investments in omnichannel programs at places like Dick’s Sporting Goods and The Home Depot.
Finance has to get involved in this fail-to-success cycle. Otherwise, business and software development will constantly be driven to be the cheapest provider. We saw how this generally works out with the outsourcing craze of my youth. Seeking to be the cheapest, or the synonomic phrase, the “most cost effective,” option ends up saving money, but paralyzing present and future innovation.
The problem isn’t that IT is too expensive, or can’t prove out a business case. As the Gartner study above shows, the problem is that most financing models we use to gate and rate business and software development are a poor fit. That needs to be fixed, finance needs to innovate. I’ve seen some techniques here and there, but nothing that’s widely accepted and used. And, certainly, when I hear about finance pushing back on IT businesses cases, it’s symptomatic of a disconnect between IT investment and corporate finance.
Businesses can certainly survive and even thrive. The small, failure-to-success learning cycles used by business and software developers works, are well known, and can be done by any organization that wills it. Those bottlenecks are broken. Finance is the next bottleneck to solve for.
I don’t really know how to fix it. Maybe you do!
After finance, for another time, my old friends: corporate strategy. And if you peer past that blizzard of pre-wired slides and pivot tables, you can see just in past the edges of the next bottleneck, that mysterious cabal called “The C-Suite.” Let’s start with strategy first.
it confirms that more and more end-user organizations are deploying systems and applications to the cloud, including replacing on-premises systems with SaaS. IT may or may not be driving these shifts.
An impressive 72% of millennials are more likely to be loyal to a brand that responds to feedback on social media.
As an old, I too would like to not use the phone.
Source: Please Hold
Here, you see a shift in intentions to use containers, a pretty large one: less people are planning to use them. To me, containers are mostly useful for custom written software, not business application workloads.
So, several years ago, containers seemed like a cheaper VMware strategy where you just generically throw your apps in and reduce costs.
But, that doesn’t really work. Apps have to be container friendly, plus, you know, you have to manage your new container orchestration thing – figure out kubernetes. Even that has only been a thing (an option a generic IT team would know about and find viable enough to consider) for about the past year.
(I mean, maybe, if you soften the idea that kubernetes is a platform for building platforms and just think of it as a platform for running apps, that is, a platform. I don’t know what the fuck is going on in that definitional-wrangling space.)
These companies, I’d theorize, then, had the wrong assumptions, investigated container usage, and realized it wasn’t what they were expecting.
Containers are for running custom written software. If you don’t want to do that, they’re probably not useful to you.
As an important side-note, let’s assume use here means penetration, which is to say, respondents use at least one, or as few as, container. That means overall usage could be a tiny percentage of their total workloads – or big. We have no way of know how many containers they use. Not a big deal if you’re interested, as here, in y/y trends, that is, growth. That’s what investors want to see.
Equally important and enlightening here, as always, is to look at the demographics:
I don’t know what the the n is, the total number of respondents. There’s a good chunk that of what I’d fall “enterprise”: 10,000+ employees, and lots from finance. Let’s say around half? Spending wise, education usually doesn’t spend much (or write that much custom software?), and “Technology” typically less. Tech companies usually don’t buy shit and DIY it choosing to spend on their own people instead of vendors – they are, after all, technology companies, they think.
Also, it’s worth weighting this all by how few insurance and retail companies are in the respondent base: they have tons of custom written software, the stuff you’d put in containers.
So, you’ve really got two very different surveys and conclusions going on here, split by two different markets: those who write and run their own software (mostly large organizations) and those who don’t (mostly smaller organizations).
You see the general conclusion in the footnote: 10,000+ people companies (who have a good chance of writing and running their own software) already use containers and plan on using more.
Anyhow, half of respondents are small and mid-sized companies, plus those tech companies that don’t spend. So, spending wise, selling containers is probably mostly a large enterprise play cause that’s where they get used and paid for. They rest of the companies, likely, want SaaS and security.
Check out the rest of the report. It covers much, much more that the container neck of the woods.
‘“Overall, Common Sense said teens are more likely to view social media as a good thing in their emotional lives. For example, 16 percent said using social media makes them feel less depressed and 25 percent said they feel less lonely, compared to 3 percent who said social media use made them feel more depressed or lonely. The report states that even though teen social media use has skyrocketed in six years, “teens are no more likely to report having a negative reaction to social media on any of these (emotional well-being) measures today than they were in 2012.”’
Original source: New Common Sense Media survey finds more positives than negatives in teen use of social media
“Forty-seven (47%) percent of them said improving performance/availability was one of the main reasons for leveraging multiple infrastructure environments.”
Original source: The Trouble with Cloud “Repatriation”
“Serverless is growing, and fast. Several key adoption metrics are 2x what they were last year. And not just with smaller companies; the enterprise is adopting serverless technologies for critical workloads just as rapidly.”
Original source: Serverless survey
“the massive online retailer once again posted its largest quarterly profit in history — $2.5 billion for the quarter — on the back of two businesses that were afterthoughts just a few years ago: Amazon Web Services, its cloud computing unit, as well as its fast-growing advertising business.”
Good charts, too.
Original source: This is the Amazon everyone should have feared — and it has nothing to do with its retail business
Original source: Two great tastes that taste great together
Original source: Technology Spending by the Numbers: From Public Cloud to Security Measures – TheStreet
“The percentage of people who think the internet is good for society is shrinking. Roughly 70 percent of American adults who use the internet believe it’s mostly good for society, down from 76 percent in 2014, Pew found.”
“About 64 percent of online adults over 65 say the internet has been “a mostly good thing for society,” Pew wrote. In 2014, that number was 78 percent.”
Original source: Is the internet good or bad for society? Americans are having a tougher time deciding.
Organizations talk a lot about transformation, but their actions don’t always back it up:
“To find out the state of digital transformation, we surveyed 1,600 business and IT decision-makers in North American and European enterprises. The answer? Sorry, I’m afraid. As you can see from the picture below, 21% of firms think their transformation is dusted and done. Really? Done? And another 22% are investigating or not transforming at all. And while 56% of firms are transforming, their level of investment and scope of transformation are still mostly small. For example, only 34% of banks and insurers are even bothering to transform marketing and only 45% are transforming customer care — a too-small percentage given consumers’ of mass adoption of mobile devices.
“Why? As one respondent put it, “It’s a war between old-school technophobe leaders and the technology innovation that represents a completely different way of doing business.”
Original source: The Sorry State of Digital Transformation in 2018
Nice chart, template for plan to modernize code.
Original source: Migrations: the sole scalable fix to tech debt.
Nice chart, template for plan to modernize code.
Original source: Migrations: the sole scalable fix to tech debt.
It’s got a quadrant chart.
Original source: Your Omnichannel Roadmap
“Founded in 2007, Dropbox epitomizes the freemium go-to-market. Dropbox has grown from 0 to 500 million users over that time period. 2% of those users convert to paid and pay an average of $9.33 per month. 90% of revenue originates through self serve channels – an astounding figure for company that generated more than $1B in revenue last year.”
Original source: Dropbox S-1 Analysis – The King of Freemium
“In addition, stagnant wages had implications for limiting demand growth. In our sector analysis, we found weak demand dampened productivity growth through other channels than investment, such as economies of scale and a subsector mix shift.”
Original source: Why isn’t digital fixing the productivity puzzle?
“In Denmark, childbearing creates a 20 percent gender wage gap.”
Original source: The gender wage gap is really a child care penalty
Now that’s how to chart.
Original source: All the president’s tweets
“fully 50% of the 872 respondents said their company is giving a ‘green light’ for IT spending. That was the highest reading since 2007, and 13 basis points higher than the average survey response for the month of November for the previous five years”
Original source: Green light for IT spending in 2018
I assume this is across distros, and including use of just the ope source stack,
The downward spiral (driven by budgeting, risk-aversion, and ROI-think) that make legacy IT bloom like algae in a stagnant creek, from Chris Tofts:
With a limited budget for maintenance or improvement, how will it be allocated to the various systems managed by IT? Remember that in having to justify the spend at all, the primary need is to demonstrate business impact and – equally – guarantee that there is no risk to continued operations.
Inevitably, depending on organisational perspective, there are essentially three underlying approaches. First, maximise the number of systems that have been updated: demonstrate lots of work has taken place. Next, minimise the risk that any update will fail: have no impact on the ongoing organisation. Finally, maximise the apparent impact on the direct customers for IT systems – improve the immediate return to the business.
If the organisation maximises the number of systems updated then the clear imperative is to choose systems that are easy (cheap) to update. The systems that are cheap to update are invariably the ones with the least difference between in-use and current. In other words, the systems that were updated during the last round of updates. So the organisation will choose to improve those systems just beyond some minimum obsolescence criteria and until all of the budget is spent.
And then, you get bi-modal infrastructure:
It’s hard to say what the fix is beyond “don’t do that.” Perhaps a good rule of thumb is to attack the hard, risky stuff first.
Getting The Business to pay attention to legacy like they do cash-losses is also an interesting gambit:
As boring as it sounds, if organisations had to carry technical debt on their books – just like they carry the value of their brand on their assets – then, finally, they might understand both their exposure and necessary spend on their critical IT assets.
As covered by Axios, in a report from IAM/PwC. As noted in the notes below the chart, these figures are based on a sub-set of the market, 20 advertising outfits. No doubt, they represent a huge part of revenue however. It’s hard to imagine that there’s many more millions in podcast advertising.
Also as highlighted by Sara Fischer:
Edison Research and Triton Digital estimates 98 million U.S. adults listen to podcasts.
Each year, Mary Meeker and team put together the Internet Trends report that draws together an ever growing collection of charts and analysis about the state of our Internet-driven world, from the latest companies to industry and economic impact. Over the years, the report has gone on to include analysis of markets like China and India. Being a production of the Kleiner Perkins Caufield & Byers venture capital firm, the focus is typically on new technologies and the corresponding business opportunities: you know, the stuff like “millennials like using their smartphones” and the proliferation of smartphones and Internet globally.These reports are good for more than just numbers-gawking, but can also give some quantitative analysis of new, technology innovations in various industries. The consumer and advertising space consumes much of this business analysis, but for example, in this year’s report, there’s an interesting analysis of health-care and transportation (bike sharing in China!). For enterprises out there, it may seem to over-index on startups and small companies, but that doesn’t detract from the value of the ideas when it comes to any organization looking to do some good, old-fashioned “digital transformation.”
Normally, I’d post my notebook things here, but the Pivotal blog overlords wanted to put this in on the Pivotal blog, so check it out there.
A nice way of explaining Amazon’s success in charts, e.g., as compare to Wal-Mart:
Just thinking aloud without any analysis, it seems liken Amazon is an example of how difficult, long, and confounding doing continual innovation as your business is. Many companies claim to be innovation-driven, but most can just eek out those “incremental innovations” and basic Porterian strategy: they improve costs, enter adjacent marketers, and grow their share of existing TAMs, all the while fending off competitors.
Amazon, on the other hand, has had decades of trying new business models mostly in existing businesses (retail), but also plenty of new business models (most notably public cloud, smart phones and tablets, streaming video and music, and whatever voice + machine learning is).
All that said, to avoid the Halo Effect, it’s important to admit that many companies tried and died here…not to mention many of the retailers who Amazon is troubcibg – Wal-Mart has had several goes at “digital” and is in the midst of another transformation-by-acquisitions. Amazon, no doubt, has had many lucky-breaks.
This isn’t to dismisss any lessons learned from Amazon. There’s one main conclusion, thought: any large organization that hopes to live a long time needs to first continually figure out if they’re in a innovation/disrupting market and, if they are, buckle up and get ready for a few decades of running in an innovation mode instead of a steady-state/profit reaping mode.
Another lesson is that the finances of innovation make little sense and will always be weird: you have to just hustle away those nattering whatnots who want to apply steady-state financial analysis to your efforts.
You can throw out the cashflow-model chaff, but really, you just have to get the financial analysis to put down their pivot tables and have faith that you’ll figure it out. You’re going to be loosing lots of money and likely fail. You’ll be doing those anti-Buffet moves that confound normals.
In this second mode you’re guided by an innovation mindset: you have to be parnoid, you have to learn everyday what your customers and competitors are doing, and do new things that bring in new cash. You have to try.
After all these years, print media still struggles versus the Internet. This long piece on how the travel magazine industry has been suffering covers many great topics. I suspect much of the analysis is the same for all of print media.
One of the problems is the new set of demands on writers in that field:
There is the pain point of figuring out an internal work flow that functions across platforms. Journalists, writers, and content creators often have specialized skillsets, so asking one to write a story, create a listicle, take photos, and film compelling videos about a trip is a major challenge.
“We just started working more efficiently that way and it really, it’s painful to integrate digital and print,” said Guzmán. “The plays are different, the workloads are different, the story ideation is different. In doing this, there’s this huge cultural shift that is exciting and difficult.”
And, then, even after suffering through all that “cultural shift,” the results are often disappointing:
“The iPad was just going to be this Jesus of magazines and I never really quite believed that because I knew how challenging it was was to rejigger the content to fit that format,” said Frank, who oversaw Travel + Leisure’s digital strategy in the early 2010s. “Having just gone through the process of signing up and downloading a magazine, it took forever and was buggy and it just wasn’t necessarily a great solution. I was never really bought the gospel that the tablet was going to be our savior. But we did it. I mean, we created a great app and it was beautiful. It won awards, but that was knowing what the usership was is a little disheartening.”
And, as ever, there’s the tense line between blaming “most reader are dumb” and “rivals are evil” when it comes to what’s to blame:
“I could have written the greatest travel story ever known, and it would not have gotten on the cover of the traffic oriented site because a Swedish bikini teen saved a kitten from a tree; which is going to be more popular?”
Let them watch cats.
Still, as the article opens up with, it’s the old Curse of Web 2.0 – former readers, now just travelers – writing the useful content in the form of reviews on TripAdvisor and such:
“In general, people don’t read a review and make a decision,” said Barbara Messing, chief marketing officer of TripAdvisor. “Consumers will read six to eight reviews. They might dig in a certain characteristic that they are interested in, maybe they really are interested in what the quality of the beach is, or maybe they are really interested in whether it’s kid friendly or not kid friendly. In general, people will hone in on the characteristics of something that’s most important to them, find that answer on TripAdvisor, get that most recent insights, check out the photos, check the forums, and really be able to make an informed decision of whether something is right for them. I think that the notion that people could rely on the wisdom of the crowd and the wisdom of individuals to their detriment, I just think that’s false, and I don’t think the reality is that is going to happen.”
There’s also some M&A history of trading various assets like Lonely Planet, Zagat, and Frommer’s back and forth as different management figures out what to do with them.
As ever, I’m no expert on the media industry. It seems like the core issue is that “the Internet” is so much more efficient at the Job to be Done for travel (as outlined by the TripAdvisor exec above) that the cost structure and business process from print magazines is not only inefficient, but unneeded. Those magazines are now over-serving (and thus, over-spending) with a worse product.
While the quality of TripAdvisor (and Yelp, for example) reviews is infinitely worse than glossy magazines, since there’s an infinite amount of more crappy reviews, with the occasional helpful ones…it sort of more than evens out in favor of Sweedish bikini cat rescuers. Plus, digital advertising has so much more spend (and overall, industry profit, if only by sheer volume if not margin) – it must be because it’s better at making the advertisers money and because it creates a larger market:
Thus far, it seems like the large banks are fending off digital disruption, perhaps embracing some of it on their own. The Economist takes a look:
The dramatic surge in PE activity is primarily due to the ever-deepening pool of financial buyers. In the history of the industry, there have never been more tech-focused buyout shops that have had access to more capital, collectively, than right now. New firms have popped up while existing ones have put even more money to work in the tech industry, which is becoming even more ‘target rich’ as it ages. For instance, both Clearlake Capital and TA Associates announced as many deals in Q1 2017 as each of the firms would typically print in an entire year. Additionally, both Vista Equity Partners and Thoma Bravo averaged almost two transactions per month in Q1, if we include deals done by their portfolio companies as well.
From Brenon at 451, and with some charts too:
That chart is in millions, i.e., 260m in 2016; the write-up on Quartz is a little wonky in that respect.
From IDC: “Annually, shipments of traditional PCs slipped to 260 million units, down 5.7% from 2015.”
Based on the S-1 filings from the business, a $3.7B price implies a 17.3x enterprise value/trailing twelve month revenue multiple, which is 41% higher than the next nearest acquisition, Salesforce/Demandware. There’s no comparable pricing event in the M&A market in the last 10 years.
The Borg’s plucked the company mere days before it was expected to float on the stock market, an event expected to raise around US$1.4bn for a portion of the company.
While AppDynamics could point to over 2,000 customers and nine-figure revenues, it also had rather a lot of red ink to deal with. That’s Cisco’s problem now, as it will make AppDynamics a software business unit in its internet of things and applications business.
If Congress enacted such a deal, of course, only a fraction of the $2.6 trillion would reach shareholders. It’s important to note that much of the UFE is not actually in cash; it’s invested in overseas plants or provides working capital for foreign subsidiaries. At press time, specifics of a plan hadn’t emerged, and figuring out which assets will ultimately get taxed, and at what rate, will be thorny. But based on Trump’s earlier proposal and on past holidays, investing pros estimate that about 40% of the UFE, or around $1 trillion, will come back to the U.S.—and that companies would net at least $850 billion after taxes.
Tech and health care companies would get most of that.
I think most people believe that cash would be used in stock buybacks and dividend to raise share prices and give cash to investors. Trump would probably want it for creating new jobs, and it could be used for domestic acquisitions.
Lawrence Hecht has some brief commentary on Gartner buying CEB for $2.6bn – Lawrence takes out the $700m in debt from the actual deal value of $3.3bn. I don’t really know CEB too well.
All seethe official press release.
Here’s some share price performance, snipping out the time around the acquisition announcement (it goes up, of course):
For the year, Gartner estimated shipments at 269.717 million, down 6.2 per cent year-on-year, with each of the major manufacturers except Dell reporting falling sales.
Gartner says high-end PCs are doing well, but of course, are a smaller market:
There have been innovative form factors, like 2-in-1s and thin and light notebooks, as well as technology improvements, such as longer battery life. This high end of the market has grown fast, led by engaged PC users who put high priority on PCs. However, the market driven by PC enthusiasts is not big enough to drive overall market growth.
There may less volume, but it’d be nice to know how that effects profits in the notoriously slim margin PC business.
Meanwhile, on overall, global IT spend:
Companies are due to splash $3.5tr (£2.87tr) on IT this year, globally, although that is down from its previous projection of three per cent.
See some more commentary of that forecast.
451 Research estimated this week the application container segment reached a robust $762 million in 2016 and is forecast to grow at a 40-percent compound rate over the next four years to $2.7 billion.
And, on usage, from an April/May 2016 survey:
451 Research’s Voice of the Enterprise: Software-Defined Infrastructure Workloads and Key Projects survey conducted in April and May 2016 showed that of the roughly 25% of enterprises we surveyed who use containers, 34% were in broad implementation of production applications and 28% had begun initial implementation of production applications with containers.
I’m somewhat suspicious that there’s $762m in container software and services sales, but who knows, really?
I haven’t read through their entire cloud enabling technologies market sizing yet, from Dec 2016, (basically, private cloud software and services, any things used by *aaS vendors, not the actual public cloud services, which are another market) , which is more than just containers. That market is pegged at $23bn in 2016, going to $39bn in 2020:
More on 451’s blog.
According to one study:
“Headcounts may not rise significantly, but look for IT organizations to spend more on talent, especially managers and developers who can lead the transition to cloud, mobility, and big data solutions,” Computer Economics says in its executive summary.
n=”over 200 executives.”