More on “grim” automation – Notebook

A few weeks back my book review of two “the robots are taking over” came out over on The New Stack. Here’s some responses, and also some highlights from a McKinsey piece on automation.

Don’t call it “automation”

From John Allspaw:

There is much more to this topic. Nick Carr’s book, The Glass Cage, has a different perspective. The ramifications of new technology (don’t call it automation) are notoriously difficult to predict, and what we think are forgone conclusions (unemployment of truck drivers even though the tech for self-driving cars needs to see much more diversity of conditions before it can get to the 99%+ accuracy) are not.

Lisanne Bainbridge in her seminal 1983 paper outlines what is still true today.

From that paper:

This paper suggests that the increased interest in human factors among engineers reflects the irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator.

When things go wrong, humans are needed:

To take over and stabilize the process requires manual control skills, to diagnose the fault as a basis for shut down or recovery requires cognitive skills.

But their skills may have deteriorated:

Unfortunately, physical skills deteriorate when they are not used, particularly the refinements of gain and timing. This means that a formerly experienced operator who has been monitoring an automated process may now be an inexperienced one. If he takes over he may set the process into oscillation. He may have to wait for feedback, rather than controlling by open-loop, and it will be difficult for him to interpret whether the feedback shows that there is something wrong with the system or more simply that he has misjudged his control action.

There’s a good case made for not only the need for humans, but to keep humans fully trained and involved in the process to handle errors states.

Hiring not abating

Vinnie, the author of one of the books I reviewed, left a comment on the review, noting:

For the book, I interviewed practitioners in 50 different work settings – accounting, advertising, manufacturing, garbage collection, wineries etc. Each one of them told me where automation is maturing, where it is not, how expensive it is etc. The litmus test to me is are they stopping the hiring of human talent – and I heard NO over and over again even for jobs for which automation tech has been available for decades – UPC scanners in groceries, ATMs in banking, kiosks and bunch of other tech in postal service. So, instead of panicking about catastrophic job losses we should be taking a more gradualist approach and moving people who do repeated tasks all day long and move them into more creative, dexterous work or moving them to other jobs.

I think Avent’s worry is that the approach won’t be gradual and that, as a society, we won’t be able to change norms, laws, and “work” over fast enough.

McKinsey

As more context, check out this overview of their own study and analysis from a 2015 McKinsey Quarterly article:

The jobs don’t disappear, they change:

Our results to date suggest, first and foremost, that a focus on occupations is misleading. Very few occupations will be automated in their entirety in the near or medium term. Rather, certain activities are more likely to be automated, requiring entire business processes to be transformed, and jobs performed by people to be redefined, much like the bank teller’s job was redefined with the advent of ATMs.

Further:

our research suggests that as many as 45 percent of the activities individuals are paid to perform can be automated by adapting currently demonstrated technologies… fewer than 5 percent of occupations can be entirely automated using current technology. However, about 60 percent of occupations could have 30 percent or more of their constituent activities automated.

Most work is boring:

Capabilities such as creativity and sensing emotions are core to the human experience and also difficult to automate. The amount of time that workers spend on activities requiring these capabilities, though, appears to be surprisingly low. Just 4 percent of the work activities across the US economy require creativity at a median human level of performance. Similarly, only 29 percent of work activities require a median human level of performance in sensing emotion.

So, as Vinnie also suggests, you can automate all that stuff and have people focus on the “creative” things, e.g.:

Financial advisors, for example, might spend less time analyzing clients’ financial situations, and more time understanding their needs and explaining creative options. Interior designers could spend less time taking measurements, developing illustrations, and ordering materials, and more time developing innovative design concepts based on clients’ desires.

Companies want more from offshore IT, likely leading to more on-shore IT growth

The most recent offshoring survey from Horses for Sources suggests that companies will have less use for traditional IT outsourcing.

When it comes to IT services and BPO, it’s no longer about “location, location, location”, it’s now all about “skills, skills, skills”.

Instead of “commodity” capabilities (things like password resets, routine programming changes, etc.), companies want more highly-skilled, innovative capabilities. Either offshorers need to provide this, or companies will in-source those skills.

Because offshorers typically don’t focus on such “open ended” roles, analysis of the survey suggests offshorers will have less business, at least new business:

aspirations for offshore use between the 2014 and 2017 State of the Industry studies, we see a significant drop, right across the board, with plans to offshore services.

And:

an increasing majority of customers of traditional shared services and outsourcing feel they have wrung most of the juice offshore has to offer from their existing operations, and aren’t looking to increase offshore investments.

What with the large volume of IT offshorers companies do, and how this outsourcing tends to control/limit IT capabilities, paying attention to these trends can help you predict what the ongoing “nature of IT” is in large offshorers.

This fits the offshoring and outsourcing complaining I hear from most all software teams in large organizations.

To me this read as “yes, we need to refocus IT to help us create and refine new business models.” You know, “digital transformation,” “cloud native,” and all that.

Source: “Offshore has become Walmartas Outsourcing becomes more like Amazon”

Linux killed Sun?

For the Sun: WTF? files:

Gerstner questioned whether three or four years from now any proprietary version of Unix, such as Sun’s Solaris, will have a leading market position.

One of the more popular theories for the decline of Sun is that they accepted Linux way, way too late. As a counter-example, there’s IBM saying that somewhere around 2006 you’d see the steep decline of the Unix market, including Solaris, of course.

If I ever get around to writing that book on Sun, a chart showing server OS market-share from 2000 to 2016 would pair well with that quote.

If you’ve read Stephen’s fine book, The New Kingmakers, you may recall this relevant passage:

In 2001, IBM publicly committed to spending $1 billion on Linux. To put this in context, that figure represented 1.2% of the company’s revenue that year and a fifth of its entire 2001 R&D spend. Between porting its own applications to Linux and porting Linux to its hardware platforms, IBM, one of the largest commercial technology vendors on the planet, was pouring a billion dollars into the ecosystem around an operating system originally written by a Finnish graduate student that no single entity — not even IBM — could ever own. By the time IBM invested in the technology, Linux was already the product of years of contributions from individual developers and businesses all over the world.

How did this investment pan out? A year later, Bill Zeitler, head of IBM’s server group, claimed that they’d made almost all of that money back. “We’ve recouped most of it in the first year in sales of software and systems. We think it was money well spent. Almost all of it, we got back.”

Source: IBM to spend $1 billion on Linux in 2001 – CNET

Talend IPO’s

The open source based data integration (basically, evolved ETL) company Talend IPO’ed this week. It’s a ten year old company, based on open source, with a huge French tie-in. Interesting all around. Here’s some details on them:

  • “1,300 customers include Air France, Citi, and General Electric.” That’s way up from 400 back in 2009, seven years ago.
  • In 2015 “Talend generated a total revenue of $76 million. Its subscription revenue grew 39% year over year, representing $62.7 million of the total. The company isn’t profitable: it reported a net loss of $22 million for 2015.”
  • “…much of that [loss] thanks to the $49 million it spent on sales and marketing,” according yo Julie Bort.
  • “Subscription revenue rose 27% to $63m while service fees stayed flat at $13m,” according to Matt Aslett.
  • It looks like the IPO performed well, up ~50% from the opening price.

TAM Time

By this point, I’m sure Talend messes around in other TAMs, but way back when I used to follow the business intelligence and big data market more closely, I recall that much of the growth – though small in TAM – was in ETL. People always like the gussy it up as “data integration”: sure thing, hoss.

That seems still be the case as spelled out a recent magic quadrant of the space (courtesy of the big dog in the space, Informatica):

Gartner estimates that the data integration tool market was worth approximately $2.4 billion in constant currency at the end of 2014, an increase of 6.9% from 2013. The growth rate is above the average for the enterprise software market as a whole, as data integration capability continues to be considered of critical importance for addressing the diversity of problems and emerging requirements. A projected five-year compound annual growth rate of approximately 7.7% will bring the total to more than $3.4 billion by 2019

In comparison, here’s the same from the 2011 MQ:

Gartner estimates that the data integration tools market amounted to $1.63 billion at the end of 2010, an increase of 20.5% from 2009. The market continues to demonstrate healthy growth, and we expect a year-on-year increase of approximately 15% in 2011. A projected five-year compound annual growth rate of approximately 11.4% will bring the total to $2.79 billion by 2015.

Meanwhile check out Carl Lehmann’s recent overview of Informatica and the general data integration market and Matt Aslett’s coverage of IPO plans back in June for a good overview of Talend.

OpenStack Summit 2016 Talks

The OpenStack Summit is in Austin this year, finally! So, I of course submitted several talks. Go over and vote for them – I think that does something helpful, who the hell knows?

Here’s the talks:

I’ll be at the Summit regardless, but it’d sure be dandy to do some of the above too.

New DevOps Column at The Register

I started a new column at The Register, on the topic of DevOps. I used the first column to layout the case that DevOps is a thing, and baseline for how much adoption there currently is (enough, but not a lot – a “glass almost half full” type of situation). I was surprised by how many comments it kicked up!

Next up, I’ll try to pick key concepts and explain them, along with best and worst practices for adoption of those concepts. Or whatever else pops up to fill 800 words. Tell me if you have any ideas!

(You may recall had a brief column at The Register back when I was at 451 Research.)

The Problem with PaaS Market-sizing

Figuring out the market for PaaS has always been difficult. At the moment, I tend to estimate it at $20-25bn sometime in the future (5-10 years from now?) based on the model of converting the existing middleware and application development market. Sizing this market has been something of an annual bug-bear for me across my time at Dell doing cloud strategy, at 451 Research covering cloud, and now at Pivotal.

A bias against private PaaS

This number is contrast to numbers you usually see in the single digit billions from analysts. Most analysts think of PaaS only as public PaaS, tracking just Force.com, Heroku, and parts of AWS, Azure, and Google. This is mostly due, I think, to historical reasons: several years ago “private cloud” was seen as goofy and made-up, and I’ve found that many analysts still view it as such. Thus, their models started off being just public PaaS and have largely remained as so.

I was once a “public cloud bigot” myself, but having worked more closely with large organizations over the past five years, I now see that much of the spending on PaaS is on private PaaS. Indeed, if you look at the history of Pivotal Cloud Foundry, we didn’t start making major money until we gave customers what they wanted to buy: a private PaaS platform. The current product/market fit, then, PaaS for large organizations seems to be private PaaS

(Of course, I’d suggest a wording change: when you end-up running your own PaaS you actually end-up running your own cloud and, thus, end up with a cloud platform.)

How much do you have budgeted?

With this premise – that people want private PaaS – I then look at existing middleware and application development market-sizes. Recently, I’ve collected some figures for that:

  • IDC’s Application Development forecast puts the application development market (which includes ALM tools and platforms) at $24bn in 2015, growing to $30bn in 2019. The commentary notes that the influence of PaaS will drive much growth here.
  • Recently from Ovum: “Ovum forecasts the global spend on middleware software is expected to grow at a compound annual growth rate (CAGR) of 8.8 percent between 2014 and 2019, amounting to $US22.8 billion by end of 2019.”
  • And there’s my old pull from a Goldman Sachs report that pulled from Gartner, where middleware is $24bn in 2015 (that’s from a Dec 2014 forecast).

When dealing with large numbers like this and so much speculation, I prefer ranges. Thus, the PaaS TAM I tent to use now-a-days is something like “it’s going after a $20-25bn market, you know, over the next 5 to 10 years.” That is, the pot of current money PaaS is looking to convert is somewhere in that range. That’s the amount of money organizations are currently willing to spend on this type of thing (middleware and application development) so it’s a good estimate of how much they’ll spend on a new type of this thing (PaaS) to help solve the same problems.

Things get slightly dicey depending on including databases, ALM tools, and the underlying virtualization and infrastructure software: some PaaSes include some, none, or all of these in their products. Databases are a huge market (~$40bn), as is virtualization (~$4.5bn). The other ancillary buckets are pretty small, relatively. I don’t think “PaaS” eats too much database, but probably some “virtualization.”

So, if you accept that PaaS is both public and private PaaS and that it’s going after the middleware and appdev market, it’s a lot more than a few billion dollars.

(Ironic-clipart from my favorite source, geralt.)

The media doesn’t know what they’re talking about w/r/t Yahoo, a study in i-banker rhetoric

The notion that some in the media – who usually have no specific knowledge about Yahoo – have recklessly put forward that Yahoo is “unfixable” and that it should be simply “chopped up” and handed over for nothing to private equity or strategies is insulting to all long-term public shareholders.

This presentation is an example of many things we discuss on Software Defined Talk around large, struggling companies and the way they’re covered. Among other rhetorical highlights:

  • Check out how they make their case
  • Use visuals and charts
  • The informal nature of their language, e.g., they use the word “stuff” frequently
  • Their citations, e.g., citing themselves (I always love a good “Source: Me!”) and citing “Google Images”

These things, in my view, are neither good or bad: I’m more interested in the study of the rhetoric which I find fascinating for investment banker documents/presentations like this.

Not only that, it’s a classic “Word doc accidentally printed in landscape.” The investment community can’t help themselves.

As another note, no need to be such a parenthetical dick, below, to prove the point of a poor M&A history, just let the outcomes speak for themselves, not the people who do them.

img_4051

They actually do a better job in the very next slide, but that kind to pettiness doesn’t really help their argument. (Their argument is: she’s acquiring her friends.)

This is a type of reverse halo effect: we assume that tree standing goofiness has something to do with the business: an ad hominem attack. But, I think most billionaires probably have picture of themselves in trees, wearing those silly glove shoes, roasting their own coffee, only eating meat they kill themselves, or any number of other affectations that have nothing to do with profit-making, good or bad.

Solving the conundrums of our father’s strategies

So here we are, as of this writing a good twenty-nine years after the “hatchet job,” and Kodak has declared bankruptcy. The once-humming factories are literally being blown up, and the company’s brand, which Interbrand had valued at $14.8 billion in 2001, fell off its list of the top one hundred brands in 2008, with a value of only $3.3 billion. 6 It really bothered me that the future was so visible in 1980 at Kodak, and yet the will to do anything about it did not seem to be there. I asked Gunther recently why, when he saw the shifts coming so clearly, he did not battle harder to convince the company to take more forceful action. He looked at me with some surprise. “He asked me my opinion,” he said, “and I gave it to him. What he did beyond that point was up to him.” Which is entirely characteristic of scientists like Gunther. They may see the future clearly, but are often not interested in or empowered to lead the charge for change. Why do I know this story so well? He happens to be my father. —The End of Competitive Advantage, Rita McGrath.

You don’t get a sudden, personal turn like that in business books much. It evoked one of the latent ideas in my head: much of my interest in “business” and “strategy” comes from dad’s all too typical career at IBM in the 80s and 90s.

Sometime in the early 80s – or even late 70s? – my dad started working at IBM in Austin on the factory floor, printed circuit boards I believe. He’d tell me that he’d work the late shift, third shift and at 6am in the morning, stop by 7-11 with his buddies to get a six pack and wait in the parking lot of the Poodle Dog bar for it to open at 8.

He moved up to management, and eventually into planning and forecasting. All for hardware. I remember he got really excited in the late 80s when he got a plotter at home so he could work on foils, that is, transparencies. We call these “slides” now: you can still get a that battlefield-twinkle out of old IBM’ers eyes if you say “foils.”

Eventually, he lived the dictum of “I’ve Been Moved” and went up to the research triangle for a few years, right before IBM divested of his part of the company selling it to Multek (at least he got to return to Austin).

As you can guess, his job changed from a long-term one where his company had baseball fields and family fun days (where we have an outdoor mall, The Domain now) to the usual transient, productivity harvesting job. He moved over to Polycom eventually where he spent the rest of his career helping manage planning and shipping, on late night phone calls to Thailand manufacturers.

In addition to always having computers around – IBM PCs of course! – there was always this thing of how a large tech company evolves and operates. At the time, I don’t think I paid much attention to it, but it’s a handy reference now that I spend most of my time focused on the business of tech.