2017 Cloud Foundry Application Runtime Survey – Highlights

There’s a new survey out from the Cloud Foundry Foundation, looking at the users of Cloud Foundry. Here’s some highlights and notes:

  • Another ClearPath joint, n=735.
  • It’s important to keep in mind that this is covers all distress of Cloud Foundry, including open source (no vendor involved).
  • “The percentage of user respondents who require over three months
    per app drops from 51 percent to 18 percent after deploying Cloud Foundry Application Runtime”
  • “…while the percentage of user respondents who require less than a week climbs from 16 percent to 46 percent.”
  • “Nearly half (49 percent) of Cloud Foundry Application Runtime users are large enterprises ($1+ billion annual revenue).”
  • This chart is hard to read, but it shows a reduction in time to deploy across various time periods:
    before-after-release.
  • Uptake is early, but there are definitely mature users: “A plurality of Cloud Foundry Application Runtime users (61 percent) describe their deployments as somewhere in the early stages—trial, PoC, evaluation, or a partial integration into specific business units. Meanwhile, 39 percent have deployed Cloud Foundry Application Runtime more broadly across their company, from total integration in specific business groups to company-wide deployment.”
  • “Comcast, for example has more than 1500 developers using Cloud Foundry Application Runtime daily. Home Depot reports more than 2500 developers.”
  • “Comcast has seen between 50 percent and 75 percent improvement in productivity.”
  • “Half of Cloud Foundry Application Runtime users are currently using containers, such as Docker or rkt, with another 35 percent evaluating or deploying containers.”
  • Container management – there’s a wide variety of tools that people use for container orchestration, including DIY (14%). There’s a lot of interest in having CF do it: “Nearly three-quarters (71 percent) of Cloud Foundry Application Runtime users currently using or evaluating containers are interested in adding container orchestration and management to their Cloud Foundry Application Runtime environment.” Hence, validating the Cloud Foundry Container Runtime.
  • Of course, the surveyed are already CF users, so they’re biased/driven by what they know.
  • Almost half of respondents say that getting started with CF. But people end up liking it: “An overwhelming majority of users (83 percent) would recommend Cloud Foundry Application Runtime to a colleague, including 60 percent who would do so strongly.”
  • “As more companies roll out Cloud Foundry Application Runtime more broadly, the footprint continues to grow. Currently, 46 percent of users have more than 10 apps deployed on Cloud Foundry Application Runtime, including 18 percent with over 100 (and eight percent with over 500).” 4% have over 1,000 apps.
  • CF’s uses: “The primary use is for microservices (54 percent), followed by websites (38 percent), internal business applications (31 percent), Software-as-a-Service (SaaS) (27 percent) and legacy software (eight percent).”
  • Validating multi-cloud: “60 percent say this is very important, and another 30 percent describe it as somewhat important.” Meanwhile, 53% are using more than one type of IaaS.

Private cloud: avoiding an existential crisis

451 Research’s data points suggest that some workloads are likely to remain on private cloud regardless of any disruptor’s attack. And even with hungry cloud providers eyeing private workloads, growth is likely to continue across all cloud models, not just public cloud.

Whole bunch of survey numbers tryin’ figure out how many workloads will stay on private cloud.

Source: Private cloud: avoiding an existential crisis

Podcast market estimated at over $220m

As covered by Axios, in a report from IAM/PwC. As noted in the notes below the chart, these figures are based on a sub-set of the market, 20 advertising outfits. No doubt, they represent a huge part of revenue however. It’s hard to imagine that there’s many more millions in podcast advertising.

Also as highlighted by Sara Fischer:

Edison Research and Triton Digital estimates 98 million U.S. adults listen to podcasts.

Link

451’s container orchestration usage survey – Notebook


As part of CoreOS’s conference this week, 451 put out a sponsored study on container orchestration. It’s been much cited and is free, so it’s worth taking a look. Here’s my highlights and notes:

  • Leadgen yourself to CoreOS get a copy of the report.
  • This report is really more of a “container orchestration usage” report than much about “hybrid cloud.”
  • Demographics:
    • “We surveyed 201 enterprise IT decision-makers in April and May 2017. This was not a survey of developers; rather, we received responses from those in C-level and director-level positions, including CISO, CTO, CIO, director of IT, IT Ops and DevOps, and VPs and managers of IT.”
    • All from the US
    • “All of our survey respondents came from organizations using application containers, and all were familiar with their organization’s use of containers.” – This survey, then, tells you what people who’re already using containers are doing, not what the entire market is thinking and planning on.
    • “A significant slice of the survey respondents represented large enterprises.”
  • Organizations are hoping to use containers for “[a] ‘leapfrog’ effect, whereby containers are viewed as a way to skip adoption of other technologies, was tested, and a majority of respondents think Kubernetes and other container management and orchestration software is sufficient to replace both private clouds and PaaS.”
  • Obviously I’m biased, being at Pivotal, but the question here is “to do what?” As we like to say around here, you’re going to end-up with a platform. People need a “platform” on-top of that raw IaaS, and as things like Icito show (not to mention Pivotal’s ongoing momentum), the lower levels aren’t cutting the mustard.
  • There’s an ongoing semantic argument about what “PaaS” means to be mindful of, as well: in contexts like these, the term is often taken to mean “that old stuff, before, like 2009.” At the very least, as with Gartner’s PaaS Magic Quadrant, the phrase often means means “only in the public cloud.” Again, the point is: if you’re developing and running software you need an application development, middleware, and services platform. Call it whatever you like, but make sure you have it. It’s highly likely that these “whatever you want to call ‘PaaS’ PaaSes” will run on-top of and with container orchestration layers, for example, as Cloud Foundry does and is doing.
  • That said, it’s not uncommon for me to encounter people in organizations who really do have a “just the containers, and maybe some kubernates” mind-set in the planning phase of their cloud-native stuff. Of course, they frequently end-up needing more.
  • Back to the survey: keeping in mind that all respondents were already using containers (or at least committed to doing so, I think), ~27% had “initial” production container use, ~25% of respondents had “broad” containers in production. So, if you were being happy-path, you’d say “over half of respondents have containers in production.”
  • In a broader survey (where, presumably, not every enterprise was already using containers), of 300+ enterprises, production container use was: 19% in initial production, 8% were in broad production implementation.
  • Nonetheless, 451 has been tracking steady, high growth in container usage for the past few years, putting the container market at $2.7B by 2020 and $1.1bn in 2017.
  • As the report says, it’s more interesting to see what benefits users actually find once they’re using the technology. Their original desires are often just puppy-love notions after actual usage:

  • Interesting note on lock-in: “Given that avoiding vendor lock-in is generally a priority for organizations, it might seem surprising that it was not ranked higher as an advantage since much of the container software used today is open source… However, our respondents for this study were users of containers, and may have assumed that the technology would be open source and, thus, lock-in less of a concern.” (There’s a whole separate report from Gartner on lock-in that I’ll take a look at, and, of course, some 140 character level analysis.)
  • On marketshare, rated by usage, not revenue:

  • On that note, it’s easy to misread the widely quoted finding of “[n]early three-quarters (71 percent) of respondents indicated they are using Kubernetes” as meaning only Kubernetes. Actually, people are using many of them at once. The report clarifies this: “The fact that almost 75% of organizations reported using Kubernetes while the same group also reported significant use of other container management and orchestration software is evidence of a mixed market.”

As one last piece of context, one of the more recent Gartner surveys for container usage puts usage at around 18%, with 4% of that being “significant production use”:


Of course, looks at more specialized slices of the market find higher usage.

This early in the container market, it’s good to pay close attention to surveys because the sample size will be small, selective, and most people will only have used containers for a short while. But, there’s good stuff in this survey, it’s definitely worth looking at and using.

Figuring out fixing federal government IT – Notebook

In the US, we love arm-chair strategizing government IT, in particular federal IT. Getting your arms around “the problem” is near impossible.

What do we think is wrong, exactly?

As citizens, our perceptions seem to be that government IT has poor user experience, none at all (there’s no app to do things, you have to go to an office to fill something out, etc.), and that it costs too much. More wonky takes are that there’s not enough data provided, nor insights generated by that data to drive better decision making.

When I’ve spoken with government IT people, their internal needs revolve around increasing (secure) communication, using more modern “white-collar” tools (from simply upgrading their copies of Office, to moving to G Suite/Office 365 suites, or just file sharing), and addressing the citizen perceptions (bringing down costs, making sure the software, whether custom made or “off the shelf,” have better customer experiences.

Is it so hard, really?

It’s also easy to think that government is a special snow-flake, but, really, they have mostly the same problems as any large organization. As highlighted below, the government contracting, procurement, and governance processes are more onerous in government IT, and the profile of “legacy” systems is perhaps higher, but, worse, more of a pull down into the muck.

From my conversations, one of the main barriers to change is systemic inertia, seemingly driven by avoidance of risk and overall lack of motivation to do anything. This lack of motivation is likely driven by the lack of competition: unlike in the private sector, there’s no other government to go to, so there’s no fear of loosing “business,” so what care to change or make things better?

Anyhow, here’s a notebook of federal government IT.

“Legacy”

  • “92 percent of Federal IT managers say it’s urgent for their agency to modernize legacy applications, citing the largest driving factors as security issues (42 percent), time required to manage and/or maintain systems (36 percent), and inflexibility and integration issues (31 percent)” from an Accenture sponsored 2015 survey of “150 Federal IT managers familiar with their agency’s applications portfolio”
  • Theres a large pool of legacy IT, though not as large as you might think: ~60% of portfolio are from before 2010(https://www.gartner.com/document/3604417).
  • That said, the same report says that ~25% of portfolios are pre-1999, with 5% from the 1980’s.
  • On spending: “The government has been reporting that 75 to 80 percent of the federal IT budget is spent on running legacy (or existing) systems.”
  • But, actually, that’s pretty normal: “That may sound alarming to those who aren’t familiar with the inner-workings of a large IT organization. However, the percentage is in-line with the industry average. Gartner says the average distribution of IT spending between run, grow and transform activities — across all industries — is 70 percent, 19 percent and 11 percent respectively. Those numbers have been consistent over the past decade.”
  • However, the spending items above are from Compuware’s CEO, who’s clearly interested in continuing legacy spending, mostly on mainframes.

Priorities

Source: “2017 CIO Agenda: A Government Perspective,” Rick Howard, Gartner, Feb. 2017.

Other notes:

  • In the same survey, data & analytics skills are the leading talent gap, with security coming in second. Everything else is in the single digits.
  • Why care about data? On simply providing it (and you, know, the harder job of producing it), the UN e-Government survey says “Making data available online for free also allows the public – and various civil society organizations –to reuse and remix them for any purpose. This can potentially lead to innovation and new or improved services, new understanding and ideas. It can also raise awareness of governments’ actions to realize all the SDGs, thus allowing people to keep track and contribute to those efforts.”
  • And, on analytics: “Combining transparency of information with Big Data analytics has a growing potential. It can help track service delivery and lead to gains in efficiency. It can also provide governments with the necessary tools to focus on prevention rather than reaction, notably in the area of disaster risk management.”
  • Reducing compliance and overall “bureaucracy” is always a problem. My benchmark case is an 18F project that reduced the paperwork time (ATO) down from 9-14 months to 3 days.

The workloads – what’s the IT do?

  • And, while it’s for the Australian government, check out a good profile of the kinds of basic services, and, therefore, applications that agencies need, e.g.: booking citizenship process appointments, getting permits to open businesses, and facilitating the procurement process.
  • If you think about many of the business services governments do, it’s workflow process: someone submits a request, multiple people have to check and co-relate the data submitted, and then someone has to approve the request. This is a core, ubiquitous thing handled by enterprise software and, in theory, shouldn’t be that big of a deal. But, you know, it usually is. SaaS offerings are a great fit for this, you’d hope.

The problems: the usual old process, expensive COTs, contractors, compliance

  • If you accept that much of government IT is simple workflow management, much of the improving the quality and costs of government IT would likely come from shifting off custom, older IT to highly commoditized, cheap (and usually faster evolving, and more secure), SaaS-based services.
  • Jennifer Pahlka: “When you consider that much of what ails government today is the use of custom development at high cost when a commodity product is readily and cheaply available, we must acknowledge that agile is one useful doctrine, not the doctrine. “
  • So, if you do the old “IT – SaaS = what?” you suck out a lot of resources (money, attention, etc.) by moving from janky, expensive COTs systems (and all the infrastructure and operations support needed to run them). You can both cut these costs (fire people, shut down systems), and then reallocate resources (people, time, and money) to better customizing software. Then, this gets you back to “agile,” which I always read as “software development.”
  • In my experience, government IT has the same opportunities as most companies, taking on a more “agile” approach to IT. This means doing smaller, faster to release batches, with smaller, more focused, “all in teams.” Again, the same thing as most large organizations.
  • An older survey (sponsored by Red Hat): “Just 13% of respondents in a recent MeriTalk/Accenture survey of 152 US Federal IT managers believed they could ‘develop and deploy new systems as fast as the mission requires.’”
  • Mikey Dickerson, 2014: “We’ll break that up by discouraging government contracts that are multibillion-dollar and take years to deliver. HealthCare.gov would have been difficult to roll out piecemeal, but if you, a contractor, have to deliver some smaller thing in four to six weeks while the system is being constructed, you’ll act differently.”
  • Government contractors and procurement are a larger problem in government IT, though. The structure of how business is done with third parties, and the related procurement and compliance red-tape causes problems, and, as put by Andrew McMahon, it creates “a procurement process that has become more important than the outcome.”
  • While there’s “too much” red-tape, in general we want a huge amount of transparency and oversight into government work. In the US, we don’t really trust the government to work efficiently. This become frustrating ironic and circular, then, if your position is that all of that oversight and compliance is a huge part of the inefficiency.
  • As put by one government CIO: “Government agencies, therefore, place a business value on ‘optics’—how something appears to the observant public. In an oversight environment that is quick to assign blame, government is highly risk averse (i.e., it places high business value on things that mitigate risk)…. the compliance requirements are not an obstacle, but rather an expression of a deeper business need that the team must still address.”

Success story

  • Tom Cochran: “While running technology for Obama’s WhiteHouse.gov, open-source solutions enabled our team to deliver projects on budget and up to 75% faster than alternative proprietary-software options. More than anything, open-source technology allows governments to utilize a large ecosystem of developers, which enhances innovation and collaboration while driving down the cost to taxpayers.”
  • As with “agile,” it’s important to not put all your eggs-of-hope in one basket on the topic of open source. My theory is that for many large organizations, simply doing something new and different, upgrading – open or not – will improve your IT situation:
  • While open source has different cost dynamic, I’d suggest that simply switching to new software to get the latest features and mindset that the software imbues gives you a boost. Open source, when picked well, will come with that community and an ongoing focus on updates: older software that has long been abandoned by the community and vendors will stall out and become stale, open or not.
  • One example of success, from Pivotal-land, is the IRS’s modernization of reporting on diligent taxes. It moved from a costly, low customer service quality telephone based system to an online approach. As I overuse in most of my talks, they applied a leaner, more “agile” approach to designing the software and now “taxpayers have initiated over 400,000 sessions and made over $100M in payments after viewing their balance.”

If you’re really into this kind of thing, you should come to our free Pivotal workshop day in D.C., on June 7th. Mark Heckler and I will be going over how to apply “cloud-native” thinking, practices, and technologies to the custom written software portion of all this. Also, I’ll be speaking at a MeetUp later that day on the overall hopes and dreams of cloud-native, DevOps, and all that “agile” stuff.

The eternal battle for OpenStack’s soul will conclude in three years. Again – My May Register column

“What types of clouds are running OpenStack?” OpenStack Survey, 2017.

My column at The Register this month looks at the state of OpenStack. As Matt Asay better headlined it “CIOs may not want to build private clouds, but they are, anyway.”

Check out the piece!

IT’s usefulness is improving, but there’s plenty of room to fix the meatware, Surveys – Highlights

It’s another survey about business/IT alignment. Who knows how accurate these leadgen PDFs are, but why not? This one is of “646 CIOs and other IT leaders and 200 line of business leaders.” Some summaries from Minda Zetlin:

When LOB leaders were asked about the role their companies’ CIOs play, 41 percent said the CIO is a strategic advisor who identifies business needs and opportunities and proposes technology to address them. Another 22 percent said the CIO is a consultant who provides advice about technology and service providers when asked.

But 10 percent said their CIO was a “roadblock” who raises so many obstacles and objections to new technology that projects are difficult to complete. And another 9 percent said the CIO was a “rogue player,” with IT making technology decisions on its own, and creating visibility and transparency challenges.

Meanwhile, 36 percent of LOB leaders and 31 percent of IT leaders believe other departments “see IT as an obstacle.” And 58 percent of IT leaders but only 13 percent of LOB leaders agreed with the statement, “IT gets scapegoated by other departments when they miss their own goals.”

This seems better than the usual (kind of out of date) scare chart I used use, from a multi-year Cutter survey:

There’s still, as ever, plenty of room to improve business/IT alignment.

Speaking of that, also in that IDG/CIO Magazine survey, there’s a weird mismatch between the perception of The Business and IT about what IT does:

What does The Business want anyway?

Meanwhile, Vinnie quotes a Gartner survey of 388 CEOs:

  • Almost twice as many CEOs are intent on building up in-house technology and digital capabilities as those plan on outsourcing it (57 percent and 29 percent, respectively).
  • Forty-seven percent of CEOs are directed by their board of directors to make rapid progress in digital business transformation, and 56 percent said that their digital improvements have already delivered profits.
  • 33 percent of CEOs measure digital revenue.

Point being: The Business wants IT to matter and be core to how their organizations evolve. They want programmable businesses. Here’s some examples from another summary of that Gartner survey:

Although a significant number of CEOs still mention eCommerce, more of them align new IT infrastructure investments to advanced commercial activities – such as digital product and service innovation, exploring the Internet of Things (IoT), or adopting digital platforms and associated supplier ecosystems.

According to the Gartner assessment, some CEOs have already advanced their digital business agenda – 20 percent of CEOs are now taking a digital-first approach to business development. “This might mean, for example, creating the first version of a new business process or in the form of a mobile app,” said Mr. Raskino.

Furthermore, 22 percent are applying digital business technologies to their traditional processes. That’s where the product, service and business models are being changed, and the new digital capabilities that support those are becoming core competencies.

There’s demand there, the final result of “the consumerization of enterprise IT,” as we used to crow about. IT needs to catch-up on its abilities to do more than “just keep the lights” on or there’ll be a donkey apocalypse out there.

You seem people like Comcast doing this catching-up, very rapidly. The good news is that the software and hardware is easy. It’s the meatware that’s the problem.

Link

On-premise IT holding steady around 65% of enterprise workloads – Highlights

barfing cloud.png

One of the more common questions I’ve had over the years is: “but, surely, everyone is just in the public cloud, right?” I remember having a non-productive debate with a room full of Forrester analysts back in about 2012 where they were going on and on about on-premise IT being dead. There was much talk about electricity outlets. To be fair, the analysts were somewhat split, but the public cloud folks were adamant. You can see this same sentiment from analysts (including, before around 2011, myself!) in things like how long it’s taken to write about private PaaS, e.g., the PaaS magic quadrant has only covered public PaaS since inception).

Along these lines, the Uptime Institute has some survey numbers out. Here’s some highlights:

Some 65% of enterprise workloads reside in enterprise owned and operated data centers—a number that has remained stable since 2014, the report found. Meanwhile, 22% of such workloads are deployed in colocation or multi-tenant data center providers, and 13% are deployed in the cloud, the survey found….

On-prem solutions remain dominant in the enterprise due to massive growth in business critical applications and data for digital transformation, Uptime Institute said
Public cloud workload penetration:
Some 95% of IT professionals said they had migrated critical applications and IT infrastructure to the cloud over the past year, according to another recent survey from SolarWinds.
Budgets:

That survey also found that nearly half of enterprises were still dedicating at least 70% of their yearly budget to traditional, on-premise applications, potentially pointing to growing demand for a hybrid infrastructure….

Nearly 75% of companies’ data center budgets increased or stayed consistent in 2017, compared to 2016, the survey found.

Metrics, KPIs, and what organizations are focusing on (uptime):

More than 90% of data center and IT professionals surveyed said they believe their corporate management is more concerned about outages now than they were a year ago. And while 90% of organizations conduct root cause analysis of an IT outage, only 60% said that they measure the cost of downtime as a business metric, the report found.

Demographics: “responses from more than 1,000 data center and IT professionals worldwide.”

Pretty much all Pivotal Cloud Foundry customers run “private cloud.” Many of them want to move to public cloud in a “multi-cloud” (I can’t make myself say “hybrid cloud”) fashion or mostly public cloud over the next 5 to ten years. That’s why we support all the popular public clouds. Most of them are doing plenty of things in public cloud now – though, not anywhere near “a whole lotta” – and there are of course, outliers.

This does bring up a nuanced but important point: I didn’t check out the types of workloads in the survey. I’d suspect that much of the on-premises workloads are packaged software. There’s no doubt plenty of custom written application run on-premises – even the majority of them per my experience with the Pivotal customer base. However, I’d still suspect that more custom written applications were running in the public cloud than other workloads. Just think of all the mobile apps and marketing apps out there.

Also, see some qualitative statements from CIO types.

So, the idea that it’s all public cloud in enterprise IT, thus far, is sort of like, you know: ¯_(ツ)_/¯

Reactions to Cloudera’s IPO, prospects – Notebook

There’s lots of opinions on Cloudera’s IPO today. Here’s some that I’ve collected in my notebook.

Not valued high enough?

Despite the share-price being up 20% at close, some negative commentary focuses on their valuation dropping from Intel’s funding round, e.g., from Brenon at 451:

The chipmaker paid up for the privilege, putting a ‘quadra unicorn’ valuation of $4.1bn on Cloudera. Altogether, Cloudera raised more than $1bn from private market investors, making the $225m raised from public market investors seem almost like lunch money.

And then there’s the small matter of valuation. In its debut, Cloudera is only worth about half of what Intel thought it was worth when it made its bet.

The counter-point goes a little something like this (as pointed out by Derrick Harris):

“Much has been made of the huge valuation of that Intel-led round, but that’s all misguided noise,” according to IPO Candy, a website founded by Kris Tuttle, the director of research at Soundview Technology Group. “Intel didn’t make the investment for a financial return so the valuation isn’t relevant.”

Back in 2014, Intel was still smarting from missing the shift to mobile computing and Big Data was a favorite as the next big thing. The Santa Clara chip giant’s bet was placed chasing a strategic return, not so much banking a direct return on investment.

You know, all of this is a little bit of ¯_(ツ)_/¯. As I recall, Facebook’s IPO was all wiggly-woggly. If Cloudera makes a lot of money, gets bought for a lot of money, etc., no one will care to remember, just like with Facebook. Success is the best deodorant.

Their business, finances

Also from 451, earlier this month, a profile of their business:

Cloudera is nearly one-third bigger than Hortonworks, recording $261m in sales in its most recent fiscal year compared with $184m for Hortonworks. Both are growing at roughly 50%.

Since 2008, the company has grown steadily. As of January 31, it reports more than 1,000 customers. However, Cloudera is currently emphasizing and banking its success on what it calls the Global 8,000, which are the largest enterprises worldwide. The company notes that its number of Global 8,000 customers increased from 255 as of January 31, 2015, to 381 as of January 31, 2016, and 495 as of January 31. For the year ended January 31, the Global 8,000 represented 73% of Cloudera’s total revenue, while a further 10% of total sales came from the public sector. The company reports 1,470 fulltime employees as of January 31, a slight increase from its headcount of 1,140 the prior year.

More from Katie Roof at TechCrunch:

Cloudera’s market cap is now about $2.3 billion, significantly less than the $4.1 billion valuation Intel gave in 2014. This increasingly common phenomenon is now nicknamed a “down round IPO.”

In an interview with TechCrunch, CEO Tom Riley insisted that this was not a problem for the company because of the “growth prospects ahead of us.” If it performs well in the stock market, it could ultimately achieve the $4 billion-plus value. Square, which went public in 2015 at half its private market valuation, has since seen its share prices more than double.

(Side-note: comparisons of companies, Square and Cloudera, that have nothing to do with each other except being “tech” – and Square is payment processing, not “pure tech,” at that! – drive me a bit crazy, as listeners know.)

And a quick revenue/spend write-up from her:

Cloudera’s revenue is growing, totaling $261 million for the fiscal year that ended in January. The company brought in $166 million at the same time last year.

Losses were $186.32 million, down from $203 million in the same period the year before.

And, according to Jonathan Vanian: “Cloudera spent $203 million on sales and marketing in its latest fiscal year, up 26% from the previous year.”

TAM

I don’t really follow this space well enough anymore to quickly figure out the TAM: I suspect Cloudera operates in several data and BI related ones.

Cloudera isn’t only Hadoop, but 451 put the Hadoop market at $1.3b in 2016, growing to $4.4b in 2020, with a CAGR of 38.3% between 2015 & 2020.

If you throw data warehousing, BI, analytics, and an injection of the mega-databases TAM together, you get a really big TAM, anyhow. Keep in mind though that one of the traps of (definitionally orthodox) disruptors in this space is lowering the TAM of their respective markets, a la Red Hat in operating systems. I don’t get the sense that Cloudera is on that game plan, but others in the market might be.

Buyers’ plans & needs

With respect to what people would do with Cloudera and others in this space (including Pivotal), here’s a good ranking of the information infrastructure priorities Gartner recently found in enterprises:

info plans survey

Also of public/private cloud interest from the summary of that survey: “Based on survey responses, plans for on-premises deployments for production uses of data will drop from today’s 45% to 14% in 2018.”

Looking forward

People in the tech industry care a great deal about IPO’s like this. We’re all curious what The Market’s read on valuation of enterprise IT business models is for our own benefit, and just a general sense of the health of the sector. There’s also usually people you know at the company, so “yay” for people you know.

One day isn’t long enough to tell anything, though, cf., in a completely different space, that Facebook debut weirdness. People got all excited about Cisco buying AppDynamics because that seemed to show some “healthy” signs that money valued this kind of software/SaaS.

At any rate, people still seem to love the Big Data and such. From Cloudera’s CEO, Tom Reilly: “We’re competing with IBM and Watson, so our customers seeing the strength of our finances allows us to do more.” Think of all the free marketing!

And, Mike Olson (original CEO) adds:

The ensuing years have been remarkable. Our company has grown with the market. The original technology has morphed almost beyond recognition, adding real-time, SQL, streaming, machine learning capabilities and more. That’s driven adoption among some of the very biggest enterprises on the planet. They’re running a huge variety of applications, solving a wide variety of critical business problems.

Our early bet has proven correct: Data is changing the world. In applications like fraud detection and prevention, securing networks against cyberattacks and optimizing fleet performance in logistics and trucking, we’re delivering value. We’re helping to address big social challenges, improving patient outcomes in healthcare and helping law enforcement find and shut down human trafficking networks.

Against that background, an IPO takes on a more appropriate scale. We started Cloudera because we believe that data makes things that are impossible today, possible tomorrow. There’s more data coming, and there are plenty of impossible things to work on. Our journey is only well begun.

I admittedly don’t know Cloudera’s business model too well, but my sense is that they align well with the “have something to sell” model that many open source companies in the enterprise space forget to put in place.

Are tech H1-B visas actually that big of a deal? How we even evaluate the question?


25720976280_daf8a3d827_k_d

Over the decades, the number of H-1B workers allowed into the US each year has grown. With the 1998 update, the visa cap lifted to 115,000. In 2000, the limit was boosted again, this time up to 195,000. That year, the law was also tweaked so that renewals no longer counted toward the cap. In 2004, the cap was reset to 65,000, but an exemption was added for 20,000 students graduating from US institutions with master’s degrees. Exemptions were also added for workers affiliated with academic institutions, which can include schools and teaching hospitals. According to Ron Hira, a professor of Public Policy at Howard University who has studied the H-1B issue and testified about it before the Senate, the actual number of visas handed out each year has been around 135,000 over the last five years. Link

There’s a good rant on the relative importance of all of this in last week’s Political Gabfest. While us “on-shore” workers in the tech industry may see that 135,000 as a threat to our cashflow, it’s a drop in the bucket of employment in America. As Adam Davidson argues well, therefore, worrying about H1-B visas should be pretty low on the list of how to setup up more people with good jobs:

The question of H-1B visas has rhetorical importance far beyond its actual economic relevance. The unemployment rate for computer and mathematical occupations is, currently, 2.1 per cent. This is what economists consider full employment, meaning that pretty much everyone who wants a job has a job or is in a brief hiatus between positions. The number of jobs in those fields is growing fast—by about twelve per cent a year—and the number of qualified workers is not growing enough to catch up. In short, the plight of computer professionals is on few people’s list of urgent concerns…. According to the Bureau of Labor Statistics, ten thousand computer professionals start a new job every working day. In this context, the eighty-five thousand foreigners given H-1B visas each year represent little more than statistical noise.

He goes off an a political jag after this saying that the H1-B discussion is a proxy for “fear of brown people,” which certainly has appeal to leftist people like myself. There’s a business question here, too, though: are H1-B visas a good idea and why? Are they ethical and effective?

What types of jobs?

Also, some interesting analysis of the types of jobs H1-B visas are used for. Mostly for jobs at outsourcing firms:

But it’s how H1-B visas are being used by applicants that’s really changed. Data from the 2016 batch of H-1B petitions show that the top 10 sponsors of H-1B visa workers in the US are all corporations with large outsourcing businesses: Indian companies like Infosys, Tata, and Wipro, which pioneered the business, and US-based firms like IBM, Accenture, and Cognizant, which saw the success of the Indian contractors and began offing their own competing outsourcing programs. Those 10 firms have more workers currently employed through the program than the next 90 companies combined, a group that includes all of America’s largest tech companies and banks.

So, the discussion about H1-B visas in tech ii, by bulk, about the 60,000+ jobs in IT outsourcing. This is in addition to the estimated 1.7m off-shore jobs in outsourcing already.

In theory, most of these are “lower value” jobs where you’re more operating IT (help-desks, managing the daily operations of enterprise applications) rather than creating it (like programming). Anecdotally, there’s still programming running around in there, esp. when it comes to modernizing applications. The going theory is that you can’t just slot in workers on higher-value IT work like writing custom software.

How do you think about all this?

There’s an odd ethical vs. business-sense argument scurrying about as well that I’ve never seen addressed. One, you’d seem to be happy that the H1-B visa worker was getting work. By nature of accepting the job and up-rooting themselves, it must be good for them: or, at least, better than other alternatives. Also, if it’s actually cheaper to get the same services/output from an H1-B visa worker, why would you pay more for “native” worker? On the other hand, it’s equally confusing to figure out what companies “owe” workers that they’re firing in favor of the H1-B visa workers.

Tech companies like to skirt all that by talking about “we have to hire from a global pool,” which is fine if you’re hiring for an individual with unique skills. However, the divide between outsourcing firms and tech companies suggests that the bulk of H1-B visa hires in tech are not for all the super unique, AI experts that may not live on-shore. Then again, it’s insulting to even think that: why do I value one set of people over another in any context?

Businesses say they’re not satisfied

However we figure out talking about this, it’s clear from surveys that companies are dodgy on the value of outsourcing. As I summarizing some HfS work recently:

Outsourcers too often do exactly what the contract (from five to ten years ago) says instead of helping you innovate and keep the business growing. Itʼs little wonder that in a recent study, more than 75% of senior executives said they want to replace their legacy outsourcers because those providers are so unwilling to change to new models.


If we take Adam Davidson’s perspective, it’s not really even a problem worth thinking about (versus all the other hair-on-fire issue we have). However, when it comes to outsourcing (which I’ve shifted to because so many H1-B visa workers end up at outsourcing firms), it’s clear that we could be doing much better.