Coté

Coté

It takes a village to make a king, not just developers

This is an example of a weird use of “developers are the new kingmakers”:

There is a simple rule: developers rule. You run the global economy, right? Not politicians, not CEOs - developers are the ones running this global economy. What aspect of your life is not getting more digital? Everything, sports, entertainment, social experience or health, everything is becoming more digital. It’s a foundational aspect of all economy and human experience. We’re just seeing everything become a computer, replacing industries like oil that defined geopolitics for five decades. It’s now silicon and the technology supply chains that it enables.

Here, you have a computer hardware (chips) company saying that developers are the kingmakers and, I guess, the continual power-source of the kings. I didn’t watch the keynote, I’m just taking the idea in that text and commenting on it. Maybe Intel is on about something else. But I want to look at this idea of “developers as kingmakers,” apart from whatever is going on at Intel.

A Runaway Idea

I’ve recently been uncomfortable with how much people latch onto the idea of developers as king-makers. This is despite having worked at the firm, RedMonk, that was the main source of that idea back in the 2000s. Stephen’s book has several case studies of developers playing huge roles in technologies “winning.” If you work in any business that works with developers, then: YES!

BUT. For what feels like over a decade, the importance of developers has started to get way overblown and misused to over simplify, like, how things actually work.

Related to the above quote, let’s look at the hardware case. This is an area I have first hand experience with from when I did corporate strategy at Dell, covering software and cloud. Dell was super interested in the importance of developers. I did many, many reports, slides, big meetings, and studies about developers. Dell is still very interested in developer! They’ve obviously funded it an incredible amount more than when I was there.

Hardware without software is useless

Here is where it starts with a hardware company: Hardware without software is useless, it just sits there.

So, someone - more likely, some software company - has to write and provide/sell software to run on that hardware.

How is software made? Well, people program it. These people are…developers. But they’re not the only ones involved.

Yes, and, there are also product managers; testers; platform engineers/DevOps engineers, and ops/infrastructure people who either run it or run the labs and equipment that everyone uses; designers; general managers of business units that trickle down strategy from the corporate layer; the corporate layer that is making all sorts of choices about which technology to use and not use (see Adobe, Apple, and Facebook examples below); then there’s legal and regulatory requirements.

Also, now-a-days, you have to do a lot to manage culture and overall worker happiness. If you read through the past several years of developer productivity metric studies, you’ll realize that management’s ongoing “programming” of corporate culture is a critical part of the success of software as well.

On the buyer/user side: there are the people who select which software to buy; the people who procure it; the people who install and integrate it; the people who administer it ongoing; and then the business side at the buyers who determine how it’s used and for how long. Also, all sorts of cost/benefit analysis that’s done: is it worth while to upgrade our old stuff, or, like, do anything new?

With respect to hardware, there are strategic decisions like running on your own hardware (private cloud, edge), or in public cloud. There’s also just people who decide they like AMD versus Intel versus whatever. You have security people who will decide which hardware and software to get based on the reputation of the vendors - real and perceived. You have computer makers and hyper-scalers (Dell, Apple, Facebook), who are building hardware and deciding which chips and other things go in there, or just licensing(?) ARM and making their own. Maybe they don’t like Intel’s pricing, so for two years they’ll go with AMD just to stick it to Intel. Maybe they choose Nvidia because, like, it’s the best technology. Perhaps there’s licensing and patent barriers that force people into choices.

(I don’t think there’s much choice in storage - they’re all pretty much the same right, a commodity. I’m pretty sure developers don’t king-make hard drives? Sure, they gave SSD an early boost, but, like, SSD is just better, regardless of who’s deciding to buy it. You could argue that for the SSD market to exist, you needed the early developer market to buy the expensive SSDs before there was enough scale to build more and lower the prices…but…really? I don’t have the numbers for this, but I feel like if you took some contemporary estimates of the developer population, reduced it to those who could decide to buy SSDs, and figured out the spend, it’d be pretty small.)

Then there’s also marketing, especially brand marketing, lower level marketing - the people who put together giant keynotes and events. You’ve got the people who write documentation, manuals, training, staff StackExchange, and garden not only developer communities, but also the communities for all roles above.

You have people who manage channels and partners (at Apple, someone has to manage the relationship with all those phone stores and carriers to get favorable deals and make it all possible - if you could only buy iPhones directly from Apple…how would that have worked out? And see the partner/channel hijinks in RIA below!).

And so. And so forth.

Are the needs of developers taken into account for all of this? Sure. The hardware has to work with the software. Your developers need to understand it, sometimes even like it all. If the organization wants to use Kubernetes, your hardware needs to work well with Kubernetes and container-based architectures. These can be make or break considerations.

But so are all the other considerations and roles above. Often, developers have to fit their skills, desires, and work into the constraints that this whole system drives. It’s certainly not the case that there’s some kind of “great developer of history” theory of IT.

Developers are required, but so are all the other people, roles, and strategic considerations above.

Sure, you can have “indie” developers that collapse all these roles into one or two skin-bound entities, but all of those non-developer functions need to happen.

And, further-sure, if you’re talking about products and services that you sell to developers, then developers have a key, often king-making role…just as any customer for a product or service does.

There are many technologies that have won - or lost - because of non-developers.

This must especially be true lower down the stack, that is, with hardware. I feel like decisions about which software to support, the amount of brand management, marketing, design of the hardware, pricing, and getting your supply chain right have much more influence over the success of hardware companies than developers.

In other words, you should make good hardware that is sold at a reasonable price that solves the problem the buyer has. (I should put that in an airport business book!)

The Story of Flash

Here is a case that I lived through: Rich Internet Applications. In the 2000s, using HTML (web pages) for the kinds of applications we use now was magic. When GMail came out, few had ever seen an application that worked in the browser. It was hard to make rich, GUI-like applications for the web.

Meanwhile, browser plugins had been getting around this problem by not using HTML. Cheif among them was Flash, from Macromedia, acquired by Adobe. As Flash evolved past delivering delightful cartoons with limited user interaction, it allowed you to have desktop like applications (GUIs) in the browser. I’ve used many of them, especially enterprise software systems.

People - users - didn’t always like them, but then some people did like them (and some people nowadays have nostalgia eyes for them). As ever with software, it wasn’t the technology (Flash), it was how well you did the screens, “UX,” whatever. I

n the mid-2000s, there was a long-game/war between major software vendors to win the emerging smart phone market, which, actually meant winning the new PC platform. Microsoft wanted a version of Windows on smart phones. Adobe wanted a version of Flash on there. Sun wanted Java on there. There was also Blackberry! Winning the smart phone market was incredibly important because…well…look at the value of Apple today!

Apple shocked everyone (Well, I think? Me at least) when Apple launched the iPhone. Adobe tried to get Apple interested in Flash. They developed a whole other framework called Air that was, like, Flash++ (that might have been after Apple’s rejection of Flash, I forget). They made partnerships with other phone manufactures, streaming services (I think Netflix streaming used Flash originally, for a long time, right?), and so forth. But, at some point, Steve Jobs wrote a memo where he was, in summary, all like “Flash sucks.”

And that was it for Adobe! Did a bunch of developers petition Steve Jobs to do that? I don’t know the history, but I’m guessing not. I’m guessing Apple: (1) strategically was all “why would we let someone else own part of our fully vertical stack, are you insane?”, and, (2) it always seemed like there was a super-weird relationship between the two.

Also, people argued that Flash was just not good enough for the task. It was technically inadequate You could say “developers decided that,” but they would have decided that because it was the bad option. Imagine the case where, instead, developers chose Flash even though it was the worse option. They would have king-made Flash, but made the wrong choice. We wouldn’t be celebrating that.

Oh, and also: maybe it was just licensing. Maybe Apple and the hand-set carriers didn’t want to pay Adobe to include Flash in their stack. Flash could have been awesome - developers could have been clamoring for it! - but if the handset providers (with respect to Adobe, in strategy terms, the partners/channels) refused to pay Adobe and shipped something else. Well. Then you have Android (right?). As mentioned above, for Apple, it was an obvious strategic choice to make their own platform, regardless of what developers wanted.

Adobe tried to find traction for Flash and AIR for many more years, but eventually it just sort of died off.

Here is a sort-of contrast the Flash story. Meanwhile! Facebook was trying to figure out how to do mobile apps. At one point Mark Zuckerberg got really into HMTL5 as their app framework. The idea was that you should use the open web for apps. Strategically, this is similar to Apple rejecting Flash: using the open web meant Facebook had very little ties and dependencies to any company. This is, you know, very different than how Facebook and the entire web acts now - a closed off web with the gates made of network effects and user preference.

(You can also see, in recent, history why it’s was bad for Facebook to have dependencies on Apple. Apple changed how tracking for advertising was done a year or so back and this caused all sorts of problems for Facebook and others who’d been using those capabilities to track and target ads. I think Facebook figured it out, but, like, bummer on their cashflow!)

Eventually, Facebook was like, nope, we’re making native iPhone and Android apps. They still have HTML ones as well.

Here, you can see some developers contributing to this, but like, a handful of developers at Facebook, probably - not, you know, the millions of developers out there in the world. Also, I’m pretty sure it was a strategic choice.

Yeah, but…

First, I could be missing a major point in the smart phone cases above. Once Apple put its own app platform in place, it needed developers to actually write apps for it. And, like: yes! But, let’s go back to that laundry list of what goes into making software. Apple needed the software producing industry to make apps that ran on the iPhone…not just developers.

The container wars are another thing I don’t have sorted out in this “developers are just part of it all” thinking. Developers can play a large role in the success of technology. I’ve seen this first hand recently with the rise of Docker and then Kubernetes. But…to me, this is something more of a perception and business strategy “war.”

The actual usage of containers in production is incredibly low compared to how much we talk about “cloud native” and Kubernetes.

By 2025, 15% of enterprise applications will run in a container environment, up from 5% in 2022, hampered by application backlog, technical debt, skills availability and cultural change.

–From a recent “Gartner Forecast Analysis: Container Management (Software and Cloud Services), Worldwide.”

Also, the market (how much money people pay for) for container management is incredibly low as well. And then there’s the bizarre thing that the Kubernetes people keep saying that Kubernetes is not for developers: you know, platform for building platforms and all that.

I don’t have this whole container wars thing figured out, but I suspect there’s a lot more going on then just developer preference.

The cost of Kubernetes (free to download and install the hard-way); the brand-glow of Google (and then Red Hat jumping in and acting as a kingmaker and channel for it); the missteps of competitors (usually understandable with out reverse Halo Effect vision, just like Adobe and Facebook above); the price of alternatives (Pivotal Cloud Foundry is/was expensive compared to free - OpenStack sort of lost the plot); Docker’s strategy waffling and flameout; HashiCorp somehow not getting a foothold in platforms despite its foothold in other areas; and then all the existing VM-based “platforms” from the public cloud vendors to VMware (where I work), Microsoft, etc.

Also, there’s endless castle intrigue and rumors about people doing plain old bickering, acting on grudges, and all that petty bullshit that drives history. I wasn’t first hand to any of that, so I have no idea - but I’d love to know!

Again, we see an incredibly complex system where the part of one role - developers - is important, but not any less important than other roles?

(Oh, and then, how about OpenStack as a case from the other direction: what made that not work out? Further, was it the developers in telco that made and continue to make OpenStack successful in telco? Or maybe OpenStack is just fine?)

Perhaps you could say that developers can be one of the early gate-keepers of a technology. They can decide if the technology gets past the early parts of the diffusion of innovations curve. But, whether early on or along the curve (especially the fat part in the middle), so does every other role in the big system.

Pricing and procurement can be a king-maker: if the software is expensive and hard to buy, you’ll have problems…well, not exactly…you could hire an enterprise salesforce and sales engineers that can support this long process, do the marketing and product management needed to justify a high price, and then increase the prices to support the whole mostly self-inflicted hassle of enterprise sales. That all works perfectly in the ERP: witness Oracle, Salesforce, SAP, etc.

(Another side case that’d be fun to ponder. When I was at RedMonk, SAP cared a great deal about developers. Did all that effort matter? Was it make-or-break for SAP’s continued success? SAP was also heavily aligned with Adobe’s RIA stuff above.)

It takes a lot more than cheese to make cheese enchiladas

To say that “developers rule” is like saying that the cheese makers rule with when it comes to cheese enchiladas, or even queso. The cheese is important, but the decision process and hands-on work that leads up to putting a chip in queso is so complex, involves so many people, and, most importantly, user/eater preferences, that to say the cheese maker rules is weird. Maybe the proper mixture of spices, getting a ripe avocado, deciding on putting chilli con carne or not (and is it ground beef or brisket?) - I like a lot of pico de gallo myself. But, then, procurement wise, sometimes you don’t have the time and money so you just melt some Velveda and you’re perfectly satisfied - connivence is the king maker. (Is the developer the queso eater here, or the maker? I’ve lot track.)

Similarly, there are so many things, people, decisions, tricks of fate, grudges (I’ve heard!), etc. involved in making, buying, and using software that to assign an outsized amount of importance to developers is…weird.

Many of the people and roles involved in the lifecycle of process can have a pivotal role in it, developers can for sure! But people making strategic choices also can have make or break control (Steve Jobs and Mark Zuckerberg above, or the strategic choices they make at Concur to not update their software which are, I assume based on their software’s UX, very much in favor of controlling costs and driving profit since they have mental lock-in in their market - from what I can tell).

A designer who comes up with the best way to do a workflow for a user can have make or break effect - they could use “dark patterns” to drive revenue that keeps the company alive, allowing the developers, and everyone else, to keep doing their job. Someone in billing could just decide that you have to call (on the phone!) to cancel your subscription.

Despite all that, it’s still a big, important idea

At the time, the point Stephan was making was incredibly important because most people didn’t think about developers as having much of any role. Sure, there was the mythical idea of a garage of app developers, but from what I recall, caring about developers wasn’t a big enough part of the conversation.

It was weird. You had that whole “developers! developers! developers!” thing. And Linux and the broader open source world was driven by developers (as cataloged by Stephan).

But, still, the idea of thinking about developers strategically was only done by a select few, like Microsoft. Now, most everyone considers the developer. And this is fantastic! YES TO THAT. We do, after all, need software.

I just think, you know, let’s focus on all the parts, not overly on one, developers.

So, going back to the original quote a the top, I think what he means is “software.” Software is, like, pretty important now-a-days.

And, sure, developers program software, but there are so many other factors and roles involved in the life-cycle of software that also play a critical role (see above!) that, you know, developers, just like the cheese maker, are only one part of making a good bowl of queso.

Speaking of developers…


Next week, on October 17th, a whole passel of my team mates and I are hosting an online SpringOne Tour. It’s free to attend, of course. If you can’t make it to one of our in-person events, check this one out. There’s over 20 talks you can choose from, including mine on platforms.

If you’re a programmer - especially a Java programmer - or doing anything with cloud native apps, there’s something in it for you. Register for free, and check it out on October 17th.


Back when I was king-making.

Logoff

I got access to the ChatGPT conversation feature this weekend. It is pretty amazing, really. I’ve used it for my D&D messing around, and a little for other things: I’m thinking through a method of collaborative storytelling that could make it a better ChatDM. And, with the conversation thing, you could do it while walking the dog, washing the dishes, etc. I still don’t think it could do D&D combat, but I might be able to finally figure out how to get it to be fun.

@cote@hachyderm.io, @cote@cote.io, @cote, https://proven.lol/a60da7, @cote@social.lol