Companies that loose billions have a hard time being successful

How all these unprofitable companies sustaining high valuations:

bending reality today has three elements: a vision, fast growth, and financing.

But:

A few firms other than Amazon have defied the odds. Over the past 20 years Las Vegas Sands, a casino firm, Royal Caribbean, a cruise-line company, and Micron Technology, a chip-maker, each lost $1bn or more for two consecutive years and went on to prosper. But the chances of success are slim. Of the current members of the Russell 1000 index, since 1997 only 37 have lost $1bn or more for at least two years in a row. Of these, 21 still lose money.

Source: SchumpeterFirms that burn up $1bn a year are sexy but statistically doomed

In finance, large banks seem to be fast followers, not disruption victims

Eventually every advisor will be a robo-advisor, which means there will be convergence.

Without some marketshare numbers, it’s tough to tell if the banking startups are making a dent against incumbent banks. Josh Brown suggests that banks are quick to catch-up and have nullified any lead that companies like Weathfront could have made:

It wasn’t long before the weaker B2C robo-advisors folded, the middling players were acquired and the incumbents launched their own competing platforms. The miscalculation on the part of the disruptors may have been the idea that they had years of lead time to scale up their assets before the lumbering giants of the industry would be able to fight back. Turns out they only had months, not years. Charles Schwab and Vanguard launched their own versions of the service and the mindshare / market share battle was joined.

Checks out

From what I see out there, banks are quick to adapt and adopt new ideas into their businesses. While they’re beset with endless legacy IT and technical debt, they churn ahead nonetheless, e.g.:

While past performance is no guarantee of future results, and even though all the company’s results cannot be entirely attributed to BBVA’s digital transformation plan, so far many signs are encouraging. The number of BBVA’s digital customers increased by 68% from 2011 to 2014, reaching 8.4 million in mid-2014, of which 3.6 million were active mobile users.

Acquisition isn’t always failure, or victory

On the narrative framing side, it’s easy to frame a startup being acquired as “failure” and success for incumbents. That’s not always the case, and suggests a zero-sum view of innovation in industries. Acquisitions can have winners and losers – as with valuing anything, like real estate, the valuation could be wrong and in favor of the buyer or seller.

However, in the ideal case of an acquisition, it makes strategic sense for the buyer to spend their time and money that way instead of trying to innovate on it’s own. For the startup being acquired, they’re usually near the end of their gamble of sacrificing profit in favor of innovation and growth and need someone to bring them to the black.

Link

Vanguard’s thinking on microservices

Breaking up the monolith with good, old fashioned, OO-think:

Instead, Vanguard has begun a journey to break apart our monolithic legacy systems piece-by-piece by replacing them with microservices over time. With a microservices architecture, we remove the business logic and data logic from our applications and replace it with a set of re-usable modules of code that are built and deployed as independent entities. We then compliment this architecture by chunking out our user interfaces into modular purpose-built components.

De-coupling for stability and resiliency, among other things:

This service-based approach to application architecture provides a variety of advantages over the jumble of code that defines a non-modular monolithic application. First, services reduce redundancy by making sure there is only one copy of application logic for a given capability – regardless of how many applications leverage that logic. In the long run, this leads to lower development costs and increases speed to market. Second, since these services are deployed independently and built in a resilient manner, outages in one area of an application are less likely to bring down an entire system. In some instances, several of our services can be down without our clients being aware of a loss in functionality thanks to the ability of our applications to automatically react to a service that isn’t available. Finally, services enable our applications to scale easier. The marriage of cloud and services means we can quickly spin up infrastructure to handle surges in the number of transactions we need to handle without needing to scale up an entire application.

Vanguard CIO: Why we’re on a journey to evolve to a microservices architecture

How JPMC is making IT more innovative with PaaS, public and private

wocintech (microsoft) - 154

A good, pretty long overview of JPMorgan Chase’s plans for doing cloud with a PaaS focus. Some highlights.

More than just private-IaaS and DIY-platforms:

Like most large U.S. banks, JPMorgan Chase has had some version of a private cloud for years, with virtualized servers, storage and networks that can be shared in a flexible way throughout the organization.

The bank is upgrading its private cloud to “platform as a service” — in other words, the cloud service will manage the infrastructure (servers, storage, and networks), so that developers don’t have to worry about that stuff.

On the multi-/hybrid-cloud thing:

By the second half of 2017, the bank plans to run proprietary applications on the public cloud. At the same time, it’s building a new, modern internal cloud, code-named Gaia.

While “hybrid-cloud” has been tedious vendor-marketing-drivel over the past ten years, pretty much all of the large organizations I work with at Pivotal have exactly this approach. Public, private, whatever: we want to do it all.

Shifting their emphasis innovation:

“We aren’t looking to decrease the amount of money the firm is spending on technology. We’re looking to change the mix between run-the-bank costs versus innovation investment,” he said. “We’ve got to continue to be really aggressive in reducing the run-the bank costs and do it in a very thoughtful way to maintain the existing technology base in the most efficient way possible.” …Dollars saved by using lower-cost cloud infrastructure and platforms will be reinvested in technology, he said.

On appreciating the scale of “large organizations” that drive their very real challenges with adopting new ways of running IT:

The bank has 43,000 employees in IT; almost 19,000 are developers.

Good luck having the “we have no process by design” process with that setup.

On security, there’s a nice, almost syllogistic re-framing of “cloud security here”:

For years, banks have worried about using the public cloud out of security concerns and fears of what their regulators will say. Ever since the 2013 Target data breach, in which hackers stole card information from 40 million customers by breaking into the computers of an air conditioning company Target used, regulators have strongly urged banks to carefully vet and monitor all third parties, with a specific focus on security.

“We’re spending a significant amount of time to ensure that any applications we choose to run on a public cloud will have the same level of security and controls as those run internally,” Deasy said.

Most notable corporate security breeches over the year have involved on-premises IT (like the HVAC example above). The point is not to make sure that “cloud is as secure as [all that on-prem IT that’s been the source of most security problems in the past], but to make sure that all IT has a rigorous approach to security. “Cloud” isn’t the security problem, doing a shitty job at security is the security problem.

Source: Unexpected Champion of Public Clouds: JPMorgan CIO Dana Deasy, Penny Crosman, American Banker

Autonomy quarter stuffing

When Autonomy was negotiating a sale to an end user, but couldn’t close the sale by quarter’s end, Egan would approach the resellers on or near the last day of the quarter, saying the deal was nearly done. Egan coaxed the resellers to buy Autonomy software by paying them hefty commissions. The resellers could then sell the software to a specified end user – but Autonomy maintained control of the deals and handled negotiations with the end user without the resellers’ aid. There’s no way these transactions could be revenue.

Link

Allianz now deploying to production in minutes

By changing its development practices and investing in a private cloud platform as a service, there have been clear benefits to the business. “Historically it would take two or three days for a deployment to go to production, with lots of manual production. Now with the apps in the garages we can do it on the basis of Cloud Foundry within minutes.”

Source: Allianz app deployment goes from ‘days to minutes’ with PaaS and agile practices

Adding more individual controls to debit cards

Some interesting ideas to improve debit (and credit, one’d presume) with software-driven features:

CardGuard provides additional protection against fraud, since customers are able turn their debit card “on” or “off.” When the card is “off,” no withdrawals or purchases will be approved, with the exception of previously authorized or recurring transactions. Additionally, transaction controls can be set according to location, meaning transactions attempted outside of the geographic parameters set by the customer will be declined.

Also:

With CardGuard, customers are able to better manage their spending by establishing limits for debit card purchases based on the amount of the transaction. Additional controls can be set to manage spending in different categories by enabling or disabling transactions for certain merchant groups, such as gas, grocery or retail stores.

Source: First National Bank app comes with debit card controls

How Asset Managers Can Succeed with Advanced Analytics

A nice overview of what you’d use analytics fit in investing:

Armed with these advanced techniques, digitally forward asset managers can gain a significant information advantage over peers who rely mainly on traditional data sources and analytical practices. They can crunch through vast quantities of data; scour video and satellite imagery to gauge a retailer’s Black Friday prospects; extract insights from social media, texts, and e-mail to divine market sentiment; and parse a CEO’s comments during an earnings call to estimate the potential impact on the next quarter’s results. They can discern how unexpected weather disruptions might affect their portfolio, and even disprove long-held beliefs about how markets work. Smart, dynamic investment technology also helps managers assess their own performance to see whether they may be making the right decisions at the wrong times, buying too late, or listening to “influencers” who push them in the wrong direction.

There’s also a good overview of how to introduce new approaches like the above into the organization without it be Big Bang projects, likely to fail:

In experimenting with new technologies, firms should prioritize a small number of targeted initiatives that can deliver immediate tangible benefits in a focused, resource-­constrained way. In doing so, they should resist two temptations: the “esoteric science experiment,” whose focus is so narrow that the initiative can’t yield scalable results; and the “big bang rollout,” whose scope is so ambitious that it has a daunting price tag and takes too long to demonstrate value.

Source: How Asset Managers Can Succeed with Advanced Analytics