HSBC’s Google Cloud use

A brief note, from William Fellows at 451, on HSBC’s use of Google Cloud’s big data/analytical services:

They have lot of data, that’s only growing:

6PB in 2014, 77PB in 2015 and 93PB in 2016

What they use it for:

In addition to anti-money-laundering workloads (identification and reducing false positives), it is also migrating other machine-learning workloads to GCP, including finance liquidity reporting (six hours to six minutes), risk analytics (raise compute utilization from 10% to actual units consumed), risk reporting and valuation services (rapid provisioning of compute power instead of on-premises grid).

As I highlighted over the weekend, it seems like incumbent banks are doing pretty well wtih all this digital disruption stuff.

Source: HSBC taps Google Compute Platform for Hadoop, is ‘cloud first’ for ML and big data

Pivotal Conversations: Bringing Agility to Enterprise Data Workflows, with Sina Sojoodi

The summary:

This week we talk with about how organizations are increasingly looking to improve how they use data and workflows around data to innovate in their business. As with discuss with our guest, Sina Sojoodi, More than the usual ideas about “big data” and “machine learning,” we talk about the practical uses of data workflows like insurance claims handling and retail optimization. In many large, successful organizations the stacks to support all this processing are aging and not providing the agility businesses want. Of course, as you can guess, we have some suggestions for how to fix those problems, and also how to start thinking about data workflows differently. We also cover some recent news, mostly around Google Cloud Next and Pivotal’s recent momentum announcement.

Check out the SoundCloud page, or download the MP3 directly.

How Asset Managers Can Succeed with Advanced Analytics

A nice overview of what you’d use analytics fit in investing:

Armed with these advanced techniques, digitally forward asset managers can gain a significant information advantage over peers who rely mainly on traditional data sources and analytical practices. They can crunch through vast quantities of data; scour video and satellite imagery to gauge a retailer’s Black Friday prospects; extract insights from social media, texts, and e-mail to divine market sentiment; and parse a CEO’s comments during an earnings call to estimate the potential impact on the next quarter’s results. They can discern how unexpected weather disruptions might affect their portfolio, and even disprove long-held beliefs about how markets work. Smart, dynamic investment technology also helps managers assess their own performance to see whether they may be making the right decisions at the wrong times, buying too late, or listening to “influencers” who push them in the wrong direction.

There’s also a good overview of how to introduce new approaches like the above into the organization without it be Big Bang projects, likely to fail:

In experimenting with new technologies, firms should prioritize a small number of targeted initiatives that can deliver immediate tangible benefits in a focused, resource-­constrained way. In doing so, they should resist two temptations: the “esoteric science experiment,” whose focus is so narrow that the initiative can’t yield scalable results; and the “big bang rollout,” whose scope is so ambitious that it has a daunting price tag and takes too long to demonstrate value.

Source: How Asset Managers Can Succeed with Advanced Analytics

Pivotal puts PaaS in the spotlight

{{{So specific!}}} Paul Maritz, the former long-time Microsoft executive and VMware CEO who now heads Pivotal, says there’s a lack of an application development platform with built-in capabilities to manage and analyze large quantities of data geared toward the enterprise. Pivotal is a “new platform for a new era,” he says.

Pivotal puts PaaS in the spotlight