“Enterprises need to think about data traffic patterns in their organizations, Vincent said, and recognize when the traffic no longer flows through a central point (whether public cloud or private cloud) and ready their corporate networks for a whole new traffic flow as part of their digital transformation.”
Original source: With Emerging Technology Comes Emerging Data Problems
“Further complicating matters, a large amount of customer data still lives in departmental silos, with sales, marketing, and customer service each supplying separate customer experiences. Those databases can easily grow stale or become inconsistent, since customer information owned by one department is often not shared with others. Keeping it up to date is an even more formidable task –– even a monthly update isn’t always frequent enough to keep up with the important life changes that can impact marketing decisions.”
Original source: The Travel Industry’s Data Dilemma: Turning Insights into Action
“In every case in which a CIO or other executive has driven or authorized substantial investments in service-based database infrastructure, changes in DBA roles have followed. As two financial industry executives put it at a conference in Jersey City and re:Invent, respectively, their DBAs are all being moved to doing more generic DevOps-style roles, roles that involve more architecture and engineering than traditional database administration. This is the logical outcome of a scenario in which making a database fault-tolerant with 6 copies across three availability zones with continuous backup is now merely a product feature instead of a full time job or jobs.”
Original source: Whither the DBA
‘Datical automatically examines SQL scripts created by developers and aligns them with a common object model. “We create a package so you have an immutable artifact that goes from development to test to production just like your app code,” Reeves said. The software checks for inefficiencies, such as the use of multiple indices or joins, and flags them before changing the schema…. Datical’s containerized image can be run with Concourse as part of a testing pipeline to enable application development teams to push application and database changes through the release cycle at the same time. The company’s will cross-sell each other products, although the arrangement isn’t exclusive, Reeves said.’
Link to original
A brief note, from William Fellows at 451, on HSBC’s use of Google Cloud’s big data/analytical services:
They have lot of data, that’s only growing:
6PB in 2014, 77PB in 2015 and 93PB in 2016
What they use it for:
In addition to anti-money-laundering workloads (identification and reducing false positives), it is also migrating other machine-learning workloads to GCP, including finance liquidity reporting (six hours to six minutes), risk analytics (raise compute utilization from 10% to actual units consumed), risk reporting and valuation services (rapid provisioning of compute power instead of on-premises grid).
As I highlighted over the weekend, it seems like incumbent banks are doing pretty well wtih all this digital disruption stuff.
Source: HSBC taps Google Compute Platform for Hadoop, is ‘cloud first’ for ML and big data
This week we talk with about how organizations are increasingly looking to improve how they use data and workflows around data to innovate in their business. As with discuss with our guest, Sina Sojoodi, More than the usual ideas about “big data” and “machine learning,” we talk about the practical uses of data workflows like insurance claims handling and retail optimization. In many large, successful organizations the stacks to support all this processing are aging and not providing the agility businesses want. Of course, as you can guess, we have some suggestions for how to fix those problems, and also how to start thinking about data workflows differently. We also cover some recent news, mostly around Google Cloud Next and Pivotal’s recent momentum announcement.
Check out the SoundCloud page, or download the MP3 directly.
A nice overview of what you’d use analytics fit in investing:
Armed with these advanced techniques, digitally forward asset managers can gain a significant information advantage over peers who rely mainly on traditional data sources and analytical practices. They can crunch through vast quantities of data; scour video and satellite imagery to gauge a retailer’s Black Friday prospects; extract insights from social media, texts, and e-mail to divine market sentiment; and parse a CEO’s comments during an earnings call to estimate the potential impact on the next quarter’s results. They can discern how unexpected weather disruptions might affect their portfolio, and even disprove long-held beliefs about how markets work. Smart, dynamic investment technology also helps managers assess their own performance to see whether they may be making the right decisions at the wrong times, buying too late, or listening to “influencers” who push them in the wrong direction.
There’s also a good overview of how to introduce new approaches like the above into the organization without it be Big Bang projects, likely to fail:
In experimenting with new technologies, firms should prioritize a small number of targeted initiatives that can deliver immediate tangible benefits in a focused, resource-constrained way. In doing so, they should resist two temptations: the “esoteric science experiment,” whose focus is so narrow that the initiative can’t yield scalable results; and the “big bang rollout,” whose scope is so ambitious that it has a daunting price tag and takes too long to demonstrate value.
Source: How Asset Managers Can Succeed with Advanced Analytics
Watch over the next 2 years as the Big 4 of BI (IBM, Oracle, SAP, and Microstragy) battle distuption from well funded upstarts. Folks like SAS snd Terradata will have to pick sides.
More: QLIK Rising: As DATA Soars, BMO Cheers New Analytics Approach