Even the world's biggest asset manager is struggling to crack the data conundrum.
Firms eager to integrate traditional and alternative data in their investment processes to outperform their peers are struggling with data intake and processing.
"The amount of headache and heartache that we spend on this is unbelievable," Amer Bisat, BlackRock's head of sovereign and emerging markets alpha portfolios, said at a New York conference on Monday. "It remains a holy grail. It remains something that’s more a promise than a reality."
Bisat said his team spends time on two pieces of the data puzzle. First, they work to classify data from disparate, ever-increasing sources to make sure it's structured properly.
"We're spending a lot of time on this," Bisat said.
Then comes the hard part: "How do you use that data analytically in a way that can actually generate alpha?" he asked. That's "complicated work."
"Some of my colleagues are much more excited about this than I am," Bisat said. "I think we’re still way too early."
About a quarter of BlackRock's 14,000 employees work as technologists, the firm's chief financial officer said in December. And BlackRock is spending about $1 billion annually in technology and data, CFO Gary Shedlin said.
BlackRock isn't the only major financial institution struggling with getting data into a usable format and then using it to augment the investment process.
Sign up here for our weekly newsletter Wall Street Insider, a behind-the-scenes look at the stories dominating banking, business, and big deals.
At JPMorgan, the largest US bank, there are thousands of databases that still need to be cleaned and made usable before AI or machine-learning techniques can be fully unleashed, according to the copresident Daniel Pinto, who spoke with Business Insider earlier this year on the sidelines of the World Economic Forum in Davis, Switzerland.
For decades, banks were at the forefront of data collection, hoovering up information about stock and bond trades, credit-card transactions, and mortgage loans. But for most of that time, firms were content to take in the data and store it, with few spending much time thinking about how it might be retrieved or compared to other datasets nestled in other parts of the firm.
According to a July 2016 McKinsey article, about half of the time spent by employees in finance and insurance is used for collecting and processing data. That and the large amounts of data involved in the industry make it one of the area's most ripe for disruption, according to the consultant.
At Credit Suisse, the bank has focused on ensuring that any data that gets fed into AI tools is of the highest possible quality, according to the Swiss firm's chief technology officer, Laura Barrowman. For an AI-based tool to be efficient, the data it analyzes needs to be complete and accurate. While that may seem like a basic request, Barrowman said it's a critical one and not easily achievable in a company the size of Credit Suisse.
"Making sure that your basics are right is a fundamental for everything," said Barrowman, also speaking on the sidelines of the Davos event.
Join the conversation about this story »
NOW WATCH: We tried all the unique menu items at Swedish McDonald's — including the McVegan