Craig Federighi, Senior Vice President of Software Engineering for Apple Inc, discusses the Siri desktop assistant for Mac OS Sierra at the company's World Wide Developers Conference in San Francisco, California, U.S. June 13, 2016. Reuters

Apple bought Lattice Data, a company that uses an artificial intelligence (AI) based interface engine for transforming unstructured “dark” data into structured and more usable information.

TechCrunch reported Saturday, citing sources, that the tech giant reportedly paid $200 million for the acquisition.

On being asked about the deal, Apple neither confirmed nor denied it.

“Apple buys smaller technology companies from time to time and we generally do not discuss our purpose or plans,” an Apple spokesperson told the publication.

Read: Apple Integrating Siri Into Messages App According To New Patent Application

What is Dark Data?

There is a massive amount of data produced globally every year – 4.4 Zetabytes in 2013 by a conservative estimate. This data is expected to grow 10-folds by 2020, and according to IBM most of the data has been produced in the last two years, as TechCruch reported.

Around 80 percent of the data is unstructured or “dark” and therefore cannot be used for processing and analytics. Lattice uses machine-learning algorithms to sort out the data, put it into the correct order and turn it into useful information. While the data is useless without labeling and organization, once it is unlocked using proper organization, it has a certain latent value. This data could then be used for various purposes, ranging from policing and crime solving — finding digital traces of crimes such as human trafficking and child porn, medical research and other such uses.

Apple’s interest in the technology is well thought out. The company’s voice assistant, Siri was the first in a line of conversation voice assistants on smartphones. In fact, it had a head start on all the other companies in the field. But, over the years, while Apple was busy integrating the voice assistant on all of its devices, other companies moved ahead with their own version of the technology.

In 2016 many companies revealed their own AI-based voice assistant, starting with Amazon’s Alexa, after which Google launched its Google Assistant and HTC sense companion. Samsung released its Bixby voice assistant in March, on its Galaxy S8 flagship and Microsoft is expected to reveal an improved AI-based version of Cortana soon.

All these assistants, which came after Siri, have an advantage over it since they are AI-based and evolve according to usage.

Most importantly, Apple’s major rival, Samsung’s Bixby Assistant has a feature called the Bixby Vision. Bixby is integrated into the Galaxy S8’s camera, gallery and is connected to the internet. It then uses AI to find out what the user is focusing on using the camera and sources out contextual information from the internet.

Read: Apple Siri Speaker: 5 Expected Features Including Improved Beats Audio, Touchscreen

With the iPhone 8 expected to launch this year, and a Siri-based speaker expected to be unveiled at WWDC, Apple needs to improve Siri to match up to other voice assistants. The company’s acquisition of Lattice Data suggests that the company could be working on a similar feature for Siri, especially since the Lattice has the technology to scan and make sense of images and text.

While Apple, as usual, is silent on the subject, a new version of Siri is expected to be launched at WWDC 2017.