Enterprises today amass a huge amount of data—but end up using only a small sliver of it for actual analysis. Companies use just 1-3% of hundreds of billions of data points per customer to unlock insights. Naturally, the question arises, what’s happening with the other 97-99% of that data? What risks and opportunities are we missing? Adobe Analytics may be able to help us find out.
At 6:30 PT this morning, Adobe announced the next generation of “Virtual Analyst” in Adobe Analytics. It’s an AI assistant tool that will analyze vast amounts of data to provide recommendations on what to examine further.
They say it’s “designed for the non-quant, to help them perform tasks of a data scientist.”
Essentially, the theory behind it is this
Normally, we look for known-knowns in our data. We have a target in mind (page views, bounce rate, etc), then we go to our data for the number.
If and when we get through all that, we then move to the known-unknowns. Why has cart abandonment moved above X? Why did X video go viral?
Very rarely, however, do we make time for unknown-unknowns. These are things that we don’t even know to look for—the lottery ticket insights you maybe uncover once in a blue moon, if ever. But who has time to wade through hundreds of billions of data points in search of uncertain things that may or may not even exist?
Enter AI Virtual Analyst
The idea is that this AI tool will surface for me things that I may be interested in in order to spur deeper level analysis. With adaptive learning, it learns to understand business contexts and user consumption. Think Spotify recommendations, but for marketers. The idea is that the tool recognizes what any given user will be interested in based on what they’ve already looked at—and also recommends important data that the user might otherwise not notice.
John Bates, Director Product Management at Adobe Analytics relates this to the early days of astronomy: “People had a single goal: seeing more, further, deeper. Similar to data now, there was so much more than meets the eye.” Fifty years into the future, who knows how many new secrets we’ll have unearthed from the 97-99% of data currently unused.
What it does
The two main goals of this tool are automate and personalize. This is the first time Adobe will be baking AI directly into the interface, to better understand preferences and tastes of individual users.
The amount of data produced in the world increases at a double-digit rate year-on-year, yet the amount of time we have to analyze it remains constant. Thus, the proportion of data we can analyze is decreasing.
By using AI, we can work at unprecedented scale. Adobe’s assist tool works to constantly analyze all trending data at a macro level. As the user clicks through displayed insights, the system fans out with more detailed metrics. While it doesn’t automatically analyze all the data (which would drastically slow it down), it does automatically analyze the trending data and use that to provide a bridge of what to tap into further. According to Bates, “it’s comprehensive; it really covers the rest of the 97-99% of data collected.”
From our perspective, the two biggest pieces of this are the breadth of insights and the user personalization. In the past, you had to manually create metrics you wanted alerts for. Now, AI can automatically analyze data and push out insights it thinks will be useful to you, without you having to explicitly request them. It’s not uncovering “new” data, but using data in new ways. Again, the power of digging into the unknown-unknowns could be what propels us into next levels of growth.