Welcome to the invisible future. Here, we are perpetually surrounded by connected devices, from the 77 percent of us who own a smartphone to the 90 percent of us who own an IoT device. This boom in connected ownership has slowly created an invisible marketing layer, resulting in a purchasable halo around every object and moment.
To put this in perspective, it’s been noted that 90 percent of the worlds’ data was generated in the past two years. The resulting data layer represents a transformative moment in the ongoing battle for brands to reach consumers; it’s an immense, yet highly complex, opportunity for marketers. Winning in this space means creating value for consumers in the moment before they realize that moment is there.
Daniel Kahneman, the Nobel Prize winning behavioral economist said in his 2010 Ted talk, The Riddle of Experience vs. Memory: “When we think of the future, we don’t think of a future normally as experiences, we think of the future as anticipated memories.” This view is important to keep in mind as we look to see how our devices are evolving and how brands may use these tools to reach consumers in this new space. Our focus will be on mobile given the above ownership rate and that fact that consumers spend on average three hours and thirty-five minutes a day on smartphones.
"Create anticipated memories through the new invisible marketing layer"
The latest version of Android Pie includes a key to this new future of anticipation in the App Actions and App Slices frameworks. These two tools were created to help users ‘get to their next task more quickly’ and developers to get those users to re-engage with their apps.
Essentially, the Android OS is moving from predicting the next app a user wants to open to predicting the next action that they want to take. By combining Actions and Slices, the user can get direct access to either booking a car or buying movie tickets with very little input through the search bar. In the same vein of helping users ‘get to their next task more quickly’, Google’s Assistant also offers Routines, a way to create a program of multiple tasks initiated by a phrase like ‘Good Morning’. In all instances we see how predictive anticipation is enabled through technology, specifically in the Google-verse, by on-device machine learning.
And for Apple fans, iOS 12’s new Shortcuts and Suggestions feature enables Siri to perform similar functions as Android App Actions and Slices along with Google Assistant Routines. Users will be able to create their own Shortcuts for activities like calling up a train schedule at the same time of the day or logging a workout when complete to their favorite app. Shortcuts are powered by the User Activity API and the new Intents API. It’s the latter that tells the Siri system how you use an app so that it can make relevant suggestions to you based on patterns.
Developers will be able to create the same types of near direct intent access over multiple instances of button pushing in iOS as in Android. Technology analyst Rene Ritchie calls these new iOS capabilities a sign that we’ve moved from the Pull Interface (lots of selecting, button pushing and hunting to get where you want to go) to the Push Interface (where what you want comes to find you only when you need it).
But how long will it take for these experiences to be mainstream? iOS users will likely be the first to experience these new tools, VentureBeat reported last month that it took 11 months for iOS 11 to be installed across 85% of eligible devices. Projecting that rate of adoption out for iOS 12 and most people who have phones that can run the new OS will install it by end of summer next year.
Android is a different animal altogether. While Pie was released in August, it was only available on Pixel. Given the fragmented Android ecosystem, it may take more than a year for it to see any meaningful adoption rate.
So, it’s fair to say that anywhere between one to two years is the lead time on mass audiences seeing this functionality on their devices. While that may seem like a long way off, marketing plans for 2019 are likely locked in and plans for 2020 will probably start to be solidified over the next six months.
For brands, the question is two-fold. First, how do I update my apps to be as predictive as this and second, how do I make my other consumer touchpoints as anticipatory as this? Both questions require some serious consideration around purpose, as in why would a consumer expect a brand to provide this type of predictive service in the first place? Would it seem natural or will it come to be expected that a laundry detergent brand can read my calendar, see that I am hosting a dinner party and then ask me the next morning if I need help removing that wine stain from the table cloth? Maybe, if it knew that the morning after the last dinner party that the same people attended resulted in a ton of searches, posts or texts around wine stain removal. It goes without saying that the consumer would need to opt-in to allowing access to all of the different inputs to make this kind of recommendation. If consumers value the utility in having this type of service, the data exchange might be worth it.
As these new tools are baked into more consumer experiences, we know that they simply may come to expect this level of predictive convenience from brands and platforms. Building for this future now could cement the bond between brand and consumer, and, in Kahneman’s words, create ‘anticipated memories’ through the new invisible marketing layer.