With everyone jumping onto the social stories train, the next step is to differentiate and make your stories offering more appealing, in order to boost usage.
Facebook has been working on a range of different tools on this front, including its various music options which take advantage of the platform's music licensing deals. And now, Google's looking to use its advanced capacity to take its AR effects tools to the next level, which it plans to roll out in YouTube Stories, which first launched in November last year.

As detailed on the Google Research blog, the new AR effects will utilize improved "anchoring" processes to make them more realistic, and more responsive to real-world cues and movement. And interestingly, Google's improved AR system "doesn't rely on additional depth input, so it can also be applied to pre-recorded videos". That could see it used across a wider range of inputs and sources.
Google has been able to improve the speed and response of its AR tools by using TensorFlow Lite, which reduces computational effort. That also means that Google's effects and able to respond and correct faster, leaving them less susceptible to issues like camera imperfections or extreme lighting conditions.
"The end result of these efforts empowers a user experience with convincing, realistic selfie AR effects in YouTube, ARCore, and other clients by:
- Simulating light reflections via environmental mapping for realistic rendering of glasses
- Natural lighting by casting virtual object shadows onto the face mesh
- Modelling face occlusions to hide virtual object parts behind a face, e.g. virtual glasses."

Yes, those are virtual glasses - an AR effect.
How, exactly, Google is going to use these in YouTube Stories - as in what specific effects it's looking to add - is not clear at this stage. YouTube Stories is still only available to creators with over 10,000 subscribers, so it's still fairly limited, making it difficult to test (at least for me), but in the post, Google does note that the new YouTube Stories creator effects are coming soon.
Will that get people more interested in YouTube Stories?
It's interesting to note the differing platform approaches in this regard - on YouTube, for example, despite initially noting that users would only see Stories from creators they follow within the app, it's actually been populating its Stories bar with a wide range of Stories, both from profiles you follow and those you don't.

That may or may not increase click-throughs - but the real trick, as noted, is in keeping users around after they've gone to check out a story.
Both Facebook/Instagram and YouTube have significant capacity advantages in this regard, with the ability to "out-tech" smaller competitors like Snapchat with advanced machine learning effects and models.
Snapchat knows this, which is why it's been taking a different approach to its effects, rolling out new, immersive experiences like its Black History Month virtual art gallery, and its latest, interactive effects which respond to visual cues based on Game of Thrones images at SXSW.
— Russ Caditz-Peck (@RussCP) March 8, 2019
Snapchat has always outperformed its larger rivals on the AR innovation front, and new initiatives like this could be the way forward, building more expansive AR "worlds", as opposed to simple masks.
In that respect, maybe Google's already missed the boat with these new tools - maybe, despite advanced AR capacity, users won't be as interested, and are already looking to the next virtual thing.
Time will tell, of course, but you have to wonder whether YouTube Stories has come too late in the game to become a significant consideration, even with these improved AR options.
You can read more about Google's new AR tools on the Google Research blog.