Snap's experiment in Crowd Surf with in-house AI technology



The Crowd Surf idea is a novelty in which several startups have tried to build whole apps around; it involves turning everyone’s quick video clips from concerts into a complete watchable music video.

Snap experimented in Crowd Surf at pop singer, Lorde’s performance at San Francisco’s Outside Lands music festival.

The Crowd Surf runs on Snap's in-house AI, the proprietary machine learning technology binds together Snaps submitted to Snapchat "Our Story", and uses geolocation and timestamps to merge the audio together into a semi-seamless video.

It employed machine learning technology to analyze when lots of people are recording the same musical performance at a time, then bind the different notes together with the song playing in the background, giving you the control to choose where to watch from the entire footage.

And for the video to turnout best, a lot of people would be required to be taking steady Snaps, which in theory may mean the technology could be useful in most audio-based events like concerts and speeches.

The company has confirmed plans to make the technology available at more public events in the near future including concerts and notable public speeches.

Snap encourages users to syndicate their post to Snapchat "Our Stories" feature before they share it, so as to be available in the curated Our Stories, search results, and the new Snap Map.
Previous
Next Post »