3 Trends from NAB 2017

NAB 2017 v2.jpg

The NABShow (National Association of Broadcaster) took place during the last week of April in Las Vegas and FMAV was there! We walked the tradeshow floor and thought we’d share with you some of what we learned.

Virtual Reality was Everywhere!

As a new medium, VR  is certainly here to stay. While VR’s “honeymoon” phase may be over. Now is the time companies are really starting to figure out the real practical uses for VR. Where VR really shines is when used to showcase an environment. There is no other medium in the world capable of allowing you to feel what its like to be somewhere, like the top of a mountain, without actually having to travel there physically. Also, VR is an “empathy engine” and can allow you to empathize with another person and their experiences better than other technologies.

360 Live Streaming is Gaining Momentum

While it’s not yet commonplace to have headsets like the GearVR or Google Daydream in every household, viewership for 360 live streams have shown continued growth and attendance boosting to the events it’s implemented at. Currently, 360 live streaming is focused on the mobile market where most users are viewing live streams in a windowed mode (holding the phone up and moving it around) and seem to really enjoy that experience. HEVC looks like it is becoming the codec of choice for 360 live streaming and that’s of little surprise seeing how it effectively allows you to maintain better quality video at half the file size to that of H264.

Machine Learning for Video Asset Management

Many companies are building technology that will allow video footage to be “searchable” via machine learning. It works like this, a user loads large amounts of video footage to the cloud, and when they need a clip of something specific, instead of scrubbing though footage or hoping that the right metadata is on the clips, they search with a keyword like “water” and through machine learning and AI image recognition every video and section within that video that shows or mentions water is brought up. The AI can also scrape the videos and determine whether the footage is happy or sad, high energy or low energy. This means less time scrubbing through footage to find clips editors need for their videos.

We hope you enjoyed reading this! Let us know your thoughts in the comments below.

Post Tags: