There was a lot of buzz about AI at last year’s NAB Show, but the conversations we were having with organizations centered around what AI "is" and how it works in the context of media workflows. Fast forward to April 2024, and many visitors to the Moments Lab booth were armed with a list of requirements — they had a much higher level of understanding and knew exactly where in their ecosystems they wanted to deploy AI solutions.
The explosion of consumer-focused tools like ChatGPT and Midjourney really helped to mature the market, and what we saw this year in Las Vegas was a mix of media industry decision-makers and end-users excited to dive into the details of how they can practically put AI to work.
Broadcasters are overwhelmingly looking to eradicate monotonous and time-consuming tasks such as manual media logging. And they want to enable editors to work faster by tapping into AI features that speed up the tedious process of combing through rushes. Leveraging the latest AI developments to improve the overall MAM search experience is also key. Organizations see higher content ROI when both internal and external users can easily find what they’re looking for.
The organizations that Moments Lab works with are on an endless quest to maximize audience engagement by quickly publishing new content – be it fresh from the field, repurposed, or unearthed from their archives to capitalize on a viral moment. Being able to quickly find the right moments in your video footage to build a story has never been more important. This is where MXT-1.5’s new Sequences Detection and Sound Bites features really excel.
One of my favorite quotes from NAB came from a top European broadcaster during a demo:
“It takes us half a day to source content for a 5-minute video. With your AI it seems we could have a rough cut in minutes!”
Another executive asked us to officially confirm that there were no humans secretly generating the Sound Bites in the background.
Ultimately, we want to enable content creators to know exactly what’s inside their media — be it a single press conference or a vast video archive — and understand at a glance what is most important. Resource constraints are ever-present in this industry and there are some major events on the horizon, notably the Paris Olympics and US Presidential Elections, which media companies appear to be covering without the typical injection of extra staff. With the help of AI, editors, producers, and journalists can pinpoint the shots they need extremely quickly to build their stories.
A subject heard across the halls at this year’s NAB Show was that of tightening budgets and the increasing need to closely manage resources.
Helping broadcasters to accurately estimate ROI on their content for the benefit of their CFOs can be the key to unlocking new opportunities and getting project budgets approved. Our clients tell us that 10,000 hours of archive made available for sale can amount to a potential annual revenue of USD $1 million.
Sinclair EVP and CFO Lucy Rutishauser addressed ROI at the Devoncroft Executive Summit. She made the point that in large organizations, it’s simply not possible for managers to project accurate ROI without visibility of teams beyond their own. We should expect to see CFOs increasingly becoming involved in vendor conversations alongside the functional teams. At Moments Lab we’re already helping our clients to build out CFO-ready business cases.
I’ve said it before and I’ll say it again, many media organizations and production houses have decades of unindexed archive content in storage gathering dust, when it could be harnessed for new, much-needed revenue streams. In Vegas we saw a renewed push by some major players to kick off large digitization and indexing projects. The sense of urgency is no doubt in part spurred by the dwindling lifespan of VTRs. Broadcasters know they have limited time left to secure their heritage.
Part of the reason why significant digitization projects are now viable is related to the lower cost of using AI to index content at scale. We’ve enabled cost efficiencies in our multimodal and generative AI indexing technology through the use of low energy consumption models. And we’ve partnered with CIS Group and Glookast in the Americas to provide the market with a much-needed end-to-end media digitization and indexing solution.
When it comes to repurposing and reusing archive content, the possibilities are endless, especially now that AI technology is able to break down the contents of old tapes and tell you in seconds exactly what shots, moments, and key quotes they contain.
These insights are priceless when quickly building short video stories for social and cultivating a highly engaged community, but they’re also gold for growing a bank of collateral stories — much like those that exist in the Marvel Cinematic Universe. Media companies that hold the rights to iconic characters and universes have a gateway to creating sub-stories for their various platforms, maximizing the ROI on their IP.
Based on the uptake and enthusiasm for implementing next-gen AI into media workflows and where the technology is heading, I expect some very impressive use cases will take center stage at NAB Show 2025. Media workflow structural changes are to be expected, especially on the creative side, and these will strongly impact how vendors demonstrate added value.
It’s not possible to compare the current advancements in AI to technology upgrades like the SD to HD evolution, or SDI to IP. GenAI will transform how we build content. Organizations that focus on creativity and implementing bold workflows will reap the rewards.
Missed Moments Lab at NAB Show 2024? Catch up on our latest product announcements here.