Leading AI for video understanding and search company Moments Lab is pleased to announce its public API, which allows organizations to enrich their media assets with powerful metadata that can be integrated into any software.
Moments Lab’s AI indexing technology MXT generates time-coded, searchable metadata on media files, capturing what's happening in every moment of a video and describing it like a human. It's capable of identifying people, logos, shot types, landmarks, speech, and highlighting the most impactful quotes in a transcript.
The new API enables organizations to better discover their content and gain statistical insights to fuel and shape a myriad of needs including content search, insights generation, curation, and recommendations.
“When we first released MXT, many people asked us how they could embed our video understanding technology within their own product, and leverage the data to boost content discoverability or serve other use cases,” said Frederic Petitpont, Moments Lab co-founder and CTO. “Our API is here to enable them to do just that, unlocking enhanced searchability, contextual insights, and new ROI opportunities.”
Organizations simply make their files available to Moments Lab, where they’re temporarily stored and analyzed by MXT, before the generated metadata is sent to an existing DAM, MAM, CMS, or other software.
Metadata generated by other AI indexing tools is typically delivered in formats that are difficult to read or cannot be edited, leaving no opportunity for evaluation. MXT produces readable, text-based metadata that’s fully editable within the Moments Lab platform before being sent to existing software, ensuring portability, accuracy, and transparency.
Developers can access all relevant documentation at the MXT API developer website, api.momentslab.com.
For more information, please visit momentslab.com/products/just-index or you can contact us.