The Smart Media City Pilot used the Next Generation Meta Operating System (Nemo Meta OS) framework to enhance the boundaries of live media capture and user involvement in a live outdoor race event, providing an effective broadcast, analysis and productivity solution and seeked to enhance user engagement by offering a Personalized Content Delivery solution using a dedicated app.
During the race, spectators and selected runners captured media content using smartphones, tablets, 360 cameras, and, where available along the running circuit. This incoming content underwent automated processing, annotation, and rendering, with AI/ML models running partially on devices and partially at the edge. A curated selection of this content (Production control was remote in Spain Madrid) is then broadcasted in real time, such as through social media, based on the location of leading runners and notable race events, as identified through automated and user-provided annotations.
Spectators if the Pilot Race enhanced their contributions and interacted with other users in response to specific race incidents. The trial emphasized real-time, user-generated content processing and rendering, leveraging Federated Learning (FL) hosted across IoT nodes (smartphones), edge devices, and cloud infrastructure. AI models were trained to recognize Racing Bib Numbers enabling better identification of runners and their precise positioning within each video stream.
You can add the link to the video: https://youtu.be/SKpcN5UUjJs?si=vUQa7wV4z27q2cJ0
