DAIQUIRI - AI for live sports reporting
DAIQUIRI — Data & Artificial Intelligence for Quantified Reporting In sport — was a two-year imec.icon research project (2019–2021) funded by imec and Agentschap Innoveren & Ondernemen. The consortium developed AI algorithms and a sensor data platform to turn athlete wearable and sensor data into narrative elements for live sports broadcasting. The project was demonstrated in professional cyclocross and elite hockey.
Sensor data that never reached the audience
During professional sports events, athletes generate large volumes of data through wearables and sensors — heart rate, power output, speed, positioning. This data was used primarily for coaching and performance analysis. It rarely reached the broadcast or the audience. The gap was not in the data itself but in the translation: turning raw sensor streams into narrative elements that commentators, content editors, and viewers could use in real time. No platform existed to bridge that gap at production speed.

An imec.icon research consortium
DAIQUIRI was an imec.icon research project funded by imec and Agentschap Innoveren & Ondernemen, running from October 2019 to September 2021. The consortium brought together experts in sports event capturing, sensor data platforms, editorial tools, and interactive user experiences. The project had four defined outcomes. First, optimising sensor connection and data quality to reduce data stream sizes by 30%. Second, enabling insight-driven data capturing with real-time dynamic visualisation. Third, developing AI algorithms for generating different types of story snippets from sensor data. Fourth, unlocking storytelling techniques that complement traditional reporting.
From sensor stream to story snippet
The core technical challenge was building a scalable data workflow that could ingest IoT data from athletes and equipment, process it through AI enrichment, and output structured narrative elements — all within the latency constraints of live broadcasting. The AI layer handled four challenges. Data overload meant filtering signal from noise in high-frequency sensor streams. Sensor-video matching synchronised data points with broadcast footage. Dynamic captioning generated contextual text overlays. Multi-modal story generation combined data, video, and text into coherent segments for different output formats.
Cyclocross and elite hockey
The platform was demonstrated in two live sports scenarios: professional cyclocross and elite hockey. Sensor data and insights gathered by the platform were used to create templates for real-time visualisation, fed into a content creation dashboard. Media professionals used the dashboard to dynamically add data-driven insights to their coverage.

