All production is handled by a UK broadcast facility
As Sail Grand Prix concludes its 2021-22 season this weekend in San Francisco, it continues to prove how quickly new remote production workflows can evolve. On-site presence is minimal, with camera signals, audio signals and heaps of data sent from boats to shore via RF, then via SMPTE ST 2110 transport halfway around the world to a production team at Timeline’s Ealing Broadcast Center in the UK.
“It seems like a long time since we started this season in Bermuda, and the world is a little different than it was a year ago,” said Warren Jones, Chief Technology Officer, SailGP. “It was difficult for us to move around at the time and do what we had to do. And our remote production was essential for us.
Warren says the production team was lucky because there were no legacy systems, workflows and contracts to manage. Eight national teams compete on identical F50 foiling catamarans capable of reaching speeds of up to 60 mph. The boats and the athletes who sail them have sensors that collect and broadcast tons of data for each competition.
“We had a blank sheet of paper, so that was how do we want to do it,” he recalls. “Do we want to do what we did two years ago for the America’s Cup? Or do we want to look ahead and have something scalable for the next 10 or 15 years? »
The result is a workflow in which most people, including referees, are in London in a control room at Timeline’s facilities. SMPTE ST 2110 is an essential technical piece, allowing the team to carry signals to Timeline from anywhere in the world and also have a return path to monitor the exit from the race site.
“We send the finished world feed to the race site, where it plays through the hospitality areas, our media center and then onto the big screens,” Jones explains. “Nothing is produced on site.”
With the exception of cameras, microphones and Riedel Bolero intercoms, all equipment is located in London, including the augmented reality graphics engines, which play an important role in visualizing all the data. And there is a ground data capture via pressure sensors, gyroscopes, GPS and other technologies.
“We have 30,000 data points per boat, and eventually we’ll have something like 40 billion data points,” Jones says of a data rate of over 15,000 messages every 500 ms. “We will know everything, including the constraints on the foils, the hulls, the pressure on the trampoline. We are lucky to have a partner like Oracle and the way they handle data. When you see the database and the number of lines of code in it, it’s amazing.
All data goes via RF from the boats to the shore and via fiber to the Oracle servers in London. It’s the basis for almost everything that helps tell the racing story visually through graphics and other elements. And all data is transmitted in 180 ms.
“There’s a program called Oracle Stream Analytics where we have pre-determined templates to define about a thousand different metrics that we use,” Jones explains. “We use a Kafka bridge to get the metrics into our augmented reality graphics, 2D graphics, sales app, and any third party or wherever we want to display that information.”
Added to the data from the voyage to Timeline are video signals from two cameras on each boat, one camera in a helicopter, cameras on three chase boats and two ENG cameras.
“All of these cameras go through our network in London, where the show is produced,” Jones explains. “Then we distribute them from London to our partners at Sky, CBS, Canal Plus and Fox Sports in Australia.”