Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why is it super-important?


Because events using many cameras without this type of setup would cause jarring visual differences when switching between one camera/view to another during the broadcast.


Why wasn't it jarring in early Superbowl games?


There was even more color correction happening then - cameras were worse and more analog! It just was not being controlled from offsite, instead there was a dedicated room and engineer in the broadcast truck doing camera configuration + color correction.

The usual technique was to start by holding up a color card on the stage/floor then use a vectorscope[1] and get all the dots to line up in the right place. Then with a waveform monitor for exposure. During the event, there would be fine tuning by eye, or as things drifted out of line.

[1] https://en.wikipedia.org/wiki/Vectorscope#/media/File:PAL_Ve...

PS: You can also see modern vectorscope / waveform monitor images in this photo from the cyanview blog. Look for the black and white X-ray looking things on the screens. https://www.cyanview.com/wp-content/uploads/2022/10/20221006...


They used around 150 cameras for the last Super Bowl. Most of them were Sony studio cameras, controlled with Sony remotes to ensure perfect alignment. But now, they’ve added a lot of specialty cameras: probably 4 or 8 pylons, each equipped with 2 to 4 cameras, plus drones, handheld mirrorless cameras, mini high-speed cameras, and a few other mini-cams for PoV (Point of View) shots. Last year, they even had a mini-cam inside the cars driving from the Bellagio to the stadium, controlled remotely over cellular. An Elixir process ran on a RIO in the car to manage the camera and connect to a cloud server, while the remote panel was linked to the same server to complete the connection. All three ran Elixir code, with the cloud server acting as a simple data relay.

If you want the green of the grass on all the pylon cameras to match your main production cameras, adjustments are a must. And with outdoor stadiums, this is a constant task—lighting conditions change throughout the day, even when a cloud moves across the sky. When night falls, video engineers are working non-stop to keep everything perfectly aligned with the main cameras.


Fewer cameras, lower resolution, poor color rendering even with one camera anyway?


And less color fidelity on receiving displays I would imagine!


I don't think he's saying the end this tech is serving is super important (televised professional sports), only that in order to televise professional sports and other events requiring similar camera work, it is important to do this kind of stuff.


If you've ever watched a poorly-produced porno where the colors change with every camera cut, you'll know why....




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: