Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> With enough lines, enough brightness, and a high-enough refresh rate, it may become possible have a display that can artificially emulate the features of a CRT -- including phosphor persistence and blooming and focus issues and power supply sag and everything else, with interlacing. AFAICT, we aren't there yet.

To truly do that, you need to display over 20 million frames a second.

Why?

True analog video didn't capture frames, but instead each pixel was transmitted / recorded as it was captured. This becomes clear when watching shows like Mr. Rogers on an LCD. When the camera pans, the walls look all slanted. (This never happened when viewing on a CRT) This is because the top part of the image was captured before the bottom part. I wouldn't even expect a 60i -> 60p deinterlacer to correct it.

That being said, I don't want to emulate a CRT:

- I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable. (Unless I slow down the video / look at stills.)

- I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.

CRTs had a look that wasn't completely natural; it was pleasant, like old tube amplifiers and tube-based mixers, but it isn't something that I care to reproduce.



I definitely understand you very well, and I agree.

Please allow me to restate my intent: With enough angular resolution (our eyes have limits), and enough brightness and refresh rate, we can maybe get close to what the perception of watching television once was.

And to clarify: I don't propose completely chasing the beam with OLED, but instead emulation of the CRT that includes the appearance of interlaced video (which itself can be completely full of fields of uncorrelated as-it-happens scans of the continuously-changing reality in front of the analog camera that captured it), and the scan lines that resulted, and the persistence and softness that allowed it to be perceived as well as it once was.

In this way, panning in an unmodified Mr Rogers video works with a [future] modern display, sports games and rocket launches are perceived largely as they were instead of a series of frames, and so on. This process doesn't have to be perfect; it just needs to be close enough that it is looks the ~same (largely no better, nor any worse) as it once did.

My completely hypothetical method may differ rather drastically in approach from what you wish to accomplish, and that difference is something that I think is perfectly OK.

These approaches aren't exclusive of eachother. There can be more than one.

And it seems that both of our approaches rely on the rote preservation of existing (interlaced, analog, real-time!) video, for once that information is discarded in favor of something that seems good today, future improvements (whether in display technology or in deinterlacing/scaler technology, or both) for any particular video become largely impossible.

In order to reach either desired result, we really need the interlaced analog source (as close as possible), and not the dodgy transfers that are so common today.


Some screens will scan out top to bottom at 60Hz and mostly avoid that skew. If you took an OLED that does that, and added another mechanism to black out lines after 3ms, you'd have a pretty good match to the timing of the incoming signal


I don't want that kind of flicker, though. I just want the image to look good:

> I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable.

> I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.

---

If we want to black out lines after 3ms, we might as well bring these back: https://www.youtube.com/watch?v=ms8uu0zeU88 ("Video projectors used to be ridiculously cool", by Technology Connections.)


Okay, so you're not interested in the flicker versus persistence tradeoffs.

In that case you just need a 60Hz-synced scanout, and you can get screens that do that right now. That will beat any machine learning stabilizer.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: