Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I get how a film buff might care, and agree the original version should be available, but isn’t there space for people who just want to see the story but experience it with modern levels of image quality? The technical details of technology at some point of time is definitely interesting to some people, but as say the writer or others associated with the creative and less technical aspects of a film I may find the technical limitations make the story less accessible to people used to more modern technologies and quality.


What does "modern levels of image quality" mean in this context?

The article is about AI upscaling "True Lies", which was shot on 35mm film. 35mm provides a very high level of detail -- about equivalent in resolution to a 4k digital picture. We're not talking about getting an old VHS tape to look decent on your TV here.

The differences in quality between 35mm film and 4k digital are really more qualitative than quantitative, such as dynamic range and film grain. But things like lighting and dynamic range are just as much directorial choices as script, story, any other aspect of a film. It's a visual medium, after all.

Is the goal to have all old movies have the same, flatly lit streaming "content" look that's so ubiquitous today?

I think the argument against "isn’t there space for people who just want to see the story but experience it with modern levels of image quality" is that such a space is a-historical -- It's a space for someone that doesn't want to engage with the fact that things were different in the (not even very distant) past, and (at the risk of sounding a bit pretentious) it breeds an intellectually lazy and small-minded culture.


The problem with that is the content is usually shot with the certain definition in mind. If you don't film certain scenes from scratch, they can end up looking weird in higher definition, simply because certain tricks rely on low definition/poor quality, or because you get a mismatch between old VFX and new resolution, for example.

It's a widespread issue with the emulation of old games that have been made for really low resolution/different ratio screens and slow hardware, especially early 3D/2D combinations like Final Fantasy, and those that relied on janky analog video outputs to draw their effects.


For a specific simple example: multiple Star Trek TV series were shot with the assumption that SDTV resolution would hide all the rough edges of props and fake displays. Watch them in (non-remastered) HD and suddenly it's very obvious how much of the set is painted plywood and cardboard.


One somewhat funny example of this is in the first ST:TNG episode "Encounter at Farpoint". In one shot, the captain asks Data a question, and the camera turns to him to show him standing from his seat at the conn and answering. At the bottom of the screen, it's plainly visible (in the new Blu-Ray version) that a patch of extra carpet is under the edge of the seat. It was probably put there to level the seat or something. At the time, this was ignored, because on a standard SDTV screen, the edges are all rounded, so the very edge of the frame isn't normally visible.

Another thing that's plainly obvious in TNG's remastered version is all the black cardboard placed over the display screens in the back of the bridge, to block glare from lights. In SDTV, this wasn't noticeable because the quality was so bad.


Actually I would expect AI up scaling of SDTV in this case would perform better. It would assume semantically the props were real and would extrapolate them as such.


For anything that's not just "grab a camera and shoot the movie" the format that it is shot in is absolutely taken into account. I don't think you can separate the story from how the image is captured.


One perspective:

'Film buff' responses are common to every major change in technology and society. People highly invested in the old way have an understandably conservative reaction - wait! slow down! what happens to all these old values?! They look for and find flaws, confirming their fears (a confirmation bias) and supporting their argument to slow down.

They are right that some values will be lost; hopefully much more will be gained. The existance of flaws in beta / first generation applications doesn't correlate with future success.

Also, they unknowingly mislead by reasoning with what is also an old sales disinformation technique: List the positive values of Option A, compare them to Option B; B, being a different product, inevitably will differ from A's design and strengths and lose the comparison. The comparision misleads us because it omits B's concept and its strengths that are superior to A's; with a new technology, those strengths aren't even all known - in this case, we can see B's far superior resolution and cleaner image. We also don't know what creative, artistic uses people will come up with - for example, maybe it can be used to blend two very different kinds of films together.

These things happen with political and social issues too. It's just another way of saying the second step in what every innovator experiences: 'first they laugh at you, then they tell you it violates the orthodoxy, then they say they knew it all along'.


Feels like people have had over 20 years to move on from the narcissistic butchery Lucas did to Star Wars IV, but it seems we're still at Step 2.

Maybe ... just maybe ... Step 2 is where it stops sometimes because it was a bad idea and did make the films worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: