Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm a 4K HDR fanboy, but IMO most movies shot in 35mm film should remain in 1080p. 70mm film looks great in 4K though (2001, Nolan movies, etc).

It's a shame there is no 1080p HDR content. 35mm movies would benefit a lot more of higher color bit depth that higher resolution. Anyone knows if this is a codec limitation?



I think 35mm->4k scans can look amazing. There's a lot going on though with film to digital scanning. The film stock itself has a lot to do with it. From the condition it has been stored as well as the actual type of film stock. The choice in a physical scanner also makes a difference. Single pass realtime captures vs slow speed triple flashed scans. Single pass is usually on a CMOS type sensor, where the triple flashes are typically on a CCD sensor. There's a whole list of things that are involved that makes one 35mm->4k transfer not like the next.

Typically, you as a consumer, will be receiving an H.264 or H.265 stream that your cable box or streaming device decodes. These have all been updated with levels/profiles to allow for things like 3D, 4K, HDR, and 10bit encodes. The shoehorn used to shove new features into these formats has been put to heavy use.


35mm -> 4K can look amazing in terms of color, but I've yet to see a film where the added detail is not mostly film grain (exaggerated by sharpening). Do you have a good example?


This was the point of the type of film stock used. The film grain is why pretty much everyone will wince when you ask to do a 16mm->4K. Hell, 16mm->HD was rough.

I don't have any examples on hand. But with large amount of time I have found myself with over the past year to watch a lot of content, I too have noticed even some of the last episodics to be shot on film (2005-2010 range) are very noticeable. I have also seen some features from the 90s that looked really clean in comparison. Lots of things go into that, from how much the camera department "cared", what film stock was used, what film processor was used, etc. Towards the end of mass film production, there were fewer and fewer labs left. During they heyday, the soups used in the processing where in constant use. As demand lowered, the soups kind of stagnated especially in the shops financially strapped.

Film is fickle to be sure.


Another point to consider is that many 4K 35mm movies, are actually upscaled from 1080p or 2K scans.

Here's a good list to check whether a film is real 4K or not.

https://www.digiraw.com/DVD-4K-Bluray-ripping-service/4K-UHD...

I've now remembered that the X-Men films from 20 years ago did look pretty good in 4K HDR. And apparently they are "real 4K". Also the 4K remaster of The Fifth Element.


Sorry, I was assuming we were not talking uprezing and true film scans. Uprez outputs can look disastorous as people tend to use a heavy amount of noise canceling. Bad temporal noise filters look so bad when you can see residual artifacts from 4-5 frames earlier. shudders. Or someone that did this to content that had a 3:2 cadence, and now want to restore to original frame rate even though the cadence is undetectable by filters. Select fields it is! Ugh. too many flashbacks popping off.


Not quite what you ask, but very related: http://yedlin.net/ResDemo/


This is awesome. Thanks for sharing.


But I like film grain. :) Barring exaggerated sharpening, it's closer to how it looks in a cinema film projection.


1080p HDR does exist (video games, mostly) but the lack of it is basically tied to the fact that there are no consumer TVs that can do HDR but not 4K.


> 70mm film looks great in 4K though (2001, Nolan movies, etc).

Best to leave modern productions (like Nolan) out of the conversation, IMO. It can give an unrealistic picture of what resolution is achievable. Take Lawrence of Arabia for example: it's possibly the paradigmatic example of a 70mm film, and yet in most scenes the resolution is no better than an upscaled 1080p. This is due to a combination of problems: film stock degrades relatively quickly and was generally of lower quality 50+ years ago than today, and focus pulling was not precise enough to take full advantage of the available resolving power of 70mm film.

Unfortunately this is also true for many scenes in 2001: A Space Odyssey. Take a look at these comparisons here: https://caps-a-holic.com/c.php?go=1&a=0&d1=12517&d2=12509&s1... You're looking at an upscaled 1080p film versus native 4k (tonemapped to have roughly comparable colors on SDR screens). There's little if any more detail in the 4k image.

On the other hand, 35mm film certainly can have details visible only in 4k, especially more recent stuff. Check out this comparison for Casino (1995), the difference is unbelievable: https://caps-a-holic.com/c.php?d1=13389&d2=13388&s1=134076&s...

I agree with you about HDR though. It's certainly the most important improvement that's come up in the last decade, much more than the resolution increase.

> It's a shame there is no 1080p HDR content. 35mm movies would benefit a lot more of higher color bit depth that higher resolution. Anyone knows if this is a codec limitation?

Yes and no. The standard for Blu-ray (which uses H.264) is not able to accommodate HDR or 4k, so it was not possible until the release of the UHD Blu-ray standard. Players with the UHD standard support 4k, HDR, and the HEVC codec instead of H.264. HEVC (aka H.265) can support HDR at any resolution, but since you're releasing a "UHD" disc anyway that only people with supporting players can use, you may as well put it on there in 4k anyway. Better for marketing, probably. So it's a bit complicated. If you check bittorrent trackers, you'll probably be able to find 1080p films with HDR (encoded with HEVC), but no one in the industry releases films that way.

(Also, the higher bit depth is not quite as important as the extended dynamic range.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: