This post is a guest contribution from Michael Fett.
If you walk into an electronics store to buy a new television you might be shocked to find out that most of the televisions are now 4K Ultra High Definition televisions. You might be jumping up and down with joy that a new technology for films and television shows is now upon us: The thought that possibly even greater detail can be pulled to make your favorite films of the past look even better.
The first major film from the past to test the new HDR technology is upon us.
Ghostbusters is the first major film of the ’80s to get the upgraded treatment to 4K UHD Blu-ray discs. Now, on paper, that sounds great: Ghostbusters already had a 4K treatment for the mastered in 4K series for the regular Blu-ray release. So the print was already made for Ghostbusters to be released later on 4K UHD. Yet can a film shot in the ’80s really benefit that much from a treatment that now gives us four times the pixels in a sense enhancing the film four times as much as before?
Last night I got to test that question out on two different 4KUHD televisions. One television that was a regular Samsung UHD TV and another that was a Samsung 9000 series SUHD TV. The answer was mixed with the viewing party, but there was some general agreement on some thoughts.
There is no question that this release provided the best colors, sharpness, and details ever seen for a Ghostbusters release, especially when special effects are on the screen. However, there is another effect that the average viewer may not understand or even the most seasoned videoholic with a complete understanding of film that can lead to some debate: the enhancement of the grain in the picture.
The grain in the picture is natural and is supposed to be there. In the case of the Ghostbusters release, the grain enhanced to the point to where at times especially in bright sequences the grain acted like a layer of dust. I questioned that layer so much that I had to pull out the regular Blu-ray release. Sure, the movie was extremely grainy on that release, but nowhere near as much as the 4K UHD release. That left some mixed emotions in the room.
Some of the videoholics were loving this release and some were disappointed. Those who loved this release think the more grain the better because the grain is supposed to be there. Those who were disappointed were liking some aspects of the upgrade, but felt the 4K over enhanced the grain despite some visual upgrades felt the enhancement of the grain took away from the viewing experience.
I know for myself I leaned towards those people who were disappointed with mixed emotions. On one hand I felt the over-enhanced grain took away from the viewing experience, but on the other hand I was relieved that I have finally found a format to where I no longer need to upgrade most of my older films from the ’90s and further back. The Blu-ray in most instances will provide the best picture you will be able to see, with some minor exceptions. Those exceptions are when a format that comes out that will not compress the sound and video. Any film shot in 70mm film (65mm) or higher will also benefit from a 4K release. Beyond that, though, a 4K release of an older film will really depend on the quality of film that the film was shot on. Many reviews from videoholics say Ghostbusters was shot on extremely cheap film for the day compared to what it could have used. That being said, a lot of the films I like from that era and older had cheaper budgets than Ghostbusters.
No producer could have predicted 4K televisions or had that in mind back then, and to be honest sometimes the low-quality film adds to the enjoyment. Am I going to buy older films on 4K UHD? Sure, but I am going to be very selective from this point forward. Do not get me wrong — on newer films like Peanuts, The Revenant, and Ender’s Game this format is just jaw dropping, but on most older films the wall has been hit, which leaves me sad and happy at the same time.