Availability aside (bluray rips are a thing), most people can’t tell the difference between FullHD and 4K at all, at least in moving pictures[1]. I doubt bitrate will make much difference on top of that, as long as you start from some reasonable value.
I seriously can’t tell the difference between a very low quality YIFY rip and a proper Bluray. If you freeze frame they both look bad, and when they’re moving they both look great. I’ve done this as an experiment multiple times and it’s like judging wine... There’s a threshold you need to pass but beyond that you quickly run into diminishing returns.
[1] BTW most movies are still mastered or partially mastered (SFX) at 1080p still, and even if they’re true 4K you get high quality downscale to 1080p for free. But really most 4K movies are still upscaled from 1080p.
I’d rather see 60-144hz before any increases in resolution above 1080p and maybeeee 4K.
I think lower quality rips show themselves a bit more on high quality playback devices, but I generally don’t hit low quality releases purely for Snob factors so I could be wrong.
Not that I'm aware of. The only high FPS movie I know of is Billy Lynn's Long Halftime Walk by Ang Lee, it is shot at 120fps. The Hobbit is shot at 48fps.
Interpolation and frame rate are two different things.
Care to give an argument more than ‘looks garbage’? I think people reported that the hobbit looked weird, but that’s likely because were used to 30fps in a subconscious level.
I seriously can’t tell the difference between a very low quality YIFY rip and a proper Bluray. If you freeze frame they both look bad, and when they’re moving they both look great. I’ve done this as an experiment multiple times and it’s like judging wine... There’s a threshold you need to pass but beyond that you quickly run into diminishing returns.
[1] BTW most movies are still mastered or partially mastered (SFX) at 1080p still, and even if they’re true 4K you get high quality downscale to 1080p for free. But really most 4K movies are still upscaled from 1080p.