The amount of drama about AI based upscaling seems disproportionate. I know framing it in terms of AI and hallucinated pixels makes it sound unnatural, but graphics rendering works with so many hacks and approximations.
Even without modern deep-learning based "AI", it's not like the pixels you see with traditional rendering pipelines were all artisanal and curated.
> AI upscaling is equivalent to lowering bitrate of compressed video.
When I was a kid people had dozens of CDs with movies, while pretty much nobody had DVDs. DVD was simply too expensive, while Xvid allowed to compress entire movie into a CD while keeping good quality. Of course original DVD release would've been better, but we were too poor, and watching ten movies at 80% quality was better than watching one movie at 100% quality.
DLSS allows to effectively quadruple FPS with minimal subjective quality impact. Of course natively rendered image would've been better, but most people are simply too poor to buy game rig that plays newest games 4k 120FPS on maximum settings. You can keep arguing as much as you want that natively rendered image is better, but unless you send me money to buy a new PC, I'll keep using DLSS.
> I am certainly not going to celebrate the reduction in image quality
What about perceived image quality? If you are just playing the game chances of you noticing anything (unless you crank up the upscaling to the maximum) are near zero.
The contentious part from what I get is the overhead for hallucinating these pixels, on cards that also cost a lot more than the previous generation for otherwise minimal gains outside of DLSS.
Some [0] are seeing 20 to 30% drop in actual frames when activating DLSS, and that means as much latency as well.
There's still games where it should be a decent tradeoff (racing or flight simulators ? Infinite Nikki ?), but it's definitely not a no-brainer.
Even without modern deep-learning based "AI", it's not like the pixels you see with traditional rendering pipelines were all artisanal and curated.