Says who?
Regardless, it needs to be considered that even in a five-year span - a very long time in technology years - there's only so much you can do to make graphics look better. We're not going from a SNES to a Nintendo 64 anymore. We're not even going from a Nintendo 64 to a GameCube. If you set the hypothetical "perfect endgame" of video game graphics as being perfect realism, indistinguishable in realism from filmed footage, we're never going to get there. At best, we can keep taking that small distance between that realism and the very best of what a PS4 or a an Xbox One or a cutting edge gaming PC can put out and keep dividing it in half. We're never going to get "there", to that hypothetical endgame, but we can keep getting closer to it while never quite reaching it. There's a mathematical term for this - where you keep taking an infinitesimal distance and dividing it in half - but I forget what it's called because I was told there would be no math.
What hardware makers can do, and what they're honestly endeavoring to do from generation to generation now, is not to keep trying to make what's onscreen look more real, but to allow for more to be done at any one time. You do that by taking the five years' worth of progress in making processors and other such components faster, cheaper, smaller and in need of less energy consumption, progress that allows you to put higher-grade components in your hardware and/or more of those components, and that's the goal of new hardware every 4-6 years or so. The goal of the NX - or the PS5, or the Xbox... Two? I dunno, you get the idea - isn't to make their graphics hugely better and more realistic because again, there's only so much progress to be made there over the current iterations. Rather, the goal is to build a hardware that allows for more of those high-end graphics and other processes and computations to all take place at once, which constitutes significant graphic improvement in its own way.