The better question you should be asking is why does everyone else need 300GB of disk space that transfers at over 7GB/s, 32 CPU cores at 6GHz, a video card worth $3000 dollars, and RAM sizes measured in gigabytes with 3 digits just for a fucking game?
At some point I have to wonder if the reason we have so much computing power is so we can use that computing power.
If you are serious (I do game development for a living and work on graphical assets daily so that seems evident for me, but I totally understand it can be arcane stuff) it's simply that they choose a stylized graphical style avoiding a lot of costly details you generaly find in hight end games.
They use low poly models, as far as I know there is no baked lightmap (these are pretty expensives but are mandatory in a lot of engine if you want realistic shadows on higtly detailled environment) and their shader materials probably use very simple and low resolution maps.
All these thing decrease the asset footprint by orders of magnitude.
If you want to look in more detail in can look and compare a similar rendering in unity. Taking two unity exemple you can compare :
- 'chop chop' a game using a similar rendering style : https://www.youtube.com/watch?v=GGTTHOpUQDE, if you take the pig and its environment showed in the video and go in the github repository you can see they only use one texture map : an albedo one.
All the models (pig + environment) weight about 6mB of textures and 350kB of models. and are sufficient to have the full main character and an environment.
- a 'realistic PBR workflow gun asset' on asset store (choose randomly but seems nice, realistic and containing only the gun so we can see download size) : https://assetstore.unity.com/packages/3d/props/guns/free-fps.... The workflow need 6 maps (there are 7 here but you generaly only use either a normal map or a heightmap) The pack weight 35MB. It's only the gun, you lack a full character handling it and the environment.
While I really like zelda, even with stylized graphics the game look a bit outdated for me. The cellshaded characters are fluid and pretty but the 'low resolution texture and low poly models' bother me a bit especially on environments. The artistic direction is really good but technically I can only think they are held back by the hardware.
As a game developer, I totally want to use all the resources i know i can find on the target hardware. Trust me even today they are lots of features game designer dream to put in game and cant because computing resources are still limited ^^. Do game NEEDS them to be fun ? Of course not, but COULD they be fun experiences ? I think yes :)
I absolutely am serious; a lot of games and software in general today demand far more system resources than they have any reasonable right to.
Don't give me "but the textures!" and the like either, optimize that stuff better instead. Whether it's Windows 10/11 or Call of Duty or Elite: Dangerous or Chrome or whatever strikes your fancy, software today has no business demanding the resources they do.
Lest we forget, the hardware we can buy today would have been considered supercomputers just a few years ago. You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
Well they were given theses right by the users who spend lot of money on having these system resources and are asking games to be as beautiful and complex we can (not all the users, i'm not the last to spend time in oldschool games, but a significant and heavy spending portion of them).
Business is exactly why most games dont spend an enormous budget on optimization today. It's not a requirement by the great majority of customers, it's quickly time and cost heavy, so the return on investment is pretty low.
Yes, i think even with infinite optimization budget a today triple A realistic rendering could simply not be possible on a too old computer in realtime.
I also think while it would really add value if background application like teams/slack/discord would be less resource heavy because they are open but not the main focus, when you play a high end video game it make sense to consider it's your main reason to use your computer at that time :)
If simulating and rendering a complete complex intractable realistic but imaginary world with today achievable level of detail seems mundane to you, it's far to seems like to me :)
No opinion about browsers and OS, today games are doing lot more of stuff valuable to most users than those of yesterday. I don't know enought about modern value of os and browser, exept empirically they do seems to crash a lot lot less than 20 years ago, but also syp a lot more on me :)
The priority of an AAA game developer is to provide as much graphic fidelity for a specific compute budget, not to consume the least compute for a specific graphic fidelity. If they "optimize that stuff better", the outcome wouldn't (and shouldn't!) be a lower usage of system resources but rather fitting in even more graphic details while still capping out all resources.
They do obviously have the reasonable right to demand all the system resources that are available, because a game is usually an immersive experience that is the only important thing running on the system at that time, and the only purpose of those greatly increased system resources is to be used for gains in visual quality - there's no reason to not try and use all of that compute power of what would have been considered supercomputers just a few years ago.
> You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
The fact that you're comparing browsing the internet with playing AAA games speaks volumes. Browsers are capable of making insane amounts of optimizations because the "geometry" of a website is (mostly) completely static, there's no physics, there's no sounds, there's no AI running client side, there's no game logic, etc. This means they get to cache 90% of the view and only update the changed portions of the screen.
Contrast that with a game, which has the entire view of the 3D world changing every 16ms when the user moves their mouse, has thousands of physical interactions happening (most likely at a higher framerate), is buffering and mixing sounds in a 3D world, is animating and loading large 3D assets in real-time, is creating photo realistic lighting in real-time, is handling all game logic and AI client side, etc. It becomes clear that the two fields, while both difficult in their own ways, don't overlap very much. Of course AAA games take a super computer to run. It's doing all that in 16ms, sometimes 7ms!
Plus, if you don't care about all the visual fidelity and stuff, most games allow you to turn a ton of that off. Games have never been mundane, whether we're talking about the original tetris or the remastered version of the last of us, they are pushing the boundaries of the hardware they run on to the limit to achieve incredible immersive experiences.
Not only that! They also have increasingly helped improve the state of the art rendering in offline renderers! We're seeing the improvements that games have been able to make to achieve real-time photo realistic rendering slowly make their way to large Hollywood studios. This allows the movies we watch to have higher fidelity CG, because the artists have quicker iteration times. And it reduces the compute load required for these massive CG scenes since they are using more optimized rendering techniques. Saving money, and our environment.
Lest we forget, these "mundane" games have led to huge breakthroughs in all sorts of fields because of their willingness to push the boundaries of our machines to see what's truly possible. As opposed to 90% of the software created today which runs orders of magnitude slower than it needs to because people can't or don't know how to write efficient software.
Because this hardware is for a different experience. Zelda is nice, but the style has its limits. It's kinda like asking why Disney invests hundreds of millions of Dollar into Marvel and Star Wars-Movies, when you can also make a cheaper but polished animation-movie with a fraction of that price. It's simply not the same.
We need this power because companies need to keep selling us new stuff and also because developers nowadays can't optimize their games too much since their managers make them do ten times the work, in half the time, and for one third of the pay they had on the 1990s.
Gaming has been a massive driver of hardware for a very long time in a way that can always be looked at as unneeded. We surpassed more compute for the sake of compute a while ago. The neat thing is that there are still new things to do with it. Real-time path tracing will be the next thing as well as moving more compute over to the GPU. We don’t need it but it will open new possibilities. And it seems less wasteful than running ten copies of Chrome to support a few desktop apps.
The new Zelda demonstrates something that’s been true of every console generation. They are a fixed platform and the later games are always considerably better at utilising the hardware.
Yeah, just look at FromSoft's output for the PS4 for example. Bloodborne came out very early, and while it looks great, it doesn't hold a candle to Elden Ring.
Though gameplay wise I have to say I prefer it. Elden Ring has too much stuff in it(crafting, gathering, tons of cookie cutter dungeons, and too many easy boss fights) for me. Bloodborne is very stripped down and devoid of fluff. I can sort of keep the whole game in my head and I love that about a game.
I doubt anyone is asking that question since none of what you said is true. This just seems like rant of someone wanting to play recent games and not wanting to upgrade their computer. If that's the case, just say that instead of whining about a trend that's been going on for 40 years.
Furthermore, why does a 2023 released game, in development since 2018, run like utter garbage on said spec hardware that is orders of magnitude more performant than last years specs ...
At some point I have to wonder if the reason we have so much computing power is so we can use that computing power.