> Fortnite, Risk of Rain 2, Valheim, and Deep Rock Galactic
Those games are low poly but NOT visually simplistic. Memory tends to play tricks and we remember older games looking better than they actually did, so we imagine lower polygons = 2005 tech game. Those games you've mention run on advanced and heavyweight fragment and vertex shaders to create a specific look (cartoony graphics in Fortnite, there's a million effects on screen a dozen levels in a Risk of Rain game, etc.)
They might have the same poly count of Old School Runescape (but not really, as the models are actually quite complex) yet everything else is 20 years ahead of that game tech and complexity wise.
Also not sure what your friend's PC is like, because a 5 year old PC can play all of those games with ease, though perhaps not at max settings. A 1080 ti from 2017 can run RoR2 at ~190 fps at 1440p. https://youtu.be/TdfE3n8YLYo
The effects punch below their weight in those games, though. I like to call it "Unity Syndrome", but it applies to any widely-adopted engine.
Well-made video games focus on the experience of playing them. Visuals, audio, setting, gameplay, user interfaces, they're all made with the same goal.
In a fast action game, you'll want menus to get out of the way quickly, dialogue that can be delivered while the player is moving, particle effects designed more like fireworks than sparklers, etc.
In a slow-paced story game, you'll have more leeway to let players stop and smell the roses. You'll want to pay attention to different details, make cues last longer, etc.
Open-world games need more attention to dynamic level of detail and story progressions. The list goes on.
When people wrote their own engines, these assumptions were baked in from the start and the engine was developed and tweaked according to the game being made. When you shoehorn your idea into a general-purpose off-the-shelf solution, you end up making more compromises on things like performance and verisimilitude.
You can see it in the default shaders/effects that many modern budget games use, but my favorite example of this is actually The Witcher. The first game in that series used Bioware's Aurora engine, which was designed to simulate d20 games like Dungeons & Dragons.
It's highly relevant. Devs aren't obligated to optimize for your friend's extremely weak computer. They may well be optimizing for a median gaming PC instead.
It's completely irrelevant when the conversation is "is this game poorly optimized?" - nobody ever talked about "obligation", that's a strawman you pulled out of nowhere.
The actual topic of conversation is "what games are poorly optimized". "Poorly optimized" means "making bad use of available resources" - which is irrespective of the amount of resources available.
You can make the argument that the devs are making the business decision of intentionally leaving their games poorly-optimized because they don't think that that'll recoup the cost of optimization (which is likely what's happening) - but that still makes those games poorly optimized, by definition.
Optimization isn't a one size fits all thing. Poorly optimized for very low end hardware, sure, I have no trouble believing that. That's not the same thing as being poorly optimized in general.
And realistically, even much older games were not known for running well without dedicated GPU's on old computers.
That's why I loved The Witness devs who released an update for the game, with improved support for my crappy old integrated GPU on laptop. Even though it was not meeting tmhardware requurements of tge game.
> They may well be optimizing for a median gaming PC instead.
This is how you get games that could run good if the user got more control over the model LODs, post processing effects and even render scale, but the developers/project management didn't care.
What's worse, a lot of modern games have great ability to scale back and run on lower end hardware when necessary, but the companies behind them only care about that ability when it comes to getting them running on Switch or a similar constrained hardware environment, that would still let them rake in more cash.
And outside of particular hacks (messing with config files, or using untrusted utilities), the users are often left powerless because a few configuration variables weren't exposed to them for whatever reason.
That's actually worse than Electron apps that are typically badly optimized by default (platform overhead): it's very much like a developer in an enterprise setting choosing to go for the N+1 by looping over data in the app and doing DB calls for each iteration, yet everyone actually is okay with it.
Except for the people who actually don't want their software/game to run slow, just because they cannot afford to throw unreasonable amounts of hardware resources at the problem the devs (and whoever is telling them what to do) inflicted upon them.
Best counterexample to this is probably e-sports titles that are optimized for stable frame times because it actually matters to the developers, or games like Skyrim that expose some of the engine internals to the users, so modders can choose what matters to them.
That said, many developers don't really consider it worth the effort to put lots of thought into options menu and sometimes don't even gate performance intensive post processing like SSAO behind options that can be toggled on/off.
In other cases, they might not have the necessary skillset to use a profiler properly and recognize what is particularly badly optimized, especially for smaller indie projects.
> Memory tends to play tricks and we remember older games looking better than they actually did
We compared what League of Legends, Valorant, and Inscryption look like now with what those other games look like now. There's no rose-tinted glasses involved - this is an apples-to-apples comparison.
> Those games you've mention run on advanced and heavyweight fragment and vertex shaders to create a specific look
If that same look is being created in a far more performant manner by other games, then that means that the game is poorly optimized.
> cartoony graphics in Fortnite
Same effect class as Valorant and League, with lower visual fidelity, and worse performance.
> there's a million effects on screen a dozen levels in a Risk of Rain game
Risk of Rain is incredibly laggy in the menu, with zero mobs on screen and a single small scene as the background.
> yet everything else is 20 years ahead of that game tech and complexity wise
Again, see Valorant, LoL, and Inscryption.
> Also not sure what your friend's PC is like, because a 5 year old PC can play all of those games with ease, though perhaps not at max settings.
Core i3-5015U with Intel integrated 5500[1] - so, 7 years old. Yet, it can still run League and Valorant at >60 fps with default settings, and 20-30fps Inscryption and Dota 2 with reduced settings - meanwhile, Fortnite, DRG, RoR2, and Valheim are all slideshows with all settings turned all the way down.
The claim that "these new games are just so much more involved than older games" simply doesn't hold up against the reality that there are recently-released games that look better and perform better simultaneously than these examples.
My lived experience, my understanding of computer graphics, and knowledge of things like the GTA Online incident[2] strongly indicates that this line of reasoning is incorrect.
That computer doesn’t even meet Risk of Rain’s min specs so of course it’s gonna run like that. League of legends and valorant don’t have nearly the same amount of cpu bound gameplay logic. Minimal AI, dedicated servers, less characters on screen is in lol and val advantage. RoR2 is p2p, has a lot of AI, tons of bullets, large environments, and a small development team making it have different requirements. Riot has a lot of resources to focus on performance that a small developer can’t.
>If that same look is being created in a far more performant manner by other games, then that means that the game is poorly optimized.
LMFAO. Games are just visual looks now. Good to know. Would you like a spot on r/gaming ? You sound like you'd fit right in there. Let's start with the obvious:
League of Legends, Valorant: 2500 employees at Riot Games
Inscryption: A single developer
Risk of Rain 2: 5 employees
Games don't run well or badly just for the lulz. Focusing on running on your shitty ass laptop is about the last thing on a developer's mind when trying to both ship, and make enough money for the other game. Riot can afford doing everything for it to run better on crap PCs because they have so many players that tapping onto the low, low end is worth money for them.
Additionally, there is infinitely more to a game than just "haha cartoon graphics go boom". LoL is extremely simple in terms of mechanics, and so is DotA. Camera pointed down at a very simple map that never changes. Trees that can fall is about the most complex interaction that happens on the map. Compare that with RoR, Valheim and DRG that procedurally generate complex environments, in full 3D with a large view distance. Add to this various features (destructibility, which brings its own set of view culling issues as well as more complex algorithms and data structures), and yes, your shitbox cannot run them well. Fortnite is so simple that it only has 100 animated pawns at the beginning at the same time, a gigantic map with also large view distances, structure building, thousands of assets, and a much more complex rendering pipeline (that, no, does not look like league of legends, despite all you'd like to pretend)
> My lived experience, my understanding of computer graphics, and knowledge of things like the GTA Online incident[2] strongly indicates that this line of reasoning is incorrect.
With none of the due respect, your lived experience takes about a single, barely valid factor into account, your understanding of computer graphics seems to be just about the one of a college student that got in his first year, and your knowledge of things like the GTA Online incident is a single event. "Looking better" means absolutely shit. You may prefer the visual result, but no matter your metric, I can make you a game that will look better _and_ run like absolute crap. Is it polycount ? Sure, I'll make a 50 million poly character model. Textures ? Here comes the 16K textures baby. Lighting ? Have I told you about our lord and savior CPU-ran-ray-tracing ?
Those games are low poly but NOT visually simplistic. Memory tends to play tricks and we remember older games looking better than they actually did, so we imagine lower polygons = 2005 tech game. Those games you've mention run on advanced and heavyweight fragment and vertex shaders to create a specific look (cartoony graphics in Fortnite, there's a million effects on screen a dozen levels in a Risk of Rain game, etc.)
They might have the same poly count of Old School Runescape (but not really, as the models are actually quite complex) yet everything else is 20 years ahead of that game tech and complexity wise.
Also not sure what your friend's PC is like, because a 5 year old PC can play all of those games with ease, though perhaps not at max settings. A 1080 ti from 2017 can run RoR2 at ~190 fps at 1440p. https://youtu.be/TdfE3n8YLYo