>>As much as we love exciting new features, we also want to see people create games on the full spectrum of devices for everyone to enjoy.
This is one of the main attitudes of the Godot team I really appreciate a lot. It might be easy for people in more developed nations to upgrade their hardware every few years, but there's people still playing games running on computers from 2002 and before. I used to know of a player who used to play (an old MMORPG) games on a computer he aimed a table fan at to keep it cool. The whole casing was open, it was kinna funny to look at it and it had hardware he got as a birthday gift more than 10 years ago. He played that old MMORPG because newer games wouldn't even start on that old thing. But most people who played that MMO were in the same boat, it was one of the very few ones they could run.
The requirements for some of the games coming out these days is sometimes so insane a lot of people from around the world are unable to play them. I always found it funny how we had so much developer time wasted on supporting ie6 because a small percentage of people were unable to upgrade their browsers, but when it comes to gaming, all bets were off and you are now expected to spend a few grand a year on upgrading your computer to play newer games. And don't get me started on the bandwidth costs to play some of the new games.
Optimization levels of many newer games are terrible.
Low-poly, visually simplistic games like Fortnite, Risk of Rain 2, Valheim, and Deep Rock Galactic barely run on a friend's computer (that was made only 5 years ago). Visually more complex games like League of Legends run buttery smooth on the exact same hardware.
(ironically, he says that Valorant, made by a non-Epic company, apparently runs significantly better than Fortnite, made by Epic - even though both are using the Epic-made Unreal Engine)
> Fortnite, Risk of Rain 2, Valheim, and Deep Rock Galactic
Those games are low poly but NOT visually simplistic. Memory tends to play tricks and we remember older games looking better than they actually did, so we imagine lower polygons = 2005 tech game. Those games you've mention run on advanced and heavyweight fragment and vertex shaders to create a specific look (cartoony graphics in Fortnite, there's a million effects on screen a dozen levels in a Risk of Rain game, etc.)
They might have the same poly count of Old School Runescape (but not really, as the models are actually quite complex) yet everything else is 20 years ahead of that game tech and complexity wise.
Also not sure what your friend's PC is like, because a 5 year old PC can play all of those games with ease, though perhaps not at max settings. A 1080 ti from 2017 can run RoR2 at ~190 fps at 1440p. https://youtu.be/TdfE3n8YLYo
The effects punch below their weight in those games, though. I like to call it "Unity Syndrome", but it applies to any widely-adopted engine.
Well-made video games focus on the experience of playing them. Visuals, audio, setting, gameplay, user interfaces, they're all made with the same goal.
In a fast action game, you'll want menus to get out of the way quickly, dialogue that can be delivered while the player is moving, particle effects designed more like fireworks than sparklers, etc.
In a slow-paced story game, you'll have more leeway to let players stop and smell the roses. You'll want to pay attention to different details, make cues last longer, etc.
Open-world games need more attention to dynamic level of detail and story progressions. The list goes on.
When people wrote their own engines, these assumptions were baked in from the start and the engine was developed and tweaked according to the game being made. When you shoehorn your idea into a general-purpose off-the-shelf solution, you end up making more compromises on things like performance and verisimilitude.
You can see it in the default shaders/effects that many modern budget games use, but my favorite example of this is actually The Witcher. The first game in that series used Bioware's Aurora engine, which was designed to simulate d20 games like Dungeons & Dragons.
It's highly relevant. Devs aren't obligated to optimize for your friend's extremely weak computer. They may well be optimizing for a median gaming PC instead.
It's completely irrelevant when the conversation is "is this game poorly optimized?" - nobody ever talked about "obligation", that's a strawman you pulled out of nowhere.
The actual topic of conversation is "what games are poorly optimized". "Poorly optimized" means "making bad use of available resources" - which is irrespective of the amount of resources available.
You can make the argument that the devs are making the business decision of intentionally leaving their games poorly-optimized because they don't think that that'll recoup the cost of optimization (which is likely what's happening) - but that still makes those games poorly optimized, by definition.
Optimization isn't a one size fits all thing. Poorly optimized for very low end hardware, sure, I have no trouble believing that. That's not the same thing as being poorly optimized in general.
And realistically, even much older games were not known for running well without dedicated GPU's on old computers.
That's why I loved The Witness devs who released an update for the game, with improved support for my crappy old integrated GPU on laptop. Even though it was not meeting tmhardware requurements of tge game.
> They may well be optimizing for a median gaming PC instead.
This is how you get games that could run good if the user got more control over the model LODs, post processing effects and even render scale, but the developers/project management didn't care.
What's worse, a lot of modern games have great ability to scale back and run on lower end hardware when necessary, but the companies behind them only care about that ability when it comes to getting them running on Switch or a similar constrained hardware environment, that would still let them rake in more cash.
And outside of particular hacks (messing with config files, or using untrusted utilities), the users are often left powerless because a few configuration variables weren't exposed to them for whatever reason.
That's actually worse than Electron apps that are typically badly optimized by default (platform overhead): it's very much like a developer in an enterprise setting choosing to go for the N+1 by looping over data in the app and doing DB calls for each iteration, yet everyone actually is okay with it.
Except for the people who actually don't want their software/game to run slow, just because they cannot afford to throw unreasonable amounts of hardware resources at the problem the devs (and whoever is telling them what to do) inflicted upon them.
Best counterexample to this is probably e-sports titles that are optimized for stable frame times because it actually matters to the developers, or games like Skyrim that expose some of the engine internals to the users, so modders can choose what matters to them.
That said, many developers don't really consider it worth the effort to put lots of thought into options menu and sometimes don't even gate performance intensive post processing like SSAO behind options that can be toggled on/off.
In other cases, they might not have the necessary skillset to use a profiler properly and recognize what is particularly badly optimized, especially for smaller indie projects.
> Memory tends to play tricks and we remember older games looking better than they actually did
We compared what League of Legends, Valorant, and Inscryption look like now with what those other games look like now. There's no rose-tinted glasses involved - this is an apples-to-apples comparison.
> Those games you've mention run on advanced and heavyweight fragment and vertex shaders to create a specific look
If that same look is being created in a far more performant manner by other games, then that means that the game is poorly optimized.
> cartoony graphics in Fortnite
Same effect class as Valorant and League, with lower visual fidelity, and worse performance.
> there's a million effects on screen a dozen levels in a Risk of Rain game
Risk of Rain is incredibly laggy in the menu, with zero mobs on screen and a single small scene as the background.
> yet everything else is 20 years ahead of that game tech and complexity wise
Again, see Valorant, LoL, and Inscryption.
> Also not sure what your friend's PC is like, because a 5 year old PC can play all of those games with ease, though perhaps not at max settings.
Core i3-5015U with Intel integrated 5500[1] - so, 7 years old. Yet, it can still run League and Valorant at >60 fps with default settings, and 20-30fps Inscryption and Dota 2 with reduced settings - meanwhile, Fortnite, DRG, RoR2, and Valheim are all slideshows with all settings turned all the way down.
The claim that "these new games are just so much more involved than older games" simply doesn't hold up against the reality that there are recently-released games that look better and perform better simultaneously than these examples.
My lived experience, my understanding of computer graphics, and knowledge of things like the GTA Online incident[2] strongly indicates that this line of reasoning is incorrect.
That computer doesn’t even meet Risk of Rain’s min specs so of course it’s gonna run like that. League of legends and valorant don’t have nearly the same amount of cpu bound gameplay logic. Minimal AI, dedicated servers, less characters on screen is in lol and val advantage. RoR2 is p2p, has a lot of AI, tons of bullets, large environments, and a small development team making it have different requirements. Riot has a lot of resources to focus on performance that a small developer can’t.
>If that same look is being created in a far more performant manner by other games, then that means that the game is poorly optimized.
LMFAO. Games are just visual looks now. Good to know. Would you like a spot on r/gaming ? You sound like you'd fit right in there. Let's start with the obvious:
League of Legends, Valorant: 2500 employees at Riot Games
Inscryption: A single developer
Risk of Rain 2: 5 employees
Games don't run well or badly just for the lulz. Focusing on running on your shitty ass laptop is about the last thing on a developer's mind when trying to both ship, and make enough money for the other game. Riot can afford doing everything for it to run better on crap PCs because they have so many players that tapping onto the low, low end is worth money for them.
Additionally, there is infinitely more to a game than just "haha cartoon graphics go boom". LoL is extremely simple in terms of mechanics, and so is DotA. Camera pointed down at a very simple map that never changes. Trees that can fall is about the most complex interaction that happens on the map. Compare that with RoR, Valheim and DRG that procedurally generate complex environments, in full 3D with a large view distance. Add to this various features (destructibility, which brings its own set of view culling issues as well as more complex algorithms and data structures), and yes, your shitbox cannot run them well. Fortnite is so simple that it only has 100 animated pawns at the beginning at the same time, a gigantic map with also large view distances, structure building, thousands of assets, and a much more complex rendering pipeline (that, no, does not look like league of legends, despite all you'd like to pretend)
> My lived experience, my understanding of computer graphics, and knowledge of things like the GTA Online incident[2] strongly indicates that this line of reasoning is incorrect.
With none of the due respect, your lived experience takes about a single, barely valid factor into account, your understanding of computer graphics seems to be just about the one of a college student that got in his first year, and your knowledge of things like the GTA Online incident is a single event. "Looking better" means absolutely shit. You may prefer the visual result, but no matter your metric, I can make you a game that will look better _and_ run like absolute crap. Is it polycount ? Sure, I'll make a 50 million poly character model. Textures ? Here comes the 16K textures baby. Lighting ? Have I told you about our lord and savior CPU-ran-ray-tracing ?
Fortnite is not “low poly” other than the fact that it uses polygons like nearly every other 3D game. The character models, level geometry, vfx, etc are all incredibly polished. League of Legends is fine, but it’s so much simpler it’s bizarre you would think it’s the more visually complex game.
On top of that, the fact Fortnite has a destructible world with 100 players means it’s also one of the most computationally demanding games… and yet it can run on four year old mobile phones. It’s a technical masterpiece, and anyone saying otherwise doesn’t know what they’re talking about.
Not gonna excuse Fortnite since I've personally had terrible experiences with it on top-end hardware, but from what little of it I was able to play it has much larger environments & draw distances to contend with. The tight lines of sight Valorant & other tactical shooters have can be an advantage when it comes to rendering optimizations.
The most common symptom I encounter on indie titles in particular is loading stutter. You can tell when something big is about to happen or you'll see a new enemy type because the game just locks up for a short but perceivable amount of time.
The 2nd Ori game was actually entirely unplayable on my system until I moved it to an SSD due to how it loads assets (don't know the details, but that's my assumption). I couldn't beat the time trials since it'd lock up mid-air due to the amount of map coverage and mess up my flow.
You might need to understand what system requirements are. "Recommended" or "Minimum" hardware these is what developers has tested on and where they provide support. Some games are never designed for whole hardware classes like laptops. If you complain that game does not run on hardware that does not meet "minimum" requirements you are complaining that 1 > 0 is true. The only reasonable response for devs is to provide refund of money spent on purchasing game.
I think prebaked lighting plays a huge role here - the maps in Valorant are statically lit, small confined spaces, and probably all the lighting and shadows is prebaked, save for the characters. Late 2000s hardware could already make games look amazing with this level of power, combined with good art direction.
All the games you mentioned in your post have dynamic levels, and must calculate all their lighting on the fly.
Valorant has a solid engineering team and invest heavily in optimization, can't speak for the others. I could be misremembering but there was an interesting bug that was featured in an Unreal profiling case study where a Fortnite font was eating ~2ms (or similarly obscene) CPU budget.
DRG should run fine on a gaming PC from five years ago I think (a mid-range GPU from then would be like a GTX 1060).
Actually, I think basically all of these games should run fine on that sort of hardware, as long as the quality isn't turned up high. Did they just build a really cheap machine or something?
Yes, that's a low-end processor. No, it doesn't matter, because the fact is that there are several other games that look better and run much better on the same hardware.
Optimization certainly varies across games, but when you're working on an iGPU, you're probably playing on a particularly low level of quality that yeah, developers don't spend a lot of time optimizing for, because relatively few of their intended customers are running on a crappy old integrated GPU. That doesn't necessarily mean that the game is "unoptimized" as a whole though, it could be just that it's unoptimized for very low end systems, because they focused more on better performance at higher quality settings.
In Deep Rock Galactic's case, while it's relatively low poly, it's still quite a pretty game, it's really not that barebones, so I wouldn't expect it to run well on such a weak CPU and GPU, and to expect otherwise is silly.
If you had read it, you probably would have actually responded to the points I made. You did not respond to the points that I made. Therefore, it's more likely that you simply did not read it.
> your complaint is still just silly
The classic "dismiss the argument without actually making a logical point".
> That doesn't necessarily mean that the game is "unoptimized" as a whole though
I never said "unoptimized" - don't put words in my mouth. I said "poorly optimized" - and if a game makes poor use of its resources on low-end hardware when there are other games that make better use (under similar economic constraints), and there are a significant number of actual players on lower-end hardware, then yes, it's poorly-optimized.
> it's still quite a pretty game, it's really not that barebones, so I wouldn't expect it to run well on such a weak CPU and GPU, and to expect otherwise is silly
I've already responded to this argument with the case example of Valorant. Not only is the expectation not "silly", but there's, again, an existing counter-example. Expecting program to run on hardware that is capable of running them is not unreasonable at all.
The people who think that it's ok to waste the vast amount of a computer's potential are part of the problem of ever-more-bloated software that's wasting power and contributing to carbon emissions through poor use of available resources.
I won't dispute that some companies are known for making particularly optimized games that function well even on very low end hardware, including Riot and Valve. That doesn't mean everything else counts as poorly optimized, though, and certainly it's made easier for a company like Riot when they're huge and the game is intended to live as a GaaS for a very long time.
Your reasoning amounts to, "a few companies manage to do X extremely well, everyone else counts as a poor effort". Do I really need to explain why this doesn't make sense?
Not to mention, at which point in PC gaming history were you able to play most new games at a decent framerate on a 7-year old PC with no dedicated GPU and a CPU that was weak even for its time?
Even seemingly simple games are clocking in at 100GB+ in disk space. I think in terms of performance many games are setting the floor at the Nintendo Switch or the Steam Deck, often ripping out features to get it to run on those platforms (CIV6, 2k etc).
This is one reason I really admired Valheim (~1GB). Though even that game had CPU issues (also its an indie title so...).
I'm aware of large AAA games taking up that much space (Doom Eternal, Red Dead Redemption 2) but they could hardly be considered "simple". Do you have examples of what you're talking about?
NBA2k is over 100GB. Sports games seem to do this a lot. It’s obviously due to the art assets that AAA roll out but there’s gotta be some way to trim the fat or at least do it incrementally.
> a computer he aimed a table fan at to keep it cool. The whole casing was open
That brings me back, I had to do the same to play Starcraft in the early 00s in summer since our house had no air conditioning and CPUs were passively cooled. I was drenched in sweat but I wasn't going to give up my games, dammit! The interesting thing is the symptom was just a crash to desktop, not a full system shutdown like now (for safety reasons I assume).
I played with Godot back in the early 3.x timeframe (3.1? 3.2?) and then fell away and spent some time with Unity. But between some of Unity's missteps and Godot getting dotnet 6 support it is time to give it another go, plus a bunch of nice changes to gdscript that might make it something I'm okay with using (functions are first class citizens now unlike before where you just passed the name of the function as a string which always felt incredibly gross).
Though based on the discord conversation there are still a lot of bumps even in the beta so I would not expect smooth sailing yet, but lets see if it is any good.
For anyone interested, the main documentation on their site is still pointing at 3.x. If you want the docs for 4.0 start here
Yeah it is part of why I'm going to start with GDScript instead of jumping straight to dotnet 6 (which is what initially got me thinking about trying the new version).
I may still jump to c# but I wanna give GDScript a fair shake first. Plus the docs/tutorials tend to be more plentiful for GDScript.
>Plus the docs/tutorials tend to be more plentiful for GDScript.
I quite like reading tutorials in GDScript and trying to convert them to C#, I feel like it strikes a good mix between being told what to do while also getting to go off and explore the engine myself and reinforcing that same learning.
This is also true, and I'd probably do some of that after learning the transliteration (at least some of the existing documentation like the intro to 2d tutorial give examples in both).
Well GDScript and it’s nodetree is great for early days prototyping, even if you decide to redo it later in another fashion. So getting that under your belt is definitely not a bad idea.
HN hug of death? godotengine.org is offline for me at the moment. Regardless, really looking forward to trying it out! Here it is on the wayback machine:
Their web hosting has been pretty awful the last month or two. Multiple days of downtime for critical pieces. I know they get it free from tuxfamily but this certainly feels like they're getting what they pay for.
Fantastic news. I'm really looking forward to Godot 4.0 exposing more of the ENet wrapper into GDScript etc. I've been writing a game server in Erlang and very much looking forward to offering ENet as an option in addition to WebSockets!
Whoa, that's interesting -- does ENet have a good integration with Erlang, or something like that? I'm interested in both Elixir/Erlang and Godot, but haven't seen an opportunity to use them together previously. I'd love to hear more details about this project of yours, and how the Godot 4.0 release impacts it.
Someone made an effort to do a full protocol port to Erlang and I have a friendly fork I’m maintaining that makes it work better with the C version as used by Godot. Seems to work with unreliable/reliable/unsequenced packets but never tested in anger, much less production!
I don’t want to derail the thread but I have a link to my GitHub in my profile - I’m working on a few things:
1. the aforementioned Erlang ENet fork
2. My generic Erlang game server called Overworld
3. A client plugin for Godot that generates a lot of the boilerplate code needed to encode/decode messages between Godot and Overworld (protobuf used for serialization)
4. An implementation of a subset of GDScript called Gdminus written in Erlang, including a simple interpreter. The idea being to have a familiar language for Godot programmers to be able to script Overworld someday- or more generally as a toy white space significant language that runs in the BEAM.
Honestly they’re all toys right now but happy to chat more via email!
Early pandemic I was playing around with making a network shooter in godot but I got hung up on a good way to make a shared lib for the client/server "projects". Anyone solve that or have tips? It seemed like I either needed to make a mono repo and toggle the build for client=true but I really wanted to make 3 repos, client, server, and game-core-lib. Would love tips/guides if you have them!
In my experience, you’re better off having a single codebase with client/server/shared code and using either the equivalent of ifdefs or conditional blocks gating execution for these states.
Unreal uses a different architecture where the codebase allows you to define who owns what, but it’s ultimately the exact same thing, and more confusingly done. Further, they have some actor states which are never actually used.
Unsurprisingly, Unreal’s networking model is unpopular. They also struggle with an architecture that has been unreliable by design since the late 90s, but now I’m digressing.
One codebase, split execution. If you use this traditional model, you can also avoid shipping server binaries to users if you so desire.
For reference, I am the author of Planimeter Game Engine 2D.
I also was doing the same but ultimately put it down to wait for Godot 4 (and its niceties such as occlusion culling - I had performance issues with quake 1 maps loaded up through godot). Do you have a repo?
I had networking etc running fine with client/server architecture but I would like to see how others have achieved this.
Mine was a bashed together repo from some example tutorial. I was trying to build a paintball game so most of this was actually interest in exploring solid body projectiles vs. hitscan. I was able to get 10 players with 10 balls per second running reasonably smooth without any crazy optimization so that was cool. When I hit the wall of shared lib thats when I gave up further exploration.
Yeah you can easily compartmentalize client/server with ifdefs but it's pretty hideous. Do serious shops on MP titles stick with UnrealNet? I find it performs poorly even with some of the bandaids like Replication Graph.
Unreal's default multiplayer architecture is unreliable by design and Tim Sweeney's whitepaper on its design from 1999, IIRC, talks about it in plain language. It's dramatically different than the Quake family of engines which reliably replicates entity states. The fact that he does not think of clients as being capable of having accurate replicable state is jaw dropping and appalling to read. It's, well, like reading a dissertation of provably wrong statements.
Serious shops hack around the fact that Unreal uses a variable timestep. Because it uses a variable timestep and uses an RPC design that does not align state changes with frametimes, you, by definition, cannot make say, a reliable first-person shooter. Every game that uses Unreal must use cone projection and proximity hacks on top of the RPC system to attempt to make a reliable first person shooter, because you cannot guarantee the state of actor positions and inputs coming from the same frame of execution.
Beautiful game engine though. Absolutely unreliable, though.
If you ever wondered why FPS games on modern versions of the Unreal Engine had multiplayer that felt so bad, it's because the engine isn't designed from the ground up to be accurately replicated to clients.
The design issue is systemic, so it may never be fixed, and Unreal engineers have decided it's too difficult or laborious to tackle.
Yeah, everything about early id Tech is just, without exaggeration, decades ahead of Unreal. Of course, sex sells, so since Unreal renders fairly well, for this reason alone I think it attracts amateurs.
I think Godot wants to do the same thing, and I'm sure it'll work, but the authors are just like the rest of the open source community in that they don't actually work on features meaningful to most games.
Back to Unreal networking: what really sucks is that you can't easily just avoid using it by using your own networking subsystem. Because features like default actor movement are tightly bound to the RPC system, and concepts like network visibility are directly integrated with actors, you may unintentionally end up screwing up what needs to be networked to clients unless you very closely match what the existing networking code does.
It's sophisticated enough that I suspect there isn't a single company or individual who has done this, because that too would be far more effort than it might be worth.
It's so gross to me that you immediately cannot use things like the replication graph if you decide you want your game to support split screen. IIRC, that was a technical limitation for some reason. I could be misremembering. It's just disgusting.
Interesting. I know that Frost Giant (heir apparent to the Starcraft/Warcraft part of Blizzard) has said that they're building their own gameplay logic engine and networking stack on top on UE5 for their RTS, I wonder if what you're saying is part of why.
UE is the last engine I'd use for RTS, though people have done wonders in spite of it -- check out the tech behind HŌRU game for some impressive examples.
I think writing their own physics engine might be wrong way to go here. I understand that Bullet leaves a lot to be desired, but my instinct is that the complexity from their own engine will leave a lot of edge case bugs that need to be ironed out over time, and that games using their own physics engine will suffer from a lot more quirks in the meantime.
A few years ago, as someone who has worked on physics engines for robotics simulations, I would have agreed with you. But now a lot of people are doing greenfield physics engines for games and having it work out pretty well. There's a ton of established academic and conference literature in the area now and it's not nearly as scary as it used to be.
For example, Horizon: Forbidden West uses a custom physics engine that started out as one of the core dev's fun side projects: https://github.com/jrouwe/JoltPhysics
Physics engines (at least game quality physics engines) are starting to drift in to "solved problem" territory and there's enough literature now that you can get something reasonable going yourself after doing some weekend reading.
Edit to add: Godot has had its own engine available for a long time, so it's not a totally new effort. It's a heavy refactor and a large improvement but the bones for this were laid years ago so some of that technical debt you're describing has already been paid down.
I've briefly looked into physics engines and they seem to require a lot of trade-offs. If a rock meets a hard place, what should happen? Gamers have seen the hilarity that can ensue. Designing your own engine allows you to make those trade-offs with the end goal in mind. You can do things like set global force or speed limits, because you know your game's design and the appropriate limits.
Lots of simulators for robotics use ODE or Bullet, actually, and they're quite good for lots of things that you might want to simulate. But there are some people who would argue that they aren't the best for it, especially for simulating legged locomotion, because of the way physical constraints such as joint limits are enforced (via Lagrange multipliers). A popular alternative formulation is Featherstone's Method[1], which is becoming more widely available and goes by other names sometimes; for example, Rust's `rapier` physics engine refers to this type of dynamics model as "reduced coordinates" (they don't yet support it, but it's on their roadmap). Featherstone's method is becoming more popular but it's still not widely used in a lot of physics engines because most physics engines are targeting games and Featherstone's doesn't always have the best performance characteristics, instead favoring physical fidelity. Bullet can run using Featherstone's Method, by the way. One of the few FOSS engines I know of that supports it out of the box.
Additionally, there are people in robotics who like to say that simulators are like lightsabers, you haven't become a true Jedi until you've built your own, so there's a lot of home grown physics simulators in robotics.
One of the benefits of 4.0 is also the modularity of it with GDExtension. The major parts of the engine (including the physics) can be swapped with replacements without the need to recompile the entire engine. I'd usually say that is a long shot for community run projects, but even Bevy engine has community made extensions for separate physics engines.
Forgetting about extensions, though, I see your point and almost agree, but Godot has shown that they will put in the work to improve their project, even if that means removing features like they did with visual scripting. Their physics engine will definitely be rough at first, but based on their past work, I believe they are willing and able to maintain it.
I reckon they're thinking long term here though, if they don't change it now then likely the only time they could change it is when a theoretical Godot 5.0 comes out in who knows how many years.
It sounds like it's got bugs that need fixing either way, so at least for their own engine they're not reliant on anyone else to get those fixes out (or have to use hacky workarounds as would most likely be the actual fix).
From everything I heard the existing physics engine was already buggy/hard to work with, which is harder to fix because either they had to fork it and then maintain compat themselves or ensure all their changes got accepted upstream. At some point it can easily make sense to just go your own way, and I can't fault them for deciding to do so.
For the elf/linux target, I hope the build containers have a really, really old glibc (dunno which version it is) and the "-static-libgcc" and "-static-libstdc++" are defaults.
I would like too. But games needs to load system libs (which are linked to a glibc), and games are not fully libdl-ized (dlopen/dlsym/dlclose). Additionnaly gcc static libstdc++ does not have a "libdl" mode, or even worse: it seems it links to internal glibc symbols. The glibc "should" be libdl clean, minus of few C runtime services I guess.
Basically, binaries of games cannot be "pure" and "simple" ELF64, in other words cannot load with any libc/elf runtime (musl/glibc/bionic?/etc) until there is the right symbol/version because of the static libstdc++ (some libgcc services too).
They would have to fork libstdc++, third-party libs, and fully libdl-ized them (in theory, it is easy brutal work).
c++ ABI is a nightmare, glibc symbol versioning is amok, well the devs of those components seem to ignore carefully that games have been available on elf/linux for 10 years.
The part of a game which is "code" is not "large": it is "small", except if a game has really little data, namely it is a small game, and if it is a small game...
Then, on elf/linux systems, you have "alternatives" and serious abi instability due mostly, but not only, to gnu symbol versioning (it is pathological in glibc, or it could be a scam). Game binaries cannot expect _their_ alternatives to be there, that's why they have to rely on really only video game core, very video game core libs and the sysv abi(elf). The only way to properly "deal" with abi (modules, symbols, with their versions) stuff on elf/linux, is to dynamically load as much as possible from the system: libdl with only the 3 symbols dlopen/dlsym/dlclose.
Ofc, a game should statically link as much as it can (for instance libm), but it requires to dynamically load the video game core libs. Those video game core libs may be linked with musl (hope they have their libdl now), glibc, bionic (if it has a elf loader), etc. If you want full static binaries, you will need a full elf loader into your static binaries. You cannot create such static binaries with the glibc and it is not supported by the set of glibc libs as far as I know (I wonder if it is not done conveniently on purpose to "break" the usage of alternative elf/c runtimes), and if I recall properly you can with musl.
The video game core libs on a elf/linux system are:
- wayland window system code is static into game binaries, but for key symbols resolution, the game must dynamically load the libxkbcommon library for the xkb client state machine with the user configuration (you may not have the "right" xkb data files for a user). Basically, you feed wayland keycodes to the xkb state machine and based on user specific configuration files, you get out the right key symbol (some games have a mode to bypass key symbol resolution and work directly with keycodes).
- x11 system code, the usage is now to dynamically load the client xcb libs and libxkbcommon-x11 (do not use libX11).
- android has wayland, but I don't think it's xkb (no libxkbcommon(-x11) lib).
- GPU, vulkan by-design does require dynamic loading, and GL fallback will require to dynamically load of the targetted GL libs (some games could be 100% cpu rendering in window systems surfaces using a presentation interface).
- sound: you only need to dynamically load the alsa-lib (libasound) as software mixers (dmix/pulseaudio/pipewire/jack/etc) are hidden behind the alsa-lib. (I don't know for android, but alsa has a mobile phone specific configuration interface, don't think it is stable yet though).
- for joypad stuff, it is linux specific with the interface of /dev/input/eventN files.
In theory, games should be sets of pure and simple ELF64 binaries (with the least amount of relocation types), libdl-ing everything from the system. It means no c runtime main() function, but the sysv ABI entry point, which is a basically a main() anyway. To deal with cross-platform (elf/linux|doz|fruitOS|etc), platform specific fallbacks (on elf/linux platform: wayland->x11, vulkan->gl->cpu, etc), proper compile-time and runtime tables of functions usually will do the trick.
Reality check: libstdc++, some libgcc services, and many third-party libs are not libdl-ized. The mitigation is to use "-static-libgcc" and "-static-libstdc++" compiler/linker options and link with an extremely, EXTREMELY old glibc (what godot does). It seems libstdc++ will link with glibc internal symbols if linking happens with a glibc on the build system.
NOTES: for i18n games with "full" text input, an input method should be provided by the game itself as you cannot expect a specific input method to be installed on the user system, if any. A GUI toolkit is "above" the window system and related to user alternatives (some may prefere QT, other GTK, or enlightenment or neither of them, then for the same reason than before, a game must package its GUI toolkits.
Ofc, if you code in plain and simple C, a lot of those issues can be worked around much more easily.
I think the main difference is that GDNative only had access to Godot's scripting API, so it could only manipulate the same stuff that GDScript or C# already could. So in 3.x, if you wanted to manipulate engine internals, you had to "engine extensions" which had to be compiled with the engine itself. But with GDExtension I think you're supposed to be able to do _everything_ that "engine extensions" could do, but without having to re-compile Godot itself.
I've been building my game on the Godot 4 alphas and the improvements in networking and rendering have more than made up for any instability or keeping up with changes. That said, more stability will be welcome and a focus on bug fixing instead of feature proposals will be key to a strong 4.0 release.
haven't used Godot in a couple years but it's good to see the progress on their tile system for 2D games! it was terrible last I checked, yet pretty crucial for making many kinds of simple 2D games
It would be nice if we brought the rule back where it's against HN guidelines to post submissions for every new version of some software. Every Godot thread ends up with the same comments posted, none of them particularly interesting or insightful. And in this particular case, it's not even an official release; it's just a beta!
I disagree. I want to see more actual software on Hacker News. This is a space for "hackers," is it not?
While Godot fairly established, there are up-and-comers in software development, and one of the few ways I will ever know about these people and their projects are by Show HNs and product update submissions.
It may be dozens of releases or years before I even hear about a project, and this type of comment clearly comes from a place of not knowing at all what it is like to work so hard on something nor knowing at all how to promote a product.
People have an aversion to promoting and advertising, but I want to see "WAYWO?"!
I'd far more* rather see that than political nonsense, bullshit tech opinion articles, and news completely unrelated to the hacker or business space.
Edit: Further, with respect to Godot, the authors continually make more progress on the codebase and there is a lot to talk about. Not just specifically with Godot and their prioritization of software features, but how the developers and contributors are having an impact on the hobbyist and independent developer scene.
I have gripes with the space as it currently is, and I know I'm not the only one. I want to read those opinions here. If you don't like it, don't upvote it.
For example, why have they in the past prioritized their own programming language? Decades old game engine codebases have rich features like material sounds, and fully integrated multiplayer features, but almost no open source game engines feature these things. Instead, they all focus on shallow flashy features like PBR workflows. I want to talk about those things.
Another comment here mentions a custom physics engine that is being introduced. That's interesting! And further discussion is warranted over whether or not that is something that developers care about! What about other features like native split screen support? There's so much in this space to discuss.
Godot has a large number of integrated multiplayer features from low level primitives to higher level interfaces. It also has support for p2p with webrtc.
* I can't find the proposal, but we, V-Sekai, call those physical materials or sound materials. Like a piece of concrete has different friction and a different sound. The worst part is it requires the Godot Engine 4 artists to conform to like 300 materials.
This isn't merely some milestone. This is a 2-3 year major rewrite of the engine with a rewritten render pipeline/engine, a heavily updated scripting engine+language, and a lot more. People (I admit, myself among them) have been chomping at the bit to see Beta 1 where the API/etc stability is guaranteed barring major bugs forcing them to revert things.
Godot 4 entering beta is quite an important thing since that is update that has been in works for a couple of years now and adds lots of new capabilities to Godot.
This comment makes no sense. If it was some random alpha release of Godot (there have been more than a dozen) then sure, but this is the first beta release, which means it's now somewhat stable. That's a big deal, considering how much rewriting as been going on.
I actually agree with the idea of restraint, but this is a colossal release with literally dozens of major highly anticipated features and a huge load of bug fixes and performance optimizations. Given the development time frame and resources this is a spectacular delivery. Also highly recommend checking out the code base directly as the team have done a great job of keeping the sources well organized and clear.
>>As much as we love exciting new features, we also want to see people create games on the full spectrum of devices for everyone to enjoy.
This is one of the main attitudes of the Godot team I really appreciate a lot. It might be easy for people in more developed nations to upgrade their hardware every few years, but there's people still playing games running on computers from 2002 and before. I used to know of a player who used to play (an old MMORPG) games on a computer he aimed a table fan at to keep it cool. The whole casing was open, it was kinna funny to look at it and it had hardware he got as a birthday gift more than 10 years ago. He played that old MMORPG because newer games wouldn't even start on that old thing. But most people who played that MMO were in the same boat, it was one of the very few ones they could run.
The requirements for some of the games coming out these days is sometimes so insane a lot of people from around the world are unable to play them. I always found it funny how we had so much developer time wasted on supporting ie6 because a small percentage of people were unable to upgrade their browsers, but when it comes to gaming, all bets were off and you are now expected to spend a few grand a year on upgrading your computer to play newer games. And don't get me started on the bandwidth costs to play some of the new games.