I've been writing a metaverse client in Rust for almost five years now, which is too long.[1]
Someone else set out to do something similar in C#/Unity and had something going in less than two years.
This is discouraging.
Ecosystem problems:
The Rust 3D game dev user base is tiny.
Nobody ever wrote an AAA title in Rust. Nobody has really pushed the performance issues.
I find myself having to break too much new ground, trying to get things to work that others doing first-person shooters should have solved years ago.
The lower levels are buggy and have a lot of churn
The stack I use is Rend3/Egui/Winit/Wgpu/Vulkan. Except for Vulkan, they've all had hard to find bugs.
There just aren't enough users to wring out the bugs.
Also, too many different crates want to own the event loop.
These crates also get "refactored" every few months, with breaking API changes, which breaks the stack for months at a time until everyone gets back in sync.
Language problems:
Back-references are difficult
A owns B, and B can find A, is a frequently needed pattern, and one that's hard to do in Rust. It can be done with Rc and Arc, but it's a bit unwieldy to set up and adds run-time overhead.
There are three common workarounds:
- Architect the data structures so that you don't need back-references. This is a clean solution but is hard. Sometimes it won't work at all.
- Put everything in a Vec and use indices as references. This has most of the problems of raw pointers, except that you can't get memory corruption outside the Vec. You lose most of Rust's safety. When I've had to chase down difficult bugs in crates written by others, three times it's been due to errors in this workaround.
- Use "unsafe". Usually bad. On the two occasions I've had to use a debugger on Rust code, it's been because someone used "unsafe" and botched it.
Rust needs a coherent way to do single owner with back references. I've made some proposals on this, but they require much more checking machinery at compile time and better design. Basic concept: works like "Rc::Weak" and "upgrade", with compile time checking for overlapping upgrade scopes to insure no "upgrade" ever fails.
"Is-a" relationships are difficult
Rust traits are not objects. Traits cannot have associated data. Nor are they a good mechanism for constructing object hierarchies. People keep trying to do that, though, and the results are ugly.
I saw a good talk, though I don't remember the name, that went over the array-index approach. It correctly pointed out that by then, you're basically recreating your own pointers without any of the guarantees rust, or even C++ smart pointers, provide.
pcwalton 3 hours ago [-]
But Unity game objects are the same way: you allocate them when they spawn into the scene, and you deallocate them when they despawn. Accessing them after you destroyed them throws an exception. This is exactly the same as entity IDs! The GC doesn't buy you much, other than memory safety, which you can get in other ways (e.g. generational indices, like Bevy does).
_bin_ 2 hours ago [-]
But in rust you have to fight the borrow checker a lot, and sometimes concede, with complex referential stuff. I say this as someone who writes a good bit of rust and enjoys doing so.
pcwalton 2 hours ago [-]
I just don't, and even less often with game logic which tends to be rather simple in terms of the data structures needed. In my experience, the ownership and borrowing rules are in no way an impediment to game development. That doesn't invalidate your experience, of course, but it doesn't match mine.
charlotte-fyi 1 hours ago [-]
The dependency injection framework provided by Bevy also particularly elides a lot of the problems with borrow checking that users might run into and encourages writing data oriented code that generally is favorable to borrow checking anyway.
_bin_ 51 minutes ago [-]
This is a valid point. I've played a little with Bevy and liked it. I have also not written a triple-A game in Rust, with any engine, but I'm extrapolating the mess that might show up once you have to start using lots of other libraries; Bevy isn't really a batteries-included engine so this probably becomes necessary. Doubly so if e.g. you generate bindings to the C++ physics library you've already licensed and work with.
These are all solvable problems, but in reality, it's very hard to write a good business case for being the one to solve them. Most of the cost accrues to you and most of the benefit to the commons. Unless a corporate actor decides to write a major new engine in Rust or use Bevy as the base for the same, or unless a whole lot of indie devs and part-time hackers arduously work all this out, it's not worth the trouble if you're approaching it from the perspective of a studio with severe limitations on both funding and time.
pcwalton 46 minutes ago [-]
Thankfully my studio has given me time to be able to submit a lot of upstream code to Bevy. I do agree that there's a bootstrapping problem here and I'm glad that I'm in a situation where I can help out. I'm not the only one; there are a handful of startups and small studios that are doing the same.
jokethrowaway 1 hours ago [-]
Given my experience with Bevy this doesn't happen very often, if ever.
The only challenge is not having an ecosystem with ready made everything like you do in "batteries included" frameworks.
You are basically building a game engine and a game at the same time.
We need a commercial engine in Rust or a decade of OSS work. But what features will be considered standard in Unreal Engine 2035?
2 hours ago [-]
dundarious 2 hours ago [-]
Yes but regarding use of uninitialized/freed memory, neither GC nor memory safety really help. Both "only" help with totally incidental and unintentional and small scale violations.
janalsncm 3 hours ago [-]
> These crates also get "refactored" every few months, with breaking API changes
I am dealing with similar issues in npm now, as someone who is touching Node dev again. The number of deprecations drives me nuts. Seems like I’m on a treadmill of updating APIs just to have the same functionality as before.
christophilus 2 hours ago [-]
I’ve found the key to the JS ecosystem is to be very picky about what dependencies you use. I’ve got a number of vanilla Bun projects that only depend on TypeScript (and that is only a dev dependency).
It’s not always possible to be so minimal, but I view every dependency as lugging around a huge lurking liability, so the benefit it brings had better far outweigh that big liability.
So far, I’ve only had one painful dependency upgrade in 5 years, and that was Tailwind 3-4. It wasn’t too painful, but it was painful enough to make me glad it’s not a regular occurrence.
molszanski 1 hours ago [-]
Hmmm.. strange. Don’t have issues like that. Can you show us your package json?
schneems 2 hours ago [-]
I wish for ecosystems that would let maintainers ship deprecations with auto-fixing lint rules.
photonthug 2 hours ago [-]
Yeah, not only is the structure of business workflows often resistant to mature software dev workflows, developers themselves increasingly lack the discipline, skills or interest in backwards compatibility or good initial designs anyway. Add to this the trend that fast changing software is actually a decent strategy to keep LLMs befuddled, and it’s probably going to become an unofficial standard to maintain support contracts.
On that subject, ironically code gen by ai for ai related work is often least reliable due to fast churn. Langchain is a good example of this and also kind of funny, they suggest / integrate gritql for deterministic code transforms rather than using AI directly: https://python.langchain.com/docs/versions/v0_3/.
Overall.. mastering things like gritql, ast grep, and CST tools for code transforms still pays off. For large code bases, No matter how good AI gets, it is probably better to get them to use formal/deterministic tools like these rather than trust them with code transformations more directly and just hope for the best..
I occasionally notice libraries or frameworks including OpenRewrite rules in their releases. I've never tried it, though!
harles 3 hours ago [-]
I’ve found such changes can actually be a draw at first. “Hey look, progress and activity!”. Doubly so as a primarily C++ dev frustrated with legacy choices in stl. But as you and others point out, living with these changes is a huge pain.
Charon77 52 minutes ago [-]
A owns B, and B can find A
I think you should think less like Java/C# and more like database.
If you have a Comment object that has parent object, you need to store the parent as a 'reference', because you can't put the entire parent.
So I'll probably use Box here to refer to the parent
aystatic 21 minutes ago [-]
?? the whole point of Box<T> is to be an owning reference, you can’t have multiple children refer to the same parent object if you use a Box
4 hours ago [-]
echelon 4 hours ago [-]
We've got another one on our end. It's much more to do with Bevy than Rust, though. And I wonder if we would have felt the same if we had chosen Fyrox.
> Migration - Bevy is young and changes quickly.
We were writing an animation system in Bevy and were hit by the painful upgrade cycle twice. And the issues we had to deal with were runtime failures, not build time failures. It broke the large libraries we were using, like space_editor, until point releases and bug fixes could land. We ultimately decided to migrate to Three.js.
> The team decided to invest in an experiment. I would pick three core features and see how difficult they would be to implement in Unity.
This is exactly what we did! We feared a total migration, but we decided to see if we could implement the features in Javascript within three weeks. Turns out Three.js got us significantly farther than Bevy, much more rapidly.
pcwalton 4 hours ago [-]
> We were writing an animation system in Bevy and were hit by the painful upgrade cycle twice.
I definitely sympathize with the frustration around the churn--I feel it too and regularly complain upstream--but I should mention that Bevy didn't really have anything production-quality for animation until I landed the animation graph in Bevy 0.15. So sticking with a compatible API wasn't really an option: if you don't have arbitrary blending between animations and opt-in additive blending then you can't really ship most 3D games.
seivan 3 hours ago [-]
[dead]
pcwalton 4 hours ago [-]
> Nobody has really pushed the performance issues.
This is clearly false. The Bevy performance improvements that I and the rest of the team landed in 0.16 speak for themselves [1]: 3x faster rendering on our test scenes and excellent performance compared to other popular engines. It may be true that little work is being done on rend3, but please don't claim that there isn't work being done in other parts of the ecosystem.
I read the original post as saying that no one has pushed the engine to the extent a completed AAA game would in order to uncover performance issues, not that performance is bad or that Bevy devs haven’t worked hard on it.
sapiogram 4 hours ago [-]
Wonderful work!
...although the fact that a 3x speed improvement was available kind of proves their point, even if it may be slightly out of date.
pcwalton 4 hours ago [-]
Most game engines other than the latest in-house AAA engines are leaving comparable levels of performance on the table on scenes that really benefit from GPU-driven rendering (that's not to say all scenes, of course). A Google search for [Unity drawcall optimization] will show how important it is. GPU-driven rendering allows developers to avoid having to do all that optimization manually, which is a huge benefit.
milesrout 26 minutes ago [-]
[dead]
seivan 3 hours ago [-]
[dead]
12_throw_away 6 hours ago [-]
More than anything else, this sounds like a good lesson in why commercial game engines have taken over most of game dev. There are so many things you have to do to make a game, but they're mostly quite common and have lots of off-the-shelf solutions.
That is, any sufficiently mature indie game project will end up implementing an informally specified, ad hoc, bug-ridden implementation of Unity (... or just use the informally specified, ad hoc and bug-ridden game engine called "Unity")
pcwalton 3 hours ago [-]
> More than anything else, this sounds like a good lesson in why commercial game engines have taken over most of game dev. There are so many things you have to do to make a game, but they're mostly quite common and have lots of off-the-shelf solutions.
> That is, any sufficiently mature indie game project will end up implementing an informally specified, ad hoc, bug-ridden implementation of Unity (... or just use the informally specified, ad hoc and bug-ridden game engine called "Unity")
But using Bevy isn't writing your own game engine. Bevy is 400k lines of code that does quite a lot. Using Bevy right now is more like taking a game engine and filling in some missing bits. While this is significantly more effort than using Unity, it's an order of magnitude less work than writing your own game engine from scratch.
milesrout 26 minutes ago [-]
[dead]
doctorpangloss 6 hours ago [-]
And yet, if making your own game engine makes it intellectually stimulating enough to actually make and ship a game, usually for near free, going 10x slower is still better than going at a speed of zero.
spullara 5 hours ago [-]
I would bet that if you want to build a game engine and not the game, the game itself is probably not that compelling. Could still break out, like Minecraft, but if someone has an amazing game idea I would think they would want to ship it as fast as possible.
qustrolabe 5 hours ago [-]
If anything, making your own game engine makes process more frustrating, time consuming and leads to burnout quicker than ever, especially when your initial goal was just to make a game but instead you stuck figuring out your own render pipeline or inventing some other wheel. I have a headache just from thinking that at some point in engine development person would have to spend literal weeks figuring out export to Android with proper signage and all, when, again, all they wanted is to just make a game.
lolinder 4 hours ago [-]
This seems entirely subjective, most importantly hinging on this part here: "all they wanted is to just make a game".
If you just want to make a game, yes, absolutely just go for Unity, for the same reason why if you just want to ship a CRUD app you should just use an established batteries-included web framework. But indie game developers come in all shapes and some of them don't just want to make a game, some of them actually do enjoy owning every part of the stack. People write their own OSes for fun, is it so hard to believe that people (who aren't you) might enjoy the process of building a game engine?
turtledragonfly 4 hours ago [-]
Speaking as someone who has made their own game engine for their indie game: it really depends on the game, and on the developer's personality and goals. I think you're probably right for the majority of cases, since the majority of games people want to make are reasonably well-served by general-purpose game engines.
But part of the thing that attracted me to the game I'm making is that it would be hard to make in a standard cookie-cutter way. The novelty of the systems involved is part of the appeal, both to me and (ideally) to my customers. If/when I get some of those (:
mjr00 5 hours ago [-]
> And yet, if making your own game engine makes it intellectually stimulating enough to actually make and ship a game, usually for near free, going 10x slower is still better than going at a speed of zero.
Generally, I've seen the exact opposite. People who code their own engines tend to get sucked into the engine and forget that they're supposed to be shipping a game. (I say this as someone who has coded their own engine, multiple times, and ended up not shipping a game--though I had a lot of fun working on the engine.)
The problem is that the fun, cool parts about building your own game engine are vastly outnumbered by the boring parts: supporting level and save data loading/storage, content pipelines, supporting multiple input devices and things like someone plugging in an XBox controller while the game is running and switching all the input symbols to the new input device in real time, supporting various display resolutions and supporting people plugging in new displays while the game is running, and writing something that works on PC/mobile/Switch(2)/XBox/Playstation... all solved problems, none of which are particularly intellectually stimulating to solve correctly.
If someone's finances depend on shipping a game that makes money, there's really no question that you should use Unity or Unreal. Maybe Godot but even that's a stretch. There's a small handful of indie custom game engine success stories, including some of my favorites like The Witness and Axiom Verge, but those are exceptions rather than the rule. And Axiom Verge notably had to be deeply reworked to get a Switch release, because it's built on MonoGame.
pornel 3 hours ago [-]
Indeed there are people who want to make games, and there are people who think they want to make games, but want to make game engines (I'm speaking from experience, having both shipped games and keeping a junk drawer of unreleased game engines).
Shipping a playable game involves so so many things beyond enjoyable programming bits that it's an entirely different challenge.
I think it's telling that there are more Rust game engines than games written in Rust.
CooCooCaCha 4 hours ago [-]
My experience is the opposite. Plenty of intellectual stimulation comes from actually making the game. Designing and refining gameplay mechanics, level design, writing shaders, etc.
What really drags you down in games is iteration speed. It can be fun making your own game engine at first but after awhile you just want the damn thing to work so you can try out new ideas.
palata 5 hours ago [-]
I really like Rust as a replacement for C++, especially given that C++ seems to become crazier every year. When reasonable, nowadays I always use Rust instead of C++.
But for the vast majority of projects, I believe that C++ is not the right language, meaning that Rust isn't, either.
I feel like many people choose Rust because is sounds like it's more efficient, a bit as if people went for C++ instead of a JVM language "because the JVM is slow" (spoiler: it is not) or for C instead of C++ because "it's faster" (spoiler: it probably doesn't matter for your project).
It's a bit like choosing Gentoo "because it's faster" (or worse, because it "sounds cool"). If that's the only reason, it's probably a bad choice (disclaimer: I use and love Gentoo).
lolinder 4 hours ago [-]
I have a personal-use app that has a hot loop that (after extensive optimization) runs for about a minute on a low-powered VPS to compute a result. I started in Java and then optimized the heck out of it with the JVM's (and IntelliJ's) excellent profiling tools. It took one day to eliminate all excess allocations. When I was confident I couldn't optimize the algorithm any further on the JVM I realized that what I'd boiled it down to looked an awful lot like Rust code, so I thought why not, let's rewrite it in Rust. I took another day to rewrite it all.
The result was not statistically different in performance than my Java implementation. Each took the same amount of time to complete. This surprised me, so I made triply sure that I was using the right optimization settings.
Lesson learned: Java is easy to get started with out of the box, memory safe, battle tested, and the powerful JIT means that if warmup times are a negligible factor in your usage patterns your Java code can later be optimized to be equivalent in performance to a Rust implementation.
internetter 3 hours ago [-]
I'd rather write rust than java, personally
noisy_boy 16 minutes ago [-]
If I have all the time in the world, sure. When I'm racing against a deadline, I don't want to wrestle with the borrow checker too. Sure, it's objections help with the long term quality of the code and reduce bugs but that's hard to justify to a manager/process driven by Agile and Sprints. Quite possible that an experienced Rust dev can be very productive but there aren't tons of those going around.
Java has the stigma of ClassFactoryGeneratorFactory sticking to it like a nasty smell but that's not how the language makes you write things. I write Java professionally and it is as readable as any other language. You can write clean, straightforward and easy to reason code without much friction. It's a great general purpose language.
lolinder 3 hours ago [-]
I'd have said the same thing 10 years ago (or, I would have if I were comparing 10-year-old Java with modern Rust), but Java these days is actually pretty ergonomic. Rust's borrow checker balances out the ML-style niceties to bring it down to about Java's level for me, depending on the application.
vips7L 1 hours ago [-]
I’d rather write Java than Rust, personally
wffurr 4 hours ago [-]
>> a bit as if people went for C++ instead of a JVM language "because the JVM is slow" (spoiler: it is not)
The OP is doing game development. It’s possible to write a performant game in Java but you end up fighting the garbage collector the whole way and can’t use much library code because it’s just not written for predictable performance.
palata 4 hours ago [-]
I didn't mean that the OP should use Java. BTW the OP does not use C++, but Rust.
This said, they moved to Unity, which is C#, which is garbage collected, right?
jayd16 4 hours ago [-]
C# also has "Value Types" which can be stack allocated and passed by value. They're used extensively in game dev.
vips7L 1 hours ago [-]
Hopefully that changes once Java releases their value types.
elabajaba 3 hours ago [-]
The core unity game engine is c++ that you can't access, but all unity games are written in c#.
neonsunset 4 hours ago [-]
C#/.NET has huge feature area for low-level/hands-on memory manipulation, which is highly relevant to gamedev.
wyager 2 hours ago [-]
I write a lot of Rust, but as you say, it's basically a vastly improved version of C++. C++ is not always the right move!
For all my personal projects, I use a mix of Haskell and Rust, which I find covers 99% of the product domains I work in.
Ultra-low level (FPGA gateware): Haskell. The Clash compiler backend lets you compile (non-recursive) Haskell code directly to FPGA. I use this for audio codecs, IO expanders, and other gateware stuff.
Very low-level (MMUless microcontroller hard-realtime) to medium-level (graphics code, audio code): Rust dominates here
High-level (have an MMU, OS, and desktop levels of RAM, not sensitive to ~0.1ms GC pauses): Haskell becomes a lot easier to productively crank out "business logic" without worrying about memory management. If you need to specify high-level logic, implement a web server, etc. it's more productive than Rust for that type of thing.
Both languages have a lot of conceptual overlap (ADTs, constrained parametric types, etc.), so being familiar with one provides some degree of cross-training for the other.
jandrewrogers 3 hours ago [-]
> C instead of C++ because "it's faster" (spoiler: it probably doesn't matter for your project)
If your C is faster than your C++ then something has gone horribly wrong. C++ has been faster than C for a long time. C++ is about as fast as it gets for a systems language.
haberman 2 hours ago [-]
> C++ has been faster than C for a long time.
What is your basis for this claim? C and C++ are both built on essentially the same memory and execution model. There is a significant set of programs that are valid C and C++ both -- surely you're not suggesting that merely compiling them as C++ will make them faster?
There's basically no performance technique available in C++ that is not also available in C. I don't think it's meaningful to call one faster than the other.
mawww 1 hours ago [-]
C and C++ do have very different memory models, C essentially follows the "types are a way to decode memory" model while C++ has an actual object model where accessing memory using the wrong type is UB and objects have actual lifetimes. Not that this would necessarily lead to performance differences.
When people claim C++ to be faster than C, that is usually understood as C++ provides tools that makes writing fast code easier than C, not that the fastest possible implementation in C++ is faster than the fastest possible implementation in C, which is trivially false as in both cases the fastest possible implementation is the same unmaintainable soup of inline assembly.
The typical example used to claim C++ is faster than C is sorting, where C due to its lack of templates and overloading needs `qsort` to work with void pointers and a pointer to function, making it very hard on the optimiser, when C++'s `std::sort` gets the actual types it works on and can directly inline the comparator, making the optimiser work easier.
ryao 50 minutes ago [-]
Try putting objects into two linked lists in C using sys/queue.h and in C++ using the STL. Try sorting the linked lists. You will find C outperforms C++. That is because C’s data structures are intrusive, such that you do not have external nodes pointing to the objects to cause an extra random memory access. The C++ STL requires an externally allocated node that points to the object in at least one of the data structures, since only 1 container can manage the object lifetimes to be able to concatenate its node with the object as part of the allocation. If you wish to avoid having object lifetimes managed by containers, things will become even slower, because now both data structures will have an extra random memory access for every object. This is not even considering the extra allocations and deallocations needed for the external nodes.
That said, external comparators are a weakness of generic C library functions. I once manually inlined them in some performance critical code using the C preprocessor:
In my experience, templates usually cause a lot of bloat that slows things down. Sure, in microbenchmarks it always looks good to specialize everything at compile time, whether this is what you want in a larger project is a different question. And then, also a C compiler can specialize a sort routine for your types just fine. It just needs to be able too look into it, i.e. it does not work for qsort from the libc. I agree to your point that C++ comes with fast implementations of algorithms out-of-the-box. In C you need to assemble a toolbox yourself. But once you have done this, I see no downside.
29 minutes ago [-]
krapht 2 hours ago [-]
I know you're going to reply with "BUT MY PREPROCESSOR", but template specialization is a big win and improvement (see qsort vs std::sort).
ryao 48 minutes ago [-]
I have used the preprocessor to avoid this sort of slowdown in the past in a binary search function:
The performance gain comes not from eliminating the function overhead, but enabling conditional move instructions to be used in the comparator, which eliminates a pipeline hazard on each loop iteration. There is some gain from eliminating the function overhead, but it is tiny in comparison to eliminating the pipeline hazard.
That said, C++ has its weaknesses too, particularly in its typical data structures, its excessive use of dynamic memory allocation and its exception handling. I gave an example here:
That's not the most general case, but it's better than I expected.
ryao 1 hours ago [-]
I doubt that because C++ encourages heavy use of dynamic memory allocations and data structures with external nodes. C encourages intrusive data structures, which eliminates many of the dynamic memory allocations done in C++. You can do intrusive data structures in C++ too, but it clashes with object oriented idea of encapsulation, since an intrusive data structure touches fields of the objects inside it. I have never heard of someone modifying a class definition just to add objects of that class to a linked list for example, yet that is what is needed if you want to use intrusive data structures.
While I do not doubt some C++ code uses intrusive data structures, I doubt very much of it does. Meanwhile, C code using <sys/queue.h> uses intrusive lists as if they were second nature. C code using <sys/tree.h> from libbsd uses intrusive trees as if they were second nature. There is also the intrusive AVL trees from libuutil on systems that use ZFS and there are plenty of other options for such trees, as they are the default way of doing things in C. In any case, you see these intrusive data structures used all over C code and every time one is used, it is a performance win over the idiomatic C++ way of doing things, since it skips an allocation that C++ would otherwise do.
The use of intrusive data structures also can speed up operations on data structures in ways that are simply not possible with idiomatic C++. If you place the node and key in the same cache line, you can get two memory fetches for the price of one when sorting and searching. You might even see decent performance even if they are not in the same cache line, since the hardware prefetcher can predict the second memory access when the key and node are in the same object, while the extra memory access to access a key in a C++ STL data structure is unpredictable because it goes to an entirely different place in memory.
You could say if you have the C++ STL allocate the objects, you can avoid this, but you can only do that for 1 data structure. If you want the object to be in multiple data structures (which is extremely common in C code that I have seen), you are back to inefficient search/traversal. Your object lifetime also becomes tied to that data structure, so you must be certain in advance that you will never want to use it outside of that data structure or else you must do at a minimum, another memory allocation and some copies, that are completely unnecessary in C.
Exception handling in C++ also can silently kill performance if you have many exceptions thrown and the code handles it without saying a thing. By not having exception handling, C code avoids this pitfall.
cantrecallmypwd 3 hours ago [-]
> C++ has been faster than C for a long time.
Citation needed.
zxvkhkxvdvbdxz 2 hours ago [-]
> If your C is faster than your C++ then something has gone horribly wrong. C++ has been faster than C for a long time. C++ is about as fast as it gets for a systems language.
That's interesting, did ChatGPT tell you this?
djmips 5 hours ago [-]
I agree with you except for the JVM bit - but everyone's application varies
palata 4 hours ago [-]
My point is that there are situations where C++ (or Rust) is required because the JVM wouldn't work, but those are niche.
In my experience, most people who don't want a JVM language "because it is slow" tend to take this as a principle, and when you ask why their first answer is "because it's interpreted". I would say they are stuck in the 90s, but probably they just don't know and repeat something they have heard.
Similar to someone who would say "I use Gentoo because Ubuntu sucks: it is super slow". I have many reasons to like Gentoo better than Ubuntu as my main distro, but speed isn't one in almost all cases.
twic 2 hours ago [-]
The JVM is excellent for throughput, once the program has warmed up, but it always has much more jitter than a more systemsy language like C++ or Rust. There are definitely use cases where you need to consistently react fast, where Java is not a good choice.
It also struggles with numeric work involving large matrices, because there isn't good support for that built into the language or standard library, and there isn't a well-developed library like NumPy to reach for.
peterashford 5 hours ago [-]
You think the JVM is slow?
bluGill 4 hours ago [-]
Depends. JVM is fast once hotspot figures things out - but that means the first level is slow and you lose your users.
vips7L 1 hours ago [-]
You can always load JIT caches if you can’t wait for warm up.
mceachen 5 hours ago [-]
IME large linear algebra algos run like molasses in a jvm compared to compiled solutions. You're always fighting the gc.
za3faran 4 hours ago [-]
Do you have any benchmarks to show, out of curiosity?
light_hue_1 3 hours ago [-]
Ok. But we have plenty of C libraries to bind to that for.
They're far slower in Python but that hasn't stopped anyone.
3 hours ago [-]
artursapek 4 hours ago [-]
Install Gentoo
palata 4 hours ago [-]
As I said, I use Gentoo already ;-).
gerdesj 3 hours ago [-]
Quite.
I was a Gentoo user (daily driver) for around 15 years but the endless compilation cycles finally got to me. It is such a shame because as I started to depart, Gentoo really got its arse in gear with things like user patching etc and no doubt is even better.
It has literally (lol) just occurred to me that some sort of dual partition thing could sort out my main issue with Gentoo.
@system could have two partitions - the running one and the next one that is compiled for and then switched over to on a reboot. @world probably ought to be split up into bits that can survive their libs being overwritten with new ones and those that can't.
Errrm, sorry, I seem to have subverted this thread.
fc417fc802 1 hours ago [-]
You have approximately described guix.
curt15 2 hours ago [-]
Gentoo Silverblue?
VWWHFSfQ 5 hours ago [-]
Rust is very easy when you want to do easy things. You can actually just completely avoid the borrow-checker altogether if you want to. Just .clone(), or Arc/Mutex. It's what all the other languages (like Go or Java) are doing anyway.
But if you want to do a difficult and complicated thing, then Rust is going to raise the guard rails. Your program won't even compile if it's unsafe. It won't let you make a buggy app. So now you need to back up and decide if you want it to be easy, or you want it to be correct.
Yes, Rust is hard. But it doesn't have to be if you don't want.
WD-42 4 hours ago [-]
This argument goes only so far. Would you consider querying a database hard? Most developers would say no. But it’s actually a pretty hard problem, if you want to do it safely. In rust, that difficultly leaks into the crates. I have a project that uses diesel and to make even a single composable query is a tangle of uppercase Type soup.
This just isn’t a problem in other languages I’ve used, which granted aren’t as safe.
I love Rust. But saying it’s only hard if you are doing hard things is an oversimplification.
ben-schaaf 3 hours ago [-]
Building a proper ORM is hard. Querying a database is not. See the postgres crate for an example.
Querying a database while ensuring type safety is harder, but you still don't need an OEM for that. See sqlx.
dudinax 4 hours ago [-]
My feeling is that rust makes easy things hard and hard things work.
palata 4 hours ago [-]
If you use Rust with `.clone()` and Arc/Mutex, why not just using one of the myriad of other modern and memory safe languages like Go, Scala/Kotlin/Java, C#, Swift?
The whole point of Rust is to bring memory safety with zero cost abstraction. It's essentially bringing memory safety to the use-cases that require C/C++. If you don't require that, then a whole world of modern languages becomes available :-).
mtndew4brkfst 45 minutes ago [-]
For me personally, doing the clone-everything style of Rust for a first pass means I still have a graceful incremental path to go pursue the harder optimizations that are possible with more thoughtful memory management. The distinction is that I can do this optimization pass continuing to work in Rust rather than considering, and probably discarding, a potential rewrite to a net-new language if I had started in something like Ruby/Python/Elixir. FFI to optimize just the hot paths in a multi-language project has significant downsides and tradeoffs.
Plus in the meantime, even if I'm doing the "easy mode" approach I get to use all of the features I enjoy about writing in Rust - generics, macros, sum types, pattern matching, Result/Option types. Many of these can't be found all together in a single managed/GC'd languages, and the list of those that I would consider viable for my personal or professional use is quite sparse.
echelon 5 hours ago [-]
Rust is actually quite suitable for a number of domains where it was never intended to excel.
Writing web service backends is one domain where Rust absolutely kicks ass. I would choose Rust/(Actix or Axum) over Go or Flask any day. The database story is a little rough around the edges, but it's getting better and SQLx is good enough for me.
edit: The downvoters are missing out.
palata 4 hours ago [-]
To me, web dev really sounds like the one place where everything works and it's more a question of what is in fashion. Java, Ruby, Python, PHP, C, C++, Go, Rust, Scala, Kotlin, probably even Swift? And of course NodeJS was made for that, right?
I am absolutely convinced I can find success story of web backends built with all those languages.
echelon 3 hours ago [-]
Perhaps. But a comparable Rust backend stack produces a single binary deployable that can absorb 50,000 QPS with no latency caused by garbage collection. You get all of that for free.
The type system and package manager are a delight, and writing with sum types results in code that is measurably more defect free than languages with nulls.
aquariusDue 1 hours ago [-]
Yep, that's precisely it! When dealing with other languages I miss the "match" keyword and being able to open a block anywhere. Sure, sometimes Rust allows you to write terse abominations if you don't exercise a dose of caution and empathy for future maintainers (you included).
Other than the great developer experience in tooling and language ergonomics (as in coherent features not necessarily ease of use) the reason I continue to put up with the difficulties of Rust's borrow checker is because I feel I can work towards mastering one language and then write code across multiple domains AND at the end I'll have an easy way to share it, no Docker and friends needed.
But I don't shy away from the downsides. Rust loads the cognitive burden at the ends. Hard as hell in the beginning when learning it and most people (me included) bounce from it for the first few times unless they have C++ experience (from what I can tell). At the middle it's a joy even when writing "throwaway" code with .expect("Lol oops!") and friends. But when you get to the complex stuff it becomes incredibly hard again because Rust forces you to either rethink your design to fit the borrow checker rules or deal with unsafe code blocks which seem to have their own flavor of C++ like eldritch horrors.
Anyway, would *I* recommend Rust to everyone? Nah, Go is a better proposition for a most bang for your buck language, tooling and ecosystem UNLESS you're the kind that likes to deal with complexity for the fulfilled promise of one language for almost anything. In even simpler terms Go is good for most things, Rust can be used for everything.
Also stuff like Maud and Minijinja for Rust are delights on the backend when making old fashioned MPA.
Thanks for coming to my TED talk.
vips7L 1 hours ago [-]
What language are you using that doesn’t have match? Even Java has the equivalent. The only ones I can think of that don’t are the scripting languages.. Python and JS.
icantcode 48 minutes ago [-]
Yeah, anything with nulls ends up with Option<this> and Option<that> which means unwraps or matches. There is a comment above about good bedrock and Rust works OK with nulls but it works really well with unsparse databases (avoiding joins).
jokethrowaway 56 minutes ago [-]
The bar for web services is low, so pretty much anything works as long as it's easy. I wouldn't call them a success story.
When things get complex, you start missing Rust's type system and bugs creep in.
In node.js there was a notable improvement when TS became the de-facto standard and API development improved significantly (if you ignore the poor tooling, transpiling, building, TS being too slow). It's still far from perfect because TS has too many escape hatches and you can't trust TS code; with Rust, if it compiles and there are no unsafe (which is rarely a problem in web services) you get a lot of compile time guarantees for free.
ajross 3 hours ago [-]
Yeah, "web services backend" really means "code exercising APIs pioneered by SunOS in 1988". It's easy to be rock solid if your only dependency is the bedrock.
benwilber0 4 hours ago [-]
Tokio + Axum + SQLx has been a total game-changer for me for web dev. It's by far the most productive I've been with any backend web stack.
echelon 3 hours ago [-]
People that haven't tried this are downvoting with prejudice, but they just don't know.
Rust is an absolute gem at web backend. An absolute fucking gem.
efnx 4 hours ago [-]
I think this is a problem of using the right abstractions.
Rust gamedev is the Wild West, and frontier development incurs the frontier tax. You have to put a lot of work into making an abstraction, even before you know if it’s the right fit.
Other “platforms” have the benefit of decades more work sunk into finding and maintaining the right abstractions. Add to that the fact that Rust is an ML in sheep’s clothing, and that games and UI in FP has never been a solved problem (or had much investment even), it’s no wonder Rust isn’t ready. We haven’t even agreed on the best solutions to many of these problems in FP, let alone Rust specifically!
Anyway, long story short, it takes a very special person to work on that frontier, and shipping isn’t their main concern.
lynndotpy 7 hours ago [-]
I love Rust, but this lines up with my experience roughly. Especially the rapid iteration. Tried things out with Bevy, but I went back to Godot.
There are so many QoL things which would make Rust better for gamedev without revamping the language. Just a mode to automatically coerce between numeric types would make Rust so much more ergonomic for gamedev. But that's a really hard sell (and might be harder to implement than I imagine.)
ChadNauseam 6 hours ago [-]
I wish more languages would lean into having a really permissive compiler that emits a lot of warnings. I have CI so I'm never going to actually merge anything that makes warnings. But when testing, just let me do whatever I want!
GHC has an -fdefer-type-errors option that lets you compile and run this code:
a :: Int
a = 'a'
main = print "b"
Which obviously doesn't typecheck since 'a' is not an Int, but will run just fine since the value of `a` is not observed by this program. (If it were observed, -fdefer-type-errors guarantees that you get a runtime panic when it happens.) This basically gives you the no-types Python experience when iterating, then you clean it all up when you're done.
This would be even better in cases where it can be automatically fixed. Just like how `cargo clippy --fix` will automatically fix lint errors whenever it can, there's no reason it couldn't also add explicit coercions of numeric types for you.
zaptheimpaler 5 hours ago [-]
Yeah this is my absolute dream language. Something that lets you prototype as easily as Python but then compile as efficiently and safely as Rust. I thought Rust might actually fit the bill here and it is quite good but it's still far from easy to prototype in - lots of sharp edges with say modifying arrays while iterating, complex types, concurrency. Maybe Rust can be something like this with enough unsafe but I haven't tried. I've also been meaning to try more Typescript for this kind of thing.
FacelessJim 4 hours ago [-]
You should give Julia a shot.
That’s basically that. You can start with super dynamic code in a REPL and gradually hammer it into stricter and hyper efficient code. It doesn’t have a borrow checker, but it’s expressive enough that you can write something similar as a package (see BorrowChecker.jl).
jimbokun 5 hours ago [-]
Some Common Lisp implementations like SBCL have supported this style of development for many years. Everything is dynamically typed by default but as you specify more and more types the compiler uses them to make the generated code more efficient.
fc417fc802 1 hours ago [-]
I quite like common lisp but I don't believe any existing implementation gets you anywhere near the same level of compile time safety. Maybe something like typed racket but that's still only doing a fraction of what rust does.
myaccountonhn 4 hours ago [-]
I think OCaml could be such a language personally. Its like rust-lite or a functional go.
cantrecallmypwd 3 hours ago [-]
Xen and Wall St. folks use it.
tetha 6 hours ago [-]
Yeh, I've been tinkering around a year with a Bevy-competitor, Amethyst until that project shut down. By now, I just don't think Rust is good for client-side or desktop game development.
In my book, Rust is good at moving runtime-risk to compile-time pain and effort. For the space of C-Code running nuclear reactors, robots and missiles, that's a good tradeoff.
For the space of making an enemy move the other direction of the player in 80% of the cases, except for that story choice, and also inverted and spawning impossible enemies a dozen times if you killed that cute enemy over yonder, and.... and the worst case is a crash of a game and a revert to a save at level start.... less so.
And these are very regular requirements in a game, tbh.
And a lot of _very_silly_physics_exploits_ are safely typed float interactions going entirely nuts, btw. Type safety doesn't help there.
zxvkhkxvdvbdxz 1 hours ago [-]
> Yeh, I've been tinkering around a year with a Bevy-competitor, Amethyst until that project shut down. By now, I just don't think Rust is good for client-side or desktop game development.
I don't think your experience with Amethyst merits your conclusion of the state of gamedev in rust, especially given Amethysts own take on Bevy [1, 2].
> Just a mode to automatically coerce between numeric types would make Rust so much more ergonomic for gamedev.
C# is stricter about float vs. double for literals than Rust is, and the default in C# (double) is the opposite of the one you want for gamedev. That hasn't stopped Unity from gaining enormous market share. I don't think this is remotely near the top issue.
lynndotpy 1 hours ago [-]
I have written a lot of C# and I would very much not want to use it for gamedev either. I can only speak for my own personal preference.
__loam 6 hours ago [-]
I used to hate the language but statically typed GDscript feels like the perfect weight for indie development
IshKebab 6 hours ago [-]
Yeah I haven't really used it much but from what I've seen it's kind of what Python should have been. Looks way better than Lua too.
__loam 6 hours ago [-]
I like it better than python now, but it's still got some quirks. The lack of structs and typed callables are the biggest holes right now imo but you can work around those
Seattle3503 6 hours ago [-]
What numeric types typically need conversions?
koakuma-chan 6 hours ago [-]
The fact you need a usize specifically to index an array (and most collections) is pretty annoying.
anticrymactic 6 hours ago [-]
This could be different in game dev, but in the last years of writing rust (outside of learning the language) I very rarely need to index any collection.
There is a very certain way rust is supposed to be used, which is a negative on it's own, but it will lead to a fulfilling and productive programming experience. (My opinion) If you need to regularly index something, then you're using the language wrong.
bunderbunder 6 hours ago [-]
I'm no game dev but I have had friends who do it professionally.
Long story short, yes, it's very different in game dev. It's very common to pre-allocate space for all your working data as large statically sized arrays because dynamic allocation is bad for performance. Oftentimes the data gets organized in parallel arrays (https://en.wikipedia.org/wiki/Parallel_array) instead of in collections of structs. This can save a lot of memory (because the data gets packed more densely) be more cache-friendly, and makes it much easier to make efficient use of SIMD instructions.
This is also fairly common in scientific computing (which is more my wheelhouse), and for the same reason: it's good for performance.
Pet_Ant 6 hours ago [-]
> Oftentimes the data gets organized in parallel arrays (https://en.wikipedia.org/wiki/Parallel_array) instead of in collections of structs. This can save a lot of memory (because the data gets packed more densely) be more cache-friendly, and makes it much easier to make efficient use of SIMD instructions.
That seems like something that could very easily be turned into a compiler optimisation and enabled with something like an annotation. Would have some issue when calling across library boundaries ( a lot like the handling of gradual types), but within the codebase that'd be easy.
crq-yml 3 hours ago [-]
The underlying issue with game engine coding is that the problem is shaped in this way:
* Everything should be random access(because you want to have novel rulesets and interactions)
* It should also be fast to iterate over per-frame(since it's real-time)
* It should have some degree of late-binding so that you can reuse behaviors and assets and plug them together in various ways
* There are no ideal data structures to fulfill all of this across all types of scene, so you start hacking away at something good enough with what you have
* Pretty soon you have some notion of queries and optional caching and memory layouts to make specific iterations easier. Also it all changes when the hardware does.
* Congratulations, you are now the maintainer of a bespoken database engine
You can succeed at automating parts of it, but note that parent said "oftentimes", not "always". It's a treadmill of whack-a-mole engineering, just like every other optimizing compiler; the problem never fully generalizes into a right answer for all scenarios. And realistically, gamedevs probably haven't come close to maxing out what is possible in a systems-level sense of things since the 90's. Instead we have a few key algorithms that go really fast and then a muddle of glue for the rest of it.
rcxdude 3 hours ago [-]
It's not at all easy to implement as an optimisation, because it changes a lot of semantics, especially around references and pointers. It is something that you can e.g. implement using rust procedural macros, but it's far from transparent to switch between the two representations.
(It's also not always a win: it can work really well if you primarily operate on the 'columns', and on each column more or less once per update loop, but otherwise you can run into memory bandwidth limitations. For example, games with a lot of heavily interacting systems and an entity list that doesn't fit in cache will probably be better off with trying to load and update each entity exactly once per loop. Factorio is a good example of a game which is limited by this, though it is a bit of an outlier in terms of simulation size.)
bunderbunder 5 hours ago [-]
Meh. I've tried "SIMD magic wand" tools before, and found them to be verschlimmbessern.
At least on the scientific computing side of things, having the way the code says the data is organized match the way the data is actually organized ends up being a lot easier in the long run than organizing it in a way that gives frontend developers warm fuzzies and then doing constant mental gymnastics to keep track of what the program is actually doing under the hood.
I think it's probably like sock knitting. People who do a lot of sock knitting tend to use double-pointed needles. They take some getting used to and look intimidating, though. So people who are just learning to knit socks tend to jump through all sorts of hoops and use clever tricks to allow them to continue using the same kind of knitting needles they're already used to. From there it can go two ways: either they get frustrated, decide sock knitting is not for them, and go back to knitting other things; or they get frustrated, decide magic loop is not for them, and learn how to use double-pointed needles.
djmips 5 hours ago [-]
Very much agree and love your analogy but there is a third option - make a sock knitting machine.
nonameiguess 6 hours ago [-]
I'm not a game dev, but what's a straightforward way of adjusting some channel of a pixel at coordinate X,Y without indexing the underlying raster array? Iterators are fine when you want to perform some operation on every item in a collection but that is far from the only thing you ever might want to do with a collection.
maccard 4 hours ago [-]
Game dev here. If you’re concerned about performance the only answer to this is a pixel shader, as anything else involves either cpu based rendering or a texture copy back and forth.
fc417fc802 1 hours ago [-]
A compute shader could update some subset of pixels in a texture. It's on the programmer to prevent race conditions though. However that would again involve explicit indexing.
In general I think GP is correct. There is some subset of problems that absolutely requires indexing to express efficiently.
5 hours ago [-]
ChadNauseam 6 hours ago [-]
This is getting downvoted but it's kind of true. Indexing collections all the time usually means you're not using iterators enough. (Although iterators become very annoying for fallible code that you want to return a Result, so sometimes it's cleaner not to use them.)
However this problem does still come up in iterator contexts. For example Iterator::take takes a usize.
bunderbunder 6 hours ago [-]
An iterator works if you're sequentially visiting every item in the collection, in the order they're stored. It's terrible if you need random access, though.
Concrete example: pulling a single item out of a zip file, which supports random access, is O(1). Pulling a single item out of a *.tar.gz file, which can only be accessed by iterating it, is O(N).
cantrecallmypwd 2 hours ago [-]
History lesson for the cheap seats in the back:
Compressed tars are terrible for random access because the compression occurs after the concatenation and so knows nothing about inner file metadata, but it's good for streaming and backups. Uncompressed tars are much better for random access. (Tar was a used as a backup mechanism to tape (tape archive).)
Zips are terrible for streaming because their metadata is stored at the end, but are better for 1-pass creation and on-disk random access. (Remember that zip files and programs were created in an era of multiple floppy disk-based backups.)
When fast tar enumeration is desired, at the cost of compatibility and compression potential, it might be worth compressing files and then taring them when and if zipping alone isn't achieving enough compression and/or decompression performance. FUSE compressed tar mounting gets to be really expensive with terabyte archives.
fc417fc802 54 minutes ago [-]
> compressing files and then taring them
Just use squashfs if that is the functionality that you need.
kevincox 6 hours ago [-]
While you maybe "shouldn't" be indexing collections often (which I also don't agree with, there is a reason that we have more collections then linked lists, lookup is important) even just getting the size of a collection which is often very related to business logic can be quite annoying.
AndrewDucker 5 hours ago [-]
For data that needs to be looked up mostly I want a hashtable. Not always, but mostly. It's rare that I want to look up something but its position in a list.
Starlevel004 5 hours ago [-]
The actual problem with this is how to add it without breaking type inference for literal numbers.
lynndotpy 4 hours ago [-]
What I mean is, I want to be able to use i32/i64/u32/u64/f32/f64s interchangeably, including (and especially!) in libraries I don't own.
I'm usually working with positive values, and almost always with values within the range of integers f32 can safely represent (+- 16777216.0).
I want to be able to write `draw(x, y)` instead of `draw(x as u32, y as u32)`. I want to write "3" instead of "3.0". I want to stop writing "as".
It sounds silly, but it's enough to kill that gamedev flow loop. I'd love if the Rust compiler could (optionally) do that work for me.
The fact that people love the language is an unexpected downside. In my experience the rust ecosystem has an insanely high churn rate. Crates are often abandoned seemingly for no reason, often before even hitting 1.0. My theory is this is because people want to use rust primarily, the domain problem is just a challenge, like a level in a game. Once all the fun parts are solved, they leave it for dead.
Conversely and ironically, this is why I love Go. The language itself is so boring and often ugly, but it just gets out of the way and has the best in class tooling. The worst part is having seen the promised land of eg Rust enums, and not having them in other langs.
meindnoch 5 hours ago [-]
This.
Feeling passionate about a programming language is generally bad for the products made with that language.
5 hours ago [-]
bmitc 5 hours ago [-]
I find it interesting how the software industry has done everything it can to ignore F#. This is me just lamenting how I always come back to it as the best general purpose language.
klabb3 2 hours ago [-]
Huh? Usually languages that are ”ignored” turns out to be for reasons such as poor or proprietary tooling. As an ignorant bystander, how are things like
Cross compilation, package manager and associated infrastructure, async io (epoll, io_uring etc), platform support, runtime requirements, FFI support, language server, etc.
Are a majority of these things available with first party (or best in class) integrated tooling that are trivial to set up on all big three desktop platforms?
For instance, can I compile an F# lib to an iOS framework, ideally with automatically generated bindings for C, C++ or Objective C? Can I use private repo (ie github) urls with automatic overrides while pulling deps?
Generally, the answer to these questions for – let’s call it ”niche” asterisk – languages, are ”there is a GitHub project with 15 stars last updated 3 years ago that maybe solves that problem”.
There are tons of amazing languages (or at the very least, underappreciated language features) that didn’t ”make it” because of these boring reasons.
My entire point is that the older and grumpier I get, the less the language itself matters. Sure, I hate it when my favorite elegant feature is missing, but at the end of the day it’s easy to work around. IMO the navel gazing and bikeshedding around languages is vastly overhyped in software engineering.
zxvkhkxvdvbdxz 1 hours ago [-]
F# compiler is cross os and allows cross compilation (dotnet build --runtime xxx), its packaged in most Linux distros as dotnet.
promiseofbeans 3 hours ago [-]
One of the smartest devs I know built his game from scratch in C. Pretty complex game too - 3D open-world management game. It's now successful on steam.
Thing is, he didn't make the game in C. He built his game engine in C, and the game itself in Lua. The game engine is specific to this game, but there's a very clear separation where the engine ends and the game starts. This has also enabled amazing modding capabilities, since mods can do everything the game itself can do. Yes they need to use an embedded scripting language, but the whole game is built with that embedded scripting language so it has APIs to do anything you need.
It probably supports Linux via proton. Done. Official valve recommendation a few years ago not sure if still active.
Rohansi 1 hours ago [-]
Most likely because they don't use Linux. Or because it's kind of a mine field to support with bugs that occur on different distros. Even Unity has their own struggles with Linux support.
They're distributing their game on Steam too so Linux support is next to free via Proton.
fc417fc802 51 minutes ago [-]
> it's kind of a mine field to support with bugs that occur on different distros
Non-issue. Pick a single blessed distro. Clearly state that it's the only configuration that you officially support. Let the community sort the rest out.
nu11ptr 7 hours ago [-]
I did the same for my project and moved to Go from Rust. My iteration is much faster, but the code a bit more brittle, esp. for concurrency. Tests have become more important.
Still, given the nature of what my project is (APIs and basic financial stuff), I think it was the right choice. I still plan to write about 5% of the project in Rust and call it from Go, if required, as there is a piece of code that simply cannot be fast enough, but I estimate for 95% of the project Go will be more than fast enough.
klabb3 5 hours ago [-]
> but the code a bit more brittle, esp. for concurrency
Obligatory ”remember to `go run -race`”, that thing is a life saver. I never run into difficult data races or deadlocks and I’m regularly doing things like starting multiple threads to race with cancelation signals, extending timeouts etc. It’s by far my favorite concurrency model.
nu11ptr 4 hours ago [-]
Yep, I do use that, but after getting used to Rust's Send/Sync traits it feels wild and crazy there are no guardrails now on memory access between threads. More a feel thing than reality, but I just find I need to be a bit more careful.
akkad33 6 hours ago [-]
Is calling Rust from Go fast? Last time I checked the interface between C and Go is very slow
nu11ptr 5 hours ago [-]
No, it is not all that fast after the CGo call marshaling (Rust would need to compile to the C ABI). I would essentially call in to Rust to start the code, run it in its own thread pool and then call into Rust again to stop it. The time to start and stop don't really matter as this is code that runs from minutes to hours and is embarrassingly parallel.
spiffyk 6 hours ago [-]
I have no experience with FFI between C and Go, could anyone shed some light on this? They are both natively compiled languages – why would calls between them be much slower than any old function call?
atombender 3 hours ago [-]
There are two reasons:
• Go uses its own custom ABI and resizeable stacks, so there's some overhead to switching where the "Go context" must be saved and some things locked.
• Go's goroutines are a kind of preemptive green thread where multiple goroutines share the same OS thread. When calling C, the goroutine scheduler must jump through some hoops to ensure that this caller doesn't stall other goroutines on the same thread.
Calling C code from Go used to be slow, but over the last 10 years much of this overhead has been eliminated. In Go 1.21 (which came with major optimizations), a C call was down to about 40ns [1]. There are now some annotations you can use to further help speed up C calls.
And P/Invoke call can be as cheap as a direct C call, at 1-4ns
In Unity, Mono and/or IL2CPP's interop mechanism also ends up in the ballpark of direct call cost.
fsmv 5 hours ago [-]
There's some type translation and the Go runtime needs to turn some things off before calling out to C
dralley 6 hours ago [-]
Rust is no different from C in that respect.
dangoodmanUT 6 hours ago [-]
it's reasonably fast now
palata 5 hours ago [-]
> I still plan to write about 5% of the project in Rust and call it from Go, if required
And chances are that it won't be required.
ryanisnan 7 hours ago [-]
This seems like the right call. When it comes to projects like these, efficiency is almost everything. Speaking about my own experiences, when I hit a snag in productivity in a project like this, it's almost always a death-knell.
I too have a hobby-level interest in Rust, but doing things in Rust is, in my experience, almost always just harder. I mean no slight to the language, but this has universally been my experience.
mikepurvis 6 hours ago [-]
The advantages of correctness, memory safety, and a rich type system are worth something, but I expect it's a lot less when you're up against the value of a whole game design ecosystem with tools, assets, modules, examples, documentation, and ChatGPT right there to tell you how it all fits together.
Perhaps someday there will be a comparable game engine written in Rust, but it would probably take a major commercial sponsor to make it happen.
ryanisnan 6 hours ago [-]
One of the challenges I never quite got over completely, was that I was always fighting rust fundamentals, which tells me I never fully assimilated into thinking like a rustacean.
This was more of a me-problem, but I was constantly having to change my strategy to avoid fighting the borrow-checker, manage references, etc. In any case, it was a productivity sink.
bionhoward 54 seconds ago [-]
Personally, I don’t think of it as fighting, more like “compiler assistance” —
you want to make some change, so you adjust a struct or a function signature, and then your IDE highlights all the places where changes are necessary with red squigglies.
Once you’re done playing whack-a-mole with the red squigglies, and tests pass, you know there’s no weird random crash hiding somewhere
mikepurvis 5 hours ago [-]
I bet, and that's particularly difficult when so much of modern game dev is just repeating extremely well-worn patterns— moving entities around and providing for scripted and emergent interactions between those entities and the player(s).
That's not to say that games aren't a very cool space to be in, but the challenges have moved beyond the code. Particularly in the indie space, for 10+ years it's been all about story, characters, writing, artwork, visual identity, sound and music design, pacing, unique gameplay mechanics, etc. If you're making a game in 2025 and the hard part is the code, then you're almost certainly doing it wrong.
sieabahlpark 2 hours ago [-]
[dead]
peterashford 4 hours ago [-]
This was my experience with Rust. I've bounced off it a few times and I think I've decided its just not for me.
wavemode 6 hours ago [-]
It is a question of tradeoffs. Indie studios should be happy to trade off some performance in exchange for more developer productivity (since performance is usually good enough anyway in an indie game, which usually don't have millions of entities, meanwhile developer productivity is a common failure point).
noelwelsh 5 hours ago [-]
Not a game dev, but thought I'd mess around with Bevy and Rust to learn a bit more about both. I was surprised that my code crashed at runtime due to basics I expected the type system to catch. The fancy ECS system may be great for AAA games, but it breaks the basic connections between data and use that type systems rely on. I felt that Bevy was, unfortunately, the worst of both worlds: slow iteration without safety.
rellfy 32 minutes ago [-]
I've always liked the concept of ECS, but I agree with this, although I have very limited experience with Bevy. If I were to write a game in Rust, I would most likely not choose ECS and Bevy because of two reasons: 1. Bevy will have lots of breaking changes as pointed in the post, and 2. ECS is almost always not required -- you can make performant games without ECS, and if with your own engine then you retain full control over breaking changes and API design compromises.
I think all posts I have seen regarding migrating away from writing a game in Rust were using Bevy, which is interesting. I do think Bevy is awesome and great, but it's a complex project.
A friend of mine wrote an article 25+ years ago about using C++ based scripting (compiles to C++). My friend is super smart engineer, but I don't think he was thinking of those poor scripters that would have to wait on iteration times. Granted 25 years ago the teams were small, but nowadays the amount of scripters you would have on AAA game is probably dozen if not two or three dozen and even more!
Imagine all of them waiting on compile... Or trying to deal with correctness, etc.
ChadNauseam 6 hours ago [-]
I love Bevy, but Unity is a weapon when it comes to quickly iterating and making a game. I think the Bevy developers understand that they have a long way to go before they get there. The benefits of Bevy (code-first, Rust, open source) still make me prefer it over Unity, but Unity is ridiculously batteries-included.
Many of the negatives in the post are positives to me.
> Each update brought with it incredible features, but also a substantial amount of API thrash.
This is highly annoying, no doubt, but the API now is just so much better than it used to be. Keeping backwards compatibility is valuable once a product is mature, but like how you need to be able to iterate on your game, game engine developers need to be able to iterate on their engine. I admit that this is a debuff to the experience of using Bevy, but it also means that the API can actually get better (unlike Unity which is filled with historical baggage, like the Text component).
k__ 7 hours ago [-]
Good for them.
From a dev perspective, I think, Rust and Bevy are the right direction, but after reading this account, Bevy probably isn't there yet.
For a long time, Unity games felt sluggish and bloated, but somehow they got that fixed. I played some games lately that run pretty smoothly on decade old hardware.
byearthithatius 7 hours ago [-]
Love to have this comparison analysis. Huge LOC difference between Rust and C# (64k -> 17k!!!) though I am sure that is mostly access to additional external libraries that did things they wrote by hand in Rust.
bob1029 6 hours ago [-]
> I am sure that is mostly access to additional external libraries that did things they wrote by hand in Rust
This is the biggest reason I push for C#/.NET in "serious business" where concerns like auditing and compliance are non-negotiable aspects of the software engineering process. Virtually all of the batteries are included already.
For example, which 3rd party vendors we use to build products is something that customers in sectors like banking care deeply about. No one is going to install your SaaS product inside their sacred walled garden if it depends on parties they don't already trust or can't easily vet themselves. Microsoft is a party that virtually everyone can get on board with in these contexts. No one has to jump through a bunch of hoops to explain why the bank should trust System or Microsoft namespaces. Having ~everything you need already included makes it an obvious choice if you are serious about approaching highly sensitive customers.
bunderbunder 6 hours ago [-]
I worked in a regulated space at one time, and my understanding is that this is a big reason they chose .NET over Java. Java relies a lot more on third-party libraries, which makes getting things certified harder.
Log4shell was a good example of a relative strength of .NET in this area. If a comparable bug had happened in .NET's standard logging tooling, we likely would have seen all of the first-party .NET framework patched fairly shortly after, in a single coordinated release that we could upgrade to with minimal fuss. Meanwhile, at my current job we've still got standing exceptions allowing vulnerable version of log4j in certain services because they depend on some package that still has a hard dependency on a vulnerable version, which they in turn say they can't fix yet because they're waiting on one of their transitive dependencies to fix it, and so on. We can (and do) run periodic audits to confirm that the vulnerable parts of log4j aren't being used, but being able to put the whole thing in the past within a week or two would be vastly preferable to still having to actively worry about it 5 years later.
The relative conciseness of C# code that the parent poster mentioned was also a factor. Just shooting from the hip, I'd guess that I can get the same job done in about 2/3 as much code when I'm using C# instead of Java. Assuming that's accurate, that means that with Java we'd have had 50% more code to certify, 50% more code to maintain, 50% more code to re-certify as part of maintenance...
CharlieDigital 6 hours ago [-]
Hugely underrated aspect of .NET. If a CVE surfaces, there's a team a Microsoft that owns the code and is going to patch and ship a fix.
mawadev 4 hours ago [-]
In sectors that are critical here in the EU, nobody allows c# and microsoft due to licensing woes longterm. It's java and foss all the way down. SaaS also is not a thing unless it runs on prem.
dgellow 3 hours ago [-]
C# and Microsoft are in all critical places in Europe. What are you talking about
neonsunset 4 hours ago [-]
What kind of nonsense is this? EU is perfectly happy to use .NET-based languages as all of them, and the platform itself, are MIT (in fact, it's pretty popular out here).
CharlieDigital 7 hours ago [-]
C# is a very highly underrated (and oft misunderstood) language that has become more terse as it has aged -- in a very good way. C#'s terseness has not come at the cost of its legibility and in fact, I feel like enhances it in many cases.
> The maturity and vast amount of stable historical data for C# and the Unity API mean that tools like Gemini consistently provide highly relevant guidance.
This is also a highly underrated aspect of C# in that its surface area has largely remained stable from v1 (few breaking changes (though there are some valid complaints that surface from this with regards to keyword bloat!)). So the historical volume of extremely well-written documentation is a boon for LLMs. While you may get out-dated patterns (e.g. not using latest language features for terseness), you will not likely get non-working code because of the large and stable set of first party dependencies (whereas outdated 3rd party dependencies in Node often leads to breaking incompatibilities with the latest packages on NPM).
> It was also a huge boost to his confidence and contributed to a new feeling of momentum. I should point out that Blake had never written C# before.
Often overlooked with C# is its killer feature: productivity. Yes, when you get a "batteries included" framework and those "batteries" are quite good, you can be productive. Having a centralized repository for first party documentation is also a huge boon for productivity. When you have an extremely broad, well-written, well-organized standard library and first party libraries, it's very easy to ramp up productivity versus finding different 3rd party packages to fill gaps. Entity Framework, for example, feels miles better to me than Prisma, TypeORM, Drizzle, or any option on Node.js. Having first party rate limiting libraries OOB for web APIs is great for productivity. Same for having first party OpenAPI schema generators.
Less time wasted sifting through half-baked solutions.
> Code size shrank substantially, massively improving maintainability. As far as I can tell, most of this savings was just in the elimination of ECS boilerplate.
C# has three "super powers" to reduce code bloat which is its really rich runtime reflection, first-class expression trees, and Roslyn source generators to generate code on the fly. Used correctly, this can remove a lot of boilerplate and "templatey" code.
---
I make the case that many teams that outgrow JS/TS on Node.js should look to C# because of its congruence to TS[0] before Go, Java, Kotlin, and certainly not Rust.
C# has aged better but I feel like Java 8 approaching ANSI C level solid tools. If only Swing wasn't so ugly. They should poach Raymond Chen to make Java 8 Remastered I like his blog posts. There's probably a DOS joke in there. Also they should just use the JavaFX namespace so I don't have to change my code and I want the lawyer here to laugh too.
quotemstr 5 hours ago [-]
> Java 8
Why would you use Java 8?
atombender 2 hours ago [-]
C# is a great language, but it's been hampered by slow transition towards AOT.
My understanding (not having used it much, precisely because of this) is that AOT is still quite lacking; not very performant and not so seamless when it comes to cross-platform targeting. Do you know if things have gotten better recently?
I think fhat Microsoft had dropped the old .NET platform (CLR and so on) sooner and really nailed the AOT experience, they may have had a chance at competing with Go and even Rust and C++ for some things, but I suspect that ship has sailed, as it has for languages like D and Nim.
neonsunset 2 hours ago [-]
C# (well, .NET, because that's what does JIT/AOT compilation of the bytecode) is not transitioning to AOT. NativeAOT is just one of the ways to publish .NET applications for scenarios where it is desirable. Having JIT is a huge boon to a number of scenarios too, for example it is basically impossible to implement a competitive Regex engine with JIT compilation for the patterns in Go (aside from other limitations like not having SIMD primitives).
throw_m239339 6 hours ago [-]
> C# is a very highly underrated (and oft misunderstood) language that has become more terse as it has aged -- in a very good way. C#'s terseness has not come at the cost of its legibility and in fact, I feel like enhances it in many cases.
C# and .net are one of the most mature platform for development of all kind. It's just that online, it carries some sort of anti Microsoft stigma...
But a lot of AA or indie games are written in C# and they do fine. It's not just C++ or Rust in that industry.
People tend to be influenced by opinions online but often the real world is completely different. Been using C# for a decade now and it's one of the most productive language I have ever used, easy to set up, powerful toolchains... and yes a lot of closed source libs in the .net ecosystem but the open source community is large too by now.
CharlieDigital 6 hours ago [-]
> People tend to be influenced by opinions online but often the real world is completely different.
Unfortunately, my experience has been that C#'s lack of popularity online translates into a lot of misunderstandings about the language and thus many teams simply do not consider it.
Some folks still think it's Windows-only. Some folks think you need to use Visual Studio. Some think it's too hard to learn. Lots of misconceptions lead to teams overlooking it for more "hyped" languages like Rust and Go.
bob1029 5 hours ago [-]
You don't need to use Visual Studio, but it really makes a difference in the overall experience.
I think there may also be some misunderstandings regarding the purchase models around these tools. Visual Studio 2022 Professional is possible to outright purchase for $500 [0] and use perpetually. You do NOT need a subscription. I've got a license key printed on paper that I can use to activate my copy each time.
Imagine a plumber or electrician spending time worrying about the ideological consequences of purchasing critical tools that cost a few hundred dollars.
> Imagine a plumber or electrician spending time worrying about the ideological consequences of purchasing critical tools that cost a few hundred dollars.
That's just the way it is, especially with startups whom I think would benefit the most from C# because -- believe it or not -- I actually think that most startups would be able to move faster with C# on the backend than TypeScript.
dicytea 5 hours ago [-]
> Some folks think you need to use Visual Studio
How's the LSP support nowadays? I remember reading a lot of complaints about how badly done the LSP is compared to Visual Studio.
CharlieDigital 5 hours ago [-]
Pretty good.
I started using Visual Studio Code exclusively around 2020 for C# work and it's been great. Lightweight and fast. I did try Rider and 100% it is better if you are open to paying for a license and if you need more powerful refactoring, but I find VSC to be perfectly usable and I prefer its "lighter" feel.
nh2 7 hours ago [-]
The article says it's 64k -> 17k.
byearthithatius 5 hours ago [-]
Updated, good catch haha
Ygg2 6 hours ago [-]
That's not unexpected they went from Bevy which is more of a game framework, than a proper GUI engine.
I mean, you could also write how we went from C# code 1mil code of our mostly custom engine to 10k in Unreal C++.
taylorallred 6 hours ago [-]
I love Rust and wanted to use it for gamedev but I just had to admit to myself that it wasn't a good fit. Rust is a very good choice for user space systems level programming (ie. compilers, proxies, databases etc.). For gamedev, all of the explicitness that Rust requires around ownership/borrowing and types tends to just get in the way and not provide a lot of value. Games should be built to be fast, but the programmer should be able to focus almost completely on game logic rather than low-level details.
yyyk 6 hours ago [-]
GC isn't a big problem for many types of apps/games, and most games don't care about memory safety. Rust's advantages aren't so important in this domain, while its complexity remains. No surprise he prefers C# for this.
maccard 4 hours ago [-]
Disagree on both points. Anyone who has shipped a game in unity has dealt with object pooling, flipping to structs instead of classes, string interpolation, and replacing idiomatic APIs with out parameters of reused collections.
Similarly, anyone who has shipped a game in unreal will know that memory issues are absolutely rampant during development.
But, the cure rust presents to solve these for games is worse than the disease it seems. I don’t have a magic bullet either..
Rohansi 49 minutes ago [-]
This is a mostly Unity-specific issue. Unity unfortunately has a potato for a GC. This is not even an exaggeration - it uses Boehm GC. Unity does not support Mono's better GC (SGen). .NET has an even better GC (and JIT) that Unity can't take advantage of because they are built on Mono still.
Other game engines exist which use C# with .NET or at least Mono's better GC. When using these engines a few allocations won't turn your game into a stuttery mess.
Just wanted to make it clear that C# is not the issue - just the engine most people use, including the topic of this thread, is the main issue.
loeg 5 hours ago [-]
Not just GC -- performance in general is a total non-issue for a 2d tile-based game. You just don't need the low-level control that Rust or C++ gives you.
trealira 1 hours ago [-]
I wouldn't say it's a non-issue. I've played 2D tile-based, pixel art games where the framerate dropped noticeably with too many sprites on screen, even though it felt like a 3DS should have been able to run it, and my computer isn't super low-end, either. You have more leeway, but it's possible to badly make optimized 2D games to the point where performance becomes an issue again.
palata 5 hours ago [-]
Except that C# is memory safe.
foderking 5 hours ago [-]
great summary
seivan 6 hours ago [-]
[dead]
rorylaitila 35 minutes ago [-]
For my going on 5 year side game project, this is why I can only write in vanilla tools (java, typescript) and with small libraries that are easy to replace. I would loose all motivation if I had to refactor my game and update the engine every time I come back to it. But also, I don't have the pressure of ever finishing the game...
ezekiel68 3 hours ago [-]
This is a personal project that had the specific goal of the person's brother, who was not a coder, being able to contribute to the project. On top of that, they felt the need to continuously upgrade to the latest version of the underlying game engine instead of locking to a version.
I have worked as a professional dev at game studios many would recognize. Those studios which used Unity didn't even upgrade Unity versions often unless a specific breaking bug got fixed. Same for those studios which used DirectX. Often a game shipped with a version of the underlying tech that was hard locked to something several years old.
The other points in the article are all valid, but the two factors above held the greatest weight as to why the project needed to switch (and the article says so -- it was an API change in Bevy that was "the straw that broke the camel's back").
> I failed to fairly evaluate my options at the start of the project.
The more projects I do, the more time I find that I dedicate to just planning things up front. Sometimes it's fun to just open a game engine and start playing with it (I too have an unfair bias in this area, but towards Godot [https://godotengine.org/]), but if I ever want to build something to release, I start with a spreadsheet.
gh0stcat 6 hours ago [-]
Do you think you needed to have those times to play around in the engine? Can a beginner possibly even know what to plan for if they don't fully understand the game engine itself? I am older so I know the benefits of planning, but I sometimes find that I need to persuade myself to plan a little less, just to get myself more in tune with the idioms and behaviors of the tool I am working in.
_QrE 6 hours ago [-]
I think even if you don't have much experience with tools, you can still plan effectively, especially now with LLMs that can give you an idea of what you're in for.
But if you're doing something for fun, then you definitely don't need much planning, if any - the project will probably be abandoned halfway through anyways :)
999900000999 6 hours ago [-]
Unity is still probably the best game engine for smaller games with Unreal being better for AAA.
The problem is you make a deal with the devil. You end up shipping a binary full of phone home spyware, if you don't use Unity in the exact way the general license intends they can and will try to force you into the more expensive industrial license.
However, the ease of actually shipping a game can't be matched.
Godot has a bunch of issues all over the place, a community more intent on self praise than actually building games. It's free and cool though.
I don't really enjoy Godot like I enjoy Unity , but I've been using Unity for over a decade. I might just need to get over it.
excerionsforte 6 hours ago [-]
I love Rust, but I would not try to make a full fledged game with it without patience. This post is not so much a moving away from Rust as much as Bevy is not enjoyable in its current form.
Bevy is in its early stages. I'm sure more Rust Game Engines will come up and make it easier. That said, Godot was great experience for me but doesn't run on mobile well for what I was making. I enjoy using Flutter Flame now (honestly different game engines for different genres or preference), but as Godot continues to get better, I personally would use Godot. Try Unity or Unreal as well if I just want to focus on making a game and less on engine quirks and bugs.
meisel 3 hours ago [-]
Aren't there some scripting languages designed around seamless interop with Rust that could be used here for scripting/prototyping? Not that it would fix all the issues in that blog post, but maybe some of them.
mamcx 5 hours ago [-]
This can be summarized in a simple way: UI is totally, another world.
There is not chance for any language, not matter how good is it, to match the most horrendous (web!) but full-featured ui toolkit.
I bet, 1000%, that is easier to do a OS, a database engine, etc that try to match QT, Delphi, Unity, etc.
---
I made a decision that has become the most productive and problem-less approach of make UIs in my 30 years doing this:
1- Use the de-facto UI toolkit as-is (html, swift ui, jetpack compose). Ignore any tool that promise cross-platform UI (so that is html, but I mean: I don't try to do html in swift, ok?).
2- Use the same idea of html: Send plain data with the full fidelity of what you wanna render: Label(text=.., size=..).
3- Render it directly from the native UI toolkit.
Yes, this is more or less htmx/tailwindcss (I get the inspiration from them).
This mean my logic is full Rust, I pass serializable structs to the UI front-end and render directly from it. Critically, the UI toolkit is nearly devoid of any logic more complex that what you see in a mustache template language.. Not do the localization, formatting, etc. Only UI composition.
I don't care that I need to code in different ways, different apis, different flows, and visually divergent UIs.
IS GREAT.
After the pain of boilerplate, doing the next screen/component/wwhatever is so ridiculous simple that is like cheating.
So, the problem is not Rust. Is not F#, or Lisp. Is that UI is a kind of beast that is imperious to be improved by language alone.
peterashford 5 hours ago [-]
I disagree. The issue, which the article mentions, is iteration time. They were having issues iterating on gameplay, not UI.
My own experiences with game dev and Rust (which are separate experiences, I should add) resonate with what the article is expressing. Iterating systems is common in gamedev and Rust is slow to iterate because its precision ossifies systems. This is GREAT for safety, it's crap for momentum and fluidity
maccard 4 hours ago [-]
This is why game engines embedded scripting languages. Who gives a crap if the engine takes 12 hours to compile if 80% of the team are writing lua in a hot reload loop.
peterashford 3 hours ago [-]
Yeah but no-one is recompiling the engine. This is just about gameplay code
maccard 3 hours ago [-]
Which is why I said
> this is why game engines embedded scripting languages
api 5 hours ago [-]
> I bet, 1000%, that is easier to do a OS, a database engine, etc that try to match QT, Delphi, Unity, etc.
I 100% agree. A modern mature UI toolkit is at least equivalent to a modern game engine in difficulty. GitHub is strewn with the corpses of abandoned FOSS UI toolkits that got 80% of the way there only to discover that the other 20% of the problem is actually 20000% of the work.
The only way you have a chance developing a UI toolkit is to start in full self awareness of just how hard this is going to be. Saying "I am going to develop a modern UI toolkit" is like saying "I am going to develop a complete operating system."
Even worse: a lot of the work that goes into a good UI toolkit is the kind of work programmers hate: endless fixing of nit-picky edge case bugs, implementation of standards, and catering to user needs that do not overlap with one's own preferences.
4 hours ago [-]
cube2222 6 hours ago [-]
That's an excellent article - it's great when people share not only their victories, but mistakes, and what they learned from them.
That said regarding both rapid gameplay mechanic iteration and modding - would that not generally be solved via a scripting language on top of the core engine? Or is Rust + Bevy not supposed to be engine-level development, and actually supposed to solve the gameplay development use-case too? This is very much not my area of expertise, I'm just genuinely curious.
skeptrune 6 hours ago [-]
>I wanted UI to be easy to build, fast to iterate, and moddable. This was an area where we learned a lot in Rust and again had a good mental model for comparison.
I feel like this harkens to the general principle of being a software developer and not an "<insert-language-here>" developer.
Choose tools that expose you to more patterns and help to further develop your taste. Don't fixate on a particular syntax.
nrvn 5 hours ago [-]
> Bevy is young and changes quickly. Each update brought with it incredible features, but also a substantial amount of API thrash
> Bevy is still in the early stages of development. Important features are missing. Documentation is sparse. A new version of Bevy containing breaking changes to the API is released approximately once every 3 months.
I would choose Bevy if and only if I would like to be heavily involved in the development of Bevy itself.
And never for anything that requires a steady foundation.
Programming language does not matter. Choose the right tool for job and be pragmatic.
eYrKEC2 2 hours ago [-]
I like not getting paged at night, so I like APIs written in Rust.
DarkmSparks 2 hours ago [-]
The best language for game logic is lua, switching to C# probably isnt going to help any.... IMHO.
Rohansi 1 hours ago [-]
What makes Lua the best for game logic? You don't even have types to help you out with Lua.
trealira 1 hours ago [-]
Yeah, I actually recently tried making a game in Lua using LOVE2D, and then making the same one in C with Raylib, and I didn't feel like Lua itself gave me all that much. I don't think Lua is best for game logic so much as it's the easiest language to embed in a game written in C or C++. That said, maybe some of its unique features, like its coroutines, or stuff relating to metatables, could be useful in defining game logic. I was writing very boring, procedural, occasionally somewhat object-oriented code either way.
Rohansi 39 minutes ago [-]
Lua would definitely help with iteration times vs. C/C++/Rust but C# compiles very quickly. Especially in Unity where you have an editor that keeps assets cached and can hot reload code changes (with a plugin).
Coroutines can definitely be very useful for games and they're also available in C#.
chaosprint 4 hours ago [-]
I completely understand, and it's not the first time I've heard of people switching from Bevy to Unity. btw Bevy 0.16 just came out in case you missed the discussion:
In my personal opinion, a paradox of truly open-source projects (meaning community projects, not pseudo-open-source from commercial companies) is that development seems to show a tendency of diversity. While this leads to more and more cool things appearing, there always needs to be a balance with sustainable development.
Commercial projects, at least, always have a clear goal: to sell. For this goal, they can hold off on doing really cool things. Or they think about differentiated competition. Perhaps if the purpose is commercial, an editor would be the primary goal (let me know if this is alreay on the roadmap).
---
I don't think the language itself is the problem. The situation where you have to use mature solutions for efficiency is more common in games and apps.
For example, I've seen many people who have had to give up Bevy, Dioxus, and Tauri.
But I believe for servers, audio, CLI tools, and even agent systems, Rust is absolutely my first choice.
I've recently been rewriting Glicol (https://glicol.org) after 2 years. I start from embedded devices, switching to crates like Chumsky, and I feel the ecosystem has improved a lot compared to before.
So I still have 100% confidence in Rust.
hyllos 6 hours ago [-]
To which extent was the implementation in C# benefitting off both the clarified requirements (so the Rust experience could be seen more as prototyping mixed with production)?
Was it actually in major parts just a major refactor in a different language (admittedly with much more proven elements)?
5 hours ago [-]
lostmsu 4 hours ago [-]
Related: just tried to switch to Rust when starting a new project. The main motivation was the combination of fearless concurrency and exhaustive error handling - things that were very painful in the more mature endeavor.
Gave up after 3 days for 3 reasons:
1. Refactoring and IDE tooling in general are still lightyears away from JetBrains tooling and a few astronomical units away from Visual Studio. Extract function barely works.
2. Crates with non-Rust dependencies are nearly impossible to debug as debuggers don't evaluate expressions. So, if you have a Rust wrapper for Ogg reader, you can't look at ogg_file.duration() in the debugger because that requires function evaluation.
3. In contrast to .NET and NuGet ecosystem, non-Rust dependencies typically don't ship with precompiled binaries, meaning you basically have to have fun getting the right C++ compilers, CMake, sometimes even external SDKs and manually setting up your environment variables to get them to build.
With these roadblocks I would never have gotten the "mature" project to the point, where dealing with hard to debug concurrency issues and funky unforeseen errors became necessary.
airstrike 2 hours ago [-]
Curious what kind of project that was. Were you making a GUI by any chance?
neonsunset 4 hours ago [-]
> 3. In contrast to .NET and NuGet ecosystem, non-Rust dependencies typically don't ship with precompiled binaries, meaning you basically have to have fun getting the right C++ compilers, CMake, sometimes even external SDKs and manually setting up your environment variables to get them to build.
Depending on your scenario, you may want either one or another. Shipping pre-compiled binaries carries its own risks and you are at the mercy of the library author making sure to include the one for your platform. I found wiring up MSBuild to be more painful than the way it is done in Rust with cc crate, often I would prefer for the package to also build its other-language components for my specific platform, with extra optimization flags I passed in.
But yes, in .NET it creates sort of an impedance mismatch since all the managed code assemblies you get from your dependencies are portable and debuggable, and if you want to publish an application for a specific new target, with those it just works, be it FreeBSD or WASM. At the same time, when it works - it's nicer than having to build everything from scratch.
corysama 2 hours ago [-]
Professional high-performance C++ game engine dev here. At a glance, their game looks great. But, to be frank, it also looks like it could have been made in the DOS era with sufficient effort.
Going hard with Rust ECS was not the appropriate choice here. Even a 1000x speed hit would be preferable if it gained speed of development. C# and Unity is a much smarter path for this particular game.
But, that’s not a knock on Rust. It’s just “Right tool for the job.”
6 hours ago [-]
jokethrowaway 1 hours ago [-]
Congrats on the rewrite!
I think the worst issue was the lack of ready-made solution. Those 67k lines in Rust contains a good chunk of a game engine.
The second worst issue was that you targeted an unstable framework - I would have focused on a single version and shipped the entire game with it, no matter how good the goodies in the new version.
I know it's likely the last thing you want to do, but you might be in a great position to improve Bevy. I understand open sourcing it comes with IP challenges, but it would be good to find a champion with read access within Bevy to parse your code and come up with OSS packages (cleaned up with any specific game logic) based on the countless problems you must have solved in those extra 50k lines.
I rarely touch game dev but that made me think Godot wasn't very suitable
kllrnohj 6 hours ago [-]
> We wrote extensive pros and cons, emphasizing how each option fared by the criteria above: Collaboration, Abstraction, Migration, Learning, and Modding.
Would you really expect Godot to win out over Unity given those priorities? Godot is pretty awesome these days, but it's still going to be behind for those priorities vs. Unity or Unreal.
ziddoap 7 hours ago [-]
I also would have liked to have seen the pro/con lists for each of the potential choices.
I've been toying with the idea of making a 2d game that I've had on my mind for awhile, but have no game development experience, and am having trouble deciding where to start (obviously wanting to avoid the author's predicament of choosing something and having to switch down the line).
jerf 6 hours ago [-]
The key is, you gotta be pretty cold in the analysis. It's probably more important to avoid what you hate than to lean in too hard to what you love, unless your terminal goal is to work in $FAVE_LANG. Too many people claim they want to make a game, but their actions show that their terminal goal was actually to work in their favorite language. I don't care if your goal is just to work in your favorite language, I just think you need to be brutally honest with yourself on that front.
Probably the best thing in your case is, look at the top three engines you could consider, spend maybe four hours gather what look like pros and cons, then just pick one and go. Don't overestimate your attachment to your first choice. You'll learn more just in finishing a tutorial for any of them then you can possibly learn with analysis in advance.
elktown 4 hours ago [-]
This is goes for a lot of things in tech unfortunately. For example, being stuck in a SRE/devops amusement park can be incredibly frustrating and surprisingly resource intense.
Sometimes it feels like we could use some kind of a temperance movement, because if one can just manage to walk the line one can often reap great rewards. But the incentives seem to be pointing in the opposite direction.
ziddoap 5 hours ago [-]
Thanks, I appreciate the comment! I'm certain that my goal is not to work in a specific language, but to bring a long-time idea to life, and ideally minimize the amount of avoidable headaches along the way.
You're probably right that it'd be best to just jump in and get going with a few of them rather than analyze the choice to death (as I am prone to do when starting anything).
jryan49 6 hours ago [-]
One of the complaints in the article was using a framework early in it's dev cycle. I imagine they were just picking what is safe at that point and didn't want to get burned again.
GardenLetter27 7 hours ago [-]
I wondered the same - the separate C# build might be a bit of a hassle still though.
But they also could have combined Rust parts and C# parts if they needed to keep some of what they had.
cantrecallmypwd 3 hours ago [-]
API churn is so expensive, largely unnecessary, and rarely value-add. It's an anti-pattern that makes things otherwise promising things unusable.
lovegrenoble 4 hours ago [-]
Why not the awesome Gamemaker engine?
morning-coffee 6 hours ago [-]
Expect many more commits like #12. ;)
nickkell 4 hours ago [-]
Awww that's not fair.
C# actually has fairly good null-checking now. Older projects would have to migrate some code to take advantage of it, but new projects are pretty much using it by default.
I'm not sure what the situation is with Unity though - aren't they usually a few versions behind the latest?
shadowgovt 6 hours ago [-]
Excellent write-up.
On the topic of rapid prototyping: most successful game engines I'm aware of hit this issue eventually. They eventually solve it by dividing into infrastructure (implemented in your low-level lanuage) and game-logic / application logic / scripting (implemented in something far more flexible and, usually, interpreted; I've seen Lua used for this, Python, JavaScript, and I think Unity's C# also fits this category?).
For any engine that would have used C++ instead, I can't think of a good reason to not use Rust, but most games with an engine aren't written in 100% C++.
johng 5 hours ago [-]
I signed up for the mailing list. The game looks interesting, I hope there is a Mac version in the future.
adamnemecek 6 hours ago [-]
For anyone considering Rust for gamedev check out the Fyrox engine
Sorry but this engine had(s) problems rendenring a simple rectangle with alpha channel texture, not longer than 3 months ago (I'm assuming it was fixed).
Is it normal for Rust ecosystem to suggest software with this level of maturity?
Rust is fine as a low-level systems programming language. It's a huge improvement over C and (because memory safety) a decent improvement over C++. However, most applications don't need a low-level systems programming language, and trying to shoehorn one where it doesn't belong just leads to sadness without commensurate benefit. Rust does not
* automatically make your program fast;
* eliminate memory leaks;
* eliminate deadlocks; or
* enforce your logical invariants for you.
Sometimes people mention that independent of performance and safety, Rust's pattern-matching and its traits system allow them to express logic in a clean way at least partially checked at compile time. And that's true! But other languages also have powerful type systems and expressive syntax, and these other languages don't pay the complexity penalty inherent in combining safety and manual memory management because they use automatic memory management instead --- and for the better, since the vast majority of programs out there don't need manual memory management.
I mean, sure, you can Arc<Box<Whatever>> many of your problems away, but that point, your global reference counting just becomes a crude form of manual garbage collection. You'd be better off with a finely-tuned garbage collector instead --- one like Unity (via the CLR and Mono) has.
And you're not really giving anything up this way either. If you have some compute kernel that's a bottleneck, thanks to easy FFIs these high-level languages have, you can just write that one bit of code in a lower-level language without bringing systems consideration to your whole program.
killme2008 2 hours ago [-]
I completely agree with you—Rust is not well-suited for application development. Application development requires rapid iteration, acceptable performance, and most importantly, a large developer community and a rich software ecosystem.
Languages like Go , JavaScript, C# or Java are much better choices for this purpose. Rust is still best suited for scenarios where traditional system languages excel, such as embedded systems or infrastructure software that needs to run for extended periods.
Jyaif 7 hours ago [-]
Very useful writeup, thank you for taking the time to do it.
PS: I love the art style of the game.
WesolyKubeczek 7 hours ago [-]
Somehow I can't read this with uBlock Origin on. Hm.
ryanisnan 7 hours ago [-]
Strange, I had no such issue.
nottorp 5 hours ago [-]
Me neither. Default uBlock Origin settings though, maybe the OP is more strict.
worik 6 hours ago [-]
Migrating away from Bevy is the main thrust.
Rust is a niche language, there is no evidence it is going to do well in the game space.
Unity and C# sound like a much better business choice for this. Choosing a system/language....
> My love of Rust and Bevy meant that I would be willing to bear some pain
....that is not a good business case.
Maybe one day there will be a Rust game engine that can compete with Unity, probably already are, in niches.
ninjis 6 hours ago [-]
The "Learning" point drives home a concern my brother-in-law and I were talking about recently. As LLMs become more entrenched as a tool, they may inevitably become the crutch that actually holds back innovation. Individuals and teams may be hesitant to explore or adopt bleeding edge technologies specifically because LLMs don't know about them or don't know enough about them yet.
How is that different from choosing not to adopt a technology because it’s not widely used therefore not widely documented? It’s the timeless mantra of “use boring tech” that seems to resurface every once in a while. It’s all about the goal: do you want to build a viable product, quickly, or do you want to learn and contribute to a specific tech stack? That’s the trade off most of the time.
Bolwin 6 hours ago [-]
It's a lot worse. A high quality project can have great documentation and guides that make it easy to use for a human, but an LLM won't until there's a lot of code and documents out there using it.
And if it's not already popular, that won't happen.
tptacek 5 hours ago [-]
No, this doesn't ring true: long before there were LLMs, people were selecting languages and stacks because of the quality and depth of their community.
But also: there is a lot of Rust code out there! And a cubic fuckload of high-quality written material about the language, its idioms, and its libraries, many of which are pretty famous. I don't think this issue is as simple as it's being out to be.
tayo42 5 hours ago [-]
Isn't this article an example of that. There might be a lot of rust code but if the apis are changing frequently it's all outdated and leads to unusable outputs.
no-dr-onboard 5 hours ago [-]
I see this quite a bit with Rust. I honestly cringe when people get up in arms about someone taking their project out of the rust community.
The same can be said of books as of programming languages:
"Not every ___ deserves to be read/used"
If the documentation or learning curve is so high and/or convoluted that it's disparaging to newcomers then perhaps it's just not a language that's fit for widespread adoption. That's actually fine.
"Thanks for your work on the language, but this one just isn't for me"
"Thanks for writing that awfully long book, but this one just isn't for me"
There's no harm in saying either of those statements. You shouldn't be disparaged for saying that rust just didn't work out for your case. More power to the author.
bigfatkitten 5 hours ago [-]
Rust attracts a religious fervour that you'll almost never see associated with any other language. That's why posts like this make the front page and receive over 200 comments.
If you switched from Java to C# or vice versa, nobody would care.
J_Shelby_J 3 hours ago [-]
A religious fervor against it: no one is in the comments telling the OP he’s wrong.
gregschlom 6 hours ago [-]
I was actually meaning to post this as an Ask HN question, but never found the time to word it well. Basically, what happens to new frameworks and technologies in the age of widespread LLM-assisted coding? Will users be reluctants to adopt bleeding-edge tools because the LLMs can't assist as well? Will companies behind the big frameworks put more resources towards documenting them in a way that makes it easy for LLMs to learn from?
n_ary 6 hours ago [-]
Actually, here in my corner of EU, only the prominent big tech backed well documented and battle tested tools are most marketable skills. So, React, 50 new jobs, but you worked with Svelte/Solidjs, what is that? Java/PHP/Python/Ruby/JS, adequate jobs. Go/Rust/Zig/Crystal/Nim, what are these? While Go has some popularity in recent years and I can spot Rust once in a blue moon. Anything involving requiring near metal work is always C/C++.
Availability of documentation and tooling, widespread adaptation and access to already-trained-at-someone-else's-dime possibility is deemed safe for hiring decision. Sometimes, the narrow tech is spotted in the wild, but it was mostly some senior/staff engineer wanted to experiment something which became part of production because management saw no issue, will sometimes open some doors for practitioners of those stack but the probability is akin to getting hit by lightning strike.
binary132 6 hours ago [-]
This is just reality outside of the early stage startup. The US tech industry and its social networks are very dominated by trendy startup ideas, but the reality is still the major tried-and-true platforms.
timeon 6 hours ago [-]
Maybe it is not the regulations what is holding EU back.
rad_gruchalski 6 hours ago [-]
Another way to look at it: working bleeding edge will become a competitive advantage and a signal to how competent the team is. „Do they consume it” vs „do they own it”.
n_ary 6 hours ago [-]
Or a signal that, someone did not think about the bus factor and future of the project when most of the teams jumped ship.
this_user 5 hours ago [-]
Constantly chasing the latest tech trends has probably done more harm than good, because more often than not, it turns out that the latest hype technology actually does not deliver what the marketing had promised. Look at NoSQL and MongoDB especially as recent examples. Most people who blindly jumped on the MDB bandwagon would have probably been better off just using Postgres, and they later had to spend a lot of resources migrating away from Mongo.
To me constantly chasing the latest trends means lack of experience in a team and absence of focus on what is actually important, which is delivering the product.
IgorPartola 6 hours ago [-]
This already happens. Is your new framework popular on GitHub and on Stack Overflow is a metric people use. LLMs are currently mostly capable of just adapting documentation, blog posts, and answers on SO. So they add a thin veneer on top of those resources.
px1999 5 hours ago [-]
I expect it will wind up like search engines where you either submit urls for indexing/inclusion or wait for a crawl to pick your information up.
Until the tech catches up it will have a stifling effect on progress toward and adoption of new things (which imo is pretty common of new/immature tech, eg how culture has more generally kind of stagnated since the early 2000s)
gs17 5 hours ago [-]
Hopefully, tools can adapt to integrate documentation better. I've already run into this with GitHub Copilot, trying to use Svelte 5 with it is a battle despite it being released most of a year ago.
inerte 6 hours ago [-]
There’s another future where reasoning models get better with larger context windows, and you can throw a new programming language or framework at it and it will do a pretty good job.
PaulKeeble 6 hours ago [-]
We already have quite a lot of that effect with tooling. A language can't really get much traction until its got a build, packaging and all the IDE support we expect or however productive the language is it looses out in practice because its hard to work with and doesn't just fit into our CI/CD systems.
wewtyflakes 5 hours ago [-]
Doesn't this mean that new tech will have to demonstrate material advantages, such that outweigh the LLM inertia, in order to be adopted? This sounds good to me; so much framework churn seems to be code fashion rather than function. Now if someone releases a new framework, they need to demonstrate real value first. People that are smart enough to read the docs and absorb the material of a new, better, framework will now have a competitive advantage; this all seems good.
dogprez 6 hours ago [-]
I think it's a good point and I experienced the same thing when playing with SDL3 the other day. So even established languages with new API's can be problematic.
However, I had a different takeaway when playing with Rust+AI. Having a language that has strict compile-time checks gave me more confidence in the code the AI was producing.
I did see Cursor get in an infinite loop where it couldn't solve a borrow checker problem and it eventually asked me for help. I prefer that to burying a bug.
adamrezich 5 hours ago [-]
I had the same issue a few months ago when I was trying to ask LLMs about Box2D 3.0. I kept getting answers that were either for Box2D 2.x, or some horrific mashup of 2.x and 3.0.
Now Box2D 3.1 has been released and there's zero chance any of the LLMs are going to emit any useful answers that integrate the newly introduced features and changes.
breuleux 5 hours ago [-]
I have that worry as well, but it may not be as bad as I feared. I am currently developing a Python serialization/deserialization library based on advanced multiple dispatch, so it is fairly different from how existing libraries work. Nonetheless, if I ask LLMs (using Cursor) to write new functionality or plugins within my framework, they are surprisingly adept at it, even with limited guidance. I expect it'll only get better in the next few years. Perhaps a set of AI directives and examples for new technologies would suffice.
In any case, there has always been a strong bias towards established technologies that have a lot of available help online. LLMs will remain better at using them, but as long as they are not completely useless on new technologies, they will also help enthusiasts and early adopters work with them and fill in the gaps.
mbrumlow 6 hours ago [-]
I don’t think we will have a lack of people who explore and know beyond others how to things.
LLMs will make people productive. But it will at the same time elevate those with real skill and passion to create good software. In the meantime there will be some maker confusion, and some engineers who are mediocre might find them selfs in demand like top end engineers. But over the time companies and markets will realize and top dollar will go to those select engineers who know how to do things with and without LLMs.
Lots of people are afraid of LLMs and think it is the end of the software engineer. It is and it is not. It’s the end of the “CLI engineer” or the “Front end engineer” and all those specializations that were attempt to require less skill to pay less. But the systems engineers who know how computers work, can take all week long describing what happens when you press enter on a keyboard at google.com will only be pressed into higher demand. This is because the single skill “engineer” wont really be a thing.
tldr; LLMs wont kill software engineering its a reset, it will cull those who chose such a path on a rubric only because it paid well.
doug_durham 5 hours ago [-]
What innovation? Languages with curly braces versus BEGIN/END? There is no innovation going on in computer languages. Rust is C with better ergonomics and rigorous memory management. This was made possible with better processors which made more elaborate compilers practical. It all gets compiled by LLVM down to the same object code. I think we are moving to an era of "read-only" languages. Languages that have horrible writing ergonomics yet are easy to understand when read. Humans won't write code. They will review code.
jdprgm 5 hours ago [-]
I've noticed this effect even with well established tech but just in degrees of popularity. I've recently been working on a Swift/SwiftUI project and the experience with LLM's compared to something like web dev stuff with React, etc is noticeably different/worse which I mostly attribute to there probably being at least 20 times less Swift specific content on the web in comparison.
cvwright 5 hours ago [-]
There are a ton of Swift /SwiftUI tutorials out there for every new technology.
The problem is, they’re all blogspam rehashes of the same few WWDC talks. So they all have the same blindspots and limitations, usually very surface level.
pkkm 5 hours ago [-]
Is that different from what is happening already? A lot of people won't adopt a language/technology unless it has a huge repository of answers on StackOverflow, mature tooling, and a decent hiring pool.
I'm not saying you're definitely wrong, but if you think that LLMs are going to bring qualitative change rather than just another thing to consider, then I'm interested in why.
gwd 6 hours ago [-]
New languages / packages / frameworks may need to collaborate with LLM providers to provide good training material. LLM-able training material may be the next important documentation thing.
Another potentially interesting avenue of research would be to explore allowing LLMs to use "self-play" to explore new things.
SoKamil 6 hours ago [-]
How can it compete with vast amount of trained codebases on Github? For LLMs, more data equals better results, so people will naturally be driven to better completion with already established frameworks and languages.
It would be hard to produce organic data on all ways your technology can be (ab)used.
huijzer 6 hours ago [-]
It’s the same now. I’ve spent arguably too much time trying to avoid Python and it has cost me a whole lot of time. You keep running into bugs and have to implement much more yourself if you go off the beaten path (see also [1]). I don’t regret it since I learned a lot, but it’s definitively not always the easiest path. To this day I wonder whether maybe I should have taken the simple route.
A showerthought I had recently was that newly-written software may have a perverse incentive to be intentionally buggy such that there will be more public complaints/solutions for said software, which gives LLMs more training data to work with.
doctorpangloss 5 hours ago [-]
Unity was a better choice for game engine long before the existence of LLMs.
calvinmorrison 5 hours ago [-]
Its not even innovation. I had a new Laravel project that i was chopping around to play with some new library and I couldn't the the dumbest stuff to work. Of course I went back to read the docs and - ah Laravel 19 or whatever is using config/boostrap.php again and no matter what chatgpt, or myself had figured, could understand why it wasnt working.
unfortunately, a lot of libraries and services - well I don't think chatGPT understands the differences or it would be hard to. At least I have found that with writing scriplets for RT, PHP tooling, etc. The web world seems to move fast enough (and RT moves hella slow) that its confusing libraries and interfaces through the versions.
It'd really need a wider project context where it can go look at how those includes, or functions, or whatever work instead of relying on 'built in' knowledge.
"Assume you know nothing, go look at this tool, api endpoint or, whatever, read the code, and tell me how to use it"
wottuh 6 hours ago [-]
[dead]
5 hours ago [-]
dmitrygr 5 hours ago [-]
Honey, a new incantation to summon Cthulhu just dropped.
The article title is half-true. It wasn't so much they migrated away from Rust, but that they migrated away from Bevy, which is an alpha quality game engine.
I wouldn't have read the article if it'd been labeled that, so kudos to the blog writer, I guess.
jonas21 5 hours ago [-]
What are some non-alpha quality Rust game engines? If the answer is "there are none", then I'd say the title is accurate.
ivanjermakov 7 hours ago [-]
More surprising part for me is not migrating from Rust/Bevy, but migrating _to_ C#/Unity.
Although points mentioned in the post are quite valid.
koakuma-chan 6 hours ago [-]
Where would you migrate to?
gh0stcat 6 hours ago [-]
Not OP, but it seems that there is still a huge sentiment that Unity is not a "safe" platform to migrate to because of their relatively antagonistic approach to monetization guidelines compared to other open source game engines. I do think it makes sense to also consider Godot given his coworker is his brother who is stated to be new to game development, it has a scripting language even simpler than C#, more like python. Additionally, one might expect that someone more into Rust might prefer the C++ integration that Unreal offers. I think the timeline had an effect here too, as it's not been until recently that people have been taking Godot more seriously.
sadeshmukh 6 hours ago [-]
Maybe godot? The unity scandal recently is not great for developers.
pjmlp 6 hours ago [-]
People forget that Unity and Unreal are industry darlings for a reason.
The amount of platforms they support, the amount of features they support, many of which could be a PhD thesis in graphics programming, the tooling, the store,....
Personally, literally anything except Unity. The fact that they tried to retroactively change terms on developers means that it will be a long time before I feel comfortable trusting they won't try it again.
Jyaif 6 hours ago [-]
They mentioned ABI and the ability to create mods, which are Rust things.
Here's a thought experiment:
Would Minecraft have been as popular if it had been written in Rust instead of Java?
legobmw99 6 hours ago [-]
I mean, we already have a sort-of answer, because the "Bedrock Edition" of Minecraft is written in C++, and it is indeed less popular on PC (on console, it's the only option, so _overall_ it might win out) and does lack any real modding scene
rcxdude 3 hours ago [-]
Indeed. Java is sufficiently dynamic/decompilable a game written in it can be heavily modded without adding specific support. C++ is much harder (depending on the game engine), though not impossible. If you do add modding support then everything is much better regardless of language, though (see Factorio, written in C++ and with a huge modding scene, because it was basically written with modding in mind. Lua is certainly helping with that, of course).
jedisct1 6 hours ago [-]
The problem with Rust is that almost everything is still at an alpha stage. The vast majority of crates are at version 0.x and are eventually abandoned, replaced, or subject to constant breaking changes
While the language itself is great and stable, the ecosystem is not, and reverting to more conservative options is often the most reasonable choice, especially for long-term projects.
the_mitsuhiko 6 hours ago [-]
I really don’t think Rust is a good match for game dev. Both because of the borrow checker which requires a lot of handles instead of pointers and because compile times are just not great.
But outside of games the situation looks very different. “Almost everything” is just not at all accurate. There are tons of very stable and productive ecosystems in Rust.
pcwalton 2 hours ago [-]
> I really don’t think Rust is a good match for game dev. Both because of the borrow checker which requires a lot of handles instead of pointers and because compile times are just not great.
I completely disagree, having been doing game dev in Rust for well over a year at this point. I've been extremely productive in Bevy, because of the ECS. And Unity compile times are pretty much just as bad (it's true, if you actually measure how long that dreaded "Reloading Domain" screen takes).
littlestymaar 6 hours ago [-]
Borrow checker is mostly a strawman for this discussion, the post is about using Bevy as an engine and Bevy uses an ECS than manages the lifetime of objects for you automatically. You will never have an issue with the borrow checker when using Bevy, not even once.
pclmulqdq 6 hours ago [-]
Everything in every ECS system is done with handles, but the parent comment is correct that many games use hairballs of pointers all over the place (and they are handles with ECS). There is never a borrow checker issue with handles since they divorce the concept of a pointer from the concept of ownership.
rcxdude 3 hours ago [-]
I wouldn't say 'almost everything', but there are some areas which require a huge amount of time and effort to build a mature solution for, UI and game engines being one, where there are still big gaps.
littlestymaar 6 hours ago [-]
It's still true for game dev indeed, but for back-end or CLI tools it hasn't been true in like 7 years or so.
Ygg2 6 hours ago [-]
> The problem with Rust is that almost everything is still at an alpha stage.
Replace Rust with Bevy and language with framework, you might have a point. Bevy is still in alpha, it's lacking plenty of things, mainly UI and an easy way to have mods.
As for almost everything is at an alpha stage, yeah. Welcome to OSS + SemVer. Moving to 1.x makes a critical statement. It's ready for wider use, and now we take backwards compatibility seriously.
But hurray! Commercial interest won again, and now you have to change engines again, once the Unity Overlords decide to go full Shittification on your poorly paying ass.
pclmulqdq 6 hours ago [-]
Unfortunately, it is a failing of many projects in the Rust sphere that they spend quite a lot longer in 0.x than other projects. Rust language and library features themselves often spend years in nightly before making it to a release build.
You can also always go from 1.0 to 2.0 if you want to make breaking changes.
Ygg2 6 hours ago [-]
> Unfortunately, it is a failing of many projects in the Rust sphere that they spend quite a lot longer in 0.x than other projects
Yes. Because it makes a promise about backwards compatibility.
> Rust language and library features themselves often spend years in nightly before making it to a release build.
So did Java's. And I Rust probably has a fraction of its budget.
In defense of long nightly feature more than once, stabilizing some feature like negative impl and never types early would have caused huge backwards breaking changes.
> You can also always go from 1.0 to 2.0 if you want to make breaking changes.
Yeah, just like Python!
And split the community and double your maintenance burden. Or just pretend 2.0 is 1.1 and have the downstream enjoy the pain of migration.
bigstrat2003 6 hours ago [-]
> And split the community and double your maintenance burden.
If you choose to support 1.0 sure. But you don't have to. Overall I find that the Rust community is way too leery of going to 1.0. It doesn't have to be as big a burden as they make it out to be, that is something that comes down to how you handle it.
Ygg2 6 hours ago [-]
> If you choose to support 1.0 sure.
If you choose not to, then people wait for x.0 where x approaches infinity. I.e. they lose confidence in your crates/modules/libraries.
I mean, a big part of why I don't 1.x my OSS projects (not just Rust) is that I don't consider them finished yet.
justmarc 6 hours ago [-]
I have totally disagree here.
I don't even look at crate versions but the stuff works, very well. The resulting code is stable, robust and the crates save an inordinate amount of development time. It's like lego for high end, high performance code.
With Rust and the crates you can build actual, useful stuff very quickly. Hit a bug in a crate or have missing functionality? contribute.
Software is something that is almost always a work in progress and almost never perfect, and done. It's something you live with. Try any of this in C or C++.
pjmlp 6 hours ago [-]
They might be unsafe, but there is enough tooling to pick from 60 and 50 years of industrial use, approximately.
littlestymaar 6 hours ago [-]
Well, on the flip side with C++ some of it hasn't been updated beyond very basic maintenance and you can't even understand the code if you are just familiar with more modern C++…
pjmlp 6 hours ago [-]
Well it is upon each one to be good with their craft.
If not, the language they pick doesn't really make a difference in the end.
It is like complaining playing a music instrument to be in band or orchestra requires too much effort, naturally.
littlestymaar 6 hours ago [-]
Except here you are a trained pianist and the tour manager gave you a pipe organ or a harpsichord.
pjmlp 5 hours ago [-]
Speaking as someone with musical background, that is where we discover those that actually understand music, from those that kind of get by.
Great musicians make a symphony out of what they can get their hands on.
BuyMyBitcoins 6 hours ago [-]
>”reverting to more conservative options”
From what I’ve heard about the Rust community, you may have made an unintentionally witty pun.
6 hours ago [-]
monkeyelite 7 hours ago [-]
It’s incredible how many projects and articles have been written around ECS with very little results.
Quake 1-3 uses a single array of structs, with sometimes unused properties. Is your game more complex than quake 3?
The “ECS” upgrade to that is having an array for each component type but just letting there be gaps:
Hype as usual, too many people waste time on how to implement engines, instead of how to make a game fun to play.
cogman10 6 hours ago [-]
The important part of ECS (IMO) is more that it's a pattern that others recognize and less that it's necessarily the best pattern to use.
dist-epoch 6 hours ago [-]
Quake 1-3 were written for computers where memory was not much slower than the CPU as is the situation today.
But yeah, probably you don't need an ECS for 90% of the games.
fooker 6 hours ago [-]
Memory is sometimes faster today!
6 hours ago [-]
seivan 6 hours ago [-]
[dead]
eftychis 5 hours ago [-]
This comment might not be liked by the usual commenters in these threads, but I think it is worth stressing:
First: I have experience with Bevy and other game engine frameworks; including Unreal. And I consider myself a seasoned Rust, C etc developer.
I could sympathize with what was stated by the author.
I think the issue here is (mainly) Bevy. It is just not even close to the standard yet (if ever). It is hard for any generic game engine to compete with Unity/GoDot. Nevermind, the de facto standard of Unreal.
But if you are a C# developer and using Unity already, and not C++ in Unreal, going to a bloated framework that is missing features that is Bevy makes little sense. [And here is also the minor issue, that if you are a C# developer, honestly you don't care about low level code, or not having a garbage collector.]
Now if you are a C++ developer and use Unreal, they only point to move to Rust (which I would argue for the usual reasons) is if Unreal supports Rust. Otherwise, there is nothing that even compares to Unreal. (That is not custom made game engine.)
JeremyBarbosa 5 hours ago [-]
As someone who has used Bevy in the past, that was my reading as well. It is an incredible tool, but some of the things mentioned in the article like the gnarly function signature and constant migrations are known issues that stop a lot of people from using it. That's not even to mention the strict ECS requirement if your game doesn't work well around it. Here is a good reddit thread I remember reading about some more difficulties other people had with Bevy:
The way I read about Bevy in online discussions obfuscates this. Someone who is new to game development could be confused into thinking Bevy is a fair competitor with the other engines you mentioned. And equate Bevy with Rust, or Bevy with Rust in game dev. I think stomping this out is critical to expectation management, and perhaps rust's future in game dev.
From my experience one has to take Rust discussions with a grain of salt because often shortcomings and disclosures are handwaved and/or ommited.
the__alchemist 3 hours ago [-]
I've learned to do the same. I see this in the embedded world as well.
And within rust, I've learned to look beyond the most popular and hyped tools; they are often not the best ones.
ecshafer 5 hours ago [-]
Imo the place for rust in game dev isnt in games at all, but base libraries and tools. Writing your proc generation library in rust that is an isolated package you can call in isolation, or similar is where its useful.
eftychis 5 hours ago [-]
I agree. [Unless fully adopted by a serious game engine, of course.]
Rust's "superpower" is substituting critical C++ code in-place, with the goal of ensuring correctness and soundness. And increasing the development velocity as a result.
5 hours ago [-]
nine_k 5 hours ago [-]
Sounds like "Migrating away from Bevy towards Unity"; the Rust to C# transition is mostly a technical consequence.
Bevy: unstable, constantly regressing, with weird APIs here and there, in flux, so LLMs can't handle it well.
Unity: rock-solid, stable, well-known, featureful, LLMs know it well. You ought to choose it if you want to build the game, not hack on the engine, be its internal language C#, Haskell, or PHP. The language is downstream from the need to ship.
4 hours ago [-]
Sleaker 6 hours ago [-]
Anyone else get an empty page on mobile Firefox when they try to go the article? All that renders for me is a comment entry box. If I go back to news I can see the article list just fine.
firesteelrain 6 hours ago [-]
Works for me on mobile Chrome
fotta 6 hours ago [-]
Same on mobile safari
5 hours ago [-]
gbuk2013 7 hours ago [-]
Don’t see any content on that article for some reason (from iPhone)
maartenscholl 7 hours ago [-]
I experienced the same, I had to disable my adblocker to view it, it seems the content is inside a tag `<article class="social-sharing">` but I am unsure whether this triggered my adblocker.
janice1999 7 hours ago [-]
Adblocking seems to cause issues with the site. Disabling uBlock Origin worked for me as did readability mode in Firefox.
6 hours ago [-]
3 hours ago [-]
blueredmodern 6 hours ago [-]
[dead]
jheriko 6 hours ago [-]
[dead]
6 hours ago [-]
akkad33 6 hours ago [-]
[flagged]
airstrike 7 hours ago [-]
[flagged]
malkia 6 hours ago [-]
[flagged]
5 hours ago [-]
forrestthewoods 7 hours ago [-]
Rust is not good for video game gameplay logic. The ownership model of Rust can not represent the vast majority of allocations.
I love Rust. It’s not for shipping video games. No Tiny Glade doesn’t count.
Edit: don’t know why you’re downvoting. I love Rust. I use it at my job and look for ways to use it more. I’ve also shipped a lot of games. And if you look at Steam there are simply zero Rust made games in the top 2000. Zero. None nada zilch.
Also you’re strictly forbidden from shipping Rust code on PlayStation. So if you have a breakout indie hit on Steam in Rust (which has never happened) you can’t ship it on PS5. And maybe not Switch although I’m less certain.
pcwalton 2 hours ago [-]
> No Tiny Glade doesn’t count.
> And if you look at Steam there are simply zero Rust made games in the top 2000. Zero. None nada zilch.
Well, sure, if you arbitrarily exclude the popular game written in Rust, then of course there are no popular games written in Rust :)
> And maybe not Switch although I’m less certain.
I have talked to Nintendo SDK engineers about this and been told Rust is fine. It's not an official part of their toolchain, but if you can make Rust work they don't care.
forrestthewoods 1 hours ago [-]
Yeah in my haste I mixed up my rants. The bane of typing at work inbetween things.
Tiny Glade is indeed a rust game. So there is one! I am not aware of a second. But it’s not really a Bevy game. It uses the ECS crate from Bevy.
And for something like Gnorp, Rust is probably a decent choice.
queuebert 6 hours ago [-]
> The ownership model of Rust can not represent the vast majority of allocations.
What allocations can you not do in Rust?
forrestthewoods 6 hours ago [-]
Gameplay code is a big bag of mutable data that lives for relatively unknown amounts of time. This is the antithesis of Rust.
The Unity GameObject/Component model is pretty good. It’s very simple. And clearly very successful. This architecture can not be represented in Rust. There are a dozen ECS crates but no one has replicated the worlds most popular gameplay system architecture. Because they can’t.
yuriks 4 hours ago [-]
Which part of that architecture is impossible in Rust? Actually an honest question, I'm wondering if I'm missing something.
From what I remember from my Unity days (which granted, were a long time ago), GameObjects had their own lifecycle system separate from the C# runtime and had to be created and deleted using Destroy and Create calls in the Unity API. Similarly, components and references to them had to be created and retrieved using the GetComponent calls, which internally used handles, rather than being raw GC pointers. Runtime allocation of objects frequently caused GC issues, so you were practically required to pre-allocate them in an object pool anyway.
I don't see how any of those things would be impossible or even difficult to implement in Rust. In fact, this model is almost exactly what I used to see evangelized all the time for C++ engines (using safe handles and allocator pools) in GDC presentations back then.
In my view, as someone who has not really interacted or explored Rust gamedev much, the issue is more that Bevy has been attempting to present an overtly ambitious API, as opposed to focusing on a simpler, less idealistic one, and since it is the poster child for Rust game engines, people keep tripping over those problems.
koakuma-chan 7 hours ago [-]
You could probably write the core in Rust and use some sort of scripting for gameplay logic. Warframe's gameplay logic is written in Lua.
WinstonSmith84 6 hours ago [-]
The headline is a bit sensational here and shall have been rather called "Migrating away from Bevy" .. That's not (really) comparing C# to Rust (and Luna but that one is missing), but rather comparing game engine where the language is secondary. Obviously Unity is the leader here (with Unreal) - despite all its flaws.
koakuma-chan 6 hours ago [-]
Isn't Veloren doing pretty good?
forrestthewoods 6 hours ago [-]
No. No one plays Veloren. It’s a toy project for programmers.
No offense to the project. It’s cool and I’m glad it exists. But if you were to plot the top 2000 games on Steam by time played there are, I believe, precisely zero written in Rust.
dismalaf 6 hours ago [-]
> No Tiny Glade doesn’t count.
Tiny Glade is also the buggiest Steam game I've ever encountered (bugs from disappearing cursor to not launching at all). Incredibly poor performance as well for a low poly game, even if it has fancy lighting...
adamrezich 5 hours ago [-]
> Also you’re strictly forbidden from shipping Rust code on PlayStation. So if you have a breakout indie hit on Steam in Rust (which has never happened) you can’t ship it on PS5. And maybe not Switch although I’m less certain.
What evidence do you have for this statement? It kind of doesn't make any sense on its face. Binaries are binaries, no matter what tools are used to compile them. Sure, you might need to use whatever platform-specific SDK stuff to sign the binary or whatever, but why would Rust in particular be singled out as being forbidden?
Despite not being yet released publicly, Jai can compile code for PlayStation, Xbox, and Switch platforms (with platform-specific modules not included in the beta release, available upon request provided proof of platform SDK access).
forrestthewoods 5 hours ago [-]
Sony mandates you use their toolchain. You don’t get to ship whatever you want on their console. They have a very thorough TRC check you must pass before you get to ship.
adamrezich 5 hours ago [-]
Rust being forbidden on a platform, and Rust being unsupported out-of-the-box with the SDK toolchain, seem to me like they're rather different things?
Philpax 5 hours ago [-]
...why does Tiny Glade not count?
Ciantic 6 hours ago [-]
> Rust can not represent the vast majority of allocations
Do you mean cyclic types?
Rust being low-level, nobody prevents one from implementing garbage-collected types, and I've been looking into this myself: https://github.com/Manishearth/rust-gc
It's "Simple tracing (mark and sweep) garbage collector for Rust", which allows cyclic allocations with simple `Gc<Foo>` syntax. Can't vouch for that implementation, but something like this would be good for many cases.
shmerl 6 hours ago [-]
Using poor quality AI suggestions as a reason not to use Rust is a super weird argument. Something is very wrong with such idea. What's going to be next, avoiding everything where AI performs poorly?
Scripting being flexible is a proper idea, but that's not an argument against Rust either. Rather it's an argument for more separation between scripting machinery and the core engine.
For example Godot allows using Rust for game logic if you don't want to use GDScript, and it's not really messing up the design of their core engine. It's just more work to allow such flexibility of course.
The rest of the arguments are more in the familiarity / learning curve group, so nothing new in that sense (Rust is not the easiest language).
tptacek 6 hours ago [-]
Yes, a lot of people are reasonably going to decide to work in environments that are more legible to LLMs. Why would that surprise you?
The rest of your comment boils down to "skills issue". I mean, OK. But you can say that about any programming environment, including writing in raw assembly.
shmerl 3 hours ago [-]
First argument sounds like a major fallacy to me. It doesn't surprise me, but it find it extremely wrong.
tptacek 3 hours ago [-]
Why?
shmerl 2 hours ago [-]
Because it's a discouragement of learning based on mediocrity of AI. I find such idea perpetuating the mediocrity (not just of AI itself but of whatever it's used for).
It's like imagine saying, I don't want to learn how write a good story because AI always suggests me writing a bad one anyway. May be that delivers the idea better.
tptacek 2 hours ago [-]
It's not at all clear to me what this has to do with the practical delivery of software. In languages that LLMs handle well, with a careful user (ie, not a vibe coder; someone reading every line of output and subjecting most of it to multiple cycles of prompting) the code you end up with is basically indistinguishable from the replacement-level code of an expert in the language. It won't hit that human expert's peaks, but it won't generally sink below their median. That's a huge accelerator for actually delivering projects, because, for most projects, most of the code need only be replacement-grade.
Why would I valorize discarding this kind of automation? Is this just a craft vs. production thing? Like, the same reason I'd use only hand tools when doing joinery in Japanese-style woodworking? There's a place for that! But most woodworkers... use table saws and routers.
mwcampbell 16 minutes ago [-]
> Why would I valorize discarding this kind of automation? Is this just a craft vs. production thing?
The strongest reason I can think of to discard this kind of automation, and do so proudly, is that it's effectively plagiarizing from all of the experts whose code was used in the training data set without their permission.
shmerl 2 hours ago [-]
It's not about delivery of software, it's about avoidance of learning based on mediocrity of AI. I.e. original post literally brings LLMs being poor at suggestions for Rust as a reason to avoid it.
That implies that proponents of such approach don't want to pursue learning which requires them to do something that exceeds the mediocrity level set by the AI they rely on.
For me it's obvious that it has a major negative impact on many things.
quantified 5 hours ago [-]
It's a weird idea now, but it won't be weird soon. As devs and organizations further buy into AI-first coding, anything not well-served by AI will be treated as second-class. Another thread here brought up the risk that AI will limit innovation by not being well-trained on new things.
shmerl 3 hours ago [-]
I agree that such trend exists, but it's extremely unhealthy and if anyone, developers should have more clue how bad it is.
brokencode 6 hours ago [-]
Developers often pick languages and libraries based on the strength of their developer tools. Having great dev tools was a major reason Ruby on Rails took off, for example.
Why exclude AI dev tools from this decision making? If you don’t find such tools useful, then great, don’t use them. But not everybody feels the same way.
bsaul 6 hours ago [-]
it could be a weird argument, but as a rust newcomer, i have to say it's really something that jumps to your face. LLMs are practically useless for anything non-basic, and rust contains a lot non-basic things.
quantified 5 hours ago [-]
So, what are the chances that the pendulum swings to lower-level programming via LLM-generated C/C++ if LLM-generated Rust doesn't emerge? Note that this question is a context switch from gaming to something larger. For gaming, it could easily be that the engine and culture around it (frequent regressions, etc) are the bigger problems than the language.
bsaul 3 hours ago [-]
I haven't coded in C/C++ in years but friends who do and worked on non-trivial codebase in those languages had a really crappy experience with LLMs too.
A friend of mine only understood why i was so impressed by LLMs once he had to start coding a website for his new project.
My feeling is that low-level / system programming is currently at the edge of what LLMs can do. So i'd say that languages that manage to provide nice abstractions around those types of problems will thrive. The others will have a hard time gaining support among young developers.
I've been writing a metaverse client in Rust for almost five years now, which is too long.[1] Someone else set out to do something similar in C#/Unity and had something going in less than two years. This is discouraging.
Ecosystem problems:
The Rust 3D game dev user base is tiny.
Nobody ever wrote an AAA title in Rust. Nobody has really pushed the performance issues. I find myself having to break too much new ground, trying to get things to work that others doing first-person shooters should have solved years ago.
The lower levels are buggy and have a lot of churn
The stack I use is Rend3/Egui/Winit/Wgpu/Vulkan. Except for Vulkan, they've all had hard to find bugs. There just aren't enough users to wring out the bugs.
Also, too many different crates want to own the event loop.
These crates also get "refactored" every few months, with breaking API changes, which breaks the stack for months at a time until everyone gets back in sync.
Language problems:
Back-references are difficult
A owns B, and B can find A, is a frequently needed pattern, and one that's hard to do in Rust. It can be done with Rc and Arc, but it's a bit unwieldy to set up and adds run-time overhead.
There are three common workarounds:
- Architect the data structures so that you don't need back-references. This is a clean solution but is hard. Sometimes it won't work at all.
- Put everything in a Vec and use indices as references. This has most of the problems of raw pointers, except that you can't get memory corruption outside the Vec. You lose most of Rust's safety. When I've had to chase down difficult bugs in crates written by others, three times it's been due to errors in this workaround.
- Use "unsafe". Usually bad. On the two occasions I've had to use a debugger on Rust code, it's been because someone used "unsafe" and botched it.
Rust needs a coherent way to do single owner with back references. I've made some proposals on this, but they require much more checking machinery at compile time and better design. Basic concept: works like "Rc::Weak" and "upgrade", with compile time checking for overlapping upgrade scopes to insure no "upgrade" ever fails.
"Is-a" relationships are difficult
Rust traits are not objects. Traits cannot have associated data. Nor are they a good mechanism for constructing object hierarchies. People keep trying to do that, though, and the results are ugly.
[1] https://www.animats.com/sharpview/index.html
These are all solvable problems, but in reality, it's very hard to write a good business case for being the one to solve them. Most of the cost accrues to you and most of the benefit to the commons. Unless a corporate actor decides to write a major new engine in Rust or use Bevy as the base for the same, or unless a whole lot of indie devs and part-time hackers arduously work all this out, it's not worth the trouble if you're approaching it from the perspective of a studio with severe limitations on both funding and time.
The only challenge is not having an ecosystem with ready made everything like you do in "batteries included" frameworks. You are basically building a game engine and a game at the same time.
We need a commercial engine in Rust or a decade of OSS work. But what features will be considered standard in Unreal Engine 2035?
I am dealing with similar issues in npm now, as someone who is touching Node dev again. The number of deprecations drives me nuts. Seems like I’m on a treadmill of updating APIs just to have the same functionality as before.
It’s not always possible to be so minimal, but I view every dependency as lugging around a huge lurking liability, so the benefit it brings had better far outweigh that big liability.
So far, I’ve only had one painful dependency upgrade in 5 years, and that was Tailwind 3-4. It wasn’t too painful, but it was painful enough to make me glad it’s not a regular occurrence.
On that subject, ironically code gen by ai for ai related work is often least reliable due to fast churn. Langchain is a good example of this and also kind of funny, they suggest / integrate gritql for deterministic code transforms rather than using AI directly: https://python.langchain.com/docs/versions/v0_3/.
Overall.. mastering things like gritql, ast grep, and CST tools for code transforms still pays off. For large code bases, No matter how good AI gets, it is probably better to get them to use formal/deterministic tools like these rather than trust them with code transformations more directly and just hope for the best..
eg: https://docs.openrewrite.org/recipes/java/migrate/joda/jodat...
I occasionally notice libraries or frameworks including OpenRewrite rules in their releases. I've never tried it, though!
I think you should think less like Java/C# and more like database.
If you have a Comment object that has parent object, you need to store the parent as a 'reference', because you can't put the entire parent.
So I'll probably use Box here to refer to the parent
> Migration - Bevy is young and changes quickly.
We were writing an animation system in Bevy and were hit by the painful upgrade cycle twice. And the issues we had to deal with were runtime failures, not build time failures. It broke the large libraries we were using, like space_editor, until point releases and bug fixes could land. We ultimately decided to migrate to Three.js.
> The team decided to invest in an experiment. I would pick three core features and see how difficult they would be to implement in Unity.
This is exactly what we did! We feared a total migration, but we decided to see if we could implement the features in Javascript within three weeks. Turns out Three.js got us significantly farther than Bevy, much more rapidly.
I definitely sympathize with the frustration around the churn--I feel it too and regularly complain upstream--but I should mention that Bevy didn't really have anything production-quality for animation until I landed the animation graph in Bevy 0.15. So sticking with a compatible API wasn't really an option: if you don't have arbitrary blending between animations and opt-in additive blending then you can't really ship most 3D games.
This is clearly false. The Bevy performance improvements that I and the rest of the team landed in 0.16 speak for themselves [1]: 3x faster rendering on our test scenes and excellent performance compared to other popular engines. It may be true that little work is being done on rend3, but please don't claim that there isn't work being done in other parts of the ecosystem.
[1]: https://bevyengine.org/news/bevy-0-16/
...although the fact that a 3x speed improvement was available kind of proves their point, even if it may be slightly out of date.
That is, any sufficiently mature indie game project will end up implementing an informally specified, ad hoc, bug-ridden implementation of Unity (... or just use the informally specified, ad hoc and bug-ridden game engine called "Unity")
> That is, any sufficiently mature indie game project will end up implementing an informally specified, ad hoc, bug-ridden implementation of Unity (... or just use the informally specified, ad hoc and bug-ridden game engine called "Unity")
But using Bevy isn't writing your own game engine. Bevy is 400k lines of code that does quite a lot. Using Bevy right now is more like taking a game engine and filling in some missing bits. While this is significantly more effort than using Unity, it's an order of magnitude less work than writing your own game engine from scratch.
If you just want to make a game, yes, absolutely just go for Unity, for the same reason why if you just want to ship a CRUD app you should just use an established batteries-included web framework. But indie game developers come in all shapes and some of them don't just want to make a game, some of them actually do enjoy owning every part of the stack. People write their own OSes for fun, is it so hard to believe that people (who aren't you) might enjoy the process of building a game engine?
But part of the thing that attracted me to the game I'm making is that it would be hard to make in a standard cookie-cutter way. The novelty of the systems involved is part of the appeal, both to me and (ideally) to my customers. If/when I get some of those (:
Generally, I've seen the exact opposite. People who code their own engines tend to get sucked into the engine and forget that they're supposed to be shipping a game. (I say this as someone who has coded their own engine, multiple times, and ended up not shipping a game--though I had a lot of fun working on the engine.)
The problem is that the fun, cool parts about building your own game engine are vastly outnumbered by the boring parts: supporting level and save data loading/storage, content pipelines, supporting multiple input devices and things like someone plugging in an XBox controller while the game is running and switching all the input symbols to the new input device in real time, supporting various display resolutions and supporting people plugging in new displays while the game is running, and writing something that works on PC/mobile/Switch(2)/XBox/Playstation... all solved problems, none of which are particularly intellectually stimulating to solve correctly.
If someone's finances depend on shipping a game that makes money, there's really no question that you should use Unity or Unreal. Maybe Godot but even that's a stretch. There's a small handful of indie custom game engine success stories, including some of my favorites like The Witness and Axiom Verge, but those are exceptions rather than the rule. And Axiom Verge notably had to be deeply reworked to get a Switch release, because it's built on MonoGame.
Shipping a playable game involves so so many things beyond enjoyable programming bits that it's an entirely different challenge.
I think it's telling that there are more Rust game engines than games written in Rust.
What really drags you down in games is iteration speed. It can be fun making your own game engine at first but after awhile you just want the damn thing to work so you can try out new ideas.
But for the vast majority of projects, I believe that C++ is not the right language, meaning that Rust isn't, either.
I feel like many people choose Rust because is sounds like it's more efficient, a bit as if people went for C++ instead of a JVM language "because the JVM is slow" (spoiler: it is not) or for C instead of C++ because "it's faster" (spoiler: it probably doesn't matter for your project).
It's a bit like choosing Gentoo "because it's faster" (or worse, because it "sounds cool"). If that's the only reason, it's probably a bad choice (disclaimer: I use and love Gentoo).
The result was not statistically different in performance than my Java implementation. Each took the same amount of time to complete. This surprised me, so I made triply sure that I was using the right optimization settings.
Lesson learned: Java is easy to get started with out of the box, memory safe, battle tested, and the powerful JIT means that if warmup times are a negligible factor in your usage patterns your Java code can later be optimized to be equivalent in performance to a Rust implementation.
Java has the stigma of ClassFactoryGeneratorFactory sticking to it like a nasty smell but that's not how the language makes you write things. I write Java professionally and it is as readable as any other language. You can write clean, straightforward and easy to reason code without much friction. It's a great general purpose language.
The OP is doing game development. It’s possible to write a performant game in Java but you end up fighting the garbage collector the whole way and can’t use much library code because it’s just not written for predictable performance.
This said, they moved to Unity, which is C#, which is garbage collected, right?
For all my personal projects, I use a mix of Haskell and Rust, which I find covers 99% of the product domains I work in.
Ultra-low level (FPGA gateware): Haskell. The Clash compiler backend lets you compile (non-recursive) Haskell code directly to FPGA. I use this for audio codecs, IO expanders, and other gateware stuff.
Very low-level (MMUless microcontroller hard-realtime) to medium-level (graphics code, audio code): Rust dominates here
High-level (have an MMU, OS, and desktop levels of RAM, not sensitive to ~0.1ms GC pauses): Haskell becomes a lot easier to productively crank out "business logic" without worrying about memory management. If you need to specify high-level logic, implement a web server, etc. it's more productive than Rust for that type of thing.
Both languages have a lot of conceptual overlap (ADTs, constrained parametric types, etc.), so being familiar with one provides some degree of cross-training for the other.
If your C is faster than your C++ then something has gone horribly wrong. C++ has been faster than C for a long time. C++ is about as fast as it gets for a systems language.
What is your basis for this claim? C and C++ are both built on essentially the same memory and execution model. There is a significant set of programs that are valid C and C++ both -- surely you're not suggesting that merely compiling them as C++ will make them faster?
There's basically no performance technique available in C++ that is not also available in C. I don't think it's meaningful to call one faster than the other.
When people claim C++ to be faster than C, that is usually understood as C++ provides tools that makes writing fast code easier than C, not that the fastest possible implementation in C++ is faster than the fastest possible implementation in C, which is trivially false as in both cases the fastest possible implementation is the same unmaintainable soup of inline assembly.
The typical example used to claim C++ is faster than C is sorting, where C due to its lack of templates and overloading needs `qsort` to work with void pointers and a pointer to function, making it very hard on the optimiser, when C++'s `std::sort` gets the actual types it works on and can directly inline the comparator, making the optimiser work easier.
That said, external comparators are a weakness of generic C library functions. I once manually inlined them in some performance critical code using the C preprocessor:
https://github.com/openzfs/zfs/commit/677c6f8457943fe5b56d7a...
https://github.com/openzfs/zfs/commit/677c6f8457943fe5b56d7a...
The performance gain comes not from eliminating the function overhead, but enabling conditional move instructions to be used in the comparator, which eliminates a pipeline hazard on each loop iteration. There is some gain from eliminating the function overhead, but it is tiny in comparison to eliminating the pipeline hazard.
That said, C++ has its weaknesses too, particularly in its typical data structures, its excessive use of dynamic memory allocation and its exception handling. I gave an example here:
https://news.ycombinator.com/item?id=43827857
Honestly, I think these weaknesses are more severe than qsort being unable to inline the comparator.
Does not work if the compiler can not look into the function, but the same is true in C++.
Edit: It sort of works for the bsearch() standard library function:
https://godbolt.org/z/3vEYrscof
However, it optimized the binary search into a linear search. I wanted to see it implement a binary search, so I tried with a bigger array:
https://godbolt.org/z/rjbev3xGM
Now it calls bsearch instead of inlining the comparator.
That's not the most general case, but it's better than I expected.
While I do not doubt some C++ code uses intrusive data structures, I doubt very much of it does. Meanwhile, C code using <sys/queue.h> uses intrusive lists as if they were second nature. C code using <sys/tree.h> from libbsd uses intrusive trees as if they were second nature. There is also the intrusive AVL trees from libuutil on systems that use ZFS and there are plenty of other options for such trees, as they are the default way of doing things in C. In any case, you see these intrusive data structures used all over C code and every time one is used, it is a performance win over the idiomatic C++ way of doing things, since it skips an allocation that C++ would otherwise do.
The use of intrusive data structures also can speed up operations on data structures in ways that are simply not possible with idiomatic C++. If you place the node and key in the same cache line, you can get two memory fetches for the price of one when sorting and searching. You might even see decent performance even if they are not in the same cache line, since the hardware prefetcher can predict the second memory access when the key and node are in the same object, while the extra memory access to access a key in a C++ STL data structure is unpredictable because it goes to an entirely different place in memory.
You could say if you have the C++ STL allocate the objects, you can avoid this, but you can only do that for 1 data structure. If you want the object to be in multiple data structures (which is extremely common in C code that I have seen), you are back to inefficient search/traversal. Your object lifetime also becomes tied to that data structure, so you must be certain in advance that you will never want to use it outside of that data structure or else you must do at a minimum, another memory allocation and some copies, that are completely unnecessary in C.
Exception handling in C++ also can silently kill performance if you have many exceptions thrown and the code handles it without saying a thing. By not having exception handling, C code avoids this pitfall.
Citation needed.
That's interesting, did ChatGPT tell you this?
In my experience, most people who don't want a JVM language "because it is slow" tend to take this as a principle, and when you ask why their first answer is "because it's interpreted". I would say they are stuck in the 90s, but probably they just don't know and repeat something they have heard.
Similar to someone who would say "I use Gentoo because Ubuntu sucks: it is super slow". I have many reasons to like Gentoo better than Ubuntu as my main distro, but speed isn't one in almost all cases.
It also struggles with numeric work involving large matrices, because there isn't good support for that built into the language or standard library, and there isn't a well-developed library like NumPy to reach for.
They're far slower in Python but that hasn't stopped anyone.
I was a Gentoo user (daily driver) for around 15 years but the endless compilation cycles finally got to me. It is such a shame because as I started to depart, Gentoo really got its arse in gear with things like user patching etc and no doubt is even better.
It has literally (lol) just occurred to me that some sort of dual partition thing could sort out my main issue with Gentoo.
@system could have two partitions - the running one and the next one that is compiled for and then switched over to on a reboot. @world probably ought to be split up into bits that can survive their libs being overwritten with new ones and those that can't.
Errrm, sorry, I seem to have subverted this thread.
But if you want to do a difficult and complicated thing, then Rust is going to raise the guard rails. Your program won't even compile if it's unsafe. It won't let you make a buggy app. So now you need to back up and decide if you want it to be easy, or you want it to be correct.
Yes, Rust is hard. But it doesn't have to be if you don't want.
This just isn’t a problem in other languages I’ve used, which granted aren’t as safe.
I love Rust. But saying it’s only hard if you are doing hard things is an oversimplification.
Querying a database while ensuring type safety is harder, but you still don't need an OEM for that. See sqlx.
The whole point of Rust is to bring memory safety with zero cost abstraction. It's essentially bringing memory safety to the use-cases that require C/C++. If you don't require that, then a whole world of modern languages becomes available :-).
Plus in the meantime, even if I'm doing the "easy mode" approach I get to use all of the features I enjoy about writing in Rust - generics, macros, sum types, pattern matching, Result/Option types. Many of these can't be found all together in a single managed/GC'd languages, and the list of those that I would consider viable for my personal or professional use is quite sparse.
Writing web service backends is one domain where Rust absolutely kicks ass. I would choose Rust/(Actix or Axum) over Go or Flask any day. The database story is a little rough around the edges, but it's getting better and SQLx is good enough for me.
edit: The downvoters are missing out.
I am absolutely convinced I can find success story of web backends built with all those languages.
The type system and package manager are a delight, and writing with sum types results in code that is measurably more defect free than languages with nulls.
Other than the great developer experience in tooling and language ergonomics (as in coherent features not necessarily ease of use) the reason I continue to put up with the difficulties of Rust's borrow checker is because I feel I can work towards mastering one language and then write code across multiple domains AND at the end I'll have an easy way to share it, no Docker and friends needed.
But I don't shy away from the downsides. Rust loads the cognitive burden at the ends. Hard as hell in the beginning when learning it and most people (me included) bounce from it for the first few times unless they have C++ experience (from what I can tell). At the middle it's a joy even when writing "throwaway" code with .expect("Lol oops!") and friends. But when you get to the complex stuff it becomes incredibly hard again because Rust forces you to either rethink your design to fit the borrow checker rules or deal with unsafe code blocks which seem to have their own flavor of C++ like eldritch horrors.
Anyway, would *I* recommend Rust to everyone? Nah, Go is a better proposition for a most bang for your buck language, tooling and ecosystem UNLESS you're the kind that likes to deal with complexity for the fulfilled promise of one language for almost anything. In even simpler terms Go is good for most things, Rust can be used for everything.
Also stuff like Maud and Minijinja for Rust are delights on the backend when making old fashioned MPA.
Thanks for coming to my TED talk.
When things get complex, you start missing Rust's type system and bugs creep in.
In node.js there was a notable improvement when TS became the de-facto standard and API development improved significantly (if you ignore the poor tooling, transpiling, building, TS being too slow). It's still far from perfect because TS has too many escape hatches and you can't trust TS code; with Rust, if it compiles and there are no unsafe (which is rarely a problem in web services) you get a lot of compile time guarantees for free.
Rust is an absolute gem at web backend. An absolute fucking gem.
Rust gamedev is the Wild West, and frontier development incurs the frontier tax. You have to put a lot of work into making an abstraction, even before you know if it’s the right fit.
Other “platforms” have the benefit of decades more work sunk into finding and maintaining the right abstractions. Add to that the fact that Rust is an ML in sheep’s clothing, and that games and UI in FP has never been a solved problem (or had much investment even), it’s no wonder Rust isn’t ready. We haven’t even agreed on the best solutions to many of these problems in FP, let alone Rust specifically!
Anyway, long story short, it takes a very special person to work on that frontier, and shipping isn’t their main concern.
There are so many QoL things which would make Rust better for gamedev without revamping the language. Just a mode to automatically coerce between numeric types would make Rust so much more ergonomic for gamedev. But that's a really hard sell (and might be harder to implement than I imagine.)
GHC has an -fdefer-type-errors option that lets you compile and run this code:
Which obviously doesn't typecheck since 'a' is not an Int, but will run just fine since the value of `a` is not observed by this program. (If it were observed, -fdefer-type-errors guarantees that you get a runtime panic when it happens.) This basically gives you the no-types Python experience when iterating, then you clean it all up when you're done.This would be even better in cases where it can be automatically fixed. Just like how `cargo clippy --fix` will automatically fix lint errors whenever it can, there's no reason it couldn't also add explicit coercions of numeric types for you.
In my book, Rust is good at moving runtime-risk to compile-time pain and effort. For the space of C-Code running nuclear reactors, robots and missiles, that's a good tradeoff.
For the space of making an enemy move the other direction of the player in 80% of the cases, except for that story choice, and also inverted and spawning impossible enemies a dozen times if you killed that cute enemy over yonder, and.... and the worst case is a crash of a game and a revert to a save at level start.... less so.
And these are very regular requirements in a game, tbh.
And a lot of _very_silly_physics_exploits_ are safely typed float interactions going entirely nuts, btw. Type safety doesn't help there.
I don't think your experience with Amethyst merits your conclusion of the state of gamedev in rust, especially given Amethysts own take on Bevy [1, 2].
1: https://web.archive.org/web/20220719130541mp_/https://commun...
2: https://web.archive.org/web/20240202140023/https://amethyst....
C# is stricter about float vs. double for literals than Rust is, and the default in C# (double) is the opposite of the one you want for gamedev. That hasn't stopped Unity from gaining enormous market share. I don't think this is remotely near the top issue.
There is a very certain way rust is supposed to be used, which is a negative on it's own, but it will lead to a fulfilling and productive programming experience. (My opinion) If you need to regularly index something, then you're using the language wrong.
Long story short, yes, it's very different in game dev. It's very common to pre-allocate space for all your working data as large statically sized arrays because dynamic allocation is bad for performance. Oftentimes the data gets organized in parallel arrays (https://en.wikipedia.org/wiki/Parallel_array) instead of in collections of structs. This can save a lot of memory (because the data gets packed more densely) be more cache-friendly, and makes it much easier to make efficient use of SIMD instructions.
This is also fairly common in scientific computing (which is more my wheelhouse), and for the same reason: it's good for performance.
That seems like something that could very easily be turned into a compiler optimisation and enabled with something like an annotation. Would have some issue when calling across library boundaries ( a lot like the handling of gradual types), but within the codebase that'd be easy.
* Everything should be random access(because you want to have novel rulesets and interactions)
* It should also be fast to iterate over per-frame(since it's real-time)
* It should have some degree of late-binding so that you can reuse behaviors and assets and plug them together in various ways
* There are no ideal data structures to fulfill all of this across all types of scene, so you start hacking away at something good enough with what you have
* Pretty soon you have some notion of queries and optional caching and memory layouts to make specific iterations easier. Also it all changes when the hardware does.
* Congratulations, you are now the maintainer of a bespoken database engine
You can succeed at automating parts of it, but note that parent said "oftentimes", not "always". It's a treadmill of whack-a-mole engineering, just like every other optimizing compiler; the problem never fully generalizes into a right answer for all scenarios. And realistically, gamedevs probably haven't come close to maxing out what is possible in a systems-level sense of things since the 90's. Instead we have a few key algorithms that go really fast and then a muddle of glue for the rest of it.
(It's also not always a win: it can work really well if you primarily operate on the 'columns', and on each column more or less once per update loop, but otherwise you can run into memory bandwidth limitations. For example, games with a lot of heavily interacting systems and an entity list that doesn't fit in cache will probably be better off with trying to load and update each entity exactly once per loop. Factorio is a good example of a game which is limited by this, though it is a bit of an outlier in terms of simulation size.)
At least on the scientific computing side of things, having the way the code says the data is organized match the way the data is actually organized ends up being a lot easier in the long run than organizing it in a way that gives frontend developers warm fuzzies and then doing constant mental gymnastics to keep track of what the program is actually doing under the hood.
I think it's probably like sock knitting. People who do a lot of sock knitting tend to use double-pointed needles. They take some getting used to and look intimidating, though. So people who are just learning to knit socks tend to jump through all sorts of hoops and use clever tricks to allow them to continue using the same kind of knitting needles they're already used to. From there it can go two ways: either they get frustrated, decide sock knitting is not for them, and go back to knitting other things; or they get frustrated, decide magic loop is not for them, and learn how to use double-pointed needles.
In general I think GP is correct. There is some subset of problems that absolutely requires indexing to express efficiently.
However this problem does still come up in iterator contexts. For example Iterator::take takes a usize.
Concrete example: pulling a single item out of a zip file, which supports random access, is O(1). Pulling a single item out of a *.tar.gz file, which can only be accessed by iterating it, is O(N).
Compressed tars are terrible for random access because the compression occurs after the concatenation and so knows nothing about inner file metadata, but it's good for streaming and backups. Uncompressed tars are much better for random access. (Tar was a used as a backup mechanism to tape (tape archive).)
Zips are terrible for streaming because their metadata is stored at the end, but are better for 1-pass creation and on-disk random access. (Remember that zip files and programs were created in an era of multiple floppy disk-based backups.)
When fast tar enumeration is desired, at the cost of compatibility and compression potential, it might be worth compressing files and then taring them when and if zipping alone isn't achieving enough compression and/or decompression performance. FUSE compressed tar mounting gets to be really expensive with terabyte archives.
Just use squashfs if that is the functionality that you need.
I'm usually working with positive values, and almost always with values within the range of integers f32 can safely represent (+- 16777216.0).
I want to be able to write `draw(x, y)` instead of `draw(x as u32, y as u32)`. I want to write "3" instead of "3.0". I want to stop writing "as".
It sounds silly, but it's enough to kill that gamedev flow loop. I'd love if the Rust compiler could (optionally) do that work for me.
[1] https://docs.rs/num-traits/latest/num_traits/trait.Num.html
Conversely and ironically, this is why I love Go. The language itself is so boring and often ugly, but it just gets out of the way and has the best in class tooling. The worst part is having seen the promised land of eg Rust enums, and not having them in other langs.
Feeling passionate about a programming language is generally bad for the products made with that language.
Cross compilation, package manager and associated infrastructure, async io (epoll, io_uring etc), platform support, runtime requirements, FFI support, language server, etc.
Are a majority of these things available with first party (or best in class) integrated tooling that are trivial to set up on all big three desktop platforms?
For instance, can I compile an F# lib to an iOS framework, ideally with automatically generated bindings for C, C++ or Objective C? Can I use private repo (ie github) urls with automatic overrides while pulling deps?
Generally, the answer to these questions for – let’s call it ”niche” asterisk – languages, are ”there is a GitHub project with 15 stars last updated 3 years ago that maybe solves that problem”.
There are tons of amazing languages (or at the very least, underappreciated language features) that didn’t ”make it” because of these boring reasons.
My entire point is that the older and grumpier I get, the less the language itself matters. Sure, I hate it when my favorite elegant feature is missing, but at the end of the day it’s easy to work around. IMO the navel gazing and bikeshedding around languages is vastly overhyped in software engineering.
Thing is, he didn't make the game in C. He built his game engine in C, and the game itself in Lua. The game engine is specific to this game, but there's a very clear separation where the engine ends and the game starts. This has also enabled amazing modding capabilities, since mods can do everything the game itself can do. Yes they need to use an embedded scripting language, but the whole game is built with that embedded scripting language so it has APIs to do anything you need.
For those who are curious - the game is 'Sapiens' on Steam: https://store.steampowered.com/app/1060230/Sapiens/
They're distributing their game on Steam too so Linux support is next to free via Proton.
Non-issue. Pick a single blessed distro. Clearly state that it's the only configuration that you officially support. Let the community sort the rest out.
Still, given the nature of what my project is (APIs and basic financial stuff), I think it was the right choice. I still plan to write about 5% of the project in Rust and call it from Go, if required, as there is a piece of code that simply cannot be fast enough, but I estimate for 95% of the project Go will be more than fast enough.
Obligatory ”remember to `go run -race`”, that thing is a life saver. I never run into difficult data races or deadlocks and I’m regularly doing things like starting multiple threads to race with cancelation signals, extending timeouts etc. It’s by far my favorite concurrency model.
• Go uses its own custom ABI and resizeable stacks, so there's some overhead to switching where the "Go context" must be saved and some things locked.
• Go's goroutines are a kind of preemptive green thread where multiple goroutines share the same OS thread. When calling C, the goroutine scheduler must jump through some hoops to ensure that this caller doesn't stall other goroutines on the same thread.
Calling C code from Go used to be slow, but over the last 10 years much of this overhead has been eliminated. In Go 1.21 (which came with major optimizations), a C call was down to about 40ns [1]. There are now some annotations you can use to further help speed up C calls.
[1] https://shane.ai/posts/cgo-performance-in-go1.21/
In Unity, Mono and/or IL2CPP's interop mechanism also ends up in the ballpark of direct call cost.
And chances are that it won't be required.
I too have a hobby-level interest in Rust, but doing things in Rust is, in my experience, almost always just harder. I mean no slight to the language, but this has universally been my experience.
Perhaps someday there will be a comparable game engine written in Rust, but it would probably take a major commercial sponsor to make it happen.
This was more of a me-problem, but I was constantly having to change my strategy to avoid fighting the borrow-checker, manage references, etc. In any case, it was a productivity sink.
you want to make some change, so you adjust a struct or a function signature, and then your IDE highlights all the places where changes are necessary with red squigglies.
Once you’re done playing whack-a-mole with the red squigglies, and tests pass, you know there’s no weird random crash hiding somewhere
That's not to say that games aren't a very cool space to be in, but the challenges have moved beyond the code. Particularly in the indie space, for 10+ years it's been all about story, characters, writing, artwork, visual identity, sound and music design, pacing, unique gameplay mechanics, etc. If you're making a game in 2025 and the hard part is the code, then you're almost certainly doing it wrong.
I think all posts I have seen regarding migrating away from writing a game in Rust were using Bevy, which is interesting. I do think Bevy is awesome and great, but it's a complex project.
Hot reloading! Iteration!
A friend of mine wrote an article 25+ years ago about using C++ based scripting (compiles to C++). My friend is super smart engineer, but I don't think he was thinking of those poor scripters that would have to wait on iteration times. Granted 25 years ago the teams were small, but nowadays the amount of scripters you would have on AAA game is probably dozen if not two or three dozen and even more!
Imagine all of them waiting on compile... Or trying to deal with correctness, etc.
Many of the negatives in the post are positives to me.
> Each update brought with it incredible features, but also a substantial amount of API thrash.
This is highly annoying, no doubt, but the API now is just so much better than it used to be. Keeping backwards compatibility is valuable once a product is mature, but like how you need to be able to iterate on your game, game engine developers need to be able to iterate on their engine. I admit that this is a debuff to the experience of using Bevy, but it also means that the API can actually get better (unlike Unity which is filled with historical baggage, like the Text component).
From a dev perspective, I think, Rust and Bevy are the right direction, but after reading this account, Bevy probably isn't there yet.
For a long time, Unity games felt sluggish and bloated, but somehow they got that fixed. I played some games lately that run pretty smoothly on decade old hardware.
This is the biggest reason I push for C#/.NET in "serious business" where concerns like auditing and compliance are non-negotiable aspects of the software engineering process. Virtually all of the batteries are included already.
For example, which 3rd party vendors we use to build products is something that customers in sectors like banking care deeply about. No one is going to install your SaaS product inside their sacred walled garden if it depends on parties they don't already trust or can't easily vet themselves. Microsoft is a party that virtually everyone can get on board with in these contexts. No one has to jump through a bunch of hoops to explain why the bank should trust System or Microsoft namespaces. Having ~everything you need already included makes it an obvious choice if you are serious about approaching highly sensitive customers.
Log4shell was a good example of a relative strength of .NET in this area. If a comparable bug had happened in .NET's standard logging tooling, we likely would have seen all of the first-party .NET framework patched fairly shortly after, in a single coordinated release that we could upgrade to with minimal fuss. Meanwhile, at my current job we've still got standing exceptions allowing vulnerable version of log4j in certain services because they depend on some package that still has a hard dependency on a vulnerable version, which they in turn say they can't fix yet because they're waiting on one of their transitive dependencies to fix it, and so on. We can (and do) run periodic audits to confirm that the vulnerable parts of log4j aren't being used, but being able to put the whole thing in the past within a week or two would be vastly preferable to still having to actively worry about it 5 years later.
The relative conciseness of C# code that the parent poster mentioned was also a factor. Just shooting from the hip, I'd guess that I can get the same job done in about 2/3 as much code when I'm using C# instead of Java. Assuming that's accurate, that means that with Java we'd have had 50% more code to certify, 50% more code to maintain, 50% more code to re-certify as part of maintenance...
Less time wasted sifting through half-baked solutions.
C# has three "super powers" to reduce code bloat which is its really rich runtime reflection, first-class expression trees, and Roslyn source generators to generate code on the fly. Used correctly, this can remove a lot of boilerplate and "templatey" code.---
I make the case that many teams that outgrow JS/TS on Node.js should look to C# because of its congruence to TS[0] before Go, Java, Kotlin, and certainly not Rust.
[0] https://typescript-is-like-csharp.chrlschn.dev/
Why would you use Java 8?
My understanding (not having used it much, precisely because of this) is that AOT is still quite lacking; not very performant and not so seamless when it comes to cross-platform targeting. Do you know if things have gotten better recently?
I think fhat Microsoft had dropped the old .NET platform (CLR and so on) sooner and really nailed the AOT experience, they may have had a chance at competing with Go and even Rust and C++ for some things, but I suspect that ship has sailed, as it has for languages like D and Nim.
C# and .net are one of the most mature platform for development of all kind. It's just that online, it carries some sort of anti Microsoft stigma...
But a lot of AA or indie games are written in C# and they do fine. It's not just C++ or Rust in that industry.
People tend to be influenced by opinions online but often the real world is completely different. Been using C# for a decade now and it's one of the most productive language I have ever used, easy to set up, powerful toolchains... and yes a lot of closed source libs in the .net ecosystem but the open source community is large too by now.
Some folks still think it's Windows-only. Some folks think you need to use Visual Studio. Some think it's too hard to learn. Lots of misconceptions lead to teams overlooking it for more "hyped" languages like Rust and Go.
I think there may also be some misunderstandings regarding the purchase models around these tools. Visual Studio 2022 Professional is possible to outright purchase for $500 [0] and use perpetually. You do NOT need a subscription. I've got a license key printed on paper that I can use to activate my copy each time.
Imagine a plumber or electrician spending time worrying about the ideological consequences of purchasing critical tools that cost a few hundred dollars.
[0] https://www.microsoft.com/en-us/d/visual-studio-professional...
How's the LSP support nowadays? I remember reading a lot of complaints about how badly done the LSP is compared to Visual Studio.
I started using Visual Studio Code exclusively around 2020 for C# work and it's been great. Lightweight and fast. I did try Rider and 100% it is better if you are open to paying for a license and if you need more powerful refactoring, but I find VSC to be perfectly usable and I prefer its "lighter" feel.
I mean, you could also write how we went from C# code 1mil code of our mostly custom engine to 10k in Unreal C++.
Similarly, anyone who has shipped a game in unreal will know that memory issues are absolutely rampant during development.
But, the cure rust presents to solve these for games is worse than the disease it seems. I don’t have a magic bullet either..
Other game engines exist which use C# with .NET or at least Mono's better GC. When using these engines a few allocations won't turn your game into a stuttery mess.
Just wanted to make it clear that C# is not the issue - just the engine most people use, including the topic of this thread, is the main issue.
I have worked as a professional dev at game studios many would recognize. Those studios which used Unity didn't even upgrade Unity versions often unless a specific breaking bug got fixed. Same for those studios which used DirectX. Often a game shipped with a version of the underlying tech that was hard locked to something several years old.
The other points in the article are all valid, but the two factors above held the greatest weight as to why the project needed to switch (and the article says so -- it was an API change in Bevy that was "the straw that broke the camel's back").
https://bevyengine.org/learn/quick-start/introduction/
The more projects I do, the more time I find that I dedicate to just planning things up front. Sometimes it's fun to just open a game engine and start playing with it (I too have an unfair bias in this area, but towards Godot [https://godotengine.org/]), but if I ever want to build something to release, I start with a spreadsheet.
But if you're doing something for fun, then you definitely don't need much planning, if any - the project will probably be abandoned halfway through anyways :)
The problem is you make a deal with the devil. You end up shipping a binary full of phone home spyware, if you don't use Unity in the exact way the general license intends they can and will try to force you into the more expensive industrial license.
However, the ease of actually shipping a game can't be matched.
Godot has a bunch of issues all over the place, a community more intent on self praise than actually building games. It's free and cool though.
I don't really enjoy Godot like I enjoy Unity , but I've been using Unity for over a decade. I might just need to get over it.
Bevy is in its early stages. I'm sure more Rust Game Engines will come up and make it easier. That said, Godot was great experience for me but doesn't run on mobile well for what I was making. I enjoy using Flutter Flame now (honestly different game engines for different genres or preference), but as Godot continues to get better, I personally would use Godot. Try Unity or Unreal as well if I just want to focus on making a game and less on engine quirks and bugs.
There is not chance for any language, not matter how good is it, to match the most horrendous (web!) but full-featured ui toolkit.
I bet, 1000%, that is easier to do a OS, a database engine, etc that try to match QT, Delphi, Unity, etc.
---
I made a decision that has become the most productive and problem-less approach of make UIs in my 30 years doing this:
1- Use the de-facto UI toolkit as-is (html, swift ui, jetpack compose). Ignore any tool that promise cross-platform UI (so that is html, but I mean: I don't try to do html in swift, ok?).
2- Use the same idea of html: Send plain data with the full fidelity of what you wanna render: Label(text=.., size=..).
3- Render it directly from the native UI toolkit.
Yes, this is more or less htmx/tailwindcss (I get the inspiration from them).
This mean my logic is full Rust, I pass serializable structs to the UI front-end and render directly from it. Critically, the UI toolkit is nearly devoid of any logic more complex that what you see in a mustache template language.. Not do the localization, formatting, etc. Only UI composition.
I don't care that I need to code in different ways, different apis, different flows, and visually divergent UIs.
IS GREAT.
After the pain of boilerplate, doing the next screen/component/wwhatever is so ridiculous simple that is like cheating.
So, the problem is not Rust. Is not F#, or Lisp. Is that UI is a kind of beast that is imperious to be improved by language alone.
> this is why game engines embedded scripting languages
I 100% agree. A modern mature UI toolkit is at least equivalent to a modern game engine in difficulty. GitHub is strewn with the corpses of abandoned FOSS UI toolkits that got 80% of the way there only to discover that the other 20% of the problem is actually 20000% of the work.
The only way you have a chance developing a UI toolkit is to start in full self awareness of just how hard this is going to be. Saying "I am going to develop a modern UI toolkit" is like saying "I am going to develop a complete operating system."
Even worse: a lot of the work that goes into a good UI toolkit is the kind of work programmers hate: endless fixing of nit-picky edge case bugs, implementation of standards, and catering to user needs that do not overlap with one's own preferences.
That said regarding both rapid gameplay mechanic iteration and modding - would that not generally be solved via a scripting language on top of the core engine? Or is Rust + Bevy not supposed to be engine-level development, and actually supposed to solve the gameplay development use-case too? This is very much not my area of expertise, I'm just genuinely curious.
I feel like this harkens to the general principle of being a software developer and not an "<insert-language-here>" developer.
Choose tools that expose you to more patterns and help to further develop your taste. Don't fixate on a particular syntax.
> Bevy is still in the early stages of development. Important features are missing. Documentation is sparse. A new version of Bevy containing breaking changes to the API is released approximately once every 3 months.
I would choose Bevy if and only if I would like to be heavily involved in the development of Bevy itself.
And never for anything that requires a steady foundation.
Programming language does not matter. Choose the right tool for job and be pragmatic.
Coroutines can definitely be very useful for games and they're also available in C#.
https://news.ycombinator.com/item?id=43787012
In my personal opinion, a paradox of truly open-source projects (meaning community projects, not pseudo-open-source from commercial companies) is that development seems to show a tendency of diversity. While this leads to more and more cool things appearing, there always needs to be a balance with sustainable development.
Commercial projects, at least, always have a clear goal: to sell. For this goal, they can hold off on doing really cool things. Or they think about differentiated competition. Perhaps if the purpose is commercial, an editor would be the primary goal (let me know if this is alreay on the roadmap).
---
I don't think the language itself is the problem. The situation where you have to use mature solutions for efficiency is more common in games and apps.
For example, I've seen many people who have had to give up Bevy, Dioxus, and Tauri.
But I believe for servers, audio, CLI tools, and even agent systems, Rust is absolutely my first choice.
I've recently been rewriting Glicol (https://glicol.org) after 2 years. I start from embedded devices, switching to crates like Chumsky, and I feel the ecosystem has improved a lot compared to before.
So I still have 100% confidence in Rust.
Gave up after 3 days for 3 reasons:
1. Refactoring and IDE tooling in general are still lightyears away from JetBrains tooling and a few astronomical units away from Visual Studio. Extract function barely works.
2. Crates with non-Rust dependencies are nearly impossible to debug as debuggers don't evaluate expressions. So, if you have a Rust wrapper for Ogg reader, you can't look at ogg_file.duration() in the debugger because that requires function evaluation.
3. In contrast to .NET and NuGet ecosystem, non-Rust dependencies typically don't ship with precompiled binaries, meaning you basically have to have fun getting the right C++ compilers, CMake, sometimes even external SDKs and manually setting up your environment variables to get them to build.
With these roadblocks I would never have gotten the "mature" project to the point, where dealing with hard to debug concurrency issues and funky unforeseen errors became necessary.
Depending on your scenario, you may want either one or another. Shipping pre-compiled binaries carries its own risks and you are at the mercy of the library author making sure to include the one for your platform. I found wiring up MSBuild to be more painful than the way it is done in Rust with cc crate, often I would prefer for the package to also build its other-language components for my specific platform, with extra optimization flags I passed in.
But yes, in .NET it creates sort of an impedance mismatch since all the managed code assemblies you get from your dependencies are portable and debuggable, and if you want to publish an application for a specific new target, with those it just works, be it FreeBSD or WASM. At the same time, when it works - it's nicer than having to build everything from scratch.
Going hard with Rust ECS was not the appropriate choice here. Even a 1000x speed hit would be preferable if it gained speed of development. C# and Unity is a much smarter path for this particular game.
But, that’s not a knock on Rust. It’s just “Right tool for the job.”
I think the worst issue was the lack of ready-made solution. Those 67k lines in Rust contains a good chunk of a game engine.
The second worst issue was that you targeted an unstable framework - I would have focused on a single version and shipped the entire game with it, no matter how good the goodies in the new version.
I know it's likely the last thing you want to do, but you might be in a great position to improve Bevy. I understand open sourcing it comes with IP challenges, but it would be good to find a champion with read access within Bevy to parse your code and come up with OSS packages (cleaned up with any specific game logic) based on the countless problems you must have solved in those extra 50k lines.
I rarely touch game dev but that made me think Godot wasn't very suitable
Would you really expect Godot to win out over Unity given those priorities? Godot is pretty awesome these days, but it's still going to be behind for those priorities vs. Unity or Unreal.
I've been toying with the idea of making a 2d game that I've had on my mind for awhile, but have no game development experience, and am having trouble deciding where to start (obviously wanting to avoid the author's predicament of choosing something and having to switch down the line).
Probably the best thing in your case is, look at the top three engines you could consider, spend maybe four hours gather what look like pros and cons, then just pick one and go. Don't overestimate your attachment to your first choice. You'll learn more just in finishing a tutorial for any of them then you can possibly learn with analysis in advance.
Sometimes it feels like we could use some kind of a temperance movement, because if one can just manage to walk the line one can often reap great rewards. But the incentives seem to be pointing in the opposite direction.
You're probably right that it'd be best to just jump in and get going with a few of them rather than analyze the choice to death (as I am prone to do when starting anything).
But they also could have combined Rust parts and C# parts if they needed to keep some of what they had.
C# actually has fairly good null-checking now. Older projects would have to migrate some code to take advantage of it, but new projects are pretty much using it by default.
I'm not sure what the situation is with Unity though - aren't they usually a few versions behind the latest?
On the topic of rapid prototyping: most successful game engines I'm aware of hit this issue eventually. They eventually solve it by dividing into infrastructure (implemented in your low-level lanuage) and game-logic / application logic / scripting (implemented in something far more flexible and, usually, interpreted; I've seen Lua used for this, Python, JavaScript, and I think Unity's C# also fits this category?).
For any engine that would have used C++ instead, I can't think of a good reason to not use Rust, but most games with an engine aren't written in 100% C++.
https://fyrox.rs/
here's a web demo
https://fyrox.rs/assets/demo/animation/index.html
Is it normal for Rust ecosystem to suggest software with this level of maturity?
https://github.com/FyroxEngine/Fyrox/discussions/725
* automatically make your program fast;
* eliminate memory leaks;
* eliminate deadlocks; or
* enforce your logical invariants for you.
Sometimes people mention that independent of performance and safety, Rust's pattern-matching and its traits system allow them to express logic in a clean way at least partially checked at compile time. And that's true! But other languages also have powerful type systems and expressive syntax, and these other languages don't pay the complexity penalty inherent in combining safety and manual memory management because they use automatic memory management instead --- and for the better, since the vast majority of programs out there don't need manual memory management.
I mean, sure, you can Arc<Box<Whatever>> many of your problems away, but that point, your global reference counting just becomes a crude form of manual garbage collection. You'd be better off with a finely-tuned garbage collector instead --- one like Unity (via the CLR and Mono) has.
And you're not really giving anything up this way either. If you have some compute kernel that's a bottleneck, thanks to easy FFIs these high-level languages have, you can just write that one bit of code in a lower-level language without bringing systems consideration to your whole program.
Languages like Go , JavaScript, C# or Java are much better choices for this purpose. Rust is still best suited for scenarios where traditional system languages excel, such as embedded systems or infrastructure software that needs to run for extended periods.
PS: I love the art style of the game.
Rust is a niche language, there is no evidence it is going to do well in the game space.
Unity and C# sound like a much better business choice for this. Choosing a system/language....
> My love of Rust and Bevy meant that I would be willing to bear some pain
....that is not a good business case.
Maybe one day there will be a Rust game engine that can compete with Unity, probably already are, in niches.
And if it's not already popular, that won't happen.
But also: there is a lot of Rust code out there! And a cubic fuckload of high-quality written material about the language, its idioms, and its libraries, many of which are pretty famous. I don't think this issue is as simple as it's being out to be.
The same can be said of books as of programming languages:
"Not every ___ deserves to be read/used"
If the documentation or learning curve is so high and/or convoluted that it's disparaging to newcomers then perhaps it's just not a language that's fit for widespread adoption. That's actually fine.
"Thanks for your work on the language, but this one just isn't for me" "Thanks for writing that awfully long book, but this one just isn't for me"
There's no harm in saying either of those statements. You shouldn't be disparaged for saying that rust just didn't work out for your case. More power to the author.
If you switched from Java to C# or vice versa, nobody would care.
Availability of documentation and tooling, widespread adaptation and access to already-trained-at-someone-else's-dime possibility is deemed safe for hiring decision. Sometimes, the narrow tech is spotted in the wild, but it was mostly some senior/staff engineer wanted to experiment something which became part of production because management saw no issue, will sometimes open some doors for practitioners of those stack but the probability is akin to getting hit by lightning strike.
To me constantly chasing the latest trends means lack of experience in a team and absence of focus on what is actually important, which is delivering the product.
Until the tech catches up it will have a stifling effect on progress toward and adoption of new things (which imo is pretty common of new/immature tech, eg how culture has more generally kind of stagnated since the early 2000s)
However, I had a different takeaway when playing with Rust+AI. Having a language that has strict compile-time checks gave me more confidence in the code the AI was producing.
I did see Cursor get in an infinite loop where it couldn't solve a borrow checker problem and it eventually asked me for help. I prefer that to burying a bug.
Now Box2D 3.1 has been released and there's zero chance any of the LLMs are going to emit any useful answers that integrate the newly introduced features and changes.
In any case, there has always been a strong bias towards established technologies that have a lot of available help online. LLMs will remain better at using them, but as long as they are not completely useless on new technologies, they will also help enthusiasts and early adopters work with them and fill in the gaps.
LLMs will make people productive. But it will at the same time elevate those with real skill and passion to create good software. In the meantime there will be some maker confusion, and some engineers who are mediocre might find them selfs in demand like top end engineers. But over the time companies and markets will realize and top dollar will go to those select engineers who know how to do things with and without LLMs.
Lots of people are afraid of LLMs and think it is the end of the software engineer. It is and it is not. It’s the end of the “CLI engineer” or the “Front end engineer” and all those specializations that were attempt to require less skill to pay less. But the systems engineers who know how computers work, can take all week long describing what happens when you press enter on a keyboard at google.com will only be pressed into higher demand. This is because the single skill “engineer” wont really be a thing.
tldr; LLMs wont kill software engineering its a reset, it will cull those who chose such a path on a rubric only because it paid well.
The problem is, they’re all blogspam rehashes of the same few WWDC talks. So they all have the same blindspots and limitations, usually very surface level.
I'm not saying you're definitely wrong, but if you think that LLMs are going to bring qualitative change rather than just another thing to consider, then I'm interested in why.
Another potentially interesting avenue of research would be to explore allowing LLMs to use "self-play" to explore new things.
[1]: https://huijzer.xyz/posts/killer-domain/
unfortunately, a lot of libraries and services - well I don't think chatGPT understands the differences or it would be hard to. At least I have found that with writing scriplets for RT, PHP tooling, etc. The web world seems to move fast enough (and RT moves hella slow) that its confusing libraries and interfaces through the versions.
It'd really need a wider project context where it can go look at how those includes, or functions, or whatever work instead of relying on 'built in' knowledge.
"Assume you know nothing, go look at this tool, api endpoint or, whatever, read the code, and tell me how to use it"
I wouldn't have read the article if it'd been labeled that, so kudos to the blog writer, I guess.
Although points mentioned in the post are quite valid.
The amount of platforms they support, the amount of features they support, many of which could be a PhD thesis in graphics programming, the tooling, the store,....
Here's a thought experiment: Would Minecraft have been as popular if it had been written in Rust instead of Java?
While the language itself is great and stable, the ecosystem is not, and reverting to more conservative options is often the most reasonable choice, especially for long-term projects.
But outside of games the situation looks very different. “Almost everything” is just not at all accurate. There are tons of very stable and productive ecosystems in Rust.
I completely disagree, having been doing game dev in Rust for well over a year at this point. I've been extremely productive in Bevy, because of the ECS. And Unity compile times are pretty much just as bad (it's true, if you actually measure how long that dreaded "Reloading Domain" screen takes).
Replace Rust with Bevy and language with framework, you might have a point. Bevy is still in alpha, it's lacking plenty of things, mainly UI and an easy way to have mods.
As for almost everything is at an alpha stage, yeah. Welcome to OSS + SemVer. Moving to 1.x makes a critical statement. It's ready for wider use, and now we take backwards compatibility seriously.
But hurray! Commercial interest won again, and now you have to change engines again, once the Unity Overlords decide to go full Shittification on your poorly paying ass.
You can also always go from 1.0 to 2.0 if you want to make breaking changes.
Yes. Because it makes a promise about backwards compatibility.
> Rust language and library features themselves often spend years in nightly before making it to a release build.
So did Java's. And I Rust probably has a fraction of its budget.
In defense of long nightly feature more than once, stabilizing some feature like negative impl and never types early would have caused huge backwards breaking changes.
> You can also always go from 1.0 to 2.0 if you want to make breaking changes.
Yeah, just like Python!
And split the community and double your maintenance burden. Or just pretend 2.0 is 1.1 and have the downstream enjoy the pain of migration.
If you choose to support 1.0 sure. But you don't have to. Overall I find that the Rust community is way too leery of going to 1.0. It doesn't have to be as big a burden as they make it out to be, that is something that comes down to how you handle it.
If you choose not to, then people wait for x.0 where x approaches infinity. I.e. they lose confidence in your crates/modules/libraries.
I mean, a big part of why I don't 1.x my OSS projects (not just Rust) is that I don't consider them finished yet.
I don't even look at crate versions but the stuff works, very well. The resulting code is stable, robust and the crates save an inordinate amount of development time. It's like lego for high end, high performance code.
With Rust and the crates you can build actual, useful stuff very quickly. Hit a bug in a crate or have missing functionality? contribute.
Software is something that is almost always a work in progress and almost never perfect, and done. It's something you live with. Try any of this in C or C++.
If not, the language they pick doesn't really make a difference in the end.
It is like complaining playing a music instrument to be in band or orchestra requires too much effort, naturally.
Great musicians make a symphony out of what they can get their hands on.
From what I’ve heard about the Rust community, you may have made an unintentionally witty pun.
Quake 1-3 uses a single array of structs, with sometimes unused properties. Is your game more complex than quake 3?
The “ECS” upgrade to that is having an array for each component type but just letting there be gaps:
But yeah, probably you don't need an ECS for 90% of the games.
First: I have experience with Bevy and other game engine frameworks; including Unreal. And I consider myself a seasoned Rust, C etc developer.
I could sympathize with what was stated by the author.
I think the issue here is (mainly) Bevy. It is just not even close to the standard yet (if ever). It is hard for any generic game engine to compete with Unity/GoDot. Nevermind, the de facto standard of Unreal.
But if you are a C# developer and using Unity already, and not C++ in Unreal, going to a bloated framework that is missing features that is Bevy makes little sense. [And here is also the minor issue, that if you are a C# developer, honestly you don't care about low level code, or not having a garbage collector.]
Now if you are a C++ developer and use Unreal, they only point to move to Rust (which I would argue for the usual reasons) is if Unreal supports Rust. Otherwise, there is nothing that even compares to Unreal. (That is not custom made game engine.)
https://old.reddit.com/r/rust_gamedev/comments/13wteyb/is_be...
I wonder how something simpler in the rust world like macroquad[0] would have worked out for them (superpowers from Unity's maturity aside).
[0] https://macroquad.rs/
From my experience one has to take Rust discussions with a grain of salt because often shortcomings and disclosures are handwaved and/or ommited.
And within rust, I've learned to look beyond the most popular and hyped tools; they are often not the best ones.
Bevy: unstable, constantly regressing, with weird APIs here and there, in flux, so LLMs can't handle it well.
Unity: rock-solid, stable, well-known, featureful, LLMs know it well. You ought to choose it if you want to build the game, not hack on the engine, be its internal language C#, Haskell, or PHP. The language is downstream from the need to ship.
I love Rust. It’s not for shipping video games. No Tiny Glade doesn’t count.
Edit: don’t know why you’re downvoting. I love Rust. I use it at my job and look for ways to use it more. I’ve also shipped a lot of games. And if you look at Steam there are simply zero Rust made games in the top 2000. Zero. None nada zilch.
Also you’re strictly forbidden from shipping Rust code on PlayStation. So if you have a breakout indie hit on Steam in Rust (which has never happened) you can’t ship it on PS5. And maybe not Switch although I’m less certain.
> And if you look at Steam there are simply zero Rust made games in the top 2000. Zero. None nada zilch.
Well, sure, if you arbitrarily exclude the popular game written in Rust, then of course there are no popular games written in Rust :)
> And maybe not Switch although I’m less certain.
I have talked to Nintendo SDK engineers about this and been told Rust is fine. It's not an official part of their toolchain, but if you can make Rust work they don't care.
Tiny Glade is indeed a rust game. So there is one! I am not aware of a second. But it’s not really a Bevy game. It uses the ECS crate from Bevy.
Egg on my face. Regrets.
And for something like Gnorp, Rust is probably a decent choice.
What allocations can you not do in Rust?
The Unity GameObject/Component model is pretty good. It’s very simple. And clearly very successful. This architecture can not be represented in Rust. There are a dozen ECS crates but no one has replicated the worlds most popular gameplay system architecture. Because they can’t.
From what I remember from my Unity days (which granted, were a long time ago), GameObjects had their own lifecycle system separate from the C# runtime and had to be created and deleted using Destroy and Create calls in the Unity API. Similarly, components and references to them had to be created and retrieved using the GetComponent calls, which internally used handles, rather than being raw GC pointers. Runtime allocation of objects frequently caused GC issues, so you were practically required to pre-allocate them in an object pool anyway.
I don't see how any of those things would be impossible or even difficult to implement in Rust. In fact, this model is almost exactly what I used to see evangelized all the time for C++ engines (using safe handles and allocator pools) in GDC presentations back then.
In my view, as someone who has not really interacted or explored Rust gamedev much, the issue is more that Bevy has been attempting to present an overtly ambitious API, as opposed to focusing on a simpler, less idealistic one, and since it is the poster child for Rust game engines, people keep tripping over those problems.
No offense to the project. It’s cool and I’m glad it exists. But if you were to plot the top 2000 games on Steam by time played there are, I believe, precisely zero written in Rust.
Tiny Glade is also the buggiest Steam game I've ever encountered (bugs from disappearing cursor to not launching at all). Incredibly poor performance as well for a low poly game, even if it has fancy lighting...
What evidence do you have for this statement? It kind of doesn't make any sense on its face. Binaries are binaries, no matter what tools are used to compile them. Sure, you might need to use whatever platform-specific SDK stuff to sign the binary or whatever, but why would Rust in particular be singled out as being forbidden?
Despite not being yet released publicly, Jai can compile code for PlayStation, Xbox, and Switch platforms (with platform-specific modules not included in the beta release, available upon request provided proof of platform SDK access).
Do you mean cyclic types?
Rust being low-level, nobody prevents one from implementing garbage-collected types, and I've been looking into this myself: https://github.com/Manishearth/rust-gc
It's "Simple tracing (mark and sweep) garbage collector for Rust", which allows cyclic allocations with simple `Gc<Foo>` syntax. Can't vouch for that implementation, but something like this would be good for many cases.
Scripting being flexible is a proper idea, but that's not an argument against Rust either. Rather it's an argument for more separation between scripting machinery and the core engine.
For example Godot allows using Rust for game logic if you don't want to use GDScript, and it's not really messing up the design of their core engine. It's just more work to allow such flexibility of course.
The rest of the arguments are more in the familiarity / learning curve group, so nothing new in that sense (Rust is not the easiest language).
The rest of your comment boils down to "skills issue". I mean, OK. But you can say that about any programming environment, including writing in raw assembly.
It's like imagine saying, I don't want to learn how write a good story because AI always suggests me writing a bad one anyway. May be that delivers the idea better.
Why would I valorize discarding this kind of automation? Is this just a craft vs. production thing? Like, the same reason I'd use only hand tools when doing joinery in Japanese-style woodworking? There's a place for that! But most woodworkers... use table saws and routers.
The strongest reason I can think of to discard this kind of automation, and do so proudly, is that it's effectively plagiarizing from all of the experts whose code was used in the training data set without their permission.
That implies that proponents of such approach don't want to pursue learning which requires them to do something that exceeds the mediocrity level set by the AI they rely on.
For me it's obvious that it has a major negative impact on many things.
Why exclude AI dev tools from this decision making? If you don’t find such tools useful, then great, don’t use them. But not everybody feels the same way.
A friend of mine only understood why i was so impressed by LLMs once he had to start coding a website for his new project.
My feeling is that low-level / system programming is currently at the edge of what LLMs can do. So i'd say that languages that manage to provide nice abstractions around those types of problems will thrive. The others will have a hard time gaining support among young developers.