www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Garbage Collection for Systems Programmers

reply Adam <adm98 gmail.com> writes:
Thought (this)[1] was an interesting read, and is a good 
counterpoint to all those who refuse to give D a change because 
of GC.

[1]: https://bitbashing.io/gc-for-systems-programmers.html
Mar 31
next sibling parent reply Anonymouse <zorael gmail.com> writes:
On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 Thought (this)[1] was an interesting read, and is a good 
 counterpoint to all those who refuse to give D a change because 
 of GC.

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
We don't have a moving, recompacting, generational GC though. This doesn't invalidate the points he makes, but it's not quite apples vs apples. Incidentally [that's me to the left](https://assets.bitbashing.io/images/just-use-gc.jpg).
Mar 31
parent Walter Bright <newshound2 digitalmars.com> writes:
On 3/31/2024 8:52 AM, Anonymouse wrote:
 We don't have a moving, recompacting, generational GC though. This doesn't 
 invalidate the points he makes, but it's not quite apples vs apples.
D doesn't have a moving collector because if there is a pointer into a GC allocated object, and the object is moved, then the pointer is pointing to garbage. There goes memory safety out the window. It is possible to detect those pointers and "pin" that particular allocation so it doesn't move, and I wrote such a collector long ago. It's called a partially compacting collector. (I thought I had invented this, but then later found a paper on it.) This could be done for D, it isn't that hard. As for a generational collector, this relies on the collector being notified when an allocated object gets mutated. This requires adding a "write gate" to the generated code. All writes through pointers execute additional code to notify the collector that the object changed. This is fine if all memory is allocated via the collector. But in a mixed allocator environment, you're looking at a severe performance cost. So D doesn't do that. At one point I did implement a scheme whereby the the GC's pages were marked "read only", and so when a write was done, the CPU would seg fault. A handler was made for the seg fault which notified the GC that it changed, and made the page writeable again. This involved no extra code for writes. Unfortunately, benchmarking showed that it was way too slow to be viable. Oh well.
Apr 08
prev sibling next sibling parent reply Lance Bachmeier <no spam.net> writes:
On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 Thought (this)[1] was an interesting read, and is a good 
 counterpoint to all those who refuse to give D a change because 
 of GC.

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
It's a good counterpoint in some sense, but it argues about stuff like
 Slower than manual memory management
The first question to ask is "Does it matter one way or the other?" Since you're probably not writing the Linux kernel, even if you're engaged in 'systems programming', the answer is quite often no. If the answer is yes, the second question to ask is "Does it matter enough?" I dislike these debates because most of those arguing against the GC are insufficiently informed to engage in a worthwhile debate. Many of them don't even understand that you can do things with a programming language other than write video games.
Mar 31
parent reply Adam Wilson <flyboynw gmail.com> writes:
On Monday, 1 April 2024 at 01:58:51 UTC, Lance Bachmeier wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 The first question to ask is "Does it matter one way or the 
 other?" Since you're probably not writing the Linux kernel, 
 even if you're engaged in 'systems programming', the answer is 
 quite often no. If the answer is yes, the second question to 
 ask is "Does it matter enough?"

 I dislike these debates because most of those arguing against 
 the GC are insufficiently informed to engage in a worthwhile 
 debate. Many of them don't even understand that you can do 
 things with a programming language other than write video games.
The discourse around the GC has gotten so ridiculous that I have seriously considered asking Walter to declare that "If you want to create an OS or Video Game, consider a different language." OS/Games is actually a fairly uncommon use of D, if you look at what the people who aren't whining endlessly about the GC are actually doing with it. Personally, I blame the OS/Game crowd for single-handedly keeping D out of the web service space for the past *decade* because, instead of improving the GC to the point that long-running processes are possible, we've built a mountain of (mis)features designed to assuage their demands. I maintain that this was probably the second biggest mistake in D's history. We need to accept a fact that I learned at DConf 2017, the no-GC crowd will not be silenced until the GC is removed from the language. However, Walter has said the GC is here to stay. Therefore, the no-GC crowd will never be silenced. We've given them a plethora of tools to work without the GC. It is time that we stopped giving them so much our time. We have bigger problems to solve.
Apr 01
next sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 
 Personally, I blame the OS/Game crowd for single-handedly 
 keeping D out of the web service space for the past *decade* 
 because, instead of improving the GC to the point that 
 long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands. I maintain 
 that this was probably the second biggest mistake in D's 
 history.
feel free to try d's wasm with a broken libc and std The gc/allocator debate is just a proxy for the failure to merge data structures
Apr 01
prev sibling next sibling parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 We need to accept a fact that I learned at DConf 2017, the 
 no-GC crowd will not be silenced until the GC is removed from 
 the language. However, Walter has said the GC is here to stay.
Perhaps such pseudo-debate in the forums could be shut down in a standard way. If a page was written explaining the D_Foundation/Walter's position and/or an FAQ with responses to bogus questions, then whenever the topic was raised the response could just be a link.
Apr 01
parent aberba <karabutaworld gmail.com> writes:
On Monday, 1 April 2024 at 22:11:09 UTC, Carl Sturtivant wrote:
 On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 We need to accept a fact that I learned at DConf 2017, the 
 no-GC crowd will not be silenced until the GC is removed from 
 the language. However, Walter has said the GC is here to stay.
Perhaps such pseudo-debate in the forums could be shut down in a standard way. If a page was written explaining the D_Foundation/Walter's position and/or an FAQ with responses to bogus questions, then whenever the topic was raised the response could just be a link.
100% agree
Apr 01
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 On Monday, 1 April 2024 at 01:58:51 UTC, Lance Bachmeier wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 The first question to ask is "Does it matter one way or the 
 other?" Since you're probably not writing the Linux kernel, 
 even if you're engaged in 'systems programming', the answer is 
 quite often no. If the answer is yes, the second question to 
 ask is "Does it matter enough?"

 I dislike these debates because most of those arguing against 
 the GC are insufficiently informed to engage in a worthwhile 
 debate. Many of them don't even understand that you can do 
 things with a programming language other than write video 
 games.
The discourse around the GC has gotten so ridiculous that I have seriously considered asking Walter to declare that "If you want to create an OS or Video Game, consider a different language." OS/Games is actually a fairly uncommon use of D, if you look at what the people who aren't whining endlessly about the GC are actually doing with it. Personally, I blame the OS/Game crowd for single-handedly keeping D out of the web service space for the past *decade* because, instead of improving the GC to the point that long-running processes are possible, we've built a mountain of (mis)features designed to assuage their demands. I maintain that this was probably the second biggest mistake in D's history. We need to accept a fact that I learned at DConf 2017, the no-GC crowd will not be silenced until the GC is removed from the language. However, Walter has said the GC is here to stay. Therefore, the no-GC crowd will never be silenced. We've given them a plethora of tools to work without the GC. It is time that we stopped giving them so much our time. We have bigger problems to solve.
crowds, regardless of what the world thinks is adequate for systems programming as a programming language, we have companies shipping real hardware being programmed in those languages. On Go side, the USB Armory security key, Arduino as possible target, Android GPU debugger, gVisor, container based distros with Linux kernel + Go userspace On Java side, PTC and Aicas bare metal implementations with real time GC, used even by the military in weapons control on battleships On Oberon, Astrobe is selling compilers for ARM based boards for about 20 years now. Minecraft probably would never happened if Notch decided to listen what would be the best programming language to write games, instead of using the one he knew best. Same applies to other major success like Stardew Valley. D should just follow the same approach, have people shipping the projects they like regardless if the rest of the world thinks it isn't system programming because a GC is involved.
Apr 02
parent Sergey <kornburn yandex.ru> writes:
On Tuesday, 2 April 2024 at 07:17:56 UTC, Paulo Pinto wrote:
 D should just follow the same approach, have people shipping 
 the projects they like regardless if the rest of the world 
 thinks it isn't system programming because a GC is involved.
And we are coming to another part... where D is not about shipping the products, but about research in compilers..
Apr 02
prev sibling next sibling parent Hipreme <msnmancini hotmail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 On Monday, 1 April 2024 at 01:58:51 UTC, Lance Bachmeier wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 The first question to ask is "Does it matter one way or the 
 other?" Since you're probably not writing the Linux kernel, 
 even if you're engaged in 'systems programming', the answer is 
 quite often no. If the answer is yes, the second question to 
 ask is "Does it matter enough?"

 I dislike these debates because most of those arguing against 
 the GC are insufficiently informed to engage in a worthwhile 
 debate. Many of them don't even understand that you can do 
 things with a programming language other than write video 
 games.
The discourse around the GC has gotten so ridiculous that I have seriously considered asking Walter to declare that "If you want to create an OS or Video Game, consider a different language." OS/Games is actually a fairly uncommon use of D, if you look at what the people who aren't whining endlessly about the GC are actually doing with it. Personally, I blame the OS/Game crowd for single-handedly keeping D out of the web service space for the past *decade* because, instead of improving the GC to the point that long-running processes are possible, we've built a mountain of (mis)features designed to assuage their demands. I maintain that this was probably the second biggest mistake in D's history. We need to accept a fact that I learned at DConf 2017, the no-GC crowd will not be silenced until the GC is removed from the language. However, Walter has said the GC is here to stay. Therefore, the no-GC crowd will never be silenced. We've given them a plethora of tools to work without the GC. It is time that we stopped giving them so much our time. We have bigger problems to solve.
As a Game Developer, I could not care less about nogc features. I have deliberately done the wasm runtime extension from Adam, to use an allocation only GC. Games can easily implement a game-level GC, where you can control game entities, their specific resources and deallocate them when you want. Of course, I would prefer way more having a better GC in D. I have saw that a plenty of algorithms that uses memory allocation would perform better because we have GC pauses. I'm not saying that I should not be allocating GC memory. I'm saying that seeing such huge impact being done by the GC when I compare it to Java is quite saddening specially for the given effort. I had my own implementation of an `assert` like feature for my engine which used lazy parameters. My loop time reduced by about 80% (not a big game) by simply removing those lazy parameters because they were allocating every frame.
Apr 02
prev sibling next sibling parent reply Guillaume Piolat <first.name gmail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 We need to accept a fact that I learned at DConf 2017, the 
 no-GC crowd will not be silenced until the GC is removed from 
 the language. However, Walter has said the GC is here to stay. 
 Therefore, the no-GC crowd will never be silenced. We've given 
 them a plethora of tools to work without the GC. It is time 
 that we stopped giving them so much our time. We have bigger 
 problems to solve.
The problem is viewing people using nogc as doing out of performance fetish, when really it makes no performance difference meaningfully (2x memory consupmtion usually "ok"). Instead the Mir library, Dplug, Hipreme Engine and soon Inochi2D are doing it simply for portability because the regular druntime has insane requirements, which the C standard library doesn't have. We went " nogc" when druntime wouldn't even start in some macOS in shared library form, not because some kind of performance reason. If you want to use GC and be portable, then (currently in D today) you have to write your own D runtime. Of which there is 3 or 4 custom ones! The solution is of course to lower the requirements of druntime so that it can run anywhere, be in WebASM, on the Playstation Vita, on the Dreamcast, what people are doing nowadays (and may need or not nogc).
Apr 02
next sibling parent Guillaume Piolat <first.name gmail.com> writes:
Perhaps I misunderstood GP post, indeed now there are many many 
tools to work with the GC better, profile it, avoid it, and it 
has gone faster in silent ways even.
Apr 02
prev sibling next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 03/04/2024 2:23 AM, Guillaume Piolat wrote:
 If you want to use GC and be portable, then (currently in D today) you 
 have to write your own D runtime. Of which there is 3 or 4 custom ones!
And worse still they each have to implement the compiler hooks, when all they really need to implement is stuff like allocation of memory! Locking people into a specific compiler version makes peoples lives harder than it needs to be. Split the hooks out, into a compiler adjacent library with a well defined API for runtimes to implement will make custom runtime writers lives a lot easier. It'll also mean faster builds since less stuff is in object.d so lots of wins here.
Apr 02
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Apr 03, 2024 at 03:50:19AM +1300, Richard (Rikki) Andrew Cattermole via
Digitalmars-d wrote:
 On 03/04/2024 2:23 AM, Guillaume Piolat wrote:
 If you want to use GC and be portable, then (currently in D today)
 you have to write your own D runtime. Of which there is 3 or 4
 custom ones!
And worse still they each have to implement the compiler hooks, when all they really need to implement is stuff like allocation of memory! Locking people into a specific compiler version makes peoples lives harder than it needs to be. Split the hooks out, into a compiler adjacent library with a well defined API for runtimes to implement will make custom runtime writers lives a lot easier.
Yes, undocumented compiler hooks are a big barrier to adaptability. This needs to be refactored. Right now there are tons of undocumented conventions and hooks that you basically have to do trial-and-error to discover. This needs to be documented and refactored to a proper API instead.
 It'll also mean faster builds since less stuff is in object.d so lots
 of wins here.
Yeah while getting my D code to run in wasm, I discovered that there's a lot of stuff in object.d that actually only matters to druntime backend code. Would be nice to get rid of this stuff. Though tbh you probably won't see very much improvement in build times here; object.d is pretty low down on the list of druntime/phobos modules that hog compile time. The gains probably won't be noticeable in non-trivial projects. T -- Two wrongs don't make a right; but three rights do make a left...
Apr 02
next sibling parent reply Dennis <dkorpel gmail.com> writes:
On Tuesday, 2 April 2024 at 15:31:25 UTC, H. S. Teoh wrote:
 Though tbh you probably won't see very much improvement in 
 build times here; object.d is pretty low down on the list of 
 druntime/phobos modules that hog compile time. The gains 
 probably won't be noticeable in non-trivial projects.
``` cd /usr/include/dlang/dmd time dmd -o- object.d dmd -o- object.d 0.03s user 0.00s system 98% cpu 0.030 total ``` There's only 30 ms to save here.
Apr 02
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 03/04/2024 8:54 AM, Dennis wrote:
 On Tuesday, 2 April 2024 at 15:31:25 UTC, H. S. Teoh wrote:
 Though tbh you probably won't see very much improvement in build times 
 here; object.d is pretty low down on the list of druntime/phobos 
 modules that hog compile time. The gains probably won't be noticeable 
 in non-trivial projects.
``` cd /usr/include/dlang/dmd time dmd -o- object.d dmd -o- object.d  0.03s user 0.00s system 98% cpu 0.030 total ``` There's only 30 ms to save here.
That is not what is of interest. Its symbol lookup of other modules falling back on object.d.
Apr 03
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/2/2024 8:31 AM, H. S. Teoh wrote:
 Yes, undocumented compiler hooks are a big barrier to adaptability. This
 needs to be refactored.  Right now there are tons of undocumented
 conventions and hooks that you basically have to do trial-and-error to
 discover.  This needs to be documented and refactored to a proper API
 instead.
 
 Yeah while getting my D code to run in wasm, I discovered that there's a
 lot of stuff in object.d that actually only matters to druntime backend
 code. Would be nice to get rid of this stuff.
Please document these in a bugzilla issue. Thanks!
Apr 08
prev sibling parent reply rkompass <rkompass gmx.de> writes:
On Tuesday, 2 April 2024 at 14:50:19 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 On 03/04/2024 2:23 AM, Guillaume Piolat wrote:
 If you want to use GC and be portable, then (currently in D 
 today) you have to write your own D runtime. Of which there is 
 3 or 4 custom ones!
And worse still they each have to implement the compiler hooks, when all they really need to implement is stuff like allocation of memory!
Is this the reason why there is no D for Arduino? (Which would be a big + in terms of popularity).
Apr 03
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 03/04/2024 10:28 PM, rkompass wrote:
 On Tuesday, 2 April 2024 at 14:50:19 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 On 03/04/2024 2:23 AM, Guillaume Piolat wrote:
 If you want to use GC and be portable, then (currently in D today) 
 you have to write your own D runtime. Of which there is 3 or 4 custom 
 ones!
And worse still they each have to implement the compiler hooks, when all they really need to implement is stuff like allocation of memory!
Is this the reason why there is no D for Arduino? (Which would be a big + in terms of popularity).
No. Although it hasn't helped. Micro's tend to be on the small size, all the extra metadata you need for D code like ModuleInfo and TypeInfo could easily exceed that budget or bump you up into one of the more expensive micros killing D off as a possible language. About the max you'd want here is -betterC which already works.
Apr 03
parent Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 4 April 2024 at 06:35:49 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 On 03/04/2024 10:28 PM, rkompass wrote:
 On Tuesday, 2 April 2024 at 14:50:19 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:
 On 03/04/2024 2:23 AM, Guillaume Piolat wrote:
 If you want to use GC and be portable, then (currently in D 
 today) you have to write your own D runtime. Of which there 
 is 3 or 4 custom ones!
And worse still they each have to implement the compiler hooks, when all they really need to implement is stuff like allocation of memory!
Is this the reason why there is no D for Arduino? (Which would be a big + in terms of popularity).
No. Although it hasn't helped. Micro's tend to be on the small size, all the extra metadata you need for D code like ModuleInfo and TypeInfo could easily exceed that budget or bump you up into one of the more expensive micros killing D off as a possible language. About the max you'd want here is -betterC which already works.
Yet many stuff is still possible without taking the GC out of the picture. https://blog.arduino.cc/2019/08/23/tinygo-on-arduino/ https://www.hackster.io/alankrantas/tinygo-on-arduino-uno-an-introduction-6130f6 https://tinygo.org/docs/reference/microcontrollers/ https://www.electromaker.io/blog/article/tinygo-brings-golang-to-microcontrollers Now imagine if all those Maker magazines and events would be adopting D instead.
Apr 04
prev sibling parent Adam Wilson <flyboynw gmail.com> writes:
On Tuesday, 2 April 2024 at 13:23:48 UTC, Guillaume Piolat wrote:
 Instead the Mir library, Dplug, Hipreme Engine and soon 
 Inochi2D are doing it simply for portability because the 
 regular druntime has insane requirements, which the C standard 
 library doesn't have. We went " nogc" when druntime wouldn't 
 even start in some macOS in shared library form, not because 
 some kind of performance reason.

 If you want to use GC and be portable, then (currently in D 
 today) you have to write your own D runtime. Of which there is 
 3 or 4 custom ones!

 The solution is of course to lower the requirements of druntime 
 so that it can run anywhere, be in WebASM, on the Playstation 
 Vita, on the Dreamcast, what people are doing nowadays (and may 
 need or not  nogc).
This is a very reasonable take, and I can understand how supporting different platforms might result in needing to abandon features that rely on DRT, especially given the difficultly in porting. And I would generally agree with DIP's and PR's that move us in that direction. Rikki has some ideas there. In general though, I think it is reasonable for us to say that if you expect D to work fully on a new platform, that you're going to have to fully port DRT. And I don't have a problem with the existing no-GC features. I just think that we need to deprioritize no-GC so that our precious few resources can be spent addressing the problems we have with the GC and open-up new markets for D.
Apr 03
prev sibling next sibling parent reply cc <cc nevernet.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 The discourse around the GC has gotten so ridiculous that I 
 have seriously considered asking Walter to declare that "If you 
 want to create an OS or Video Game, consider a different 
 language."

 OS/Games is actually a fairly uncommon use of D, if you look at 
 what the people who aren't whining endlessly about the GC are 
 actually doing with it.

 Personally, I blame the OS/Game crowd for single-handedly 
 keeping D out of the web service space for the past *decade* 
 because, instead of improving the GC to the point that 
 long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands. I maintain 
 that this was probably the second biggest mistake in D's 
 history.
"We'd have the programming Utopia if only all the goddamn users stopped getting in the way!" Seriously though, what an asinine opinion. Excluding an entire market, to satisfy your religious opinions on memory management. Nobody has stopped anyone from improving the GC. By all means, please improve the GC. I develop commercial games in D, I have come to avoid the GC much of the time, and I'm pretty sure I haven't spent any of the past 10 years of my workload getting in your way of making the GC the best it could be. I simply haven't been using it. Oh no, someone exposed GC.free and __delete as a hacky temporary stopgap? Gosh, that 30 seconds of work sure did get in the way of a decade of someone else making that thing I want better. In *my* opinion, a language that can't handle modern 60+ fps (144-240+ nowadays) gaming or graphical simulations is nothing more than a hobbyist piece of crap and not fit for business. But that's just my opinion... I don't try to force it on anyone else ;) I'm aware other industries exist and we can all graze on our respective farms in peace. Fortunately, there's numerous ways to get around the problems of stop-the-world GC skips, even working within the GC. It takes a tiny bit of thought, and a document was drafted to suggest these techniques to newcomers, though nobody strained themselves with effort over this since, naturally, the doc wants to sell you his favorite pills. Fair enough. Even so, it is the natural philosophy and developmental style of many to simply avoid the GC. You can either react to this with "different strokes", or you can go full zealot and start holding inquisitions for all the blasphemers. It actually takes very little effort to please the detmem crowd by just giving them the basic tools they want, compared to trying to sell them on broad spectrum cure-all GC liniment when the seller's approach is "Just use it! Just use it already damnit stop asking questions! No refunds!" and is brazenly ignorant of how much of an issue naive GC use can be for particular situations. D actually HAS gone and given them the tools they need, albeit they don't come in the same nice shiny box as the GC. They're just kind of handed out in plastic baggies. But hey, that's how Ultima sold 50,000 copies before Richard's net worth clapped $1.5B. In a world where the gold standard of technology is "It Just Works", D's GC... *doesn't* Just Work. Oh, It DOES Work. It just Works With Effort. And that's a problem you've had 10 years to do something about. What has been done besides paint the church and hand out a lot of fliers?
Apr 03
parent reply bachmeier <no spam.net> writes:
On Thursday, 4 April 2024 at 04:01:56 UTC, cc wrote:

 Seriously though, what an asinine opinion.  Excluding an entire 
 market, to satisfy your religious opinions on memory 
 management.  Nobody has stopped anyone from improving the GC.  
 By all means, please improve the GC.  I develop commercial 
 games in D, I have come to avoid the GC much of the time, and 
 I'm pretty sure I haven't spent any of the past 10 years of my 
 workload getting in your way of making the GC the best it could 
 be.  I simply haven't been using it.  Oh no, someone exposed 
 GC.free and __delete as a hacky temporary stopgap?  Gosh, that 
 30 seconds of work sure did get in the way of a decade of 
 someone else making that thing I want better.
I think you're taking the wrong thing from this conversation. You're helping to make the case that the anti-GC crowd is wrong. For me, the problem is that the anti-GC zealots make the following claims: - The GC gets in the way. It needs to be removed the way Rust removed theirs for performance reasons. - D isn't suitable as a systems programming language because it has a GC. - GC should be removed as the default because it leads to bad practice. - All D programs have to use the GC. These things appear on Reddit, Hacker News, etc. etc. any time D is discussed, and it has an effect. Convincing someone to use the GC might be right on paper in some cases, but in practice it doesn't work if the other side has a fundamentalist viewpoint that the GC has to be removed entirely. You can see it in the comments on the story: "That's a valid point, but..." I do think we need to be realistic and realize that the anti-GC crowd has found its love and they've married Rust or they've decide that C++ is a flawed but good enough spouse. That battle is over. D needs to worry about folks that like C (D makes it realistic for them to keep writing the code they like writing) and folks that like GC (the story's probably not as good as it could be). I'll also add that I use SafeRefCounted a lot, so I'm also avoiding D's GC much of the time rather than forcing the GC into everything I do.
Apr 04
next sibling parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Thursday, 4 April 2024 at 14:32:58 UTC, bachmeier wrote:
 I'll also add that I use SafeRefCounted a lot, so I'm also 
 avoiding D's GC much of the time rather than forcing the GC 
 into everything I do.
Interesting, could you expand on your wider thinking about doing this?
Apr 04
parent reply bachmeier <no spam.net> writes:
On Thursday, 4 April 2024 at 15:07:23 UTC, Carl Sturtivant wrote:
 On Thursday, 4 April 2024 at 14:32:58 UTC, bachmeier wrote:
 I'll also add that I use SafeRefCounted a lot, so I'm also 
 avoiding D's GC much of the time rather than forcing the GC 
 into everything I do.
Interesting, could you expand on your wider thinking about doing this?
What I mean is that a lot of memory I use is allocated either by C libraries or by R's garbage collector. SafeRefCounted takes care of all the details for me so I don't have to think about it.
Apr 04
parent Carl Sturtivant <sturtivant gmail.com> writes:
On Thursday, 4 April 2024 at 15:54:30 UTC, bachmeier wrote:
 What I mean is that a lot of memory I use is allocated either 
 by C libraries or by R's garbage collector. SafeRefCounted 
 takes care of all the details for me so I don't have to think 
 about it.
A great example of the strength of D. Eliminating the GC as the default is *premature optimization*. When you need to do something else, you can. Tune up your use of the GC or not use it.
Apr 04
prev sibling parent reply ryuukk_ <ryuukk.dev gmail.com> writes:
On Thursday, 4 April 2024 at 14:32:58 UTC, bachmeier wrote:
 On Thursday, 4 April 2024 at 04:01:56 UTC, cc wrote:

 Seriously though, what an asinine opinion.  Excluding an 
 entire market, to satisfy your religious opinions on memory 
 management.  Nobody has stopped anyone from improving the GC.  
 By all means, please improve the GC.  I develop commercial 
 games in D, I have come to avoid the GC much of the time, and 
 I'm pretty sure I haven't spent any of the past 10 years of my 
 workload getting in your way of making the GC the best it 
 could be.  I simply haven't been using it.  Oh no, someone 
 exposed GC.free and __delete as a hacky temporary stopgap?  
 Gosh, that 30 seconds of work sure did get in the way of a 
 decade of someone else making that thing I want better.
I think you're taking the wrong thing from this conversation. You're helping to make the case that the anti-GC crowd is wrong. For me, the problem is that the anti-GC zealots make the following claims: - The GC gets in the way. It needs to be removed the way Rust removed theirs for performance reasons. - D isn't suitable as a systems programming language because it has a GC. - GC should be removed as the default because it leads to bad practice. - All D programs have to use the GC. These things appear on Reddit, Hacker News, etc. etc. any time D is discussed, and it has an effect. Convincing someone to use the GC might be right on paper in some cases, but in practice it doesn't work if the other side has a fundamentalist viewpoint that the GC has to be removed entirely. You can see it in the comments on the story: "That's a valid point, but..." I do think we need to be realistic and realize that the anti-GC crowd has found its love and they've married Rust or they've decide that C++ is a flawed but good enough spouse. That battle is over. D needs to worry about folks that like C (D makes it realistic for them to keep writing the code they like writing) and folks that like GC (the story's probably not as good as it could be). I'll also add that I use SafeRefCounted a lot, so I'm also avoiding D's GC much of the time rather than forcing the GC into everything I do.
There is no anti-GC crowd I will not speak for other people, so i will speak for myself I advocate for: GC as a library and core language as pay as you go, so i can use a great language without people making it annoying to use Imagine: ```D int[] myarray; create_array(allocator, myarray, length: 16); int[int] mymap; create_map(allocator, mymap) // or fall back to using GC for casual scripting ``` No need to pick a clan, be smart, enable people There is no reason to require the GC to report errors for example, this is beyond stupid The absolute best is what Zig is doing by encouraging people to use/request for an allocator The pro-gc people love to enforce the GC everywhere that's not needed, and when people start having issues with it, everyone becomes silent ```D import core.memory; import std.stdio; import std.conv; import std.datetime.stopwatch; string[int] stuff; void main() { for (int i = 0; i < 10_000_000; i++) stuff[i] = "hello_" ~ to!(string)(i); auto sw = StopWatch(AutoStart.yes); auto a = new ubyte[512_000_000]; writeln(sw.peek.total!"seconds"); } ``` Question to pro-GC crowd, why does it take 2 seconds to create the array?
Apr 04
next sibling parent user1234 <user1234 12.fr> writes:
On Thursday, 4 April 2024 at 15:42:24 UTC, ryuukk_ wrote:
 ```D
 import core.memory;
 import std.stdio;
 import std.conv;
 import std.datetime.stopwatch;

 string[int] stuff;

 void main()
 {
     for (int i = 0; i < 10_000_000; i++)
         stuff[i] = "hello_" ~ to!(string)(i);

     auto sw = StopWatch(AutoStart.yes);
     auto a = new ubyte[512_000_000];
     writeln(sw.peek.total!"seconds");
 }

 ```

 Question to pro-GC crowd, why does it take 2 seconds to create 
 the array?
I'm not pro or con but that rather looks like a bug, i.e the new'ed array should not be scanned, it's not even initialized so in no way it can contain references to living-in-the-GC-pointers. (side-node 3 secs here ^^).
Apr 04
prev sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 4 April 2024 at 15:42:24 UTC, ryuukk_ wrote:

 There is no anti-GC crowd

 I will not speak for other people, so i will speak for myself
There's definitely an anti-GC crowd. Most of them don't use D, but they're happy to vandalize discussions about D.
 I advocate for: GC as a library and core language as pay as you 
 go, so i can use a great language without people making it 
 annoying to use
 Imagine:

 ```D
 int[] myarray;
 create_array(allocator, myarray, length: 16);


 int[int] mymap;
 create_map(allocator, mymap)


 // or fall back to using GC for casual scripting
 ```
But once you have to "fall back" to the GC, you've lost anyone that wants to do scripting.
 No need to pick a clan, be smart, enable people
But in your example, you have picked a clan, those that enjoy dealing with memory management.
 The absolute best is what Zig is doing by encouraging people to 
 use/request for an allocator
Why would I care about an allocator? These are the specs of the computer that will run my program: Memory: 125.5 GiB Processor: 13th Gen Intel® Core™ i9-13900 × 32 If I want to write a program that changes a few lines in a markdown file, for instance, there's no reason to introduce overhead.
 Question to pro-GC crowd, why does it take 2 seconds to create 
 the array?
Why are you using the GC if that's the code you need to write? Although if I'm writing a script and it's run one time once a day, even that isn't a big deal.
Apr 04
parent reply ryuukk_ <ryuukk.dev gmail.com> writes:
On Thursday, 4 April 2024 at 16:06:35 UTC, bachmeier wrote:
 On Thursday, 4 April 2024 at 15:42:24 UTC, ryuukk_ wrote:

 There is no anti-GC crowd

 I will not speak for other people, so i will speak for myself
There's definitely an anti-GC crowd. Most of them don't use D, but they're happy to vandalize discussions about D.
 I advocate for: GC as a library and core language as pay as 
 you go, so i can use a great language without people making it 
 annoying to use
 Imagine:

 ```D
 int[] myarray;
 create_array(allocator, myarray, length: 16);


 int[int] mymap;
 create_map(allocator, mymap)


 // or fall back to using GC for casual scripting
 ```
But once you have to "fall back" to the GC, you've lost anyone that wants to do scripting.
 No need to pick a clan, be smart, enable people
But in your example, you have picked a clan, those that enjoy dealing with memory management.
 The absolute best is what Zig is doing by encouraging people 
 to use/request for an allocator
Why would I care about an allocator? These are the specs of the computer that will run my program: Memory: 125.5 GiB Processor: 13th Gen Intel® Core™ i9-13900 × 32 If I want to write a program that changes a few lines in a markdown file, for instance, there's no reason to introduce overhead.
 Question to pro-GC crowd, why does it take 2 seconds to create 
 the array?
Why are you using the GC if that's the code you need to write? Although if I'm writing a script and it's run one time once a day, even that isn't a big deal.
Congratulation, you just describe the reason why nobody wants to take D seriously when it comes to system language with a GC "Problem? Let's put it under the carpet"
Apr 04
parent bachmeier <no spam.net> writes:
On Thursday, 4 April 2024 at 16:19:28 UTC, ryuukk_ wrote:

 Congratulation, you just describe the reason why nobody wants 
 to take D seriously when it comes to system language with a GC
That's a different issue though. D's GC can at times become a problem, and pretty much everyone that says D should embrace the GC also says it should be improved. In your example, the simple solution is to call `GC.disable()` before allocating the array, so it's hardly a problem for the cases where this does arise. The real solution is to fix the GC so this doesn't happen. And you can use malloc with reference counting if you don't want to disable the GC when it's needed. If we move the GC to a library and make it less convenient to use, we've eliminated most scripting usage in return for solving edge cases with a simple solution (probably a bug) like the one you've presented.
Apr 04
prev sibling next sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 On Monday, 1 April 2024 at 01:58:51 UTC, Lance Bachmeier wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 The first question to ask is "Does it matter one way or the 
 other?" Since you're probably not writing the Linux kernel, 
 even if you're engaged in 'systems programming', the answer is 
 quite often no. If the answer is yes, the second question to 
 ask is "Does it matter enough?"

 I dislike these debates because most of those arguing against 
 the GC are insufficiently informed to engage in a worthwhile 
 debate. Many of them don't even understand that you can do 
 things with a programming language other than write video 
 games.
The discourse around the GC has gotten so ridiculous that I have seriously considered asking Walter to declare that "If you want to create an OS or Video Game, consider a different language." OS/Games is actually a fairly uncommon use of D, if you look at what the people who aren't whining endlessly about the GC are actually doing with it. Personally, I blame the OS/Game crowd for single-handedly keeping D out of the web service space for the past *decade* because, instead of improving the GC to the point that long-running processes are possible, we've built a mountain of (mis)features designed to assuage their demands. I maintain that this was probably the second biggest mistake in D's history. We need to accept a fact that I learned at DConf 2017, the no-GC crowd will not be silenced until the GC is removed from the language. However, Walter has said the GC is here to stay. Therefore, the no-GC crowd will never be silenced. We've given them a plethora of tools to work without the GC. It is time that we stopped giving them so much our time. We have bigger problems to solve.
I am sure some people on this forum would refer me as "one of them" - and, by default, someone that "wants control of memory" being labeled as ignorant. That makes me [the guy in the middle](https://assets.bitbashing.io/images/just-use-gc.jpg), right? I think some people are viewing this as black or white. You love the GC or you hate it. This isn't some kind of religious group. Those that might categorize me as some OOP/GC hater, I could say the same for those that want to force the GC it on me. I have D programs on servers. Some using a rich set of features... shock.. uses the GC. For others, they are written in BetterC. What we have to remember is, in my opinion, is the Dlang is a unique programming language. Dlang has the opportunity of winning over many different sections of programmers. This is something other languages, especially higher level ones, do not have. If you want to go down the route of "but if you want to write a game or OS... this is not for you" seems like you are enforcing what **you want** from D and excluding users/developers who need to solve other problems. It isn't just you, but others who are web guys or app guys are happy to join in and agree. I am not being bitter about it. The core team behind D are free to push whatever goals they wish for the language. If you want to go the route of "no gamers or OS guys" then make it clear. People are going to be argumentative about it on forums and it is not because "no-GC crowd will never be silenced" -- it is because people on this forum **really care** about the D programming language. I will agree that some people could word their points of view more professionally (and more respecfully of others) -- but you can tell that a number of members on this forum would take a bullet for D. State the goals of D. Some people will dislike it and create posts about it but at the end, it comes down to 2 choices:- 1) Accept the goals and keep using D, or 2) Accept defeat and move to another language. Maybe there is truth behind it. Is D the language for gamers or OS'ers? With Rust entering the Linux and Windows Kernel space... and we have game-specific languages like Odin or Jai... maybe D is not the right choice for these afterall. If the Non-GC guys are that much of a burden, maybe a chat with Walter should happen sooner than later.
Apr 04
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
I have argued for many years that D can scale, and that is a good thing.

Sometimes GC is right (especially for business logic), other times you 
need full control and are talking to the cpu directly about memory.

At one point I even got Walter to agree that someone is allowed to work 
on adding write barriers (opt-in) so a more advanced GC could be 
written, although nobody has done it.

Yet my stuff is -betterC.

I argue that reference counting should be used exclusively for system 
resources and be part of the language so that we can take full advantage 
of it, but argue that business logic should be GC based simultaneously.

It certainly isn't black or white! Its a scale of values and multiple 
may be in use at any given time and that is a desirable trait of D.
Apr 04
parent reply Sebastian Nibisz <snibisz gmail.com> writes:
Why doesn't the D language have a fully concurrent, pause-free 
GC? Look at [SGCL](https://github.com/pebal/sgcl), experimental 
tracked pointers library for C++. I've developed this library and 
can share insights on how such a GC engine works.
Apr 04
parent reply user1234 <user1234 12.fr> writes:
On Thursday, 4 April 2024 at 09:32:10 UTC, Sebastian Nibisz wrote:
 Why doesn't the D language have a fully concurrent, pause-free 
 GC?
 [...]
I believe the reason is that the GC was designed before the multi-core era (or maybe at the very beginning), so STW was less a problem than todays. Then nobody proposed anything new.
Apr 04
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 05/04/2024 3:10 AM, user1234 wrote:
 On Thursday, 4 April 2024 at 09:32:10 UTC, Sebastian Nibisz wrote:
 
     Why doesn't the D language have a fully concurrent, pause-free GC? [...]
 
 I believe the reason is that the GC was designed before the multi-core 
 era (or maybe at the very beginning), so STW was less a problem than 
 todays. Then nobody proposed anything new.
People have proposed plenty of new GC stuff over the years. But to summarize: For newer GC designs, you need write barriers. They are approved as an opt-in feature. Nobody has added support nor written a new GC. It simply comes down to man power, nobody is willing to do the work. There just isn't enough of a win here to make anyone motivated to do it. We'd need funding to try and get a student or two to tackle it long term.
Apr 04
parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Thursday, 4 April 2024 at 14:13:52 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 It simply comes down to man power, nobody is willing to do the 
 work.

 There just isn't enough of a win here to make anyone motivated 
 to do it.
Not enough of a highly visible tactical win, no. However... I'm just an occasional intense user of D. However, reading the forums, reading this thread again now specifically, I think it is possible to form a wider conclusion. *There is a massive strategic win to having a fabulous 21st century GC for D, perfectly good for soft-real-time coding with no further ado, like the one used by the author of the article linked at the start of this thread.* We might guess that some people who are trying to find the right tool for the job (their soft-real-time game for example), and who do not like manual memory management because of the additional drag it imposes on the programmer simply did not choose D even though they otherwise would were this ace GC present. they detest for the memory management administrative burden it imposes on the programmer and the general complicated nature of coding in such languages). What we see in the dlang forums related to the above group are such soft-real-time programmers who have labored and successfully overcome or bypassed these difficulties with D's GC in one way or another for their situation. This is a biased sample! The presence of these successes strongly suggests a larger group who failed to go to D, with the successes the minority who got in. Those that penetrated the armor and those who were deflected. It takes something extra to penetrate the armor, so we might reasonably think that the deflected are in the majority, with the successes being the tip of the iceberg. D is deterring a class of people that are very much operating in the spirit of D from joining the D community and creating new things that in turn widen positive attitudes to the language out there. Imagine this: what if D had such an ace GC for the last decade? Perception and use of D would be entirely different to its present state; soft-real-time applications would abound, with a wide community of pro-D game programmers talking in the forums. Just like ImportC being a game changer, ace GC is a game(!) changer. It's just harder to see this, but it is so.
Apr 05
next sibling parent Lance Bachmeier <no spam.net> writes:
On Friday, 5 April 2024 at 16:40:06 UTC, Carl Sturtivant wrote:

 Imagine this: what if D had such an ace GC for the last decade? 
 Perception and use of D would be entirely different to its 
 present state; soft-real-time applications would abound, with a 
 wide community of pro-D game programmers talking in the forums.

 Just like ImportC being a game changer, ace GC is a game(!) 
 changer. It's just harder to see this, but it is so.
A desire to embrace and improve the GC is [an important part of the OpenD fork](https://dpldocs.info/this-week-in-arsd/Blog.Posted_2024_01_08.html#gc).
 One of the guiding principles of this fork is to embrace the GC 
 as a successful design rather than to shun and avoid it.
 Finally, language and library feature discussions are often 
 stopped in their tracks by concerns about  nogc compatibility, 
 without weighing the significant benefits they may bring. "But 
 it needs GC" becomes a feature-killer, and then we're stuck 
 with nothing.
 I think we're all aligned on the goals of keeping the language 
 easy to use, dispelling myths about the GC (while fixing 
 implementation issues when it isn't just a myth), and not 
 letting good work be blocked by the topic.
Apr 05
prev sibling next sibling parent reply Adam Wilson <flyboynw gmail.com> writes:
On Friday, 5 April 2024 at 16:40:06 UTC, Carl Sturtivant wrote:
 *There is a massive strategic win to having a fabulous 21st 
 century GC for D, perfectly good for soft-real-time coding with 
 no further ado, like the one used by the author of the article 
 linked at the start of this thread.*
This is the point I was trying to make. The strategic win for have a fantastic GC would be immense, and would far outweigh anything we could gain by continuing down the no-GC path. But first I would like to state that I am *not* advocating that D remove any of the existing no-GC support. Even I use it occasionally! But I was there when the anti-GC crowd put on a full court press to convince the community that all we needed to do to see massive increase in adoption was make the language more accessible to C/C++ users who need to manually manage memory. As a result, tools like ` nogc` and `-betterC` were introduced. When that proved insufficient, the anti-GC crowd started demanding more invasive changes, up to and including removing the GC altogether. Instead, the world changed around us. Memory Safety is now a national security concern and languages like C/C++ are being speak to the most common usages and concerns of the users of programming languages, and those usages/concerns largely do not benefit from no-GC. Chasing no-GC as far as we did was a mistake that cost us precious time and scarce resources. We need to be mature enough to admit that it was a mistake and correct our course. Given the lessons and direction of the industry over the intervening years, I would strongly argue that now is the time to return our focus to the GC.
 Imagine this: what if D had such an ace GC for the last decade? 
 Perception and use of D would be entirely different to its 
 present state; soft-real-time applications would abound, with a 
 wide community of pro-D game programmers talking in the forums.

 Just like ImportC being a game changer, ace GC is a game(!) 
 changer. It's just harder to see this, but it is so.
Mic. Drop.
Apr 05
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Apr 06, 2024 at 12:06:04AM +0000, Adam Wilson via Digitalmars-d wrote:
 On Friday, 5 April 2024 at 16:40:06 UTC, Carl Sturtivant wrote:
 *There is a massive strategic win to having a fabulous 21st century
 GC for D, perfectly good for soft-real-time coding with no further
 ado, like the one used by the author of the article linked at the
 start of this thread.*
This is the point I was trying to make. The strategic win for have a fantastic GC would be immense, and would far outweigh anything we could gain by continuing down the no-GC path.
Totally. If we could introduce write barriers that opens the door to incremental generational GCs, that would be a HUGE step in the long run. Maybe not so much in the short term, but guys, it's been at least a decade. We gotta stop chasing decisions that only benefit the short term. It's time to think about long-term strategy. [...]
 But I was there when the anti-GC crowd put on a full court press to
 convince the community that all we needed to do to see massive
 increase in adoption was make the language more accessible to C/C++
 users who need to manually manage memory. As a result, tools like
 ` nogc` and `-betterC` were introduced. When that proved insufficient,
 the anti-GC crowd started demanding more invasive changes, up to and
 including removing the GC altogether.
Walter himself used to say in the old days that it's better to cater to existing, enthusiastic customers who are already here, than to chase would-be customers who claim that if only feature X were implemented, they'd adopt D instantly. Because once you implement X, said would-be customers would start clamoring for Y instead. And when you implement Y they would start clamoring for Z. The truth is that they will never become real customers; X, Y and Z are merely convenient excuses. In the meantime, so much effort is being put towards features that in the long run doesn't draw in the promised rush of new customers (and probably never will), while quality-of-life changes for existing customers are being neglected. [...]
 Chasing no-GC as far as we did was a mistake that cost us precious
 time and scarce resources. We need to be mature enough to admit that
 it was a mistake and correct our course. Given the lessons and
 direction of the industry over the intervening years, I would strongly
 argue that now is the time to return our focus to the GC.
[...] +100. While there *have* been improvements in our current GC over the past years, we're running against a brick wall in terms of available GC algorithms, because of the pessimistic situation of no write barriers. That closes the door to many of the major advancements in GC algorithms over the past decade or two. It's time we stop sitting on the fence and commit to a GC-centric language that actually has a competitive GC to T -- Why did the dinosaur get into a car accident? Because a tyrannosaurus wrecks.
Apr 05
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 06/04/2024 1:29 PM, H. S. Teoh wrote:
     Chasing no-GC as far as we did was a mistake that cost us precious
     time and scarce resources. We need to be mature enough to admit that
     it was a mistake and correct our course. Given the lessons and
     direction of the industry over the intervening years, I would
     strongly argue that now is the time to return our focus to the GC. [...]
 
 
 +100. While there /have/ been improvements in our current GC over the 
 past years, we're running against a brick wall in terms of available GC 
 algorithms, because of the pessimistic situation of no write barriers. 
 That closes the door to many of the major advancements in GC algorithms 
 over the past decade or two. It's time we stop sitting on the fence and 
 commit to a GC-centric language that actually has a competitive GC to 

It is not an all or nothing situation. We can have write barriers be opt-in, and if all binaries have it, then the GC can take advantage of it. The only person that needs convincing now is the one that does the work.
Apr 05
prev sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 00:29:04 UTC, H. S. Teoh wrote:
 Walter himself used to say in the old days that it's better to 
 cater to existing, enthusiastic customers who are already here, 
 than to chase would-be customers who claim that if only feature 
 X were implemented, they'd adopt D instantly.  Because once you 
 implement X, said would-be customers would start clamoring for 
 Y instead.  And when you implement Y they would start clamoring 
 for Z.  The truth is that they will never become real 
 customers; X, Y and Z are merely convenient excuses. In the 
 meantime, so much effort is being put towards features that in 
 the long run doesn't draw in the promised rush of new customers 
 (and probably never will), while quality-of-life changes for 
 existing customers are being neglected.
And we are losing potential customers who don't make claims, and just want GC to be a non-issue for their soft-real-time requirements.
 While there *have* been improvements in our current GC over the 
 past years, we're running against a brick wall in terms of 
 available GC algorithms, because of the pessimistic situation 
 of no write barriers. That closes the door to many of the major 
 advancements in GC algorithms over the past decade or two. It's 
 time we stop sitting on the fence and commit to a GC-centric 
 language that actually has a competitive GC to speak of, one on 

Yes. There's no downside! Only a chance at a strategic future for D. Quite aside from technical reasons, I think it is utterly essential and almost too late. If this isn't embraced and the current tactical compromise is continued, then erosion of D from propaganda from anti-GC zealots will continue. "Maybe you should leave D" being emitted at people in the community who want to use GC and want better GC without getting it. In other words, getting people to join the ones who etcetera. "Want a pure GC language, go away to language blah." "GC is Scripting, go away to GC-only language blah, not real programming", etcetera. We've all encountered this psychological noise that has only emotive basis. And it's been very destructive to D. If GC-centric people are deflected in the first place or pushed out then this serves anti-GC zealotry. The present state of GC serves anti-GC zealotry. Code can be written to use a lot of time attributable to the GC, and used to shout about GC being bad. *It is a reasonable working hypothesis that without escaping the present situation with a competitive GC for soft-real-time, that D will be lost strategically.*
Apr 06
prev sibling parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Friday, 5 April 2024 at 16:40:06 UTC, Carl Sturtivant wrote:
 What we see in the dlang forums related to the above group are 
 such soft-real-time programmers who have labored and 
 successfully overcome or bypassed these difficulties with D's 
 GC in one way or another for their situation.

 This is a biased sample! The presence of these successes 
 strongly suggests a larger group who failed to go to D, with 
 the successes the minority who got in. Those that penetrated 
 the armor and those who were deflected. It takes something 
 extra to penetrate the armor, so we might reasonably think that 
 the deflected are in the majority, with the successes being the 
 tip of the iceberg.

 D is deterring a class of people that are very much operating 
 in the spirit of D from joining the D community and creating 
 new things that in turn widen positive attitudes to the 
 language out there.

 Imagine this: what if D had such an ace GC for the last decade? 
 Perception and use of D would be entirely different to its 
 present state; soft-real-time applications would abound, with a 
 wide community of pro-D game programmers talking in the forums.

 Just like ImportC being a game changer, ace GC is a game(!) 
 changer. It's just harder to see this, but it is so.
Unfortunately the people I allude to above are not a part of the D community because the STW GC is not good enough for their purposes --- or they were convinced by the anti-GC crowd that this is the case. So the D community is therefore host to a bunch of the anti-GC crowd without pushback from them. Instead they have done their successes are not presented here to bolster the strategic argument that a new ace GC of that variety is a necessary game-changer for D. It's catch-22. Anti-GC crowd: Don't straw-man this: none of this says that you HAVE to use GC in D. There is no reason to oppose an ace GC for D on those grounds.
Apr 06
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/6/2024 9:04 AM, Carl Sturtivant wrote:
 Don't straw-man this: none of this says that you HAVE to use GC in D. There is 
 no reason to oppose an ace GC for D on those grounds.
Consider the performance deficit from write-barriers.
Apr 08
next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/04/2024 12:14 PM, Walter Bright wrote:
 On 4/6/2024 9:04 AM, Carl Sturtivant wrote:
 Don't straw-man this: none of this says that you HAVE to use GC in D. 
 There is no reason to oppose an ace GC for D on those grounds.
Consider the performance deficit from write-barriers.
D is in an excellent position to make write barriers opt-in, with different strategies allowing for different GC designs. Its not an all or nothing situation that people seem to want it to be. We can get the write barriers implemented and let some PhD students implement some more advanced GC designs. No need to commit to any one strategy, people should be able to use what makes sense for their application!
Apr 08
next sibling parent reply Gregor =?UTF-8?B?TcO8Y2ts?= <gregormueckl gmx.de> writes:
On Tuesday, 9 April 2024 at 00:47:55 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 D is in an excellent position to make write barriers opt-in, 
 with different strategies allowing for different GC designs.

 Its not an all or nothing situation that people seem to want it 
 to be.

 We can get the write barriers implemented and let some PhD 
 students implement some more advanced GC designs.
How would you add write barriers to D? Is it possible without extending the language?
Apr 09
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/04/2024 7:17 PM, Gregor Mückl wrote:
 On Tuesday, 9 April 2024 at 00:47:55 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 D is in an excellent position to make write barriers opt-in, with 
 different strategies allowing for different GC designs.

 Its not an all or nothing situation that people seem to want it to be.

 We can get the write barriers implemented and let some PhD students 
 implement some more advanced GC designs.
How would you add write barriers to D? Is it possible without extending the language?
Its a glue code layer thing. If you emit a write, guard it with a call into the GC (or if we have a known specific strategy that instead). The language won't need to change, its all under the hood! Other languages like Java change write barrier behavior a plenty. The bytecode however doesn't change when it does.
Apr 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/8/2024 5:47 PM, Richard (Rikki) Andrew Cattermole wrote:
 Consider the performance deficit from write-barriers.
D is in an excellent position to make write barriers opt-in, with different strategies allowing for different GC designs.
It would be a massive bifurcation of the language. It affects everything. Languages that use this tend to have a lot of low-level C systems code, because those languages are not systems implementation languages. Trying to mix code that does and does not do write gates will be very tedious and error prone in hard-to-detect ways. The net advantage tilts towards write barriers when the language does pretty much all its allocation with the GC. D is not like that. Rust is a contender, does not use GC, and does not use write barriers. I know that D's live has elicited minimal interest, but it does provide an opportunity to say that D is a memory safe language.
Apr 09
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 10/04/2024 4:41 AM, Walter Bright wrote:
 On 4/8/2024 5:47 PM, Richard (Rikki) Andrew Cattermole wrote:
 Consider the performance deficit from write-barriers.
D is in an excellent position to make write barriers opt-in, with different strategies allowing for different GC designs.
It would be a massive bifurcation of the language. It affects everything. Languages that use this tend to have a lot of low-level C systems code, because those languages are not systems implementation languages. Trying to mix code that does and does not do write gates will be very tedious and error prone in hard-to-detect ways.
Yes it could do that. We would need to set a flag for this in ModuleInfo to detect when a binary isn't compiled with write barriers. Given a specific strategy. But even then it's only going to be a partial solution. You can only turn on write barriers if you know your program can make use of them safely. I am in no way suggesting we make this opt-out. Making this opt-in gives us a ton of room for some PhD students to research, write papers ext. and hopefully come up with something that'll keep people happy that could benefit some extra work on their part. It's good to remember that just because write barriers are codegen'd in, they may be no-op if the GC is StW. So it isn't end of the world. Depends upon the write barrier strategy.
Apr 09
prev sibling next sibling parent reply Sebastian Nibisz <snibisz gmail.com> writes:
On Tuesday, 9 April 2024 at 00:14:01 UTC, Walter Bright wrote:
 On 4/6/2024 9:04 AM, Carl Sturtivant wrote:
 Don't straw-man this: none of this says that you HAVE to use 
 GC in D. There is no reason to oppose an ace GC for D on those 
 grounds.
Consider the performance deficit from write-barriers.
Stopping the world is also a performance deficit, which accumulates all at one time.
Apr 08
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/8/2024 5:51 PM, Sebastian Nibisz wrote:
 Stopping the world is also a performance deficit, which accumulates all at one 
 time.
A concurrent collector fixes this problem. Rainer implemented one at one point, but there were some technical issues. It's worth revisiting. Besides, the programmer can control when then the collection cycle is done.
Apr 09
parent Sebastian Nibisz <snibisz gmail.com> writes:
On Tuesday, 9 April 2024 at 16:45:18 UTC, Walter Bright wrote:
 A concurrent collector fixes this problem. Rainer implemented 
 one at one point, but there were some technical issues. It's 
 worth revisiting.
Memory protection mechanisms are expensive and also cause pauses.
 Besides, the programmer can control when then the collection 
 cycle is done.
What for?
Apr 09
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Tuesday, 9 April 2024 at 00:14:01 UTC, Walter Bright wrote:
 On 4/6/2024 9:04 AM, Carl Sturtivant wrote:
 Don't straw-man this: none of this says that you HAVE to use 
 GC in D. There is no reason to oppose an ace GC for D on those 
 grounds.
Consider the performance deficit from write-barriers.
Assignments to a pointer probably should call DRuntime handle. As code example, any time one assigns to a pointer (including dynamic arrays, class references, delegate contexts and so on): ```D int* ptr; ptr = new int(35); ``` ...it would be treated as (without doing the safety checks): ```D *cast(size_t*) &ptr = __ptrWrite!int(cast(size_t) ptr, cast(size_t) new int(35)) ``` The DRuntime definition for present behaviour would be simply ```D pragma(inline, true) size_t __ptrAssign(Pointee)(size_t oldVal, size_t newVal) => newVal; ``` and this would remain an option, but alternatively `__ptrWrite` can have a write gate. If the write gate would just ignore any pointers not registered to the GC it would probably continue to work with existing code with no changes other than a slight slowdown. This would enable a tri-color GC, and it's better performance could easily win far more than the write gate overhead costs in most cases. The existing mark-and-sweep collector can still remain an option for those who don't want write gates. Also even when the write gates are used you could still forgo them explicitly in unsafe code to by writing `*cast(size_t*) &ptr = cast(size_t) newValue`. Or probably use a less ugly library function to do the same.
Apr 09
next sibling parent Nick Treleaven <nick geany.org> writes:
On Tuesday, 9 April 2024 at 13:50:34 UTC, Dukc wrote:
 On Tuesday, 9 April 2024 at 00:14:01 UTC, Walter Bright wrote:
 Consider the performance deficit from write-barriers.
Assignments to a pointer probably should call DRuntime handle.
... Thanks, I've filed a bugzilla enhancement that links to your post: https://issues.dlang.org/show_bug.cgi?id=24492 It also links to a discussion where Walter appears to be OK with opt-in write barriers.
Apr 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/9/2024 6:50 AM, Dukc wrote:
 If the write gate would just ignore any pointers not registered to the GC 
 it would probably continue to work with existing code with no changes other
than 
 a slight slowdown.
This appears to require that the language would be cognizant of two different pointer types - gc pointers, and non-gc pointers. This concept was implemented in Microsoft's "Managed C++" language. It's still in use, but I never hear anyone mention it. The two-pointer-type scheme is a massive increase in complexity that the programmer has to deal with. For example, int strcmp(char* s1, char* s2); now requires 4 declarations (& means a GC pointer): int strcmp(char* s1, char* s2); int strcmp(char& s1, char* s2); int strcmp(char* s1, char& s2); int strcmp(char& s1, char& s2); People dealt with multiple pointer types in the DOS 16 bit days, and I was very glad to be able to get away from that.
Apr 09
parent Dukc <ajieskola gmail.com> writes:
On Tuesday, 9 April 2024 at 16:54:03 UTC, Walter Bright wrote:
 On 4/9/2024 6:50 AM, Dukc wrote:
 If the write gate would just ignore any pointers not 
 registered to the GC it would probably continue to work with 
 existing code with no changes other than a slight slowdown.
This appears to require that the language would be cognizant of two different pointer types - gc pointers, and non-gc pointers. This concept was implemented in Microsoft's "Managed C++" language.
No, you misunderstood. There would be only one pointer type, whether it's write-gated would be controlled by a version switch. What I meant is: *if* the write-gates are turned on, the write gate would check *at runtime* whether the memory pointed to by the new value is registered to the GC. If not, nothing is done apart from the assignment. Meaning, pointing to memory not controlled by the GC would still work, although the pointer writes would be slower than if write gates were not used. There would be no difference in language semantics. Both GC-controlled memory and manually allocated memory work exactly the same in both cases. The difference would only be in performance.
Apr 10
prev sibling next sibling parent reply Ogi <ogion.art gmail.com> writes:
On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 Personally, I blame the OS/Game crowd for single-handedly 
 keeping D out of the web service space for the past *decade* 
 because, instead of improving the GC to the point that 
 long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands.
If you want to write web things, the are many other program languages with established frameworks and tools, huge communities and corporate support. Perhaps it’s you who should consider a different language.
Apr 06
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 6 April 2024 at 11:04:29 UTC, Ogi wrote:
 On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 Personally, I blame the OS/Game crowd for single-handedly 
 keeping D out of the web service space for the past *decade* 
 because, instead of improving the GC to the point that 
 long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands.
If you want to write web things, the are many other program languages with established frameworks and tools, huge communities and corporate support. Perhaps it’s you who should consider a different language.
embedded development, as there is already several companies selling hardware and compiler toolchains, with companies paying real money for them. Something that D has been missing out, by being stuck onto this kind of discussions.
Apr 06
parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 14:40:04 UTC, Paulo Pinto wrote:

 embedded development, as there is already several companies 
 selling hardware and compiler toolchains, with companies paying 
 real money for them.

 Something that D has been missing out, by being stuck onto this 
 kind of discussions.
You seem to be saying that speaking of the strategic future of D has stopped useful things from happening. Seems unlikely. Let's speak of the present discussion instead of the nebulous "this kind of discussions". Specifically what about it will stop what specific useful things from happening?
Apr 06
prev sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 11:04:29 UTC, Ogi wrote:
 On Monday, 1 April 2024 at 20:45:25 UTC, Adam Wilson wrote:
 Personally, I blame the OS/Game crowd for single-handedly 
 keeping D out of the web service space for the past *decade* 
 because, instead of improving the GC to the point that 
 long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands.
If you want to write web things, the are many other program languages with established frameworks and tools, huge communities and corporate support. Perhaps it’s you who should consider a different language.
Having a state-of-the-art ace GC of the effective-for-soft-real-time kind as per the article that started this forum thread would put D in a strong place in regard to web services too. Telling someone to leave D and not pursue this strategic future that they themselves are advocating is a bizarre to say the least. They're not just thinking tactically about the very next piece of web services coding.
Apr 06
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/1/2024 1:45 PM, Adam Wilson wrote:
 Personally, I blame the OS/Game crowd for single-handedly keeping D out of the 
 web service space for the past *decade* because, instead of improving the GC
to 
 the point that long-running processes are possible, we've built a mountain of 
 (mis)features designed to assuage their demands. I maintain that this was 
 probably the second biggest mistake in D's history.
The sad thing is if you don't want the GC in your program, don't use 'new'. I can never get this point across.
 We need to accept a fact that I learned at DConf 2017, the no-GC crowd will
not 
 be silenced until the GC is removed from the language. However, Walter has
said 
 the GC is here to stay. Therefore, the no-GC crowd will never be silenced.
We've 
 given them a plethora of tools to work without the GC. It is time that we 
 stopped giving them so much our time. We have bigger problems to solve.
One unexpected yuge advantage to having the GC in D is it enables CTFE.
Apr 08
parent reply Ogi <ogion.art gmail.com> writes:
On Monday, 8 April 2024 at 20:21:35 UTC, Walter Bright wrote:
 The sad thing is if you don't want the GC in your program, 
 don't use 'new'. I can never get this point across.
Also don’t use exceptions, don’t use `lazy`, don’t use built-in dynamic and associative arrays, be careful to not accidentally allocate with an array literal: ```D if (arr[0..3] == [0, 1, 2]) { /*…*/ } // bad int[3] arr2 = [0, 1, 2]; if (arr[0..3] == arr2) { /*…*/ } // good ``` …be careful to not accidentally create a closure: ```D int x; auto r = arr[].map!(e = e*x); // bad auto r = arr[].zip(x.repeat).map!(t = t[0]*t[1]); // good ``` …and don’t use a huge bulk of Phobos. Writing no-GC code feels like walking through a minefield. Even seemingly innocent things like `DateTime` can use GC internally.
Apr 09
next sibling parent reply tchaloupka <chalucha gmail.com> writes:
On Tuesday, 9 April 2024 at 10:29:54 UTC, Ogi wrote:
 On Monday, 8 April 2024 at 20:21:35 UTC, Walter Bright wrote:
 [...]
Also don’t use exceptions, don’t use `lazy`, don’t use built-in dynamic and associative arrays, ...
Having basic building blocks like Fiber, Thread, Mutex, etc. implemented as class objects complicate things further.. There is also no tool (that I know of) to help analyze memory leaks of the GC memory. Lately, I've spent almost a manweek trying to find the cause of the leak in a large codebase using vibe-d in a long-running service that just kept growing on memory. Not fun at all. While I could be using memory sanitizer tools with malloc managed memory just fine ;-)
Apr 09
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/04/2024 11:04 PM, tchaloupka wrote:
 There is also no tool (that I know of) to help analyze memory leaks of 
 the GC memory.
 Lately, I've spent almost a manweek trying to find the cause of the leak 
 in a large codebase using vibe-d in a long-running service that just 
 kept growing on memory. Not fun at all. While I could be using memory 
 sanitizer tools with malloc managed memory just fine ;-)
Valgrind should work. https://dlang.org/changelog/2.105.0.html#druntime.valgrind
Apr 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
That's what  nogc is for.
Apr 09
parent reply Ogi <ogion.art gmail.com> writes:
On Tuesday, 9 April 2024 at 16:56:13 UTC, Walter Bright wrote:
 That's what  nogc is for.
Mine detectors are helpful but it would be nice if there were less mines in my backyard.
Apr 11
parent reply Adam Wilson <flyboynw gmail.com> writes:
On Thursday, 11 April 2024 at 07:09:00 UTC, Ogi wrote:
 On Tuesday, 9 April 2024 at 16:56:13 UTC, Walter Bright wrote:
 That's what  nogc is for.
Mine detectors are helpful but it would be nice if there were less mines in my backyard.
And there it is. This is where we always end up in the pro-GC vs. anti-GC debate. The anti-GC side eventually ends up with some version of "if only we could just get rid of the GC, then my life would finally be great." The number of "mines" is not going to get smaller. It can't. It will probably only grow. D is a memory safe language, of which the GC is a cornerstone component, without the GC it is no longer a memory safe language. You have tools to avoid the GC, but even so, you're going to have to do more work to avoid it than if you don't. That's what you're signing up for when you go for manual memory management. It really is that simple.
Apr 12
parent Nick Treleaven <nick geany.org> writes:
On Friday, 12 April 2024 at 22:13:53 UTC, Adam Wilson wrote:
 D is a memory safe language, of which without the GC it is no 
 longer a memory safe language.
That is not true. You can avoid heap allocation, or use allocation patterns with a safe interface like SafeRefCounted, or you can write programs that don't need deallocation, or you can avoid deallocating in safe functions.
Apr 13
prev sibling next sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
I think if you like `GC`, you can try `openD`, which is pure `GC`. As for `D`, if it's purely `GC`, why don't others use `Rust`? Are you prepared to completely abandon for `system scope`?.
Apr 05
next sibling parent reply Adam Wilson <flyboynw gmail.com> writes:
On Saturday, 6 April 2024 at 01:31:33 UTC, zjh wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
I think if you like `GC`, you can try `openD`, which is pure `GC`. As for `D`, if it's purely `GC`, why don't others use `Rust`? Are you prepared to completely abandon for `system scope`?.
Because it's not purely GC, nor is it intended to go that direction. I use the no-GC features occasionally, I don't need a pure-GC language. Just need a GC that doesn't suck compared to
Apr 05
parent zjh <fqbqrr 163.com> writes:
On Saturday, 6 April 2024 at 01:55:47 UTC, Adam Wilson wrote:

 Because it's not purely GC, nor is it intended to go that 
 direction. I use the no-GC features occasionally, I don't need 
 a pure-GC language. Just need a GC that doesn't suck compared 

You don't have to worry at all, they have discussed it hundreds of times..
Apr 05
prev sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 01:31:33 UTC, zjh wrote:
 On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
I think if you like `GC`, you can try `openD`, which is pure `GC`.
False.
 As for `D`, if it's purely `GC`, why don't others use `Rust`?
Bogus assumption about D.
 Are you prepared to completely abandon for `system scope`?.
Straw man. None of the things you knock down are a part of the conversation in this forum thread. There is no "pure GC", and everything you said has been dealt with sanely in advance. On Thursday, 4 April 2024 at 14:32:58 UTC, bachmeier wrote:
 the problem is that the anti-GC zealots make the following 
 claims:

 - The GC gets in the way. It needs to be removed the way Rust 
 removed theirs for performance reasons.
 - D isn't suitable as a systems programming language because it 
 has a GC.
 - GC should be removed as the default because it leads to bad 
 practice.
 - All D programs have to use the GC.
Apr 06
prev sibling next sibling parent reply Ogi <ogion.art gmail.com> writes:
 Many developers opposed to garbage collection are building 
 “soft” real-time systems. They want to go as fast as 
 possible—more FPS in my video game! Better compression in my 
 streaming codec! But they don’t have hard latency requirements. 
 Nothing will break and nobody will die if the system 
 occasionally takes an extra millisecond.
This depends on the amount of latency. If it’s only “an extra millisecond” than yeah, not an issue. But if garbage collection can take more than entire game tick, than it’s a no-go. I wrote a small script that simulates a video game. A game scene contains around ten thousand objects, and on every update it creates some new objects and destroys some old ones. It runs for 36,000 iterations (simulating 10 minutes of exciting 60 FPS gameplay) and measures minimum, mean and maximum time per update. One version uses GC, the other uses `malloc`/`free`. No-GC version completed the task in 15 seconds, giving a mean update time of 0.4 ms. Maximum was 1 ms. Naïve GC version took 6 minutes, 40 secs; mean update time was 11 ms. Maximum was 34 milliseconds. That’s *two entire updates* in a game that runs at 60 updates per second. At some points the game cannot maintain 60 updates per second, dropping to up to 45. Not frames—I’m not rendering anything. Updates. The whole game just slows down. Profiler reported around 12k collections. Collections could run multiple times during a single update, killing performance. Disabling GC for the duration of game update and calling `GC.collect` manually was a big improvement. Simulation took 1 minute 37 seconds, mean update time is therefore 2.7 ms. Maximum is 27 ms, which means occasional framerate drops. Well, at least the game can run at full speed... So the GC is causing significantly worse performance and worse user experience than simple `malloc`. I don’t even have to use object pools or free lists or whatever. Do people who tell us that it’s fine to use use stop-the-world GC in “soft” real-time applications actually run some tests?
Apr 06
next sibling parent reply Sergey <kornburn yandex.ru> writes:
On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 I wrote a small script that simulates a video game. A game 
 scene contains around ten thousand objects, and on every update 
 it creates some new objects and destroys some old ones. It runs 
 for 36,000 iterations (simulating 10 minutes of exciting 60 FPS 
 gameplay) and measures minimum, mean and maximum time per 
 update. One version uses GC, the other uses `malloc`/`free`.
Very similar test.. D with LDC showed more FPS than Rust and C++ :P https://github.com/NCrashed/asteroids-arena
Apr 06
next sibling parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 14:53:09 UTC, Sergey wrote:
 On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 I wrote a small script that simulates a video game. A game 
 scene contains around ten thousand objects, and on every 
 update it creates some new objects and destroys some old ones. 
 It runs for 36,000 iterations (simulating 10 minutes of 
 exciting 60 FPS gameplay) and measures minimum, mean and 
 maximum time per update. One version uses GC, the other uses 
 `malloc`/`free`.
Very similar test.. D with LDC showed more FPS than Rust and C++ :P https://github.com/NCrashed/asteroids-arena
This is all very well, but this side thread is conflating the existing GC with the one we are postulating for D that would be the game changer, with performance as per the article linked when this thread was started.
Apr 06
parent reply Sergey <kornburn yandex.ru> writes:
On Saturday, 6 April 2024 at 15:52:21 UTC, Carl Sturtivant wrote:
 Very similar test..
 D with LDC showed more FPS than Rust and C++ :P
 https://github.com/NCrashed/asteroids-arena
This is all very well, but this side thread is conflating the existing GC with the one we are postulating for D that would be the game changer, with performance as per the article linked when this thread was started.
I've just shared with Ogi some other tests showing that everything not that bad as in tests Ogi mentioned) That article is not new and was discussed previously. Moreover at D Discord several discussions about improvements of GC were taken. But I can't see any real points from topic starter or you. Maybe you can help me with that. Let me briefly summarize some things to be on the same page: * have "better" GC will be good for D (kinda obviously, but even "better" could mean different things for different people) * but it is very hard and expensive research (Google made some for Go) * GC overall is not simple thing, and you need someone who will be able to spend significant resources on it * Rikki has some ideas for improvements, that related to barriers, but afaic there is no consensus about it * Walter previously on Dconf was mentioning, that some metaprog features of D are available only because we have current GC approach, and they could not working with another architecture https://www.youtube.com/watch?v=tzt36EGKEZo) has not very good GC * One of the state of the art Azul's "pauseless" GC is a result of hard work the whole company, which is impossible for the current state of the D So what is the point then you wanted to discuss in this thread?
Apr 06
next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 07/04/2024 4:28 AM, Sergey wrote:
 Rikki has some ideas for improvements, that related to barriers, but 
 afaic there is no consensus about it
Its not really an idea on my end. Modern GC's tend to need write barriers to make them work. There is no way around it. We can have as much consensus as we want on the subject, but if we don't have it, we aren't getting anything advanced.
Apr 06
parent reply Sebastian Nibisz <snibisz gmail.com> writes:
On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
Apr 06
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Apr 06, 2024 at 07:09:12PM +0000, Sebastian Nibisz via Digitalmars-d
wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) Andrew Cattermole
 wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
So let's do it! T -- The problem with the world is that everybody else is stupid.
Apr 06
prev sibling next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 07/04/2024 7:09 AM, Sebastian Nibisz wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
*can be* It would need to call into the GC as the strategy until we know what implementations need. It must also be noted that dmd-be doesn't support atomics as intrinsics and Walter has been very against adding it, so it would still be a function call.
Apr 06
parent reply ryuukk_ <ryuukk.dev gmail.com> writes:
On Sunday, 7 April 2024 at 06:16:13 UTC, Richard (Rikki) Andrew 
Cattermole wrote:
 On 07/04/2024 7:09 AM, Sebastian Nibisz wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
*can be* It would need to call into the GC as the strategy until we know what implementations need. It must also be noted that dmd-be doesn't support atomics as intrinsics and Walter has been very against adding it, so it would still be a function call.
Don't wait for Walter, you are capable, if you know how to do it, submit a PR I'm pretty sure everyone wants atomics as intrinsics, myself included, just do it
Apr 07
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 07/04/2024 9:41 PM, ryuukk_ wrote:
 On Sunday, 7 April 2024 at 06:16:13 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 On 07/04/2024 7:09 AM, Sebastian Nibisz wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
*can be* It would need to call into the GC as the strategy until we know what implementations need. It must also be noted that dmd-be doesn't support atomics as intrinsics and Walter has been very against adding it, so it would still be a function call.
Don't wait for Walter, you are capable, if you know how to do it, submit a PR I'm pretty sure everyone wants atomics as intrinsics, myself included, just do it
I don't know how to do it and have zero desire to learn that backend to that point, let alone how to introduce such intrinsics. I know enough of x86 assembly to be dangerous, but over all have very little experience with backends. There are better ways for me to contribute.
Apr 07
prev sibling parent reply ryuukk_ <ryuukk.dev gmail.com> writes:
On Saturday, 6 April 2024 at 19:09:12 UTC, Sebastian Nibisz wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
Again, asking to become even worse than Java Java's latest new concurrent GC doesn't need write barriers Rust is the best evidence that nobody wants a system language with a GC, only Microsoft do, and they has lost
Apr 07
next sibling parent reply Sebastian Nibisz <snibisz gmail.com> writes:
On Sunday, 7 April 2024 at 09:44:21 UTC, ryuukk_ wrote:
 On Saturday, 6 April 2024 at 19:09:12 UTC, Sebastian Nibisz 
 wrote:
 Write barrier can be as cheap as a single atomic write.
Again, asking to become even worse than Java
This is better than Java.
 Java's latest new concurrent GC doesn't need write barriers
That's not true. You either use a write barrier or you stop the world; otherwise, it is impossible. All Java GCs use write barriers and pause threads. The GC for D can be completely pauseless.
Apr 07
parent deadalnix <deadalnix gmail.com> writes:
On Sunday, 7 April 2024 at 10:04:37 UTC, Sebastian Nibisz wrote:
 Java's latest new concurrent GC doesn't need write barriers
That's not true. You either use a write barrier or you stop the world; otherwise, it is impossible. All Java GCs use write barriers and pause threads. The GC for D can be completely pauseless.
You can also have read barriers, but the typical program does vastly more read than writes, so this is a losing strategy.
Apr 08
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 7 April 2024 at 09:44:21 UTC, ryuukk_ wrote:
 On Saturday, 6 April 2024 at 19:09:12 UTC, Sebastian Nibisz 
 wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
Again, asking to become even worse than Java Java's latest new concurrent GC doesn't need write barriers Rust is the best evidence that nobody wants a system language with a GC, only Microsoft do, and they has lost
Swift is a systems language with a GC, and before you mention RC isn't, I suggest reading proper CS material on automatic memory management research, and not random blog posts on the matter.
Apr 07
parent ryuukk_ <ryuukk.dev gmail.com> writes:
On Sunday, 7 April 2024 at 10:28:37 UTC, Paulo Pinto wrote:
 On Sunday, 7 April 2024 at 09:44:21 UTC, ryuukk_ wrote:
 On Saturday, 6 April 2024 at 19:09:12 UTC, Sebastian Nibisz 
 wrote:
 On Saturday, 6 April 2024 at 16:49:44 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:
 Modern GC's tend to need write barriers to make them work.
Write barrier can be as cheap as a single atomic write.
Again, asking to become even worse than Java Java's latest new concurrent GC doesn't need write barriers Rust is the best evidence that nobody wants a system language with a GC, only Microsoft do, and they has lost
Swift is a systems language with a GC, and before you mention RC isn't, I suggest reading proper CS material on automatic memory management research, and not random blog posts on the matter.
Smarter than what people suggest on this forum
Apr 07
prev sibling parent Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Sunday, 7 April 2024 at 09:44:21 UTC, ryuukk_ wrote:
 Rust is the best evidence that nobody wants a system language 
 with a GC, only Microsoft do, and they has lost
Please, remember that D is not some low level system programming language only. Most people would be happy to not fight with memory management, and especially companies that want to minimize development costs, with no expense at application security. Best regards, Alexandru.
Apr 07
prev sibling parent reply Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 16:28:25 UTC, Sergey wrote:
 So what is the point then you wanted to discuss in this thread?
This is not about what I wanted to discuss in this thread, it's about what IS being discussed in this thread. Finding strategic purpose via the future of GC. On Saturday, 6 April 2024 at 00:29:04 UTC, H. S. Teoh wrote:
 Chasing no-GC as far as we did was a mistake that cost us 
 precious time and scarce resources. We need to be mature 
 enough to admit that it was a mistake and correct our course. 
 Given the lessons and direction of the industry over the 
 intervening years, I would strongly argue that now is the time 
 to return our focus to the GC.
[...] +100. While there *have* been improvements in our current GC over the past years, we're running against a brick wall in terms of available GC algorithms, because of the pessimistic situation of no write barriers. That closes the door to many of the major advancements in GC algorithms over the past decade or two. It's time we stop sitting on the fence and commit to a GC-centric language that actually has a competitive GC to speak GCs.
Let's not confuse requirements with implementation. By muddling in all the ins-and-outs of implementation difficulty you blur the wider picture and make things look more pointless. Should everyone just give up? Your reply suggests that posture.
Apr 06
next sibling parent reply Sergey <kornburn yandex.ru> writes:
On Saturday, 6 April 2024 at 16:59:14 UTC, Carl Sturtivant wrote:
 On Saturday, 6 April 2024 at 16:28:25 UTC, Sergey wrote:
 So what is the point then you wanted to discuss in this thread?
Finding strategic purpose via the future of GC.
D doesn't have any strategy. As a samurai D has no goal only path.. But it is easy to get "future GC": Need a couple of PhD/PostDoc - lock them in a room with coffee and pizza for a couple of years - done
Apr 06
parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 17:08:55 UTC, Sergey wrote:
 On Saturday, 6 April 2024 at 16:59:14 UTC, Carl Sturtivant 
 wrote:
 On Saturday, 6 April 2024 at 16:28:25 UTC, Sergey wrote:
 So what is the point then you wanted to discuss in this 
 thread?
Finding strategic purpose via the future of GC.
D doesn't have any strategy.
As discussed, it needs to get some. On Saturday, 6 April 2024 at 00:29:04 UTC, H. S. Teoh wrote:
 On Sat, Apr 06, 2024 at 12:06:04AM +0000, Adam Wilson via
 This is the point I was trying to make. The strategic win for 
 have a fantastic GC would be immense, and would far outweigh 
 anything we could gain by continuing down the no-GC path.
Totally. If we could introduce write barriers that opens the door to incremental generational GCs, that would be a HUGE step in the long run. Maybe not so much in the short term, but guys, it's been at least a decade. We gotta stop chasing decisions that only benefit the short term. It's time to think about long-term strategy.
Apr 06
prev sibling next sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 16:59:14 UTC, Carl Sturtivant wrote:
 On Saturday, 6 April 2024 at 16:28:25 UTC, Sergey wrote:
 Let's not confuse requirements with implementation. By muddling 
 in all the ins-and-outs of implementation difficulty you blur 
 the wider picture and make things look more pointless. Should 
 everyone just give up? Your reply suggests that posture.
This:
 On Saturday, 6 April 2024 at 16:28:25 UTC, Sergey wrote:
 * but it is very hard and expensive research (Google made some 
 for Go)
 * GC overall is not simple thing, and you need someone who will 
 be able to spend significant resources on it
 * Rikki has some ideas for improvements, that related to 
 barriers, but afaic there is no consensus about it
 * Walter previously on Dconf was mentioning, that some metaprog 
 features of D are available only because we have current GC 
 approach, and they could not working with another architecture

 https://www.youtube.com/watch?v=tzt36EGKEZo) has not very good 
 GC
 * One of the state of the art Azul's "pauseless" GC is a result 
 of hard work the whole company, which is impossible for the 
 current state of the D
Apr 06
prev sibling parent Sergey <kornburn yandex.ru> writes:
On Saturday, 6 April 2024 at 16:59:14 UTC, Carl Sturtivant wrote:
 Should everyone just give up?
Just wait for a hero.. some university professor who would guide PhD to do that. Or a company which would like to sponsor it
Apr 06
prev sibling parent reply Ogi <ogion.art gmail.com> writes:
On Saturday, 6 April 2024 at 14:53:09 UTC, Sergey wrote:
 Very similar test..
 D with LDC showed more FPS than Rust and C++ :P
 https://github.com/NCrashed/asteroids-arena
This thing barely uses GC. It implements ECS architecture: all game entities are split into small components which are in separate arrays in a manner of an object pool pattern. And it utilizes `std.container.Array` to avoid GC. Object pool may be good for performance, but’s also a form of manual memory management and therefore haram. GC bros want us to “just use GC”.
Apr 06
parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 6 April 2024 at 21:04:03 UTC, Ogi wrote:
 On Saturday, 6 April 2024 at 14:53:09 UTC, Sergey wrote:
 Very similar test..
 D with LDC showed more FPS than Rust and C++ :P
 https://github.com/NCrashed/asteroids-arena
This thing barely uses GC. It implements ECS architecture: all game entities are split into small components which are in separate arrays in a manner of an object pool pattern. And it utilizes `std.container.Array` to avoid GC. Object pool may be good for performance, but’s also a form of manual memory management and therefore haram. GC bros want us to “just use GC”.
The fallacy is to push for a XOR strategy instead of a OR strategy in regards to resource management. The GC bros don't argue GC is the only strategy, only that it should be the default.
Apr 06
prev sibling next sibling parent Carl Sturtivant <sturtivant gmail.com> writes:
On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 Do people who tell us that it’s fine to use use stop-the-world 
 GC in “soft” real-time applications actually run some tests?
Straw man. No-one told you "to use use stop-the-world GC in “soft” real-time" in this forum thread. The discussion was of an ace don't-stop-the-world GC as used by the author of the article linked to start this thread for real-world critical applications. And how D having THAT would be a game changer.
Apr 06
prev sibling next sibling parent Julian Fondren <julian.fondren gmail.com> writes:
On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 I wrote a small script that simulates a video game. A game 
 scene contains around ten thousand objects, and on every update 
 it creates some new objects and destroys some old ones. It runs 
 for 36,000 iterations (simulating 10 minutes of exciting 60 FPS 
 gameplay) and measures minimum, mean and maximum time per 
 update. One version uses GC, the other uses `malloc`/`free`.
This seems like an interesting benchmark. Can you post it somewhere?
Apr 06
prev sibling parent reply Leonardo <leotada523 gmail.com> writes:
On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 [...]
This depends on the amount of latency. If it’s only “an extra millisecond” than yeah, not an issue. But if garbage collection can take more than entire game tick, than it’s a no-go. [...]
Based on this and my experiments, I think we should at least improve the GC, like Unity did with an incremental GC, to avoid spikes. https://docs.unity3d.com/Manual/performance-incremental-garbage-collection.html
Apr 09
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 10/04/2024 1:15 AM, Leonardo wrote:
 On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 [...]
This depends on the amount of latency. If it’s only “an extra millisecond” than yeah, not an issue. But if garbage collection can take more than entire game tick, than it’s a no-go. [...]
Based on this and my experiments, I think we should at least improve the GC, like Unity did with an incremental GC, to avoid spikes. https://docs.unity3d.com/Manual/performance-incremental-garbage-collection.html
Sing along with me: ~~ write barriers ~~ :)
Apr 09
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Apr 10, 2024 at 01:19:22AM +1200, Richard (Rikki) Andrew Cattermole via
Digitalmars-d wrote:
 On 10/04/2024 1:15 AM, Leonardo wrote:
 On Saturday, 6 April 2024 at 10:52:04 UTC, Ogi wrote:
 [...]
This depends on the amount of latency. If it’s only “an extra millisecond” than yeah, not an issue. But if garbage collection can take more than entire game tick, than it’s a no-go. [...]
Based on this and my experiments, I think we should at least improve the GC, like Unity did with an incremental GC, to avoid spikes. https://docs.unity3d.com/Manual/performance-incremental-garbage-collection.html
Sing along with me: ~~ write barriers ~~ :)
Write barriers! We've been singing this tune for literally years. It's about time we actually did it instead of merely talking about it! T -- Too many people have open minds but closed eyes.
Apr 09
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 9 April 2024 at 14:14:08 UTC, H. S. Teoh wrote:
 [snip]
 
 Sing along with me: ~~ write barriers ~~
 
 :)
Write barriers! We've been singing this tune for literally years. It's about time we actually did it instead of merely talking about it! T
If something hasn't happened for years, usually there are good reasons why. Walter raised some recently here [1]. If you force everything to go through the GC, then you avoid some complexity, but you have the added cost of write barriers all the time. If you're in a mixed memory environment, you end up needing two types of pointers (one managed, one unmanaged) and that leads to a lot of complications that people tend to not want to deal with. [1] https://forum.dlang.org/post/uv1l7p$2tu2$1 digitalmars.com
Apr 09
parent reply Lance Bachmeier <no spam.net> writes:
On Tuesday, 9 April 2024 at 15:02:52 UTC, jmh530 wrote:
 On Tuesday, 9 April 2024 at 14:14:08 UTC, H. S. Teoh wrote:
 [snip]
 
 Sing along with me: ~~ write barriers ~~
 
 :)
Write barriers! We've been singing this tune for literally years. It's about time we actually did it instead of merely talking about it! T
If something hasn't happened for years, usually there are good reasons why. Walter raised some recently here [1]. If you force everything to go through the GC, then you avoid some complexity, but you have the added cost of write barriers all the time. If you're in a mixed memory environment, you end up needing two types of pointers (one managed, one unmanaged) and that leads to a lot of complications that people tend to not want to deal with. [1] https://forum.dlang.org/post/uv1l7p$2tu2$1 digitalmars.com
I honestly don’t find that line of argument very convincing. There are no numbers backing up the conclusions, and not even a specific application given. We already have experience with the status quo and the problems it can cause. There’s no justification for ruling out other options based on speculation that the performance hit will be unacceptable.
Apr 09
parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 9 April 2024 at 15:31:08 UTC, Lance Bachmeier wrote:
 [snip]

 I honestly don’t find that line of argument very convincing. 
 There are no numbers backing up the conclusions, and not even a 
 specific application given. We already have experience with the 
 status quo and the problems it can cause. There’s no 
 justification for ruling out other options based on speculation 
 that the performance hit will be unacceptable.
I don't doubt that there are some applications that will benefit from the types of GC algorithms that are enabled by write barriers, but I think the burden of proof is on the people proposing changes. They can create a fork with write barriers and benchmark some code vs. the normal compiler. Regardless, if they can create a way to make it completely opt-in without breaking existing code, then that offsets much of the concern about performance.
Apr 09
prev sibling next sibling parent Atila Neves <atila.neves gmail.com> writes:
On Sunday, 31 March 2024 at 14:22:43 UTC, Adam wrote:
 Thought (this)[1] was an interesting read, and is a good 
 counterpoint to all those who refuse to give D a change because 
 of GC.

 [1]: https://bitbashing.io/gc-for-systems-programmers.html
Thanks for sharing! I'm going to point a *lot* of people to this blog post in the future.
Apr 08
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 3/31/2024 7:22 AM, Adam wrote:
 Thought (this)[1] was an interesting read, and is a good counterpoint to all 
 those who refuse to give D a change because of GC.
 
 [1]: https://bitbashing.io/gc-for-systems-programmers.html
On HN: https://news.ycombinator.com/item?id=39873692
Apr 08