www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Tasks, actors and garbage collection

reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
As computer memory grows, naive scan and sweep garbage collection 
becomes more and more a burden.

Also, languages have not really come up with a satisfactory way 
to simplify multi-threaded programming, except to split the 
workload into many single-threaded tasks that are run in parallel.

It seems to me that the obvious way to retain the easy of use 
that garbage collection provides without impeding performance is 
to limit the memory to scan, and preferably do the scanning when 
nobody is using the memory.

The actor model seems to be a good fit. Or call it a task, if you 
wish. If each actor/task has it's own GC pool then there is less 
memory to scan, and you can do the scanning when the actor/task 
is waiting on I/O or scheduling. So you would get less intrusive 
scanning pauses. It would also fit well with async-await/futures.

Another benefit is that if an actor is deleted before it is 
scanned, then no scanning is necessary at all. It can simply be 
released (assuming destructor-free classes are allocated in a 
separate area). This is of great benefit to web-services, they 
can simply implement a request-handler as an actor/task.

The downside is that you need a non-GC mechanism for dealing with 
inter-actor/task communication. Such as reference counting, 
however that should be quite ok, as you would expect the 
time-consuming stuff to happen within an actor/task as well as 
complex allocation patterns.

Is this a direction D is able to move in or is a new language 
needed?
Apr 20 2021
next sibling parent reply Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
wrote:
 As computer memory grows, naive scan and sweep garbage 
 collection becomes more and more a burden.

 Also, languages have not really come up with a satisfactory way 
 to simplify multi-threaded programming, except to split the 
 workload into many single-threaded tasks that are run in 
 parallel.

 It seems to me that the obvious way to retain the easy of use 
 that garbage collection provides without impeding performance 
 is to limit the memory to scan, and preferably do the scanning 
 when nobody is using the memory.

 The actor model seems to be a good fit. Or call it a task, if 
 you wish. If each actor/task has it's own GC pool then there is 
 less memory to scan, and you can do the scanning when the 
 actor/task is waiting on I/O or scheduling. So you would get 
 less intrusive scanning pauses. It would also fit well with 
 async-await/futures.

 Another benefit is that if an actor is deleted before it is 
 scanned, then no scanning is necessary at all. It can simply be 
 released (assuming destructor-free classes are allocated in a 
 separate area). This is of great benefit to web-services, they 
 can simply implement a request-handler as an actor/task.

 The downside is that you need a non-GC mechanism for dealing 
 with inter-actor/task communication. Such as reference 
 counting, however that should be quite ok, as you would expect 
 the time-consuming stuff to happen within an actor/task as well 
 as complex allocation patterns.

 Is this a direction D is able to move in or is a new language 
 needed?
A few years ago, when [`std.experimental.allocator`][0] was still hot out of the oven, I considered that this would one of primary innovations that it would enable. The basic idea is that since allocators are composable first-class objects, you can pass them to any function and that way you can override and customize its memory allocation policy, without resorting to global variables. (The package does provide convenience [thread-local][1] and [global variables][2], but IMO that's an anti-pattern, as if you prefer the simplicity, you can either use the GC (as always), or `MAllocator` directly. IMO, if you're reaching for `std.experimental.allocator`, you do so, in order to gain more control over the memory management. Also knowing whether `theAllocator` points to `GCAllocator`, or an actually separate thread-local allocator, can be critical for ensuring that code is lock-free. You either know what you're doing, or the code is not performance critical, so it doesn't matter, and you should be using the GC anyway.) By passing the allocator as an object, you allow it to be used safely from `pure` functions. (If `pure` functions were to somehow be allowed to use those global allocator variables, you could have some ugly consequences. For example, a pure function can be preempted in the middle of its execution, only to have the global allocator replaced under its feet, thereby leaving all the memory allocated from the previous allocator dangling.) Pure code (even in the relaxed D sense) is great for parallelism, as a scheduler can essentially assume that it's both lock-free and wait-free - it doesn't need to interact with any other thread/fiber/task to make progress. Having multiple per thread/fiber/actor/task GC heaps fits naturally in the model you propose. There could be a new LocalGCAllocator, which the runtime / framework can simply pass to the actor on its creation. There two main challenges: 1. Ensuring code doesn't brake the assumptions of the actor model by e.g. sharing memory between threads in an uncontrolled manner. This can be addressed in a variety of ways: * The framework's build-system can prevent you from importing code that doesn't fit its model * The framework can run a non-optional linter as part of the build process, which would ensure that you don't have: * ` system` or ` trusted` code * `extern` function declarations (otherwise you could define ` safe pure int printf(scope const char* format, scope const ...);`) * reference capabilities like [Pony][3]'s * other type-system or language built-in static analysis 2. Making it ergonomic and easy to use, as is using the GC. Essentially having all language and library features that currently require the GC use `LocalGCAllocator` automagically. I think this can be done in several steps: * Finish transitioning druntime's compiler interface from unchecked "magic" extern(C) functions to regular D (template) functions * Add `context` as the last parameter to each of druntime function that may need to allocate memory set it's default value to the global GC context. This is a pure refactoring, no change in behavior. * Add Scala `implicit` parameters [⁴][4] [⁵][5] [⁶][6] [⁷][7] [⁸][8] to the language and mark the `context` parameters as `implicit` [0]: https://dlang.org/phobos/std_experimental_allocator.html [1]: https://dlang.org/phobos/std_experimental_allocator.html#theAllocator [2]: https://dlang.org/phobos/std_experimental_allocator.html#.processAllocator [3]: https://tutorial.ponylang.io/reference-capabilities/reference-capabilities.html [4]: https://scala-lang.org/files/archive/spec/2.13/07-implicits.html#implicit-parameters [5]: https://docs.scala-lang.org/tour/implicit-parameters.html [6]: https://docs.scala-lang.org/tutorials/FAQ/finding-implicits.html [7]: https://stackoverflow.com/questions/10375633/understanding-implicit-in-scala [8]: https://dzone.com/articles/scala-implicits-presentations
Apr 20 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 20 April 2021 at 16:21:39 UTC, Petar Kirov 
[ZombineDev] wrote:
 By passing the allocator as an object, you allow it to be used 
 safely from `pure` functions. (If `pure` functions were to 
 somehow be allowed to use those global allocator variables, you 
 could have some ugly consequences. For example, a pure function 
 can be preempted in the middle of its execution, only to have 
 the global allocator replaced under its feet, thereby leaving 
 all the memory allocated from the previous allocator dangling.)
Yes, but I think it is too tedious to pass around allocators. Having a global user-modified variable is also not good for static analysis. I think a Task ought to be a compiler construct, or at least a language-runtime-construct. It might be desirable to have different types of Tasks, like one that is GC based and another type that is more like C++. Then the compiler need to keep tabs of the call-tree and ensure that only a GC call-tree allows a regular pointer to own a new object. And shared pointers could always be owning (possibly RC-based) unless some kind of borrowing scheme is implemented.
 Pure code (even in the relaxed D sense) is great for 
 parallelism, as a scheduler can essentially assume that it's 
 both lock-free and wait-free - it doesn't need to interact with 
 any other thread/fiber/task to make progress.
I guess that could be useful. How would it affect scheduling?
 There two main challenges:
 1. Ensuring code doesn't brake the assumptions of the actor 
 model by e.g. sharing memory between threads in an uncontrolled 
 manner. This can be addressed in a variety of ways:
     * The framework's build-system can prevent you from 
 importing code that doesn't fit its model
Hm, what implications are you thinking of that are different from what D has to deal with under the current scheme? Are you thinking about coexisting with the current regime of having a global GC as a transition, perhaps?
     * The framework can run a non-optional linter as part of 
 the build process, which would ensure that you don't have:
         * ` system` or ` trusted` code
But you should be able to call trusted? Or maybe have a different mechanism like unsafe.
     * reference capabilities like [Pony][3]'s
Yes, I think being able to transition a shared object with a refcount of 1 into a GC-owned non-shared object could be desirable.
 2. Making it ergonomic and easy to use, as is using the GC. 
 Essentially having all language and library features that 
 currently require the GC use `LocalGCAllocator` automagically.
Yes, I think you either have a thread local current_task pointer or have a dedicated hardware-register point to the current task. (implementation defined)
     * Add `context` as the last parameter to each of druntime 
 function that may need to allocate memory set it's default 
 value to the global GC context. This is a pure refactoring, no 
 change in behavior.
I guess an alternative would be to transition to a new runtime. Then code that depends on the current runtime will have to be rewritten to work with the new regime, and compilation failure would protect mistakes from going unnoticed. Or, if a transition is needed, then I guess each runtime could assert that it isn't called from a task call-tree (check a thread local variable).
     * Add Scala `implicit` parameters [⁴][4] [⁵][5] [⁶][6] 
 [⁷][7] [⁸][8] to the language and mark the `context` parameters 
 as `implicit`
The FAQ on implicits was long... maybe it is a language design mistake? :-) Ola.
Apr 20 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 20 April 2021 at 16:21:39 UTC, Petar Kirov 
[ZombineDev] wrote:
     * The framework can run a non-optional linter as part of 
 the build process, which would ensure that you don't have:
         * ` system` or ` trusted` code
Just want to add that there should be imposed some constraints on what can execute as safe code in a task so that one can get fully precise garbage collection and over time add compaction. I guess untagged unions is the main source for imprecision. One might also want to prevent non-safe code from allocating GC objects so that all code that is GC-relevant is checked properly by the compiler.
Apr 20 2021
prev sibling next sibling parent reply russhy <russhy gmail.com> writes:
GC is not the right model if you don't have the same ressources 
as Microsoft / Google / Oracle to invest massively on GC R&D, 
there is no way D can comepte with the latest Java's sub 1ms GC, 
no way

IMHO, Apple made the right decision to go with a RC with Swift, 
like they did with Obj-C

D's future is with memory ownership / reference counting

The more we stick to the GC, the more we'll have to play catchup 
with the competition

Even GO's GC is starting to become a negative amongst certain 
users 
https://blog.discord.com/why-discord-is-switching-from-go-to-rust-a190bbca2b1f


We need to transition to RC instead of GC, the sooner, the better

I know people like the GC, i do too, but in its current state it 
is absolute garbage when you compare to the competition

Again, the more we fill the STD library with GC code, the faster 
we'll get shadowed by the competition
Apr 20 2021
next sibling parent reply russhy <russhy gmail.com> writes:
It is very sad because it is the same discussion, every weeks, 
every months, every years

People agree that GC is trash, and yet no action done to improve 
things

I reiterate, i know i choose strong words, it's on purpose

D 3.0 needs to happen, with a GC-free features/std

That's the only way to be future proof and pragmatic


How many people who wrote the GC dependent std are still here 
using D? i bet close to 5%

They made D unusable for some workloads, and they made it 
dependent on very bad GC implementation, the wrong memory model 
when you advertise yourself as being a "system language"


D with core.stdc is the best version of D

Add traits/signature system and RC and you'll have a very capable 
and pragmatic system language

Simpler than to come up with 15151 different GC implementation 
with 2618518 different config options to waste months tweaking 
them *cough* JVM *cough*
Apr 20 2021
parent reply russhy <russhy gmail.com> writes:
And please note that i never said GC hinders the language 
adoption! because i never believed in that argument, it is a 
distraction from the real core issue, the memory model, when it 
is confusing, when it is not efficient, when it doesn't scale, at 
the end of the day, you'll have to pay for it

Being pragmatic about it like Swift and the issues related to 
scaling disappear "automagically", because you no longer depend 
on a Death Aura that could trigger at any time 'when allocating 
with new or via gc-dependent language features *cough* AA *cough* 
dynamic array *cough* exceptions *cough* asserts', blocking your 
whole world
Apr 20 2021
parent reply russhy <russhy gmail.com> writes:
And no, no   could save us

```D
 safe  system  pure  nogc  noreturn  help  nothrow void 
doSomethingForMePlease()
{
    writeln("is this where we want to go?");
}

```
Apr 20 2021
next sibling parent reply russhy <russhy gmail.com> writes:
C++ could have went with a GC, they decided to go against and 
encourage the use of smart pointers

That was smart, actually
Apr 20 2021
next sibling parent reply russhy <russhy gmail.com> writes:
On Tuesday, 20 April 2021 at 16:45:38 UTC, russhy wrote:
 C++ could have went with a GC, they decided to go against and 
 encourage the use of smart pointers

 That was smart, actually
Well not that smart, but smarter then enforcing a GC into the language, because the std, even if trash, doesn't depend on anything, and gives you the option to supply an IAllocator almost all the time, zig does that too, and it is very nice
Apr 20 2021
parent evilrat <evilrat666 gmail.com> writes:
On Tuesday, 20 April 2021 at 17:08:17 UTC, russhy wrote:
 On Tuesday, 20 April 2021 at 16:45:38 UTC, russhy wrote:
 C++ could have went with a GC, they decided to go against and 
 encourage the use of smart pointers

 That was smart, actually
Well not that smart, but smarter then enforcing a GC into the language, because the std, even if trash, doesn't depend on anything, and gives you the option to supply an IAllocator almost all the time, zig does that too, and it is very nice
GC isn't real, it can't hurt you.
Apr 20 2021
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 20 April 2021 at 16:45:38 UTC, russhy wrote:
 C++ could have went with a GC, they decided to go against and 
 encourage the use of smart pointers

 That was smart, actually
Kind of, there are C++/CLI, C++/CX, C++ Builder and what many keep forgeting when talking about game developers and tracing GC, Unreal C++. Also in regards to Objective-C and Swift, Objective-C did went with a tracing GC, the pivot into RC as compiler assisted help for Cocoa retain/release patterns, only happend due to C's memory model borking the whole idea. https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/GarbageCollection/Introduction.html Swift naturally needed to use the same model, otherwise a translation layer like .NET RCW for COM would be needed had they gone with tracing GC.
Apr 20 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 21 April 2021 at 06:37:21 UTC, Paulo Pinto wrote:
 On Tuesday, 20 April 2021 at 16:45:38 UTC, russhy wrote:
 C++ could have went with a GC, they decided to go against and 
 encourage the use of smart pointers

 That was smart, actually
Kind of, there are C++/CLI, C++/CX, C++ Builder and what many keep forgeting when talking about game developers and tracing GC, Unreal C++.
Yes, but C++ has had the Boehm collector since the early 90s, which is comparable to D's current default collector. There is also C++11 N2670 ("Garbage Collection and Reachability-Based Leak Detection") that has no compiler support, but actually is in the language. Garbage collection simply isn't widespread in C++ as the main allocation strategy, but it is in D. The demographics are different.
Apr 21 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 21 April 2021 at 08:09:17 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 21 April 2021 at 06:37:21 UTC, Paulo Pinto wrote:
 On Tuesday, 20 April 2021 at 16:45:38 UTC, russhy wrote:
 C++ could have went with a GC, they decided to go against and 
 encourage the use of smart pointers

 That was smart, actually
Kind of, there are C++/CLI, C++/CX, C++ Builder and what many keep forgeting when talking about game developers and tracing GC, Unreal C++.
Yes, but C++ has had the Boehm collector since the early 90s, which is comparable to D's current default collector. There is also C++11 N2670 ("Garbage Collection and Reachability-Based Leak Detection") that has no compiler support, but actually is in the language. Garbage collection simply isn't widespread in C++ as the main allocation strategy, but it is in D. The demographics are different.
None of the examples I gave use the pre-historic Boehm collector. Also no one cares about C++11 N2670, because the above mentioned examples already had their own solutions, and aren't going to rewrite them, because it was a compromise not capable to do what those dialects were already offering, so much that C++11 N2670 is voted for removal in ISO C++23.
Apr 21 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 21 April 2021 at 09:23:06 UTC, Paulo Pinto wrote:
 None of the examples I gave use the pre-historic Boehm 
 collector.
That doesn't change the fact that D's current default collector is comparable to Boehm. I don't know anything about C++/CX, but wikipedia says: «A WinRT object is reference counted and thus handles similarly to ordinary C++ objects enclosed in shared_ptrs. An object will be deleted when there are no remaining references that lead to it. There is no garbage collection involved.» https://en.wikipedia.org/wiki/C%2B%2B/CX
Apr 21 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 21 April 2021 at 09:47:32 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 21 April 2021 at 09:23:06 UTC, Paulo Pinto wrote:
 None of the examples I gave use the pre-historic Boehm 
 collector.
That doesn't change the fact that D's current default collector is comparable to Boehm. I don't know anything about C++/CX, but wikipedia says: «A WinRT object is reference counted and thus handles similarly to ordinary C++ objects enclosed in shared_ptrs. An object will be deleted when there are no remaining references that lead to it. There is no garbage collection involved.» https://en.wikipedia.org/wiki/C%2B%2B/CX
Reference counting is GC.
Apr 21 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 21 April 2021 at 09:50:53 UTC, Paulo Pinto wrote:
 Reference counting is GC.
Not in the context of this thread.
Apr 21 2021
parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 21 April 2021 at 10:04:44 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 21 April 2021 at 09:50:53 UTC, Paulo Pinto wrote:
 Reference counting is GC.
Not in the context of this thread.
I only care about computer science context.
Apr 21 2021
prev sibling parent reply russhy <russhy gmail.com> writes:
 Kind of, there are C++/CLI, C++/CX, C++ Builder and what many 
 keep forgeting when talking about game developers and tracing 
 GC, Unreal C++.
It is so good that unreal developers are all working around the GC ;) Just like the unity peeps, everyone workaround the GC with ugly hacks and ugly code
Apr 21 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 21 April 2021 at 11:14:39 UTC, russhy wrote:
 Kind of, there are C++/CLI, C++/CX, C++ Builder and what many 
 keep forgeting when talking about game developers and tracing 
 GC, Unreal C++.
It is so good that unreal developers are all working around the GC ;) Just like the unity peeps, everyone workaround the GC with ugly hacks and ugly code
I bet D wouldn't mind to have 1% of the market share they enjoy across the games industry, AR/VR devices and independt movie industry, including not so successful movies like Mandalorian. But that is just me.
Apr 21 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 21 April 2021 at 11:14:39 UTC, russhy wrote:
 It is so good that unreal developers are all working around the 
 GC ;)

 Just like the unity peeps, everyone workaround the GC with ugly 
 hacks and ugly code
And that is a good reason to compartmentalize the GC so that it only affects one task on one thread if it kicks in. Hopefully one can write code that is segmented into many smaller tasks that bypass collection altogheter. And if you can't, then write a GC-less task. Anyway, dedicated specialized GCs are not relevant to the discussion. Chrome also has a GC, again, it is specialized and not relevant.
Apr 21 2021
prev sibling next sibling parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Tuesday, 20 April 2021 at 16:44:01 UTC, russhy wrote:
 And no, no   could save us

 ```D
  safe  system  pure  nogc  noreturn  help  nothrow void 
 doSomethingForMePlease()
 {
    writeln("is this where we want to go?");
 }

 ```
Invalid safe and system contradict each other. Also noreturn is not an annotation it's a type. pure and nothrow are not annotations either they are keywords, though they arguably should be an annotations.
Apr 20 2021
parent russhy <russhy gmail.com> writes:
On Tuesday, 20 April 2021 at 17:09:48 UTC, Stefan Koch wrote:
 On Tuesday, 20 April 2021 at 16:44:01 UTC, russhy wrote:
 And no, no   could save us

 ```D
  safe  system  pure  nogc  noreturn  help  nothrow void 
 doSomethingForMePlease()
 {
    writeln("is this where we want to go?");
 }

 ```
Invalid safe and system contradict each other. Also noreturn is not an annotation it's a type. pure and nothrow are not annotations either they are keywords, though they arguably should be an annotations.
That's the point of the joke, all looks the same, they adds, and things becomes confusing/visual noise
Apr 20 2021
prev sibling parent reply Per =?UTF-8?B?Tm9yZGzDtnc=?= <per.nordlow gmail.com> writes:
On Tuesday, 20 April 2021 at 16:44:01 UTC, russhy wrote:
 ```D
  safe  system  pure  nogc  noreturn  help  nothrow void

 ```
IMO, it's the lack of attribute inference that's the problem not the (implicit) presence of attributes.
Apr 25 2021
parent reply russhy <russhy gmail.com> writes:
On Sunday, 25 April 2021 at 10:39:01 UTC, Per Nordlöw wrote:
 On Tuesday, 20 April 2021 at 16:44:01 UTC, russhy wrote:
 ```D
  safe  system  pure  nogc  noreturn  help  nothrow void

 ```
IMO, it's the lack of attribute inference that's the problem not the (implicit) presence of attributes.
This is none of that, it's people bloating the language Language should stay simple, and people can come up with what ever library they want If people love bloat, they should keep it in their own libraries and let the language simple GC is perfect example of this, it should have been a library
Apr 25 2021
parent reply evilrat <evilrat666 gmail.com> writes:
On Sunday, 25 April 2021 at 17:35:36 UTC, russhy wrote:
 On Sunday, 25 April 2021 at 10:39:01 UTC, Per Nordlöw wrote:
 On Tuesday, 20 April 2021 at 16:44:01 UTC, russhy wrote:
 ```D
  safe  system  pure  nogc  noreturn  help  nothrow void

 ```
IMO, it's the lack of attribute inference that's the problem not the (implicit) presence of attributes.
This is none of that, it's people bloating the language Language should stay simple, and people can come up with what ever library they want If people love bloat, they should keep it in their own libraries and let the language simple GC is perfect example of this, it should have been a library
We already have zig and rust, adding yet another fancy slick no GC landuage is dead end. Take it from D and there is no point to stay here. and stuff just not as that friendly like D and requires more boilerplate, but again that could be automated. So please stop your no-GC whine. People already heard you, more than once too.
Apr 25 2021
parent reply russhy <russhy gmail.com> writes:
 We already have zig and rust, adding yet another fancy slick no 
 GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, more 
 than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
Apr 25 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
I don't think ppl are saying "want more gc" per se, just don't remove it. I think there's a difference.
Apr 25 2021
parent reply russhy <russhy gmail.com> writes:
On Sunday, 25 April 2021 at 19:57:53 UTC, Imperatorn wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
I don't think ppl are saying "want more gc" per se, just don't remove it. I think there's a difference.
it needs to be removed and put as a library so language features doesn't depend on the library i don't want to stick with core.stdc and stick with a language that only get new features for the people who rely on the GC
Apr 25 2021
next sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 25 April 2021 at 20:02:32 UTC, russhy wrote:
 On Sunday, 25 April 2021 at 19:57:53 UTC, Imperatorn wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 [...]
Same defeatist mentality i keep reading here, this is not what D need
 [...]
I will never stop fighting for D from the people who wants to ruin it with more GC
I don't think ppl are saying "want more gc" per se, just don't remove it. I think there's a difference.
it needs to be removed and put as a library so language features doesn't depend on the library i don't want to stick with core.stdc and stick with a language that only get new features for the people who rely on the GC
I kinda agree, but I think it would be quite some work to make that happen.
Apr 25 2021
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 25 April 2021 at 20:02:32 UTC, russhy wrote:
 it needs to be removed and put as a library

 so language features doesn't depend on the library
That has nothing to do with this thread. In order to have GC support you need compiler support, obviously.
Apr 25 2021
prev sibling parent reply Dominikus Dittes Scherkl <dominikus scherkl.de> writes:
On Sunday, 25 April 2021 at 20:02:32 UTC, russhy wrote:
 it needs to be removed and put as a library

 so language features doesn't depend on the library
I can't even agree with that. Simply creating arrays and exceptions make prototyping so much faster and more convenient with a GC, I will never miss that again. And after profiling there are almost never more than a very few inner loops where removing the GC is a performance gain - so I turn it of in those few places and do the manual memory management (which at least is feasible if it is only about few objects) and have both: fast development time and fast execution time. Why would anyone ever want to change this? In fact: it doesn't matter if the GC is slow and imperfect. 90% of the code is executed so rarely that bad GC just has no measurable effect. And in the remaining 10% you can turn it off.
Apr 26 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 26 April 2021 at 11:33:48 UTC, Dominikus Dittes 
Scherkl wrote:
 In fact: it doesn't matter if the GC is slow and imperfect. 90% 
 of the code is executed so rarely that bad GC just has no 
 measurable effect. And in the remaining 10% you can turn it off.
Well, it does matter if you have to halt 8-32 threads when scanning all memory that may contain pointers. If that is the only option then automatic reference counting with optimization is a better choice, but I think a mix of task-local GC og global RC (with optimization) is a reasonable trade-off.
Apr 26 2021
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
Apr 25 2021
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
There is already Volt language, and Odin and Zig languages which is very D inspired and "simple" compared to D, there is a lot to choose from. But Take GC from D and you get C2 language (guess where it is now? oh I've heard they given up and started C3 language which is even better than C2, fantastic!), and there was even more "simple" C-- (C minus minus) language, but can you guess where it is now? Or maybe he want to repeat Python 2 vs 3 story? That was almost killed entire language. D just can't afford switching direction amid its course. But what if this really necessary? Ok, why not, just put it under a new name. But don't touch the original. That guy teaches us about how bad GC is and provides nonsensical examples of how brave developers avoid GC by all means because of just how evil it is, meanwhile Unity have been working just fine on mobile for 10+ years, and UE4 works just fine (CPU performance wise) on average 4 years old smartphone. I also like how he hijacked the thread and expects answers from Walter and Andrey who never showed up in the thread. He demands from them make something because he wanted it. That's definitely not going to work. If he is so serious about reducing GC dependency he could probably start patching phobos with no-GC functionality to be on par, that would be at least useful, but in the long run it will just add clutter, technical debt and bloat(omg!). To make phobos usable with nogc it would need some serious rethinking, research and planning. It is not just "remove GC" and done, this will require adding monads and stuff, pattern matching, and more. The result will probably end up look like Rust too.
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 09:22:12 UTC, evilrat wrote:
 On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
There is already Volt language, and Odin and Zig languages which is very D inspired and "simple" compared to D, there is a lot to choose from. But Take GC from D and you get C2 language (guess where it is now? oh I've heard they given up and started C3 language which is even better than C2, fantastic!), and there was even more "simple" C-- (C minus minus) language, but can you guess where it is now? Or maybe he want to repeat Python 2 vs 3 story? That was almost killed entire language. D just can't afford switching direction amid its course. But what if this really necessary? Ok, why not, just put it under a new name. But don't touch the original. That guy teaches us about how bad GC is and provides nonsensical examples of how brave developers avoid GC by all means because of just how evil it is, meanwhile Unity have been working just fine on mobile for 10+ years, and UE4 works just fine (CPU performance wise) on average 4 years old smartphone. I also like how he hijacked the thread and expects answers from Walter and Andrey who never showed up in the thread. He demands from them make something because he wanted it. That's definitely not going to work. If he is so serious about reducing GC dependency he could probably start patching phobos with no-GC functionality to be on par, that would be at least useful, but in the long run it will just add clutter, technical debt and bloat(omg!). To make phobos usable with nogc it would need some serious rethinking, research and planning. It is not just "remove GC" and done, this will require adding monads and stuff, pattern matching, and more. The result will probably end up look like Rust too.
Again, this shows how little you know UE4 GC is fine if you make a hello world, ask every studios what they have to do to workaround the GC, they wish it didn't exist, and Epic is working on ditching the GC with their upcoming data oriented stack, just like Unity is working on ditching the GC So little do you know that it makes people believe GC is fine This is why we can't have nice things, and this is why people are coming up with new languages instead of embracing Dlang, you guys make bad press for D, you are not pragmatic enough and i never said ditch the GC, you do what ever you want, but the language shouldn't expect you to use a GC, it should expect you to provide what ever allocator is proper for the task
Apr 26 2021
next sibling parent reply russhy <russhy gmail.com> writes:
Or maybe he want to repeat Python 2 vs 3 story? That was almost 
killed entire language.
Python3 is what saved Python Its problem were the same people as you, the people who refuses to understand what is wrong with the language Now python3 is experiencing a new youth, thanks to the people who decided it was time to start fresh for more decades of growing user adoption
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
  meanwhile Unity have been working just fine on mobile for 10+ 
 years
LOL, wonder how it works fine on mobile? by avoiding the GC and doing manual memory management (object pooling -- aka custom allocators) Woawoawoa, who would have thought that Unity recomands to avoid the GC in order to gain stable perf on device constrained devices, woawoaw Same goes for Unreal What have you done with D, other than cli tools?
Apr 26 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 26 April 2021 at 13:51:29 UTC, russhy wrote:
  meanwhile Unity have been working just fine on mobile for 
 10+ years
LOL, wonder how it works fine on mobile? by avoiding the GC and doing manual memory management (object pooling -- aka custom allocators) Woawoawoa, who would have thought that Unity recomands to avoid the GC in order to gain stable perf on device constrained devices, woawoaw Same goes for Unreal What have you done with D, other than cli tools?
Android seems to be doing just fine with its 80% world wide market share. And ChromeOS isn't doing badly in US school market.
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 14:48:23 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 13:51:29 UTC, russhy wrote:
  meanwhile Unity have been working just fine on mobile for 
 10+ years
LOL, wonder how it works fine on mobile? by avoiding the GC and doing manual memory management (object pooling -- aka custom allocators) Woawoawoa, who would have thought that Unity recomands to avoid the GC in order to gain stable perf on device constrained devices, woawoaw Same goes for Unreal What have you done with D, other than cli tools?
Android seems to be doing just fine with its 80% world wide market share. And ChromeOS isn't doing badly in US school market.
Yeah that is why nobody uses D for android apps And that is why nobody uses D for ChromeOS And that is why nobody uses D for gamedev and the list continues -- That is also why Android apps require 2x the amount of memory/cpu/battery than equivalent on iOS Sure GC is fine, if your quality standard is bellow 0
Apr 26 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 26 April 2021 at 15:29:22 UTC, russhy wrote:
 On Monday, 26 April 2021 at 14:48:23 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 13:51:29 UTC, russhy wrote:
 [...]
Android seems to be doing just fine with its 80% world wide market share. And ChromeOS isn't doing badly in US school market.
Yeah that is why nobody uses D for android apps And that is why nobody uses D for ChromeOS And that is why nobody uses D for gamedev and the list continues -- That is also why Android apps require 2x the amount of memory/cpu/battery than equivalent on iOS Sure GC is fine, if your quality standard is bellow 0
They use Java, JavaScript and Unity, do you know what they have in common with D?
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
On Tuesday, 27 April 2021 at 06:09:25 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 15:29:22 UTC, russhy wrote:
 On Monday, 26 April 2021 at 14:48:23 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 13:51:29 UTC, russhy wrote:
 [...]
Android seems to be doing just fine with its 80% world wide market share. And ChromeOS isn't doing badly in US school market.
Yeah that is why nobody uses D for android apps And that is why nobody uses D for ChromeOS And that is why nobody uses D for gamedev and the list continues -- That is also why Android apps require 2x the amount of memory/cpu/battery than equivalent on iOS Sure GC is fine, if your quality standard is bellow 0
They use Java, JavaScript and Unity, do you know what they have in common with D?
Yeah, they all use the GC and they they work hard to avoid it like the plague the GC, sure, they use C++ for scripting only
Apr 27 2021
parent reply russhy <russhy gmail.com> writes:
And at least with Unreal, since it is using C++, you are not 
forced to stick with the GC all the time

With unity, they had to come up with a special language to 

are forced to stick with the GC no matter what


So yes i want D to be as pragmatic as C++



that leads


Oh and guess why android smart watches sucks compared to apple 
watch (perf/battery life)?, yes i know you know why
Apr 27 2021
next sibling parent Mike Parker <aldacron gmail.com> writes:
On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:

 So yes i want D to be as pragmatic as C++
D is a GC'ed language. Period. You're tilting at windmills.
Apr 27 2021
prev sibling next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
 And at least with Unreal, since it is using C++, you are not 
 forced to stick with the GC all the time

 With unity, they had to come up with a special language to 

 you are forced to stick with the GC no matter what


 So yes i want D to be as pragmatic as C++
D is already very pragmatic imo. What is it you want to do in D that you can't, but can in C++?

 that leads
 Oh and guess why android smart watches sucks compared to apple 
 watch (perf/battery life)?, yes i know you know why
This has nothing to do with gc vs nogc
Apr 27 2021
parent reply sclytrack <fake hotmail.com> writes:
On Tuesday, 27 April 2021 at 12:20:12 UTC, Imperatorn wrote:
 On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
"The Rustening at Microsoft has begun." When will DIP1000 no longer be a preview? When will live go live? DIP1000.md says copyright 2016 and it is 2021.
Apr 27 2021
parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 27 April 2021 at 13:00:39 UTC, sclytrack wrote:
 On Tuesday, 27 April 2021 at 12:20:12 UTC, Imperatorn wrote:
 On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
"The Rustening at Microsoft has begun." When will DIP1000 no longer be a preview? When will live go live? DIP1000.md says copyright 2016 and it is 2021.
Actually the security recomendation for new software at Microsoft is: 1 - .NET languages 2 - Rust 3 - C++ with Core Guidelines Notice who gets first place. https://msrc-blog.microsoft.com/2019/07/18/we-need-a-safer-systems-programming-language/
Apr 27 2021
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
 And at least with Unreal, since it is using C++, you are not 
 forced to stick with the GC all the time

 With unity, they had to come up with a special language to 

 you are forced to stick with the GC no matter what


 So yes i want D to be as pragmatic as C++



 that leads


 Oh and guess why android smart watches sucks compared to apple 
 watch (perf/battery life)?, yes i know you know why
Can you please show us your in progress engine that beats the current use of Unity on Microsoft HoloLens?
Apr 27 2021
prev sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
 And at least with Unreal, since it is using C++, you are not 
 forced to stick with the GC all the time

 With unity, they had to come up with a special language to 

 you are forced to stick with the GC no matter what
references).
 So yes i want D to be as pragmatic as C++



 that leads
I'd call it success. Oof. People who make games with Java can has malloc buffers via C directly, that also opens the door to malloc and other allocators, moreover it is even has pointers right in the language (in unsafe code). And just to be clear, object pools is the GC managed memory, this is simply an optimization technique and not GC avoidance, the memory is still under GC control, it's still will be collected by GC at some point, it is just delayed to some moment in time in future. I don't get it, in D you have basically SINGLE scenario when GC actually runs - memory allocation using GC. It just sits there, takes up just a bit of memory, but it doesn't waste your precious CPU ticks, it doesn't sucks power. And the most important thing - you are free to manage the memory what ever way you want! D even gives you the tools(-vgc flag) to know where exactly GC allocation can happen so you could optimize away that so much hated GC to zero. All this available right now without changing the language and waiting for implementation. So what's the problem with you?
Apr 27 2021
next sibling parent reply SealabJaster <sealabjaster gmail.com> writes:
On Tuesday, 27 April 2021 at 13:29:43 UTC, evilrat wrote:
 So what's the problem with you?
Just commenting here as an observation, I think one of his main points is that there's a lack of a standard library based around allocators rather than the GC. While there's dub packages for this stuff, it's all very scattered and incompatible with one another. And most of the higher level libraries make use of the GC instead of supporting allocators (which is understandable, considering they were never moved out of std.experimental). I *think* what he's trying to say is that, while D allows you to avoid the GC and do whatever, the overall ecosystem for nogc is quite lacking and has no leadership or vision for something cohesive, and is just a hodgepodge of random hobby libraries. I personally wouldn't use D if it didn't have its GC, so I do feel that claims that "D is perfectly useable without the GC" while technically true, may not be practically true. If that makes sense. Especially if compared to nogc languages like C++ and Rust. All the "anti-bloat" and "pragmatic" stuff though I have no clue about. D's super pragmatic. Phobos may or may not be bloated, idk, I feel I don't even use a large portion of Phobos. Mostly just the metaprogramming, algorithm/ranges, and formatting+conversion stuff. Also a hint of std.experimental.logger My point is, even though this guy's very strong with his wording and I'd also say flat out incorrect with some of these statements, there's areas here that might be worth thinking about a bit more since D may have deficiencies there.
Apr 27 2021
parent reply sighoya <sighoya gmail.com> writes:
On Tuesday, 27 April 2021 at 13:50:09 UTC, SealabJaster wrote:
 On Tuesday, 27 April 2021 at 13:29:43 UTC, evilrat wrote:
 So what's the problem with you?
Just commenting here as an observation, I think one of his main points is that there's a lack of a standard library based around allocators rather than the GC.
Without to be invited too much in the details here, I think you are right. But how should a stdlib look like which abstract over all allocators, would it be safe can it be safe and performant and generalizing? I mean RC, for instance, is inferior in what it can accept for a language/what it can manage safely compared to GC. So in return, some algorithms may only work for a GC. Sure, we can write other algorithms for other allocators and other manual memory strategies, but then we are arriving to the state space explosion- and incompatibility problem. I think it is hard to unify the world of different MM containers, as severally evidenced by C++. Though I would kind of like to be convinced if I'm wrong. Why not go the nim direction and detect acycles with static analysis. I know that this is cumbersome given some steps in the evolution of D at least from what I heard so far. Generally, I would like to see the compiler improving storing a huge amount of metadata in the resulting binary, e.g., is the argument of a function borrowed or cached somewhere without the interception of the user in the frontend. And generally, people moaning about the GC's non-determinism may prefer to track cycle detection for potential resources just in time with some negative runtime performance hit for the benefit of just in time deallocation.
Apr 27 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 27 April 2021 at 15:23:30 UTC, sighoya wrote:
 Without to be invited too much in the details here, I think you 
 are right. But how should a stdlib look like which abstract 
 over all allocators, would it be safe can it be safe and 
 performant and generalizing?

 I mean RC, for instance, is inferior in what it can accept for 
 a language/what it can manage safely compared to GC.

 So in return, some algorithms may only work for a GC.
This can be solved by not allowing destructors/finalizers on GC objects. Then a modelling error would only lead to leaks. If you do that then you can let libraries be written for RC. Then when a GC is available you convert strong and weak RC-pointers to non-RC pointers if the RC-pointer pointed to a class with no destructors/finalizers. But since people won't give up on destructors/finalizers for GC, we can't do this. The alternatives are likely to be bug ridden... So, people who want libraries to be memory management agnostic actually have to successfully champion that destructors/finalizers are removed from the GC for all programs. Get consensus for that, then move on to getting rid of the GC from libraries. First thing first.
Apr 27 2021
parent sighoya <sighoya gmail.com> writes:
On Tuesday, 27 April 2021 at 19:51:55 UTC, Ola Fosheim Grøstad 
wrote:

 Then when a GC is available you convert strong and weak 
 RC-pointers to non-RC pointers if the RC-pointer pointed to a 
 class with no destructors/finalizers.
This kind of direction isn't the problem, but the other way around is. You can't easily transform GC code to RC code which is much more needed than the other way around given the amount of libraries written. The model of Weak/Strong references is in my opinion either not fast or unsafe.
Apr 28 2021
prev sibling next sibling parent reply russhy <russhy gmail.com> writes:
On Tuesday, 27 April 2021 at 13:29:43 UTC, evilrat wrote:
 On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
 And at least with Unreal, since it is using C++, you are not 
 forced to stick with the GC all the time

 With unity, they had to come up with a special language to 

 you are forced to stick with the GC no matter what
references).
 So yes i want D to be as pragmatic as C++



 where that leads
that I'd call it success. Oof. People who make games with Java can has malloc buffers via directly, that also opens the door to malloc and other allocators, moreover it is even has pointers right in the language (in unsafe code). And just to be clear, object pools is the GC managed memory, this is simply an optimization technique and not GC avoidance, the memory is still under GC control, it's still will be collected by GC at some point, it is just delayed to some moment in time in future. I don't get it, in D you have basically SINGLE scenario when GC actually runs - memory allocation using GC. It just sits there, takes up just a bit of memory, but it doesn't waste your precious CPU ticks, it doesn't sucks power. And the most important thing - you are free to manage the memory what ever way you want! D even gives you the tools(-vgc flag) to know where exactly GC allocation can happen so you could optimize away that so much hated GC to zero. All this available right now without changing the language and waiting for implementation. So what's the problem with you?
libgdx also is a pain in the ass to work with because you have to constantly fight with java and the GC, they had to rewrite all the data structures (collections etc because they allocate on foreach) that is why libgdx is dead, java is not suitable for gamedev -- concerning microsoft, that is why they are working on their new rust like language with full C++ interop, rip D, https://github.com/microsoft/verona/ -- you all misread me, and you all miss the point, i'm not asking to make D a no-gc language only i am asking we should aim to be a memory agnostic language, where one can plugin a GC transparently and do what ever he wants, and at the same time transition to a full manually managed scheme with allocators transparently
Apr 27 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 27 April 2021 at 14:59:22 UTC, russhy wrote:
 i am asking we should aim to be a memory agnostic language, 
 where one can plugin a GC transparently and do what ever he 
 wants, and at the same time transition to a full manually 
 managed scheme with allocators transparently
That would be nice, wouldn't it? But there is no such design on the table for D. You have to write your framework differently in order to support reference counting and garbage collection if it does something non-trivial. You have to write frameworks differently for RAII and non-deterministic destruction... So in practice you have to settle on one main memory management scheme, and let other management schemes be optional. Library authors are not going to do a good job of testing their libraries for multiple memory management scheme, even if they thought they wrote the code to support both. Sure the standard library can default to reference counting, but 3rd party libraries probably won't. This has nothing to do with moving to task-local GC or not, tough.
Apr 27 2021
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 27 April 2021 at 14:59:22 UTC, russhy wrote:
 On Tuesday, 27 April 2021 at 13:29:43 UTC, evilrat wrote:
 On Tuesday, 27 April 2021 at 12:01:33 UTC, russhy wrote:
 And at least with Unreal, since it is using C++, you are not 
 forced to stick with the GC all the time

 With unity, they had to come up with a special language to 

 you are forced to stick with the GC no matter what
references).
 So yes i want D to be as pragmatic as C++



 where that leads
that I'd call it success. Oof. People who make games with Java can has malloc buffers almost directly, that also opens the door to malloc and other allocators, moreover it is even has pointers right in the language (in unsafe code). And just to be clear, object pools is the GC managed memory, this is simply an optimization technique and not GC avoidance, the memory is still under GC control, it's still will be collected by GC at some point, it is just delayed to some moment in time in future. I don't get it, in D you have basically SINGLE scenario when GC actually runs - memory allocation using GC. It just sits there, takes up just a bit of memory, but it doesn't waste your precious CPU ticks, it doesn't sucks power. And the most important thing - you are free to manage the memory what ever way you want! D even gives you the tools(-vgc flag) to know where exactly GC allocation can happen so you could optimize away that so much hated GC to zero. All this available right now without changing the language and waiting for implementation. So what's the problem with you?
libgdx also is a pain in the ass to work with because you have to constantly fight with java and the GC, they had to rewrite all the data structures (collections etc because they allocate on foreach) that is why libgdx is dead, java is not suitable for gamedev
LibGDX is pretty much alive, https://libgdx.com, I guess you keep referring to the old site. Yes it probably has been shadowed by the facts that the original author kind of left the project after being burned with the Xamarin acquisition and shutdown from the original RoboVM project. And the tiny detail that nowadays Unreal and Unity are mostly free beer for indies, alongside their GC implementations, graphical tools, the preferred teaching tool at game universities around the globe, with first party support from Microsoft, Apple, Google, Sony and Nintendo on their OS, VR/AR and gaming platforms. As for Verona, it is Microsoft Research language and it is mostly paperware in 2021, did you forgot to read the link?
 This project is at a very early stage, parts of the type 
 checker are still to be implemented, and there are very few 
 language features implemented yet. This will change, but will 
 take time.
Apr 28 2021
prev sibling parent sighoya <sighoya gmail.com> writes:
On Tuesday, 27 April 2021 at 14:59:22 UTC, russhy wrote:

 i am asking we should aim to be a memory agnostic language, 
 where one can plugin a GC transparently and do what ever he 
 wants, and at the same time transition to a full manually 
 managed scheme with allocators transparently
The problem is that you solely consider manual or managed resources on their own, but resources alias other resources and other resources alias the considered resource, the connection is the problem. Not only does the GC manage memory for you at runtime, it also does this in a way to guarantee memory safety, speaking differently, the GC is some kind of runtime lifetime manager, although not a deterministic lifetime manager. Anyway, this is a feature Rust completely lacks. There are ways to reformulate the solution to cyclic data structures differently, but mostly with some additional memory or performance hit. And just to say, no one moans when Rust didn't allow deleting resources when lifetimes say no, but the GC shouldn't say no? Solving the problem in Rust is similar to solving the problem in D, just choose another memory model for the specific problem. Could libraries more engage in providing such solutions, I think yes, but generalizing algorithms to offer you the most performant and safe code is hard and leads probably to a state space explosion. Just look at the many ways you can create containers in Rust, you have Owned, Rc, Pin, Cell, Pin<...<...>> and whatever.
Apr 28 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 27 April 2021 at 13:29:43 UTC, evilrat wrote:
 And the most important thing - you are free to manage the 
 memory what ever way you want!
That's kinda irrelevant. You can say that for basically any language with a C-bridge. Doesn't mean it is cost efficient. If it isn't cost efficient, then it isn't competitive. If it isn't competitive then it isn't interesting. barriers and is therefore going to freeze or significantly slow down execution for any reasonable collection scheme during collection. (I don't consider fork() reasonable.) is irrelevant for D. Can we please focus this discussion on getting a competitive memory management scheme? Reverting to C is not a competitive option.
Apr 27 2021
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 26 April 2021 at 13:44:36 UTC, russhy wrote:
 On Monday, 26 April 2021 at 09:22:12 UTC, evilrat wrote:
 [...]
Again, this shows how little you know UE4 GC is fine if you make a hello world, ask every studios what they have to do to workaround the GC, they wish it didn't exist, and Epic is working on ditching the GC with their upcoming data oriented stack, just like Unity is working on [...]
From "Unity Future .NET Development Status"
 But we have also a longer term idea in the future to "combine" 
 Burst+IL2CPP effort and to push for the best .NET AOT 
 experience for Unity apps. This AOT solution could be 
 considered as a Tiered level 4 compilation compatible with 
 CoreCLR. But before going there, we will have quite some work 
 ahead, as highlighted by  JoshPeterson in the original post 
 above.
https://forum.unity.com/threads/unity-future-net-development-status.1092205/#post-7035031
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 14:42:22 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 13:44:36 UTC, russhy wrote:
 On Monday, 26 April 2021 at 09:22:12 UTC, evilrat wrote:
 [...]
Again, this shows how little you know UE4 GC is fine if you make a hello world, ask every studios what they have to do to workaround the GC, they wish it didn't exist, and Epic is working on ditching the GC with their upcoming data oriented stack, just like Unity is working on [...]
From "Unity Future .NET Development Status"
 But we have also a longer term idea in the future to "combine" 
 Burst+IL2CPP effort and to push for the best .NET AOT 
 experience for Unity apps. This AOT solution could be 
 considered as a Tiered level 4 compilation compatible with 
 CoreCLR. But before going there, we will have quite some work 
 ahead, as highlighted by  JoshPeterson in the original post 
 above.
https://forum.unity.com/threads/unity-future-net-development-status.1092205/#post-7035031
Unity DOTS doesn't use the GC, AT ALL You guys never made a game or a game engine and you are telling me GC is fine Sure, sure, GC will save us, but it needs to pause the world first
Apr 26 2021
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 You guys never made a game or a game engine and you are telling 
 me GC is fine
Which games have you made? I made a buggy version of Minesweeper once. You can play it online actually http://webassembly.arsdnet.net/minesweeper
Apr 26 2021
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Monday, 26 April 2021 at 15:38:48 UTC, Adam D. Ruppe wrote:
 On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 You guys never made a game or a game engine and you are 
 telling me GC is fine
Which games have you made? I made a buggy version of Minesweeper once. You can play it online actually http://webassembly.arsdnet.net/minesweeper
Now we are talking! I have asteroids game that is actually written in D using Unity. https://forum.dlang.org/post/lwxodemvzgyxkmanpptg forum.dlang.org
Apr 26 2021
parent Adam D. Ruppe <destructionator gmail.com> writes:
On Monday, 26 April 2021 at 16:35:08 UTC, evilrat wrote:
 I have asteroids game that is actually written in D using Unity.

 https://forum.dlang.org/post/lwxodemvzgyxkmanpptg forum.dlang.org
whoa it actually ran on my linux box! that's kinda cool. I did a little asteroids type thing too (also available online http://webassembly.arsdnet.net/asteroids ). I started a first person version of it too many years ago but I never finished it. That'd probably be kinda fun to revisit some day and finish off. idk if I'd make that work on the webassembly though but of course it would on normal windows and linux.
Apr 27 2021
prev sibling parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 15:38:48 UTC, Adam D. Ruppe wrote:
 On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 You guys never made a game or a game engine and you are 
 telling me GC is fine
Which games have you made? I made a buggy version of Minesweeper once. You can play it online actually http://webassembly.arsdnet.net/minesweeper
I'm working on a 3D game engine yeah minesweeper can have a 5 second GC pause, users won't notice
Apr 26 2021
next sibling parent bachmeier <no spam.net> writes:
On Monday, 26 April 2021 at 17:34:41 UTC, russhy wrote:
 On Monday, 26 April 2021 at 15:38:48 UTC, Adam D. Ruppe wrote:
 On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 You guys never made a game or a game engine and you are 
 telling me GC is fine
Which games have you made? I made a buggy version of Minesweeper once. You can play it online actually http://webassembly.arsdnet.net/minesweeper
I'm working on a 3D game engine yeah minesweeper can have a 5 second GC pause, users won't notice
If you want to be precise, this is a matter of implementation of GC. There are *some games* for which *current GC implementations* result in unacceptable performance issues. Many will do just fine. There's also no reason to pretend manual memory management, reference counting, or garbage collection is a silver bullet. Allocation/freeing of memory always results in a performance hit no matter what name you apply. As [stated in this blog post](https://www.sebastiansylvan.com/post/on-gc-in-games-response-to-jeff-and-casey/):
 Jeff and Casey’s ranting is not actually about the GC itself, 
 but about promiscuous allocation behavior, and I fully agree 
 with that, but I think it’s a mistake to conflate the two. GC 
 doesn’t imply that you should heap allocate at the drop of a hat
I'm not arguing that the GC will work for all games because it won't. You give the impression of having a fundamentalist position though that "real games" (not Minesweeper) cannot have a GC and that GC is obviously going to be slow (uncontrollable 5 second pauses). That's not a helpful starting point.
Apr 26 2021
prev sibling next sibling parent evilrat <evilrat666 gmail.com> writes:
On Monday, 26 April 2021 at 17:34:41 UTC, russhy wrote:
 I'm working on a 3D game engine
Why bother making your engine if you don't have a team and project for it? There is already stuff like godot-d, it works but still somewhat cumbersome to use. Still should be a good choice for most non shooter games. The only bad thing is that the editor itself has UX of a potato (just like pretty much any open source product) and it is said that 3d is barely optimized so we have to wait for Godot 4 release. https://code.dlang.org/packages/godot-d
Apr 26 2021
prev sibling parent martinm <martinmannes23312 gmail.com> writes:
On Monday, 26 April 2021 at 17:34:41 UTC, russhy wrote:
 On Monday, 26 April 2021 at 15:38:48 UTC, Adam D. Ruppe wrote:
 On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 You guys never made a game or a game engine and you are 
 telling me GC is fine
Which games have you made? I made a buggy version of Minesweeper once. You can play it online actually http://webassembly.arsdnet.net/minesweeper
I'm working on a 3D game engine yeah minesweeper can have a 5 second GC pause, users won't notice
I'm interested. Will your 3D engine also have LUA scripting ?
Apr 27 2021
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 26 April 2021 at 15:30:48 UTC, russhy wrote:
 On Monday, 26 April 2021 at 14:42:22 UTC, Paulo Pinto wrote:
 On Monday, 26 April 2021 at 13:44:36 UTC, russhy wrote:
 [...]
From "Unity Future .NET Development Status"
 [...]
https://forum.unity.com/threads/unity-future-net-development-status.1092205/#post-7035031
Unity DOTS doesn't use the GC, AT ALL You guys never made a game or a game engine and you are telling me GC is fine Sure, sure, GC will save us, but it needs to pause the world first
Apr 26 2021
prev sibling parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
say this to the people who want to ruin D with more GC, they are free to make their own fork caleld DJava
Apr 26 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 26 April 2021 at 13:40:18 UTC, russhy wrote:
 On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy slick 
 no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
say this to the people who want to ruin D with more GC, they are free to make their own fork caleld DJava
This is just pure GC phobia at this point of time. -Alex
Apr 26 2021
parent reply russhy <russhy gmail.com> writes:
On Monday, 26 April 2021 at 13:43:21 UTC, 12345swordy wrote:
 On Monday, 26 April 2021 at 13:40:18 UTC, russhy wrote:
 On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy 
 slick no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
say this to the people who want to ruin D with more GC, they are free to make their own fork caleld DJava
This is just pure GC phobia at this point of time. -Alex
Not, i never said GC is useless, i said the language shouldn't expect a GC to exists, it should ask you to provide what ever allocation scheme you need for the task
Apr 26 2021
parent 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 26 April 2021 at 13:54:47 UTC, russhy wrote:
 On Monday, 26 April 2021 at 13:43:21 UTC, 12345swordy wrote:
 On Monday, 26 April 2021 at 13:40:18 UTC, russhy wrote:
 On Monday, 26 April 2021 at 06:35:25 UTC, Paulo Pinto wrote:
 On Sunday, 25 April 2021 at 19:41:39 UTC, russhy wrote:
 We already have zig and rust, adding yet another fancy 
 slick no GC landuage is dead end.
Same defeatist mentality i keep reading here, this is not what D need
 So please stop your no-GC whine. People already heard you, 
 more than once too.
I will never stop fighting for D from the people who wants to ruin it with more GC
D is open source, you are free to take care of your special flavoured D.
say this to the people who want to ruin D with more GC, they are free to make their own fork caleld DJava
This is just pure GC phobia at this point of time. -Alex
Not, i never said GC is useless, i said the language shouldn't expect a GC to exists, it should ask you to provide what ever allocation scheme you need for the task
That is just pure nonsense. Having the gc built into the language have noticeable benefits. -Alex
Apr 26 2021
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 GC is not the right model if you don't have the same ressources 
 as Microsoft / Google / Oracle to invest massively on GC R&D, 
 there is no way D can comepte with the latest Java's sub 1ms 
 GC, no way
Well, it would be 0ms, if it runs when the task is idle. If the task is short lived then it would be like an arena-allocator (faster than malloc).
 IMHO, Apple made the right decision to go with a RC with Swift, 
 like they did with Obj-C
The people who were unhappy with D having a GC has probably left for Rust at this point. So it seems like the majority wants to have a GC.
 D's future is with memory ownership / reference counting
If it keeps moving at this pace then it will never happen... and the language semantics for borrowing isn't there.
 We need to transition to RC instead of GC, the sooner, the 
 better
Yes, if it had happened 4 years ago, but it is kinda too late. If there had been traction in the community for making everything RC then things wouldn't move so slowly with memory management as it does now. Borrowing won't happen either, for it to happen there should've been a gamma-release by now. It will take years to get it to work properly after the first gamma-release. GC-based tasks should make converting existing code reasonable in most cases. Except for heavily user-level multithreaded code. It can also be done as a prototype in short order. Just adapt the existing GC code, even though that might be a bit heavy for production use.
Apr 20 2021
parent reply russhy <russhy gmail.com> writes:
Well, it would be 0ms, if it runs when the task is idle. If the 
task is short lived then it would be like an arena-allocator 
(faster than malloc).
Then just use an arena allocator, why everyone wants to make things complicated for everyone, and as a default It's just like the Thread, or even Fibers, can't even be used without the GC, that is a very very very poor and short sighted mindset
The people who were unhappy with D having a GC has probably left 
for Rust at this point. So it seems like the majority wants to 
have a GC.
Reason why language stagnates, noone left to take it to the next level Worse, the ones who stayed are taking D to the wrong direction by focusing on the GC
If it keeps moving at this pace then it will never happen... and 
the language semantics for borrowing isn't there.
Argument that i share, we need the foundation people and walter/atila etc to show up in these discussions, but they never do, so no interesting debates happen, reason why language stagnates
Yes, if it had happened 4 years ago, but it is kinda too late. 
If there had been traction in the community for making 
everything RC then things wouldn't move so slowly with memory 
management as it does now. Borrowing won't happen either, for it 
to happen there should've been a gamma-release by now. It will 
take years to get it to work properly after the first 
gamma-release.
That is a very short term mindset, another reason why language stagnates, lack of vision and lack of proper discussion to make pragmatic and future proof changes/deprecations/decisions Whenever that discussion happen, it's always "it's too late", "it'll never happen", "it's the way it is", "just embrace the bad, stagnating, non-scalable GC" The main issue is the lack of will and courage from the team to ackowledge the poor state of the GC and they play hide&seek when it comes to discussing/debating about it
Apr 20 2021
parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 20 April 2021 at 21:59:30 UTC, russhy wrote:
 That is a very short term mindset, another reason why language 
 stagnates, lack of vision and lack of proper discussion to make 
 pragmatic and future proof changes/deprecations/decisions

 Whenever that discussion happen, it's always "it's too late", 
 "it'll never happen", "it's the way it is"
It is too late for D to attract C++ developers. I am ok with C++20 and dont think it is realistic for D to play catch up. D needs to find its own vision, or it will be displaced.
Apr 20 2021
parent reply russhy <russhy gmail.com> writes:
On Tuesday, 20 April 2021 at 22:32:08 UTC, Ola Fosheim Grostad 
wrote:
 On Tuesday, 20 April 2021 at 21:59:30 UTC, russhy wrote:
 That is a very short term mindset, another reason why language 
 stagnates, lack of vision and lack of proper discussion to 
 make pragmatic and future proof changes/deprecations/decisions

 Whenever that discussion happen, it's always "it's too late", 
 "it'll never happen", "it's the way it is"
It is too late for D to attract C++ developers. I am ok with C++20 and dont think it is realistic for D to play catch up.
What RC has to do with C++? What using a pragmatic approach and IAllocator oriented apis has to do with C++? I'm not talking about C++, at all
 I am ok with C++20 and dont think it is realistic for D to play 
 catch up.
Defeatist mindset, that is what i am talking about when i say the people who stayed are taking D to the wrong direction And i never said it needs to catch up with C++, sticking to GC means it can't be a compelling alternative to the people with C/C++/Rust and soon Zig background purposes because Go already fit the role perfectly fine, because they did what D failed to do
 So it's 2014. If Go does not solve this GC latency problem 
 somehow then Go isn't going to be successful. That was clear.
 Other new languages were facing the same problem. Languages 
 like Rust went a different way but we are going to talk about 
 the path that Go took.
https://blog.golang.org/ismmkeynote Is this allocating memory? is this using arena GC? is this gonna end up blocking all my threads? how long can we expect a pause to be? 20ms? 1ms? Actor model isn't bound to GC uses, so your wish of a vision for D has nothing to do with the way memory is managed Solving real world issues is what drives adoption, but at some point your solution will have to work with D's features, and these are using a poor's man GC that'll stop their world, and won't scale well Foundations have to work well, and to be pragmatic, so you aren't bound to a specific scenario, that is the most future proof solution because it allows anyone with ideas on how to solve real world problems, that is what C is not displaced because of that, it doesn't tell you what to do, and how to do, it gives you the tools so you can write portable code, efficient and better software I personally doesn't give a damn about all of this since i stick to ``core.stdc`` But i care about D, and this vision of D being a managed language is not doing good for D's future, managed language come and go because they made strong choices that they are unable to revert, C/C++ still strong today, why? i guess -i don't know right?- I went way too offtopic, i should have created a separate from the beginning.. --- And i agree with Andrei this place needs a voting system, with proper moderating tools, so i'd know if what i said speaks to people or if i should just stop because i am becoming annoyingly rude for no real reasons
Apr 20 2021
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 21 April 2021 at 00:47:13 UTC, russhy wrote:
 C is not displaced because of that, it doesn't tell you what to 
 do, and how to do, it gives you the tools so you can write 
 portable code, efficient and better software
ISO C is being replaced with ISO C++, that trend won't stop. Walter does not want to have many pointer types, that makes a library based allocator strategy a mess. My proposal is simpler, if you don't want GC, don't use a GC-Task.
Apr 20 2021
prev sibling next sibling parent reply TheGag96 <thegag96 gmail.com> writes:
On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 We need to transition to RC instead of GC, the sooner, the 
 better
Maybe I'm wrong, but wouldn't we be just going from one form of memory management people find slow to another? And a kind we know is *particularly* slow (even if more predictable)? It seems to me that the really anti-GC crowd (or the portion of it that I know of - the Handmade / game dev group) doesn't want GC nor RC, but instead something way more C-like, with custom allocators and such. I think that's what languages like Zig that sprang up out of nowhere are suddenly taking off and why there's still anticipation for Jai nearly 7 years later. I do kind of agree that maybe a D3 is what's needed in order to have a clean rethinking and uncrufting of the language. I've been feeling that despite D's amazing features and upsides, something just isn't clicking with people. Since I became an avid D fan many years ago, most programmers I talk to have never heard of it, and the situation has not gotten better over time.
Apr 20 2021
parent russhy <russhy gmail.com> writes:
On Wednesday, 21 April 2021 at 01:00:59 UTC, TheGag96 wrote:
 On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 We need to transition to RC instead of GC, the sooner, the 
 better
Maybe I'm wrong, but wouldn't we be just going from one form of memory management people find slow to another? And a kind we know is *particularly* slow (even if more predictable)? It seems to me that the really anti-GC crowd (or the portion of it that I know of - the Handmade / game dev group) doesn't want GC nor RC, but instead something way more C-like, with custom allocators and such. I think that's what languages like Zig that sprang up out of nowhere are suddenly taking off and why there's still anticipation for Jai nearly 7 years later. I do kind of agree that maybe a D3 is what's needed in order to have a clean rethinking and uncrufting of the language. I've been feeling that despite D's amazing features and upsides, something just isn't clicking with people. Since I became an avid D fan many years ago, most programmers I talk to have never heard of it, and the situation has not gotten better over time.
I mixed things with the Swift analogy i used earlier my bad, i meant the IAllocator driven APIs, need to allocate? one want GC in their program? fine provide the GCAllocator, but don't enforce it to everyone And i agree with you, that is the perfect opportunity for a D3 vision, and that's where the sooner the better make sense, because just like you said, there is a need for a modern C *alternative*
  I've been feeling that despite D's amazing features and 
 upsides, something just isn't clicking with people.
The experience i have with this was always the GC
Apr 20 2021
prev sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 GC is not the right model if you don't have the same ressources 
 as Microsoft / Google / Oracle to invest massively on GC R&D, 
 there is no way D can comepte with the latest Java's sub 1ms 
 GC, no way

 IMHO, Apple made the right decision to go with a RC with Swift, 
 like they did with Obj-C

 D's future is with memory ownership / reference counting

 The more we stick to the GC, the more we'll have to play 
 catchup with the competition

 Even GO's GC is starting to become a negative amongst certain 
 users 
 https://blog.discord.com/why-discord-is-switching-from-go-to-rust-a190bbca2b1f


 We need to transition to RC instead of GC, the sooner, the 
 better

 I know people like the GC, i do too, but in its current state 
 it is absolute garbage when you compare to the competition

 Again, the more we fill the STD library with GC code, the 
 faster we'll get shadowed by the competition
U can take the gc away from me from my dead cold hands. -Alex
Apr 21 2021
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 21 April 2021 at 21:47:10 UTC, 12345swordy wrote:
 On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 [...]
U can take the gc away from me from my dead cold hands. -Alex
The GC will remain, but could be improved.
Apr 25 2021
parent reply russhy <russhy gmail.com> writes:
On Sunday, 25 April 2021 at 09:48:15 UTC, Imperatorn wrote:
 On Wednesday, 21 April 2021 at 21:47:10 UTC, 12345swordy wrote:
 On Tuesday, 20 April 2021 at 16:28:46 UTC, russhy wrote:
 [...]
U can take the gc away from me from my dead cold hands. -Alex
The GC will remain, but could be improved.
That is a very very very very very bad decision That is one of the reason i decide to stay away from std and stick to core.stdc, and that is what i recommend everytime i mention D to people That is also one of the reason i limit D for my new projects and keep an eye for new languages that focuses on allocators first Stay the hell out of std, and stay the hell out of the GC, if you want to get started with D Mistakes mistakes mistakes, that is what people do, bloat bloat bloat, they is what people do And as expected, the top level people remains silent, they let some people destroy this beautiful language that is D Ignoring decade of complains about the GC, and saying after decades "we need to improve the GC", as if nobody made the complain Blind, refuse to listen to feedback, that's what D foundation Lack of public roadmap is the evidence of everything Leadership? where? C'mon Walter, Atila, Andrei, let us know what do you think about the GC story? You are responsible of it, c'mon react! You guys are chasing Java, Go ate your cake with better GC Now you figured you'd also chase C++, but Rust ate your cake too Now you figured you should stick to chasing Java, with a shitty GC that doesn't scale Without even tooling to analyses heap memory sanely, something at least Java provides out of the box "the gc will remain", but not D
Apr 25 2021
parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 25 April 2021 at 17:33:06 UTC, russhy wrote:
 On Sunday, 25 April 2021 at 09:48:15 UTC, Imperatorn wrote:
 [...]
That is a very very very very very bad decision That is one of the reason i decide to stay away from std and stick to core.stdc, and that is what i recommend everytime i mention D to people [...]
Hmm, some partly valid points, but I think the strength D has is its plasticity. Let's put it like this, what should the "GC-lovers" do without the GC? How do you please everyone? The only solution I see is provide choice, which I think D does.
Apr 25 2021
prev sibling next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
wrote:
 [snip]
 It seems to me that the obvious way to retain the easy of use 
 that garbage collection provides without impeding performance 
 is to limit the memory to scan, and preferably do the scanning 
 when nobody is using the memory.

 The actor model seems to be a good fit. Or call it a task, if 
 you wish. If each actor/task has it's own GC pool then there is 
 less memory to scan, and you can do the scanning when the 
 actor/task is waiting on I/O or scheduling. So you would get 
 less intrusive scanning pauses. It would also fit well with 
 async-await/futures.

 [snip]
Are you familiar with this: https://github.com/hsutter/gcpp How similar are these ideas?
Apr 20 2021
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 20 April 2021 at 17:52:33 UTC, jmh530 wrote:
 Are you familiar with this:
 https://github.com/hsutter/gcpp

 How similar are these ideas?
I've read about it, but it seems to be just a local collector? It isn't similar as we want to trace stacks and don't want to rely on conventions. We want the type system to prevent bad situations from happening.
Apr 20 2021
prev sibling next sibling parent Jossy <lmablenewman gmail.com> writes:
On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
wrote:
 As computer memory grows, naive scan and sweep garbage 
 collection becomes more and more a burden.

 [...]
Hi I had the same request. I can say that preserving the ease of use that garbage collection provides without sacrificing performance can be done with the software that I installed on this site https://www.worktime.com/
Apr 27 2021
prev sibling next sibling parent reply sighoya <sighoya gmail.com> writes:
On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
wrote:
 Is this a direction D is able to move in or is a new language 
 needed?
I think this sounds good. There is already a language taking the same/similar direction: Pony: https://tutorial.ponylang.io/appendices/garbage-collection.html Any(one) experience with it?
Apr 27 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 27 April 2021 at 21:18:30 UTC, sighoya wrote:
 Pony: 
 https://tutorial.ponylang.io/appendices/garbage-collection.html

 Any(one) experience with it?
I have no real experience with it, but I read quite a bit about it when it first came. I think the typing system for references is interesting. Basicially keeps track at compile time of whether a pointer has been given out or not and restricts capabilities for the reference based on that.
Apr 27 2021
prev sibling parent reply Tejas <notrealemail gmail.com> writes:
On Tuesday, 27 April 2021 at 21:18:30 UTC, sighoya wrote:
 On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
 wrote:
 Is this a direction D is able to move in or is a new language 
 needed?
I think this sounds good. There is already a language taking the same/similar direction: Pony: https://tutorial.ponylang.io/appendices/garbage-collection.html Any(one) experience with it?
Someone made a data stream processor with it that had nanosecond level latency requirements. Seems like a pretty good language even in practical usage. https://youtu.be/GigBhej1gfI However I noticed that the company he leadsnis nownlooking to hire rust programmers so I can't really say what is the result of using this language long-term.
May 03 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 3 May 2021 at 18:17:05 UTC, Tejas wrote:
 Someone made a data  stream processor with it that had 
 nanosecond level latency requirements. Seems like a pretty good 
 language even in practical usage.

 https://youtu.be/GigBhej1gfI
Thanks for the link, Sean T Allen seemed to be very honest about strengths and weaknesses. I like that.
 However I noticed that the company he leadsnis nownlooking to 
 hire rust programmers so I can't really say what is the result 
 of using this language long-term.
I think that would apply to most small languages; most suitable for hobbyists or personal usage, more risky for long term projects that are deployed.
May 03 2021
prev sibling parent reply James Lu <jamtlu gmail.com> writes:
On Tuesday, 20 April 2021 at 09:52:07 UTC, Ola Fosheim Grøstad 
wrote:
 As computer memory grows, naive scan and sweep garbage 
 collection becomes more and more a burden.

 Also, languages have not really come up with a satisfactory way 
 to simplify multi-threaded programming, except to split the 
 workload into many single-threaded tasks that are run in 
 parallel.

 It seems to me that the obvious way to retain the easy of use 
 that garbage collection provides without impeding performance 
 is to limit the memory to scan, and preferably do the scanning 
 when nobody is using the memory.
So region-based memory management with per-region GC?
Apr 28 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 29 April 2021 at 01:24:27 UTC, James Lu wrote:
 So region-based memory management with per-region GC?
You could say that, although it is more based on the context of a task which can suspend.
Apr 29 2021