www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Future of memory management in D

reply Rumbu <rumbu rumbu.ro> writes:
At least from my point of view, it seems that recently D made a 
shift from a general purpose language to a C successor, hence the 
last efforts to improve betterC and C interop, neglecting other 
areas of the language.

By other areas I mean half baked language built-ins or oop 
support which failed to evolve at least to keep the pace with the 
  languages from where D took inspiration initially (e.g. Java and 
its successors).

In this new light, even I am not bothered by, I must admit that 
the garbage collector became something that doesn't fit in.

Now, without a gc, more than half of the language risks to become 
unusable and that's why I ask myself how do you see the future of 
the memory management in D?

For library development it is not necessary a big deal since the 
allocator pattern can be implemented for each operation that 
needs to allocate.

But, for the rest of the features which are part of the core 
language (e.g. arrays, classes, exceptions) what memory model do 
you consider that will fit in? Do you think that compiler 
supported ARC can be accepted as a deterministic memory model by 
everyone? Or memory ownership and flow analysis are better?

Not assuming a standard memory model can be a mistake, the C 
crowd will always complain that they cannot use feature X, others 
will complain that they cannot use feature Y because it is not 
finished or its semantics are stuck in 2000's.
Nov 16 2021
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Nov 16, 2021 at 06:17:29PM +0000, Rumbu via Digitalmars-d wrote:
[...]
 In this new light, even I am not bothered by, I must admit that the
 garbage collector became something that doesn't fit in.
 
 Now, without a gc, more than half of the language risks to become
 unusable and that's why I ask myself how do you see the future of the
 memory management in D?
[...] You can take the GC from me over my cold dead body. T -- Study gravitation, it's a field with a lot of potential.
Nov 16 2021
next sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 16 November 2021 at 18:25:53 UTC, H. S. Teoh wrote:
 On Tue, Nov 16, 2021 at 06:17:29PM +0000, Rumbu via 
 Digitalmars-d wrote: [...]
 In this new light, even I am not bothered by, I must admit 
 that the garbage collector became something that doesn't fit 
 in.
 
 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?
[...] You can take the GC from me over my cold dead body. T
😂 I'm not that dramatic. But research suggests GC is actually a good invention. For me the thing about D is its plasticity. That you can choose. But, yes, to be able to choose, the stdlib must not rely on gc as much. But I really want the GC there if I need it.
Nov 16 2021
prev sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 16 November 2021 at 18:25:53 UTC, H. S. Teoh wrote:
 On Tue, Nov 16, 2021 at 06:17:29PM +0000, Rumbu via 
 Digitalmars-d wrote: [...]
 In this new light, even I am not bothered by, I must admit 
 that the garbage collector became something that doesn't fit 
 in.
 
 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?
[...] You can take the GC from me over my cold dead body. T
By accident I was actually reading this rn: https://ocaml.org/learn/tutorials/garbage_collection.html
Nov 16 2021
prev sibling next sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?

 Not assuming a standard memory model can be a mistake, the C 
 crowd will always complain that they cannot use feature X, 
 others will complain that they cannot use feature Y because it 
 is not finished or its semantics are stuck in 2000's.
The Achilles heel of D and its memory management was never the GC itself or GC/not GC. It was that D didn't separate raw pointers from managed pointers as different types. If D had done that, it would have many more options. This borrow checker implementation in D, I'm not sure what the purpose is and I don't have the whole picture either.
Nov 16 2021
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Nov 16, 2021 at 10:38:13PM +0000, IGotD- via Digitalmars-d wrote:
 On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 
 Now, without a gc, more than half of the language risks to become
 unusable and that's why I ask myself how do you see the future of
 the memory management in D?
 
 Not assuming a standard memory model can be a mistake, the C crowd
 will always complain that they cannot use feature X, others will
 complain that they cannot use feature Y because it is not finished
 or its semantics are stuck in 2000's.
The Achilles heel of D and its memory management was never the GC itself or GC/not GC. It was that D didn't separate raw pointers from managed pointers as different types. If D had done that, it would have many more options.
[...] That may be true, but having multiple incompatible pointer types mixed together in the language makes code far more complex (and prone to bugs). The type system would be a lot more complex than it is today. And it would not really solve the problem of interop between, e.g., two different 3rd party libraries that expect different pointer types. Having a single unified pointer type increases compatibility between code of diverse origins, though of course that comes at a price. T -- "Maybe" is a strange word. When mom or dad says it it means "yes", but when my big brothers say it it means "no"! -- PJ jr.
Nov 16 2021
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 16 November 2021 at 22:38:13 UTC, IGotD- wrote:
 [snip]

 The Achilles heel of D and its memory management was never the 
 GC itself or GC/not GC. It was that D didn't separate raw 
 pointers from managed pointers as different types. If D had 
 done that, it would have many more options.

 This borrow checker implementation in D, I'm not sure what the 
 purpose is and I don't have the whole picture either.
I'm confused by this because it seems as if the managed C++ iterations from Microsoft do not have much traction. What is the benefit of different types for GC/non-GC pointers?
Nov 16 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 17 November 2021 at 02:10:02 UTC, jmh530 wrote:
 On Tuesday, 16 November 2021 at 22:38:13 UTC, IGotD- wrote:
 [snip]

 The Achilles heel of D and its memory management was never the 
 GC itself or GC/not GC. It was that D didn't separate raw 
 pointers from managed pointers as different types. If D had 
 done that, it would have many more options.

 This borrow checker implementation in D, I'm not sure what the 
 purpose is and I don't have the whole picture either.
I'm confused by this because it seems as if the managed C++ iterations from Microsoft do not have much traction. What is the benefit of different types for GC/non-GC pointers?
They do, to the point that C++/CLI support was a major milestone for .NET Core 3.0. C++/CLI is used on the context of native interop, its goal isn't to rewrite the world in C++, and is much easier than figuring out the right set of P/Invoke declarations. And then since Longhorn debacle, Windows team has pushed COM everywhere as replacement for .NET ideas, so you have COM reference counting across all OS stack. What we don't have is a developer friendly way to author COM, because politics killed C++/CX and we are back into editing IDL files, but AddRef/Release are everywhere.
Nov 16 2021
prev sibling parent reply IGotD- <nise nise.com> writes:
On Wednesday, 17 November 2021 at 02:10:02 UTC, jmh530 wrote:
 I'm confused by this because it seems as if the managed C++ 
 iterations from Microsoft do not have much traction. What is 
 the benefit of different types for GC/non-GC pointers?
Managed C++ is now named C++/CLI and it is probably still there if you want to use it. Not many use C++/CLI and I suspect that for most cases. The benefit of a special type for managed pointers is that you can change the implementation of the GC fairly easily as well as incorporate metadata under the hood. Tracing GC is not suitable for low latency programs/embedded, but reference counting can be a viable alternative for the low latency programs.
Nov 17 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 17 November 2021 at 10:59:19 UTC, IGotD- wrote:
 On Wednesday, 17 November 2021 at 02:10:02 UTC, jmh530 wrote:
 I'm confused by this because it seems as if the managed C++ 
 iterations from Microsoft do not have much traction. What is 
 the benefit of different types for GC/non-GC pointers?
Managed C++ is now named C++/CLI and it is probably still there if you want to use it. Not many use C++/CLI and I suspect that for most cases.
It is mostly used to consume those COM APIs that Windows team keeps doing only for C++ consumption and are harder to get straight with plain P/Invoke, or RCW/CCW.
 The benefit of a special type for managed pointers is that you 
 can change the implementation of the GC fairly easily as well 
 as incorporate metadata under the hood. Tracing GC is not 
 suitable for low latency programs/embedded, but reference 
 counting can be a viable alternative for the low latency 
 programs.
PTC and Aicas are in business for the last 25 years doing real time GC for embedded. It is a matter of who's on the team, "Hard Realtime Garbage Collection in Modern Object Oriented Programming Languages." https://www.amazon.com/Realtime-Collection-Oriented-Programming-Languages/dp/3831138931/ Basically the foundation background for the Aicas product, the thesis written by one of the founders, "Distributed, Embedded and Real-time Java Systems" https://link.springer.com/book/10.1007/978-1-4419-8158-5 Given that D is still in the philosophical search of it wants to double down on GC or not, such optimizations aren't possible.
Nov 17 2021
prev sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 17 November 2021 at 10:59:19 UTC, IGotD- wrote:
 [snip]
 The benefit of a special type for managed pointers is that you 
 can change the implementation of the GC fairly easily as well 
 as incorporate metadata under the hood. Tracing GC is not 
 suitable for low latency programs/embedded, but reference 
 counting can be a viable alternative for the low latency 
 programs.
Thanks. I now remember that this might have come up before. I get the idea that tracing GC and reference counting are useful in different programs. However, I understand that it is possible to switch out D's GC, though that may not be so easy. Could you explain a bit more how having two different pointer types helps with switching out the GC? Also, suppose std.allocator gets put in Phobos. We can currently use the gc_allocator, would it be possible to also create an rc_allocator? Is the issue that the pointer of gc_allocator is a normal pointer, but rc_allocator would need one wrapped with additional metadata?
Nov 17 2021
prev sibling next sibling parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:


GC should be an option,not enforce on you.
Nov 16 2021
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 12:46:25AM +0000, zjh via Digitalmars-d wrote:
[...]
 GC should be an option,not enforce on you.
IIRC the current GC doesn't even initialize itself on program startup until you make your first GC allocation. So if main() is nogc, you can pretty much already write an entire application without the GC starting up at all. T -- Long, long ago, the ancient Chinese invented a device that lets them see through walls. It was called the "window".
Nov 16 2021
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 00:55:33 UTC, H. S. Teoh wrote:

Thank you for reminding me. I don't know that point.
However, suppose others are like me. As soon as they hear GC, 
they shake their heads and leave. What should we do?
How to expand D's user base?
We should provide a way for users to never use `GC`, such as 
`embedded`, which is a good market. This is the territory of `C`.
Nov 16 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 01:06:57AM +0000, zjh via Digitalmars-d wrote:
 On Wednesday, 17 November 2021 at 00:55:33 UTC, H. S. Teoh wrote:
 
 Thank you for reminding me. I don't know that point.
 However, suppose others are like me. As soon as they hear GC, they shake
 their heads and leave. What should we do?
What *can* we do? Lie to them that there's no GC, and when they find out they leave *and* get angry that we lied to them? You cannot argue with GC-phobia, it's an irrational fear. No amount of argument will change anything. You cannot reason with the unreasonable; you lose by definition.
 How to expand D's user base?
 We should provide a way for users to never use `GC`, such as
 `embedded`, which is a good market. This is the territory of `C`.
Years ago, before nogc was implemented, people were clamoring for it, saying that if we could only have the compiler enforce no GC use, hordes of C/C++ programmers would come flocking to us. Today, we have nogc implemented and working, and the hordes haven't come yet. Now people are clamoring for ref-counting and getting rid of GC use in Phobos. My prediction is that 10 years later, we will finally have ref-counting and Phobos will be nogc, and the hordes of C/C++ programmers still will not come to us. T -- Never ascribe to malice that which is adequately explained by incompetence. -- Napoleon Bonaparte
Nov 16 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 01:23:45 UTC, H. S. Teoh wrote:
 Years ago, before  nogc was implemented, people were clamoring 
 for it, saying that if we could only have the compiler enforce 
 no GC use, hordes of C/C++ programmers would come flocking to 
 us.

 Today, we have  nogc implemented and working, and the hordes 
 haven't come yet.
nogc gave those who understand system level programming a signal of direction. I don't remmber people demanding it, IIRC Walter just did it. Nobody said it was significant.
 Now people are clamoring for ref-counting and getting rid of GC 
 use in Phobos.  My prediction is that 10 years later, we will 
 finally have ref-counting and Phobos will be  nogc, and the 
 hordes of C/C++ programmers still will not come to us.
C++ has grown since then. D has chosen to waste all resources on safe and live and not change. Thus C++ is too far ahead. D should have gone with the actor model. D should have gotten rid of global GC scanning. But D has not chosen any model and tries to do everything, which isn't possible. D tries to change without changing. That leads to bloat. Phobos suffers from bloat. The compiler suffers from bloat. The syntax suffers from bloat. Andrei seems to think that D should follow C++'s idea of simplifying by addition. That is a disaster in making. C++ cannot change, D is not willing to use that to its advantage. Bloat is the enemy of change. D has chosen to tweak the bloat instead of reducing it. That leads to more bloat and less change. ImportC is added to a compiler that should have been restructured first. Good luck with refactoring the compiler now, SDC might be the only hope... The global GC strategy with raw pointers and integrated C interop is one massive source for "runtime bloat". C++ also suffers from bloat, but has critical mass, and that is enough to keep it alive. D is competing with Zig and Nim. They are leading. They have less bloat AFAIK. Refocus. Forget C++ and Rust. D should pick a memory model and runtime strategy that scales and do it well! Global GC does not scale when combined with C semantics. That has always been true and will remain true. That is an undeniable fact. If D is to be competitive something has to change. Adding bloat wont change anything.
Nov 17 2021
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 17 November 2021 at 09:04:42 UTC, Ola Fosheim 
Grøstad wrote:
  nogc gave those who understand system level programming a 
 signal of direction. I don't remmber people demanding it, IIRC 
 Walter just did it. Nobody said it was significant.
That isn't not what happened, people did demand nogc back before it existed and considered it necessary. (I was a sceptic and am now an avid user of nogc). GC in D did receive many improvements over the years: nogc, speed and space enhancements, -profile=gc... A bit surprised by the disinformation on this thread.
Nov 17 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 23:01:44 UTC, Guillaume Piolat 
wrote:
 That isn't not what happened, people did demand  nogc back 
 before it existed and considered it necessary. (I was a sceptic 
 and am now an avid user of  nogc).
I am sorry if my memory is off, but I thought people suggested a command-line switch for nogc and that Walter came up with the idea for the attribute. I could be wrong as this is ten years ago! I never thought it was essential, let me put it that way. I thought it was a nice feature that confirmed that Walter *wants* programmers to view D as proper system programming language. Which is significant to what could happen down the road, but not a replacement for compiler backed no-gc memory management.
 GC in D did receive many improvements over the years:  nogc, 
 speed and space enhancements, -profile=gc... A bit surprised by 
 the disinformation on this thread.
There has been many improvements, but that does not change the O(N) limitation and the fact that it essentially is a Boehm-collector like C++ has. Almost no projects use that one. And that says a lot of what system level programmers look for.
Nov 17 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 23:01:44 UTC, Guillaume Piolat 
wrote:
 That isn't not what happened, people did demand  nogc back 
 before it existed and considered it necessary. (I was a sceptic 
 and am now an avid user of  nogc).
To give my statement some context, this is a quote from a [2014 posting I made](https://forum.dlang.org/post/srqejkdljjhxiwmjbqzb forum.dlang.org):
I got very happy when Walter announced " nogc" and his intent to 
create a "better C" switch on the compiler.
I felt this was a nice change of direction, but I also feel that 
this direction has stagnated and taken a turn for the worse with 
the ref-counting focus… Phobos is too much of a 
scripting-language library to me, too much like Tango, and 
hacking in ref counting makes it even more so.
DIP60 was created on 2014-4-15, shortly after I had engaged in some criticism of the GC (IIRC). I saw DIP60 as a response to that, but when I search the forums I see that there have been suggestions of various kinds several years prior to this. Now, please also understand that my view of RC/GC in D has changed since then.
Nov 17 2021
prev sibling parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 00:46:25 UTC, zjh wrote:

 GC should be an option,not enforce on you.
D is originally a `C like` language. If we can utilize the library of `C/C++`, what a big market ?
Nov 16 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 12:58:42AM +0000, zjh via Digitalmars-d wrote:
 On Wednesday, 17 November 2021 at 00:46:25 UTC, zjh wrote:
 
 GC should be an option,not enforce on you.
D is originally a `C like` language. If we can utilize the library of `C/C++`, what a big market ?
We already can. I have many projects that call C libraries (not so much C++ because of incompatibilities between C++ and D templates), and it works fine. T -- Today's society is one of specialization: as you grow, you learn more and more about less and less. Eventually, you know everything about nothing.
Nov 16 2021
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 01:02:46 UTC, H. S. Teoh wrote:
 We already can.  I have many projects that call C libraries 
 (not so much C++ because of incompatibilities between C++ and D 
 templates), and it works fine.
The abstraction of C is too raw. D needs the ability to own the whole standard library without GC, so as to survive under `low hardware conditions`. There are many C++ users, so you have to provide C++ binding. Now, there are many `Rust` users, and they may have to be binded. Their is mine. After all, the D user base is too small. You have to bind with others.
Nov 16 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 01:15:21AM +0000, zjh via Digitalmars-d wrote:
 On Wednesday, 17 November 2021 at 01:02:46 UTC, H. S. Teoh wrote:
 We already can.  I have many projects that call C libraries (not so
 much C++ because of incompatibilities between C++ and D templates),
 and it works fine.
The abstraction of C is too raw. D needs the ability to own the whole standard library without GC, so as to survive under `low hardware conditions`.
Have you heard about -betterC? ;-)
 There are many C++ users, so you have to provide C++ binding.
The past few years Walter was busy implementing binding to C++ classes. I haven't kept up with the current status of that, but AFAIK it has been implemented to some extent. I haven't seen the crowd of C++ programmers flocking to D yet.
 Now, there are many `Rust` users, and they may have to be binded.
 Their is mine. After all, the D user base is too small. You have to
 bind with others.
AFAIK, most Rust users are ex-C/C++ users. If they have already chosen Rust over D, good luck convincing them to switch again. D already has bindings for other languages. The most important one is C, since that's the de facto baseline for basically every relevant OS out there these days. C++ is partially supported (you have extern(C++) classes, dpp, etc.). We have Java support (thanks to Adam's jni.d, which uses D's metaprogramming abilities to make it so nice to interface with Java that I was actually tempted to start using Java again!), and somebody wrote an Excel binding too. Of course, these are in various states of polishedness, and probably could use more work to round out the experience. It's easy to say "we need X, we need Y, we should do Z". But talk doesn't make anything happen; what we have today is because *somebody* decided to actually write code to make it happen instead of posting suggestions to unending threads on the forum. ;-) T -- "Hi." "'Lo."
Nov 16 2021
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 01:32:50 UTC, H. S. Teoh wrote:

Can I make full use of the `STD` library without garbage 
collection?
If this is achieved, it can be advertised everywhere.
,Then why not remove `GC` on the home page?.
`-BetterC `, of course. I just can't use it with the `STD` 
library,It's like walking on one leg.
Examples should be provided on how to bind C++ classes and how to 
write bindings.Should provide `C++` Common libraries bindings .
I didn't see hordes of C++ programmers flocking to d.
Because C++ is already quite good, C++ has learned a lot of `d`. Cats teach tigers. `Rust`'s propaganda is ` no GC + memory security `. We can also learn. We can add a little` beautiful syntax+ excellent metaprogramming `. We need more examples of `C++` binding. Let users know how to bind. `DPP`, not available to me. `Java support`, too few examples. I am not an expert and can only provide `ideas`.
Nov 16 2021
next sibling parent zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 01:57:23 UTC, zjh wrote:

 I am not an expert and can only provide `ideas`.
One wrong step, step step wrong , `GC` is a bad move. `D` has become the experience of others.
Nov 16 2021
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via Digitalmars-d wrote:
 On Wednesday, 17 November 2021 at 01:32:50 UTC, H. S. Teoh wrote:
 
 Can I make full use of the `STD` library without garbage collection?
 If this is achieved, it can be advertised everywhere.
 ,Then why not remove `GC` on the home page?.
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. Having a GC IME is an extremely powerful thing, contrary to what the GC haters will tell you. It frees up your APIs from being littered with memory-management minutiae. It makes your APIs clean, refactorable, and maintainable. Easy to use. It makes your code clean. You get to make progress in your problem domain instead of wasting 75% of your brain power thinking about memory management at every step. You save countless hours writing manual memory management code, and countless more hours debugging said code. And in D, you also get to choose to use manual memory management in performance bottlenecks *where it actually matters*. 90% of application code is not on the hot path, it's a completely waste of effort to meticulously manage memory in such code, when you could be focusing your efforts on the 10% hot path where 90% of the performance gains are made. Writing code without a GC is wasteful of programmer time, which equals to wasting money paying programmers to do something that should have been done in 10% of the time, leaving the rest of the time to work on things that actually matter, like implementing features and making progress in your problem domain. You spend tons of wages paying said programmers to debug memory-related bugs, which are notorious to be extremely hard to find and require a lot of time, when these wages could have been used to pay them to implement new features and drive your business forward. *And* you waste tons of wages paying them to maintain code that's needlessly complex due to having to manually manage memory all the time. It takes a lot of time and effort to maintain such code, time that could have been diverted for more useful work had a GC been in place. And you know what? In spite of all this time and effort, programmers get it wrong anyway -- typical C code is full of memory leaks, pointer bugs, and buffer overflows. Most of them are latent and only trigger in obscure environments and unusual inputs, and so lie undiscovered, waiting for the day it suddenly explodes on a customer's production environment. Or somebody comes up with a security exploit... With a GC, you instantly eliminate 90% of these problems. Only 10% of the time, you actually need to manually manage memory -- in inner loops and hot paths where it actually matters. GC phobia is completely irrational and I don't see why we should bend over backwards to please that crowd. T -- "I'm running Windows '98." "Yes." "My computer isn't working now." "Yes, you already said that." -- User-Friendly
Nov 16 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

You're right. GC is good for experts.
For plain users, he left when he heard `GC`. You can't increase 
the number of `user base`.
No matter how good your language is, it's useless.
Why can `Rust` attract `C/C++` users, but `d` can't. It's `GC`.
Without GC, D might have been popular `ten years ago`.`java` is a 
`big hole`.
Moreover, if the speed is increased by one point and the memory 
is reduced by one point, the customer will have more users. No 
one thinks users are enough.
The `speed/memory` of the machine is much more important than the 
`speed/comfort` of the programmer.
Nov 16 2021
next sibling parent reply SealabJaster <sealabjaster gmail.com> writes:
On Wednesday, 17 November 2021 at 02:45:10 UTC, zjh wrote:
 Why can `Rust` attract `C/C++` users, but `d` can't. It's `GC`.
tbf until we can a solid definition of what D actually want to **be** then complaining that "We're not getting the C/C++/Python/Java/JS/TS///// users" is a bit pointless, because until we know what D wants to be, we don't know who the actual target audience is. DConf is only a few days away, so maybe we'll get some surprises then :)
Nov 16 2021
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 03:02:05 UTC, SealabJaster 
wrote:

 DConf is only a few days away, so maybe we'll get some 
 surprises then :)
Yes, I haven't seen D's PPT for a long time.
Nov 16 2021
prev sibling parent reply bachmeier <no spam.net> writes:
On Wednesday, 17 November 2021 at 02:45:10 UTC, zjh wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:

 You're right. GC is good for experts.
 For plain users, he left when he heard `GC`. You can't increase 
 the number of `user base`.
 No matter how good your language is, it's useless.
I'm not sure if you're serious or if this is some kind of parody and I'm missing the joke.
 Why can `Rust` attract `C/C++` users, but `d` can't. It's `GC`.
Well obviously. That's about the only reason anyone still uses either language. Everyone else left for the greener pastures of the GC-world long ago.
 The `speed/memory` of the machine is much more important than 
 the `speed/comfort` of the programmer.
There is no doubt that there are small number of cases in which this is true. Judging from the software that gets written these days, including Electron apps, I don't think the world agrees with you that it's only about maximizing speed and minimizing languages.
Nov 16 2021
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 03:24:24 UTC, bachmeier wrote:

lose.
Nov 16 2021
parent reply forkit <forkit gmail.com> writes:
On Wednesday, 17 November 2021 at 03:38:54 UTC, zjh wrote:
 On Wednesday, 17 November 2021 at 03:24:24 UTC, bachmeier wrote:

lose.
I'd add to that... that when D becomes 'popular', that popularity will become a significant constraint on how D further evolves/develops. Be careful what you wish for ;-) And btw. competition has it's own downfalls.
Nov 16 2021
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 04:38:52 UTC, forkit wrote:

 popularity will become a significant constraint on how D 
 further evolves/develops.
As long as the location is accurate, no your question. Find the `target user`. `C++`:`speed + memory, extreme speed `. `Rust`:`speed + memory security`. `D`:`speed + memory security + extreme metaprogramming`. If `GC` is only an option, I believe it can attract `users`. If you don't force the user, the user will come naturally. Provide options, not mandatory.
Nov 16 2021
prev sibling next sibling parent russhy <russhy gmail.com> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via Digitalmars-d 
 wrote:
 On Wednesday, 17 November 2021 at 01:32:50 UTC, H. S. Teoh 
 wrote:
 
 Can I make full use of the `STD` library without garbage 
 collection?
 If this is achieved, it can be advertised everywhere.
 ,Then why not remove `GC` on the home page?.
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. Having a GC IME is an extremely powerful thing, contrary to what the GC haters will tell you. It frees up your APIs from being littered with memory-management minutiae. It makes your APIs clean, refactorable, and maintainable. Easy to use. It makes your code clean. You get to make progress in your problem domain instead of wasting 75% of your brain power thinking about memory management at every step. You save countless hours writing manual memory management code, and countless more hours debugging said code. And in D, you also get to choose to use manual memory management in performance bottlenecks *where it actually matters*. 90% of application code is not on the hot path, it's a completely waste of effort to meticulously manage memory in such code, when you could be focusing your efforts on the 10% hot path where 90% of the performance gains are made. Writing code without a GC is wasteful of programmer time, which equals to wasting money paying programmers to do something that should have been done in 10% of the time, leaving the rest of the time to work on things that actually matter, like implementing features and making progress in your problem domain. You spend tons of wages paying said programmers to debug memory-related bugs, which are notorious to be extremely hard to find and require a lot of time, when these wages could have been used to pay them to implement new features and drive your business forward. *And* you waste tons of wages paying them to maintain code that's needlessly complex due to having to manually manage memory all the time. It takes a lot of time and effort to maintain such code, time that could have been diverted for more useful work had a GC been in place. And you know what? In spite of all this time and effort, programmers get it wrong anyway -- typical C code is full of memory leaks, pointer bugs, and buffer overflows. Most of them are latent and only trigger in obscure environments and unusual inputs, and so lie undiscovered, waiting for the day it suddenly explodes on a customer's production environment. Or somebody comes up with a security exploit... With a GC, you instantly eliminate 90% of these problems. Only 10% of the time, you actually need to manually manage memory -- in inner loops and hot paths where it actually matters. GC phobia is completely irrational and I don't see why we should bend over backwards to please that crowd. T
I can't believe this is what still being debated, yet again.. -- D's GC: - doesn't scale, the more pointers to scan the slower the collection is - stop the entire world whenever a collection happen - no R&D What's your use case with this? And you want to attract what kind of people? in a cloud native world with such GC? what kind of people will you attract if you are doing worse than NodeJS's GC? scripters? is that what D is? a language for scripters? want to replace BASH, is that it? -- You complain about nogc being what the "gc phobic" people wanted, and yet they didn't come? People don't want tags, they want a platform where they can use D, not "annotated" D -- -betterC failed to bring C/C++ people? Yeah, i mean.. ```D struct Test { float[32] data; } extern (C) int main() { Test b; Test c; bool r = b == c; return 0; } ``` this code doesn't compile in -betterC, who will you attract if such basic and common code doesn't work? also it sounds like it is some bone you give to a dog so he can play and you relax in peace, not a good message fragmentation, that's what you get -- GO became ultra popular thanks to **its** GC and its applications, not because **of a** GC, big difference ;) -- bandaid solutions doesn't attract people, saying "we have a no GC story at home" BUT you have to do X, Y, Z, and give up on A, B, C is not what the people wanted and that you gave the " nogc tag" and the "non functioning -betterc" they want what Rust provide, a modern system programming language they want what Zig provide, a modern system programming language they want what Odin provide, a modern system programming language they want what Jai provide, a modern system programming language -- Don't expect those people to want a modern Java alternative Because GO already took that role, something D could have done, and yet didn't, you have a GC, why it failed? why people complained about a GC in D? but not a GC in GO?, this is the message you REFUSE to hear -- There are no GC-phobia, there is a poor's man GC phobia!, big difference Instead of investing billions into a competitive GC, with nobody wants to do, including the GC advocators, let's adopt the Allocator aware mindset, it's much cheaper, and much more effective ;)
Nov 16 2021
prev sibling next sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via Digitalmars-d 
 wrote:
 [...]
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. [...]
+1
Nov 16 2021
prev sibling next sibling parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

 And you know what?  In spite of all this time and effort, 
 programmers get it wrong anyway -- typical C code is full of 
 memory leaks, pointer bugs, and buffer overflows.
People write bugs. You don't say!
 Most of them are latent and only trigger in obscure 
 environments and unusual inputs, and so lie undiscovered, 
 waiting for the day it suddenly explodes on a customer's 
 production environment.
GC isn't solving those issues. It's masking them. If you have a stale pointer somewhere, the bug *is* that the pointer is stale. Making it point to some forgotten piece of data is not a solution, it's a write off of a lost cause. Yeah, it's safe, sure. Because you're leaking.
 Or somebody comes up with a security exploit...
Your susceptibility to security (and other) exploits grows proportional to the number of dependencies, of which druntime (and, consequently, GC) is one. So that's kind of a moot point.
 With a GC, you instantly eliminate 90% of these problems.  Only 
 10% of the time, you actually need to manually manage memory -- 
 in inner loops and hot paths where it actually matters.
It's so funny how you keep talking about this mythical 90% vs 10% split. When you have 16ms *for everything*, and a single collection takes 8ms, during which the whole world is stopped, you can't have 90/10. When you can't actually disable the GC (and you can't *actually* disable the GC), you can't have 90/10, because eventually some forced collection *will* turn that 90 into 99. So, practically, it comes down to either you use the GC, or you don't, period. That is not the fault of GC per se, but it's the consequence of lack of control. Unfortunately, price of convenience sometimes is just too high.
 GC phobia is completely irrational and I don't see why we 
 should bend over backwards to please that crowd.
Simply put? Because any decent API doesn't transfer its garbage to its callers. And because, surprise, sometimes deterministic run time is a requirement. If the cost of calling your function is 100 cycles now and a million at some poorly specified point in the future, I'd consider looking for something that would just take the million up front.
Nov 16 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav Blinov 
wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:

 [...]
People write bugs. You don't say! [...]
What do you mean by "you can't *actually* disable the GC"? 🤔
Nov 16 2021
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn wrote:
 On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav 
 Blinov wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:

 [...]
People write bugs. You don't say! [...]
What do you mean by "you can't *actually* disable the GC"? 🤔
I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.
Nov 16 2021
parent reply Tejas <notrealemail gmail.com> writes:
On Wednesday, 17 November 2021 at 07:25:45 UTC, Stanislav Blinov 
wrote:
 On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn 
 wrote:
 On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav 
 Blinov wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:

 [...]
People write bugs. You don't say! [...]
What do you mean by "you can't *actually* disable the GC"? 🤔
I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.
`GC.disable()` disables the running of a collection phase(and runs a collection __only__ when even the OS returns `Out of Memory` error), but allows allocation. ` nogc` disables all allocations via the `GC`. How else do you propose we further formalize these two notions? GC should throw Error if even the OS returns OoM rather than run collection cycle?
Nov 16 2021
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Wednesday, 17 November 2021 at 07:35:43 UTC, Tejas wrote:
 On Wednesday, 17 November 2021 at 07:25:45 UTC, Stanislav 
 Blinov wrote:
 On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn 
 wrote:
 What do you mean by "you can't *actually* disable the GC"? 🤔
I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.
`GC.disable()` disables the running of a collection phase(and runs a collection __only__ when even the OS returns `Out of Memory` error), but allows allocation.
That is not at all how it's documented. "Collections may continue to occur in instances where the implementation deems necessary for correct program behavior, such as during an out of memory condition." IOW, it's basically allowed to collect anyway whenever, and not "__only__ when even the OS returns OoM".
 ` nogc` disables all allocations via the `GC`.
Not talking about nogc.
 How else do you propose we further formalize these two notions? 
 GC should throw Error if even the OS returns OoM rather than 
 run collection cycle?
If it's disabled - yes. Because "disable" should mean "disable", and not "disable, but..." Failing that, it should be renamed to "disableBut", and specify exactly when collections may still occur, and not wave it off as implementation-defined. I wouldn't want to depend on implementation details, would you?
Nov 16 2021
parent Tejas <notrealemail gmail.com> writes:
On Wednesday, 17 November 2021 at 07:50:06 UTC, Stanislav Blinov 
wrote:

 That is not at all how it's documented. "Collections may 
 continue to occur in instances where the implementation deems 
 necessary for correct program behavior, such as during an out 
 of memory condition."
 IOW, it's basically allowed to collect anyway whenever, and not 
 "__only__ when even the OS returns OoM".
Oops Okay, that's not very nice :(
 ` nogc` disables all allocations via the `GC`.
Not talking about nogc.
You can chose to ignore it, but I feel people will bring it up in these kinds of discussions anyways
 How else do you propose we further formalize these two 
 notions? GC should throw Error if even the OS returns OoM 
 rather than run collection cycle?
If it's disabled - yes. Because "disable" should mean "disable", and not "disable, but..." Failing that, it should be renamed to "disableBut", and specify exactly when collections may still occur, and not wave it off as implementation-defined. I wouldn't want to depend on implementation details, would you?
Yeah, me neither. I think such harsh constraints aren't making it in the language spec anytime soon. If you want such guarantees then you'll have to stick to `-betterC`... but that is only supposed to be used as a transition technology, not a dev platform in its own right... Yeah, D is a no-go for latency-sensitive stuff, I think(Maybe ARC + ` weak` pointers could help... but ARC itself seems far away, if even possible...)
Nov 17 2021
prev sibling parent Commander Zot <no no.no> writes:
 during which the whole world is stopped,
create a new thread and don't register it with the gc. it wont be stopped.
Nov 17 2021
prev sibling next sibling parent reply tchaloupka <chalucha gmail.com> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:
 With a GC, you instantly eliminate 90% of these problems.  Only 
 10% of the time, you actually need to manually manage memory -- 
 in inner loops and hot paths where it actually matters.

 GC phobia is completely irrational and I don't see why we 
 should bend over backwards to please that crowd.


 T
I tell you a story :) mindset compared to hardcore C/C++ devs. (just get the shit done using some of so many libraries out there). What I liked (and still like) that D allowed me to do is become more low level, more performant, but still be very productive. D code also is ofter much shorter and easier to understood (rust makes my eyes bleed). GC allows that and is great for. And I must admit that D had broken me in a way that I don't want to use higher level languages anymore. I've learned a lot using D through the years. BUT: * have you tried to write a shared D lib used from some other language from multiple threads? I know that you must register/unregister threads in GC, but it just hasn't work for me reliably in any way and you would have to track the lifetime of the thread in the calling language - not pleasant experience at all, no tutorials of how to do that properly that actually works - it's some years old [experience](https://forum.dlang.org/post/lhgljscvdwjdmxrnpchv forum.dlang.org) now so maybe something has changed * as GC is `stop the world` kind, only way to make it to not intervene with you and still use it in other places is make a thread (with a nogc function) that is not registered in the GC and make some own mechanism to exchange data between GC and nogc threads (as std.concurrency won't help you here) * GC won't magically stop the leaks. Nowadays one want's to have a long running service that just works. But try that with a 'flagship' vibe-d framework and you probably get [this](https://forum.dlang.org/post/pzrszydsqfxyipowoprn forum.dlang.org) experience * I don't like much when GC.stats reports something like: 239MB free from 251MB used memory that is a lot of unused space in a microservice world (and we had a cases when OS just OOM killed the service as it just grows indefinitely regardles there is a majority of free space in GC - as GC.stats says) * and now figure that out -> after that experience I would rather use `asan` than GC with no tools helping to figure that out * we have somehow managed to set GC properties in a way that it doesn't grow that much and get rid of a lot of small allocations, but with a cost you wouldn't expect using the GC * one of the cases that caused a lot of unnecessary small allocations was something like this `row["count"].as!long` when reading the columns from a database. Seems like a totally normal way right? But there is much more to it. As it (`dpq2` library) uses `libpq` internally that addresses columns by their index, it uses C method with `char*` column name to get it and so is using `toStringz` that allocates, for every single use of that column for every single row being read. You can imagine where it goes handling some complex queries on thousands of rows. And that is not something that a programmer like 'original me' wants to care about, he just wants to use the available libraries and get the work done, that is what GC should help with right? * there are leaks caused by druntime bugs itself (for example https://issues.dlang.org/show_bug.cgi?id=20680) After those and some other experiences with GC I just became a bit GC phobic (I'm ok with GC for CLI tools, scripts, short running programs, no problem with that there) and try to avoid it as much as I can. But when you want to `get shit done` you can't write all on your own, but use the libraries that get you there with no much hassle between. Overall my 2 cents on D state: * druntime relies too much on the GC * no Fiber usable in betterC or nogc * no Thread usable in betterC or nogc * etc. * I just think that basic blocks we built on should be as low level as possible to be generally usable * druntime and phobos has many `extern(C)` or normal functions that aren't ` nogc` albeit they can be (but is's getting better with each release thanks to various contributors that cares as much as at least report it) - but look at codebases of `mecca` or `vibe-d` where they use their own `extern(C)` redefinition due to this, or `mecca` has `assumeNoGC` template to workaround missing ` nogc` attribute * std.experimental.allocator * still in experimental * not usable in ` betterC` * shouldn't generally standard library interface use the allocators so that caller can actually choose the way it allocates? * preview switches that would stay in preview forever (ie `fieldwise`)? * no `async`/`await` - it's hard to find a modern language without it, D is one of them and there doesn't seem to be any interest in it by the leadership (it would deserve a workgroup to work on it) * but I'm afraid even if it would potentially be added to the language it would still use the GC as GC is great.. * pure tooling compared to others - I'm using VSCode in linux (sorry I'm lazy to learn vim in a way I'd be effective with it), it somewhat works, but code completion breaks too often for me (I'm used to it over the years, but I can imagine it doesn't look good to newcomers) * dub and code.dlang.org doesn't seems to be too official, and being cared about * it's hard to tell anyone that GC in D is fine when you look at techempower benchmark and searching the vibe-d (or anything in D) somewhere up where it should be and isn't (event other GC languages are much higher there) * `betterC` seems to becoming an unwanted child and is a minefield to use - see [bugs](https://issues.dlang.org/buglist.cgi?bug_status=UNCONFIRMED&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&keywords=betterC%2C%20&keywords_type=allwords&list_id=238210&query_format=advanced&resolution=---) * I think there are 2 sorts of group in D community - one more low level that won't like to use GC much, and GC 'likers', for whom GC is just 'good enough' * I'm afraid that there can't be consensus of what D should look as those groups has different priorities and points of view * preferred memory model differs for them too and I'm not sure if it's possible in D to make both sides happy (and without breaking changes) * most libraries on code.dlang.org are high level, and mostly when you want to use `betterC` or avoid GC, you are on your own. That is a problem when you just want to use some component and be done (if there is no C alternative or it would mean to write a more idiomatic wrapper for it).
Nov 17 2021
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Wednesday, 17 November 2021 at 12:14:46 UTC, tchaloupka wrote:
 * it's hard to tell anyone that GC in D is fine when you look 
 at techempower benchmark and searching the vibe-d (or anything 
 in D) somewhere up where it should be and isn't (event other GC 
 languages are much higher there)
I'm pretty sure you're the one who benchmarked my cgi.d and it annihilated vibe.d in that test. Maybe it isn't the GC and vibe is just poorly written?
Nov 17 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 13:44:39 UTC, Adam D Ruppe 
wrote:
 Maybe it isn't the GC and vibe is just poorly written?
Make the required language changes and make the GC fully precise. In the cloud you care more about memory usage than computational speed. Your whole instance might boot on 256-512MiB. GC is ok, but you need to be able to reclaim all memory.
Nov 17 2021
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Wednesday, 17 November 2021 at 13:50:59 UTC, Ola Fosheim 
Grøstad wrote:
 Your whole instance might boot on 256-512MiB.
I've had no trouble running normal D on that right now. Though one thing that would be nice is this function: https://github.com/dlang/druntime/blob/master/src/core/internal/gc/os.d#L218 Notice the Linux implementation....... lol.
Nov 17 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 17 November 2021 at 14:01:31 UTC, Adam D Ruppe 
wrote:
 On Wednesday, 17 November 2021 at 13:50:59 UTC, Ola Fosheim 
 Grøstad wrote:
 Your whole instance might boot on 256-512MiB.
I've had no trouble running normal D on that right now. Though one thing that would be nice is this function: https://github.com/dlang/druntime/blob/master/src/core/internal/gc/os.d#L218 Notice the Linux implementation....... lol.
wtf
Nov 17 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 14:39:02 UTC, Imperatorn wrote:
 On Wednesday, 17 November 2021 at 14:01:31 UTC, Adam D Ruppe 
 wrote:
 Notice the Linux implementation....... lol.
wtf
I guess they assumed a large swap disk… :)
Nov 17 2021
parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 17 November 2021 at 14:50:49 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 17 November 2021 at 14:39:02 UTC, Imperatorn 
 wrote:
 On Wednesday, 17 November 2021 at 14:01:31 UTC, Adam D Ruppe 
 wrote:
 Notice the Linux implementation....... lol.
wtf
I guess they assumed a large swap disk… :)
Also why is os_physical_mem using another function which is in turn not used when checking low mem?
Nov 17 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 14:01:31 UTC, Adam D Ruppe 
wrote:
 On Wednesday, 17 November 2021 at 13:50:59 UTC, Ola Fosheim 
 Grøstad wrote:
 Your whole instance might boot on 256-512MiB.
I've had no trouble running normal D on that right now.
Ok, but it depends on how much garbage you create. For instance if you want to process very large images on the same instance you might run into trouble. Although I guess you could preallocate a large buffer at boot and only do one image at the same time. But if you need to buffer many large data items at the same time you can run into fragmentation issues. This is not so uncommon as you often might want to cache files in memory. I guess one option here is to use weak pointers and let the GC evict items from the cache, but then we need weak pointers… Ok, can be a library type maybe, but then we are back to ownership and borrowing. With precise collection you have some chance to reduce fragmentation. For instance if you collect when no requests are pending then you should ideally have no fragmentation. But a proper GC language can do compaction… so that is a GC advantage. D cannot really offer compaction even in theory? I think cloud services should have a dedicated runtime. In short: It is not so trivial if you want to offer advantages over other languages. A primitive GC, is just that, primitive.
 https://github.com/dlang/druntime/blob/master/src/core/internal/gc/os.d#L218

 Notice the Linux implementation....... lol.
Ouch. If you know the instance size I guess you also could hardcode the limit and just claim all the memory you ever want at startup. But, I don't think web-services really is an area where D should try to be highly competitive, that is much more of a high level programming arena. I guess some services can benefit from system level programming in 2021, but in the long run, no.
Nov 17 2021
prev sibling parent reply tchaloupka <chalucha gmail.com> writes:
On Wednesday, 17 November 2021 at 13:44:39 UTC, Adam D Ruppe 
wrote:
 On Wednesday, 17 November 2021 at 12:14:46 UTC, tchaloupka 
 wrote:
 * it's hard to tell anyone that GC in D is fine when you look 
 at techempower benchmark and searching the vibe-d (or anything 
 in D) somewhere up where it should be and isn't (event other 
 GC languages are much higher there)
I'm pretty sure you're the one who benchmarked my cgi.d and it annihilated vibe.d in that test. Maybe it isn't the GC and vibe is just poorly written?
Yeah, that was me ;-) In case of the techempower one problem with GC is, that it is tested on a CPU with many cores. For each of them there is one thread to manage clients (using `SO_REUSEADDR`/`SO_REUSEPORT`). There would be a problem with a `stop the world` GC as it would stop all of them. In case of the web server there would be better to use some allocator that just allocates as needed in a scope of currently handled request and discards all of it back on request completion. Something like this is used in nginx module implementation for example. Or one would need to manage a set of subprocesses to avoid others to be stopped on GC collection. But then you'll start making workarounds for the GC, start to manage memory manually here and there and then you realize that you would probably be better without the GC in the first place. Sometimes it just won't scale. GC would also add some syscalls overhead as it uses signals, eventfd, etc. Of course there would also be places in vibe that are suboptimal, it's not only about the GC. Maybe even fibers alone won't be as lightweight compared to rust/c++. Rust has also a stealing scheduler with waitfree/lockfree queues.
Nov 17 2021
parent reply SealabJaster <sealabjaster gmail.com> writes:
On Wednesday, 17 November 2021 at 16:03:29 UTC, tchaloupka wrote:
 But then you'll start making workarounds for the GC, start to 
 manage memory manually here and there and then you realize that 
 you would probably be better without the GC in the first place. 
 Sometimes it just won't scale.
I wonder if we could make this a bit less painful if we had a thread-local GC by default, and another global GC purely for `shared` memory. So `new Class()` might only stop the current thread, while `new shared Class()` might stop the world.
Nov 17 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 16:27:10 UTC, SealabJaster 
wrote:
 I wonder if we could make this a bit less painful if we had a 
 thread-local GC by default, and another global GC purely for 
 `shared` memory.
GC should not be thread-local. It should be local to the computation. The computation has to be able to move between threads to get proper balanced load and high throughput. The *big* advantage here is that for most computations you probably don't have to collect at all. You just release the heap when the computation is completed. This can be very fast if you separate objects with destructors from those without.
 So `new Class()` might only stop the current thread, while `new 
 shared Class()` might stop the world.
No. ```new Class()``` will: 1. try to collect its own heap if possible 2. try to collect the heaps of inactive pending computations if possible 3. try to grab more memory form the OS if possible 4. trigger a collection of the shared heap 5. put the computation to sleep and wait for other computations to release memory That does not help much for those that want low latency and predictable performance. You also don't want to freeze computations that do not use ```shared```at all, and freezing the world on ```new shared``` is not acceptable for low latency computations anyway. You need to use RC on shared to get decent performance.
Nov 17 2021
parent SealabJaster <sealabjaster gmail.com> writes:
On Wednesday, 17 November 2021 at 16:47:10 UTC, Ola Fosheim 
Grøstad wrote:
 The *big* advantage here is that for most computations you 
 probably don't have to collect at all. You just release the 
 heap when the computation is completed. This can be very fast 
 if you separate objects with destructors from those without.
Interesting concept. Thanks for the detailed response.
Nov 17 2021
prev sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/17/21 7:14 AM, tchaloupka wrote:
 * have you tried to write a shared D lib used from some other language 
 from multiple threads? I know that you must register/unregister threads 
 in GC, but it just hasn't work for me reliably in any way and you would 
 have to track the lifetime of the thread in the calling language - not 
 pleasant experience at all, no tutorials of how to do that properly that 
 actually works - it's some years old 
 [experience](https://forum.dlang.org/post/lhgljscvdwjdmxrnp
hv forum.dlang.org) 
 now so maybe something has changed
 * as GC is `stop the world` kind, only way to make it to not intervene 
 with you and still use it in other places is make a thread (with a  nogc 
 function) that is not registered in the GC and make some own mechanism 
 to exchange data between GC and  nogc threads (as std.concurrency won't 
 help you here)
 * GC won't magically stop the leaks. Nowadays one want's to have a long 
 running service that just works. But try that with a 'flagship' vibe-d 
 framework and you probably get 
 [this](https://forum.dlang.org/post/pzrszydsqfxyipowoprn forum.dlang.org)
experience 
I have to say, the second I see "How to X with DLL" in this forum, I mark as read and continue. D has a horrible story on DLLs, and it's been like that for as long as I can remember. If there is some infrastructure project that needs attention, it's DLLs. You are right that `betterC` and `importC` are useless if using D from C is nigh impossible or so difficult even the experts can't tell you the answer. HOWEVER -- there is no excuse for the runtime hanging when a possible error is detected from a system call. Your linked discussion was not resolved how it should have been. Either the runtime should deal with that result properly, or it should crash the application completely. Not properly handling system call errors deep in the runtime is not acceptable. If no bug has been filed on this, please do.
    * I don't like much when GC.stats reports something like: 239MB free 
 from 251MB used memory that is a lot of unused space in a microservice 
 world (and we had a cases when OS just OOM killed the service as it just 
 grows indefinitely regardles there is a majority of free space in GC - 
 as GC.stats says)
I believe the GC can be tuned to reduce this, as long as you haven't at one point needed that much memory at once. However, it is worth noting that GC (in any language) generally does require more memory than manual allocation or reference counting. D does not have the ability to use a moving GC, because of the type system, which makes compacting GC impossible unfortunately.
      * one of the cases that caused a lot of unnecessary small 
 allocations was something like this `row["count"].as!long` when reading 
 the columns from a database. Seems like a totally normal way right? But 
 there is much more to it. As it (`dpq2` library) uses `libpq` internally 
 that addresses columns by their index, it uses C method with `char*` 
 column name to get it and so is using `toStringz` that allocates, for 
 every single use of that column for every single row being read. You can 
 imagine where it goes handling some complex queries on thousands of 
 rows. And that is not something that a programmer like 'original me' 
 wants to care about, he just wants to use the available libraries and 
 get the work done, that is what GC should help with right?
Oof, the `dpq2` library should fix that (probably malloc/free the temporary C string). Have you filed a bug on that?
    * there are leaks caused by druntime bugs itself (for example 
 https://issues.dlang.org/show_bug.cgi?id=20680)
 
 After those and some other experiences with GC I just became a bit GC 
 phobic (I'm ok with GC for CLI tools, scripts, short running programs, 
 no problem with that there) and try to avoid it as much as I can. But 
 when you want to `get shit done` you can't write all on your own, but 
 use the libraries that get you there with no much hassle between.
If I can separate the use cases here, using D as a main application, on a normal modern server/desktop, I have found using the GC to be a non-issue. There are problems clearly with: * using D with GC as a plugin * using GC in a memory/resource constrained environment Does that sound fair?
 Overall my 2 cents on D state:
 
 * druntime relies too much on the GC
    * no Fiber usable in  betterC or  nogc
    * no Thread usable in  betterC or  nogc
    * etc.
I think it's more of an either-or. Using D on constrained environments likely needs a *separate* runtime that is catered to those environments. I might be wrong, and a reasonable base that can deal with both is possible, but it hasn't materialized in the 14+ years that I've been using D.
 * no `async`/`await` - it's hard to find a modern language without it, D 
 is one of them and there doesn't seem to be any interest in it by the 
 leadership (it would deserve a workgroup to work on it)
This I have to disagree on, I'm not used to async/await, but I absolutely love fiber-based i/o where I don't have to deal with those (i.e. vibe-core or mecca). There are also library solutions to it, which might be good enough, I don't know. A good demonstration of how async/await can help D other than "because I'm used to it" would be great to see.
    * but I'm afraid even if it would potentially be added to the 
 language it would still use the GC as GC is great..
This doesn't make sense, as I believe async/await in languages enables the compiler to rewrite your functions into a state machine. How it stores the state might not need GC usage, because the compiler is in control of the state creation and the state usage. I wouldn't expect it to need more GC abilities than `scope(exit)` does.
 * pure tooling compared to others - I'm using VSCode in linux (sorry I'm 
 lazy to learn vim in a way I'd be effective with it), it somewhat works, 
 but code completion breaks too often for me (I'm used to it over the 
 years, but I can imagine it doesn't look good to newcomers)
If you refer to UFCS and/or templates, I'm not sure how solvable a problem this is.
 * dub and code.dlang.org doesn't seems to be too official, and being 
 cared about
It is official, it does not receive enough care.
 * it's hard to tell anyone that GC in D is fine when you look at 
 techempower benchmark and searching the vibe-d (or anything in D) 
 somewhere up where it should be and isn't (event other GC languages are 
 much higher there)
More solid web frameworks are needed for sure. D can shine in this area, it just needs some champions for it.
 * I think there are 2 sorts of group in D community - one more low level 
 that won't like to use GC much, and GC 'likers', for whom GC is just 
 'good enough'
There are fundamental tradeoffs that are ingrained in these choices. If you want safe code, you need GC, some reference counting scheme, or rust-style borrow checking. There are people who don't care about any of these, and don't care about safe code. If you want safe though, you have to pick one, and the GC is a perfectly good option for that. Having the other options is a nice possibility, but I would be very much against removing the GC, or refactoring everything so I have to think about non-GC options.
    * I'm afraid that there can't be consensus of what D should look as 
 those groups has different priorities and points of view
D does pretty well letting you choose a lot of things. Even the GC has some choice in it, but for sure the ecosystem for it isn't there. Most other languages don't let you make these choices.
 * most libraries on code.dlang.org are high level, and mostly when you 
 want to use `betterC` or avoid GC, you are on your own. That is a 
 problem when you just want to use some component and be done (if there 
 is no C alternative or it would mean to write a more idiomatic wrapper 
 for it).
Such is life. If you want an ecosystem built around a certain paradigm or low level design choice, you have to either create it or get others to join you in that paradigm. D does give you the opportunity to do that, it certainly does take a lot of critical mass to do that. -Steve
Nov 17 2021
parent reply tchaloupka <chalucha gmail.com> writes:
Thanks Steve.

On Wednesday, 17 November 2021 at 17:19:31 UTC, Steven 
Schveighoffer wrote:
 I have to say, the second I see "How to X with DLL" in this 
 forum, I mark as read and continue. D has a horrible story on 
 DLLs, and it's been like that for as long as I can remember. If 
 there is some infrastructure project that needs attention, it's 
 DLLs. You are right that `betterC` and `importC` are useless if 
 using D from C is nigh impossible or so difficult even the 
 experts can't tell you the answer.

 HOWEVER -- there is no excuse for the runtime hanging when a 
 possible error is detected from a system call. Your linked 
 discussion was not resolved how it should have been. Either the 
 runtime should deal with that result properly, or it should 
 crash the application completely. Not properly handling system 
 call errors deep in the runtime is not acceptable.

 If no bug has been filed on this, please do.
After the experience we've ended up with a workaround that passes It works kind off. It's not pretty or most effective, but at least it doesn't crash. Only on DLL unload it has problems sometimes but that is only on program exit anyway so not a big deal for us. But if I'd do something differently now, I'd end up using betterC only library for this case and avoided runtime completely.
 I believe the GC can be tuned to reduce this, as long as you 
 haven't at one point needed that much memory at once.
Yes, we have found some settings that makes it better (highest impact has progressive growing of the allocated pages set to 1).
 However, it is worth noting that GC (in any language) generally 
 does require more memory than manual allocation or reference 
 counting. D does not have the ability to use a moving GC, 
 because of the type system, which makes compacting GC 
 impossible unfortunately.
Yes it's expected but when I've seen the reported numbers of what is allocated and how much of that is actually free I had my doubts that it is caused only by fragemtation. But there are no official tools, nor druntime facility to make some GC memory map of what is actually there and where so it's pretty hard to tell and diagnose this. I've ended up using linux debuging tools to at least see when it allocates next page from what stack, but too many allocations rendered it impossible to process (but at least lead us to the dpq2 allocation problem). After that it became much better so we've decided to move it back to backlog as it would require much more time to move further).
 Oof, the `dpq2` library should fix that (probably malloc/free 
 the temporary C string). Have you filed a bug on that?
Nope, just internal patch, but I've fixed that: https://github.com/denizzzka/dpq2/issues/161
 If I can separate the use cases here, using D as a main 
 application, on a normal modern server/desktop, I have found 
 using the GC to be a non-issue.

 There are problems clearly with:

  * using D with GC as a plugin
  * using GC in a memory/resource constrained environment

 Does that sound fair?
Yes, there are uses where GC is not much of a problem and when is, it can be worked around where it makes sense.
 This doesn't make sense, as I believe async/await in languages 
 enables the compiler to rewrite your functions into a state 
 machine. How it stores the state might not need GC usage, 
 because the compiler is in control of the state creation and 
 the state usage.

 I wouldn't expect it to need more GC abilities than 
 `scope(exit)` does.
I might be wrong, but I guess there would be more to it as a scheduler that would control where the work is actually done controlling some `Future`/`Promise` facilities. And when it comes to Thread/Fiber synchronization, then.. But I'm not an expert in this field and as you, I'm more used to seemingly blocking API that is possible in vibe-d/mecca.
 If you refer to UFCS and/or templates, I'm not sure how 
 solvable a problem this is.
No this is more about that it just stops working completely and needs the restart. I can understand that not all code completion is possible, but in this case what has worked just stops working. Not sure why or where (might be a project organization with many subprojects in a relative paths, etc.). But as I said, I'm used to it. This is more about newcomers that might be scared away ;-) For debugging I use printf/gdb and I can live with that.
 Having the other options is a nice possibility, but I would be 
 very much against removing the GC, or refactoring everything so 
 I have to think about non-GC options.
We're on the same page here. I'd also like the best of both worlds ;-) Removing `betterC` or forcing GC on user would be a show stopper for me too.
 Such is life. If you want an ecosystem built around a certain 
 paradigm or low level design choice, you have to either create 
 it or get others to join you in that paradigm.

 D does give you the opportunity to do that, it certainly does 
 take a lot of critical mass to do that.
Yes, unfortunately I haven't found any other language that I would like more. So even with its quirks, D is great but definitely deserves to be better.
Nov 17 2021
parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/17/21 2:07 PM, tchaloupka wrote:
 
 Yes it's expected but when I've seen the reported numbers of what is 
 allocated and how much of that is actually free I had my doubts that it 
 is caused only by fragemtation.
I believe there is a minimum pool allocated, but I don't think it's in that range (250MB). Some documentation on how the performance of the GC is expected would be nice to have on the web site. -Steve
Nov 17 2021
prev sibling parent reply Zoadian <no no.no> writes:
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via Digitalmars-d 
 wrote:
 [...]
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. [...]
I 100% agree with this. There are more important things that need improvement. GC and no-GC work well enough in D (Exceptions are the only thing that needs a bit of extra work with nogc).
Nov 17 2021
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 17 November 2021 at 18:09:56 UTC, Zoadian wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via 
 Digitalmars-d wrote:
 [...]
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. [...]
I 100% agree with this. There are more important things that need improvement. GC and no-GC work well enough in D (Exceptions are the only thing that needs a bit of extra work with nogc).
Yes, we need to focus on the bigger picture
Nov 17 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 18:47:35 UTC, Imperatorn wrote:
 On Wednesday, 17 November 2021 at 18:09:56 UTC, Zoadian wrote:
 I 100% agree with this. There are more important things that 
 need improvement. GC and no-GC work well enough in D 
 (Exceptions are the only thing that needs a bit of extra work 
 with  nogc).
Yes, we need to focus on the bigger picture
Well, the bigger picture is that D needs solid compiler-backed memory management and a solid memory model, including adjustments to shared. What can be more important than that for a system level programming language? Continuing with inventing homegrown ad-hoc solutions is basically free advertising for competing languages that has put in the work to get something consistent out-of-the-box. I don't know Rust much and have absolutely no preference for it, but if I had to write a WASM module for a commercial project I probably would have given Rust a shot. As it looks like it might be the *cheapest* manageable option. And NO, I don't want to run a GC in WASM.
Nov 17 2021
prev sibling parent reply Greg Strong <mageofmaple protonmail.com> writes:
On Wednesday, 17 November 2021 at 18:47:35 UTC, Imperatorn wrote:
 On Wednesday, 17 November 2021 at 18:09:56 UTC, Zoadian wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via 
 Digitalmars-d wrote:
 [...]
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. [...]
I 100% agree with this. There are more important things that need improvement. GC and no-GC work well enough in D (Exceptions are the only thing that needs a bit of extra work with nogc).
Yes, we need to focus on the bigger picture
Agreed. Trying to please everybody is a losing battle. If the GC is so offensive to you, why are you here? There are other languages. I'm using D to build a chess variant engine where there are recursive functions that call each other a million times a second. I absolutely, positively cannot have any GC happening there! No problem. I pre-allocate what I need and I can further add nogc to be certain. (Which helped me because there was a place where I was doing a closure that could allocate and I didn't realize it.) D is great at letting you go the direction you want. The hysterical no-gc crowd should find another language. That being said, reducing GC usage in phobos is a worth goal...
Nov 17 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 22:36:51 UTC, Greg Strong wrote:
 I'm using D to build a chess variant engine where there are 
 recursive functions that call each other a million times a 
 second.
Out of curiosity, why did you pick D for this task? Sounds like it is well suited for a high level language with high level optimization capabilities.
 D is great at letting you go the direction you want.  The 
 hysterical no-gc crowd should find another language.  That 
 being said, reducing GC usage in phobos is a worth goal...
The *hysterical* no-gc crowd already found another language. The low latency gc crowd also found another language. A high latency gc-language is not something anyone actively looks for. That has to be fixed to fix the manpower-issue.
Nov 17 2021
parent reply bachmeier <no spam.net> writes:
On Wednesday, 17 November 2021 at 22:48:43 UTC, Ola Fosheim 
Grøstad wrote:

 The *hysterical* no-gc crowd already found another language.

 The low latency gc crowd also found another language.
And yet they still post here. It wouldn't be so bad if they posted things that made sense. They say things like "D forces you to use the GC" which is clearly nonsense. Then when someone says to use nogc, they say "But there are parts of the standard library that require GC". Which has nothing to do with their original complaint, and has absolutely nothing to do with their decision to use another language.
Nov 17 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 23:39:18 UTC, bachmeier wrote:
 And yet they still post here. It wouldn't be so bad if they 
 posted things that made sense. They say things like "D forces 
 you to use the GC" which is clearly nonsense.
If you want parity with C++ *language* features you have to use the GC. So that is an issue. I've never complained about not being able to use Phobos, as I've always viewed Phobos as being too much of a high level scripting-oriented library. Let me quote more from the same post I linked above, where I in 2014 wrote:
To me, a "better C" would have a minimal runtime, a tight 
minimalistic standard library and very simple builtin ownership 
semantics (uniqe_ptr). Then a set of supporting libraries that 
are hardware-optimized (with varying degree of portability).
However, I think those that are interested in D as a tight 
system level language have to spec out "better C" themselves as 
a formal language spec sketch. I'd be happy to contribute to 
that, maybe we could start a wiki-page. Since a "better C" would 
break existing code, it would allow a more "idealistic" language 
design discussion. I think that could cut down on the noise.
On this I still agree with myself. :-)
Nov 17 2021
next sibling parent bachmeier <no spam.net> writes:
On Wednesday, 17 November 2021 at 23:45:31 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 17 November 2021 at 23:39:18 UTC, bachmeier wrote:
 And yet they still post here. It wouldn't be so bad if they 
 posted things that made sense. They say things like "D forces 
 you to use the GC" which is clearly nonsense.
If you want parity with C++ *language* features you have to use the GC. So that is an issue. I've never complained about not being able to use Phobos, as I've always viewed Phobos as being too much of a high level scripting-oriented library.
I wasn't putting you in the group that makes crazy claims about the GC.
Nov 17 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Wednesday, 17 November 2021 at 23:45:31 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 17 November 2021 at 23:39:18 UTC, bachmeier wrote:
 And yet they still post here. It wouldn't be so bad if they 
 posted things that made sense. They say things like "D forces 
 you to use the GC" which is clearly nonsense.
If you want parity with C++ *language* features you have to use the GC. So that is an issue.
And those feature would be...?
Nov 18 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 13:56:00 UTC, Atila Neves wrote:
 And those feature would be...?
I personally would like to see C++ compatible exceptions, lambdas, and coroutines (and build a D actor on top of that), but I am not saying it is critical or important. Those are just my personal instincts. Others might have other preferences that might be equally valid. I haven't thought a lot about the consequences, as I don't think it will happen anytime soon.
Nov 18 2021
parent reply JN <666total wp.pl> writes:
On Thursday, 18 November 2021 at 14:12:36 UTC, Ola Fosheim 
Grøstad wrote:
 On Thursday, 18 November 2021 at 13:56:00 UTC, Atila Neves 
 wrote:
 And those feature would be...?
I personally would like to see C++ compatible exceptions, lambdas, and coroutines (and build a D actor on top of that), but I am not saying it is critical or important. Those are just my personal instincts. Others might have other preferences that might be equally valid. I haven't thought a lot about the consequences, as I don't think it will happen anytime soon.
I don't know if C++ compatibility is a good direction. Most modern languages try to distance themselves from C/C++ and only offer C ABI interop for legacy software and popular libraries. D always felt like trying to start from a clean slate and minimize the dependencies on C/C++. By adding C++ compatibility, whether we like it or not we will also inherit negative things that are related with C++. D should stand on its own as a language, rather than be a GC sidekick to make code that works with C++. (ironically, languages which don't care about c++ interop at all seem to have better bindings work with popular C++ projects such as Qt or Bullet).
Nov 18 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 14:54:31 UTC, JN wrote:
 I don't know if C++ compatibility is a good direction. Most 
 modern languages try to distance themselves from C/C++ and only 
 offer C ABI interop for legacy software and popular libraries. 
 D always felt like trying to start from a clean slate and 
 minimize the dependencies on C/C++. By adding C++ 
 compatibility, whether we like it or not we will also inherit 
 negative things that are related with C++. D should stand on 
 its own as a language, rather than be a GC sidekick to make 
 code that works with C++.
Yes, but then the current C++ interop strategy should be unwound, otherwise you end up in that uncanny-valley situation where you are neither this nor that. You end up with the disadvantages of tracking C++ with limited benefits. Same for importC. You have to go all in to be taken seriously, not just dip your toes. It is a difficult choice to make. Coroutines are going to be more common over time in C++ code. Exceptions are less of a burden for libraries in C++ than it used to be. So, long term strategic planning should say: do it, or pull out completely?
Nov 18 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 23:39:18 UTC, bachmeier wrote:
 And yet they still post here. It wouldn't be so bad if they 
 posted things that made sense. They say things like "D forces 
 you to use the GC" which is clearly nonsense. Then when someone 
 says to use nogc, they say "But there are parts of the standard 
 library that require GC". Which has nothing to do with their 
 original complaint, and has absolutely nothing to do with their 
 decision to use another language.
But nobody wanted nogc in 2010? https://forum.dlang.org/post/hhjho8$1paq$1 digitalmars.com
Nov 17 2021
prev sibling parent Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Wednesday, 17 November 2021 at 22:36:51 UTC, Greg Strong wrote:
 On Wednesday, 17 November 2021 at 18:47:35 UTC, Imperatorn 
 wrote:
 On Wednesday, 17 November 2021 at 18:09:56 UTC, Zoadian wrote:
 On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh 
 wrote:
 On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via 
 Digitalmars-d wrote:
 [...]
Why bend over backwards to please the GC-phobic crowd? They've already made up their minds, there's no convincing them. [...]
I 100% agree with this. There are more important things that need improvement. GC and no-GC work well enough in D (Exceptions are the only thing that needs a bit of extra work with nogc).
Yes, we need to focus on the bigger picture
Agreed. Trying to please everybody is a losing battle. If the GC is so offensive to you, why are you here? There are other languages. I'm using D to build a chess variant engine where there are recursive functions that call each other a million times a second. I absolutely, positively cannot have any GC happening there! No problem. I pre-allocate what I need and I can further add nogc to be certain. (Which helped me because there was a place where I was doing a closure that could allocate and I didn't realize it.) ...The hysterical no-gc crowd should find another language.
 That being said, reducing GC usage in phobos is a worth goal...
You realize this particular thread of comments started with a question about being able to use Phobos without GC?..
Nov 17 2021
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Wednesday, 17 November 2021 at 01:57:23 UTC, zjh wrote:
 On Wednesday, 17 November 2021 at 01:32:50 UTC, H. S. Teoh 
 wrote:

 Can I make full use of the `STD` library without garbage 
 collection?
 If this is achieved, it can be advertised everywhere.
 ,Then why not remove `GC` on the home page?.
 `-BetterC `, of course. I just can't use it with the `STD` 
 library,It's like walking on one leg.
I guess you need to just replace the leg with builtin ai (gc) for convenient walking, with manually tuned one, like https://code.dlang.org/packages/tanya It is probably impossible to have a standard lib that satisfies necessities for entire community. Therefore it may be best if it is split in different packages each caring for a subset of community. The best case being if those modules could interoperate well between themselves. I hope that the decision board will take the best option in current situation for future of dlang :). Best, regards, Alexandru.
Nov 17 2021
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 17 November 2021 at 23:06:01 UTC, Alexandru 
Ermicioi wrote:
 for convenient walking, with manually tuned one, like 
 https://code.dlang.org/packages/tanya
If the `d core maintainer` could support one,that's Good.
Nov 17 2021
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 At least from my point of view, it seems that recently D made a 
 shift from a general purpose language to a C successor, hence 
 the last efforts to improve betterC and C interop, neglecting 
 other areas of the language.
Nothing can compete with C, in terms of being the layer just above assembly. Many try. All fail. Improving on D's capability to write and interop with low-level code, seems like a worthwhile exercise to me, as it expands the problem domains in which D can be used. But that is a very different objective to being C's successor. C will always B.
Nov 16 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 17, 2021 at 01:23:38AM +0000, forkit via Digitalmars-d wrote:
[...]
 Nothing can compete with C, in terms of being the layer just above assembly.
 
 Many try. All fail.
 
 Improving on D's capability to write and interop with low-level code,
 seems like a worthwhile exercise to me, as it expands the problem
 domains in which D can be used.
 
 But that is a very different objective to being C's successor.
 
 C will always B.
Until buffer overflows, stack corruptions, and other lovely security holes thanks to C's unabashedly unsafe and outdated language constructs make it irrelevant in a world where security is quickly becoming (if it hasn't already become) a primary concern. Recently at my job I was assigned to review code quality issues found by an automatic code analysing tool in a very large C codebase. From reviewing (and fixing) hundreds of them, I can confirm that Walter was 100% spot on when he talked about design problems in C, the worst of which are the memory-safety issues. Some of the top issues I've encountered are: - Uninitialized variables (i.e., there's a code path that leads to reading an uninitialized variable), the worst of which is uninitialized pointers. - Dereferencing of null pointers (forgetting to check for null, or using a pointer *before* checking for null -- don't laugh, this happens *very* often even with the most seasoned C programmers; the code path that triggers this is usually completely non-obvious after a function has been revised several times). - Resource leaks (most frequent: memory leaks; after that file descriptor leaks, then other resource leaks). - Buffer overflows / overruns (unterminated strings being the most common). The worst offenders are functions that take a pointer to an array without a size parameter and just *assume* that there's enough space (oh yes, people still write such monstrosities -- a LOT more than you might think). - Double free. 99.9% of these issues are latent bugs: the usual code paths don't trigger them so nobody realizes that the bugs are there. But given some carefully-crafted input, or some environmental factors (disk full, file not found, different operating environments, etc.), these bugs will manifest themselves. Sometimes these bugs *do* cause things to blow up in a customer deployment environment, and cost us a LOT of money to clean up and fix. Not to mention the countless hours of employee time spent chasing down and fixing these bugs -- and the wages paid that could have been spent on things like actually implementing new features and making progress. And 90% of these issues are completely non-obvious to code reviewers (among whom are some of the best C programmers in our team). These are not bugs caused by clueless n00bs; these are bugs that sneak into code written by seasoned C coders with decades of experience **because the language is inherently unsafe**. One day, the realization will dawn on decision-makers that using such an outdated language in today's radically different environment from the 70's when C was designed, is a needless waste of resources. And then the inevitable will happen. T -- Democracy: The triumph of popularity over principle. -- C.Bond
Nov 16 2021
parent reply forkit <forkit gmail.com> writes:
On Wednesday, 17 November 2021 at 02:08:22 UTC, H. S. Teoh wrote:
 One day, the realization will dawn on decision-makers that 
 using such an outdated language in today's radically different 
 environment from the 70's when C was designed, is a needless 
 waste of resources. And then the inevitable will happen.


 T
I don't accept your assertion that C is outdated. Is assembly outdated? C should be the closest to assembly you can get, and no closer. That is the value proposition of C, in 'modern times'. If your assertion was, instead, that there are problem domains where something other than C should be considered, then I would whole heartedly accept such an assertion. And in such domains, D is certainly worth evaluating.
Nov 16 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 17 November 2021 at 05:51:48 UTC, forkit wrote:
 On Wednesday, 17 November 2021 at 02:08:22 UTC, H. S. Teoh 
 wrote:
 One day, the realization will dawn on decision-makers that 
 using such an outdated language in today's radically different 
 environment from the 70's when C was designed, is a needless 
 waste of resources. And then the inevitable will happen.


 T
I don't accept your assertion that C is outdated. Is assembly outdated? C should be the closest to assembly you can get, and no closer. That is the value proposition of C, in 'modern times'. If your assertion was, instead, that there are problem domains where something other than C should be considered, then I would whole heartedly accept such an assertion. And in such domains, D is certainly worth evaluating.
Google and Linux kernel don't care about your opinion and have started introducing Rust into the Linux kernel. AUTOSAR for high integrity computing doesn't care about your opinion and now mandates C++14 as the language to use on AUTOSAR certified software, ISO 26262 road vehicle functional safety. Arduino folks don't care about your opinion and Arduino libraries are written in C++, they also have Rust and Ada collaborations. C was closer to PDP-11 Assembly, it is hardly closer to any modern CPU.
Nov 16 2021
parent reply forkit <forkit gmail.com> writes:
On Wednesday, 17 November 2021 at 06:50:52 UTC, Paulo Pinto wrote:
 Google and Linux kernel don't care about your opinion and have 
 started introducing Rust into the Linux kernel.

 AUTOSAR for high integrity computing doesn't care about your 
 opinion and now mandates C++14 as the language to use on 
 AUTOSAR certified software, ISO 26262 road vehicle functional 
 safety.

 Arduino folks don't care about your opinion and Arduino 
 libraries are written in C++, they also have Rust and Ada 
 collaborations.

 C was closer to PDP-11 Assembly, it is hardly closer to any 
 modern CPU.
Well, clearly those examples demonstate that my opinion has some merit ;-) Also, many of C's so called problems, are really more library problems. You can't even do i/o in C without a library. also. you kinda left out alot.... like all the problem domains where C is still the language of choice... even to this day. I mean, even Go was originally written in C. It seems unlilkely they could have written Go, in Go.
Nov 16 2021
parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 17 November 2021 at 07:54:19 UTC, forkit wrote:
 On Wednesday, 17 November 2021 at 06:50:52 UTC, Paulo Pinto 
 wrote:
 Google and Linux kernel don't care about your opinion and have 
 started introducing Rust into the Linux kernel.

 AUTOSAR for high integrity computing doesn't care about your 
 opinion and now mandates C++14 as the language to use on 
 AUTOSAR certified software, ISO 26262 road vehicle functional 
 safety.

 Arduino folks don't care about your opinion and Arduino 
 libraries are written in C++, they also have Rust and Ada 
 collaborations.

 C was closer to PDP-11 Assembly, it is hardly closer to any 
 modern CPU.
Well, clearly those examples demonstate that my opinion has some merit ;-) Also, many of C's so called problems, are really more library problems. You can't even do i/o in C without a library. also. you kinda left out alot.... like all the problem domains where C is still the language of choice... even to this day. I mean, even Go was originally written in C. It seems unlilkely they could have written Go, in Go.
Go was written in C, because the authors decided to reuse Plan 9 compilers they created in first place, instead of starting from scratch. That is all, nothing special about C other than saving time. Currently, Go is written in Go, including its GC implementation. F-Secure has their own baremetal Go implementation, TamaGo, written in Go and sold for high security firmware.
Nov 17 2021
prev sibling next sibling parent reply Tejas <notrealemail gmail.com> writes:
On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 At least from my point of view, it seems that recently D made a 
 shift from a general purpose language to a C successor, hence 
 the last efforts to improve betterC and C interop, neglecting 
 other areas of the language.

 By other areas I mean half baked language built-ins or oop 
 support which failed to evolve at least to keep the pace with 
 the  languages from where D took inspiration initially (e.g. 
 Java and its successors).

 In this new light, even I am not bothered by, I must admit that 
 the garbage collector became something that doesn't fit in.

 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?

 For library development it is not necessary a big deal since 
 the allocator pattern can be implemented for each operation 
 that needs to allocate.

 But, for the rest of the features which are part of the core 
 language (e.g. arrays, classes, exceptions) what memory model 
 do you consider that will fit in? Do you think that compiler 
 supported ARC can be accepted as a deterministic memory model 
 by everyone? Or memory ownership and flow analysis are better?

 Not assuming a standard memory model can be a mistake, the C 
 crowd will always complain that they cannot use feature X, 
 others will complain that they cannot use feature Y because it 
 is not finished or its semantics are stuck in 2000's.
D always aimed to be a C/C++ successor though. It had Java influences, but that's it, it always had the ambitions of being a systems programming language. That it was flexible enough to even be good for scripting tasks was an accidental boon(and downfall: - since GC means no system programming; and instead of removing GC/embracing GC, we're stuck in-between, and - ARC seems to be a pipe dream for some reason; - front-end refactoring is considered infeasible and - code-breakage is only allowed as regression, ie, a bug that must be fixed).
Nov 16 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 07:50:20 UTC, Tejas wrote:
 - ARC seems to be a pipe dream for some reason;
It is a pipe dream because there is no plan to restructure the compiler internals. Until that happens a solid ARC implementation is infeasible (or rather much more expensive that the restructuring costs). What is missing is basic software engineering. The boring aspect of programming, but basically a necessity if you care about cost and quality.
Nov 17 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 At least from my point of view, it seems that recently D made a 
 shift from a general purpose language to a C successor,
I mean, it kind of always was, which is even indicated in the name.
 hence the last efforts to improve betterC and C interop, 
 neglecting other areas of the language.

 By other areas I mean half baked language built-ins or oop 
 support which failed to evolve at least to keep the pace with 
 the  languages from where D took inspiration initially (e.g. 
 Java and its successors).
Where would you say D has failed to evolve in terms of OOP?
 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?
The GC isn't going anywhere. It's the easiest way to write memory-safe code other than leaking everything.
Nov 17 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves wrote:
 The GC isn't going anywhere. It's the easiest way to write 
 memory-safe code other than leaking everything.
What is the plan for destructors? Objects with destructors should not be allocated on the GC-heap. Will this become a type error that is caught at compile time?
Nov 17 2021
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/17/21 4:59 PM, Ola Fosheim Grøstad wrote:
 On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves wrote:
 The GC isn't going anywhere. It's the easiest way to write memory-safe 
 code other than leaking everything.
What is the plan for destructors? Objects with destructors should not be allocated on the GC-heap. Will this become a type error that is caught at compile time?
Nothing wrong with it. It works correctly if you use it correctly. -Steve
Nov 17 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 17 November 2021 at 22:53:45 UTC, Steven 
Schveighoffer wrote:
 Nothing wrong with it. It works correctly if you use it 
 correctly.
That is a meaningless statement! When I write assembly everything works correctly if I use it correctly.
Nov 17 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Wednesday, 17 November 2021 at 21:59:51 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves 
 wrote:
 The GC isn't going anywhere. It's the easiest way to write 
 memory-safe code other than leaking everything.
What is the plan for destructors? Objects with destructors should not be allocated on the GC-heap. Will this become a type error that is caught at compile time?
No plans right now. Refresh my memory: what are the main issues?
Nov 18 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 13:38:58 UTC, Atila Neves wrote:
 On Wednesday, 17 November 2021 at 21:59:51 UTC, Ola Fosheim 
 Grøstad wrote:
 On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves 
 wrote:
 The GC isn't going anywhere. It's the easiest way to write 
 memory-safe code other than leaking everything.
What is the plan for destructors? Objects with destructors should not be allocated on the GC-heap. Will this become a type error that is caught at compile time?
No plans right now. Refresh my memory: what are the main issues?
Off the top of my head: 1. The main issue is that beyond the trivial examples destruction order matters and you need to keep objects alive to get the correct cleanup of resources. Think database/transaction, GPU/texture, etc. So if you assume RAII for aggregates, then aggregates written for those won't work with GC. Though you could support both by adding finalizer handlers, then you could deny GC allocation of objects with destructor and no finalizer handler. (just one possibility) 2. Performance of GC collection will be faster if you have no destructors. Especially if you move to local GC heap (per actor). Then you can just release the whole heap and do no scanning when the actor is done. If the average actor's lifespan is short and you have plenty of memory, the GC overhead will drop from "O(N)" to "O(1)" (average). Might be other issues.
Nov 18 2021
prev sibling parent reply rumbu <rumbu rumbu.ro> writes:
On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves wrote:
 On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 At least from my point of view, it seems that recently D made 
 a shift from a general purpose language to a C successor,
I mean, it kind of always was, which is even indicated in the name.
I am here when D was advertised as a general purpose language. The last development efforts are concentrated around making a better C, not a better D.
 Where would you say D has failed to evolve in terms of OOP?
* wrong idea of what private means (yes, I know that you disagree on that, but every OOP book/study/reference considers the class as unit of encapsulation); * struct inheritance * explicit interface implementations * class destructuring * properties * pattern matching on class type * pattern matching on fields/properties * implicit constructors
 Now, without a gc, more than half of the language risks to 
 become unusable and that's why I ask myself how do you see the 
 future of the memory management in D?
The GC isn't going anywhere. It's the easiest way to write memory-safe code other than leaking everything.
Personnaly I want gc to stay and improve. But when I see the language trying its best to please the gc haters, I wonder if the GC is the right default memory management in D. That was in fact the main idea of this post, which model of memory management will be more suitable to be assumed in D to make everyone happy. Finally, I found [something](https://digitalmars.com/d/1.0/builtin.html) from the old D1 page:
Complex Numbers
A detailed comparison with C++'s std::complex.

The most compelling reason is compatibility with C's imaginary 
and complex floating point >types. Next, is the ability to have 
imaginary floating point literals. Isn't:

c = (6 + 2i - 1 + 3i) / 3i;
far preferable than writing:

c = (complex!(double)(6,2) + complex!(double)(-1,3)) / 
complex!(double)(0,3);
? It's no contest.
15 years ago. I will reply with this quote to everyone who has "another library solution".
Nov 17 2021
next sibling parent reply forkit <forkit gmail.com> writes:
On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 Personnaly I want gc to stay and improve. But when I see the 
 language trying its best to please the gc haters, I wonder if 
 the GC is the right default memory management in D. That was in 
 fact the main idea of this post, which model of memory 
 management will be more suitable to be assumed in D to make 
 everyone happy.
You want to make everyone happy? Good luck with that ;-) In any case.... "People often think that D is a garbage collected language. I hope to disabuse them of that notion.." https://www.youtube.com/watch?v=_PB6Hdi4R7M
Nov 17 2021
parent reply rumbu <rumbu rumbu.ro> writes:
On Thursday, 18 November 2021 at 05:50:22 UTC, forkit wrote:
 On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 Personnaly I want gc to stay and improve. But when I see the 
 language trying its best to please the gc haters, I wonder if 
 the GC is the right default memory management in D. That was 
 in fact the main idea of this post, which model of memory 
 management will be more suitable to be assumed in D to make 
 everyone happy.
You want to make everyone happy? Good luck with that ;-) In any case.... "People often think that D is a garbage collected language. I hope to disabuse them of that notion.." https://www.youtube.com/watch?v=_PB6Hdi4R7M
int* x = stackalloc int[100]; int* y = Marshal.AllocHGlobal(1000); struct RC<T> where T: struct { ... } O course, metaprogramming makes things easier in D, but pretending that D is not a garbage collected language when you cannot join two arrays or throw an exception without digging outside the language for a replacement is absurd.
Nov 17 2021
parent reply Nick Treleaven <nick geany.org> writes:
On Thursday, 18 November 2021 at 06:25:29 UTC, rumbu wrote:


 int* x = stackalloc int[100];
 int* y = Marshal.AllocHGlobal(1000);
 struct RC<T> where T: struct { ... }

 O course, metaprogramming makes things easier in D, but 
 pretending that D is not a garbage collected language when you 
 cannot join two arrays or throw an exception without digging  
 outside the language for a replacement is absurd.
nogc unittest { import std.container.array; auto a = Array!int(1, 2); auto b = Array!int(3, 4); a ~= b; import std.algorithm; assert(a[].equal([1,2,3,4])); static const ex = new Exception("hi"); throw ex; }
Nov 18 2021
parent Rumbu <rumbu rumbu.ro> writes:
On Thursday, 18 November 2021 at 14:31:36 UTC, Nick Treleaven 
wrote:
 On Thursday, 18 November 2021 at 06:25:29 UTC, rumbu wrote:


 int* x = stackalloc int[100];
 int* y = Marshal.AllocHGlobal(1000);
 struct RC<T> where T: struct { ... }

 O course, metaprogramming makes things easier in D, but 
 pretending that D is not a garbage collected language when you 
 cannot join two arrays or throw an exception without digging  
 outside the language for a replacement is absurd.
nogc unittest { import std.container.array; auto a = Array!int(1, 2); auto b = Array!int(3, 4); a ~= b; import std.algorithm; assert(a[].equal([1,2,3,4])); static const ex = new Exception("hi"); throw ex; }
As I said, you are digging outside the language spec with Array and equal and with the overloaded operators opOpAssign and opSlice, even they are not obvious. Preallocating global exceptions with standard messages is not a good idea.
Nov 18 2021
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 * struct inheritance
 * explicit interface implementations
 * class destructuring
 * properties
 * pattern matching on class type
 * pattern matching on fields/properties
 * implicit constructors
What is "class destructuring"? I would add user-provided default constructors to this list.
 Finally, I found 
 [something](https://digitalmars.com/d/1.0/builtin.html) from 
 the old D1 page:

Complex Numbers
A detailed comparison with C++'s std::complex.

The most compelling reason is compatibility with C's imaginary 
and complex floating point >types. Next, is the ability to have 
imaginary floating point literals. Isn't:

c = (6 + 2i - 1 + 3i) / 3i;
far preferable than writing:

c = (complex!(double)(6,2) + complex!(double)(-1,3)) / 
complex!(double)(0,3);
? It's no contest.
15 years ago. I will reply with this quote to everyone who has "another library solution".
C++ can do this in a library: https://en.cppreference.com/w/cpp/language/user_literal You need to find a better example. :-)
Nov 18 2021
parent reply Rumbu <rumbu rumbu.ro> writes:
On Thursday, 18 November 2021 at 11:08:17 UTC, Ola Fosheim 
Grøstad wrote:

 What is "class destructuring"?
In D idioms allowing to overload tupleof if it was an operator. Combined with tuple support you can have the following code: ``` class A { int x; int y} A a = new A(); (int v, int w) = a; // v will contain a.x, w will contain a.y ```
 15 years ago. I will reply  with this quote to everyone who 
 has "another library solution".
C++ can do this in a library: https://en.cppreference.com/w/cpp/language/user_literal You need to find a better example. :-)
This was to show how D paradigm shifted in 15 years from avoiding library solutions to encouraging them.
Nov 18 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 12:42:04 UTC, Rumbu wrote:
 In D idioms allowing to overload tupleof if it was an operator.
Ok, I get it. So that would work with public fields only? Yes, it is a weakness that class/struct are different yet very similar. Ideally a tuple should be syntactic sugar for a struct.
 This was to show how D paradigm shifted in 15 years from 
 avoiding library solutions to encouraging them.
Yes. Andrei did that with D2. A strategic mistake, but perhaps more fun.
Nov 18 2021
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/17/21 11:24 PM, rumbu wrote:
 Finally, I found [something](https://digitalmars.com/d/1.0/builtin.html) 
 from the old D1 page:
 
 Complex Numbers
 A detailed comparison with C++'s std::complex.

 The most compelling reason is compatibility with C's imaginary and 
 complex floating point >types. Next, is the ability to have imaginary 
 floating point literals. Isn't:

 c = (6 + 2i - 1 + 3i) / 3i;
 far preferable than writing:

 c = (complex!(double)(6,2) + complex!(double)(-1,3)) / 
 complex!(double)(0,3);
 ? It's no contest.
15 years ago. I will reply  with this quote to everyone who has "another library solution".
Let's adjust the straw in that man: ```d import std.complex; auto i(double v) { return complex(0, v); } void main() { auto c = (6 + 2.i - 1 + 3.i) / 3.i; } ``` -Steve
Nov 18 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 13:05:02 UTC, Steven 
Schveighoffer wrote:
 Let's adjust the straw in that man:
How can you bind something global to a common identifier like ```i```? You are joking, right?
Nov 18 2021
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 18 November 2021 at 13:07:28 UTC, Ola Fosheim 
Grøstad wrote:
 How can you bind something global to a common identifier like 
 ```i```?
That's not global.
Nov 18 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 13:26:43 UTC, Adam D Ruppe wrote:
 On Thursday, 18 November 2021 at 13:07:28 UTC, Ola Fosheim 
 Grøstad wrote:
 How can you bind something global to a common identifier like 
 ```i```?
That's not global.
That's a rather pedantic statement. Oh well, then nothing user defined is global, everything is nested within something else.
Nov 18 2021
parent Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 18 November 2021 at 13:37:57 UTC, Ola Fosheim 
Grøstad wrote:
 That's not global.
That's a rather pedantic statement.
It is an important distinction since you get this if you want it and don't get it if you don't.
Nov 18 2021
prev sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/18/21 8:07 AM, Ola Fosheim Grøstad wrote:
 On Thursday, 18 November 2021 at 13:05:02 UTC, Steven Schveighoffer wrote:
 Let's adjust the straw in that man:
How can you bind something global to a common identifier like ```i```? You are joking, right?
Ironically, `i` is almost never a global identifier. Using it for a global identifier works because UFCS is not allowed on locals. ```d import std.complex; auto i(double v) { return complex(0, v); } void main() { int i = 5; auto c = (6 + 2.5.i - 1 + 3.i) / 3.i; } ``` -Steve
Nov 18 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 18 November 2021 at 13:27:15 UTC, Steven 
Schveighoffer wrote:
 Ironically, `i` is almost never a global identifier. Using it 
 for a global identifier works because UFCS is not allowed on 
 locals.
But you would have to stuff it into std.complex for it to be a library feature. So if someone else uses ```i(double)``` it would mess up everything. That is not what Rumba asked for, he "asked" for something that was simple to use.
Nov 18 2021
prev sibling parent reply Tejas <notrealemail gmail.com> writes:
On Thursday, 18 November 2021 at 13:05:02 UTC, Steven 
Schveighoffer wrote:
 On 11/17/21 11:24 PM, rumbu wrote:
 Finally, I found 
 [something](https://digitalmars.com/d/1.0/builtin.html) from 
 the old D1 page:
 
 Complex Numbers
 A detailed comparison with C++'s std::complex.

 The most compelling reason is compatibility with C's 
 imaginary and complex floating point >types. Next, is the 
 ability to have imaginary floating point literals. Isn't:

 c = (6 + 2i - 1 + 3i) / 3i;
 far preferable than writing:

 c = (complex!(double)(6,2) + complex!(double)(-1,3)) / 
 complex!(double)(0,3);
 ? It's no contest.
15 years ago. I will reply  with this quote to everyone who has "another library solution".
Let's adjust the straw in that man: ```d import std.complex; auto i(double v) { return complex(0, v); } void main() { auto c = (6 + 2.i - 1 + 3.i) / 3.i; } ``` -Steve
This is amazing :D But, this has not been documented anywhere in the `std.complex` page, so I don't feel its fair to say that rumbu is making a straw man argument.
Nov 18 2021
parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/18/21 8:25 AM, Tejas wrote:
 
 This is amazing :D
D is quite amazing, especially with some of these syntax sugar features ;)
 
 But, this has not been documented anywhere in the `std.complex` page, so 
 I don't feel its fair to say that rumbu is making a straw man argument.
The strawman is from the D1 page, it compares its current implementation to a fictitious complex number implementation (that didn't exist back then), making it look as bad as possible. I'm using the actual implementation of the library that now does exist. Note that UFCS does not exist in D1 for anything other than arrays, so in D1 that rewrite is not possible. The `i` function is not in std.complex, which is a shame, I think it should be. But I've never used complex numbers so I haven't cared. -Steve
Nov 18 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 On Wednesday, 17 November 2021 at 21:46:45 UTC, Atila Neves 
 wrote:
 On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:
 At least from my point of view, it seems that recently D made 
 a shift from a general purpose language to a C successor,
I mean, it kind of always was, which is even indicated in the name.
I am here when D was advertised as a general purpose language. The last development efforts are concentrated around making a better C, not a better D.
I don't see "a better C" and "a general purpose language" as mutually exclusive. Do you disagree?
 Where would you say D has failed to evolve in terms of OOP?
* wrong idea of what private means (yes, I know that you disagree on that, but every OOP book/study/reference considers the class as unit of encapsulation);
As you've mentioned, we're not going to agree. I don't think this is failing to evolve either since it's by design.
 * struct inheritance
Inheritance (subtyping) and value types don't mix.
 * explicit interface implementations
 * class destructuring
Could you please explain what these mean?
 * properties
What's missing?
 * pattern matching on class type
AFAIC this is an anti-pattern. IMHO, in OOP, there should be a switch exactly in one place in the program, and that's where instances get created. I don't know why anyone would want to recreate the functionality of the virtual table in an ad-hoc way. Multiple dispatch blurs this, however.
 * pattern matching on fields/properties
How would this work?
 Personnaly I want gc to stay and improve. But when I see the 
 language trying its best to please the gc haters,
In my view it's not really about pleasing the GC haters as much as it's acknowledging that the GC isn't always the best choice, and making sure that we can write memory-safe programs without it.
 I wonder if the GC is the right default memory management in D.
I believe it is, yes.
 Finally, I found 
 [something](https://digitalmars.com/d/1.0/builtin.html) from 
 the old D1 page:

Complex Numbers
A detailed comparison with C++'s std::complex.

The most compelling reason is compatibility with C's imaginary 
and complex floating point >types. Next, is the ability to have 
imaginary floating point literals. Isn't:

c = (6 + 2i - 1 + 3i) / 3i;
far preferable than writing:

c = (complex!(double)(6,2) + complex!(double)(-1,3)) / 
complex!(double)(0,3);
? It's no contest.
15 years ago. I will reply with this quote to everyone who has "another library solution".
Library solution: auto c = (6 + 2.i - 1 + 3.i) / 3.i; I'd disagree it's "no contest". I think that 2021 Walter would disagree as well but he can disprove me here if not the case.
Nov 18 2021
next sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Thursday, 18 November 2021 at 13:37:28 UTC, Atila Neves wrote:
 On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 [...]
I don't see "a better C" and "a general purpose language" as mutually exclusive. Do you disagree? [...]
I don't know about the other stuff, but explicit interface implementation is this: https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/interfaces/explicit-interface-implementation
Nov 18 2021
prev sibling next sibling parent reply rumbu <rumbu rumbu.ro> writes:
On Thursday, 18 November 2021 at 13:37:28 UTC, Atila Neves wrote:
 On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 * wrong idea of what private means (yes, I know that you 
 disagree on that, but every OOP book/study/reference considers 
 the class as unit of encapsulation);
As you've mentioned, we're not going to agree. I don't think this is failing to evolve either since it's by design.
Let's agree to disagree. For me, it's a failure.
 * struct inheritance
Inheritance (subtyping) and value types don't mix.
A struct can inherit 2 things: - another struct - an interface This doesn't involve boxing for structs, just the compiler generating a templated function when encounters the interface as a parametter. At least this is how it is done in BeefLang. ```d interface I { void foo(); } void bar(I i) { i: foo; } class C: I { void foo() {} } struct S: I { void foo() {} } C c = new C(); S s = new S(); void callMe(I i) { i.foo } callMe(c); // compiler will call callMe as usual callMe(s); // compiler will generate and call a specialized callMe(S i) ```
 Could you please explain what these mean?

explicit interface implementations
```d interface I1 { void foo(); } interface I2 { void foo(); } class C : I1, I2 { void foo() { writeln ("I am I1's and I silently hide I2's foo"); } void I1.foo() {writeln("I am I1's foo"); } //this doesn't work void I2.foo() {writeln("I am I2's foo"); } //this doesn't work } ```
 class destructuring
I already explain it, controlled decomposing of classes or structs; ```d //if tuple syntax was built-in (item1, item2, ....) struct Coordinate { int x, y, z; void opDeconstruct(out x, out y) { return (x, y) } void opDeconstruct(out x, out y) { return (x, y, z) } } Coordinate co; (x,y) = co; (x,y,z) = co; ```
 * properties
What's missing?
A better syntax? Compiler generated backing fields? The fact they are not working as advertised? ```d class C { int _fld; int prop() {return _fld;} void prop(int x) { _fld = x; } } C c = new C(); c.prop += 42; ///oops! ```
 * pattern matching on class type
AFAIC this is an anti-pattern. IMHO, in OOP, there should be a switch exactly in one place in the program, and that's where instances get created. I don't know why anyone would want to recreate the functionality of the virtual table in an ad-hoc way. Multiple dispatch blurs this, however.
 * pattern matching on fields/properties
How would this work?
```d switch (JSONValue) { case JSONNumber n: writeln ("I have a number %s", n); case JSONString s when s.Length > 100 : writeln("long string"); case JSONString s: writeln("short string"); } ```
Nov 18 2021
next sibling parent reply JN <666total wp.pl> writes:
On Thursday, 18 November 2021 at 14:56:40 UTC, rumbu wrote:
 * pattern matching on fields/properties
How would this work?
```d switch (JSONValue) { case JSONNumber n: writeln ("I have a number %s", n); case JSONString s when s.Length > 100 : writeln("long string"); case JSONString s: writeln("short string"); } ```
Isn't that just std.sumtype/tagged union?
Nov 18 2021
parent reply Rumbu <rumbu rumbu.ro> writes:
On Thursday, 18 November 2021 at 15:00:22 UTC, JN wrote:
 On Thursday, 18 November 2021 at 14:56:40 UTC, rumbu wrote:
 * pattern matching on fields/properties
How would this work?
```d switch (JSONValue) { case JSONNumber n: writeln ("I have a number %s", n); case JSONString s when s.Length > 100 : writeln("long string"); case JSONString s: writeln("short string"); } ```
Isn't that just std.sumtype/tagged union?
Yes, it is, but the question was about oop evolution in D and keeping the pace with other languages, not about "here you have another lib for this".
Nov 18 2021
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 18, 2021 at 03:19:39PM +0000, Rumbu via Digitalmars-d wrote:
 On Thursday, 18 November 2021 at 15:00:22 UTC, JN wrote:
 On Thursday, 18 November 2021 at 14:56:40 UTC, rumbu wrote:
 * pattern matching on fields/properties
How would this work?
```d switch (JSONValue) { case JSONNumber n: writeln ("I have a number %s", n); case JSONString s when s.Length > 100 : writeln("long string"); case JSONString s: writeln("short string"); } ```
Isn't that just std.sumtype/tagged union?
Yes, it is, but the question was about oop evolution in D and keeping the pace with other languages, not about "here you have another lib for this".
[...] IMO, this is actually a strength of D: it is powerful enough to express these constructs as library code instead of being baked into the language. Of course, the library experience definitely can be improved -- I don't argue with that. Some language changes to make library solutions more powerful would definitely be welcome. Documentation needs improvement, and ecosystem definitely needs work. But I don't see library solutions as a failure; I see it rather as a success that libraries are able to express such things, whereas in languages like Java the language doesn't let you express them, so you have no choice except to bake it into the language. T -- Stop staring at me like that! It's offens... no, you'll hurt your eyes!
Nov 18 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Thursday, 18 November 2021 at 14:56:40 UTC, rumbu wrote:
 On Thursday, 18 November 2021 at 13:37:28 UTC, Atila Neves 
 wrote:
 On Thursday, 18 November 2021 at 04:24:56 UTC, rumbu wrote:
 * struct inheritance
Inheritance (subtyping) and value types don't mix.
A struct can inherit 2 things: - another struct - an interface This doesn't involve boxing for structs, just the compiler generating a templated function when encounters the interface as a parametter. At least this is how it is done in BeefLang. ```d interface I { void foo(); } void bar(I i) { i: foo; } class C: I { void foo() {} } struct S: I { void foo() {} } C c = new C(); S s = new S(); void callMe(I i) { i.foo } callMe(c); // compiler will call callMe as usual callMe(s); // compiler will generate and call a specialized callMe(S i) ```
Is the point here to make sure that a struct implements a given interface? Because, if so: https://github.com/atilaneves/concepts#using-a-d-interface-to-specify-compile-time-template-constraints I hadn't thought about functions that can take a dynamic or static type though, which is interesting.
 * properties
What's missing?
A better syntax? Compiler generated backing fields? The fact they are not working as advertised? ```d class C { int _fld; int prop() {return _fld;} void prop(int x) { _fld = x; } } C c = new C(); c.prop += 42; ///oops! ```
Works with `ref prop() { return _fld; }`.
 * pattern matching on fields/properties
How would this work?
```d switch (JSONValue) { case JSONNumber n: writeln ("I have a number %s", n); case JSONString s when s.Length > 100 : writeln("long string"); case JSONString s: writeln("short string"); } ```
std.sumtype?
Nov 18 2021
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state. Exposing a private member for direct mutation is generally a bad idea.
Nov 19 2021
next sibling parent reply Dom DiSc <dominikus scherkl.de> writes:
On Friday, 19 November 2021 at 10:21:35 UTC, Max Samukha wrote:
 On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves 
 wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state. Exposing a private member for direct mutation is generally a bad idea.
Exactly. What I would expect from a "real" property is that prop += 5; is automatically lowered to prop(prop() + 5); // function style or direct auto tmp = prop; tmp += 5; prop = tmp; or whatever - but would work only if there is both a getter _and_ a setter. And if only one of the two is implemented, a fitting error message should be emitted like "no setter for prop available" or "no getter for prop available". It should NEVER EVER set a property through the getter!!! ref prop() { ... } is a very bad anti-pattern. If "const" is missing or a reference is returned from a getter (a property function without parameter) this should be a compile error! Unless that is given I don't consider property to be a ready-to-use feature.
Nov 19 2021
next sibling parent Dom DiSc <dominikus scherkl.de> writes:
On Friday, 19 November 2021 at 11:14:46 UTC, Dom DiSc wrote:
 prop += 5;

 is automatically lowered to

 prop(prop() + 5); // function style
And btw. taking the address of a property (the whole point why this feature was abandoned which would otherwise be very easy to implement) should simply be forbidden also. What should that be anyway? The address of the getter? Or that of the setter? Or that of the private member (that may not even exist)?!? Properties simulate a value but protect it by allowing only specific operations on it (either get or set or both, but maybe with complicated processing of the assigned value). Why should it allow such an immersive operation as taking its address?!? If at all, we may allow for a third property function, that simulates an address of a value - but I can't think of a valid usecase for such a thing... through the address you can do everything to the simulated object, so why not making it public in the first place?
Nov 19 2021
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Friday, 19 November 2021 at 11:14:46 UTC, Dom DiSc wrote:
 On Friday, 19 November 2021 at 10:21:35 UTC, Max Samukha wrote:
 On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves 
 wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state. Exposing a private member for direct mutation is generally a bad idea.
Exactly. What I would expect from a "real" property is that prop += 5; is automatically lowered to prop(prop() + 5); // function style or direct auto tmp = prop; tmp += 5; prop = tmp; or whatever - but would work only if there is both a getter _and_ a setter. And if only one of the two is implemented, a fitting error message should be emitted like "no setter for prop available" or "no getter for prop available". It should NEVER EVER set a property through the getter!!!
If you want to hide the field from public use, then just don't expect it to behave like a public field. If you want to have += and other operators that affect the private field, you could return a wrapper that does this and saves the field. This behavior of properties should just be explained a lot better, and that's it.
 ref prop() { ... } is a very bad anti-pattern.
Why is it bad? Best regards, Alexandru.
Nov 19 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Friday, 19 November 2021 at 14:30:39 UTC, Alexandru Ermicioi 
wrote:
 Why is it bad?

 Best regards,
 Alexandru.
It violate encapsulation. Which at that point you might as well make the l-value public. -Alex
Nov 19 2021
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 19.11.21 16:39, 12345swordy wrote:
 On Friday, 19 November 2021 at 14:30:39 UTC, Alexandru Ermicioi wrote:
 Why is it bad?

 Best regards,
 Alexandru.
It violate encapsulation. Which at that point you might as well make the l-value public. -Alex
This is just one way to make an lvalue public. Note that the address might not be constant/at a constant offset.
Nov 19 2021
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 19.11.21 16:50, Timon Gehr wrote:
 On 19.11.21 16:39, 12345swordy wrote:
 On Friday, 19 November 2021 at 14:30:39 UTC, Alexandru Ermicioi wrote:
 Why is it bad?

 Best regards,
 Alexandru.
It violate encapsulation. Which at that point you might as well make the l-value public. -Alex
This is just one way to make an lvalue public. Note that the address might not be constant/at a constant offset.
(Also, there might be lifetime tracking of some sort going on.)
Nov 19 2021
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Friday, 19 November 2021 at 15:39:54 UTC, 12345swordy wrote:
 It violate encapsulation. Which at that point you might as well 
 make the l-value public.

 -Alex
It doesn't. Just like any other setter in any other language, it does not constraint you to return a private field in the object itself. You can return any value from any source, be it in object itself, a sub-object, on heap or on stack. The only constraint is to have it stored somewhere, so you can return a reference. The main problem here, is that people expect for a value type to behave like a reference type here, which isn't the case in any other language too. Just try for example to return an int from a getter in java, and do ++ on it. You'll get the same behavior as in D. Best regards, Alexandru.
Nov 19 2021
next sibling parent Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Friday, 19 November 2021 at 16:14:41 UTC, Alexandru Ermicioi 
wrote:

 It doesn't. Just like any other setter in any other language
I meant getter not setter. Regards, Alexandru
Nov 19 2021
prev sibling next sibling parent reply rumbu <rumbu rumbu.ro> writes:
On Friday, 19 November 2021 at 16:14:41 UTC, Alexandru Ermicioi 
wrote:
 On Friday, 19 November 2021 at 15:39:54 UTC, 12345swordy wrote:
 It violate encapsulation. Which at that point you might as 
 well make the l-value public.

 -Alex
It doesn't. Just like any other setter in any other language, it does not constraint you to return a private field in the object itself. You can return any value from any source, be it in object itself, a sub-object, on heap or on stack. The only constraint is to have it stored somewhere, so you can return a reference. The main problem here, is that people expect for a value type to behave like a reference type here, which isn't the case in any other language too. Just try for example to return an int from a getter in java, and do ++ on it. You'll get the same behavior as in D. Best regards, Alexandru.
```d class Square { private double width; public double area() { return width * width; } public ref float area(double x) { width = sqrt(x); //what should I return here as ref? } } ``` ```csharp class Square { private double width; public double area { get => width * width; set => width = Sqrt(value); } Square sq = new Square(); sq.area += 1; } ```
Nov 19 2021
parent reply Dom DiSc <dominikus scherkl.de> writes:
On Friday, 19 November 2021 at 16:31:19 UTC, rumbu wrote:
 ```d
 class Square
 {
     private double width;
     public double area()
     {
         return width * width;
     }
     public ref float area(double x)
     {
        width = sqrt(x);
       //what should I return here as ref?
     }
 }
 ```
Why should a setter return anything? (But of course it could, if you wish so). This is not neccessary to realize the += operator. I talked about the GETter that should not return a reference. Because the reason to have a getter (and no setter at all) is to make the value visible to the public, but not allow any hook to anything within my object. Beside the getter the object should remain a complete black box. If any other behavior is desired, use something else than a getter property. It's no problem to have ordinary members that return references and of which you can get the address or pointers or whatever. But properties should stay as close to the basic needs of a pure value as possible. We don't need another way to write ordinary functions.
Nov 20 2021
parent reply Rumbu <rumbu rumbu.ro> writes:
On Saturday, 20 November 2021 at 10:08:55 UTC, Dom DiSc wrote:
 On Friday, 19 November 2021 at 16:31:19 UTC, rumbu wrote:
 ```d
 class Square
 {
     private double width;
     public double area()
     {
         return width * width;
     }
     public ref float area(double x)
     {
        width = sqrt(x);
       //what should I return here as ref?
     }
 }
 ```
Why should a setter return anything? (But of course it could, if you wish so). This is not neccessary to realize the += operator. I talked about the GETter that should not return a reference. Because the reason to have a getter (and no setter at all) is to make the value visible to the public, but not allow any hook to anything within my object. Beside the getter the object should remain a complete black box. If any other behavior is desired, use something else than a getter property. It's no problem to have ordinary members that return references and of which you can get the address or pointers or whatever. But properties should stay as close to the basic needs of a pure value as possible. We don't need another way to write ordinary functions.
Please read the entire thread. The original issue was the how can I design a property in D so I can write: ``` area +=1; ``` The solution with ref return came from Atilla, not from me.
Nov 20 2021
parent reply Commander Zot <no no.no> writes:
On Saturday, 20 November 2021 at 11:53:20 UTC, Rumbu wrote:
 The original issue was the how can I design a property in D so 
 I can write:

 ```
  area +=1;
 ```
import std.stdio; struct Property(T) { T* t; T opOpAssign(string op)(T r) { mixin("(*t) "~op~"= r;"); return *t; } } class Square { private double width = 42; public auto area() { return Property!double(&width); } } void main() { auto s = new Square; writeln(s.width); s.area += 1; writeln(s.width); }
Nov 20 2021
parent reply Rumbu <rumbu rumbu.ro> writes:
On Saturday, 20 November 2021 at 12:11:14 UTC, Commander Zot 
wrote:
 On Saturday, 20 November 2021 at 11:53:20 UTC, Rumbu wrote:
 The original issue was the how can I design a property in D so 
 I can write:

 ```
  area +=1;
 ```
import std.stdio; struct Property(T) { T* t; T opOpAssign(string op)(T r) { mixin("(*t) "~op~"= r;"); return *t; } } class Square { private double width = 42; public auto area() { return Property!double(&width); } } void main() { auto s = new Square; writeln(s.width); s.area += 1; writeln(s.width); }
Try again. Width must be 42.01 after you increase area by 1, not 43. And again, it's not about that you can find or not an workaround, it's about the fact that property is half baked in the language.
Nov 20 2021
parent Commander Zot <no no.no> writes:
On Saturday, 20 November 2021 at 13:32:40 UTC, Rumbu wrote:
 On Saturday, 20 November 2021 at 12:11:14 UTC, Commander Zot 
 wrote:
 On Saturday, 20 November 2021 at 11:53:20 UTC, Rumbu wrote:
 The original issue was the how can I design a property in D 
 so I can write:

 ```
  area +=1;
 ```
import std.stdio; struct Property(T) { T* t; T opOpAssign(string op)(T r) { mixin("(*t) "~op~"= r;"); return *t; } } class Square { private double width = 42; public auto area() { return Property!double(&width); } } void main() { auto s = new Square; writeln(s.width); s.area += 1; writeln(s.width); }
Try again. Width must be 42.01 after you increase area by 1, not 43. And again, it's not about that you can find or not an workaround, it's about the fact that property is half baked in the language.
this was just an example how you can actually implement properties that support whatever operation you want. property and the assignment rewrite should just be removed from the language in my opinion.
Nov 20 2021
prev sibling parent 12345swordy <alexanderheistermann gmail.com> writes:
On Friday, 19 November 2021 at 16:14:41 UTC, Alexandru Ermicioi 
wrote:
 On Friday, 19 November 2021 at 15:39:54 UTC, 12345swordy wrote:
 It violate encapsulation. Which at that point you might as 
 well make the l-value public.

 -Alex
It doesn't. Best regards, Alexandru.
Yes it does, you are directly accessing to the variable itself here. The point of properties is to grant indirect access to it. - Alex
Nov 19 2021
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Friday, 19 November 2021 at 10:21:35 UTC, Max Samukha wrote:
 On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves 
 wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state.
I don't agree - I think the point is to treat getters like state.
 Exposing a private member for direct mutation is generally a 
 bad idea.
I agree wholeheartedly. Getters are a code smell, setters stink.
Nov 20 2021
next sibling parent 12345swordy <alexanderheistermann gmail.com> writes:
On Saturday, 20 November 2021 at 16:42:02 UTC, Atila Neves wrote:
 On Friday, 19 November 2021 at 10:21:35 UTC, Max Samukha wrote:
 On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves 
 wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state.
I don't agree - I think the point is to treat getters like state.
 Exposing a private member for direct mutation is generally a 
 bad idea.
I agree wholeheartedly. Getters are a code smell, setters stink.
It is not a code smell if: You are concern with the private member integrity. You wanted to call other functions such as log audit every time a variable is being written. - Alex
Nov 20 2021
prev sibling next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 20 November 2021 at 16:42:02 UTC, Atila Neves wrote:

 I don't agree - I think the point is to treat getters like 
 state.
Ok, I get your point. I think you've changed my mind.
 Exposing a private member for direct mutation is generally a 
 bad idea.
I agree wholeheartedly. Getters are a code smell, setters stink.
Nov 21 2021
prev sibling parent russhy <russhy gmail.com> writes:
On Saturday, 20 November 2021 at 16:42:02 UTC, Atila Neves wrote:
 On Friday, 19 November 2021 at 10:21:35 UTC, Max Samukha wrote:
 On Thursday, 18 November 2021 at 19:32:36 UTC, Atila Neves 
 wrote:

 Works with `ref prop() { return _fld; }`.
The whole point of properties is to hide private state.
I don't agree - I think the point is to treat getters like state.
 Exposing a private member for direct mutation is generally a 
 bad idea.
I agree wholeheartedly. Getters are a code smell, setters stink.
I 100% agree
Nov 23 2021
prev sibling parent reply Vladimir Marchevsky <vladimmi gmail.com> writes:
On Thursday, 18 November 2021 at 13:37:28 UTC, Atila Neves wrote:
 * properties
What's missing?
Ahem...
 WARNING: The definition and usefulness of property functions is 
 being reviewed, and the implementation is currently incomplete. 
 Using property functions is not recommended...
Nov 18 2021
parent Atila Neves <atila.neves gmail.com> writes:
On Thursday, 18 November 2021 at 20:13:44 UTC, Vladimir 
Marchevsky wrote:
 On Thursday, 18 November 2021 at 13:37:28 UTC, Atila Neves 
 wrote:
 * properties
What's missing?
Ahem...
 WARNING: The definition and usefulness of property functions 
 is being reviewed, and the implementation is currently 
 incomplete. Using property functions is not recommended...
So... what's missing is updating the docs? :P
Nov 18 2021