www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - A betterC base

reply ixid <nuaccount gmail.com> writes:
How difficult would it be for D at this point to move towards a 
pay for what you use system that out of the box is betterC and 
requires the garbage collector to be explicitly imported?

It feels like D has not overcome at least two major issues in the 
public mind, the built-in GC and, more ludicrously, the D1 
library split. Would there not be significant value in making 
this a D3 transition? As non-breaking as possible but moving the 
run time elements into libraries and potentially taking the 
opportunity to invert the defaults for things like safe and pure.

The story of this D3 transition to the public would then address 
the 'issues' head on, creating an easily conveyable story that 
these have been resolved. This appears to be the level on which a 
lot of language adoption works, at least between hearing about 
and trying a language. At the moment it's painful to see the 
endless criticisms of the GC and library split crop up whenever D 
is discussed. D is progressing technically but needs a 'story'.
Feb 08 2018
next sibling parent reply Seb <seb wilzba.ch> writes:
On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 How difficult would it be for D at this point to move towards a 
 pay for what you use system that out of the box is betterC and 
 requires the garbage collector to be explicitly imported?
https://github.com/dlang/druntime/pull/2057
 It feels like D has not overcome at least two major issues in 
 the public mind, the built-in GC and, more ludicrously, the D1 
 library split. Would there not be significant value in making 
 this a D3 transition? As non-breaking as possible but moving 
 the run time elements into libraries
One of Andrei's student is working on this. I think she has been focusing on templated ==, <= and AAs so far and is now recently getting more into the GC business: https://github.com/dlang/druntime/pulls?utf8=%E2%9C%93&q=is%3Apr+author%3Asomzzz
 and potentially taking the opportunity to invert the defaults 
 for things like safe and pure.
I think someone (Petar?) is working on a DIP for package-wide defaults. The idea is simple, you tell the compiler once: "Hey yo, listen up. I know that old code still needs to XXX (e.g. system by default), but don't be a bad boy and let me opt-in into the cool new stuff by default, please" I don't recall the exact details though. Ideally it's like the -std=c++11 flag, i.e. you set it once in your dub.sdl and don't have to think about it again.
 The story of this D3 transition to the public would then 
 address the 'issues' head on, creating an easily conveyable 
 story that these have been resolved. This appears to be the 
 level on which a lot of language adoption works, at least 
 between hearing about and trying a language. At the moment it's 
 painful to see the endless criticisms of the GC and library 
 split crop up whenever D is discussed. D is progressing 
 technically but needs a 'story'.
I don't think D3 is going to happen anytime soon (except someone forks the language). Breaking changes are only done for critical things, e.g. when the compiler learns to detect new errors in your code. Regarding the 'story', there's e.g. the excellent GC Series now (https://dlang.org/blog/the-gc-series) and things are moving forward though of course PR has always been one of D's weakest points. --- Though if there's ever a D3, my list of things to be addressed is big - no auto-decoding - fix shared - attribute bloat - wrong defaults (e.g. safe or final by default) - C behavior without compiler warnings - tuples (though it looks like they can be retro-actively added now that the use of the return of the comma operator is finally gone) - Redesign Phobos with nogc in mind - remove the bloat from Phobos - std.io (with streams) - proper naming and structuring in Phobos (e.g. why is doesPointTo or RangePrimitive in std.exception), or "hello super-messy std.traits" ... Some of these actually can be fixed with little or no breakage, but for most - if addressed - hell would break loose.
Feb 08 2018
parent Arun Chandrasekaran <aruncxy gmail.com> writes:
On Thursday, 8 February 2018 at 11:40:44 UTC, Seb wrote:
 On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 [...]
https://github.com/dlang/druntime/pull/2057
 [...]
One of Andrei's student is working on this. I think she has been focusing on templated ==, <= and AAs so far and is now recently getting more into the GC business: [...]
Accompanied by a root cause analysis and learning how/why these were implemented at first place that prompts a D2 -> D3 and how to avoid such mistakes again so that D3 -> D4 is not required. :) Just saying, techies can exhibit some managerial traits as well.
Feb 08 2018
prev sibling next sibling parent Mike Franklin <slavo5150 yahoo.com> writes:
On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 How difficult would it be for D at this point to move towards a 
 pay for what you use system that out of the box is betterC and 
 requires the garbage collector to be explicitly imported?
I'm not sure if this is what you're looking for, but I've been trying to work on something like that, and have successfully submitted a few PRs: Opt-in ModuleInfo https://github.com/dlang/dmd/pull/7395 https://github.com/dlang/dmd/pull/7768 Opt-in Throwable https://github.com/dlang/dmd/pull/7786 Opt-in TypeInfo https://github.com/dlang/dmd/pull/7799 (not yet merged; someone please review it) With all of the above PRs merged, the compiler will no longer complain about the above missing runtime features if your code doesn't use them. It also allows one to create really small binaries with no dependencies. Example 1 ========= object.d -------- module object; private alias extern(C) int function(char[][] args) MainFunc; private extern (C) int _d_run_main(int argc, char** argv, MainFunc mainFunc) { return mainFunc(null); } main.d ------ void main() { } dmd -conf= -defaultlib= main.d object.d -of=main size main text data bss dec hex filename 1403 584 16 2003 7d3 main Example 2 This will avoid linking in the C standard library and C runtime. But you have to provide your own replacements. ========= object.d -------- module object; extern(C) void __d_sys_exit(long arg1) { asm { mov RAX, 60; mov RDI, arg1; syscall; } } extern void main(); private extern(C) void _start() { main(); __d_sys_exit(0); } main.d ------ void main() { } dmd -c -lib main.d object.d -of=main.o ld main.o -o main size main text data bss dec hex filename 56 0 0 56 38 main If you are creating a library to be consumed by another language, you just need to add an empty object.d file in your current directory. I tried to remove that silly limitation, but it met resistance: https://github.com/dlang/dmd/pull/7825 I have a changelog PR describing all this at https://github.com/dlang/dmd/pull/7829, with the intention of it being available in the next DMD release, but I need my other PRs reviewed and merged before I can move forward. This is just the tip of the iceberg, though. After this stage, I would like to start tackling the overuse of TypeInfo in the coupling between the compiler and the runtime. See this comment (https://issues.dlang.org/show_bug.cgi?id=18312#c2) for more about what I mean there. Mike
Feb 08 2018
prev sibling next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 It feels like D has not overcome at least two major issues in 
 the public mind, the built-in GC
D is a pragmatic language aimed toward writing fast code, fast. Garbage collection has proved to be a smashing success in the industry, providing productivity and memory-safety to programmers of all skill levels. D's GC implementation follows in the footsteps of industry giants without compromising expert's ability to tweak even further. That's what we should be saying every single time someone mentions GC. Including it was the RIGHT DECISION and we should own that.
Feb 08 2018
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
ooh better last sentence


D's GC implementation follows in the footsteps of industry giants 
without compromising experts' ability to realize maximum 
potential from the machine.
Feb 08 2018
next sibling parent reply ixid <nuaccount gmail.com> writes:
On Thursday, 8 February 2018 at 14:56:31 UTC, Adam D. Ruppe wrote:
 ooh better last sentence


 D's GC implementation follows in the footsteps of industry 
 giants without compromising experts' ability to realize maximum 
 potential from the machine.
That's been said over and over and the message has not gotten through. With a pay for what you use approach the GC is just as available as it is now, and yes, I completely agree it's very useful and that the reaction to it is ludicrous. This is an illogical argument that we've lost so needs a new approach.
Feb 08 2018
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 15:43:01 UTC, ixid wrote:
 That's been said over and over and the message has not gotten 
 through.
It is almost never said! We always play by their terms and implicitly concede by saying "but we can avoid it" or "look -betterC". Reddit invades our space, and we fall back. Rust assimilates entire worlds, and we fall back. Not again! The line must be drawn here! This far, no further!
Feb 08 2018
next sibling parent reply John Gabriele <jgabriele fastmail.fm> writes:
On Thursday, 8 February 2018 at 15:51:38 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 15:43:01 UTC, ixid wrote:
 That's been said over and over and the message has not gotten 
 through.
It is almost never said! We always play by their terms and implicitly concede by saying "but we can avoid it" or "look -betterC". Reddit invades our space, and we fall back. Rust assimilates entire worlds, and we fall back. Not again! The line must be drawn here! This far, no further!
Woot! Love it. :) Will save that quote you provided to use elsewhere. Thanks. Regarding what you said about the implementation of the GC following in the footsteps of industry giants, what specifically about D's GC impl is patterned after other industry giant's GC's?
Feb 08 2018
parent Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 16:40:46 UTC, John Gabriele wrote:
 Regarding what you said about the implementation of the GC 
 following in the footsteps of industry giants, what 
 specifically about D's GC impl is patterned after other 
 industry giant's GC's?
The simple fact that it is a GC. These debates aren't about technical details. You don't see the reddit detractors actually arguing implementation details - they just equate GC with bad. But GC isn't bad. GC is used by virtually everyone, to big productivity and memory safety gains by most, and evidently, without seriously getting in the way by the majority of the remainder.... just like how D's specialized users who can't afford GC still manage to use D.
Feb 08 2018
prev sibling next sibling parent reply ixid <nuaccount gmail.com> writes:
On Thursday, 8 February 2018 at 15:51:38 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 15:43:01 UTC, ixid wrote:
 That's been said over and over and the message has not gotten 
 through.
It is almost never said! We always play by their terms and implicitly concede by saying "but we can avoid it" or "look -betterC". Reddit invades our space, and we fall back. Rust assimilates entire worlds, and we fall back. Not again! The line must be drawn here! This far, no further!
You're preaching to the choir here. Being able to add GC easily to a betterC base gives you the same utility and a much stronger story to tell people, optional GC sounds good. Do you really think sticking with the current course on GC would gain more users than very slightly changing tack and making it something you add to a simpler base? I think the second of those will gain more users.
Feb 08 2018
parent Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 17:32:53 UTC, ixid wrote:
 Do you really think sticking with the current course on GC 
 would gain more users than very slightly changing tack and 
 making it something you add to a simpler base? I think the 
 second of those will gain more users.
No, the current course - which IS the optional GC story you're talking about - is not good. I'm saying change course by embracing our advantages instead of constantly playing defense. D isn't going to beat Rust on compiler-enforced memory safety, so we shouldn't even play that game. Instead, pound them into the ground with our programmer productivity package. Destroy them with our familiar syntax that programmers already know.
Feb 08 2018
prev sibling parent psychoticRabbit <meagain megain.com> writes:
On Thursday, 8 February 2018 at 15:51:38 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 15:43:01 UTC, ixid wrote:
 That's been said over and over and the message has not gotten 
 through.
It is almost never said! We always play by their terms and implicitly concede by saying "but we can avoid it" or "look -betterC". Reddit invades our space, and we fall back. Rust assimilates entire worlds, and we fall back. Not again! The line must be drawn here! This far, no further!
"Death is nothing, but to live defeated and inglorious is to die daily." - Napoleon 'D' Bonaparte Hey... logo idea for Munich 2018 -> Dman wearing a Napoleon hat - and riding a horse. Hey.. it's better than Dman lying on a death bed, dying of a stomach ulcer... I think we should have an annual D parade too...bring out all the might of D's machinery..and show the world how powerful we really are.
Feb 08 2018
prev sibling parent reply Dave Jones <dave jones.com> writes:
On Thursday, 8 February 2018 at 14:56:31 UTC, Adam D. Ruppe wrote:
 ooh better last sentence


 D's GC implementation follows in the footsteps of industry 
 giants without compromising experts' ability to realize maximum 
 potential from the machine.
If D had a decent garbage collector it might be a more convincing argument. If going malloc didnt lose you a bunch of features and bring a bunch of other stuff you need to be careful of, that might be a good argument too. I mean a good quality GC and seamless integration of manual memory management would be a pretty good argument to make, but D has neither of those ATM.
Feb 08 2018
next sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 8 February 2018 at 17:03:58 UTC, Dave Jones wrote:
 On Thursday, 8 February 2018 at 14:56:31 UTC, Adam D. Ruppe 
 wrote:
 ooh better last sentence


 D's GC implementation follows in the footsteps of industry 
 giants without compromising experts' ability to realize 
 maximum potential from the machine.
If D had a decent garbage collector it might be a more convincing argument. If going malloc didnt lose you a bunch of features and bring a bunch of other stuff you need to be careful of, that might be a good argument too. I mean a good quality GC and seamless integration of manual memory management would be a pretty good argument to make, but D has neither of those ATM.
What are D's limitations on do-it-yourself reference counting?
Feb 08 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Thursday, 8 February 2018 at 17:10:00 UTC, bachmeier wrote:

 What are D's limitations on do-it-yourself reference counting?
* Types that are built into the language like dynamic arrays, associative arrays, and exceptions won't benefit from DIY reference counting. * Much of Phobos probably wouldn't be compatible with DIY reference counting. That being said, there may be a way to override some runtime hooks like _d_newclass (https://dlang.org/library/rt/lifetime/_d_newclass.html), etc... to make it work. But I haven't tried. Also, I think Walter is currently working on getting reference counted exceptions into the language: https://github.com/dlang/druntime/pull/1995 Mike
Feb 08 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 9 February 2018 at 01:31:41 UTC, Mike Franklin wrote:
 On Thursday, 8 February 2018 at 17:10:00 UTC, bachmeier wrote:

 What are D's limitations on do-it-yourself reference counting?
* Types that are built into the language like dynamic arrays, associative arrays, and exceptions won't benefit from DIY reference counting. * Much of Phobos probably wouldn't be compatible with DIY reference counting. That being said, there may be a way to override some runtime hooks like _d_newclass (https://dlang.org/library/rt/lifetime/_d_newclass.html), etc... to make it work. But I haven't tried. Also, I think Walter is currently working on getting reference counted exceptions into the language: https://github.com/dlang/druntime/pull/1995 Mike
Also, I think DIY reference counting is already done for us in the automem library https://dlang.org/blog/2017/04/28/automem-hands-free-raii-for-d/ Mike
Feb 08 2018
parent Seb <seb wilzba.ch> writes:
On Friday, 9 February 2018 at 01:36:02 UTC, Mike Franklin wrote:
 On Friday, 9 February 2018 at 01:31:41 UTC, Mike Franklin wrote:
 On Thursday, 8 February 2018 at 17:10:00 UTC, bachmeier wrote:

 What are D's limitations on do-it-yourself reference counting?
* Types that are built into the language like dynamic arrays, associative arrays, and exceptions won't benefit from DIY reference counting. * Much of Phobos probably wouldn't be compatible with DIY reference counting. That being said, there may be a way to override some runtime hooks like _d_newclass (https://dlang.org/library/rt/lifetime/_d_newclass.html), etc... to make it work. But I haven't tried. Also, I think Walter is currently working on getting reference counted exceptions into the language: https://github.com/dlang/druntime/pull/1995 Mike
Also, I think DIY reference counting is already done for us in the automem library https://dlang.org/blog/2017/04/28/automem-hands-free-raii-for-d/ Mike
We use std.typecons.RefCounted in many places in Phobos too. There is only one big problem at the moment: it's not safe and can only be safe with DIP1000. This is actually blocking a lot of work - think RCString, containers or even just a simple range which needs heap memory as state.
Feb 09 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/8/2018 9:03 AM, Dave Jones wrote:
 If D had a decent garbage collector it might be a more convincing argument.
'Decent' GC systems rely on the compiler emitting "write gates" around every assignment to a pointer. These are justified in languages like Java and Go for which everything is GC allocated, but they would be a performance disaster for a hybrid language like D. More precise GC exacts heavy runtime penalties, too, which is why attempts to add them to D have had mixed results. I.e. it isn't an issue of us D guys being dumb about the GC.
 If going malloc didnt lose you a bunch of features and bring a bunch of other
stuff 
 you need to be careful of, that might be a good argument too.
With nogc, you don't have to be careful about it. The compiler will let you know.
Feb 08 2018
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright wrote:
 [snip]

 More precise GC exacts heavy runtime penalties, too, which is 
 why attempts to add them to D have had mixed results.
See, there's your problem right there. Now if you replace the current GC with the slowest possible GC you can think of and then replace that with a precise GC. Then all the comparisons are gonna come out roses. /s
Feb 08 2018
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright wrote:
 On 2/8/2018 9:03 AM, Dave Jones wrote:
 If D had a decent garbage collector it might be a more 
 convincing argument.
'Decent' GC systems rely on the compiler emitting "write gates" around every assignment to a pointer. These are justified in languages like Java and Go for which everything is GC allocated, but they would be a performance disaster for a hybrid language like D. More precise GC exacts heavy runtime penalties, too, which is why attempts to add them to D have had mixed results. I.e. it isn't an issue of us D guys being dumb about the GC.
 If going malloc didnt lose you a bunch of features and bring a 
 bunch of other stuff you need to be careful of, that might be 
 a good argument too.
With nogc, you don't have to be careful about it. The compiler will let you know.
.NET, Eiffel, Modula-3 and the various Oberon variants are all examples where not everything is GC allocated. from D.
Feb 08 2018
prev sibling next sibling parent reply Rubn <where is.this> writes:
On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright wrote:
 I.e. it isn't an issue of us D guys being dumb about the GC.
So you could say it's a design flaw of D, attempting to use a GC where it isn't suited?
 If going malloc didnt lose you a bunch of features and bring a 
 bunch of other stuff you need to be careful of, that might be 
 a good argument too.
With nogc, you don't have to be careful about it. The compiler will let you know.
nogc has issues integrating with features like delegates, but no one seems to care about that with statements like this. It's more convenient to not use nogc than dealing with the hassles of using it.
Feb 08 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, February 08, 2018 23:57:45 Rubn via Digitalmars-d wrote:
 On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright wrote:
 I.e. it isn't an issue of us D guys being dumb about the GC.
So you could say it's a design flaw of D, attempting to use a GC where it isn't suited?
You could say that, but many of us would not agree. Just because certain classes of GCs cannot be used with D does not mean that the fact that D has a GC built-in is not beneficial and ultimately a good design decision. Plenty of folks have been able to write very efficient code that uses D's GC. Obviously, there are use cases where it's better to avoid the GC, but for your average D program, the GC has been a fantastic asset. - Jonathan M Davis
Feb 08 2018
parent Rubn <where is.this> writes:
On Friday, 9 February 2018 at 02:09:57 UTC, Jonathan M Davis 
wrote:
 On Thursday, February 08, 2018 23:57:45 Rubn via Digitalmars-d 
 wrote:
 On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright 
 wrote:
 I.e. it isn't an issue of us D guys being dumb about the GC.
So you could say it's a design flaw of D, attempting to use a GC where it isn't suited?
You could say that, but many of us would not agree. Just because certain classes of GCs cannot be used with D does not mean that the fact that D has a GC built-in is not beneficial and ultimately a good design decision. Plenty of folks have been able to write very efficient code that uses D's GC. Obviously, there are use cases where it's better to avoid the GC, but for your average D program, the GC has been a fantastic asset. - Jonathan M Davis
I didn't say that a GC isn't beneficial, the problem is if you are going to be using the GC there are plenty of other languages that implement it better. The language is designed around the GC. Anytime I try and use an associative array, my program crashes because of the GC. The workaround I need to do is just make every associative array static. Maybe if phobos could be built into a shared library it wouldn't be as big of a problem. But that's not the case, before someone goes around crying that Phobos CAN be built into a shared library, remember platform matters! You can write efficient code with Java, and that has an entire VM running between the cpu and the language. Efficiency isn't the issue. Writing code that is both GC and non-GC code is extremely difficult to do correctly. That it just isn't worth it at the end of the day, it complicates everything, and that is the design flaw. Having to use a complicated inefficient GC is just a side effect of the greater issue.
Feb 09 2018
prev sibling parent Dave Jones <dave jones.com> writes:
On Thursday, 8 February 2018 at 18:06:38 UTC, Walter Bright wrote:
 On 2/8/2018 9:03 AM, Dave Jones wrote:
 If D had a decent garbage collector it might be a more 
 convincing argument.
'Decent' GC systems rely on the compiler emitting "write gates" around every assignment to a pointer. These are justified in languages like Java and Go for which everything is GC allocated, but they would be a performance disaster for a hybrid language like D. More precise GC exacts heavy runtime penalties, too, which is why attempts to add them to D have had mixed results.
When even you make excuses for the sub standard garbage collection how can anyone expect to use it as a positive selling point for D? That's my point, that the current GC is not something that will sell many tickets to the show. Whether there are good reasons for it being so is kind of beside the point.
 I.e. it isn't an issue of us D guys being dumb about the GC.
I have no doubt about that.
 If going malloc didnt lose you a bunch of features and bring a 
 bunch of other stuff you need to be careful of, that might be 
 a good argument too.
With nogc, you don't have to be careful about it. The compiler will let you know.
I mean more mixing GCed and malloc memory. If you want to use malloc and still use language features that need GC, you still need to be aware and think about whether any of malloced stuff needs to be registered with the GC. IE. It's not just a case of "hey just use malloc".
Feb 09 2018
prev sibling next sibling parent reply JN <666total wp.pl> writes:
On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe wrote:
 Garbage collection has proved to be a smashing success in the 
 industry, providing productivity and memory-safety to 
 programmers of all skill levels.
Citation needed on how garbage collection has been a smashing success based on its merits rather than the merits of the languages that use garbage collection. Python was also a smashing success, but it doesn't use a garbage collector in it's default implementation (CPython). Unless you mean garbage collection as in "not manual memory management"? But that still might not be as simple, because RAII would fall somewhere inbetween.
Feb 08 2018
next sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:

 Python was also a smashing success, but it doesn't use a 
 garbage collector in it's default implementation (CPython).
I'm pretty sure CPython uses a mark-and-sweep GC together with reference counting.
Feb 08 2018
parent rjframe <dlang ryanjframe.com> writes:
On Thu, 08 Feb 2018 17:08:41 +0000, bachmeier wrote:

 On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:
 
 Python was also a smashing success, but it doesn't use a garbage
 collector in it's default implementation (CPython).
I'm pretty sure CPython uses a mark-and-sweep GC together with reference counting.
It does. Originally it was reference-counting only, but they added the (generational) GC to clean up cyclic references. Because they do reference counting as well, you can disable the GC entirely. https://docs.python.org/3.6/library/gc.html
Feb 08 2018
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/8/2018 7:55 AM, JN wrote:
 Citation needed on how garbage collection has been a smashing success based on 
 its merits rather than the merits of the languages that use garbage
collection. 
You can't separate the two. The Java and Go language semantics are designed around the GC.
Feb 08 2018
parent reply JN <666total wp.pl> writes:
On Thursday, 8 February 2018 at 18:08:59 UTC, Walter Bright wrote:
 On 2/8/2018 7:55 AM, JN wrote:
 Citation needed on how garbage collection has been a smashing 
 success based on its merits rather than the merits of the 
 languages that use garbage collection.
You can't separate the two. The Java and Go language semantics are designed around the GC.
I agree, however these languages would probably have been successful even without GC, using e.g. some form of automatic reference counting.
Feb 08 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/8/2018 10:11 AM, JN wrote:
 I agree, however these languages would probably have been successful even 
 without GC, using e.g. some form of automatic reference counting.
If reference counting would work with Java, and was better, wouldn't the Java developers have done it decades ago?
Feb 08 2018
parent reply bachmeier <no spam.net> writes:
On Thursday, 8 February 2018 at 19:34:20 UTC, Walter Bright wrote:
 On 2/8/2018 10:11 AM, JN wrote:
 I agree, however these languages would probably have been 
 successful even without GC, using e.g. some form of automatic 
 reference counting.
If reference counting would work with Java, and was better, wouldn't the Java developers have done it decades ago?
The developers working on .NET had the opportunity to learn from Java, yet they went with GC.[0] Anyone that says one approach is objectively better than the other is clearly not familiar with all the arguments - or more likely, believes their problem is the only real programming problem. [0] https://blogs.msdn.microsoft.com/brada/2005/02/11/resource-management/
Feb 08 2018
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 8 February 2018 at 19:51:05 UTC, bachmeier wrote:
 The developers working on .NET had the opportunity to learn 
 from Java, yet they went with GC.[0] Anyone that says one 
 approach is objectively better than the other is clearly not 
 familiar with all the arguments - or more likely, believes 
 their problem is the only real programming problem.
Reference counting isn't a general solution, and it is very slow when you allow flexible programming paradigms that generate lots of objects. So, it all depends on how much flexibility you want to allow for your programmers and still having reasonable performance. (The vast majority of high level programming languages use GC and has done so since the 60s.)
Feb 08 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/8/2018 11:51 AM, bachmeier wrote:
 The developers working on .NET had the opportunity to learn from Java, yet
they 
 went with GC.[0] Anyone that says one approach is objectively better than the 
 other is clearly not familiar with all the arguments - or more likely,
believes 
 their problem is the only real programming problem.
 
 [0] https://blogs.msdn.microsoft.com/brada/2005/02/11/resource-management/
That really is an informative article, thanks. The only issue with it is that it doesn't cover the newer C++ ref counting model, which has proved popular.
Feb 08 2018
parent psychoticRabbit <meagain meagain.com> writes:
On Thursday, 8 February 2018 at 21:01:55 UTC, Walter Bright wrote:
 That really is an informative article, thanks. The only issue 
 with it is that it doesn't cover the newer C++ ref counting 
 model, which has proved popular.
Here is another very informative article, outling the 'tradeoff' between program 'throughput' and 'latency'. https://making.pusher.com/golangs-real-time-gc-in-theory-and-practice/ The really important part of their conclusion, however, is that there is no such thing as a 'one size fits all' GC implementation: .. " It is important to understand the underlying GC algorithm in order to decide whether it is appropriate for your use-case". From this, I can only conclude that integrating GC too much into the core of the language & library can be problematic for many use cases, and not so for many others.
Feb 09 2018
prev sibling next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:
 Citation needed on how garbage collection has been a smashing 
 success based on its merits rather than the merits of the 
 languages that use garbage collection.
Who cares? Even if the success isn't because of GC per se, the ubiquity of it in the real world means it certainly isn't a deal breaker.
Feb 08 2018
next sibling parent reply Ali <fakeemail example.com> writes:
On Thursday, 8 February 2018 at 23:27:25 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:
 Citation needed on how garbage collection has been a smashing 
 success based on its merits rather than the merits of the 
 languages that use garbage collection.
Who cares? Even if the success isn't because of GC per se, the ubiquity of it in the real world means it certainly isn't a deal breaker.
But D, unlike many other languages, promotes itself as primarily a system programming language https://en.wikipedia.org/wiki/System_programming_language So I would say yes, D's success does depend in a very large part on the Garbage Collector, and managing a system resources
Feb 08 2018
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 8 February 2018 at 23:50:29 UTC, Ali wrote:
 But D, unlike many other languages, promotes itself as 
 primarily a system programming language
I think that's a mistake too. I'd rebrand it as a "general purpose" programming language. One language you can use everywhere. It worked for node.js and electron... Though, of course, the GC is NOT a problem for those systems tasks. Even in the niches where it doesn't help, it doesn't actually hurt either. (in fact, the bigger problem we have in those niches are obligatory typeinfo generation and unnecessary bloat in the runtime.... implementation issues that Mike Franklin has made big progress on fixing already. and even those can be worked around, doing a serious system implementation is a bigger task than stubbing out a few functions.)
 https://en.wikipedia.org/wiki/System_programming_language
A few of those languages have GCs... and GC languages have been used for all these tasks before. It's not a dealbreaker.
Feb 08 2018
parent reply Benny <benny.luypaert rhysoft.com> writes:
On Friday, 9 February 2018 at 00:08:56 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 23:50:29 UTC, Ali wrote:
 But D, unlike many other languages, promotes itself as 
 primarily a system programming language
I think that's a mistake too. I'd rebrand it as a "general purpose" programming language. One language you can use everywhere. It worked for node.js and electron...
Plenty of "general purpose" programming languages. The issue being that very few offer classes, no GC, easy syntax, good tooling and editor support, ... I noticed a trend with languages with so many going to functional programming or semi-class based. From the outside D looks good but there are so many strange things in the D design, that just infuriate. - GC ... sure, if only it did not allocate so much on startup. It makes any other languages look better, by simply having a lower memory footprint on first comparison. C 0.1MB, C++ 0.2MB, Rust 0.4MB, D 1.4MB, ... Looks inefficient when its simply the whole 1MB allocation. But perception matters! - import ... really, we are 2018 and people are still wasting our time to have standard libraries as imports. Its even more fun when you split, only to need import the array library. Look how ridiculous C++ like "import std.algorithm, std.conv, std.functional, std.math, std.regex, std.stdio;" some of the example on the front page look like. I see people on Reddit sh*t all over PHP all the time and yet, its so darn easy and comfortable to not think about writing import all over the code, just to get default functionality!! Reddit is full of people who love to hate languages that simply work. - Tooling. I will say it again and again until i die, it simply sucks for Windows users. How fun is it to see dcd-server taking up between 90 to 120MB and seeing 10, 12, 15 instances loading into memory eating away 2GB memory. Or seeing VSC work with some of the plugins for 5 minutes and then break again, forcing you to constantly restart VSC. Or how competing languages seem to provide more cleaner and better working plugins, with cleaner tool tips ( source documentation ) - Even the example on the front page are so typical "scare away the newbies". It looks like a cleaner version of C++. D has always been a love/hate relationship for me. One can see the work that has gone into it but it feels like a Frankenstein's monster. Small details, big details, the lack of clear focus. BetterC just moves resources away from actually implementing a permanent solution. Instead of maintain one system, you deal with two. While default D still deals with regressions and issue, BetterC being incomplete is pushed as the next big thing. The library has design choices that date back a long time and nobody dares to touch. The whole constant GC debate is linked to those design choices. D can do a lot but the layer between both is so thin that at times you wonder if your dealing with compile or runtime features. CTFE or not. Talking about CTFE .. Stephan vanished for a long time busy with work and yet it feels reading the topics that very few people noticed him missing, despite working a year on the whole new CTFE engine. Not exactly motivating for people. I can talk until i turn blue. I already wrote "a wall of text" as some say, in the Go topic and that was not even technical issues. People talk about the need for a clear design focus, leadership and ... things go on as before. That is D in a nutshell. People doing what they want, whenever and things stay the same. New features ( that is always fun ), a few people doing to grunt work and all the rest comes down to people complaining because they see no reason to put effort into D, as it feels like a wast of time. << want to bet that this is the only thing people will quote, instead of the rest. But on-topic again: No GC, yay. Always a win because it makes a language stand out. Possible for D. NO! Too much design choices that limit the language. Another D3 rewrite will simply kill D. D is so tiring. Its the main reason for going with Go, simply tired of waiting. In this one+ year time watching D, i have seen blogs, betterC half finished being promoted when D is already overloaded with features and has already a higher learning curve. More regressions and bug fix releases because the new features keep breaking stuff. Some more examples on the front pages. Some nice external packages that only limited amount of people care about. And very few things to improving the issues people mentioned the year before and the year before and the year before. So again, why do people need to bother? The momentum D build up in 2016, seem to according tiobe really lost. I remember D hitting (23) 1% a year ago, now its ranking (29) 0.5%. Great another wall of text at 2.50 in the morning. Frankly, i can write a book about D issues, justified or not. It will probably read like gibberish again and ... some people will cut parts to quote, complain ... but things will stay the same. Everything feels so 90% finished, like the effort to finalyse things is always lacking. You see it everywhere in D. That is my view... disagree, fine but it does not change the outside perception of D. D needs a massive cleanup, a focus, probably a name change to get rid of the reputation... D, D never changes ( fallout reference ).
Feb 08 2018
next sibling parent psychoticRabbit <meagain megain.com> writes:
On Friday, 9 February 2018 at 01:55:10 UTC, Benny wrote:
 People talk about the need for a clear design focus, leadership 
 and ... things go on as before. That is D in a nutshell. People 
 doing what they want, whenever and things stay the same. New 
 features ( that is always fun ), a few people doing to grunt 
 work and all the rest comes down to people complaining because 
 they see no reason to put effort into D, as it feels like a 
 wast of time. << want to bet that this is the only thing people 
 will quote, instead of the rest.
D does NOT need a top-down, authoritarian, corporation like vision imposed on it (which would solve all the issues you mention). D is an open source, meritocratic community of people, who drive the project forward. Some (like you apparently) seem to think that a lack of authoritarianism puts D at a disadvantage - I simply disagree. It may mean, that (some)things progress more slowly, and the overall vision is less certain - but that's exactly how I like it. D 'emerges' from its community. It's is not imposed on its community.
Feb 08 2018
prev sibling next sibling parent Suliman <evermind live.ru> writes:
 - import ... really, we are 2018 and people are still wasting 
 our time to have standard libraries as imports. Its even more 
 fun when you split, only to need import the array library.
Please explain what do you mean by it?
Feb 08 2018
prev sibling next sibling parent Fra Mecca <me francescomecca.eu> writes:
On Friday, 9 February 2018 at 01:55:10 UTC, Benny wrote:

 Plenty of "general purpose" programming languages. The issue 
 being that very few offer classes, no GC, easy syntax, good 
 tooling and editor support, ...

 [...]

 D, D never changes ( fallout reference ).
Hi Benny, I have read both of your lengthy post and given your harsh criticism I decided to reply to you, even though I think in such cases that it's better to let the flame burn out by itself. I totally respect your choice of using Go and your problem of usability on Windows. On the other hand, I have yet to encounter the other issues you have mentioned (I don't want to belittle your considerations). When I approached D, one year last august, I was instantly amazed by what I consider a clear and solid design. What I appreciate as well is the bazaar style of development. I think it is worth it in the long run because it allows D to take different directions that the ones original envisioned. One such example is the take on backend programming (vibe and the other frameworks). Go, Elm and Java (but many others) on the other end suffer because their "dictators" are slow or against some paradigms (generics and structured programming are on example). BetterC may seem an half effort that may split the development, but going in the past the D community is full of attempts that in the end succeeded. The documentation is an example of that, or the way recently the nogc story is evolving rapidly.
 D is so tiring. Its the main reason for going with Go, simply 
 tired of waiting. In this one+ year time watching D, i have 
 seen blogs, betterC half finished being promoted when D is 
 already overloaded with features and has already a higher 
 learning curve.
I think the very opposite. D for me come as intuitive as possible. With just a solid background on C and python I was able to understand the most important design decisions and write software more complex than before and build abstractions (Go goes in the opposite way). Even reading code written by other seems easily accomplishable (something impossible for me in C++) and many design decisions don't seem half-assed attempts at innovate (like asyncio in Python). I don't think that any of the abstractions of the std library or the idioms of the language add bloat and it seems to me that I can learn each one of them day after day without losing any sanity.
 Everything feels so 90% finished, like the effort to finalyse 
 things is always lacking. You see it everywhere in D. That is 
 my view... disagree, fine but it does not change the outside 
 perception of D. D needs a massive cleanup, a focus, probably a 
 name change to get rid of the reputation...
That is my personal experience and I think that if you can be more specific and tackle some definite issues you may help this community.
Feb 09 2018
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Friday, 9 February 2018 at 01:55:10 UTC, Benny wrote:
 On Friday, 9 February 2018 at 00:08:56 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 23:50:29 UTC, Ali wrote:
- import ... really, we are 2018 and people are still wasting our time to have standard libraries as imports. Its even more fun when you split, only to need import the array library. Look how ridiculous C++ like "import std.algorithm, std.conv, std.functional, std.math, std.regex, std.stdio;" some of the example on the front page look like.
It's easy enough to create std package like this: module std; public import std.algorithm; //... However, I'm a _huge_ fan of local imports and only importing what's needed. It helps with build times, binary sizes, and is a boon for finding where things are actually defined when DCD can't figure it out. And, of course, reduces namespace pollution. Dependencies are usually bad. Imports are dependencies. Ergo... Atila
Feb 09 2018
next sibling parent reply Seb <seb wilzba.ch> writes:
On Friday, 9 February 2018 at 14:11:37 UTC, Atila Neves wrote:
 On Friday, 9 February 2018 at 01:55:10 UTC, Benny wrote:
 [...]
It's easy enough to create std package like this: module std; public import std.algorithm; //... However, I'm a _huge_ fan of local imports and only importing what's needed. It helps with build times, binary sizes, and is a boon for finding where things are actually defined when DCD can't figure it out. And, of course, reduces namespace pollution. Dependencies are usually bad. Imports are dependencies. Ergo... Atila
FYI: and for the lazy ones, there will hopefully be std.experimental.scripting soon: https://github.com/dlang/phobos/pull/5916
Feb 09 2018
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 9 February 2018 at 16:54:35 UTC, Seb wrote:
 FYI: and for the lazy ones, there will hopefully be 
 std.experimental.scripting soon:

 https://github.com/dlang/phobos/pull/5916
Why not make this a package.d file for std?
Feb 09 2018
parent reply Seb <seb wilzba.ch> writes:
On Friday, 9 February 2018 at 17:41:45 UTC, jmh530 wrote:
 On Friday, 9 February 2018 at 16:54:35 UTC, Seb wrote:
 FYI: and for the lazy ones, there will hopefully be 
 std.experimental.scripting soon:

 https://github.com/dlang/phobos/pull/5916
Why not make this a package.d file for std?
Yes, that's the intended goal. However, to convince everyone involved and to be able to experiment with this in the wild for a bit, we went with std.experimental first. If drawbacks get discovered, it's a lot easier to retreat.
Feb 09 2018
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 9 February 2018 at 19:28:40 UTC, Seb wrote:
 Yes, that's the intended goal.
 However, to convince everyone involved and to be able to 
 experiment with this in the wild for a bit, we went with 
 std.experimental first.

 If drawbacks get discovered, it's a lot easier to retreat.
Cool. Do you know if compilation speed improves if using selective imports? E.g. import std.experimental.scripting : writeln; vs. import std.experimental.scripting; I suppose that's a general question wrt public imports, but in this case there is probably more to parse than in other smaller projects.
Feb 09 2018
parent Seb <seb wilzba.ch> writes:
On Friday, 9 February 2018 at 19:50:50 UTC, jmh530 wrote:
 On Friday, 9 February 2018 at 19:28:40 UTC, Seb wrote:
 Yes, that's the intended goal.
 However, to convince everyone involved and to be able to 
 experiment with this in the wild for a bit, we went with 
 std.experimental first.

 If drawbacks get discovered, it's a lot easier to retreat.
Cool. Do you know if compilation speed improves if using selective imports? E.g. import std.experimental.scripting : writeln; vs. import std.experimental.scripting; I suppose that's a general question wrt public imports, but in this case there is probably more to parse than in other smaller projects.
AFACIT selective imports have no impact on the compilation speed at the moment. It's quite likely that future versions of the compiler will take selective imports into account, but at the moment DMD reads the world of everything and only stops at templated structs/functions/classes etc. See also: https://issues.dlang.org/show_bug.cgi?id=13255 https://issues.dlang.org/show_bug.cgi?id=18414
Feb 09 2018
prev sibling parent JN <666total wp.pl> writes:
On Friday, 9 February 2018 at 16:54:35 UTC, Seb wrote:
 FYI: and for the lazy ones, there will hopefully be 
 std.experimental.scripting soon:

 https://github.com/dlang/phobos/pull/5916
Shouldn't something like this be handled by better tooling (i.e. IDEs)? In Java you have to import every single class you want to use, but it's not a manual effort, in 99% of cases you just press Ctrl+Shift+I and IDE adds all the necessary imports. It only requires manual intervention when there's a conflict, and you get to choose between possible candidates, e.g. java.xml.Document vs java.pdf.Document.
Feb 14 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 2/9/2018 6:11 AM, Atila Neves wrote:
 It's easy enough to create std package like this:
 
 module std;
 public import std.algorithm;
 //...
Yes, but I suspect that'll be a large negative for compile speed for smallish programs.
Feb 09 2018
prev sibling parent psychoticRabbit <meagain meagain.com> writes:
On Thursday, 8 February 2018 at 23:27:25 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:
 Citation needed on how garbage collection has been a smashing 
 success based on its merits rather than the merits of the 
 languages that use garbage collection.
Who cares? Even if the success isn't because of GC per se, the ubiquity of it in the real world means it certainly isn't a deal breaker.
GC is all about time/space tradeoffs. That's all one can say about it really. Yes, the 'ubiquity of it in the real world' (in popular and not so popular languages) suggest that most accept this tradeoff, in favour of using GC. But many still don't.. And many that do, might decide otherwise in the future... cause I'm not sure how well GC really scales...(in the future, the size of the heap might be terabytes..or more). That's not an argument for not defaulting to GC in D. It's an argument for when GC in D, could be a deal breaker. So it's good thing for the D community to consider these people as well - rather than saying 'who cares'. In the end, GC just adds to all the other bloat that's associated with programming in the modern era. The more we can reduce bloat, the -betterD. I'm glad there is alot of research in this area, and increasingly so - that's really important, cause the story of automatic memory management is far from over - even in D it seems.
Feb 08 2018
prev sibling next sibling parent reply meppl <mephisto nordhoff-online.de> writes:
On Thursday, 8 February 2018 at 15:55:09 UTC, JN wrote:
 On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe 
 wrote:
 Garbage collection has proved to be a smashing success in the 
 industry, providing productivity and memory-safety to 
 programmers of all skill levels.
Citation needed on how garbage collection has been a smashing success based on its merits rather than the merits of the languages that use garbage collection. Python was also a smashing success, but it doesn't use a garbage collector in it's default implementation (CPython). Unless you mean garbage collection as in "not manual memory management"? But that still might not be as simple, because RAII would fall somewhere inbetween.
let's say python is supposed to offer slow execution. So, python doesn't prove reference counting is fast (even if it is possible in theory). D on the other hand provides binaries who are expected to execute fast.
Feb 09 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/9/2018 1:14 AM, meppl wrote:
 let's say python is supposed to offer slow execution. So, python doesn't prove 
 reference counting is fast (even if it is possible in theory). D on the other 
 hand provides binaries who are expected to execute fast.
I believe it has been shown (sorry, no reference) that GC is faster in aggregate time, and RC is perceived faster because it doesn't have pauses. This makes GC better for batch jobs, and RC better for interactive code. Of course, the issue can get more complex. GC uses 3x the memory of RC, and so you can get extra slowdowns from swapping and cache misses.
Feb 09 2018
next sibling parent Dukc <ajieskola gmail.com> writes:
On Friday, 9 February 2018 at 21:24:14 UTC, Walter Bright wrote:
 Of course, the issue can get more complex. GC uses 3x the 
 memory of RC, and so you can get extra slowdowns from swapping 
 and cache misses.
Is the total memory consumption tripled, or only the extra memory used for tracking allocations?
Feb 10 2018
prev sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Friday, 9 February 2018 at 21:24:14 UTC, Walter Bright wrote:
 On 2/9/2018 1:14 AM, meppl wrote:
 let's say python is supposed to offer slow execution. So, 
 python doesn't prove reference counting is fast (even if it is 
 possible in theory). D on the other hand provides binaries who 
 are expected to execute fast.
I believe it has been shown (sorry, no reference) that GC is faster in aggregate time, and RC is perceived faster because it doesn't have pauses.
RC is a form of GC. Also tracing GCs with pause times under 1ms are in production for seceral languages now.
 This makes GC better for batch jobs, and RC better for 
 interactive code.
Yes GCs with lower pause time sacrifices throughput for low latency. RC included.
 Of course, the issue can get more complex. GC uses 3x the 
 memory of RC,
I’ve seen figures of about x2 but that was in an old paper on Boehm GC.
 and so you can get extra slowdowns from swapping
Oh come on... anything touching swap is usually frozen these days. Plus heap size is usually statically bounded for GC languages, choosen not to grow beyond ram.
 and cache misses.
Feb 10 2018
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 2/10/18 10:14 AM, Dmitry Olshansky wrote:
 On Friday, 9 February 2018 at 21:24:14 UTC, Walter Bright wrote:
 Of course, the issue can get more complex. GC uses 3x the memory of RC,
 I’ve seen figures of about x2 but that was in an old paper on Boehm GC.
This is the classic reference: https://people.cs.umass.edu/~emery/pubs/gcvsmalloc.pdf. Executive review in the abstract: "With only three times as much memory, the collector runs on average 17% slower than explicit memory management. However, with only twice as much memory, garbage collection degrades performance by nearly 70%. When physical memory is scarce, paging causes garbage collection to run an order of magnitude slower than explicit memory management." -- Andrei
Feb 10 2018
parent Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Saturday, 10 February 2018 at 18:40:43 UTC, Andrei 
Alexandrescu wrote:
 On 2/10/18 10:14 AM, Dmitry Olshansky wrote:
 On Friday, 9 February 2018 at 21:24:14 UTC, Walter Bright 
 wrote:
 Of course, the issue can get more complex. GC uses 3x the 
 memory of RC,
 I’ve seen figures of about x2 but that was in an old paper on Boehm GC.
This is the classic reference: https://people.cs.umass.edu/~emery/pubs/gcvsmalloc.pdf. Executive review in the abstract: "With only three times as much memory, the collector runs on average 17% slower than explicit memory management.
Reading the whole paper is a tad more important: On particular “manual” memory management is aided by precompted trace of lifetimes w/o any bookkeeping performed by the application. Oracular memory management framework. As Figure 1(a) shows, it first executes the Java pro- gram to calculate object lifetimes and generate the program heap trace. The system processes the program heap trace uses the Mer- lin algorithm to compute object reachability times and generate the reachability-based oracle. [...] Using these oracles, the oracular memory manager executes the program as shown in Figure 1(b), allocating objects using calls to malloc and invoking free on objects when directed by the oracle Plus - single threaded only... (e.g. parallel GC is a thing) In the experiments we present here, we assume a single-processor environment and disable atomic operations both for Jikes RVM and for the Lea allocator. In a multithreaded environment, most thread- safe memory allocators also require at least one atomic operation for every call to malloc and free: a test-and-set operation for lock-based allocator...
 However, with only twice as much memory, garbage collection 
 degrades performance by nearly 70%. When physical memory is 
 scarce, paging causes garbage collection to run an order of 
 magnitude slower than explicit memory management." -- Andrei
Feb 10 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 2/10/2018 7:14 AM, Dmitry Olshansky wrote:
 RC is a form of GC.
Pedantically, yes. But common usage regards the two as disjoint, and it's inconvenient to treat RC as a subset of GC when discussing tradeoffs between the two. Nobody bothers with s/GC/GC excluding RC/.
Feb 10 2018
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 2/10/18 4:41 PM, Walter Bright wrote:
 On 2/10/2018 7:14 AM, Dmitry Olshansky wrote:
 RC is a form of GC.
Pedantically, yes. But common usage regards the two as disjoint, and it's inconvenient to treat RC as a subset of GC when discussing tradeoffs between the two. Nobody bothers with s/GC/GC excluding RC/.
"Tracing GC" is the common way of referring to GC techniques outside of reference counting. -- Andrei
Feb 11 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 08.02.2018 16:55, JN wrote:
 On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe wrote:
 Garbage collection has proved to be a smashing success in the 
 industry, providing productivity and memory-safety to programmers of 
 all skill levels.
Citation needed on how garbage collection has been a smashing success based on its merits rather than the merits of the languages that use garbage collection. Python was also a smashing success, but it doesn't use a garbage collector in it's default implementation (CPython). Unless you mean garbage collection as in "not manual memory management"? ...
Even if "garbage collection" is taken to mean "collecting garbage", reference counting is garbage collection. Referring to RC as not GC makes no sense at all and was probably only invented because some people want to think that RC is good but GC is bad, being too lazy to say "tracing GC".
Feb 10 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, February 10, 2018 14:06:09 Timon Gehr via Digitalmars-d wrote:
 On 08.02.2018 16:55, JN wrote:
 On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe wrote:
 Garbage collection has proved to be a smashing success in the
 industry, providing productivity and memory-safety to programmers of
 all skill levels.
Citation needed on how garbage collection has been a smashing success based on its merits rather than the merits of the languages that use garbage collection. Python was also a smashing success, but it doesn't use a garbage collector in it's default implementation (CPython). Unless you mean garbage collection as in "not manual memory management"? ...
Even if "garbage collection" is taken to mean "collecting garbage", reference counting is garbage collection. Referring to RC as not GC makes no sense at all and was probably only invented because some people want to think that RC is good but GC is bad, being too lazy to say "tracing GC".
Except that RC and what folks typically mean what they talk about GC are fundamentally different. Yes, they both automatically free memory for you, but one is deterministic, whereas the other involves periodically running a collection to find memory that can be freed. So, yes, in a sense, RC is a form of GC, but they're very different beasts. - Jonathan M Davis
Feb 10 2018
parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 10 February 2018 at 19:22:51 UTC, Jonathan M Davis 
wrote:
 On Saturday, February 10, 2018 14:06:09 Timon Gehr via 
 Digitalmars-d wrote:
 On 08.02.2018 16:55, JN wrote:
 On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe 
 wrote:
 Garbage collection has proved to be a smashing success in 
 the industry, providing productivity and memory-safety to 
 programmers of all skill levels.
Citation needed on how garbage collection has been a smashing success based on its merits rather than the merits of the languages that use garbage collection. Python was also a smashing success, but it doesn't use a garbage collector in it's default implementation (CPython). Unless you mean garbage collection as in "not manual memory management"? ...
Even if "garbage collection" is taken to mean "collecting garbage", reference counting is garbage collection. Referring to RC as not GC makes no sense at all and was probably only invented because some people want to think that RC is good but GC is bad, being too lazy to say "tracing GC".
Except that RC and what folks typically mean what they talk about GC are fundamentally different. Yes, they both automatically free memory for you, but one is deterministic, whereas the other involves periodically running a collection to find memory that can be freed. So, yes, in a sense, RC is a form of GC, but they're very different beasts. - Jonathan M Davis
People like to think that RC is deterministic. First of all, unless they are atomic, there are no guarantees of pause time during locking on counter access. Second, Herb Sutter has a great CppCon talk about non-deterministic releases, with the possibility of stack overflow, in complex datastructures. Herb Sutter “Leak-Freedom in C++... By Default.” https://www.youtube.com/watch?v=JfmTagWcqoE
Feb 10 2018
prev sibling next sibling parent Michael <michael toohuman.io> writes:
On Thursday, 8 February 2018 at 14:54:19 UTC, Adam D. Ruppe wrote:
 On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 It feels like D has not overcome at least two major issues in 
 the public mind, the built-in GC
D is a pragmatic language aimed toward writing fast code, fast. Garbage collection has proved to be a smashing success in the industry, providing productivity and memory-safety to programmers of all skill levels. D's GC implementation follows in the footsteps of industry giants without compromising expert's ability to tweak even further. That's what we should be saying every single time someone mentions GC. Including it was the RIGHT DECISION and we should own that.
Yes, absolutely! It's the reason I chose to start writing programs in D, because I had a background in C and Java, and wanted a fast, compiled language that would take care of the details for me. You can write programs quickly, and it's quick enough when running them, and can of course be tuned further through managing allocations/collections of the GC etc.
Feb 08 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, February 08, 2018 14:54:19 Adam D. Ruppe via Digitalmars-d 
wrote:
 On Thursday, 8 February 2018 at 11:06:15 UTC, ixid wrote:
 It feels like D has not overcome at least two major issues in
 the public mind, the built-in GC
D is a pragmatic language aimed toward writing fast code, fast. Garbage collection has proved to be a smashing success in the industry, providing productivity and memory-safety to programmers of all skill levels. D's GC implementation follows in the footsteps of industry giants without compromising expert's ability to tweak even further. That's what we should be saying every single time someone mentions GC. Including it was the RIGHT DECISION and we should own that.
+10000000000000000000 - Jonathan M Davis
Feb 08 2018
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Feb 08, 2018 at 12:17:06PM -0700, Jonathan M Davis via Digitalmars-d
wrote:
 On Thursday, February 08, 2018 14:54:19 Adam D. Ruppe via Digitalmars-d 
 wrote:
[...]
 Garbage collection has proved to be a smashing success in the
 industry, providing productivity and memory-safety to programmers of
 all skill levels. D's GC implementation follows in the footsteps of
 industry giants without compromising expert's ability to tweak even
 further.



 That's what we should be saying every single time someone mentions
 GC. Including it was the RIGHT DECISION and we should own that.
+10000000000000000000
[...] /// ditto. :-P While I agree that we *should* make D as usable as possible for those who don't want to use the GC, all too often that belies the benefits that having a GC actually brings. It's true that the current GC could be improved, and that we could reduce GC-dependence in Phobos, provide better nogc support, etc.. But we should not apologize for *having* a GC, as if it was somehow a wrong decision. I think it's *great* to have a GC. It has saved me *so* much time, energy, and frustration that would have been spent obsessing over memory management every other line of code I write; now I can instead direct that energy towards actually solving stuff in the problem domain that is the entire purpose of the code in the first place. And for those times when performance is an issue, GC.disable and GC.collect have proven sufficient to clear the bottleneck in 95% of the cases. And besides, D doesn't stop you from dropping back to malloc/free if you really need to. Or, for that matter, RefCounted. T -- If you want to solve a problem, you need to address its root cause, not just its symptoms. Otherwise it's like treating cancer with Tylenol...
Feb 08 2018
prev sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, February 08, 2018 11:28:52 H. S. Teoh via Digitalmars-d wrote:
 On Thu, Feb 08, 2018 at 12:17:06PM -0700, Jonathan M Davis via 
Digitalmars-d wrote:
 On Thursday, February 08, 2018 14:54:19 Adam D. Ruppe via Digitalmars-d
 wrote:
[...]
 Garbage collection has proved to be a smashing success in the
 industry, providing productivity and memory-safety to programmers of
 all skill levels. D's GC implementation follows in the footsteps of
 industry giants without compromising expert's ability to tweak even
 further.



 That's what we should be saying every single time someone mentions
 GC. Including it was the RIGHT DECISION and we should own that.
+10000000000000000000
[...] /// ditto. :-P While I agree that we *should* make D as usable as possible for those who don't want to use the GC, all too often that belies the benefits that having a GC actually brings. It's true that the current GC could be improved, and that we could reduce GC-dependence in Phobos, provide better nogc support, etc.. But we should not apologize for *having* a GC, as if it was somehow a wrong decision. I think it's *great* to have a GC. It has saved me *so* much time, energy, and frustration that would have been spent obsessing over memory management every other line of code I write; now I can instead direct that energy towards actually solving stuff in the problem domain that is the entire purpose of the code in the first place. And for those times when performance is an issue, GC.disable and GC.collect have proven sufficient to clear the bottleneck in 95% of the cases. And besides, D doesn't stop you from dropping back to malloc/free if you really need to. Or, for that matter, RefCounted.
I am completely fine with making more features pay-as-you-go so long as it doesn't require me to change any existing code (e.g. I shouldn't have to import the GC - but if no code in your program invokes the GC and that results in the GC not being linked in, that's fine with me). But whenever I see folks trying to push -betterC as the way to go or push to get the GC out of Phobos, I start getting worried about that negatively affecting normal D code. I totally agree that there are times when you don't want something on the GC heap, and there are times when you need to do stuff like reference-counting (e.g. for OS-level resources that need to be released deterministically), but on the whole, having the GC is fantastic, and for most stuff, it works wonderfully. We should strive to minimize the cost of nice stuff like the GC so that it's as much pay-as-you-go as is reasonable, but at some point, if you're not careful, you start losing out on nice features in your attempt to appease the folks who think that they can't afford the GC in their enivornment (whether they actually can or not). And I would much rather see folks have to go to a bit of extra work to turn off something that most programs are going to benefit from than to make it harder for your average D program to take advantage of all of D's great features. - Jonathan M Davis
Feb 08 2018