www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Why are the nogc crowed labeled as alarmists?!?!

reply "Frustrated" <Who where.com> writes:
Are those that say the GC is fine and works for 90-95% of apps 
without issue just ignorant? Or are they arrogant?

When one is writing a real time app and have the absolute lowest 
chance of losing control, a STW GC is simply not allowed in this 
apps.

This is the argument for the GC: So, you write a surveillance app 
that captures a frame every second. The GC kicks in once an hour 
and pauses the app for half a second. Thats great! No big deal. 
1/2 a second in a an hour is just 1/120th of the time... less 
than l% of the app's run time is used by the GC. AMAZING!!! No 
one will notice!

Of course, are they just too stupid? Or they simply don't care 
about any applications other than they are writing? Seriously, 
which is it?

Take an audio app that is used to record a band. Same scenario. 
Ok, right? GC isn't a problem! No one will notice the glitches!

What about a first person online FPS written in D? Ok too!! Who 
will care when the game lags at that critical moment when you are 
in the heat of battle. So unlikely that the GC will cause any 
problems.

So, this is the way I see it:

There are some real arrogant people out there. They do not write 
critical real time apps. They write program stuff like 
"writeln('I'm a fu$%ing cool programmer!! Look how awesome I 
am')". They never see the GC cause any issues so it MUST not 
cause issues(which is were the ignorance comes in).

They don't want anything changed because it works for them and 
they are afraid it will require more work.

Anyways, not that this rant will do any good but I'm getting sick 
and tired of the pathetic argument that since the GC is fine for 
MOST people [it is fine for ALL people].

D has to decide what it wants to be able to do. If it wants to be 
held back by a GC simple because it's too much work to get it 
done right(ARC, MMM, or whatever) then so be it. But at least 
decide on something absolute and let it be known so the lemmings 
stop using this ridiculous logic that the "GC is great, no one 
needs anything better [because I'm great and I use the GC and it 
works just fine]".

Of course.. if all your programs are under 640kB then just maybe 
that GC never kicks in in the first place?!?!
Jul 17 2014
next sibling parent reply "Daniel Murphy" <yebbliesnospam gmail.com> writes:
"Frustrated"  wrote in message news:vdtunbkrdyyxnmqcgmmv forum.dlang.org... 

 Are those that say the GC is fine and works for 90-95% of apps 
 without issue just ignorant? Or are they arrogant?
 
 When one is writing a real time app and have the absolute lowest 
 chance of losing control, a STW GC is simply not allowed in this 
 apps.
 
 This is the argument for the GC: So, you write a surveillance app 
 that captures a frame every second. The GC kicks in once an hour 
 and pauses the app for half a second. Thats great! No big deal. 
 1/2 a second in a an hour is just 1/120th of the time... less 
 than l% of the app's run time is used by the GC. AMAZING!!! No 
 one will notice!
 
 Of course, are they just too stupid? Or they simply don't care 
 about any applications other than they are writing? Seriously, 
 which is it?
 
 Take an audio app that is used to record a band. Same scenario. 
 Ok, right? GC isn't a problem! No one will notice the glitches!
 
 What about a first person online FPS written in D? Ok too!! Who 
 will care when the game lags at that critical moment when you are 
 in the heat of battle. So unlikely that the GC will cause any 
 problems.
 
 So, this is the way I see it:
 
 There are some real arrogant people out there. They do not write 
 critical real time apps. They write program stuff like 
 "writeln('I'm a fu$%ing cool programmer!! Look how awesome I 
 am')". They never see the GC cause any issues so it MUST not 
 cause issues(which is were the ignorance comes in).
 
 They don't want anything changed because it works for them and 
 they are afraid it will require more work.
 
 Anyways, not that this rant will do any good but I'm getting sick 
 and tired of the pathetic argument that since the GC is fine for 
 MOST people [it is fine for ALL people].
 
 D has to decide what it wants to be able to do. If it wants to be 
 held back by a GC simple because it's too much work to get it 
 done right(ARC, MMM, or whatever) then so be it. But at least 
 decide on something absolute and let it be known so the lemmings 
 stop using this ridiculous logic that the "GC is great, no one 
 needs anything better [because I'm great and I use the GC and it 
 works just fine]".
 
 Of course.. if all your programs are under 640kB then just maybe 
 that GC never kicks in in the first place?!?!
You sound frustrated.
Jul 17 2014
parent "Dicebot" <public dicebot.lv> writes:
Because applications with hard real-time requirements take hardly 
any more than 5% of all software industry. And those almost never 
used "standard" tools/libraries.
Jul 17 2014
prev sibling next sibling parent reply Ary Borenszweig <ary esperanto.org.ar> writes:
On 7/17/14, 3:13 PM, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of apps without
 issue just ignorant? Or are they arrogant?
We probably do webapps and other stuff that is not real-time. A GC there works just fine. Now, if you compare the amount of audio apps, surveilance apps and real-time games with the amount of webapps out there, or the amount of command line tools out there, or text editors (SublimeText is done in Python, I think), or a web service, or some background job ... I would conclude that 90-95% is a pretty good guess. For that other %5 you can use C, C++ or Rust, but be prepared do deal with hard languages. So, you are right: D has to choose what he wants to cover: that %5, that %95, or both (at the expense of becoming a really difficult language to use).
Jul 17 2014
parent reply "Hannes Steffenhagen" <cubicentertain gmail.com> writes:
On Thursday, 17 July 2014 at 18:28:30 UTC, Ary Borenszweig wrote:
 On 7/17/14, 3:13 PM, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of apps 
 without
 issue just ignorant? Or are they arrogant?
We probably do webapps and other stuff that is not real-time. A GC there works just fine. Now, if you compare the amount of audio apps, surveilance apps and real-time games with the amount of webapps out there, or the amount of command line tools out there, or text editors (SublimeText is done in Python, I think), or a web service, or some background job ... I would conclude that 90-95% is a pretty good guess. For that other %5 you can use C, C++ or Rust, but be prepared do deal with hard languages. So, you are right: D has to choose what he wants to cover: that %5, that %95, or both (at the expense of becoming a really difficult language to use).
Last time I checked, D was advertised as a systems programming language and a real alternative to C/C++. I think we're good for languages that cover the needs of web application developers, that 5% is where most people interested in D would be coming from. Not that the web application thing is even entirely true; If you have huge workloads you'll eventually want to take more control than managed systems give you.
Jul 17 2014
parent "Dicebot" <public dicebot.lv> writes:
On Thursday, 17 July 2014 at 20:52:15 UTC, Hannes Steffenhagen 
wrote:
 Last time I checked, D was advertised as a systems programming 
 language and a real alternative to C/C++. I think we're good 
 for languages that cover the needs of web application 
 developers, that 5% is where most people interested in D would 
 be coming from.

 Not that the web application thing is even entirely true; If 
 you have huge workloads you'll eventually want to take more 
 control than managed systems give you.
And with recent work in LDC and GDC it D is quite capable of it, what Mike DConf talk brilliantly proves. You may even use some parts of Phobos with help of nogc! There are certain parts of the language that trigger GC when it is entirely possible to avoid that and we should fight those with no doubt. But that does not mean removing GC from the core language. And for any kind of web applications GC is not a problem if it is concurrent GC (which is something to improve but unrelated to "GC is evil" topic). Just the fact that your application _may_ generate a lot of garbage does not mean you have to do it. Use custom allocators for most data and experience something Java can't give you. Now if you want to have _both_ automatic memory management _and_ removal of GC there will be some trouble. But it is something that C/C++ can't give you either so mentioning it as a transition blocker is hardly a good point. Right now D is probably about 20% behind C++ feature-wise for RAII / manual memory management coding style and this gap is compensated with templates alone. After GDC (or was it LDC?) get planned switches to remove RTTI related stuff I'd honestly recommend anyone doing low-level project to give it a try.
Jul 17 2014
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/17/14, 11:13 AM, Frustrated wrote:
 D has to decide what it wants to be able to do. If it wants to be held
 back by a GC simple because it's too much work to get it done right(ARC,
 MMM, or whatever) then so be it. But at least decide on something
 absolute and let it be known so the lemmings stop using this ridiculous
 logic that the "GC is great, no one needs anything better [because I'm
 great and I use the GC and it works just fine]".
To paraphrase a common phrase used among Facebook engineers: "Nothing in D is someone else's problem." If GC is a problem for you, you can be sure your solid work on it will be acknowledged and adopted. Andrei
Jul 17 2014
parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Thursday, 17 July 2014 at 18:57:51 UTC, Andrei Alexandrescu 
wrote:

 To paraphrase a common phrase used among Facebook engineers: 
 "Nothing in D is someone else's problem."
That would make a good motto. - Jonathan M Davis
Jul 20 2014
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 17 July 2014 at 18:13:18 UTC, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of apps 
 without issue just ignorant? Or are they arrogant?
...
We probably had the fortune of using operating systems written in GC enabled system programming languages before the duo UNIX/Windows took over desktop computing, and learned another way was possible. -- Paulo
Jul 18 2014
prev sibling parent reply "Kagamin" <spam here.lot> writes:
On Thursday, 17 July 2014 at 18:13:18 UTC, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of apps 
 without issue just ignorant? Or are they arrogant?

 When one is writing a real time app and have the absolute 
 lowest chance of losing control, a STW GC is simply not allowed 
 in this apps.
D works fine without GC for me. What problems do you have?
Jul 18 2014
parent reply "Dominikus Dittes Scherkl" writes:
On Friday, 18 July 2014 at 13:17:34 UTC, Kagamin wrote:
 On Thursday, 17 July 2014 at 18:13:18 UTC, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of apps 
 without issue just ignorant? Or are they arrogant?

 When one is writing a real time app and have the absolute 
 lowest chance of losing control, a STW GC is simply not 
 allowed in this apps.
D works fine without GC for me. What problems do you have?
For me also. The cool thing about D is: You can use it like a script-language at first, and GC (+all the other nice features like unit tests, asserts etc) keep you from bothering with stupit bugs and implementation details that are only relevant for maximum performance. And afterwards, if it comes to RT (real-time), the first thing I throw out is all that MMI stuff (man-machine-interface), e.g. everything dealing with strings. And thats about 98% of all functions that use GC in my code. The very little rest is things like exceptions, delegates and closures - because I have no idea how to use them with manual memory management. So unfortunately I have to avoid them in RT code. But what remains is anyway far, far, better than what C offered. And to make that clear: nothing else was usable for embedded programming before D. No C++, no Java, nothing at all.
Jul 18 2014
next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Dominikus Dittes Scherkl:

 The very little rest is things like exceptions, delegates and 
 closures - because I have no idea how to use them with manual 
 memory management. So unfortunately I have to avoid them in RT 
 code.
Some closures can be avoided with "scope", and some exception allocations can be moved where they don't harm. This doesn't solve all problems, but improves the situation a little (hard-RT code should avoid exceptions). Bye, brarophile
Jul 18 2014
prev sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 18 July 2014 at 13:53:14 UTC, Dominikus Dittes Scherkl 
wrote:
 On Friday, 18 July 2014 at 13:17:34 UTC, Kagamin wrote:
 On Thursday, 17 July 2014 at 18:13:18 UTC, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of 
 apps without issue just ignorant? Or are they arrogant?

 When one is writing a real time app and have the absolute 
 lowest chance of losing control, a STW GC is simply not 
 allowed in this apps.
D works fine without GC for me. What problems do you have?
For me also. The cool thing about D is: You can use it like a script-language at first, and GC (+all the other nice features like unit tests, asserts etc) keep you from bothering with stupit bugs and implementation details that are only relevant for maximum performance. And afterwards, if it comes to RT (real-time), the first thing I throw out is all that MMI stuff (man-machine-interface), e.g. everything dealing with strings. And thats about 98% of all functions that use GC in my code. The very little rest is things like exceptions, delegates and closures - because I have no idea how to use them with manual memory management. So unfortunately I have to avoid them in RT code. But what remains is anyway far, far, better than what C offered. And to make that clear: nothing else was usable for embedded programming before D. No C++, no Java, nothing at all.
Ada and Modula-2?
Jul 18 2014
parent "jackdeath" <jackdeath mind.com> writes:
:)) how true

On Friday, 18 July 2014 at 14:25:54 UTC, Paulo Pinto wrote:
 On Friday, 18 July 2014 at 13:53:14 UTC, Dominikus Dittes 
 Scherkl wrote:
 On Friday, 18 July 2014 at 13:17:34 UTC, Kagamin wrote:
 On Thursday, 17 July 2014 at 18:13:18 UTC, Frustrated wrote:
 Are those that say the GC is fine and works for 90-95% of 
 apps without issue just ignorant? Or are they arrogant?

 When one is writing a real time app and have the absolute 
 lowest chance of losing control, a STW GC is simply not 
 allowed in this apps.
D works fine without GC for me. What problems do you have?
For me also. The cool thing about D is: You can use it like a script-language at first, and GC (+all the other nice features like unit tests, asserts etc) keep you from bothering with stupit bugs and implementation details that are only relevant for maximum performance. And afterwards, if it comes to RT (real-time), the first thing I throw out is all that MMI stuff (man-machine-interface), e.g. everything dealing with strings. And thats about 98% of all functions that use GC in my code. The very little rest is things like exceptions, delegates and closures - because I have no idea how to use them with manual memory management. So unfortunately I have to avoid them in RT code. But what remains is anyway far, far, better than what C offered. And to make that clear: nothing else was usable for embedded programming before D. No C++, no Java, nothing at all.
Ada and Modula-2?
Jul 18 2014