www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Why don't you advertise more your language on Quora etc ?

reply Jared Jeffries <jared.jeffries yandex.com> writes:
I've read the answer to questions like "Which is the best 
programming language to learn in 2017?".

Nobody was telling anything about D, which is really sad, because 
in my opinion D could be one of the best answers to this question.

I've answered this question. Better late than never.

I suggest that other "happy users" of this language do the same...
Feb 28 2017
next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 02/28/2017 06:29 PM, Jared Jeffries wrote:
 I've read the answer to questions like "Which is the best programming
 language to learn in 2017?".

 Nobody was telling anything about D, which is really sad, because in my
 opinion D could be one of the best answers to this question.

 I've answered this question. Better late than never.

 I suggest that other "happy users" of this language do the same...
What's quora? (It's really hard to always keep on top of all the latest tread sites/appz/whatever. It's all so fly-by-night.)
Mar 01 2017
next sibling parent reply Jared Jeffries <jared.jeffries yandex.com> writes:
It's one of those forum websites where you ask a question, 
experts give their advices and people vote for the best answer.

You can find them by searching "best programming language 2017" 
in google for instance.

I suggest that D lovers answer as often as possible to these kind 
of questions on these websites.

For instance :
- best programming language
- best programming language for ...
- best programming language to learn
- etc

It's needed because I didn't know that D existed until somebody 
advised me to try it, and now it's my favorite language.

I'm now studying Java and C++ because I have to, but I'll still 
continue to use D and promote it.

I really learned a lot from D, it's probably the best "first 
programming language to learn" that one can find, much better 


It should be advertised a lot for that, because it's probably the 
best one can find at the moment.

And it's also probably easier to convince new programmers like me 
to learn D as a first object oriented language, than people who 
are using other similar languages for years or decades.
Mar 01 2017
parent reply Mike Parker <aldacron gmail.com> writes:
On Wednesday, 1 March 2017 at 10:20:45 UTC, Jared Jeffries wrote:

 I suggest that D lovers answer as often as possible to these 
 kind of questions on these websites.
Plenty of people do, particularly on reddit, StackOverflow, Hacker News, and whatever forums & communities they tend to hang out at (e.g. gamedev.net). If there's an absence of such at Quora, it's just because none of the vocal D users are using it.
Mar 01 2017
parent thedeemon <dlang thedeemon.com> writes:
On Wednesday, 1 March 2017 at 13:53:17 UTC, Mike Parker wrote:

 Plenty of people do, particularly on reddit, StackOverflow, 
 Hacker News, and whatever forums & communities they tend to 
 hang out at (e.g. gamedev.net). If there's an absence of such 
 at Quora, it's just because none of the vocal D users are using 
 it.
Actually in case of quora, the way it works is unlike reddit or this forum, where you see all recent topics, quora only shows you some bits it decides would be "interesting" for you, so if the first time you came on quora to read about alpacas, then later it will mostly show you questions and answers about alpacas and not about learning programming languages. I usually don't see any questions like mentioned above, instead when I come to quora I see mostly stuff about particle physics and black holes, very rarely I see questions where I could mention D.
Mar 02 2017
prev sibling next sibling parent Jared Jeffries <jared.jeffries yandex.com> writes:
Here is a link to my answer :

https://www.quora.com/Which-is-the-best-programming-language-to-learn-in-2017/answer/Jared-Jeffries-4?prompt_topic_bio=1
Mar 01 2017
prev sibling next sibling parent reply Craig Dillabaugh <craig.dillabaugh gmail.com> writes:
On Wednesday, 1 March 2017 at 08:12:05 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 02/28/2017 06:29 PM, Jared Jeffries wrote:

 What's quora?

 (It's really hard to always keep on top of all the latest tread 
 sites/appz/whatever. It's all so fly-by-night.)
Quora is a general Q&A forum and hang-out for narcissistic know-it-alls. In fairness, the answers are typically of high quality, and there are definitely some smart folks on there. By I get turned off by folks who do things like list their IQ or Mensa membership in their personal profiles, or post about their sexual exploits. However the forum does often answer some highly important and relevant questions. My favorite so far was the "If I wanted to jump from an airplane flying at 30,000 feet with nothing but bubble wrap for protection, how much would I need?" Since you are no doubt curious, I believe the answer was you would need to be wrapped in a ball about 4m in radius. They never explained how you would get that on the plane though.
Mar 01 2017
parent "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/01/2017 08:35 AM, Craig Dillabaugh wrote:
 However the forum does often answer some highly important and relevant
 questions.  My favorite so far was the "If I wanted to jump from an
 airplane flying at 30,000 feet with nothing but bubble wrap for
 protection, how much would I need?"


 Since you are no doubt curious, I believe the answer was you would need
 to be wrapped in a ball about 4m in radius.  They never explained how
 you would get that on the plane though.
Hah :) Sounds very xkcd.
Mar 01 2017
prev sibling parent reply xenon325 <anm programmer.net> writes:
On Wednesday, 1 March 2017 at 08:12:05 UTC, Nick Sabalausky 
(Abscissa) wrote:
 What's quora?

 (It's really hard to always keep on top of all the latest tread 
 sites/appz/whatever. It's all so fly-by-night.)
I've dissmissed Quora the first time I've seen it like 5 years ago, but last year I've discovered there are super-cool folks over there: * https://www.quora.com/profile/Peter-Norvig * https://www.quora.com/profile/Xavier-Amatriain
Mar 02 2017
parent Swoorup Joshi <swoorupjoshi gmail.com> writes:
Why not advertise? Because lagging deterministic memory 
management, meaning nogc. And garbage collection

I'll probably be kicked for saying this. ^
Mar 02 2017
prev sibling next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Tuesday, 28 February 2017 at 23:29:24 UTC, Jared Jeffries 
wrote:
 I've read the answer to questions like "Which is the best 
 programming language to learn in 2017?".

 Nobody was telling anything about D, which is really sad, 
 because in my opinion D could be one of the best answers to 
 this question.

 I've answered this question. Better late than never.

 I suggest that other "happy users" of this language do the 
 same...
There's a reason stackoverflow and softwareengineering.stackexchange delete these kinds of questions: they're counter productive and can't actually be answered. The question "Which is the best programming language to learn in 2017" is one such question. It comes down strictly to opinion and circumstance. Because of this, the "answers" are either answering a different question or just ads for the user's favorite language. It seems most of the top answers in that thread took the question to mean "Which language would be most likely to get me a job in 2017", which isn't the same. Programming questions on Quora are the dumping ground for bad SO questions. Most D power users spend their time either on the IRC or on SO.
Mar 01 2017
next sibling parent reply Jared Jeffries <jared.jeffries yandex.com> writes:
I'm not talking especially about Quora, even if I admit that it's 
on this forum that somebody advised me to learn D to improve my 
object oriented programming skills.

I'm just saying that I think that D is de facto one of the best 
languages for beginners like me.

A lot better than most mainstream language these forum are always 
promoting as the best starting programming language.

So it's sad that D is ignored by so many young programmers, 
because when they do some research on google, like I did, there 
are too little chance they will find this advice.

D is mainly known as a system programming language because it's 
indeed "C++ done well". But that's probably not enough to 
convince C++ addicts to try D, especially because of the garbage 
collector.

But D is a great alternative to mainstream garbage collected 


I think it should instead be advertised as the perfect language 
to learn programming and web development, because that's where it 
really shines, IMHO.
Mar 01 2017
next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries wrote:
 I'm not talking especially about Quora, even if I admit that 
 it's on this forum that somebody advised me to learn D to 
 improve my object oriented programming skills.

 I'm just saying that I think that D is de facto one of the best 
 languages for beginners like me.

 A lot better than most mainstream language these forum are 
 always promoting as the best starting programming language.

 So it's sad that D is ignored by so many young programmers, 
 because when they do some research on google, like I did, there 
 are too little chance they will find this advice.

 D is mainly known as a system programming language because it's 
 indeed "C++ done well". But that's probably not enough to 
 convince C++ addicts to try D, especially because of the 
 garbage collector.

 But D is a great alternative to mainstream garbage collected 


 I think it should instead be advertised as the perfect language 
 to learn programming and web development, because that's where 
 it really shines, IMHO.
I agree. We have a lot to improve in terms of marketing. Mainly our messaging is jumbled. Rust = memory safety Go = the best runtime around D = everything I guess? And the problem is that D is good at everything (IMO), so how do we go about marketing to everyone without getting our messages mixed up in the public's view.
Mar 01 2017
next sibling parent Meta <jared771 gmail.com> writes:
On Wednesday, 1 March 2017 at 17:25:07 UTC, Jack Stouffer wrote:
 I agree. We have a lot to improve in terms of marketing.

 Mainly our messaging is jumbled.

 Rust = memory safety
 Go = the best runtime around
 D = everything I guess?

 And the problem is that D is good at everything (IMO), so how 
 do we go about marketing to everyone without getting our 
 messages mixed up in the public's view.
To me, D's key marketing feature has always been it's incredible static introspection and metaprogramming abilities.
Mar 01 2017
prev sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/01/2017 12:25 PM, Jack Stouffer wrote:
 I agree. We have a lot to improve in terms of marketing.

 Mainly our messaging is jumbled.

 Rust = memory safety
 Go = the best runtime around
 D = everything I guess?

 And the problem is that D is good at everything (IMO), so how do we go
 about marketing to everyone without getting our messages mixed up in the
 public's view.
Yea, its kinda depressing that programmers will be cold towards D, and admit flat-out it's because D *isn't* a one-trick pony centered around once single gimmick. (I've encountered that myself. Such a facepalm inducer. "Polyglot programming" has rotted people's brains.)
Mar 01 2017
parent reply ixid <adamsibson gmail.com> writes:
On Wednesday, 1 March 2017 at 18:37:46 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 03/01/2017 12:25 PM, Jack Stouffer wrote:
 I agree. We have a lot to improve in terms of marketing.

 Mainly our messaging is jumbled.

 Rust = memory safety
 Go = the best runtime around
 D = everything I guess?

 And the problem is that D is good at everything (IMO), so how 
 do we go
 about marketing to everyone without getting our messages mixed 
 up in the
 public's view.
Yea, its kinda depressing that programmers will be cold towards D, and admit flat-out it's because D *isn't* a one-trick pony centered around once single gimmick. (I've encountered that myself. Such a facepalm inducer. "Polyglot programming" has rotted people's brains.)
We need a powerful message that resonates. I think performance is the strongest message and it directly attacks people's major and misplaced concern about D's garbage collector.
Mar 02 2017
parent reply Jared Jeffries <jared.jeffries yandex.com> writes:
 We need a powerful message that resonates. I think performance 
 is the strongest message and it directly attacks people's major 
 and misplaced concern about D's garbage collector.
Indeed, D gives better performance than other similar garbage But I don't think that performance is really what will decide their programmers to give D a try, because they are used to their current ecosystem, and are comfortable in delivering their applications in time with it. IMHO, what really matters to a developer experimenting a new language are : - Is the new language easy to learn ? How long will it take me to become productive with it ? - Is it really worth the effort ? How will it help me in getting the job done ? D is easier to learn, it's *both* more programmer-friendly (arrays, maps, slices, foreach loops, references types, closures), less verbose and more complete (templates, etc) than similar mainstream languages. Basically you just have to learn some sort of curated C++/Java mix. And in return you enjoy the *expressivity and productivity* of a scripting-language like Javascript *without sacrifying performance or safety*.
Mar 02 2017
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Thu, 2017-03-02 at 12:29 +0000, Jared Jeffries via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 IMHO, what really matters to a developer experimenting a new=C2=A0
 language are :
 - Is the new language easy to learn ? How long will it take me to=C2=A0
 become productive with it ?
 - Is it really worth the effort ? How will it help me in getting=C2=A0
 the job done ?
And what is the tooling and workflow? We are now living in the post- VIM/post-Emacs era. If there isn't a JetBrain IDE or plugin to an IDE for a language, you are in losing territory for big take up: without an IDE (JetBrains is just the current main example) the language is destined to be a niche or small community one. I will elide the rant on this to avoid being accused of being repetitious.
 D is easier to learn, it's *both* more programmer-friendly=C2=A0
 (arrays, maps, slices, foreach loops, references types,=C2=A0
 closures), less verbose and more complete (templates, etc) than=C2=A0
 similar mainstream languages.
=20
 Basically you just have to learn some sort of curated C++/Java=C2=A0
 mix.
=20
 And in return you enjoy the *expressivity and productivity* of a=C2=A0
 scripting-language like Javascript *without sacrifying=C2=A0
 performance or safety*.
=20
=20
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 02 2017
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/02/2017 07:47 AM, Russel Winder via Digitalmars-d wrote:
 And what is the tooling and workflow? We are now living in the post-
 VIM/post-Emacs era. If there isn't a JetBrain IDE or plugin to an IDE
 for a language, you are in losing territory for big take up: without an
 IDE (JetBrains is just the current main example) the language is
 destined to be a niche or small community one.
I've used tools from JetBrains before. IMO, it should be easy for both vim and emacs to catch up to tools like JetBrains, Xamarin and such. All they need are a couple extensions to artificially boost memory footprint and introduce big startup and various UI delays. And done: instant parity with the popular IDE's. Should be easy enough to write.
Mar 02 2017
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Thu, 2017-03-02 at 15:02 -0500, Nick Sabalausky (Abscissa) via
Digitalmars-d wrote:
 [=E2=80=A6]
 I've used tools from JetBrains before. IMO, it should be easy for
 both=C2=A0
 vim and emacs to catch up to tools like JetBrains, Xamarin and such.
 All=C2=A0
 they need are a couple extensions to artificially boost memory
 footprint=C2=A0
 and introduce big startup and various UI delays. And done: instant=C2=A0
 parity with the popular IDE's. Should be easy enough to write.
IDEs, big memory footprint, yes. IDEs, big startup cost, yes. IDEs, vastly more supportive, useful software development functionality than editors, especially for debugging, yes. It's that last one, the one about getting working software developed faster, that is the one that has moved me away from Emacs to IDEs. But everyone to their own, there is no universal truth in this arena.=20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 03 2017
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/03/2017 10:40 AM, Russel Winder via Digitalmars-d wrote:
 IDEs, vastly more supportive, useful software development functionality
 than editors, especially for debugging, yes.


 It's that last one, the one about getting working software developed
 faster, that is the one that has moved me away from Emacs to IDEs. But
Perhaps ironically, I used to be big on IDE's (back before the bloat). But between the bloat that started happening to them about 10+ years ago, and various factors that led me to (oddly enough) actually prefer printf debugging, I switched to more basic editors with the whole "Linux is my IDE" setup.
 everyone to their own, there is no universal truth in this arena.
Definitely true. But I do really wish though, that the IDE devs would start prioritizing efficiency, UI snappiness, and startup time. Yea, those toold do more, but they don't do THAT much more that would technologically necessitate THAT much of a performance discrepancy. (The plain-old-editors are far more capable than I think IDE users seem to beleive). Those IDE efficiency improvements certainly wouldn't hurt their users, and it would cause less of a "cascading bloat" effect - where devs feel that have to always have top-of-the-line hardware (because that's what their IDE demands), so then they get accustomed to how their software runs on top-of-the-line hardware, so their software winds up bloated too, and they become so familiar with all of that, they they can't comprehend why any user would ever be using less than top-of-the-line hardware and develop disinterest and even disdain for users who aren't just as computing-gearhead as they are (because they've allowed themselves to become too far removed from the world of average Joe user).
Mar 03 2017
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, Mar 03, 2017 at 01:45:50PM -0500, Nick Sabalausky (Abscissa) via
Digitalmars-d wrote:
[...]
 But I do really wish though, that the IDE devs would start
 prioritizing efficiency, UI snappiness, and startup time. Yea, those
 toold do more, but they don't do THAT much more that would
 technologically necessitate THAT much of a performance discrepancy.
 (The plain-old-editors are far more capable than I think IDE users
 seem to beleive).
[...] Yeah. I am actually skeptical of the whole GUI koolaid. I'm pretty sure having a GUI is not a necessity to implementing the equivalent functionality of an IDE in a text-mode editor. In fact, I've heard of Vim scripts that let you invoke a compiler and upon encountering a compile error correctly jump to the right line in the buffer containing the offending source file. And invoking a debugger from an editor isn't exactly rocket science (and has been done since decades ago). And in theory, one could also invoke an external D lexer/parser to refactor code automatically (though IMAO, if you find yourself regularly reaching for the code refactor button, then something's wrong with the way you write code -- perhaps you ought to be thinking a bit harder before typing out code, y'know, like design stuff before coding, like the crusty old textbooks say). T -- Study gravitation, it's a field with a lot of potential.
Mar 03 2017
parent reply Jared Jeffries <jared.jeffries yandex.com> writes:
 Yeah.  I am actually skeptical of the whole GUI koolaid.  I'm 
 pretty sure having a GUI is not a necessity to implementing the 
 equivalent functionality of an IDE in a text-mode editor.
Personally I'm using a mix of Geany, Coedit and Code::Blocks for D development, depending on what I'm doing on that moment (coding, fixing compilation errors or debugging). All three IDE are fine, the GUI "koolaid" works well, and moreover they start quickly on my venerable Linux laptop. And I've heard that Visual D is nice too. Obviously there are enough *FREE* efficient D IDE out there, starting fast with just enough visual aid. I think that the programming tutorial using D as the first programming language is what is really need, and fortunately I see that now it's on his way. What D needs too is probably more "fame" on the beginners forums. How can people start learning a language, if they don't even know it exists, and perfectly fulfill their needs ? To be well known, D just need that people talk about it more for beginners. D is not *just* a language for meta-programming experts and execution speed addicts. IMHO, it's also both the *simplest & complete* alternative to That must be said on every forum, at each occasion. Stop trying to convince only the expert programmers, most of them language and IDE for D...
Mar 03 2017
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, Mar 03, 2017 at 07:49:06PM +0000, Jared Jeffries via Digitalmars-d
wrote:
 Yeah.  I am actually skeptical of the whole GUI koolaid.  I'm pretty
 sure having a GUI is not a necessity to implementing the equivalent
 functionality of an IDE in a text-mode editor.
Personally I'm using a mix of Geany, Coedit and Code::Blocks for D development, depending on what I'm doing on that moment (coding, fixing compilation errors or debugging). All three IDE are fine, the GUI "koolaid" works well, and moreover they start quickly on my venerable Linux laptop.
If it works well with your workflow, then all the more power to you! All I'm saying is that, contrary to popular belief, GUI is not a *necessity* to implement the supposed superior features of IDEs over editors. After all, editors *have* advanced in the past 30-40 years too. [...]
 Stop trying to convince only the expert programmers, most of them are

 IDE for D...
That's a curious statement, because I was trained mainly as a C/C++ programmer, and still use them for my job every day. I was very well-versed in the intricacies of C, and somewhat C++, yet I was very unhappy with them. For several years I would scour the net during my free time to look for a better language. I wasn't very impressed with D the first time I saw it, for various reasons, and it wasn't until I found a copy of TDPL in a local bookstore and decided on a whim to buy it, that I really got started with D. And the rest, as they say, is history. :D Even though I suspect that I'm in the minority here, I'm pretty sure there's got to be others out there in C/C++ land who are like me, longing for a better language but just haven't found a suitable candidate yet. So don't discount us C/C++ veteran folk just yet! T -- Many open minds should be closed for repairs. -- K5 user
Mar 03 2017
parent reply Rico Decho <rico.decho gmail.com> writes:
 That's a curious statement, because I was trained mainly as a 
 C/C++ programmer, and still use them for my job every day. I 
 was very well-versed in the intricacies of C, and somewhat C++, 
 yet I was very unhappy with them.  For several years I would 
 scour the net during my free time to look for a better 
 language.  I wasn't very impressed with D the first time I saw 
 it, for various reasons, and it wasn't until I found a copy of 
 TDPL in a local bookstore and decided on a whim to buy it, that 
 I really got started with D.  And the rest, as they say, is 
 history. :D

 Even though I suspect that I'm in the minority here, I'm pretty 
 sure there's got to be others out there in C/C++ land who are 
 like me, longing for a better language but just haven't found a 
 suitable candidate yet.  So don't discount us C/C++ veteran 
 folk just yet!
My programming language at work remains C++ too, therefore I agree that D's clean syntax is very attractive to C++ developers. is closer to D (imports, reference types, etc). D's clean and powerful syntax may not be as attractive to them. Unfortunately C++ developers generally have to use this language wouldn't be used. So D's garbage collector may be a problem to convince most C++ professional developers, and D may not seem enough of an programming, as it's syntax has remained close to all of them, D can indeed be "sold" as the best first OO programming language. And once you have tried D, you probably will continue to use it and promote it, like I do.
Mar 04 2017
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Sat, Mar 04, 2017 at 09:06:52AM +0000, Rico Decho via Digitalmars-d wrote:
[...]
 Unfortunately C++ developers generally have to use this language in

 used.
 
 So D's garbage collector may be a problem to convince most C++
 professional developers, and D may not seem enough of an improvement

Ah, the good ole GC phobia. It was somewhat of a deterrent to me in the beginning too, I admit, but honestly, in retrospect, a lot of the GC phobia among the C/C++ crowd (including myself at the time) amounts to nothing more than unfounded prejudice, mostly stemming from initial impressions of the original GC in Java which was very sucky and had all sorts of problems. Modern GCs addressed most of these problems. Now, it's true that D's current GC needs a lot of improvement, but it really isn't as bad as people seem to think it is. For most applications, it's actually liberating to be able to finally stop thinking about the nitty-gritty of memory management at every turn, and to be able to focus on the actual algorithms at hand. It does require a different mindset than when you're writing C++, though, so this may not click for a while at first. It's actually rather rare to *need* to avoid the GC -- only niche applications need that, like if you're writing a game engine that has to avoid stop-the-world pauses (which can be easily worked around, btw), or real-time medical applications where if it stops for 10ms somebody dies. 90% of real world programs out there work just fine with the GC. IME, many C/C++ folks' GC aversion are actually more religious than anything else. I know C coders who swear against ever using any sort of GC, but then go ahead and (re)invent their own form of GC, albeit full of bugs, design flaws, and nasty unhandled corner cases that real GCs avoid. But of course, they'd never admit that their design was inferior, because GCs suck by definition, and therefore even their most flawed memory management designs must necessarily be better. T -- Why have vacation when you can work?? -- EC
Mar 05 2017
parent reply Rico Decho <rico.decho gmail.com> writes:
 It's actually rather rare to *need* to avoid the GC -- only 
 niche applications need that, like if you're writing a game 
 engine that has to avoid stop-the-world pauses (which can be 
 easily worked around, btw), or real-time medical applications 
 where if it stops for 10ms somebody dies. 90% of real world 
 programs out there work just fine with the GC.
Actually it's my case. I'd LOVE to use D for game development for instance, but I won't take the risk of having the GC pause the game when I don't want to, even for just an unknown amount of milliseconds, and even if I know that anyway I'll try to limit dynamic allocations by using caches etc. Of course, for everything else (scripts, tools, servers, etc), now I use D and I'm glad with it. Because for that same scripts and applications, I could have used any other garbage collected And that's my point. Better than Rust too. But for D to succeed as a viable alternative to C++ where it's mandatory to use it, it must have a minimalistic version of Phobos that makes it easy and natural to use D with manual memory management. Maybe it's already the case, but then I suggest that you should promote it louder, so that people don't even feel the need to try Rust for instance.
Mar 05 2017
next sibling parent Mike Parker <aldacron gmail.com> writes:
On Monday, 6 March 2017 at 05:50:01 UTC, Rico Decho wrote:

 Actually it's my case. I'd LOVE to use D for game development 
 for instance, but I won't take the risk of having the GC pause 
 the game when I don't want to, even for just an unknown amount 
 of milliseconds, and even if I know that anyway I'll try to 
 limit dynamic allocations by using caches etc.
It's *not* going to pause your game unless you do any allocations in your game loop, because it isn't otherwise going to run. D's GC is perfectly acceptable for games if you just follow the best practices that are already standard in game development. Allocate as much as you can when loading a new level and minimize allocations in the loop. If you do need to allocate during the level, you can avoid the GC entirely and do so via malloc. D provides the means to reduce or eliminate the impact of the GC in your critical loops. When developing games, you should already be trying to minimize allocations in those loops anyway.
Mar 05 2017
prev sibling next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 March 2017 at 05:50:01 UTC, Rico Decho wrote:
 Actually it's my case. I'd LOVE to use D for game development 
 for instance, but I won't take the risk of having the GC pause 
 the game when I don't want to, even for just an unknown amount 
 of milliseconds, and even if I know that anyway I'll try to 
 limit dynamic allocations by using caches etc.
void loop() { // code } void loadLevel() { import core.memory : GC; GC.disable(); while(stuff) loop(); GC.collect(); } Also see EMSI containers for no gc containers with deterministic destruction https://github.com/economicmodeling/containers
Mar 05 2017
next sibling parent Rico Decho <rico.decho gmail.com> writes:
 void loop() {
     // code
 }

 void loadLevel() {
     import core.memory : GC;
     GC.disable();
     while(stuff)
         loop();
     GC.collect();
 }

 Also see EMSI containers for no gc containers with 
 deterministic destruction 
 https://github.com/economicmodeling/containers
Thanks for mentioning the containers, that's exactly what I needed ! Then I think it's worth trying... This week end I'll try to generate the D wrapper for Urho3D and test how to load some scene resources in the background.
Mar 06 2017
prev sibling parent reply bachmeier <no spam.net> writes:
On Monday, 6 March 2017 at 06:39:27 UTC, Jack Stouffer wrote:

 void loop() {
     // code
 }

 void loadLevel() {
     import core.memory : GC;
     GC.disable();
     while(stuff)
         loop();
     GC.collect();
 }
GC.disable doesn't guarantee the garbage collector won't run: https://dlang.org/phobos/core_memory.html#.GC.disable I'm not sure how much impact that has in practice.
Mar 06 2017
next sibling parent Rico Decho <rico.decho gmail.com> writes:
 GC.disable doesn't guarantee the garbage collector won't run: 
 https://dlang.org/phobos/core_memory.html#.GC.disable I'm not 
 sure how much impact that has in practice.
That's why I'd like D to allow also Nim's alternative GC method. https://nim-lang.org/docs/gc.html Basically, you have the choice between a standard GC and a real-time GC. With the real-time version, you ask regularly for a partial time-limited GC. They say you can normally expect it to take less than 2 ms. And you can also decide not to call it if you know that it's not needed at the moment (no resource loading, etc), or if it's the wrong time to call it. What worries C++ game developers like me is not a regular 2 ms GC at each frame, it's actually an occasional 200 ms GC once in a while... Hence the need to allocate manually and disable the GC at the moment.
Mar 06 2017
prev sibling parent Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 March 2017 at 16:40:02 UTC, bachmeier wrote:
 GC.disable doesn't guarantee the garbage collector won't run
In only exceptional cases
 deems necessary for correct program behavior, such as during an 
 out of memory condition
Mar 06 2017
prev sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Monday, 6 March 2017 at 05:50:01 UTC, Rico Decho wrote:
 It's actually rather rare to *need* to avoid the GC -- only 
 niche applications need that, like if you're writing a game 
 engine that has to avoid stop-the-world pauses (which can be 
 easily worked around, btw), or real-time medical applications 
 where if it stops for 10ms somebody dies. 90% of real world 
 programs out there work just fine with the GC.
Actually it's my case. I'd LOVE to use D for game development for instance, but I won't take the risk of having the GC pause the game when I don't want to, even for just an unknown amount of milliseconds, and even if I know that anyway I'll try to limit dynamic allocations by using caches etc.
If this isn't a perfect example of D's marketing problem I don't know what is. Someone who likes D and takes the time to write on the forum yet thinks the GC will randomly run no matter what. To make it abundantly clear: I'm not bashing on you in the slightest, Rico Decho. I'm just pointing out that there's a clear problem here in that we can't expect to convert e.g. C++ game developers who have never written a line of D before if we haven't even managed to educate the community yet. Unfortunately, I have no ideas on how to remedy the situation. I also don't know how to get people to stop believing that C is magically fast either, which I think is a similar perception problem. Atila
Mar 06 2017
next sibling parent reply Rico Decho <rico.decho gmail.com> writes:
 If this isn't a perfect example of D's marketing problem I 
 don't know what is. Someone who likes D and takes the time to 
 write on the forum yet thinks the GC will randomly run no 
 matter what.

 To make it abundantly clear: I'm not bashing on you in the 
 slightest, Rico Decho. I'm just pointing out that there's a 
 clear problem here in that we can't expect to convert e.g. C++ 
 game developers who have never written a line of D before if we 
 haven't even managed to educate the community yet.

 Unfortunately, I have no ideas on how to remedy the situation. 
 I also don't know how to get people to stop believing that C is 
 magically fast either, which I think is a similar perception 
 problem.

 Atila
Actually it's written in the documentation. If I remember well the garbage collection could be triggered during any allocation, for instance when concatenating some displayed text, and freeze all threads until the garbage collection is done. In my opinion, the problem is D's GC implementation. For instance, Nim uses a soft (realtime) GC, which is why Nim's author himself has made the Urho3D wrapper :) With this approach, no need to disable the GC and make manual allocations to avoid that the GC freezes all threads. Instead you simply use all the standard libraries as normally, while still try to avoid allocating too much stuff during the rendering of course. During the render loop, in Nim you occasionally call the GC with an numeric argument telling how much milliseconds it is allowed to use in the worst case. For instance, you ask for GC when the game is in a menu or after a background resource loading. That's the simple and clever way of using a garbage collected language for game development.
Mar 06 2017
next sibling parent bachmeier <no spam.net> writes:
On Monday, 6 March 2017 at 15:40:54 UTC, Rico Decho wrote:
 If I remember well the garbage collection could be triggered 
 during any allocation, for instance when concatenating some 
 displayed text, and freeze all threads until the garbage 
 collection is done.
My understanding is that GC can be triggered only when you do an allocation with GC memory. And if you use nogc (which to my knowledge works) you cannot call anything that might trigger a garbage collection. https://dlang.org/spec/attribute.html#nogc
Mar 06 2017
prev sibling next sibling parent reply grm <gerhard.mueller gmsoft.at> writes:
On Monday, 6 March 2017 at 15:40:54 UTC, Rico Decho wrote:
 ...
 For instance, you ask for GC when the game is in a menu or 
 after a background resource loading.
 ...
maybe that's what you're looking for: https://dlang.org/phobos/core_memory.html#.GC.collect
Mar 06 2017
next sibling parent reply Rico Decho <rico.decho gmail.com> writes:
 maybe that's what you're looking for: 
 https://dlang.org/phobos/core_memory.html#.GC.collect
Indeed ! I'll try to make some benchmarks with a 3D rendering loop to see how much time it takes if there is not much to GC.
Mar 06 2017
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Mar 06, 2017 at 05:30:56PM +0000, Rico Decho via Digitalmars-d wrote:
 maybe that's what you're looking for:
 https://dlang.org/phobos/core_memory.html#.GC.collect
Indeed ! I'll try to make some benchmarks with a 3D rendering loop to see how much time it takes if there is not much to GC.
If you have a lot of total allocated memory, GC may still be somewhat slow (because it has to scan a lot of memory, most of which is still live). One possible approach is to do your large-scale, long-term allocations outside the GC heap (i.e., use malloc) so that the amount of GC memory that needs to be scanned is small. One approach of reducing the amount of GC allocations that people new to D seem to overlook, is to avoid string allocations by using D's range-based algorithms instead. E.g., instead of: auto s = "abc" ~ myString ~ "def"; // lots of allocations writeln(s); Do this instead: import std.range; auto r = chain("abc", myString, "def"); // no allocation writeln(r); This isn't always possible, e.g., if you need to store the result of a concatenation and access it later, but a lot of on-the-fly allocations (i.e., short-term garbage) could be eliminated this way. T -- If you want to solve a problem, you need to address its root cause, not just its symptoms. Otherwise it's like treating cancer with Tylenol...
Mar 06 2017
parent Rico Decho <rico.decho gmail.com> writes:
 If you have a lot of total allocated memory, GC may still be 
 somewhat slow (because it has to scan a lot of memory, most of 
 which is still live).  One possible approach is to do your 
 large-scale, long-term allocations outside the GC heap (i.e., 
 use malloc) so that the amount of GC memory that needs to be 
 scanned is small.

 One approach of reducing the amount of GC allocations that 
 people new to D seem to overlook, is to avoid string 
 allocations by using D's range-based algorithms instead.  E.g., 
 instead of:

 	auto s = "abc" ~ myString ~ "def"; // lots of allocations
 	writeln(s);

 Do this instead:

 	import std.range;
 	auto r = chain("abc", myString, "def"); // no allocation
 	writeln(r);

 This isn't always possible, e.g., if you need to store the 
 result of a concatenation and access it later, but a lot of 
 on-the-fly allocations (i.e., short-term garbage) could be 
 eliminated this way.


 T
PERFECT !!!
Mar 06 2017
prev sibling parent reply Rico Decho <rico.decho gmail.com> writes:
 maybe that's what you're looking for: 
 https://dlang.org/phobos/core_memory.html#.GC.collect
What is nice with Nim it that it has a GC heap PER THREAD. No need to stop the other threads during a GC... https://nim-lang.org/docs/threads.html https://nim-lang.org/docs/manual.html#threads "Nim's memory model for threads is quite different than that of other common programming languages (C, Pascal, Java): Each thread has its own (garbage collected) heap and sharing of memory is restricted to global variables. This helps to prevent race conditions. GC efficiency is improved quite a lot, because the GC never has to stop other threads and see what they reference. Memory allocation requires no lock at all! This design easily scales to massive multicore processors that are becoming the norm." I'd suggest taking inspiration from that for D's memory allocations and garbage collection...
Mar 06 2017
parent reply Rico Decho <rico.decho gmail.com> writes:
Btw, I'm not promoting Nim here, just asking to take inspiration 
from its memory model ;)

I've used Nim in the past, and while it's a nice language, D is 
much closer to perfection regarding my personal needs and tastes.

I've actually converted all my Nim scripts to D, because :
1/ it doesn't force you to declare the types and functions before 
using them;
2/ it uses a standard curly-brace block syntax, which helps a lot 
when porting C++ or Node.js code to D.
Mar 06 2017
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Mar 06, 2017 at 05:52:40PM +0000, Rico Decho via Digitalmars-d wrote:
 Btw, I'm not promoting Nim here, just asking to take inspiration from
 its memory model ;)
Nevertheless, it's certainly true that D's GC could use a major upgrade at some point. While it's not horrible, the present implementation does leave more to be desired. Hopefully the various efforts at GC by forum members will at some point turn into some major improvements to D's GC. There was talk a year or two ago about a precise for D (with fallback to conservative GC for cases where that wouldn't work), but I'm not sure what has come of it. [...]
 2/ it uses a standard curly-brace block syntax, which helps a lot when
 porting C++ or Node.js code to D.
<even-more-offtopic> It's kinda ironic how the times have changed, that people these days regard curly-brace block syntax as "standard". I still remember not too many decades ago (har har) when C's curly-brace syntax was regarded as too obscure or line-noise-y, and people preferred "begin ... end", "if .. fi", or similar things popular at the time. </even-more-offtopic> T -- Unix is my IDE. -- Justin Whear
Mar 06 2017
parent Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 March 2017 at 18:22:53 UTC, H. S. Teoh wrote:
but I'm not sure what has come of it.
https://github.com/dlang/druntime/pull/1603
Mar 06 2017
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2017-03-06 at 10:22 -0800, H. S. Teoh via Digitalmars-d wrote:
=20
[=E2=80=A6]
 Nevertheless, it's certainly true that D's GC could use a major
 upgrade
 at some point.=C2=A0=C2=A0While it's not horrible, the present implementa=
tion
 does
 leave more to be desired.=C2=A0=C2=A0Hopefully the various efforts at GC =
by
 forum
 members will at some point turn into some major improvements to D's
 GC.
 There was talk a year or two ago about a precise for D (with fallback
 to
 conservative GC for cases where that wouldn't work), but I'm not sure
 what has come of it.
[=E2=80=A6] Learn the lesson from Java. It started with a truly crap GC and everyone said Java is crap because the GC is garbage. D has seemingly actually progressed beyond this stage technically but not marketing wise. The Java folk worked on the GC and kept replacing it over and over again. The GC got better and better. Now with the G1 GC almost all the problem have gone away =E2=80=93 as has most of the moaning about Java having a crap GC. Most people never notice the GC and those that do, engineer it rather than moaning. The Java GC situation is now a sophisticated one where those who don't really care do not have a problem and those that do care have the tools to deal with it. D seems to be in a situation where those who don't care have a crap GC which needs to be improved and those who do care have the tools to deal with it. So there needs to be ongoing replacement of the D GC until there is something good, this is a technical problem. That people who care about the effect of GC still think D is a crap GC-based language implies there is a marketing problem, not a technical one. We all know that many, many people see the word garbage collector and run a mile in an uneducated prejudiced way. Who cares about them. We care about the people who are willing to try stuff out and have a problem. </rant> --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 07 2017
next sibling parent reply Rico Decho <rico.decho gmail.com> writes:
 D seems to be in a situation where those who don't care have a 
 crap GC which needs to be improved and those who do care have 
 the tools to deal with it. So there needs to be ongoing 
 replacement of the D GC until there is something good, this is 
 a technical problem. That people who care about the effect of 
 GC still think D is a crap GC-based language implies there is a 
 marketing problem, not a technical one.
But I don't think that D's GC is fine for people who care about it. If it is, why are people on this forum giving advices on how to disable and/or avoid it for soft real-time applications where a GC freeze can't be tolerated. D's GC isn't a crap at all, but better designs and implementations exist, and Nim's GC is one of them. We can either learn from it, or ignore it... But the second solution won't make D more appropriate for soft real-time scenarios...
Mar 07 2017
next sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Mar 07, 2017 at 06:45:55PM +0000, Rico Decho via Digitalmars-d wrote:
[...]
 But I don't think that D's GC is fine for people who care about it.
 
 If it is, why are people on this forum giving advices on how to
 disable and/or avoid it for soft real-time applications where a GC
 freeze can't be tolerated.
 
 D's GC isn't a crap at all, but better designs and implementations
 exist, and Nim's GC is one of them.
 
 We can either learn from it, or ignore it... But the second solution
 won't make D more appropriate for soft real-time scenarios...
What the D GC needs is somebody willing to sit down and actually spend the time to improve/rewrite the code. Over the years there has been an endless stream of fancy ideas, feature requests, and wishlists for the GC, but without anybody actually doing the work, nothing will actually happen. We are all very well aware of the current GC's limitations and flaws for years now, and there has been some amount of improvements going into it over the years. But again, talking about it won't magically make it better. *Somebody* has to write the code, after all. If anyone is interested to help, take a look at: https://github.com/dlang/druntime/pull/1603 and review the code, give some feedback, run the benchmarks yourself, etc., to prod this PR along. If you have other ideas for improving the GC (e.g., adapting ideas from Nim's GC), submitting PRs to that effect would be much more effective than merely talking about it. T -- If you want to solve a problem, you need to address its root cause, not just its symptoms. Otherwise it's like treating cancer with Tylenol...
Mar 07 2017
prev sibling next sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2017-03-07 at 18:45 +0000, Rico Decho via Digitalmars-d wrote:
 D seems to be in a situation where those who don't care have a=C2=A0
 crap GC which needs to be improved and those who do care have=C2=A0
 the tools to deal with it. So there needs to be ongoing=C2=A0
 replacement of the D GC until there is something good, this is=C2=A0
 a technical problem. That people who care about the effect of=C2=A0
 GC still think D is a crap GC-based language implies there is a=C2=A0
 marketing problem, not a technical one.
=20 But I don't think that D's GC is fine for people who care about=C2=A0 it. =20 If it is, why are people on this forum giving advices on how to=C2=A0 disable and/or avoid it for soft real-time applications where a=C2=A0 GC freeze can't be tolerated.
Because an option that may be sensibly available for those that cannot cope with a GC language is for D to have a GC-less mode =E2=80=93 at the expense of not using Phobos. Of course soft-real time and GC are not incompatible except in some people's minds: it is entirely possible to have GC in a soft real-time system, if the programming language supports it. The question here is only whether the current GC allows D to be used for soft real time.
 D's GC isn't a crap at all, but better designs and=C2=A0
 implementations exist, and Nim's GC is one of them.
=20
 We can either learn from it, or ignore it... But the second=C2=A0
 solution won't make D more appropriate for soft real-time=C2=A0
 scenarios...
The question is who is the "we" here. A lot of people have a lot of opinions on D and it's GC, including me. However, it seems that none of the people expressing opinions are willing to do anything other than express opinions on the email list. My gut feeling is that the D language execution and data model is not compatible with a "do not stop the world" GC. However this is opinion not really backed with evidence. What needs to happen is for a group of people who like complaining about the GC to get together and gather evidence as to what needs to change in the D language to support a soft real-time compatible GC such as Go, Nim, Java G1, etc. You can't just transplant an algorithm since the GC has to fit with the language data and execution model and D is more like C than like Java or Go. If the result is that a change to the D execution or data model is needed then this has to be proposed and debated. If this is not something open to change, then there is no point in going any further. I cannot commit to being involved in anything such as this until 2017- 06-30Y17:01+01:00, but from then there is a good possibility of getting me on board an effort to create a new GC for D (but note I really abhor the Phobos coding style with it's wrongly place opening braces). --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 14 2017
parent Moritz Maxeiner <moritz ucworks.org> writes:
On Tuesday, 14 March 2017 at 10:05:54 UTC, Russel Winder wrote:
 [...]

 My gut feeling is that the D language execution and data model 
 is not compatible with a "do not stop the world" GC. However 
 this is opinion not really backed with evidence.
I've recently been made aware of [1] and [2]. GC seems to always be a question of what you're willing to sacrifice, and if you want low pause times AFAIK the only known way to get that is to sacrifice overall throughput. This seems to be contradictory to D's goals of efficiency and control to me.
 What needs to happen is for a group of people who like 
 complaining about the GC to get together and gather evidence as 
 to what needs to change in the D language to support a soft 
 real-time compatible GC such as Go, Nim, Java G1, etc. You 
 can't just transplant an algorithm since the GC has to fit with 
 the language data and execution model and D is more like C than 
 like Java or Go.

 If the result is that a change to the D execution or data model 
 is needed then this has to be proposed and debated. If this is 
 not something open to change, then there is no point in going 
 any further.
The problem with changing D's execution and/or data model is that AFAIK to be viable for a better GC the necessary sacrifices will ensure that D cannot compete with C in terms of performance anymore. I'm not sure how the majority of the D community would feel about that, but I don't think I at least could still advocate D as a better drop-in replacement for C.
 I cannot commit to being involved in anything such as this 
 until 2017- 06-30Y17:01+01:00, but from then there is a good 
 possibility of getting me on board an effort to create a new GC 
 for D (but note I really abhor the Phobos coding style with 
 it's wrongly place opening braces).
OT: That's what (GIT) commit hooks are for. Write how you want, automatically commit as whatever non-Allman, wrong style the project uses. [1] https://blog.plan99.net/modern-garbage-collection-911ef4f8bd8e?gi=78635e05a6ac#.6zz5an77a [2] http://www.infognition.com/blog/2014/the_real_problem_with_gc_in_d.html
Mar 14 2017
prev sibling next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2017-03-07 at 11:06 -0800, H. S. Teoh via Digitalmars-d wrote:
 On Tue, Mar 07, 2017 at 06:45:55PM +0000, Rico Decho via Digitalmars-
 d wrote:
 [...]
 But I don't think that D's GC is fine for people who care about it.
=20
 If it is, why are people on this forum giving advices on how to
 disable and/or avoid it for soft real-time applications where a GC
 freeze can't be tolerated.
=20
 D's GC isn't a crap at all, but better designs and implementations
 exist, and Nim's GC is one of them.
=20
 We can either learn from it, or ignore it... But the second
 solution
 won't make D more appropriate for soft real-time scenarios...
=20 What the D GC needs is somebody willing to sit down and actually spend the time to improve/rewrite the code.=C2=A0=C2=A0Over the years there has=
been
 an
 endless stream of fancy ideas, feature requests, and wishlists for
 the
 GC, but without anybody actually doing the work, nothing will
 actually
 happen.=C2=A0=C2=A0We are all very well aware of the current GC's limitat=
ions
 and
 flaws for years now, and there has been some amount of improvements
 going into it over the years.=C2=A0=C2=A0But again, talking about it won'=
t
 magically make it better.=C2=A0=C2=A0*Somebody* has to write the code, af=
ter
 all.
=20
 If anyone is interested to help, take a look at:
=20
 	https://github.com/dlang/druntime/pull/1603
As mentioned previously I can schedule taking a look at this, but only from 2017-06-30T17:01+01:00 onwards.
 and review the code, give some feedback, run the benchmarks yourself,
 etc., to prod this PR along.
=20
 If you have other ideas for improving the GC (e.g., adapting ideas
 from
 Nim's GC), submitting PRs to that effect would be much more effective
 than merely talking about it.
=20
=20
 T
=20
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 14 2017
prev sibling parent Moritz Maxeiner <moritz ucworks.org> writes:
On Tuesday, 7 March 2017 at 18:45:55 UTC, Rico Decho wrote:
 D seems to be in a situation where those who don't care have a 
 crap GC which needs to be improved and those who do care have 
 the tools to deal with it. So there needs to be ongoing 
 replacement of the D GC until there is something good, this is 
 a technical problem. That people who care about the effect of 
 GC still think D is a crap GC-based language implies there is 
 a marketing problem, not a technical one.
But I don't think that D's GC is fine for people who care about it.
You'd have to be a lot more specific on what exactly you care about, since GC always deals with tradeoffs.
 If it is, why are people on this forum giving advices on how to 
 disable and/or avoid it for soft real-time applications where a 
 GC freeze can't be tolerated.
Because there are applications where the tradeoffs chosen for D's GC can't be tolerated.
 D's GC isn't a crap at all, but better designs and 
 implementations exist, and Nim's GC is one of them.
Better implementations of the same design, probably, but I haven't checked. As far as I've been able to discern on a quick look, Nim's current GC is also a mark-and-sweep GC, with the same tradeoffs as D's GC, i.e. if you can't accept D's GC in an application, you cannot accept Nim's. Better designs? That, again, depends on what tradeoffs you're willing to make. For the goal "we want throughput on-par with C" I'm not aware of better designs.
 We can either learn from it, or ignore it... But the second 
 solution won't make D more appropriate for soft real-time 
 scenarios...
You'll have to be very explicit in what you think we should learn. And D is perfectly viable for real-time scenarios: Don't call the GC.
Mar 14 2017
prev sibling parent Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Tuesday, 7 March 2017 at 16:26:20 UTC, Russel Winder wrote:
 Learn the lesson from Java. It started with a truly crap GC and 
 everyone said Java is crap because the GC is garbage. D has 
 seemingly actually progressed beyond this stage technically but 
 not marketing wise. The Java folk worked on the GC and kept 
 replacing it over and over again. The GC got better and better. 
 Now with the G1 GC almost all the problem have gone away – as 
 has most of the moaning about Java having a crap GC. Most 
 people never notice the GC and those that do, engineer it 
 rather than moaning. The Java GC situation is now a 
 sophisticated one where those who don't really care do not have 
 a problem and those that do care have the tools to deal with it.

 D seems to be in a situation where those who don't care have a 
 crap GC which needs to be improved and those who do care have 
 the tools to deal with it.
 So there needs to be ongoing replacement of the D GC until 
 there is something good, this is a technical problem.
Obviously D generates less garbage... It's really a problem of social organisation as well - as you say in the rest of your post. For example Sociomantic released their parallel GC, but it's only a solution for linux and not Windows because no fork on Windows. Why isn't it available in a form that will work with latest dmd master? Because we haven't collectively been able to organise someone to do the work, and nobody has stepped up to do it voluntarily. (And similarly with other GC alternatives - I know that memory barriers have problems too). But it's probably a matter of time only, because resources are beginning to flow into supporting the language. The D Foundation didn't exist a couple of years ago - and things didn't magically change once it came into existence. It takes time and the repeated application of effort, but over time it's likely to bear fruit in different ways. Things develop at their own pace, and more complex things develop more slowly.
 That people who care about the effect of GC still think D is a 
 crap GC-based language implies there is a marketing problem, 
 not a technical one.
Yes, but that's okay too. Maybe it's a pity if you would like to work in D, and the smaller size of community means more limited opportunities. But there are jobs in D, and they are growing. In the meantime, for those who are able to judge things by how they are and not depend on social proof, it might be a source of strategic advantage to adopt the language earlier - and pay the inevitable toll for that, because it is true that the tooling isn't yet completely mature. Eg some things we worked on: https://github.com/dlang/dub/pulls/John-Colvin
 We all know that many, many people see the word garbage 
 collector and run a mile in an uneducated prejudiced way. Who 
 cares about them. We care about the people who are willing to 
 try stuff out and have a problem.
Yes - exactly. Though we can't wave a wand and make problems disappear unless someone is willing to work on them or sponsor development. One thing that's lacking is a central list of projects that would benefit the language and community that enterprise users might sponsor. There's the bug bounty program, but that's something a bit different. Laeeth.
Mar 15 2017
prev sibling parent Atila Neves <atila.neves gmail.com> writes:
On Monday, 6 March 2017 at 15:40:54 UTC, Rico Decho wrote:
 If this isn't a perfect example of D's marketing problem I 
 don't know what is. Someone who likes D and takes the time to 
 write on the forum yet thinks the GC will randomly run no 
 matter what.

 To make it abundantly clear: I'm not bashing on you in the 
 slightest, Rico Decho. I'm just pointing out that there's a 
 clear problem here in that we can't expect to convert e.g. C++ 
 game developers who have never written a line of D before if 
 we haven't even managed to educate the community yet.

 Unfortunately, I have no ideas on how to remedy the situation. 
 I also don't know how to get people to stop believing that C 
 is magically fast either, which I think is a similar 
 perception problem.

 Atila
Actually it's written in the documentation.
That's true and was pointed out to me on Twitter by a C++ dev. I don't know what's up with that but I'm _pretty_ sure it doesn't happen in practice, but only somebody who knows the GC implementation well can comment I guess.
 If I remember well the garbage collection could be triggered 
 during any allocation, for instance when concatenating some 
 displayed text, and freeze all threads until the garbage 
 collection is done.
Right. So slap ` nogc` on whatever is in the game loop and that's guaranteed to not happen.
 In my opinion, the problem is D's GC implementation.
This is also a problem, yes.
 For instance, Nim uses a soft (realtime) GC, which is why Nim's 
 author himself has made the Urho3D wrapper :)

 With this approach, no need to disable the GC and make manual 
 allocations to avoid that the GC freezes all threads.

 Instead you simply use all the standard libraries as normally, 
 while still try to avoid allocating too much stuff during the 
 rendering of course.

 During the render loop, in Nim you occasionally call the GC 
 with an numeric argument telling how much milliseconds it is 
 allowed to use in the worst case.
That's pretty cool. Atila
Mar 06 2017
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Monday, 6 March 2017 at 14:49:42 UTC, Atila Neves wrote:
 Unfortunately, I have no ideas on how to remedy the situation. 
 I also don't know how to get people to stop believing that C is 
 magically fast either, which I think is a similar perception 
 problem.
Writing up a detailed example with code showing how to avoid the GC in the most common situations, posting it on Reddit, and then making it easy to find on dlang.org would be a good start. Given the importance of these issues, it should be one of the first things you see on the homepage.
Mar 06 2017
parent bpr <brogoff gmail.com> writes:
On Monday, 6 March 2017 at 16:42:50 UTC, bachmeier wrote:
 Writing up a detailed example with code showing how to avoid 
 the GC in the most common situations, posting it on Reddit, and 
 then making it easy to find on dlang.org would be a good start. 
 Given the importance of these issues, it should be one of the 
 first things you see on the homepage.
That's a great idea. In fact, I'd like to see multiple examples, with many different approaches to manual memory management, going over the common problems (e.g. "How do I do a writefln in a nogc block?") and how to solve them in idiomatic D. Something like Rust's guide to unsafe programming. This set of examples could be extended as the upcoming DIPs dealing with resource management make it into D compilers.
Mar 06 2017
prev sibling parent Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Monday, 6 March 2017 at 14:49:42 UTC, Atila Neves wrote:
 On Monday, 6 March 2017 at 05:50:01 UTC, Rico Decho wrote:
 It's actually rather rare to *need* to avoid the GC -- only 
 niche applications need that, like if you're writing a game 
 engine that has to avoid stop-the-world pauses (which can be 
 easily worked around, btw), or real-time medical applications 
 where if it stops for 10ms somebody dies. 90% of real world 
 programs out there work just fine with the GC.
Actually it's my case. I'd LOVE to use D for game development for instance, but I won't take the risk of having the GC pause the game when I don't want to, even for just an unknown amount of milliseconds, and even if I know that anyway I'll try to limit dynamic allocations by using caches etc.
If this isn't a perfect example of D's marketing problem I don't know what is. Someone who likes D and takes the time to write on the forum yet thinks the GC will randomly run no matter what. To make it abundantly clear: I'm not bashing on you in the slightest, Rico Decho. I'm just pointing out that there's a clear problem here in that we can't expect to convert e.g. C++ game developers who have never written a line of D before if we haven't even managed to educate the community yet.
"Unfortunately, I have no ideas on how to remedy the situation. " A start would be to link this page prominently from near the front page: https://p0nce.github.io/d-idioms/#The-impossible-real-time-thread https://dlang.org/articles.html
Mar 15 2017
prev sibling parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Friday, 3 March 2017 at 19:49:06 UTC, Jared Jeffries wrote:

 I think that the programming tutorial using D as the first 
 programming language is what is really need, and fortunately I 
 see that now it's on his way.
Ali Çehreli's book is really good in that regard. he explains programming from ground up (i.e. explans bits and bytes and what variable means etc.) in a very straightforward and clever way. It's one of the best programming intro I've seen regardless of the used language. That it uses D is the icing of the cake and as you said, learning D allows to know the concepts used in C, C++,
 What D needs too is probably more "fame" on the beginners 
 forums.
Indeed.
 How can people start learning a language, if they don't even 
 know it exists, and perfectly fulfill their needs ?

 To be well known, D just need that people talk about it more 
 for beginners.

 D is not *just* a language for meta-programming experts and 
 execution speed addicts.

 IMHO, it's also both the *simplest & complete* alternative to 


 That must be said on every forum, at each occasion.

 Stop trying to convince only the expert programmers, most of 
 them are probably not interested in leaving their C++, Java or 

Besides the personal preferences, there are real business and technical constraints that make it difficult to change for the "experts". In my work, for instance (big government like institution), there is official IT policy that projects have to be written in Java for Weblogic application servers using Oracle as databases. So if you want support from the IT department, you better use what they offer in their catalogue. I have the luck of working on a legacy project (started in 1993), not very visible but central to the whole business of our directorate, which means that we can force a little bit the hand of the IT department, so that they have to support our historical constraints (the project is a mix of C (99), Oracle Pro*C, perl 5, bash and a java frontend). Now I'm trying to introduce a little bit of D but that will only be possible when we have definitely moved from Solaris/SPARC to Linux/x86_64. TL;DR Difficult to introduce D when the project runs on Solaris/SPARC and interfaces with Oracle DB.
Mar 04 2017
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Friday, 3 March 2017 at 18:45:50 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 03/03/2017 10:40 AM, Russel Winder via Digitalmars-d wrote:
 IDEs, vastly more supportive, useful software development 
 functionality
 than editors, especially for debugging, yes.


 It's that last one, the one about getting working software 
 developed
 faster, that is the one that has moved me away from Emacs to 
 IDEs. But
Perhaps ironically, I used to be big on IDE's (back before the bloat). But between the bloat that started happening to them about 10+ years ago, and various factors that led me to (oddly enough) actually prefer printf debugging, I switched to more basic editors with the whole "Linux is my IDE" setup.
 everyone to their own, there is no universal truth in this 
 arena.
Definitely true. But I do really wish though, that the IDE devs would start prioritizing efficiency, UI snappiness, and startup time. Yea, those toold do more, but they don't do THAT much more that would technologically necessitate THAT much of a performance discrepancy. (The plain-old-editors are far more capable than I think IDE users seem to beleive).
The thing that annoys me with IDE's is generally not the IDE itself, or even their heaviness. The main problem I encounter with them is that they often end up being tied with the project itself, which means that if you want to build or modify an exist project, you have to install the same IDE as the original developer used. On open source projects it doesn't happen too often, but at work it happens all the time. The java jockeys use eclipse with a lot extensions, it take a day alone to install that shit (and be careful some of them work only on 32 bit eclipse while other require 64 bit eclipse). The frontend guys, use another java environment. The desktop apps of our project used and Visual Studio . The thing is, the old version of our app, which requires still support until it is replaced by a new one app, doesn't compile under a recent Visual Studio. TL;DR The big issue with IDE's is that they become part of the projects themselves.
Mar 04 2017
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sat, 2017-03-04 at 09:29 +0000, Patrick Schluter via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 The thing that annoys me with IDE's is generally not the IDE=C2=A0
 itself, or even their heaviness. The main problem I encounter=C2=A0
 with them is that they often end up being tied with the project=C2=A0
 itself, which means that if you want to build or modify an exist=C2=A0
 project, you have to install the same IDE as the original=C2=A0
 developer used. On open source projects it doesn't happen too=C2=A0
 often, but at work it happens all the time. The java jockeys use=C2=A0
 eclipse with a lot extensions, it take a day alone to install=C2=A0
 that shit (and be careful some of them work only on 32 bit=C2=A0
 eclipse while other require 64 bit eclipse). The frontend guys,=C2=A0
 use another java environment. The desktop apps of our project=C2=A0
 used and Visual Studio . The thing is, the old version of our=C2=A0
 app, which requires still support until it is replaced by a new=C2=A0
 one app, doesn't compile under a recent Visual Studio.
=20
 TL;DR
 The big issue with IDE's is that they become part of the projects=C2=A0
 themselves.
I thought this sort of crap had gone out with the naughties. Any modern project uses something such as CMake, SCons, Meson, Gradle, etc. and the IDEs have plugins to generate the projects from the build system specifications. This is certainly true for all the JetBrains and Eclipse systems I use. As soon as you have to start defining the project in the IDE you are on to a total loser. So I am agreeing that the experience outlined above is wrong, but I am also saying I have not had that experience with IDEs in the last five years or more. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 05 2017
parent "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/05/2017 12:12 PM, Russel Winder via Digitalmars-d wrote:
 On Sat, 2017-03-04 at 09:29 +0000, Patrick Schluter via Digitalmars-d
 wrote:
 TL;DR
 The big issue with IDE's is that they become part of the projects
 themselves.
I thought this sort of crap had gone out with the naughties. Any modern project uses something such as CMake, SCons, Meson, Gradle, etc. and the IDEs have plugins to generate the projects from the build system specifications. This is certainly true for all the JetBrains and Eclipse systems I use. As soon as you have to start defining the project in the IDE you are on to a total loser. So I am agreeing that the experience outlined above is wrong, but I am also saying I have not had that experience with IDEs in the last five years or more.
It's true for Unity3D, unless you count msbuild/xbuild as being along the same lines as cmake/etc (and even then, I'm not sure how much I'd trust Unity not to mess with hand-edits to the msbuild files...but that fear is mainly just because Unity is very block-boxy when it comes to project building - although not as badly black-boxed as Marmalade's build process last I looked at it, but that was awhile ago).
Mar 05 2017
prev sibling next sibling parent Joakim <dlang joakim.fea.st> writes:
On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries wrote:
 I'm not talking especially about Quora, even if I admit that 
 it's on this forum that somebody advised me to learn D to 
 improve my object oriented programming skills.

 [...]
 I think it should instead be advertised as the perfect language 
 to learn programming and web development, because that's where 
 it really shines, IMHO.
Why do you believe this? Don't tell me here: write it up as a post on your blog and link it on those fora. That will help advertise the language.
Mar 01 2017
prev sibling parent reply bachmeier <no spam.net> writes:
On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries wrote:

 I think it should instead be advertised as the perfect language 
 to learn programming and web development, because that's where 
 it really shines, IMHO.
I agree, but we need an intro to programming class using D as the language in order to do that. Most of the materials for the language assume you have programming experience. You can't just say that D allows you to use pointers and other low-level constructs, you have to explain what a pointer is and what you do with it. The same goes for functional programming, OOP, contracts, compile time, and so on. That's a big task. Ali's book is great as an introduction to the language, but not really sufficient as a beginning programming tutorial.
Mar 01 2017
next sibling parent reply aberba <karabutaworld gmail.com> writes:
On Wednesday, 1 March 2017 at 18:34:22 UTC, bachmeier wrote:
 On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries 
 wrote:

 I think it should instead be advertised as the perfect 
 language to learn programming and web development, because 
 that's where it really shines, IMHO.
I agree, but we need an intro to programming class using D as the language in order to do that. Most of the materials for the language assume you have programming experience. You can't just say that D allows you to use pointers and other low-level constructs, you have to explain what a pointer is and what you do with it. The same goes for functional programming, OOP, contracts, compile time, and so on. That's a big task. Ali's book is great as an introduction to the language, but not really sufficient as a beginning programming tutorial.
Thats the gab I'm trying to fill https://github.com/aberba/learn-coding
Mar 01 2017
parent Jared Jeffries <jared.jeffries yandex.com> writes:
 Thats the gab I'm trying to fill 
 https://github.com/aberba/learn-coding
GREAT INITIATIVE !!!
Mar 02 2017
prev sibling parent reply Martin Tschierschke <mt smartdolphin.de> writes:
On Wednesday, 1 March 2017 at 18:34:22 UTC, bachmeier wrote:
 On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries 
 wrote:

 I think it should instead be advertised as the perfect 
 language to learn programming and web development, because 
 that's where it really shines, IMHO.
I agree, but we need an intro to programming class using D as the language in order to do that. Most of the materials for the language assume you have programming experience. You can't just say that D allows you to use pointers and other low-level constructs, you have to explain what a pointer is and what you do with it. The same goes for functional programming, OOP, contracts, compile time, and so on. That's a big task. Ali's book is great as an introduction to the language, but not really sufficient as a beginning programming tutorial.
??? I think the book is exactly written to work as an beginning programming tutorial. May be my impression was wrong. What would be very useful, would be to have one .deb learning package, which including compiler an ide many simple tutorial like examples and the ability to write simple programs with gui output. Do you know basic256? https://sourceforge.net/projects/kidbasic/ It offers a convenient programming experience for beginners. I started to learn programming (BASIC) with an traditional home computer in the 80's (Schneider/Amstrad CPC6128). The best thing was, you only needed to switch it on and only with typing "DRAW 640,400" a line was drawn from the bottom left to the top right corner. Now give someone a new computer and ask him to do the same? How many years of computer experience will be needed? How many tool would I need to install? Regards mt.
Mar 02 2017
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 03/03/2017 12:10 AM, Martin Tschierschke wrote:
 On Wednesday, 1 March 2017 at 18:34:22 UTC, bachmeier wrote:
 On Wednesday, 1 March 2017 at 17:09:51 UTC, Jared Jeffries wrote:

 I think it should instead be advertised as the perfect language to
 learn programming and web development, because that's where it really
 shines, IMHO.
I agree, but we need an intro to programming class using D as the language in order to do that. Most of the materials for the language assume you have programming experience. You can't just say that D allows you to use pointers and other low-level constructs, you have to explain what a pointer is and what you do with it. The same goes for functional programming, OOP, contracts, compile time, and so on. That's a big task. Ali's book is great as an introduction to the language, but not really sufficient as a beginning programming tutorial.
??? I think the book is exactly written to work as an beginning programming tutorial. May be my impression was wrong. What would be very useful, would be to have one .deb learning package, which including compiler an ide many simple tutorial like examples and the ability to write simple programs with gui output. Do you know basic256? https://sourceforge.net/projects/kidbasic/ It offers a convenient programming experience for beginners. I started to learn programming (BASIC) with an traditional home computer in the 80's (Schneider/Amstrad CPC6128). The best thing was, you only needed to switch it on and only with typing "DRAW 640,400" a line was drawn from the bottom left to the top right corner. Now give someone a new computer and ask him to do the same? How many years of computer experience will be needed? How many tool would I need to install? Regards mt.
I recently bought[0] a book from my childhood (yes yes, it was written long before my time). I am not aware of any other book that teaches programming like it. With a nice narrative, good funny comics and pretty simple code. The sad reality is that computers today are so far more advanced than they ever used to be that most programmers never truly understand the cost of the things they think are "normal". Like GUI's (hint, massive thing I hate people wanting when they start out). [0] https://www.amazon.co.uk/d/cka/Childs-Guide-BBC-Micro-John-Dewhirst/0521277302
Mar 02 2017
prev sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 2 March 2017 at 11:10:54 UTC, Martin Tschierschke 
wrote:
 I started to learn programming (BASIC) with an traditional home 
 computer in the 80's
 (Schneider/Amstrad CPC6128).

 The best thing was, you only needed to switch it on and only 
 with typing "DRAW 640,400" a line was drawn from the bottom 
 left to the top right corner.

 Now give someone a new computer and ask him to do the same?
 How many years of computer experience will be needed?
 How many tool would I need to install?
I too learned to program using BASIC sometime in the mid-80's. The "enterprise" side of things has created a completely unnecessary learning curve. Java being used to teach intro to computing was successful at exactly one thing - it drove people away from programming. I spend my days working with graduate students in economics departments. They have to program for their research, but most of them have never taken a programming class. I use RStudio server. Students need only a browser to do fairly complicated analyses. Once you eliminate the startup costs, it's amazing how easy it is for them to learn. Do we have such a thing with D? Unfortunately we are moving in the wrong direction. New users are told to write configuration files for Hello World.
Mar 02 2017
next sibling parent sarn <sarn theartofmachinery.com> writes:
On Thursday, 2 March 2017 at 15:32:26 UTC, bachmeier wrote:
 On Thursday, 2 March 2017 at 11:10:54 UTC, Martin Tschierschke 
 wrote:
 I started to learn programming (BASIC) with an traditional 
 home computer in the 80's
 (Schneider/Amstrad CPC6128).

 The best thing was, you only needed to switch it on and only 
 with typing "DRAW 640,400" a line was drawn from the bottom 
 left to the top right corner.

 Now give someone a new computer and ask him to do the same?
 How many years of computer experience will be needed?
 How many tool would I need to install?
I too learned to program using BASIC sometime in the mid-80's. The "enterprise" side of things has created a completely unnecessary learning curve. Java being used to teach intro to computing was successful at exactly one thing - it drove people away from programming. ... Do we have such a thing with D? Unfortunately we are moving in the wrong direction. New users are told to write configuration files for Hello World.
I started on a (then obsolete) Acorn Electron and later moved to QBasic. This article was written not too long ago. First I laughed; then it convinced me :) http://www.nicolasbize.com/blog/30-years-later-qbasic-is-still-the-best/ (Cached version because it seems to be down: https://webcache.googleusercontent.com/search?q=cache:ZyMCKqG-ZKMJ:http://www.nicolasbize.com/blog/30-years-later-qbasic-is-still-the-best/+qbasic+is+still+the+best hl=ja&gbv=1&ct=clnk ) I wish everyone teaching beginner programming would read that.
Mar 02 2017
prev sibling next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/02/2017 10:32 AM, bachmeier wrote:
 I too learned to program using BASIC sometime in the mid-80's. The
Ditto here (well, late 80's). AppleSoft Basic on Apple IIc.
 "enterprise" side of things has created a completely unnecessary
 learning curve. Java being used to teach intro to computing was
 successful at exactly one thing - it drove people away from programming.
Oh, it was a disaster in so many ways, it even wound up warping the whole process of teaching programming basics into a total ineffective mess. Case in point: Back around the height of Java schools, I was a tutor for CS 101 (Intro to programming, using Java) students at a university around here (John Carroll University). There were two instructors who taught the course: A prof who'd been teaching it since well before the Java craze, and a Java-zealout who was constantly bragging how she'd come direct from a real software dev studio and had real-world experience with the right way of doing things. The two approached their classes very differently: The first one, the one who had been teaching code since before Java, started out by teaching basic flow-of-execution. "This statement runs, then the next one, then the next one." Conditions, loops, functions, etc. The second teacher, the one who was kneck-deep in the "Java/OOP is our god, we must not question" madness that was common in that time period...didn't teach it that way. Those students were instead dropped straight into object-oriented modeling. Because, of course, OOP is "the right way to do everything", as most programmers believed circa early 2000's. (Hell, even I believed it - but I hadn't drank enough of the kool-aid to think it was more fundamental than basic execution flow. But, OOP was such a craze back then, that it was widely seen as an alternative to imperative programming, rather than merely a way of structuring imperative code, as was the real dirty little truth of Java. So the imperative execution-flow was thought of as "dirty", something to discourage and sweep under the rug instead teach as a basic fundamental.) So, literally with ZERO exceptions: EVERY student I got from the first teacher's class pretty much knew what they were doing and were only coming to me for confirmation that they were on the right track. Invariably they were. And EVERY (again, zero exceptions) student I got from the second teacher's class was *completely* and utterly lost, and didn't even have enough grasp of the basics of basics that I was able to help them get a working program - at ALL. Java definitely had good points (especially compared to the C++ that was so predominant before Java stole its thunder), but it also lead to some real major blunders and corrupted a lot of minds.
 Do we have such a thing with D? Unfortunately we are moving in the wrong
 direction. New users are told to write configuration files for Hello World.
Lot of truth to that: I'm a big fan of using D instead of bash for scripting purposes, and about a couple months ago I found myself promoting that approach to a guy who writes a lot of bash scripts and python. It was embarrassing to show this as the "hello world" for replacing bash scripts with D: ------------------------- /+ dub.sdl: // This is the config file for the DUB package manager, // but it can be embedded into your main .d file like this. name "myscript" dependency "scriptlike" version="~>0.9.6" +/ import scriptlike; void main(string[] args) { run("echo Foobar > stuff.txt"); } ------------------------- I don't think I won him over.
Mar 02 2017
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Thu, Mar 02, 2017 at 07:12:07PM -0500, Nick Sabalausky (Abscissa) via
Digitalmars-d wrote:
 On 03/02/2017 10:32 AM, bachmeier wrote:
 
 I too learned to program using BASIC sometime in the mid-80's. The
Ditto here (well, late 80's). AppleSoft Basic on Apple IIc.
Ahh, the memories! (And how I am dating myself... but who cares.) Such fond memories of evenings spent poring over AppleSoft code trying for the first time in my life to write programs. And those lovely error messages with backwards punctuation: ?SYNTAX ERROR :-) [...]
 Back around the height of Java schools, I was a tutor for CS 101
 (Intro to programming, using Java) students at a university around
 here (John Carroll University). There were two instructors who taught
 the course: A prof who'd been teaching it since well before the Java
 craze, and a Java-zealout who was constantly bragging how she'd come
 direct from a real software dev studio and had real-world experience
 with the right way of doing things.
 
 The two approached their classes very differently:
 
 The first one, the one who had been teaching code since before Java,
 started out by teaching basic flow-of-execution. "This statement runs,
 then the next one, then the next one." Conditions, loops, functions,
 etc.
 
 The second teacher, the one who was kneck-deep in the "Java/OOP is our
 god, we must not question" madness that was common in that time
 period...didn't teach it that way. Those students were instead dropped
 straight into object-oriented modeling. Because, of course, OOP is
 "the right way to do everything", as most programmers believed circa
 early 2000's.
I was skeptical of OO, and especially of Java, at the time. It's odd, given that I had just been learning C++ in college and was familiar with OO concepts, but when I saw the way Java pushed for OO to the exclusion of all else, I balked. Call me a non-conformist or whatever, but every time I see too much hype surrounding something, my kneejerk reaction is to be skeptical of it. I eschew all bandwagons. [...]
 So, literally with ZERO exceptions: EVERY student I got from the first
 teacher's class pretty much knew what they were doing and were only
 coming to me for confirmation that they were on the right track.
 Invariably they were. And EVERY (again, zero exceptions) student I got
 from the second teacher's class was *completely* and utterly lost, and
 didn't even have enough grasp of the basics of basics that I was able
 to help them get a working program - at ALL.
 
 Java definitely had good points (especially compared to the C++ that was so
 predominant before Java stole its thunder), but it also lead to some real
 major blunders and corrupted a lot of minds.
To be fair, though, Java as a language in and of itself is not bad at all. In fact, in its own way, it's a pretty nicely designed language. Idealistic, and in some sense approaching perfection. But in an idealistic bubble-world kind of way (the ugliest parts of Java, IMO, are where it has to interact with the real world -- but nevertheless, it isn't *bad* in itself). The mentality and hype of the community surrounding it, though, seem to me to have gone off the deep end, and have bred rabid zealots, sad to say, to this very day, of the kind of calibre you described above. (I also TA'd a Java course back in the day, and was quite appalled to observe the number of thoroughly-confused students who couldn't tell control flow from OO, because "classes" had been hammered into their heads long before they even understood what a statement was. Apparently, imperative statements are non-OO and therefore evil, so one was supposed to wrap literally everything in classes. Nobody ever explained how one would implement class methods without using statements, though. I suppose calling other class methods was excepted from the "evil" label, but it seemed to escape people's minds that eventually nothing would actually get accomplished if all you had was an infinite regress of calling class methods with no imperative statements in between. But such was the rabid OO-fanaticism in those days.)
 Do we have such a thing with D? Unfortunately we are moving in the
 wrong direction. New users are told to write configuration files for
 Hello World.
Lot of truth to that: I'm a big fan of using D instead of bash for scripting purposes, and about a couple months ago I found myself promoting that approach to a guy who writes a lot of bash scripts and python. It was embarrassing to show this as the "hello world" for replacing bash scripts with D:
Ha! Let the rotten tomatoes fly, but I am a skeptic when it comes to dub (or any other such tool, really -- I mean no offense to Sonke). Sure they have their place in large software projects with potentially complicated external dependencies, but for Hello World? C'mon, now. Whatever happened to just: import std.stdio; void main() { writeln("Hello world!"); } And seriously, what kind of koolaid have kids these days been fed, that they can no longer work with the filesystem, but needs to be spoonfed by some automated tool? Y'know, we can't just like download scriptlike.d and put it in a subdir, and then import it. Oh, no, there's no app for this, and you can't do it by pressing a big red button on your handheld touchscreen, so it doesn't count. Filesystem? What's that? I only know of downloading stuff in a browser which magically gets shuffled somewhere in the "memory" of my device, that automatically gets found when I need to find it because technology is just that cool. What's the filesystem thing you speak of? Download folder? What's that? Sigh. T -- Music critic: "That's an imitation fugue!"
Mar 03 2017
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 03/03/2017 12:33 PM, H. S. Teoh via Digitalmars-d wrote:
 Ahh, the memories!
(Please keep memories marked with [OT]. Thanks! -- Andrei)
Mar 03 2017
prev sibling parent "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/03/2017 12:33 PM, H. S. Teoh via Digitalmars-d wrote:
 Call me a non-conformist or whatever, but every
 time I see too much hype surrounding something, my kneejerk reaction is
 to be skeptical of it.  I eschew all bandwagons.
Yea, I'm the same way. Not even a deliberate thing really, just my natural reaction. Can be good or bad, depending.
 To be fair, though, Java as a language in and of itself is not bad at
 all. In fact, in its own way, it's a pretty nicely designed language.
Right. Having come from C/C++ (C++ *was* "C with classes" at the time), Java taught me how a module system and class syntax *should* work. It made two big impressions on me: 1. How much quicker and easier it made certain programming tasks. 2. How much more of a gigantic pain it made other programming tasks. That, incidentally, it was set me out on a language search that led to to (a very early version of) D.
 (I also TA'd a Java course back in the day, and was quite appalled to
 observe the number of thoroughly-confused students who couldn't tell
 control flow from OO, because "classes" had been hammered into their
 heads long before they even understood what a statement was.
 Apparently, imperative statements are non-OO and therefore evil, so one
 was supposed to wrap literally everything in classes. Nobody ever
 explained how one would implement class methods without using
 statements, though.  I suppose calling other class methods was excepted
 from the "evil" label, but it seemed to escape people's minds that
 eventually nothing would actually get accomplished if all you had was an
 infinite regress of calling class methods with no imperative statements
 in between. But such was the rabid OO-fanaticism in those days.)
Yup. That's a perfect description of exactly what I observed.
 Ha!  Let the rotten tomatoes fly, but I am a skeptic when it comes to
 dub (or any other such tool, really -- I mean no offense to Sonke). Sure
 they have their place in large software projects with potentially
 complicated external dependencies, but for Hello World?
Yea, I have a love/hate thing with dub. Try to use something like vibe.d in a project that isn't built with dub (or think back to our own dsource days), and you'll quickly see why something like dub is needed. But, the pain and uselessness of dub in projects that only need it for package management, and use their own choice of build system - well, that's been a huge thorn in my side for years. It should have been a top priority from day one for dub to be usable like "pkg-config --cflags --libs". But even with all the work I put into "dub describe --data" it still isn't quite there :(
Mar 03 2017
prev sibling next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 2017-03-03 at 09:33 -0800, H. S. Teoh via Digitalmars-d wrote:
 On Thu, Mar 02, 2017 at 07:12:07PM -0500, Nick Sabalausky (Abscissa)
 via Digitalmars-d wrote:
 [=E2=80=A6]
Ahh, the memories! (And how I am dating myself... but who cares.)=C2=A0=C2=A0Such fond memories of evenings spent poring over AppleSoft code trying for the first time in my life to write programs. And those lovely error messages with backwards punctuation: =20 ?SYNTAX ERROR =20 :-)
Youngster. :-) Oh for the days when the only error message you ever got was 0c4.
 [=E2=80=A6]
 I was skeptical of OO, and especially of Java, at the time.=C2=A0=C2=A0It=
's
 odd,
 given that I had just been learning C++ in college and was familiar
 with
 OO concepts, but when I saw the way Java pushed for OO to the
 exclusion
 of all else, I balked.=C2=A0=C2=A0Call me a non-conformist or whatever, b=
ut
 every
 time I see too much hype surrounding something, my kneejerk reaction
 is
 to be skeptical of it.=C2=A0=C2=A0I eschew all bandwagons.
So how come you are on the D bandwagon? ;-) Having been involved in the OO scene 1983 to well now I guess, yes I look back on the function/OO war and the religious aspects of it all with great embarrassment.
 [=E2=80=A6]
=20
 To be fair, though, Java as a language in and of itself is not bad at
 all. In fact, in its own way, it's a pretty nicely designed language.
 Idealistic, and in some sense approaching perfection. But in an
 idealistic bubble-world kind of way (the ugliest parts of Java, IMO,
 are
 where it has to interact with the real world -- but nevertheless, it
 isn't *bad* in itself).=C2=A0=C2=A0The mentality and hype of the communit=
y
 surrounding it, though, seem to me to have gone off the deep end, and
 have bred rabid zealots, sad to say, to this very day, of the kind of
 calibre you described above.
Whilst I can see that of the 1994 to 2014 period, I am not sure I see it so much that way now. There are developers in Java shops who are a bit "jobsworth" and care little for personal development, and they are the people who refuse to accept the existence of languages other than Java. However most of the Java folk at the main conferences are actually JVM folk and they know languages such as Kotlin, Scala, Clojure, Groovy, Ceylon, Frege, etc. as well as Java. The zealotry, when present, is more about the JVM than Java per se.
 (I also TA'd a Java course back in the day, and was quite appalled to
 observe the number of thoroughly-confused students who couldn't tell
 control flow from OO, because "classes" had been hammered into their
 heads long before they even understood what a statement was.
 Apparently, imperative statements are non-OO and therefore evil, so
 one
 was supposed to wrap literally everything in classes. Nobody ever
 explained how one would implement class methods without using
 statements, though.=C2=A0=C2=A0I suppose calling other class methods was
 excepted
 from the "evil" label, but it seemed to escape people's minds that
 eventually nothing would actually get accomplished if all you had was
 an
 infinite regress of calling class methods with no imperative
 statements
 in between. But such was the rabid OO-fanaticism in those days.)
There were, and are, a lot of bad teachers. Overzealous as it seems in this episode. This does not make "objects first" a bad idea per se, it just has to be done properly. Just as teaching bottom up from statement does. A bad teacher can teach any curriculum badly, that should not reflect on the curriculum.
 [=E2=80=A6]
=20
 Ha!=C2=A0=C2=A0Let the rotten tomatoes fly, but I am a skeptic when it co=
mes to
 dub (or any other such tool, really -- I mean no offense to Sonke).
 Sure
 they have their place in large software projects with potentially
 complicated external dependencies, but for Hello World? C'mon, now.
 Whatever happened to just:
=20
 	import std.stdio;
 	void main() { writeln("Hello world!"); }
=20
 And seriously, what kind of koolaid have kids these days been fed,
 that
 they can no longer work with the filesystem, but needs to be spoonfed
 by
 some automated tool? Y'know, we can't just like download scriptlike.d
 and put it in a subdir, and then import it. Oh, no, there's no app
 for
 this, and you can't do it by pressing a big red button on your
 handheld
 touchscreen, so it doesn't count. Filesystem? What's that? I only
 know
 of downloading stuff in a browser which magically gets shuffled
 somewhere in the "memory" of my device, that automatically gets found
 when I need to find it because technology is just that cool.=C2=A0=C2=A0W=
hat's
 the
 filesystem thing you speak of? Download folder? What's that?
I am not sure where this one comes from. Here in the UK most 6 year olds are now happy manipulating filesystems. Sadly Windows ones, but I'm working on it. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 05 2017
prev sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Sun, Mar 05, 2017 at 05:26:08PM +0000, Russel Winder via Digitalmars-d wrote:
 On Fri, 2017-03-03 at 09:33 -0800, H. S. Teoh via Digitalmars-d wrote:
 On Thu, Mar 02, 2017 at 07:12:07PM -0500, Nick Sabalausky (Abscissa)
 via Digitalmars-d wrote:
 […]
Ahh, the memories! (And how I am dating myself... but who cares.)  Such fond memories of evenings spent poring over AppleSoft code trying for the first time in my life to write programs. And those lovely error messages with backwards punctuation: ?SYNTAX ERROR :-)
Youngster. :-) Oh for the days when the only error message you ever got was 0c4.
I bow before your venerable age! :-P
 […]
 I was skeptical of OO, and especially of Java, at the time.  It's
 odd, given that I had just been learning C++ in college and was
 familiar with OO concepts, but when I saw the way Java pushed for OO
 to the exclusion of all else, I balked.  Call me a non-conformist or
 whatever, but every time I see too much hype surrounding something,
 my kneejerk reaction is to be skeptical of it.  I eschew all
 bandwagons.
So how come you are on the D bandwagon? ;-)
[...] Haha, you got me there. Though truth be told, I only chose D after a long search for a better language and finding nothing that more closely matches my ideal of what a programming language should be. And at the time, there wasn't much hype surrounding D at all (in fact, I would never have found it had I not been actively searching for new programming languages). [...]
 To be fair, though, Java as a language in and of itself is not bad
 at all. [...]  The mentality and hype of the community surrounding
 it, though, seem to me to have gone off the deep end, and have bred
 rabid zealots, sad to say, to this very day, of the kind of calibre
 you described above.
Whilst I can see that of the 1994 to 2014 period, I am not sure I see it so much that way now. There are developers in Java shops who are a bit "jobsworth" and care little for personal development, and they are the people who refuse to accept the existence of languages other than Java. However most of the Java folk at the main conferences are actually JVM folk and they know languages such as Kotlin, Scala, Clojure, Groovy, Ceylon, Frege, etc. as well as Java. The zealotry, when present, is more about the JVM than Java per se.
Perhaps my perception is colored by a close acquiantance who happens to be a Java zealot to this very day. :-P JVM zealotry, OTOH, I don't see very much at all. In fact, I've never even heard such a term until you said it.
 (I also TA'd a Java course back in the day, and was quite appalled
 to observe the number of thoroughly-confused students who couldn't
 tell control flow from OO, because "classes" had been hammered into
 their heads long before they even understood what a statement was.
 Apparently, imperative statements are non-OO and therefore evil, so
 one was supposed to wrap literally everything in classes. Nobody
 ever explained how one would implement class methods without using
 statements, though.  I suppose calling other class methods was
 excepted from the "evil" label, but it seemed to escape people's
 minds that eventually nothing would actually get accomplished if all
 you had was an infinite regress of calling class methods with no
 imperative statements in between. But such was the rabid
 OO-fanaticism in those days.)
There were, and are, a lot of bad teachers. Overzealous as it seems in this episode. This does not make "objects first" a bad idea per se, it just has to be done properly. Just as teaching bottom up from statement does. A bad teacher can teach any curriculum badly, that should not reflect on the curriculum.
[...] The thing that gets to me is that these teachers, good or bad, committed the fallacy of embracing a single paradigm to the exclusion of everything else, even in the face of obvious cases where said paradigm didn't fit very well with the problem domain. Some aspects of Java also reflect this same fallacy -- such as those ubiquitous singleton static classes in the OS-wrapping modules, or the impossibility of declaring a function outside of a class -- which to me are indications that it wasn't just the teachers, but a more pervasive trend in the Java ecosystem of putting on OO-centric blinders. T -- In a world without fences, who needs Windows and Gates? -- Christian Surchi
Mar 06 2017
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/06/2017 07:47 PM, H. S. Teoh via Digitalmars-d wrote:
 On Sun, Mar 05, 2017 at 05:26:08PM +0000, Russel Winder via Digitalmars-d
wrote:
 Oh for the days when the only error message you ever got was 0c4.
You can get similar experiences even in modern times in the embedded area (at least hobbyist anyway, I guess there is all that JTAG stuff). I remember doing some demos on a late prototype Propeller MC, and there were times all I had for debugging was a single solitary LED. To this day, I still can't decide whether that was fun or horrible. I must've have a bit of masochist in me :P
 The zealotry,
 when present, is more about the JVM than Java per se.
Perhaps my perception is colored by a close acquiantance who happens to be a Java zealot to this very day. :-P JVM zealotry, OTOH, I don't see very much at all. In fact, I've never even heard such a term until you said it.
I learned the true meaning of Java zealotry ten or so years ago, when talking to a co-worker (our resident Java-fan - 'course, this was a VB6 house so I can't entirely blame him for Java fandom) and I made some remark involving checked exceptions (which, at the time, were already widely considered problematic, or even a mistake, even within the Java world). I was stunned to see a quizzical expression on his face and then learn he was genuinely puzzled by the mere suggestion of Java's checked exceptions having any downside. Luckily, this does seem much less common that it was at the time.
 The thing that gets to me is that these teachers, good or bad, committed
 the fallacy of embracing a single paradigm to the exclusion of
 everything else, even in the face of obvious cases where said paradigm
 didn't fit very well with the problem domain.  Some aspects of Java also
 reflect this same fallacy -- such as those ubiquitous singleton static
 classes in the OS-wrapping modules, or the impossibility of declaring a
 function outside of a class -- which to me are indications that it
 wasn't just the teachers, but a more pervasive trend in the Java
 ecosystem of putting on OO-centric blinders.
Yes, this. Although, granted, the OO-koolaid *was* quite strong indeed in those days. It really is strange to look back on all that, when I was fairly sold on OO too (just not quite as fanatically so), and compare to now: At this point I feel that class-based polymorphism mostly just turned out to be an awkward work-around for the lack of first-class functions and closures in mainstream languages. What convinced me: After years of using D, I find myself using OO less and less (OO polymorphism nearly never, aside from exception hierarchies), and instead of feeling hamstringed I feel liberated - and I'm normally a kitchen-sink kinda guy!
Mar 06 2017
parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Mar 06, 2017 at 10:41:06PM -0500, Nick Sabalausky (Abscissa) via
Digitalmars-d wrote:
[...]
 Yes, this. Although, granted, the OO-koolaid *was* quite strong indeed
 in those days.
 
 It really is strange to look back on all that, when I was fairly sold
 on OO too (just not quite as fanatically so), and compare to now:
 
 At this point I feel that class-based polymorphism mostly just turned
 out to be an awkward work-around for the lack of first-class functions
 and closures in mainstream languages. What convinced me: After years
 of using D, I find myself using OO less and less (OO polymorphism
 nearly never, aside from exception hierarchies), and instead of
 feeling hamstringed I feel liberated - and I'm normally a kitchen-sink
 kinda guy!
I was never fully "sold" to the OO bandwagon, though I did appreciate the different way of looking at a programming problem. While I found OO to be a nice way of structuring a program that deals with highly-structured data (it was like abstract data types on steroids), I never really understood the folks who see it as the be-all and end-all and want to essentially recast all of computer science in OO terms. Like you, after coming to terms with D's duck-typing range idioms I've started moving away from OO and leaning more in the direction of generic programming via templates. These days I even prefer static polymorphism via structs and alias this, than full-out classes. Of course, classes still do have their place when runtime polymorphism is needed, and I do use that at times. But it occupies a far smaller percentage of my code than the OO advocates would rally for. T -- Do not reason with the unreasonable; you lose by definition.
Mar 07 2017
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 2 March 2017 at 15:32:26 UTC, bachmeier wrote:
 I spend my days working with graduate students in economics 
 departments. They have to program for their research, but most 
 of them have never taken a programming class. I use RStudio 
 server. Students need only a browser to do fairly complicated 
 analyses. Once you eliminate the startup costs, it's amazing 
 how easy it is for them to learn.

 Do we have such a thing with D? Unfortunately we are moving in 
 the wrong direction. New users are told to write configuration 
 files for Hello World.
A Jupyter kernel would go a long way to students being able to easily play around with it in a browser. There's already dabble (D REPL) that one could make use of. I was surprised at the breadth of the list of kernels available these days.
Mar 03 2017
parent reply Seb <seb wilzba.ch> writes:
On Friday, 3 March 2017 at 19:18:31 UTC, jmh530 wrote:
 A Jupyter kernel would go a long way to students being able to 
 easily play around with it in a browser. There's already dabble 
 (D REPL) that one could make use of. I was surprised at the 
 breadth of the list of kernels available these days.
There is also drepl: https://github.com/drepl/drepl though of course it's not comparable with a Jupyter kernel.
Mar 04 2017
parent reply Jon Degenhardt <jond noreply.com> writes:
On Saturday, 4 March 2017 at 09:13:15 UTC, Seb wrote:
 On Friday, 3 March 2017 at 19:18:31 UTC, jmh530 wrote:
 A Jupyter kernel would go a long way to students being able to 
 easily play around with it in a browser. There's already 
 dabble (D REPL) that one could make use of. I was surprised at 
 the breadth of the list of kernels available these days.
There is also drepl: https://github.com/drepl/drepl though of course it's not comparable with a Jupyter kernel.
A D repl followed by a Jupyter kernel would be a great assist in engaging the data science community.
Mar 04 2017
next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sat, 2017-03-04 at 19:36 +0000, Jon Degenhardt via Digitalmars-d
wrote:
 On Saturday, 4 March 2017 at 09:13:15 UTC, Seb wrote:
 On Friday, 3 March 2017 at 19:18:31 UTC, jmh530 wrote:
 A Jupyter kernel would go a long way to students being able to=C2=A0
 easily play around with it in a browser. There's already=C2=A0
 dabble (D REPL) that one could make use of. I was surprised at=C2=A0
 the breadth of the list of kernels available these days.
=20 There is also drepl: =20 https://github.com/drepl/drepl =20 though of course it's not comparable with a Jupyter kernel.
=20 A D repl followed by a Jupyter kernel would be a great assist in=C2=A0 engaging the data science community.
Never underestimate the power of Jupyter within data science and finance. Just last month I reinforced a shift of another hedge fund from Matlab and R to Python using the power of Jupyter. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 05 2017
prev sibling parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Saturday, 4 March 2017 at 19:36:30 UTC, Jon Degenhardt wrote:
 On Saturday, 4 March 2017 at 09:13:15 UTC, Seb wrote:
 On Friday, 3 March 2017 at 19:18:31 UTC, jmh530 wrote:
 A Jupyter kernel would go a long way to students being able 
 to easily play around with it in a browser. There's already 
 dabble (D REPL) that one could make use of. I was surprised 
 at the breadth of the list of kernels available these days.
There is also drepl: https://github.com/drepl/drepl though of course it's not comparable with a Jupyter kernel.
A D repl followed by a Jupyter kernel would be a great assist in engaging the data science community.
A Jupyter kernel exists - written by John Colvin. It works and I have used it, and you can write python in one sell and D in another. It needs some work though, and so I am sure if somebody would like to contribute pull requests John would be happy to consider them. https://github.com/John-Colvin/PydMagic Well - I am not sure if it's officially a kernel as I can't remember how PydMagic works. But in effect it gets you most of the way there and is usable.
Mar 15 2017
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 15 March 2017 at 21:10:49 UTC, Laeeth Isharc wrote:
 A Jupyter kernel exists - written by John Colvin.  It works and 
 I have used it, and you can write python in one sell and D in 
 another.  It needs some work though, and so I am sure if 
 somebody would like to contribute pull requests John would be 
 happy to consider them.

 https://github.com/John-Colvin/PydMagic

 Well - I am not sure if it's officially a kernel as I can't 
 remember how PydMagic works.  But in effect it gets you most of 
 the way there and is usable.
I thought PydMagic allows one to call easily write D code to call in some Python code in Jupyter. A D Jupyter kernel would allow one to use Jupyter with D the same as one could use Python or R. PydMagic probably gets part of the way there, and some other projects mentioned above probably would help in other ways. I consider it a nice-to-have, but I wouldn't give it a super high priority. The real advantage is for getting people who would otherwise use Python or R or other languages to use D. rdmd works just fine for me.
Mar 15 2017
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Wed, 2017-03-15 at 23:36 +0000, jmh530 via Digitalmars-d wrote:
 [=E2=80=A6]
 otherwise use Python or R or other languages to use D. rdmd works=C2=A0
 just fine for me.
Except that rdmd needs separating out as a distinct thing so that ldc users can use it. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 16 2017
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 03/16/2017 05:48 AM, Russel Winder via Digitalmars-d wrote:
 Except that rdmd needs separating out as a distinct thing so that ldc
 users can use it.
I'm pretty sure it does work fine with ldc. Just do: rdmd --compiler=ldmd [...otherflags...]
Mar 16 2017
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Thu, 2017-03-16 at 19:54 -0400, Nick Sabalausky (Abscissa) via
Digitalmars-d wrote:
 On 03/16/2017 05:48 AM, Russel Winder via Digitalmars-d wrote:
=20
 Except that rdmd needs separating out as a distinct thing so that
 ldc
 users can use it.
=20
=20 I'm pretty sure it does work fine with ldc. Just do: =20 rdmd --compiler=3Dldmd [...otherflags...]
The problem is that it is only bundled with dmd as far as I can tell. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Mar 20 2017
prev sibling next sibling parent Jared Jeffries <jared.jeffries yandex.com> writes:
 There's a reason stackoverflow and 
 softwareengineering.stackexchange delete these kinds of 
 questions: they're counter productive and can't actually be 
 answered.

 The question "Which is the best programming language to learn 
 in 2017" is one such question. It comes down strictly to 
 opinion and circumstance. Because of this, the "answers" are 
 either answering a different question or just ads for the 
 user's favorite language. It seems most of the top answers in 
 that thread took the question to mean "Which language would be 
 most likely to get me a job in 2017", which isn't the same.

 Programming questions on Quora are the dumping ground for bad 
 SO questions.

 Most D power users spend their time either on the IRC or on SO.
I agree with you, the question is vague, etc. But my point is : this is exactly the kind of question beginners like me use to git this advice on which language they should learn. I did some research with google with the keywords "best programming language to learn", which lead me to what people say on Quora and similar websites. Nothing fancy. And I didn't ask a more specific question myself, I wouldn't probably have read anything about D, which is sad. D is great for students, so my advice is to stop advertising D so much for power users, and promoting it more as the best language out there to learn object oriented programming, before digging into Java, C++, Javascript etc. Because that's where D is strong : it regroups all their features array/maps/slices/foreach like in Javascript, structs/pointers/templates like in C++), while keeping almost the same syntax. I've personally ported my little experiment programs from D to I'm very impressed how D's syntax and native features make it actually a common denominator to all these languages. That's really where it is stronger than any other programming language on earth : it's the simple, pragmatic and efficient synthesis of the current mainstream languages.
Mar 02 2017
prev sibling parent reply Jesse Phillips <Jesse.K.Phillips+D gmail.com> writes:
On Wednesday, 1 March 2017 at 14:34:59 UTC, Jack Stouffer wrote:
 There's a reason stackoverflow and 
 softwareengineering.stackexchange delete these kinds of 
 questions: they're counter productive and can't actually be 
 answered.
Slant does a pretty good job of providing a platform to these opinionated questions. https://www.slant.co/topics/25/viewpoints/11/~best-programming-language-to-learn-first~d
Mar 02 2017
parent Rico Decho <rico.decho gmail.com> writes:
 Slant does a pretty good job of providing a platform to these 
 opinionated questions.

 https://www.slant.co/topics/25/viewpoints/11/~best-programming-language-to-learn-first~d
That's right. Btw I've tested this simple "opinionated" search : https://www.google.be/search?q=best+programming+language At the bottom of the second page, there is indeed a slant result. Surprisingly it includes D :) And there is indeed two links to Quora webpages in the first 10 results. https://www.slant.co/topics/25/~best-programming-language-to-learn-first tried (Rust, Elixir, etc) many of the languages that people Despite there is almost no other reference to the D language in the first 50 results of the "best programming language" search, which is sad. Now we can try to improve the situation, or not. But indeed the Quora-like results are the only places where D can be added to the list of recommended programming languages, as anybody can post his answer on these forums. And this is already what I'm doing btw. I mean, talking about D on the webpages which are in the first results of obvious google searches...
Mar 03 2017
prev sibling parent Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Tuesday, 28 February 2017 at 23:29:24 UTC, Jared Jeffries 
wrote:
 I've read the answer to questions like "Which is the best 
 programming language to learn in 2017?".

 Nobody was telling anything about D, which is really sad, 
 because in my opinion D could be one of the best answers to 
 this question.

 I've answered this question. Better late than never.

 I suggest that other "happy users" of this language do the 
 same...
I've written some stuff there. Something I wrote on Python, that was in a way an indirect suggestion to explore alternatives (and I mentioned D), got 250k views and was sent to a few million people (whether they read it, who knows!). Of course it's not the number of people that matters, but a few key people. Weka use D after hearing about it from a tweet from someone they respected - and their use of D makes a big difference for others that follow (there are others too, but they heard about D from other routes). One or two people at least explored D after reading my post. It's a good idea to get involved though, provided one doesn't become an irritating language evangelist lacking in balance or perspective. D is good for some things and people in some situations, but not for everyone. It's enough that some people who are receptive actually make the commitment to explore it - over time, that will make quite a difference. That's true, I think, both for the outside world, and inside an organisation.
Mar 15 2017