www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Andrei's list of barriers to D adoption

reply Walter Bright <newshound2 digitalmars.com> writes:
Andrei posted this on another thread. I felt it deserved its own thread. It's 
very important.
-----------------------------------------------------------------------------
I go to conferences. Train and consult at large companies. Dozens every year, 
cumulatively thousands of people. I talk about D and ask people what it would 
take for them to use the language. Invariably I hear a surprisingly small
number 
of reasons:

* The garbage collector eliminates probably 60% of potential users right off.

* Tooling is immature and of poorer quality compared to the competition.

* Safety has holes and bugs.

* Hiring people who know D is a problem.

* Documentation and tutorials are weak.

* There's no web services framework (by this time many folks know of D, but of 
those a shockingly small fraction has even heard of vibe.d). I have strongly 
argued with Sönke to bundle vibe.d with dmd over one year ago, and also in
this 
forum. There wasn't enough interest.

* (On Windows) if it doesn't have a compelling Visual Studio plugin, it doesn't 
exist.

* Let's wait for the "herd effect" (corporate support) to start.

* Not enough advantages over the competition to make up for the weaknesses
above.
Jun 05 2016
next sibling parent reply Pie? <AmericanPie gmail.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:
and it's taken you that long to now that the following are the problem? (No offense... just seems like the following list is obvious whether 1 or 1 googol)
 * The garbage collector eliminates probably 60% of potential 
 users right off.
Duh! The claim is made that D can work without the GC... but that's a red herring... If you take about the GC what do you have? A ton of effort to build something that gets D to work properly without the GC. It can be done but it isn't done except by leet guru's who have the time and knowledge to do it. It is not a built in option that works out of the box.
 * Tooling is immature and of poorer quality compared to the 
 competition.
Duh! Every use Visual studio? Sure Visual D offers a glimpse of hope, but only glimpse. D has some cool stuff but that's all it is for most users in my guestimation.
 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
Jun 05 2016
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but 
 that's a red herring... If you take about the GC what do you 
 have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
Jun 05 2016
next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 Though, I wish D would just own its decision instead of bowing 
 to Reddit pressure. GC is a proven success in the real world 
 with a long and impressive track record. Yes, there are times 
 when you need to optimize your code, but even then you aren't 
 really worse off with it than without it.
While I understand that some people can't afford a GC, this has confused me as well. I never understood the large amount of people on /r/programming complaining about the GC when the vast majority of software is written in one of the following languages: C#, Java, PHP, Python, JavaScript. Those have to cover at least 80% of all software projects in the US and not only do they have a GC, they force you to use a GC. This just shows to me that /r/programming is not a representative sample of programmers at all. The anti D's GC thing has become meme at this point. I have literally seen only one person on /r/programming complain about Go's GC, despite Go being a slower language overall.
Jun 05 2016
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/6/16 6:38 AM, Jack Stouffer wrote:
 On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 Though, I wish D would just own its decision instead of bowing to
 Reddit pressure. GC is a proven success in the real world with a long
 and impressive track record. Yes, there are times when you need to
 optimize your code, but even then you aren't really worse off with it
 than without it.
While I understand that some people can't afford a GC, this has confused me as well. I never understood the large amount of people on /r/programming complaining about the GC when the vast majority of software is written in one of the following languages: C#, Java, PHP, Python, JavaScript.
These are only a part of our competition. -- Andrei
Jun 05 2016
parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 June 2016 at 06:32:15 UTC, Andrei Alexandrescu wrote:
 These are only a part of our competition. -- Andrei
Sure, I was just remarking on the fact that the amount of complaining about GCs is disproportionate to the number of people not using GCs. As I said, I think it's more of a meme ("lol D sux cuz the GC") than actual C++ users complaining.
Jun 06 2016
parent Shachar Shemesh <shachar weka.io> writes:
On 06/06/16 10:02, Jack Stouffer wrote:
 On Monday, 6 June 2016 at 06:32:15 UTC, Andrei Alexandrescu wrote:
 These are only a part of our competition. -- Andrei
Sure, I was just remarking on the fact that the amount of complaining about GCs is disproportionate to the number of people not using GCs. As I said, I think it's more of a meme ("lol D sux cuz the GC") than actual C++ users complaining.
I'm not only a C++ user, I am also a D user. And I'm complaining. Shachar
Jun 06 2016
prev sibling next sibling parent poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 04:38:15 UTC, Jack Stouffer wrote:
 (...)
 While I understand that some people can't afford a GC, this has 
 confused me as well.

 I never understood the large amount of people on /r/programming 
 complaining about the GC when the vast majority of software is 
 written in one of the following languages: C#, Java, PHP, 
 Python, JavaScript. Those have to cover at least 80% of all 
 software projects in the US and not only do they have a GC, 
 they force you to use a GC. This just shows to me that 
 /r/programming is not a representative sample of programmers at 
 all.

 The anti D's GC thing has become meme at this point. I have 
 literally seen only one person on /r/programming complain about 
 Go's GC, despite Go being a slower language overall.
People constantly raise the argument that some large fraction (e.g. 80%) of software in all languages is written with GC just fine. This is missing a few points: - It is often not "just fine" even if they use it. Authors sometimes don't realize that GC would be a liability in their projects until its too late. Then they fight it. Also, people may be forced to use GC because libraries they need use GC. - Most people don't actively want GC, they just want productivity. Whether its GC that gives it or something else, they don't care. If something else was providing productivity, people wouldn't care that its not GC. Cpython uses reference counding as its GC strategy. Do you think most people care? - The minority of applications which cannot use GC is not necessarily also a minority in economic value or in the number of running copies. Most of all applications are usually one-off internal business apps or scientific experiments. Also for every 10 programs there are probably 8 bad ones. Hence, the number of applications is a pretty silly metric. Note that non-GC applications are often multi-million dollar operating systems, AAA games, control software, AI software and server software.
Jun 06 2016
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 06/06/16 07:38, Jack Stouffer wrote:

 I never understood the large amount of people on /r/programming
 complaining about the GC when the vast majority of software is written
 in one of the following languages: C#, Java, PHP, Python, JavaScript.
With the *possible* exception of C#, none of those are systems programming languages. D presents itself as one. Shachar
Jun 06 2016
parent Jim Hewes <jimhewes gmail.com> writes:
On 6/6/2016 9:31 AM, Shachar Shemesh wrote:
 With the *possible* exception of C#, none of those are systems
 programming languages. D presents itself as one.

 Shachar
I think that is true. I understand that some disciplines might need to avoid a GC for whatever reason, like games or small embedded systems. But the thing that always gets me about GC is not performance. It's that garbage collection always lumps all resources together with memory. I understand that there have been countless long discussions about GC. But my eyes glaze over because the discussion is usually about performance. Coming from C++ I like to use RAII. I like to depend on deterministic destruction of resources. I don't care about memory or when it's released (depending on the type of application) but I do care when other types of resources are released. If I'm wrong to be thinking this way I'm happy to be convinced otherwise. I know the library has Unique which is in std.typcons. But I thought I also read once that there are certain cases where it doesn't always work correctly. Maybe I'm mistaken; I'm not clear on that. As a developer coming from C++, okay, I've bought into the idea that D is a better C++. The first thing I want to know is, "How do I accomplish all the things in D that I normally do in C++?" For the case of deterministic destruction it might take someone a while to figure that out. (Not scope guards; they don't handle lifetimes longer than functions.)
Jun 07 2016
prev sibling next sibling parent reply finalpatch <fengli gmail.com> writes:
On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but 
 that's a red herring... If you take about the GC what do you 
 have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
GC is okay if there is a way to ensure it does not suspend time critical code. Right now this is not possible. Even if my time critical code is completely nogc, other threads still can trigger the GC and stop the world including my time critical thread.
Jun 05 2016
parent reply Daniel Kozak <kozzi11 gmail.com> writes:
On Monday, 6 June 2016 at 04:47:20 UTC, finalpatch wrote:
 On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 [...]
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
GC is okay if there is a way to ensure it does not suspend time critical code. Right now this is not possible. Even if my time critical code is completely nogc, other threads still can trigger the GC and stop the world including my time critical thread.
You can still unregister your critical thread from GC.
Jun 05 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
exactly. that's what i did in my sound engine, and it works like a charm: i can enjoy hiccup-less ogg replaying on the background while the main code enjoys GC.
Jun 05 2016
next sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 05:28 +0000, ketmar via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
=20 exactly. that's what i did in my sound engine, and it works like=C2=A0 a charm: i can enjoy hiccup-less ogg replaying on the background=C2=A0 while the main code enjoys GC.
This should be marketed as a major feature of D: the language with a GC for those situations where you want it, and manual memory management for thse cases where you do not want a GC. What is there not to like here? Why are we still debating the GC at all, it is a done deal. OK so maybe new GC algorithms could be tried. Go did this, why not D. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 6 June 2016 at 08:18:20 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 05:28 +0000, ketmar via Digitalmars-d 
 wrote:
 On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
exactly. that's what i did in my sound engine, and it works like a charm: i can enjoy hiccup-less ogg replaying on the background while the main code enjoys GC.
This should be marketed as a major feature of D: the language with a GC for those situations where you want it, and manual memory management for thse cases where you do not want a GC.
it is even better: i do *zero* manual memory management in Follin! synthesizers are simple classes, and they are automatically anchored with __gshared variables (when user is creating a new replay channel). so i actually enjoying the best things from both worlds! ;-)
Jun 06 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 08:24 +0000, ketmar via Digitalmars-d wrote:
=20
[=E2=80=A6]
 it is even better: i do *zero* manual memory management in=C2=A0
 Follin! synthesizers are simple classes, and they are=C2=A0
 automatically anchored with __gshared variables (when user is=C2=A0
 creating a new replay channel). so i actually enjoying the best=C2=A0
 things from both worlds! ;-)
Let the D community drop the angst, we now have two exemplars of GC and no-GC, let's go with instances of software and not worry about abstract problems. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
prev sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 6 June 2016 at 08:18:20 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 05:28 +0000, ketmar via Digitalmars-d 
 wrote:
 On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
exactly. that's what i did in my sound engine, and it works like a charm: i can enjoy hiccup-less ogg replaying on the background while the main code enjoys GC.
This should be marketed as a major feature of D: the language with a GC for those situations where you want it, and manual memory management for those cases where you do not want a GC.
Having the GC for the UI is very pleasant, while nogc time-critical code won't use it. It think the problem is that the message then become more complicated. GC is easily victim of the "holier-than-thou" fallacy, because evidently less GC is supposed to translate into faster programs. Er... right?
Jun 06 2016
parent poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 10:54:41 UTC, Guillaume Piolat wrote:
 On Monday, 6 June 2016 at 08:18:20 UTC, Russel Winder wrote:
 (...)
 This should be marketed as a major feature of D: the language 
 with a GC for those situations where you want it, and manual 
 memory management for those cases where you do not want a GC.
Having the GC for the UI is very pleasant, while nogc time-critical code won't use it. It think the problem is that the message then become more complicated. GC is easily victim of the "holier-than-thou" fallacy, because evidently less GC is supposed to translate into faster programs. Er... right?
I'm just worried how usable this approach really is at scale. If you combine 20 D libraries, 17 of which use GC, are you able to control the GC well enough for a low-latency app? The problem with GC is that its a global (per process) resource so it poisons everything in the program with its unpredictable time consumption. I would be hesitant to market this without designing and testing a viable development methodology first. And then there is reference counting which is another way to be productive that doesn't have this problem.
Jun 06 2016
prev sibling parent Pie? <AmericanPie gmail.com> writes:
On Monday, 6 June 2016 at 05:28:58 UTC, ketmar wrote:
 On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
exactly. that's what i did in my sound engine, and it works like a charm: i can enjoy hiccup-less ogg replaying on the background while the main code enjoys GC.
So, did your sound engine 'hiccup' before you did that? Do you think it is acceptable? Does C#, Go, or other languages have this problem out of the box? I could write a hiccup-less sound player in C# without messing with the GC at all and it be fully functional... D has a GC problem and they never arrive on time and sometimes make a mess. The real problem is no one seems to care except the people that end up with trash all over their lawn.... but isn't that always the case?
Jun 06 2016
prev sibling parent finalpatch <fengli gmail.com> writes:
On Monday, 6 June 2016 at 05:13:11 UTC, Daniel Kozak wrote:
 You can still unregister your critical thread from GC.
Thanks, didn't know you could do that.
Jun 06 2016
prev sibling next sibling parent reply Mithun Hunsur <me philpax.me> writes:
On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but 
 that's a red herring... If you take about the GC what do you 
 have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
The problem is that D is targeted as a multi-paradigm systems programming language, and while it's largely successful at that, the GC doesn't fit in with that domain by nature of its existence. There's no problem with _having_ a GC, it just shouldn't be the default case for what's meant to be a systems language, especially when language and standard library features become dependent upon it. But I digress: we've had this debate before, we're having it now, and we'll keep having it well into the future :-)
Jun 05 2016
next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 06:24 +0000, Mithun Hunsur via Digitalmars-d
wrote:
 [=E2=80=A6]
 The problem is that D is targeted as a multi-paradigm systems=C2=A0
 programming language, and while it's largely successful at that,=C2=A0
 the GC doesn't fit in with that domain by nature of its existence.
=20
 There's no problem with _having_ a GC, it just shouldn't be the=C2=A0
 default case for what's meant to be a systems language,=C2=A0
 especially when language and standard library features become=C2=A0
 dependent upon it.
No. As evidence I give you Go. The whole "it's a systems programming language so it cannot have GC" is just so wrong in 2016 (as it was in 2004). Having a GC for a time critical real-time streaming application is probably a bad idea, so turn GC off for that. D can do that. D having a GC is not the problem, the problem is the D community agamizing about a wrong issue instead of focusing on the real one: when to switch the GC off.
 But I digress: we've had this debate before, we're having it now,=C2=A0
 and we'll keep having it well into the future :-)
If the D community does continue to debate this, then D really will die as a language. This is a dead issue. It has gone to the debating hall in the sky. It is an ex-issue. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
prev sibling next sibling parent David Soria Parra via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, Jun 06, 2016 at 09:23:05AM +0100, Russel Winder via Digitalmars-d wrote:
 On Mon, 2016-06-06 at 06:24 +0000, Mithun Hunsur via Digitalmars-d
 wrote:
 […]
 The problem is that D is targeted as a multi-paradigm systems 
 programming language, and while it's largely successful at that, 
 the GC doesn't fit in with that domain by nature of its existence.
 
 There's no problem with _having_ a GC, it just shouldn't be the 
 default case for what's meant to be a systems language, 
 especially when language and standard library features become 
 dependent upon it.
No. As evidence I give you Go. The whole "it's a systems programming language so it cannot have GC" is just so wrong in 2016 (as it was in 2004). Having a GC for a time critical real-time streaming application is probably a bad idea, so turn GC off for that. D can do that.
Go is the perfect example here. The traction that Go has and D has not doesn't come from GC or not, it comes from accessible documentation, easy installation, good libraries and community support. New developerse will give a language 10-60min max, if it is compelling and you feel productive and decently fast then you are set. sure there are outlines where you watn to replace C++ or C, but those areas are much harder to get traction on due to dependencies in the existing architecture and a (correctly so) risk-aversity.
Jun 06 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 13:19 -0700, David Soria Parra via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 Go is the perfect example here. The traction that Go has and D has
 not doesn't come from GC
 or not, it comes from accessible documentation, easy installation,
 good libraries and community support.
I think it is slightly more complicated than that, but these are the major issues. Go arrived with an easy way for early adopters to try it out (clone the repository and type make) which cause some of them to start writing libraries, which caused more people to try it out which caused documentation to be written. Then came the debates about how crap the system was which eventually led to the go command, etc. So it was about rapid evolution and acreting more and more people creating an active and constructive community. But in the background was that fact that the core team were fully funded to work on the project full time. This cannot be underestimated in the rapid rise of Go. Go evolution over the 7 years it has been public has had many rough rides, not dissimilar to D, but always there was the team of full-time people. This is what D is missing, and is unlikely to have in the foreseeable future. Lessons can be learned not only from Go, but also Groovy, which has seen all forms of activity in it's 13 years. =C2=A0 Swift will I suspect go the same way as Go exactly because it is funded and has a hype behind it. Swift also has the ready made market of being the anointed replacement for Objective-C(++) for iOS applications development.
 New developerse will give a language 10-60min max, if it is
 compelling and you feel productive and
 decently fast then you are set.
I am not sure this is entirely the case, but yes there is a distinct element of soundbyte first impressions. There is a clear relationship between language age and expectation of sophistication of the installation and development support. Go got away with zero support initially because it was so young, and it got the early adopters in to do the leg work of enabling the sophisticated set ups no there. Without active, full time development of Eclipse/DDT, IntelliJ IDEA, CLion, etc. support for D, D will always be seen as an old language with no sophisticated tooling. The organizations currently using D could easily chip in and make this full time activity happen, and whilst a cost should bear fruit in better development for themselves.=C2=A0
 sure there are outlines where you watn to replace C++ or C, but those
 areas are much harder to get
 traction on due to dependencies in the existing architecture and a
 (correctly so) risk-aversity.
So don't focus on them for D traction, let them come in their own time. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/6/16 6:17 AM, Adam D. Ruppe wrote:
 Though, I wish D would just own its decision instead of bowing to Reddit
 pressure.
Writing GC's issues off as pressure from reddit would be an understatement. -- Andrei
Jun 05 2016
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/6/16 2:31 AM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Adam D. Ruppe wrote:
 Though, I wish D would just own its decision instead of bowing to Reddit
 pressure.
Writing GC's issues off as pressure from reddit would be an understatement. -- Andrei
I agree. It's telling that nearly all real-world examples we've seen (sociomantic, remedy games, etc.) use D without GC or with specialized handling of GC. I've had personal experience with "fixing" performance dramatically by removing or minimizing GC allocations. It is important to both get the GC operating more efficiently, and provide easier ways to avoid the GC (or better tutorials on how to do so). -Steve
Jun 06 2016
parent reply Wyatt <wyatt.epp gmail.com> writes:
On Monday, 6 June 2016 at 14:27:52 UTC, Steven Schveighoffer 
wrote:
 I agree. It's telling that nearly all real-world examples we've 
 seen (sociomantic, remedy games, etc.) use D without GC or with 
 specialized handling of GC.
I doubt either of the two you named would change, but I wonder how different the tenor of conversation would be in general if D's GC wasn't a ponderous relic? -Wyatt
Jun 06 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 16:56 +0000, Wyatt via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 14:27:52 UTC, Steven Schveighoffer=C2=A0
 wrote:
 I agree. It's telling that nearly all real-world examples we've=C2=A0
 seen (sociomantic, remedy games, etc.) use D without GC or with=C2=A0
 specialized handling of GC.
=20 I doubt either of the two you named would change, but I wonder=C2=A0 how different the tenor of conversation would be in general if=C2=A0 D's GC wasn't a ponderous relic?
So instead of debating this endlessly, I think this is about the tenth time this has come up in the last two years, why doesn't a group of people who know about GC algorithms get together and write a new one? Java has had a large number of GCs over the years: new knowledge, new algorithms, new implementation lead to better performance. Go has had at least three GCs as new knowledge, new algorithms, new implementation lead to better performance. D has had lots of discussion on email lists but no-one has followed this up with actually doing something that resulted in a change. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
next sibling parent tsbockman <thomas.bockman gmail.com> writes:
On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 So instead of debating this endlessly, I think this is about 
 the tenth time this has come up in the last two years, why 
 doesn't a group of people who know about GC algorithms get 
 together and write a new one?
 ...
 D has had lots of discussion on email lists but no-one has 
 followed this up with actually doing something that resulted in 
 a change.
As someone else mentioned earlier in this thread, Jeremy DeHaan is working on a GSoC project to begin overhauling the GC right now: http://forum.dlang.org/post/jcfwcdvvfytdkjrpdeld forum.dlang.org
Jun 07 2016
prev sibling next sibling parent Dsby <dushibaiyu yahoo.com> writes:
On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 16:56 +0000, Wyatt via Digitalmars-d 
 wrote:
 [...]
So instead of debating this endlessly, I think this is about the tenth time this has come up in the last two years, why doesn't a group of people who know about GC algorithms get together and write a new one? Java has had a large number of GCs over the years: new knowledge, new algorithms, new implementation lead to better performance. Go has had at least three GCs as new knowledge, new algorithms, new implementation lead to better performance. D has had lots of discussion on email lists but no-one has followed this up with actually doing something that resulted in a change.
support. Why not write a new GC to replace the simple and old GC.
Jun 07 2016
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 4:05 AM, Russel Winder via Digitalmars-d wrote:
 On Mon, 2016-06-06 at 16:56 +0000, Wyatt via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 14:27:52 UTC, Steven Schveighoffer
 wrote:
 I agree. It's telling that nearly all real-world examples we've
 seen (sociomantic, remedy games, etc.) use D without GC or with
 specialized handling of GC.
I doubt either of the two you named would change, but I wonder how different the tenor of conversation would be in general if D's GC wasn't a ponderous relic?
So instead of debating this endlessly, I think this is about the tenth time this has come up in the last two years, why doesn't a group of people who know about GC algorithms get together and write a new one?
People have. Reiner wrote a precise scanning GC. Leandro wrote a forking GC that is used in Sociomantic's software. Martin has made some huge improvements to the existing GC for performance. I think there is some difficulty taking a new GC and putting it into druntime. We druntime developers take for granted many times how the GC is implemented, without considering the ability to swap it out. Let's also not forget that the GC is a super-integral part of the runtime, and it deals with one of the most difficult-to-debug aspects - memory allocation. If you replace the GC, you better have it perfect. From my experience replacing the D array runtime, it is a very very difficult thing to debug and get right, the bugs are disastrous and difficult to trace and reproduce. Not to say we shouldn't do it, I think we should at LEAST have it optional (like with globally recognized runtime parameters: --usegc myfavoritegc). But I don't want to act like GC improvement is something nobody has ever looked at. My opinion, the first optional one to include is the forking GC, since it has been in production for years for one company. We may even find some bugs in it to help out Sociomantic, since they have discovered and helped fix so many problems with D :) If we have the ability to swap out GCs, then it makes it easier to define how the GC API must work. -Steve
Jun 07 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 13:15:38 UTC, Steven Schveighoffer 
wrote:
 My opinion, the first optional one to include is the forking 
 GC, since it has been in production for years for one company. 
 We may even find some bugs in it to help out Sociomantic, since 
 they have discovered and helped fix so many problems with D :)
Dicebot even ported it to D2 about year ago. but afair it has some unsolvable corner cases, like deadlocking if one of the threads is malloc'ing while forking, or something like that. Dicebot probably knows better.
Jun 07 2016
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 9:29 AM, ketmar wrote:
 On Tuesday, 7 June 2016 at 13:15:38 UTC, Steven Schveighoffer wrote:
 My opinion, the first optional one to include is the forking GC, since
 it has been in production for years for one company. We may even find
 some bugs in it to help out Sociomantic, since they have discovered
 and helped fix so many problems with D :)
Dicebot even ported it to D2 about year ago. but afair it has some unsolvable corner cases, like deadlocking if one of the threads is malloc'ing while forking, or something like that. Dicebot probably knows better.
Right, but without a way to try it out, you will never get any help. I'm not sure of Dicebot's preferences, but I can imagine porting Sociomantic's GC to D2 may not be his preferred task. So doing something you don't want to do (possibly) with no help doesn't usually lead to a successful outcome. I just read elsewhere that a GSoC student is working to achieve the goal of making the GC swappable and adding Reiner's precise scanning GC. I consider this to be essential work, I hope we can get this rolling soon! -Steve
Jun 07 2016
parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Tuesday, 7 June 2016 at 13:39:19 UTC, Steven Schveighoffer 
wrote:
 I just read elsewhere that a GSoC student is working to achieve 
 the goal of making the GC swappable and adding Reiner's precise 
 scanning GC. I consider this to be essential work, I hope we 
 can get this rolling soon!
https://github.com/dlang/druntime/pull/1581
Jun 07 2016
parent Adam Wilson <flyboynw gmail.com> writes:
Jack Stouffer wrote:
 On Tuesday, 7 June 2016 at 13:39:19 UTC, Steven Schveighoffer wrote:
 I just read elsewhere that a GSoC student is working to achieve the
 goal of making the GC swappable and adding Reiner's precise scanning
 GC. I consider this to be essential work, I hope we can get this
 rolling soon!
https://github.com/dlang/druntime/pull/1581
Feedback is greatly appreciated! If you have an opinion on how to implement user-selectable Garbage Collection algorithms, please chime in. That said, we may not include your feedback, being a GSoC project means that we will need to fast-track merges so as not to block the student. We'll listen, but we might push that really cool idea of yours off until after GSoC. -- // Adam Wilson // import quiet.dlang.dev;
Jun 07 2016
prev sibling parent reply Wyatt <wyatt.epp gmail.com> writes:
On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 So instead of debating this endlessly, I think this is about 
 the tenth time this has come up in the last two years, why 
 doesn't a group of people who know about GC algorithms get 
 together and write a new one?
In addition to the other answers, it's worth noting that most every good modern GC algorithm I can think of requires barriers. Walter has repeatedly and emphatically declared that D will not have barriers, so we're kind of SoL on on that front. Java and Go don't have that problem. -Wyatt
Jun 07 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 17:19:16 UTC, Wyatt wrote:
 On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 So instead of debating this endlessly, I think this is about 
 the tenth time this has come up in the last two years, why 
 doesn't a group of people who know about GC algorithms get 
 together and write a new one?
In addition to the other answers, it's worth noting that most every good modern GC algorithm I can think of requires barriers. Walter has repeatedly and emphatically declared that D will not have barriers, so we're kind of SoL on on that front. Java and Go don't have that problem. -Wyatt
hear, hear!
Jun 07 2016
prev sibling next sibling parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 1:19 PM, Wyatt wrote:
 On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 So instead of debating this endlessly, I think this is about the tenth
 time this has come up in the last two years, why doesn't a group of
 people who know about GC algorithms get together and write a new one?
In addition to the other answers, it's worth noting that most every good modern GC algorithm I can think of requires barriers. Walter has repeatedly and emphatically declared that D will not have barriers, so we're kind of SoL on on that front. Java and Go don't have that problem.
So you're saying one of the barriers to adoption is no barriers... -Steve
Jun 07 2016
prev sibling parent qznc <qznc web.de> writes:
On Tuesday, 7 June 2016 at 17:19:16 UTC, Wyatt wrote:
 On Tuesday, 7 June 2016 at 08:05:58 UTC, Russel Winder wrote:
 So instead of debating this endlessly, I think this is about 
 the tenth time this has come up in the last two years, why 
 doesn't a group of people who know about GC algorithms get 
 together and write a new one?
In addition to the other answers, it's worth noting that most every good modern GC algorithm I can think of requires barriers. Walter has repeatedly and emphatically declared that D will not have barriers, so we're kind of SoL on on that front. Java and Go don't have that problem.
Barriers are only necessary if you want a low-latency or concurrent GC, afaik. I'm not sure if low-latency is worthwhile for D. If latency is important, you might be better off to disable the GC (partially). Concurrent is nice for servers, where you do not care about memory consumption. D does not even have a precise GC so far and that fruit hangs much lower. It is a precondition on basically any advanced GC techniques.
Jun 07 2016
prev sibling next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but 
 that's a red herring... If you take about the GC what do you 
 have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
+1, with nogc, -profile=gc most problems disappeared. You can even guarantee no allocation at all, which isn't the case when using C++ stdlib. An overblown "problem" that started being a topic with the Rust appearance.
Jun 06 2016
prev sibling next sibling parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Monday, 6 June 2016 at 04:17:40 UTC, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but 
 that's a red herring... If you take about the GC what do you 
 have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
Maybe we need official containers that use the allocator, better messaging with concrete examples that get the point across that D generates less garbage than some other languages whose history might be putting people off, and better examples, tutorials, and documentation on how to do things without the GC.
Jun 06 2016
parent jmh530 <john.michael.hall gmail.com> writes:
On Monday, 6 June 2016 at 08:09:29 UTC, Laeeth Isharc wrote:
 Maybe we need official containers that use the allocator,  
 better messaging with concrete examples that get the point 
 across that D generates less garbage than some other languages 
 whose history might be putting people off, and better examples, 
 tutorials,  and documentation on how to do things without the 
 GC.
+1 to "better examples, tutorials, and documentation on how to do things iwthout the GC."
Jun 06 2016
prev sibling next sibling parent Shachar Shemesh <shachar weka.io> writes:
On 06/06/16 07:17, Adam D. Ruppe wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but that's a
 red herring... If you take about the GC what do you have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
Weka thought so too, at first. We said there are two kinds of components where it is okay to use the GC: * Long living objects (where the GC won't matter, because they are not freed anyway) and * very small objects. The thought was that, with these two constraints, the GC won't matter. We can run it infrequently, and all will be fine. Turns out, we were very very very wrong. The problem with the GC is that its run time is proportional not to the amount of memory it frees, but to the amount of memory it needs to scan. This number is vastly different than the first one. Without keeping this number low, the GC freeze time runs into the high milliseconds range. So we changed course. We are now trying to minimize the total use of GC memory. This, with lots of other tricks, would, hopefully, allow us to get the GC run time to very very very low, maybe even not run it at all. However, saying that the GC makes you no worse is simply false. Shachar
Jun 06 2016
prev sibling next sibling parent Charles Hixson via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 06/05/2016 09:17 PM, Adam D. Ruppe via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but that's a 
 red herring... If you take about the GC what do you have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
Usually correct, but there are times when you want to suspend the garbage collection. The problem is this should always be a scoped decision, because it's easy to accidentally leave it turned off, and then it's MUCH worse than not having it.
Jun 07 2016
prev sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 07, 2016 at 07:00:13PM -0700, Charles Hixson via Digitalmars-d
wrote:
 On 06/05/2016 09:17 PM, Adam D. Ruppe via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
 Duh! The claim is made that D can work without the GC... but
 that's a red herring... If you take about the GC what do you have?
Like 90% of the language, still generally nicer than most the competition. Though, I wish D would just own its decision instead of bowing to Reddit pressure. GC is a proven success in the real world with a long and impressive track record. Yes, there are times when you need to optimize your code, but even then you aren't really worse off with it than without it.
Usually correct, but there are times when you want to suspend the garbage collection. The problem is this should always be a scoped decision, because it's easy to accidentally leave it turned off, and then it's MUCH worse than not having it.
auto myFunc(Args...)(Args args) { GC.disable(); scope(exit) GC.enable(); doStuff(); } On another note, I have found that strategic disabling of the GC and/or manually triggering GC.collect() at the right times, can give your programs a big boost in performance, typically around 20% to 50% depending on the specifics of your memory usage patterns. Arguably this should be automatic once D's GC is replaced with something better than the current implementation, but the point is that performance concerns over the GC aren't insurmountable, and the fix is often not even that complicated. I think far too much energy has been spent arguing for a GC-less language than actually writing the code that would fix its associated performance issues, and my suspicion is that this is mostly caused by your typical C/C++ programmer mindset (of which I used to be a part) that's always obsessed about memory management, rather than any factual basis. T -- It said to install Windows 2000 or better, so I installed Linux instead.
Jun 07 2016
prev sibling next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
Hi, to be usable for companies which want to create economic software, in my opinion D lacks std.decimal. Maybe some companies will develop their own decimal libraries but for the others they won't. There is some great work, but currently it seems to be blocked by std.bigint https://github.com/andersonpd/eris/issues/6 For the tooling I can only speak for windows environment. To have a sophisticated IDE, the DLL topic needs some love. DLL are a major topic on windows and without sophisticated DLL support, it is hardly possible to build an integrated IDE similar to the well known like Visual Studio/Delphi/... One major issue I faced, I create a class in a DLL and cannot cast the class in my main application to another type due to the missing type info. The workaround would be to have massive code duplication, which makes the code more complex. Also here some great work are already done but blocked by the "export semantic" topic. Kind regards André
Jun 05 2016
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
Jun 05 2016
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
C/C++: not really Java: https://docs.oracle.com/javase/tutorial/i18n/format/numberintro.html Go: Two proposals as of last year https://forum.golangbridge.org/t/what-is-the-proper-golang-equivalent-to-decimal-when-dealing-with-money/413/8 Rust: nope PHP: Has stuff, but we REALLY don't want to go around copying them So where is it used? PHP, Java and C# the most. What are the languages used in making e-commerce stuff? Them.
Jun 05 2016
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/5/2016 11:38 PM, rikki cattermole wrote:
 On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
 Java: https://docs.oracle.com/javase/tutorial/i18n/format/numberintro.html
That's about formatting numbers, not about a decimal type. http://docs.oracle.com/javase/8/docs/api/java/math/BigDecimal.html is more like it.
Jun 05 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/5/2016 11:38 PM, rikki cattermole wrote:
 On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
Apparently there's a spec: http://speleotrove.com/decimal/decarith.html
Jun 06 2016
next sibling parent reply Observer <here inter.net> writes:
On Monday, 6 June 2016 at 07:01:33 UTC, Walter Bright wrote:
 On 6/5/2016 11:38 PM, rikki cattermole wrote:
 On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic 
 software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
Apparently there's a spec: http://speleotrove.com/decimal/decarith.html
There's been a lot of work on decimal floating-point types for C and C++, even if they haven't yet made it into the languages. See, for instance: http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1312.pdf http://open-std.org/jtc1/sc22/wg21/docs/papers/2014/n3871.html and http://www.quadibloc.com/comp/cp020302.htm the last of which is not a standards-related document but provides some interesting detail. The point being, while decimal floats are something to be wished for, it's a complex matter, not something that can be quickly thrown together. That said, GNU C++ does provide some support: https://gcc.gnu.org/onlinedocs/gcc/Decimal-Float.html
Jun 06 2016
next sibling parent Observer <here inter.net> writes:
On Monday, 6 June 2016 at 08:04:07 UTC, Observer wrote:
 That said, GNU C++ does provide some support:
 https://gcc.gnu.org/onlinedocs/gcc/Decimal-Float.html
Apparently, I meant GNU C.
Jun 06 2016
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 1:04 AM, Observer wrote:
 On Monday, 6 June 2016 at 07:01:33 UTC, Walter Bright wrote:
 On 6/5/2016 11:38 PM, rikki cattermole wrote:
 On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
Apparently there's a spec: http://speleotrove.com/decimal/decarith.html
There's been a lot of work on decimal floating-point types for C and C++, even if they haven't yet made it into the languages. See, for instance: http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1312.pdf http://open-std.org/jtc1/sc22/wg21/docs/papers/2014/n3871.html
Interestingly, that contains links to open source C implementations, meaning we can just provide a D wrapper. https://software.intel.com/en-us/articles/intel-decimal-floating-point-math-library/ http://speleotrove.com/decimal/decnumber.html
 and
 http://www.quadibloc.com/comp/cp020302.htm
 the last of which is not a standards-related document but
 provides some interesting detail.

 The point being, while decimal floats are something to be
 wished for, it's a complex matter, not something that can be
 quickly thrown together.

 That said, GNU C++ does provide some support:
 https://gcc.gnu.org/onlinedocs/gcc/Decimal-Float.html
Jun 06 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 00:01 -0700, Walter Bright via Digitalmars-d
wrote:
 On 6/5/2016 11:38 PM, rikki cattermole wrote:
 On 06/06/2016 6:29 PM, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic
 software,
 in my opinion D lacks std.decimal.
=20 Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
=20 Apparently there's a spec: http://speleotrove.com/decimal/decarith.ht ml
IBM have had hardware decimal numbers for about 50 years. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
prev sibling next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Monday, 6 June 2016 at 06:29:27 UTC, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic 
 software,
 in my opinion D lacks std.decimal.
Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
With java 7 there is the big decimal library in java: https://docs.oracle.com/javase/7/docs/api/java/math/BigDecimal.html Kind regards André
Jun 06 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 07:00 +0000, Andre Pany via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 06:29:27 UTC, Andrei Alexandrescu wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic=C2=A0
 software,
 in my opinion D lacks std.decimal.
=20 Do C, C++, Java, Go, or Rust have a standard decimal type? --=C2=A0 Andrei
=20 With java 7 there is the big decimal library in java: https://docs.oracle.com/javase/7/docs/api/java/math/BigDecimal.html =20 Kind regards Andr=C3=A9
It is such a real pain to use though, at least from Java. A number of finance organization specifically went with Groovy to avoid the hassle: Groovy uses BigDecimal as the default floating point type and provides sensible expression syntax (unlike Java). These companies were overjoyed when Groovy got a static compile mode. :-) =C2=A0Kotlin can also do the right thing with BigDecimal, and does. So yes the Java Platform has a usable BigDecimal, but Java does not. Fortunately there are other languages on the Java Platform. BigDecimal is a "Huge Win"=E2=84=A2 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 08:29 +0200, Andrei Alexandrescu via Digitalmars-
d wrote:
 On 6/6/16 6:17 AM, Andre Pany wrote:
 to be usable for companies which want to create economic software,
 in my opinion D lacks std.decimal.
=20 Do C, C++, Java, Go, or Rust have a standard decimal type? -- Andrei
No, but then why is this a factor? Java has a BigDecimal but it isn't quite what you think it might be. Many (financial) organizations create their own. They tend to treat them as asset rather than infrastructure. I know of two C++ ones in this category. If having a decimal feature in the standard library, or at least easily available as a third-party library, would aide D traction, then maybe it should go on the "let's get this done now" agenda? --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
prev sibling parent reply interessted <interessted noreply.com> writes:
On Monday, 6 June 2016 at 04:17:18 UTC, Andre Pany wrote:
 Hi,

 to be usable for companies which want to create economic 
 software,
 in my opinion D lacks std.decimal. Maybe some companies will 
 develop their
 own decimal libraries but for the others they won't.
 There is some great work, but currently it seems to be blocked 
 by std.bigint
 https://github.com/andersonpd/eris/issues/6

 For the tooling I can only speak for windows environment. To 
 have a sophisticated
 IDE, the DLL topic needs some love. DLL are a major topic on 
 windows and without
 sophisticated DLL support, it is hardly possible to build an 
 integrated IDE similar
 to the well known like Visual Studio/Delphi/...
 One major issue I faced, I create a class in a DLL and cannot 
 cast the class
 in my main application to another type due to the missing type 
 info.
 The workaround would be to have massive code duplication, which 
 makes the code more complex.
 Also here some great work are already done but blocked by the 
 "export semantic" topic.

 Kind regards
 André
how true. we discarded D for development after discovering these problems after a 2 hour discussion in our company. nobody had a problem with the gc or the other previously mentioned points except for: Documentation and tutorials are weak. It was also the general impression, that windows is an orphan and there should be no risk taking with D because of that. instead of taking about decimal etc., you should fix the minimum issues of the language to make it useful for development, in our case windows, even if this train left the station.
Jun 06 2016
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 6 June 2016 at 07:39:40 UTC, interessted wrote:

 It was also the general impression, that windows is an orphan 
 and there should be no risk taking with D because of that.
I've never understood this. DMD started out on Windows exclusively. Linux and other platforms came later. It has a Windows installer that will find the MS tools if you need to use them, the zip package works out of the box, the compiler ships with a minimal (and admittedly outdated) set of Win32 libraries that work with OPTLINK... Where does this impression come from that Windows is a second-class citizen?
Jun 06 2016
next sibling parent reply ixid <adamsibson hotmail.com> writes:
On Monday, 6 June 2016 at 07:44:22 UTC, Mike Parker wrote:
 Where does this impression come from that Windows is a 
 second-class citizen?
64-bit support seemed to take forever to reach Windows.
Jun 06 2016
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 6 June 2016 at 08:17:44 UTC, ixid wrote:
 On Monday, 6 June 2016 at 07:44:22 UTC, Mike Parker wrote:
 Where does this impression come from that Windows is a 
 second-class citizen?
64-bit support seemed to take forever to reach Windows.
Well, that was only because the backend didn't support it initially. It didn't support Linux, FreeBSD or Mac either. My understanding is that implementing 64-bit support for those platforms was fairly easy once support for 32-bit was in place. 64-bit on Windows was a much tougher nut to crack so it took a long time, but Walter did get it done. So... what's the problem?
Jun 06 2016
parent ixid <adamsibson hotmail.com> writes:
On Monday, 6 June 2016 at 09:09:13 UTC, Mike Parker wrote:
 Where does this impression come from that Windows is a 
 second-class citizen?
 So... what's the problem?
I'm saying things like that is where the impression can come from. It's not a problem now.
Jun 06 2016
prev sibling parent reply tsbockman <thomas.bockman gmail.com> writes:
On Monday, 6 June 2016 at 07:44:22 UTC, Mike Parker wrote:
 I've never understood this. DMD started out on Windows 
 exclusively. Linux and other platforms came later. It has a 
 Windows installer that will find the MS tools if you need to 
 use them, the zip package works out of the box, the compiler 
 ships with a minimal (and admittedly outdated) set of Win32 
 libraries that work with OPTLINK... Where does this impression 
 come from that Windows is a second-class citizen?
The Windows build for DMD, etc. seems to require tools that Microsoft no longer distributes publicly. I asked about this, but no one replied... http://forum.dlang.org/post/lzmnllscqyyuqlusrwwe forum.dlang.org
Jun 06 2016
next sibling parent Mike Parker <aldacron gmail.com> writes:
On Monday, 6 June 2016 at 09:07:32 UTC, tsbockman wrote:

 The Windows build for DMD, etc. seems to require tools that 
 Microsoft no longer distributes publicly. I asked about this, 
 but no one replied...
     
 http://forum.dlang.org/post/lzmnllscqyyuqlusrwwe forum.dlang.org
It would seem that most of the people who want to build custom DMD binaries are Linux users, so the Windows build system doesn't get the attention it should. Perhaps if more people were trying to build on Windows, it would be a different story (because we'd surely have more PRs). What we need is someone who lives and breathes Windows development to maintain it so that it uses whatever MS toolchain is available on the system and is always up to date. Still, I don't see that this makes Windows a second class citizen. Again, DMD works out of the box on Windows.
Jun 06 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 2:07 AM, tsbockman wrote:
 The Windows build for DMD, etc. seems to require tools that Microsoft no longer
 distributes publicly. I asked about this, but no one replied...
     http://forum.dlang.org/post/lzmnllscqyyuqlusrwwe forum.dlang.org
Building dmd with Microsoft C++ isn't an official build. Building it with DMC++ is, works fine, and does not depend on Microsoft tools.
Jun 06 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 02:46 -0700, Walter Bright via Digitalmars-d
wrote:
 On 6/6/2016 2:07 AM, tsbockman wrote:
 The Windows build for DMD, etc. seems to require tools that
 Microsoft no longer
 distributes publicly. I asked about this, but no one replied...
 =C2=A0=C2=A0=C2=A0=C2=A0http://forum.dlang.org/post/lzmnllscqyyuqlusrww=
e forum.dlang.or
 g
=20 =20 Building dmd with Microsoft C++ isn't an official build. Building it with DMC++=C2=A0 is, works fine, and does not depend on Microsoft tools.
But standard C++ should be compilable with any standards compliant C++ compiler. So if it C++ and compiles with DMC++ then it should compile with MS C++. =C2=A0 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 3:40 AM, Russel Winder via Digitalmars-d wrote:
 On Mon, 2016-06-06 at 02:46 -0700, Walter Bright via Digitalmars-d
 wrote:
 On 6/6/2016 2:07 AM, tsbockman wrote:
 The Windows build for DMD, etc. seems to require tools that
 Microsoft no longer
 distributes publicly. I asked about this, but no one replied...
     http://forum.dlang.org/post/lzmnllscqyyuqlusrwwe forum.dlang.or
 g
Building dmd with Microsoft C++ isn't an official build. Building it with DMC++ is, works fine, and does not depend on Microsoft tools.
But standard C++ should be compilable with any standards compliant C++ compiler. So if it C++ and compiles with DMC++ then it should compile with MS C++.
That wasn't the complaint in the post. Besides, most any non-trivial program is going to break the Standard rules. For example, will your code work with 32 bit long bytes? Neither will mine.
Jun 06 2016
prev sibling parent Daniel Murphy <yebbliesnospam gmail.com> writes:
On 6/06/2016 8:40 PM, Russel Winder via Digitalmars-d wrote:
 Building dmd with Microsoft C++ isn't an official build. Building it
 with DMC++
 is, works fine, and does not depend on Microsoft tools.
But standard C++ should be compilable with any standards compliant C++ compiler. So if it C++ and compiles with DMC++ then it should compile with MS C++.
DMD is not written in C++ any more...
Jun 07 2016
prev sibling next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Documentation and tutorials are weak.
I never understood this, as I've always found D's docs to be pretty average. Let's compare a typical Phobos page with a page from the Python docs: Python Section on String Methods: https://docs.python.org/2.7/library/stdtypes.html#string-methods std.string: http://dlang.org/phobos/std_string.html Which is more helpful? Yet people still use Python despite it.
 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.
"Web Services", like "cloud", is a fancy marketing term that hides the simplicity of the concept in order to sound cool in board rooms. It just means "a web server that doesn't just serve web pages". I'm going to sound cold here, but how much can we accommodate people who don't do their research? Literally do this http://lmgtfy.com/?q=d+web+server. If they can't be bothered to do the google search, then I suspect they're one of the people who Walter talks about: always finding an excuse not to use something. Vibe.d was one of the things that first drew me to D because I was interested in it's feature set after I took ten seconds and googled for "D web server". Maybe the best solution here would be a "recommended packages" page that's linked from the home page under the "Packages" section. But I foresee that as being seen like we're playing favorites by some in the D community.
Jun 05 2016
parent reply Johnjo Willoughby <who me.com> writes:
On Monday, 6 June 2016 at 04:24:14 UTC, Jack Stouffer wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Documentation and tutorials are weak.
I never understood this, as I've always found D's docs to be pretty average.
If you understand why they are "pretty average" you can imagine what would make them better, that gives you two reference points from which you can extrapolate back to a point at which they are "fairly crap". "fairly crap" on the graph is where new users come in because... 1. They are not yet fully invested in D, so do not have the inherent bias of a convert. 2. They do not have long familiarity with the docs. All that aside, it doesn't actually matter what you think or whether you understand why it is a common complaint. It is simply a fact that a lot of new users find the documentation to be "fairly crap". So you can either choose to... 1. Flap your hands and bury your head in the sand. 2. Say it doesn't make sense, these people must be morons. 3. Fix the documentation to make it more accessible to new users. 1 and 2 are the current solution from what I can see.
Jun 06 2016
parent Dave <david.dave dave.com> writes:
On Monday, 6 June 2016 at 18:21:49 UTC, Johnjo Willoughby wrote:
 On Monday, 6 June 2016 at 04:24:14 UTC, Jack Stouffer wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Documentation and tutorials are weak.
I never understood this, as I've always found D's docs to be pretty average.
If you understand why they are "pretty average" you can imagine what would make them better, that gives you two reference points from which you can extrapolate back to a point at which they are "fairly crap". "fairly crap" on the graph is where new users come in because... 1. They are not yet fully invested in D, so do not have the inherent bias of a convert. 2. They do not have long familiarity with the docs. All that aside, it doesn't actually matter what you think or whether you understand why it is a common complaint. It is simply a fact that a lot of new users find the documentation to be "fairly crap". So you can either choose to... 1. Flap your hands and bury your head in the sand. 2. Say it doesn't make sense, these people must be morons. 3. Fix the documentation to make it more accessible to new users. 1 and 2 are the current solution from what I can see.
There documentation can also be classified as broken as broken links were quite the annoyance for me. It's a minor inconvenience, but one that has an additive effect every time I encountered one using their site. That being said. My major complaint is lack of coherent, relevant or useful examples of what a function/type/template is actually useful for. Most are just ripped out of unittests, provide no context, and often use other library calls that aren't really needed to solve a non-common, non explained problem.
Jun 06 2016
prev sibling next sibling parent Suliman <evermind live.ru> writes:
A lot of people need GUI

As second part db drivers and orm.

I am one of thouse who prefer to work with languages that have GC

Yeah, docs should have more examples. Sometimes it very hard to 
unserstand how to use function without examples
Jun 05 2016
prev sibling next sibling parent reply Ethan Watson <gooberman gmail.com> writes:
There's definitely an information war that needs to be won. That 
D has been around for 15-odd years and is still considered an 
emerging language is something of a problem.

I linked my DConf talks on a games industry forum, and the first 
response was that "It looks like a poor man's Rust". A notion I 
quickly dispelled, but it's a mindset that needs solid, linkable 
examples to work against. The talk I'm hoping to hold at GDC 
Europe in August will have some examples to that effect (they 
still haven't got back to me with confirmation). I'll need to 
make that more visible than slides/video once the talk is done.

Echoing the need for decimal support. I won't use it myself, but 
I know it's critical for finance.
Jun 05 2016
next sibling parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 There's definitely an information war that needs to be won. 
 That D has been around for 15-odd years and is still considered 
 an emerging language is something of a problem.
Hi Ethan. I enjoyed your talk, and thanks for this. But don't you think that as a language D has intrinsically matured quite slowly? Sociomantic began in 2008,or 2009,whenever it was, but at the time given where the language was that must have been quite a courageous decision if one thought one might be using it to process large amounts of data. These is nothing wrong with maturing more slowly - indeed maybe more complex creatures take time for everything to come together. Things develop at their own pace. Ie it's important to go to the root of the challenge - it's a different thing if the language has been ready for a decade and adoption is perceived to be slow than if it's been ready for some uses for maybe five years and people have that perception.
 I linked my DConf talks on a games industry forum, and the 
 first response was that "It looks like a poor man's Rust". A 
 notion I quickly dispelled, but it's a mindset that needs 
 solid, linkable examples to work against.
Agree about this. Also to have a few different channels by industry because I guess what's important for you is different for bio informatics and different again for me in finance. In addition there's a tribal and social proof aspect and people relate more easily to use cases closest to what they wish to do.
 Echoing the need for decimal support. I won't use it myself, 
 but I know it's critical for finance.
In banking maybe, and it would be nice to have, but large parts of finance (I am on the hedge fund side) don't need it so much.
Jun 06 2016
parent reply Ethan Watson <gooberman gmail.com> writes:
On Monday, 6 June 2016 at 08:00:30 UTC, Laeeth Isharc wrote:
 Hi Ethan.
Ahoy.
 But don't you think that as a language D has intrinsically 
 matured quite slowly?  Sociomantic began in 2008,or 
 2009,whenever it was,  but at the time given where the language 
 was that must have been quite a courageous decision if one 
 thought one might be using it to process large amounts of data.

 These is nothing wrong with maturing more slowly - indeed maybe 
 more complex creatures take time for everything to come 
 together. Things develop at their own pace.
Maturing slowly tends to be a counterpoint when I talk about it. And it's purely down to an information war thing. Compare to how C++ matures. And by matures, I mean the old dinosaur becomes more and more fossilized with age and is kept animated by cybernetic enhancements bolted on to the side in a haphazard manner. There's a lot of people I know that are fine with that because of entrenchment. D is still ahead of the pack in terms of features. Communicating that, and why you should buy in to the better way, is a bit of a challenge. A colleague of mine complained that strings use another whacky operator (~) to join strings and it's just another way of doing string work, which came about because he hadn't looked deep enough in to the language to realise it's just normal array concatenation. Yet despite being ahead of the pack, its slow adoption doesn't speak well for it. But there is precedent for slow adoption, at least in gaming. C++ was virtually unused until after the turn of the century, and now it's deeply entrenched. Moving to C++ was a pretty clear path forward for C programmers. Moving forward from C++? There's options (Rust, Swift, C#, D). And the other options have a far greater mindshare than D at the moment.
Jun 06 2016
next sibling parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, June 06, 2016 09:12:19 Ethan Watson via Digitalmars-d wrote:
 Yet despite being ahead of the pack, its slow adoption doesn't
 speak well for it. But there is precedent for slow adoption,
It's my understanding that python had slow adoption. It's huge now, but it took them a long time to get there. Slow growth does not necessarily equate to forever being small. - Jonathan M Davis
Jun 06 2016
parent BigDog <big.dog gmail.com> writes:
On Monday, 6 June 2016 at 13:15:12 UTC, Jonathan M Davis wrote:
 On Monday, June 06, 2016 09:12:19 Ethan Watson via 
 Digitalmars-d wrote:
 Yet despite being ahead of the pack, its slow adoption doesn't 
 speak well for it. But there is precedent for slow adoption,
It's my understanding that python had slow adoption. It's huge now, but it took them a long time to get there. Slow growth does not necessarily equate to forever being small. - Jonathan M Davis
Ada also had slow adoption. So has Erlang. O.o
Jun 06 2016
prev sibling parent reply BigDog <big.dog gmail.com> writes:
On Monday, 6 June 2016 at 09:12:19 UTC, Ethan Watson wrote:
 On Monday, 6 June 2016 at 08:00:30 UTC, Laeeth Isharc wrote:
 D is still ahead of the pack in terms of features.
I always think of Jurassic Park, when the D community talks of features. Specifically Jeff Goldblum's line of "Your scientist were so concerned if they could, they didn't stop to think if they should". D has pretty much most of the features present in every programming language. I'm not sure if that is a positive or negative personally.
Jun 06 2016
parent reply Seb <seb wilzba.ch> writes:
On Monday, 6 June 2016 at 13:53:13 UTC, BigDog wrote:
 On Monday, 6 June 2016 at 09:12:19 UTC, Ethan Watson wrote:
 On Monday, 6 June 2016 at 08:00:30 UTC, Laeeth Isharc wrote:
 D is still ahead of the pack in terms of features.
I always think of Jurassic Park, when the D community talks of features. Specifically Jeff Goldblum's line of "Your scientist were so concerned if they could, they didn't stop to think if they should". D has pretty much most of the features present in every programming language. I'm not sure if that is a positive or negative personally.
How about doing a collaborative poll and giving Andrei and Walter some feedback (backed with "some" number)? This time I found a platform that allows everyone to add new answers and select from the existing ones: http://www.rkursem.com/poll/view.php?id=7f7ebc16c280d0c3c Happy voting! Disclaimer: I am _not_ affiliated with this website by any means.
Jun 06 2016
parent reply Dave <david.dave dave.com> writes:
On Monday, 6 June 2016 at 14:00:29 UTC, Seb wrote:
 On Monday, 6 June 2016 at 13:53:13 UTC, BigDog wrote:
 On Monday, 6 June 2016 at 09:12:19 UTC, Ethan Watson wrote:
 On Monday, 6 June 2016 at 08:00:30 UTC, Laeeth Isharc wrote:
 D is still ahead of the pack in terms of features.
I always think of Jurassic Park, when the D community talks of features. Specifically Jeff Goldblum's line of "Your scientist were so concerned if they could, they didn't stop to think if they should". D has pretty much most of the features present in every programming language. I'm not sure if that is a positive or negative personally.
How about doing a collaborative poll and giving Andrei and Walter some feedback (backed with "some" number)? This time I found a platform that allows everyone to add new answers and select from the existing ones: http://www.rkursem.com/poll/view.php?id=7f7ebc16c280d0c3c Happy voting! Disclaimer: I am _not_ affiliated with this website by any means.
One thing I mentioned in another post is the documentation is lacking...particularly parts don't work and the examples are rather chaotic. As I dig into the language further (for some reason I haven't determined why yet) I also find that the standard library is has big giant holes. Particularly in the graphics department, which pretty much was a deal killer for my project I was gonna try using D with. But also it seems to be missing formal data structures that it should have. Deques, Queues, Stacks, etc. Also no formal official http support is also a bummer... I would dismiss this as "growing pains for a new language" then you find out D is almost 20 years old O.o
Jun 06 2016
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 6 June 2016 at 14:10:57 UTC, Dave wrote:

 One thing I mentioned in another post is the documentation is 
 lacking...particularly parts don't work and the examples are 
 rather chaotic.

 As I dig into the language further (for some reason I haven't 
 determined why yet) I also find that the standard library is 
 has big giant holes. Particularly in the graphics department, 
 which pretty much was a deal killer for my project I was gonna 
 try using D with. But also it seems to be missing formal data 
 structures that it should have. Deques, Queues, Stacks, etc. 
 Also no formal official http support is also a bummer...
Would this be of interest for you: http://dlang.org/phobos/std_container_dlist.html http://dlang.org/phobos/std_container_slist.html http://dlang.org/phobos/std_container_array.html
 I would dismiss this as "growing pains for a new language" then 
 you find out D is almost 20 years old O.o
...but maintained by how many people on a voluntary basis? You have to take the restricted resources into account as well. It's not like 20 years + Apple or Google behind it. Given how slowly big languages like Java have progressed over the years, one can only admire the wealth of (sometimes innovative) features D has, implemented by a small number of core developers.
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 14:16:03 UTC, Chris wrote:
 It's not like 20 years + Apple or Google behind it. Given how 
 slowly big languages like Java have progressed over the years, 
 one can only admire the wealth of (sometimes innovative) 
 features D has, implemented by a small number of core 
 developers.
The problem with that reasoning is that the standard libraries of languages like C++, Java and Python are less likely to contain undocumented bugs. Which is more important than features. The sole purpose of a standard library is to have something very stable to build your own libraries upon. A large number of features in a standard library is not really a selling point for production work. Having a large number of independent narrow high quality maintained 3rd party libraries is a selling point. The role of a good standard library is to enable writing narrow independent libraries that can be combined. This is an area where many languages go wrong. Basically, if there is no significant demand for a feature from library authors then it probably should not be added to a standard library. Arcane bloat becomes baggage down the line and can even keep the language itself from evolving. (breaking your own standard library is much worse than breaking 3rd party frameworks)
Jun 07 2016
parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 7 June 2016 at 14:35:51 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 14:16:03 UTC, Chris wrote:
 It's not like 20 years + Apple or Google behind it. Given how 
 slowly big languages like Java have progressed over the years, 
 one can only admire the wealth of (sometimes innovative) 
 features D has, implemented by a small number of core 
 developers.
The problem with that reasoning is that the standard libraries of languages like C++, Java and Python are less likely to contain undocumented bugs. Which is more important than features. The sole purpose of a standard library is to have something very stable to build your own libraries upon. A large number of features in a standard library is not really a selling point for production work. Having a large number of independent narrow high quality maintained 3rd party libraries is a selling point. The role of a good standard library is to enable writing narrow independent libraries that can be combined. This is an area where many languages go wrong. Basically, if there is no significant demand for a feature from library authors then it probably should not be added to a standard library. Arcane bloat becomes baggage down the line and can even keep the language itself from evolving. (breaking your own standard library is much worse than breaking 3rd party frameworks)
Features are important. Templates, for example, make writing code in general and libraries in particular much easier. You sound as if a wealth of features and good libraries were mutually exclusive. They are not. The problems with Phobos are not due to D's features but to a specific implementation (slow algorithms) - like in any other language. When writing software, it's important to have a wealth of features to choose from so you can use the one that best fits the task at hand. You realize that when you have to use a language with less features, then it's repeat yourself, repeat yourself ... It's always funny when other languages introduce a feature that D has had for years - after having insisted that the feature was unnecessary. Java has lambdas now (since version 8, I think) and I read somewhere that it's not certain that Java programmers will adopt (i.e. use) them at all. D has the advantage that its users are not so complacent and actually demand and welcome new features and happily use them. They're not lulled into believing that this or that feature is useless. This is why D evolves somewhat faster which helps to attract users who look for something different, who are not happy with the status quo. At the same time this will forever keep the complacent and those with a herd mentality away and there is absolutely no way to convince them - and we shouldn't try to, because it's a waste of time.
Jun 07 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 15:15:03 UTC, Chris wrote:
 Features are important. Templates, for example, make writing 
 code in general and libraries in particular much easier.
Both C++ and D have simple macro-like duck-typing templates. I don't find that approach particularly impressive. Yes, I use it a lot, but templated programming in this fashion also easily get convoluted and having multiple levels of abstraction can reduce transparency and make debugging more difficult.
 You sound as if a wealth of features and good libraries were 
 mutually exclusive. They are not. The problems with Phobos are 
 not due to D's features but to a specific implementation (slow 
 algorithms) - like in any other language.
Well, D has some language features that probably should be library features (like slices) and some library features that should have been language features (like memory management). However I was not complaining about "problems", but pointing out that system library APIs get out-of-date over time irrespective of the quality of the implementation. The more detached library features are from the core language the more likely they are to get out-dated.
 When writing software, it's important to have a wealth of 
 features to choose from so you can use the one that best fits 
 the task at hand.
Not really. C++/D would have been better languages if they just cut all the pointless special casing and boiled the core semantics down to templates and functors. C++ has at least boiled lambdas down to functors and are better for it.
 You realize that when you have to use a language with less 
 features, then it's repeat yourself, repeat yourself ...
Not true in my experience. Which languages have you used? Orthogonal minimalistic languages can be very expressive. Actually, it is often the opposite, because without that minimalism you have to special case a lot of stuff when similar entities cannot be treated the same. However in some minimalistic languages everything looks the same (like Lisp) which can make it harder to read source code. Sometimes you want a DSL for readability. However I don't think C++ or D score high on readability. It is not their core strength. I don't find C++ std or Phobos particularly readable. I can read it, but neither provide shining examples of legible code.
 Java has lambdas now (since version 8, I think) and I read 
 somewhere that it's not certain that Java programmers will 
 adopt (i.e. use) them at all.
So, Java was designed to be a simple language, just like the JVM is simple. Lambdas aren't anything special. Most languages have anonymous functions. Even the minimalistic OO ones.
 D has the advantage that its users are not so complacent and 
 actually demand and welcome new features and happily use them. 
 They're not lulled into believing that this or that feature is 
 useless.
Uhm... Do you really think this reflects reality? You mean the D language designers and fanbois don't try to cut people down when they point out where D needs improvement? ORLY???
 This is why D evolves somewhat faster which helps to attract 
 users who look for something different, who are not happy with 
 the status quo.
But it doesn't evolve much, it adds features without fixing what is already there. Which makes it harder and harder to make any significant progress on the language semantics. C++ actually do improve on the existing features, lambdas was improved in C++14 and further improved in C++17 (constexpr). And C++'s take on lambdas is better than D lambdas. Despite C++ not being able to push breaking changes, which D can (D's major advantage over C++). If D improved on the existing feature set then it could stand a chance. Like, it could provide better integer semantics than C/C++, but chose not to. It could provide better floating point semantics than C/C++ (not hard to beat, as gcc/clang messes up rounding modes), but chose to stick with something even worse. The only advantage D has in the template department is the ability to select class-members using meta-programming. Everything else you need for practical meta-programming can be done just as well in C++. If you compare current day D to C++ it only have two advantage points: - static if - a naive garbage collector But D lambdas are worse, floats are worse, ints are worse, simd support is worse (gcc/clang) and documentation is worse. Please understand that I am not saying that C++ is really great, but D is deliberately not trying to be better, but for some reason choose to just be "somewhat different". That cannot work out to D's advantage as both languages have too many pitfalls to be suitable for newbies (in comparison to high level languages). As such, Go is better than C++ in some domains by being dependent on GC and stripping down features to the essentials. Go is very unexciting, but they stuck to improving on some core competitive features over time. Which paid off in that niche as other languages cannot get those features within 3 years. D needs to improve too. Not by growing the feature set, but by improving on what is already there.
Jun 07 2016
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 7 June 2016 at 16:12:42 UTC, Ola Fosheim Grøstad 
wrote:
 If you compare current day D to C++ it only have two advantage 
 points:

 - static if

 - a naive garbage collector

 But D lambdas are worse, floats are worse, ints are worse, simd 
 support is worse (gcc/clang) and documentation is worse.
How about no headers or macros? Dub? Chaining range operations with ufcs?
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 17:04:19 UTC, jmh530 wrote:
 How about no headers or macros?
Textual include files are annoying and C++ needs more symbolic modules, but not having the ability to use namespaces is annoying too and D's take on name resolution can break code. No need to use macros in C++10, but it comes in handy in debugging and unit testing, actually. So, I am bit torn on that. So those are just break-even issues, neither better or worse. Plusses and minuses in both camps.
 Dub?
Not a language feature, I avoid to use such features if I can. I really hate being forced to use it with node.js. I prefer downloading directly from github and put specific versions of libraries in my own projects. I don't use a package manager with Python either. Just download libraries, remove unneeded stuff and dump it into my project directory.
 Chaining range operations with ufcs?
I don't like how UFCS makes code less maintainable, and was happy to learn that C++17 most likely won't add it. I only use generators for testing, not for production. In general I end up using explicit easy to read inner-loops as they are easier for step-debugging and easier to optimize. The time it takes to write the code is less costly than the time it takes to understand what is going on in a debugger. I use generators in high level REPL languages like Python though.
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 17:38:01 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 17:04:19 UTC, jmh530 wrote:
 How about no headers or macros?
Textual include files are annoying and C++ needs more symbolic modules, but not having the ability to use namespaces is annoying too and D's take on name resolution can break code.
Maybe I should explain the namespace part, as what I am referring to is a relatively new feature in C++ called "inline namespaces". I basically do this in different include files: namespace mylib { inline namespace common { // stuff I want to use without prefix or with ::mylib::* } // stuff I want to use with prefix ::mylib::* } When I include various parts of mylib I simply write "use namespace ::mylib::common;" Then all symbols in ::mylib::common is accessible without prefixing it, but I can choose to use explicit prefixing ::mylib::... whenever I want to anyway. Handy for not polluting the namespace, but also getting easy access to things often used in conditional like "is_empty(x)". Not sure if other people do this, but I like it.
Jun 07 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 7 June 2016 at 16:12:42 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 15:15:03 UTC, Chris wrote:
 Features are important. Templates, for example, make writing 
 code in general and libraries in particular much easier.
Both C++ and D have simple macro-like duck-typing templates. I don't find that approach particularly impressive. Yes, I use it a lot, but templated programming in this fashion also easily get convoluted and having multiple levels of abstraction can reduce transparency and make debugging more difficult.
But we agree that templates are a good idea in general, regardless of the actual implementation.
 You sound as if a wealth of features and good libraries were 
 mutually exclusive. They are not. The problems with Phobos are 
 not due to D's features but to a specific implementation (slow 
 algorithms) - like in any other language.
Well, D has some language features that probably should be library features (like slices) and some library features that should have been language features (like memory management).
What do you mean by `memory management`? GC/RC built into the compiler?
 However I was not complaining about "problems", but pointing 
 out that system library APIs get out-of-date over time 
 irrespective of the quality of the implementation. The more 
 detached library features are from the core language the more 
 likely they are to get out-dated.
What do you mean? Is it a good or a bad thing that the library is detached from the core language?
 When writing software, it's important to have a wealth of 
 features to choose from so you can use the one that best fits 
 the task at hand.
Not really. C++/D would have been better languages if they just cut all the pointless special casing and boiled the core semantics down to templates and functors. C++ has at least boiled lambdas down to functors and are better for it.
Believe me, features will be requested. Would you have an example of how such a language, or better still did you have time to design and test one? A proof of concept.
 You realize that when you have to use a language with less 
 features, then it's repeat yourself, repeat yourself ...
Not true in my experience. Which languages have you used? Orthogonal minimalistic languages can be very expressive. Actually, it is often the opposite, because without that minimalism you have to special case a lot of stuff when similar entities cannot be treated the same. However in some minimalistic languages everything looks the same (like Lisp) which can make it harder to read source code. Sometimes you want a DSL for readability. However I don't think C++ or D score high on readability. It is not their core strength. I don't find C++ std or Phobos particularly readable. I can read it, but neither provide shining examples of legible code.
Having to write the same for loop with slight variations all over again is not my definition of efficient programming. One of D's strengths is that it offers nice abstractions for data representation.
 Java has lambdas now (since version 8, I think) and I read 
 somewhere that it's not certain that Java programmers will 
 adopt (i.e. use) them at all.
So, Java was designed to be a simple language, just like the JVM is simple. Lambdas aren't anything special. Most languages have anonymous functions. Even the minimalistic OO ones.
Not special but handy. Before Java 8 (?) you had to use inner/anonymous classes to mimic lambdas. Not very efficient. Boiler plate, repetition, the whole lot.
 D has the advantage that its users are not so complacent and 
 actually demand and welcome new features and happily use them. 
 They're not lulled into believing that this or that feature is 
 useless.
Uhm... Do you really think this reflects reality? You mean the D language designers and fanbois don't try to cut people down when they point out where D needs improvement? ORLY???
I was not talking about that. Read it again. I said that the D community actively demands features or improvements and uses them. What you refer to has nothing to do with my point and it doesn't happen very often on this forum. Usually everything is discussed to death and beyond.
 This is why D evolves somewhat faster which helps to attract 
 users who look for something different, who are not happy with 
 the status quo.
But it doesn't evolve much, it adds features without fixing what is already there. Which makes it harder and harder to make any significant progress on the language semantics.
That's how it works. Add, test, improve. Having to wait for years for a feature is terrible, because you build up a code base with code that is not optimal and verbose (cf. Java). It goes without saying that existing features have to be improved. It's a question of manpower.
 C++ actually do improve on the existing features, lambdas was 
 improved in C++14 and further improved in C++17 (constexpr). 
 And C++'s take on lambdas is better than D lambdas. Despite C++ 
 not being able to push breaking changes, which D can (D's major 
 advantage over C++).

 If D improved on the existing feature set then it could stand a 
 chance. Like, it could provide better integer semantics than 
 C/C++, but chose not to. It could provide better floating point 
 semantics than C/C++ (not hard to beat, as gcc/clang messes up 
 rounding modes), but chose to stick with something even worse. 
 The only advantage D has in the template department is the 
 ability to select class-members using meta-programming. 
 Everything else you need for practical meta-programming can be 
 done just as well in C++.

 If you compare current day D to C++ it only have two advantage 
 points:

 - static if

 - a naive garbage collector

 But D lambdas are worse, floats are worse, ints are worse, simd 
 support is worse (gcc/clang) and documentation is worse.

 Please understand that I am not saying that C++ is really 
 great, but D is deliberately not trying to be better, but for 
 some reason choose to just be "somewhat different". That cannot 
 work out to D's advantage as both languages have too many 
 pitfalls to be suitable for newbies (in comparison to high 
 level languages).
There is a learning curve that cannot be made flatter. There are concepts that have to be grasped and understood. Any language (cf. Nim) that allows you to do sophisticated and low-level things is harder to learn than JS or Python.
 As such, Go is better than C++ in some domains by being 
 dependent on GC and stripping down features to the essentials. 
 Go is very unexciting, but they stuck to improving on some core 
 competitive features over time. Which paid off in that niche as 
 other languages cannot get those features within 3 years.

 D needs to improve too. Not by growing the feature set, but by 
 improving on what is already there.
Go forces you to repeat yourself. The less features you have, the more you have to write the same type of code all over again. Look at all the for loops in a C program.
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 19:52:47 UTC, Chris wrote:
 But we agree that templates are a good idea in general, 
 regardless of the actual implementation.
Having access to parametric abstractions is a good idea. How to best use them is not so obvious... in real projects where things changes. (Except for trivial stuff).
 What do you mean by `memory management`? GC/RC built into the 
 compiler?
Everything related to managing ownership and access, making the most out of static analysis. Putting things on the stack, or simply not allocating if not used.
 What do you mean? Is it a good or a bad thing that the library 
 is detached from the core language?
I mean that the standard library features that are closely related to the core language semantics are more stable than things like HTTP.
 Believe me, features will be requested. Would you have an 
 example of how such a language, or better still did you have 
 time to design and test one? A proof of concept.
C++ with just member functions and wrappers around builtins is pretty close. Yes, I have used minimalistic languages, Beta for instance. I believe gBeta is closer. I guess dynamic languages like Self and even Javascript are there too. Probably also some of the dependently typed languages, but I have never used those. If you stripped down C++ by taking out everything that can be expressed using another feature then you would have the foundation.
 Having to write the same for loop with slight variations all 
 over again is not my definition of efficient programming. One 
 of D's strengths is that it offers nice abstractions for data 
 representation.
Hm? You can have templated functions in C++.
 Not special but handy. Before Java 8 (?) you had to use 
 inner/anonymous classes to mimic lambdas. Not very efficient. 
 Boiler plate, repetition, the whole lot.
Well, that is a matter of implementation. C++ lambdas are exactly that, function objects, but there is no inherent performance penalty. A "lambda" is mostly syntactical sugar.
 I was not talking about that. Read it again. I said that the D 
 community actively demands features or improvements and uses 
 them.
Are you talking about the language or the standard library? I honestly don't think the latter matter much. Except for memory management.
 It goes without saying that existing features have to be 
 improved. It's a question of manpower.
No, it is a matter of being willing to improve the semantics. Many of the improvements that is needed to best C++ are simple, but slightly breaking, changes. D could change floats so that interval arithmetics can be implemented. Which is difficult to do in clang/gcc. That would be a major selling point. But the basic reasoning is that this is not needed, because C/C++ fails to comply with the IEEE standard as well. If the motivation is to trail C/C++, then there is no way to surpass C/C++, then there is no real motivation to switch.
 There is a learning curve that cannot be made flatter. There 
 are concepts that have to be grasped and understood. Any 
 language (cf. Nim) that allows you to do sophisticated and 
 low-level things is harder to learn than JS or Python.
Not sure what sophisticated things you are referring to. (JS and Python have complexity issues as well, you just don't need to learn them to make good use the languages).
 Go forces you to repeat yourself. The less features you have, 
 the more you have to write the same type of code all over 
 again. Look at all the for loops in a C program.
Loops and writing the same code over is not a major hurdle. Getting it right is the major hurdle. So having many loops is not bad, but having a way to express that you want to iterate from 1 to 10 in a less error-prone way matters. But you can do that in minimalistic languages like Beta that has only two core entities: - a pattern (type/function/subclass) - an instance of a pattern Just define a pattern that iterates and prefix your body with it, that is the same as subclassing it. Pseudo-code (not actual Beta syntax, but Cish syntax for simplicty). iterate:{ N: int; i: int; enter N do i = 0 while (i < N) do inner; // insert prefixed do-part here i++ } 10 -> iterate{ do i -> print; } Or you could just do repeat10:{ N:<{ n: int; do 10->n; inner; exit n}; i: int; do N -> iterate{ do inner; } } repeat99:repeat10{ N:<{ do 99->n; inner; } } repeat99{ do i -> print; "bottles of wine" ->print } etc...
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 20:55:12 UTC, Ola Fosheim Grøstad 
wrote:
 repeat10:{
    N:<{ n: int; do 10->n; inner; exit n};
    i: int;
    do N -> iterate{ do inner; }
 }

 repeat99:repeat10{
   N:<{ do 99->n; inner; }
 }

 repeat99{ do i -> print; "bottles of wine" ->print }
Adding some comments, as the example was not clear on its own: // repeat10 is a new pattern (class) inheriting from object by default repeat10:{ // N is a virtual pattern (function) N:<{ n: int; do 10->n; inner; exit n}; do N -> iterate{ do inner; } // this is a subclass of the previously defined iterate pattern } // repeat99 is a new pattern inheriting from repeat10 above repeat99:repeat10{ // N is a specialization of N in repeat10 // N expands to {n: int; do 10->n; 99->n; inner; exit n} N:<{ do 99->n; inner; } } // this is a subclass of repeat 99 repeat99{ do i -> print; "bottles of wine" ->print } Give or take, I haven't used Beta in 20 years. Abstractions is not the problem, a very simple language can provide what most programmers need. Perhaps not with a familiar syntax though.
Jun 07 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 15:15 +0000, Chris via Digitalmars-d wrote:
=20
[=E2=80=A6]
 Java has lambdas now (since version 8, I think) and I read=C2=A0
 somewhere that it's not certain that Java programmers will adopt=C2=A0
 (i.e. use) them at all. D has the advantage that its users are=C2=A0
=C2=A0[=E2=80=A6]
Whatever you read, the writer didn't really know what they were talking about. At least not in general, and if they were talking of the Javaverse as a whole. Java 8 features such as lambda expressions, Streams, method references, etc. are no longer even controversial. There is a world-wide activity in transforming Java 6 and Java 7 code to Java 8. Yes some of this is pull rather than push, and I am sure there are islands of intransigence (*). However the bulk of Java programmers will eventually get and use the features. Of course many people have stopped using Java and use Kotlin, Ceylon, or Scala (**). The crucial point here is that the Javaverse is much, much more than just the Java language. (*) Usually people who think Java 5 was a bad move and stick with Java 1.4.2.=C2=A0 (**) There are others but these are the main players. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 10 2016
parent Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 17:09:18 UTC, Russel Winder wrote:

 Whatever you read, the writer didn't really know what they were 
 talking about. At least not in general, and if they were 
 talking of the Javaverse as a whole. Java 8 features such as 
 lambda expressions, Streams, method references, etc. are no 
 longer even controversial. There is a world-wide activity in 
 transforming Java 6 and Java 7 code to Java 8. Yes some of this 
 is pull rather than push, and I am sure there are islands of 
 intransigence (*). However the bulk of Java programmers will 
 eventually get and use the features.

 Of course many people have stopped using Java and use Kotlin, 
 Ceylon, or Scala (**). The crucial point here is that the 
 Javaverse is much, much more than just the Java language.
This only proves my point. This happens in languages that are "feature resistant". For years you have to write awkward code[1], and once a feature got accepted you have to revise your old code and jazz it up. And then of course you get conservative programmers who loath changes, they are a direct consequence of feature resistance. The more progressive ones turn to other languages like Clojure and Kotlin. All this proves that being feature resistant is not healthy for a language. [1] E.g. Java event listeners, Runnable etc.
 (*) Usually people who think Java 5 was a bad move and stick 
 with Java
 1.4.2.

 (**) There are others but these are the main players.
Jun 11 2016
prev sibling next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 Echoing the need for decimal support. I won't use it myself, 
 but I know it's critical for finance.
You can always round something to two digits if you need to.
Jun 06 2016
next sibling parent reply Observer <here inter.net> writes:
On Monday, 6 June 2016 at 13:10:32 UTC, jmh530 wrote:
 On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 Echoing the need for decimal support. I won't use it myself, 
 but I know it's critical for finance.
You can always round something to two digits if you need to.
It's more complicated than that. Part of what you need is to be able to declare a variable as (say) having two significant fractional digits, and have the rounding rules be implicitly applied when saving to that variable, producing an exact representation of the rounded result in storage. And part of the reason for that is that if you compare "3.499" to "3.501" (as the originally computed numbers) when only 2 sig figs should be in play, they must compare equal. This is very much unlike what happens with binary floating point, where exact comparisons are rarely possible due to low-order bit differences. In commercial decimal-arithmetic applications (think "money"), the low-order bits are supposed to be discarded for almost all values that actually represent an amount of currency. For reliability of the application, this has to be built into the data type, not dependent on programmer vigilance. That is, just like certain other language features, a decimal type without implicit truncation would be thought of as "doesn't scale well".
Jun 06 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 6 June 2016 at 18:36:37 UTC, Observer wrote:
 It's more complicated than that.  Part of what you need is to
 be able to declare a variable as (say) having two significant
 fractional digits, and have the rounding rules be implicitly
 applied when saving to that variable, producing an exact
 representation of the rounded result in storage.
Yes, but if you want accurate representation of two fractional digits on storage only, then it makes most sense to do all calculations on cents (scale everything by 100) and store as integers.
Jun 06 2016
parent reply Observer <here inter.net> writes:
On Monday, 6 June 2016 at 18:55:22 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 6 June 2016 at 18:36:37 UTC, Observer wrote:
 It's more complicated than that.  Part of what you need is to
 be able to declare a variable as (say) having two significant
 fractional digits, and have the rounding rules be implicitly
 applied when saving to that variable, producing an exact
 representation of the rounded result in storage.
Yes, but if you want accurate representation of two fractional digits on storage only, then it makes most sense to do all calculations on cents (scale everything by 100) and store as integers.
That's only a workaround. It's probably best to think of decimal arithmetic as an accountant would, not as a computer geek would. Intermediate arithmetic results should perhaps also be rounded/ truncated, though it depends on the particular calculation. For instance, I might have an interest percentage that gets carried to 4 sig figs (fractional digits), but when applied to a currency amount with only 2 sig figs, the result should end up with 2 sig figs, not 6. And then that result should continue to be used in later arithmetic in the same larger expression, even without an explicit save operation to invoke the rounding/truncation. If you tried this using float for the percentage and integer for the amount, once the conversion to float happened it would not revert in the middle of the expression. Also, keeping values as only integers complicates input/output formatting. I want 123456 pennies to show up as either 1234.56 or perhaps 1,234.56 (at least, I want that punctuation in a U.S. locale), and this should happen without my needing to manually rescale during i/o operations.
Jun 06 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 6 June 2016 at 19:30:52 UTC, Observer wrote:
 Also, keeping values as only integers complicates input/output
 formatting.  I want 123456 pennies to show up as either 1234.56
 or perhaps 1,234.56 (at least, I want that punctuation in a U.S.
 locale), and this should happen without my needing to manually
 rescale during i/o operations.
You can solve all these things with static typing, unless you want to vary the number of significant decimals at runtime. (well, you can then as well, but need 2 values)
Jun 06 2016
prev sibling parent reply DLearner <bmqazwsx123 gmail.com> writes:
On Monday, 6 June 2016 at 19:30:52 UTC, Observer wrote:
 On Monday, 6 June 2016 at 18:55:22 UTC, Ola Fosheim Grøstad 
 wrote:
 On Monday, 6 June 2016 at 18:36:37 UTC, Observer wrote:
 It's more complicated than that.  Part of what you need is to
 be able to declare a variable as (say) having two significant
 fractional digits, and have the rounding rules be implicitly
 applied when saving to that variable, producing an exact
 representation of the rounded result in storage.
Yes, but if you want accurate representation of two fractional digits on storage only, then it makes most sense to do all calculations on cents (scale everything by 100) and store as integers.
That's only a workaround. It's probably best to think of decimal arithmetic as an accountant would, not as a computer geek would. Intermediate arithmetic results should perhaps also be rounded/ truncated, though it depends on the particular calculation. For instance, I might have an interest percentage that gets carried to 4 sig figs (fractional digits), but when applied to a currency amount with only 2 sig figs, the result should end up with 2 sig figs, not 6. And then that result should continue to be used in later arithmetic in the same larger expression, even without an explicit save operation to invoke the rounding/truncation. If you tried this using float for the percentage and integer for the amount, once the conversion to float happened it would not revert in the middle of the expression. Also, keeping values as only integers complicates input/output formatting. I want 123456 pennies to show up as either 1234.56 or perhaps 1,234.56 (at least, I want that punctuation in a U.S. locale), and this should happen without my needing to manually rescale during i/o operations.
Entirely agree. Scaling everything internally to force things to integers, and then re-scaling on output is just bending the language because it cannot properly address the problem. The result is just creating something more to go wrong - consider large accounting program maintained over several years, not always by original author. If we allow _int foo;_ to declare an integer variable foo, then suggest we have _dec bar(a,b);_ to declare a decimal variable bar with a units in total length, b units of decimal places. D language then defines how (for example) assignment is carried out (by default), and also provides mechanisms for alternatives (ie some sort of dec_trunc() and dec_round() functions).
Jun 06 2016
parent reply Observer <here inter.net> writes:
On Monday, 6 June 2016 at 19:55:53 UTC, DLearner wrote:
 If we allow _int foo;_ to declare an integer variable foo, then 
 suggest we have
 _dec bar(a,b);_ to declare a decimal variable bar with a units 
 in total length, b units of decimal places.
 D language then defines how (for example) assignment is carried 
 out (by default), and also provides mechanisms for alternatives 
 (ie some sort of dec_trunc() and dec_round() functions).
And when real-world money is involved, you'd better have arithmetic overflow trigger an exception, not just wrap around silently.
Jun 06 2016
next sibling parent reply DLearner <bmqazwsx123 gmail.com> writes:
On Monday, 6 June 2016 at 20:19:20 UTC, Observer wrote:
 And when real-world money is involved, you'd better have 
 arithmetic
 overflow trigger an exception, not just wrap around silently.
Suggest as compiler option, default=on. IBM PL/I had a _FIXED DECIMAL_ datatype, many many years ago. Could it be used as a model?
Jun 06 2016
next sibling parent Observer <here inter.net> writes:
On Monday, 6 June 2016 at 21:35:20 UTC, DLearner wrote:
 IBM PL/I had a _FIXED DECIMAL_ datatype, many many years ago.
 Could it be used as a model?
Not by me. I just checked, and I still have my IBM Fortran IV manuals, but my PL/I (F) Reference Manual seems to have evaporated. On the other hand, I'd probably prefer to use PL/I as a reference for ideas than COBOL. But really, I assume such issues are already described in the C/C++ standards papers I referenced in an earlier post.
Jun 06 2016
prev sibling next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 21:35 +0000, DLearner via Digitalmars-d wrote:
=20
[=E2=80=A6]
 IBM PL/I had a _FIXED DECIMAL_ datatype, many many years ago.
 Could it be used as a model?
That will almost certainly be because IBM hardware has a decimal ALU as well as a binary ALU.=C2=A0 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling parent tsbockman <thomas.bockman gmail.com> writes:
On Monday, 6 June 2016 at 21:35:20 UTC, DLearner wrote:
 On Monday, 6 June 2016 at 20:19:20 UTC, Observer wrote:
 And when real-world money is involved, you'd better have 
 arithmetic
 overflow trigger an exception, not just wrap around silently.
Suggest as compiler option, default=on.
Full compiler-level support is not currently planned, but we've been working on library support for a while now, and it's almost ready: https://github.com/dlang/phobos/pull/4407 https://code.dlang.org/packages/checkedint It's not as fast as a built-in type could be, but still much faster than BigDecimal.
Jun 07 2016
prev sibling parent qznc <qznc web.de> writes:
On Monday, 6 June 2016 at 20:19:20 UTC, Observer wrote:
 On Monday, 6 June 2016 at 19:55:53 UTC, DLearner wrote:
 If we allow _int foo;_ to declare an integer variable foo, 
 then suggest we have
 _dec bar(a,b);_ to declare a decimal variable bar with a units 
 in total length, b units of decimal places.
 D language then defines how (for example) assignment is 
 carried out (by default), and also provides mechanisms for 
 alternatives (ie some sort of dec_trunc() and dec_round() 
 functions).
And when real-world money is involved, you'd better have arithmetic overflow trigger an exception, not just wrap around silently.
I made an abstraction, which uses long internally. If the value range is big enough, it works fine. http://code.dlang.org/packages/money
Jun 07 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 11:36 AM, Observer wrote:
 It's more complicated than that.  Part of what you need is to
 be able to declare a variable as (say) having two significant
 fractional digits, and have the rounding rules be implicitly
 applied when saving to that variable, producing an exact
 representation of the rounded result in storage.  And part of
 the reason for that is that if you compare "3.499" to "3.501"
 (as the originally computed numbers) when only 2 sig figs
 should be in play, they must compare equal.  This is very much
 unlike what happens with binary floating point, where exact
 comparisons are rarely possible due to low-order bit differences.
 In commercial decimal-arithmetic applications (think "money"),
 the low-order bits are supposed to be discarded for almost all
 values that actually represent an amount of currency.  For
 reliability of the application, this has to be built into the
 data type, not dependent on programmer vigilance.  That is,
 just like certain other language features, a decimal type
 without implicit truncation would be thought of as "doesn't
 scale well".
What I've done, though I know that I won't convince any users of BigDecimal, is use longs and have them represent pennies instead of dollars. Then they're all exact to two places.
Jun 06 2016
next sibling parent reply Observer <here inter.net> writes:
On Monday, 6 June 2016 at 20:35:26 UTC, Walter Bright wrote:
 What I've done, though I know that I won't convince any users 
 of BigDecimal, is use longs and have them represent pennies 
 instead of dollars. Then they're all exact to two places.
I loaned out my copy of TDPL, so I don't have it nearby to check, but if that's your approach, you need to ensure that your underlying integral type is *always* 64-bits, not 32-bits. The reason is that otherwise, you've just limited your apps to handling a maximum amount of $21,474,836.48. But banks can accept checks up to $99,999,999.99 (yes, they do have a limit, or at least they used to when I was working in that field).
Jun 06 2016
next sibling parent Observer <here inter.net> writes:
On Monday, 6 June 2016 at 20:56:04 UTC, Observer wrote:
 On Monday, 6 June 2016 at 20:35:26 UTC, Walter Bright wrote:
 What I've done, though I know that I won't convince any users 
 of BigDecimal, is use longs and have them represent pennies 
 instead of dollars. Then they're all exact to two places.
I loaned out my copy of TDPL, so I don't have it nearby to check, but if that's your approach, you need to ensure that your underlying integral type is *always* 64-bits, not 32-bits. The reason is that otherwise, you've just limited your apps to handling a maximum amount of $21,474,836.48. But banks can accept checks up to $99,999,999.99 (yes, they do have a limit, or at least they used to when I was working in that field).
Silly me, I meant $21,474,836.47, of course.
Jun 06 2016
prev sibling next sibling parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/6/16 4:56 PM, Observer wrote:
 On Monday, 6 June 2016 at 20:35:26 UTC, Walter Bright wrote:
 What I've done, though I know that I won't convince any users of
 BigDecimal, is use longs and have them represent pennies instead of
 dollars. Then they're all exact to two places.
I loaned out my copy of TDPL, so I don't have it nearby to check, but if that's your approach, you need to ensure that your underlying integral type is *always* 64-bits, not 32-bits. The reason is that otherwise, you've just limited your apps to handling a maximum amount of $21,474,836.48. But banks can accept checks up to $99,999,999.99 (yes, they do have a limit, or at least they used to when I was working in that field).
longs are always 64-bit in D. -Steve
Jun 06 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 1:56 PM, Observer wrote:
 On Monday, 6 June 2016 at 20:35:26 UTC, Walter Bright wrote:
 What I've done, though I know that I won't convince any users of BigDecimal,
 is use longs and have them represent pennies instead of dollars. Then they're
 all exact to two places.
I loaned out my copy of TDPL, so I don't have it nearby to check, but if that's your approach, you need to ensure that your underlying integral type is *always* 64-bits, not 32-bits.
D's long is always 64 bits.
 The reason is that otherwise, you've just limited your apps
 to handling a maximum amount of $21,474,836.48.
I wouldn't mind running into that problem :-)
 But banks
 can accept checks up to $99,999,999.99 (yes, they do have
 a limit, or at least they used to when I was working in that
 field).
Jun 06 2016
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 6 June 2016 at 21:28:14 UTC, Walter Bright wrote:
 The reason is that otherwise, you've just limited your apps
 to handling a maximum amount of $21,474,836.48.
I wouldn't mind running into that problem :-)
It's really not that hard when you consider that many institutional investors wouldn't even consider investing in mutual funds with that amount under management. Vanguard has mutual funds with 100+ billion. I imagine their NAV (net asset value) calculations have to be exact to dollars/cents too.
Jun 06 2016
parent Andre Pany <andre s-e-a-p.de> writes:
On Monday, 6 June 2016 at 22:21:45 UTC, jmh530 wrote:
 On Monday, 6 June 2016 at 21:28:14 UTC, Walter Bright wrote:
 The reason is that otherwise, you've just limited your apps
 to handling a maximum amount of $21,474,836.48.
I wouldn't mind running into that problem :-)
It's really not that hard when you consider that many institutional investors wouldn't even consider investing in mutual funds with that amount under management. Vanguard has mutual funds with 100+ billion. I imagine their NAV (net asset value) calculations have to be exact to dollars/cents too.
In addition to the finance sector there are also a lot of other industries which depends on decimals, e.g. all industries dealing with sensors and measurements. The requirements for them (precision and scale) are very different. There could be very small values with a lot of numbers after the decimal separator, which needs to be exact. Kind regards André
Jun 06 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 13:35 -0700, Walter Bright via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 What I've done, though I know that I won't convince any users of
 BigDecimal, is=C2=A0
 use longs and have them represent pennies instead of dollars. Then
 they're all=C2=A0
 exact to two places.
You are right, your use of integers (even 128-bit ones) will not convince anyone doing anything finance other than household accounting. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 08:21:23 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 13:35 -0700, Walter Bright via 
 Digitalmars-d wrote:
 
[…]
 What I've done, though I know that I won't convince any users 
 of
 BigDecimal, is
 use longs and have them represent pennies instead of dollars. 
 Then
 they're all
 exact to two places.
You are right, your use of integers (even 128-bit ones) will not convince anyone doing anything finance other than household accounting.
It is the same as base 10 floating point, with rounding to N decimals places, you just make the exponent part of the type. Making the exponent and rounding fixed is basically just an optimization where it is statically known. Anyway, there is no point in having base 10 floating point part of the language unless you target hardware with base 10 floats. There are already base 10 IEEE754-2008 implementations available for other languages that those in need can: 1. translate to D 2. unit test against the reference implementation If well-funded industries like finance want it, they can easily do it. This is quite different from say meta-programming, syntax, tooling, memory management and basic data types which are language related-issues, not small-scale funding issues.
Jun 07 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 13:10 +0000, jmh530 via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
=20
 Echoing the need for decimal support. I won't use it myself,=C2=A0
 but I know it's critical for finance.
=20 You can always round something to two digits if you need to.
For graphics cards this approach to floating point may work: in fact we know it does, graphics card floating point is fast but really quite inaccurate, but it is accurate enough that the final rounding leads to the right result (usually). For finance, you have to have very high accuracy all the way through calculations: floating point rounding errors are a serious problem solvable only with arbitrary size/precision numbers. Hence Java's BigDecimal, GNU's gmp and all the other realizations. This is not just an abstract technical debate, it is a matter of compliance with legal and regulatory requirements. And no you can't do it with integers, at least not hardware ones. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/06/2016 01:49 AM, Ethan Watson wrote:
 I linked my DConf talks on a games industry forum, and the first
 response was that "It looks like a poor man's Rust". A notion I quickly
The games industry (with some exceptions) has a very deeply rooted mentality of "$$$ matters, everything else is teenagers hacking in their basement". Naturally leads to: "Rust -> Mozilla -> $$$ -> Valid" vs "D -> Volunteers -> Kids dinking around -> Worthless bullcrap". Honestly, I wouldn't even bother trying with that crowd. Let them circle the drain with their neverending spiral of ballooning-costs.
Jun 06 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
This.

On Mon, 2016-06-06 at 13:07 -0400, Nick Sabalausky via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 The games industry (with some exceptions) has a very deeply rooted=C2=A0
 mentality of "$$$ matters, everything else is teenagers hacking in
 their=C2=A0
 basement". Naturally leads to: "Rust -> Mozilla -> $$$ -> Valid" vs
 "D=C2=A0
 -> Volunteers -> Kids dinking around -> Worthless bullcrap".
 Honestly, I=C2=A0
 wouldn't even bother trying with that crowd. Let them circle the
 drain=C2=A0
 with their neverending spiral of ballooning-costs.
=20
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 Echoing the need for decimal support. I won't use it myself, 
 but I know it's critical for finance.
funny thing: those "financial sector" most of the time doesn't give anything back. adding special decimal type complicating the compiler and all backends. i myself never needed that for my whole lifetime (and this is more than two decades of programming, in various free and commercial projects). i'd say: "you want it? DIY or GTFO."
Jun 06 2016
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 07, 2016 at 05:39:37AM +0000, ketmar via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 Echoing the need for decimal support. I won't use it myself, but I
 know it's critical for finance.
funny thing: those "financial sector" most of the time doesn't give anything back. adding special decimal type complicating the compiler and all backends. i myself never needed that for my whole lifetime (and this is more than two decades of programming, in various free and commercial projects).
[...] A Decimal type isn't hard to implement as a user-defined type. I don't understand the obsession with some people that something must be a built-in type to be acceptable... user-defined types were invented for a reason, and in D you have the facilities of making user-defined types behave almost like built-in types in a way no other language I know of can. Same thing goes with a fixed point type. People keep complaining about it, but honestly if I were in the finance sector I'd implement the type myself in a couple o' days and put it up on code.dlang.org or something. It's not *that* hard. T -- To provoke is to call someone stupid; to argue is to call each other stupid.
Jun 06 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 05:38:25 UTC, H. S. Teoh wrote:
 Same thing goes with a fixed point type. People keep 
 complaining about it, but honestly if I were in the finance 
 sector I'd implement the type myself in a couple o' days and 
 put it up on code.dlang.org or something. It's not *that* hard.
i've seen some implementations already (and, of course, made my own too). it is not that hard, and it working nice. so yes, i'm all for that. and literals of various custom types can be implemented with CTFE magic, like `octal!`, for example.
Jun 06 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 05:52:33 UTC, ketmar wrote:
 On Tuesday, 7 June 2016 at 05:38:25 UTC, H. S. Teoh wrote:
 Same thing goes with a fixed point type. People keep 
 complaining about it, but honestly if I were in the finance 
 sector I'd implement the type myself in a couple o' days and 
 put it up on code.dlang.org or something. It's not *that* hard.
i've seen some implementations already (and, of course, made my own too). it is not that hard, and it working nice. so yes, i'm all for that. and literals of various custom types can be implemented with CTFE magic, like `octal!`, for example.
Well, it is a lot of work to get the base 10 IEEE 754-2008 implementation conformant. Fortunately IBM has already done it under a BSD license: http://www.bytereef.org/mpdecimal/index.html So all you base-10 dudes have to do is to translate it into D.
Jun 07 2016
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 12:43 AM, Ola Fosheim Grøstad wrote:
 Well, it is a lot of work to get the base 10 IEEE 754-2008 implementation
 conformant. Fortunately IBM has already done it under a BSD license:

 http://www.bytereef.org/mpdecimal/index.html

 So all you base-10 dudes have to do is to translate it into D.
Nice link. It isn't even necessary to translate it to D. Just put a D wrapper around it (it is a C library, after all). Since it is BSD licensed, it probably shouldn't go into Phobos itself, but as an add-on library there should be no problem.
Jun 07 2016
prev sibling next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Tuesday, 7 June 2016 at 05:38:25 UTC, H. S. Teoh wrote:
 On Tue, Jun 07, 2016 at 05:39:37AM +0000, ketmar via 
 Digitalmars-d wrote:
 On Monday, 6 June 2016 at 05:49:53 UTC, Ethan Watson wrote:
 Echoing the need for decimal support. I won't use it myself, 
 but I know it's critical for finance.
funny thing: those "financial sector" most of the time doesn't give anything back. adding special decimal type complicating the compiler and all backends. i myself never needed that for my whole lifetime (and this is more than two decades of programming, in various free and commercial projects).
[...] A Decimal type isn't hard to implement as a user-defined type. I don't understand the obsession with some people that something must be a built-in type to be acceptable... user-defined types were invented for a reason, and in D you have the facilities of making user-defined types behave almost like built-in types in a way no other language I know of can. Same thing goes with a fixed point type. People keep complaining about it, but honestly if I were in the finance sector I'd implement the type myself in a couple o' days and put it up on code.dlang.org or something. It's not *that* hard. T
In my opinion passionate D developer will build the missing parts themselves. I also developed my own decimal library. But the developers starting with D and which rather want to build content instead of the missing parts won't. Also there is a much higher trust if a library is provided within phobos than provided from a single person in terms of long time support. Kind regards Andre
Jun 06 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 06:23 +0000, Andre Pany via Digitalmars-d wrote:
=20
[=E2=80=A6]
 In my opinion passionate D developer will build the missing parts=C2=A0
 themselves. I also developed my own decimal library. But the=C2=A0
 developers starting with D and which rather want to build content=C2=A0
 instead of the missing parts won't. Also there is a much higher=C2=A0
 trust if a library is provided within phobos than provided from a=C2=A0
 single person in terms of long time support.
Forget Phobos, be agile: get your stuff out on GitHub and into the Dub repository early, announce it, get people using it and feeding back to improve it. Be first to market and get traction. The evolution of Go, Rust, and Ceylon has been about library development not in the central library but in easily accessed and used third-party libraries. Yes the central library core is important, but the real development, and traction, action is outside that. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling parent Ethan Watson <gooberman gmail.com> writes:
On Tuesday, 7 June 2016 at 05:38:25 UTC, H. S. Teoh wrote:
 A Decimal type isn't hard to implement as a user-defined type. 
 I don't understand the obsession with some people that 
 something must be a built-in type to be acceptable...
As I see it, any kind of an implementation that's comparable to what's out there is acceptable, be it a standard libary or user library, as long as it's visible and people can easily find it. For example in C++ land: https://software.intel.com/en-us/articles/intel-decimal-floating-point-math-library/ Which makes a point of stating it conforms to standards and is usable in cases where decimal is legally required. And in C# land: https://msdn.microsoft.com/en-us/library/system.decimal(v=vs.110).aspx System.Decimal and the basic decimal type being a part of the .NET runtime/C# language. Both are highly visible from Google searches.
Jun 07 2016
prev sibling next sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 22:38 -0700, H. S. Teoh via Digitalmars-d wrote:
=20
[=E2=80=A6]
 A Decimal type isn't hard to implement as a user-defined type. I
 don't
 understand the obsession with some people that something must be a
 built-in type to be acceptable... user-defined types were invented
 for a
 reason, and in D you have the facilities of making user-defined types
 behave almost like built-in types in a way no other language I know
 of
 can.
=20
 Same thing goes with a fixed point type. People keep complaining
 about
 it, but honestly if I were in the finance sector I'd implement the
 type
 myself in a couple o' days and put it up on code.dlang.org or
 something.
 It's not *that* hard.
=C2=A0
It is certainly the case that now there is the Dub repository, it is now feasible to think in terms of library solutions of things not in Phobos. Till then if it wasn't it Phobos it didn't exist. Yes organizations can implement their own, but I bet they would think twice about putting it into the Dub repository. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 05:39 +0000, ketmar via Digitalmars-d wrote:
=20
[=E2=80=A6]
 funny thing: those "financial sector" most of the time doesn't=C2=A0
 give anything back. adding special decimal type complicating the=C2=A0
 compiler and all backends. i myself never needed that for my=C2=A0
 whole lifetime (and this is more than two decades of programming,=C2=A0
 in various free and commercial projects).
Financial organizations generally want all their language infrastructure for free (GCC, Python, Eclipse) anything they write themselves is treated as asset and so theirs not for anyone else. Show them PyPy is 30x faster than CPython for their use case but needs =C2=A350,000 to be production ready, and they carry on using CPython. (And waste millions of =C2=A3s in staff time waiting for Python codes to finish.= )
 i'd say: "you want it? DIY or GTFO."
That leads to them using Java, Scala, C++, C# and Python. Which they do. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 08:31:09 UTC, Russel Winder wrote:
 Financial organizations generally want all their language 
 infrastructure for free (GCC, Python, Eclipse) anything they 
 write themselves is treated as asset and so theirs not for 
 anyone else.
So basically no advantage in having them use your tools?
 That leads to them using Java, Scala, C++, C# and Python. Which 
 they do.
How does that benefit C++ if they never contribute anything?
Jun 07 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 08:41 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-
d wrote:
 On Tuesday, 7 June 2016 at 08:31:09 UTC, Russel Winder wrote:
 Financial organizations generally want all their language=C2=A0
 infrastructure for free (GCC, Python, Eclipse) anything they=C2=A0
 write themselves is treated as asset and so theirs not for=C2=A0
 anyone else.
=20 So basically no advantage in having them use your tools?
=46rom my experience as a trainer in those organizations=E2=80=A6 no.
 That leads to them using Java, Scala, C++, C# and Python. Which=C2=A0
 they do.
=20 How does that benefit C++ if they never contribute anything?
It doesn't. As a counter-point to the "downer" on financial institutions and contributing back, it perhaps should be noted that Goldman Sachs did release their Java data structures framework as FOSS, but I think it hasn't gained much traction. Also there is Jane Street Capital's contributions to OCaml. However, in the main, financial institutions do not like giving back. Where there is some giving back is organizations that work with the financial institutions. Python has a number of organizations contributing to Python and library development who get their income by training or consulting to financial institutions. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 09:03:15 UTC, Russel Winder wrote:
 As a counter-point to the "downer" on financial institutions 
 and contributing back, it perhaps should be noted that Goldman 
 Sachs did release their Java data structures framework as FOSS, 
 but I think it hasn't gained much traction. Also there is Jane 
 Street Capital's contributions to OCaml.
I think big corporations releasing internal frameworks doesn't add much in general. They often "change the language" by requiring programmers to adapt a specific paradigm on a very basic level that has been aggregated over time. Even Qt and moc has such issues. Internal libraries can sometimes be reworked to something more general, but frameworks are usually a waste of time if it has not been used by a very large number of projects while being developed.
 Python has a number of organizations contributing to Python and 
 library development who get their income by training or 
 consulting to financial institutions.
*nods* So you basically need very large scale adoption before you get the benefits. Which kind of makes it irrelevant in this context, where the goal is to gain traction.
Jun 07 2016
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 09:03:15 UTC, Russel Winder wrote:
 Where there is some giving back is organizations that work with 
 the financial institutions. Python has a number of 
 organizations contributing to Python and library development 
 who get their income by training or consulting to financial 
 institutions.
Btw, I just found this on pony-lang: https://www.infoq.com/presentations/pony I guess one model is for the language-creator to do consulting/implementation for the industry and make the improvements on the compiler/libraries available. But Pony is a rather unique language regarding compile time guarantees.
Jun 07 2016
prev sibling next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 08:31:09 UTC, Russel Winder wrote:
 Financial organizations generally want all their language 
 infrastructure for free (GCC, Python, Eclipse) anything they 
 write themselves is treated as asset and so theirs not for 
 anyone else.
...
 That leads to them using Java, Scala, C++, C# and Python. Which 
 they do.
so let 'em use those languages. they will not return anything, thus i see no reason to care about their bussiness needs. i've never seen "this language is used in banking" to be used as the main reason for choosing one language over another (outside the banking, of course ;-). if they want to contribute and support their contribution — great, they can. but if they want somebody to do the work for 'em for free... two words, seven letters.
Jun 07 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 08:45:14 UTC, ketmar wrote:
 never seen "this language is used in banking" to be used as the 
 main reason for choosing one language over another (outside the 
 banking, of course ;-).
True, positioning yourself as a toolmaker for banking/finance is basically saying it is arcane and slow-moving (like Cobol and Java). It does not make you attractive for programmers wanting hardware access. On the contrary, any language that is stuffed into the Java category would make me highly sceptical when looking for system programming tooling.
Jun 07 2016
parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Tuesday, 7 June 2016 at 08:57:36 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 08:45:14 UTC, ketmar wrote:
 never seen "this language is used in banking" to be used as 
 the main reason for choosing one language over another 
 (outside the banking, of course ;-).
True, positioning yourself as a toolmaker for banking/finance is basically saying it is arcane and slow-moving (like Cobol and Java).
I shall try to respect Andrei's advice, but really! Hahahaha!
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 09:02:05 UTC, Laeeth Isharc wrote:
 On Tuesday, 7 June 2016 at 08:57:36 UTC, Ola Fosheim Grøstad 
 wrote:
 On Tuesday, 7 June 2016 at 08:45:14 UTC, ketmar wrote:
 never seen "this language is used in banking" to be used as 
 the main reason for choosing one language over another 
 (outside the banking, of course ;-).
True, positioning yourself as a toolmaker for banking/finance is basically saying it is arcane and slow-moving (like Cobol and Java).
I shall try to respect Andrei's advice, but really! Hahahaha!
Try tell someone making a 3D engine that your tooling is used in banking and that they should switch from C++. Now, don't feel insulted, but banking/finance is considered a boring application area by most programmers I know of.
Jun 07 2016
parent reply Ethan Watson <gooberman gmail.com> writes:
On Tuesday, 7 June 2016 at 09:11:38 UTC, Ola Fosheim Grøstad 
wrote:
 Try tell someone making a 3D engine that your tooling is used 
 in banking and that they should switch from C++.

 Now, don't feel insulted, but banking/finance is considered a 
 boring application area by most programmers I know of.
And yet, the financial/banking sector loves game/engine programmers because they give a damn about real-time performance. There are plenty of ex-game-developers in that sector making three times as much money as they used to. Not to say that it isn't boring. That's purely a subjective thing.
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 10:28:37 UTC, Ethan Watson wrote:
 performance. There are plenty of ex-game-developers in that 
 sector making three times as much money as they used to.
I am sure there is, game programmers/smaller companies also contribute a lot of libraries and knowhow (tutorials etc). Whereas the suits in games are in it for the money, I think most game programmers are in it for other more "idealistic" reasons. The difference between: 1. Programming in order to reach some non-software performance goal. 2. Programming in order to achieve a programming related esthetic result. Attracting the culture in group 2 is much more valuable to a community project as they find it meaningful to share their knowledge (games, raytracing, compilers etc). It isn't only a job then.
 Not to say that it isn't boring. That's purely a subjective 
 thing.
I don't know if it is boring or not, probably depends on where you work, but the reputation isn't very marketable. Unlike say embedded programming. Embedded programming -> excellent hardware access / memory usage Games programming -> excellent access to OS APIs and resource management
Jun 07 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 08:45 +0000, ketmar via Digitalmars-d wrote:
=C2=A0[=E2=80=A6]
=20
 so let 'em use those languages. they will not return anything,=C2=A0
 thus i see no reason to care about their bussiness needs. i've=C2=A0
 never seen "this language is used in banking" to be used as the=C2=A0
 main reason for choosing one language over another (outside the=C2=A0
 banking, of course ;-).
=20
 if they want to contribute and support their contribution =E2=80=94=C2=A0
 great, they can. but if they want somebody to do the work for 'em=C2=A0
 for free... two words, seven letters.
Generally I would agree. However with something like "big decimal" I'd say it is worth doing anyway =E2=80=93 even though the impetus appears to b= e from financial computing. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 09:05:10 UTC, Russel Winder wrote:
 Generally I would agree. However with something like "big 
 decimal" I'd say it is worth doing anyway – even though the 
 impetus appears to be from financial computing.
as an external library, i'd say, by someone who really needs it. i don't feel that it is something that should be included into language distribution and supported by core devs.
Jun 07 2016
prev sibling parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Tuesday, 7 June 2016 at 08:31:09 UTC, Russel Winder wrote:
 On Tue, 2016-06-07 at 05:39 +0000, ketmar via Digitalmars-d 
 wrote:
 
[…]
 funny thing: those "financial sector" most of the time doesn't 
 give anything back. adding special decimal type complicating 
 the compiler and all backends. i myself never needed that for 
 my whole lifetime (and this is more than two decades of 
 programming, in various free and commercial projects).
Financial organizations generally want all their language infrastructure for free (GCC, Python, Eclipse) anything they write themselves is treated as asset and so theirs not for anyone else. Show them PyPy is 30x faster than CPython for their use case but needs £50,000 to be production ready, and they carry on using CPython. (And waste millions of £s in staff time waiting for Python codes to finish.)
 i'd say: "you want it? DIY or GTFO."
That leads to them using Java, Scala, C++, C# and Python. Which they do.
I only know a certain portion of that world, but for example Jane Street has done quite a lot for Ocaml, Bloomberg has released some useful things including for languages, Morgan Stanley has supported Scala, I have supported in a small way some things for D and will be releasing a working Bloomberg API soon. Don't look for innovation to come from the banks because they have had other things to deal with, but even there there is the beginning of a broader change in mindset. EMSI are in econ not quite finance and they released containers. Finance is just one more industry, but it's quite a pragmatic one and still has a decent share of global IT spending.
Jun 07 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 09:00 +0000, Laeeth Isharc via Digitalmars-d
wrote:
 [=E2=80=A6]
=20
 I only know a certain portion of that world, but for example Jane=C2=A0
 Street has done quite a lot for Ocaml, Bloomberg has released=C2=A0
 some useful things including for languages, Morgan Stanley has=C2=A0
 supported Scala, I have supported in a small way some things for=C2=A0
 D and will be releasing a working Bloomberg API soon.=C2=A0=C2=A0Don't lo=
ok=C2=A0
 for innovation to come from the banks because they have had other=C2=A0
 things to deal with, but even there there is the beginning of a=C2=A0
 broader change in mindset.
Jane Street are indeed well-known and well-renowned for their work with OCaml. It works for them but remains a small niche with little traction. Bloomberg actually do a significant amount of indirect give-back for C++ and a little for Java: they do a lot of sponsoring of C++ events and have staff on standards committees. We at ACCUConf like Bloomberg. I haven't been aware of Scala give-back from Morgan Stanley, I shall hunt it out. The organizations I know using Scala generally stay pretty quiet about it other than they are using it. Ditto for Python.
 Finance is just one more industry, but it's quite a pragmatic one=C2=A0
 and still has a decent share of global IT spending.
I have made quite a lot of my income over the last 7 years from finance industry, commercial banking and hedge funds, I am not complaining. :-) =C2=A0 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Tuesday, 7 June 2016 at 09:45:36 UTC, Russel Winder wrote:
 Jane Street are indeed well-known and well-renowned for their 
 work with OCaml. It works for them but remains a small niche 
 with little traction.
Perhaps, but my point is that there's a company that adopted a less-known language without starting out with any grand plan to do so, and they clearly have contributed very substantially to that language community in terms of money and I would guess code (as well as hiring key people from Ocaml community).
 Bloomberg actually do a significant amount of indirect 
 give-back for C++ and a little for Java: they do a lot of 
 sponsoring of C++ events and have staff on standards 
 committees. We at ACCUConf like Bloomberg.

 I haven't been aware of Scala give-back from Morgan Stanley, I 
 shall hunt it out. The organizations I know using Scala 
 generally stay pretty quiet about it other than they are using 
 it. Ditto for Python.
So perhaps you might agree that just the few examples that immediately came to mind demonstrate that it isn't the case finance doesn't and wouldn't give back. I don't know the details of MS involvement with Scala (except that they use it), but a Scala guy I talked to was concerned that they were having too much influence of the development of the language - so I guess there must have been some money and code involved.
 Finance is just one more industry, but it's quite a pragmatic 
 one and still has a decent share of global IT spending.
I have made quite a lot of my income over the last 7 years from finance industry, commercial banking and hedge funds, I am not complaining. :-)
Glad to hear. But maybe in that context one should be thoughtful about suggesting entire industries are unpromising targets. It's quite heterogeneous, and hard to speak in broadbush terms.
Jun 07 2016
prev sibling next sibling parent reply poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * The garbage collector eliminates probably 60% of potential 
 users right off.
Please, elliminate GC. This also hurts the open source community. Why would I write/opensource a high performance library if I know that projects like AAA games are not going to use it anyway due to GC in D? On the other hand if I can write a library that guarantees to not use and not need garbage collector then even C and C++ projects can use it. With GC, D doesn't play nice with existing C/C++ code.
 * Tooling is immature and of poorer quality compared to the 
 competition.
Quality is an issue, but I thing a bigger problem for adoption is just the time it takes a new user to set the dev environment up. If you look at those pages: https://wiki.dlang.org/Editors https://wiki.dlang.org/IDEs most of the tools use dcd, dscanner and dfmt to provide the most important features like auto-completion, autoformatting etc. The problem is that dcd, dscanner and dfmt are not bundled together so it takes a long time to actually install all this together. Note that discovering what is what also takes a lot of time for a new user. I tried to do this recently and it took me 2 days before I found a combination of versions of dcd, dscanner, dfmt, dub and an IDE that work together correctly on Windows. My example is probably extreme but is a lesson. May I suggest bundling the official tools with dcd, dscanner, dfmt and dub to create a Dlang-SDK. Then the user would only have to install - Dlang-SDK - An editor And everything would work. This would reduce the time from a few hours (with a very large variance) to 30 minutes. Then maybe people who try D in their free time without being strongly indoctrinated by Andrei will not quit after 30 minutes. :)
Jun 05 2016
next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 6 June 2016 at 06:50:36 UTC, poliklosio wrote:
 Please, elliminate GC.
 This also hurts the open source community. Why would I 
 write/opensource a high performance library if I know that 
 projects like AAA games are not going to use it anyway due to 
 GC in D?
- well there is an AAA game using it now, - many existing AAA games have a GC already, - the GC doesn't prevent anything from being done when it comes to real-time, - you lose development time by not having GC, - preciously few people get to make AAA games anyway, and they will workaround anything in their path
 On the other hand if I can write a library that guarantees to 
 not use and not need garbage collector then even C and C++ 
 projects can use it.
Even if you use the GC you can make a library that C and C++ can use.
Jun 06 2016
parent reply Ethan Watson <gooberman gmail.com> writes:
On Monday, 6 June 2016 at 07:18:56 UTC, Guillaume Piolat wrote:
 - well there is an AAA game using it now,
Replying solely to highlight that Unreal Engine has garbage collected since forever; and Unity is a .NET runtime environment and all the GC frills that come with it. GC in the AAA/indie gaming space is hardly a new concept.
Jun 06 2016
parent Danni Coy via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, Jun 6, 2016 at 7:13 PM, Ethan Watson via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 6 June 2016 at 07:18:56 UTC, Guillaume Piolat wrote:
 - well there is an AAA game using it now,
Replying solely to highlight that Unreal Engine has garbage collected since forever; and Unity is a .NET runtime environment and all the GC frills that come with it. GC in the AAA/indie gaming space is hardly a new concept.
And Unity has had some pretty serious scalability problems that come from this decision and with unreal the garbage collected components are optional, and when epic goes to push performance, they write it in C++. Having said that there are garbage collectors out there that are realtime friendly, neither Mono nor D as far as I can tell are perform well in this senario. Personally I would accept either a Garbage collector that guaranteed me better than 5ms latency or manual memory management.
Jun 06 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 06:50 +0000, poliklosio via Digitalmars-d wrote:
=20
 [=E2=80=A6]
 Please, elliminate GC.
Let's not. It is a USP against C++ and Rust. It forges a new route to traction, cf. Go, Java, etc.
 This also hurts the open source community. Why would I=C2=A0
 write/opensource a high performance library if I know that=C2=A0
 projects like AAA games are not going to use it anyway due to GC=C2=A0
 in D? On the other hand if I can write a library that guarantees=C2=A0
 to not use and not need garbage collector then even C and C++=C2=A0
 projects can use it.
 With GC, D doesn't play nice with existing C/C++ code.
There may be some instances where this is the case. Let them use C++ or Rust. Fine. If AAA games people want C++ they will use C++, for D they are a lost market. That is fine, there is nothing wrong with that. See what happened with Go. And anyway D has a GC if you want and a no-GC if you want. Thus this is not actually an issue anyway.
=20
 * Tooling is immature and of poorer quality compared to the=C2=A0
 competition.
=20 Quality is an issue, but I thing a bigger problem for adoption is=C2=A0 just the time it takes a new user to set the dev environment up.=C2=A0 If you look at those pages: https://wiki.dlang.org/Editors https://wiki.dlang.org/IDEs most of the tools use dcd, dscanner and dfmt to provide the most=C2=A0 important features like auto-completion, autoformatting etc. The problem is that dcd, dscanner and dfmt are not bundled=C2=A0 together so it takes a long time to actually install all this=C2=A0 together. Note that discovering what is what also takes a lot of=C2=A0 time for a new user. I tried to do this recently and it took me 2 days before I found=C2=A0 a combination of versions of dcd, dscanner, dfmt, dub and an IDE=C2=A0 that work together correctly on Windows. My example is probably=C2=A0 extreme but is a lesson.
Agreed. the editor, support, toolchain thing is an issue. Whilst some of us can cope with putting the bits together, this is not a route to traction for the language.
 May I suggest bundling the official tools with dcd, dscanner,=C2=A0
 dfmt and dub to create a Dlang-SDK.
Having a small selection of "all in" bundles is a good idea. Better is a tool for creating an installation from the bits. =C2=A0
 Then the user would only have to install
 - Dlang-SDK
 - An editor
 And everything would work. This would reduce the time from a few=C2=A0
 hours (with a very large variance) to 30 minutes. Then maybe=C2=A0
 people who try D in their free time without being strongly=C2=A0
 indoctrinated by Andrei will not quit after 30 minutes. :)
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
parent reply poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 08:42:55 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 06:50 +0000, poliklosio via Digitalmars-d 
 wrote:
 
 […]
 Please, elliminate GC.
Let's not. It is a USP against C++ and Rust. It forges a new route to traction, cf. Go, Java, etc.
I should have been more specific here. I mean I want to elliminate GC in my code. I don't mind if you or anyone else uses GC. Even I use GC languages when writing things like scripts, so I'm not a no-GC-because-I-say-so person. Is it a unique selling point (USP) against C++ or Rust? I don't think so. People who use the GC languages for business/scientific apps don't care what is behind the scenes. Also, the relationship between GC and productivity is a subtle point that requires some CompSci background to grasp. I think D is far too complicated to be marketed as even simpler than python or Go. Low-latency people do care what is behind the scenes and they understandably want no GC. That leaves high-performance high-latency people. If you think you can find a niche there, fair enough, otherwise its not a USP. D's power is in its native-but-productive approach. This is an improvement in C++ niches, not a simplified language for banging end-user apps.
 This also hurts the open source community. Why would I
 write/opensource a high performance library if I know that
 projects like AAA games are not going to use it anyway due to 
 GC
 in D? On the other hand if I can write a library that 
 guarantees
 to not use and not need garbage collector then even C and C++
 projects can use it.
 With GC, D doesn't play nice with existing C/C++ code.
There may be some instances where this is the case. Let them use C++ or Rust. Fine. If AAA games people want C++ they will use C++, for D they are a lost market.
Why would they not use D? D is a much better language for them as well. To give some examples, in C++ code there is a ton of boilerplate, while D code hardly has any. Also, the number of bugs in a D program is smaller due to easier unittesting. Also, templates don't cause day-long stop-and-learn sessions as in C++. I don't think those people are a lost market.
 And anyway D has a GC if you want and a no-GC if you want. Thus 
 this is not actually an issue anyway.
This is a big issue now due to lack of a comprehensive guide, as well as holes in the language and phobos (strings, exceptions, delegates). C++ doesn't have those holes.
Jun 06 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 19:57 +0000, poliklosio via Digitalmars-d wrote:
 [=E2=80=A6]
=20
 I should have been more specific here. I mean I want to=C2=A0
 elliminate GC in my code. I don't mind if you or anyone else uses=C2=A0
 GC. Even I use GC languages when writing things like scripts, so=C2=A0
 I'm not a no-GC-because-I-say-so person.
Indeed. D has a GC by default which can be switched off. This is good. That D needs a better GC is an issue.
 Is it a unique selling point (USP) against C++ or Rust? I don't=C2=A0
 think so. People who use the GC languages for business/scientific=C2=A0
 apps don't care what is behind the scenes. Also, the relationship=C2=A0
 between GC and productivity is a subtle point that requires some=C2=A0
 CompSci background to grasp. I think D is far too complicated to=C2=A0
 be marketed as even simpler than python or Go. Low-latency people=C2=A0
 do care what is behind the scenes and they understandably want no=C2=A0
 GC. That leaves high-performance high-latency people. If you=C2=A0
 think you can find a niche there, fair enough, otherwise its not=C2=A0
 a USP.
My feeling is there is no point in over-thinking this, or being abstract. C++ can have a GC but doesn't. Rust can have a GC but doesn't. Python has a GC. Go has a GC. Java has a GC. D has a GC that you can turn off. That seems like a USP to me. Whether this is good or bad for traction is down to the marketing and the domain of use.
 D's power is in its native-but-productive approach. This is an=C2=A0
 improvement in C++ niches, not a simplified language for banging=C2=A0
 end-user apps.
Productive is way, way more important that native. [=E2=80=A6]
=20
 Why would they not use D? D is a much better language for them as=C2=A0
 well. To give some examples, in C++ code there is a ton of=C2=A0
 boilerplate, while D code hardly has any. Also, the number of=C2=A0
 bugs in a D program is smaller due to easier unittesting. Also,=C2=A0
 templates don't cause day-long stop-and-learn sessions as in C++.=C2=A0
 I don't think those people are a lost market.
Can we drop the unit and just talk about testing. Unit, integration and system testing are all important, focusing always on unit testing is an error. As to why not use D? The usual answer is that "no else is using D so we won't use D" for everyone except early adopters. D needs to remanufacture an early adopter feel. It is almost there: LDC announcing 1.0.0, Dub getting some activity, new test frameworks (can they all lose the unit please in a renaming), rising in TIOBE table. This can all be used to try and get activity. However it needs to be activity of an early adopter style because there are so many obvious problems with the toolchains and developer environments. So let's focus on getting these things improved so that then the people who will only come to a language that has sophistication of developer experience.
 This is a big issue now due to lack of a comprehensive guide, as=C2=A0
 well as holes in the language and phobos (strings, exceptions,=C2=A0
 delegates). C++ doesn't have those holes.
Holes in Phobos can be handled by having third-party things in the Dub repository that are superior to what is in Phobos. Documentation, guides, and tutorials are a problem, but until someone steps up and contributes this is just going to remain a problem. One that will undermine all attempts to get traction for D. So how to get organizations to put resource into doing this? --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent poliklosio <poliklosio happypizza.com> writes:
On Tuesday, 7 June 2016 at 08:57:33 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 19:57 +0000, poliklosio via Digitalmars-d 
 wrote:
 […]
 
 I should have been more specific here. I mean I want to 
 elliminate GC in my code. I don't mind if you or anyone else 
 uses GC. Even I use GC languages when writing things like 
 scripts, so I'm not a no-GC-because-I-say-so person.
Indeed. D has a GC by default which can be switched off. This is good. That D needs a better GC is an issue.
For me a much bigger issues are: - Smaller power of the language with GC switched off. - Mixing GC and noGC is still experimental thing that few experts know how to do properly. Better GC is not a bigger issue for me as I'm not going to use much of it. Better GC is of course advantageous for adoption, I just have a strong impression that there are more important things, like nailing easy setup of editors, and providing a guide for programming **without** the GC and for mixing GC and NoGC. You have to distinguish switching GC off (that implies 2 languages, 2 communities, 2 separate standard libraries, all with some overlap) from being able to mix GC and non-GC code in the same program. The problem is that AFAIK the second is not a viable methodology outside a tightly controlled environment, where you select used libraries very carefully and limit their number.
 (...)
My feeling is there is no point in over-thinking this, or being abstract. C++ can have a GC but doesn't. Rust can have a GC but doesn't. Python has a GC. Go has a GC. Java has a GC. D has a GC that you can turn off. That seems like a USP to me. Whether this is good or bad for traction is down to the marketing and the domain of use.
I'm trying to be as far from abstract as I can. Having GC is hardly a unique selling point. As for switching it off, see issues above. After they are solved to a point when experts can get noGC stuff done easily, it will be a USP.
 D's power is in its native-but-productive approach. This is an 
 improvement in C++ niches, not a simplified language for 
 banging end-user apps.
Productive is way, way more important that native.
For some people native is necessary. For them D is the way to get productive. Others ideally would use D as well but currently there are more productive options for them, like C# or python.
 […]
 
 Why would they not use D? D is a much better language for them 
 as well. To give some examples, in C++ code there is a ton of 
 boilerplate, while D code hardly has any. Also, the number of 
 bugs in a D program is smaller due to easier unittesting. 
 Also, templates don't cause day-long stop-and-learn sessions 
 as in C++. I don't think those people are a lost market.
Can we drop the unit and just talk about testing. Unit, integration and system testing are all important, focusing always on unit testing is an error.
There's nothing wrong with discussing unittesting on its own. In fact, this is very relevant because its the unittesting that D makes easier. More coarse-grained testing can be done as easily in any other language - you just make some executables for various subsystems and variants of your program and run them in test scenarios.
 As to why not use D? The usual answer is that "no else is using 
 D so we won't use D" for everyone except early adopters.

 D needs to remanufacture an early adopter feel. It is almost 
 there: LDC announcing 1.0.0, Dub getting some activity, new 
 test frameworks (can they all lose the unit please in a 
 renaming), rising in TIOBE table. This can all be used to try 
 and get activity. However it needs to be activity of an early 
 adopter style because there are so many obvious problems with 
 the toolchains and developer environments. So let's focus on 
 getting these things improved so that then the people who will 
 only come to a language that has sophistication of developer 
 experience.
As long as those are improvements in getting started fast and time-to-market for D apps, than yes, and that's probably 10 times more important than the both slow GC and poor noGC experience.
 This is a big issue now due to lack of a comprehensive guide, 
 as well as holes in the language and phobos (strings, 
 exceptions, delegates). C++ doesn't have those holes.
Holes in Phobos can be handled by having third-party things in the Dub repository that are superior to what is in Phobos.
I don't think that third-party libraries can have the reach of Phobos libraries. Also, things as basic as strings should be in the standard library or language, otherwise the whole idea of using D looks ridiculous. Having said that, third-party libraries can help.
 Documentation, guides, and tutorials are a problem, but until 
 someone steps up and contributes this is just going to remain a 
 problem. One that will undermine all attempts to get traction 
 for D. So how to get organizations to put resource into doing 
 this?
I think those things can be easily done by individuals as well, as long as pull request are accepted. But of course until someone steps up... :)
Jun 07 2016
prev sibling next sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sun, 2016-06-05 at 19:20 -0700, Walter Bright via Digitalmars-d
wrote:
 [=E2=80=A6]
=20
 * The garbage collector eliminates probably 60% of potential users
 right off.
And i bet over 80% of them are just saying this based on zero evidence, just prejudice.=C2=A0 Go went with the attitude "Go has a GC, if you cannot deal with that #### off". Many people did exactly that and the Go community said "byeeee". Arrogant this may have been, but Pike, Cox, et al. stuck to their guns and forged a community and a niche for the language. This then created traction. Now GC in Go is not an issue.=C2=A0 D has a different problem in that it is not a new language, it cannot ride a "new language" hype wave to get over knee-jerk opinions. D needs t make it clear there is the normal GC approach proven to work fine over the years, just look at Go, and there is the GC-less mode for those that really do need absolute control over all memory management. If D does not have this then C++ and Rust win. If GC-less D is a problem then why not let people who want absolute control of memory at all times go to another language, e.g. C, C++, Rust. They will anyway. So maybe get rid of CG-less mode as Go has done and not try to make D a language for people who are already prejudiced against GC and for C, C++ and Rust? Ig GC is a problem for these people, let them go.
 * Tooling is immature and of poorer quality compared to the
 competition.
This is true.=C2=A0 We have too many half-finished attempt at things, basically because everything is volunteer, not directly associated with work, activity. Nothing wrong with this per se, but an obvious explanation why it is so. Unless an oirganization or seven put some work-oriented effort into the tooling, nothinkg will change. I would suggest three ways forward: 1. Get effort on the IntelliJ IDEA and CLion plugin. Kingsley has made a start. I suggest using his work as a basis and doing a new version written in Kotlin instead of Java. Kotlin will be easier than Java for D people to work with and easy for Java people to work with. 2. Get effort on the DDT Eclipse plugin. Bruno has declared it finished, which is fine, but I would say it should not be treated that way. 3. Have one lightweight D realized cross platform IDE. Qt us probably the best widget set to use for this. My model here is LiteIDE which is a Qt-based Go IDE realized in C++. It should of course be realized in Go, but there are no Qt bindings for Go, only QML ones.
 * Safety has holes and bugs.
Then so does C, C++ and Rust, so this is just a comment made because it can be made and sounds bad. Bad enough to salve the conscience of the speaker as to why they are not already using D. What does safety mean here anyway. C++ safety perhaps?
 * Hiring people who know D is a problem.
Definitely. But this is also an excuse since there were no Go programmers, no Rust programmers, no C++ programmers, yet now there is no problem. The point here is that a few organizations just need to take a public stance on using the language to create a buzz and traction either happens (as with Go) or it doesn't.
 * Documentation and tutorials are weak.
True. This is a resourcing issue. Go had full resourcing from Google, and then later others, to get to the state they have =E2=80=93 which is not half bad.
 * There's no web services framework (by this time many folks know of
 D, but of=C2=A0
 those a shockingly small fraction has even heard of vibe.d). I have
 strongly=C2=A0
 argued with S=C3=B6nke to bundle vibe.d with dmd over one year ago, and
 also in this=C2=A0
 forum. There wasn't enough interest.
vibe.d is not a full Web services/applications framework in most people's eyes, it is just the event loop. I guess it is like saying Node is a framework without recognizing the importance of npm and the library of REST, SQL, and other components. Another analogy would be saying Flask is a web framework without pip, SQLAlchemy, etc.
 * (On Windows) if it doesn't have a compelling Visual Studio plugin,
 it doesn't=C2=A0
 exist.
I guess Windows people are so wedded to Visual Studio this is true.
 * Let's wait for the "herd effect" (corporate support) to start.
Seems reasonable to me. Why should organization X take a risk if organization Y hasn't already. This is the position of businesses doomed to failure. Without some risk taking there is no new business and hence no growth, not even status quo. So target this decade old technology at less risk averse organizations. Likely this means SMEs, probably small businesses (or small business units within bigger businesses) simply because they are generally less risk averse.
 * Not enough advantages over the competition to make up for the
 weaknesses above.
A position clearly based in a total lack of knowledge, probably of the technology they are currently using. This attitude is simply the risk averse position in another guise. It is worth taking note of these and learning from them, clearly. But perhaps D as a language and community should be less defensive about the language and its position in the world. Let's stop the "it's a better C++" line, that is clearly a dead route to traction. Let's take lessons from Go, Rust, Kotlin. Find the things that D does well, in the organizations that use it and push those. Go came along as a systems programming language for Google interns to do productive work and not get things wrong. It has ended up as a premier Web programming language and other things taking market share from PHP and Python as well as C. Yes it had the "better C" angle but it did not agamize about how to move people from C. Yes it had Google and hype, but it gained traction both because of that and in spite of that. Rust is pushing the "Rust is a better C++ but with memory safety" the fact that it is a quasi-functional language isn't being made enough of. Many people are blocked from using current C libraries because of many features of Rust. So for example trying to use GStreamer from Rust is noted as being fundamentally impossible by the core GStreamer team, and they have been trying =E2=80=93 to the extent of having to contemplate a ne= w implementation in Rust. D has a big win here as it already has a fully working GStreamer adapter API =E2=80=93 though I wish it were not a part of GtkD but a separate thing. So where should D go? Let's cease the defensive attitude of trying to compete with C++ where C++ is entrenched. Let us take D out to people as a native code language with a declarative feel that has excellent data structures and algorithms, concurrency, parallelism; a language good for computation. Many in finance are already using it. Market their usage, leave the defensive approach and promote the organizations who have taken the risk so other organizations can see what they have done. --=C2=A0 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 06 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 3. Have one lightweight D realized cross platform IDE.
by the way, Buggins has dlangIDE written with his dlangUI package. it is cross-platform, has debugger support, and written in D!
Jun 06 2016
next sibling parent reply Vadim Lopatin <coolreader.org gmail.com> writes:
On Monday, 6 June 2016 at 08:21:22 UTC, ketmar wrote:
 On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 3. Have one lightweight D realized cross platform IDE.
by the way, Buggins has dlangIDE written with his dlangUI package. it is cross-platform, has debugger support, and written in D!
As well, it's only a few megabytes in size. On Windows it can be bundled with mago-mi debugger to avoid Visual Studio dependencies.
Jun 06 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 6 June 2016 at 10:33:29 UTC, Vadim Lopatin wrote:
 On Monday, 6 June 2016 at 08:21:22 UTC, ketmar wrote:
 On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 3. Have one lightweight D realized cross platform IDE.
by the way, Buggins has dlangIDE written with his dlangUI package. it is cross-platform, has debugger support, and written in D!
As well, it's only a few megabytes in size. On Windows it can be bundled with mago-mi debugger to avoid Visual Studio dependencies.
you probably should write artice about it, or something alike. something we can point people into when they are asking about D IDEs, so they can read about features, see some screenshots, and download binary packages for at least windows. something more beginner-friendly than current wiki entry point. not that you "absolutely must" do it, but you are in the best position to do that, as the author. ;-)
Jun 06 2016
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 08:21 +0000, ketmar via Digitalmars-d wrote:
 On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 3. Have one lightweight D realized cross platform IDE.
by the way, Buggins has dlangIDE written with his dlangUI=C2=A0 package. it is cross-platform, has debugger support, and written=C2=A0 in D!
Some months ago I cloned the repository, compiled it, and then found no way of getting a light on dark mode. I thus deleted and ignored it. Maybe I should try again and instead of ignoring problems, jump up and down, scream, throw my toys out of the pram, and write an issue :-) Yes, I know, and submit a pull request. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 09:09:04 UTC, Russel Winder wrote:
 On Mon, 2016-06-06 at 08:21 +0000, ketmar via Digitalmars-d 
 wrote:
 On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 3. Have one lightweight D realized cross platform IDE.
by the way, Buggins has dlangIDE written with his dlangUI package. it is cross-platform, has debugger support, and written in D!
Some months ago I cloned the repository, compiled it, and then found no way of getting a light on dark mode. I thus deleted and ignored it. Maybe I should try again and instead of ignoring problems, jump up and down, scream, throw my toys out of the pram, and write an issue :-) Yes, I know, and submit a pull request.
considering that the whole package, including dlangUI, is one man work... give it a chance! ;-)
Jun 07 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 09:55 +0000, ketmar via Digitalmars-d wrote:
=20
[=E2=80=A6]
 considering that the whole package, including dlangUI, is one man=C2=A0
 work... give it a chance! ;-)
A project starting as a one person thing is quite natural, a project aiming to gain traction remaining a one person project is a dead end siding. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 11:11:31 UTC, Russel Winder wrote:
 On Tue, 2016-06-07 at 09:55 +0000, ketmar via Digitalmars-d 
 wrote:
 
[…]
 considering that the whole package, including dlangUI, is one 
 man work... give it a chance! ;-)
A project starting as a one person thing is quite natural, a project aiming to gain traction remaining a one person project is a dead end siding.
not everybody is good at promoting their work. yes, this skill is required to make your project wide-known (and then wide-used), but... this is where other people can step in to help. i'm sux in promoting things too, so i'm doing as much as i can: mentioning the project occasionally here and there.
Jun 07 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 11:21 +0000, ketmar via Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 11:11:31 UTC, Russel Winder wrote:
 On Tue, 2016-06-07 at 09:55 +0000, ketmar via Digitalmars-d=C2=A0
 wrote:
=20
[=E2=80=A6]
 considering that the whole package, including dlangUI, is one=C2=A0
 man work... give it a chance! ;-)
=20 A project starting as a one person thing is quite natural, a=C2=A0 project aiming to gain traction remaining a one person project=C2=A0 is a dead end siding.
=20 not everybody is good at promoting their work. yes, this skill is=C2=A0 required to make your project wide-known (and then wide-used),=C2=A0 but... this is where other people can step in to help. i'm sux in=C2=A0 promoting things too, so i'm doing as much as i can: mentioning=C2=A0 the project occasionally here and there.
My point was not so much a direct marketing one more an indirect one: if a project is claiming to be a production thing usable by all and sundry but is a one-person project, then it isn't actually production ready. It may actually be production ready, but it will not be perceived as that: person under a bus scenario and all that. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 10 2016
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 1:15 AM, Russel Winder via Digitalmars-d wrote:
 * Safety has holes and bugs.
Then so does C, C++ and Rust, so this is just a comment made because it can be made and sounds bad. Bad enough to salve the conscience of the speaker as to why they are not already using D.
It's pretty clear when they say that, and then continue using C++ which has no safety, that safety isn't the real reason. Reminds me of an anecdote Andrei is tired of. In the 80s, a C++ developer said that compilation speed, speed, speed is the most important thing in a C++ compiler. This went on until it was pointed out to him that he was using Microsoft C++, which was the slowest at compiling by a factor of 4. Clearly, what was actually most important to him was a name brand compiler (social proof), but he needed a more palatable reason, so he just latched on to one without much thought. We have to be careful about being led down the garden path by people who say "I'd use your product if only it did X." I have a lot of experience with that, and it's very rare that they'll use it if you do X. They'll just respond with "yeah, ok, but what I really need is Y." This process never ends. Sometimes it's because they're embarrassed by the real reason, sometimes they just take pleasure in you dancing to their tune. Paying attention to our existing users is a much more reliable source of information.
Jun 06 2016
next sibling parent reply Carl Vogel <carljv gmail.com> writes:
On Monday, 6 June 2016 at 09:16:45 UTC, Walter Bright wrote:
 On 6/6/2016 1:15 AM, Russel Winder via Digitalmars-d wrote:
 * Safety has holes and bugs.
Then so does C, C++ and Rust, so this is just a comment made because it can be made and sounds bad. Bad enough to salve the conscience of the speaker as to why they are not already using D.
It's pretty clear when they say that, and then continue using C++ which has no safety, that safety isn't the real reason. Reminds me of an anecdote Andrei is tired of. In the 80s, a C++ developer said that compilation speed, speed, speed is the most important thing in a C++ compiler. This went on until it was pointed out to him that he was using Microsoft C++, which was the slowest at compiling by a factor of 4. Clearly, what was actually most important to him was a name brand compiler (social proof), but he needed a more palatable reason, so he just latched on to one without much thought. We have to be careful about being led down the garden path by people who say "I'd use your product if only it did X." I have a lot of experience with that, and it's very rare that they'll use it if you do X. They'll just respond with "yeah, ok, but what I really need is Y." This process never ends. Sometimes it's because they're embarrassed by the real reason, sometimes they just take pleasure in you dancing to their tune. Paying attention to our existing users is a much more reliable source of information.
I think this is a really important point and I 100% agree with Walter. Also, the idea that more people will adopt D if you just "get rid of the GC" ignores the fact that you don't just "get rid of the GC," you replace it with another memory management scheme (manual, RAII, RC). If you replace all the parts of the language and phobos that rely on GC with, e.g., a poor implementation of RC, then you're just going to get another round of complaints, and no real adoption. In my experience, GC complaints are split between existence and implementation. Some folks are dead against and GC code, others just don't like how the current GC works. In my world---which is not AAA gaming or HFT, but is still very concerned with performance---a GC is tolerable, but a clunky stop-the-world GC with big unpredictable pauses is less so. Having a high quality GC that can be avoided (with some agreed-to and well-documented limitations) would be great. My concern is that the kill-the-GC craze is going to fall into a second-system trap, and be replaced with buggy half-implementations of, say RC, that I can't rely on, or will be combing through piles of stale bug reports about in 6 months. I believe a big issue for D, and for any not-mainstream language, is being straight about what works and what doesn't. D is not alone in this, but I often feel I'm sold on features that I later find out are not fully implemented or have big holes in them. The limitations themselves aren't the problem---the trust is the problem. I never know if I can tell someone else "D can do that" safely (turning off the GC is a good example---it took me weeks of reading forums and documentation to see how practical that really was after initially reading that it was straightforward.)
Jun 06 2016
next sibling parent Observer <here inter.net> writes:
On Monday, 6 June 2016 at 15:06:49 UTC, Carl Vogel wrote:
 I believe a big issue for D, and for any not-mainstream 
 language, is being straight about what works and what doesn't. 
 D is not alone in this, but I often feel I'm sold on features 
 that I later find out are not fully implemented or have big 
 holes in them. The limitations themselves aren't the 
 problem---the trust is the problem. I never know if I can tell 
 someone else "D can do that" safely (turning off the GC is a 
 good example---it took me weeks of reading forums and 
 documentation to see how practical that really was after 
 initially reading that it was straightforward.)
I completely agree with this point. I read TDPL cover-to-cover, and got excited about writing a large, complex, multi-threaded program in this language. Then I started to look at Web resources, and read http://p0nce.github.io/d-idioms/#The-truth-about-shared . Suddenly it looked like the language was far less stable than I had believed, and that realization put the brakes on my investing time in this direction without a lot more investigation.
Jun 06 2016
prev sibling parent reply poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 15:06:49 UTC, Carl Vogel wrote:
 (...) Also, the idea that more people will adopt D if you just 
 "get rid of the GC" ignores the fact that you don't just "get 
 rid of the GC," you replace it with another memory management 
 scheme (manual, RAII, RC). If you replace all the parts of the 
 language and phobos that rely on GC with, e.g., a poor 
 implementation of RC, then you're just going to get another 
 round of complaints, and no real adoption.
I think you are going to get some adoption it you replace good GC with clunky RC. The key difference is a call to a function that uses GC is incomplete: some of it will execute later, after the call has finished. On the other hand a call to an equivalent function that uses RC has only local (in time) consequences: once you finished the call, it stopped executing. If it returned something that needs to be freed later, you know what it is. Of course people can write arbitrarily messed up things like singletons etc. but I'm not counting those because good libraries are usually free of those. This means you have control over all the OTHER code, however inefficient the call is. Hence, the second is acceptable in low-latency code, but the first is not.
Jun 06 2016
parent reply Carl Vogel <carljv gmail.com> writes:
On Monday, 6 June 2016 at 20:28:47 UTC, poliklosio wrote:
 On Monday, 6 June 2016 at 15:06:49 UTC, Carl Vogel wrote:
 (...) Also, the idea that more people will adopt D if you just 
 "get rid of the GC" ignores the fact that you don't just "get 
 rid of the GC," you replace it with another memory management 
 scheme (manual, RAII, RC). If you replace all the parts of the 
 language and phobos that rely on GC with, e.g., a poor 
 implementation of RC, then you're just going to get another 
 round of complaints, and no real adoption.
I think you are going to get some adoption it you replace good GC with clunky RC. The key difference is a call to a function that uses GC is incomplete: some of it will execute later, after the call has finished. On the other hand a call to an equivalent function that uses RC has only local (in time) consequences: once you finished the call, it stopped executing. If it returned something that needs to be freed later, you know what it is. Of course people can write arbitrarily messed up things like singletons etc. but I'm not counting those because good libraries are usually free of those. This means you have control over all the OTHER code, however inefficient the call is. Hence, the second is acceptable in low-latency code, but the first is not.
I get that and agree. My point was a different one -- that these conversations are about a totally hypothetical RC implementation that we all imagine is perfect, and so we just discuss theoretical GC vs RC tradeoffs. The real one that gets made is going to have bugs and unexpected corner cases. So the risk is that at some point we'll all run to reddit and say "Tada, no more GC" and folks will then just say "D has RC, but it's buggy and unreliable and doesn't work when [insert anecdote]" Maybe that won't be so---maybe the new memory management regime will be perfect and elegant and have no nasty surprises. But I feel a real "grass is greener" sense, when as many other has pointed out, the current GC could use a lot of love, which would solve problems for a lot of current users.
Jun 06 2016
parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 22:45 +0000, Carl Vogel via Digitalmars-d wrote:
=20
[=E2=80=A6]
 I get that and agree. My point was a different one -- that these=C2=A0
 conversations are about a totally hypothetical RC implementation=C2=A0
 that we all imagine is perfect, and so we just discuss=C2=A0
 theoretical GC vs RC tradeoffs. The real one that gets made is=C2=A0
 going to have bugs and unexpected corner cases. So the risk is=C2=A0
 that at some point we'll all run to reddit and say "Tada, no more=C2=A0
 GC" and folks will then just say "D has RC, but it's buggy and=C2=A0
 unreliable and doesn't work when [insert anecdote]"
=C2=A0[=E2=80=A6]
I suspect what will actually happen is that people will every so often raise these points and have long threads of debate and yet again do nothing. RC is fine, cf. Python. GC is fine, cf. JVM. D has a GC that can be switched off. Let's go with that. So the status quo is fine per se. The discussion should only be about whether this GC is good enough, and if it isn't write a new one. JVM did this, Go did this, D just has debates. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling next sibling parent Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 06-Jun-2016 12:16, Walter Bright wrote:
 On 6/6/2016 1:15 AM, Russel Winder via Digitalmars-d wrote:
 Paying attention to our existing users is a much more reliable source of
 information.
Can't agree more. In fact, I'm tired of discussions that revolve around winning appeal of some existing crowd. What we should do instead is help our users do what they already are doing. The resulting top-notch products would be our best marketing. -- Dmitry Olshansky
Jun 06 2016
prev sibling next sibling parent reply Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 6/6/2016 2:16 AM, Walter Bright via Digitalmars-d wrote:
 On 6/6/2016 1:15 AM, Russel Winder via Digitalmars-d wrote:
 * Safety has holes and bugs.
Then so does C, C++ and Rust, so this is just a comment made because it can be made and sounds bad. Bad enough to salve the conscience of the speaker as to why they are not already using D.
It's pretty clear when they say that, and then continue using C++ which has no safety, that safety isn't the real reason.
This isn't a small problem, don't dismiss it quite that quickly. Safety as a usable subset of D is still pretty non-existent and yet is used as a selling point. The language still has holes -- I don't have bug report numbers, but others have reported them in the past, some closed some not. At the library level things are far worse. I've yet to be able to write any interesting apps with an safe main. Has anyone? Later, Brad
Jun 06 2016
next sibling parent David <David.dave dave.com> writes:
On Tuesday, 7 June 2016 at 00:19:27 UTC, Brad Roberts wrote:
 On 6/6/2016 2:16 AM, Walter Bright via Digitalmars-d wrote:
 On 6/6/2016 1:15 AM, Russel Winder via Digitalmars-d wrote:
 * Safety has holes and bugs.
Then so does C, C++ and Rust, so this is just a comment made because it can be made and sounds bad. Bad enough to salve the conscience of the speaker as to why they are not already using D.
It's pretty clear when they say that, and then continue using C++ which has no safety, that safety isn't the real reason.
This isn't a small problem, don't dismiss it quite that quickly. Safety as a usable subset of D is still pretty non-existent and yet is used as a selling point. The language still has holes -- I don't have bug report numbers, but others have reported them in the past, some closed some not. At the library level things are far worse. I've yet to be able to write any interesting apps with an safe main. Has anyone? Later, Brad
My two cents on what I've read so far (in my week long pursuit of learning this language), is that D's safety system is the same old "C++" style philosophy in disguise. Sure if you follow all of the paradigms and code in an idiomatic way, you will have less bugs. But my experience is people do things less idiomatic and more idiotic. In short, D has an opt in system. Which means it's EXTREMELY easy to opt out. Which is essentially what C++ has to. You can have a moderate amount of memory safety if you still to smart pointers and stray away from manual memory management (sure you have some things to worry about, such as cross referencing, etc), but forging a pointer is so easy. And even worse...so convenient at times. Languages like Go, C# and Java are far better in this regard because you have to go WAAAAY out of your way to start doing silly things. You can. And you can by accident (C# has delegates that lead to a lot of leaks for instance), but in general the situation is better. D has the features...but they are opt in. Which means people aren't gonna go out of there way. Just like in C++. Just an observation. In other news if I had to guess I'd say safe never was meant to be slapped on main, but to help give you reasonable guarantees of where issues are not. However, unsafe would have probably been better (so you'd have to opt into the unsafeness) because that is greppable. In terms of safety, I don't see how D is any different than C++. Which is why Rust is probably going to be the tool people reach for safety. But only have reaching for Go, C# and Java, because they require so much less of a learning curve.
Jun 06 2016
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 5:19 PM, Brad Roberts via Digitalmars-d wrote:
 This isn't a small problem, don't dismiss it quite that quickly.  Safety as a
 usable subset of D is still pretty non-existent and yet is used as a selling
 point.
I use it regularly.
 The language still has holes -- I don't have bug report numbers, but
 others have reported them in the past, some closed some not.  At the library
 level things are far worse.  I've yet to be able to write any interesting apps
 with an  safe main.
It's harder to make your program compile with safe. That is not a bug in the language.
Jun 06 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 5:19 PM, Brad Roberts via Digitalmars-d wrote:
 Safety as a
 usable subset of D is still pretty non-existent and yet is used as a selling
 point.  The language still has holes -- I don't have bug report numbers, but
 others have reported them in the past, some closed some not.  At the library
 level things are far worse.  I've yet to be able to write any interesting apps
 with an  safe main.
Without knowing any details of why your app wouldn't compile as safe, there's nothing useful nor actionable in the complaint. There also is a conflation of two issues in the complaint - compiling programs that are unsafe despite being marked safe, and the compiler complaining about unsafe code in code you'd like to be marked safe. Which is it?
Jun 06 2016
parent reply Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 6/6/2016 10:25 PM, Walter Bright via Digitalmars-d wrote:
 On 6/6/2016 5:19 PM, Brad Roberts via Digitalmars-d wrote:
 Safety as a
 usable subset of D is still pretty non-existent and yet is used as a
 selling
 point.  The language still has holes -- I don't have bug report
 numbers, but
 others have reported them in the past, some closed some not.  At the
 library
 level things are far worse.  I've yet to be able to write any
 interesting apps
 with an  safe main.
Without knowing any details of why your app wouldn't compile as safe, there's nothing useful nor actionable in the complaint. There also is a conflation of two issues in the complaint - compiling programs that are unsafe despite being marked safe, and the compiler complaining about unsafe code in code you'd like to be marked safe. Which is it?
For me, it's the latter, but the issues with the former make it hard to trust either all that much. I've fixed some of the issues in a couple bursts of activity over the last several years, and filed a bunch more bugs, but the specifics aren't the point I'm raising here, though your trimming of the thread dropped that part of the context. You dismissed complaints of the incompleteness of safety as the whining of non-users. I'm a user. I was a much more frequent user until I got tired of the sheer number of only partially complete nature of so much of the language + core library. Yes they're separate, no that's not relevant to the majority of users. Yes, I can and have contributed to the fixes, but it's clearly (just based on commit history) not a priority to many people. The D ecosystem is a large pile of incomplete features, with more added all the time.
Jun 06 2016
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with more added all the
 time.
Even with only array bounds checking, D is safer than C++.
Jun 06 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 06:22:50 UTC, Walter Bright wrote:
 On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with 
 more added all the
 time.
Even with only array bounds checking, D is safer than C++.
+inf. this feature alone made me much more productive, and took away alot of pain. (dreaming) if only RangeError could show invalid index too...
Jun 06 2016
parent default0 <Kevin.Labschek gmx.de> writes:
On Tuesday, 7 June 2016 at 06:30:05 UTC, ketmar wrote:
 On Tuesday, 7 June 2016 at 06:22:50 UTC, Walter Bright wrote:
 On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with 
 more added all the
 time.
Even with only array bounds checking, D is safer than C++.
+inf. this feature alone made me much more productive, and took away alot of pain. (dreaming) if only RangeError could show invalid index too...
^ Thats my biggest beef with it, too. Error messages should contain as much detail as is reasonable for the operation. As an aside, I remember Adam did a PR/Bug submission or something and it was argued that it would add too much bloat(??) Heres to hoping we get that at some point.
Jun 06 2016
prev sibling next sibling parent reply deadalnix <deadalnix gmail.com> writes:
On Tuesday, 7 June 2016 at 06:22:50 UTC, Walter Bright wrote:
 On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with 
 more added all the
 time.
Even with only array bounds checking, D is safer than C++.
Better doesn't matter. So good that it justify the switch do matter, anything less is worthless.
Jun 06 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 11:46 PM, deadalnix wrote:
 On Tuesday, 7 June 2016 at 06:22:50 UTC, Walter Bright wrote:
 On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with more added all the
 time.
Even with only array bounds checking, D is safer than C++.
Better doesn't matter. So good that it justify the switch do matter, anything less is worthless.
The context is someone using C++ for the sole reason that D has holes in safe, and that doesn't make sense. Granted, one can certainly have other reasons to prefer C++. But memory safety isn't one of them.
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 07:17:04 UTC, Walter Bright wrote:
 Granted, one can certainly have other reasons to prefer C++. 
 But memory safety isn't one of them.
That's true, but memory safety isn't a big problem in C++ if one sticks to what one can do in safe code. Using gsl::span you get the same stuff as in D for slicing etc. In my experience memory issues in C++ often comes from casting through pointers (which often is necessary for performance reasons) and having typos in pointer-offsets.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 12:40 AM, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 June 2016 at 07:17:04 UTC, Walter Bright wrote:
 Granted, one can certainly have other reasons to prefer C++. But memory safety
 isn't one of them.
That's true, but memory safety isn't a big problem in C++ if one sticks to what one can do in safe code.
Obviously, code that doesn't do unsafe things is safe. I used to write programs that corrupted memory for years and years in C++. Over time, I gradually evolved practices that avoided those sorts of bugs, and now I rarely have a corrupted memory issue in my code. It's not that C++ got any safer. All that old code will still compile and crash. It's that I got better, which should not be confused with the language getting better. I learned not to stick my fingers into the high voltage section of the power supply. C++ still suffers from: http://www.digitalmars.com/articles/b44.html and probably always will.
Jun 07 2016
next sibling parent reply Ethan Watson <gooberman gmail.com> writes:
On Tuesday, 7 June 2016 at 07:57:09 UTC, Walter Bright wrote:
 C++ still suffers from:

 http://www.digitalmars.com/articles/b44.html

 and probably always will.
template< size_t size > void function( char ( &array )[ size ] ); It's horrible syntax (no surprise), and being a templated function means it's recompiled N times... but there's at least something for it now. (Of note is that it's how you're expected to handle string literals at compile time with constexp, but it doesn't make string manipulation at compile time any easier. Go figure.)
Jun 07 2016
parent Olivier <olivier.grant gmail.com> writes:
On Tuesday, 7 June 2016 at 08:09:49 UTC, Ethan Watson wrote:
 On Tuesday, 7 June 2016 at 07:57:09 UTC, Walter Bright wrote:
 C++ still suffers from:

 http://www.digitalmars.com/articles/b44.html

 and probably always will.
template< size_t size > void function( char ( &array )[ size ] ); It's horrible syntax (no surprise), and being a templated function means it's recompiled N times... but there's at least something for it now.
There is array_view and string_view that have been proposed (and I even think accepted) for C++17. But if you want a fat pointer, you can just write one yourself: template< typename T > struct FatPtr { template< std::size_t N > FatPtr( T (&a)[N] ) : p (a) , n (N) { } T *p; std::size_t n; }; void function( FatPtr<char> a ); int main( ) { char a[128]; function(a); function("foo"); }
Jun 13 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 07:57:09 UTC, Walter Bright wrote:
 It's not that C++ got any safer. All that old code will still 
 compile and crash. It's that I got better, which should not be 
 confused with the language getting better. I learned not to 
 stick my fingers into the high voltage section of the power 
 supply.
Well, I don't know how newbies approach C++ and what problems they run into, but when I learned C++ in the 90s most of my problems were related to using the C-subset of C++. Often for performance reasons as compilers weren't as good back then so even simple compile-time abstractions could give significantly lower performance. Of course, I knew C++98 and was a proficient programmer before I started using C++14/17 18 months ago and the learning curve for becoming proficient in C++14/17 is quite steep and involves both learning, inventing (going where people haven't gone before) and unlearning. There are just way too many ways of doing the same thing in C++ to be certain that one does something the best-possible way. Which is rather costly compared to say Go or the direction D1 took, which focus on being simpler than C++ and generally offers one way to do something. Work on making D simpler (easier to use and easier to read) and you may find new markets. Simpler does not mean less powerful. The best way to get a simple and consistent language is to create a simple high level IR that can represent all language constructs you are interested in (by "lowering").
 C++ still suffers from:

 http://www.digitalmars.com/articles/b44.html
The array issue is solved. I only use std::array<Type, Dim>. I never use Type[Dim]... I also use gsl::span<Type, Dim> or gsl::span<Type> for array parameters. I also have my own array types for special use cases (forcing heap allocation etc). So this is solved in modern C++. I don't think it is "idiomatic" to write C-code in C++14/17. What really irks me about C++/D is that they don't focus on making it easy to write readable code (making complex code more readable). I don't think it is difficult to fix, but it requires significant breaking changes.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 1:22 AM, Ola Fosheim Grøstad wrote:
 So this is solved in modern C++.
This is where we diverge. A language isn't safe unless it can mechanically guarantee that unsafe constructs are not used. Saying "don't write unsafe code" in C++ does not make it safe language. How would you know some random 10,000 line piece of C++ code is using std::vector instead of [ ]? How do you know that some random PR pulled into your project does not have [ ] in it? It's faith-based programming. Faith based programming does not scale and is not the point of safe.
Jun 07 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 08:54:32 UTC, Walter Bright wrote:
 On 6/7/2016 1:22 AM, Ola Fosheim Grøstad wrote:
 So this is solved in modern C++.
This is where we diverge. A language isn't safe unless it can mechanically guarantee that unsafe constructs are not used. Saying "don't write unsafe code" in C++ does not make it safe language.
C++ isn't a safe language, but if you are proficient in modern C++ then memory issues aren't the big hurdle. I find the syntactic mess that comes from having N different convoluted ways of doing the same thing in meta-programming to be more problematic in day-to-day programming than safety issues.
 How would you know some random 10,000 line piece of C++ code is 
 using std::vector instead of [ ]?
Static analysis tooling? But I don't use std::vector. If you only "borrow" access to an array you should use gsl::span (a slice).
Jun 07 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 09:10 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-
d wrote:
=20
[=E2=80=A6]
 C++ isn't a safe language, but if you are proficient in modern=C2=A0
 C++ then memory issues aren't the big hurdle. I find the=C2=A0
 syntactic mess that comes from having N different convoluted ways=C2=A0
 of doing the same thing in meta-programming to be more=C2=A0
 problematic in day-to-day programming than safety issues.
=20
std::unique_ptr and std::shared_ptr maybe great for those who have to use C++, but for those with a choice it is the fastest route to Rust. And then you find Rust cannot cope nicely with many C libraries. Hence you find your way to D. Only to find to developer environments nowhere near as good. For a traditional Emacs person, I am finding CLion a joy to use. D needs equivalents. =C2=A0 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 09:49:34 UTC, Russel Winder wrote:
 std::unique_ptr and std::shared_ptr maybe great for those who 
 have to use C++, but for those with a choice it is the fastest 
 route to Rust. And then you find Rust cannot cope nicely with 
 many C libraries. Hence you find your way to D. Only to find to 
 developer environments nowhere near as good.
Well, currently C++17 has overall better semantics than Rust and D, for system level programming. It is just a very expensive language to become and remain proficient in, and C++ syntax issues means you have to spend more effort on making your code maintainable... :-/ Both Rust and D have syntax and semantic issues, if they had focused on improving the ergonomics instead of adding features then they could take on C++. As it stands, they cannot. So yes, and top notch editor is needed to gain ground, but isn't sufficient as of today.
 For a traditional Emacs person, I am finding CLion a joy to 
 use. D needs equivalents.
Good static typing based editors matter _a lot_. My own experience is that PyCharm makes it harder to justify using a statically typed language over Python. So Python benefits enormously from PyCharm being available in a community edition. It is quite interesting that editors can add missing language features like that and turn dynamic languages into gradually typed languages (more or less).
Jun 07 2016
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 10:54, Walter Bright wrote:
 On 6/7/2016 1:22 AM, Ola Fosheim Grøstad wrote:
 So this is solved in modern C++.
This is where we diverge. A language isn't safe unless it can mechanically guarantee that unsafe constructs are not used. Saying "don't write unsafe code" in C++ does not make it safe language. How would you know some random 10,000 line piece of C++ code is using std::vector instead of [ ]? How do you know that some random PR pulled into your project does not have [ ] in it? It's faith-based programming. Faith based programming does not scale and is not the point of safe.
How do you know that some random safe PR pulled into your project does not corrupt memory?
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 10:44 AM, Timon Gehr wrote:
 How do you know that some random  safe PR pulled into your project does not
 corrupt memory?
trusted and system are designed to be greppable, i.e. you can look for them without needing a static analysis tool.
Jun 07 2016
next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Tuesday, 7 June 2016 at 18:15:28 UTC, Walter Bright wrote:
  trusted and  system are designed to be greppable, i.e. you can 
 look for them without needing a static analysis tool.
But you can't grep for system because 99% of the time it's implicit. This problem becomes harder to find when using templates for everything, which I do.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 11:19 AM, Jack Stouffer wrote:
 On Tuesday, 7 June 2016 at 18:15:28 UTC, Walter Bright wrote:
  trusted and  system are designed to be greppable, i.e. you can look for them
 without needing a static analysis tool.
But you can't grep for system because 99% of the time it's implicit. This problem becomes harder to find when using templates for everything, which I do.
Add: safe: at the top of your D module and you'll find the system code. The D compiler is the static analysis tool. It's true that safe should have been the default, but too much code would break if that were changed. Adding one line to the top of a module is very doable for those that are desirous of adding the safety checks. You can also add: nogc: at the top, too. It isn't necessary to tediously annotate every function.
Jun 07 2016
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate every 
 function.
nogc: struct Foo { int* a() { return new int; } }
Jun 07 2016
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 7 June 2016 at 18:32:05 UTC, Adam D. Ruppe wrote:
 On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate 
 every function.
nogc: struct Foo { int* a() { return new int; } }
Are you trying to say that you shouldn't be allowed to do that? I get an error when I actually call x.a(). nogc: struct Foo { int* a() { return new int; } } void main() { Foo x; auto y = x.a(); }
Jun 07 2016
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 20:43, jmh530 wrote:

 Are you trying to say that you shouldn't be allowed to do that? I get an
 error when I actually call x.a().

  nogc:

 struct Foo {
          int* a() { return new int; }
 }

 void main()
 {
      Foo x;
      auto y = x.a();
 }
You'll get the same error for this program: nogc: struct Foo { int x; int* a() { return &x; } } void main(){ Foo x; auto y = x.a(); }
Jun 07 2016
parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 7 June 2016 at 18:54:25 UTC, Timon Gehr wrote:
 You'll get the same error for this program:
Also, it doesn't help to make the struct or member functions templates because it's no different than appending nogc everywhere. You would need some kind of ! nogc as others have pointed out.
Jun 07 2016
prev sibling next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Tuesday, 7 June 2016 at 18:43:12 UTC, jmh530 wrote:
 Are you trying to say that you shouldn't be allowed to do that? 
 I get an error when I actually call x.a().
The colon attributes do NOT extend inside aggregates. Putting nogc: at the top did not affect my function a(), because it was inside a struct. Real world code tends to have a lot of structs and such, so the claim that you can just slap whatever: at the top doesn't stand up to scrutiny, even disregarding the other problems like lack of inversion or backward default. I wrote more about this a couple months ago here: http://arsdnet.net/this-week-in-d/2016-apr-17.html I have actually softened my position somewhat since then, but still hold to the key point: the default behavior means apathy ruins the feature. Use a library written by someone who didn't care to write nogc, but nevertheless didn't actually use the gc? too bad.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 12:01 PM, Adam D. Ruppe wrote:
 the key point: the default behavior means apathy ruins the feature. Use a
library
 written by someone who didn't care to write  nogc, but nevertheless didn't
 actually use the gc? too bad.
I don't think it is that bad. Lots of formerly acceptable C coding practice is still perfectly allowed by the C Standard, but has become laughed at in practice, such as: #define BEGIN { #define END } The point being that a culture of "best practices" does arise and evolve over time, and professional programmers know it. Such has also arisen for D over things like unittests and Ddoc comments. Would you even want to use a library not written by professionals who have a reasonable awareness of best practices? Would you want to use a library where the maintainers refuse to use nogc even if they aren't using the gc?
Jun 07 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 20:41:08 UTC, Walter Bright wrote:
 Would you want to use a library where the maintainers refuse to 
 use  nogc even if they aren't using the gc?
yes, i do. i'm actively using Adam's arsd libraries, and they doesn't have nogc spam all over the place, even where functions doesn't use gc. more than that: i tend to ignore nogc in my code too, almost never bothering to put it. it just doesn't worth the efforts.
Jun 07 2016
prev sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Tuesday, 7 June 2016 at 20:41:08 UTC, Walter Bright wrote:
 The point being that a culture of "best practices" does arise 
 and evolve over time, and professional programmers know it.
Sure, that's one of the big advantages C++ has over D: people have that institutional knowledge. But, two important questions: 1) You criticize C++ because it isn't good enough - programmers are lazy and just because you can do it right doesn't mean they will. The right thing also needs to be the easy thing. D's attribute spam is not the easy thing. And there's no apparent punishment for doing it "wrong" - everything works equally well for the author. It is only when a third party comes in and tries to slap the attribute on that it becomes an issue. 2) What makes you so sure nogc will actually be part of the best practice? I haven't done a comprehensive study, but my impression so far is that it is very rarely used: the biggest win is being inferred in templates... which seems to imply that people aren't caring enough to actually write it out.
 Would you want to use a library where the maintainers refuse to 
 use  nogc even if they aren't using the gc?
I think you underestimate the cost of nogc (and safe, and pure, and nothrow) to the library author. It is littering their code with something they don't use themselves (and thus easy to forget to put there on new functions) and don't derive any direct benefit from. Moreover, it limits their flexibility in the future: once a function is published with one of those attributes, it is part of the public interface so the implementation cannot change its mind in the future. That might be a good thing to the user begging for verified nogc or whatever, but to the library author it is another cost for them to maintain. Though, the apathy factor I think is bigger than the maintenance factor: I don't write it in my code because I just don't care. I have had only one user ever complain too... and he seems to have changed his mind by now and no longer cares either (probably because a critical mass of library authors still just don't care, so you can't realistically slap nogc safe on that much anyway).
Jun 07 2016
prev sibling parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 2:43 PM, jmh530 wrote:
 On Tuesday, 7 June 2016 at 18:32:05 UTC, Adam D. Ruppe wrote:
 On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate every
 function.
nogc: struct Foo { int* a() { return new int; } }
Are you trying to say that you shouldn't be allowed to do that? I get an error when I actually call x.a(). nogc: struct Foo { int* a() { return new int; } } void main() { Foo x; auto y = x.a(); }
That's because main is nogc. If you did: void main() { Foo x; auto y = x.a(); } nogc: struct Foo { int* a() { return new int; } } it would work. Adam's point is that putting nogc: at the top doesn't mark everything nogc. -Steve
Jun 07 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 11:32 AM, Adam D. Ruppe wrote:
 On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate every function.
nogc: struct Foo { int* a() { return new int; } }
You're right, the global nogc doesn't go through the struct. But safe does.
Jun 07 2016
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 3:44 PM, Walter Bright wrote:
 On 6/7/2016 11:32 AM, Adam D. Ruppe wrote:
 On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate every
 function.
nogc: struct Foo { int* a() { return new int; } }
You're right, the global nogc doesn't go through the struct. But safe does.
Bug? I would have expected nogc: to permeate. -Steve
Jun 07 2016
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 12:56 PM, Steven Schveighoffer wrote:
 Bug? I would have expected  nogc: to permeate.
It did originally, but that was removed. It's deliberate.
Jun 07 2016
prev sibling next sibling parent reply Dave <david.dave dave.com> writes:
On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 On 6/7/2016 11:19 AM, Jack Stouffer wrote:
 On Tuesday, 7 June 2016 at 18:15:28 UTC, Walter Bright wrote:
 [...]
But you can't grep for system because 99% of the time it's implicit. This problem becomes harder to find when using templates for everything, which I do.
Add: safe: at the top of your D module and you'll find the system code. The D compiler is the static analysis tool. It's true that safe should have been the default, but too much code would break if that were changed. Adding one line to the top of a module is very doable for those that are desirous of adding the safety checks. You can also add: nogc: at the top, too. It isn't necessary to tediously annotate every function.
Seems fair. But perhaps phobos should also follow this standard? Which might be why people get the mindset that they have to annotate everything...
Jun 07 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 On 6/7/2016 11:19 AM, Jack Stouffer wrote:
 On Tuesday, 7 June 2016 at 18:15:28 UTC, Walter Bright wrote:
 [...]
But you can't grep for system because 99% of the time it's implicit. This problem becomes harder to find when using templates for everything, which I do.
Add: safe: at the top of your D module and you'll find the system code. The D compiler is the static analysis tool. It's true that safe should have been the default, but too much code would break if that were changed. Adding one line to the top of a module is very doable for those that are desirous of adding the safety checks. You can also add: nogc: at the top, too. It isn't necessary to tediously annotate every function.
Seems fair. But perhaps phobos should also follow this standard? Which might be why people get the mindset that they have to annotate everything...
IMHO, it's bad practice to mass apply attributes with labels or blocks. It's far too easy to accidentally mark a function with an attribute that you didn't mean to, and it makes it way harder to figure out which attributes actually apply to a function. And when you add templates into the mix, applying attributes en masse doesn't work anyway, because pretty much the only time that you want to mark a template function with an attribute is when the template arguments have nothing to do with whether the attribute is appropriate or not. So, while mass applying something like safe temporarily to check stuff makes some sense, I really don't think that it's a good idea to do it in any code that you'd ever commit. - Jonathan M Davis
Jun 07 2016
next sibling parent reply Dave <david.dave dave.com> writes:
On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
 On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
 [...]
IMHO, it's bad practice to mass apply attributes with labels or blocks. It's far too easy to accidentally mark a function with an attribute that you didn't mean to, and it makes it way harder to figure out which attributes actually apply to a function. And when you add templates into the mix, applying attributes en masse doesn't work anyway, because pretty much the only time that you want to mark a template function with an attribute is when the template arguments have nothing to do with whether the attribute is appropriate or not. [...]
So we should not follow the advice of Walter?
Jun 07 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
 On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
 [...]
IMHO, it's bad practice to mass apply attributes with labels or blocks. It's far too easy to accidentally mark a function with an attribute that you didn't mean to, and it makes it way harder to figure out which attributes actually apply to a function. And when you add templates into the mix, applying attributes en masse doesn't work anyway, because pretty much the only time that you want to mark a template function with an attribute is when the template arguments have nothing to do with whether the attribute is appropriate or not. [...]
So we should not follow the advice of Walter?
If he's arguing that you should slap an attribute on the top of your module to apply to everything, then no, I don't think that we should follow his advice. He's a very smart guy, but he's not always right. And in my experience, mass applying attributes is a mistake. - Jonathan M Davis
Jun 07 2016
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 5:10 PM, Jonathan M Davis via Digitalmars-d wrote:
 On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
 On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
 [...]
IMHO, it's bad practice to mass apply attributes with labels or blocks. It's far too easy to accidentally mark a function with an attribute that you didn't mean to, and it makes it way harder to figure out which attributes actually apply to a function. And when you add templates into the mix, applying attributes en masse doesn't work anyway, because pretty much the only time that you want to mark a template function with an attribute is when the template arguments have nothing to do with whether the attribute is appropriate or not. [...]
So we should not follow the advice of Walter?
If he's arguing that you should slap an attribute on the top of your module to apply to everything, then no, I don't think that we should follow his advice. He's a very smart guy, but he's not always right. And in my experience, mass applying attributes is a mistake.
The original(?) complaint was that it's hard to grep for system because it's the default. I think the advice is to put the attribute at the top to see where your non-conforming code lies. Not as a permanent fixture. I can attest that figuring out why something isn't inferred safe isn't always easy, and the "slap a safe: tag at the top" isn't always going to help. But it can be a technique to find such things. -Steve
Jun 07 2016
next sibling parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 17:28:27 Steven Schveighoffer via Digitalmars-d 
wrote:
 On 6/7/16 5:10 PM, Jonathan M Davis via Digitalmars-d wrote:
 On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
 On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
 [...]
IMHO, it's bad practice to mass apply attributes with labels or blocks. It's far too easy to accidentally mark a function with an attribute that you didn't mean to, and it makes it way harder to figure out which attributes actually apply to a function. And when you add templates into the mix, applying attributes en masse doesn't work anyway, because pretty much the only time that you want to mark a template function with an attribute is when the template arguments have nothing to do with whether the attribute is appropriate or not. [...]
So we should not follow the advice of Walter?
If he's arguing that you should slap an attribute on the top of your module to apply to everything, then no, I don't think that we should follow his advice. He's a very smart guy, but he's not always right. And in my experience, mass applying attributes is a mistake.
The original(?) complaint was that it's hard to grep for system because it's the default. I think the advice is to put the attribute at the top to see where your non-conforming code lies. Not as a permanent fixture. I can attest that figuring out why something isn't inferred safe isn't always easy, and the "slap a safe: tag at the top" isn't always going to help. But it can be a technique to find such things.
Yeah. It makes sense as a temporary solution to track down problems. It makes a lot less sense as a way to write your code normally. - Jonathan M Davis
Jun 07 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 2:28 PM, Steven Schveighoffer wrote:
 I can attest that figuring out why something isn't inferred  safe isn't always
 easy, and the "slap a  safe: tag at the top" isn't always going to help.
Having a -safe compiler switch to make safe the default won't improve that in the slightest.
Jun 07 2016
next sibling parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/7/16 7:05 PM, Walter Bright wrote:
 On 6/7/2016 2:28 PM, Steven Schveighoffer wrote:
 I can attest that figuring out why something isn't inferred  safe
 isn't always
 easy, and the "slap a  safe: tag at the top" isn't always going to help.
Having a -safe compiler switch to make safe the default won't improve that in the slightest.
No, of course not. I don't think anyone has said this. In my experience, finding the reasons something isn't inferred safe is an iterative process with the compiler and temporarily marking targeted code. I don't think grep helps here at all, and neither do global safe attributes. -Steve
Jun 07 2016
prev sibling parent Observer <here inter.net> writes:
On Tuesday, 7 June 2016 at 23:05:49 UTC, Walter Bright wrote:
 On 6/7/2016 2:28 PM, Steven Schveighoffer wrote:
 I can attest that figuring out why something isn't inferred 
  safe isn't always
 easy, and the "slap a  safe: tag at the top" isn't always 
 going to help.
Having a -safe compiler switch to make safe the default won't improve that in the slightest.
I think it's useful here to compare one aspect of Perl's approach to security, its "taint" mode. It tags insecure data to make sure it does not affect the security of the application, and blocks actions where insecure actions would otherwise occur. The Perl invocation accepts a couple of flags to control how taint mode works: -t Like -T, but taint checks will issue warnings rather than fatal errors. These warnings can now be controlled normally with "no warnings qw(taint)". Note: This is not a substitute for "-T"! This is meant to be used only as a temporary development aid while securing legacy code: for real production code and for new secure code written from scratch, always use the real -T. -T turns on "taint" so you can test them. Ordinarily these checks are done only when running setuid or setgid. It's a good idea to turn them on explicitly for programs that run on behalf of someone else whom you might not necessarily trust, such as CGI programs or any internet servers you might write in Perl. See perlsec for details. For security reasons, this option must be seen by Perl quite early; usually this means it must appear early on the command line or in the "#!" line for systems which support that construct. The point being, such flags provide a very simple means for the user to check the execution of their code, without being terribly intrusive. They can be a great convenience as a stepstone to discovering where problems exist and addressing them.
Jun 11 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 1:48 PM, Jonathan M Davis via Digitalmars-d wrote:
 So, while mass applying something like  safe temporarily to check stuff
 makes some sense, I really don't think that it's a good idea to do it in any
 code that you'd ever commit.
The downsides you listed do not apply to safe.
Jun 07 2016
parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 16:04:05 Walter Bright via Digitalmars-d wrote:
 On 6/7/2016 1:48 PM, Jonathan M Davis via Digitalmars-d wrote:
 So, while mass applying something like  safe temporarily to check stuff
 makes some sense, I really don't think that it's a good idea to do it in
 any code that you'd ever commit.
The downsides you listed do not apply to safe.
Sure they do. Regardless of the attribute, if it can be inferred, and templates are involved, you can't mass apply it, because because you almost always need the attribute to be inferred. And regardless of whether an attribute can be inferred, mass applying it tends to mean that it's harder to figure out which attributes a function is actually marked with. It's easier when it's just a label at the top of the file, but we've already had PRs in Phobos where an attribute got applied locally as part the PR, because the person doing the work did not realize that it was already in effect. And personally, it always throws me off when attribute labels or blocks are used, because it looks like the attribute is not being applied to a function when it actually is. I don't think that it matters what the attribute is. All of the same downsides apply. The primary difference with safe over some of the others is that you can reverse it, whereas you can't with most of them. But even then, you can't tell a template to infer safe when you've marked the whole file with safe, so while you can change which level of trust you're applying to a function, you can remove the trust attributes entirely once one of them has been applied. Personally, I think that it's almost always a mistake to mass apply attributes - especially those that can be inferred in templated code. It does not play well with templates, and it causes maintenance problems. - Jonathan M Davis
Jun 07 2016
prev sibling parent Jack Stouffer <jack jackstouffer.com> writes:
On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
 Add:

     safe:

 at the top of your D module and you'll find the  system code.
Sure, that's easy to do in my code, but we were talking about 3rd party code. Plus the template problem comes up again: no one should be annotating their templates anything but maybe property. So in practice, because you recommend (correctly) that everything should be a template, there's nothing to grep.
 You can also add:

     nogc:

 at the top, too. It isn't necessary to tediously annotate every 
 function.
Are you sure? nogc is not like safe in that it has a corresponding negating version to cancel the top level out. If I have nogc at the top, there's no way to declare an allocating function in that file, therefore you have to use nogc {}. This is why some have proposed hacks like ! nogc.
Jun 07 2016
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 20:15, Walter Bright wrote:
 On 6/7/2016 10:44 AM, Timon Gehr wrote:
 How do you know that some random  safe PR pulled into your project
 does not
 corrupt memory?
trusted and system are designed to be greppable,
$ grep -r " trusted" * $ grep -r " system" *
 i.e. you can look for
 them without needing a static analysis tool.
mixin(" tru"~"sted void foo(){ ... }"); Anyway, this is not actually the issue. One can hack the compiler such that it reports locations of trusted functions easily. I still don't know the code is memory safe if main is safe and there are no trusted functions in the code. The safe subset should be specified and implemented by inclusion, such that it is obvious that it does the right thing. I don't know what's 'unspecific' about this. Closing holes one-by-one is not the right approach here. You don't know when you are done and might never be.
Jun 07 2016
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 11:32 AM, Timon Gehr wrote:
 The  safe subset should be specified and
 implemented by inclusion, such that it is obvious that it does the right thing.
 I don't know what's 'unspecific' about this.
 Closing holes one-by-one is not the
 right approach here. You don't know when you are done and might never be.
I don't see how it is any different painting the fence from one direction or the other. There are omissions possible either way. Another issue is implementing such a spec. The "disapproved" list is how the compiler works, and makes it reasonably straightforward to check the implementation against the list. It's quite a mess to try to tag everything the compiler does with approved/disapproved, so you wind up in exactly the same boat anyway. In any case, writing such a large specification covering every semantic action of the of the language is way, way beyond being a bugzilla issue. If you want to take charge of writing such a specification DIP, please do so.
Jun 07 2016
next sibling parent reply Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 6/7/2016 12:52 PM, Walter Bright via Digitalmars-d wrote:
 On 6/7/2016 11:32 AM, Timon Gehr wrote:
 The  safe subset should be specified and
 implemented by inclusion, such that it is obvious that it does the
 right thing.
 I don't know what's 'unspecific' about this.
 Closing holes one-by-one is not the
 right approach here. You don't know when you are done and might never be.
I don't see how it is any different painting the fence from one direction or the other. There are omissions possible either way.
Yes, either direction has the probability of being incomplete. However, when the disallow by default and allow only explicitly activities is _far_ more correct in the face of omissions. In the case of an omission of something that should be allowed, you haven't discovered a violation of safety, you've discovered a place to allow additional safe code. To use an analogy, would you trust a security model where the list of people not allowed to withdraw from your bank account be a whitelist or a blacklist?
 Another issue is implementing such a spec. The "disapproved" list is how
 the compiler works, and makes it reasonably straightforward to check the
 implementation against the list. It's quite a mess to try to tag
 everything the compiler does with approved/disapproved, so you wind up
 in exactly the same boat anyway.

 In any case, writing such a large specification covering every semantic
 action of the of the language is way, way beyond being a bugzilla issue.
Yes, it's hard to implement. Shrug, you signed up for it.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 1:13 PM, Brad Roberts via Digitalmars-d wrote:
 Yes, it's hard to implement.  Shrug, you signed up for it.
I work on things pretty much on a maximizing benefit/cost basis. Working on something that has a clear tremendous cost and a benefit that it might close a hole that nobody has run into after many years of use, puts it near the bottom of the list of productive things to do. This is how all engineering projects work. However, that should not dissuade anyone who believes it is worth their own effort to work on it, as everyone has their own benefit/cost function. BTW, it is a nice idea to require mathematical proofs of code properties, but real world programming languages have turned out to be remarkably resistant to construction of such proofs. As I recall, Java had initially proven that Java was memory safe, until someone found a hole in it. And so on and so forth for every malware attack vector people find. We plug the problems as we find them.
Jun 07 2016
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 22:36, Walter Bright wrote:
 ...

 BTW, it is a nice idea to require mathematical proofs of code
 properties, but real world programming languages have turned out to be
 remarkably resistant to construction of such proofs. As I recall, Java
 had initially proven that Java was memory safe, until someone found a
 hole in it. And so on and so forth for every malware attack vector
 people find. We plug the problems as we find them.
Obviously they proved the virtual machine itself memory safe, not all of its implementations. If you have a mechanized proof of memory safety, then your language is memory safe.
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 3:23 PM, Timon Gehr wrote:
 Obviously they proved the virtual machine itself memory safe,
As I recall, the proof was broken, not the implementation. People do make mistakes and overlook cases with proofs. There's nothing magical about them.
Jun 07 2016
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 08.06.2016 00:47, Walter Bright wrote:
 On 6/7/2016 3:23 PM, Timon Gehr wrote:
 Obviously they proved the virtual machine itself memory safe,
As I recall, the proof was broken, not the implementation.
Which time?
 People do
 make mistakes and overlook cases with proofs. There's nothing magical
 about them.
Obviously, but there are reliable systems that check proofs automatically.
Jun 07 2016
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/8/16 12:53 AM, Timon Gehr wrote:
 On 08.06.2016 00:47, Walter Bright wrote:
 On 6/7/2016 3:23 PM, Timon Gehr wrote:
 Obviously they proved the virtual machine itself memory safe,
As I recall, the proof was broken, not the implementation.
Which time?
That is an old result that has essentially expired and should not be generalized. See http://www.seas.upenn.edu/~sweirich/types/archive/1999-2003/msg00849.html. I assume the matter has been long fixed by now, do you happen to know?
 People do
 make mistakes and overlook cases with proofs. There's nothing magical
 about them.
Obviously, but there are reliable systems that check proofs automatically.
It is my opinion that writing off formal proofs of safety is a mistake. Clearly we don't have the capability on the core team to work on such. However, I am very interested if you'd want to lead such an effort. Andrei
Jun 07 2016
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 4:07 PM, Andrei Alexandrescu wrote:
 It is my opinion that writing off formal proofs of safety is a mistake. Clearly
 we don't have the capability on the core team to work on such. However, I am
 very interested if you'd want to lead such an effort.
On the contrary, I think a formal proof would be very valuable. I am just skeptical of the notion that a proof is automatically correct. I've read about mistakes being found in many published mathematical proofs. I read somewhere that Hilbert made many mistakes in his proofs, even though the end results turned out correct.
Jun 07 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 June 2016 at 00:39:54 UTC, Walter Bright wrote:
 On the contrary, I think a formal proof would be very valuable. 
 I am just skeptical of the notion that a proof is automatically 
 correct. I've read about mistakes being found in many published 
 mathematical proofs. I read somewhere that Hilbert made many 
 mistakes in his proofs, even though the end results turned out 
 correct.
Well, you cannot prevent errors in the requirements, but you can eliminate errors in the proof, so if the requirements are too complex you have a bad deal. The theorem prover is separate from the proof verifier. It works like this: 1. Human specifies the requirements (e.g. assert(...) in D) 2. Theorem prover takes program + requirements + strategies (prodding the prover along the right track) and emits a loooong formal proof in a standard format. 3. The proof is handed to N independently implemented verifiers that checks the proof. But that is impractical for typical user created program. You only want to do that once, for your backend or your type-system etc. -- What you can do is, as you've stated before, transform your source code into a simpler form and verify that it only can lead to situations that are provably safe. The advantage of this is that you also can prove that specific cases of pointer arithmetics are provably safe (say, fixed size array on the stack) thus reducing the need for trusted. The disadvantage is that it will slow down the compiler and make it more complicated, so why have it in the compiler and not as a separate program? Make it a separate program so it works on uninstantiated code and prove libraries to be correctly marked safe before they are uploaded to repositories etc. If safe does not affect code gen, why have it in the compiler?
Jun 08 2016
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 08.06.2016 02:39, Walter Bright wrote:
 On 6/7/2016 4:07 PM, Andrei Alexandrescu wrote:
 It is my opinion that writing off formal proofs of safety is a
 mistake. Clearly
 we don't have the capability on the core team to work on such.
 However, I am
 very interested if you'd want to lead such an effort.
On the contrary, I think a formal proof would be very valuable. I am just skeptical of the notion that a proof is automatically correct. I've read about mistakes being found in many published mathematical proofs. I read somewhere that Hilbert made many mistakes in his proofs, even though the end results turned out correct.
Mathematicians use a semi-formal style of reasoning in publications. Most mistakes are minor and most mathematicians don't use tools (such as https://coq.inria.fr/) to verify their proofs like computer scientists often do when proving properties of formal systems. The focus of Mathematics isn't necessarily on verification, it is usually on aesthetics, understanding, communication etc. Current tools are not particularly strong in such areas and it is often more tedious to get the proof through than it should be. And certainly Hilbert didn't have access to anything like them.
Jun 08 2016
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 08.06.2016 01:07, Andrei Alexandrescu wrote:
 On 6/8/16 12:53 AM, Timon Gehr wrote:
 On 08.06.2016 00:47, Walter Bright wrote:
 On 6/7/2016 3:23 PM, Timon Gehr wrote:
 Obviously they proved the virtual machine itself memory safe,
As I recall, the proof was broken, not the implementation.
Which time?
That is an old result that has essentially expired and should not be generalized. See http://www.seas.upenn.edu/~sweirich/types/archive/1999-2003/msg00849.html.
I think this can't be what Walter is referring to: "the type inference system for generic method calls was not subjected to formal proof. In fact, it is unsound," I.e. no proof, unsound.
 I assume the matter has been long fixed by now, do you happen to know?
 ...
I don't know. BTW, Java's type system is unsound [1]. class Unsound { static class Bound<A, B extends A> {} static class Bind<A> { <B extends A> A bad(Bound<A,B> bound, B b) {return b;} } public static <T,U> U coerce(T t) { Bound<U,? super T> bound = null; Bind<U> bind = new Bind<U>(); return bind.bad(bound, t); } public static void main(String[] args){ String s=coerce(0); } }
 People do
 make mistakes and overlook cases with proofs. There's nothing magical
 about them.
Obviously, but there are reliable systems that check proofs automatically.
It is my opinion that writing off formal proofs of safety is a mistake. Clearly we don't have the capability on the core team to work on such. However, I am very interested if you'd want to lead such an effort. Andrei
I'll probably do it at some point. (However, first I need to figure out what the formal language specification should actually be, this is one reason why I'm implementing a D compiler.) [1] https://www.facebook.com/ross.tate/posts/10102249566392775.
Jun 08 2016
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/8/16 1:50 PM, Timon Gehr wrote:
 I'll probably do it at some point. (However, first I need to figure out
 what the formal language specification should actually be, this is one
 reason why I'm implementing a D compiler.)
That's very very promising. Looking forward to anything in that area! -- Andrei
Jun 08 2016
prev sibling parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 15:47:10 Walter Bright via Digitalmars-d wrote:
 On 6/7/2016 3:23 PM, Timon Gehr wrote:
 Obviously they proved the virtual machine itself memory safe,
As I recall, the proof was broken, not the implementation. People do make mistakes and overlook cases with proofs. There's nothing magical about them.
Yeah. I recall an article by Joel Spoelsky where he talks about deciding that proofs of correctness weren't worth much, because they were even harder to get right than the software. I do think that there are situations where proofs are valuable, but they do tend to be very difficult to get right, and their application is ultimately fairly limited. - Jonathan M Davis
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 4:01 PM, Jonathan M Davis via Digitalmars-d wrote:
 Yeah. I recall an article by Joel Spoelsky where he talks about deciding
 that proofs of correctness weren't worth much, because they were even harder
 to get right than the software.

 I do think that there are situations where proofs are valuable, but they do
 tend to be very difficult to get right, and their application is ultimately
 fairly limited.
My understanding is that academic researchers who need to prove a theory use a subset of Java, because the smaller the language, the more practical it is to write proofs about it. I also remember bearophile bringing up the Spec# language which was supposed to be able to formally prove things, but it turned out not much. I fed it some one liners with bit masking which it threw in the towel on. I suspect D has long since passed point where it is too complicated for the rather limited ability of mathematicians to prove things about it.
Jun 07 2016
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 08.06.2016 01:59, Walter Bright wrote:
 ...

 I suspect D has long since passed point where it is too complicated for
 the rather limited ability of mathematicians to prove things about it.
The main reason why it is currently impractical to prove things about D is that D is not really a mathematical object. I.e. there is no precise spec.
Jun 08 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 June 2016 at 13:43:27 UTC, Timon Gehr wrote:
 On 08.06.2016 01:59, Walter Bright wrote:
 ...

 I suspect D has long since passed point where it is too 
 complicated for
 the rather limited ability of mathematicians to prove things 
 about it.
The main reason why it is currently impractical to prove things about D is that D is not really a mathematical object. I.e. there is no precise spec.
Besides that, even if a safe checker is slightly flawed, it only has to be vetted better than the backend which most likely is unverified anyway. This is different from some of the static analysis done on C, which convert the LLVM bitcode or even X86 assembly into a format that can be queried using a solver. That way the proof holds even in the case where the backend is buggy.
Jun 08 2016
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/8/16 3:43 PM, Timon Gehr wrote:
 On 08.06.2016 01:59, Walter Bright wrote:
 ...

 I suspect D has long since passed point where it is too complicated for
 the rather limited ability of mathematicians to prove things about it.
The main reason why it is currently impractical to prove things about D is that D is not really a mathematical object. I.e. there is no precise spec.
Walter and I have spoken about the matter and reached the conclusion that work on a formal spec (be it in legalese, typing trees, small step semantics etc) on a reduced form of D would be very beneficial. We are very much supportive of such work. Andrei
Jun 08 2016
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 21:52, Walter Bright wrote:
 On 6/7/2016 11:32 AM, Timon Gehr wrote:
 The  safe subset should be specified and
 implemented by inclusion, such that it is obvious that it does the
 right thing.
 I don't know what's 'unspecific' about this.
 Closing holes one-by-one is not the
 right approach here. You don't know when you are done and might never be.
I don't see how it is any different painting the fence from one direction or the other.
The fence is infinitely long, your painting speed is finite and people will be looking at the fence mostly at the left end.
 There are omissions possible either way.
 ...
In one way, an omission means you are potentially tracking down memory corruption inside a huge codebase by grepping for trusted, until you notice that the issue is in safe code. In the other way, an omission means you are getting a spurious compile error that is easily worked around.
 Another issue is implementing such a spec. The "disapproved" list is how
 the compiler works,
It is how the compiler fails to work.
 and makes it reasonably straightforward to check the
 implementation against the list. It's quite a mess to try to tag
 everything the compiler does with approved/disapproved, so you wind up
 in exactly the same boat anyway.
 ...
The compiler should work by inclusion too.
 In any case, writing such a large specification covering every semantic
 action of the of the language is way, way beyond being a bugzilla issue.
 ...
Does not apply. The bugzilla issue can be fixed by disallowing all code in safe. Also, why not just close the bugzilla issue _after_ there is a more adequate replacement?
 If you want to take charge of writing such a specification DIP,
 please do so.
If you think progress on this matters, why are you arguing against it?
Jun 07 2016
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 3:10 PM, Timon Gehr wrote:
 If you think progress on this matters, why are you arguing against it?
I don't believe it is worth the effort. You do. You need to make a better case for it, and the best way to do that is to actually write a spec. Demanding that someone (i.e. me) who doesn't believe in it, has a poor track record of writing specs, and has no academic background in writing proofs, means you aren't going to get what you want that route. You've made some valuable contributions to D, in the form of finding problems. Why not contribute something more substantial, like a spec? You have a good idea in mind what it should be, just write it.
Jun 07 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 11:32 AM, Timon Gehr wrote:
 mixin(" tru"~"sted void foo(){ ... }");
So grep for mixin, too. Not hard.
Jun 07 2016
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 08.06.2016 00:44, Walter Bright wrote:
 On 6/7/2016 11:32 AM, Timon Gehr wrote:
 mixin(" tru"~"sted void foo(){ ... }");
So grep for mixin, too. Not hard.
Huge amounts of false positives.
Jun 07 2016
prev sibling parent reply Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 6/6/2016 11:22 PM, Walter Bright via Digitalmars-d wrote:
 On 6/6/2016 10:38 PM, Brad Roberts via Digitalmars-d wrote:
 The D ecosystem is a large pile of incomplete features, with more
 added all the
 time.
Even with only array bounds checking, D is safer than C++.
Nice deflection, has the benefit of being both correct and irrelevant. Yes, bounds checking is great. Yes, D is safer than C++, but we were talking about safe as a specific, advertised, but low-priority and largely unrealized feature, not the general safety of the language. safe is intended to be a big leap forward, but falls flat outside toys. I want the protections it's intended to bring. I want the notification I've accidentally violated one of the rules. I want the productivity gains from not having to debug cases it's intended to catch. This minimization and deflection of real user feedback is rather disappointing and somewhat insulting. Instead, I'd prefer to see acceptance that it's an actual issue and not hand waved away as not really a problem of current, real, users. Sigh, Brad
Jun 07 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 12:07 AM, Brad Roberts via Digitalmars-d wrote:
 Yes, D is safer than C++, but we were talking
 about  safe as a specific, advertised, but low-priority and largely unrealized
 feature, not the general safety of the language.  safe is intended to be a big
 leap forward, but falls flat outside toys.
I looked over the issues marked 'safe' in bugzilla, and do not understand the complaint that it falls flat and is largely unrealized. There are holes in it, sure, but they do not take away from what is covered, which is quite a bit.
 This minimization and deflection of real user feedback is rather disappointing
 and somewhat insulting.  Instead, I'd prefer to see acceptance that it's an
 actual issue and not hand waved away as not really a problem of current, real,
 users.
I cannot do anything with handwaving statements, whether I accept them and hang my head in shame or not. Although I may have missed some, or some may not be tagged properly, I found only one bugzilla issue from you tagged 'safe': https://issues.dlang.org/show_bug.cgi?id=13607 Here is the complete list of issues tagged with 'safe': https://issues.dlang.org/buglist.cgi?bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&keywords=safe&keywords_type=allwords&list_id=208819&query_format=advanced Writing up specifically what problems you're having would be most appreciated.
Jun 07 2016
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
I just noticed you [i.e. Brad] had made some bugzilla entries about  safe that 
were not tagged with the 'safe' keyword. I have gone ahead and tagged them, so
now:

https://issues.dlang.org/buglist.cgi?bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&keywords=safe&keywords_type=allwords&list_id=208819&query_format=advanced

should be a more or less complete list. If there are more, please add.
Jun 07 2016
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 07, 2016 at 12:49:55AM -0700, Walter Bright via Digitalmars-d wrote:
 On 6/7/2016 12:07 AM, Brad Roberts via Digitalmars-d wrote:
[...]
 This minimization and deflection of real user feedback is rather
 disappointing and somewhat insulting.  Instead, I'd prefer to see
 acceptance that it's an actual issue and not hand waved away as not
 really a problem of current, real, users.
I cannot do anything with handwaving statements, whether I accept them and hang my head in shame or not. Although I may have missed some, or some may not be tagged properly, I found only one bugzilla issue from you tagged 'safe': https://issues.dlang.org/show_bug.cgi?id=13607 Here is the complete list of issues tagged with 'safe': https://issues.dlang.org/buglist.cgi?bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&keywords=safe&keywords_type=allwords&list_id=208819&query_format=advanced Writing up specifically what problems you're having would be most appreciated.
I can't seem to find an issue I filed some years ago about safe needing to be whitelist-based rather than blacklist-based. Did it get closed while I wasn't looking? The problem is that D, being as complex as it is, has far too many corner cases and unanticipated feature combinations, that there is always going to be something that violates safe in some way that we hadn't anticipated. Currently, safe is implemented as what's effectively a blacklist: code is assumed to be safe by default, unless they use one of a set of features deemed unsafe. This means unanticipated unsafe corner cases will be assumed safe by default. A better approach would be to assume that *everything* is unsafe, and then whitelist those operations that are verifiably safe. Yes, at first it will be almost impossible to write safe code because the compiler will complain about everything, but as the whitelist is expanded more code becomes compilable in safe, and we at least would have the confidence that unanticipated corner cases will be assumed unsafe. If they actually turn out to be safe afterwards, we can always add them to the whitelist, and in the meantime we have the confidence that no code in the wild is actually unsafe while assumed to be safe. Whereas right now, it's hard to be confident that code marked safe doesn't contain some unexpected feature combination that is actually unsafe, but that we just haven't caught it yet. T -- "No, John. I want formats that are actually useful, rather than over-featured megaliths that address all questions by piling on ridiculous internal links in forms which are hideously over-complex." -- Simon St. Laurent on xml-dev
Jun 07 2016
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On Tuesday, 7 June 2016 at 13:47:39 UTC, H. S. Teoh wrote:
 .

 I can't seem to find an issue I filed some years ago about 
  safe needing to be whitelist-based rather than 
 blacklist-based. Did it get closed while I wasn't looking?
This one? http://forum.dlang.org/post/mailman.2912.1465288884.26339.digitalmars-d-bugs puremagic.com -Steve
Jun 07 2016
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 07, 2016 at 02:31:43PM +0000, Steven Schveighoffer via
Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 13:47:39 UTC, H. S. Teoh wrote:
 .
 
 I can't seem to find an issue I filed some years ago about  safe
 needing to be whitelist-based rather than blacklist-based. Did it
 get closed while I wasn't looking?
This one? http://forum.dlang.org/post/mailman.2912.1465288884.26339.digitalmars-d-bugs puremagic.com
[...] Oh yes, that's the one. T -- Making non-nullable pointers is just plugging one hole in a cheese grater. -- Walter Bright
Jun 07 2016
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 07.06.2016 17:11, H. S. Teoh via Digitalmars-d wrote:
 On Tue, Jun 07, 2016 at 02:31:43PM +0000, Steven Schveighoffer via
Digitalmars-d wrote:
 On Tuesday, 7 June 2016 at 13:47:39 UTC, H. S. Teoh wrote:
 .

 I can't seem to find an issue I filed some years ago about  safe
 needing to be whitelist-based rather than blacklist-based. Did it
 get closed while I wasn't looking?
This one? http://forum.dlang.org/post/mailman.2912.1465288884.26339.digitalmars-d-bugs puremagic.com
[...] Oh yes, that's the one. T
You didn't find it because it was closed by Walter for no good reason about 10 hours ago.
Jun 07 2016
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/7/2016 11:36 AM, Timon Gehr wrote:
 You didn't find it because it was closed by Walter for no good reason about 10
 hours ago.
If you want to make a DIP out of it, please do so. It was inappropriate as a bugzilla issue.
Jun 07 2016
prev sibling parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 06:47:39 H. S. Teoh via Digitalmars-d wrote:
 I can't seem to find an issue I filed some years ago about  safe needing
 to be whitelist-based rather than blacklist-based. Did it get closed
 while I wasn't looking?
Walter closed it a day or two ago on the grounds that it wasn't a specific issue but more of a discussion topic: https://issues.dlang.org/show_bug.cgi?id=12941 In principle, I think that you're very right that safe needs to be implemented as a whitelist. Security in general does not work as a blacklist, and I think that safe has the same problem. The problem is code breakage. Even assuming that the change in implementation were straightforward (and I have no idea whether it is or not), it would be pretty much guranteed that we would break a lot of code marked safe if we were to switch to a whitelist. Some of that code is not truly safe and really should be fixed, but just throwing the switch like that is too sudden. We'd probably be forced to have both a whitelist and a blaklist and treat the whitelist results as warnings temporarily before switching fully to the whitelist implementation. And that's likely feasible, but it seems like it would be a bit of a mess. So, I don't know if we reasonably can switch to a whitelist or not. But I think that it's clearly that we ideally would. - Jonathan M Davis
Jun 07 2016
parent Observer <here inter.net> writes:
On Tuesday, 7 June 2016 at 20:41:21 UTC, Jonathan M Davis wrote:
 In principle, I think that you're very right that  safe needs 
 to be implemented as a whitelist. Security in general does not 
 work as a blacklist, and I think that  safe has the same 
 problem. The problem is code breakage. Even assuming that the 
 change in implementation were straightforward (and I have no 
 idea whether it is or not), it would be pretty much guranteed 
 that we would break a lot of code marked  safe if we were to 
 switch to a whitelist. Some of that code is not truly  safe and 
 really should be fixed, but just throwing the switch like that 
 is too sudden. We'd probably be forced to have both a whitelist 
 and a blaklist and treat the whitelist results as warnings 
 temporarily before switching fully to the whitelist 
 implementation. And that's likely feasible, but it seems like 
 it would be a bit of a mess. So, I don't know if we reasonably 
 can switch to a whitelist or not. But I think that it's clearly 
 that we ideally would.
I think you meant "treat the non-whitelist results as warnings". Seems to me the proper answer is simple. Stuff on the whitelist should pass without comment. Stuff on neither the whitelist nor the blacklist should generate warnings. Stuff on the blacklist should generate errors. A compiler flag similar to gcc's -Werror that turns all warnings into errors would allow the end-user to select whether or not to worry, during a phase of transition. This way, those warnings could be pushed back upstream to the compiler maintainers as "hey, your whitelist/blacklist division omits certain real-world cases". And gradually, the graylist would be narrowed over successive compiler releases.
Jun 11 2016
prev sibling parent Paolo Invernizzi <paolo.invernizzi no.address> writes:
On Tuesday, 7 June 2016 at 05:38:09 UTC, Brad Roberts wrote:
 I've fixed some of the issues in a couple bursts of activity 
 over the last several years, and filed a bunch more bugs, but 
 the specifics aren't the point I'm raising here, though your 
 trimming of the thread dropped that part of the context.  You 
 dismissed complaints of the incompleteness of safety as the 
 whining of non-users.  I'm a user.  I was a much more frequent 
 user until I got tired of the sheer number of only partially 
 complete nature of so much of the language + core library.  Yes 
 they're separate, no that's not relevant to the majority of 
 users.  Yes, I can and have contributed to the fixes, but it's 
 clearly (just based on commit history) not a priority to many 
 people.
Same here....
 The D ecosystem is a large pile of incomplete features, with 
 more added all the time.
I'm not using D anymore, at least not for new production projects, exactly for that reason. I don't feel confident any more on the future of the language exactly for that reason. /Paolo
Jun 07 2016
prev sibling parent HaraldZealot <harald_zealot tut.by> writes:
On Monday, 6 June 2016 at 09:16:45 UTC, Walter Bright wrote:
 Paying attention to our existing users is a much more reliable 
 source of information.
+1000
Jun 07 2016
prev sibling next sibling parent reply NX <nightmarex1337 hotmail.com> writes:
On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 On Sun, 2016-06-05 at 19:20 -0700, Walter Bright via 
 Digitalmars-d wrote:
 […]
 
 * The garbage collector eliminates probably 60% of potential 
 users right off.
And i bet over 80% of them are just saying this based on zero evidence, just prejudice. Go went with the attitude "Go has a GC, if you cannot deal with that #### off". Many people did exactly that and the Go community said "byeeee". Arrogant this may have been, but Pike, Cox, et al. stuck to their guns and forged a community and a niche for the language. This then created traction. Now GC in Go is not an issue.
This. I think the biggest problem about D is it's trying to satisfy *everyone*. It's literally combining C++, Java, Python, and ~the like. At some point it even tries to mimic Rust! Trying to be everything at once is key po
Jun 06 2016
parent reply NX <nightmarex1337 hotmail.com> writes:
On Monday, 6 June 2016 at 13:51:30 UTC, NX wrote:
 This.

 I think the biggest problem about D is it's trying to satisfy 
 *everyone*. It's literally combining C++, Java, Python, and 
 ~the like. At some point it even tries to mimic Rust! Trying to 
 be everything at once is key po-----
-------int of failure. You try to be everything, then you see you're non of them. There is no clear vision what D is aiming to be. Think about it. Let's say we seamlessly combined GC + borrow semantics + manual memory management. How can you expect a library writer to decide it's way? There are way too many things to consider. And then we may write 2-3 versions of the same lib; say, one that uses GC and the other one that requires manual memory management. What a mess! D have GC. I am completely fine with GCs. But there are things that hurt rea
Jun 06 2016
parent NX <nightmarex1337 hotmail.com> writes:
And this forum is impossible to use on smart phones...

I gave up. You know the rest...
Jun 06 2016
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 6 June 2016 at 08:15:42 UTC, Russel Winder wrote:
 On Sun, 2016-06-05 at 19:20 -0700, Walter Bright via 
 Digitalmars-d wrote:
 […]
 
 * The garbage collector eliminates probably 60% of potential 
 users right off.
And i bet over 80% of them are just saying this based on zero evidence, just prejudice. Go went with the attitude "Go has a GC, if you cannot deal with that #### off". Many people did exactly that and the Go community said "byeeee". Arrogant this may have been, but Pike, Cox, et al. stuck to their guns and forged a community and a niche for the language. This then created traction. Now GC in Go is not an issue.
GC in Go is not an issue, because in Go the concurrent GC is basically what it has to offer in addition to builtin decent HTTP and cloud-server adoption. GC is Go would have been a big big issue if Go was not designed for it or tried to present itself as a system level programming language. For performance you would still not use Go, you would use either C++ or Rust. But few servers in the cloud need those extra 20%.
Jun 06 2016
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 18:03 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-
d wrote:
=20
[=E2=80=A6]
 GC in Go is not an issue, because in Go the concurrent GC is=C2=A0
 basically what it has to offer in addition to builtin decent HTTP=C2=A0
 and cloud-server adoption.
That is the current state after 7 years of development, and at least three GCs. The arguments about GCs in the Go mailing list were almost similar to those in these D mailing lists. The crucial difference the full time Go developers did something. D appears to not have that rather =C2=A0crucial resource.
 GC is Go would have been a big big issue if Go was not designed=C2=A0
 for it or tried to present itself as a system level programming=C2=A0
 language.
Go was always, and always will be a GC language, very true. However it is, and always has been, emphasized as a systems programming language. Their "strap line" is effectively that GC is not a problem for systems programming. And they are right. Which is why D has no problem with being a GC language.
 For performance you would still not use Go, you would use either=C2=A0
 C++ or Rust. But few servers in the cloud need those extra 20%.
Go has blossomed due to the Web niche but it is used elsewhere. If C++ and Rust can only offer 20% improvement they are dead in the water. I am sure that by appropriate adjustment to the number of processors a well designed Go program can go far faster that 20% faster. The CSP emphasis of Go application design is a wonderful thing. Now I can use Kotlin+Quasar as well as Groovy+GPars to push this same message. Sadly D doesn't yet have an equivalent so therefore has a problem. I would love to create a CSP thing for D but I cannot give the time to do this on my own. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 09:29:35 UTC, Russel Winder wrote:
 That is the current state after 7 years of development, and at 
 least three GCs. The arguments about GCs in the Go mailing list 
 were almost similar to those in these D mailing lists.
It probably was. I only followed those mailing lists when Go was first launched and people didn't complain then, but they complained about lacking exceptions and non-null pointers. I knew Go users complained about GC performance after some time (when trying to use it in production), but I didn't know they complained about having a GC?
 However it is, and always has been, emphasized as a systems 
 programming language. Their "strap line" is effectively that GC 
 is not a problem for systems programming. And they are right.
They initially said it was a systems programming language, but they later turned around and said it isn't and isn't trying to be. Go is a high level language, it does not try to give unbiased access to hardware semantics. So it cannot be considered a system level language. I am not even sure if Rust fully qualifies atm.
 Which is why D has no problem with being a GC language.
Well... that remains unproven. :-) And I don't agree.
 If C++ and Rust can only offer 20% improvement they are dead in 
 the water.
I meant that by not having a GC C++ can get approx 20% improvement over Go with the same code as the Go code by writing idiomatic. There is also overhead when making system calls and also C-calls in Go, thanks to the GC. That said, in the cloud it makes a lot of sense to not use a system level language and use a high level "virtual" environment so that you can upgrade hardware without affecting the behaviour of services. If Go was positioned as a system level language it would be dead.
 I would love to create a CSP thing for D but I cannot give the 
 time to do this on my own.
I don't use CSP in Go... The only thing I care about regarding Go is: high level (not particularly hardware dependent), GC, http, significantly faster than Python for web-servers, AppEngine support and good searchable online documentation (traction). If Go was a system level language I would not consider using it actually and I am only starting to adopt it, very very slowly, as Python still have many advantages over Go in production. I am never going to use Go for something I couldn't do with Python. So to me C++ and Go have disjunct application areas. Rust and D mostly are in the C++ domain, with a little bit more overlap with Go than C++, but not much. That said, if the basic simplicity-oriented design philosophy in D1 had been refined, then D could have taken on Go. As it stands, I don't think D can without making very radical changes to the language.
Jun 07 2016
prev sibling parent reply Dave <david.dave dave.com> writes:
On Tuesday, 7 June 2016 at 09:29:35 UTC, Russel Winder wrote:
 Their "strap line" is effectively that GC is not a problem for 
 systems programming. And they are right. Which is why D has no 
 problem with being a GC language.
Incorrect. Pike, on a panel with D's Andrei, even said that when they labeled Go a systems programming language it was kind of taken in a way they didn't mean. It's more of a server programming language. I don't think Pike would agree that Go would be the best choice for an OS. I'm sure you can create one, and I'm sure he'd agree with this, but I doubt he'd personally reach for it.
 Go has blossomed due to the Web niche but it is used elsewhere.
No. It's blossomed because of its tooling and standard library, which gets traction from the funds of Google.
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 17:09:38 UTC, Dave wrote:
 Incorrect. Pike, on a panel with D's Andrei, even said that 
 when they labeled Go a systems programming language it was kind 
 of taken in a way they didn't mean. It's more of a server 
 programming language. I don't think Pike would agree that Go 
 would be the best choice for an OS. I'm sure you can create 
 one, and I'm sure he'd agree with this, but I doubt he'd 
 personally reach for it.
There was actually a person from an academic OS research team that wrote about adopting Go (probably changing it) for an experimental OS implementation. Someone on the Go team thought it was a good idea and also that they could do it using garbage collection... so well... I am not sure if they are sober about what Go is suitable for... ;-) (but yes, they are no longer developing it as a system level language)
Jun 07 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 18:00:43 UTC, Ola Fosheim Grøstad 
wrote:
 There was actually a person from an academic OS research team 
 that wrote about adopting Go (probably changing it) for an 
 experimental OS implementation. Someone on the Go team thought 
 it was a good idea and also that they could do it using garbage 
 collection...
side note: garbage-collected OS is possible, and written, and is working. Inferno.
Jun 07 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 18:59:13 UTC, ketmar wrote:
 side note: garbage-collected OS is possible, and written, and 
 is working. Inferno.
Yes, the Go guys was involved. So it is possible yes, but why would you want to have a garbage collected kernel? What do you gain from it? Extra trouble is what you gain from it.
Jun 07 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 7 June 2016 at 19:07:03 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 18:59:13 UTC, ketmar wrote:
 side note: garbage-collected OS is possible, and written, and 
 is working. Inferno.
Yes, the Go guys was involved. So it is possible yes, but why would you want to have a garbage collected kernel? What do you gain from it? Extra trouble is what you gain from it.
'cause it simplifies memory management, on all levels... if we'll switch to microkernel architecture. but this is very off-topic.
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 19:16:13 UTC, ketmar wrote:
 'cause it simplifies memory management, on all levels... if 
 we'll switch to microkernel architecture. but this is very 
 off-topic.
Inferno was a MMU-less VM like JVM, but failed miserably to gain traction... In a micro kernel design all services are moved to user-space and the privileged kernel is minimal, which suggests no benefits from GC. Yeah, OT. Except it does reflect the Go authors preference for weird-and-pointless solutions. Just like Go's desperate attempt to avoid adding exceptions: panic-defer-recover, a weird hack that is possible, but not something that would make anyone satisfied. There are more hacks like that in Go, like Ds growing slices that suddenly reallocates behind the scenes. They seem to follow the guideline «it is possible, and kind of simplistic, therefore it is a good idea», but they are wrong. Go is a mixed bag.
Jun 07 2016
prev sibling parent reply Dave <david.dave dave.com> writes:
On Tuesday, 7 June 2016 at 19:07:03 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 7 June 2016 at 18:59:13 UTC, ketmar wrote:
 side note: garbage-collected OS is possible, and written, and 
 is working. Inferno.
Yes, the Go guys was involved. So it is possible yes, but why would you want to have a garbage collected kernel? What do you gain from it? Extra trouble is what you gain from it.
I also 'never' claimed it wasn't possible. Nor that you couldn't do it in Go. However, should you...I'm not that convinced it's a good tool for the job.
Jun 07 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 June 2016 at 19:29:11 UTC, Dave wrote:
 I also 'never' claimed it wasn't possible. Nor that you 
 couldn't do it in Go. However, should you...I'm not that 
 convinced it's a good tool for the job.
That's right. It isn't. I think the ones looking at Go, just wanted to use it as a simple starting point and planned to strip out the GC. I don't know why they didn't want to write their OS in C, maybe they wanted to add some semantic analysis that isn't possible in C as part of their research. I just thought the request was very odd: https://groups.google.com/forum/#!topic/golang-nuts/6dI4vIxRgn8/discussion
Jun 07 2016
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/06/2016 04:15 AM, Russel Winder via Digitalmars-d wrote:
 3. Have one lightweight D realized cross platform IDE. Qt us probably
 the best widget set to use for this. My model here is LiteIDE which is
 a Qt-based Go IDE realized in C++. It should of course be realized in
 Go, but there are no Qt bindings for Go, only QML ones.
One thing I've been really wanting to do for awhile (and even moreso after switching my main desktop to Linux) is take Programmer's Notepad 2 (a windows program, but very lightweight and very nice) and try porting it to D+Qt (or maybe libui if it gets a Qt backend). Although I don't know how realistic Qt on D in right now, and I haven't been able to justify the personal time & energy investment, even as much as I'd like to :( Just can't find a linux editor I like as much as PN2 :(
Jun 11 2016
prev sibling parent reply Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 06/06/2016 09:15, Russel Winder via Digitalmars-d wrote:
 * Tooling is immature and of poorer quality compared to the
 competition.
This is true. We have too many half-finished attempt at things, basically because everything is volunteer, not directly associated with work, activity. Nothing wrong with this per se, but an obvious explanation why it is so. Unless an oirganization or seven put some work-oriented effort into the tooling, nothinkg will change. I would suggest three ways forward: 1. Get effort on the IntelliJ IDEA and CLion plugin. Kingsley has made a start. I suggest using his work as a basis and doing a new version written in Kotlin instead of Java. Kotlin will be easier than Java for D people to work with and easy for Java people to work with. 2. Get effort on the DDT Eclipse plugin. Bruno has declared it finished, which is fine, but I would say it should not be treated that way. 3. Have one lightweight D realized cross platform IDE. Qt us probably the best widget set to use for this. My model here is LiteIDE which is a Qt-based Go IDE realized in C++. It should of course be realized in Go, but there are no Qt bindings for Go, only QML ones.
If anything is to be done about improving the IDE tooling, it should be work on a tool like D Completion Daemon (DCD) - that is, an IDE-agnostic tool for code completion and other language analysis functionality (find-references, refactor, etc.) The IDE space is just too fragmented - there are now even more popular IDEs that than say 5 years ago - like VS Code, Atom, etc.. Even SublimeText is a relatively recent player. As such it's not feasible to be focusing a work on just a few IDEs. You have to make the core of IDE functionality available in an IDE-agnostic tool. VSCode for example has even defined a language-agnostic protocol for such language servers: https://github.com/Microsoft/vscode-languageserver-protocol/ , and there is work in a few other IDEs to adopt that protocol as well, and write their own IDE client implementation (Eclipse for example, but it's all very early stages). In any case, this is all of secondary importance, IMO. The GC issue is much more critical. If people think D has a worthwhile future for building apps in real world scenarios, then the tooling will get there eventually, it will catch up. But if people think other languages will work much better for their needs (like Rust or others), no amout of exceptional tooling will make a difference. BTW, "finished" is not the right word to describe the state of DDT, if anything it's now in maintenance mode. (actually not that different from what it has been in the last 1 year or so) -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 13 2016
parent Peter Lewis <peter werl.me> writes:
As someone learning D, I thought I would give my insight in how I 
came to D.

My biggest reason for choosing D is the GC. I have come from Java 
and don't quite believe that I'm ready to manage my own memory 
throughout an entire program, but the ability to disconnect from 
the GC is a great way to start. I'm not saying that D should be a 
stopgap language, it has far too much potential for that.

I think that D definitely has many positives but that there is 
still work that needs to go into it. But all languages need work, 
no language is perfect.

I don't have much insight onto how the long term development and 
goals have gone, but I see that D is moving in a good direction 
and hope it will be around for many years to come. I also wish 
that D would have a wider adoption.

As it goes for tools. I agree work needs to be done on them but 
that it is not as important as a well done, competent compiler 
set and great documentation. D has great docs, and a quite 
competent compiler group.
Jun 14 2016
prev sibling next sibling parent Bienlein <jeti789 web.de> writes:
RefCounted does not work for classes, only for structs. Reason 
against adoption at least for me ;-).
Jun 06 2016
prev sibling next sibling parent reply Andrea Fontana <nospam example.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:
IMHO biggest problems are: - There's no standard "de facto" cross-platform / up-to-date IDE with a just working debugger. At least dub is doing a good job to uniform build system. It would be very very useful for beginners and tutorials. - We miss a lot of "standard" libraries. An example: in my company we have a webservice used by my collegues to upload and store pictures using AWS. It would be nice to convert it in D but we miss: a) a standard way to listen in http or https (vibe.d could be ok: it appers an overkill for me that I don't know it. I don't need all template framework or mvc etc.). b) a standard libary to manage pictures (we need to do some simple operations - resize, cut etc) c) support for aws api in d d) support for db api in d (we need of course to store info in db). The last point is ok in our case, because i wrote a binding for mongodb. I wrote a simple fastcgi binding to solve first point. But b) and c) are still waiting for a solution. And, of course, this is a simple example. For a website or a GUI app a lot of other things are missing. So I think the problem is not D itself but the ecosystem. Andrea
Jun 06 2016
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 06/06/2016 10:35 PM, Andrea Fontana wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its own
 thread. It's very important.
 -----------------------------------------------------------------------------

 I go to conferences. Train and consult at large companies. Dozens
 every year, cumulatively thousands of people. I talk about D and ask
 people what it would take for them to use the language. Invariably I
 hear a surprisingly small number of reasons:
IMHO biggest problems are: - There's no standard "de facto" cross-platform / up-to-date IDE with a just working debugger. At least dub is doing a good job to uniform build system. It would be very very useful for beginners and tutorials. - We miss a lot of "standard" libraries. An example: in my company we have a webservice used by my collegues to upload and store pictures using AWS. It would be nice to convert it in D but we miss: a) a standard way to listen in http or https (vibe.d could be ok: it appers an overkill for me that I don't know it. I don't need all template framework or mvc etc.). b) a standard libary to manage pictures (we need to do some simple operations - resize, cut etc) c) support for aws api in d d) support for db api in d (we need of course to store info in db). The last point is ok in our case, because i wrote a binding for mongodb. I wrote a simple fastcgi binding to solve first point. But b) and c) are still waiting for a solution. And, of course, this is a simple example. For a website or a GUI app a lot of other things are missing. So I think the problem is not D itself but the ecosystem. Andrea
My goal is to get windowing + image library into Phobos. Do note that this is not a GUI toolkit but definitely a major dependency. I certainly do need help. Help is very much welcome[0]. Manu Evans has a good work done for color[1]. But he is quite busy so its not ugh done yet. [0] https://github.com/rikkimax/alphaPhobos [1] https://github.com/dlang/phobos/pull/2845
Jun 06 2016
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 06/06/16 04:20, Walter Bright wrote:

 * Tooling is immature and of poorer quality compared to the competition.
What is the competition in this case? -- /Jacob Carlborg
Jun 06 2016
parent Observer <here inter.net> writes:
On Monday, 6 June 2016 at 10:35:40 UTC, Jacob Carlborg wrote:
 On 06/06/16 04:20, Walter Bright wrote:

 * Tooling is immature and of poorer quality compared to the 
 competition.
What is the competition in this case?
I write a lot of Perl code, and I swear by perltidy to keep things clean and neat. Aside from running on full files to clean up the mess I often inherit, I have a single-button-press macro in the editor to invoke it on the current block of code. I would want similar full-featured capability and convenience from dfmt (or my own dtidy, if I were up to writing it).
Jun 06 2016
prev sibling next sibling parent maik klein <maikklein googlemail.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
I am a D user since probably 6 months now, I can give you my list from the point of view of a game dev. I concentrate on the negatives * The garbage collector eliminates probably 60% of potential users right off. I think that is mostly true, while it is possible to avoid the GC it is not easy. The biggest problem is that you would need to use something like unqiue_ptr and shared_ptr. While you can create this, almost nothing in Phobos works with non copyable types. Unqiue and RefCounted both still use the GC and Unique can't be used (any?) container in Phobos currently. You can't even use writeln on it, because almost anything in Phobos "copies" behind your back. This makes avoiding the GC not really practical unless you are recreating a lot of stuff from scratch and make it move aware. I don't think many want to do this on their own. * Tooling is immature and of poorer quality compared to the competition. Honestly I think most languages have very poor tooling but D's is especially bad. I say that without any anger nor do I want to point fingers. I basically use D without any tooling right now except GDB and only use DCD for autocompleting import statements. * Error messages I think most error messages are actually not bad, but some error messages especially compile time lambdas that are passed into templates are especially bad. Most of the time you literally get no message at all if the error message is inside that lambda. * Compiler not written in D DMD backend is not open source and is written in(C++?). ldc is written in C++. I don't think that is a big issue now, but at the time of evaluating D I was mostly wondering why D's compiler weren't implemented in D itself. * Long standing issues There are so many long standing issues in the bug tracker that also made me wonder how active the community is in fixing stuff. Most issues that I encountered where phobos issues that I worked around by creating my own stuff. Like a zipped range not being able to mutate its elements. At one point I will try contribute all this stuff but my implementations use a lot of stuff that I have written myself and are not currently in phobos. * Size of community and getting help I think D's community is very helpful and we have a few experts that are answering my questions on stackoverflow very often. But you usually have to wait for some answers quite some time. For example most questions have already been answered for C++ and if you have some advanced question for C++ you usually get an answer in a matter of minutes. I think D is a amazingly well designed language and if we could "fix" tooling(autocompletion, renaming, goto definition etc), gc and error messages, I think the other issues would be fixed indirectly by having more users.
Jun 06 2016
prev sibling next sibling parent reply Joseph Cassman <jc7919 outlook.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 [...]
 * The garbage collector eliminates probably 60% of potential 
 users right off.
Curious of the progress/plan for RC. From the autodecode thread sounds like RC-string is still in the works. How's that effort coming along? I am guessing it may be guiding the implementation of an RC design for D. If you and Andrei have a working design perhaps it could be shared (no pressure, just curious since it is a tough problem to solve). Perhaps sharing a plan on this front could dispel some of the negativity attached to the GC? As an aside, I am not opposed to garbage collection. Working with D the little that I have has helped me to be understand better how to avoid allocating memory and how allocation works with memory management algorithms. That is great. But I still do not have a clear idea of how to completely avoid GC and do just manual memory management in D. Perhaps a short tutorial of this could be presented front-and-center on the website. Joe
Jun 06 2016
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/6/16 5:41 PM, Joseph Cassman wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its own
 thread. It's very important.
 [...]
 * The garbage collector eliminates probably 60% of potential users
 right off.
Curious of the progress/plan for RC. From the autodecode thread sounds like RC-string is still in the works. How's that effort coming along?
Progress is slow but steady.
 I
 am guessing it may be guiding the implementation of an RC design for D.
 If you and Andrei have a working design perhaps it could be shared (no
 pressure, just curious since it is a tough problem to solve). Perhaps
 sharing a plan on this front could dispel some of the negativity
 attached to the GC?
We will share things as soon as we have something worth sharing. Andrei
Jun 06 2016
next sibling parent Joseph Cassman <jc7919 outlook.com> writes:
On Monday, 6 June 2016 at 22:07:03 UTC, Andrei Alexandrescu wrote:
 On 6/6/16 5:41 PM, Joseph Cassman wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 [...]
Curious of the progress/plan for RC. From the autodecode thread sounds like RC-string is still in the works. How's that effort coming along?
Progress is slow but steady.
 I
 am guessing it may be guiding the implementation of an RC 
 design for D.
 If you and Andrei have a working design perhaps it could be 
 shared (no
 pressure, just curious since it is a tough problem to solve). 
 Perhaps
 sharing a plan on this front could dispel some of the 
 negativity
 attached to the GC?
We will share things as soon as we have something worth sharing. Andrei
Sounds good. Look forward to it! Joe
Jun 06 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tue, 2016-06-07 at 00:07 +0200, Andrei Alexandrescu via Digitalmars-
d wrote:
 [=E2=80=A6]
=20
 We will share things as soon as we have something worth sharing.
Why not release early, release often? --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling next sibling parent reply bob belcher <claudiu.garba gmail.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
Hello, why not a poll, and ask the community that they want first. - tiny web library from vibe.d will not be complicated - improve documentation, the same - tour.dlang.io improvements - make an editor work properly on all platforms YES - weekly tutorials. (that will be 30 until the end of year) - more noise on how to use proper dlang. dfmd, dub, dscanner. - make the website much friendly. Hire some freelancers and make the website nice! #makedlanggreatagain!
Jun 06 2016
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 6 June 2016 at 16:10:29 UTC, bob belcher wrote:
 why not a poll, and ask the community that they want first.
I'm not sure poll-driven development is necessarily a good idea. The people leading development probably already have a good sense of the communities' priorities. Also, it's not like all energy will be poured into one or two of them - ideally several could be worked on. I think a better approach is just listing community priorities and making clear who has responsibility/ownership over them.
Jun 06 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 9:40 AM, jmh530 wrote:
 Also, it's not like all energy will be poured into one or two of
 them - ideally several could be worked on.
Consider the recent gigantic thread on autodecoding. Is anyone working on any PRs for it?
Jun 06 2016
next sibling parent Stefan Koch <uplink.coder googlemail.com> writes:
On Monday, 6 June 2016 at 22:25:34 UTC, Walter Bright wrote:
 On 6/6/2016 9:40 AM, jmh530 wrote:
 Also, it's not like all energy will be poured into one or two 
 of
 them - ideally several could be worked on.
Consider the recent gigantic thread on autodecoding. Is anyone working on any PRs for it?
Adam did I believe.
Jun 06 2016
prev sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 6 June 2016 at 22:25:34 UTC, Walter Bright wrote:
 Consider the recent gigantic thread on autodecoding. Is anyone 
 working on any PRs for it?
https://github.com/dlang/phobos/pull/4394
Jun 06 2016
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 4:22 PM, Jack Stouffer wrote:
 On Monday, 6 June 2016 at 22:25:34 UTC, Walter Bright wrote:
 Consider the recent gigantic thread on autodecoding. Is anyone working on any
 PRs for it?
https://github.com/dlang/phobos/pull/4394
Wonderful!
Jun 06 2016
prev sibling next sibling parent reply Seb <seb wilzba.ch> writes:
On Monday, 6 June 2016 at 16:10:29 UTC, bob belcher wrote:
 Hello,

 why not a poll, and ask the community that they want first.
http://www.rkursem.com/poll/view.php?id=7f7ebc16c280d0c3c
 - tiny web library from vibe.d will not be complicated
 - improve documentation, the same
I don't think it's bad in general. It just needs more eyes - people willing to submit even small PRs for minor things!
 - tour.dlang.io improvements
We are working on it, everyone is cordially invited to help us: https://github.com/stonemaster/dlang-tour/issues
 - make an editor work properly on all platforms YES
I am fine with Vim and the amazing part about D is that you don't need a sophisticated IDE that handles the boiler-plate code, because Walter et.al. have done a great job of removing the boiler-plate code in the first place! We should start to teach people that ala #dlanglovesanyeditor
 - weekly tutorials. (that will be 30 until the end of year)
Adam regularly features them in "This week in D" http://arsdnet.net/this-week-in-d Afaik he lost some of his input sources - we should just feed him with more info or tutorials!
 - more noise on how to use proper dlang. dfmd, dub, dscanner.
I agree that they dfmt and dscanner should be be included in the installer and distributed to all platforms, see e.g. https://github.com/Hackerpilot/dfix/issues/36
 - make the website much friendly. Hire some freelancers and 
 make the website nice!
what exactly don't you like about dlang.org? It's beautiful, fast and responsive.
 #makedlanggreatagain!
it's already great!
Jun 06 2016
parent Seb <seb wilzba.ch> writes:
On Monday, 6 June 2016 at 21:36:52 UTC, Seb wrote:
 On Monday, 6 June 2016 at 16:10:29 UTC, bob belcher wrote:
 Hello,

 why not a poll, and ask the community that they want first.
http://www.rkursem.com/poll/view.php?id=7f7ebc16c280d0c3c
Btw current intermediate results: Immature tooling 11.4% 18 Holes in the Standard Library: Images, Containers ... 10.8% 17 Weak tutorials and documentation 9.5% 15 Garbage collector 7.6% 12 Unwillingness to break old code to improve the lan... 7.6% 12 Excessive memory usage during compilation 7.0% 11 Missing sophisticated DLL support 5.7% 9 No (big) corporate support 4.4% 7 Unfinished and/or inadequate language features 4.4% 7 Small job market 4.4% 7 Large size of binaries 4.4% 7 Holes and bugs in safety 3.8% 6 No compelling debugger support 3.8% 6 No web services framework 3.8% 6 No compelling plugin for wide-spread IDEs (Intelli... 3.2% 5 General unwillingness of Andrei&Walter to listen t... 2.5% 4 No compelling Visual Studio plugin 1.3% 2 Not enough advantages/features 1.3% 2 General unwillingness of Andrei&Walter to listen t... 1.3% 2 compile times do not justify coffee breaks 1.3% 2 Too fast 0.6% 1
Jun 07 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 16:10 +0000, bob belcher via Digitalmars-d wrote:
 [=E2=80=A6]
=20
 why not a poll, and ask the community that they want first.
 - tiny web library from vibe.d will not be complicated
 - improve documentation, the same
 - tour.dlang.io improvements
 - make an editor work properly on all platforms YES
 - weekly tutorials. (that will be 30 until the end of year)
 - more noise on how to use proper dlang. dfmd, dub, dscanner.
 - make the website much friendly. Hire some freelancers and make=C2=A0
 the website nice!
Polls are all very nice as input to creating plans and furthering discussion. But unless someone actually does something so as to create stuff then it is a waste of time. =C2=A0 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling next sibling parent reply Karabuta <karabutaworld gmail.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
Tutorial, tutorials, tutorials .... Serach youtube for D tutorials and you will find none that is helpful to many people. Check rust tutorials, yeah .... JavaScript tutorials, abundance. Go tutorials, plenty..... Java tutorials, yeah. Clearly there seem to be a problem with tutorials.
Jun 06 2016
parent Dave <david.dave dave.com> writes:
On Monday, 6 June 2016 at 18:47:25 UTC, Karabuta wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 [...]
Tutorial, tutorials, tutorials .... Serach youtube for D tutorials and you will find none that is helpful to many people. Check rust tutorials, yeah .... JavaScript tutorials, abundance. Go tutorials, plenty..... Java tutorials, yeah. Clearly there seem to be a problem with tutorials.
For instance, Apple pretty posted a great series of tutorials within a week (days?) of launching Swift.
Jun 06 2016
prev sibling next sibling parent reply qznc <qznc web.de> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.
All three of those are affected by documentation and tutorials. From this list, this seems to be the biggest issue. Personally, this is surprising to me. I have read lots of complaints here in the forum, but I never experienced it myself. I think my first contact with D was in 2009 or 2010 and the documentation was certainly not better then. My conclusion is that I'm not normal. This also means I have a hard time to see where it should be improved.
Jun 06 2016
parent Atila Neves <atila.neves gmail.com> writes:
On Monday, 6 June 2016 at 19:43:13 UTC, qznc wrote:
 On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.
All three of those are affected by documentation and tutorials. From this list, this seems to be the biggest issue. Personally, this is surprising to me. I have read lots of complaints here in the forum, but I never experienced it myself. I think my first contact with D was in 2009 or 2010 and the documentation was certainly not better then. My conclusion is that I'm not normal. This also means I have a hard time to see where it should be improved.
Same here. I'm perfectly fine with the tools and docs as they are now, they wouldn't make my top 20 priorities. It's a common complaint, and therefore this must be important for other people, but I just don't get it. Conversely, Go is usually brought up as an example of a language with good tools. I've written some Go and I also don't understand what people mean when they say that. Why are they good? What's easier? This thread itself has had multiple people say that Phobos lacks proper examples of usage, which is the exact opposite of what I'd say. I'd rather have runnable documentation (unit tests) than static-could-be-lying-cos-it-hasn't-been-updated docs any day of the week and twice on Sundays. I read TDPL cover-to-cover in an afternoon, started writing code and never looked back. Conclusion? I'm weird. Atila
Jun 06 2016
prev sibling next sibling parent reply Satoshi <satoshi rikarin.org> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
(I'm work as a C/C++ programmer for military sector in the area of the information and communication infrastructure and electronic warfare) When I told of D to my boss he had a couple of reasons why not to use D for development of our products. * Backward compatibility with existing code. * D is much more complex than C++ * Not enough tutorials and solved problems in D on stack overflow (LOL) * We have problem to recruit a good C++ not a good D programmer. (1/100 is good) * My boss does not have free time to learn new things... * Using GC is strictly prohibited in realtime apps. And D does not have an compiler supported ARC * D without GC or ARC is not powerful as it can be. * More and more people are dumber, we must write our programs for later re-usage by any junior what we must employ. C++ is in this way much more easier than D, cuz you know what every line of your program do. Employ C++ junior programmer and let him to learn D and then work on our projects is not a good (and cost-effective) idea. * Not everyone is interested in programming, sometimes people are doing it just for money.
Jun 06 2016
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 12:44 PM, Satoshi wrote:
 When I told of D to my boss he had a couple of reasons why not to use D for
 development of our products.

 * Backward compatibility with existing code.
 * D is much more complex than C++
More complex? Wow!
 * Not enough tutorials and solved problems in D on stack overflow (LOL)
 * We have problem to recruit a good C++ not a good D programmer. (1/100 is
good)
 * My boss does not have free time to learn new things...
 * Using GC is strictly prohibited in realtime apps. And D does not have an
 compiler supported ARC
 * D without GC or ARC is not powerful as it can be.
 * More and more people are dumber, we must write our programs for later
re-usage
 by any junior what we must employ. C++ is in this way much more easier than D,
 cuz you know what every line of your program do. Employ C++ junior programmer
 and let him to learn D and then work on our projects is not a good (and
 cost-effective) idea.
C++ for junior programmers is easier? Wow!
 * Not everyone is interested in programming, sometimes people are doing it just
 for money.
Jun 06 2016
next sibling parent reply Dave <david.dave dave.com> writes:
On Monday, 6 June 2016 at 20:33:14 UTC, Walter Bright wrote:
 On 6/6/2016 12:44 PM, Satoshi wrote:
 [...]
More complex? Wow!
 [...]
C++ for junior programmers is easier? Wow!
 [...]
Most learn C++ in college. So they have a 'headstart'
Jun 06 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/6/2016 1:36 PM, Dave wrote:
 On Monday, 6 June 2016 at 20:33:14 UTC, Walter Bright wrote:
 On 6/6/2016 12:44 PM, Satoshi wrote:
 [...]
More complex? Wow!
 [...]
C++ for junior programmers is easier? Wow!
 [...]
Most learn C++ in college. So they have a 'headstart'
Even acquiring minimal competency in C++ has a very long learning curve, or perhaps I just have a different idea of what minimal competency is. Heck, the C++ code I wrote just a few years ago I regard as crap today. My personal pet peeve are junior programmers who think they get it, find some perverted corner case in the language, and build their entire application around that. They confuse their mastery of the corner case with competence. (This happens with all languages, not just C++, it's just that C++ provides so many opportunities for it!)
Jun 06 2016
next sibling parent David <David.dave dave.com> writes:
On Monday, 6 June 2016 at 21:25:07 UTC, Walter Bright wrote:
 On 6/6/2016 1:36 PM, Dave wrote:
 On Monday, 6 June 2016 at 20:33:14 UTC, Walter Bright wrote:
 On 6/6/2016 12:44 PM, Satoshi wrote:
 [...]
More complex? Wow!
 [...]
C++ for junior programmers is easier? Wow!
 [...]
Most learn C++ in college. So they have a 'headstart'
Even acquiring minimal competency in C++ has a very long learning curve, or perhaps I just have a different idea of what minimal competency is. Heck, the C++ code I wrote just a few years ago I regard as crap today. My personal pet peeve are junior programmers who think they get it, find some perverted corner case in the language, and build their entire application around that. They confuse their mastery of the corner case with competence. (This happens with all languages, not just C++, it's just that C++ provides so many opportunities for it!)
D is a little bit harder than most considering the lack of tutorials and the chaotic documentation that I have eluded to. Consider this:https://www.youtube.com/watch?v=c8BGQ3CfPBs I saw this series literally the day AFTER they released Swift. I find lots of D talks, but nothing even REMOTELY close to this. I have to fall back onto your documentation that is littered with non-useful examples and broken links. This isn't a knock on YOUR work personally. Every time I learn a detail of this language I am pleased with the solution (maybe with a slight caveat of the GC). It's just how long it takes to figure it out and learn what is needed to do things right.
Jun 06 2016
prev sibling parent Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Mon, 2016-06-06 at 14:25 -0700, Walter Bright via Digitalmars-d
wrote:
=20
[=E2=80=A6]
 My personal pet peeve are junior programmers who think they get it,
 find some=C2=A0
 perverted corner case in the language, and build their entire
 application around=C2=A0
 that. They confuse their mastery of the corner case with competence.
[=E2=80=A6] I found some nice features of FORTRAN in 1975 and proceeded to do all my FORTRAN programming in abstract-data-type style (a few years later this might have been labelled "object oriented"). Wholly inappropriate for FORTRAN obviously. I think many of the same techniques are still being used in JavaScript programming. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 07 2016
prev sibling parent Satoshi <satoshi rikarin.org> writes:
On Monday, 6 June 2016 at 20:33:14 UTC, Walter Bright wrote:
 On 6/6/2016 12:44 PM, Satoshi wrote:
 When I told of D to my boss he had a couple of reasons why not 
 to use D for
 development of our products.

 * Backward compatibility with existing code.
 * D is much more complex than C++
More complex? Wow!
 * Not enough tutorials and solved problems in D on stack 
 overflow (LOL)
 * We have problem to recruit a good C++ not a good D 
 programmer. (1/100 is good)
 * My boss does not have free time to learn new things...
 * Using GC is strictly prohibited in realtime apps. And D does 
 not have an
 compiler supported ARC
 * D without GC or ARC is not powerful as it can be.
 * More and more people are dumber, we must write our programs 
 for later re-usage
 by any junior what we must employ. C++ is in this way much 
 more easier than D,
 cuz you know what every line of your program do. Employ C++ 
 junior programmer
 and let him to learn D and then work on our projects is not a 
 good (and
 cost-effective) idea.
C++ for junior programmers is easier? Wow!
 * Not everyone is interested in programming, sometimes people 
 are doing it just
 for money.
Yes. D have much more features than C++ to learn before you can use it in the right way. I never met programmer who start with D as a first language but I know many poeple who started with C++. You cannot be objective in this case. It's just your point of view. I made OS in C++ then I rewrote it into D and I must say I had much more problems with D than with C++.
Jun 06 2016
prev sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 6 June 2016 at 19:44:12 UTC, Satoshi wrote:

 When I told of D to my boss he had a couple of reasons why not 
 to use D for development of our products.

 * Backward compatibility with existing code.
It's never a good idea to introduce a new language, not matter which, into a long standing project. If your new code builds/depends on old code then stick to the language the project is written in. Introducing a new language would be more for new (ideally independent) projects. So this is _not_ an argument against D, but about switching languages midway in general.
 * D is much more complex than C++
D is not easy, but I'd say it's at least more structured than C++ (less dead weight).
 * Not enough tutorials and solved problems in D on stack 
 overflow (LOL)
 * We have problem to recruit a good C++ not a good D 
 programmer. (1/100 is good)
Again, this is not about the language, but about finding exceptionally good people. Then again, maybe someone who's really into D might be very good, because s/he does it out of genuine interest in programming, not just as a job.
 * My boss does not have free time to learn new things...
Well, I think this is one of the _real_ reasons people don't adopt D. Too much work to learn a new language (any language not just D).
 * Using GC is strictly prohibited in realtime apps. And D does 
 not have an compiler supported ARC
 * D without GC or ARC is not powerful as it can be.
Examples?
 * More and more people are dumber, we must write our programs 
 for later re-usage by any junior what we must employ. C++ is in 
 this way much more easier than D, cuz you know what every line 
 of your program do. Employ C++ junior programmer and let him to 
 learn D and then work on our projects is not a good (and 
 cost-effective) idea.
This kind of thinking is very common and is one of the reasons IT is still stuck with outdated languages. It's this terrible must-work-out-of-the box thinking, meaning that the new employee has to be able to contribute code from day one, if not, s/he'll not be hired. It's a vicious circle you can't get out of. A has to know B, and we use B, because A knows B. In this way, no progress will ever be made. Why not invest a little time in educating employees? If s/he understands C++, D is not that hard to pick up. A little investment pays off in the future.
 * Not everyone is interested in programming, sometimes people 
 are doing it just for money.
Well, with these people things will never progress. But we shouldn't make our future dependent on the complacent and disinterested. They never innovate. Sorry, but most reasons given by your boss are very general and could be applied to any new technology. It has nothing to do with D.
Jun 07 2016
prev sibling next sibling parent David Soria Parra via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sun, Jun 05, 2016 at 07:20:52PM -0700, Walter Bright via Digitalmars-d wrote:
 Andrei posted this on another thread. I felt it deserved its own thread.
 It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. Dozens every
 year, cumulatively thousands of people. I talk about D and ask people what
 it would take for them to use the language. Invariably I hear a surprisingly
 small number of reasons:
From my personal perspective:
- Compiler bugs are a real issue. At one point I had to have a dev-build of DMD around.
 * The garbage collector eliminates probably 60% of potential users right off.
- If you think as a C++ competitor you are right, I personally think D is more a Python competitor. I think about it as Python in fast. Or "GO" with better C compatibility. So I consider it more a glue language than trying to compete with the lowest layer of an architecture (C++) which creats a lot of architectual dependencies to change it.
 
 * There's no web services framework (by this time many folks know of D, but
 of those a shockingly small fraction has even heard of vibe.d). I have
 strongly argued with Snke to bundle vibe.d with dmd over one year ago, and
 also in this forum. There wasn't enough interest.
- Yes please - Debian/Fedora packages for DMD, LDC, GDC, should be a one click away to get stuff installed. - Better libraries. Like "batteries" included. Like a proper DB interface is missing for example, besides that it's already a great tool for glueing stuff together in fast. - Other missing things: nice HTTP api that is not curl, but easier. Protbuf/Thrift integration (thrift exists). Just my 2cents,
Jun 06 2016
prev sibling next sibling parent Piece <none mail.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 * Let's wait for the "herd effect" (corporate support) to start.
We should make a Slack community and another kind of forum like this one http://elixirforum.com/. They make really easy for starter to learn or question anything related to D.
Jun 06 2016
prev sibling next sibling parent Adam Wilson <flyboynw gmail.com> writes:
Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its own thread.
 It's very important.
 -----------------------------------------------------------------------------

 I go to conferences. Train and consult at large companies. Dozens every
 year, cumulatively thousands of people. I talk about D and ask people
 what it would take for them to use the language. Invariably I hear a
 surprisingly small number of reasons:

 * The garbage collector eliminates probably 60% of potential users right
 off.

 * Tooling is immature and of poorer quality compared to the competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks know of D,
 but of those a shockingly small fraction has even heard of vibe.d). I
 have strongly argued with Sönke to bundle vibe.d with dmd over one year
 ago, and also in this forum. There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio plugin, it
 doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the
 weaknesses above.
A note to all on the GC. We have a GSoC project this year for the GC. He is currently working on improving the GC code to allow for multiple GC implementations and bringing in Rainer's Precise GC on-board. Once he has completed the on-boarding he will work on improving the precision of the precise GC. A precise GC is important as it paves the way for background/generational/concurrent GC algorithms, such as what you find in modern .NET/Java apps. I feel that this will go a *long* towards solving the majority of the complaints about the GC, with the exception of the "never-GC" crowd. -- // Adam Wilson // import quiet.dlang.dev;
Jun 06 2016
prev sibling next sibling parent poliklosio <poliklosio happypizza.com> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 (...)
 * Documentation and tutorials are weak.
Regarding documentation, I'll just point out something that people seem not to understand here. I think the complaints about bad docs are not really about accuracy of what is displayed on the screen. I think accuracy is very good. They are more about usefulness and discoverability. Its just the amount of cruft that a user has to go through to find out what exists and how he can apply it. I'll pick on string documentation as an example. Other concepts will have different issues. As a concrete example, how good is https://docs.python.org/2/library/stdtypes.html#str.find in comparison with https://dlang.org/phobos/std_string.html#.indexOf ? I think python version has several advantages: - It is more conscise due to uniform handling of single char and string. - It is more conscise due to not displaying a lot of pointless cruft that the user doesn't care about. - It is more discoverable due to the name "find". - It is more discoverable due to being a method of str type. Problems with the D version are: - There is heavy use UFCS, so things are split into different modules. String is an array, which means that UFCS must be used to extend it. - The whole deal with dchar and unicode vocabulary all over the place, which adds to the amount of information. It would be much better to just assume correct Unicode handling and concentrate on how is a function different from other functions, as this is what user needs. Yes, details are good, but - Strong typing in D has a lot of concepts that a newcomer has to learn before docs stop causing anxiety. - The type constrains add visual noise. - For the type constraints, it is hard to find what is the intention of their existence. It is typically not spelled out in English language. Can something like this be made for the D version? I claim it can. We just have to drop the compulsion to document every detail and generate a simplified version of the docs. I think such a simplified, python-like documentation could live as yet another version of docs that concentrates on usage rather than definition. It could contain links to the full version. :) And, as a cherry on the cake, what is the first thing that user sees in the D version? "Jump to: 2" It is not readable at all. What is 2? Second what? What is the difference between the 2 things? I had to squint hard to find out (I wasn't lucky enough to read the first line of description first when I was finding out the difference. I started by looking at the function signatures). Those things should be described in english language somehow. And regarding cheatsheets, as pointed out before, they don't work as a discoverability aid. You really need 2 or 3 sentences to tell what a fucntion does. A line of cryptic code that presents *one* data sample doesn't work.
Jun 07 2016
prev sibling next sibling parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 19:04:06 H. S. Teoh via Digitalmars-d wrote:
 I think far too much energy has been spent arguing for a GC-less
 language than actually writing the code that would fix its associated
 performance issues, and my suspicion is that this is mostly caused by
 your typical C/C++ programmer mindset (of which I used to be a part)
 that's always obsessed about memory management, rather than any factual
 basis.
One of the other big issues is that many of the companies which use D and have been active in the community have been doing _very_ high performance stuff where they can't afford the GC to kick in even occasionally for less than a hundred milliseconds (e.g. IIRC, Manu considering 10ms to be too long for what they were doing at Rememdy). When you start getting requirements that are that stringent, you start needing to not use the GC. And when some of the more visible users of D have requirements like that and raise issues about how the GC gets in their way, then it becomes a big deal even if it's not a problem for your average D user. We need to support GC-less code, and we need to avoid using the GC in Phobos stuff where it's not necessary, since it will impede the high performance folks otherwise. And doing some of the stuff like turning off the GC in specific code like you discussed will take care of many of the folks that would be affected by GC issues. But for your average D program, I really think that it's a non-issue, and as the GC situation improves, it will be even less of an issue. So, to some extent, I agree that there's too much an issue made over the GC - especially by folks who aren't even using the language and pretty much just react negatively to the idea of the GC. But we do still need to do a better job of not requiring the GC when it's not actually needed as well as better supporting some of the GC-less paradigms like ref-counting. - Jonathan M Davis
Jun 07 2016
prev sibling next sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 07, 2016 at 07:24:55PM -0700, Jonathan M Davis via Digitalmars-d
wrote:
 On Tuesday, June 07, 2016 19:04:06 H. S. Teoh via Digitalmars-d wrote:
 I think far too much energy has been spent arguing for a GC-less
 language than actually writing the code that would fix its
 associated performance issues, and my suspicion is that this is
 mostly caused by your typical C/C++ programmer mindset (of which I
 used to be a part) that's always obsessed about memory management,
 rather than any factual basis.
One of the other big issues is that many of the companies which use D and have been active in the community have been doing _very_ high performance stuff where they can't afford the GC to kick in even occasionally for less than a hundred milliseconds (e.g. IIRC, Manu considering 10ms to be too long for what they were doing at Rememdy).
I'm pretty sure there are applications for which GC is verboten, such as high-performance game engines and what-not (but even for them, it's just the core code that needs to avoid GC; peripheral things like in-game scripting may actually be using a forced-GC scripting language -- it's just that you want to control certain core operations to be maximally performant). But these are the exceptions rather than the norm. [...]
 We need to support GC-less code, and we need to avoid using the GC in
 Phobos stuff where it's not necessary, since it will impede the high
 performance folks otherwise.  And doing some of the stuff like turning
 off the GC in specific code like you discussed will take care of many
 of the folks that would be affected by GC issues. But for your average
 D program, I really think that it's a non-issue, and as the GC
 situation improves, it will be even less of an issue.
Actually, I'm not sure how much of Phobos actually depends on the GC. Most of the stuff I use frequently are from std.range and std.algorithm, and we've pretty much gotten rid of GC-dependence from most of the stuff there. Phobos modules that are GC-heavy ought to be avoided in high-performance code anyway; the only problematic case I can think of being std.string which greedily allocates. But we've been making progress on that over the past year or so by turning many of the functions into range-based algorithms rather than string-specific. Lately I've been considering in my own code that a lot of things actually don't *need* the GC. Even things like string transformations generally don't need to allocate if they are structured to be a range-based pipeline, and the consumer (sink) processes the data as it becomes available instead of storing everything in intermediate buffers. Even if you do need to allocate some intermediate buffers these are generally well-scoped, and can be taken care of with malloc/free and scope(exit). The only time you really need the GC is when passing around large recursive data structures with long lifetimes, like trees and graphs.
 So, to some extent, I agree that there's too much an issue made over
 the GC - especially by folks who aren't even using the language and
 pretty much just react negatively to the idea of the GC.
As Walter has said before: people who aren't already using the language are probably only using GC as an excuse to not use the language. They won't necessarily start using the language after we've bent over backwards to get rid of the GC. We should be paying attention to current users (and yes I know Manu has been clamoring for nogc and he's a current user).
 But we do still need to do a better job of not requiring the GC when
 it's not actually needed as well as better supporting some of the
 GC-less paradigms like ref-counting.
[...] IMO a lot of Phobos modules actually could become much higher-quality if rewritten to be GC-independent. Some of the less-frequently used modules are GC-heavy not out of necessity but because of sloppy code quality, or because they were written prior to major D2 breakthroughs like ranges and templates / CT introspection. T -- The early bird gets the worm. Moral: ewww...
Jun 07 2016
prev sibling next sibling parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, June 07, 2016 21:00:06 H. S. Teoh via Digitalmars-d wrote:
 Actually, I'm not sure how much of Phobos actually depends on the GC.
 Most of the stuff I use frequently are from std.range and std.algorithm,
 and we've pretty much gotten rid of GC-dependence from most of the stuff
 there.  Phobos modules that are GC-heavy ought to be avoided in
 high-performance code anyway; the only problematic case I can think of
 being std.string which greedily allocates. But we've been making
 progress on that over the past year or so by turning many of the
 functions into range-based algorithms rather than string-specific.
As I understand it, the big problems relate to lambdas and closures and the like. As it stands, it's way too easy to end up allocating when using stuff like std.algorithm even though it doesn't obviously allocate. And on some level at least, I think that that's more an issue of language improvements than library improvements. But no, explicit use of the GC in Phobos is not particularly heavy. Array-related stuff often allocates, and the few places in Phobos that use classes allocate, but in general, Phobos really doesn't do much in the way explicit allocations. It's the implicit ones that are the killer. - Jonathan M Davis
Jun 07 2016
prev sibling parent reply Yura <min_yura mail.ru> writes:
On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
 Andrei posted this on another thread. I felt it deserved its 
 own thread. It's very important.
 -----------------------------------------------------------------------------
 I go to conferences. Train and consult at large companies. 
 Dozens every year, cumulatively thousands of people. I talk 
 about D and ask people what it would take for them to use the 
 language. Invariably I hear a surprisingly small number of 
 reasons:

 * The garbage collector eliminates probably 60% of potential 
 users right off.

 * Tooling is immature and of poorer quality compared to the 
 competition.

 * Safety has holes and bugs.

 * Hiring people who know D is a problem.

 * Documentation and tutorials are weak.

 * There's no web services framework (by this time many folks 
 know of D, but of those a shockingly small fraction has even 
 heard of vibe.d). I have strongly argued with Sönke to bundle 
 vibe.d with dmd over one year ago, and also in this forum. 
 There wasn't enough interest.

 * (On Windows) if it doesn't have a compelling Visual Studio 
 plugin, it doesn't exist.

 * Let's wait for the "herd effect" (corporate support) to start.

 * Not enough advantages over the competition to make up for the 
 weaknesses above.
Hello, I have to stress I am beginner in programming, mainly interested in number crunching in academia (at least so far). I started to write a small project in D, but had to switch to C for few reasons: 1) Importance for my CV. I know Python, if I add also C - it sounds, and could be useful since the C language is, apart from the other reasons, is popular and could help me wit the future job, both in academia and industry, since there are many C/C++ projects. 2) The libraries - in the scientific world you can find practically everything which has already been coded in C, => many C libraries. To link it to be used within D code requires some work/efforts, and since I am not that confident in my IT skills, I decided that C code calling C libraries is much safer. 3) For C - a lot of tutorials, everything has been explained at stack overflow many times, huge community of people. E.g. you want to use OpenMP, Open MPI - everything is there, explained many times, etc. 4) The C language is well tested and rock solid stable. However, if you encounter a potential bug in D, I am not sure how long would it take to fix. 5) Garbage collector - it will slow my number crunching down. Please, do not take it as criticism, I like D language, I tried it before C and I find it much much easier, and user friendly. I feel it is more similar to Python. On the other hand C++ is too complex for me, and D would be the perfect option for the scientific community, if the above points would be fixed somehow.. Best luck with your work!
Jun 09 2016
next sibling parent reply deadalnix <deadalnix gmail.com> writes:
On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:
 5) Garbage collector - it will slow my number crunching down.
You are a scientist, so try to measure. GC generally improves throughput at the cost of latency.
Jun 09 2016
parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:
 You are a scientist, so try to measure. GC generally improves 
 throughput at the cost of latency.
As a side note, I always found it funny that programmers call themselves "computer scientists" while many write a lot of their programs without tests.
Jun 09 2016
next sibling parent deadalnix <deadalnix gmail.com> writes:
On Thursday, 9 June 2016 at 20:38:30 UTC, Jack Stouffer wrote:
 On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:
 You are a scientist, so try to measure. GC generally improves 
 throughput at the cost of latency.
As a side note, I always found it funny that programmers call themselves "computer scientists" while many write a lot of their programs without tests.
A ton of computer science, even the one that is peer reviewed, do not publish code. It's garbage... And then you look at https://twitter.com/RealPeerReview and conclude it maybe isn't that bad.
Jun 09 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/9/2016 1:38 PM, Jack Stouffer wrote:
 On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:
 You are a scientist, so try to measure. GC generally improves throughput at
 the cost of latency.
As a side note, I always found it funny that programmers call themselves "computer scientists" while many write a lot of their programs without tests.
A scientist is someone who does research to make discoveries, while an engineer puts scientific discoveries to work. Programming is a mix of engineering and craft. There are people who do research into programming theory, and those are computer scientists. I'm not one of them. Andrei is.
Jun 09 2016
parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
 Programming is a mix of engineering and craft. There are people 
 who do research into programming theory, and those are computer 
 scientists. I'm not one of them. Andrei is.
Unfortunately, the term "software engineer" is a LOT less popular than "computer scientist".
Jun 09 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 9 June 2016 at 21:54:05 UTC, Jack Stouffer wrote:
 On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
 Programming is a mix of engineering and craft. There are 
 people who do research into programming theory, and those are 
 computer scientists. I'm not one of them. Andrei is.
Unfortunately, the term "software engineer" is a LOT less popular than "computer scientist".
How so? I only hear people use the term "programmer" or "informatics". Computer Science -> pure math / classification / concepts. Software Engineering -> process of developing software. At my uni we had the term "informatics" which covers both comps.sci., software engineering and requirements analysis, human factors etc. But it IS possible to be a computer scientist and only know math and no actual programming. Not common, but possible. But yes, sometimes people who have not studied compsci, but only read stuff on wikipedia engage in debates as if they knew the topic and then the distinction matters. There are things you never have to explain to a person who knows compsci, but you almost always have trouble explaining to people who don't know it (but think they do, because they are programmers and have seen big-oh notation in documentation). It is like a car engineering listening to a driver claiming that you should pour oil on your breaks if they make noise. Or a mathematician having to explain what infinity entails. At some point it is easier to just make a distinction between those who know the fundamental things about how brakes actually are constructed, and those who know how to drive a car. The core difference as far as debates goes, is that comp sci is mostly objective (proofs) and software engineering is highly subjective (contextual practice). So, if the topic is compsci then you usually can prove that the other person is wrong in a step-by-step irrefutable fashion. Which makes a big difference, actually. People who know compsci usually think that is ok, because they like to improve their knowledge and are used to getting things wrong (that's how you learn). People who don't know compsci usually don't like it becuase they are being told that they don't know something they like to think they know (but actually don't and probably never will). That's just the truth... ;-)
Jun 09 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Friday, June 10, 2016 02:38:28 Ola Fosheim Grstad via Digitalmars-d wrote:
 On Thursday, 9 June 2016 at 21:54:05 UTC, Jack Stouffer wrote:
 On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
 Programming is a mix of engineering and craft. There are
 people who do research into programming theory, and those are
 computer scientists. I'm not one of them. Andrei is.
Unfortunately, the term "software engineer" is a LOT less popular than "computer scientist".
How so? I only hear people use the term "programmer" or "informatics".
I assume that you're not from the US? In the US at least, professional programmers are almost always referred to officially as software engineers (though they use the term programmers informally all the time), whereas the terms computer science and computer scientist are generally reserved for academics. And while the term informatics (or very similar terms) are used in several other languages/countries, I've never heard the term used in the US except to mention that some other languages/countries use the term informatics for computer science, and I'm willing to bet that relatively few programmers in the US have ever even heard the term informatics. - Jonathan M Davis
Jun 09 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 10 June 2016 at 05:37:37 UTC, Jonathan M Davis wrote:
 I assume that you're not from the US?
Right, I am in Oslo (Norway).
 In the US at least, professional programmers are almost always 
 referred to officially as software engineers (though they use 
 the term programmers informally all the time), whereas the 
 terms computer science and computer scientist are generally 
 reserved for academics
Well, I don't know what is "official". Some norwegian companies seem to use convoluted "international business" terminology for everything, which is just weird and "snobbish". I think "system developer" ("systemutvikler") is the broad general term here. So you can be an "informatiker" (broad term for your education) with an education in the fields of "computer science" and "software engineering", and work in the role of a "system developer". If you have a bachelor that fulfills the requirements for starting on a comp.sci. master then you are a computer scientist, but if you have a bachelor that doesn't and focus more on practical computing then you are a software engineer? You can have an education that is all about discrete math and still be a computer scientist. You couldn't then say you have a bachelor in software engineering, as it would be wrong. Likewise, you can have a bachelor in software engineering and barely know anything about complexity theory.
 And while the term informatics (or very similar terms) are used 
 in several other languages/countries, I've never heard the term 
 used in the US except to mention that some other 
 languages/countries use the term informatics for computer 
 science, and I'm willing to bet that relatively few programmers 
 in the US have ever even heard the term informatics.
Yes, but it makes sense to distinguish between "computer science" (the timeless math and concepts behind computing) and "software engineering" (contemporary development methodology and practice). Although I think an education should cover both. "Informatics" just covers it all (as an educational field).
Jun 10 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Friday, June 10, 2016 07:45:03 Ola Fosheim Grstad via Digitalmars-d wrote:
 On Friday, 10 June 2016 at 05:37:37 UTC, Jonathan M Davis wrote:
 I assume that you're not from the US?
Right, I am in Oslo (Norway).
 In the US at least, professional programmers are almost always
 referred to officially as software engineers (though they use
 the term programmers informally all the time), whereas the
 terms computer science and computer scientist are generally
 reserved for academics
Well, I don't know what is "official". Some norwegian companies seem to use convoluted "international business" terminology for everything, which is just weird and "snobbish". I think "system developer" ("systemutvikler") is the broad general term here.
Well, I meant official as in what someone's job title would be. Most developers have titles like "Software Engineer" or "Senior Softweer Engineer." They'e frequently called programmers and/or software developers when not talking about titles.
 So you can be an "informatiker" (broad term for your education)
 with an education in the fields of "computer science" and
 "software engineering", and work in the role of a "system
 developer".

 If you have a bachelor that fulfills the requirements for
 starting on a comp.sci. master then you are a computer scientist,
 but if you have a bachelor that doesn't and focus more on
 practical computing then you are a software engineer?

 You can have an education that is all about discrete math and
 still be a computer scientist. You couldn't then say you have a
 bachelor in software engineering, as it would be wrong. Likewise,
 you can have a bachelor in software engineering and barely know
 anything about complexity theory.
Yeah. Most universities in the US have a Computer Science degree, but some have Software Engineering as a separate degree. My college had Computer Science, Software Engineer, and Computer Engineering, which is atypical. All of them took practical courses, but the SE guys didn't have to take some of the more theoretical stuff and instead took additional classes focused on working on projects in teams and whatnot. And CPE was basically a hybrid between Computer Science and Electrical Engineering with an aim towards embedded systems. But all of them had more of a practical focus than is the case at many schools, because the school's motto is "learn by doing," and they focus a lot on the practical side of things, whereas many Computer Science programs suffer from not enough practical skills being taught. The college in the city where I lived for my last job is notoriously bad at teaching their Computer Science students much in the way of practical skills. I think that it's by far the most typical though that someone gets a degree in Computer Science (with varying degrees of practical skils involved) and then takes a job as a Software Engineer. And if you got a good focus on pratical skills in school in addition to the theory, then you went to a good school, whereas some schools do a very poor job on the practical side of things.
 And while the term informatics (or very similar terms) are used
 in several other languages/countries, I've never heard the term
 used in the US except to mention that some other
 languages/countries use the term informatics for computer
 science, and I'm willing to bet that relatively few programmers
 in the US have ever even heard the term informatics.
Yes, but it makes sense to distinguish between "computer science" (the timeless math and concepts behind computing) and "software engineering" (contemporary development methodology and practice). Although I think an education should cover both. "Informatics" just covers it all (as an educational field).
Agreed. A good education covers both the theoritical stuff and the practical stuff, and some schools do distinguish based on what the focus of their program is, but in the US at least, it's very common to have a Computer Science program and less common to have a Software Engineering program (though I think that Software Engineering degrees are becoming more common). - Jonathan M Davis
Jun 10 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 10 June 2016 at 15:27:03 UTC, Jonathan M Davis wrote:
 Most developers have titles like "Software Engineer" or "Senior 
 Softweer Engineer." They'e frequently called programmers and/or 
 software developers when not talking about titles.
Neither academia or businesses use Computer Scientist as a job title... tough?
 Yeah. Most universities in the US have a Computer Science 
 degree, but some have Software Engineering as a separate 
 degree. My college had Computer Science, Software Engineer, and 
 Computer Engineering, which is atypical. All of them took 
 practical courses, but the SE guys didn't have to take some of 
 the more theoretical stuff and instead took additional classes 
 focused on working on projects in teams and whatnot.
Sounds like a good setup. At my uni we could pick freely what courses we wanted each semester, but needed a certain combination of fields and topics to get a specific degree. Like for entering computer science you would need the most feared topic, Program Verification taught by Ole-Johan Dahl (co-creator of Simula) who was very formal on the blackboard... I felt it was useless at the time, but there are some insights you have to be force-fed... only to be appreciated later in life. It is useless, but still insightful. Not sure if those more narrow programs are doing their students a favour, as often times the hardest part is getting a good intuition for the basics of a topic, while getting the "expert" knowledge for a specific task is comparatively easier. Especially now we have the web. So, being "forced" to learning the basics of a wider field is useful. I'm rather sceptical of choosing C++ as a language for instance. Seems like you would end up wasting a lot of time on trivia and end up students hating programming...
Jun 10 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Friday, June 10, 2016 17:20:29 Ola Fosheim Grstad via Digitalmars-d wrote:
 On Friday, 10 June 2016 at 15:27:03 UTC, Jonathan M Davis wrote:
 Most developers have titles like "Software Engineer" or "Senior
 Softweer Engineer." They'e frequently called programmers and/or
 software developers when not talking about titles.
Neither academia or businesses use Computer Scientist as a job title... tough?
In academia, you'd be a professor of Computer Science or a professor in the Computer Science department. You wouldn't normally be called a Computer Scientist - certainly not as a job title. And in businesses, the only companies that even _might_ have Computer Scientist as a title would be where it was a very research-heavy job, which would not be at all normal. Research-heavy jobs like that do exist in some large companies, but in the vast majority of cases, programmers are hired as Software Engineers to write code for actual products.
 Yeah. Most universities in the US have a Computer Science
 degree, but some have Software Engineering as a separate
 degree. My college had Computer Science, Software Engineer, and
 Computer Engineering, which is atypical. All of them took
 practical courses, but the SE guys didn't have to take some of
 the more theoretical stuff and instead took additional classes
 focused on working on projects in teams and whatnot.
Sounds like a good setup. At my uni we could pick freely what courses we wanted each semester, but needed a certain combination of fields and topics to get a specific degree. Like for entering computer science you would need the most feared topic, Program Verification taught by Ole-Johan Dahl (co-creator of Simula) who was very formal on the blackboard... I felt it was useless at the time, but there are some insights you have to be force-fed... only to be appreciated later in life. It is useless, but still insightful. Not sure if those more narrow programs are doing their students a favour, as often times the hardest part is getting a good intuition for the basics of a topic, while getting the "expert" knowledge for a specific task is comparatively easier. Especially now we have the web. So, being "forced" to learning the basics of a wider field is useful.
I tend to be of the opinion that the best colloge program has all of the more theoretical stuff, because it provides a solid base for real life programming, but project-based, real world stuff is also very important to help prepare students for actual jobs. Too many college programs do very little with helping prepare students for actual programming jobs, but at the same time, I think that skipping a lot of the theoretical stuff will harm students in the long run. But striking a good balance isn't exactly easy, and it's definitely the case that a lot of the more theoretical stuff isn't as obviously useful then as it is later. In some ways, it would actually be very beneficial to actually go back to school to study that stuff after having programmed professionally for a while, but that's a pain to pull off time-wise, and the classes aren't really designed with that in mind anyway.
 I'm rather sceptical of choosing C++ as a language for instance.
 Seems like you would end up wasting a lot of time on trivia and
 end up students hating programming...
Choosing the right language for teaching is an endless debate with all kinds of pros and cons. Part of the problem is that good languages for professional work tend to be complicated with advantages aimed at getting work done rather than teaching, which causes problems for teaching, but picking a language that skips a lot of the compilications means that students aren't necessarily well-prepared to deal with the more complicated aspects of a language. When I started out in school, C++ was the main language, but it quickly changed to Java, which removes all kinds of certain problems, but it still has a lot of extra cruft (like forcing everything to be in a class and a ton of attributes forced to be on main), and it doesn't at all prepare students to properly deal with pointers and memory. So, students starting out with Java have some fun problems when they then have to deal with C or C++. Alternatively, there are folks in favor of starting with functional languages, which has certain advantages, but it's so different from how folks would program normally that I'm not sure that it's ultimately a good idea. All around, it's a difficult problem, and I don't know wha the right choice is. In general, there are serious problems with teaching with real world languages, and teaching with a language that was designed for teaching doesn't necessarily prepare students for the real world. I don't envy teachers having to figure out how to teach basic programming concepts. Regardless, I think that students should be at least exposed to both the imperative/OO languages and the functional languages over the course of school if they're going to be well-rounded programmers. So, a lot of the question is how to best teach the beginning concepts than what to use later later in the curriculum. To some extent, once you've got the basic stuff down, the language doesn't necessarily matter much. I did think that it was funny though when in the networking course that I took, the teacher said that we were doing it in C, because if we did it in Java, then there wouldn't be much of a class. We basically implemented TCP on top of UDP as part of the course, whereas in Java, it would be a lot more likely to use RMI or the like and not even deal with sockets, let alone memory. - Jonathan M Davis
Jun 10 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 10 June 2016 at 18:59:02 UTC, Jonathan M Davis wrote:
 then as it is later. In some ways, it would actually be very 
 beneficial to actually go back to school to study that stuff 
 after having programmed professionally for a while, but that's 
 a pain to pull off time-wise, and the classes aren't really 
 designed with that in mind anyway.
I am definitively considering it, maybe on some topics that I've read on my own, to fill in the missing bits. Or topics that has had some advances since the 90s. It wouldn't be too much of a pain as I could get there in 15 minutes on a bike, so it would just be exercise. I believe lectures at the University of Oslo are open to the public if there is enough space, and the fee at the University of Oslo is at $100/semester so the threshold for signing up is low. And I don't even have to do the exam, which probably makes it more enjoyable.
 When I started out in school, C++ was the main language, but it 
 quickly changed to Java, which removes all kinds of certain 
 problems, but it still has a lot of extra cruft (like forcing 
 everything to be in a class and a ton of attributes forced to 
 be on main), and it doesn't at all prepare students to properly 
 deal with pointers and memory. So, students starting out with 
 Java have some fun problems when they then have to deal with C 
 or C++. Alternatively, there are folks in favor of starting 
 with functional languages, which has certain advantages, but 
 it's so different from how folks would program normally that 
 I'm not sure that it's ultimately a good idea.
I went to a high school that had programming/digital circuits on the curriculum. In the programming we started with Logo, which actually is kind of neat, as you are working with very concrete intuitive geometric problems. Then we went on to Turbo Pascal. It wasn't challenging enough, so the better students went with digital circuits and machine language for the last year. At the uni the entry level courses used Simula, but later courses used C, Beta, Scheme, etc, based on the topic. In several courses I could use whatever language I wanted for projects as long as the assistant teacher could read it. Made sense since the grades usually were based on a final exam only.
 world.  I don't envy teachers having to figure out how to teach 
 basic programming concepts.
Yes, some people are simply never going to be able to do programming well... I'm talking having trouble reading code with basic input - output loops (not even writing it) after having it carefully explained to them many times. With some students you know they will never be able to pass the exam after the first few sessions. But you cannot tell them to quit... so you have to encourage them, basically encouraging them to strive towards a certain failure. That's frustrating. Educational institutions should probably have an aptitude test. At the entry level courses maybe 30% are never going to be able to become even mediocre programmers.
Jun 11 2016
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Saturday, June 11, 2016 08:06:21 Ola Fosheim Grstad via Digitalmars-d 
wrote:
 On Friday, 10 June 2016 at 18:59:02 UTC, Jonathan M Davis wrote:
 then as it is later. In some ways, it would actually be very
 beneficial to actually go back to school to study that stuff
 after having programmed professionally for a while, but that's
 a pain to pull off time-wise, and the classes aren't really
 designed with that in mind anyway.
I am definitively considering it, maybe on some topics that I've read on my own, to fill in the missing bits. Or topics that has had some advances since the 90s. It wouldn't be too much of a pain as I could get there in 15 minutes on a bike, so it would just be exercise. I believe lectures at the University of Oslo are open to the public if there is enough space, and the fee at the University of Oslo is at $100/semester so the threshold for signing up is low. And I don't even have to do the exam, which probably makes it more enjoyable.
LOL. 10x that would be cheap in the US, and I don't think that your average school will let folks sit in on courses (though some will). For your average college in the US, I would only expect anyone to take classes if they're actually working towards a degree, though I'm sure that there are exceptions in some places.
 world.  I don't envy teachers having to figure out how to teach
 basic programming concepts.
Yes, some people are simply never going to be able to do programming well... I'm talking having trouble reading code with basic input - output loops (not even writing it) after having it carefully explained to them many times. With some students you know they will never be able to pass the exam after the first few sessions. But you cannot tell them to quit... so you have to encourage them, basically encouraging them to strive towards a certain failure. That's frustrating. Educational institutions should probably have an aptitude test. At the entry level courses maybe 30% are never going to be able to become even mediocre programmers.
It works better when the school itself is really hard to get into. For instance, my dad went to MIT, and according to him, you usually don't have much of a need for weeder courses there, because it was already hard enough to get into the school that folks who can't hack it won't even be there - and it's an engineering school, so you're typically going to get very smart, well-prepared students who want to do engineering. Contrast that with schools where almost anyone can get in, and there are always problems with folks entering engineering programs where they can't hack it - especially computer science, since it doesn't obviously involve the really hard math and science that would scare many of those folks away. You freqeuntly either end up with the school trying to weed out a lot of folks up front by having very hard beginning courses or making their beginning classes easy in an attempt to make it so that everyone has a chance, though I think that tends to just delay the inevitable for many students. I can appreciate wanting to give everyone a chance, and I'm sure that there are folks who have a hard time at first who get it later, but many folks just don't seem to think the right way to be able to program. So, I agree that it would be nice if there were some sort of aptitude test up front that at least indicated whether you were likely have a very hard time with programming. But I don't think that I've ever heard of any schools doing anything like that (though obviously, some could be, and I just haven't heard of it). And I don't know how you would even go about making such a test, though I expect that there are computer science professors out there who would. - Jonathan M Davis
Jun 11 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 11 June 2016 at 12:19:52 UTC, Jonathan M Davis wrote:
 LOL. 10x that would be cheap in the US, and I don't think that 
 your average school will let folks sit in on courses (though 
 some will). For your average college in the US, I would only 
 expect anyone to take classes if they're actually working 
 towards a degree, though I'm sure that there are exceptions in 
 some places.
I remember that we sometimes had older programmers taking some courses, maybe to ge a degree? But not often. The $100/semester fee isn't for tuition though, it is for student activities/facilities, paper copies etc. Tuition is free.
 It works better when the school itself is really hard to get 
 into. For instance, my dad went to MIT, and according to him, 
 you usually don't have much of a need for weeder courses there, 
 because it was already hard enough to get into the school that 
 folks who can't hack it won't even be there - and it's an 
 engineering school, so you're typically going to get very 
 smart, well-prepared students who want to do engineering.
It sorts itself out at higher levels, although I once had a project group at the master level that came to the hallway outside my office to get help, and it eventually dawned on me that none of the three boys knew that they should end sentences with ";"... I couldn't help laughing... and I kinda felt bad about it, but they laughed too, so I guess it was ok. I was so totally unprepared for that kind of problem at a master level course. I assume they came from some other field, as it was a web-oriented course. These things with uneven knowledge levels are more of a problem in "hip" project oriented courses, not so much in the courses that are proper compsci and are based on individual final exams. It kinda work out ok as long as students of the same level go on the same group, but it creates a lot of friction if you get a mixed group where the better students feel the other ones are freeloaders.
 You freqeuntly either end up with the school trying to weed out 
 a lot of folks up front by having very hard beginning courses 
 or making their beginning classes easy in an attempt to make it 
 so that everyone has a chance, though I think that tends to 
 just delay the inevitable for many students.
Yep, exactly, but the problem was that the introduction course in programming was required by other fields such as getting a master in bio-chemistry or so. That didn't go very well when the lecturer once came up with a "clever exam" where you got stuck if you didn't master the first task. So 40% failed on their final exam, 200 angry students? That would've made me feel bad. After that they softened the tasks a bit... making failure harder. In the math department they had one more narrow calculus course for those who wanted to specialise in math and a broader more pragmatic calculus course for those who were more to use math as an applied tool in other fields. Probably a better solution.
 to be able to program. So, I agree that it would be nice if 
 there were some sort of aptitude test up front that at least 
 indicated whether you were likely have a very hard time with 
 programming. But I don't think that I've ever heard of any 
 schools doing anything like that (though obviously, some could 
 be, and I just haven't heard of it). And I don't know how you 
 would even go about making such a test, though I expect that 
 there are computer science professors out there who would.
Well, I don't know. I guess having average or above in math would work out. Not that you have to know math, but general problem solving. I noticed that people from other fields that was working on their master picked up programming faster, perhaps because they had acquired skills in structuring and problem solving. Then again, pure theoretical topics kill motivation for me. Like, I could never find any interest in solving tricky integrals analytically as it seemed like a pointless exercise. And to be honest, I've never found the need to do it. But as you said, some topics become more meaningful later in life and I'd probably put more energy into topics like program verification and combinatorics today than I did in my youth.
Jun 11 2016
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/9/2016 9:44 AM, Yura wrote:
 4) The C language is well tested and rock solid stable. However, if you
 encounter a potential bug in D, I am not sure how long would it take to fix.
Thanks for taking the time to post here. Yes, there are bugs in D. Having dealt with buggy compilers from every vendor for decades, I can speak from experience that almost every bug has workarounds that will keep the project moving. Also, bugs in D tend to be with the advanced features. But there's a C-ish subset that's nearly a 1:1 correspondence to C, and if you are content with C style it'll serve you very well.
Jun 09 2016
prev sibling next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:
 4) The C language is well tested and rock solid stable.
loooooool.
Jun 09 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 06:25:55 UTC, ketmar wrote:
 On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:
 4) The C language is well tested and rock solid stable.
loooooool.
ah, sorry, let me explain myself. i hit ALOT of gcc bugs in my life. and i never fixed any of them myself, 'cause gcc is very huge, and i don't feel that it worth it. even with bugs that blocked my work i used workarounds and hand-written asm. i hit some bugs in D too. curiously, it comparable with gcc in numbers (maybe this tells us something, and maybe it is just a coincidence). some of them i was able not only report, but fix. usually, official fix comes later, and was better than mine hacky patch, but hey... DMD compiler is less complex than gcc, *alot* less complex. now, why i loled: i thinked about what you wrote, and found that gcc bugs blocks my work/pet projects more often than dmd bugs. it may sounds strange, but dmd bug is usually either fixed fast enough (and building new dmd+druntime+phobos from sources takes less than two minutes on my PC), or i know a workaround. contrary to that, even if gcc bug was fixed fast (and usually they don't), rebuilding gcc takes 20‒30 minutes. and most of the time i can't even understand what fix does, due to huge and unknown codebase. so no, C languange is not "rock solid stable". it just has alot less features, and if you will use the same feature set in DMD, you will hardly hit a bug too.
Jun 09 2016
parent reply Yura <min_yura mail.ru> writes:
On Friday, 10 June 2016 at 06:37:08 UTC, ketmar wrote:
 On Friday, 10 June 2016 at 06:25:55 UTC, ketmar wrote:
 On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:
 4) The C language is well tested and rock solid stable.
loooooool.
ah, sorry, let me explain myself. i hit ALOT of gcc bugs in my life. and i never fixed any of them myself, 'cause gcc is very huge, and i don't feel that it worth it. even with bugs that blocked my work i used workarounds and hand-written asm. i hit some bugs in D too. curiously, it comparable with gcc in numbers (maybe this tells us something, and maybe it is just a coincidence). some of them i was able not only report, but fix. usually, official fix comes later, and was better than mine hacky patch, but hey... DMD compiler is less complex than gcc, *alot* less complex. now, why i loled: i thinked about what you wrote, and found that gcc bugs blocks my work/pet projects more often than dmd bugs. it may sounds strange, but dmd bug is usually either fixed fast enough (and building new dmd+druntime+phobos from sources takes less than two minutes on my PC), or i know a workaround. contrary to that, even if gcc bug was fixed fast (and usually they don't), rebuilding gcc takes 20‒30 minutes. and most of the time i can't even understand what fix does, due to huge and unknown codebase. so no, C languange is not "rock solid stable". it just has alot less features, and if you will use the same feature set in DMD, you will hardly hit a bug too.
Thanks all of you for the constructive discussion, I am a chemist studying the programming by myself since I need it to explore chemistry at the molecular level and to check my chemical ideas. The professional software engineer(SE)/computer scientist(CS) would do the job much faster, and the resulting code would look much better - but, to do that you need to explain all the chemistry behind to SE/CS which would be tricky, and the most important (drastic) approximations come exactly from chemistry - not from the particular language. I hope you excuse me for the long introduction. In my area there are 3 languages dominating: Python, Fortran and C/C++. The first is easy (many libraries are available), but might be very slow. Fortran is used by the old professors, tons of libraries, but is not used outside of academia - and this stops young people from studying it because everyone at some point may quit an academia. C and C++ perhaps dominate the field, but especially the second one is very tough for people coming from non-IT background. I believe D has very high chances to be competitive in this field. Regarding the GC, I will try to check it when I have some time, but since most of the codes are written in non-GC languages (https://en.wikipedia.org/wiki/List_of_quantum_chemistry_and_solid-stat _physics_software), something tells me that GC would slow you down because in this field people are fighting for every percent of the performance (many simulations are running for weeks). Another point is to link the libraries, with my poor background in IT, even to link the C library to the C code is a challenge, and something tells me that to link it to D would be even more challenging => to make linking the C libraries as easy as possible (Fortran or C++ libraries are not as important) and to have active support forum when you can as for help in linking the libraries to your D code would be helpful. As people have this support, then they are confident to start their new projects in D. Then, at the conferences/ in the scientific papers people can advertise and promote the language, which is more user friendly than C, Fortran and C++ and is modern enough. However, perhaps, only enthusiasm is not sufficient for all this, you need the sponsors... I agree the C subset for sure guarantees (almost) absence of bugs. Another things where I do almost all my mistakes is: array bounding/calling the memory which was free => the result is undefined behavior. If I remember correctly the D is better with that respect? Anyway, super easy way to use all C libraries + super active support would clearly help to increase D popularity/adoption. All other point I raised are perhaps not that important.
Jun 10 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 08:29:50 UTC, Yura wrote:
 something tells me that GC would slow you down
 because in this field people are fighting for every
 percent of the performance (many simulations are
 running for weeks).
yes, GC will have an effect for such things. but then, people fighting for performance will resort to "dirty tricks" in any language, i believe, and in D it is not that hard to avoid GC (i'm doing something like that in my videogame and sound engines). but the great thing — at least for me — that you can easily prototype your solution with GC, debug it, and then gradually replace data structures with other data structures that does manual memory management. this way you can debug your data structures and algorithms independently. of course, it is possible in C and C++ too, but i found that with D it is somewhat easier.
 Another point is to link the libraries, with my poor
 background in IT, even to link the C library to the
 C code is a challenge, and something tells me that
 to link it to D would be even more challenging
i found that D is better here too, it just require some... discipline. thanks to UFCS, one can write D code that *looks* like OOP, but actualy only using structs and free functions. this way you can use `export(C)` on your public API, and still use `myvar.myfunc()` syntax in D, but have C-ready `myfunc(&myvar)` syntax to export. also, with some CTFE magic one can even generate such wrappers in compile time.
 Another things where I do almost all my mistakes is: array 
 bounding/calling the memory which was free => the result is 
 undefined behavior. If I remember correctly the D is better 
 with that respect?
yes. with GC, you won't hit "use after free" error. and D does bound checking on array access (this can be turned off once you debugged your code), so you will get a stack trace on RangeError.
 Anyway, super easy way to use all C libraries + super active 
 support would clearly help to increase D popularity/adoption.
and as for C libraries... i'm actively using alot of C libs in my D projects, and most of the time i do wrappers for them with sed. ;-) i.e. i'm just taking C header file, run some regular expression replaces on it, and then do some manual cleanup. that is, even without specialized tools one is able to produce a wrapper with very small time and effort investement. tbh, i even translated some C libraries to D mostly with sed. for example, enet and libotr.
Jun 10 2016
prev sibling parent Kagamin <spam here.lot> writes:
On Friday, 10 June 2016 at 08:29:50 UTC, Yura wrote:
 Another things where I do almost all my mistakes is: array 
 bounding/calling the memory which was free => the result is 
 undefined behavior. If I remember correctly the D is better 
 with that respect?
I think slices and automatic bound checking is the most important improvement of D over C. An important concern in simulations (mentioned by one using D in bioinformatics) is correctness: if you have a bug, the program is not guaranteed to crash, it can just give an incorrect result.
 Anyway, super easy way to use all C libraries + super active 
 support would clearly help to increase D popularity/adoption. 
 All other point I raised are perhaps not that important.
I'm not as optimistic about binding C libraries as others :) I think it requires skills.
Jun 10 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:
 Hello,

 I have to stress I am beginner in programming, mainly 
 interested in number crunching in academia (at least so far). I 
 started to write a small project in D, but had to switch to C 
 for few reasons:

 1) Importance for my CV. I know Python, if I add also C - it 
 sounds, and could be useful since the C language is, apart from 
 the other reasons, is popular and could help me wit the future 
 job, both in academia and industry, since there are many C/C++ 
 projects.
I wouldn't worry too much about the CV. Maybe in a year or two companies will demand you know Ruby or Javascript :) Once you know who to program it's not so hard to pick up other languages. The basic concepts of handling / mapping data are always the same (hash tables, arrays ...)
 2) The libraries - in the scientific world you can find 
 practically everything which has already been coded in C, => 
 many C libraries. To link it to be used within D code requires 
 some work/efforts, and since I am not that confident in my IT 
 skills, I decided that C code calling C libraries is much safer.
It's a piece of cake most of the time, it's really easy.[1] When I first tried it, I couldn't believe that it was _that_ simple. I use some C code too in one of my projects and it's easy to either call individual C functions or, if needs be, you can turn a C header file into a D interface file with only a few changes (they will almost look identical). There is absolutely no reason not to use D because of existing C libraries. The seamless interfacing to C is one of D's major advantages. In this way you can write elegant D code and still take advantage of the plethora of C libraries that are available. Here are examples of porting C libraries that have D interfaces: https://github.com/D-Programming-Deimos?page=1 If you need help, you can always ask on the forum. Nobody will bite you :-) There are even D frameworks that enable you to interact with Python [2] and Lua. I'd say give it a shot. [1] http://dlang.org/spec/interfaceToC.html [2] https://github.com/ariovistus/pyd Other links: http://dlang.org/ctod.html http://dlang.org/articles.html
 3) For C - a lot of tutorials, everything has been explained at 
 stack overflow many times, huge community of people. E.g. you 
 want to use OpenMP, Open MPI - everything is there, explained 
 many times, etc.

 4) The C language is well tested and rock solid stable. 
 However, if you encounter a potential bug in D, I am not sure 
 how long would it take to fix.

 5) Garbage collector - it will slow my number crunching down.
You should test it first, gut feeling is not a proof. If it really does slow down your code, write GC free code, as ketmar suggested. You can always ask on the .learn forum.
 Please, do not take it as criticism, I like D language, I tried 
 it before C and I find it much much easier, and user friendly. 
 I feel it is more similar to Python. On the other hand C++ is 
 too complex for me, and D would be the perfect option for the 
 scientific community, if the above points would be fixed 
 somehow..

 Best luck with your work!
Jun 10 2016
parent reply Chris <wendlec tcd.ie> writes:
And also, always use ldc or gdc, once your project is ready to 
go. dmd generated code is slow as it is only a reference compiler.

http://dlang.org/download.html
Jun 10 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 09:35:32 UTC, Chris wrote:
 And also, always use ldc or gdc, once your project is ready to 
 go. dmd generated code is slow as it is only a reference 
 compiler.
but not *dog* *slow*. ;-) if one don't really need to squeeze every possible cycle out of CPU, DMD-generated code is more than acceptable. i, for example, managed to make my scripting language almost as efficient with DMD -O as Lua with gcc -O3. ;-)
Jun 10 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 09:46:11 UTC, ketmar wrote:
 On Friday, 10 June 2016 at 09:35:32 UTC, Chris wrote:
 And also, always use ldc or gdc, once your project is ready to 
 go. dmd generated code is slow as it is only a reference 
 compiler.
but not *dog* *slow*. ;-) if one don't really need to squeeze every possible cycle out of CPU, DMD-generated code is more than acceptable. i, for example, managed to make my scripting language almost as efficient with DMD -O as Lua with gcc -O3. ;-)
No not slow slow. Even in debugging mode it produces acceptable results for testing. A scripting language based on D? Is it open source? I've always dreamt of something like that.
Jun 10 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 10:03:29 UTC, Chris wrote:
 A scripting language based on D? Is it open source? I've always 
 dreamt of something like that.
i have several of them, actually. yet they are very specialized — i.e. designed to support my game engines, not to be "wide-area scripting languages". publicly accessible are: DACS[1] — statically typed, with modules and UFCS, and JIT compiler built with LibJIT[2], optionally supports internal stack-based VM. GML[3] — part of my WIP Game Maker 8 emulator, register-based 3-op VM, no JIT. [1] http://repo.or.cz/dacs.git [2] https://www.gnu.org/software/libjit/ [3] http://repo.or.cz/gaemu.git
Jun 10 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 10:21:07 UTC, ketmar wrote:
 i have several of them, actually. yet they are very specialized 
 — i.e. designed to support my game engines, not to be 
 "wide-area scripting languages".

 publicly accessible are:

 DACS[1] — statically typed, with modules and UFCS, and JIT 
 compiler built with LibJIT[2], optionally supports internal 
 stack-based VM.

 GML[3] — part of my WIP Game Maker 8 emulator, register-based 
 3-op VM, no JIT.


 [1] http://repo.or.cz/dacs.git
 [2] https://www.gnu.org/software/libjit/
 [3] http://repo.or.cz/gaemu.git
Cool. I'd love to see `DScript` one day - and replace JS once and for all ... well ... just day dreaming ...
Jun 10 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 10:55:42 UTC, Chris wrote:
 Cool. I'd love to see `DScript` one day - and replace JS once 
 and for all ... well ... just day dreaming ...
Adam has scripting engine in his arsd repo[1]. it's not a speed demon, but it is much more like JS, it even has exceptions, and it is very easy to integrate it with D code. you may take a look at it too. afair, you only need jsvar.d and script.d modules to use it. [1] https://github.com/adamdruppe/arsd
Jun 10 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 11:05:27 UTC, ketmar wrote:
 Adam has scripting engine in his arsd repo[1]. it's not a speed 
 demon, but it is much more like JS, it even has exceptions, and 
 it is very easy to integrate it with D code. you may take a 
 look at it too. afair, you only need jsvar.d and script.d 
 modules to use it.

 [1] https://github.com/adamdruppe/arsd
Nice. Anyone interested in turning this into "DScript"? Having a scripting language powered by D would also boost D's prestige. And it would be easy to write modules in pure D. DScript could then be used by scientists, game developers etc. à la Python and if speed is crucial, just write a module in pure D.
Jun 10 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
 Nice. Anyone interested in turning this into "DScript"? Having 
 a scripting language powered by D would also boost D's 
 prestige. And it would be easy to write modules in pure D.

 DScript could then be used by scientists, game developers etc. 
 à la Python and if speed is crucial, just write a module in 
 pure D.
it looks like you just described a project you can start yourself! ;-)
Jun 10 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 11:20:35 UTC, ketmar wrote:
 On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
 Nice. Anyone interested in turning this into "DScript"? Having 
 a scripting language powered by D would also boost D's 
 prestige. And it would be easy to write modules in pure D.

 DScript could then be used by scientists, game developers etc. 
 à la Python and if speed is crucial, just write a module in 
 pure D.
it looks like you just described a project you can start yourself! ;-)
I have neither time nor the required expertise to write a scripting language from scratch ;) You on the other hand ... :-)
Jun 10 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
 I have neither time nor the required expertise to write a 
 scripting language from scratch ;) You on the other hand ... :-)
so just use Adam's code as the starting point then! ;-)
Jun 10 2016
prev sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
 I have neither time nor the required expertise to write a 
 scripting language from scratch ;) You on the other hand ... :-)
Oh, it isn't that hard, at least to do a quick basic thing. You might want to start with the various math parsers. A postfix one is relatively easy: 2 3 + break it up into tokens, read them in, build a syntax tree (well, for the postfix thing, it is probably a stack!). That approach will even work for a Lisp-like language! Then try an infix one. You'd use the same tokenizer, but the parser is different... and this kind of parser gets you started for a typical script language. 2 + 3 The way this works is you read the token, peek ahead, create an object and build a tree. You'd use different functions for different contexts. So it might start with readExpression which readFactor. Then readFactor might call readAddend... If you look at the D grammar: http://dlang.org/spec/grammar.html you'll find the various terms are defined as WhateverExpressions and often recursively... you can write the parser to follow that basically the same way! You end up with one of these: https://en.wikipedia.org/wiki/Recursive_descent_parser Once you get addition and multiplication working with correct order of operations, you just kinda start adding stuff! Make a function call and an if/loop statement and boom, you have a simple programming language. After that, it is basically just adding more token recognition and AST classes. To make an interpreter, you can just add a method to the AST objects that interprets and gives a result.... boom, it works! Compiling is basically the same idea, just spitting out something other than the result of the expression - spitting out code that gives you the result. That gets harder to get into all the fancy techniques, but it builds on the same foundation. It is a good thing to know how to do, at least the basic parts!
Jun 10 2016
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:
 To make an interpreter, you can just add a method to the AST 
 objects that interprets and gives a result.... boom, it works!
Given my limited knowledge of compilers/interpreters, this part kind of seems like magic. Let's say you have something simple like 1+2, you would build an AST that looks something like + / \ 1 2 What would be the next step?
Jun 10 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:
 On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:
 To make an interpreter, you can just add a method to the AST 
 objects that interprets and gives a result.... boom, it works!
Given my limited knowledge of compilers/interpreters, this part kind of seems like magic. Let's say you have something simple like 1+2, you would build an AST that looks something like + / \ 1 2 What would be the next step?
1. this is heavily OT. ;-) 2. you may take a look at my gml engine. it has clearly separated language parser and AST builder (gaem.parser), and AST->VM compiler (gaem.runner/compiler.d).
Jun 10 2016
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:
 1. this is heavily OT. ;-)
I didn't forget to mark it! :-)
 2. you may take a look at my gml engine. it has clearly 
 separated language parser and AST builder (gaem.parser), and 
 AST->VM compiler (gaem.runner/compiler.d).
I couldn't for the life of me find a link to this.
Jun 10 2016
next sibling parent reply Wyatt <wyatt.epp gmail.com> writes:
On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:
 On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:
 2. you may take a look at my gml engine. it has clearly 
 separated language parser and AST builder (gaem.parser), and 
 AST->VM compiler (gaem.runner/compiler.d).
I couldn't for the life of me find a link to this.
He linked it earlier: http://repo.or.cz/gaemu.git/tree/HEAD:/gaem/parser -Wyatt
Jun 10 2016
parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 June 2016 at 15:40:45 UTC, Wyatt wrote:
 He linked it earlier:
 http://repo.or.cz/gaemu.git/tree/HEAD:/gaem/parser

 -Wyatt
Cheers.
Jun 10 2016
prev sibling next sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:
 On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:
 1. this is heavily OT. ;-)
I didn't forget to mark it! :-)
Well, yeah, we should start a new thread, but compiler programming isn't really off topic at all on a forum where we talk about programming a compiler! Knowing the idea helps reading dmd source too.
Jun 10 2016
prev sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:
 On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:
 1. this is heavily OT. ;-)
I didn't forget to mark it! :-)
 2. you may take a look at my gml engine. it has clearly 
 separated language parser and AST builder (gaem.parser), and 
 AST->VM compiler (gaem.runner/compiler.d).
I couldn't for the life of me find a link to this.
sorry. Wyatt kindly fixed that for me. ;-) also, you can replace code generation in compiler with direct execution, and you will get AST-based interpreter. just create a new AA with local variables on NodeFCall (this will serve as "stack frame"), and make `compileExpr` return value instead of stack slot index. then it is as easy as: (NodeBinarySub n) => compileExpr(n.el)-compileExpr(n.er); and so on. also, fix `compileVarAccess` and `compileVarStore` to use your "stack frame" AA. this whole bussines is not hard at all. i'd say it is easier than many other programming tasks.
Jun 11 2016
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:
 On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:
 To make an interpreter, you can just add a method to the AST 
 objects that interprets and gives a result.... boom, it works!
Given my limited knowledge of compilers/interpreters, this part kind of seems like magic. Let's say you have something simple like 1+2, you would build an AST that looks something like + / \ 1 2 What would be the next step?
https://en.wikipedia.org/wiki/Tree_traversal#Post-order
Jun 10 2016
prev sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:
 Let's say you have something simple like 1+2, you would build 
 an AST that looks something like
    +
   / \
  1   2
 What would be the next step?
https://github.com/adamdruppe/arsd/blob/master/script.d#L879 The function is pretty simple: interpret the left hand side (here it is 1, so it yields int(1)), interpret the right hand side (yields int(2)), combine them with the operator ("+") and return the result. Notice that interpreting the left hand side is a recursive call to the interpret function - it can be arbitrarily complex, and the recursion will go all the way down, then all the way back up to get the value.
Jun 10 2016
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 June 2016 at 17:02:06 UTC, Adam D. Ruppe wrote:
 https://github.com/adamdruppe/arsd/blob/master/script.d#L879

 The function is pretty simple: interpret the left hand side 
 (here it is 1, so it yields int(1)), interpret the right hand 
 side (yields int(2)), combine them with the operator ("+") and 
 return the result.

 Notice that interpreting the left hand side is a recursive call 
 to the interpret function - it can be arbitrarily complex, and 
 the recursion will go all the way down, then all the way back 
 up to get the value.
Ah, it produces mixin("1+2") and evaluates that. What's the PrototypeObject sc I see everywhere doing?
Jun 10 2016
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 17:36:02 UTC, jmh530 wrote:
 Ah, it produces mixin("1+2") and evaluates that.
Sort of, 1 and 2 are both runtime variables there so it really produces mixin("a+b") after setting a = 1 and b = 2 above. But yeah, that's the idea - it just hoists that mixin to runtime for scripting.
 What's the PrototypeObject sc I see everywhere doing?
sc is short for "scope" - it refers to the chain of local variables. So consider the following: var a = 1; function foo() { var b = 4; var c = a + b; } foo(); So as this is interpreted by my thing, it is like it runs the following D code: // this happens as part of the interpreter initialization auto globalScope = new PrototypeObject(globals_the_d_programmer_passed); // now it starts running auto currentScope = globalScope; // var a = 1; currentScope["a"] = 1; // it holds the local variables! call_function("foo", []); // script foo(); // when we enter the new scope inside the function, it // creates a new object, based on the old one currentScope = new PrototypeObject(currentScope); // var b = 4; currentScope["b"] = 4; // remember the scope changed above, so this is local to the function now // var c = a + b; currentScope["c"] = currentScope["a"] + currentScope["b"]; /* OK, so at this point, we get two variables: a and b. That's what the sc object in the script.d source represents - what I called currentScope here. The opIndex does two things: check the current scope for the name. If it is there, return that value. If not, go up to the parent scope and look there. Continue until you find it, of if it isn't there, throw a "no such variable" exception. It'd find b in the current scope and return the function-local variable, and it'd find a in the parent scope. */ // and now that the function is over, we pop off the local // variables from the function by setting the current back // to the old parent currentScope = currentScope.parent; So yeah, the sc in the interpreter is just the currentScope from the pseudocode, a chain of AAs holding the local variables.
Jun 10 2016
parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 June 2016 at 17:59:15 UTC, Adam D. Ruppe wrote:
 What's the PrototypeObject sc I see everywhere doing?
sc is short for "scope" - it refers to the chain of local variables. So consider the following: [snip]
Cool. Thanks.
Jun 10 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:
 On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
 I have neither time nor the required expertise to write a 
 scripting language from scratch ;) You on the other hand ... 
 :-)
Oh, it isn't that hard, at least to do a quick basic thing. You might want to start with the various math parsers. A postfix one is relatively easy: 2 3 + break it up into tokens, read them in, build a syntax tree (well, for the postfix thing, it is probably a stack!). That approach will even work for a Lisp-like language! Then try an infix one. You'd use the same tokenizer, but the parser is different... and this kind of parser gets you started for a typical script language. 2 + 3 The way this works is you read the token, peek ahead, create an object and build a tree. You'd use different functions for different contexts. So it might start with readExpression which readFactor. Then readFactor might call readAddend... If you look at the D grammar: http://dlang.org/spec/grammar.html you'll find the various terms are defined as WhateverExpressions and often recursively... you can write the parser to follow that basically the same way! You end up with one of these: https://en.wikipedia.org/wiki/Recursive_descent_parser Once you get addition and multiplication working with correct order of operations, you just kinda start adding stuff! Make a function call and an if/loop statement and boom, you have a simple programming language. After that, it is basically just adding more token recognition and AST classes. To make an interpreter, you can just add a method to the AST objects that interprets and gives a result.... boom, it works! Compiling is basically the same idea, just spitting out something other than the result of the expression - spitting out code that gives you the result. That gets harder to get into all the fancy techniques, but it builds on the same foundation. It is a good thing to know how to do, at least the basic parts!
I agree. It's good to know how to do it. But don't get me started, else I'll have a new obsession ... ;) But seriously, would you like to work on something like DScript. Your scripting language already fulfills things that were on my wishlist (easy D interop).
Jun 10 2016
next sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
 But seriously, would you like to work on something like 
 DScript. Your scripting language already fulfills things that 
 were on my wishlist (easy D interop).
I'm best when working on something that I'm actively using, since then I find the bugs myself and have some personal thing to gain (a lot of times, I can take time out of the day job to do it then, since it contributes directly back to it)... and alas, right now, I'm not actively using it. I do have some plans for it, but no set schedule. That said though, it is already fairly useful... if you guys use it and report bugs/feature requests, I can probably respond to that.
Jun 10 2016
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
 DScript. Your scripting language already fulfills things that 
 were on my wishlist (easy D interop).
hey, both GML and DACS has some of that too! ;-) VM["print"] = (string s) { writeln(s); }; VM["add"] = (int a, int b) => a+b; wow, now we can print things from script, and (for some unknown reason) use function to add two numbers. with DACS you still have to declare function prototypes, but with GML it will "just work" (including conversion from internal nan-boxed doubles to strings and ints, and back). GML is somewhat limited, but can be extended, and it almost as fast as Lua. DACS, with it's JIT, is sometimes even comparable to gcc -O2 (but only sometimes, lol; and LuaJIT still makes it look like a snail).
Jun 11 2016
parent Chris <wendlec tcd.ie> writes:
On Saturday, 11 June 2016 at 12:44:54 UTC, ketmar wrote:
 On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
 DScript. Your scripting language already fulfills things that 
 were on my wishlist (easy D interop).
hey, both GML and DACS has some of that too! ;-) VM["print"] = (string s) { writeln(s); }; VM["add"] = (int a, int b) => a+b; wow, now we can print things from script, and (for some unknown reason) use function to add two numbers. with DACS you still have to declare function prototypes, but with GML it will "just work" (including conversion from internal nan-boxed doubles to strings and ints, and back). GML is somewhat limited, but can be extended, and it almost as fast as Lua. DACS, with it's JIT, is sometimes even comparable to gcc -O2 (but only sometimes, lol; and LuaJIT still makes it look like a snail).
Cool. Maybe we should continue this here http://forum.dlang.org/thread/njfdch$2627$1 digitalmars.com
Jun 11 2016
prev sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
 Nice. Anyone interested in turning this into "DScript"? Having 
 a scripting language powered by D would also boost D's 
 prestige. And it would be easy to write modules in pure D.
So I use my toy thing from time to time and it is pretty cool. My favorite part (and the reason I made it) is the easy interop with D itself: you basically just assign your D functions and values to a global object and get them out via the same var type - in D! var globals = var.emptyObject; globals.write = &(writeln!string); var result = interpret(your_script_string, globals); writeln(result); where the script string looks like: write("Hi!"); 10 + 3 * 4; and it will work: $ dmd test.d arsd/script.d arsd/jsvar.d $ ./test Hi! 22 So really easy to use in all three ways: D interop is easy, the script lang itself is easy, and compiling it is easy, it is just the two modules. I've even did a bit of GUI and DOM wrapping with it and my simpledisplay.d and dom.d in toys... a surprisingly big chunk of things just work. The downside though is that it is something I basically slapped together in a weekend to support var.eval on a lark... it has a few weird bugs and the code is no longer beautiful as it has grown organically, and it isn't very fast, it is a simple AST interpreter that makes liberal use of new objects in D (even like a null object is allocated on the D side), but it is REALLY easy to use and coupled with native D functions for real work, it might just be interesting enough to play with. tho idk if I'd recommend it for serious work. Just use D for that!
Jun 10 2016
parent reply Wyatt <wyatt.epp gmail.com> writes:
On Friday, 10 June 2016 at 14:34:53 UTC, Adam D. Ruppe wrote:
 var globals = var.emptyObject;
 globals.write = &(writeln!string);
Woah, I never thought of using it like that!
 The downside though is that it is something I basically slapped 
 together in a weekend to support var.eval on a lark... it has a 
 few weird bugs
And yet it somehow seems to _work_ better than std.variant. :/
 tho idk if I'd recommend it for serious work. Just use D for 
 that!
I use it in my toml parser and it's very pleasant. I figured it probably isn't very fast, but it works and that's important. -Wyatt
Jun 10 2016
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 10 June 2016 at 15:30:19 UTC, Wyatt wrote:
 globals.write = &(writeln!string);
Woah, I never thought of using it like that!
Yeah, since writeln is a template, you need to instantiate it with some arguments. This isn't the ideal way to do it in the script btw, it'd be like: globals.write = (var this_, var[] args) { writeln(args); }; or something like that - this signature gives a variadic function to the scripting language, whereas writeln!string just has a single argument. But, of course, the script language cannot instantiate D templates itself, so you gotta do that before assigning it to the runtime var. But from there, the jsvar.d reflection code will handle the rest of var<->string conversions.
 I use it in my toml parser and it's very pleasant.  I figured 
 it probably isn't very fast, but it works and that's important.
kewl! Did you use the script component for interpreting or just the jsvar part for the data?
Jun 10 2016
parent Wyatt <wyatt.epp gmail.com> writes:
On Friday, 10 June 2016 at 17:10:39 UTC, Adam D. Ruppe wrote:
 On Friday, 10 June 2016 at 15:30:19 UTC, Wyatt wrote:
 I use it in my toml parser and it's very pleasant.  I figured 
 it probably isn't very fast, but it works and that's important.
kewl! Did you use the script component for interpreting or just the jsvar part for the data?
Just the jsvar; I've got a Ppegged grammar mixin doing most of the heavy lifting. IIRC, you actually wrote it around the time I was fighting a losing battle with nested Variant arrays and it saved me a lot of headache. -Wyatt
Jun 10 2016
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/10/2016 3:55 AM, Chris wrote:
 Cool. I'd love to see `DScript` one day - and replace JS once and for all ...
 well ... just day dreaming ...
Started a new thread for that.
Jun 10 2016