www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - C's Biggest Mistake on Hacker News

reply Walter Bright <newshound2 digitalmars.com> writes:
My article C's Biggest Mistake on front page of https://news.ycombinator.com !
Jul 21 2018
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/21/2018 11:53 PM, Walter Bright wrote:
 My article C's Biggest Mistake on front page of https://news.ycombinator.com !
Direct link: https://news.ycombinator.com/item?id=17585357
Jul 22 2018
parent reply Jim Balter <Jim Balter.name> writes:
On Sunday, 22 July 2018 at 20:10:27 UTC, Walter Bright wrote:
 On 7/21/2018 11:53 PM, Walter Bright wrote:
 My article C's Biggest Mistake on front page of 
 https://news.ycombinator.com !
Direct link: https://news.ycombinator.com/item?id=17585357
The responses are not encouraging, but I suppose they're useful for sociologists studying fallacious thinking.
Jul 23 2018
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 23 July 2018 at 11:51:54 UTC, Jim Balter wrote:
 On Sunday, 22 July 2018 at 20:10:27 UTC, Walter Bright wrote:
 On 7/21/2018 11:53 PM, Walter Bright wrote:
 My article C's Biggest Mistake on front page of 
 https://news.ycombinator.com !
Direct link: https://news.ycombinator.com/item?id=17585357
The responses are not encouraging, but I suppose they're useful for sociologists studying fallacious thinking.
In my experience, people never learn, even from the blatantly obvious, _particularly_ when they're invested in the outdated. What inevitably happens is the new tech gets good enough to put them out of business, then they finally pick it up or retire. Until most system software is written in D/Go/Rust/Swift/Zig/etc., they will keep mouthing platitudes about how C is here to stay.
Jul 23 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/23/2018 5:39 AM, Joakim wrote:
 In my experience, people never learn, even from the blatantly obvious, 
 _particularly_ when they're invested in the outdated. What inevitably happens
is 
 the new tech gets good enough to put them out of business, then they finally 
 pick it up or retire. Until most system software is written in 
 D/Go/Rust/Swift/Zig/etc., they will keep mouthing platitudes about how C is
here 
 to stay.
I've predicted before that what will kill C is managers and customers requiring memory safety because unsafeness costs them millions. The "just hire better programmers" will never work.
Jul 23 2018
next sibling parent reply RhyS <sale rhysoft.com> writes:
On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
 I've predicted before that what will kill C is managers and 
 customers requiring memory safety because unsafeness costs them 
 millions. The "just hire better programmers" will never work.
I have yet to see a company Walter where higher ups will take correct actions to resolve issues. Customers do not understand **** about programming. Your lucky if most clients can even get a proper specification formulated for what they want. If clients are that knowledgeable we do not need to constantly deal with issues where clients had things in their heads different then what they told / envisioned. And most manager are not going to rock the boat and stick their necks out. Not when they can simply blame issues on programmer incompetence or "it has always been like that with programming languages". I have yet to see managers really taking responsibility beyond guiding the projects so they do not get fired and hope to rack in bonuses. Issues can always be blamed on the tools or programmers. Sorry but that response is so naive Walter that it surprises me. Its like wanting a unicorn. And frankly, good luck convincing any company to convert millions of C code into D code. Not when manager hear about some new language or framework or whatever that is the chizz. They rather keep running the old code and move to something new. D is not something new, its not the chizz, its the same issue that D has struggle with for years. Its the same reason why that topic derailed so fast. You want to see something fun. Mention PHP on HackerNews/Reddit and you see the exact same trolling. People rather push their new favorite language, be it Go, Rust, ... then pick D. Response at my work when i made some stuff in D... "Why did you not use Go". Because the managers knew Go from the hype. They know Google is behind it. And some of our colleagues in sister companies already used Go. And that is all it takes. I am sorry to say but to succeed as a language beyond being a small or hobby language it takes: Being established already or having a big name to hype behind your "product". Anything beyond that will have topic derail and frankly, its more negative then positive. And D has too much old baggage. Its the same reason why PHP despite being a good language ( for what it is ), still keeps getting the exact same crude on forums. If i am honest, DasBetterC is a for me unreliable D product because using specific D library function can be GC. Or DasBetterC needs to be sold as C only, ever, forget about everything else that is D ( library, packages, ... ). Until everything is 100% GC free, your going to run into this. And even when its 100% GC free, people have long memories. Its always a struggle swimming up a river.
Jul 23 2018
next sibling parent reply JohnB <jb jb.com> writes:
On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 Customers do not understand **** about programming. Your lucky 
 if most clients can even get a proper specification formulated 
 for what they want. If clients are that knowledgeable we do not 
 need to constantly deal with issues where clients had things in 
 their heads different then what they told / envisioned.
I think that what Walter meant was when the customer have this problem where their data is leaking (and perhaps losing money) they will ask why, and an alternative to avoid this in the future will rely on a language that tend to follow the safety aspect. John B.
Jul 23 2018
parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Tuesday, 24 July 2018 at 01:31:13 UTC, JohnB wrote:
 On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 Customers do not understand **** about programming. Your lucky 
 if most clients can even get a proper specification formulated 
 for what they want. If clients are that knowledgeable we do 
 not need to constantly deal with issues where clients had 
 things in their heads different then what they told / 
 envisioned.
I think that what Walter meant was when the customer have this problem where their data is leaking (and perhaps losing money) they will ask why, and an alternative to avoid this in the future will rely on a language that tend to follow the safety aspect. John B.
Having read all the forum thread around the necessity to terminate an application on bugs or catch-them-and-keep-going, I'm not sure if the programmers folk are not to blame also for that problems. I agree also on the discussion about management, but, sometime, little companies has illuminate technical management, that can dare to rely on innovative languages to do their job, and compete (against big companies with no-so-illuminate-management). So definitely D has some value for them. /Paolo
Jul 24 2018
prev sibling next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:

 I am sorry to say but to succeed as a language beyond being a 
 small or hobby language it takes: Being established already or 
 having a big name to hype behind your "product". Anything 
 beyond that will have topic derail and frankly, its more 
 negative then positive.
If I'm not wrong, Python has grown up really slowly and quietly, till the recent big success in the scientific field. I remember also when Ruby was released, and what a killer application like Ruby on Rails has done for its success. /Paolo
Jul 24 2018
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 24 July 2018 at 07:19:21 UTC, Paolo Invernizzi wrote:
 If I'm not wrong, Python has grown up really slowly and 
 quietly, till the recent big success in the scientific field.
Python replaced Perl and to some extent Php... Two very questionable languages, which was a good starting-point. C++98 was a similarly questionable starting-point, but nobody got the timing right when that window was open. The success of Python in scientific computing is to a large extent related to C though.
Jul 25 2018
prev sibling next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
 I've predicted before that what will kill C is managers and 
 customers requiring memory safety because unsafeness costs 
 them millions. The "just hire better programmers" will never 
 work.
I have yet to see a company Walter where higher ups will take correct actions to resolve issues. Customers do not understand **** about programming. Your lucky if most clients can even get a proper specification formulated for what they want. If clients are that knowledgeable we do not need to constantly deal with issues where clients had things in their heads different then what they told / envisioned. And most manager are not going to rock the boat and stick their necks out. Not when they can simply blame issues on programmer incompetence or "it has always been like that with programming languages". I have yet to see managers really taking responsibility beyond guiding the projects so they do not get fired and hope to rack in bonuses. Issues can always be blamed on the tools or programmers. Sorry but that response is so naive Walter that it surprises me. Its like wanting a unicorn. And frankly, good luck convincing any company to convert millions of C code into D code. Not when manager hear about some new language or framework or whatever that is the chizz. They rather keep running the old code and move to something new. D is not something new, its not the chizz, its the same issue that D has struggle with for years. Its the same reason why that topic derailed so fast. You want to see something fun. Mention PHP on HackerNews/Reddit and you see the exact same trolling. People rather push their new favorite language, be it Go, Rust, ... then pick D. Response at my work when i made some stuff in D... "Why did you not use Go". Because the managers knew Go from the hype. They know Google is behind it. And some of our colleagues in sister companies already used Go. And that is all it takes. I am sorry to say but to succeed as a language beyond being a small or hobby language it takes: Being established already or having a big name to hype behind your "product". Anything beyond that will have topic derail and frankly, its more negative then positive. And D has too much old baggage. Its the same reason why PHP despite being a good language ( for what it is ), still keeps getting the exact same crude on forums. If i am honest, DasBetterC is a for me unreliable D product because using specific D library function can be GC. Or DasBetterC needs to be sold as C only, ever, forget about everything else that is D ( library, packages, ... ). Until everything is 100% GC free, your going to run into this. And even when its 100% GC free, people have long memories. Its always a struggle swimming up a river.
+1 IMO, D in its current state, and with its current ecosystem, even after more than a decade of existence, is still NOT the best alternative to C/C++ where they HAVE to be used (microcontrollers, game engines, etc), despite D has always had this objective in mind. And despite C++ is an unsafe language which makes it easy to have memory leaks, dangling pointers, etc. Because in those case, most of the time, when you use C or C++, it's because you HAVE to, not only because they run fast, but also because they can run without on a garbage collector. Just that simple. In C++, the memory is immediately released to the allocation system by the collections and smart pointers as soon as it's no longer used. This may not be perfect, but this process is continuous and predictable. In D, unused memory block are progressively filling the available memory until the non-incremental GC is triggered, either automatically or manually. Completely the opposite way. Not really appropriate for a "forever" event loop, where the unused memory HAS be released in a continuous way, not once in a while. And when you CAN afford to use a garbage collector, unfortunately D is still not the best pick in many use cases. While D's standard library makes D a great "plug-n-play" language for file processing and data analysis, for many other use cases, like web development for instance, some very recent languages already provide better alternatives "out of the box" (Go, Crystal, etc), as there have PLENTY of third party libraries (web frameworks, etc) built on top of the SAME building blocks provided by the default libraries of those languages. So, at the moment, I don't see how you can EASILY convince people to use BetterC for C/C++ use cases, like programming games, microcontrollers, etc. Same if you want to EASILY convince people to start to use D services, web sites, etc. Despite I know that some pioneer companies may have already chosen D for those same use cases, and are perfectly happy with D for that... So my opinion remains that : - become a TRUE and COMPLETE C++ replacement, D should provide the FULL D experience (arrays/strings/slices/maps/etc) using automatic reference counting INSTEAD of the garbage collector (like Kotlin/Native); replacement, D should provide the FULL Go-like experience (i.e. BUILTIN fibers/channels/server/etc). Of course, removing everything that makes D a pleasant language to use (i.e. arrays/strings/slices/maps/etc), calling it "BetterC", and expecting people to use this instead of C++ might also work...
Jul 24 2018
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
 So, at the moment, I don't see how you can EASILY convince 
 people to use BetterC for C/C++ use cases, like programming 
 games, microcontrollers, etc.
*Extremely powerful meta programming that blows c++ meta programming out of the water *Clean readable syntax *No header file nonsense *Standard keyword for ASM if you really need the performance boost. *Compiler enforce memory safety. -Alex
Jul 24 2018
parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:
 On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
 So, at the moment, I don't see how you can EASILY convince 
 people to use BetterC for C/C++ use cases, like programming 
 games, microcontrollers, etc.
*Extremely powerful meta programming that blows c++ meta programming out of the water *Clean readable syntax *No header file nonsense *Standard keyword for ASM if you really need the performance boost. *Compiler enforce memory safety. -Alex
I know. And D's builtin strings/arrays/slices/maps/etc and automatic memory deallocation are part of what makes D a better alternative to C++ too. I'm just saying : Kotlin Native automated memory management through automated reference counting with cycle detection. That solution may have its own drawbacks over a "true" traditional garbage collector, but its main advantage is that it's transparent. Business as usual... And IF you need to disable the cycle collector, you can still have a TRUE and COMPLETE replacement for C++, by simply using weak references to avoid strong reference cycles, just like in the provided standard library. Best of both worlds, no need for a "nogc" standard library, as it IS nogc by default, while still providing exactly the same functionalities as in the "gc" standard library...
Jul 24 2018
parent reply bpr <brogoff gmail.com> writes:
On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:
 On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
 So, at the moment, I don't see how you can EASILY convince 
 people to use BetterC for C/C++ use cases, like programming 
 games, microcontrollers, etc.
*Extremely powerful meta programming that blows c++ meta programming out of the water *Clean readable syntax *No header file nonsense *Standard keyword for ASM if you really need the performance boost. *Compiler enforce memory safety. -Alex
I know. And D's builtin strings/arrays/slices/maps/etc and automatic memory deallocation are part of what makes D a better alternative to C++ too.
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. I work in a mostly C++ shop where exceptions are intolerable in C++ code, and in many places we use CRTP to eliminate dispatch overhead. DasBetterC would be usable here but it's too late given the existing investment in C++. Obviously there's no CRTP in DasBetterC without struct inheritance, but there are other designs to address this issue. Besides having more betterC libraries, I'd like to see some kind of restricted approach to exception handling, like the ones being investigated in http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf. If you want a better C++, look at what people who have to use C++ use it for, and where the pain points are.
Jul 24 2018
next sibling parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:
 On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder 
 wrote:
 So, at the moment, I don't see how you can EASILY convince 
 people to use BetterC for C/C++ use cases, like programming 
 games, microcontrollers, etc.
*Extremely powerful meta programming that blows c++ meta programming out of the water *Clean readable syntax *No header file nonsense *Standard keyword for ASM if you really need the performance boost. *Compiler enforce memory safety. -Alex
I know. And D's builtin strings/arrays/slices/maps/etc and automatic memory deallocation are part of what makes D a better alternative to C++ too.
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. I work in a mostly C++ shop where exceptions are intolerable in C++ code, and in many places we use CRTP to eliminate dispatch overhead. DasBetterC would be usable here but it's too late given the existing investment in C++. Obviously there's no CRTP in DasBetterC without struct inheritance, but there are other designs to address this issue. Besides having more betterC libraries, I'd like to see some kind of restricted approach to exception handling, like the ones being investigated in http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf. If you want a better C++, look at what people who have to use C++ use it for, and where the pain points are.
I agree. What I leant is that after having built several realtime 3D engines using simoly strong/weak references to transparently release unused objects, I don't see why such features couldn't be integrated in a language like D as a core feature (T,T^,T*), instead of being a template library. This gets the job done, and while not perfect, this remains very handy. A cycle detector is only required as a debugging tool. All you need is 3 kinds of pointers: - strong reference - weak reference - raw pointer And, unfortunately, more discipline to manage mutual references yourself, instead of letting the GC manage that for you. So in some cases, having an optional cycle colkector can be very useful when using D in a Go-like way...
Jul 24 2018
prev sibling next sibling parent reply Chris M. <chrismohrfeld comcast.net> writes:
On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:
 [...]
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. [...]
There was a precise GC in the works at one point, no clue what happened to it.
Jul 24 2018
parent reply Seb <seb wilzba.ch> writes:
On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:
 On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:
 [...]
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. [...]
There was a precise GC in the works at one point, no clue what happened to it.
The newest PR is: https://github.com/dlang/druntime/pull/1977 Though there's already a bit of precise scanning on Windows, e.g. https://github.com/dlang/druntime/pull/1798 and IIRC Visual D uses a precise GC too.
Jul 24 2018
parent reply bpr <brogoff gmail.com> writes:
On Tuesday, 24 July 2018 at 17:24:41 UTC, Seb wrote:
 On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:
 On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder 
 wrote:
 [...]
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. [...]
There was a precise GC in the works at one point, no clue what happened to it.
The newest PR is: https://github.com/dlang/druntime/pull/1977 Though there's already a bit of precise scanning on Windows, e.g. https://github.com/dlang/druntime/pull/1798 and IIRC Visual D uses a precise GC too.
Well, this is a big problem with D IMO. There are a lot of unfinished, half baked features which linger in development for years. How long for precise GC now, over 5 years? I don't think D was really designed to be friendly to GC, and it just isn't realistic to expect that there will *ever* be a production quality precise GC for all of D. Maybe giving up on some things and finishing/fixing others would be a better strategy? I think so, which is why I think DasBetterC is the most appealing thing I've seen in D lately.
Jul 25 2018
next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Wednesday, 25 July 2018 at 16:39:51 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 17:24:41 UTC, Seb wrote:
 On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:
 On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder 
 wrote:
 [...]
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. [...]
There was a precise GC in the works at one point, no clue what happened to it.
The newest PR is: https://github.com/dlang/druntime/pull/1977 Though there's already a bit of precise scanning on Windows, e.g. https://github.com/dlang/druntime/pull/1798 and IIRC Visual D uses a precise GC too.
Well, this is a big problem with D IMO. There are a lot of unfinished, half baked features which linger in development for years. How long for precise GC now, over 5 years? I don't think D was really designed to be friendly to GC, and it just isn't realistic to expect that there will *ever* be a production quality precise GC for all of D. Maybe giving up on some things and finishing/fixing others would be a better strategy? I think so, which is why I think DasBetterC is the most appealing thing I've seen in D lately.
+1 But don't be too optimistic about BetterC... Honestly, considering D's leadership current priorities, I don't see how it could become soon a true C++ or Go competitor, even with the half-baked BetterC initiative... For instance, I've suggested they consider using reference counting as an alternative default memory management scheme, and add it to the lists of scolarship and crowdsourced project, and of course they have added all the other suggestion, but not this one. What a surprise ;) Despite this is probably one of the most used allocation management scheme in typical C++ development, as this drastically reduces the risks of memory leaks and dangling pointers... Anyway, meanwhile D remains a fantastic strongly-typed scripting language for file processing and data analysis, and its recent adoption at Netflix has once again clearly proved it...
Jul 25 2018
next sibling parent reply bpr <brogoff gmail.com> writes:
On Wednesday, 25 July 2018 at 17:23:40 UTC, Ecstatic Coder wrote:
 But don't be too optimistic about BetterC...
I'm too old to get optimistic about these things. In the very best case, D has quite an uphill battle for market share. Any non mainstream language does. If I were a betting man, I'd bet on Rust.
 Honestly, considering D's leadership current priorities, I 
 don't see how it could become soon a true C++ or Go competitor, 
 even with the half-baked BetterC initiative...
There are a few ways I can see, and doubtless others can see different ones. Here's one: use Mir and BetterC to write a TensorFlow competitor for use in developing and deploying ML models. I'm sure you can shoot holes in that idea, but you get the point. Try lots of things and see what works, and keep doing more of those things. Worked for Python.
 For instance, I've suggested they consider using reference 
 counting as an alternative default memory management scheme, 
 and add it to the lists of scolarship and crowdsourced project, 
 and of course they have added all the other suggestion, but not 
 this one. What a surprise ;)
I'm pretty sure D leadership is pursuing such things. In fact, https://wiki.dlang.org/Vision/2018H1 rather prominently mentions it.
 Despite this is probably one of the most used allocation 
 management scheme in typical C++ development, as this 
 drastically reduces the risks of memory leaks and dangling 
 pointers...
 Anyway, meanwhile D remains a fantastic strongly-typed 
 scripting language for file processing and data analysis, and 
 its recent adoption at Netflix has once again clearly proved 
 it...
For this and similar uses, tracing GC is fine, better in fact than the alternatives. I'm only making noise about betterC for the cases where C++ dominates and tracing GC is a showstopper. In an alternative timeline, DasBtterC would have been released before D with GC, and the main libraries would have been nogc, and maybe there'd be a split between raw pointers and traced refs (like Nim and Modula-3) and then maybe there'd have been no strong desire for Rust since D could have filled that niche.
Jul 25 2018
parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Wednesday, 25 July 2018 at 20:24:39 UTC, bpr wrote:
 On Wednesday, 25 July 2018 at 17:23:40 UTC, Ecstatic Coder 
 wrote:
 But don't be too optimistic about BetterC...
I'm too old to get optimistic about these things. In the very best case, D has quite an uphill battle for market share. Any non mainstream language does. If I were a betting man, I'd bet on Rust.
 Honestly, considering D's leadership current priorities, I 
 don't see how it could become soon a true C++ or Go 
 competitor, even with the half-baked BetterC initiative...
There are a few ways I can see, and doubtless others can see different ones. Here's one: use Mir and BetterC to write a TensorFlow competitor for use in developing and deploying ML models. I'm sure you can shoot holes in that idea, but you get the point. Try lots of things and see what works, and keep doing more of those things. Worked for Python.
 For instance, I've suggested they consider using reference 
 counting as an alternative default memory management scheme, 
 and add it to the lists of scolarship and crowdsourced 
 project, and of course they have added all the other 
 suggestion, but not this one. What a surprise ;)
I'm pretty sure D leadership is pursuing such things. In fact, https://wiki.dlang.org/Vision/2018H1 rather prominently mentions it.
 Despite this is probably one of the most used allocation 
 management scheme in typical C++ development, as this 
 drastically reduces the risks of memory leaks and dangling 
 pointers...
 Anyway, meanwhile D remains a fantastic strongly-typed 
 scripting language for file processing and data analysis, and 
 its recent adoption at Netflix has once again clearly proved 
 it...
For this and similar uses, tracing GC is fine, better in fact than the alternatives. I'm only making noise about betterC for the cases where C++ dominates and tracing GC is a showstopper. In an alternative timeline, DasBtterC would have been released before D with GC, and the main libraries would have been nogc, and maybe there'd be a split between raw pointers and traced refs (like Nim and Modula-3) and then maybe there'd have been no strong desire for Rust since D could have filled that niche.
+1
Jul 25 2018
prev sibling parent reply Seb <seb wilzba.ch> writes:
On Wednesday, 25 July 2018 at 17:23:40 UTC, Ecstatic Coder wrote:
 On Wednesday, 25 July 2018 at 16:39:51 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 17:24:41 UTC, Seb wrote:
 On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:
 On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder 
 wrote:
 [...]
No. For many C++ users, tracing GC is absolutely not an option. And, if it were, D's GC is not a shining example of a good GC. It's not even precise, and I would bet that it never will be. If I'm able to tolerate a GC, there are languages with much better GCs than the D one, like Go and Java. [...]
There was a precise GC in the works at one point, no clue what happened to it.
The newest PR is: https://github.com/dlang/druntime/pull/1977 Though there's already a bit of precise scanning on Windows, e.g. https://github.com/dlang/druntime/pull/1798 and IIRC Visual D uses a precise GC too.
Well, this is a big problem with D IMO. There are a lot of unfinished, half baked features which linger in development for years. How long for precise GC now, over 5 years? I don't think D was really designed to be friendly to GC, and it just isn't realistic to expect that there will *ever* be a production quality precise GC for all of D. Maybe giving up on some things and finishing/fixing others would be a better strategy? I think so, which is why I think DasBetterC is the most appealing thing I've seen in D lately.
+1 But don't be too optimistic about BetterC... Honestly, considering D's leadership current priorities, I don't see how it could become soon a true C++ or Go competitor, even with the half-baked BetterC initiative... For instance, I've suggested they consider using reference counting as an alternative default memory management scheme, and add it to the lists of scolarship and crowdsourced project, and of course they have added all the other suggestion, but not this one. What a surprise ;)
The scholarship list is an idea list that is community-maintained. Let me quote Mike: "Thanks to everyone for the project ideas, but I put the list on the Wiki for a reason. I'm always pressed for time, so if you have an idea for a project suggestion, it would help me tremendously if you can just summarize it on the Wiki rather than here." The crowdsourced project was an experiment and the most popular item of the state of D survey that had someone who could be contacted and was more than willing to work for a scholarship salary, was picked. As Mike has already announced in the blog, it wasn't known before that essentially only one goal could be selected. In the future, it will be possible to select the project(s) you are most interested in when donating.
 Despite this is probably one of the most used allocation 
 management scheme in typical C++ development, as this 
 drastically reduces the risks of memory leaks and dangling 
 pointers...
Agreed, but it's easily implemented in a library like RefCounted (automem has good implementation too: https://github.com/atilaneves/automem). For example, std.stdio.File is reference-counted ;-) core.rc will come, but at the moment only Martin is planning to work on it and he's busy with a lot of other things (e.g. the release process, maintaining the project tester, migrating code.dlang.org to a highly-available cluster, fixing DMD regressions, ...) It's the same story as always, just from complaining, things won't get magically better... (though it would be great if the world worked that way because then maybe my relationships would be more successful :O)
Jul 26 2018
parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
 It's the same story as always, just from complaining, things 
 won't get magically better... (though it would be great if the 
 world worked that way because then maybe my relationships would 
 be more successful :O)
You can choose whatever priorities you prefer for your scholarship and funded projects. Sorry to have showed my disagreement with some of your choices and strategies. That was silly, being a loss of time for both of us, indeed.
Jul 26 2018
parent Seb <seb wilzba.ch> writes:
On Thursday, 26 July 2018 at 09:55:39 UTC, Ecstatic Coder wrote:
 You can choose whatever priorities you prefer for your 
 scholarship and funded projects.
I only tried to point out that the SAoC scholarships depend on proposal ideas by the D community (e.g. through the D wiki) and encouraging students to submit their applications. Thus complaining about the D leadership not respecting your proposal if you didn't even bother to put up a potential proposal in the D wiki isn't really fair to them.
 Sorry to have showed my disagreement with some of your choices 
 and strategies.

 That was silly, being a loss of time for both of us, indeed.
Sorry if you misunderstood. Critics and negative feedback is welcome (!), but if it's just "they don't follow me", it's hard to put it into actionable items and make it happen.
Jul 26 2018
prev sibling parent reply Kagamin <spam here.lot> writes:
On Wednesday, 25 July 2018 at 16:39:51 UTC, bpr wrote:
 Well, this is a big problem with D IMO. There are a lot of 
 unfinished, half baked features which linger in development for 
 years. How long for precise GC now, over 5 years?
Precise GC is only relevant for 32-bit address space, which is shrinking even in gamedev. On Wednesday, 25 July 2018 at 20:24:39 UTC, bpr wrote:
 For this and similar uses, tracing GC is fine, better in fact 
 than the alternatives. I'm only making noise about betterC for 
 the cases where C++ dominates and tracing GC is a showstopper.
https://forum.dlang.org/post/cgdkzpltclkufotkpbih forum.dlang.org like this?
Jul 26 2018
parent Radu <void null.pt> writes:
On Thursday, 26 July 2018 at 08:45:41 UTC, Kagamin wrote:
 On Wednesday, 25 July 2018 at 16:39:51 UTC, bpr wrote:
 Well, this is a big problem with D IMO. There are a lot of 
 unfinished, half baked features which linger in development 
 for years. How long for precise GC now, over 5 years?
Precise GC is only relevant for 32-bit address space, which is shrinking even in gamedev. On Wednesday, 25 July 2018 at 20:24:39 UTC, bpr wrote:
 For this and similar uses, tracing GC is fine, better in fact 
 than the alternatives. I'm only making noise about betterC for 
 the cases where C++ dominates and tracing GC is a showstopper.
https://forum.dlang.org/post/cgdkzpltclkufotkpbih forum.dlang.org like this?
Please, there are still a *lot* of 32 bit targets out there, most of the embedded devices bare metal or Linux based are still 32 bit, and for sure they will still be for the foreseeable future. Let's not assume how dlang is used or could be used, after all, most of the embedded applications are not really explicit on how are made. Enabling a slower but precise GC for 32 bit would be acceptable for some applications. But D GC is getting a lot of criticism and less interest in improving it. I would like to see a goal on opencollective to support improving the precise collector and finish it to the point that it is at least available as an option in the official builds. I will contribute to that goal, I'm sure others will do.
Jul 26 2018
prev sibling parent Kagamin <spam here.lot> writes:
On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
 Besides having more betterC libraries, I'd like to see some 
 kind of restricted approach to exception handling, like the 
 ones being investigated in 
 http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf. If you
want a better C++, look at what people who have to use C++ use it for, and
where the pain points are.
I'd say because of default initialization exceptions in constructors are less necessary in D than in C++.
Jul 26 2018
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 [...]
+1 IMO, D in its current state, and with its current ecosystem, even after more than a decade of existence, is still NOT the best alternative to C/C++ where they HAVE to be used (microcontrollers, game engines, etc), despite D has always had this objective in mind. And despite C++ is an unsafe language which makes it easy to have memory leaks, dangling pointers, etc. [...]
the games industry. https://unity3d.com/unity/features/job-system-ECS Mike Acton and Andreas Fredriksson left Insomianc Games to help drive this effort. Mike opinions regarding performance and C vs C++ are very well performance at Unity. -- Paulo
Jul 25 2018
parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Wednesday, 25 July 2018 at 08:23:40 UTC, Paulo Pinto wrote:
 On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 [...]
+1 IMO, D in its current state, and with its current ecosystem, even after more than a decade of existence, is still NOT the best alternative to C/C++ where they HAVE to be used (microcontrollers, game engines, etc), despite D has always had this objective in mind. And despite C++ is an unsafe language which makes it easy to have memory leaks, dangling pointers, etc. [...]
the games industry. https://unity3d.com/unity/features/job-system-ECS Mike Acton and Andreas Fredriksson left Insomianc Games to help drive this effort. Mike opinions regarding performance and C vs C++ are very well performance at Unity. -- Paulo
Yop :) Orthodox C++ and data-oriented designs are now the basis of most new game engines since several years. I'm glad that the Unity management has finally decided to switch its engin to a more modern archicture, so we can now develop our games as everybody else in the industry...
Jul 25 2018
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
 On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
 I've predicted before that what will kill C is managers and 
 customers requiring memory safety because unsafeness costs 
 them millions. The "just hire better programmers" will never 
 work.
I have yet to see a company Walter where higher ups will take correct actions to resolve issues.
It might be that you are working for the wrong companies. Half the companies in the world are below average and few are excellent.
 And most manager are not going to rock the boat and stick their 
 necks out. Not when they can simply blame issues on programmer 
 incompetence or "it has always been like that with programming 
 languages". I have yet to see managers really taking 
 responsibility beyond guiding the projects so they do not get 
 fired and hope to rack in bonuses. Issues can always be blamed 
 on the tools or programmers.
That's a good point, but the nice thing about not having dominant market share is it's easy to grow it. You don't need to convince most managers. Just a few more people who are on the edge already anyway. Quality is better than quantity because the former concentrate power.
 And frankly, good luck convincing any company to convert 
 millions of C code into D code.
The point is with betterC you don't need to. And on the other hand if you did, libclang would take quite a lot of the pain out once you did a bit of work upfront. See DPP
 I am sorry to say but to succeed as a language beyond being a 
 small or hobby language it takes: Being established already or 
 having a big name to hype behind your "product".
I don't agree. We are in a time of positive disruption when old heuristics break down. All D has to do is to keep compounding its adoption and what average people think of D is completely irrelevant. What's important is what the best people amongst those who are principals rather than agents think of D. There's no point selling it to a committee, but who wants to deal with committees anyway - life is too short for that if one possibly has the choice.
 Its the same reason why PHP despite being a good language ( for 
 what it is )
! , still keeps
 getting the exact same crude on forums.

 If i am honest, DasBetterC is a for me unreliable D product 
 because using specific D library function can be GC. Or 
 DasBetterC needs to be sold as C only, ever, forget about 
 everything else that is D ( library, packages, ... ). Until 
 everything is 100% GC free, your going to run into this. And 
 even when its 100% GC free, people have long memories.
Don't use it if you don't want to. But making predictions is a tricky thing and mostly of not much value. I think it's more interesting to be the change you wish to see in the world.
Jul 25 2018
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Jul 25, 2018 at 11:27:45PM +0000, Laeeth Isharc via Digitalmars-d wrote:
 On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:
[...]
 I am sorry to say but to succeed as a language beyond being a small
 or hobby language it takes: Being established already or having a
 big name to hype behind your "product".
I don't agree. We are in a time of positive disruption when old heuristics break down. All D has to do is to keep compounding its adoption and what average people think of D is completely irrelevant. What's important is what the best people amongst those who are principals rather than agents think of D. There's no point selling it to a committee, but who wants to deal with committees anyway - life is too short for that if one possibly has the choice.
+1. [...]
 If i am honest, DasBetterC is a for me unreliable D product because
 using specific D library function can be GC. Or DasBetterC needs to
 be sold as C only, ever, forget about everything else that is D (
 library, packages, ... ). Until everything is 100% GC free, your
 going to run into this. And even when its 100% GC free, people have
 long memories.
Don't use it if you don't want to. But making predictions is a tricky thing and mostly of not much value. I think it's more interesting to be the change you wish to see in the world.
[...] +1!!! Predictions rarely come true, as history proves. And for the amount of effort put into naysaying, so much more could have done productively. But hey, if that's what people like doing, then what is one to say. T -- Being able to learn is a great learning; being able to unlearn is a greater learning.
Jul 25 2018
prev sibling next sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Wednesday, 25 July 2018 at 23:27:45 UTC, Laeeth Isharc wrote:

 But making predictions is a tricky thing and mostly of not much 
 value.
I'm really surprised to hear you say this - so much money in the financial services is poured into making predictions, lots of them and as fast as possible. Isn't that one of the promises of D in that market? Whatever the reality about that, in the life of all humans the ability to make good predictions is fundamental to survival - if I cross the road now, will I be run over? If I build a chair to make money, will anyone buy it? Likewise, if I am investing time in developing my skills to further my career, will learning D be a net benefit? This important question depends heavily on predicting the future of D (among many other things). If I use D for my startup, will it be the secret sauce that will propel us to the top, or will I be better off with JDK8 or modern C++?
  I think it's more interesting to be the change you wish to see 
 in the world.
This has a lovely ring but it doesn't mean not to assess / predict if what you do will provide a net benefit.
Jul 27 2018
parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Friday, 27 July 2018 at 15:04:12 UTC, Abdulhaq wrote:
 On Wednesday, 25 July 2018 at 23:27:45 UTC, Laeeth Isharc wrote:

 But making predictions is a tricky thing and mostly of not 
 much value.
I'm really surprised to hear you say this - so much money in the financial services is poured into making predictions, lots of them and as fast as possible. Isn't that one of the promises of D in that market?
For me, I think that managing money is about choosing to expose your capital intelligently to the market, balancing the risk of loss against the prospective gain and considering this in a portfolio sense. Prediction doesn't really come into that I do personally sometimes write longer term pieces about markets - I wrote a piece in June 2012 asking if dollar is bottoming and I said it was. But that was based on a gestalt not some Cartesian predictive model.
 Whatever the reality about that, in the life of all humans the 
 ability to make good predictions is fundamental to survival - 
 if I cross the road now, will I be run over? If I build a chair 
 to make money, will anyone buy it?
I disagree. It's not the prediction that matters but what you do. It's habits, routines, perception, adaptation and actions that matter. What people think drives their behaviour isn't what actually does. See Dr Iain Macgilchrist Master and His Emissary for more. And in particular if you survive based on having insight then it's interesting to listen to what you say. If you are known as an expert but don't depend on having insight, it's interesting in how others perceive what you say and how that evolves, but the substance of your analysis is not - without skin in the game, it's just talk. Bernanke admits he has had no clue about economic developments before they happen. I used to trade a lot of gilts and the UK debt management office asked me to meet the IMF financial stability review guy in 2005. He had a bee in his bonnet about hedge funds and the dollar yen carry trade. I told him to look at the banks and what they were buying. He didn't listen. I had lunch with Kohn, Fed vice chair in summer 2006. I asked him about housing. He wasn't worried at all. So lots of people talk about all kinds of things. Look at how insightful they have been in the past. Predictions themselves aren't worth much - recognising change early is. And for what it's worth I think D is early in a bull market that will last for decades. The grumbling is funnily enough quite characteristic of such too.
 Likewise, if I am investing time in developing my skills to 
 further my career, will learning D be a net benefit?
It really depends. There are some very good jobs in D. If it should turn out we hired you some day then most likely you would find it quite satisfying and well paid and be rather glad you learnt D. If not and not someone else then who knows. I personally found following my intuition like in a Jack London novel to be better than trying to live by optimising and figuring out the best angle. But people are different and it's difficult to know. If you feel like learning D, do it. If it's purely a career move then there are too many factors to say.
 important question depends heavily on predicting the future of 
 D (among many other things). If I use D for my startup, will it 
 be the secret sauce that will propel us to the top, or will I 
 be better off with JDK8 or modern C++?
Things once alive tend to grow. The future is unknown if not unimaginable. I don't think life works like that. It's more like you pick something for your startup and the start-up fails because your business partner gets divorced. But through some unlikely chain of coincidences that leads to some better opportunity you never could have found by approaching it head on. So things are beyond calculation, but not beyond considering intuition and what resonates with you. See the work of my colleague Flavia Cymbalista - How George Soros Knows What He Knows.
  I think it's more interesting to be the change you wish to 
 see in the world.
This has a lovely ring but it doesn't mean not to assess / predict if what you do will provide a net benefit.
It's really up to you what you do. People who make high stakes decisions - also commercially - I'm really not sure if predictions play the part you think they do. One little trick. If you have an insight nobody agrees with then you know you might be onto something when surprises start to come in your direction in a way nobody could have quite imagined. We are seeing this now with processor challenges for example.
Jul 27 2018
next sibling parent greentea <greentea gmail.com> writes:
On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:
 On Friday, 27 July 2018 at 15:04:12 UTC, Abdulhaq wrote:
 On Wednesday, 25 July 2018 at 23:27:45 UTC, Laeeth Isharc 
 wrote:
 I personally found following my intuition like in a Jack London 
 novel to be better than trying to live by optimising and 
 figuring out the best angle.  But people are different and it's 
 difficult to know.
...
 So things are beyond calculation, but not beyond considering 
 intuition and what resonates with you.
Malcolm Gladwell wrote an interesting book about how intuition contributes to decision-making (Blink - The Power of Thinking Without Thinking).
Jul 27 2018
prev sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

 For me, I think that managing money is about choosing to expose 
 your capital intelligently to the market, balancing the risk of 
 loss against the prospective gain and considering this in a 
 portfolio sense.

 Prediction doesn't really come into that
I think this apparent difference of opinion is down to different definitions of the word prediction. When I say prediction I mean the assessment of what are the possible futures for a scenario and how likely each one is. It can be conscious or unconscious. I think my understanding of the word is not an uncommon one. By my definition, when you balance the risk of loss (i.e. predict how likely you are to lose money) against the prospective gain (i.e. multiply the probability of each possible outcome by its reward and sum the total to get a prospective value) then you are, by my definition and therefore, for me, by definition, making predictions.
 It's not the prediction that matters but what you do.  It's 
 habits, routines, perception, adaptation and actions that 
 matter.
I agree they are integral to our behaviour and habits and routines do not involve the element of prediction. Perceptions come before and actions take place after the decision process is made (conscious or not) and so don't factor into this discussion for me. In truth I avoid discussions that are really just arguing about definitions of words, but you made a couple of sweeping bumper-stickery comments that trying to predict things was usually a waste of time and as an alternative we should 'be the change...'. I wholeheartedly agree we should 'be the change...' but it's not an alternative to making predictions, it goes hand in hand with it. I'm sure you've read Kahneman's Thinking, Fast and Slow. You made a generalisation that applies to the 'fast' part. I'm saying your universal rule is wrong because of the slow part. I learnt D many years ago just after Andrei's book came out. I love it but it's on the shelf at the moment for me. I rarely get time for side projects these days but when I do I want them to run on Android with easy access to all the APIs and without too much ado in the build setup. They must continue to work and be supported with future versions of Android. At work, on Windows, JDK8/JavaFX/Eclipse/maven and python/numpy/Qt/OpenCascade/VTK hit the spot. Each project I start I give some very hard thought about which development environment I'm going to use, and D is often one of those options. The likely future of D on the different platforms is an important part of that assessment, hence 'predicting' the future of D, hard and very unreliable though that is, is an important element in some of my less trivial decisions.
Jul 28 2018
parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Saturday, 28 July 2018 at 11:09:28 UTC, Abdulhaq wrote:
 On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

 For me, I think that managing money is about choosing to 
 expose your capital intelligently to the market, balancing the 
 risk of loss against the prospective gain and considering this 
 in a portfolio sense.

 Prediction doesn't really come into that
I think this apparent difference of opinion is down to different definitions of the word prediction. When I say prediction I mean the assessment of what are the possible futures for a scenario and how likely each one is. It can be conscious or unconscious. I think my understanding of the word is not an uncommon one. By my definition, when you balance the risk of loss (i.e. predict how likely you are to lose money) against the prospective gain (i.e. multiply the probability of each possible outcome by its reward and sum the total to get a prospective value) then you are, by my definition and therefore, for me, by definition, making predictions.
It's tough when dealing with genuine - Knightian uncertainty or even more radical versions. When one doesn't even know the structure of the problem then maximising expected utility doesn't work. One can look at capacities - Choquet and the like - but then its harder to say something useful about what you should do. And I think when dealing with human action and institutions we are in a world of uncertainty more often than not.
 It's not the prediction that matters but what you do.  It's 
 habits, routines, perception, adaptation and actions that 
 matter.
I agree they are integral to our behaviour and habits and routines do not involve the element of prediction. Perceptions come before and actions take place after the decision process is made (conscious or not) and so don't factor into this discussion for me.
But it's a loop and one never takes a final decision to master D. Also habits, routines and structures _do_ shape perception.
 In truth I avoid discussions that are really just arguing about 
 definitions of words, but you made a couple of sweeping 
 bumper-stickery comments
That's entertaining. I've not been accused of that before! Bear in mind also I tend to write on my phone.
that trying to predict things was
 usually a waste of time and as an alternative we should 'be the 
 change...'. I wholeheartedly agree we should 'be the change...' 
 but it's not an alternative to making predictions, it goes hand 
 in hand with it. I'm sure you've read Kahneman's Thinking, Fast 
 and Slow. You made a generalisation that applies to the 'fast' 
 part. I'm saying your universal rule is wrong because of the 
 slow part.
Yes I read Kahneman et al papers for the first time in 92 in the university library. I speed-read his book, and I thought it was a bad book. I work with a specialist in making decisions under uncertainty - she was the only person able to articulate to George Soros how he made money because he certainly couldn't, and she is mentioned in the preface to the revised version of Alchemy. She has the same view as me - behavioural finance is largely a dead end. One learns much more by going straight to the neuroeconomics and incorporating also the work of Dr Iain Macgilchrist. Kahneman makes a mistake in his choice of dimension. There's analytic and intuitive/gestalt and in my experience people making high stakes decisions are much less purely analytical than a believer in the popular Kahneman might suggest. What I said about prediction being overrated isn't controversial amongst a good number of the best traders and business people in finance. You might read Nassim Taleb also.
 I learnt D many years ago just after Andrei's book came out. I 
 love it but it's on the shelf at the moment for me. I rarely 
 get time for side projects these days but when I do I want them 
 to run on Android with easy access to all the APIs and without 
 too much ado in the build setup. They must continue to work and 
 be supported with future versions of Android. At work, on 
 Windows, JDK8/JavaFX/Eclipse/maven and 
 python/numpy/Qt/OpenCascade/VTK hit the spot.
Well it's a pity the D Android ecosystem isn't yet mature. Still I remain in awe of the stubborn accomplishment of the man (with help) who got LDC to run on Android. It's not that bad calling D from Java. Some day I will see if I can help automate that - Kai started working on it already I think.
each project I
 start I give some very hard thought about which development 
 environment I'm going to use, and D is often one of those 
 options. The likely future of D on the different platforms is 
 an important part of that assessment, hence 'predicting' the 
 future of D, hard and very unreliable though that is, is an 
 important element in some of my less trivial decisions.
Since you already know D you need to answer a different question. What's the chance the compiler will die on the relevant horizon, and how bad will it be for me if that happens. Personally I'm not worried. If D should disappear in a few years, it wouldn't be the end of the world to port things. I just don't think that's very likely. Of course it depends on your context. The people who use D at work seem to be more principals who have the right to take the best decision as they see it then agents who must persuade others who are the real decision-makers. That's a recipe for quiet adoption that's dispersed across many industries initially and for the early adopters of D being highly interesting people. Since, as the Wharton professor, Adam Grant observes, we are in an age where positive disruptors can achieve a lot within an organisation, that's also rather interesting.
Jul 28 2018
next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:

each project I
 start I give some very hard thought about which development 
 environment I'm going to use, and D is often one of those 
 options. The likely future of D on the different platforms is 
 an important part of that assessment, hence 'predicting' the 
 future of D, hard and very unreliable though that is, is an 
 important element in some of my less trivial decisions.
Since you already know D you need to answer a different question. What's the chance the compiler will die on the relevant horizon, and how bad will it be for me if that happens. Personally I'm not worried. If D should disappear in a few years, it wouldn't be the end of the world to port things. I just don't think that's very likely. Of course it depends on your context. The people who use D at work seem to be more principals who have the right to take the best decision as they see it then agents who must persuade others who are the real decision-makers. That's a recipe for quiet adoption that's dispersed across many industries initially and for the early adopters of D being highly interesting people. Since, as the Wharton professor, Adam Grant observes, we are in an age where positive disruptors can achieve a lot within an organisation, that's also rather interesting.
A very interesting discussion... really. Perceptions, expectations, prediction... an easy read I suggest on the latest trends [1], if someone is interested... BTW, Laeeth is right in the last paragraph two. I was one of the 'principal' who took the decision to use D in production, 14 years ago, and he described the reasoning of that era very well. Today I'm still convinced that the adoption of D is a competitive advantage for a company, I definitely have to work to improve my bad temper (eheh) to persuade my actual CTO to give it another change. /Paolo (btw, I'm the CEO...)
Jul 28 2018
parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi wrote:
 On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:

each project I
 start I give some very hard thought about which development 
 environment I'm going to use, and D is often one of those 
 options. The likely future of D on the different platforms is 
 an important part of that assessment, hence 'predicting' the 
 future of D, hard and very unreliable though that is, is an 
 important element in some of my less trivial decisions.
Since you already know D you need to answer a different question. What's the chance the compiler will die on the relevant horizon, and how bad will it be for me if that happens. Personally I'm not worried. If D should disappear in a few years, it wouldn't be the end of the world to port things. I just don't think that's very likely. Of course it depends on your context. The people who use D at work seem to be more principals who have the right to take the best decision as they see it then agents who must persuade others who are the real decision-makers. That's a recipe for quiet adoption that's dispersed across many industries initially and for the early adopters of D being highly interesting people. Since, as the Wharton professor, Adam Grant observes, we are in an age where positive disruptors can achieve a lot within an organisation, that's also rather interesting.
A very interesting discussion... really. Perceptions, expectations, prediction... an easy read I suggest on the latest trends [1], if someone is interested... BTW, Laeeth is right in the last paragraph two. I was one of the 'principal' who took the decision to use D in production, 14 years ago, and he described the reasoning of that era very well. Today I'm still convinced that the adoption of D is a competitive advantage for a company, I definitely have to work to improve my bad temper (eheh) to persuade my actual CTO to give it another change. /Paolo (btw, I'm the CEO...)
Thanks for the colour, Paolo. Yes - it's a competitive advantage, but opportunity often comes dressed in work clothes. We're in an era when most people are not used to discomfort and have an inordinate distaste for it. If you're fine with that and make decisions as best you can based on objective factors (objectivity being something quite different from 'evidence-based' because of the drunk/lamppost issue) then there is treasure everywhere (to steal Andrey's talk title). Opportunities are abundant where people aren't looking because they don't want to.
Jul 28 2018
next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Saturday, 28 July 2018 at 14:09:44 UTC, Laeeth Isharc wrote:
 On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi
 Perceptions, expectations, prediction...   an easy read I 
 suggest on the latest trends [1], if someone is interested...
I forgot the link... here it is: https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710
 Yes - it's a competitive advantage, but opportunity often comes 
 dressed in work clothes.
Curiosity is the salt of evolution... for example I'm now intrigued by the Master and His Emissary, I've to read it. And another curiosity: I studied in the 90 in Milano, what was your thought on Hayek, von Mises, in those time? Classic Economics was so boring...
 We're in an era when most people are not used to discomfort and 
 have an inordinate distaste for it.  If you're fine with that 
 and make decisions as best you can based on objective factors 
 (objectivity being something quite different from 
 'evidence-based' because of the drunk/lamppost issue) then 
 there is treasure everywhere (to steal Andrey's talk title).  
 Opportunities are abundant where people aren't looking because 
 they don't want to.
Me and my colleague are pretty different, in the approach to that kind of stuff... Maybe I'll post on the Forum a 'Request for D Advocacy', a-la PostgreSQL, so the community can try to address some of his concerns about modern D, and lower his discomfort! :-P /Paolo
Jul 28 2018
parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 28 July 2018 at 14:45:19 UTC, Paolo Invernizzi wrote:
 I forgot the link... here it is:
 https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710
An interesting article. I found that Dennet's Consciousness Explained, which is presumably debunked old hat by now, is full of interesting experiments and speculation about how we model things in our mind and how our perceptions feed into that. It's a long time since I read it but if I remember correctly he shows how we seem to have a kind of mental theatre which has an expectation of what will come next from the senses, leading to interesting mistakes in perception. It's a useful model of how the mind works. That website often carries good articles about new maths as well.
 Me and my colleague are pretty different, in the approach to 
 that kind of stuff...

 Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
 PostgreSQL, so the community can try to address some of his 
 concerns about modern D, and lower his discomfort!

 :-P
If you can explain to me what is the _direction_ of D in terms of interfacing with large C++ libraries it would be very much appreciated! I'd love to be using D for some of my projects but I have a perception that using e.g. VTK is still a difficult thing to do from D. Is that still true? What is the long term plan for D, is it extern(C++), a binding technology? Is there any interest in Calypso from the upper echelons? I want to know where D is trying to go, not just where it is now. I want to know if anyone has got their heart in it. My CV says my main languages are Java, Python and D. That last one is mainly wishful thinking at the moment. I wish it wasn't! Make me believe, Paulo!
Jul 29 2018
parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Sunday, 29 July 2018 at 09:35:06 UTC, Abdulhaq wrote:
 On Saturday, 28 July 2018 at 14:45:19 UTC, Paolo Invernizzi 
 wrote:
 I forgot the link... here it is:
 https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710
An interesting article. I found that Dennet's Consciousness Explained, which is presumably debunked old hat by now, is full of interesting experiments and speculation about how we model things in our mind and how our perceptions feed into that. It's a long time since I read it but if I remember correctly he shows how we seem to have a kind of mental theatre which has an expectation of what will come next from the senses, leading to interesting mistakes in perception. It's a useful model of how the mind works. That website often carries good articles about new maths as well.
 Me and my colleague are pretty different, in the approach to 
 that kind of stuff...

 Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
 PostgreSQL, so the community can try to address some of his 
 concerns about modern D, and lower his discomfort!

 :-P
If you can explain to me what is the _direction_ of D in terms of interfacing with large C++ libraries it would be very much appreciated! I'd love to be using D for some of my projects but I have a perception that using e.g. VTK is still a difficult thing to do from D. Is that still true? What is the long term plan for D, is it extern(C++), a binding technology? Is there any interest in Calypso from the upper echelons? I want to know where D is trying to go, not just where it is now. I want to know if anyone has got their heart in it. My CV says my main languages are Java, Python and D. That last one is mainly wishful thinking at the moment. I wish it wasn't! Make me believe, Paulo!
Well we are hiring D programmers in London and HK in case it's interesting. Dpp doesn't work with STL yet. I asked Atila how long to #include vector and he thought maybe two months of full-time work. That's not out of the question in time, but we have too much else to do right now. I'm not sure if recent mangling improvements help and how much that changes things. But DPP keeps improving as does extern (C++) and probably one way and another it will work for quite a lot. Calypso makes cpp classes work as both value and reference types. I don't know the limit of what's possible without such changes - seems like C++ mangling is improving by leaps and bounds but I don't know when it will be dependable for templates. It's not that relevant what Andrei or Walter might think because it's a community-led project and we will make progress if somebody decides to spend their time working on it, or a company lends a resource for the same purpose. I'm sure they are all in favour of greater cpp interoperability, but I don't think the binding constraint is will from the top, but rather people willing and able to do the work. And if one wants to see it go faster then one can logically find a way to help with the work or contribute financially. I don't think anything else will make a difference. Same thing with Calypso. It's not ready yet to be integrated in a production compiler so it's an academic question as to the leadership's view about it.
Jul 31 2018
parent Abdulhaq <alynch4047 gmail.com> writes:
On Tuesday, 31 July 2018 at 22:55:08 UTC, Laeeth Isharc wrote:

 Dpp doesn't work with STL yet.  I asked Atila how long to 
 #include vector and he thought maybe two months of full-time 
 work.  That's not out of the question in time, but we have too 
 much else to do right now.  I'm not sure if recent mangling 
 improvements help and how much that changes things.  But DPP 
 keeps improving as does extern (C++) and probably one way and 
 another it will work for quite a lot.  Calypso makes cpp 
 classes work as both value and reference types.  I don't know 
 the limit of what's possible without such changes - seems like 
 C++ mangling is improving by leaps and bounds but I don't know 
 when it will be dependable for templates.
Yes OK, thanks.
 It's not that relevant what Andrei or Walter might think 
 because it's a community-led project and we will make progress 
 if somebody decides to spend their time working on it, or a 
 company lends a resource for the same purpose.  I'm sure they 
 are all in favour of greater cpp interoperability, but I don't 
 think the binding constraint is will from the top, but rather 
 people willing and able to do the work.
I think the DIP system has greatly improved the situation, but for anyone thinking of embarking on a lot of work for something like e.g. the GC, you do need to feel that there will be a good chance of it being adopted - otherwise all that work could go to waste.
 And if one wants to see it go faster then one can logically 
 find a way to help with the work or contribute financially.  I 
 don't think anything else will make a difference.
Agreed entirely.
 Same thing with Calypso.  It's not ready yet to be integrated 
 in a production compiler so it's an academic question as to the 
 leadership's view about it.
Where I'm coming from is that writing and maintaining something as large and complex as Calypso requires a whole heap both of motivation and also of encouragement from the sidelines - and especially from Walter and/or Andrei. If someone starts to feel that the backing is not there then it's very very hard to maintain motivation, particularly on infrastructure related code that if not integrated by Walter will always be hard for people to use and therefore not be widely adopted. To be fair to Walter though, this is a really intractable problem for him. He could adopt something like Calypso, and then find the original maintainer loses interest. That would leave Walter either needing to maintain someone else's complex code, or try to extricate himself from code having already integrated it. Also, there is no guarantee, in this particular case, that as C++ evolves it will still be possible to use Calypso's strategy. Of course there are other very good reasons for why adopting it is problematic. Still, it leaves the developer struggling, I expect, to maintain motivation. Considering the above, then knowing the general direction that Walter/Andrei want to take D, would be a great help in deciding what larger projects are worth undertaking. It seems to me, anyway (big caveat).
Aug 03 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/28/2018 7:09 AM, Laeeth Isharc wrote:
 Opportunities are 
 abundant where people aren't looking because they don't want to.
My father told me I wasn't at all afraid of hard work. I could lie down right next to it and go to sleep.
Jul 28 2018
prev sibling next sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:
 It's tough when dealing with genuine - Knightian uncertainty or 
 even more radical versions.  When one doesn't even know the 
 structure of the problem then maximising expected utility 
 doesn't work.  One can look at capacities - Choquet and the 
 like - but then its harder to say something useful about what 
 you should do.
Sounds interesting, I'll look into it.
 But it's a loop and one never takes a final decision to master 
 D. Also habits, routines and structures _do_ shape perception.

 In truth I avoid discussions that are really just arguing 
 about definitions of words, but you made a couple of sweeping 
 bumper-stickery comments
That's entertaining. I've not been accused of that before! Bear in mind also I tend to write on my phone.
I think I was just in need of a decent conversation. I didn't mean it in an accusatory manner :-). TBH I read those comments as coming from a D advocate who was in a motivational mood. They triggered a debate in me that has been wanting to come out, but I rarely contribute to forums these days.
 Yes I read Kahneman et al papers for the first time in 92 in 
 the university library.  I speed-read his book, and I thought 
 it was a bad book.  I work with a specialist in making 
 decisions under uncertainty - she was the only person able to 
 articulate to George Soros how he made money because he 
 certainly couldn't, and she is mentioned in the preface to the 
 revised version of Alchemy.  She has the same view as me - 
 behavioural finance is largely a dead end.  One learns much 
 more by going straight to the neuroeconomics and incorporating 
 also the work of Dr Iain Macgilchrist.

 Kahneman makes a mistake in his choice of dimension.  There's 
 analytic and intuitive/gestalt and in my experience people 
 making high stakes decisions are much less purely analytical 
 than a believer in the popular Kahneman might suggest.

 What I said about prediction being overrated isn't 
 controversial amongst a good number of the best traders and 
 business people in finance.  You might read Nassim Taleb also.
You're way ahead of me here, obviously. I didn't read any Taleb until he made an appearance at the local bookshop. It was Black Swan and it didn't say anything that hadn't independently occurred to me already. However, for some reason it seemed to be a revelation to a lot of people.
 Well it's a pity the D Android ecosystem isn't yet mature.  
 Still I remain in awe of the stubborn accomplishment of the man 
 (with help) who got LDC to run on Android.

 It's not that bad calling D from Java.  Some day I will see if 
 I can help automate that - Kai started working on it already I 
 think.
D as a programming language has numerous benefits over Java, but trying to analyse why I would nevertheless choose Kotlin/Java for Android development: * The Android work I do largely does not need high low level performance. The important thinking that is done is the user interface, how communication with the servers should look for good performance, caching etc. Designing good algorithms. * Having done the above, I want a low friction way of getting that into code. That requires a decent expressive language with a quality build system that can churn out an APK without me having to think too hard about it. Kotlin/JDK8 are good enough and Android Studio helps a lot. * Given the above, choosing D to implement some of the code would just be a cognitive and time overhead. It's no reflection on D in any way, it's just that all the tooling is for Java and the platform API/ABI is totally designed to host Java. * "The man who (with help) got LDC to run on Android". The team, with the best will in the world, is too small to answer all the questions that the world of pain known as Android can throw up. Why doesn't this build for me? Gradle is killing me... Dub doesn't seem to be working right after the upgrade to X.Y... it works on my LG but not my Samsung... I've upgraded this but now that doesn't work anymore... * Will there be a functioning team in 5 years time? Will they support older versions of Android? Can I develop on Windows? Or Linux? Why not?., etc., etc.
 Since you already know D you need to answer a different 
 question.
  What's the chance the compiler will die on the relevant 
 horizon, and how bad will it be for me if that happens.  
 Personally I'm not worried.   If D should disappear in a few 
 years, it wouldn't be the end of the world to port things.  I 
 just don't think that's very likely.
I answered the Android question already, as for engineering /scientific work (I design/develop engineering frameworks/tools for wing designers) python has bindings to numpy, Qt, CAD kernels, data visualisation tools. Python is fast enough to string those things together and run the overarching algorithms, GUIs, launch trade studies, scipy optimisations. It has even more expressive power than D and we use a typing library that makes it semi statically typed - good and sound enough for large code bases and development teams. Despite our code using a typing library (enthought traits), the IDE doesn't know about that and hence suffers from the same problems as a D IDE will - it simply can't know what code might call that function you wrote, and what types are incoming. Still, given the huge number of quality python libraries and bindings into the big C++ libraries, that outweighs (currently) the great tooling for e.g. JDK8.
 Of course it depends on your context.  The people who use D at 
 work seem to be more principals who have the right to take the 
 best decision as they see it then agents who must persuade 
 others who are the real decision-makers.  That's a recipe for 
 quiet adoption that's dispersed across many industries 
 initially and for the early adopters of D being highly 
 interesting people.  Since, as the Wharton professor, Adam 
 Grant observes, we are in an age where positive disruptors can 
 achieve a lot within an organisation, that's also rather 
 interesting.
I agree entirely. They need to maintain their own motivation and create the product they want. I'm confident that D will do well. When I read TDPL I was very excited. In that sense, I still am. In the decade since then I started a binding to Qt that got a long way (I wanted to use Qt/VTK from D) but then stopped as I didn't get enough momentum/motivation to finish it and Calypso came along. I felt the Andrei and Walter should have put all their weight behind it. Following the forums I don't think that has really happened. Then, I am happy with the GC approach. I am coming from Java/Python (with smatterings from assembly to LISP), some C++ for a bit of self-flagellation, "because it's there". But the dlang focus on performance became so important and there was the big push to remove dependence on the GC, and to offer a reference counted alternative. Are either of these options mature? Without investing a lot of time to try them out, I can't make a judgement. I follow this forum but I still haven't managed to draw out an opinion from that data. I think that I no longer fall into the category of developer that D is after. D is targeting pedal-to-the-metal requirements, and I don't need that. TBH I think 99% of developers don't need it. We like to think we do and we love to marvel at the speed of improved code, but like prediction, it's overrated ;-)
Jul 28 2018
parent reply bpr <brogoff gmail.com> writes:
On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
 I think that I no longer fall into the category of developer 
 that D is after. D is targeting pedal-to-the-metal 
 requirements, and I don't need that. TBH I think 99% of 
 developers don't need it.
I'm 99% sure you just made that number up ;-) For those developers who don't need the performance usually achieved with C or C++, and can tolerate GC overheads, there are, IMO, better languages than D. I'm not saying that here to be inflammatory, just that I believe performance is a very big part of the attractiveness of D. If you're mostly working on Android, then Kotlin seems like your best option for a non-Java language. It seems OK, there's a Kotlin native in the works, the tooling is fine, there's a REPL, etc. I like it better than I like Go.
 We like to think we do and we love to marvel at the speed of 
 improved code, but like prediction, it's overrated ;-)
For you, perhaps. I currently work mostly at a pretty low level and I'm pretty sure it's not just self delusion that causes us to use C++ at that low level. Perhaps you've noticed the rise of Rust lately? Are the Mozilla engineers behind it deluded in that they eschew GC and exceptions? I doubt it. I mostly prefer higher level languages with GCs, but nothing in life is free, and GC has significant costs.
Jul 28 2018
parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
 On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
 I think that I no longer fall into the category of developer 
 that D is after. D is targeting pedal-to-the-metal 
 requirements, and I don't need that. TBH I think 99% of 
 developers don't need it.
I'm 99% sure you just made that number up ;-)
Sure, I plucked it out of thin air. But I do think of the software development world as an inverted pyramid in terms of performance demands and headcount. At the bottom of my inverted pyramid I have Linux and Windows. This code needs to be as performant as possible and bug free as possible. C/C++/D shine at this stuff. However, I number those particular developers in the thousands. Then we have driver writers. Performance is important here but as I user I feel that I wish they would concentrate on the 'bug-free' part a bit more. Especially those cowboys who develop printer and bluetooth drivers. Of course, according to them it's the hardware that stinks. These guys and galls number in the tens of thousands. Yes I made that up. Then we have a layer up, Libc developers and co. Then platform developers. Unity, Lumberyard for games. Apache. I think a great bulk of developers, though, sit at the application development layer. They are pumping out great swathes of Java etc. Users of Spring and dozens of other frameworks. C++ is usually the wrong choice for this type of work, but can be adopted in a mistaken bid for performance. Any how many are churning out all that javascript and PHP code? Hence I think that the number of developers who really need top performance is much smaller than the number who don't.
 For you, perhaps. I currently work mostly at a pretty low level 
 and I'm pretty sure it's not just self delusion that causes us 
 to use C++ at that low level. Perhaps you've noticed the rise 
 of Rust lately? Are the Mozilla engineers behind it deluded in 
 that they eschew GC and exceptions? I doubt it. I mostly prefer 
 higher level languages with GCs, but nothing in life is free, 
 and GC has significant costs.
If I had to write CFD code, and I'd love to have a crack, then I'd really be wanting to use D for its expressiveness and performance. But because of the domain that I do work in, I feel that I am no longer in D's target demographic. I remember the subject of write barriers coming up in order (I think?) to improve the GC. Around that time Walter said he would not change D in any way that would reduce performance by even 1%. Hence I feel that D is ruling itself out of the application developer market. That's totally cool with me, but it me a long time to realise that it was the case and that therefore it was less promising to me than it had seemed before.
Jul 28 2018
parent reply bpr <brogoff gmail.com> writes:
On Saturday, 28 July 2018 at 20:34:37 UTC, Abdulhaq wrote:
 On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
 On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
 I think that I no longer fall into the category of developer 
 that D is after. D is targeting pedal-to-the-metal 
 requirements, and I don't need that. TBH I think 99% of 
 developers don't need it.
I'm 99% sure you just made that number up ;-)
Sure, I plucked it out of thin air. But I do think of the software development world as an inverted pyramid in terms of performance demands and headcount. At the bottom of my inverted pyramid I have Linux and Windows. This code needs to be as performant as possible and bug free as possible. C/C++/D shine at this stuff. However, I number those particular developers in the thousands.
The developers at Mozilla working on the browser internals, for example, are unaccounted for in your analysis. As are the developers where I work.
 I think a great bulk of developers, though, sit at the 
 application development layer. They are pumping out great 
 swathes of Java etc. Users of Spring and dozens of other 
 frameworks. C++ is usually the wrong choice for this type of 
 work, but can be adopted in a mistaken bid for performance.
I don't know that the great bulk of developers work in Java.
 Any how many are churning out all that javascript and PHP code?

 Hence I think that the number of developers who really need top 
 performance is much smaller than the number who don't.
I'd be willing to accept that, but I have no idea what the actual numbers are.
 If I had to write CFD code, and I'd love to have a crack, then 
 I'd really be wanting to use D for its expressiveness and 
 performance. But because of the domain that I do work in, I 
 feel that I am no longer in D's target demographic.
If I had to write CFD code, and I wanted to scratch an itch to use a new language, I'd probably pick Julia, because that community is made up of scientific computing experts. D might be high on my list, but not likely the first choice. C++ would be in there too :-(.
 I remember the subject of write barriers coming up in order (I 
 think?) to improve the GC. Around that time Walter said he 
 would not change D in any way that would reduce performance by 
 even 1%.
Here we kind of agree. If D is going to support a GC, I want a state of the art precise GC like Go has. That may rule out some D features, or incur some cost that high performance programmers don't like, or even suggest two kinds of pointer (a la Modula-3/Nim), which Walter also dislikes.
 Hence I feel that D is ruling itself out of the application 
 developer market.
At this stage in its life, I don't think D should try to be all things to all programmers, but rather focus on doing a few things way better than the competition.
 That's totally cool with me, but it me a long time to realise 
 that it was the case and that therefore it was less promising 
 to me than it had seemed before.
I hear you. You're looking (roughly) for a better Java/Go/Scala, and I'm looking for a better C/C++/Rust, at least for what I work on now. I don't think D can be both right now, and that the language which can satisfy both of us doesn't exist yet, though D is close.
Jul 28 2018
next sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:
 I hear you. You're looking (roughly) for a better 
 Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
 least for what I work on now. I don't think D can be both right 
 now, and that the language which can satisfy both of us doesn't 
 exist yet, though D is close.
Yes, this. In the light of D's experience, is it even possible to have a language that satisfies both?
Jul 28 2018
parent bpr <brogoff gmail.com> writes:
On Saturday, 28 July 2018 at 21:44:10 UTC, Abdulhaq wrote:
 On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:
 I hear you. You're looking (roughly) for a better 
 Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
 least for what I work on now. I don't think D can be both 
 right now, and that the language which can satisfy both of us 
 doesn't exist yet, though D is close.
Yes, this. In the light of D's experience, is it even possible to have a language that satisfies both?
I believe that the tension between low and high level features makes it nearly impossible, that tracing GC is one of those difficult problems that rulses out satisfying both sets of users optimally, and that the best D (and C++ and Nim) can do is to be "mediocre to good, but not great" at both the low level (C/Rust) domain and high level domains simultaneously. There are far fewer players in the low level space, which is why I see D more as a competitor there, and welcome DasBetterC and the noGC initiatives so that D can be a great low level and maybe just a good high level language.
Jul 30 2018
prev sibling parent reply Kagamin <spam here.lot> writes:
On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
 Are the Mozilla engineers behind it deluded in that they eschew 
 GC and exceptions? I doubt it.
They are trying to outcompete Chrome in bugs too. You're not Mozilla. And why you mention exceptions, but not bounds checking?
 Here we kind of agree. If D is going to support a GC, I want a 
 state of the art precise GC like Go has.
Go GC is far from being a state of the art, it trades everything for low latency and ease of configuration.
Jul 31 2018
parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 31 July 2018 at 12:02:55 UTC, Kagamin wrote:
 On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
 Are the Mozilla engineers behind it deluded in that they 
 eschew GC and exceptions? I doubt it.
They are trying to outcompete Chrome in bugs too. You're not Mozilla. And why you mention exceptions, but not bounds checking?
Firefox has been complete garbage on my work computer ever since the Quantum update. Works fine at home though.
Jul 31 2018
prev sibling parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 07/28/2018 05:43 AM, Laeeth Isharc wrote:

 It's not that bad calling D from Java.
Running D's GC in a thread that is started by an external runtime (like Java's) can be problematic. If a D function on another D-runtime thread needs to run a collection, then it will not know about this Java thread and won't stop it. One outcome is a crash if this thread continues to allocate while the other one is collecting. The solution is having to call thread_attachThis() upon entry to the D function and thread_detachThis() upon exit. However, there are bugs with these function, which I posted a pull request (and abandoned it because of 32-bit OS X test failures.) I think a better option would be to forget about all that and not do any GC in the D function that is called from Java. This simple function should just send a message to a D-runtime thread and return back to Java. Ali
Jul 28 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2018 4:27 PM, Laeeth Isharc wrote:
 I think it's more interesting to be the change you 
 wish to see in the world.
Haha, the whole point of me starting D. I was tired of trying to convince entrenched interests (and I wasn't very good at that).
Jul 28 2018
prev sibling next sibling parent Dave Jones <dave jones.com> writes:
On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
 On 7/23/2018 5:39 AM, Joakim wrote:
 In my experience, people never learn, even from the blatantly 
 obvious, _particularly_ when they're invested in the outdated. 
 What inevitably happens is the new tech gets good enough to 
 put them out of business, then they finally pick it up or 
 retire. Until most system software is written in 
 D/Go/Rust/Swift/Zig/etc., they will keep mouthing platitudes 
 about how C is here to stay.
I've predicted before that what will kill C is managers and customers requiring memory safety because unsafeness costs them millions. The "just hire better programmers" will never work.
My son broke the handle of a jug a few days ago and spent about 10 minutes arguing with me about how it wasn't really broken you just had to hold it differently. "C" you just need to hold it differently.
Jul 24 2018
prev sibling parent reply Jim Balter <Jim Balter.name> writes:
On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
 On 7/23/2018 5:39 AM, Joakim wrote:
 In my experience, people never learn, even from the blatantly 
 obvious, _particularly_ when they're invested in the outdated. 
 What inevitably happens is the new tech gets good enough to 
 put them out of business, then they finally pick it up or 
 retire. Until most system software is written in 
 D/Go/Rust/Swift/Zig/etc., they will keep mouthing platitudes 
 about how C is here to stay.
I've predicted before that what will kill C is managers and customers requiring memory safety because unsafeness costs them millions. The "just hire better programmers" will never work.
It ought to be obvious that "just use better tools" is far cheaper and more effective, but I think one of the problems is something that I also see in politics quite a bit: a lot of people are more interested in feeling superior or punishing people for their flaws than in avoiding bad outcomes. And there's also the magical "if only everyone would ..." thinking. If you want to get everyone to do something they aren't currently doing, you need some *causal mechanism* (and it has to be feasible, which "avoid all mistakes through discipline" is not).
Jul 25 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2018 2:08 AM, Jim Balter wrote:
 It ought to be obvious that "just use better tools" is far cheaper and more 
 effective, but I think one of the problems is something that I also see in 
 politics quite a bit: a lot of people are more interested in feeling superior
or 
 punishing people for their flaws than in avoiding bad outcomes. And there's
also 
 the magical "if only everyone would ..." thinking. If you want to get everyone 
 to do something they aren't currently doing, you need some *causal mechanism* 
 (and it has to be feasible, which "avoid all mistakes through discipline" is
not).
I recommend anyone interested in design watch "Aviation Disasters", where even the most well-trained pilots fall victim to user interface design flaws, that are always obvious in retrospect. Like the one where the checklist for loss cabin pressure starts out with fiddling with dials and switches to diagnose the problem. A crew followed that, passed out from hypoxia, crashed and died. The corrected sequence starts with: Step 1: put on your oxygen mask. because it only takes a few seconds to pass out from hypoxia. Pretty much every episode follows that pattern. --- My father worked as an accident investigator for the Air Force once upon a time. The AF used standard phrases for things to reduce confusion. One such is "takeoff power", meaning full power as that's what you use for taking off. A pilot was coming in for a landing once, and the strip was obstructed, so he yelled "takeoff power" to the copilot, who heard "take off power" and pulled the throttles back. The jet sank and crashed. The AF changed the phrase to "full power" (or "maximum power", I forgot which). Naturally, this also seems ridiculously obvious - but only in retrospect. Sort of like the infamous Windows "Start" button which turns off the machine :-) --- My experience is that all programmers (including myself) believe they know what is "intuitively obvious" going forward and what is not. They're all wrong. These things are learned only in hindsight.
Jul 26 2018
prev sibling next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Monday, 23 July 2018 at 11:51:54 UTC, Jim Balter wrote:
 On Sunday, 22 July 2018 at 20:10:27 UTC, Walter Bright wrote:
 On 7/21/2018 11:53 PM, Walter Bright wrote:
 My article C's Biggest Mistake on front page of 
 https://news.ycombinator.com !
Direct link: https://news.ycombinator.com/item?id=17585357
The responses are not encouraging, but I suppose they're useful for sociologists studying fallacious thinking.
I agree. As I've already said in the past here on this forum, D's way of managing string/array/slices in the same manner is one of its biggest advances over C/C++, both in safety and expressivity. Very simple stuff indeed, but still lightyears ahead of C++, And something that REALLY must be integrated into BetterC's low-level standard library in some way IMHO...
Jul 23 2018
parent reply Dukc <ajieskola gmail.com> writes:
On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
 And something that REALLY must be integrated into BetterC's 
 low-level standard library in some way IMHO...
They already work, except for the concatenation operator because it obviously requires the GC. And converiting a pointer from C code to D is easy, because you can slice pointers just like arrays -it's just that it won't be bounds checked.
Jul 24 2018
parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:
 On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
 And something that REALLY must be integrated into BetterC's 
 low-level standard library in some way IMHO...
They already work, except for the concatenation operator because it obviously requires the GC. And converiting a pointer from C code to D is easy, because you can slice pointers just like arrays -it's just that it won't be bounds checked.
Nice. But if you want D to be REALLY appealing to a majority of C++ developers, you'd better provide them with the FULL D experience. And unfortunately, using builtin arrays/strings/slices/maps in the usual way is probably a big part for it. Don't forget that concatenating strings in C++ is perfectly ALLOWED in C++, WITHOUT using a GC... #include <iostream> using namespace std; int main() { string str, str1, str2; str1 = "Hello"; str2 = "World"; str = str1 + " " + str2; cout << str << endl; return 0; } Instead of removing D's GC and the feature which rely on it, you'd better replace it by something which releases the unused memory blocks as soon as they have to be, like the reference counted approach used not only in C++, but also in Kotlin Native, Crack, etc... THAT would make D stand above its competition, by making it more pleasing and enjoyable to use than C, C++ and Rust for instance for their typical use cases...
Jul 24 2018
next sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Tuesday, 24 July 2018 at 11:53:35 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:
 On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
 [...]
They already work, except for the concatenation operator because it obviously requires the GC. And converiting a pointer from C code to D is easy, because you can slice pointers just like arrays -it's just that it won't be bounds checked.
Nice. But if you want D to be REALLY appealing to a majority of C++ developers, you'd better provide them with the FULL D experience. And unfortunately, using builtin arrays/strings/slices/maps in the usual way is probably a big part for it. Don't forget that concatenating strings in C++ is perfectly ALLOWED in C++, WITHOUT using a GC...
Same in D, it's just that nobody's bothered writing a string class/struct. Atila
Jul 24 2018
parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 24 July 2018 at 12:13:27 UTC, Atila Neves wrote:
 On Tuesday, 24 July 2018 at 11:53:35 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:
 On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
 [...]
They already work, except for the concatenation operator because it obviously requires the GC. And converiting a pointer from C code to D is easy, because you can slice pointers just like arrays -it's just that it won't be bounds checked.
Nice. But if you want D to be REALLY appealing to a majority of C++ developers, you'd better provide them with the FULL D experience. And unfortunately, using builtin arrays/strings/slices/maps in the usual way is probably a big part for it. Don't forget that concatenating strings in C++ is perfectly ALLOWED in C++, WITHOUT using a GC...
Same in D, it's just that nobody's bothered writing a string class/struct. Atila
Indeed...
Jul 24 2018
parent Seb <seb wilzba.ch> writes:
On Tuesday, 24 July 2018 at 13:52:20 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 12:13:27 UTC, Atila Neves wrote:
 Same in D, it's just that nobody's bothered writing a string 
 class/struct.

 Atila
Indeed...
Better late than never: https://forum.dlang.org/post/fsjspoewhdooowjotgok forum.dlang.org
Jul 26 2018
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/24/2018 4:53 AM, Ecstatic Coder wrote:
      str = str1 + " " + str2;
But you have to be careful how it is written: str = "hello" + "world"; str = "hello" + "world" + str1; don't work, etc.
Jul 25 2018
next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Wednesday, 25 July 2018 at 21:16:40 UTC, Walter Bright wrote:
 On 7/24/2018 4:53 AM, Ecstatic Coder wrote:
      str = str1 + " " + str2;
But you have to be careful how it is written: str = "hello" + "world"; str = "hello" + "world" + str1; don't work, etc.
Yeah. That's exactly there where D shines, and C++ s*cks... C++ string constants are stupid pointers, no size etc. Indeed one big C++ silly thing that Walter fixed perfectly. He is the only language designed who found and applied the perfect solution for strings, arrays and slices. Big respect to him...
Jul 25 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2018 3:46 PM, Ecstatic Coder wrote:
 C++ string constants are stupid pointers, no size etc. Indeed one big C++
silly 
 thing that Walter fixed perfectly. He is the only language designed who found 
 and applied the perfect solution for strings, arrays and slices. Big respect
to 
 him...
Not everyone agrees with that assessment, but I'm pretty happy that it has worked even better than I'd dared to hope.
Jul 26 2018
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 25 July 2018 at 21:16:40 UTC, Walter Bright wrote:
 On 7/24/2018 4:53 AM, Ecstatic Coder wrote:
      str = str1 + " " + str2;
But you have to be careful how it is written: str = "hello" + "world"; str = "hello" + "world" + str1; don't work, etc.
Well, like everything in C++, there is always a way. str = "hello"s + "world"; str = "hello"s + "world" + str1; Spot the difference. :)
Jul 25 2018
parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Thursday, 26 July 2018 at 06:04:33 UTC, Paulo Pinto wrote:
 On Wednesday, 25 July 2018 at 21:16:40 UTC, Walter Bright wrote:
 On 7/24/2018 4:53 AM, Ecstatic Coder wrote:
      str = str1 + " " + str2;
But you have to be careful how it is written: str = "hello" + "world"; str = "hello" + "world" + str1; don't work, etc.
Well, like everything in C++, there is always a way. str = "hello"s + "world"; str = "hello"s + "world" + str1; Spot the difference. :)
It's just synctactic sugar for a constructed string. You can't even use C++14 string constants to initialize a string view, or you have a dangling pointer, as it's NOT a true constant. Ridiculous...
Jul 25 2018
prev sibling parent Arun Chandrasekaran <aruncxy gmail.com> writes:
On Tuesday, 24 July 2018 at 11:53:35 UTC, Ecstatic Coder wrote:
 On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:
 On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
 And something that REALLY must be integrated into BetterC's 
 low-level standard library in some way IMHO...
They already work, except for the concatenation operator because it obviously requires the GC. And converiting a pointer from C code to D is easy, because you can slice pointers just like arrays -it's just that it won't be bounds checked.
Nice. But if you want D to be REALLY appealing to a majority of C++ developers, you'd better provide them with the FULL D experience. And unfortunately, using builtin arrays/strings/slices/maps in the usual way is probably a big part for it. Don't forget that concatenating strings in C++ is perfectly ALLOWED in C++, WITHOUT using a GC... #include <iostream> using namespace std; int main() { string str, str1, str2; str1 = "Hello"; str2 = "World"; str = str1 + " " + str2; cout << str << endl; return 0; }
Recently in my code base where similar concatenation worked fine in debug mode but crashed in release mode: Win64, VS2017. Worked fine on Linux, GCC 7.3. Had to use std::ostringstream to resolve it, or use use .append(). These kind of UB is what makes a language esoteric. C wins the lot for UBs nevertheless!
Jul 26 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/23/2018 4:51 AM, Jim Balter wrote:
 The responses are not encouraging, but I suppose they're useful for
sociologists 
 studying fallacious thinking.
A big motivation for starting D was not having to convince the C/C++ community of such things. I'd rather write code than argue.
Jul 23 2018
prev sibling parent reply Dibyendu Majumdar <mobile majumdar.org.uk> writes:
https://sqlite.org/whyc.html

Personally I think D team should try to convince some well known 
project to switch from C to D. Not many projects are written in C 
these days though ... but SQLite is amongst the few.
Jul 28 2018
parent Kagamin <spam here.lot> writes:
On Saturday, 28 July 2018 at 21:49:12 UTC, Dibyendu Majumdar 
wrote:
 https://sqlite.org/whyc.html

 Personally I think D team should try to convince some well 
 known project to switch from C to D. Not many projects are 
 written in C these days though ... but SQLite is amongst the 
 few.
The C language is old and boring. It is a well-known and 
well-understood language.
If C is so well-understood, why we have all these buffer overflows?
Jul 31 2018