www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Why is D unpopular?

reply Dr Machine Code <jckj33 gmail.com> writes:
It got [asked on 
reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Nov 02 2021
next sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
Nov 02 2021
next sibling parent reply Tejas <notrealemail gmail.com> writes:
On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
Also C++, maybe? It's not like AT&T straight up commissioned Dr. Bjarne to write C with classes; atleast the first few years of cfront were very rough, keeping it relevant only because it had 100% C compatibility.
Nov 02 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 2 November 2021 at 17:50:04 UTC, Tejas wrote:
 On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
Also C++, maybe? It's not like AT&T straight up commissioned Dr. Bjarne to write C with classes; atleast the first few years of cfront were very rough, keeping it relevant only because it had 100% C compatibility.
Alone that fact made almost every C compiler vendor embrace C++, as C++ was born on the same place as UNIX and C. Then on the PC and Mac it quickly got the love from Apple (replacing Object Pascal with C++), IBM, Microsoft, Borland, Watcom, SGI, Sun, HP, among others, and naturally Digital Mars/Symantec as well. During the 1990's, C++ was everywhere for desktop GUI frameworks from all OS vendors, and even TUI (like Turbo Vision). Also Bjarne likes to tell that C++ only became an ISO because representatives of companies like IBM, and others of similar size, made it a requirement for adoption after a visit at AT&T.
Nov 02 2021
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/2/2021 11:48 AM, Paulo Pinto wrote:
 Then on the PC and Mac it quickly got the love from Apple (replacing Object 
 Pascal with C++), IBM, Microsoft, Borland, Watcom,  SGI, Sun, HP, among
others, 
 and naturally Digital Mars/Symantec as well.
Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.) ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use. This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon. This popularity then fed back into the Unix systems. No, you won't find this account in the D&E of C++ histories, but it's what actually happened.
Apr 28 2022
next sibling parent reply Tejas <notrealemail gmail.com> writes:
On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
 On 11/2/2021 11:48 AM, Paulo Pinto wrote:
 Then on the PC and Mac it quickly got the love from Apple 
 (replacing Object Pascal with C++), IBM, Microsoft, Borland, 
 Watcom,  SGI, Sun, HP, among others, and naturally Digital 
 Mars/Symantec as well.
Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.) ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use. This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon. This popularity then fed back into the Unix systems. No, you won't find this account in the D&E of C++ histories, but it's what actually happened.
What is D&E? Internet suggests "Design & Engineering" Is that it?
Apr 28 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 7:56 PM, Tejas wrote:
 What is D&E? Internet suggests "Design & Engineering"
 Is that it?
Design and Evolution
Apr 28 2022
prev sibling next sibling parent reply Araq <rumpf_a web.de> writes:
On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
 On 11/2/2021 11:48 AM, Paulo Pinto wrote:
 Then on the PC and Mac it quickly got the love from Apple 
 (replacing Object Pascal with C++), IBM, Microsoft, Borland, 
 Watcom,  SGI, Sun, HP, among others, and naturally Digital 
 Mars/Symantec as well.
Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.)
From D&E: "The size of this overhead depends critically on the time needed to read and write the intermediate C representation and that primarily depends on the disc read/write strat- egy of a system. Over the years I have measured this overhead on various systems and found it to be between 25% and 100% of the "necessary" parts of a compilation. I have also seen C++ compilers that didn't use intermediate C yet were slower than Cfront plus a C compiler." That's not "terribly slow". And before you bring up "templates are slow to compile", in 1987 cfront did not have templates. "The earliest implementation of templates that was integrated into a compiler was a version of Cfront that supported class templates (only) written by Sam Haradhvala at Object Design Inc. in 1989."
 ZTC++ produced the first boom in use of C++, accounting for 
 perhaps 90% of C++ use.

 This popularity lead to Borland dumping their own OOP C and 
 going with C++, which then led to Microsoft getting on the 
 bandwagon.

 This popularity then fed back into the Unix systems.

 No, you won't find this account in the D&E of C++ histories, 
 but it's what actually happened.
Well that's the history as you remember it and Stroustrup does list "1st Zortech C++ release" in June 1988. I cannot say if your "90%" figure is correct or not.
Apr 28 2022
next sibling parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Friday, 29 April 2022 at 04:09:40 UTC, Araq wrote:
 On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
 On 11/2/2021 11:48 AM, Paulo Pinto wrote:
...
 Um, Zortech C++ was the first native C++ compiler on DOS in 
 1987. (The existing ones were all cfront based, and were 
 terribly slow.)
From D&E: "The size of this overhead depends critically on the time needed to read and write the intermediate C representation and that primarily depends on the disc read/write strat- egy of a system. Over the years I have measured this overhead on various systems and found it to be between 25% and 100% of the "necessary" parts of a compilation. I have also seen C++ compilers that didn't use intermediate C yet were slower than Cfront plus a C compiler." That's not "terribly slow". And before you bring up "templates are slow to compile", in 1987 cfront did not have templates.
Is there evidence that Zortech C++ was one of the "various systems" mentioned in your quote? Is it known that "necessary parts of a compilation" were implemented to run at competitive speed? (as opposed to, say, 4X slower than your best effort) ...
 No, you won't find this account in the D&E of C++ histories, 
 but it's what actually happened.
Well that's the history as you remember it and Stroustrup does list "1st Zortech C++ release" in June 1988. I cannot say if your "90%" figure is correct or not.
Is your intent here to make clear that you have no access to hard data or that you don't believe Walter? Both? Other?
Apr 28 2022
parent reply Araq <rumpf_a web.de> writes:
On Friday, 29 April 2022 at 05:37:40 UTC, Bruce Carneal wrote:
 Is your intent here to make clear that you have no access to 
 hard data or that you don't believe Walter?  Both?  Other?
My intent is to get hard data from Walter.
Apr 28 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 11:11 PM, Araq wrote:
 My intent is to get hard data from Walter.
https://dl.acm.org/doi/abs/10.1145/3386323
Apr 29 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 9:09 PM, Araq wrote:
 On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
 Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The 
 existing ones were all cfront based, and were terribly slow.)
From D&E: "The size of this overhead depends critically on the time needed to read and write the intermediate C representation and that primarily depends on the disc read/write strat- egy of a system. Over the years I have measured this overhead on various systems and found it to be between 25% and 100% of the "necessary" parts of a compilation. I have also seen C++ compilers that didn't use intermediate C yet were slower than Cfront plus a C compiler." That's not "terribly slow". And before you bring up "templates are slow to compile", in 1987 cfront did not have templates. "The earliest implementation of templates that was integrated into a compiler was a version of Cfront that supported class templates (only) written by Sam Haradhvala at Object Design Inc. in 1989."
Zortech C++ was about 4 times faster than cfront based C++ on DOS machines. This was my measurements. I agree it had nothing to do with templates. Personally I doubt Stroustrup had ever tried ZTC++. I was wrong, ZTC++ came out in 1988, not 1987.
 ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ 
 use.

 This popularity lead to Borland dumping their own OOP C and going with C++, 
 which then led to Microsoft getting on the bandwagon.

 This popularity then fed back into the Unix systems.

 No, you won't find this account in the D&E of C++ histories, but it's what 
 actually happened.
Well that's the history as you remember it and Stroustrup does list "1st Zortech C++ release" in June 1988. I cannot say if your "90%" figure is correct or not.
DOS computers were where the great mass of programmers were at the time. 90% is likely conservative. The programming magazines were all focused on DOS programming, the articles about C++ were for DOS C++ programming. Before ZTC++, the traffic on comp.lang.c++ and comp.lang.objectivec was about the same, and not very much. After ZTC++, the traffic in comp.lang.c++ took off, and comp.lang.objectivec stagnated.
Apr 29 2022
next sibling parent reply Araq <rumpf_a web.de> writes:
On Friday, 29 April 2022 at 14:21:46 UTC, Walter Bright wrote:
 Zortech C++ was about 4 times faster than cfront based C++ on 
 DOS machines. This was my measurements. I agree it had nothing 
 to do with templates. Personally I doubt Stroustrup had ever 
 tried ZTC++.

 I was wrong, ZTC++ came out in 1988, not 1987.

 DOS computers were where the great mass of programmers were at 
 the time. 90% is likely conservative. The programming magazines 
 were all focused on DOS programming, the articles about C++ 
 were for DOS C++ programming.

 Before ZTC++, the traffic on comp.lang.c++ and 
 comp.lang.objectivec was about the same, and not very much. 
 After ZTC++, the traffic in comp.lang.c++ took off, and 
 comp.lang.objectivec stagnated.
Thank you!
Apr 29 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 9:10 AM, Araq wrote:
 Thank you!
You're welcome! BTW, the reasons for the execrable performance were pretty simple. 1. ZTC did everything in memory, cfront+MSC wrote intermediate files to/from disk multiple times. 2. cfront's output was C source code. Meaning that MSC had to preprocess, retokenize, and reparse it. cfront also had an intractable problem where it did not support near/far pointer types, which rendered it nearly unusable for non-trivial 16 bit DOS programming. Zortech brought a usable C++ compiler to DOS, and DOS was where all the action and the money was.
Apr 29 2022
prev sibling parent reply Araq <rumpf_a web.de> writes:
On Friday, 29 April 2022 at 14:21:46 UTC, Walter Bright wrote:
 Zortech C++ was about 4 times faster than cfront based C++ on 
 DOS machines. This was my measurements. I agree it had nothing 
 to do with templates. Personally I doubt Stroustrup had ever 
 tried ZTC++.
For this comparison which underlying C compiler did cfront use? Zortech's?
Apr 29 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 9:15 AM, Araq wrote:
 For this comparison which underlying C compiler did cfront use? Zortech's?
Microsoft's.
Apr 29 2022
parent reply Arjan <arjan ask.me.to> writes:
On Friday, 29 April 2022 at 17:26:44 UTC, Walter Bright wrote:
 On 4/29/2022 9:15 AM, Araq wrote:
 For this comparison which underlying C compiler did cfront 
 use? Zortech's?
Microsoft's.
IIRC ComeauC++ also transpilled to C and that one did support many C compilers including DMC.
Apr 29 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 11:38 AM, Arjan wrote:
 IIRC ComeauC++ also transpilled to C and that one did support many C compilers 
 including DMC.
True, but Comeau's compiler never really caught on.
Apr 29 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
 On 11/2/2021 11:48 AM, Paulo Pinto wrote:
 Then on the PC and Mac it quickly got the love from Apple 
 (replacing Object Pascal with C++), IBM, Microsoft, Borland, 
 Watcom,  SGI, Sun, HP, among others, and naturally Digital 
 Mars/Symantec as well.
Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.) ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use. This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon. This popularity then fed back into the Unix systems. No, you won't find this account in the D&E of C++ histories, but it's what actually happened.
That is my experience as I lived through it during 1980's Portuguese view of the computing world. For example, I never saw Zortech being sold in any computer shop, while Watcom, Borland and Microsoft compilers were available all over the country. Symantec ones later became available in the mid-90's.
Apr 29 2022
next sibling parent IGotD- <nise nise.com> writes:
On Friday, 29 April 2022 at 07:32:09 UTC, Paulo Pinto wrote:
 That is my experience as I lived through it during 1980's 
 Portuguese view of the computing world.

 For example, I never saw Zortech being sold in any computer 
 shop, while Watcom, Borland and Microsoft compilers were 
 available all over the country. Symantec ones later became 
 available in the mid-90's.
That is also my experience. Microsoft, Borland and Watcom were big but I never heard of Zortech. Later on in the 90s I did see the Digital Mars C++ compiler.
Apr 29 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 12:32 AM, Paulo Pinto wrote:
 For example, I never saw Zortech being sold in any computer shop, while
Watcom, 
 Borland and Microsoft compilers were available all over the country. Symantec 
 ones later became available in the mid-90's.
ZTC was only sold mail order.
Apr 29 2022
prev sibling next sibling parent workman <workman gmail.com> writes:
On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
Because Python/Vue and some other projects is productive and easy to maintain, that is why is popular. D claim to be "productive", but it not easy to maintain.
Nov 02 2021
prev sibling parent reply harakim <harakim gmail.com> writes:
On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
I was thinking about this post for a while. What about R, Perl, Ruby or PHP? I don't think even C++ had corporate backing out of the gate. Which company backed Scala? Sun? It seems like Python is in good company and the corporate-backed languages are the outliers.
Nov 03 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 3 November 2021 at 17:36:19 UTC, harakim wrote:
 On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
I was thinking about this post for a while. What about R, Perl, Ruby or PHP? I don't think even C++ had corporate backing out of the gate. Which company backed Scala? Sun? It seems like Python is in good company and the corporate-backed languages are the outliers.
C++ had the corporate backing from AT&T, being part of UNIX, thus all major C compiler vendors either adopted CFront, or tried to create their own compiler, one of the first ones was created by Walter. All UNIX workstations had C++ compiler alongside C, Apple moved from Object Pascal to C++, Windows and OS/2 used C for low level code alongside high level frameworks in C++ like OWL, VCL and MFC, Borland had a MS-DOS C++ framework that remains famous to this day (Turbo Vision), BeOS and Symbian were written in C++, on the last MS-DOS days Watcom C++ was the compiler to go for game development. PHP had the corporate support from Zope and all ISP across the world. Python had the initial backing from Zope, followed by all major corporations like Google and Dropbox that kept paying Guido's salary. Ruby was mostly ignored until Ruby on Rails stormed the world.
Nov 03 2021
prev sibling parent bachmeier <no spam.net> writes:
On Wednesday, 3 November 2021 at 17:36:19 UTC, harakim wrote:
 On Tuesday, 2 November 2021 at 17:35:08 UTC, IGotD- wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 
 I think it's due to large ecosystem and the big corporations 
 with deep pockets that pushes them. But I'd like to know you 
 all opinions
Yes, that's a big part of it. If you look at other languages that doesn't have the corporate backing, they are about as popular as D. Also, when something is developed in a proper organization, it also usually means some kind of functional management and a few developers that can do that full time. Python is kind of a outlier here, that has grown organically.
I was thinking about this post for a while. What about R, Perl, Ruby or PHP? I don't think even C++ had corporate backing out of the gate. Which company backed Scala? Sun? It seems like Python is in good company and the corporate-backed languages are the outliers.
[S was created at Bell Labs](https://web.archive.org/web/20181014111802/http://ect. ell-labs.com/sl/S/) shortly after the creation of C. R is an open source implementation of S that became extremely popular for university teaching. Students in many fields, not just statistics, use it in multiple courses before graduation. And unlike other applications, someone analyzing data has a lot of freedom in choosing their language, so adoption didn't require convincing someone in a suit.
Nov 03 2021
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular.
I don't think it is reasonable to say it is unpopular, [Github activity shows that people create new projects with it](https://forum.dlang.org/thread/ltfgzovqcadknyjnabwp forum.dlang.org) at roughly the same rate as Nim, Crystal and other smaller languages. What would be interesting to know is what made people who were very enthusiastic about D in the past (in the forums) switch to another language? Which language was it and why was that a better fit for them?
Nov 02 2021
next sibling parent reply Satoshi <satoshi rikarin.org> writes:
On Tuesday, 2 November 2021 at 18:01:37 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular.
I don't think it is reasonable to say it is unpopular, [Github activity shows that people create new projects with it](https://forum.dlang.org/thread/ltfgzovqcadknyjnabwp forum.dlang.org) at roughly the same rate as Nim, Crystal and other smaller languages. What would be interesting to know is what made people who were very enthusiastic about D in the past (in the forums) switch to another language? Which language was it and why was that a better fit for them?
The reasons I left D was: o The language is inconsistent and lacks a clear vision. o Too much BS features but lacking cutting edge syntactic sugar and features such as async/await (state machine, not the fiber joke), nullable types and forced nullability... o Meta programming is hard to understand and even harder to debug o Lack of tutorials and frameworks. o Not so OOP language as I wanted o Too many features but not every of them was finished and sometimes I didn't get the concept behind it. o Developers of the language was focused on their vision and totally ignored others. Also I don't know why they didn't take the inspiration from other modern languages. I had a feeling of an old conservative guy in his sixties every more than 7 years
Apr 27 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 27 April 2022 at 15:59:43 UTC, Satoshi wrote:
 o The language is inconsistent and lacks a clear vision.
By inconsistent, do you mean the syntax?
 o Not so OOP language as I wanted
I am a bit surprised by this as D has the classic OOP mechanisms. What OOP features are you missing?
 I had a feeling of an old conservative guy in his sixties every 

 for more than 7 years
The type system is a bit weak, like C, so I get what you are most?
Apr 27 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/27/2022 8:59 AM, Satoshi wrote:
 Also I don't know why they didn't take the inspiration from other modern 
 languages.
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. D now has an ownership/borrowing system inspired from other languages. Right, D doesn't have async/await. That's mainly because nobody has spent the time to figure out how to do it, not because I'm blocking it.
Apr 29 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 15:28:16 UTC, Walter Bright wrote:
 On 4/27/2022 8:59 AM, Satoshi wrote:
 Also I don't know why they didn't take the inspiration from 
 other modern languages.
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. ....
Sorry, Lisp, ML, CLU and Smalltalk did it first, D was surely not the first in this regard. Plenty of SIGPLAN papers on the subject.
Apr 29 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 10:00 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 15:28:16 UTC, Walter Bright wrote:
 On 4/27/2022 8:59 AM, Satoshi wrote:
 Also I don't know why they didn't take the inspiration from other modern 
 languages.
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. ....
Sorry, Lisp, ML, CLU and Smalltalk did it first, D was surely not the first in this regard. Plenty of SIGPLAN papers on the subject.
Those were interpreters first and added native code generation later. D did is the other way around, and the native code generating compilers started doing it soon afterwards.
Apr 29 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 18:05:42 UTC, Walter Bright wrote:
 On 4/29/2022 10:00 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 15:28:16 UTC, Walter Bright wrote:
 On 4/27/2022 8:59 AM, Satoshi wrote:
 Also I don't know why they didn't take the inspiration from 
 other modern languages.
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. ....
Sorry, Lisp, ML, CLU and Smalltalk did it first, D was surely not the first in this regard. Plenty of SIGPLAN papers on the subject.
Those were interpreters first and added native code generation later. D did is the other way around, and the native code generating compilers started doing it soon afterwards.
Decades before D was even an idea. Again, SIGPLAN.
Apr 29 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 11:26 AM, Paulo Pinto wrote:
 Those were interpreters first and added native code generation later. D did is 
 the other way around, and the native code generating compilers started doing 
 it soon afterwards.
Decades before D was even an idea. Again, SIGPLAN.
So why did other native languages suddenly start doing it after D did to the point of it being something a language can't skip anymore?
Apr 29 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 12:10 PM, Walter Bright wrote:
 So why did other native languages suddenly start doing it after D did to the 
 point of it being something a language can't skip anymore?
I've seen endless lists of features people wanted to add to C and C++. None of them were CTFE. When we added it to D, people were excited and surprised.
Apr 29 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 I've seen endless lists of features people wanted to add to C 
 and C++. None of them were CTFE. When we added it to D, people 
 were excited and surprised.
Not if they had a decent CS background, it is a well known strategy for speeding up programs. Wikipedia also points out a working C++ prototype from 2003, so I doubt they needed outside influence to move in that direction.
Apr 29 2022
parent reply Tobias Pankrath <tobias pankrath.net> writes:
On Saturday, 30 April 2022 at 06:30:46 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 I've seen endless lists of features people wanted to add to C 
 and C++. None of them were CTFE. When we added it to D, people 
 were excited and surprised.
Not if they had a decent CS background, it is a well known strategy for speeding up programs. Wikipedia also points out a working C++ prototype from 2003, so I doubt they needed outside influence to move in that direction.
Sometimes a good idea from academia needs a practical example (D in this case) to show its usefulness before I sees widespread adoption. That doesn't mean that it was invented by D, or that there is no prior art.
Apr 30 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 30 April 2022 at 07:07:31 UTC, Tobias Pankrath wrote:
 Sometimes a good idea from academia needs a practical example 
 (D in this case) to show its usefulness before I sees 
 widespread adoption. That doesn't mean that it was invented by 
 D, or that there is no prior art.
Speeding up execution by evaluating functions where the input is known is the first thing you think about when considering optimizations. This practice predates the existance of most CS departments. People even did it without compiler support decades before D came into existance, one common generic strategy was to core dump a program right before it request user input and then turn the core dump into an executable. Languages like C were designed for limited semantics and fast compilation and essentially relied on the programmer for optimization rather than the compiler, because clever compilers were slow and were costly to develop. C++ tried to turn C into a proper high level language, but ended up triggering bastardized template programming practices that never should have existed, which in turn triggered what D programmers think of as CTFE. So it is more a result of poor C++ practice than cleverness. You have improved on the terrible C++ template practice, but still dont have a solution for compiler performance or debugging.
Apr 30 2022
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 30 April 2022 at 07:07:31 UTC, Tobias Pankrath wrote:
 On Saturday, 30 April 2022 at 06:30:46 UTC, Ola Fosheim Grøstad 
 wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 I've seen endless lists of features people wanted to add to C 
 and C++. None of them were CTFE. When we added it to D, 
 people were excited and surprised.
Not if they had a decent CS background, it is a well known strategy for speeding up programs. Wikipedia also points out a working C++ prototype from 2003, so I doubt they needed outside influence to move in that direction.
Sometimes a good idea from academia needs a practical example (D in this case) to show its usefulness before I sees widespread adoption. That doesn't mean that it was invented by D, or that there is no prior art.
Those pratical examples precede D's existence by decades.
Apr 30 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 30 April 2022 at 07:07:31 UTC, Tobias Pankrath wrote:
 On Saturday, 30 April 2022 at 06:30:46 UTC, Ola Fosheim Grøstad 
 wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 I've seen endless lists of features people wanted to add to C 
 and C++. None of them were CTFE. When we added it to D, 
 people were excited and surprised.
Not if they had a decent CS background, it is a well known strategy for speeding up programs. Wikipedia also points out a working C++ prototype from 2003, so I doubt they needed outside influence to move in that direction.
Sometimes a good idea from academia needs a practical example (D in this case) to show its usefulness before I sees widespread adoption. That doesn't mean that it was invented by D, or that there is no prior art.
Those pratical examples precede D's existence by decades, and what you are mentioning is not what Walter asserts, rather that it was D that brought it to the world. As if all those people doing CS research in programming languages needed D's existence to notice what is know in academia for decades.
Apr 30 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 1:02 AM, Paulo Pinto wrote:
 As if all those people doing CS research in programming languages needed D's 
 existence to notice what is know in academia for decades.
Bjarne Stroustrup has a PhD in CS. Why didn't C++ have it? Why does every iteration of C++ make CTFE work more like D's? Why didn't any of the other mainstream native compiled languages have it? Fortran? Ada? Pascal? Modula 2? (The latter two by CS academic researcher Niklaus Wirth.) CTFE is a *huge* win. Why was this well-known thing languishing in complete obscurity?
Apr 30 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 30 April 2022 at 17:14:24 UTC, Walter Bright wrote:
 On 4/30/2022 1:02 AM, Paulo Pinto wrote:
 As if all those people doing CS research in programming 
 languages needed D's existence to notice what is know in 
 academia for decades.
Bjarne Stroustrup has a PhD in CS. Why didn't C++ have it? Why does every iteration of C++ make CTFE work more like D's? Why didn't any of the other mainstream native compiled languages have it? Fortran? Ada? Pascal? Modula 2? (The latter two by CS academic researcher Niklaus Wirth.) CTFE is a *huge* win. Why was this well-known thing languishing in complete obscurity?
C++ doesn't pretend to have invented features that have preceded it by decades, other than what they might have taken away from D. Other programming languages had other design goals, which is not the same as clamming to have invented something.
May 01 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 12:37 AM, Paulo Pinto wrote:
 On Saturday, 30 April 2022 at 17:14:24 UTC, Walter Bright wrote:
 On 4/30/2022 1:02 AM, Paulo Pinto wrote:
 As if all those people doing CS research in programming languages needed D's 
 existence to notice what is know in academia for decades.
Bjarne Stroustrup has a PhD in CS. Why didn't C++ have it? Why does every iteration of C++ make CTFE work more like D's? Why didn't any of the other mainstream native compiled languages have it? Fortran? Ada? Pascal? Modula 2? (The latter two by CS academic researcher Niklaus Wirth.) CTFE is a *huge* win. Why was this well-known thing languishing in complete obscurity?
C++ doesn't pretend to have invented features that have preceded it by decades, other than what they might have taken away from D. Other programming languages had other design goals, which is not the same as clamming to have invented something.
That doesn't explain why C and C++ did not implement this well-known feature. Nor any of those other major native compilation languages I mentioned. Instead, C went the route of macros. C++ did the Turing complete template programming language. I remember the articles saying how great C++ templates were because you could calculate factorials with them at compile time. I don't recall reading in them "geez, why not just interpret a function at compile time?" But if you can find one that did, I'll buy you a beer at DConf, too!
May 01 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 1 May 2022 at 08:26:50 UTC, Walter Bright wrote:
 text
I think it is obvious to the casual observer that D had an enormous influence. After D, it became rare to have a new native language without CTFE, fast build times, static if, or unittest blocks. **My interpretation of CTFE prior from a quick research:** 2003: If you read the EDG presentation from 2003 that does indeed the same thing: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n1471.pdf What is transparent is that "metacode" is seen as code separate from normal code, whereas one could say the D innovation is to avoid this distinction, bringing it much closer in spirit to LISP. And indeed for D programmers reflecting on the structure of the program progressively becomes second nature after a few years. Possibly core.reflect would enhance this effect. EDG authors probably knew very well what they are getting into and envision all it can do for native programming. However C++ doesn't get constexpr until 14 years later, with C++11. 2007: CTFE was in DMD 1.006 (2007). Critically, as there is no committee or implementers to convince, it can be adopted immediately. One could said the merit of D would have to have pushed through the implementation, making it ubiquitous and so easy stuff that wasn't ever written for meta-programming often works on the first time ; and pushing other languages to have it. When you are not the first ever to implement something but do it in a way that has better UX, then you are doing more to popularize the feature than just inventing it. It is a bit strange the obvious syntax didn't take of likewise but maybe we can attribute that to the "new thing => loud syntax" bias of programmers. And also, D tend to implement CTFE more completely: https://nim-lang.org/docs/manual.html#restrictions-on-compileminustime-execution With objects, reference types, pointers, floats, exp/pow/log... (Bonus: the point of view of Nim designer: https://www.mail-archive.com/digitalmars-d puremagic.com/msg88688.html Which of course could be possible as some ideas "float in the air" to be independently (re)discovered) **static if:** The sanctionned way before static if / if constraints was "policy-based design" and techniques popularized by Alexandrescu in Modern C++ Design (2001). From then on "meta-programming" in a C++ concept primarily looked like: "traits" + template specializations + type lists (AliasSeq but recursive) and SFINAE, with enormous damage done to C++ build times across the world. Templates are much slower than CTFE to compile. Such a style is very much non-LISPy, needing extra-data and headache instead of code as data. D codebases have relatively few extra code generators, but this is common in C++ contexts. All in all, meta-programming in C++ used to be a very different beast than in D, and didn't require expert knowledge. Making it quite a cultural change vs C++. **slices** Go and Rust had it from the start, Nimrod got them etc. I unfortunately lack the time to do a complete research about prior, because it seems surprising to me no other native language had them before D. I have a strong feeling that like other successful features, the experience of D was strongly influencing other designs. In sharp contrast, there are less-impressive ideas that - like it or not - were left behind: - pure - TLS by default - shared - transitive immutability - insert features you hate here
May 01 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 1 May 2022 at 13:35:46 UTC, Guillaume Piolat wrote:

rust is copying `d`.
`c++` is copying `d`.

`D` is very good.
May 01 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 1 May 2022 at 13:52:56 UTC, zjh wrote:

And `d` is very `creative` and `elegant`.
I believe,Yes ,D can!
May 01 2022
parent zjh <fqbqrr 163.com> writes:
On Sunday, 1 May 2022 at 14:04:04 UTC, zjh wrote:

 And `d` is very `creative` and `elegant`.
 I believe,Yes ,D can!
I like and respect those who keep the `original spirit` or` language`. Those `plagiarists`, they are `born ugly`.
May 01 2022
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 6:35 AM, Guillaume Piolat wrote:
 **slices**
     Go and Rust had it from the start, Nimrod got them etc.
     I unfortunately lack the time to do a complete research about prior,
because 
 it seems surprising to me no other native language had them before D. I have a 
 strong feeling that like other successful features, the experience of D was 
 strongly influencing other designs.
D slices date back to 2001 or so. I don't know any language before that that had such a thing as part of the language. Of course people would do it in ad-hoc manners.
 In sharp contrast, there are less-impressive ideas that - like it or not -
were 
 left behind:
 - pure
 - TLS by default
 - shared
 - transitive immutability
 - insert features you hate here
I think constexpr in C++ implies pure. Rust kinda sorta has transitivity, it's inherent to its borrowing system.
May 01 2022
parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Sunday, 1 May 2022 at 20:31:14 UTC, Walter Bright wrote:
 On 5/1/2022 6:35 AM, Guillaume Piolat wrote:
 **slices**
     Go and Rust had it from the start, Nimrod got them etc.
     I unfortunately lack the time to do a complete research 
 about prior, because it seems surprising to me no other native 
 language had them before D. I have a strong feeling that like 
 other successful features, the experience of D was strongly 
 influencing other designs.
D slices date back to 2001 or so. I don't know any language before that that had such a thing as part of the language. Of course people would do it in ad-hoc manners.
As mentioned in my other comment, Sinclair BASIC had something very close to it (and I think HP had the same thing in their BASIC interpreters).
May 02 2022
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Sunday, 1 May 2022 at 13:35:46 UTC, Guillaume Piolat wrote:

[..]
 **slices**
    Go and Rust had it from the start, Nimrod got them etc.
    I unfortunately lack the time to do a complete research 
 about prior, because it seems surprising to me no other native 
 language had them before D. I have a strong feeling that like 
 other successful features, the experience of D was strongly 
 influencing other designs.
I discovered recently that one of the first language to have slices was BASIC. Not Microsoft derived Basics but the lowly Sinclair BASIC had something conceptually very close to slices. From wikipedia ```` Unlike the LEFT$(), MID$() and RIGHT$() functions used in the ubiquitous Microsoft BASIC dialects for home computers, parts of strings in Sinclair BASIC are accessed by numeric range. For example, a$(5 TO 10) gives a substring starting with the 5th and ending with the 10th character of the variable a$. Thus, it is possible to replace the LEFT$() and RIGHT$() commands by simply omitting the left or right array position respectively; for example a$( TO 5) is equivalent to LEFT$(a$,5). Further, a$(5) alone is enough to replace MID$(a$,5,1). ```` and what's not mentioned in the wikipedia is that the slicing also worked as lvalue. ´a$(5 TO 7)="12"´ was possible.
May 02 2022
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 2 May 2022 at 07:04:31 UTC, Patrick Schluter wrote:
 On Sunday, 1 May 2022 at 13:35:46 UTC, Guillaume Piolat wrote:

 [..]
 [...]
I discovered recently that one of the first language to have slices was BASIC. Not Microsoft derived Basics but the lowly Sinclair BASIC had something conceptually very close to slices. [...]
OMG, that's true! My first programming tool was the Spectrum Thank you Patrick, you made my day! :-P
May 02 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 12:04 AM, Patrick Schluter wrote:
  From wikipedia
 ````
 Unlike the LEFT$(), MID$() and RIGHT$() functions used in the ubiquitous 
 Microsoft BASIC dialects for home computers, parts of strings in Sinclair
BASIC 
 are accessed by numeric range. For example, a$(5 TO 10) gives a substring 
 starting with the 5th and ending with the 10th character of the variable a$. 
 Thus, it is possible to replace the LEFT$() and RIGHT$() commands by simply 
 omitting the left or right array position respectively; for example a$( TO 5)
is 
 equivalent to LEFT$(a$,5). Further, a$(5) alone is enough to replace
MID$(a$,5,1).
 ````
 and what's not mentioned in the wikipedia is that the slicing also worked as 
 lvalue. ´a$(5 TO 7)="12"´ was possible.
 
Nice. Slices are a huge deal.
May 02 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 21:31:28 UTC, Walter Bright wrote:
 Nice. Slices are a huge deal.
Why? Old languages like Simula have it for strings (aka substrings), to save space or maybe reduce the need for GC by having strings ref counted. For other containers you might as well use a library type. Never thought of it as something special, and I have written a lot of Python code. What would be more impactful is to have one line generators like Python, give D ranges som dedicated syntax. In general: improve on those features existing users think are language defining.
May 02 2022
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 On 4/29/2022 12:10 PM, Walter Bright wrote:
 So why did other native languages suddenly start doing it 
 after D did to the point of it being something a language 
 can't skip anymore?
I've seen endless lists of features people wanted to add to C and C++. None of them were CTFE. When we added it to D, people were excited and surprised.
Only those lacking sound CS background in language research.
Apr 30 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 12:07 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 I've seen endless lists of features people wanted to add to C and C++. None of 
 them were CTFE. When we added it to D, people were excited and surprised.
Only those lacking sound CS background in language research.
Bjarne Stroustrup and Niklaus Wirth are PhD CS researchers. The C++ illuminati is full of CS academics. Why did zero of them propose it for C++? Instead, we got all this excitement over the *discovery* that one could execute programs at compile time with C++ templates. A discovery made by academics. Why did none say "why don't we just interpret the function instead of this absurd template metaprogramming technique?"
Apr 30 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 On 4/29/2022 12:10 PM, Walter Bright wrote:
 So why did other native languages suddenly start doing it 
 after D did to the point of it being something a language 
 can't skip anymore?
I've seen endless lists of features people wanted to add to C and C++. None of them were CTFE. When we added it to D, people were excited and surprised.
Your lists are not representative. When D added it, our reaction was more like "finally, somebody did that!". And even today, the feature is only marginally useful because of the countless forward reference bugs. I recently filed one more (https://issues.dlang.org/show_bug.cgi?id=22981), which is not a CTFE bug per se but was encountered in another futile attempt to generate code with CTFE in a reasonable manner.
Apr 30 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 12:35 AM, Max Samukha wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 On 4/29/2022 12:10 PM, Walter Bright wrote:
 So why did other native languages suddenly start doing it after D did to the 
 point of it being something a language can't skip anymore?
I've seen endless lists of features people wanted to add to C and C++. None of them were CTFE. When we added it to D, people were excited and surprised.
Your lists are not representative. When D added it, our reaction was more like "finally, somebody did that!".
I'm open to a reference to one that does have it, that predates D's CTFE.
 And even today, the feature is only marginally 
 useful because of the countless forward reference bugs. I recently filed one 
 more (https://issues.dlang.org/show_bug.cgi?id=22981), which is not a CTFE bug 
 per se but was encountered in another futile attempt to generate code with
CTFE 
 in a reasonable manner.
I'm sorry about the problems you encountered, but as you say they are forward reference issues, not about CTFE. You couldn't get that code to work in C++, either, because C++ does not allow forward references at all. Thanks for submitting a well-done bug report on it.
Apr 30 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 30 April 2022 at 17:25:27 UTC, Walter Bright wrote:

 I'm open to a reference to one that does have it, that predates 
 D's CTFE.
I think Paulo Pinto has given you enough hard evidence. However, I was responding to the other of your claims, which is - nobody asked for CTFE. I doubt it because the factorial implemented with templates is the first thing a new-born C++ programmer sees, and her first words are "Template metaprogramming is an abomination. Why can't we just evaluate functions at compile time?".
 And even today, the feature is only marginally useful because 
 of the countless forward reference bugs. I recently filed one 
 more (https://issues.dlang.org/show_bug.cgi?id=22981), which 
 is not a CTFE bug per se but was encountered in another futile 
 attempt to generate code with CTFE in a reasonable manner.
I'm sorry about the problems you encountered, but as you say they are forward reference issues, not about CTFE.
I also said that CTFE is not so useful in the presence of forward reference bugs.
 You couldn't get that code to work in C++, either, because C++ 
 does not allow forward references at all.
Yes, that is one of the reasons why we want to use D and not C++. When it doesn't work, the reason goes away.
 Thanks for submitting a well-done bug report on it.
D will prevail!
May 01 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 12:13 AM, Max Samukha wrote:
 On Saturday, 30 April 2022 at 17:25:27 UTC, Walter Bright wrote:
 
 I'm open to a reference to one that does have it, that predates D's CTFE.
I think Paulo Pinto has given you enough hard evidence.
Of interpreters that later generated native code. Not the other way around.
 However, I was 
 responding to the other of your claims, which is - nobody asked for CTFE. I 
 doubt it because the factorial implemented with templates is the first thing a 
 new-born C++ programmer sees, and her first words are "Template
metaprogramming 
 is an abomination. Why can't we just evaluate functions at compile time?".
You would think they would have, as many said it was an abomination. But they never said the next. There were many, many articles about using C++ templates as a Turing complete programming language. Find one that said "why can't we ..." Find one and I'll buy you a beer at DConf!
May 01 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 1 May 2022 at 08:14:34 UTC, Walter Bright wrote:

 Of interpreters that later generated native code. Not the other 
 way around.
I don't quite understand why you insist on native code. Do compilers to bytecode count? There used to be a language called Nemerle (https://en.wikipedia.org/wiki/Nemerle), which had been mentioned on these forums many times, long before D got CTFE. The earliest mention I found is https://forum.dlang.org/post/ca56h1$2k4h$1 digitaldaemon.com. It had very powerful macros, which could be used as compile time functions, AST macros, and whatnot. The language is now dead because it was too good for humans.
 However, I was responding to the other of your claims, which 
 is - nobody asked for CTFE. I doubt it because the factorial 
 implemented with templates is the first thing a new-born C++ 
 programmer sees, and her first words are "Template 
 metaprogramming is an abomination. Why can't we just evaluate 
 functions at compile time?".
You would think they would have, as many said it was an abomination. But they never said the next. There were many, many articles about using C++ templates as a Turing complete programming language. Find one that said "why can't we ..." Find one and I'll buy you a beer at DConf!
I think Nemerle deserves a case of beer.
May 01 2022
next sibling parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Sunday, 1 May 2022 at 20:39:36 UTC, Max Samukha wrote:
 On Sunday, 1 May 2022 at 08:14:34 UTC, Walter Bright wrote:

 Of interpreters that later generated native code. Not the 
 other way around.
I don't quite understand why you insist on native code. Do compilers to bytecode count? There used to be a language called Nemerle (https://en.wikipedia.org/wiki/Nemerle), which had been mentioned on these forums many times, long before D got CTFE. The earliest mention I found is https://forum.dlang.org/post/ca56h1$2k4h$1 digitaldaemon.com. It had very powerful macros, which could be used as compile time functions, AST macros, and whatnot. The language is now dead because it was too good for humans.
...
 I think Nemerle deserves a case of beer.
Does writing a compile time function require any new knowledge/skill or is it like writing a runtime function? Accurately answering "they're like any other function, use functions in either context and you'll be fine" means you've got something immediately useful to newcomers, an ultra low friction path to more power. Answering "no, but we have super duper xyz which is every bit as powerful theoretically and should probably be preferred because it's hard for people to understand and qualifies you for your programming wizard merit badge", means you, as a language designer, did not understand what you could have had. Unless I'm missing something big from the Nemerle wiki page those language designers did not understand what they could have had. I'm happy to give credit where it is due but I'd advise hanging on to that beer in this case. :-)
May 01 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 00:24:24 UTC, Bruce Carneal wrote:
 Does writing a compile time function require any new 
 knowledge/skill or is it like writing a runtime function?  
 Accurately answering "they're like any other function, use 
 functions in either context and you'll be fine" means you've 
 got something immediately useful to newcomers, an ultra low 
 friction path to more power.
It does require new knowledge - you have to stick "macro" to the function declaration. In D, you don't need to do that, because the grammatical context the function is used in determines whether the function will be executed at compile time.
 Answering "no, but we have super duper xyz which is every bit 
 as powerful theoretically and should probably be preferred 
 because it's hard for people to understand and qualifies you 
 for your programming wizard merit badge", means you, as a 
 language designer, did not understand what you could have had.

 Unless I'm missing something big from the Nemerle wiki page 
 those language designers did not understand what they could 
 have had.
There is nothing big about CTFE. )
 I'm happy to give credit where it is due but I'd advise hanging 
 on to that beer in this case. :-)
I need my beer badly right now!
May 02 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 2:57 AM, Max Samukha wrote:
 There is nothing big about CTFE. )
It completely transformed how D code is written. Check out this, for example: https://github.com/dlang/dmd/blob/master/src/dmd/backend/oper.d#L388 DMD used to do this initialization as a separate program, which did what the lambda did here, and wrote out a file which was then compiled in as part of DMD. The Digital Mars C++ did the same thing for its tables.
May 02 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 20:30:36 UTC, Walter Bright wrote:
 On 5/2/2022 2:57 AM, Max Samukha wrote:
 There is nothing big about CTFE. )
It completely transformed how D code is written. Check out this, for example: https://github.com/dlang/dmd/blob/master/src/dmd/backend/oper.d#L388 DMD used to do this initialization as a separate program, which did what the lambda did here, and wrote out a file which was then compiled in as part of DMD. The Digital Mars C++ did the same thing for its tables.
I've been using D in-and-out since around 2006 and might be aware of every existing use case of CTFE. Lambdas work well for simple cases like the one you mentioned, but not so well for more involved ones: alias format(A...) = (string format, A args...) { string[] r; r ~= "bar"; return r; }; enum s = format("...", 1, 2); 1. Can't make it a normal function, because it needs to be usable with -betterC, and ~= prevents that. 3. Can't make it a lambda, because there's no way to express variadics for lambdas without loosing IFTI.
May 03 2022
next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 07:29:24 UTC, Max Samukha wrote:

 alias format(A...) = (string format, A args...)
alias format(A...) = (string format, A args)
May 03 2022
prev sibling next sibling parent reply test123 <test123 gmail.com> writes:
On Tuesday, 3 May 2022 at 07:29:24 UTC, Max Samukha wrote:
 On Monday, 2 May 2022 at 20:30:36 UTC, Walter Bright wrote:
 [...]
I've been using D in-and-out since around 2006 and might be aware of every existing use case of CTFE. Lambdas work well for simple cases like the one you mentioned, but not so well for more involved ones: alias format(A...) = (string format, A args...) { string[] r; r ~= "bar"; return r; }; enum s = format("...", 1, 2); 1. Can't make it a normal function, because it needs to be usable with -betterC, and ~= prevents that. 3. Can't make it a lambda, because there's no way to express variadics for lambdas without loosing IFTI.
You can use it with betterC if you put the format into a separate di file.
May 03 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 07:35:08 UTC, test123 wrote:

 You can use it with betterC if you put the format into a 
 separate di file.
That might work, thank you!
May 03 2022
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 07:29:24 UTC, Max Samukha wrote:

 1. Can't make it a normal function, because it needs to be 
 usable with -betterC, and ~= prevents that.
 3. Can't make it a lambda, because there's no way to express 
 variadics for lambdas without loosing IFTI.
*losing 3 -> 2 (CTFE is big. Editable forum posts would be huge.)
May 03 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 1:39 PM, Max Samukha wrote:
 I don't quite understand why you insist on native code. Do compilers to
bytecode 
 count? There used to be a language called Nemerle 
 (https://en.wikipedia.org/wiki/Nemerle), which had been mentioned on these 
 forums many times, long before D got CTFE. The earliest mention I found is 
 https://forum.dlang.org/post/ca56h1$2k4h$1 digitaldaemon.com. It had very 
 powerful macros, which could be used as compile time functions, AST macros,
and 
 whatnot. The language is now dead because it was too good for humans.
A language designed for interpretation does not distinguish between compile time and run time. While the program is executing, it can generate more code, which the interpreter executes. If the language is successful, it'll often get a native code generator to speed it up. The compiler is part of the runtime of the language. A language designed for native compilation draws a hard distinction between compile time and run time. You'll see this in the grammar for the language, in the form of a constant-expression for compile time, and just expression for run time. The constant-expression does constant folding at compile time. The runtime does not include a compiler. D is the first language I know of, designed for native compilation, with constant-expressions, that extended the evaluation of constant-expressions with the ability to execute functions at compile time. There's no compiler in the runtime. D does not support compiling code at runtime. To clarify, if the runtime of a language includes a compiler, that is in the interpreted class of languages (even if they jit to native code), and it is not fall into this category. To show D is not the first, I request an example of a language designed for native compilation, that does not include a compiler in the runtime, that has constant-expressions in the grammar that must be computed at compile time and can execute ordinary functions at compile time. C++ came closest to the mark with its Turing-complete templates.
May 01 2022
next sibling parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Monday, 2 May 2022 at 01:42:19 UTC, Walter Bright wrote:
 On 5/1/2022 1:39 PM, Max Samukha wrote:
 I don't quite understand why you insist on native code. Do 
 compilers to bytecode count? There used to be a language 
 called Nemerle (https://en.wikipedia.org/wiki/Nemerle), which 
 had been mentioned on these forums many times, long before D 
 got CTFE. The earliest mention I found is 
 https://forum.dlang.org/post/ca56h1$2k4h$1 digitaldaemon.com. 
 It had very powerful macros, which could be used as compile 
 time functions, AST macros, and whatnot. The language is now 
 dead because it was too good for humans.
A language designed for interpretation does not distinguish between compile time and run time. While the program is executing, it can generate more code, which the interpreter executes. If the language is successful, it'll often get a native code generator to speed it up. The compiler is part of the runtime of the language. A language designed for native compilation draws a hard distinction between compile time and run time. You'll see this in the grammar for the language, in the form of a constant-expression for compile time, and just expression for run time. The constant-expression does constant folding at compile time. The runtime does not include a compiler. D is the first language I know of, designed for native compilation, with constant-expressions, that extended the evaluation of constant-expressions with the ability to execute functions at compile time. There's no compiler in the runtime. D does not support compiling code at runtime. To clarify, if the runtime of a language includes a compiler, that is in the interpreted class of languages (even if they jit to native code), and it is not an example of what D's CTFE is category. To show D is not the first, I request an example of a language designed for native compilation, that does not include a compiler in the runtime, that has constant-expressions in the grammar that must be computed at compile time and can execute ordinary functions at compile time. C++ came closest to the mark with its Turing-complete templates.
This is an odd division to me. The way I do it in my lang is to just treat compiler runtime as a separate compilation target. Interpretation, code generation, it's all backend concern. But that still gives me the sort of "run program code at compiletime" capability that I think Nemerle aims at (though I've never used it), without any interpretation, just by targeting parts of the program at a backend that can be loaded back during the compilation run. And I think that's fundamentally a cleaner model than CTFE, because it doesn't rely on, in effect, embedding an entirely separate codepath for constant folding that reimplements deep parts of the compiler; instead, there is only one path through the compiler, and the split happens cleanly at the back-end ... and one backend just happens to be invoked by the compiler itself during compilation to get a function pointer to directly call. One problem of doing it this way is that it makes `static if` very awkward. You're trying to evaluate an expression, but you want to access local "compiletime variables" - so you need to compile the expression being tested in an odd context where it lives in a function, "but not really" - any access to local variables triggers a special error, because the expression is really semantically at something like toplevel, it just syntactically lives inside a function. That's why for now, I just use constant folding like D for `static if`. (Also, I'm dog slow and I'm trying to limit the amount of macro compilation roundtrips. Gonna switch to an *interpreter* backend some time - for a *speedup.* LLVM makes good code, but its performance is painful.) But I still think this is fundamentally the right way to think about CTFE. The compiler runtime is just a backend target, and because the compiler is a library, the macro can just recurse back into the compiler for parsing and helper functions. It's elegant, it gets native performance and complete language support "for free", and most importantly, it did not require much effort to implement.
May 01 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 11:15 PM, FeepingCreature wrote:
 [...]
It's a cool way to make it work. But I don't think it changes the nature of what I was talking about.
May 02 2022
parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Monday, 2 May 2022 at 07:05:00 UTC, Walter Bright wrote:
 On 5/1/2022 11:15 PM, FeepingCreature wrote:
 [...]
It's a cool way to make it work. But I don't think it changes the nature of what I was talking about.
That's fair, I'm kind of jumping in on the tail end of the discussion. So I'm probably missing a lot of context. I guess I just wanted to highlight that having compiletime in-language macros doesn't commit you to a compilation model that weakens the compiletime/runtime boundary. Anyway, I just have a hard time of seeing how the CLR target relates to this. Just because Nemerle targeted the CLR doesn't make it an interpreted language with regard to CTFE, because targeting the CLR doesn't actually buy you any simplicity in this. You can compile to code that is then loaded back into the running context just as easily on x86 as on CLR. For the purpose of compiler design, the CLR is just a processor, no?
May 02 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 12:10 AM, FeepingCreature wrote:
 That's fair, I'm kind of jumping in on the tail end of the discussion. So I'm 
 probably missing a lot of context. I guess I just wanted to highlight that 
 having compiletime in-language macros doesn't commit you to a compilation
model 
 that weakens the compiletime/runtime boundary. Anyway, I just have a hard time 
 of seeing how the CLR target relates to this. Just because Nemerle targeted
the 
 CLR doesn't make it an interpreted language with regard to CTFE, because 
 targeting the CLR doesn't actually buy you any simplicity in this. You can 
 compile to code that is then loaded back into the running context just as
easily 
 on x86 as on CLR. For the purpose of compiler design, the CLR is just a 
 processor, no?
Look at it this way. The runtime of Nemerle includes a compiler. This is quite different from CTFE, which does not rely on a compiler in the runtime.
May 02 2022
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 06:15:32 UTC, FeepingCreature wrote:

 But I still think this is fundamentally the right way to think 
 about CTFE. The compiler runtime is just a backend target, and 
 because the compiler is a library, the macro can just recurse 
 back into the compiler for parsing and helper functions. It's 
 elegant, it gets native performance and complete language 
 support "for free", and most importantly, it did not require 
 much effort to implement.
Yay!
May 02 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 01:42:19 UTC, Walter Bright wrote:

 A language designed for native compilation draws a hard 
 distinction between compile time and run time. You'll see this 
 in the grammar for the language, in the form of a 
 constant-expression for compile time, and just expression for 
 run time. The constant-expression does constant folding at 
 compile time. The runtime does not include a compiler.
Nope, Nemerle doesn't require a compiler runtime at runtime (however, you can include it if you need to). The Nemerle compiler compiles the const-expressions into a dll (yes, the target is bytecode, but it could be native code - it doesn't matter) and then loads the compiled code back and executes it *at compile time*. It could as well do interpretation the way D does. Both approaches have their pros and cons, but they do fundamentally the same thing.
May 02 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 2:10 AM, Max Samukha wrote:
 On Monday, 2 May 2022 at 01:42:19 UTC, Walter Bright wrote:
 
 A language designed for native compilation draws a hard distinction between 
 compile time and run time. You'll see this in the grammar for the language, in 
 the form of a constant-expression for compile time, and just expression for 
 run time. The constant-expression does constant folding at compile time. The 
 runtime does not include a compiler.
Nope, Nemerle doesn't require a compiler runtime at runtime (however, you can include it if you need to). The Nemerle compiler compiles the const-expressions into a dll (yes, the target is bytecode, but it could be native code - it doesn't matter) and then loads the compiled code back and executes it *at compile time*. It could as well do interpretation the way D does. Both approaches have their pros and cons, but they do fundamentally the same thing.
May 02 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 20:24:29 UTC, Walter Bright wrote:



(without resorting to hacks) at compile time based on UDAs the runtime. I guess that is what you mean when you say "it needs defer code generation to runtime.
May 03 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 12:34 AM, Max Samukha wrote:
 On Monday, 2 May 2022 at 20:24:29 UTC, Walter Bright wrote:
 


resorting to hacks) at compile time based on UDAs the way you can in Nemerle or D. In you usually process UDAs at runtime. I guess that is what you mean when you say defer code generation to runtime.
Java can create and compile code at runtime. I ran into this when creating a Java native compiler for Symantec. It was used very rarely, but just enough to sink the notion of a native compiler.
May 03 2022
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Tuesday, 3 May 2022 at 19:01:44 UTC, Walter Bright wrote:
 On 5/3/2022 12:34 AM, Max Samukha wrote:
 On Monday, 2 May 2022 at 20:24:29 UTC, Walter Bright wrote:
 


(without resorting to hacks) at compile time based on UDAs the at runtime. I guess that is what you mean when you say "it it must defer code generation to runtime.
thought of it :-) Java can create and compile code at runtime. I ran into this when creating a Java native compiler for Symantec. It was used very rarely, but just enough to sink the notion of a native compiler.
Actually: https://github.com/dotnet/csharplang/discussions/2379 - Alex
May 03 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 4:50 PM, 12345swordy wrote:
 Actually:
 https://github.com/dotnet/csharplang/discussions/2379
That's dated 2019, 12 years after D acquired Compile Time Function Execution
May 03 2022
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 5/3/22 18:03, Walter Bright wrote:
 On 5/3/2022 4:50 PM, 12345swordy wrote:
 Actually:
 https://github.com/dotnet/csharplang/discussions/2379
That's dated 2019, 12 years after D acquired Compile Time Function Execution
They acknowledge: "This feature would be pretty similar to other languages supporting CTFE, like D or C++11 with constexpr." Ali
May 03 2022
next sibling parent Tejas <notrealemail gmail.com> writes:
On Wednesday, 4 May 2022 at 01:54:30 UTC, Ali Çehreli wrote:
 On 5/3/22 18:03, Walter Bright wrote:
 On 5/3/2022 4:50 PM, 12345swordy wrote:
 Actually:
 https://github.com/dotnet/csharplang/discussions/2379
That's dated 2019, 12 years after D acquired Compile Time Function Execution
They acknowledge: "This feature would be pretty similar to other languages supporting CTFE, like D or C++11 with constexpr." Ali
Looks like they're not planning on adding it at a compiler level, rather depending on the optimisations made at the Intermediate Language level to do it
May 03 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 6:54 PM, Ali Çehreli wrote:
 They acknowledge: "This feature would be pretty similar to other languages 
 supporting CTFE, like D or C++11 with constexpr."
Nice! I missed that. Thanks for pointing it out.
May 03 2022
prev sibling next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 19:01:44 UTC, Walter Bright wrote:


 thought of it :-)
Nemerle creators did (obviously inspired by LISPs and MLs).
 Java can create and compile code at runtime. I ran into this 
 when creating a Java native compiler for Symantec. It was used 
 very rarely, but just enough to sink the notion of a native 
 compiler.
The important part is that Nemerle can execute functions at compile time - whether it's done via interpretation or compilation is not relevant to the argument. D could as well compile CTFE into native code or IL (as in newCTFE) from the start. Also, one could argue that native code is not native code, because it is further translated by the CPU into microcode, hence the CPU is an interpreter, hence any native compiler is not native. We'd rather accept that IL is native code to the VM and move on.
May 03 2022
next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 4 May 2022 at 05:54:42 UTC, Max Samukha wrote:
 On Tuesday, 3 May 2022 at 19:01:44 UTC, Walter Bright wrote:


 thought of it :-)
Nemerle creators did (obviously inspired by LISPs and MLs).
 Java can create and compile code at runtime. I ran into this 
 when creating a Java native compiler for Symantec. It was used 
 very rarely, but just enough to sink the notion of a native 
 compiler.
The important part is that Nemerle can execute functions at compile time - whether it's done via interpretation or compilation is not relevant to the argument. D could as well compile CTFE into native code or IL (as in newCTFE) from the start. Also, one could argue that native code is not native code, because it is further translated by the CPU into microcode, hence the CPU is an interpreter, hence any native compiler is not native. We'd rather accept that IL is native code to the VM and move on.
Native code is indeed not *exactly* native code however calling a CPU an interpreter is either false or requires such a loose definition of interpreter that it loses most of its descriptive power. Basically all CPUs translate an ISA into some kind of internal state, big processors just happen to have an extra layer. Also, I suppose this is mostly nomenclature, the instructions are translated into micro-operations/uops whereas microcode as a term is usually reserved for either extremely complex instructions or configuring CPU features, otherwise you have the term microcode referring to the same thing it always has versus an innovation Intel made with the early Pentiums (pentia?)
May 03 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 May 2022 at 06:29:55 UTC, max haughton wrote:
 Native code is indeed not *exactly* native code however calling 
 a CPU an interpreter is either false or requires such a loose 
 definition of interpreter that it loses most of its descriptive 
 power.
In this context there is no difference between a VM and hardware. If you can build hardware for the VM they should be considered similar. > Basically all CPUs translate an ISA into some kind of
 internal state, big processors just happen to have an extra
 
The term RISC came into use to distinguish those that did require decoding from those that did not.
 Also, I suppose this is mostly nomenclature, the instructions 
 are translated into micro-operations/uops whereas microcode as 
 a term is usually reserved for either extremely
Microcode refers to the instruction sequence that is used internally after decoding.
May 03 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 10:54 PM, Max Samukha wrote:
 The important part is that Nemerle can execute functions at compile time - 
 whether it's done via interpretation or compilation is not relevant to the 
 argument. D could as well compile CTFE into native code or IL (as in newCTFE) 
 from the start.
That's pedantically true. But I can't seem to explain the difference to you. D doesn't have a compiler in the runtime. Having a compiler in the runtime means that you can dynamically create code and compile it at runtime. It's a *fundamental* difference. If there is no difference (after all, all of them are Turing machines, no difference at all!), and CTFE is popular and well known, did ZERO of the native compilers do it? Why didn't it appear on feature wish lists? Why wasn't it in C/C++/Pascal/Fortran/Module2/Ada/Algol compilers? Why did nobody mention this when all these articles gushing about computing factorials with C++ templates NEVER mention just interpreting the factorial function? Surely you can see that there must have been SOME difference there, even if it was just perception. It was D that changed that perception. Suddenly, native languages started implementing CTFE.
May 05 2022
next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 It's a *fundamental* difference.
Maybe to solve, less so for a user experience.
May 05 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/5/2022 5:48 PM, monkyyy wrote:
 On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 It's a *fundamental* difference.
Maybe to solve, less so for a user experience.
It actually wasn't difficult to implement. IT was mostly tedious as D is a complex language.
May 05 2022
prev sibling next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 It was D that changed that perception. Suddenly, native 
 languages started implementing CTFE.
Is CTFE really that useful? Generating code as a part of the build process has been in use since a very long time ago. Any programming languages (perl, python, php, ...) or tools (bison, flex, re2c, ...) could be used for this. Yes, the build process becomes a bit more complicated, because you suddenly have more build dependencies and more complicated build scripts or makefiles. Still it's not a rocket science. Everything is pretty easy to use and understand. CTFE allows to to cut some corners and move this complexity into a compiler. The upside is that we don't need advanced build scripts. But the downside is potentially slower compilation (especially with the interpreting approach), which is also too obscure and hard to notice and fix. Additionally, can CTFE be used to sniff some passwords from my system and embed them into a compiled binary? Now we got some security concerns on top of that. As for the other languages implementing CTFE, my understanding is that compiler people just generally have to do something to keep the ball rolling and have themselves entertained and employed ;-) The features themselves may be useful or they may be just gimmicks. Only time will tell. At the end of the day, D language happens to be unpopular. CTFE doesn't look like a panacea.
May 05 2022
next sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Friday, 6 May 2022 at 05:58:37 UTC, Siarhei Siamashka wrote:
 On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 It was D that changed that perception. Suddenly, native 
 languages started implementing CTFE.
Is CTFE really that useful? Generating code as a part of the build process has been in use since a very long time ago. Any programming languages (perl, python, php, ...) or tools (bison, flex, re2c, ...) could be used for this. Yes, the build process becomes a bit more complicated, because you suddenly have more build dependencies and more complicated build scripts or makefiles. Still it's not a rocket science. Everything is pretty easy to use and understand.
That's the point. It reduces build complexity in a disruptive way.
 CTFE allows to to cut some corners and move this complexity 
 into a compiler. The upside is that we don't need advanced 
 build scripts. But the downside is potentially slower 
 compilation (especially with the interpreting approach), which 
 is also too obscure and hard to notice and fix.
You have to include your build tool invocation and development time
 Additionally, can CTFE be used to sniff some passwords from my 
 system and embed them into a compiled binary? Now we got some 
 security concerns on top of that.
We are now deep in "whataboutism-fallacy" territory here, aka as clutching at straws to win he last word in a debate. The same security issue (I would even say, worse) with external build tools generating code.
 As for the other languages implementing CTFE, my understanding 
 is that compiler people just generally have to do something to 
 keep the ball rolling and have themselves entertained and 
 employed ;-) The features themselves may be useful or they may 
 be just gimmicks. Only time will tell.

 At the end of the day, D language happens to be unpopular. CTFE 
 doesn't look like a panacea.
No-one ever said it was (this is another rhetorical fallacy called strawman).
May 05 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Friday, 6 May 2022 at 06:18:08 UTC, Patrick Schluter wrote:
 That's the point. It reduces build complexity in a disruptive 
 way.
My point is that CTFE actually increases the complexity and moves it somewhere else in a somewhat obscure way. It's one of the zillions of extra features that make the language spec bloated and difficult to learn.
 CTFE allows to to cut some corners and move this complexity 
 into a compiler. The upside is that we don't need advanced 
 build scripts. But the downside is potentially slower 
 compilation (especially with the interpreting approach), which 
 is also too obscure and hard to notice and fix.
You have to include your build tool invocation and development time
Yes and no. What really matters is the transparency and simplicity. For example, I can have a tool which calculates Pi number with high precision. Then I bundle this tool as a part of the project and simply use the computed constant in the source code. This extra tool is very visible in the project structure and there's nothing obscure about it. But with the CTFE approach, somebody may be tempted to do the expensive Pi constant computation every time at the compilation time. And this may be hidden somewhere deep in the source code in a much less visible way. Looks like such feature makes it too easy to shoot yourself in the foot.
 Additionally, can CTFE be used to sniff some passwords from my 
 system and embed them into a compiled binary? Now we got some 
 security concerns on top of that.
We are now deep in "whataboutism-fallacy" territory here, aka as clutching at straws to win he last word in a debate. The same security issue (I would even say, worse) with external build tools generating code.
Uhm? Sorry? Are you saying that it's me who is clutching at straws? This whole thread is about what makes D language unpopular. People seem to be clutching at straws to claim that the D language is great, but somehow the rest of the world just doesn't get it. The setup with external build tools is surely much easier to understand and review.
May 05 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 06:57:52 UTC, Siarhei Siamashka wrote:
 For example, I can have a tool which calculates Pi number with 
 high precision. Then I bundle this tool as a part of the 
 project and simply use the computed constant in the source 
 code. This extra tool is very visible in the project structure 
 and there's nothing obscure about it. But with the CTFE 
 approach, somebody may be tempted to do the expensive Pi 
 constant computation every time at the compilation time. And 
 this may be hidden somewhere deep in the source code in a much 
 less visible way. Looks like such feature makes it too easy to 
 shoot yourself in the foot.
Yes, you have som points here. The total development overhead of meta programming is costly if you don't write libraries that are reused in multiple projects. So in order to reap the benefits of having advanced meta programming in a language you need a community that builds libraries for reuse. In D that primarily means the standard library as the number and quality of third party libraries is limiting in comparison with more mainstream languages. This could be improved by having a smaller stable standard library and in addition have a set of curated libraries that are more open to evolution. As it is, D cannot capitalize on metaprogramming features, because the eco system for building reusable components isn't there. If you have an unchanging external library and an advanced compiler then it could cache some of the work done for reuse to lower the compilation times. This could be improved by having a strategy where the focus was on building a more modular compiler rather than adding more features to the existing one. --- D is where it is because of management choices made (or not made), I think. Overall, D is not as popular as it could be because there are missing bits and pieces, both in the language/implementation and the approach to the eco system. The solution to this could be: 1. Simplify syntax and semantics, clean it up with a focus on ease of use. Full feature freeze until this is done. 2. Create a compiler infrastructure that is modular where contributors can contribute in focused areas without understanding the whole thing. 3. Reduce the core standard library to something that is stable and can be evolved to be of the highest quality. Then move auxillary stuff to official support-libraries where breaking changes are more accepted (with semver support so you still can use an older version). 4. Map out the missing bits in documentation and official support-libraries that hampers productivity. With no such clear strategy, nothing will change.
May 06 2022
prev sibling next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Friday, 6 May 2022 at 06:57:52 UTC, Siarhei Siamashka wrote:
 On Friday, 6 May 2022 at 06:18:08 UTC, Patrick Schluter wrote:
 That's the point. It reduces build complexity in a disruptive 
 way.
My point is that CTFE actually increases the complexity and moves it somewhere else in a somewhat obscure way. It's one of the zillions of extra features that make the language spec bloated and difficult to learn.
It's exactly the contrary of this. Doing thing at compile time with templates implies something new and difficult to learn (see C++). Doing things at compile time with CTFE don't add anything new to learn, you write code as always.
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 07:58:06 UTC, Paolo Invernizzi wrote:
 Doing thing at compile time with templates implies something 
 new and difficult to learn (see C++). Doing things at compile 
 time with CTFE don't add anything new to learn, you write code 
 as always.
Let us put this in perspective first. Using templates for computations was not something the C++ designers wanted to support. In general, allowing programmers to execute recursion/loops at compile time is a serious liability for large projects. 1. In general, language designers want the compilation to execute in O(N) where N is the length of the source code. 2. There is no good solution to debug what goes wrong, so it certainly creates complications! Those are two very good reasons for not adding CTFE. Now, in C++ CTFE was inevitable because they have a large third party eco system where people started to use templates this way and it was becoming a language problem. So basically, it is a byproduct of a significant mistake in their original template design combined with having a large eco system of reusable code (libraries). Basically, they had no choice, but to add it. When C++ adds it, there will be more demand for it because many programmers know what C++ has.
May 06 2022
parent reply zjh <fqbqrr 163.com> writes:
On Friday, 6 May 2022 at 08:19:38 UTC, Ola Fosheim Grøstad wrote:

 Now, in C++ CTFE was inevitable because they have a large third 
 party eco system where people started to use templates this way 
 and it was becoming a language problem.
`D` need to pay `more attention` to the development of `library`. It is `even more important` to pay more attention to `libraries` than the `d` language. Look at `Python`, because they have `many libraries`, they are developing very well!
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 08:52:16 UTC, zjh wrote:
 It is `even more important` to pay more attention to 
 `libraries` than the `d` language.
D has many libraries that end up not being maintained, so the most important thing is to retain users, and then one needs to understand why they leave. One key issue that has been mentioned in this thread is that people leave because of inconsistencies in the language.
 Look at `Python`, because they have `many libraries`, they are 
 developing very well!
Or maybe the other way around, because the language has aimed for clean semantics and syntax in addition to being stable they get many libraries…
May 06 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 6 May 2022 at 09:37:24 UTC, Ola Fosheim Grøstad wrote:
 D has many libraries that end up not being maintained, so the 
 most important thing is to retain users, and then one needs to 
 understand why they leave. One key issue that has been 
 mentioned in this thread is that people leave because of 
 inconsistencies in the language.
Yes, what we need is to start D3 where the language can evolve and people can tinker around more. D2 isn't going anywhere now and the maintainers believe that it is "perfect". Alternatively that D is forked.
May 06 2022
parent reply forkit <forkit gmail.com> writes:
On Friday, 6 May 2022 at 09:47:22 UTC, IGotD- wrote:
 On Friday, 6 May 2022 at 09:37:24 UTC, Ola Fosheim Grøstad 
 wrote:
 D has many libraries that end up not being maintained, so the 
 most important thing is to retain users, and then one needs to 
 understand why they leave. One key issue that has been 
 mentioned in this thread is that people leave because of 
 inconsistencies in the language.
Yes, what we need is to start D3 where the language can evolve and people can tinker around more. D2 isn't going anywhere now and the maintainers believe that it is "perfect". Alternatively that D is forked.
the problem with forks, is finding competent compiler developers.. and there ain't that many of them. then you need a competent team to mangage the fork...and there ain't too many of them either... it forkin aint' easy...to fork. btw. the premise of the question posed ( a long...long..long time ago..) .. is flawed and kinda meaningless. the question should have been => 'Why is D not widely used'.
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 10:11:12 UTC, forkit wrote:
 the problem with forks, is finding competent compiler 
 developers.. and there ain't that many of them.
Actually, there are quite lot of people who have enough compiler knowledge from their CS degree to do something significant, so that skill isn't as rare are one might think. Compiler design is a course that many take. But the current codebase isn't really attractive for that group. Maybe SDC is a better horse to bet on, but I don't know much about the status or internals of SDC to make a judgement.
 then you need a competent team to mangage the fork...and there 
 ain't too many of them either...
That is a concern. Getting a loose group of people to focus on one shared long term vision doesn't happen by chance.
May 06 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Friday, 6 May 2022 at 12:05:40 UTC, Ola Fosheim Grøstad wrote:
 On Friday, 6 May 2022 at 10:11:12 UTC, forkit wrote:
 the problem with forks, is finding competent compiler 
 developers.. and there ain't that many of them.
Actually, there are quite lot of people who have enough compiler knowledge from their CS degree to do something significant, so that skill isn't as rare are one might think. Compiler design is a course that many take. But the current codebase isn't really attractive for that group. Maybe SDC is a better horse to bet on, but I don't know much about the status or internals of SDC to make a judgement.
 then you need a competent team to mangage the fork...and there 
 ain't too many of them either...
That is a concern. Getting a loose group of people to focus on one shared long term vision doesn't happen by chance.
Amaury and I are trying to bring SDC back into the folds. Feel free to contribute.
May 06 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Friday, 6 May 2022 at 13:23:30 UTC, max haughton wrote:

 Amaury and I are trying to bring SDC back into the folds. Feel 
 free to contribute.
So, we have `four` good D `compilers`? but how to level up our `libraries` ecology? We should give up the idea that language is the most important, but `ecology` is the most important . Look at `'Python'`, Maybe we should learn from them.
May 06 2022
parent zjh <fqbqrr 163.com> writes:
On Friday, 6 May 2022 at 14:23:21 UTC, zjh wrote:
`compilers`? but how to level up our
 `libraries` ecology?
Maybe we can do some investigation. `Investigate` programmers 's `favorite libraries` of `D and and other languages`. Then we port it. Theirs are mines!
May 06 2022
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 13:23:30 UTC, max haughton wrote:
 Amaury and I are trying to bring SDC back into the folds. Feel 
 free to contribute.
I have taken a look at the repo, and it looks promising, is it able to compile a subset of the language at this point? Is there a mailing list for it? I'd like to subscribe if there is, but summertime is unfortunately very busy, so I won't have any time until late in the autumn. It would be very interesting to see how you think about your concept though.
May 10 2022
prev sibling parent reply Jack <jckj33 gmail.com> writes:
On Friday, 6 May 2022 at 13:23:30 UTC, max haughton wrote:
 On Friday, 6 May 2022 at 12:05:40 UTC, Ola Fosheim Grøstad 
 wrote:
 On Friday, 6 May 2022 at 10:11:12 UTC, forkit wrote:
 [...]
Actually, there are quite lot of people who have enough compiler knowledge from their CS degree to do something significant, so that skill isn't as rare are one might think. Compiler design is a course that many take. But the current codebase isn't really attractive for that group. Maybe SDC is a better horse to bet on, but I don't know much about the status or internals of SDC to make a judgement.
 [...]
That is a concern. Getting a loose group of people to focus on one shared long term vision doesn't happen by chance.
Amaury and I are trying to bring SDC back into the folds. Feel free to contribute.
what is SDC? is that "Stupid D compiler"
May 29 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 29 May 2022 at 20:56:20 UTC, Jack wrote:
 what is SDC? is that "Stupid D compiler"
**S N A Z Z Y !**
May 29 2022
next sibling parent deadalnix <deadalnix gmail.com> writes:
On Sunday, 29 May 2022 at 21:03:29 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 29 May 2022 at 20:56:20 UTC, Jack wrote:
 what is SDC? is that "Stupid D compiler"
**S N A Z Z Y !**
Webfreak even made a logo for it, which i haven't published yet. Thanks Webfreaks, and I apologize for taking so long!
May 29 2022
prev sibling parent reply Jack <jckj33 gmail.com> writes:
On Sunday, 29 May 2022 at 21:03:29 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 29 May 2022 at 20:56:20 UTC, Jack wrote:
 what is SDC? is that "Stupid D compiler"
**S N A Z Z Y !**
where is the link for that?
May 29 2022
parent reply bauss <jj_1337 live.dk> writes:
On Monday, 30 May 2022 at 06:15:24 UTC, Jack wrote:
 On Sunday, 29 May 2022 at 21:03:29 UTC, Ola Fosheim Grøstad 
 wrote:
 On Sunday, 29 May 2022 at 20:56:20 UTC, Jack wrote:
 what is SDC? is that "Stupid D compiler"
**S N A Z Z Y !**
where is the link for that?
https://github.com/snazzy-d/sdc
May 29 2022
parent Jack <jckj33 gmail.com> writes:
On Monday, 30 May 2022 at 06:25:40 UTC, bauss wrote:
 On Monday, 30 May 2022 at 06:15:24 UTC, Jack wrote:
 On Sunday, 29 May 2022 at 21:03:29 UTC, Ola Fosheim Grøstad 
 wrote:
 On Sunday, 29 May 2022 at 20:56:20 UTC, Jack wrote:
 what is SDC? is that "Stupid D compiler"
**S N A Z Z Y !**
where is the link for that?
https://github.com/snazzy-d/sdc
thanks!
May 30 2022
prev sibling next sibling parent reply mee6 <mee6 lookat.me> writes:
On Friday, 6 May 2022 at 09:37:24 UTC, Ola Fosheim Grøstad wrote:
 On Friday, 6 May 2022 at 08:52:16 UTC, zjh wrote:
 It is `even more important` to pay more attention to 
 `libraries` than the `d` language.
D has many libraries that end up not being maintained, so the most important thing is to retain users, and then one needs to understand why they leave. One key issue that has been mentioned in this thread is that people leave because of inconsistencies in the language.
That's one reason, another reason is no long term support. If you want to maintain source code in D of any large project, you are going to run into a compiler bug and that means having to become familiar with Dmd's source code which adds a lot of overhead. If compiler bug. All I know is whenever I went to compile my project over the years I've always found a new compiler bug. D really needs a LTS build. Not having on just screams you don't care about large projects. I also worked on vibe.d for a little bit and it had the same issue. It couldn't keep up with all the different versions of D. It might make development for D easier but it comes at the cost of the user experience. It's really just many small problems like that that just start to add up.
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 10:01:00 UTC, mee6 wrote:
 That's one reason, another reason is no long term support. If 
 you want to maintain source code in D of any large project, you 
 are going to run into a compiler bug and that means having to 
 become familiar with Dmd's source code which adds a lot of 
 overhead.
Sure, I think that can be labeled as "inconsistencies", i.e. does not work as intuitively expected. Of course, a language like C++ has many things that are not intuitive, but one cannot compete with C++ by having the same critical flaw… Adding more features also does not help on inconsistencies/bugs. It is unlikely to address those issues that made people, who had enough interest to build a library, leave. It creates enthusiasm among those that are unlikely to leave (the regular enthusiasts that demand the addition), but does not really change the dynamics and makes structural changes even more expensive…
 I also worked on vibe.d for a little bit and it had the same 
 issue. It couldn't keep up with all the different versions of 
 D. It might make development for D easier but it comes at the 
 cost of the user experience. It's really just many small 
 problems like that that just start to add up.
I understand what you are saying, of course, other web frameworks also have versioning issues, but smaller eco systems have to pay even more attention to this than larger eco systems as the larger ones provide workarounds through search engines: you'll quickly find a workaround on Stack Overflow or a whole sale replacement (offering the same API). Basically, if you are small, you have to be *very focused*, to offset the eco system disadvantages. As a user it is very difficult to see where the overall focus and strategy is. There might be one, but it isn't clear to prospective users, I think.
May 06 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Friday, 6 May 2022 at 10:40:29 UTC, Ola Fosheim Grøstad wrote:
 the eco system **disadvantages.**
I think that's a meme opinion on these forums. https://github.com/p0nce/DIID For example Go has a "great ecosystem" and the solution to those problems are not simpler, and often longer: Let's find the equivalent for the first 5 DIID in Golang: - DIID 1 => https://tutorialedge.net/golang/parsing-xml-with-golang/ - DIID 2 => https://github.com/faiface/beep/wiki/Hello,-Beep! - DIID 3 => https://golangexample.com/simple-markdown-to-html-converter-in-go/ - DIID 4 => https://github.com/skelterjohn/go.wde/blob/master/wdetest/wdetest.go - DIID 5 => https://cloudoki.com/generating-pdfs-with-go/ OK, let's do the equivalent for Rust, is has a great ecosystem: - DIID 1 => https://docs.rs/minidom/0.14.0/minidom/ - DIID 2 => https://github.com/RustAudio/rodio/blob/master/examples/basic.rs - DIID 3 => https://github.com/johannhof/markdown.rs - DIID 4 => https://github.com/sinclairzx81/black - DIID 5 => https://lib.rs/crates/printpdf Apart from Rust + DIID 3, all these tasks are longer to do than in D. But D has a "bad ecosystem". It is unfair because I wrote the DIIDs to showcase D, but I think the D ecosystem stands well the comparison.
May 06 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 12:45:40 UTC, Guillaume Piolat wrote:
 On Friday, 6 May 2022 at 10:40:29 UTC, Ola Fosheim Grøstad 
 wrote:
 the eco system **disadvantages.**
I think that's a meme opinion on these forums.
It is clearly not a meme. Go is primarily useful for web-services. If you try to do something outside that domain you'll get into a rougher landscape. I don't know anything about what Rust is practical for, but they have a significant feature that nobody else has… If I were to pick it up I would use it primarily for WebAssembly. Based on what little I know, it would probably be difficult to beat for that purpose. C++ is very difficult to beat for things like embedded, simulation and computer graphics. Dart is very difficult to beat for things like cross platform mobile. Faust is very difficult to beat for prototyping LTI signal processing. All these languages have good eco systems (or system libraries) for their application areas. To get a reasonable comparison you need to focus on one application area and look at what you would need to build a full featured application.
 It is unfair because I wrote the DIIDs to showcase D, but I 
 think the D ecosystem stands well the comparison.
It has nothing to do with showcasing individual features or libraries, it has to do with picking the right tool for the job. Basically nobody are looking for a generic language (outside The question you want to answer is this: does this language provide solid, maintained, well-documented libraries at the right abstraction level to fill *all the gaps* that needs to be filled for this *particular application* I am going to build?
May 06 2022
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 6 May 2022 at 13:25:00 UTC, Ola Fosheim Grøstad wrote:
 All these languages have good eco systems (or system libraries) 
 for their application areas. To get a reasonable comparison you 
 need to focus on one application area and look at what you 
 would need to build a full featured application.
D is ideal for all applications. I know there's some belief out there that you can only possibly be good at one thing, but it isn't even true for insects, much less for divine delights like D.
May 06 2022
next sibling parent reply mee6 <mee6 lookat.me> writes:
On Friday, 6 May 2022 at 13:40:19 UTC, Adam D Ruppe wrote:
 On Friday, 6 May 2022 at 13:25:00 UTC, Ola Fosheim Grøstad 
 wrote:
 All these languages have good eco systems (or system 
 libraries) for their application areas. To get a reasonable 
 comparison you need to focus on one application area and look 
 at what you would need to build a full featured application.
D is ideal for all applications.
Not in practice.
 I know there's some belief out there that you can only possibly 
 be good at one thing, but it isn't even true for insects, much 
 less for divine delights like D.
Not sure why you are looking down on insects, they're more important in an ecosystem than humans.
May 06 2022
next sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 6 May 2022 at 14:30:27 UTC, mee6 wrote:
 Not sure why you are looking down on insects, they're more 
 important in an ecosystem than humans.
May 06 2022
prev sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 6 May 2022 at 14:30:27 UTC, mee6 wrote:
 Not in practice.
Yes in practice.
 I know there's some belief out there that you can only 
 possibly be good at one thing, but it isn't even true for 
 insects, much less for divine delights like D.
Not sure why you are looking down on insects, they're more important in an ecosystem than humans.
I'm not looking down on insects. Quite the opposite - I'm defending their flexibility against common misconceptions over their hyper specialization.
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 15:26:52 UTC, Adam D Ruppe wrote:
 I'm not looking down on insects. Quite the opposite - I'm 
 defending their flexibility against common misconceptions over 
 their hyper specialization.
What makes insects more adaptable than humans when disasters hit is that they are so simple and many that they can adapt through genetic mutations. Poor analogy for D, but it would be a great analogy in favour of LISP! Anyway, most imperative languages are more or less equally powerful and have very similar features (in the abstract). That was not the point. The point was that you need to retain dedicated users over time in order to build an eco system. It is not even strictly about how many users you've got or the expressiveness of the language. What those users are interested in and the "gravity" of their libraries can then cause a formation of a niche around that language. If they leave early, then you cannot sustain an eco system, nor grow a niche. Think of it in terms of gravity.
May 06 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 6 May 2022 at 16:15:23 UTC, Ola Fosheim Grøstad wrote:
 The point was that you need to retain dedicated users over time 
 in order to build an eco system.
http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_02.html
May 06 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 17:30:46 UTC, Adam D Ruppe wrote:
 On Friday, 6 May 2022 at 16:15:23 UTC, Ola Fosheim Grøstad 
 wrote:
 The point was that you need to retain dedicated users over 
 time in order to build an eco system.
http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_02.html
Well written article, but supporting safe and nogc is optional, so isn't that a self-imposed burden by authors? I'm inclined to believe that authors giving up on libraries have other causes, but I could be wrong. Anyway, it is very difficult to grow the eco-system until you have retention, until then "gravity" won't happen. And without gravity you won't have enough maintainers of a library to sell stability… It is great to see that Max has joined SDC, by browsing through the repo it looks like a lot of effort has gone into it! That is the kind of "gravity" that you need. If you have two highly active developers, you might get three and so on… Increasing belief in the long term viability of the project.
May 06 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/6/2022 10:30 AM, Adam D Ruppe wrote:
 http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_02.html
You wrote about new D bug-detecting features not finding bugs in existing code. This is completely unsurprising. For example, I ran one of those new C++ bug finding scanners over DMD (back when it was still in C++). How many bugs did it find? Zero. The thing is, the DMD source code was very well tested. The routine coding bugs that caused problems were already fixed. The bug finding features are for *new* code, so you can find the bugs at compile time when it is much more efficient to fix them.
May 06 2022
prev sibling parent Dukc <ajieskola gmail.com> writes:
On Friday, 6 May 2022 at 13:40:19 UTC, Adam D Ruppe wrote:
 On Friday, 6 May 2022 at 13:25:00 UTC, Ola Fosheim Grøstad 
 wrote:
 All these languages have good eco systems (or system 
 libraries) for their application areas. To get a reasonable 
 comparison you need to focus on one application area and look 
 at what you would need to build a full featured application.
D is ideal for all applications.
Not quite ideal, but it does not have to be. A family car does not need to be as good at pulling trailers as a tractor. It's still the more practical choice for that for most people. Commonality counts just as much as usefulness in one single purpose.
May 07 2022
prev sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
(first let's agree that I will not answer to your next message, 
so that our time is capped :))

On Friday, 6 May 2022 at 13:25:00 UTC, Ola Fosheim Grøstad wrote:
 Go is primarily useful for web-services. If you try to do 
 something outside that domain you'll get into a rougher 
 landscape.
That doesn't mean if you have a webservice to write, that Go is the "best tool for the job". D could be very well be the "best tool for the job". How do we know? By measuring. Who decides what is the best tool for the job? It seems to be "hearsay" from what you are saying. Have people made the comparison? It doesn't seem so to me. I did, for a limited domain, and it turns out D performs well.
 C++ is very difficult to beat for things like embedded, 
 simulation and computer graphics.
There is a good reason to start a new codebase in C++, and that is that the year is 2007 and you are into Transformers - the movie, it just came out. No one that is born after Crazy Frog will ever want to maintain code in a language that deliberately ignores the developer experience, and the history went in the other direction - making things easier. Of course the consultants will have lots of grey goo work to rejoice and loose sanity on.
 Faust is very difficult to beat for prototyping LTI signal 
 processing.
Yeah. But it isn't particularly useful. I've never anyone from the audio industry say they use Faust, and I think DSP DSLs aren't a good idea. DSP is not even 1/3 of the products people use. (You will have noticed that Faust it can be implemented at CTFE in D even, so you can save writing a compiler and a backend, and get a D-man tattoo instead).
May 06 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 14:02:40 UTC, Guillaume Piolat wrote:
 (first let's agree that I will not answer to your next message, 
 so that our time is capped :))
Sure :)
 That doesn't mean if you have a webservice to write, that Go is 
 the "best tool for the job". D could be very well be the "best 
 tool for the job".
D does not have cloud service support as I am aware of. Eco system matters. Running an on-demand Go service will be much cheaper and responsive (I am talking CPU time).
 the products people use. (You will have noticed that Faust it 
 can be implemented at CTFE in D even, so you can save writing a 
 compiler and a backend, and get a D-man tattoo instead).
Making modifications real time and seeing the spectrum change as you type is fun and intuitive, you also get access to the standard library and can conduct experiments within seconds: https://faustlibraries.grame.fr/ It is great that D does everything you want for your use context, but it is not comparable in terms of features for that domain in general.
May 06 2022
parent reply Dukc <ajieskola gmail.com> writes:
On Friday, 6 May 2022 at 15:27:58 UTC, Ola Fosheim Grøstad wrote:
 That doesn't mean if you have a webservice to write, that Go 
 is the "best tool for the job". D could be very well be the 
 "best tool for the job".
D does not have cloud service support as I am aware of. Eco system matters.
It is definitely true that comparing our ecosystem side-by-side with Go or Rust, we lose. Even in general, let alone if considering only the area where Go is presumably the strongest at, web servers. But I argue that this does not always matter much. When you need a library to do something, it does not matter whether you have 5 or 50 good options. As Guillaume's DIID series shows, D's ecosystem is easily big enough for most purposes. Of course, most purposes != all purposes, but still. If you're compare the languages in something where both D and Go/Rust provide good libraries for, it's the language itself and it's standard library that make most of the difference. Not the ecosystem.
 Running an on-demand Go service will be much cheaper and 
 responsive (I am talking CPU time).
How you use the networking library or framework is still going to matter more than what it is. Probably a Go web server is faster than a Vibe.D server if both have seen similar optimisation effort. Still, you can always drop to lower level networking APIs if you need more performance than the canonical use of Vibe.D provides. If there is any big language-level advantage in one over the other, it probably weighs more than that performance difference. OTOH if a feature-rich networking framework is needed instead of just the basic functionality, I can then see the richer ecosystem of Go trumping other considerations.
May 07 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 7 May 2022 at 19:30:08 UTC, Dukc wrote:
 If you're compare the languages in something where both D and 
 Go/Rust provide good libraries for, it's the language itself 
 and it's standard library that make most of the difference. Not 
 the ecosystem.
Yes, I agree with this in some cases. It depends on the full application. I often only use 1, 2 or 3 external libraries, but then I also need those specific ones. So it depends on what the application area, sure. How many unfilled gaps can you live with for that particular case? For instance, if you need a solver, you are generally better off using the language that has the best API for that specific solver. You don't want to mess around with bugs in the API when programming the solver correctly is challenge! No space for "gaps" in the dev environment in that case.
 to matter more than what it is. Probably a Go web server is 
 faster than a Vibe.D server if both have seen similar 
 optimisation effort.
Either way, I don't think web applications is a fair comparison, as it is more about the Cloud services enabling fast-boot-specialized-infrastructure, possibly with custom runtimes, for the most commonly used languages. (No executables allowed.) It is very difficult to get access and compete in that space. D could probably be more competitive for real time chat, game servers and services where you tailor the infrastructure yourself to run 24/7.
 use of Vibe.D provides. If there is any big language-level 
 advantage in one over the other, it probably weighs more than 
 that performance difference.
Yes, it is not about maximizing CPU performance. It is about infrastructure support. Instant boot, instant shutdown, automatic scaling. Basically one executable per web address in the extreme case (one server per "function").
 OTOH if a feature-rich networking framework is needed instead 
 of just the basic functionality, I can then see the richer 
 ecosystem of Go trumping other considerations.
If you are doing something mainstream, you basically will find something that get you there faster using a mainstream language (for that purpose). Smaller languages are better off looking for the "underserved" niches. Could be gaming server. Could be something else, but run-of-the-mill web-services is "overserved" by others, so is basically impossible to do anything *interesting* in that space that will make developers gravitate towards you.
May 07 2022
prev sibling parent reply Joshua <jtacoma pm.me> writes:
On Saturday, 7 May 2022 at 19:30:08 UTC, Dukc wrote:
 On Friday, 6 May 2022 at 15:27:58 UTC, Ola Fosheim Grøstad 
 wrote:
 That doesn't mean if you have a webservice to write, that Go 
 is the "best tool for the job". D could be very well be the 
 "best tool for the job".
D does not have cloud service support as I am aware of. Eco system matters.
It is definitely true that comparing our ecosystem side-by-side with Go or Rust, we lose. Even in general, let alone if considering only the area where Go is presumably the strongest at, web servers. But I argue that this does not always matter much. When you need a library to do something, it does not matter whether you have 5 or 50 good options. As Guillaume's DIID series shows, D's ecosystem is easily big enough for most purposes. Of course, most purposes != all purposes, but still. If you're compare the languages in something where both D and Go/Rust provide good libraries for, it's the language itself and it's standard library that make most of the difference. Not the ecosystem.
Yes. I tried D a few times over the past twenty years or so, passing on it the first time over GC performance concerns (I was a bit of a purist) and the second time over uncertainty around D1 vs D2. I came back to it in the past week or two for use in a hobby project after trying many other options… I’m so tired of jumping through hoops to combine build systems just so I can use one neat Go library or one neat Rust library or just build most (but not all) of mobile app in Flutter/Dart …however, the standard libraries for these languages don’t support UUIDs or graphemes, so I need to pull in even more dependencies and then one hobby coder owning a small test helper project ten layers deep in my dependency graph introduces a subtle incompatibility that ripples through my whole build system and I’m almost back to square one. It would be nice if choosing a language didn’t have to mean choosing the entire build system and ecosystem of hobby projects that comes with it. Worse, these build systems’ support for mixing languages is mostly superficial. Rust’s Cargo, for example, or Go’s Get, are neither optional nor easy to integrate into other build systems. Have you ever tried combining C++ and Rust in the same project? So I keep looking back to C++ or even C to just build a cross-platform base that can be extended with plugins written in whatever other language, and then scripting languages start to look good. Lately I started looking at compiled functional languages that integrate easily with C because I expect I’ll need their expressiveness and type safety for some tricky data transformations. OCaml gets pretty close - if it came with pkg-config support to find the runtime library, I might even have stopped there! However, I looked even further and found D has it all: it’s compiled, it integrates easily with existing C code whether I’m using GCC or Clang, it doesn’t force me to use a bespoke build system (I’m avoiding DUB so far, Meson is good for me), and wonder of wonders: it has standard library support for iterating through Unicode text one grapheme at a time! Well I’m sure I’ll encounter a rough edge or two. Maybe I’ll even move on again. However, with CTFE and lazy parameters to boot it’ll take more than one or two scrapes to shake my newfound hope in this language.
May 09 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 10 May 2022 at 04:16:32 UTC, Joshua wrote:
 Worse, these build systems’ support for mixing languages is 
 mostly superficial. Rust’s Cargo, for example, or Go’s Get, are 
 neither optional nor easy to integrate into other build systems.
I am a bit puzzled by your experience. It is not uncommon to use "vendoring" of libraries in Go, so I am not quite sure why you feel bound to something specific. Granted Go have other issues that are annoying, e.g. no exceptions and not the best abstraction mechanism. That said, D seems to have switched focus from C++ to C lately, so it might be a good fit for your needs. The tradeoff is that you cannot have a modern GC and also have good C interop.
May 10 2022
parent reply Joshua <jtacoma pm.me> writes:
On Tuesday, 10 May 2022 at 07:00:03 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 10 May 2022 at 04:16:32 UTC, Joshua wrote:
 Worse, these build systems’ support for mixing languages is 
 mostly superficial. Rust’s Cargo, for example, or Go’s Get, 
 are neither optional nor easy to integrate into other build 
 systems.
I am a bit puzzled by your experience. It is not uncommon to use "vendoring" of libraries in Go, so I am not quite sure why you feel bound to something specific.
Right! Ok, what I mean is that I can easily combine D, C, and C++ in the same binary build target because none of these languages are tied to a build system that supports only one language. As an example, if I'm willing write a bit of C-interop boilerplate and make some decisions about garbage collection, a [meson.build](https://mesonbuild.com/) like the following just _works_: ``` project('myapp', ['d', 'c', 'cpp']) executable('myapp', ['lib.d', 'lib.c', 'lib.c++']) ``` I [can't](https://mesonbuild.com/Rust.html) throw Rust into the same build target, but even if I could I'd still have to deal with the fact that doing anything interesting in Rust (or Go) requires using Cargo (or `go get`) to download 10× more dependencies than I asked for, which is 10× more long-term instability than I intended to accept. I don't mean to be excessively critical: they're wonderful languages, and having the option to reuse work from such large ecosystems is _truly amazing_. AFAIK, nothing remotely similar existed for compiled languages more than 10 or 15 years ago. I'm just not a fan of the trade-off: it's like, _"Everything you can see in this vast ecosystem is free to reuse, all you have to do is use this here build system which supports the only programming language you'll ever need. Isn't that great? Now go forth and make a name for yourself by rewriting useful libraries from other ~~other~~ inferior languages into our new paradise!"_ It's exciting and fun and so much work and at the end you get... a silo. Not so with D (or at least not as far as I know yet! ☺)
May 10 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 10 May 2022 at 23:08:18 UTC, Joshua wrote:
 amazing_. AFAIK, nothing remotely similar existed for compiled 
 languages more than 10 or 15 years ago. I'm just not a fan of 
 the trade-off: it's like, _"Everything you can see in this vast 
 ecosystem is free to reuse, all you have to do is use this here 
 build system which supports the only programming language 
 you'll ever need. Isn't that great? Now go forth and make a 
 name for yourself by rewriting useful libraries from other 
 ~~other~~ inferior languages into our new paradise!"_ It's 
 exciting and fun and so much work and at the end you get... a 
 silo. Not so with D (or at least not as far as I know yet! ☺)
Hm, I get a feeling you speak more of packages than builds? I assume any build system that generates makefiles can be extended to work well with Go and Rust? It is easier to create C wrappers in D than Go, for modern C++ not so much, as modern C++ is heavy on header files.
May 10 2022
parent reply Don Allen <donaldcallen gmail.com> writes:
Others have spoken in this long thread about why D is not more 
popular than X, where X=Go or Rust or what have you. I don't have 
much to add to this, other than to observe that Nim is good work 
and hasn't exactly taken over the world and, like D, it doesn't 
have a major corporate sponsor.

I can only speak from my personal experience. I'm in the process 
of completing a project to port about 9000 lines of C I wrote 10 
years ago to provide me with tools to manage my finances the way 
I want to. I'm a very experienced software developer and project 
manager, now retired. I'm certainly out-of-touch with the way 
programmers work today, but I am very familiar with contemporary 
languages as a result of this project, as well as a  
long-standing personal interest in programming languages.

As I observed in another recent post, I considered and rejected 
Rust, Go, Scheme, Haskell and Nim for this project and chose D. A 
few comments on each:

1. The main application of my project makes heavy use of gtk and 
sqlite. In Rust, gtk callbacks are required to be "static", 
meaning that the closures you pass to the signal connection 
routines must not make any free-variable references that are not 
static (don't live as long as the program). I will spare you the 
details, but if you are using sqlite in those callbacks, a 
necessity (all the work is done in the callbacks), I contend that 
there is no way to avoid using "unsafe" Rust code. The Rust 
community, like D's, is helpful, and I've discussed the specifics 
with them in gory detail, and no one has come up with a solution 
to the problems I've encountered, nor have I and believe me, I 
tried. If I am going to be forced to write unsafe Rust, why put 
up with all the borrow-checker, lifetime-checker pain? Rust's 
goal of memory safety without a GC makes we the programmers part 
of the memory management system. Add to that very long 
compilation times right in the middle of your edit, compile, 
debug cycle and it's just not worth it.

2. Scheme is a great language and Chez Scheme is a fine 
implementation. It's mature, well documented and very fast. I use 
Scheme for a lot of things. But this project is large enough that 
the dynamic typing becomes a liability. Too many issues that 
would be caught at compile time with a static language turn into 
run-time debugging adventures. There is also an issue with weak 
type-checking across the foreign-function interface (e.g., all 
pointers look like void* pointers to Chez, so you can pass the 
wrong one to C without complaint from the Chez or C compiler).

3. I could have done this project in Go and I'm sure it would 
have been fine, but I chose D instead based on my preference for 
the language. The tradeoffs weren't all that different so it came 
down to personal preference.

4. I can say the same about Nim -- personal preference. And there 
is a fair amount chatter about compiler bugs on the network that 
was also a bit off-putting.

5. Haskell is another great language, but the nature of how gtk 
works forces you into maintaining a fair amount of state the 
old-fashioned way, rather than passing new state with function 
calls. If much of your code is going to be old-fashioned 
imperative, why choose a functional language that has had 
imperative capabilities glued on?

This brings me to my D experience. It took me a while to learn 
where the land-mines are. The big one was passing a pointer to C 
that originated on the D side and then getting burned by the GC 
because I hadn't retained a reference to the object. It took a 
long time to debug that one, with help from this community. In 
hindsight, it would have been a lot simpler if I'd read the 
warning in the toStringz documentation that spells this out. 
Someone in this thread said that the GC makes C interaction 
difficult (I forget the exact phrase used). Nah -- you just need 
to know about this issue and take care to protect objects that 
originate on the D side. If they originate on the C side, no 
problem.

But once I got comfortable with the language, the porting of the 
C code went fairly quickly and the result is just what I wanted: 
much more concise, readable (and therefore maintainable) code the 
performance of which is indistinguishable from the C version 
(truth be told, most of the heavy lifting is done in sqlite, so I 
probably could have written this in Python -- ugh -- and gotten 
acceptable performance). I like being able to debug with gdb when 
runtime problems do arise and I appreciate the fast compile 
times. After all, the edit-compile-debug cycle is the inner loop 
when you are developing.

Overall, I'm very impressed by D. The compiler seems solid, the 
library functions I've used have all worked properly, and the 
documentation is mostly good. I think the criticisms that have 
been directed at D about lacking vision, blah, blah, have been 
from those who have not used it to produce working software that 
does what it is supposed to do with good performance. In other 
words, usable in practice (recall that Einstein said "In theory, 
there's no difference between theory and practice"). That is the 
ultimate test and at least for me, D has passed that test.

/Don
May 11 2022
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Thanks, Don. I really appreciate you spending the time to write this. It's
quite 
an enjoyable read! I'm glad D meets your programming needs.
May 11 2022
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 11, 2022 at 07:22:53PM +0000, Don Allen via Digitalmars-d wrote:
[...]
 I think the criticisms that have been directed at D about lacking
 vision, blah, blah, have been from those who have not used it to
 produce working software that does what it is supposed to do with good
 performance. In other words, usable in practice (recall that Einstein
 said "In theory, there's no difference between theory and practice").
 That is the ultimate test and at least for me, D has passed that test.
[...] Possibly this is true, I don't know. But personally, I have not found a better language than D, among those I have tried out when I first got deeply dissatisfied with C++ and started looking for alternatives. It's not without its own flaws, granted, but for what I need to do, it's ideal. T -- There is no gravity. The earth sucks.
May 12 2022
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, May 10, 2022 at 11:08:18PM +0000, Joshua via Digitalmars-d wrote:
[...]
 I [can't](https://mesonbuild.com/Rust.html) throw Rust into the same
 build target, but even if I could I'd still have to deal with the fact
 that doing anything interesting in Rust (or Go) requires using Cargo
 (or `go get`) to download 10× more dependencies than I asked for,
 which is 10× more long-term instability than I intended to accept.
Yeah, that's one big factor that turns me off most popular "new" languages these days. I don't *want* tons of dependencies that I know nothing about. I want the minimum to get my job done, and that's *it*. Anything that requires a big hairball of dependencies is a no-go in my book.
 I don't mean to be excessively critical: they're wonderful languages,
 and having the option to reuse work from such large ecosystems is
 _truly amazing_. AFAIK, nothing remotely similar existed for compiled
 languages more than 10 or 15 years ago. I'm just not a fan of the
 trade-off: it's like, _"Everything you can see in this vast ecosystem
 is free to reuse, all you have to do is use this here build system
 which supports the only programming language you'll ever need. Isn't
 that great? Now go forth and make a name for yourself by rewriting
 useful libraries from other ~~other~~ inferior languages into our new
 paradise!"_ It's exciting and fun and so much work and at the end you
 get... a silo. Not so with D (or at least not as far as I know yet! ☺)
This is one reason why I'm still not sold on dub, despite having used D for a decade. I just don't like its walled garden philosophy. It places arbitrary limitations on what is essentially an algorithmic solution that applies to far more general things than its authors deemed worthy of recognition. This is why I wrote what I did recently, about busting dub out of its own walled garden. Don't know if I'll succeed; but in the worst case, my last-resort secret plan is to write a general build tool that can read dub configurations and process them without any effort on the part of the package authors, so that I can import dub projects without ever using dub. As for D itself, its ease of integration with other languages has been great. It integrates with C basically seamlessly -- recently I've been working on a project that uses libxcb, and D lets me declare xcb prototypes on an as-needed basis: I don't even need to convert the entire header file, just port over those functions that I actually use and I'm good to go. And where conversion of entire headers are necessary, tools like dpp are great. C++ integration is a bit trickier (some parts aren't 100% compatible, so I heard -- haven't tried it myself). But Java integration with Adam's jni.d is awesome. I've also used D's metaprogramming capabilities to reduce the amount of boilerplate needed to interface with GLSL shader code, and that's been great too. T -- Only boring people get bored. -- JM
May 11 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/9/2022 9:16 PM, Joshua wrote:
 Lately I started looking at compiled 
 functional languages that integrate easily with C because I expect I’ll need 
 their expressiveness and type safety for some tricky data transformations.
With ImportC, D is getting pretty darned good at integrating with C.
May 11 2022
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 12 May 2022 at 00:10:00 UTC, Walter Bright wrote:
 With ImportC, D is getting pretty darned good at integrating 
 with C.
Still not as good as it already was without it. http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_09.html#importc
May 11 2022
next sibling parent zjh <fqbqrr 163.com> writes:
On Thursday, 12 May 2022 at 00:12:05 UTC, Adam D Ruppe wrote:

 http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_09.html#importc
No, I think `ImportC` is a good idea, just like `unittest` and external test libraries. I also think `d` should link to the library of `C++`. so that `d` users can directly use `C++`'s functions or libraries . Similarly, `d` should also link to `Rust`. Their ecology is relatively large, so we should make use of it . Let theirs are mines.
May 11 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/11/2022 5:12 PM, Adam D Ruppe wrote:
 On Thursday, 12 May 2022 at 00:10:00 UTC, Walter Bright wrote:
 With ImportC, D is getting pretty darned good at integrating with C.
Still not as good as it already was without it. http://dpldocs.info/this-week-in-d/Blog.Posted_2022_05_09.html#importc
You should mention that ImportC supports C constructs that are not expressible in D, such as _Generic and bitfields. (Although I added bitfield support to D recently.)
 And what could have been done to D itself in that time?
C interoperability is a major barrier to using D. ImportC is probably the best leverage for D among existing initiatives.
 translating those files wasn't that hard and now it is done
The trouble comes when those header files change, you've got to figure out what changed and update those hand-translated files. This is a big win for ImportC.
May 11 2022
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 12 May 2022 at 03:43:50 UTC, Walter Bright wrote:
 You should mention that ImportC supports C constructs that are 
 not expressible in D, such as _Generic and bitfields.
I've been using D for 15 years across a variety of applications, including making drivers to interface with esoteric hardware. Those have never been useful. C bitfields are useless even in C for hardware work, unless you lock into proprietary compiler, since the layout is undefined. You're much better off just taking out the spec sheet and using the | and & operators.
 C interoperability is a major barrier to using D.
D is very good at C interoperability. You have a bad habit of spreading misinformation about D. Why would someone use it if its own creator thinks it sucks? But in reality, D is a very good language that makes accessing C code very easy.
 The trouble comes when those header files change, you've got to 
 figure out what changed and update those hand-translated files.
That's not really hard when they change in one place. You can do cherry-pick merge and be done with it in little time. Stable APIs are a very small investment with big returns in D. It is hard though when you are trying to match the version on the user's computer, since dstep's 99% translation may not be good enough and a pre-made packaged thing may not match. This is where importC can win - matching what's on the build system automatically. It isn't there yet though.
May 12 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/12/2022 6:34 AM, Adam D Ruppe wrote:
 C bitfields are useless even in C for hardware work, unless you lock into 
 proprietary compiler, since the layout is undefined.
It's not undefined, it's implementation defined. Also, the layout of ordinary fields is also implementation defined. C has a lot of surprising unportable features. And if you're interfacing to specific hardware, portability to some other platform likely isn't in the cards anyway. If you use bitfields for, say, reducing the memory consumption of the program, it is irrelevant if a different compiler users a different layout. It would only be an issue if the bit fields were written to a file, which is not that common.
May 12 2022
prev sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Thursday, 12 May 2022 at 03:43:50 UTC, Walter Bright wrote:
 On 5/11/2022 5:12 PM, Adam D Ruppe wrote:

 C interoperability is a major barrier to using D. ImportC is 
 probably the best leverage for D among existing initiatives.
Personally I think importC will be useful, will be able to leverage many C libraries that are available but too long to port: lua, codecs, all stb_thing.h... the list is really quite long. Why not steal C++ and Zig biggest appeal.
May 12 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 12 May 2022 at 14:02:51 UTC, Guillaume Piolat wrote:
 will be able to leverage many C libraries that are available 
 but too long to port: lua, codecs, all stb_thing.h... the list 
 is really quite long.
You know most those things already work.
May 12 2022
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 12 May 2022 at 15:12:53 UTC, Adam D Ruppe wrote:
 On Thursday, 12 May 2022 at 14:02:51 UTC, Guillaume Piolat 
 wrote:
 will be able to leverage many C libraries that are available 
 but too long to port: lua, codecs, all stb_thing.h... the list 
 is really quite long.
You know most those things already work.
I think you are coming at this as someone who knows a lot about C and D already (not that Guillaume doesn't). The relevant question to me is how much work is it for someone relatively new to D to figure out how to call C code. If it importC reduces the barrier, then I see it as a positive, even if there are other tools that work better for you or others.
May 12 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Thursday, 12 May 2022 at 15:58:03 UTC, jmh530 wrote:
 If it importC reduces the barrier, then I see it as a positive
 even if there are other tools that work better for you or 
 others.
Well, like I said, importC does have potential to exceed dstep. Maybe it will next year, but it isn't there yet. So we shouldn't undersell the significant success that D and dstep have already had.
May 12 2022
next sibling parent Andrea Fontana <nospam example.org> writes:
On Thursday, 12 May 2022 at 16:38:21 UTC, Adam D Ruppe wrote:
 On Thursday, 12 May 2022 at 15:58:03 UTC, jmh530 wrote:
 If it importC reduces the barrier, then I see it as a positive
 even if there are other tools that work better for you or 
 others.
Well, like I said, importC does have potential to exceed dstep. Maybe it will next year, but it isn't there yet. So we shouldn't undersell the significant success that D and dstep have already had.
The real advantage of importc I've found is that I was able to mix the whole c code inside my project, not only the headers. And it avoids me to compile and link an external library (if license allows it). It's difficult to make it works with GCC or clang. But it's really easy using tcc. Andrea
May 12 2022
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, May 12, 2022 at 04:38:21PM +0000, Adam D Ruppe via Digitalmars-d wrote:
 On Thursday, 12 May 2022 at 15:58:03 UTC, jmh530 wrote:
 If it importC reduces the barrier, then I see it as a positive even
 if there are other tools that work better for you or others.
Well, like I said, importC does have potential to exceed dstep. Maybe it will next year, but it isn't there yet. So we shouldn't undersell the significant success that D and dstep have already had.
Yeah, current D (without ImportC) + dstep (or just manual translation, really) is already very nice to work with, in terms of interfacing with C libraries. I've done a fair bit of work using this method, interfacing with libraries like libxcb, xlib, MPFR, freetype, sqlite, EGL, GLES2, etc., just to name a few. What will make ImportC stand out above the current situation is integrated preprocessor support, which apparently Walter already submitted a PR for. Once that's done and we can import C headers without creating a separate input file (either a .di or .d with dstep or manual preprocessing), *then* ImportC would have better value than what we currently have. T -- Fact is stranger than fiction.
May 12 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Thursday, 12 May 2022 at 17:04:27 UTC, H. S. Teoh wrote:
 On Thu, May 12, 2022 at 04:38:21PM +0000, Adam D Ruppe via 
 Digitalmars-d wrote:
 [...]
Yeah, current D (without ImportC) + dstep (or just manual translation, really) is already very nice to work with, in terms of interfacing with C libraries. I've done a fair bit of work using this method, interfacing with libraries like libxcb, xlib, MPFR, freetype, sqlite, EGL, GLES2, etc., just to name a few. What will make ImportC stand out above the current situation is integrated preprocessor support, which apparently Walter already submitted a PR for. Once that's done and we can import C headers without creating a separate input file (either a .di or .d with dstep or manual preprocessing), *then* ImportC would have better value than what we currently have. T
Preprocessor support takes more than just preprocessing the C source. You need to be able to use the macros or it's useless. That's the main thing dstep is able to do.
May 12 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/12/2022 10:25 AM, max haughton wrote:
 Preprocessor support takes more than just preprocessing the C source.
I know.
 You need to be able to use the macros or it's useless. That's the main thing 
 dstep is able to do.
I understand that access to the manifest constant #define's is important. But it's not useless to not have them. We're making progress on it.
May 12 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, May 12, 2022 at 10:43:56AM -0700, Walter Bright via Digitalmars-d wrote:
 On 5/12/2022 10:25 AM, max haughton wrote:
 Preprocessor support takes more than just preprocessing the C source.
I know.
 You need to be able to use the macros or it's useless. That's the
 main thing dstep is able to do.
I understand that access to the manifest constant #define's is important. But it's not useless to not have them. We're making progress on it.
It's not just manifest constants. There's also macro functions that sometimes crop up in complex C headers. Those are hard to automatically translate; some cases may need inline functions, some involve token-pasting and may not be translatable without human intervention. T -- Tell me and I forget. Teach me and I remember. Involve me and I understand. -- Benjamin Franklin
May 12 2022
next sibling parent Adam Ruppe <destructionator gmail.com> writes:
On Thursday, 12 May 2022 at 18:10:20 UTC, H. S. Teoh wrote:
 It's not just manifest constants. There's also macro functions 
 that sometimes crop up in complex C headers. Those are hard to 
 automatically translate; some cases may need inline functions, 
 some involve token-pasting and may not be translatable without 
 human intervention.
importC's advantage is that they don't have to be translated. If you can mixin some C code, you can do the macro expansion there and then merge the ASTs, without losing the hygeine of the D code. If this gets implemented - and Max took a few steps toward it recently - we might actually exceed dstep in some cases. (You'd still have to express it in the right way but still)
May 12 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/12/2022 11:10 AM, H. S. Teoh wrote:
 There's also macro functions that
 sometimes crop up in complex C headers. Those are hard to automatically
 translate; some cases may need inline functions, some involve
 token-pasting and may not be translatable without human intervention.
All true. If a .h file uses preprocessor for metaprogramming, there's no way to make that work in D.
May 12 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 12 May 2022 at 21:36:40 UTC, Walter Bright wrote:
 On 5/12/2022 11:10 AM, H. S. Teoh wrote:
 There's also macro functions that
 sometimes crop up in complex C headers. Those are hard to 
 automatically
 translate; some cases may need inline functions, some involve
 token-pasting and may not be translatable without human 
 intervention.
All true. If a .h file uses preprocessor for metaprogramming, there's no way to make that work in D.
It should be possible most of the time, as macros in most cases expands to counterparts in the C grammar. You also have to support _Generic. It is by and large a matter of aligning D semantics with C, and that is not a far fetched goal. If such things are not supported then the utility is so limited that it might be better to drop import-C. People generally dont want to deal with 90% solutions (only the most hardcore users want that).
May 13 2022
parent reply forkit <forkit gmail.com> writes:
On Friday, 13 May 2022 at 09:58:17 UTC, Ola Fosheim Grøstad wrote:
 ..
 ...
 If such things are not supported then the utility is so limited 
 that it might be better to drop import-C. People generally dont 
 want to deal with 90% solutions (only the most hardcore users 
 want that).
Personally, I'm being turned off from using D, because of importC. And to 'try' to keep this thread back on topic, it's why D is becoming 'unpopular' with me. Please drop import-C. A C hack in D is pointless. Walter, please go and create C->M (C with Modules) instead. Then make it an international standard. Then I will give serious consideration to it.
May 13 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 13 May 2022 at 10:15:37 UTC, forkit wrote:
 Personally, I'm being turned off from using D, because of 
 importC.

 And to 'try' to keep this thread back on topic, it's why D is 
 becoming 'unpopular' with me.

 Please drop import-C.

 A C hack in D is pointless.

 Walter, please go and create C->M (C with Modules) instead.

 Then make it an international standard.

 Then I will give serious consideration to it.
Thank you and you are saying what I thought from the beginning when I read about import C. Import C is an answer to a question we never asked. Since we don't have a preprocessor and if it ever will exist it will be crippled and unusable. Basically we have to run a .h file through GCC/Clang in order to run the preprocessor, then what is the point of import C if we have to use an external tool to begin with. Then we can just use a stand alone tool to convert C .h files to D which is likely to work much better as well. Also external tools also opens up the possibility for C++ and other languages translation as well, something that will never happen with import C. Now we are several months into import C and Walther claimed it was easy to implement from the beginning which is obviously not true at all. If you look at the bug list, it is just riddled with import C bugs and will be. Just remove import C, it is pointless and rely on external translation tools instead.
May 13 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Friday, 13 May 2022 at 10:37:08 UTC, IGotD- wrote:
we are several months into import C.
It's impossible to eat all cake at one go. It's even worse to give up halfway. We can only support `importc`. Besides, it's not that bad.
May 13 2022
parent reply IGotD- <nise nise.com> writes:
On Friday, 13 May 2022 at 11:21:18 UTC, zjh wrote:
 On Friday, 13 May 2022 at 10:37:08 UTC, IGotD- wrote:
we are several months into import C.
It's impossible to eat all cake at one go. It's even worse to give up halfway. We can only support `importc`. Besides, it's not that bad.
Import C is obsoloete before it is ready (wich it will never be). The limitations of import C will force people to conversion tools and/or manual conversion. C .h files are in general full of "clever" preprocessor tricks that never will be supported by import C. import C feels like one of those corporate blunders where the management throws in resources into something that will hardly sell while it was obvious from the very beginning it was a bad idea.
May 13 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 13 May 2022 at 12:45:46 UTC, IGotD- wrote:
 general full of "clever" preprocessor tricks that never will be 
 supported by import C.

 import C feels like one of those corporate blunders where the 
 management throws in resources into something that will hardly 
 sell while it was obvious from the very beginning it was a bad 
 idea.
It isn’t really all that complicated, but more like tedious. What you have to do is expand macros in D code and turn the resulting code to a D set of C-compatible AST nodes. Not rocket science, but it is work that requires buckets of patience.
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 5:45 AM, IGotD- wrote:
 Import C is obsoloete before it is ready (wich it will never be). The 
 limitations of import C will force people to conversion tools and/or manual 
 conversion. C .h files are in general full of "clever" preprocessor tricks
that 
 never will be supported by import C.
Preprocessor metaprogramming macros will never be directly available to D. However, they work with ImportC, because the preprocessor runs on the ImportC code to produce C. htod, dstep, and dpp also will simply ignore metaprogramming macros.
 import C feels like one of those corporate blunders where the management
throws 
 in resources into something that will hardly sell while it was obvious from
the 
 very beginning it was a bad idea.
My whole career is based on doing things everyone tells me are stupid :-)
May 13 2022
prev sibling next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 13 May 2022 at 10:37:08 UTC, IGotD- wrote:
 Now we are several months into import C
Actually, it is over a year into it now.
 Import C is an answer to a question we never asked.
Yeah, the fact that ImportC's release mentions dstep's name but says absolutely nothing else about it and my questions about its shortcomings have always gone unanswered tells me there was no serious investigation into the existing options. And since that wasn't done, it seems unlikely there was an actual market analysis either. I actually have written about these things but apparently neither of my readers have the ear of D lol.
May 13 2022
next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Friday, 13 May 2022 at 12:57:04 UTC, Adam D Ruppe wrote:
 On Friday, 13 May 2022 at 10:37:08 UTC, IGotD- wrote:
 Now we are several months into import C
Actually, it is over a year into it now.
 Import C is an answer to a question we never asked.
Yeah, the fact that ImportC's release mentions dstep's name but says absolutely nothing else about it and my questions about its shortcomings have always gone unanswered tells me there was no serious investigation into the existing options. And since that wasn't done, it seems unlikely there was an actual market analysis either. I actually have written about these things but apparently neither of my readers have the ear of D lol.
market analysis died at the time of the famous first D poll, but anyway I hope that the importC effort at the end turns out to be a success. One thing I've not understood is why Walter decided not to integrate warp ... I don't remember a clear motivation of this choice
May 13 2022
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 13 May 2022 at 13:04:24 UTC, Paolo Invernizzi wrote:
 One thing I've not understood is why Walter decided not to 
 integrate warp ... I don't remember a clear motivation of this 
 choice
Using system headers means compatibility with the system preprocessor, which might have strange proprietary quirks. I don't know how much of a problem that is in practice (the current importC guidelines are to just #define away half the header), but that is the reasoning behind it. But you can still use the system preprocessor. There's an open PR to just shell out to it. Just this still leaves it less capable than dstep already is due to no work with the defined constants. This is why I wrote this mixin c proposal in last October. Max has started experimenting with it, so there's hope. (And it is just a proposal I wrote up in the middle of a call, it might be horribly useless too, but nobody has actually made an argument yet. Seems the only way to get D leadership attention is to troll comments on Hacker News.)
May 13 2022
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, May 13, 2022 at 01:12:38PM +0000, Adam D Ruppe via Digitalmars-d wrote:
 On Friday, 13 May 2022 at 13:04:24 UTC, Paolo Invernizzi wrote:
 One thing I've not understood is why Walter decided not to integrate
 warp ... I don't remember a clear motivation of this choice
Using system headers means compatibility with the system preprocessor, which might have strange proprietary quirks. I don't know how much of a problem that is in practice (the current importC guidelines are to just #define away half the header), but that is the reasoning behind it. But you can still use the system preprocessor. There's an open PR to just shell out to it. Just this still leaves it less capable than dstep already is due to no work with the defined constants.
[...] Walter's PR has been merged. So the latest dmd should be able to preprocess .h files automatically now. As for #define'd manifest constants, wouldn't it just be a matter of adding a separate pass over the .h file to extract #define's that look like they can be transformed into enum constants? Shouldn't be *that* hard to do, in theory anyway. T -- Questions are the beginning of intelligence, but the fear of God is the beginning of wisdom.
May 13 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 13 May 2022 at 17:27:01 UTC, H. S. Teoh wrote:
 As for #define'd manifest constants, wouldn't it just be a 
 matter of adding a separate pass over the .h file to extract 
 #define's that look like they can be transformed into enum 
 constants?  Shouldn't be *that* hard to do, in theory anyway.
There must be a similar way as with C compilers (the -D command line option) to inject defines. The D compiler must add this so that import C constants can be defined at the command line otherwise we have that extra custom build step again.
May 13 2022
next sibling parent Paul Backus <snarwin gmail.com> writes:
On Friday, 13 May 2022 at 17:35:35 UTC, IGotD- wrote:
 On Friday, 13 May 2022 at 17:27:01 UTC, H. S. Teoh wrote:
 As for #define'd manifest constants, wouldn't it just be a 
 matter of adding a separate pass over the .h file to extract 
 #define's that look like they can be transformed into enum 
 constants?  Shouldn't be *that* hard to do, in theory anyway.
There must be a similar way as with C compilers (the -D command line option) to inject defines. The D compiler must add this so that import C constants can be defined at the command line otherwise we have that extra custom build step again.
Most likely the D compiler will provide some way to pass arbitrary command-line options to the C preprocessor, just like it does for the linker (-Lwhatever).
May 13 2022
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, May 13, 2022 at 05:35:35PM +0000, IGotD- via Digitalmars-d wrote:
 On Friday, 13 May 2022 at 17:27:01 UTC, H. S. Teoh wrote:
 
 As for #define'd manifest constants, wouldn't it just be a matter of
 adding a separate pass over the .h file to extract #define's that
 look like they can be transformed into enum constants?  Shouldn't be
 *that* hard to do, in theory anyway.
 
There must be a similar way as with C compilers (the -D command line option) to inject defines. The D compiler must add this so that import C constants can be defined at the command line otherwise we have that extra custom build step again.
Poor man's workaround: ------ main.d ------ import std; import __stdin; void main() { writeln(MY_VALUE); } -------------------- ------ command line ------ echo 'enum MY_VALUE = 123;' | dmd - -run main.d -------------------------- ------ output ------ 123 -------------------- ;-) This isn't limited to defining constants; you can inject arbitrary snippets of D code into a compilation this way. It isn't an *ideal* solution, of course. But it's possible. Though unfortunately, I don't think this will work with ImportC. As in, the compiler won't know to invoke the preprocessor with the appropriate -D... flags, so if the .h file depends on some identifier like, e.g., _GNU_SOURCE, being predefined, then it won't work. T -- Only boring people get bored. -- JM
May 13 2022
prev sibling next sibling parent Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 13 May 2022 at 17:35:35 UTC, IGotD- wrote:
 There must be a similar way as with C compilers (the -D command 
 line option) to inject defines. The D compiler must add this so 
 that import C constants can be defined at the command line 
 otherwise we have that extra custom build step again.
Yeah, it will also need import paths and such. I imagine a command line forwarder is easy enough. I'd also like for it to be available from inside D - passing structured macros in and getting things out. Though that wouldn't be strictly necessary since you can always ctfe construct a #define string and mixinC it (assuming they do even the basics of mixin C but ive made enough noise it is a prototype wip now)
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 10:35 AM, IGotD- wrote:
 There must be a similar way as with C compilers (the -D command line option)
to 
 inject defines. The D compiler must add this so that import C constants can be 
 defined at the command line otherwise we have that extra custom build step
again.
The plan is to add a switch to dmd that passes command line options on to the preprocessor. Just like what dmd does to pass such to the linker.
May 13 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 10:27 AM, H. S. Teoh wrote:
 As for #define'd manifest constants, wouldn't it just be a matter of
 adding a separate pass over the .h file to extract #define's that look
 like they can be transformed into enum constants?  Shouldn't be *that*
 hard to do, in theory anyway.
cpp and cl all have the ability to emit all the macro definitions to stdout at the end of a run. (I intend to add it to sppn.) The idea is to intercept that output, look for #define patterns we can deal with, and deal with them. It's straightforward, it will just take some coding.
May 13 2022
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 14/05/2022 6:44 AM, Walter Bright wrote:
 cpp and cl all have the ability to emit all the macro definitions to 
 stdout at the end of a run. (I intend to add it to sppn.) The idea is to 
 intercept that output, look for #define patterns we can deal with, and 
 deal with them.
I must admit, that would have been some good information to have a year ago.
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 6:12 AM, Adam D Ruppe wrote:
 But you can still use the system preprocessor. There's an open PR to just
shell 
 out to it.
It was merged yesterday.
 Just this still leaves it less capable than dstep already is due to 
 no work with the defined constants.
I know. But that is not a difficult problem.
 This is why I wrote this mixin c proposal in last October.
Feel free to post it here.
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 6:04 AM, Paolo Invernizzi wrote:
 One thing I've not understood is why Walter decided not to integrate warp ...
I 
 don't remember a clear motivation of this choice
The reason is all the predefined macros that appear in each of cpp, cl, and sppn. Around 500 for cpp, largely undocumented, and sensitive to which version of cpp one is running and what command line switches are present. Keeping that list of predefined macros current with every variation of cpp, cl and sppn would be a constant maintenance nightmare.
May 13 2022
prev sibling parent reply Daniel N <no public.email> writes:
On Friday, 13 May 2022 at 12:57:04 UTC, Adam D Ruppe wrote:
 On Friday, 13 May 2022 at 10:37:08 UTC, IGotD- wrote:
 Now we are several months into import C
Actually, it is over a year into it now.
 Import C is an answer to a question we never asked.
For me ImportC is *the* killer feature. Currently at work we simply change 1 file from *.c to *.cc fix a few compile-errors and done, this is why C++ became what it is, if D had this from day-1 then I think it would have been a huge success, I can only hope dlang added this feature in time to still succeed, many other languages are starting to catch up. I have zero interest in adding custom build steps in exotic build systems to generate bindings, nor is it possible to generate static bindings because all other teams update their *.h files daily and we need to be able to use the new features without manual steps. Now, if you argue that custom build steps is no big thing, then why is CTFE so awesome? No custom build steps, that's why!
May 13 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 13 May 2022 at 13:30:56 UTC, Daniel N wrote:
 I have zero interest in adding custom build steps in exotic 
 build systems to generate bindings, nor is it possible to 
 generate static bindings because all other teams update their 
 *.h files daily and we need to be able to use the new features 
 without manual steps.
... but import C require a custom build step today to remove all preprocessor text from the .h file. You must have zero interest in import C in that case.
May 13 2022
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, May 13, 2022 at 04:18:41PM +0000, IGotD- via Digitalmars-d wrote:
 On Friday, 13 May 2022 at 13:30:56 UTC, Daniel N wrote:
 
 I have zero interest in adding custom build steps in exotic build
 systems to generate bindings, nor is it possible to generate static
 bindings because all other teams update their *.h files daily and we
 need to be able to use the new features without manual steps.
 
... but import C require a custom build step today to remove all preprocessor text from the .h file. You must have zero interest in import C in that case.
This is why I said, ImportC must be able to preprocess the imported .h file, otherwise it does not add much value to what we already have (not enough to justify its existence anyway). T -- Being forced to write comments actually improves code, because it is easier to fix a crock than to explain it. -- G. Steele
May 13 2022
prev sibling parent reply Daniel N <no public.email> writes:
On Friday, 13 May 2022 at 16:18:41 UTC, IGotD- wrote:
 On Friday, 13 May 2022 at 13:30:56 UTC, Daniel N wrote:
 I have zero interest in adding custom build steps in exotic 
 build systems to generate bindings, nor is it possible to 
 generate static bindings because all other teams update their 
 *.h files daily and we need to be able to use the new features 
 without manual steps.
... but import C require a custom build step today to remove all preprocessor text from the .h file. You must have zero interest in import C in that case.
It's work in progress, it will work in the future.
May 13 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 10:33 AM, Daniel N wrote:
 ... but import C require a custom build step today to remove all preprocessor 
 text from the .h file. You must have zero interest in import C in that case.
It's work in progress, it will work in the future.
It works today (on master).
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 6:30 AM, Daniel N wrote:
 For me ImportC is *the* killer feature. Currently at work we simply change 1 
 file from *.c to *.cc fix a few compile-errors and done, this is why C++
became 
 what it is, if D had this from day-1 then I think it would have been a huge 
 success, I can only hope dlang added this feature in time to still succeed,
many 
 other languages are starting to catch up.
 
 I have zero interest in adding custom build steps in exotic build systems to 
 generate bindings, nor is it possible to generate static bindings because all 
 other teams update their *.h files daily and we need to be able to use the new 
 features without manual steps.
 
 Now, if you argue that custom build steps is no big thing, then why is CTFE so 
 awesome? No custom build steps, that's why!
That's exactly the point.
May 13 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 3:37 AM, IGotD- wrote:
 Thank you and you are saying what I thought from the beginning when I read
about 
 import C. Import C is an answer to a question we never asked.
htod, dstep and dpp suggest otherwise.
 Since we don't have a preprocessor
We do: https://github.com/DigitalMars/dmpp
 and if it ever will exist it will be crippled and unusable.
??
 Basically we have to run a .h file through GCC/Clang in order to run the 
 preprocessor,
It gets run through cpp at the moment, not the C compiler.
 then what is the point of import C if we have to use an external 
 tool to begin with.
D has always required the "associated C compiler" if only because that's where the C standard library it links with, and the external linker, comes from.
 Then we can just use a stand alone tool to convert C .h 
 files to D which is likely to work much better as well.
We already have that tool: htod, dstep and dpp. Ironically, dpp relies on clang.
 Also external tools also 
 opens up the possibility for C++ and other languages translation as well, 
 something that will never happen with import C.
C is how disparate languages communicate with each other. It's the lingua franca of programming languages.
 Now we are several months into import C and Walther claimed it was easy to 
 implement from the beginning which is obviously not true at all. If you look
at 
 the bug list, it is just riddled with import C bugs and will be.
Not really. Every compiler has bugs in it.
 Just remove import C, it is pointless and rely on external translation tools 
 instead.
They're good tools, but don't quite get us there.
May 13 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 13 May 2022 at 18:17:06 UTC, Walter Bright wrote:
 htod, dstep and dpp suggest otherwise.
What shortcomings did you see in those, and how does ImportC solve them?
 They're good tools, but don't quite get us there.
In what way? How does ImportC get where they don't?
May 13 2022
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, May 13, 2022 at 06:22:08PM +0000, Adam D Ruppe via Digitalmars-d wrote:
 On Friday, 13 May 2022 at 18:17:06 UTC, Walter Bright wrote:
 htod, dstep and dpp suggest otherwise.
What shortcomings did you see in those, and how does ImportC solve them?
I think somebody has mentioned: all of these require an extra build step. Personally, I don't think it's a big deal, but then again I use a real build system as opposed to crippled walled gardens (*ahem* *cough* not naming names here), so hey. In any case, if you could just `import "freetype/freetype.h";` and have it Just Work(tm) without further ado, that would make C integration even more accessible than ever, esp to people with a phobia of using/modifying a real build system.
 They're good tools, but don't quite get us there.
In what way? How does ImportC get where they don't?
Assuming that ImportC overcomes its current limitations, the difference is: htod/dstep/dpp: - Edit dub.json / makefile / whatever - Create rule to run htod/dstep/dpp on library.h to create a library.di or library.d. - Don't forget to add `-L-llibrary` to linker flags. - `import library;'. ImportC (assuming limitations are fixed): - `import library;`. It's a matter of removing the speedbumps on the road so that there's less friction to interfacing with C. T -- Your inconsistency is the only consistent thing about you! -- KD
May 13 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 11:22 AM, Adam D Ruppe wrote:
 On Friday, 13 May 2022 at 18:17:06 UTC, Walter Bright wrote:
 htod, dstep and dpp suggest otherwise.
What shortcomings did you see in those, and how does ImportC solve them?
The fact that there are 3 of them suggests shortcomings. ImportC can: 1. handle C constructs that don't have D equivalents, such as bitfields and _Generic (yes, I did recently add the bitfields to D). 2. inline C functions 3. do CTFE on C functions 4. D and C source files can be handed to the dmd command line 5. no special syntax for importing C files 6. can be used as a straightforward C compiler 7. import D code (!) 8. minimizes friction in the build system 9. do inline assembler (well, D style inline assembler, not gcc/cl/dmc style) 10. generate linkable libraries from C code 11. make it easy to incrementally convert C code to D 13. make mixing and matching D and C code almost trivial
May 13 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 13 May 2022 at 19:18:35 UTC, Walter Bright wrote:
 The fact that there are 3 of them suggests shortcomings.

 ImportC can:

 1. handle C constructs that don't have D equivalents, such as 
 bitfields and _Generic (yes, I did recently add the bitfields 
 to D).

 2. inline C functions

 3. do CTFE on C functions

 4. D and C source files can be handed to the dmd command line

 5. no special syntax for importing C files

 6. can be used as a straightforward C compiler

 7. import D code (!)

 8. minimizes friction in the build system

 9. do inline assembler (well, D style inline assembler, not 
 gcc/cl/dmc style)

 10. generate linkable libraries from C code

 11. make it easy to incrementally convert C code to D

 13. make mixing and matching D and C code almost trivial
Nice, but I will give you a challenge. You are going to write a Linux driver in D (or better C) and you are only allowed to import C the necessary header files like <linux/kernel.h>, <linux/init.h> etc.
May 13 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 12:43 PM, IGotD- wrote:
 Nice, but I will give you a challenge. You are going to write a Linux driver
in 
 D (or better C) and you are only allowed to import C the necessary header
files 
 like <linux/kernel.h>, <linux/init.h> etc.
I'll let you do that, and please post any issues you find to bugzilla.
May 13 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 13 May 2022 at 19:18:35 UTC, Walter Bright wrote:
 ...

 11. make it easy to incrementally convert C code to D

 13. make mixing and matching D and C code almost trivial
on 11. ImportC will actually dissuade people from moving away from C. on 13. This too will dissuade people from moving away from C. In essence, you're forever locking C into D. C++ made this mistake too.
May 13 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 13 May 2022 at 22:49:09 UTC, forkit wrote:
 In essence, you're forever locking C into D. C++ made this 
 mistake too.
Yes, but that decision has been there all the time by keeping weird things from C such as messy operator precedence. I am more concerned about compiler internals and refactoring. (D's biggest mistake was to not align semantics with C++ and incorporate clang for C++ interop, but that is too late now.)
May 13 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 13 May 2022 at 23:24:14 UTC, Ola Fosheim Grøstad wrote:
 On Friday, 13 May 2022 at 22:49:09 UTC, forkit wrote:
 In essence, you're forever locking C into D. C++ made this 
 mistake too.
Yes, but that decision has been there all the time by keeping weird things from C such as messy operator precedence. I am more concerned about compiler internals and refactoring. (D's biggest mistake was to not align semantics with C++ and incorporate clang for C++ interop, but that is too late now.)
I think 'why is D unpopular' can be answered by D's biggest mistake. Which was, to not move away from C (i.e. differentiate itself from C). Instead, it tries to offer you the future, while being tied to the past.
May 13 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 13 May 2022 at 23:33:42 UTC, forkit wrote:
 I think 'why is D unpopular' can be answered by D's biggest 
 mistake.

 Which was, to not move away from C (i.e. differentiate itself 
 from C).

 Instead, it tries to offer you the future, while being tied to 
 the past.
I guess so, C++ is the only realistic continuation of the past and will remain so for another 30 years. In order to take that lead you have to embrace C++ in a way that is future compatible... The future of low level is a language that draws on rust and typescript sprinkled with some ideas from C++ and research on verification. It does not exist yet and will need decades to become mature.
May 13 2022
prev sibling parent reply Tejas <notrealemail gmail.com> writes:
On Friday, 13 May 2022 at 23:24:14 UTC, Ola Fosheim Grøstad wrote:
 On Friday, 13 May 2022 at 22:49:09 UTC, forkit wrote:
 In essence, you're forever locking C into D. C++ made this 
 mistake too.
Yes, but that decision has been there all the time by keeping weird things from C such as messy operator precedence. I am more concerned about compiler internals and refactoring. (D's biggest mistake was to not align semantics with C++ and incorporate clang for C++ interop, but that is too late now.)
+1000!!!!
May 13 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Saturday, 14 May 2022 at 00:11:25 UTC, Tejas wrote:
 On Friday, 13 May 2022 at 23:24:14 UTC, Ola Fosheim Grøstad 
 wrote:
 (D's biggest mistake was to not align semantics with C++ and 
 incorporate clang for C++ interop, but that is too late now.)
+1000!!!!
Well, clearly anyone who did that would have conquered the world. https://wiki.dlang.org/Calypso
May 13 2022
parent reply Tejas <notrealemail gmail.com> writes:
On Saturday, 14 May 2022 at 00:18:15 UTC, Adam D Ruppe wrote:
 On Saturday, 14 May 2022 at 00:11:25 UTC, Tejas wrote:
 On Friday, 13 May 2022 at 23:24:14 UTC, Ola Fosheim Grøstad 
 wrote:
 (D's biggest mistake was to not align semantics with C++ and 
 incorporate clang for C++ interop, but that is too late now.)
+1000!!!!
Well, clearly anyone who did that would have conquered the world. https://wiki.dlang.org/Calypso
Well, do we know why Calypso's development stopped? Maybe it had nothing to do with the merit of the idea but simply the circumstances of the developers? Many open source projects stall because the devs simply don't have time to maintain it anymore, perhaps the same happened here?
May 13 2022
next sibling parent forkit <forkit gmail.com> writes:
On Saturday, 14 May 2022 at 00:24:49 UTC, Tejas wrote:
 Well, do we know why Calypso's development stopped?

 Maybe it had nothing to do with the merit of the idea but 
 simply the circumstances of the developers? Many open source 
 projects stall because the devs simply don't have time to 
 maintain it anymore, perhaps the same happened here?
Well, anything the seeks to 'widen and faciliate' C/C++ ... has a limited life time ;-)
May 13 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 00:24:49 UTC, Tejas wrote:
 Well, do we know why Calypso's development stopped?

 Maybe it had nothing to do with the merit of the idea but 
 simply the circumstances of the developers? Many open source 
 projects stall because the devs simply don't have time to 
 maintain it anymore, perhaps the same happened here?
Bolting on clang is the wrong approach, one has to integrate with it to get the full featureset.
May 13 2022
parent reply Tejas <notrealemail gmail.com> writes:
On Saturday, 14 May 2022 at 05:04:40 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 14 May 2022 at 00:24:49 UTC, Tejas wrote:
 Well, do we know why Calypso's development stopped?

 Maybe it had nothing to do with the merit of the idea but 
 simply the circumstances of the developers? Many open source 
 projects stall because the devs simply don't have time to 
 maintain it anymore, perhaps the same happened here?
Bolting on clang is the wrong approach, one has to integrate with it to get the full featureset.
What do you mean? We should try to get D _inside_ Clang? Emit C++ instead of machine code, and feed _that_ to Clang?
May 14 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 10:44:39 UTC, Tejas wrote:
 We should try to get D _inside_ Clang?
I think D should forget about C++ and do C interop well. Too late to go for C++ now, better to do one thing well once the decision has been made to focus on C.
May 14 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 14 May 2022 at 13:40:11 UTC, Ola Fosheim Grøstad 
wrote:

 I think D should forget about C++ and do C interop well. Too 
 late to go for C++ now.
You're wrong. Interfacing with C++ is very `beneficial`! Of course, now just `ImportC`. `Interfacing` with C is very beneficial.This is also the reason why there is `ImportC`. `C++`'s successful is largely because of it! Interfacing with `C++` is also very beneficial. The ecology of `C++` is too large. You will meet `C++` everywhere. Only by interfacing with `C++`, the ecology of `d` will grow up calmly. The so-called back to the tree to enjoy the cool! If `rust` grows up, we also need to link it. You first at the corner and then to the center. Let theirs are mines!
May 14 2022
next sibling parent zjh <fqbqrr 163.com> writes:
On Saturday, 14 May 2022 at 14:37:49 UTC, zjh wrote:

 Let theirs are mines!
Take another example. Just like `go` always advertises `simplicity`, but why does it have to add `'generics'`? Because it must learn `successful language`'s experience!
May 14 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 14:37:49 UTC, zjh wrote:
 On Saturday, 14 May 2022 at 13:40:11 UTC, Ola Fosheim Grøstad 
 wrote:

 I think D should forget about C++ and do C interop well. Too 
 late to go for C++ now.
You're wrong. Interfacing with C++ is very `beneficial`! Of course, now just `ImportC`.
Wrong or not, after importC is done there will be no way back, there is a limit to how far you can stretch an implementation.
May 14 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 14 May 2022 at 13:40:11 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 14 May 2022 at 10:44:39 UTC, Tejas wrote:
 We should try to get D _inside_ Clang?
I think D should forget about C++ and do C interop well. Too late to go for C++ now, better to do one thing well once the decision has been made to focus on C.
ImportC goes well beyond the need for C interop. btw. If ones stops for a moment, and considers the two most popular programming languages (C and Java) - how many software projects exist that use a combination of these two languages? There is no actual market for ImportC, in the 'real' world. It's really just a 'see what we can do' thing, which will leave considerable technical debt for the D 'community'.
May 14 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/14/2022 6:44 PM, forkit wrote:
 There is no actual market for ImportC, in the 'real' world.
People are already using it.
May 14 2022
next sibling parent reply electricface <electricface qq.com> writes:
On Sunday, 15 May 2022 at 01:51:23 UTC, Walter Bright wrote:
 On 5/14/2022 6:44 PM, forkit wrote:
 There is no actual market for ImportC, in the 'real' world.
People are already using it.
Times have changed, and hardware performance is constantly improving. Programmers need to be able to easily get the most valuable feedback on their code from the language server. Errors given by the compiler are displayed quickly. Show exactly the type of the variable. Query exactly where the function is called. Exactly where the query function is defined. Also support refactoring code, you can't not adjust your code. Advanced features of the language should also be supported: CTFE Generics template mixin UFCS and many more The D language adds too many functions, which makes it difficult to implement the language server.
May 14 2022
parent zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 02:03:53 UTC, electricface wrote:

 Advanced features of the language should also be supported:
 CTFE
 Generics
 template
 mixin
 UFCS
 and many more

 The D language adds too many functions, which makes it 
 difficult to implement the language server.
This is very `important` and should be taken seriously by `D officials`.
May 14 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 01:51:23 UTC, Walter Bright wrote:
 On 5/14/2022 6:44 PM, forkit wrote:
 There is no actual market for ImportC, in the 'real' world.
People are already using it.
Sure. This might make sense, if the team involved in the project are all very well versed in both languages (and whatever other languages may also be used in the project). And, if they can confidently obtain and use the tools and people they need to make the project succeed, and maintain it over time. But some of the biggest software security concerns (that still exist to this day), are a result of C. Modern programming languages should be encouraging a move away from C (and providing the means to do so), and not encouraging a move towards C. My real concern though (and in relation to the topic of this thread), is that ImportC will actually make D even less popular. Who has the time to be (continually) well-versed in both D and C?
May 14 2022
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 15 May 2022 at 02:30:52 UTC, forkit wrote:

 Who has the time to be (continually) well-versed in both D and 
 C?
That's kind of the point. With ImportC, you don't *have* to be well-versed in C if you need to use a C library.
May 14 2022
parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 02:38:12 UTC, Mike Parker wrote:
 On Sunday, 15 May 2022 at 02:30:52 UTC, forkit wrote:

 Who has the time to be (continually) well-versed in both D and 
 C?
That's kind of the point. With ImportC, you don't *have* to be well-versed in C if you need to use a C library.
Integrating a library whose code you're not well versed in, seems like a recipe for disaster. I'm now thinking of the 2 x fighter planes (C and D) flying together in one of Walter's videos. Are the planes (let alone the pilots) going to make it back to base, or not?
May 14 2022
next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Sunday, 15 May 2022 at 03:02:24 UTC, forkit wrote:
 On Sunday, 15 May 2022 at 02:38:12 UTC, Mike Parker wrote:
 On Sunday, 15 May 2022 at 02:30:52 UTC, forkit wrote:

 Who has the time to be (continually) well-versed in both D 
 and C?
That's kind of the point. With ImportC, you don't *have* to be well-versed in C if you need to use a C library.
Integrating a library whose code you're not well versed in, seems like a recipe for disaster. I'm now thinking of the 2 x fighter planes (C and D) flying together in one of Walter's videos. Are the planes (let alone the pilots) going to make it back to base, or not?
It may be a recipe for disaster sometimes but most of the time it's how programmers actually make software... If you want to make a game you don't need to know how to implement a graphics driver.
May 14 2022
parent forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 03:07:31 UTC, max haughton wrote:
 ...
 If you want to make a game you don't need to know how to 
 implement a graphics driver.
sure. but someone will likely take advantage of that C code used in that graphics driver on your device.. and they'll take control of your device (cause nobody on the dev team understood what that C code was actually doing - since they were all D programmers).
May 14 2022
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 03:02:24 UTC, forkit wrote:
 
Also, the time for emulating the C++ experience of interfacing to C, has long gone. The advantage C++ had (soooooo..long...ago....for doing this), was that C programmers would be more likely to move towards using C++. That is what made C++ popular after all. But that was a long......time.........ago.... C programmers have had enough time to move to a newer language. If they still program in C, I don't see them moving to D anytime soon. Nor do I see D programmers moving (back) to C. So I fail to see the benefit in having D and C flying around together...
May 14 2022
prev sibling next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 15 May 2022 at 03:02:24 UTC, forkit wrote:

 Integrating a library whose code you're not well versed in, 
 seems like a recipe for disaster.
Then how do you ever use any libraries?
May 14 2022
parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 03:52:28 UTC, Mike Parker wrote:
 On Sunday, 15 May 2022 at 03:02:24 UTC, forkit wrote:

 Integrating a library whose code you're not well versed in, 
 seems like a recipe for disaster.
Then how do you ever use any libraries?
Getting it wrong in software has always had consequences, sometimes really bad consequences. But since software now operates in entirely new spheres that affect all aspects of our life and economy, developers have a wider obligation that just stitching together software so that it works. Structured higher-level languages is where we need to be moving towards, not moving backwards to low-level languages like C. Also, operating systems of the (near) future will require safety guarantees from the software that is intended to run on that operating system. C is not a language that facilitates this. I understand the appeal of making it easier to use C libraries in a otherwise D solution. But that does not progress or advance the nature of software development, and the responsibility programmers have to their clients. I was (initially) attracted to D because of how it advanced the problem of writing safe code (particulary safe). This is what would make D 'popular'. But ImportC is short-sighted in my opinion, and is a step in the opposite direction. The focus should instead be on safe, not C.
May 14 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:
 Getting it wrong in software has always had consequences, 
 sometimes really bad consequences. But since software now 
 operates in entirely new spheres that affect all aspects of our 
 life and economy, developers have a wider obligation that just 
 stitching together software so that it works.
Most software is not critical, but in this case you are worse off by hand translating a header file which is more error prone. The question is if there is enough patience to design and implement macro expansion in a way that works well. That I cannot tell.
 Structured higher-level languages is where we need to be moving 
 towards, not moving backwards to low-level languages like C.
Right, but D will always have raw pointers, so that cannot be D.
 Also, operating systems of the (near) future will require 
 safety guarantees from the software that is intended to run on 
 that operating system. C is not a language that facilitates 
 this.
Not sure what you mean, safety guarantees are on the hardware and OS. Beyond that safety cannot be had as you can implement an interpreter that emulates C. Also, look at Android that had Java, then put in an effort to allow C. Same with web browsers.
 I was (initially) attracted to D because of how it advanced the 
 problem of writing safe code (particulary  safe). This is what 
 would make D 'popular'.

 But ImportC is short-sighted in my opinion, and is a step in 
 the opposite direction.

 The focus should instead be on  safe, not C.
You cannot have a reasonable safe language without designing the language for that from the ground. Most languages are safe, so there are many to choose from. There is no shortage on safe languages. D will never be able to offer more than «safer than» C.
May 15 2022
parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 07:06:46 UTC, Ola Fosheim Grøstad wrote:
 ...
 ....
 Most software is not critical, but in this case you are worse 
 off by hand translating a header file which is more error prone.
 ...
Umm..a part of that statement is not correct. Most software (in fact) runs things that are critical to our everyday lives (both personally and economically). A such, the focus really should be on creating safer programming languages, and enabling developers to write safer software, that is less prone to bugs and attacks. D's new focus on encouraging even more C into D projects, is not something that should be getting underway in the year 2022. I perfectly understand, that those who want to such a thing, will have different opinion ;-)
May 15 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 May 2022 at 07:43:02 UTC, forkit wrote:
 Most software (in fact) runs things that are critical to our 
 everyday lives (both personally and
Well, let me put it this way, if the controller of my electric chainsaw fails it wont hurt me, I’ll just use my backup solution. If the breaks on a car fails my life is in jeopardy. If you can recover then it isnt critical. Business software fails all the time, the question is how fast you can recover. D and C++ will never be great in terms of being able to quickly identify the point of failure.
 A such, the focus really should be on creating safer 
 programming languages, and enabling developers to write safer 
 software, that is less prone to bugs and attacks.
Yes, but you make tradeoffs. ImportC makes tradeoffs. When the decision had been made it should be brought to completion. The risk is not in importC, but in D programmers using it in an incomplete state and it being left in an state that people depend on so it cannot be removed if it turns out that completing it was too tedious.
May 15 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 08:12:46 UTC, Ola Fosheim Grøstad wrote:

 The risk is not in importC, but in D programmers using it in an 
 incomplete state  .
95% is a big Victory.
May 15 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 May 2022 at 10:43:01 UTC, zjh wrote:
 On Sunday, 15 May 2022 at 08:12:46 UTC, Ola Fosheim Grøstad 
 wrote:

 The risk is not in importC, but in D programmers using it in 
 an incomplete state  .
95% is a big Victory.
Sure, but for who? Even 95% means that it will fail in 1 out of 20 attempts. That isn't a good selling point. What I meant is that the risk is that it will be used and then considered «sufficient» because people use it and then the feature will not brought to completion because difficulties crop up. It is better if people not use a feature until it is complete.
May 15 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 10:59:41 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 15 May 2022 at 10:43:01 UTC, zjh wrote:
Take some time to modify the `C source code`, and you can `link` with D. It's well `worth` the time. Then I can use `D` all the way.
May 15 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 12:33:23 UTC, zjh wrote:

95%.
It can be completely implemented that ,original 20 modifications needed, but now only `1/2` modification is needed. This can be achieved through various `matching macros` auto convert.Like Atila's `dpp` convert.
May 15 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 May 2022 at 12:41:20 UTC, zjh wrote:
 On Sunday, 15 May 2022 at 12:33:23 UTC, zjh wrote:

95%.
It can be completely implemented that ,original 20 modifications needed, but now only `1/2` modification is needed. This can be achieved through various `matching macros` auto convert.Like Atila's `dpp` convert.
I think macro expansion should be done in the lexer, differences between C and D should be resolved in the parser and in the static analysis after parsing. All content of the macro can be assumed to be proper C, but things that are passed in as parameters can refer to D-code and that is where you need to be careful and perhaps introduce new language constructs if it turns out that the analysis is too difficult or even change the semantics of the D language (difficult to say in advance). I don't think one should encourage to use this feature until it is done, as then you have no way to change the approach and you risk being stuck with something that is inadequate. I think this is a clear cut case of: be prepared to do it properly or don't do it at all.
May 15 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 13:00:26 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 15 May 2022 at 12:41:20 UTC, zjh wrote:
 I think macro expansion should be done in the lexer, 
 differences between C and D should be resolved in the parser 
 and in the static analysis after parsing.
`Macros` match can be done just with `mixin`. And `most macros` have patterns. They are very similar to `string mixin`. Of course, I don't know if they can handle monster macros like `boost`? Maybe ,it deserve the try .After all, they are also macros. If you can't make sure, just report it. then manual modify. If you can do it, just do it accurately . This is entirely feasible. Just treat the `macro` as another `small language`. Similar to `ImportC`, take another different `branch`.
May 15 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 May 2022 at 13:12:29 UTC, zjh wrote:
 `Macros` match can be done just with `mixin`.
Not really, because they contain fragments of C-code. So if you inject them straight into D-code it may look like gibberish to the D compiler. You also cannot parse a C fragment without knowing the context. So what you need to do is to use a parser that allows mixing D and C code under the assumption that the C-tokens are tagged as C and the D-tokens are tagged as D. This is probably possible (maybe requires some adjustments) since D already is very close to C in its structure. E.g. assume a C-macro called FUNCTION: ```$FUNCTION(void,functionname){…}``` After lexing the lexer annotate the tokens in the expanded FUNCTION macro as coming from a C-macro, and it will try at different branch in the parser that is oriented towards C and not D. From this it will build a D-compatible function signature before adding the body of the function (which is regular D code).
May 15 2022
parent zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 13:41:31 UTC, Ola Fosheim Grøstad wrote:

 Not really, because they contain fragments of C-code. So if you 
 inject them straight into D-code it may look like gibberish to 
 the D compiler.
 So what you need to do is to use a parser that allows mixing D 
 and C code under the assumption that the C-tokens are tagged as 
 C and the D-tokens are tagged as D.

 This is probably possible (maybe requires some adjustments) 
 since D already is very close to C in its structure.
You are right, This should be feasible.
May 15 2022
prev sibling next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:

 But ImportC is short-sighted in my opinion, and is a step in 
 the opposite direction.
Like it or not, C libraries are a critical part of the D ecosystem. They fill in the gaps. C interop is also important for integrating D into existing code bases. ImportC improves D's interop by making it part of the out-of-the-box experience. It's something we should have had long ago.
 The focus should instead be on  safe, not C.
Memory safety is very much in focus. While Walter is working on ImportC, Atila has been focused on setting the stage to enable DIP 1000 by default. Dennis Korpel has overseen the squashing of several DIP 1000 bugs. There will be more work on shoring up memory safety features going forward.
May 15 2022
parent forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 07:13:34 UTC, Mike Parker wrote:
 ..
 ...
 Memory safety is very much in focus. While Walter is working on 
 ImportC, Atila has been focused on setting the stage to enable 
 DIP 1000 by default. Dennis Korpel has overseen the squashing 
 of several DIP 1000 bugs. There will be more work on shoring up 
 memory safety features going forward.
That's nice to hear. I'd much rather people coming over to D, were doing so, because they want to write safer software, rather than wanting to stitch together new solutions using their old C code (which they've been able to do, with C++, for decades now - and look where that got us).
May 15 2022
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 15, 2022 at 06:18:58AM +0000, forkit via Digitalmars-d wrote:
[...]
 I understand the appeal of making it easier to use C libraries in a
 otherwise D solution. But that does not progress or advance the nature
 of software development, and the responsibility programmers have to
 their clients.
 
 I was (initially) attracted to D because of how it advanced the
 problem of writing safe code (particulary  safe). This is what would
 make D 'popular'.
 
 But ImportC is short-sighted in my opinion, and is a step in the
 opposite direction.
 
 The focus should instead be on  safe, not C.
Making it easier to use C libraries actually *increases* the motivation to write safe code. Because, if I were to start a new project and I need to use a C library, then if it was too hard to interface D with that C library, it'd incentivize me to write the project in C instead. Net result: I write more unsafe code -- because it'd be in C. OTOH, if it was easy to integrate that C library, I would be motivated to write D code instead. Net result: I write more safe code. Sure it's not 100%, but that's still better than 0%. There is, of course, the option of rewriting said C library in D. But that's a low-incentive option, because it requires a lot of time/effort, and is a net loss from a business POV (why reinvent functionality that already exists / increases time to market, which reduces competitiveness). Depending on which library you're talking about, this cost could be very prohibitive. Concrete example: text layout with something like Cairo. There is NO WAY I'm gonna be able to reinvent the whole thing myself; that'd take me several lifetimes to pull off. (And believe me, I've tried. Text layout is an extremely convoluted and messy process, with tons of unexpected complexities in this day and age of internationalization and Unicode. It's not something you can just reinvent in a fortnight; it took YEARS for existing solutions to be produced, and requires expertise you probably do not have (and would not have for years even if you were to start studying it now). You can't just throw it all out the window because OMG it's not written in D so it's BAD!) Which would you rather have -- more projects to be written in C because that's the only way you can leverage this existing technology, because it's too hard to use it from D; or more projects to be written in D because you can easily use a C library with minimum friction, so you actually *have* the option of writing it in D in the first place? T -- Живёшь только однажды.
May 15 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/15/2022 4:55 AM, H. S. Teoh wrote:
 There is, of course, the option of rewriting said C library in D.
In my experience, it is not an option to rewrite working C code into D. Note that I have successfully converted small and medium C code projects to D. I've done other translations of programs from one language to another. 1. if you don't have a test suite for the program, a successful conversion becomes an order of magnitude harder 2. converting a program all at once does not work. It must be done incrementally, one function at a time 3. even so, when faced with a large, complex project, there's just no business case for doing a conversion Even just converting the .h files to D can be a major, rather unpleasant undertaking. We've put a lot of time into converting the various system .h files into D for druntime. There's always a risk of a mistake, and we've made them and the result is bizarre crashes because of ABI mismatches. Hand-checking them is error-prone, tedious and very boring work. ImportC makes things so much easier. You can write new D code and hook it up with your existing C code base. You can get reliable, easy, and accurate access to .h files. D will happily work with a project that's a patchwork of C and D code. I.e. it dramatically lowers the barrier to adopting D. As for languages other than C, they routinely have a C interface. ImportC makes it easy to hook up with them via the C interface they provide.
May 15 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Sunday, 15 May 2022 at 17:21:18 UTC, Walter Bright wrote:
 In my experience, it is not an option to rewrite working C code 
 into D.

 Note that I have successfully converted small and medium C code 
 projects to D.
Don't these two sentences contradict each other? Unless I'm misinterpreting the meaning of words "converted" and "successfully".
 I've done other translations of programs from one language to 
 another.

 1. if you don't have a test suite for the program, a successful 
 conversion becomes an order of magnitude harder

 2. converting a program all at once does not work. It must be 
 done incrementally, one function at a time
Can you elaborate on this? One function at a time conversion can be easily done if one language is mostly a superset of another, such as converting from C to C++. But converting from C++ back to C one function at a time would be not so easy anymore (because there are classes, templates and other advanced language features).
 3. even so, when faced with a large, complex project, there's 
 just no business case for doing a conversion
Yes, if it ain't broke, don't fix it. Especially if this costs money and introduces unnecessary risks. C is not very good for implementing new code compared to the other programming languages. But once the job is done, long term maintenance is relatively painless. The language is standardized and newer versions of the compilers are only becoming more strict about certain things. D is good for rapid development of new code, but isn't great for long term maintenance because of the language evolution and compatibility breakages. My understanding is that many D projects and libraries died off because they could not afford to keep up and can't be even compiled anymore. I don't think that converting the existing C code into D makes much sense. Because such conversion only turns the strength of one language into the weakness of another.
 Even just converting the .h files to D can be a major, rather 
 unpleasant undertaking. We've put a lot of time into converting 
 the various system .h files into D for druntime. There's always 
 a risk of a mistake, and we've made them and the result is 
 bizarre crashes because of ABI mismatches. Hand-checking them 
 is error-prone, tedious and very boring work.
Well, everyone is doing this and bindings for popular C libraries are available for most programming languages.
 ImportC makes things so much easier. You can write new D code 
 and hook it up with your existing C code base. You can get 
 reliable, easy, and accurate access to .h files. D will happily 
 work with a project that's a patchwork of C and D code.
 I.e. it dramatically lowers the barrier to adopting D.
Do I understand it right that ImportC is intended for implementing major new features in the existing old C projects using D language? One of the old C projects is the Linux kernel. I know that some people are working on making it possible to develop some parts of the Linux kernel using Rust language. Would ImportC make it possible to use D for developing some parts of the Linux kernel and how is it different from what Rust people are doing?
May 15 2022
next sibling parent Tejas <notrealemail gmail.com> writes:
On Monday, 16 May 2022 at 06:51:46 UTC, Siarhei Siamashka wrote:

 One of the old C projects is the Linux kernel. I know that some 
 people are working on making it possible to develop some parts 
 of the Linux kernel using Rust language. Would ImportC make it 
 possible to use D for developing some parts of the Linux kernel 
 and how is it different from what Rust people are doing?
Rust isn't being used to develop any part of the Linux kernel itself, AFAIK. It's approved for driver development, not any part of the kernel proper(although device drivers are, like, 90+% of the code in the Linux project so...).
May 16 2022
prev sibling next sibling parent IGotD- <nise nise.com> writes:
On Monday, 16 May 2022 at 06:51:46 UTC, Siarhei Siamashka wrote:
 Do I understand it right that ImportC is intended for 
 implementing major new features in the existing old C projects 
 using D language?
That is possible yes. There are many chip vendors that has some kind of C API framework that is available together with their IPs. In this case (if import C works correctly) you can import their C API in D and then write D code than the usual C/C++ code which more likely will lead to buggy code. For low level code you're more likely to use better C than D though. If import C worked, I would have done that right out of the bat (the managers will stop me of course).
May 16 2022
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/15/2022 11:51 PM, Siarhei Siamashka wrote:
 On Sunday, 15 May 2022 at 17:21:18 UTC, Walter Bright wrote:
 In my experience, it is not an option to rewrite working C code into D.

 Note that I have successfully converted small and medium C code projects to D.
Don't these two sentences contradict each other? Unless I'm misinterpreting the meaning of words "converted" and "successfully".
It means I have credibility when it comes to this topic.
 2. converting a program all at once does not work. It must be done 
 incrementally, one function at a time
Can you elaborate on this?
Doing it one function at a time means if your new build doesn't work, you only have one function to look at to find the error. This means orders of magnitude less time spent debugging.
 But once the job is done, long term maintenance is relatively 
 painless.
No, it isn't. I speak from experience. C's limitations makes for code that is brittle (very hard to refactor).
 D is good for rapid development of new code, but isn't great for long term 
 maintenance because of the language evolution and compatibility breakages. My 
 understanding is that many D projects and libraries died off because they
could 
 not afford to keep up and can't be even compiled anymore.
I've brought forward code myself. The D1 => D2 transition was hard, but since then, it isn't that hard. But people don't want to bother with this.
 Well, everyone is doing this and bindings for popular C libraries are
available 
 for most programming languages.
This vastly underestimates the scope of the problem.
 Do I understand it right that ImportC is intended for implementing major new 
 features in the existing old C projects using D language?
?
 One of the old C projects is the Linux kernel. I know that some people are 
 working on making it possible to develop some parts of the Linux kernel using 
 Rust language. Would ImportC make it possible to use D for developing some
parts 
 of the Linux kernel and how is it different from what Rust people are doing?
I am unfamiliar with kernel development and its needs. It apparently is also written in a dialect of C with special compiler switches.
May 16 2022
next sibling parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Monday, 16 May 2022 at 08:08:51 UTC, Walter Bright wrote:
 On 5/15/2022 11:51 PM, Siarhei Siamashka wrote:
 Don't these two sentences contradict each other? Unless I'm 
 misinterpreting the meaning of words "converted" and 
 "successfully".
It means I have credibility when it comes to this topic.
Which of the two contradicting statements has credibility? Is it "not an option to rewrite working C code into D" or "I have successfully converted small and medium C code projects to D"?
 But once the job is done, long term maintenance is relatively 
 painless.
No, it isn't. I speak from experience. C's limitations makes for code that is brittle (very hard to refactor).
Yes, it is relatively painless. Huge amounts of the existing C code developed over the span of decades show us a very different picture. Why would you want to refactor something that already works fine and needs to keep working fine in the future?
 Well, everyone is doing this and bindings for popular C 
 libraries are available for most programming languages.
This vastly underestimates the scope of the problem.
I think that you are exaggerating the problem.
 Do I understand it right that ImportC is intended for 
 implementing major new features in the existing old C projects 
 using D language?
?
If an old C project is doing its job just fine, then it only needs minimal maintenance and has no use for any fancy stuff. Now if a major new feature is needed in such an old project, then people normally use C (or C++) to implement it. Or if they prefer a more convenient nicer higher level language, then maybe they embed Lua or Python code to do the job. Is ImportC intended to allow using D language as an alternative to Lua/Python for such mixed language hybrid projects?
May 16 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 16 May 2022 at 08:08:51 UTC, Walter Bright wrote:
 ...
 ....
 I am unfamiliar with kernel development and its needs. It 
 apparently is also written in a dialect of C with special 
 compiler switches.
What it 'needs' is to move away from C. https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=linux+kernel
May 16 2022
parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Monday, 16 May 2022 at 22:52:27 UTC, forkit wrote:
 On Monday, 16 May 2022 at 08:08:51 UTC, Walter Bright wrote:
 ...
 ....
 I am unfamiliar with kernel development and its needs. It 
 apparently is also written in a dialect of C with special 
 compiler switches.
What it 'needs' is to move away from C. https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=linux+kernel
It's moving to Rust as an option at least for some parts of the kernel and this will reduce the attack surface, so you are preaching to the choir. D decided not to participate in this race. Or even didn't notice that such opportunity existed.
May 16 2022
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 16 May 2022 at 06:51:46 UTC, Siarhei Siamashka wrote:


 D is good for rapid development of new code, but isn't great 
 for long term maintenance because of the language evolution and 
 compatibility breakages. My understanding is that many D 
 projects and libraries died off because they could not afford 
 to keep up and can't be even compiled anymore.
I don't think that's true at all. Maybe some people felt the rate of change is to high (others will tell you they want more breakage), but I suspect many D projects and libraries died off because their creators moved on to other things before they got their projects to the state they wanted. You can find countless projects like that in every language ecosystem. They're perhaps more noticeable in ours because we're so small. It's very easy to start a new project on a whim in any language, but getting it to the state you're aiming for and maintaining it long-term require discipline and commitment. Talk to people who actually maintain projects long-term to see what their take is.
May 16 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Monday, 16 May 2022 at 08:33:08 UTC, Mike Parker wrote:
 On Monday, 16 May 2022 at 06:51:46 UTC, Siarhei Siamashka wrote:
 D is good for rapid development of new code, but isn't great 
 for long term maintenance because of the language evolution 
 and compatibility breakages. My understanding is that many D 
 projects and libraries died off because they could not afford 
 to keep up and can't be even compiled anymore.
I don't think that's true at all. Maybe some people felt the rate of change is to high (others will tell you they want more breakage), but I suspect many D projects and libraries died off because their creators moved on to other things before they got their projects to the state they wanted. You can find countless projects like that in every language ecosystem. They're perhaps more noticeable in ours because we're so small. It's very easy to start a new project on a whim in any language, but getting it to the state you're aiming for and maintaining it long-term require discipline and commitment. Talk to people who actually maintain projects long-term to see what their take is.
I vaguely remember this message about maintaining 400kloc D code and keeping it up with the latest compiler versions: https://forum.dlang.org/post/idswokerwzdkszevjbrh forum.dlang.org Now I see that the poster said that the amount of breakages that come from compiler upgrades was reasonably small (at least by the poster's standards). But honestly, the description of their development process and the fact that there are at least some compatibility breaks is terrifying me. For example, I wonder what are they doing when they need to bisect a somewhat old bug? There are many little things like this that make everything problematic. C++ compilers allow to pick an older versions of the standard for keeping compatibility with legacy code (such as as the '-std=c++11' option supported by G++). Rust has language editions too, which help to ensure that the legacy code keeps working even as the compiler evolves. But D language forever remains experimental. Or at least it is perceived as such by many outsiders and gets dismissed because of this.
May 16 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 16 May 2022 at 09:25:14 UTC, Siarhei Siamashka wrote:
 C++ compilers allow to pick an older versions of the standard 
 for keeping compatibility with legacy code (such as as the 
 '-std=c++11' option supported by G++).
Having done both, keeping up with C++ compilers is a lot more work than keeping up with D compilers. C++ compilers do not have the same standard library, and not the same front-end. Some headers are missing for other's compilers. MSVC might remove features you need just like that, like inline 32-bit assembly. And, C++ compilers have bugs too and when you get one there is no nice centralized Bugzilla to post it to.
May 16 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 May 2022 at 10:45:51 UTC, Guillaume Piolat wrote:
 And, C++ compilers have bugs too and when you get one there is 
 no nice centralized Bugzilla to post it to.
I believe MS has totally revamped their compiler suite and all writings on this topic suggests that they are leading in implementing C++20. Whatever has been true in the past may not hold in 2022. You could say the same about D, too, I am sure.
May 16 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
Again: I won't answer you inevitable next message; because my 
time seems to be more limited.

On Monday, 16 May 2022 at 11:18:36 UTC, Ola Fosheim Grøstad wrote:
 I believe MS has totally revamped their compiler suite and all 
 writings on this topic suggests that they are leading in 
 implementing C++20.
I'm not sure why do you even bring Microsoft? I found more bugs in ICC than MSVC fwiw. MSVC has worse codegen, error messages, than clang, and I don't see a reason why that would change. MSVC has not undergo a new rewrite, it's part by part.
 Whatever has been true in the past may not hold in 2022.
I stand by my words, having done maintenance for both C++ and D projects for more than 6 years each, that D requires less work/debt to keep it up to date with compilers, especially because the front-end is the same and stdlib also the same. Moreover, the experience is like night and day, with D being far less heavy on mental load.
May 16 2022
next sibling parent zjh <fqbqrr 163.com> writes:
On Monday, 16 May 2022 at 12:10:36 UTC, Guillaume Piolat wrote:
 
How does your D code call `C++` code?
May 16 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 May 2022 at 12:10:36 UTC, Guillaume Piolat wrote:
 I'm not sure why do you even bring Microsoft? I found more bugs 
 in ICC than MSVC fwiw.
People say that Microsoft didn't provide the needed resources to develop their C++ compiler for a while because they lagged behind the standards. This has changed and they are now reportedly taking a lead in implementing the ISO standard. From what I can tell the funding situation for the C++ eco system has improved over the past few years. Difficult to make an objective assessment, of course.
May 16 2022
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 May 2022 at 08:33:08 UTC, Mike Parker wrote:
 breakage), but I suspect many D projects and libraries died off 
 because their creators moved on to other things before they got 
 their projects to the state they wanted.
But why did they move on? People have different answers, some related to the evolution of D as we can see in this thread.
May 16 2022
parent zjh <fqbqrr 163.com> writes:
On Monday, 16 May 2022 at 11:21:52 UTC, Ola Fosheim Grøstad wrote:

 But why did they move on?

 People have different answers, some related to the evolution of 
 D as we can see in this thread.
It may be major compatibility violation. For `large compatibility problems`, `migration tools` should be provided.
May 16 2022
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/16/2022 1:33 AM, Mike Parker wrote:
 It's very easy to start a new project on a whim in any language, but getting
it 
 to the state you're aiming for and maintaining it long-term require discipline 
 and commitment. Talk to people who actually maintain projects long-term to see 
 what their take is.
Yup. The last 1% takes 99% of the time. My old C and C++ projects all suffered from bit rot. This is due to a variety of factors, like: 1. the language changes 2. the compilers get stricter 3. implementation defined behavior changes 4. undefined behavior changes 5. portability problems 6. build system changes 7. operating system changes Heck, just reworking it to be a git repository takes time and effort.
May 16 2022
prev sibling parent reply Fry <fry131313 gmail.com> writes:
On Monday, 16 May 2022 at 08:33:08 UTC, Mike Parker wrote:
 I don't think that's true at all. Maybe some people felt the 
 rate of change is to high (others will tell you they want more 
 breakage), but I suspect many D projects and libraries died off 
 because their creators moved on to other things before they got 
 their projects to the state they wanted. You can find countless 
 projects like that in every language ecosystem. They're perhaps 
 more noticeable in ours because we're so small.

 It's very easy to start a new project on a whim in any 
 language, but getting it to the state you're aiming for and 
 maintaining it long-term require discipline and commitment. 
 Talk to people who actually maintain projects long-term to see 
 what their take is.
I maintained a personal project that was 60k loc of D for the last 6-7 years. Probably won't pick D again for a long term project again. There was _always_ a new compiler bug whenever I upgraded to a new version of the compiler. I'd try to stick to one version of the compiler, but bug fixes only exist for newer versions. Also a quite a bit of breaking changes, I turned off warnings as errors at some point. I also remember another instance of someone else here maintaining a D project that was still being used but not actively developed anymore. A fix was made in a newer version of D but they were originally using one several versions behind that their code just wasn't compatible with the new D compiler anymore. The response from Walter was to suggest making a donation to back port the fix. What they did and what is probably being done by other individuals is to just stick to one version of D, a single release. Then hope you don't come across a bug you need fixed. Not sure who else you meant to ask, but yah long-term develop with D sucks compared to other languages.
May 16 2022
next sibling parent Adam D Ruppe <destructionator gmail.com> writes:
On Monday, 16 May 2022 at 19:20:53 UTC, Fry wrote:
 There was _always_ a new compiler bug whenever I upgraded to a 
 new version of the compiler.
Do you recall what those bugs related to?
May 16 2022
prev sibling next sibling parent max haughton <maxhaton gmail.com> writes:
On Monday, 16 May 2022 at 19:20:53 UTC, Fry wrote:
 On Monday, 16 May 2022 at 08:33:08 UTC, Mike Parker wrote:
 [...]
I maintained a personal project that was 60k loc of D for the last 6-7 years. Probably won't pick D again for a long term project again. There was _always_ a new compiler bug whenever I upgraded to a new version of the compiler. I'd try to stick to one version of the compiler, but bug fixes only exist for newer versions. Also a quite a bit of breaking changes, I turned off warnings as errors at some point. [...]
At Symmetry we stick to one release but then bump to the latest one when we know it works.
May 16 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/16/2022 12:20 PM, Fry wrote:
 I maintained a personal project that was 60k loc of D for the last 6-7 years. 
 Probably won't pick D again for a long term project again. There was _always_
a 
 new compiler bug whenever I upgraded to a new version of the compiler. I'd try 
 to stick to one version of the compiler, but bug fixes only exist for newer 
 versions. Also a quite a bit of breaking changes, I turned off warnings as 
 errors at some point.
I'm sorry, but we don't have enough staff to maintain multiple versions simultaneously, especially on a volunteer basis. Most of us are not paid.
 What they did and what is probably being done by other individuals is to just 
 stick to one version of D, a single release. Then hope you don't come across a 
 bug you need fixed. Not sure who else you meant to ask, but yah long-term 
 develop with D sucks compared to other languages.
Often people want 3 bug fixes along with 2 enhancements. Trying to create various combinations of these in multiple binaries is pretty hard to do.
May 16 2022
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
Thing is, we have had multiple versions being maintained up till now.

Iain has been doing an absolutely wonderful job essentially creating an 
LTS frontend and we didn't bother ship a LTS version of dmd.

We discussed this last night on Discord, and I made this very point. If 
he says we can do it, then we can do it. He has the experience for it.
May 17 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 17 May 2022 at 07:06:47 UTC, rikki cattermole wrote:
 Iain has been doing an absolutely wonderful job essentially 
 creating an LTS frontend and we didn't bother ship a LTS 
 version of dmd.
Do you mean that the 2.076 frontend from GDC 9/10/11 can be essentially treated as a D language "edition 2017"? I actually like the idea, even though I can see a few challenges that can make everything a bit complicated.
May 17 2022
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 18/05/2022 12:41 AM, Siarhei Siamashka wrote:
 On Tuesday, 17 May 2022 at 07:06:47 UTC, rikki cattermole wrote:
 Iain has been doing an absolutely wonderful job essentially creating 
 an LTS frontend and we didn't bother ship a LTS version of dmd.
Do you mean that the 2.076 frontend from GDC 9/10/11 can be essentially treated as a D language "edition 2017"? I actually like the idea, even though I can see a few challenges that can make everything a bit complicated.
Could have been yes. I wouldn't suggest that we do it now though.
May 17 2022
parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 17 May 2022 at 12:42:42 UTC, rikki cattermole wrote:
 Could have been yes. I wouldn't suggest that we do it now 
 though.
Why not? But first it would be very interesting to check what percentage of packages from https://code.dlang.org/ can be still successfully compiled using the old 2.076 frontend versus the percentage of packages that can be compiled using the most recent 2.100 frontend. And maybe some frontend versions in between. This kind of statistics is necessary before making any commitments.
May 17 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 12:06 AM, rikki cattermole wrote:
 Thing is, we have had multiple versions being maintained up till now.
 
 Iain has been doing an absolutely wonderful job essentially creating an LTS 
 frontend and we didn't bother ship a LTS version of dmd.
 
 We discussed this last night on Discord, and I made this very point. If he
says 
 we can do it, then we can do it. He has the experience for it.
Many thanks to Iain for doing this!
May 17 2022
prev sibling next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 15 May 2022 at 17:21:18 UTC, Walter Bright wrote:
 3. even so, when faced with a large, complex project, there's 
 just no business case for doing a conversion
This is indeed the biggest problem after a translation, since the social structure that made that software being written and maintained in the first place do not exist for the fork. It might works, but receive no maintenance, performance enhancements, etc. For example in July 2021 I re-translated parts of stb-image to v2.27 and got a +30%/+45% performance improvement in PNG loading. If I were using an ideal ImportC I would just have to copy the new header to the codebase. So ImportC will help keeping up with dependencies that improves over time instead of being cast in stone. The security argument is moot, this will enhance security of D programs.
May 16 2022
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 15, 2022 at 10:21:18AM -0700, Walter Bright via Digitalmars-d wrote:
 On 5/15/2022 4:55 AM, H. S. Teoh wrote:
 There is, of course, the option of rewriting said C library in D.
In my experience, it is not an option to rewrite working C code into D. Note that I have successfully converted small and medium C code projects to D. I've done other translations of programs from one language to another.
I've converted a medium-sized C++ project into D. (Not a library, though, which would entail different mechanics because there'd be external code that uses it, so API compatibility becomes an issue.)
 1. if you don't have a test suite for the program, a successful
 conversion becomes an order of magnitude harder
Very true.
 2. converting a program all at once does not work. It must be done
 incrementally, one function at a time
On the contrary, with the C++ project that I converted, I found it too onerous to convert one function at a time. I started off that way, but quickly found it too painful because of tight coupling in the original code -- converting a single function sometimes actually requires converting 15 functions because they are all interdependent and/or use share the same data structures. In theory I could've done it, I suppose. But I eventually threw in the towel, and decided to leap into D cold-turkey. Strictly speaking it was more a complete rewrite in D from ground up than converting, using the C++ code more like an external reference for comparing behaviour than actually converting the code itself. Fortunately, I had a largish website that provided plenty of actual use cases for the program, so even though it wasn't technically a test suite it did serve as a check for whether I failed to match the old C++ behaviour, as well as a progress meter of how far the D code has progressed. Now the entire project is in D, and I'm mighty proud of it. It's much more maintainable than the original C++ codebase, and thanks to D's modern features it's much easier to implement new features without constantly getting bogged down by the amount of babysitting that C++ demands.
 3. even so, when faced with a large, complex project, there's just no
 business case for doing a conversion
 
 Even just converting the .h files to D can be a major, rather
 unpleasant undertaking. We've put a lot of time into converting the
 various system .h files into D for druntime. There's always a risk of
 a mistake, and we've made them and the result is bizarre crashes
 because of ABI mismatches.  Hand-checking them is error-prone, tedious
 and very boring work.
[...] Yes, automatic conversion is the way to go. Even when I'm "hand-copying" C prototypes into D, I always use cut-n-paste + edit afterwards, rather than typing it out from scratch, because the latter is so much more error-prone. My latest project, which uses libxcb heavily, has a sed script for doing 90% of the work recasting C types into D -- routine stuff like uint8_t -> ubyte, etc.. Routine work is the most dangerous in terms of likelihood of human error, because humans are bad at doing repetitive things accurately. After the first 5 times your brain just zones out and defers to muscle memory, and mistakes creep in that you're not even conscious of. T -- Береги платье снову, а здоровье смолоду.
May 16 2022
prev sibling next sibling parent reply IGotD- <nise nise.com> writes:
On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:
 Also, operating systems of the (near) future will require 
 safety guarantees from the software that is intended to run on 
 that operating system. C is not a language that facilitates 
 this.
Um, no that will not happen, ever. The safety guarantee of modern operating systems is and will be the MMU. Relying on "safe" software will never be enough. There have been attempts using safe intermediary languages like Microsoft Singularity but don't expect this ever to be a commercial operating system. The MMU is here to stay.
May 15 2022
next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Sunday, 15 May 2022 at 18:47:22 UTC, IGotD- wrote:
 On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:
 Also, operating systems of the (near) future will require 
 safety guarantees from the software that is intended to run on 
 that operating system. C is not a language that facilitates 
 this.
Um, no that will not happen, ever. The safety guarantee of modern operating systems is and will be the MMU. Relying on "safe" software will never be enough. There have been attempts using safe intermediary languages like Microsoft Singularity but don't expect this ever to be a commercial operating system. The MMU is here to stay.
I don't think the choice of language is going to make much difference but I'd be surprised if Apple don't do something along these lines in the future. They already scan and analyse applications relatively aggressive for mobile and to an extent for MacOS too.
May 15 2022
parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 23:11:28 UTC, max haughton wrote:
 On Sunday, 15 May 2022 at 18:47:22 UTC, IGotD- wrote:
 On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:
 Also, operating systems of the (near) future will require 
 safety guarantees from the software that is intended to run 
 on that operating system. C is not a language that 
 facilitates this.
Um, no that will not happen, ever. The safety guarantee of modern operating systems is and will be the MMU. Relying on "safe" software will never be enough. There have been attempts using safe intermediary languages like Microsoft Singularity but don't expect this ever to be a commercial operating system. The MMU is here to stay.
I don't think the choice of language is going to make much difference but I'd be surprised if Apple don't do something along these lines in the future. They already scan and analyse applications relatively aggressive for mobile and to an extent for MacOS too.
The choice of language can eliminate a whole class of bugs. It can also ensure the likelihood of a whole class of bugs. I'm sure you know this of course. I'm just stating the obvious. Fortunately, many get this, and are actively researching ways to move away from C - e.g. https://github.com/tock/tock/blob/master/doc/Design.md If Apple is not already doing very extensive research in this area, I'd be ...rather shocked. In the meantime, over here in the D community, we're actively seeking even more ways to get closer to C :-(
May 16 2022
parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 01:03:14 UTC, forkit wrote:
 On Sunday, 15 May 2022 at 23:11:28 UTC, max haughton wrote:
 On Sunday, 15 May 2022 at 18:47:22 UTC, IGotD- wrote:
 On Sunday, 15 May 2022 at 06:18:58 UTC, forkit wrote:
 Also, operating systems of the (near) future will require 
 safety guarantees from the software that is intended to run 
 on that operating system. C is not a language that 
 facilitates this.
Um, no that will not happen, ever. The safety guarantee of modern operating systems is and will be the MMU. Relying on "safe" software will never be enough. There have been attempts using safe intermediary languages like Microsoft Singularity but don't expect this ever to be a commercial operating system. The MMU is here to stay.
I don't think the choice of language is going to make much difference but I'd be surprised if Apple don't do something along these lines in the future. They already scan and analyse applications relatively aggressive for mobile and to an extent for MacOS too.
The choice of language can eliminate a whole class of bugs. It can also ensure the likelihood of a whole class of bugs. I'm sure you know this of course. I'm just stating the obvious. Fortunately, many get this, and are actively researching ways to move away from C - e.g. https://github.com/tock/tock/blob/master/doc/Design.md If Apple is not already doing very extensive research in this area, I'd be ...rather shocked. ...
That is what Swift is all about,
 Swift is intended as a replacement for C-based languages (C, 
 C++, and Objective-C).
Taken from https://www.swift.org/about/
 Swift is a successor to both the C and Objective-C languages.
Taken from https://developer.apple.com/swift/#fast And some C improvements as well, thanks hardware memory tagging support
 In iOS 14 and iPadOS 14, Apple modified the C compiler 
 toolchain used to build the iBoot bootloader to improve its 
 security. The modified toolchain implements code designed to 
 prevent memory- and type-safety issues that are typically 
 encountered in C programs.
Taken from https://support.apple.com/guide/security/memory-safe-iboot-implementation-sec30d8d9ec1/web
May 16 2022
prev sibling parent forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 18:47:22 UTC, IGotD- wrote:
 Um, no that will not happen, ever. The safety guarantee of 
 modern operating systems is and will be the MMU. Relying on 
 "safe" software will never be enough. There have been attempts 
 using safe intermediary languages like Microsoft Singularity 
 but don't expect this ever to be a commercial operating system. 
 The MMU is here to stay.
Relying on safe software alone, is not what I said. But modern architectures do not provide a 'safe' operating environment either. e.g Evict+Time cache attacks on the MMU ?? There is an abundance (to put it mildely) of research and real-world evidence that operating systems are not safe (by design). IoiT (Internet of insecure things) o/s's bring this to a whole new level. All (mostly) built using C, which results in an inherently unsafe operating environment. The problem here is C. Used (and still used) primarily for performance reasons. As useful as ImportC sounds (if it worked 100% correctly), it seems to be taking D programmers in the wrong direction (given my argument above). i.e. "just bring C code over here into your D project. It'll be fine. Don't worry. You don't even need to understand what that code does.. just import it."
May 15 2022
prev sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 5/14/22 23:18, forkit wrote:

 developers have
 a wider obligation that just stitching together software so that it 
works. Wider than their salaries? Employers needs developers and developers work for money. In the end it just works out.
 Structured higher-level languages is where we need to be moving towards,
Ok.
 not moving backwards to low-level languages like C.
I am never moving to C. I don't think anybody should be using C ever.
 I understand the appeal of making it easier to use C libraries in a
 otherwise D solution.
Me too! :) I think you think easier access to C libraries is wrong. I disagree. Besides, I think we've been using the wrong term here: ImportC allows using libraries with C interfaces easier. The implementation could have been any language. But that's beside the point. If there is a library with a C interface that solves a company's problem then they will use that library.
 But ImportC is short-sighted in my opinion, and is a step in the
 opposite direction.

 The focus should instead be on  safe, not C.
An open source community allows (should allow?) individuals to spend their energy on any topic that they want. Some will work on ImportC, others (which apparently includes you) will work on safe. Ali
May 15 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 23:32:33 UTC, Ali Çehreli wrote:
 ...
 I think you think easier access to C libraries is wrong. I 
 disagree.
No. My argument is not that ImportC is wrong, or even useless. My argument has to be taken *in context* of wanting programmers to write safer, more secure code, and how ImportC does not advance that aim, and *possibly* does the exact opposite. So, by raising my concern, I hope to get people to think twice before importing C code into their D projects, with the aim of not even needing to know what that C code does.
May 15 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 15, 2022 at 11:41:06PM +0000, forkit via Digitalmars-d wrote:
[...]
 No. My argument is not that ImportC is wrong, or even useless.
 
 My argument has to be taken *in context* of wanting programmers to
 write safer, more secure code, and how ImportC does not advance that
 aim, and *possibly* does the exact opposite.
 
 So, by raising my concern, I hope to get people to think twice before
 importing C code into their D projects, with the aim of not even
 needing to know what that C code does.
IMO this is misguided. Without ImportC, it takes more effort to use a C library. Which incentivizes would-be project authors to just write more C to interface with the library, rather than put in the effort to leap through the hoops just so you can write D. The net result is more unsafe C code is written. With ImportC (fully implemented -- though we're not quite there yet), it's easy to call an existing C library from D, so there's less resistance to writing D code to interface with it. The net result is *less* C code is written, more D code is written. And I'm sure you'll agree that more D code and less C code == more safety and security, than the other way round. T -- Being forced to write comments actually improves code, because it is easier to fix a crock than to explain it. -- G. Steele
May 16 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 May 2022 at 14:21:30 UTC, H. S. Teoh wrote:
 IMO this is misguided.  Without ImportC, it takes more effort 
 to use a C library.
Yes, if it actually works with all the macros in a clean and intuitive way. Meaning, the resulting D code should not look much worse than the corresponding C code. The biggest impact IMHO is that the threshold for *trying out* C-libraries is lowered. If you have to create your own binding your are essentially "locked in" due to the effort you spent just to get started. The larger the C-library is and the more frequently it is updated, the more impactful this feature might be.
May 16 2022
parent reply forkit <forkit gmail.com> writes:
On Monday, 16 May 2022 at 15:08:13 UTC, Ola Fosheim Grøstad wrote:
 ...
 ....
 The biggest impact IMHO is that the threshold for *trying out* 
 C-libraries is lowered. If you have to create your own binding 
 your are essentially "locked in" due to the effort you spent 
 just to get started.

 The larger the C-library is and the more frequently it is 
 updated, the more impactful this feature might be.
You just summarised my argument for why ImportC == 'here be dragons' ;-) D's focus should be 10 years ahead, not 30 years behind. Just imagine Rust/Go implementing a C compiler inside their own compiler They'd be laughing stock (not because it's wrong or silly, but because they took the initiative to step further awat from C, not further towards it).
May 16 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Monday, 16 May 2022 at 22:35:00 UTC, forkit wrote:
 On Monday, 16 May 2022 at 15:08:13 UTC, Ola Fosheim Grøstad 
 wrote:
 ...
 ....
 The biggest impact IMHO is that the threshold for *trying out* 
 C-libraries is lowered. If you have to create your own binding 
 your are essentially "locked in" due to the effort you spent 
 just to get started.

 The larger the C-library is and the more frequently it is 
 updated, the more impactful this feature might be.
You just summarised my argument for why ImportC == 'here be dragons' ;-) D's focus should be 10 years ahead, not 30 years behind. Just imagine Rust/Go implementing a C compiler inside their own compiler They'd be laughing stock (not because it's wrong or silly, but because they took the initiative to step further awat from C, not further towards it).
https://github.com/rust-lang/rust-bindgen It's not built in to the compiler but it's officially supported by the Rust foundation.
May 16 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 00:50:53 UTC, max haughton wrote:
 ...

 https://github.com/rust-lang/rust-bindgen

 It's not built in to the compiler but it's officially supported 
 by the Rust foundation.
in the hands of advanced Rustcracians who come from a C background, I don't feel too uncomfortable with this. in the same way I don't feel uncomfortable about Walter using ImportC (given his extensive expertise in C and D). but making it easily available for the mass of beginners out there (and even the mass of intermediates), worries me. C is profoundly unsafe. I'd rather they were focused on safer programming practices ;-) Of course, the C ABI is so simple, it's always going to be temptation to interact with it. Couple that with all the C code out there, the tempation is very high indeed. How on earth are we ever going to move the world of software away from C? The answer is ( I suspect) - when mass societal consequences occur (e.g. a cyber war outbreak or some really widespread consequential disaster - and we end finding out, they targeted all our C platforms. Only then will we have sufficient impetus to get off our butts and do something about our extensive reliance on what is arguably, the most unsafe language of all -> C.
May 16 2022
next sibling parent reply StarCanopy <starcanopy protonmail.com> writes:
On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 How on earth are we ever going to move the world of software 
 away from C?

 The answer is ( I suspect) - when mass societal consequences 
 occur (e.g. a cyber war outbreak or some really widespread 
 consequential disaster - and we end finding out, they targeted 
 all our C platforms.

 Only then will we have sufficient impetus to get off our butts 
 and do something about our extensive reliance on what is 
 arguably, the most unsafe language of all -> C.
Too much "we," not enough "I."
May 16 2022
parent reply forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 02:05:54 UTC, StarCanopy wrote:
 On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 How on earth are we ever going to move the world of software 
 away from C?

 The answer is ( I suspect) - when mass societal consequences 
 occur (e.g. a cyber war outbreak or some really widespread 
 consequential disaster - and we end finding out, they targeted 
 all our C platforms.

 Only then will we have sufficient impetus to get off our butts 
 and do something about our extensive reliance on what is 
 arguably, the most unsafe language of all -> C.
Too much "we," not enough "I."
But I cannot do it ;-) It *must* be 'we'. (same with global warming I guess). having said that, the first line I write in a D module, is -> safe: how many others can say the same?
May 16 2022
parent reply StarCanopy <starcanopy protonmail.com> writes:
On Tuesday, 17 May 2022 at 02:21:28 UTC, forkit wrote:
 But I cannot do it ;-)

 It *must* be 'we'.

 (same with global warming I guess).

 having said that, the first line I write in a D module, is -> 
  safe:

 how many others can say the same?
Then fund it, or if you have insufficient monetary resources to effect anything substantial, assist in organizing a group to sponsor libraries written in D. That is, if you truly want to achieve your goals. P.S. While I often apply safe at module-level, the GC and bounds-checking already accomplish a lot by themselves in my experience.
May 16 2022
parent forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 05:37:20 UTC, StarCanopy wrote:
 Then fund it, or if you have insufficient monetary resources to 
 effect anything substantial, assist in organizing a group to 
 sponsor libraries written in D. That is, if you truly want to 
 achieve your goals.

 P.S. While I often apply  safe at module-level, the GC and 
 bounds-checking already accomplish a lot by themselves in my 
 experience.
I think advancing my goals towards safer programming practices, can best be achieved by encouraging programmers to 'opt out' of safe, rather than 'opt in' to safe. Until safe is default in D (if ever), this can be accomplished right now, by everyone making the first line in every module they create, to be -> safe: Now. They have to make a conscious decision to 'opt out' of safe. And now, they become responsible for the consequences. At the moment, since safe is not default, D is responsible for the consequences ;-)
May 17 2022
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:

'here be dragons' -> case in point..

"are you sure that's right?"

https://youtu.be/1H9FHhRntAk?t=199
May 16 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 Only then will we have sufficient impetus to get off our butts 
 and do something about our extensive reliance on what is 
 arguably, the most unsafe language of all -> C.
Actually, C is one of the few languages in use where there are industtial verification solutions... Anyway, D has always followed C semantics and has always been system level. It never claimed not to be. People who care a lot about correctness now use Rust, compilers for such languages are implemented in Rust. In that narrow space Rust cannot be displaced in the next 10 years. Yet, Skia, Z3, LLVM and other big performance libraries will remain in C in the next 10 years. Nobody wants to rewrite those in other languages. Nobody can afford to build competitive free alternatives. So, C it is! ImportC done well allows D to benefit from the selfimposed restriction of being C-like that has been there from day 1. The only way to do it well is to do an integration that drills down to the level of the lexer, parser and AST.
May 16 2022
next sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
wrote:
 People who care a lot about correctness now use Rust, compilers 
 for such languages are implemented in Rust. In that narrow 
 space Rust cannot be displaced in the next 10 years.
yes. I have to agree here. Rust is really the only contender at the moment. D's 'incrementing C by 1', was never going to be enough :-( But D4.. or .. D10... could be different. But I doubt it will be. In the meantime, pls everyone, write safe: at the beggining of your module, and then you'll have no choice, but to explicately choose unsafe mode 'manually' (as opposed to an unsafe environment being presented to you, by default). If you can't choose to be safe, at least choose to be unsafe ;-)
May 16 2022
prev sibling next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 [...]
Actually, C is one of the few languages in use where there are industtial verification solutions... [...]
Z3 and LLVM are not written in C, unless you mean the stable APIs to them. The Z3 C++ interface is actually implemented on top of the C API.
May 16 2022
parent reply Ola Fosheim Gr <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 04:30:18 UTC, max haughton wrote:
 Z3 and LLVM are not written in C, unless you mean the stable 
 APIs to them.
Sooo... In this context C and C++ with C interface is the same. Yes, many Cish codebases are dressed up as C++ these days.
May 16 2022
parent reply Araq <rumpf_a web.de> writes:
On Tuesday, 17 May 2022 at 05:19:52 UTC, Ola Fosheim Gr wrote:
 On Tuesday, 17 May 2022 at 04:30:18 UTC, max haughton wrote:
 Z3 and LLVM are not written in C, unless you mean the stable 
 APIs to them.
Sooo... In this context C and C++ with C interface is the same. Yes, many Cish codebases are dressed up as C++ these days.
That doesn't mean anything... You're such a windbag that it reduces the quality of the forum.
May 16 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 05:43:07 UTC, Araq wrote:
 Yes, many Cish codebases are dressed up as C++ these days.
That doesn't mean anything...
Sorry, but I’ve looked over both code bases. They make limited use of C++ abstractions, just like dmd did. And you would interface with D through the Z3 C interface, not C++.
May 16 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 05:55:16 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 17 May 2022 at 05:43:07 UTC, Araq wrote:
 Yes, many Cish codebases are dressed up as C++ these days.
That doesn't mean anything...
Sorry, but I’ve looked over both code bases. They make limited use of C++ abstractions, just like dmd did. And you would interface with D through the Z3 C interface, not C++.
In case that was unclear: Modern C++ will typically present a template-heavy interface. D cannot interface with that. Now you are in locked up in C++ land. Cish code in C++ dressing easily provides a C-API. The original DMD codebase, Z3 and many other C++ codebases can with limited effort be translated into C. That makes them Cish… There is a 1-to-1 mapping of most of the model that you need to interface with. And that is why importC is a valuable addition even if fewer libraries are written in C and more libraries are written in C++ over time.
May 16 2022
prev sibling parent reply max haughton <maxhaton gmail.com> writes:
On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 Only then will we have sufficient impetus to get off our butts 
 and do something about our extensive reliance on what is 
 arguably, the most unsafe language of all -> C.
Actually, C is one of the few languages in use where there are industtial verification solutions... Anyway, D has always followed C semantics and has always been system level. It never claimed not to be. People who care a lot about correctness now use Rust, compilers for such languages are implemented in Rust. In that narrow space Rust cannot be displaced in the next 10 years. Yet, Skia, Z3, LLVM and other big performance libraries will remain in C in the next 10 years. Nobody wants to rewrite those in other languages. Nobody can afford to build competitive free alternatives. So, C it is! ImportC done well allows D to benefit from the selfimposed restriction of being C-like that has been there from day 1. The only way to do it well is to do an integration that drills down to the level of the lexer, parser and AST.
Other than memory safety rust doesn't have all that many virtues beyond any other language for guaranteeing correctness. Ada remains the top dog for properly critical software. SPARK still does not have many proper challengers in the space.
May 16 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 04:34:16 UTC, max haughton wrote:
 Other than memory safety rust doesn't have all that many 
 virtues beyond any other language for guaranteeing correctness.

 Ada remains the top dog for properly critical software. SPARK 
 still does not have many proper challengers in the space.
Rust is getting attention from people/academics who design/explore/research/implement tooling with a focus on correctness. They have the momentum at this point.
May 16 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 04:34:16 UTC, max haughton wrote:
 On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
 wrote:
 On Tuesday, 17 May 2022 at 01:57:44 UTC, forkit wrote:
 Only then will we have sufficient impetus to get off our 
 butts and do something about our extensive reliance on what 
 is arguably, the most unsafe language of all -> C.
Actually, C is one of the few languages in use where there are industtial verification solutions... Anyway, D has always followed C semantics and has always been system level. It never claimed not to be. People who care a lot about correctness now use Rust, compilers for such languages are implemented in Rust. In that narrow space Rust cannot be displaced in the next 10 years. Yet, Skia, Z3, LLVM and other big performance libraries will remain in C in the next 10 years. Nobody wants to rewrite those in other languages. Nobody can afford to build competitive free alternatives. So, C it is! ImportC done well allows D to benefit from the selfimposed restriction of being C-like that has been there from day 1. The only way to do it well is to do an integration that drills down to the level of the lexer, parser and AST.
Other than memory safety rust doesn't have all that many virtues beyond any other language for guaranteeing correctness. Ada remains the top dog for properly critical software. SPARK still does not have many proper challengers in the space.
Those who care for Ada, or correctness, are also interested into bringing Rust into the game, https://blog.adacore.com/adacore-and-ferrous-systems-joining-forces-to-support-rust https://www.autosar.org/news-events/details/autosar-investigates-how-the-programming-language-rust-could-be-applied-in-adaptive-platform-context/
May 16 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Tuesday, 17 May 2022 at 05:22:57 UTC, Paulo Pinto wrote:
 On Tuesday, 17 May 2022 at 04:34:16 UTC, max haughton wrote:
 On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
 wrote:
 [...]
Other than memory safety rust doesn't have all that many virtues beyond any other language for guaranteeing correctness. Ada remains the top dog for properly critical software. SPARK still does not have many proper challengers in the space.
Those who care for Ada, or correctness, are also interested into bringing Rust into the game, https://blog.adacore.com/adacore-and-ferrous-systems-joining-forces-to-support-rust https://www.autosar.org/news-events/details/autosar-investigates-how-the-programming-language-rust-could-be-applied-in-adaptive-platform-context/
I'm not surprised, but I also think Rust has a long way to go to really compete on technical grounds with at least some aspects of Ada Whether rust is useful or not depends on whether the program has to actually allocate memory or not.
May 17 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 14:46:41 UTC, max haughton wrote:
 On Tuesday, 17 May 2022 at 05:22:57 UTC, Paulo Pinto wrote:
 On Tuesday, 17 May 2022 at 04:34:16 UTC, max haughton wrote:
 On Tuesday, 17 May 2022 at 02:43:31 UTC, Ola Fosheim Grøstad 
 wrote:
 [...]
Other than memory safety rust doesn't have all that many virtues beyond any other language for guaranteeing correctness. Ada remains the top dog for properly critical software. SPARK still does not have many proper challengers in the space.
Those who care for Ada, or correctness, are also interested into bringing Rust into the game, https://blog.adacore.com/adacore-and-ferrous-systems-joining-forces-to-support-rust https://www.autosar.org/news-events/details/autosar-investigates-how-the-programming-language-rust-could-be-applied-in-adaptive-platform-context/
I'm not surprised, but I also think Rust has a long way to go to really compete on technical grounds with at least some aspects of Ada Whether rust is useful or not depends on whether the program has to actually allocate memory or not.
It certainly does have to a lot to catch up with SPARK, and NVidia has chosen Ada instead of Rust exactly because of that, yet there is money being thrown out at the problem, and standard organizations interested into making it happen. It won't be there today, but it will eventually, because they have one specific answer to "what you use X for".
May 17 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 8:41 AM, Paulo Pinto wrote:
 It certainly does have to a lot to catch up with SPARK, and NVidia has chosen 
 Ada instead of Rust exactly because of that, yet there is money being thrown
out 
 at the problem, and standard organizations interested into making it happen.
 
 It won't be there today, but it will eventually, because they have one
specific 
 answer to "what you use X for".
I'm curious what feature(s) of Ada led to NVDA's selection.
May 17 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 20:16:45 UTC, Walter Bright wrote:
 On 5/17/2022 8:41 AM, Paulo Pinto wrote:
 It certainly does have to a lot to catch up with SPARK, and 
 NVidia has chosen Ada instead of Rust exactly because of that, 
 yet there is money being thrown out at the problem, and 
 standard organizations interested into making it happen.
 
 It won't be there today, but it will eventually, because they 
 have one specific answer to "what you use X for".
I'm curious what feature(s) of Ada led to NVDA's selection.
Easy, they did a whole talk on the matter, https://www.adacore.com/nvidia
May 17 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 2:11 PM, Paulo Pinto wrote:
 Easy, they did a whole talk on the matter,
 
 https://www.adacore.com/nvidia
Sweet! Thanks
May 18 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/18/2022 12:32 AM, Walter Bright wrote:
 On 5/17/2022 2:11 PM, Paulo Pinto wrote:
 Easy, they did a whole talk on the matter,

 https://www.adacore.com/nvidia
Sweet! Thanks
I watched the video, thanks. The technical issues mentioned: 1. no buffer overflows 2. no undefined behavior 3. no integer overflow 4. no integer divide by zero 5. pre and post conditions 6. prover 7. zero overhead interfacing with C 8. easy mix&match Ada and C code 9. ranged integers 10. C-like syntax The prover seems to rely on proving the pre- and post- conditions are met by the function. Some years ago I tried this out on Spark, and found it was very limited (it was unable to deal with things like AND and OR in expressions). But perhaps that has improved over time. D (in safe code) has: 1, 5, 7, 8, 9 I know that safe code eliminates a lot of undefined behavior, and we should check into eliminating all of it. SPARC is unable to fulfill its goal without adding runtime checks, as the presentation talked about the need to use switches that eliminated those checks.
May 18 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
I did a quick look at D's undefined behavior, and nearly all of it is
disallowed 
in  safe code.

It's not 100%, but we're in a strong position.
May 18 2022
next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 19 May 2022 at 00:29:46 UTC, Walter Bright wrote:
 I did a quick look at D's undefined behavior, and nearly all of 
 it is disallowed in  safe code.

 It's not 100%, but we're in a strong position.
One hole that remains: https://forum.dlang.org/thread/kazglmjwsihfewhscioe forum.dlang.org According to a reply, the current langauge implementation does not cause anything special to happen, but it is still undefined behaviour in ` safe` code if we go by the spec.
May 19 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/19/2022 12:30 AM, Dukc wrote:
 One hole that remains: 
 https://forum.dlang.org/thread/kazglmjwsihfewhscioe forum.dlang.org
 
 According to a reply, the current langauge implementation does not cause 
 anything special to happen, but it is still undefined behaviour in ` safe`
code 
 if we go by the spec.
I'm aware of that, but if the user selects "ignore the asserts" then he assumes the risk of what happens if the assert would have otherwise tripped. Just like if the user turns off array bounds checking. You want an option for it to continue and be defined behavior. Others have argued for this, too, but that makes it kinda pointless to have.
May 19 2022
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 19 May 2022 at 18:39:51 UTC, Walter Bright wrote:
 On 5/19/2022 12:30 AM, Dukc wrote:
 One hole that remains: 
 https://forum.dlang.org/thread/kazglmjwsihfewhscioe forum.dlang.org
 
 According to a reply, the current langauge implementation does 
 not cause anything special to happen, but it is still 
 undefined behaviour in ` safe` code if we go by the spec.
I'm aware of that, but if the user selects "ignore the asserts" then he assumes the risk of what happens if the assert would have otherwise tripped.
Agreed. But asserts are often used without regard for performance, because it's assumed they are removed from release code. Which puts pressure to disable them from a release build
 Just like if the user turns off array bounds checking.
We can do a bit better. If an assert trips, the program will go to unknown state, and there's no telling what it will do. But that's still not quite the same as undefined behaviour. If the program trips an assert in ` safe` code, it probably won't cause a buffer overflow vulnerability, because ` safe` would almost certainly have detected a vulnerability if one existed. But if the spec says "undefined behaviour", the compiler is allowed to optimise so that it creates such a vulnerability - see the linked thread for an example. I know the current compilers do not do that, but the spec allows doing so in the future.
May 19 2022
next sibling parent reply Tejas <notrealemail gmail.com> writes:
On Thursday, 19 May 2022 at 20:17:36 UTC, Dukc wrote:
 On Thursday, 19 May 2022 at 18:39:51 UTC, Walter Bright wrote:
 On 5/19/2022 12:30 AM, Dukc wrote:
 [...]
I'm aware of that, but if the user selects "ignore the asserts" then he assumes the risk of what happens if the assert would have otherwise tripped.
Agreed. But asserts are often used without regard for performance, because it's assumed they are removed from release code. Which puts pressure to disable them from a release build
 Just like if the user turns off array bounds checking.
We can do a bit better. If an assert trips, the program will go to unknown state, and there's no telling what it will do. But that's still not quite the same as undefined behaviour. If the program trips an assert in ` safe` code, it probably won't cause a buffer overflow vulnerability, because ` safe` would almost certainly have detected a vulnerability if one existed. But if the spec says "undefined behaviour", the compiler is allowed to optimise so that it creates such a vulnerability - see the linked thread for an example. I know the current compilers do not do that, but the spec allows doing so in the future.
Isn't the advice to use `enforce` for handling/verifying input in release builds and `assert` for development builds though? Yeah, it's violating DRY if a particular check needs to be done in both development and release, but then one can skip the `assert` and just do `enforce`, no?
May 19 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/19/2022 11:40 PM, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying input in release
builds 
 and `assert` for development builds though?
 Yeah, it's violating DRY if a particular check needs to be done in both 
 development and release, but then one can skip the `assert` and just do 
 `enforce`, no?
Asserts are *not* for validating program input. Please do not use them for that. They are for checking that the program's logic is correct. If an assert is tripped, it is a bug in the program, not a problem with user input.
May 20 2022
parent reply forkit <forkit gmail.com> writes:
On Saturday, 21 May 2022 at 03:05:08 UTC, Walter Bright wrote:
 On 5/19/2022 11:40 PM, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying input 
 in release builds and `assert` for development builds though?
 Yeah, it's violating DRY if a particular check needs to be 
 done in both development and release, but then one can skip 
 the `assert` and just do `enforce`, no?
Asserts are *not* for validating program input. Please do not use them for that. They are for checking that the program's logic is correct. If an assert is tripped, it is a bug in the program, not a problem with user input.
I don't agree, that 'tripping an assert' == 'a bug in your program'. It may well ward off a potential bug (by asserting before hand, to prevent a bug). But I can do this with exceptions too. The main reason I'd chose to catch exceptions, rather than an assert, is in a situation where I want to *both* test and handle 'expected' conditions. I'd use an assert where I don't want to handle any 'unexpected' conditions. I rarely use assert in any case, as the software I develop is very user friendly ;-) But this argument is like the argument of whether you should use your left foot, or your right foot to brake.
May 21 2022
next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 22 May 2022 at 02:51:24 UTC, forkit wrote:
 I'd use an assert where I don't want to handle any 'unexpected' 
 conditions.
e.g. out of memory. no disk space left, etc...
May 21 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/21/2022 7:51 PM, forkit wrote:
 I don't agree, that 'tripping an assert' == 'a bug in your program'.
That's what they're designed for. If you want to use them for some other purpose, feel free, but you'll have to take responsibility for the results.
 e.g. out of memory. no disk space left, etc...
You're better off with: printf("fatal error\n"); exit(1);
May 21 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 20 May 2022 at 06:40:30 UTC, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying input 
 in release builds and `assert` for development builds though?
 Yeah, it's violating DRY if a particular check needs to be done 
 in both development and release, but then one can skip the 
 `assert` and just do `enforce`, no?
argh! I often forget that D removes assert in release mode :-( If only we had Rust style asserts. And by that, I specifically mean: "Assertions are always checked in both debug and release builds, and cannot be disabled. See debug_assert! for assertions that are not enabled in release builds by default." https://doc.rust-lang.org/std/macro.assert.html
May 22 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:

How about a compile time option to NOT remove assert in release?
May 22 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 22 May 2022 at 11:51:10 UTC, forkit wrote:
 On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:

 How about a compile time option to NOT remove assert in release?
`-release` is shorthand for disabling bounds checking outside of ` safe`, turning off contracts, and disabling asserts. These days, you can manage the first two with `-boundscheck` and `-check`, and then asserts will still be enabled.
May 22 2022
parent forkit <forkit gmail.com> writes:
On Sunday, 22 May 2022 at 12:30:21 UTC, Mike Parker wrote:
 On Sunday, 22 May 2022 at 11:51:10 UTC, forkit wrote:
 On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:

 How about a compile time option to NOT remove assert in 
 release?
`-release` is shorthand for disabling bounds checking outside of ` safe`, turning off contracts, and disabling asserts. These days, you can manage the first two with `-boundscheck` and `-check`, and then asserts will still be enabled.
Oh. Useful info, thanks.
May 22 2022
prev sibling next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:
 On Friday, 20 May 2022 at 06:40:30 UTC, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying input 
 in release builds and `assert` for development builds though?
 Yeah, it's violating DRY if a particular check needs to be 
 done in both development and release, but then one can skip 
 the `assert` and just do `enforce`, no?
argh! I often forget that D removes assert in release mode :-( If only we had Rust style asserts. And by that, I specifically mean: "Assertions are always checked in both debug and release builds, and cannot be disabled. See debug_assert! for assertions that are not enabled in release builds by default." https://doc.rust-lang.org/std/macro.assert.html
"enforce" in D is the same as "assert!" in Rust "assert" in D is the same as "debug_assert!" in Rust Looks like you are only unhappy about names, but all the necessary functionality is available.
May 22 2022
parent forkit <forkit gmail.com> writes:
On Sunday, 22 May 2022 at 15:47:05 UTC, Siarhei Siamashka wrote:
 "enforce" in D is the same as "assert!" in Rust
 "assert" in D is the same as "debug_assert!" in Rust

 Looks like you are only unhappy about names, but all the 
 necessary functionality is available.
'assert' comes naturally when making an 'assertion'. assert(assert == assertion) // true 'enforce' does not. It just seems out of place to me.
May 22 2022
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:
 On Friday, 20 May 2022 at 06:40:30 UTC, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying input 
 in release builds and `assert` for development builds though?
 Yeah, it's violating DRY if a particular check needs to be 
 done in both development and release, but then one can skip 
 the `assert` and just do `enforce`, no?
argh! I often forget that D removes assert in release mode :-( If only we had Rust style asserts. And by that, I specifically mean: "Assertions are always checked in both debug and release builds, and cannot be disabled. See debug_assert! for assertions that are not enabled in release builds by default." https://doc.rust-lang.org/std/macro.assert.html
We actually have this in D, it's just written weirdly: ```d // debug assertion assert(condition); // release assertion if (!condition) assert(0); ``` It would probably be more intuitive if we could write release assertions as plain `assert`, and use `debug assert` for debug assertions. But the functionality is there.
May 22 2022
next sibling parent forkit <forkit gmail.com> writes:
On Monday, 23 May 2022 at 01:37:09 UTC, Paul Backus wrote:
 On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:
 [...]
We actually have this in D, it's just written weirdly: ```d // debug assertion assert(condition); // release assertion if (!condition) assert(0); ``` It would probably be more intuitive if we could write release assertions as plain `assert`, and use `debug assert` for debug assertions. But the functionality is there.
helpful info. thanks. Sadly, it is (as you suggest), rather counterintuitive that the same keyword gets removed in one example, and not the other.
May 22 2022
prev sibling next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Monday, 23 May 2022 at 01:37:09 UTC, Paul Backus wrote:
 We actually have this in D, it's just written weirdly:

 ```d
 // debug assertion
 assert(condition);

 // release assertion
 if (!condition) assert(0);
 ```

 It would probably be more intuitive if we could write release 
 assertions as plain `assert`, and use `debug assert` for debug 
 assertions. But the functionality is there.
This is probably the way to go even now. Walter is not convinced we should have an option to remove `assert`s from release builds in a ` safe` way. So I think the most reasonable convention is: ```d // debug assertion, use when you suspect performance impact debug assert(condition); // regular assertion. Should be on for most applications even in release builds, but can be omitted if you're willing to accept you don't have memory safety if an assert fails. Should be the most common assert. assert(condition); // Semi-strong assertion. Should be on for all applications that care about memory-safety at all. Use for custom bounds checking in ` trusted` or ` system` code. version(D_NoBoundsChecks){} else if(!condition) assert(0) // Strong assertion. Always on. if(!condition) assert(0); ``` In fact, I think this is a good way to go even if Walter changes his mind.
May 23 2022
parent Guillaume Piolat <first.last gmail.com> writes:
On Monday, 23 May 2022 at 08:46:13 UTC, Dukc wrote:
 ```d
 // debug assertion, use when you suspect performance impact
 debug assert(condition);

 // regular assertion. Should be on for most applications even 
 in release builds, but can be omitted if you're willing to 
 accept you don't have memory safety if an assert fails. Should 
 be the most common assert.
 assert(condition);

 // Semi-strong assertion. Should be on for all applications 
 that care about memory-safety at all. Use for custom bounds 
 checking in ` trusted` or ` system` code.
 version(D_NoBoundsChecks){} else if(!condition) assert(0)

 // Strong assertion. Always on.
 if(!condition) assert(0);
 ```
Good tips. Using if(!condition) assert(0); for release assertion seems reliable. If D ever changes this in favor of a more explicit syntax, it would need a deprecation period.
May 23 2022
prev sibling next sibling parent reply claptrap <clap trap.com> writes:
On Monday, 23 May 2022 at 01:37:09 UTC, Paul Backus wrote:
 On Sunday, 22 May 2022 at 11:49:27 UTC, forkit wrote:
 On Friday, 20 May 2022 at 06:40:30 UTC, Tejas wrote:
 Isn't the advice to use `enforce` for handling/verifying 
 input in release builds and `assert` for development builds 
 though?
 Yeah, it's violating DRY if a particular check needs to be 
 done in both development and release, but then one can skip 
 the `assert` and just do `enforce`, no?
argh! I often forget that D removes assert in release mode :-( If only we had Rust style asserts. And by that, I specifically mean: "Assertions are always checked in both debug and release builds, and cannot be disabled. See debug_assert! for assertions that are not enabled in release builds by default." https://doc.rust-lang.org/std/macro.assert.html
We actually have this in D, it's just written weirdly: ```d // debug assertion assert(condition); // release assertion if (!condition) assert(0); ``` It would probably be more intuitive if we could write release assertions as plain `assert`, and use `debug assert` for debug assertions. But the functionality is there.
Thats a prime example of one of the main things that irritate me about D. For the sake of not adding a new keyword or something like that, instead a special case is added. Asserts are removed in release mode unless it is assert(0). Its idiotic. assert(0) isn't even an assert, it's abort(). uuugh. Sometimes it feels like D is intent on finding the most ways to reuse the same keyword to mean different things in different contexts.
May 23 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 assert(0) isn't even an assert, it's abort().
I agree that abort() and unreachable() should be different things. In proofs ```assert(false)``` is a potential disaster as you can deduce anything from it, kinda like claiming that what must be true is indeed not true at this point, so it can clearly never happen, and if it can never happen then anything before it can be whatever you fancy as it it clearly doesn't matter… Or something like that :-) I guess D is borrowing from MSVC ```__assume(false)``` ? C++23 will provide [unreachable()](https://en.cppreference.com/w/cpp/utility/unreachable).
May 23 2022
prev sibling next sibling parent reply bauss <jj_1337 live.dk> writes:
On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:>
 Sometimes it feels like D is intent on finding the most ways to 
 reuse the same keyword to mean different things in different 
 contexts.
alias says hi
May 23 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/23/2022 4:54 AM, bauss wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:>
 Sometimes it feels like D is intent on finding the most ways to reuse the same 
 keyword to mean different things in different contexts.
alias says hi
static was already taken
May 23 2022
prev sibling parent reply wjoe <invalid example.com> writes:
On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is assert(0). Its 
 idiotic.
Isn't that C behavior though?
May 23 2022
parent reply deadalnix <deadalnix gmail.com> writes:
On Monday, 23 May 2022 at 13:55:34 UTC, wjoe wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is assert(0). 
 Its idiotic.
Isn't that C behavior though?
C asserts are included or not based on various defines.
May 23 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 23 May 2022 at 14:07:32 UTC, deadalnix wrote:
 On Monday, 23 May 2022 at 13:55:34 UTC, wjoe wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is assert(0). 
 Its idiotic.
Isn't that C behavior though?
C asserts are included or not based on various defines.
It's rather odd, that it's so much easier in C. // uncomment to disable assert() // #define NDEBUG (whereas) .. So much mucking around in D. It's ridiculous.
May 23 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Monday, 23 May 2022 at 22:05:23 UTC, forkit wrote:
 On Monday, 23 May 2022 at 14:07:32 UTC, deadalnix wrote:
 On Monday, 23 May 2022 at 13:55:34 UTC, wjoe wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is assert(0). 
 Its idiotic.
Isn't that C behavior though?
C asserts are included or not based on various defines.
It's rather odd, that it's so much easier in C. // uncomment to disable assert() // #define NDEBUG (whereas) .. So much mucking around in D. It's ridiculous.
It's a command line option versus a macro (likely supplies on the command line). It's just fuss over nothing.
May 23 2022
parent reply forkit <forkit gmail.com> writes:
On Monday, 23 May 2022 at 22:42:52 UTC, max haughton wrote:
 On Monday, 23 May 2022 at 22:05:23 UTC, forkit wrote:
 On Monday, 23 May 2022 at 14:07:32 UTC, deadalnix wrote:
 On Monday, 23 May 2022 at 13:55:34 UTC, wjoe wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is assert(0). 
 Its idiotic.
Isn't that C behavior though?
C asserts are included or not based on various defines.
It's rather odd, that it's so much easier in C. // uncomment to disable assert() // #define NDEBUG (whereas) .. So much mucking around in D. It's ridiculous.
It's a command line option versus a macro (likely supplies on the command line). It's just fuss over nothing.
I don't fuss over 'nothing'. First, I have to 'remember' that assert is removed in -release, but that depends on how you've actually used it. Then, in order to leave my assert's in production code, I need to 'remember' not to compile with -release, but rather -someothercrap I can't remember at the moment. So while it's not the end of the world, it's also not 'nothing'.
May 23 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 06:12:40 UTC, forkit wrote:
 First, I have to 'remember' that assert is removed in -release, 
 but that depends on how you've actually used it.

 Then, in order to leave my assert's in production code, I need 
 to 'remember' not to compile with -release, but rather 
 -someothercrap I can't remember at the moment.
Relying on command line options to make a fishy code work in the way you want is a bad style. The logic of your code should be correct regardless of the used command line options. You just need to remember to use 'enforce' instead of 'assert'. People coming from C background normally know that they can't count on asserts remaining in the compiled binaries and don't use them for the checks that shouldn't be omitted. But if you are coming from Rust, then you may indeed find the 'assert' vs. 'enforce' naming confusing. But I doubt that 'assert' makes D language unpopular. It's getting way too much attention here ;-)
May 23 2022
parent reply forkit <forkit gmail.com> writes:
On Tuesday, 24 May 2022 at 06:55:52 UTC, Siarhei Siamashka wrote:
 
 ....
 You just need to remember to use 'enforce' instead of 'assert'. 
 ....
It's not correct to suggest that these are simply interchangeable. They may appear to be similar, but they are two very different things. For example (to just make that point): nothrow void checkSomething(bool checkThis) { //enforce(checkThis == true); // nope! enforce throws an exception. assert(checkThis == true); // ok. }
May 24 2022
parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 09:12:02 UTC, forkit wrote:
 
 You just need to remember to use 'enforce' instead of 
 'assert'. ....
It's not correct to suggest that these are simply interchangeable.
Of course they are not. Otherwise there would be no need for changing 'assert' to 'enforce' in the code in the first place.
 For example (to just make that point):

 nothrow void checkSomething(bool checkThis)
 {
     //enforce(checkThis == true); // nope! enforce throws an 
 exception.
     assert(checkThis == true); // ok.
 }
Yes, 'enforce' is not usable in some cases, but in many cases it is. It can be probably changed to cause abort when used in 'nothrow' functions instead of failing to compile (yes, I understand that this is controversial). Or something with a different name can be introduced to cover this case. Still my point is that having the code that behaves correctly only when used together with some special command line option is not a great idea. You never know who is going to compile your code and what kind of options they are going to use to hurt themselves. And when they do, it's you who has to provide support and do the necessary detective work to even figure out what exactly happened.
May 24 2022
prev sibling next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Tuesday, 24 May 2022 at 06:12:40 UTC, forkit wrote:
 On Monday, 23 May 2022 at 22:42:52 UTC, max haughton wrote:
 On Monday, 23 May 2022 at 22:05:23 UTC, forkit wrote:
 On Monday, 23 May 2022 at 14:07:32 UTC, deadalnix wrote:
 On Monday, 23 May 2022 at 13:55:34 UTC, wjoe wrote:
 On Monday, 23 May 2022 at 09:45:21 UTC, claptrap wrote:
 [...]
 Asserts are removed in release mode unless it is 
 assert(0). Its idiotic.
Isn't that C behavior though?
C asserts are included or not based on various defines.
It's rather odd, that it's so much easier in C. // uncomment to disable assert() // #define NDEBUG (whereas) .. So much mucking around in D. It's ridiculous.
It's a command line option versus a macro (likely supplies on the command line). It's just fuss over nothing.
I don't fuss over 'nothing'. First, I have to 'remember' that assert is removed in -release, but that depends on how you've actually used it. Then, in order to leave my assert's in production code, I need to 'remember' not to compile with -release, but rather -someothercrap I can't remember at the moment. So while it's not the end of the world, it's also not 'nothing'.
This is very close to nothing when it comes to actually shipping code. By the time you have something in "production" you will have bigger things to worry about than implementing your policy for assertions i.e. reading the manual. The idea of a release build is something of a mistake anyway, it's usually not exactly right for a given purpose so you're better off specifying exactly what you want instead anyway. I use the flag more often on compiler explorer than anything else.
May 24 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 07:02:29 UTC, max haughton wrote:
 The idea of a release build is something of a mistake anyway, 
 it's usually not exactly right for a given purpose so you're 
 better off specifying exactly what you want instead anyway.
I strongly disagree with this opinion. When shipping a product as a source code, end users may use various wild combinations of compiler options to build it. This may result in sub-optimal performance if some of the important options are forgotten. Or this may result in program misbehavior if it gets miscompiled by some super duper experimental options. It's also difficult to test all possible permutations of the available build options. So having the '-release' option, which provides reasonable and mostly optimal defaults is very useful. In reality https://en.wikipedia.org/wiki/Safety_in_numbers is also a thing. If the majority of users use the same build options, then they are likely to encounter the same bugs and/or performance problems in the compiler, report these problems and have these problems resolved. Autovectorization and LTO are somewhat immature and controversial, but if they were rock solid and never regressed anything, then I would like to see them included as a part of the default '-release' options bundle too (that's of course a big IF). When developing free open source software in C/C++, I had to deal with the end users or distro maintainers picking bad compilation options on more than one occasion. In my experience, fine grained individual optimization options to enable or disable something are mostly useful for troubleshooting compiler bugs. Considering what I said above, it's not surprising that D is also often misconfigured on various programming competition websites (which is probably not a very important use case, but it still shows the language in a bad light): * https://codeforces.com/blog/entry/101509#comment-901360 * https://discuss.codechef.com/t/gdc-compiler-settings-on-codechef/95359
May 24 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Tuesday, 24 May 2022 at 09:05:57 UTC, Siarhei Siamashka wrote:
 So having the '-release' option, which provides reasonable and 
 mostly optimal defaults is very useful.
It is neither reasonable nor optimal. -release enables a random set of *bad* switches that will HURT your code. It removes all the default safety features in the language for no actual benefit! Seriously, *never* use it. It should be removed. If you want optimization, -release doesn't do this. -O does.
May 24 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 11:30:28 UTC, Adam D Ruppe wrote:
 On Tuesday, 24 May 2022 at 09:05:57 UTC, Siarhei Siamashka 
 wrote:
 So having the '-release' option, which provides reasonable and 
 mostly optimal defaults is very useful.
It is neither reasonable nor optimal. -release enables a random set of *bad* switches that will HURT your code. It removes all the default safety features in the language for no actual benefit!
I think that you are barking up the wrong tree. One of the valid criticisms of D language is that it is not safe by default. A workaround is to add " safe:" at the top of the source files to achieve this. Does this resolve your problems? Right now the '-release' option disables bounds checking in system code and this is a **necessary** escape hatch to keep D language competitive with C++ and the other system programming languages. Many safe programming languages still allow to go the unsafe route in the performance critical parts of code to disable bounds checking and the other expensive safety features. For example, you can have a look at https://doc.rust-lang.org/book/ch19-01-unsafe-rust.html
 Seriously, *never* use it. It should be removed.
What would be your suggested replacement for this very important functionality? But again, please first check safe attribute before suggesting anything really disruptive. Moreover, the concept of 'release' builds exists in many other programming languages and removing it from D would be weird.
May 24 2022
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Tuesday, 24 May 2022 at 12:10:36 UTC, Siarhei Siamashka wrote:
 One of the valid criticisms of D language is that it is not 
  safe by default.
I don't care about safe. I'm talking real world safety, things like automatic bounds checks.
 Right now the '-release' option disables bounds checking in 
  system code and this is a **necessary** escape hatch to keep D 
 language competitive with C++ and the other system programming 
 languages.
False. You can put `.ptr` in strategic places to disable it locally instead of bludgeoning it with a superglobal switch. Even if you want to disable it globally (and you don't, unless you're too incompetent to be allowed to release anything to the public), there's the -boundscheck=off switch for that. Bundling it with a bunch of other things and hiding it behind a harmless-sounding "-release" thing is criminally negligent in a sane world.
May 24 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 12:19:56 UTC, Adam D Ruppe wrote:
 On Tuesday, 24 May 2022 at 12:10:36 UTC, Siarhei Siamashka 
 wrote:
 One of the valid criticisms of D language is that it is not 
  safe by default.
I don't care about safe. I'm talking real world safety, things like automatic bounds checks.
Please try it nevertheless. You will have automatic bounds checks in your safe code even with the '-release' switch. That's the whole point. The safe code remains safe and the system or trusted code remains fast. The choice is up to me to decide what I really want in my code on a per-function basis.
 Right now the '-release' option disables bounds checking in 
  system code and this is a **necessary** escape hatch to keep 
 D language competitive with C++ and the other system 
 programming languages.
False. You can put `.ptr` in strategic places to disable it locally instead of
If I didn't care about convenience, then I would be probably using Rust instead of D.
 bludgeoning it with a superglobal switch.
This isn't what I'm doing. Please educate yourself about the safe attribute.
 Even if you want to disable it globally (and you don't, unless 
 you're too incompetent to be allowed to release anything to the 
 public), there's the -boundscheck=off switch for that. Bundling 
 it with a bunch of other things and hiding it behind a 
 harmless-sounding "-release" thing is criminally negligent in a 
 sane world.
I asked you for your suggested alternative. This is **your** suggestion. And it's **me**, who is supposed to criticize your suggestion. Thanks for doing my job, but what was the purpose of posting this paragraph? Are there any other suggestions that you actually consider reasonable?
May 24 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Tuesday, 24 May 2022 at 12:51:32 UTC, Siarhei Siamashka wrote:
 Please educate yourself about the  safe attribute.
Please refrain from uninformed personal attacks. You might notice I said "default safety features". Here's the facts. D's default: significant safety *by default*, eliminating over 90% of C-specific bugs (which btw are a minority of actual bugs; it is important to remember to keep it in perspective). Where it is necessary to bypass these important checks, which btw is a small minority of places, you can use .ptr locally, after verifying correctness, to disable it selectively while keeping safety by default. By contrast, once you choose to use -release, you get *security holes* by default, which is the opposite of what you want when actually releasing code to real users! You can then opt back into minimum safety checks (which you want in a vast majority of places) on a case-by-case basis by adding ` safe` (...until some poor user is told to use -boundscheck=off but that's on them, at least that switch sounds like you should think twice, unlike -release which sounds harmless while being anything but). The compiler is likely to fight you throughout this process as other library authors must also remember to opt into it. A programming language ought to align with safety and common use. -release does the opposite of this, with very little warning. Never using it, on the other hand, aligns with these virtues, while still letting you very conveniently bypass checks when it is genuinely necessary and beneficial.
May 24 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 24 May 2022 at 13:43:07 UTC, Adam D Ruppe wrote:
 On Tuesday, 24 May 2022 at 12:51:32 UTC, Siarhei Siamashka 
 wrote:
 Please educate yourself about the  safe attribute.
Please refrain from uninformed personal attacks.
Says the guy who proclaimed "I don't care about safe". Please understand that we are just not on the same page until you have a sufficient understanding about the safe attribute, the reasons why it exists and its interaction with the '-release' command line option. That's why I asked you to check it up.
 You might notice I said "default safety features". Here's the 
 facts.

 D's default: significant safety *by default*,
Do you mean that system code with bounds checking enabled provides "significant safety"? I don't think that this is enough, but this is just my opinion and you are always free to disagree. I think that the other programming languages are setting the bar much higher than that.
 Where it is necessary to bypass these important checks, which 
 btw is a small minority of places, you can use .ptr locally, 
 after verifying correctness, to disable it selectively while 
 keeping safety by default.
There are multiple problem with this approach. The most severe of them is that the ".ptr" construct does not provide bounds checking in debug builds. Convenience is a major factor too, and despite your claims I don't agree that ".ptr" is convenient. I personally don't see any reasons to use it.
 By contrast, once you choose to use -release, you get *security 
 holes* by default, which is the opposite of what you want when 
 actually releasing code to real users! You can then opt back 
 into minimum safety checks (which you want in a vast majority 
 of places) on a case-by-case basis by adding ` safe`
Did you even read my replies? You got everything backwards. I'm in favor of having everything safe by default and then opt out on a case-by-case basis by adding trusted where necessary. This works very nicely with the current '-release' option. Whoever implemented it this way did a great job! I remind you that there's no obligation for us to agree. And no, having a different opinion is not a personal attack.
 The compiler is likely to fight you throughout this process as
 other library authors must also remember to opt into it.
Yes, not all libraries are fully compatible with safe and this is probably something that could be improved. Thankfully my use cases don't depend on the other libraries at the moment.
 A programming language ought to align with safety and common 
 use. -release does the opposite of this, with very little 
 warning.
See above. We just have a major disagreement here.
 Never using it, on the other hand, aligns with these virtues, 
 while still letting you very conveniently bypass checks when it 
 is genuinely necessary and beneficial.
Your standards of what is "convenient" are obviously very different from mine.
May 25 2022
parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Wednesday, 25 May 2022 at 08:34:49 UTC, Siarhei Siamashka 
wrote:
 I remind you that there's no obligation for us to agree. And 
 no, having a different opinion is not a personal attack.
Asking someone to educate himself implying that the person doesn't know anything, can be treated as an attack, though. We should not have release switch as it is now due to those security holes mentioned. Probably the best thing would be is to not turn off some of security features such as bounds checking in release mode. I.e. keep optimizations enabled, as well as security checks that are expected to be beneficial even in release binaries.
May 25 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Wednesday, 25 May 2022 at 13:34:35 UTC, Alexandru Ermicioi 
wrote:
 We should not have release switch as it is now due to those 
 security holes mentioned.
That's merely the Adam's claim. He is trying to very aggressively "save" me from some non-existing problems without realizing that I'm just not using D language in the same way as he does. He is too busy being engaged in some sort of shadowboxing against himself and is not paying attention to my explanations. To sum it up. The '-release' switch doesn't disable bounds checking in safe parts of the code. So we are fine as long as the majority of code in a project is marked as safe. Rather than removing or changing the '-release' switch, I think that a much better idea is to migrate more code and libraries to safe and my understanding is that D language is moving in that direction. Maybe Walter can confirm or deny this? Regarding the name of this topic. If the '-release' switch removal idea gains traction, it will be a strong reason for me to quit. Performance parity with C++ without unreasonable extra efforts is very high on my priority list. If extra efforts are unavoidable and D loses its rapid development appeal, then there's always Rust as a more mainstream alternative.
May 25 2022
next sibling parent Bruce Carneal <bcarneal gmail.com> writes:
On Wednesday, 25 May 2022 at 16:17:06 UTC, Siarhei Siamashka 
wrote:
 On Wednesday, 25 May 2022 at 13:34:35 UTC, Alexandru Ermicioi 
 wrote:
 We should not have release switch as it is now due to those 
 security holes mentioned.
That's merely the Adam's claim. He is trying to very aggressively "save" me from some non-existing problems without realizing that I'm just not using D language in the same way as he does. He is too busy being engaged in some sort of shadowboxing against himself and is not paying attention to my explanations.
Adam has "saved" many dlang programmers with his pragmatic and real-world-safety oriented advice. He is the most prolific, and one of the most respected, dlang contributors in the discord threads. His book, https://www.amazon.com/D-Cookbook-Adam-D-Ruppe/dp/1783287217 , was one of the sources I referenced when onboarding. His library work, https://github.com/adamdruppe/arsd , is very well regarded.
 To sum it up. The '-release' switch doesn't disable bounds 
 checking in  safe parts of the code. So we are fine as long as 
 the majority of code in a project is marked as  safe. Rather 
 than removing or changing the '-release' switch, I think that a 
 much better idea is to migrate more code and libraries to  safe 
 and my understanding is that D language is moving in that 
 direction. Maybe Walter can confirm or deny this?
D gives you a lot of options wrt safe ty. My personal preference, probably shared by some others, is that the defaults should favor correctness and convenience with all else being opt-in. I'm quite concerned with ultimate performance (real time video processing) and yet have found it easy to employ D in a very safe/convenient form for almost all my code ( safe, immutable/const, pure, gc, throw) while using dcompute, __vector, trusted, .ptr and the like when I need to be on the metal.
 Regarding the name of this topic. If the '-release' switch 
 removal idea gains traction, it will be a strong reason for me 
 to quit. Performance parity with C++ without unreasonable extra 
 efforts is very high on my priority list. If extra efforts are 
 unavoidable and D loses its rapid development appeal, then 
 there's always Rust as a more mainstream alternative.
The -release switch activates a combination of finer grain controls. You can put together whatever combination you wish in your build scripts. I don't know how a command line fixable inconvenience compares to those that you'd experience if you decamp to the Rust world but if you do I'd like to hear your take on it.
May 25 2022
prev sibling next sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Wednesday, 25 May 2022 at 16:17:06 UTC, Siarhei Siamashka 
wrote:
 On Wednesday, 25 May 2022 at 13:34:35 UTC, Alexandru Ermicioi 
 wrote:
 We should not have release switch as it is now due to those 
 security holes mentioned.
That's merely the Adam's claim. He is trying to very aggressively "save" me from some non-existing problems without realizing that I'm just not using D language in the same way as he does. He is too busy being engaged in some sort of shadowboxing against himself and is not paying attention to my explanations.
Nah, not just Adam's. There are other people that also complained at disabled bounds checking in release mode as well as disabled contracts. So the opinion for release mode now leans towards more safety even for system code.
 Regarding the name of this topic. If the '-release' switch 
 removal idea gains traction, it will be a strong reason for me 
 to quit. Performance parity with C++ without unreasonable extra 
 efforts is very high on my priority list. If extra efforts are 
 unavoidable and D loses its rapid development appeal, then 
 there's always Rust as a more mainstream alternative.
I do agree that it should not be removed, but I'm for a change in what this switch turns on and off (more safety even for system code). It is quite nice switch for turning all default recommendations, without the need to scurry over all dmd options to find which should be for release turned on. That's actually what I really hated in c++ compilers as well as make files. So many options, words, magic, even for simple projects, for newbies that it just blows your mind, and just end up with a copy pasted make file from somewhere, in order to start coding.
May 25 2022
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 25 May 2022 at 18:29:02 UTC, Alexandru Ermicioi 
wrote:
 complained at disabled bounds checking in release mode as well 
 as disabled contracts. So the opinion for release mode now 
 leans towards more safety even for system code.
Maybe there should be a `fine switch` to meet the needs of different people . For example,Like `VIM` which has many `switches`.
May 25 2022
parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Thursday, 26 May 2022 at 01:25:08 UTC, zjh wrote:
 On Wednesday, 25 May 2022 at 18:29:02 UTC, Alexandru Ermicioi 
 wrote:
 complained at disabled bounds checking in release mode as well 
 as disabled contracts. So the opinion for release mode now 
 leans towards more safety even for system code.
Maybe there should be a `fine switch` to meet the needs of different people . For example,Like `VIM` which has many `switches`.
There are already lots of 'fine' switches. The thing with release is that it should enable/configure switches in dmd to produce a release binary for user use. I'm disappointed that it only disables bounds checking. It should also enable all reasonable optimizations to make release binary fast and optimized in size (lto perhaps?) Basically if you don't want to fiddle with all micro optimization flags, you should be able to use some default profile for release, which is denoted by release switch. Best regards, Alexandru.
May 29 2022
parent zjh <fqbqrr 163.com> writes:
On Sunday, 29 May 2022 at 07:24:21 UTC, Alexandru Ermicioi wrote:
 The thing with release is that it should enable/configure 
 switches in dmd to produce a release binary for user use. I'm 
 disappointed that it only disables bounds checking. It should 
 also enable all reasonable optimizations to make release binary 
 fast and optimized in size (lto perhaps?)
There should have large switchs and small switchs. Configurability is important. `Release` is `implement` not match `name` problem.
May 29 2022
prev sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Wednesday, 25 May 2022 at 16:17:06 UTC, Siarhei Siamashka 
wrote:
 I think that a much better idea is to migrate more code
You're right, we're not on the same page. I am talking about the D language we have today, with code we have today, making decisions in the here-and-now. You're talking about a fantasy.
 Regarding the name of this topic. If the '-release' switch 
 removal idea gains traction, it will be a strong reason for me 
 to quit.
lol you can still use -boundscheck=safeonly -check=none to get the same result. -release is just a shorthand for those. (of course, if you want maximum performance, you want to use gdc -O instead of any of this other stuff)
May 25 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 25, 2022 at 07:25:43PM +0000, Adam D Ruppe via Digitalmars-d wrote:
 On Wednesday, 25 May 2022 at 16:17:06 UTC, Siarhei Siamashka wrote:
[...]
 Regarding the name of this topic. If the '-release' switch removal
 idea gains traction, it will be a strong reason for me to quit.
lol you can still use -boundscheck=safeonly -check=none to get the same result. -release is just a shorthand for those. (of course, if you want maximum performance, you want to use gdc -O instead of any of this other stuff)
FWIW, after the first year or so of using D, I stopped using -release. These days, I just use `ldc2 -O2` which gives me better performance than any silly bounds-check-removing nonsense like -release. We're not living in the 70's anymore; the few nanoseconds you save by skipping a lousy bounds check is just not worth the nightmare of a buffer overflow exploit on a customer production environment. T -- Be in denial for long enough, and one day you'll deny yourself of things you wish you hadn't.
May 25 2022
parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 25 May 2022 at 19:37:19 UTC, H. S. Teoh wrote:
 On Wed, May 25, 2022 at 07:25:43PM +0000, Adam D Ruppe via 
 Digitalmars-d wrote:
 On Wednesday, 25 May 2022 at 16:17:06 UTC, Siarhei Siamashka 
 wrote:
[...] FWIW, after the first year or so of using D, I stopped using -release. These days, I just use `ldc2 -O2` which gives me better performance than any silly bounds-check-removing nonsense like -release. We're not living in the 70's anymore; the few nanoseconds you save by skipping a lousy bounds check is just not worth the nightmare of a buffer overflow exploit on a customer production environment. T
Even in the 60's leaving out bounds checking was considered a dumb idea, naturally the C school of thought took another path.
 Many years later we asked our customers whether they wished us 
 to provide an option to switch off these checks in the 
 interests of efficiency on production runs. Unanimously, they 
 urged us not to--they already knew how frequently subscript 
 errors occur on production runs where failure to detect them 
 could be disastrous. I note with fear and horror that even in 
 1980, language designers and users have not learned this 
 lesson. In any respectable branch of engineering, failure to 
 observe such elementary precautions would have long been 
 against the law.
C.A.R Hoare on his 1980 Turing award speech.
May 25 2022
prev sibling parent Paul Backus <snarwin gmail.com> writes:
On Tuesday, 24 May 2022 at 12:10:36 UTC, Siarhei Siamashka wrote:
 Seriously, *never* use it. It should be removed.
What would be your suggested replacement for this very important functionality? But again, please first check safe attribute before suggesting anything really disruptive.
The `-check`, `-checkaction`, and `-boundscheck` switches can do everything `-release` does. They also give you more fine-grained control, so you can, e.g., disable assertions without also disabling array bounds checking.
May 24 2022
prev sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Tuesday, 24 May 2022 at 06:12:40 UTC, forkit wrote:
 Then, in order to leave my assert's in production code, I need 
 to 'remember' not to compile with -release, but rather 
 -someothercrap I can't remember at the moment.
Never ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever use -release. It should be removed, or at least renamed to -add-random-security-holes-for-zero-benefit. It is a terrible switch that does random bad things.
May 24 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/24/2022 4:26 AM, Adam D Ruppe wrote:
 It is a terrible switch that does random bad things.
Back in the olden daze, I've seen magazine compiler benchmark articles trash various compilers for poor runtime performance. It nearly always boiled down to the journalist not using the right switches for release builds. So I'm leery of not being able to turn off the runtime checks to get max performance. Besides, it provides a way of accurately measuring how much the runtime checks are costing.
May 25 2022
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 25, 2022 at 11:56:03AM -0700, Walter Bright via Digitalmars-d wrote:
 On 5/24/2022 4:26 AM, Adam D Ruppe wrote:
 It is a terrible switch that does random bad things.
Back in the olden daze, I've seen magazine compiler benchmark articles trash various compilers for poor runtime performance. It nearly always boiled down to the journalist not using the right switches for release builds. So I'm leery of not being able to turn off the runtime checks to get max performance.
Unfortunately, we're not living in the olden daze anymore. We're living in days of widespread exploitation of out-of-bounds memory accesses, buffer overflows, and other such things by malicious entities. These days, what the internet equivalent of magazines trash is no longer poor raw performance, but the lack of security features that prevent these sorts of exploits. Slices and safe were the right direction towards the future; -release disabling all bounds checks is regressing towards bygone days that are better left forgotten. Unfortunately, -release negates much of the benefits the former gives us.
 Besides, it provides a way of accurately measuring how much the
 runtime checks are costing.
I wouldn't be against an explicit switch to disable bounds checks for when you want to measure this. But it should not be the default behaviour of -release. It should be called -disable-all-checks or something along those lines. In this day and age, nobody should release software without bounds checks anymore. We're no longer living in the 70's. Just browse through the latest CVEs and see how frequently the lack of bounds checks in C/C++ leads to all sorts of recurring security holes and exploits. Plugging those holes matter far more than the bragging rights of "winning" some petty benchmarks. T -- It is of the new things that men tire --- of fashions and proposals and improvements and change. It is the old things that startle and intoxicate. It is the old things that are young. -- G.K. Chesterton
May 25 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/25/2022 12:19 PM, H. S. Teoh wrote:
 I wouldn't be against an explicit switch to disable bounds checks for
 when you want to measure this.  But it should not be the default
 behaviour of -release.
It isn't. To turn off bounds check, you'll need -noboundscheck -release just turns off asserts.
May 25 2022
next sibling parent Sergey <kornburn yandex.ru> writes:
On Wednesday, 25 May 2022 at 21:47:38 UTC, Walter Bright wrote:
 On 5/25/2022 12:19 PM, H. S. Teoh wrote:
 I wouldn't be against an explicit switch to disable bounds 
 checks for
 when you want to measure this.  But it should not be the 
 default
 behaviour of -release.
It isn't. To turn off bounds check, you'll need -noboundscheck -release just turns off asserts.
Exactly what competitive programmers do for better performance. Flags for Codeforces for example: dmd -L/STACK:268435456 -version=ONLINE_JUDGE -O -release -inline -noboundscheck {file}
May 25 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Wednesday, 25 May 2022 at 21:47:38 UTC, Walter Bright wrote:
 On 5/25/2022 12:19 PM, H. S. Teoh wrote:
 I wouldn't be against an explicit switch to disable bounds 
 checks for
 when you want to measure this.  But it should not be the 
 default
 behaviour of -release.
It isn't. To turn off bounds check, you'll need -noboundscheck -release just turns off asserts.
Hhh? You mean that's all -release does? Just turn off asserts? Nothing else (whatsoever)?
May 25 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 25 May 2022 at 23:16:35 UTC, forkit wrote:
 On Wednesday, 25 May 2022 at 21:47:38 UTC, Walter Bright wrote:
 On 5/25/2022 12:19 PM, H. S. Teoh wrote:
 I wouldn't be against an explicit switch to disable bounds 
 checks for
 when you want to measure this.  But it should not be the 
 default
 behaviour of -release.
It isn't. To turn off bounds check, you'll need -noboundscheck -release just turns off asserts.
Hhh? You mean that's all -release does? Just turn off asserts? Nothing else (whatsoever)?
https://d.godbolt.org/z/MjPe35Maz release does turn off bounds checking.
May 25 2022
next sibling parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Wednesday, 25 May 2022 at 23:34:30 UTC, max haughton wrote:
 https://d.godbolt.org/z/MjPe35Maz

 release does turn off bounds checking.
But it does not turn off bounds checking if you add " safe:" at the top of your source file. This is a very important detail! In release builds we have safety in safe code and full performance in trusted/ system code (which needs much more careful review as a safeguard against bugs). In debug builds we still have bounds checks in the trusted/ system code too, which is useful for testing/debugging.
May 25 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Wednesday, 25 May 2022 at 23:34:30 UTC, max haughton wrote:
 On Wednesday, 25 May 2022 at 23:16:35 UTC, forkit wrote:
 On Wednesday, 25 May 2022 at 21:47:38 UTC, Walter Bright wrote:
 On 5/25/2022 12:19 PM, H. S. Teoh wrote:
 I wouldn't be against an explicit switch to disable bounds 
 checks for
 when you want to measure this.  But it should not be the 
 default
 behaviour of -release.
It isn't. To turn off bounds check, you'll need -noboundscheck -release just turns off asserts.
Hhh? You mean that's all -release does? Just turn off asserts? Nothing else (whatsoever)?
https://d.godbolt.org/z/MjPe35Maz release does turn off bounds checking.
I'm gunna-hav-ta take your word for it ;-) ..(cause I can't read assembly, nor do I want to ;-) But the fact that nobody seems to really know (even Walter!), is troubling, to say the least. Perhaps someone well informed needs to write a brief article, so we can all learn what -release 'really' does.
May 25 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Thursday, 26 May 2022 at 01:07:38 UTC, forkit wrote:
 But the fact that nobody seems to really know (even Walter!), 
 is troubling, to say the least.

 Perhaps someone well informed needs to write a brief article, 
 so we can all learn what -release 'really' does.
Well, I surely do know :-) And this is documented at least for GDC and LDC: "man gdc": ``` -frelease Turns on compiling in release mode, which means not emitting runtime checks for contracts and asserts. Array bounds checking is not done for system and trusted functions, and assertion failures are undefined behavior. This is equivalent to compiling with the following options: gdc -fno-assert -fbounds-check=safe -fno-invariants \ -fno-postconditions -fno-preconditions -fno-switch-errors ``` "ldc2 --help" ``` --release - Compile release version, defaulting to disabled asserts/contracts/invariants, and bounds checks in safe functions only ``` But "man dmd" isn't helpful at all: ``` -release Compile release version ``` Still DMD behaves in the same way as the other compilers.
May 25 2022
next sibling parent Mike Parker <aldacron gmail.com> writes:
On Thursday, 26 May 2022 at 02:04:18 UTC, Siarhei Siamashka wrote:
 But "man dmd" isn't helpful at all:
 ```
        -release
               Compile release version
 ```

 Still DMD behaves in the same way as the other compilers.
https://dlang.org/dmd-linux.html#switch-release
May 25 2022
prev sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
https://issues.dlang.org/show_bug.cgi?id=23141
May 25 2022
prev sibling next sibling parent Adam D Ruppe <destructionator gmail.com> writes:
On Wednesday, 25 May 2022 at 18:56:03 UTC, Walter Bright wrote:
 Back in the olden daze, I've seen magazine compiler benchmark 
 articles trash various compilers for poor runtime performance.
I read a Raymond Chen thing not too long ago saying the reason they didn't do seconds on the Windows 95 clock is that swapping in a couple more pages each second hurt their benchmark performance. And since it wasn't that important to users, it got axed.
 So I'm leery of not being able to turn off the runtime checks 
 to get max performance.
I'm not against being able to do it - you have -boundscheck, and -check, and similar on the compiler switch, and you have .ptr in the language itself - my problem is putting it in the routine-sounding -release switch that people aren't fully educated on. If you know exactly what is happening and understand why it is happening, do it. But if you're just copying -release because "I want a release build" or because someone told you "you can win the benchmark if you use this" without realizing that it opens you back up to potentially serious problems, that's no good.
 Besides, it provides a way of accurately measuring how much the 
 runtime checks are costing.
Sure, use -boundscheck=off for that.
May 25 2022
prev sibling parent Bruce Carneal <bcarneal gmail.com> writes:
On Wednesday, 25 May 2022 at 18:56:03 UTC, Walter Bright wrote:
 On 5/24/2022 4:26 AM, Adam D Ruppe wrote:
 It is a terrible switch that does random bad things.
Back in the olden daze, I've seen magazine compiler benchmark articles trash various compilers for poor runtime performance. It nearly always boiled down to the journalist not using the right switches for release builds.
At present this concern is not addressable by a single switch. Imagine the confusion if a performance reviewer did not choose ldc/gdc, did not avail his/herself of -Ox options, did not employ LTO, did not enable target arch specific optimization, ...
 So I'm leery of not being able to turn off the runtime checks 
 to get max performance.

 Besides, it provides a way of accurately measuring how much the 
 runtime checks are costing.
Yes, it enables that measurement but, as others have noted, it would be strange (if not outright foolish) to actually release code compiled with "-release". I think there would be little concern if "-release" had been named "-removeAllSafetyChecks".
May 25 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/23/2022 7:07 AM, deadalnix wrote:
 C asserts are included or not based on various defines.
Building them in offers some semantic advantages, 1. being able to get them to do things line insert a HLT instruction 2. format an error message based on stringizing the expression 3. people won't be motivated to create their own
May 23 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Tuesday, 24 May 2022 at 02:05:39 UTC, Walter Bright wrote:
 On 5/23/2022 7:07 AM, deadalnix wrote:
 C asserts are included or not based on various defines.
Building them in offers some semantic advantages, 1. being able to get them to do things line insert a HLT instruction 2. format an error message based on stringizing the expression 3. people won't be motivated to create their own
Another reason is that forcing people down a blessed road means that the means you provide for fiddling with behaviour can be composed together much more easily (whether this be some runtime hook or merely all Assertions throwing the same Error).
May 23 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/23/2022 8:43 PM, max haughton wrote:
 On Tuesday, 24 May 2022 at 02:05:39 UTC, Walter Bright wrote:
 On 5/23/2022 7:07 AM, deadalnix wrote:
 C asserts are included or not based on various defines.
Building them in offers some semantic advantages, 1. being able to get them to do things line insert a HLT instruction 2. format an error message based on stringizing the expression 3. people won't be motivated to create their own
Another reason is that forcing people down a blessed road means that the means you provide for fiddling with behaviour can be composed together much more easily (whether this be some runtime hook or merely all Assertions throwing the same Error).
A user can still write their own myAssert(). But by building it in and making it so convenient, this just doesn't happen. It's also why `debug` is builtin - managers had complained to me that they could never get C++ code written by different teams to work together because they each developed their own `debug` conventions.
May 25 2022
prev sibling parent reply Loara <loara noreply.com> writes:
On Monday, 23 May 2022 at 01:37:09 UTC, Paul Backus wrote:
 We actually have this in D, it's just written weirdly:

 ```d
 // debug assertion
 assert(condition);

 // release assertion
 if (!condition) assert(0);
 ```

 It would probably be more intuitive if we could write release 
 assertions as plain `assert`, and use `debug assert` for debug 
 assertions. But the functionality is there.
I think this is better ```d // debug assertion assert(condition); // release assertion if (!condition) throw new AssertError(); ``` or also ```d // debug assertion assert(condition); // release assertion if (!condition) throw new MyCustomError("My error message"); ``` so when your program fails the final user is aware of what went wrong.
May 24 2022
parent Paul Backus <snarwin gmail.com> writes:
On Tuesday, 24 May 2022 at 12:51:14 UTC, Loara wrote:
 I think this is better
 ```d
  // debug assertion
  assert(condition);

  // release assertion
  if (!condition) throw new AssertError();
 ```
 or also
 ```d
  // debug assertion
  assert(condition);

  // release assertion
  if (!condition) throw new MyCustomError("My error message");
 ```
 so when your program fails the final  user is aware of what 
 went wrong.
I left off messages for the sake of the example, but of course you should use them in real code. IMO it is better to use the `assert` keyword than to throw an `Error`, so that you can use the `-checkaction` switch to choose what happens at runtime when an assertion fails.
May 24 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/19/2022 1:17 PM, Dukc wrote:
 We can do a bit better. If an assert trips, the program will go to unknown 
 state, and there's no telling what it will do. But that's still not quite the 
 same as undefined behaviour.
Yes, it is. Deliberately so. In fact, the program is *already* in an unknown state if it trips the assert. This has been extensively and exhaustively debated several times in this n.g. I don't really want to re-litigate it. The compiler is allowed to assume the assert holds. Whether it actually does or not is irrelevant. If the user uses the compiler switch to turn off the asserts, the language guarantees no longer hold, and that is the user's choice. There were a few releases where the assert's got turned off to make the compiler go faster. Inevitably, some mysterious crasher bugs appeared. Turning the asserts back on detected these. The asserts are staying on.
May 20 2022
parent reply Dukc <ajieskola gmail.com> writes:
On Saturday, 21 May 2022 at 03:03:41 UTC, Walter Bright wrote:
 On 5/19/2022 1:17 PM, Dukc wrote:
 We can do a bit better. If an assert trips, the program will 
 go to unknown state, and there's no telling what it will do. 
 But that's still not quite the same as undefined behaviour.
Yes, it is. Deliberately so. In fact, the program is *already* in an unknown state if it trips the assert.
I'm not sure if you read the rest of my post where I explained the practical difference between unknown state and undefined behaviour. Anyhow you do not address that. I did not dispute that a failed assert means unknown state.
 This has been extensively and exhaustively debated several 
 times in this n.g. I don't really want to re-litigate it.
Can you provide links? I'm interested in the topic and if there are points brought up I have not considered, I don't want people having to re-explain them to me.
 There were a few releases where the assert's got turned off to 
 make the compiler go faster. Inevitably, some mysterious 
 crasher bugs appeared. Turning the asserts back on detected 
 these. The asserts are staying on.
I do agree that was a wise solution, and would be regardless whether the spec says what it now does or what I'm proposing.
May 21 2022
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/21/2022 1:56 AM, Dukc wrote:
 Can you provide links? I'm interested in the topic and if there are points 
 brought up I have not considered, I don't want people having to re-explain
them 
 to me.
Not really (far too many messages!), but I'd try searching for "assert" and "logical assert" and "launch nuclear missiles". They're long threads.
May 21 2022
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Saturday, 21 May 2022 at 08:56:57 UTC, Dukc wrote:
 On Saturday, 21 May 2022 at 03:03:41 UTC, Walter Bright wrote:
 This has been extensively and exhaustively debated several 
 times in this n.g. I don't really want to re-litigate it.
Can you provide links? I'm interested in the topic and if there are points brought up I have not considered, I don't want people having to re-explain them to me.
The best explanation of this that I've found is actually from an article about error handling in a research language called Midori. The following link goes directly to the relevant section: http://joeduffyblog.com/2016/02/07/the-error-model/#bugs-arent-recoverable-errors If you want to read previous discussion about this from the D community, including several posts from Walter where he explains his position in his own words, there is a very long thread from 2014 here: https://forum.dlang.org/thread/m07gf1$18jl$1 digitalmars.com
May 21 2022
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/21/2022 12:06 PM, Paul Backus wrote:
 If you want to read previous discussion about this from the D community, 
 including several posts from Walter where he explains his position in his own 
 words, there is a very long thread from 2014 here:
 
 https://forum.dlang.org/thread/m07gf1$18jl$1 digitalmars.com
https://github.com/dlang/dlang.org/pull/3307
May 21 2022
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 21 May 2022 at 19:06:58 UTC, Paul Backus wrote:
 On Saturday, 21 May 2022 at 08:56:57 UTC, Dukc wrote:
 On Saturday, 21 May 2022 at 03:03:41 UTC, Walter Bright wrote:
 This has been extensively and exhaustively debated several 
 times in this n.g. I don't really want to re-litigate it.
Can you provide links? I'm interested in the topic and if there are points brought up I have not considered, I don't want people having to re-explain them to me.
The best explanation of this that I've found is actually from an article about error handling in a research language called Midori. The following link goes directly to the relevant section: http://joeduffyblog.com/2016/02/07/the-error-model/#bugs-arent-recoverable-errors ....
A research operating system.
May 22 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 21 May 2022 at 19:06:58 UTC, Paul Backus wrote:

 The best explanation of this that I've found is actually from 
 an article about error handling in a research language called 
 Midori. The following link goes directly to the relevant 
 section:

 http://joeduffyblog.com/2016/02/07/the-error-model/#bugs-arent-recoverable-errors
I have always had problems with this classification because real world systems are modular, and bugs in a module are often treated as exceptions by another module. That is, many systems using plugins try to recover from bugs in a plugin. In a sense, the plugin itself becomes an input that the host has to validate.
May 22 2022
next sibling parent Paul Backus <snarwin gmail.com> writes:
On Sunday, 22 May 2022 at 09:33:25 UTC, Max Samukha wrote:
 On Saturday, 21 May 2022 at 19:06:58 UTC, Paul Backus wrote:

 The best explanation of this that I've found is actually from 
 an article about error handling in a research language called 
 Midori. The following link goes directly to the relevant 
 section:

 http://joeduffyblog.com/2016/02/07/the-error-model/#bugs-arent-recoverable-errors
I have always had problems with this classification because real world systems are modular, and bugs in a module are often treated as exceptions by another module. That is, many systems using plugins try to recover from bugs in a plugin. In a sense, the plugin itself becomes an input that the host has to validate.
If there is no isolation boundary between the plugin and the host, then the host cannot reliably recover from bugs in the plugin. The next section of the article, "Reliability, Fault-Tolerance, and Isolation," goes into much more detail about this.
May 22 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/22/2022 2:33 AM, Max Samukha wrote:
 I have always had problems with this classification because real world systems 
 are modular, and bugs in a module are often treated as exceptions by another 
 module. That is, many systems using plugins try to recover from bugs in a 
 plugin. In a sense, the plugin itself becomes an input that the host has to 
 validate.
This is a faulty system design. There's nothing stopping modules from corrupting the memory of the caller. The correct approach is to run those modules as separate processes, where they can only corrupt themselves. It's why operating systems support processes and interprocess communications.
May 22 2022
next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 22 May 2022 at 22:59:25 UTC, Walter Bright wrote:

 This is a faulty system design. There's nothing stopping 
 modules from corrupting the memory of the caller.
It is not as much a faulty design as it is a trade off. Some systems cannot afford IPC. Consider Linux kernel "oops" failures, which may leave the system in an unstable state without tearing it down completely.
 The correct approach is to run those modules as separate 
 processes, where they can only corrupt themselves.
Or .NET application domains (WebAssembly modules, etc) running in the same process.
 It's why operating systems support processes and interprocess 
 communications.
I know how operating systems work. BTW, the entire monolithic vs microkernel debate is about this.
May 25 2022
prev sibling parent reply kdevel <kdevel vogtner.de> writes:
On Sunday, 22 May 2022 at 22:59:25 UTC, Walter Bright wrote:
 On 5/22/2022 2:33 AM, Max Samukha wrote:
 [...] many systems using plugins try to recover from bugs in a 
 plugin.
 In a sense, the plugin itself becomes an input that the host 
 has to
 validate.
This is a faulty system design. There's nothing stopping modules from corrupting the memory of the caller.
True.
 The correct approach is to run those modules as separate
 processes, where they can only corrupt themselves. It's why 
 operating systems support processes and interprocess 
 communications.
Why is this so rarely done? According to my experience one can use the ssh or the curl binary in a subprocess instead of using the libssh/libcurl in the caller's address space. Apropos cURL: When eyeballing https://dlang.org/phobos/etc_c_curl.html I discern nothing. The same goes for https://dlang.org/phobos/etc_c_sqlite3.html both start with an unfathomable cloud of symbols. There is no structure, there are no concepts. I think both components are good candidates for using their binaries in a subprocess instead of pulling their library code into the own address space.
May 27 2022
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 28/05/2022 7:27 AM, kdevel wrote:
 Apropos cURL: When eyeballing https://dlang.org/phobos/etc_c_curl.html I 
 discern nothing. The same goes for 
 https://dlang.org/phobos/etc_c_sqlite3.html both start with an 
 unfathomable cloud of symbols. There is no structure, there are no 
 concepts. I think both components are good candidates for using their 
 binaries in a subprocess instead of pulling their library code into the 
 own address space.
Those are bindings. They are not meant to have structure. They exist so that the D compiler can understand the non-D symbols.
May 27 2022
parent reply kdevel <kdevel vogtner.de> writes:
On Friday, 27 May 2022 at 19:36:24 UTC, rikki cattermole wrote:
 On 28/05/2022 7:27 AM, kdevel wrote:
 [...]
 https://dlang.org/phobos/etc_c_curl.html I discern nothing. 
 The same goes for https://dlang.org/phobos/etc_c_sqlite3.html 
 [...]
Those are bindings. They are not meant to have structure. They exist so that the D compiler can understand the non-D symbols.
dmd or gdc read those html files? I doubt that.
May 27 2022
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 28/05/2022 7:43 AM, kdevel wrote:
 dmd or gdc read those html files? I doubt that.
What html files? All the documentation is generated from D files. https://github.com/dlang/phobos/blob/master/etc/c/curl.d
May 27 2022
prev sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Thursday, 19 May 2022 at 00:29:46 UTC, Walter Bright wrote:
 I did a quick look at D's undefined behavior, and nearly all of 
 it is disallowed in  safe code.

 It's not 100%, but we're in a strong position.
Regarding safety and avoiding various sources of undefined behavior. Do you agree that "Implementation Defined: The built-in associative arrays do not preserve the order of the keys inserted into the array. In particular, in a foreach loop the order in which the elements are iterated is typically unspecified." from https://dlang.org/spec/hash-map.html can be a source of bugs in the user code? Also I wonder about the motivation to have "SwapStrategy.unstable" set as the default for https://dlang.org/phobos/std_algorithm_sorting.html#sort ? If somebody actually wants a stable sort, but just happens to forget to override the default in some part of their code, then this mistake may cost many hours spent on debugging/troubleshooting.
May 19 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/19/2022 3:15 AM, Siarhei Siamashka wrote:
 Regarding safety and avoiding various sources of undefined behavior. Do you 
 agree that "Implementation Defined: The built-in associative arrays do not 
 preserve the order of the keys inserted into the array. In particular, in a 
 foreach loop the order in which the elements are iterated is typically 
 unspecified." from https://dlang.org/spec/hash-map.html can be a source of
bugs 
 in the user code?
Yes. But still, languages draw a distinction between implementation defined and undefined. There are many implementation defined behaviors in D, like the precise layout of struct fields.
May 19 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Thursday, 19 May 2022 at 18:42:11 UTC, Walter Bright wrote:
 On 5/19/2022 3:15 AM, Siarhei Siamashka wrote:
 Regarding safety and avoiding various sources of undefined 
 behavior. Do you agree that "Implementation Defined: The 
 built-in associative arrays do not preserve the order of the 
 keys inserted into the array. In particular, in a foreach loop 
 the order in which the elements are iterated is typically 
 unspecified." from https://dlang.org/spec/hash-map.html can be 
 a source of bugs in the user code?
But still, languages draw a distinction between implementation defined and undefined.
Some programming languages just maintain the order of insertion (Ruby, Crystal, Python, ...) for their associative arrays and this makes their behavior much more predictable and friendly to the developers. Go intentionally randomizes the order of iteration over elements from its associative arrays, so that the unpredictable nature of it is much more visible to the developers and can be caught by unit tests. Every little bit helps to improve safety. You skipped my comment about default unstable sort in Phobos, but such decision also erodes safety to some extent.
May 20 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Friday, 20 May 2022 at 07:01:09 UTC, Siarhei Siamashka wrote:
 Go intentionally randomizes the order of iteration over 
 elements from its associative arrays, so that the unpredictable 
 nature of it is much more visible to the developers and can be 
 caught by unit tests.
I'm not sure how a unittest can catch it, because even if the order is randomized then it can in theory end up being ordered.
May 20 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Friday, 20 May 2022 at 09:26:58 UTC, bauss wrote:
 On Friday, 20 May 2022 at 07:01:09 UTC, Siarhei Siamashka wrote:
 Go intentionally randomizes the order of iteration over 
 elements from its associative arrays, so that the 
 unpredictable nature of it is much more visible to the 
 developers and can be caught by unit tests.
I'm not sure how a unittest can catch it, because even if the order is randomized then it can in theory end up being ordered.
D unittest (save as "main_test.d" and run as "rdmd -unittest main_test.d"): ```D unittest { // Create an associative array auto a = ["Alice": true, "Bob": true, "Charlie": true]; // Iterate over the associative array and save keys string[] s; foreach (k, _ ; a) s ~= k; // If the order is preserved, then the first retrieved name will be "Alice" assert(s[0] == "Alice"); } void main() { } ``` Go unittest (save as "main_test.go" and run as "go test main_test.go"): ```Go package main import "testing" func TestAssocArrayOrder(t *testing.T) { // Create an associative array a := map[string]bool{"Alice": true, "Bob": true, "Charlie": true} // Iterate over the associative array and save keys var s []string for k, _ := range a { s = append(s, k) } // If the order is preserved, then the first retrieved name will be "Alice" if s[0] != "Alice" { t.Fatalf("The first returned key was not 'Alice'! The actual order: %v", s) } } ``` In Go this unittest will sporadically fail and the problem will be detected reasonably fast. Of course, assuming that the unit tests are run regularly.
May 20 2022
parent deadalnix <deadalnix gmail.com> writes:
On Friday, 20 May 2022 at 12:45:07 UTC, Siarhei Siamashka wrote:
 D unittest (save as "main_test.d" and run as "rdmd -unittest 
 main_test.d"):
 ```D
 unittest {
 	// Create an associative array
 	auto a = ["Alice": true, "Bob": true, "Charlie": true];

 	// Iterate over the associative array and save keys
 	string[] s;
 	foreach (k, _ ; a)
 		s ~= k;

 	// If the order is preserved, then the first retrieved name 
 will be "Alice"
 	assert(s[0] == "Alice");
 }

 void main() {
 }
 ```
The problem quickly shows when you dig slightly deeper. Import another module. It fails. Add -i, it work, but now run the unittests of the module you import. Bonus point: dependencies are not tracked properly so you might be running stale code.
May 20 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/20/2022 12:01 AM, Siarhei Siamashka wrote:
 You skipped my comment about default 
 unstable sort in Phobos, but such decision also erodes safety to some extent.
The language and Phobos are two different things.
May 20 2022
parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Saturday, 21 May 2022 at 02:55:25 UTC, Walter Bright wrote:
 On 5/20/2022 12:01 AM, Siarhei Siamashka wrote:
 You skipped my comment about default unstable sort in Phobos, 
 but such decision also erodes safety to some extent.
The language and Phobos are two different things.
I disagree. The language can't become popular without a good standard library that is bundled with it out of the box. Phobos is very much relevant to the topic of this thread and hand waving "oh, but the core language is okay" doesn't help.
May 20 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/20/2022 9:14 PM, Siarhei Siamashka wrote:
 On Saturday, 21 May 2022 at 02:55:25 UTC, Walter Bright wrote:
 On 5/20/2022 12:01 AM, Siarhei Siamashka wrote:
 You skipped my comment about default unstable sort in Phobos, but such 
 decision also erodes safety to some extent.
The language and Phobos are two different things.
I disagree. The language can't become popular without a good standard library that is bundled with it out of the box. Phobos is very much relevant to the topic of this thread and hand waving "oh, but the core language is okay" doesn't help.
The language is where the guarantees come from, not the library.
May 21 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 15:41:25 UTC, Paulo Pinto wrote:
 It certainly does have to a lot to catch up with SPARK, and 
 NVidia has chosen Ada instead of Rust exactly because of that, 
 yet there is money being thrown out at the problem, and 
 standard organizations interested into making it happen.

 It won't be there today, but it will eventually, because they 
 have one specific answer to "what you use X for".
I'd say solutions will be developed when enough people who are into verification use it. E.g. smart contracts in Rust: https://github.com/move-language
May 18 2022
prev sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Tuesday, 17 May 2022 at 00:50:53 UTC, max haughton wrote:
 On Monday, 16 May 2022 at 22:35:00 UTC, forkit wrote:
 Just imagine Rust/Go implementing a C compiler inside their 
 own compiler They'd be laughing stock (not because it's wrong 
 or silly, but because they took the initiative to step further 
 awat from C, not further towards it).
https://github.com/rust-lang/rust-bindgen It's not built in to the compiler but it's officially supported by the Rust foundation.
Well, this looks like one of the automated bindings generators. Many of these exist for various languages since a very long time ago. The problem of creating bindings to the existing popular C and even C++ libraries in order to use them from the other programming languages is not new. Numerous solutions for this problem are also not new. I think that https://www.swig.org/ existed since like forever and D is even listed as one of the supported languages. In my earlier comment I also mentioned the hybrid projects, which are mixing multiple programming languages in the same codebase to leverage their advantages (C for the performance critical parts and Lua/Python for convenience/safety in the other parts). Walter is very much hyped about ImportC. But his solution needs to be objectively compared to the other existing solutions, rather than pretending that it's a unique groundbreaking innovation which will finally make D language popular.
May 16 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 17 May 2022 at 06:26:23 UTC, Siarhei Siamashka wrote:
 Walter is very much hyped about ImportC. But his solution needs 
 to be objectively compared to the other existing solutions, 
 rather than pretending that it's a unique groundbreaking 
 innovation which will finally make D language popular.
I actually think D can become a disruptor if this is done properly and the other loose ends of the language is plugged. So being hyped up is warranted in my opinion, even if the road ahead is a long one. Especially if he successfully has been able to express all C11 constructs in the AST, because then you have something to work with and can gradually improve on embedding macro expansion in D code. I just pray that the D community avoid heuristic pattern matching hacks. mixinC is ok for the most hardcore users, so you can have that, but you need something with better usability for this to be a success. If it is significantly harder to use than C-macros in C people will just ask themselves why they don't use C/C++ etc.
May 17 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/16/2022 11:26 PM, Siarhei Siamashka wrote:
 Walter is very much hyped about ImportC. But his solution needs to be 
 objectively compared to the other existing solutions, rather than pretending 
 that it's a unique groundbreaking innovation which will finally make D
language 
 popular.
It's conceptually equivalent to C++ being able to #include C files. I was in the business when C++ came out, and am pretty certain this was crucial to C++'s early adoption success. ImportC cannot be as as good as that (preprocessor issues), but it's closer than any of the other solutions I've seen. For another language doing the same thing, see Kotlin, which can seamlessly interact with Java, and this has been crucial to Kotlin's adoption. How is ImportC unique? No other solution I've seen is as simple as: import mycfile; These things matter. Feel free to compare ImportC with other existing solutions, and post your results! Note that ImportC can inline C functions, and ImportC code can import D code and access D constructs. Who else does that? (Not even C++! Maybe Kotlin can.)
May 17 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 15:46:09 UTC, Walter Bright wrote:
 On 5/16/2022 11:26 PM, Siarhei Siamashka wrote:
 Walter is very much hyped about ImportC. But his solution 
 needs to be objectively compared to the other existing 
 solutions, rather than pretending that it's a unique 
 groundbreaking innovation which will finally make D language 
 popular.
It's conceptually equivalent to C++ being able to #include C files. I was in the business when C++ came out, and am pretty certain this was crucial to C++'s early adoption success. ImportC cannot be as as good as that (preprocessor issues), but it's closer than any of the other solutions I've seen. For another language doing the same thing, see Kotlin, which can seamlessly interact with Java, and this has been crucial to Kotlin's adoption. ...
Not really, as the numbers show https://www.infoq.com/news/2022/03/jrebel-report-2022/ What was crucial for Kotlin's adoption was Google's decision to push Kotlin ("my way or the highway") for anything new on Android frameworks, while stagnating Java support on purpose. Those lucky ones with Android 12 devices might get an update for OpenJDK 11 LTS support, Java subset, but several of new Jetpack frameworks are only accessible via Kotlin.
May 17 2022
next sibling parent reply Dave Blanchard <none nowhere.com> writes:
Why is D unpopular? Well let's see.

Several months ago I posted on here regarding the totally screwed 
up and incomprehensibly broken process you use for 
building/installing D. I didn't bother to waste time checking 
back on that thread to see any replies. I simply waited, to see 
if my concerns were addressed. They were not.

D's approach to gaining new users:

* Put a broken pile of shit up on the download page, with no 
usable instructions on what to actually do with it.

* ...

* Surprisingly, the prospective user can't do a fucking thing 
with it, of course, because there are NO INSTRUCTIONS.

The very brief instructions in the README file are worthless. 
They don't work.

The slightly more detailed instructions I find by searching on 
the internet are worthless. They don't work.

When searching, I find the Digital Mars web site apparently 
hasn't been updated since 2003.

How the fuck am I supposed to use D if I can't even get the damn 
thing installed and working, because there are NO USABLE 
INSTRUCTIONS, AT ALL?

Furthermore, I am NOT impressed by any install process that 
insists on automatically downloading junk from the internet. Also 
had a laugh about the "install script" offered on the download 
page, which one is supposed to pipe straight into bash. Yeah, if 
I were missing half my brain, I would do something like that.

Look, this is so simple even a child could do it:

Give me ONE ARCHIVE FILE which contains EVERYTHING needed to 
build and install DMD. And then put a script IN THE ROOT 
DIRECTORY of that archive which builds the compiler using what it 
has on hand--with no hassle, no bullshit, or stupid fucking 
errors.

Alternately, have two archives for download: one containing 
binaries only, which is easily unzipped into /usr (or with a 
basic script/Makefile to do the install), and one containing the 
complete source code, which can be easily bootstrapped by said 
binary package.

The README file, it should go without saying, should contain 
full, complete, clear instructions on how to do anything related 
to building and installing the compiler.

Until you pull your heads out of your asses and make the damn 
compiler actually accessible to anyone who isn't already an 
expert on bootstrapping this overcomplicated piece of shit, then 
sadly, I have no use at all for D. I'll bet there's many others 
in the same boat also.

By the way, I built my own entire Linux distro from scratch, with 
over a thousand packages that cross compile to glibc/musl, 
openssl/libressl, and x86_64/i686/arm32/arm64, so if I can't 
figure out how to successfully bootstrap DMD, then YOU HAVE A 
PROBLEM.

ADDENDUM: The CAPTCHA questions on this posting form are TOTALLY 
INSANE AND RIDICULOUS. I guess that you could be D's tagline: 
"TOTALLY INSANE AND RIDICULOUS."
May 17 2022
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 18/05/2022 6:19 AM, Dave Blanchard wrote:
 Alternately, have two archives for download: one containing binaries 
 only, which is easily unzipped into /usr (or with a basic 
 script/Makefile to do the install), and one containing the complete 
 source code, which can be easily bootstrapped by said binary package.
That isn't possible. D requires the source for druntime and phobos as part of normal compilation. Even the minimalist build using -betterC is still going to need object.d from druntime. But yeah over all especially since now the reference D frontend is in D we really need to sort out our bootstrapping situation. But alas, that would require LTS's.
May 17 2022
prev sibling next sibling parent max haughton <maxhaton gmail.com> writes:
On Tuesday, 17 May 2022 at 18:19:01 UTC, Dave Blanchard wrote:
 Why is D unpopular? Well let's see.

 Several months ago I posted on here regarding the totally 
 screwed up and incomprehensibly broken process you use for 
 building/installing D. I didn't bother to waste time checking 
 back on that thread to see any replies. I simply waited, to see 
 if my concerns were addressed. They were not.

 D's approach to gaining new users:

 * Put a broken pile of shit up on the download page, with no 
 usable instructions on what to actually do with it.
This is unfair extrapolation from your own unique system. In the most recent data I can get my hands on, something like 50% of D users use windows. 16% of those who answered install from source (sum of all platforms)
 * ...

 * Surprisingly, the prospective user can't do a fucking thing 
 with it, of course, because there are NO INSTRUCTIONS.
 The very brief instructions in the README file are worthless. 
 They don't work.

 The slightly more detailed instructions I find by searching on 
 the internet are worthless. They don't work.
Which instructions? As I will elaborate upon below this is simply a pointless approach to technical discussion.
 When searching, I find the Digital Mars web site apparently 
 hasn't been updated since 2003.
Digital Mars is Walter's company, not associated with D other than through history.
 Furthermore, I am NOT impressed by any install process that 
 insists on automatically downloading junk from the internet. 
 Also had a laugh about the "install script" offered on the 
 download page, which one is supposed to pipe straight into 
 bash. Yeah, if I were missing half my brain, I would do 
 something like that.
Download the script and read it before running it? The logic it performs is very simple it just has some logic for grabbing new compilers as they are released.
 Look, this is so simple even a child could do it:

 Give me ONE ARCHIVE FILE which contains EVERYTHING needed to 
 build and install DMD. And then put a script IN THE ROOT 
 DIRECTORY of that archive which builds the compiler using what 
 it has on hand--with no hassle, no bullshit, or stupid fucking 
 errors.
And what will it have on hand? Will there be a D compiler? A C compiler? If so which one? Do we need to bootstrap from C to a C++ compiler then to a D compiler? Which libc is available? etc. It's really not that simple. If you ask for help and provide useful information then we can help you but so far you have at first been quite obtuse and now being a bit rude and now I would say needlessly aggressive. Also the bootstrap-anywhere, run-on-anything D compiler is basically gcc. Maybe you aren't aware of this, but it would've been mentioned in previously thread if you had responded when people tried to help. Iain Buclaw, who maintains d-in-gcc, is very finicky about platform support and bootstrapping. In fact Rikki did mention gdc in response to your previous thread.
 Alternately, have two archives for download: one containing 
 binaries only, which is easily unzipped into /usr (or with a 
 basic script/Makefile to do the install), and one containing 
 the complete source code, which can be easily bootstrapped by 
 said binary package.
We ship .deb and .rpm packages, which as you will know covers a large amount of all linux installations. Writing a script that can blindly target basically any linux distribution (or pseudo-distribution) is not a trivial thing to do, especially given that the *vast* majority of D users do not need one.
 The README file, it should go without saying, should contain 
 full, complete, clear instructions on how to do anything 
 related to building and installing the compiler.
The dmd README does actually contain a link to the wiki which explains in some detail how to build either dmd or one of the other compilers from source. https://wiki.dlang.org/DMD Did you read it? You did mention a link to a website (which one?).
 Until you pull your heads out of your asses and make the damn 
 compiler actually accessible to anyone who isn't already an 
 expert on bootstrapping this overcomplicated piece of shit, 
 then sadly, I have no use at all for D. I'll bet there's many 
 others in the same boat also.
As linked above there are instructions, maybe you missed them.
 By the way, I built my own entire Linux distro from scratch, 
 with over a thousand packages that cross compile to glibc/musl, 
 openssl/libressl, and x86_64/i686/arm32/arm64, so if I can't 
 figure out how to successfully bootstrap DMD, then YOU HAVE A 
 PROBLEM.
Or you have a problem communicating? I read your previous forum post, and I tried to help, but you didn't respond. Maybe you were busy, but your initial post was completely vague and had almost no actionable information. You might think it does but you're assuming that we know anything about your system, we don't. For example you previously complained about documentation, e.g. "Another site wanted me to pay to access their information.": Which site? There is almost no actionable information in this clause, certainly none unless we go and work out what website you are talking about. If you really want to try D, try it on a more conventional operating system then think about getting it running on an esoteric one. A lot of your previous questions would've been answered by just using it normally e.g. D *does* use the system linker, having to download one (presumably) OPTLINK means you've gone the wrong way. I can and probably soon will write some instructions for building everything from source but 1. It's not the recommended way to use D, e.g. We recently put PGO in the build script, which makes the compiler faster at the expense of compile time, so unless you are like a gentoo-like linux experience you may want to download a binary 2. If you're going to maintain a weird system then the buck has to stop somewhere wrt to what tools the process needs.
May 17 2022
prev sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 17 May 2022 at 18:19:01 UTC, Dave Blanchard wrote:
 Why is D unpopular? Well let's see.
 ....
 ......
 ........
 ............
 Until you pull your heads out of your asses and make the damn 
 compiler actually accessible to anyone who isn't already an 
 expert on bootstrapping this overcomplicated piece of shit, 
 then sadly, I have no use at all for D. I'll bet there's many 
 others in the same boat also.
 ................
These comments are actually very common, and entirely reasonable (although not necessarily correct in their detail). They arise from 'expectations' (which are usually reasonable, but not always so). But how does a 'volunteer' project (like dlang) address such expectations? Well, often the default response is 'well, why don't you pitch in?'. That response, often used, is usually not helpful. dlang first needs strong leadership, vision, goals, priorities, benchmarks, KPI's .... ( as you would get in a commercially driven project). from that, you attract and obtain your best resources ( to work on that agenda). A project that lets anyone do whatever they feel like doing (their own agenda), is never going to be 'popular'(as in never going to be widely popular like other languages).
May 17 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 9:54 AM, Paulo Pinto wrote:
 Not really, as the numbers show
 
 https://www.infoq.com/news/2022/03/jrebel-report-2022/
It doesn't mention Kotlin.
 What was crucial for Kotlin's adoption was Google's decision to push Kotlin
("my 
 way or the highway") for anything new on Android frameworks, while stagnating 
 Java support on purpose.
I'm sure that helped. But I've seen it mentioned many times that Kotlin being able to access all the old Java code was crucial.
May 17 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 17 May 2022 at 20:21:16 UTC, Walter Bright wrote:
 On 5/17/2022 9:54 AM, Paulo Pinto wrote:
 Not really, as the numbers show
 
 https://www.infoq.com/news/2022/03/jrebel-report-2022/
It doesn't mention Kotlin.
It surely does, read the graphic with attention, 8%. And yes it is more used than Scala and Groovy, which score even lower, also with great Java interoperability as well.
 What was crucial for Kotlin's adoption was Google's decision 
 to push Kotlin ("my way or the highway") for anything new on 
 Android frameworks, while stagnating Java support on purpose.
I'm sure that helped. But I've seen it mentioned many times that Kotlin being able to access all the old Java code was crucial.
Reality in the JVM ecosystem proves otherwise, also this only applies up to Java 7 constructs, Kotlin hardly gets any of the newer goodies up to Java 18. Additionally it now tries to have a foot in native, JavaScript and Android, which compromises its design in regards to host platforms.
May 17 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 2:09 PM, Paulo Pinto wrote:
 It surely does, read the graphic with attention, 8%.
And so it does, you're right. 8% of the enormous Java market is huge.
May 18 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 18 May 2022 at 07:35:20 UTC, Walter Bright wrote:
 On 5/17/2022 2:09 PM, Paulo Pinto wrote:
 It surely does, read the graphic with attention, 8%.
And so it does, you're right. 8% of the enormous Java market is huge.
Yeah, but it got there thanks to Google replacing Java with Kotlin on Android, althought they might still be "commited" to their Android Java flavour in regards to backwards compatibility. You know how to turn Java code into Groovy? Start by changing the file extension from .java to .groovy, yet it can hardly do better than 6%, and mostly thanks to Gradle being around, and having been chosen as Android's official build system. This is what D lacks, the killer use case, ImportC on its own will hardly change the current adoption state.
May 18 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/18/2022 1:56 AM, Paulo Pinto wrote:
 This is what D lacks, the killer use case, ImportC on its own will hardly
change 
 the current adoption state.
Even if it doesn't, it will make things much easier for current D users. For example, look at the dmd source code itself. We maintain, by hand, a set of equivalent .h for them. A cumulatively enormous amount of time has been expended on that, and finding/fixing the bugs from when they get out of sync. (ImportC doesn't directly address that, because it's C++ not C, but it's still illustrative. I have also considered adding "C with Classes" support in ImportC as dmd's C++ interface is "C with Classes".)
May 18 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 18, 2022 at 11:57:13AM -0700, Walter Bright via Digitalmars-d wrote:
[...]
 For example, look at the dmd source code itself. We maintain, by hand,
 a set of equivalent .h for them. A cumulatively enormous amount of
 time has been expended on that, and finding/fixing the bugs from when
 they get out of sync.
[.[..] This is, to me, a flashing red (and beeping) neon sign begging for auto-generation of .h from .d. At least, that's what I'd do in my own project if I faced a similar situation. It's never a good idea to maintain two (slightly different) copies of the same thing by hand. T -- Why do conspiracy theories always come from the same people??
May 18 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 18 May 2022 at 19:07:38 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 11:57:13AM -0700, Walter Bright via 
 Digitalmars-d wrote: [...]
 For example, look at the dmd source code itself. We maintain, 
 by hand, a set of equivalent .h for them. A cumulatively 
 enormous amount of time has been expended on that, and 
 finding/fixing the bugs from when they get out of sync.
[.[..] This is, to me, a flashing red (and beeping) neon sign begging for auto-generation of .h from .d. At least, that's what I'd do in my own project if I faced a similar situation. It's never a good idea to maintain two (slightly different) copies of the same thing by hand. T
In this case it's actually a really good idea because politically it forces people to try and look after the C++ header interface. One day it'll likely be more automated but until that automation can detect nasty changes it's better to have the frontend.h diff be extremely obvious.
May 18 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 18, 2022 at 07:08:59PM +0000, max haughton via Digitalmars-d wrote:
 On Wednesday, 18 May 2022 at 19:07:38 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 11:57:13AM -0700, Walter Bright via
 Digitalmars-d wrote: [...]
 For example, look at the dmd source code itself. We maintain, by
 hand, a set of equivalent .h for them. A cumulatively enormous
 amount of time has been expended on that, and finding/fixing the
 bugs from when they get out of sync.
[...] This is, to me, a flashing red (and beeping) neon sign begging for auto-generation of .h from .d. At least, that's what I'd do in my own project if I faced a similar situation. It's never a good idea to maintain two (slightly different) copies of the same thing by hand.
[...]
 In this case it's actually a really good idea because politically it
 forces people to try and look after the C++ header interface.
 
 One day it'll likely be more automated but until that automation can
 detect nasty changes it's better to have the frontend.h diff be
 extremely obvious.
What nasty changes are there? T -- Your inconsistency is the only consistent thing about you! -- KD
May 18 2022
next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 18 May 2022 at 19:20:58 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 07:08:59PM +0000, max haughton via 
 Digitalmars-d wrote:
 On Wednesday, 18 May 2022 at 19:07:38 UTC, H. S. Teoh wrote:
 [...]
[...]
 In this case it's actually a really good idea because 
 politically it forces people to try and look after the C++ 
 header interface.
 
 One day it'll likely be more automated but until that 
 automation can detect nasty changes it's better to have the 
 frontend.h diff be extremely obvious.
What nasty changes are there? T
You'll have to ask Iain and Martin since they are on the front lines but it is very easy to break the C++ interface to the compiler if you so wish.
May 18 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 18, 2022 at 07:26:28PM +0000, max haughton via Digitalmars-d wrote:
 On Wednesday, 18 May 2022 at 19:20:58 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 07:08:59PM +0000, max haughton via Digitalmars-d
 wrote:
 On Wednesday, 18 May 2022 at 19:07:38 UTC, H. S. Teoh wrote:
 [...]
[...]
 In this case it's actually a really good idea because politically
 it forces people to try and look after the C++ header interface.
 
 One day it'll likely be more automated but until that automation
 can detect nasty changes it's better to have the frontend.h diff
 be extremely obvious.
What nasty changes are there?
[...]
 You'll have to ask Iain and Martin since they are on the front lines
 but it is very easy to break the C++ interface to the compiler if you
 so wish.
Ah I got it, we have to preserve the C++ interface in order to integrate with the LLVM/GCC backends, right? In that case, if I were put in the same situation, I'd auto-generate a .di from the C++ .h instead. Basically, I'm wary of maintaining two parallel versions of the same thing, because the chances of human error causing them to go out of sync is just too high. Let the machine do what it's best at; leave the human to do what humans are good at, which is NOT repetitive tasks that require high accuracy. T -- A bend in the road is not the end of the road unless you fail to make the turn. -- Brian White
May 18 2022
parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 18 May 2022 at 19:36:32 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 07:26:28PM +0000, max haughton via 
 Digitalmars-d wrote:
 On Wednesday, 18 May 2022 at 19:20:58 UTC, H. S. Teoh wrote:
 On Wed, May 18, 2022 at 07:08:59PM +0000, max haughton via 
 Digitalmars-d wrote:
 [...]
[...]
 [...]
What nasty changes are there?
[...]
 You'll have to ask Iain and Martin since they are on the front 
 lines but it is very easy to break the C++ interface to the 
 compiler if you so wish.
Ah I got it, we have to preserve the C++ interface in order to integrate with the LLVM/GCC backends, right? In that case, if I were put in the same situation, I'd auto-generate a .di from the C++ .h instead. Basically, I'm wary of maintaining two parallel versions of the same thing, because the chances of human error causing them to go out of sync is just too high. Let the machine do what it's best at; leave the human to do what humans are good at, which is NOT repetitive tasks that require high accuracy. T
I think you have it backwards. GCC and LDC both use the D frontend sources, but their glue code is written in C++. This means that they need C++ binding s *to* the frontend. A .di file would be pointless because there would be no one to consume it.
May 18 2022
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 18, 2022 at 07:57:31PM +0000, max haughton via Digitalmars-d wrote:
 On Wednesday, 18 May 2022 at 19:36:32 UTC, H. S. Teoh wrote:
[...]
 Ah I got it, we have to preserve the C++ interface in order to
 integrate with the LLVM/GCC backends, right?
 
 In that case, if I were put in the same situation, I'd auto-generate
 a .di from the C++ .h instead.  Basically, I'm wary of maintaining
 two parallel versions of the same thing, because the chances of
 human error causing them to go out of sync is just too high.  Let
 the machine do what it's best at; leave the human to do what humans
 are good at, which is NOT repetitive tasks that require high
 accuracy.
[...]
 I think you have it backwards.
 
 GCC and LDC both use the D frontend sources, but their glue code is
 written in C++. This means that they need C++ binding s *to* the
 frontend. A .di file would be pointless because there would be no one
 to consume it.
The .di is to force the compiler to abort with an error if the corresponding D code doesn't match the C++ declarations. Well OK, I don't know if .di would work, perhaps the .d declarations themselves should be auto-generated. Or something like that. Whichever direction you go, the idea is to observe SST. T -- Famous last words: I *think* this will work...
May 18 2022
prev sibling parent reply user1234 <user1234 12.de> writes:
On Wednesday, 18 May 2022 at 19:20:58 UTC, H. S. Teoh wrote:
 What nasty changes are there?


 T
- use of base enum type (not the case anymore tho), - difference in the layout of members, - missing extern(c++) - ... I dont know if it's still the case but in the past CI did not fail when a PR did update the headers so it was not rare to see a PR following, a few days later, to fixup that.
May 18 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/18/2022 12:46 PM, user1234 wrote:
 I dont know if it's still the case but in the past CI did not fail when a PR 
 did  update the headers so it was not rare to see a PR following, a few days 
 later, to fixup that.
This historically has put a really unfair burden on Iain and Martin.
May 18 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 18 May 2022 at 08:56:40 UTC, Paulo Pinto wrote:
 This is what D lacks, the killer use case, ImportC on its own 
 will hardly change the current adoption state.
Adoption does not seem to be the main issue, retention appears to be the main issue. So first: tie up the loose ends, plug the potholes, get a coherent memory management story, put some makeup on the syntax, project a clear vision, keep the forums active with buzz and emotion (so it does not look like a dead language that nobody cares about), make sure the help forum is more visible and accessible and keep it friendly. Then worry about adoption. As you can see with Rust, a strong narrow semantic vision can create enough gravity to attract enough similar users and then it will gain traction in some niche(s). The "killer app" follows from that. So, vision first, the "killer use case" follows from the vision (after 10 years of people gravitating towards you and you being able to retain those users)… Being an upgrade path from C (and other Cish languages) for people looking for something higher level, is a viable vision. But then upgrading from C to D must be comparable to upgrading from C to C++/Vala/etc… So you need near perfect macro expansion, basically a more advanced approach to parsing. Linux and the Vala use case would be an obvious target, with gdc now being in the GNU suite etc. Is it likely to happen? Probably not. Is it viable. I think so. What is required? Focus!
May 19 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 23:32:33 UTC, Ali Çehreli wrote:
 An open source community allows (should allow?) individuals to 
 spend their energy on any topic that they want. Some will work 
 on ImportC, others (which apparently includes you) will work on 
  safe.

 Ali
My comments are in the context of this thread, about why D is unpopular. It's not about me wanting to stop people in an open-source community from spending their time on whatever they want. But I feel ImportC and safe push D into two completely different directions. So D may become 'more' popular to those wanting to import more C into their D. And D may become 'less' popular because more people are importing C into their D. I'm struggling to see how two different directions get us to... well.. anywhere.
May 15 2022
next sibling parent max haughton <maxhaton gmail.com> writes:
On Monday, 16 May 2022 at 02:05:32 UTC, forkit wrote:
 On Sunday, 15 May 2022 at 23:32:33 UTC, Ali Çehreli wrote:
 An open source community allows (should allow?) individuals to 
 spend their energy on any topic that they want. Some will work 
 on ImportC, others (which apparently includes you) will work 
 on  safe.

 Ali
My comments are in the context of this thread, about why D is unpopular. It's not about me wanting to stop people in an open-source community from spending their time on whatever they want. But I feel ImportC and safe push D into two completely different directions. So D may become 'more' popular to those wanting to import more C into their D. And D may become 'less' popular because more people are importing C into their D. I'm struggling to see how two different directions get us to... well.. anywhere.
This assumes that both features apply to exactly the same subset of all D programs, which they don't. At Symmetry we have very large D codebases which could be easily argued would benefit from both ImportC and safe given optimal implementations of them both. Most D code does not directly interface with C, but a lot of D *projects* do, hence ImportC. Most D projects also benefit from being able to rely on the safety of a given chunk of D code.
May 15 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 16 May 2022 at 02:05:32 UTC, forkit wrote:

i just realised, I can now answer the question 'Why is D 
unpopular?'

Because it has no shared vision.
May 15 2022
parent reply deadalnix <deadalnix gmail.com> writes:
On Monday, 16 May 2022 at 02:34:16 UTC, forkit wrote:
 On Monday, 16 May 2022 at 02:05:32 UTC, forkit wrote:

 i just realised, I can now answer the question 'Why is D 
 unpopular?'

 Because it has no shared vision.
Yes.
May 18 2022
next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Wednesday, 18 May 2022 at 12:47:52 UTC, deadalnix wrote:
 On Monday, 16 May 2022 at 02:34:16 UTC, forkit wrote:
 On Monday, 16 May 2022 at 02:05:32 UTC, forkit wrote:

 i just realised, I can now answer the question 'Why is D 
 unpopular?'

 Because it has no shared vision.
Yes.
I suggest the vision be to make me happy. For example, the impossibility to have anonymous struct literals as the RHS of an assignment (or as a function argument in general) makes me unhappy. Moreover, after DIP1030 is implemented, D wants to deprecate anonymous struct literals entirely, which makes me the unhappiest man in the world.
May 18 2022
next sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Wednesday, 18 May 2022 at 15:34:33 UTC, Max Samukha wrote:
 I suggest the vision be to make me happy. For example, the 
 impossibility to have anonymous struct literals as the RHS of 
 an assignment (or as a function argument in general) makes me 
 unhappy. Moreover, after DIP1030 is implemented, D wants to 
 deprecate anonymous struct literals entirely, which makes me 
 the unhappiest man in the world.
Realistically, brace initialization for structs is probably never going to be deprecated or removed. It would cause too much disruption to existing code for too little benefit. With DIP 1030 implemented, D's struct literals will achieve feature parity with C99's compound literals (which are also not anonymous--they require you to specify the type using a cast-like syntax).
May 18 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Wednesday, 18 May 2022 at 16:02:52 UTC, Paul Backus wrote:
 With DIP 1030 implemented, D's struct literals will achieve 
 feature parity with C99's compound literals (which are also not 
 anonymous--they require you to specify the type using a 
 cast-like syntax).
In C, you only have to specify the top level type: struct Foo { int x; }; struct Bar { struct Foo f[2]; }; int main() { struct Bar b; b = (struct Bar){{{1}, {2}}}; // or just b = (struct Bar){{1, 2}}; return 0; } In D, you would have to: b = Bar([Foo(1), Foo(2)]);
May 18 2022
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 18 May 2022 at 17:04:01 UTC, Max Samukha wrote:
  b = Bar([Foo(1), Foo(2)]);
Yes, it's too ugly.
May 18 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/18/2022 8:34 AM, Max Samukha wrote:
 Moreover, after DIP1030 is implemented, D wants to 
 deprecate anonymous struct literals entirely, which makes me the unhappiest
man 
 in the world.
DIP1030 is not moving forward.
May 18 2022
next sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Wednesday, 18 May 2022 at 18:58:51 UTC, Walter Bright wrote:
 DIP1030 is not moving forward.
Is this the official announcement :thonk:
May 18 2022
prev sibling parent reply max haughton <maxhaton gmail.com> writes:
On Wednesday, 18 May 2022 at 18:58:51 UTC, Walter Bright wrote:
 On 5/18/2022 8:34 AM, Max Samukha wrote:
 Moreover, after DIP1030 is implemented, D wants to deprecate 
 anonymous struct literals entirely, which makes me the 
 unhappiest man in the world.
DIP1030 is not moving forward.
Named arguments (the important part of DIP1030) will happen. This week in particular I'm going to refactor the compilers notion of function arguments into a struct that can represent both the semantically lowered by also original lexical order of arguments. Previously I tried doing this as additional data but it just became extremely hard to work with. I got named arguments working for non-templated functions but I could never get the logic covering all cases for all the difference flavours of template matching in the compiler. Unifying all this to one notion of arguments should massively simplify resulting logic.
May 18 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/18/2022 12:14 PM, max haughton wrote:
 On Wednesday, 18 May 2022 at 18:58:51 UTC, Walter Bright wrote:
 On 5/18/2022 8:34 AM, Max Samukha wrote:
 Moreover, after DIP1030 is implemented, D wants to deprecate anonymous struct 
 literals entirely, which makes me the unhappiest man in the world.
DIP1030 is not moving forward.
Named arguments (the important part of DIP1030) will happen.
Yes. I was referring to the removal of the { } initializer syntax. That's not going to happen, due to pushback from the community.
May 18 2022
prev sibling parent zjh <fqbqrr 163.com> writes:
On Wednesday, 18 May 2022 at 12:47:52 UTC, deadalnix wrote:

 Because it has no shared vision.
Yes.
We should vote `democratically` to decide future `priorities` .`Experts` should have `greater` weight. There should have a `clear vision` and sorted detailed objectives!
May 18 2022
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 15, 2022 at 03:02:24AM +0000, forkit via Digitalmars-d wrote:
 On Sunday, 15 May 2022 at 02:38:12 UTC, Mike Parker wrote:
 On Sunday, 15 May 2022 at 02:30:52 UTC, forkit wrote:
 
 Who has the time to be (continually) well-versed in both D and C?
That's kind of the point. With ImportC, you don't *have* to be well-versed in C if you need to use a C library.
Integrating a library whose code you're not well versed in, seems like a recipe for disaster.
[...] So you'd rather reinvent every C library out there that you need for your project, just because it's not written in D? Pretty much every OS API out there is, under the hood, written in C. Would you reinvent your own OS too, just because existing OSes aren't written in D? If I were deciding whether to start a new project and my choices are (1) use an existing C library that provides a critical part of the functionality I need, vs. (2) rewrite said C library from scratch because it's not written in D, guess which option I'm gonna choose. And guess which option is going to keep my business going, as opposed to sinking my project in the amount of resources/programmer time needed to write it from scratch. I mean, I admire your ideals, but you can't just start from zero every time. You wouldn't be able to fly past your own backyard that way. *Somewhere* along the line you have to stand on the shoulders of existing technology and take off from there, rather than reinventing fire with sticks and stones just because whoever originally discovered fire didn't use D to do it. T -- Beware of bugs in the above code; I have only proved it correct, not tried it. -- Donald Knuth
May 15 2022
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 15, 2022 at 02:30:52AM +0000, forkit via Digitalmars-d wrote:
[...]
 Modern programming languages should be encouraging a move away from C
 (and providing the means to do so), and not encouraging a move towards
 C.
I think it's a matter of perspective. I don't see ImportC as encouraging a move towards C; if somebody wanted to move towards C, D would be last thing he would be looking at. :-P Rather, I see ImportC as a way for current C programmers to move *away* from C -- incrementally. You can start writing parts of your code in D without needing to rewrite your existing C code, and gradually migrate away from C as you go along. Nobody has the time and/or resources to reengineer an entire C project in D in one shot. ImportC serves as a smooth off-ramp to get off C and start writing D.
 My real concern though (and in relation to the topic of this thread),
 is that ImportC will actually make D even less popular.
Why would ImportC make D less popular? If anything, it would make D *more* popular, because now if you have some project that depends on an existing C library (and let's face it, D's ecosystem is *nowhere* near replacing the entire C ecosystem, which pervades pretty much every corner of the programming landscape as the lingua franca of OS APIs, and the future where C is completely supplanted for OS APIs is nowhere in sight yet), you can now write your code in D and have no worries about integrating with the C library. Whereas without ImportC you have to jump through additional hoops to do so. Which equals more friction, which equals less inclination for people to start using D.
 Who has the time to be (continually) well-versed in both D and C?
Why would you need to be well-versed in C just because D provides a way for you to smoothly integrate with existing C libraries? By the same logic, you might as well say `extern(C)` is a bad idea because now people reading D code has to be well-versed in C, and we should kill off `extern(C)` completely and make D an isolated walled garden that only interacts with other D code and nothing else. The reality is that you'd just treat ImportC the same way as extern(C): black-box interfaces to OS APIs and foreign language libraries. I wrote quite a lot of D code that uses C libraries -- libfreetype, libxcb, libmpfr, to name a few. If D didn't have an easy way to interface with libfreetype, for example, my project would never have gotten off the ground. In fact, I'd have been motivated to write more C code instead. There is no way I'm gonna waste my time/energy to reinvent libfreetype just because it happens to be written in C rather than D. Life is too short to be reinventing square wheels at every single turn. As it stands, it took a bit of effort to make it work -- thankfully I use a sane build system (cough) so it wasn't a big problem, but I *did* have to spend the time to hand-port some libfreetype prototypes into D extern(C) declarations, and do the trick with cpp to auto-generate error message strings from one of libfreetype's headers. Now if ImportC had been at the high level of integration (that we're still working towards), I wouldn't have needed to put in this effort -- I'd have just written `import freetype.freetype;` and went on with my business. *That* would have given me MUCH more motivation to write D code. Truth is, I balked at using libfreetype for a period of time because of the anticipated effort needed to interface with it. Ultimately I decided to go ahead -- thanks to D's relatively easy integration with C. But had `import freetype.freetype;` Just Worked(tm) at that time, I wouldn't even have hesitated in the first place. TL;DR: if ImportC had been done 10 years ago, I'd have had MORE motivation to write more D code, not less. And if interfacing D with C had been harder than it was, I'd have been motivated to write my project in C rather than D (or just decided not to start the project in the first place). T -- IBM = I'll Buy Microsoft!
May 15 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 15 May 2022 at 11:25:47 UTC, H. S. Teoh wrote:

He has been talking nonsense .Just ignore him.
May 15 2022
next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 15 May 2022 at 12:28:54 UTC, zjh wrote:
 On Sunday, 15 May 2022 at 11:25:47 UTC, H. S. Teoh wrote:

 He has been talking nonsense .Just ignore him.
Shame on you. Just because this is an online forum, and we're not sitting around a table together, you think that is ok? Please try raising your point-of-view (whatever that is) to an appropriate intellectual level.
May 15 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/15/2022 5:28 AM, zjh wrote:
 [...]
Let's keep the discourse here professional.
May 15 2022
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/15/2022 4:25 AM, H. S. Teoh wrote:
 TL;DR: if ImportC had been done 10 years ago, I'd have had MORE
 motivation to write more D code, not less.  And if interfacing D with C
 had been harder than it was, I'd have been motivated to write my project
 in C rather than D (or just decided not to start the project in the
 first place).
I made a big mistake not doing ImportC 15 years ago!
May 15 2022
prev sibling parent Jack <jckj33 gmail.com> writes:
On Sunday, 15 May 2022 at 11:25:47 UTC, H. S. Teoh wrote:
 On Sun, May 15, 2022 at 02:30:52AM +0000, forkit via 
 Digitalmars-d wrote: [...]
 [...]
I think it's a matter of perspective. I don't see ImportC as encouraging a move towards C; if somebody wanted to move towards C, D would be last thing he would be looking at. :-P Rather, I see ImportC as a way for current C programmers to move *away* from C -- incrementally. You can start writing parts of your code in D without needing to rewrite your existing C code, and gradually migrate away from C as you go along. [...]
that's how I see ImportC as well. Far from encouraging people to write more C code. Rather it's making people waste less time by reusing existing C code and leaving more time to write D code.
May 29 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 3:49 PM, forkit wrote:
 on 11. ImportC will actually dissuade people from moving away from C.
 
 on 13. This too will dissuade people from moving away from C.
People almost *never* translate functioning programs from one language to another. After all, people still use COBOL.
 In essence, you're forever locking C into D. C++ made this mistake too.
The existence of ImportC is a dev environment thing. It doesn't change the specification of D at all.
May 13 2022
parent forkit <forkit gmail.com> writes:
On Friday, 13 May 2022 at 23:45:16 UTC, Walter Bright wrote:
 On 5/13/2022 3:49 PM, forkit wrote:
 on 11. ImportC will actually dissuade people from moving away 
 from C.
 
 on 13. This too will dissuade people from moving away from C.
People almost *never* translate functioning programs from one language to another. After all, people still use COBOL.
 In essence, you're forever locking C into D. C++ made this 
 mistake too.
The existence of ImportC is a dev environment thing. It doesn't change the specification of D at all.
That may be so. I don't disagree. However, when people say they developed my program in D, nobody really knows what they mean. You'll have to go look at all the source code to see what it really is.
May 13 2022
prev sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 13 May 2022 at 19:18:35 UTC, Walter Bright wrote:
 The fact that there are 3 of them suggests shortcomings.
And those shortcomings are...? The fact there are 3 of them can also be explained by people not investigating existing things before doing their own toy or by maintainers refusing to work together forcing a fork. Almost everything you listed here already exist in the competition, and what's left could have been done as an automatic build integration (even dmd itself just shelling out.... you know, like it is doing for the preprocessor now...) instead of a whole new compiler. As I've said several times now, there ARE things importC can potentially do that the others can't. But you said in a previous message:
 Preprocessor metaprogramming macros will never be directly 
 available to D.
It actually CAN work, through a hygienic mixin. Oh and from that message?
 htod, dstep, and dpp also will simply ignore metaprogramming 
 macros.
That's not true. They don't always succeed, but they don't simply ignore them. dstep tries to convert some back to D templates and mixins. dpp actually tries to apply them to the complete source (usually mangling the code in the process, but it does NOT ignore them).
May 13 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
I spoke with Atila a year ago on his excellent dpp. He handles the 
metaprogramming macros with a handbuilt translation of them. This works, but
has 
to be constantly tuned. The lack of a general solution was what I was referring
to.

I said I wanted to incorporate his work on this into ImportC's handling of the 
preprocessor macros.

With this:

   https://github.com/dlang/dmd/pull/14121

the following ImportC program compiles and runs:

   #include <stdio.h>
   void main() {
     printf("hello world\n");
   }
May 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 01:42:21 UTC, Walter Bright wrote:
 I spoke with Atila a year ago on his excellent dpp. He handles 
 the metaprogramming macros with a handbuilt translation of 
 them. This works, but has to be constantly tuned. The lack of a 
 general solution was what I was referring to.
Just extend D so that it accepts C code, then macro expansion will work. You are 90% there already. Or rather, extend C so that it accepts D code. So when you expand $MACRONAME(param) you inject a C-context that accepts D mixed in. Should work 99.9% of the time. That is good enough, now you only have to manually deal with 1 in 1000. 90% is not good enough, as that means dealing manually with 1 in 10...
May 13 2022
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
I had a similar idea very early on.

Since the compiler understands the difference between a D and C scope, 
it could mix them.

But I think there will be issues surrounding it. It won't work in a lot 
of cases.

Adam's mixinC idea is much more likely to "just work" in all cases that 
I think that is the direction we ought to be going in.
May 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 05:45:03 UTC, rikki cattermole wrote:
 Adam's mixinC idea is much more likely to "just work" in all 
 cases that I think that is the direction we ought to be going 
 in.
What will it look like in macro intensive code? If you end up writing C then the point is lost. What do you do with macros that expand function signatures followed by a body? The only solution that can work in most cases is to align D more with C (or rather the oppsite) and reflect that in the AST.
May 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 05:58:48 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 14 May 2022 at 05:45:03 UTC, rikki cattermole 
 wrote:
 Adam's mixinC idea is much more likely to "just work" in all 
 cases that I think that is the direction we ought to be going 
 in.
What will it look like in macro intensive code? If you end up writing C then the point is lost. What do you do with macros that expand function signatures followed by a body? The only solution that can work in most cases is to align D more with C (or rather the oppsite) and reflect that in the AST.
You need to do the macro expansion in the lexer, then inject C/D context switch tokens used by the parser, then you inject C/D AST nodes if there are semantic differences, or annotate AST nodes with the differences. That way you can have a C signature followed by a D body. Anything short of this is a waste of time IMHO. The last thing the language needs is clunky interop.
May 13 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 14 May 2022 at 06:24:15 UTC, Ola Fosheim Grøstad 
wrote:
 You need to do the macro expansion in the lexer, then inject 
 C/D context switch tokens used by the parser, then you inject 
 C/D AST nodes if there are semantic differences, or annotate 
 AST nodes with the differences. That way you can have a C 
 signature followed by a D body.
Technically it migh be better to not use context switch tokens, but instead reserve a bit in the token to distinguish between C and D. Then the parser can swallow either one where there are no semantic differences. These are details though...
May 13 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/13/2022 3:15 AM, forkit wrote:
 Walter, please go and create C->M (C with Modules) instead.
I already did. You can use it today.
 Then make it an international standard.
That's not up to me.
May 13 2022
prev sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Thursday, 12 May 2022 at 15:12:53 UTC, Adam D Ruppe wrote:
 You know most those things already work.
But until now it was worth it to translate 5000loc by hand just to simplify the build.
May 12 2022
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, May 11, 2022 at 05:10:00PM -0700, Walter Bright via Digitalmars-d wrote:
 On 5/9/2022 9:16 PM, Joshua wrote:
 Lately I started looking at compiled functional languages that
 integrate easily with C because I expect I’ll need their
 expressiveness and type safety for some tricky data transformations.
With ImportC, D is getting pretty darned good at integrating with C.
Even without ImportC, I already integrate with C well enough. I've been converting C prototypes to extern(C) declarations on an as-needed basis (as opposed to translating the entire C header), and it works pretty well. Thanks to pragma(lib), I don't even need to change my build script; the compiler automatically pulls in the library for me upon importing the converted D module. The main hurdle of ImportC that makes it not as attractive as it could be, is the lack of an integrated C preprocessor. Meaning, I have to manually invoke the C preprocessor before ImportC is useful. That pretty much nullifies the value of ImportC: I might as well just translate the C prototypes myself and it'd be less troublesome. If ImportC were to reach the point where I can literally just import any system C header, *without* needing to run the preprocessor myself beforehand, *that* would make ImportC immensely more useful in terms of C integration. T -- If you want to solve a problem, you need to address its root cause, not just its symptoms. Otherwise it's like treating cancer with Tylenol...
May 11 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/11/2022 5:53 PM, H. S. Teoh wrote:
 The main hurdle of ImportC that makes it not as attractive as it could
 be, is the lack of an integrated C preprocessor. Meaning, I have to
 manually invoke the C preprocessor before ImportC is useful.
I know. dmd now invokes cpp automatically for Posix systems, cl /P for Windows coff. A PR invokes sppn for Win32 omf builds, it's all green: https://github.com/dlang/dmd/pull/14090
 That pretty
 much nullifies the value of ImportC: I might as well just translate the
 C prototypes myself and it'd be less troublesome.
Wouldn't adding a make rule to run the preprocessor be just a couple lines of make?
 If ImportC were to
 reach the point where I can literally just import any system C header,
 *without* needing to run the preprocessor myself beforehand, *that*
 would make ImportC immensely more useful in terms of C integration.
I agree. Now, if only someone would approve 14090, we'd be good to go!
May 11 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/11/2022 8:48 PM, Walter Bright wrote:
 I agree. Now, if only someone would approve 14090, we'd be good to go!
And, it's pulled now!
May 12 2022
prev sibling parent reply zjh <fqbqrr 163.com> writes:
On Friday, 6 May 2022 at 12:45:40 UTC, Guillaume Piolat wrote:

 https://github.com/p0nce/DIID
`Good project`, we should integrate kinds of `resources` in the `forum/somewhere`. Serve for `D`'s `novices or veterans`.
May 06 2022
parent Guillaume Piolat <first.last gmail.com> writes:
On Friday, 6 May 2022 at 14:06:30 UTC, zjh wrote:
 On Friday, 6 May 2022 at 12:45:40 UTC, Guillaume Piolat wrote:

 https://github.com/p0nce/DIID
`Good project`, we should integrate kinds of `resources` in the `forum/somewhere`. Serve for `D`'s `novices or veterans`.
`Thanks`
May 06 2022
prev sibling parent zjh <fqbqrr 163.com> writes:
On Friday, 6 May 2022 at 09:37:24 UTC, Ola Fosheim Grøstad wrote:
 One key issue that has been mentioned in this thread is that 
 people leave because of `inconsistencies` in the language.
Yes, D should `investigate` why the `library authors` don't maintain it. It is `very important` for `language` to serve the `library author` well! I hope `d`'s officials will pay attention to it and `investigate` it. `D author`, should really `attach great importance` to the excellent `library author and library`.
May 06 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/5/2022 11:57 PM, Siarhei Siamashka wrote:
 My point is that CTFE actually increases the complexity and moves it somewhere 
 else in a somewhat obscure way. It's one of the zillions of extra features
that 
 make the language spec bloated and difficult to learn.
Actually, it doesn't add any new syntax. And it removes that peculiar limitation that functions cannot be used to determine enum values. It's more the removal of a compiler limitation than adding a feature. CTFE in C++ was a new feature, as it (pointlessly) added new syntax with new rules.
May 06 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/5/2022 11:18 PM, Patrick Schluter wrote:
 That's the point. It reduces build complexity in a disruptive way.
As someone who used to write C code to generate C code to be compiled into a program, this was all replaced very nicely with CTFE. CTFE is worth it just for that. I sure enjoyed deleting all that stuff. Good riddance. (The multi-level build process broke many "better make" replacements.)
May 06 2022
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, May 06, 2022 at 08:10:16PM -0700, Walter Bright via Digitalmars-d wrote:
[...]
 (The multi-level build process broke many "better make" replacements.)
IMNSHO, "better make" replacements that cannot handle multi-level builds are not worthy to be considered as "better make", but "worse make". (Including, unfortunately, dub. I've been thinking about how to break dub out of its walled garden, partly to remove this limitation.) T -- Political correctness: socially-sanctioned hypocrisy.
May 07 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 05:58:37 UTC, Siarhei Siamashka wrote:
 On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 It was D that changed that perception. Suddenly, native 
 languages started implementing CTFE.
Is CTFE really that useful? Generating code as a part of the build process has been in use since a very long time ago. Any programming languages (perl, python, php, ...) or tools (bison, flex, re2c, ...) could be used for this.
I find it to be useful for parametric types that initialize constant tables. If the program does constant lookups those values can be turned into constants known at compile time. Or like in the example I gave further up where you compute the length of delay lines at compile time so that you can make sound effects allocation free. In general CTFE would be most powerful in combination with AST manipulation to construct new types. D uses string mixins for this puropose which is macro-like in nature, so this partcular use case for CTFE is macroish IMHO. For code gen I actually think having a Python script can lead to more readable code, because then you see the generated code as part of your program and debugging is straight forward. Such code is seldom modified in my experience so I sometimes just inline the Python script as a comment. So overall, for me CTFE is most useful for computing constants.
May 06 2022
prev sibling next sibling parent reply Araq <rumpf_a web.de> writes:
On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 On 5/3/2022 10:54 PM, Max Samukha wrote:
 The important part is that Nemerle can execute functions at 
 compile time - whether it's done via interpretation or 
 compilation is not relevant to the argument. D could as well 
 compile CTFE into native code or IL (as in newCTFE) from the 
 start.
That's pedantically true. But I can't seem to explain the difference to you. D doesn't have a compiler in the runtime. Having a compiler in the runtime means that you can dynamically create code and compile it at runtime. It's a *fundamental* difference. If there is no difference (after all, all of them are Turing machines, no difference at all!), and CTFE is popular and well known, did ZERO of the native compilers do it? Why didn't it appear on feature wish lists? Why wasn't it in C/C++/Pascal/Fortran/Module2/Ada/Algol compilers?
It was in the Nim(rod) compiler which is a Modula 3 derivate quite early in the project's lifetime, maybe as early as 2004, maybe later, I don't remember. And maybe D had it earlier. However: 1. I'm quite sure I didn't copy it from D. Because: 2. Nim actually **needs** it because otherwise its AST macro system simply cannot work.
May 06 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/6/2022 12:26 AM, Araq wrote:
 It was in the Nim(rod) compiler which is a Modula 3 derivate quite early in
the 
 project's lifetime, maybe as early as 2004, maybe later, I don't remember. And 
 maybe D had it earlier. However:
 
 1. I'm quite sure I didn't copy it from D. Because:
 2. Nim actually **needs** it because otherwise its AST macro system simply 
 cannot work.
 
You wrote the Nimrod compiler? I'm impressed!
May 06 2022
parent Tejas <notrealemail gmail.com> writes:
On Saturday, 7 May 2022 at 03:24:37 UTC, Walter Bright wrote:
 On 5/6/2022 12:26 AM, Araq wrote:
 It was in the Nim(rod) compiler which is a Modula 3 derivate 
 quite early in the project's lifetime, maybe as early as 2004, 
 maybe later, I don't remember. And maybe D had it earlier. 
 However:
 
 1. I'm quite sure I didn't copy it from D. Because:
 2. Nim actually **needs** it because otherwise its AST macro 
 system simply cannot work.
 
You wrote the Nimrod compiler? I'm impressed!
His project is still alive, but renamed to [Nim](https://nim-lang.org/) It's also competing with D as a do-it-all language, ie, is suitable for high level as well as low level programming
May 06 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Thursday, 5 May 2022 at 23:06:26 UTC, Walter Bright wrote:
 Surely you can see that there must have been SOME difference 
 there, even if it was just perception.
I grant you that D packaged the concept neatly, so there is a practical difference.
 It was D that changed that perception. Suddenly, native 
 languages started implementing CTFE.
Looks like I'm not getting my well-deserved beer. ((
May 17 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/17/2022 6:41 AM, Max Samukha wrote:
 Looks like I'm not getting my well-deserved beer. ((
If you come to DConf, you'll get a beer from me regardless!
May 17 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 17 May 2022 at 15:47:31 UTC, Walter Bright wrote:
 If you come to DConf, you'll get a beer from me regardless!
Yay!
May 18 2022
prev sibling parent a11e99z <black80 bk.ru> writes:
On Tuesday, 3 May 2022 at 19:01:44 UTC, Walter Bright wrote:
 On 5/3/2022 12:34 AM, Max Samukha wrote:
 On Monday, 2 May 2022 at 20:24:29 UTC, Walter Bright wrote:
 


(without resorting to hacks) at compile time based on UDAs the at runtime. I guess that is what you mean when you say "it it must defer code generation to runtime.
thought of it :-)
https://docs.microsoft.com/en-us/dotnet/csharp/roslyn-sdk/source-generators-overview
May 04 2022
prev sibling parent reply John Colvin <john.loughran.colvin gmail.com> writes:
On Saturday, 30 April 2022 at 07:35:29 UTC, Max Samukha wrote:
 On Friday, 29 April 2022 at 20:17:38 UTC, Walter Bright wrote:
 On 4/29/2022 12:10 PM, Walter Bright wrote:
 So why did other native languages suddenly start doing it 
 after D did to the point of it being something a language 
 can't skip anymore?
I've seen endless lists of features people wanted to add to C and C++. None of them were CTFE. When we added it to D, people were excited and surprised.
Your lists are not representative. When D added it, our reaction was more like "finally, somebody did that!". And even today, the feature is only marginally useful because of the countless forward reference bugs. I recently filed one more (https://issues.dlang.org/show_bug.cgi?id=22981), which is not a CTFE bug per se but was encountered in another futile attempt to generate code with CTFE in a reasonable manner.
I don’t know what your threshold for “marginally useful” is, but ctfe is proving its usefulness at Symmetry Investments every day. Not as a niche feature, as a “wherever we need it, all over the place” feature.
May 02 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Monday, 2 May 2022 at 15:41:52 UTC, John Colvin wrote:

 I don’t know what your threshold for “marginally useful” is, 
 but ctfe is proving its usefulness at Symmetry Investments 
 every day. Not as a niche feature, as a “wherever we need it, 
 all over the place” feature.
Yeah, I am aware you are using it heavily. "Marginally" is a hyperbole provoked by another compiler bug, which made me rethink and rewrite a good chunk of code.
May 02 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 19:10:32 UTC, Walter Bright wrote:
 On 4/29/2022 11:26 AM, Paulo Pinto wrote:
 Those were interpreters first and added native code 
 generation later. D did is the other way around, and the 
 native code generating compilers started doing it soon 
 afterwards.
Decades before D was even an idea. Again, SIGPLAN.
So why did other native languages suddenly start doing it after D did to the point of it being something a language can't skip anymore?
They didn't, they got inspired by those that preceded D, you just want to believe D was the cause.
Apr 30 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 12:05 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 19:10:32 UTC, Walter Bright wrote:
 So why did other native languages suddenly start doing it after D did to the 
 point of it being something a language can't skip anymore?
They didn't, they got inspired by those that preceded D, you just want to believe D was the cause.
The timing suggests strongly otherwise. C++'s discovery that templates could be used for CTFE suggests otherwise, too. All those articles about it never mentioned just interpreting an ordinary function instead.
Apr 30 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 10:27 AM, Walter Bright wrote:
 On 4/30/2022 12:05 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 19:10:32 UTC, Walter Bright wrote:
 So why did other native languages suddenly start doing it after D did to the 
 point of it being something a language can't skip anymore?
They didn't, they got inspired by those that preceded D, you just want to believe D was the cause.
The timing suggests strongly otherwise. C++'s discovery that templates could be used for CTFE suggests otherwise, too. All those articles about it never mentioned just interpreting an ordinary function instead.
There is this paper from 2007: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf which proposes extending constant folding to functions, as long as those functions are preceded by `constexpr` and consist of a single `return` statement. Recursion is prohibited. Only literal types permitted. It couldn't even replace the use of template metaprogramming to compute values. It's also a proposal. D released an implementation in 2007, that was way beyond n2235.
Apr 30 2022
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 4/30/22 11:15, Walter Bright wrote:

 There is this paper from 2007:

 http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf

 which proposes extending constant folding to functions, as long as those
 functions are preceded by `constexpr` and consist of a single `return`
 statement. Recursion is prohibited. Only literal types permitted.

 It couldn't even replace the use of template metaprogramming to compute
 values.
Confusing that C++ proposal with D's CTFE makes me desperate. :( C++ is attempting to go one step beyond C preprocessor constants there. "Compile-time function execution" is a couple of decades beyond that simplicity. Ali
Apr 30 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 30 April 2022 at 21:41:06 UTC, Ali Çehreli wrote:
 Confusing that C++ proposal with D's CTFE makes me desperate. 
 :( C++ is attempting to go one step beyond C preprocessor 
 constants there. "Compile-time function execution" is a couple 
 of decades beyond that simplicity.
2003: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n1471.pdf An iso standard cannot absorb every single proposal, you also need all vendors on board, and there should be industry demand. Lots of "promising" ideas are thrown around, but language evolution of production languages should be conservative and not move forward until you have multiple independent implementations. There is a cost involved... What C++ does right is that they complete the features they spec out. At the end if the day, that is more important than absorbing cute/clever proposals.
Apr 30 2022
next sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Saturday, 30 April 2022 at 23:10:19 UTC, Ola Fosheim Grøstad 
wrote:
 What C++ does right is that they complete the features they 
 spec out. At the end if the day, that is more important than 
 absorbing cute/clever proposals.
if cuteness and cleverness isn't required I'm sure you could always use c++ templates. I think the rest of the world can look at the syntax and think the spec community is just wrong.
Apr 30 2022
prev sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 4/30/22 16:10, Ola Fosheim Grøstad wrote:
 On Saturday, 30 April 2022 at 21:41:06 UTC, Ali Çehreli wrote:
 Confusing that C++ proposal with D's CTFE makes me desperate. :( C++
 is attempting to go one step beyond C preprocessor constants there.
 "Compile-time function execution" is a couple of decades beyond that
 simplicity.
2003: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n1471.pdf
Good old days... I have a pre-release copy of David's templates book that he plugs in that presentation. (I used to help organize ACCU C++ meetings in Silicon Valley so I had the privilege of reading review copies of C++ books including that one). I am very well aware of every single template metaprogramming technique that you could do with C++03 and I did use many of them in production. (Oh, I was so clever.) But I don't remember the 'metacode' in that presentation. It must not have caught on. (?)
 An iso standard cannot absorb every single proposal, you also need all
 vendors on board, and there should be industry demand. Lots of
 "promising" ideas are thrown around, but language evolution of
 production languages should be conservative and not move forward until
 you have multiple independent implementations. There is a cost 
involved...
 What C++ does right is that they complete the features they spec out. At
 the end if the day, that is more important than absorbing cute/clever
 proposals.
Those are very wise but misplaced words. You are responding to a paragraph where I said confusing C++'s constexpr function proposal with D's CTFE gives me desperation. Let me say it in plain words to those who may take your ISO references as proof against what I said: C++ does not have anything that comes close to D's CTFE. Maybe you are saying that ISO will eventually produce something in the future (C++35 maybe?). I agree. Ali
Apr 30 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 01:52:09 UTC, Ali Çehreli wrote:
 I am very well aware of every single template metaprogramming 
 technique that you could do with C++03 and I did use many of 
 them in production. (Oh, I was so clever.)

 But I don't remember the 'metacode' in that presentation. It 
 must not have caught on. (?)
My point was that it did propose CTFE, and the presentation states that they had a prototype compiler extension for it, back in 2003.
 Those are very wise but misplaced words. You are responding to 
 a paragraph where I said confusing C++'s constexpr function 
 proposal with D's CTFE gives me desperation.
The point is that there have been many ideas, but they shouldn't pick up the most demanding ones, they should move slowly and extend the language gradually. Which they do.
 Let me say it in plain words to those who may take your ISO 
 references as proof against what I said: C++ does not have 
 anything that comes close to D's CTFE.
Not sure what you mean by this. The only thing I lack in practice is static foreach, which isn't even CTFE, but since C++ is more suited for template composition you can find other ways to bring better structure to your code. D also makes some unsound assumptions by assuming that the hardware you compile on is giving the same answer as the hardware you execute on. That can give surprising results, bugs that are nigh impossible to pinpoint, because you don't get them on your development machine. Are you really sure you want that? Thankfully consteval gives you full control over what happens when.
Apr 30 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 10:17 PM, Ola Fosheim Grøstad wrote:
 My point was that it did propose CTFE,
Nope. It proposed more syntax for something else. I wasn't sure what, as the examples and text apparently needed the verbal part of the presentation to make sense. I don't know what "metacode injection" is or what it has to do with CTFE.
 and the presentation states that they had 
 a prototype compiler extension for it, back in 2003.
It says "partially implemented", not a prototype. It also seems to have nothing in common with Bjarne's proposal 4 years later.
 D also makes some unsound assumptions by assuming that the hardware you
compile 
 on is giving the same answer as the hardware you execute on That can give
 surprising results, bugs that are nigh impossible to pinpoint, because you
don't 
 get them on your development machine. Are you really sure you want that?
Are you sure D's CTFE does that? (CTFE doesn't allow unsafe code, for example.) Are you sure C++'s does not? D is much more portable between systems than C++ is, in particular things like the sizes of types.
 Thankfully consteval gives you full control over what happens when.
That's the C++ rationale for consteval. But D allows 100% control over whether CTFE is run or not. consteval is completely unnecessary. It's simple - CTFE is run if otherwise the compiler would give an error message that it cannot resolve an expression at compile time. For example, void test() { enum e = foo(); // runs foo() at compile time auto f = foo(); // runs foo() at run time } ------ CTFE is conceptually a no-brainer. Run a function at compile time. That's it. Neither Daveed's nor Bjarne's proposals do that. Zero new syntax is necessary.
Apr 30 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 06:48:42 UTC, Walter Bright wrote:
 part of the presentation to make sense. I don't know what 
 "metacode injection" is or what it has to do with CTFE.
There is no need to understand the metastuff, compile time evaluation was a bulletpoint and it is referenced from the wikipedia page on compile time function evaluation.
 It says "partially implemented", not a prototype. It also seems 
 to have nothing in common with Bjarne's proposal 4 years later.
It was obviously a prototype as it was not part of the language?
 Are you sure D's CTFE does that? (CTFE doesn't allow unsafe 
 code, for example.) Are you sure C++'s does not? D is much more 
 portable between systems than C++ is, in particular things like 
 the sizes of types.
In C++ you use stdint which gives you 3 options for each bit width: exact, at least or fast... Then you bind these in meaningful ways for you application using alias, unfortunately C++ does not provide nominal binding, but D doesn't offer that either so...
 Thankfully consteval gives you full control over what happens 
 when.
That's the C++ rationale for consteval. But D allows 100% control over whether CTFE is run or not. consteval is completely unnecessary. It's simple - CTFE is run if otherwise the compiler would give an error message that it cannot resolve an expression at compile time.
Th difference is that I have functions that I only want to run at compile time or runtime, in D you have to test it in the body, in C++ you are forced to think about it. But it might be better if C++ had a less verbose syntax for this... That is an area where you could get an advantage for D, clean up syntax and semantics. I recently ported some DSP C code that computed the length of delay lines twice, that could go wrong if one execution happened at compile time and the other at runtime thanks to floating point. By being forced to mark it as consteval I can be sure that all executions give the same delay length.
 CTFE is conceptually a no-brainer. Run a function at compile 
 time. That's it.
I agree that the concept is trivial, which is why it is surprising that people think other languages haven't considered this option. The main reason for not executing loops at compile time is that compile times become unpredictable/slow for large multi-layered applications.
May 01 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 12:26 AM, Ola Fosheim Grøstad wrote:
 In C++ you use stdint which gives you 3 options for each bit width: exact, at 
 least or fast...
 Then you bind these in meaningful ways for you application using alias, 
 unfortunately C++ does not provide nominal binding, but D doesn't offer that 
 either so...
I.e. if you write portable code it is portable. But that wasn't your complaint - which was about getting portability wrong. C++ offers many more options for that.
 Thankfully consteval gives you full control over what happens when.
That's the C++ rationale for consteval. But D allows 100% control over whether CTFE is run or not. consteval is completely unnecessary. It's simple - CTFE is run if otherwise the compiler would give an error message that it cannot resolve an expression at compile time.
Th difference is that I have functions that I only want to run at compile time or runtime, in D you have to test it in the body, in C++ you are forced to think about it.
I haven't made it clear. There is no ambiguity in D about when a function is run at compile time or run time. None. Zero. It is entirely unnecessary to add a keyword for that. If you want to run a function at compile time, you don't have to test it. Just run it at compile time. We've had CTFE in D for 15 years now. None of your objections to it have caused a problem that has come to my attention. CTFE is used heavily in D. The beauty of CTFE is it is so simple that Daveed and Bjarne missed it. You can see that in their proposals.
 CTFE is conceptually a no-brainer. Run a function at compile time. That's it.
I agree that the concept is trivial, which is why it is surprising that people think other languages haven't considered this option.
Many times obvious things are obvious only in retrospect, and we get so comfortable with them we can no longer imagine otherwise. I implemented modules in 10 lines of code for C. It seems so obvious - why didn't I do it 40 years ago? I can't explain it. Why did it take C++ 35 years to come up with modules? Why didn't I invent Visicalc? It seems so obvious now. Ditto for all the FAANG zillionaires. I kick myself about Autotune. It's so obvious even the inventor didn't think of it. His wife, a singer, casually mentioned to him that it would be nice to have a device that fixed her pitch. A friend of mine who worked at a major company in the 90's would interview candidates. A question he'd ask was "what features would you add to this cellphone if you could?" Not one of them came up with anything other than things like "better voice quality". Nobody thought of using it as a radio, a music player, a book reader, a note taker, a voice recorder, a camera, and on and on. NOBODY!
May 01 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 08:10:28 UTC, Walter Bright wrote:
 I.e. if you write portable code it is portable. But that wasn't 
 your complaint - which was about getting portability wrong. C++ 
 offers many more options for that.
The code wasn't wrong, it was C code that I ported to C++ and modified to be allocation free, so the delay lengths had to be known at compile time. If I had ported it to D, maybe it would have saved me a few key-strokes but I would not have realized that the delay-length was computed both at compile time and runtime. So I prefer consteval over constexpr for such cases because it gives *stronger typing*. I like strong typing. I also don't like implicit conversion from int to float, and float to double, but it seems like the C-family are stuck on it. Strong typing is very useful when you port or modify code written by others, or when you refactor your own code. A modern language ought to provide gradual typing IMHO so that you can increase the rigidity of the model as it evolves.
 I haven't made it clear. There is no ambiguity in D about when 
 a function is run at compile time or run time. None. Zero. It 
 is entirely unnecessary to add a keyword for that.
There is no ambiguity for the compiler, but that is not the same as programmers having a full overview of what goes on in a complex code base that they might not even have written themselves. *consteval* is just a stronger version of *constexpr*, what D does is roughly equivalent to making everything constexpr in C++.
 Many times obvious things are obvious only in retrospect, and 
 we get so comfortable with them we can no longer imagine 
 otherwise.
But in this case it is obvious. It is so obvious that people added macro-languages to their builds to get similar effects.
 I implemented modules in 10 lines of code for C. It seems so 
 obvious - why didn't I do it 40 years ago? I can't explain it. 
 Why did it take C++ 35 years to come up with modules?
Probably because C++ just was an addition to C and they got around that in a more generic way by introducing namespaces. I like namespaces btw. The only reason to add modules to C++ is that people are undisciplined and #include everything rather than just what they need. There are no technical reasons to add modules to C++, IMHO.
 I kick myself about Autotune. It's so obvious even the inventor 
 didn't think of it. His wife, a singer, casually mentioned to 
 him that it would be nice to have a device that fixed her pitch.
Autotune is more about fashion and marketing, being picked up by influential producers, at the time it became a plague the music market was accustomed to "electronic sound" on the radio. Fairly advanced usage of phase vocoders and pitch-trackers were in use in music prior to this. There is a difference between existing and becoming fashionable. The original autotune effect sounds bad in terms of musical quality. You could say the same thing about bit crushers (basically taking a high fidelity signal and setting the lower bits to zero) that create aliasing in the sound. Things that "objectively" sounds bad can become fashionable for a limited time period (or become a musical style and linger on).
May 01 2022
next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 1 May 2022 at 14:36:12 UTC, Ola Fosheim Grøstad wrote:
 Autotune is more about fashion and marketing
 
 Things that "objectively" sounds bad can become fashionable for 
 a limited time period (or become a musical style and linger on).
It has been just a fad for over 23 years now.
May 01 2022
parent reply claptrap <clap trap.com> writes:
On Sunday, 1 May 2022 at 15:50:17 UTC, Guillaume Piolat wrote:
 On Sunday, 1 May 2022 at 14:36:12 UTC, Ola Fosheim Grøstad 
 wrote:
 Autotune is more about fashion and marketing
 
 Things that "objectively" sounds bad can become fashionable 
 for a limited time period (or become a musical style and 
 linger on).
It has been just a fad for over 23 years now.
And 99.9% of the time you're listening to AutoTuned vocals you dont even know.
May 01 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 9:31 AM, claptrap wrote:
 And 99.9% of the time you're listening to AutoTuned vocals you dont even know.
It's gotten to the point where I can tell :-) I don't mean cranking it up to an absurd point like Cher did. I mean the subtle use of it, as it was intended. What has happened is the *style* of singing has changed to accommodate use of autotune. This becomes most apparent if you listen to a lot of singers in the 1970s vs today. Modern singers also, since they don't have to train to be on pitch, don't seem to train to develop a good tone, either. Their voices just don't have good tone compared to 70's singers.
May 01 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 What has happened is the *style* of singing has changed to 
 accommodate use of autotune. This becomes most apparent if you 
 listen to a lot of singers in the 1970s vs today. Modern 
 singers also, since they don't have to train to be on pitch, 
 don't seem to train to develop a good tone, either. Their 
 voices just don't have good tone compared to 70's singers.
It is true that there are styles inspired (or defined) by pitch-correction, actually some singers are so nimble and pitch perfect that people refuse to believe that they don't use pitch correction! However, if you are talking doing minor adjustments in post production you would be better off using spectral tools. Auto-tune did not invent pitch correction, the author of the software didn't discover something obvious as you claimed, it might have been the first marketable successful product, but the concept was there beforehand and was well known. The basics for a phasevocoder isn't all that complex, you do an FFT then you detect the peaks, then you move the peaks, adjust the phases and do an inverse FFT. (If you shift far you have to correct the peaks to avoid an effect that sounds like Mickey Mouse). There are a couple challenges though, one is that consonants and other complex transients get smeared, so you have to fix that with some hack. You also have the issue of tracking pitch correctly, which I believe is where Autotune cut costs by doing pitch tracking by autocorrelation more cheaply. That is a technical improvement, not a conceptual one.
May 01 2022
parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 1 May 2022 at 19:02:41 UTC, Ola Fosheim Grøstad wrote:
 You also have the issue of tracking pitch correctly, which I 
 believe is where Autotune cut costs by doing pitch tracking by 
 autocorrelation more cheaply. That is a technical improvement, 
 not a conceptual one.
There is indeed a 1999 technique by Autotune makers that did track pitch with a kind of "sliding auto-correlation" (you would slide spectrally but instead of computing a single DFT coefficient, you compte a correlation coefficient), I believe this is maybe the technique that was famously imported from seismic detection.
May 01 2022
prev sibling next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 It's gotten to the point where I can tell :-)

 I don't mean cranking it up to an absurd point like Cher did. I 
 mean the subtle use of it, as it was intended.

 What has happened is the *style* of singing has changed to 
 accommodate use of autotune. This becomes most apparent if you 
 listen to a lot of singers in the 1970s vs today. Modern 
 singers also, since they don't have to train to be on pitch, 
 don't seem to train to develop a good tone, either. Their 
 voices just don't have good tone compared to 70's singers.
The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.
May 01 2022
parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Sunday, 1 May 2022 at 19:43:42 UTC, monkyyy wrote:
 On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 [...]
The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.
Yes, we had 2.5 millennia to correct that impression ;-)
May 02 2022
prev sibling parent reply claptrap <clap trap.com> writes:
On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 On 5/1/2022 9:31 AM, claptrap wrote:
 And 99.9% of the time you're listening to AutoTuned vocals you 
 dont even know.
It's gotten to the point where I can tell :-)
How do you know when you cant tell? You dont, you just assume because you spot it sometimes you can always tell, you cant. and the thing about singers being better in the 70s, it's not true, it's just that we've forgotten 90% of the music and we only remember the good stuff. It's natural selection. 20 or 30 years from now people will say the same about the 2010s, because all the crap will have been forgotten and only the good stuff remains. There's a name for it but I cant remember what it is. I mean seriously look up the charts for a specific week in the 70s, or 80s or whatever, most of it was awful. But we just remember the stuff that stood the test of time.
May 01 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 6:44 PM, claptrap wrote:
 On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 On 5/1/2022 9:31 AM, claptrap wrote:
 And 99.9% of the time you're listening to AutoTuned vocals you dont even know.
It's gotten to the point where I can tell :-)
How do you know when you cant tell? You dont, you just assume because you spot it sometimes you can always tell, you cant.
It leaves artifacts, and its use changes the style of singing.
 and the thing about singers being better in the 70s, it's not true, it's just 
 that we've forgotten 90% of the music and we only remember the good stuff.
It's 
 natural selection. 20 or 30 years from now people will say the same about the 
 2010s, because all the crap will have been forgotten and only the good stuff 
 remains. There's a name for it but I cant remember what it is.
Survivorship bias, yes, it's a real thing.
 I mean seriously look up the charts for a specific week in the 70s, or 80s or 
 whatever, most of it was awful. But we just remember the stuff that stood the 
 test of time.
There were a lot of awful songs, sure. But that's not what I'm talking about. I'm talking about the voice quality of the singer. It's not just hazy old memories of mine - this stuff is all available to listen to today at the click of a mouse, and the recordings are as high quality as today. You can hear and compare for yourself. May I suggest Roberta Flak, Karen Carpenter, Madonna, Robert Plant, Greg Lake, Jamie Somerville. I've heard music executives on documentaries say they no longer look for good singers, because they can fix their singing electronically. They look for someone who looks good and has charisma. Singers today are autotuned and electronically "sweetened". It just doesn't sound natural. (An extreme form of this is thrash metal singers who run their voices through a guitar effects pedal(!) but I'm not talking about that. There's also vocorder use, but that isn't trying to make the singer sound better.)
May 01 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 03:53:11 UTC, Walter Bright wrote:
 I've heard music executives on documentaries say they no longer 
 look for good singers, because they can fix their singing 
 electronically. They look for someone who looks good and has 
 charisma.
Madonna is a prime example of that though... but it is true that the charts contain overproduced music, or worse, there is much less variation of style in the chart tops now than in the 80s. But if you move out of the top you find singers like Yeba and Angelina Jordan that are the real deal in terms of talent.
May 01 2022
prev sibling parent bauss <jj_1337 live.dk> writes:
On Monday, 2 May 2022 at 01:44:03 UTC, claptrap wrote:
 On Sunday, 1 May 2022 at 18:09:16 UTC, Walter Bright wrote:
 On 5/1/2022 9:31 AM, claptrap wrote:
 And 99.9% of the time you're listening to AutoTuned vocals 
 you dont even know.
It's gotten to the point where I can tell :-)
How do you know when you cant tell? You dont, you just assume because you spot it sometimes you can always tell, you cant. and the thing about singers being better in the 70s, it's not true, it's just that we've forgotten 90% of the music and we only remember the good stuff. It's natural selection. 20 or 30 years from now people will say the same about the 2010s, because all the crap will have been forgotten and only the good stuff remains. There's a name for it but I cant remember what it is. I mean seriously look up the charts for a specific week in the 70s, or 80s or whatever, most of it was awful. But we just remember the stuff that stood the test of time.
I agree entirely with you. Even though there's a lot of bad music being made, there's still so much good music too. I don't think it's really that much different from then, I also believe that for nostalgic reasons people won't think newer music is better, even when it is. Same reason some people think movies etc. were better back then, when that isn't close to the truth either. Tons of movies I watched as a child and thought were amazing that I rewatched as an adult and hated.
May 02 2022
prev sibling parent reply bauss <jj_1337 live.dk> writes:
On Sunday, 1 May 2022 at 16:31:41 UTC, claptrap wrote:
 On Sunday, 1 May 2022 at 15:50:17 UTC, Guillaume Piolat wrote:
 On Sunday, 1 May 2022 at 14:36:12 UTC, Ola Fosheim Grøstad 
 wrote:
 Autotune is more about fashion and marketing
 
 Things that "objectively" sounds bad can become fashionable 
 for a limited time period (or become a musical style and 
 linger on).
It has been just a fad for over 23 years now.
And 99.9% of the time you're listening to AutoTuned vocals you dont even know.
Autotune and vocal mixing are two different things, albide the normal population don't know the difference and think it's the same. A lot of people mistake vocal mixing for autotune, when it really isn't. Autotune takes vocals as input and changes each pitch to match a specific pitch etc. Vocal mixing, might fix individual notes that were just sung the chorus and stuff like that, you don't go through all pitches in the vocal sample, on top of that it might add reverb, compression etc. all of which has nothing to do with autotune, but improves the sound a lot.
May 02 2022
parent reply claptrap <clap trap.com> writes:
On Monday, 2 May 2022 at 07:39:29 UTC, bauss wrote:
 On Sunday, 1 May 2022 at 16:31:41 UTC, claptrap wrote:
 On Sunday, 1 May 2022 at 15:50:17 UTC, Guillaume Piolat wrote:
 On Sunday, 1 May 2022 at 14:36:12 UTC, Ola Fosheim Grøstad
 Autotune and vocal mixing are two different things, albide the 
 normal population don't know the difference and think it's the 
 same.

 A lot of people mistake vocal mixing for autotune, when it 
 really isn't.

 Autotune takes vocals as input and changes each pitch to match 
 a specific pitch etc.

 Vocal mixing, might fix individual notes that were just sung 

 in the chorus and stuff like that, you don't go through all 
 pitches in the vocal sample, on top of that it might add 
 reverb, compression etc. all of which has nothing to do with 
 autotune, but improves the sound a lot.
Yeah that was started by Melodyne, that came out pretty soon after AutoTune, and that really was pretty mind blowing at the time. But even before the "digital revolution" in sound recording producers would just record multiple vocal tracks and cut in and out on the mixing desk or cut the actual tape and splice it together. Then it was done with DAWs and samplers, now it's done with stuff like Melodyne and Autotune And most people have no idea. Record producers have been fixing vocals since the invention of magnetic tape.
May 02 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 7:32 AM, claptrap wrote:
 But even before the "digital revolution" in sound recording producers would
just 
 record multiple vocal tracks and cut in and out on the mixing desk or cut the 
 actual tape and splice it together.
True, but in the documentary I saw on Autotune that was very time consuming and expensive, and required many takes. Hence the value that Autotune added.
 Record producers have been fixing vocals since the invention of magnetic tape.
I started liking live performances because they were imperfect :-)
May 03 2022
prev sibling parent reply claptrap <clap trap.com> writes:
On Sunday, 1 May 2022 at 14:36:12 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 1 May 2022 at 08:10:28 UTC, Walter Bright wrote:

 Autotune is more about fashion and marketing, being picked up 
 by influential producers, at the time it became a plague the 
 music market was accustomed to "electronic sound" on the radio. 
  Fairly advanced usage of phase vocoders and pitch-trackers 
 were in use in music prior to this.
There was no automatic pitch correction before AutoTune. There were pitch shifters, and offline editing, but nothing automatic and real time as far as I remember. Im not even sure phase vocoders would have been feasible on dsp hardware in those days.
May 01 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 16:31:20 UTC, claptrap wrote:
 and real time as far as I remember. Im not even sure phase 
 vocoders would have been feasible on dsp hardware in those days.
It was possible. The original papers on phase vocoders discuss real time.
May 01 2022
parent reply claptrap <clap trap.com> writes:
On Sunday, 1 May 2022 at 16:56:12 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 1 May 2022 at 16:31:20 UTC, claptrap wrote:
 and real time as far as I remember. Im not even sure phase 
 vocoders would have been feasible on dsp hardware in those 
 days.
It was possible. The original papers on phase vocoders discuss real time.
I said it likely wasn't "feasible" not that it was impossible. Even the high end digital effects units in the mid 90s only managed a handful of basic effects at the same time, and they usually did that by using multiple chips, with different chips handling different blocks in the chain. A phase vocoder would have been pretty hard to pull off on that kind of hardware even if it was possible to a level of quality that was useful. I mean it's basically fft-->processing-->ifft, which is an order of magnitude or two more than the 20 or 30 cycles/sample you need to implement a reasonable quality chorus, phaser or eq, etc... Which is likely why AutoTune appeared first as a DAW plugin.
May 01 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 01:43:03 UTC, claptrap wrote:
 I said it likely wasn't "feasible" not that it was impossible. 
 Even the high end digital effects units in the mid 90s only 
 managed a handful of basic effects at the same time, and they 
 usually did that by using multiple chips, with different chips 
 handling different blocks in the chain. A phase vocoder would 
 have been pretty hard to pull off on that kind of hardware even 
 if it was possible to a level of quality that was useful.
Technically even the Motorola 56000 can do over 500 FFTs per second with a window size of 1024 according to Wikipedia. So the phase vocoder part was feasible, but it might not have been sonically feasible in the sense that you would not end up with a product believed to be marketable or that it wasn't believed to be feasible to reach a sonic quality that would satisfy the market. That could come down to pitch-tracking, phase-vocoder issues or the details of putting it all together. Phase vocoders do introduce artifacts in the sound, it kinda follows from the uncertainty principle, you get to choose between high resolution in time or high resolution in frequency, but not both. So when you modify the sound of chunks of sound only in the frequency domain (with no concern for time) and then glue those chunks back together you will get something that has changed not only in pitch (in the general case). So it takes a fair amount of cleverness and time consuming fiddling to "suppress" those "time domain artifacts" in such a way that we don't find it disturbing. (But as I said, by the late 90s, such artifacts was becoming the norm in commercial music. House music pushed the sound of popular music in a that direction throughout the 90s.) However, the concept of decomposing sound into spectral components in order to modify or improve on the resulting sound has been an active field ever since ordinary computers were able to run FFT in reasonable time. So there is no reason to claim that someone suddenly woke up with this obvious idea that nobody had thought about before. It comes down to executing and hitting a wave (being adopted). In general truly original innovators rarely succeed in producing a marketable product, market success usually happens by someone else with the right knowledge taking ideas that exists, refining them, making them less costly to produce, using good marketing at the right time (+ a stroke of luck, like being picked up by someone that gives it traction). "Somone woke up with an obvious idea that nobody had thought about before" makes for good journalistic entertainment, but is usually not true. Successful products tend to come in the wake of "not quite there efforts". You very rarely find examples of the opposite. (The exception might be in chemistry where people stumble upon a substance with interesting properties.)
May 02 2022
next sibling parent reply user1234 <user1234 12.de> writes:
On Monday, 2 May 2022 at 08:52:06 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 2 May 2022 at 01:43:03 UTC, claptrap wrote:
 I said it likely wasn't "feasible" not that it was impossible. 
 Even the high end digital effects units in the mid 90s only 
 managed a handful of basic effects at the same time, and they 
 usually did that by using multiple chips, with different chips 
 handling different blocks in the chain. A phase vocoder would 
 have been pretty hard to pull off on that kind of hardware 
 even if it was possible to a level of quality that was useful.
Technically even the Motorola 56000 can do over 500 FFTs per second with a window size of 1024 according to Wikipedia. So the phase vocoder part was feasible, but it might not have been sonically feasible in the sense that you would not end up with a product believed to be marketable or that it wasn't believed to be feasible to reach a sonic quality that would satisfy the market. That could come down to pitch-tracking, phase-vocoder issues or the details of putting it all together. Phase vocoders do introduce artifacts in the sound, it kinda follows from the uncertainty principle, you get to choose between high resolution in time or high resolution in frequency, but not both. So when you modify the sound of chunks of sound only in the frequency domain (with no concern for time) and then glue those chunks back together you will get something that has changed not only in pitch (in the general case). So it takes a fair amount of cleverness and time consuming fiddling to "suppress" those "time domain artifacts" in such a way that we don't find it disturbing. (But as I said, by the late 90s, such artifacts was becoming the norm in commercial music. House music pushed the sound of popular music in a that direction throughout the 90s.)
The concept of "windowing" + "overlapp add" to reduce artifacts is quite old, e.g the Harris Window is [1978]. Dont known for better ones (typically Hanning). This doubles the amount of FFT required for a frame but you seem to say this was technically possible. [1978]: https://en.wikipedia.org/wiki/Window_function#Harris
May 02 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 08:57:21 UTC, user1234 wrote:
 The concept of "windowing" + "overlapp add" to reduce artifacts 
 is quite old, e.g the Harris Window is [1978]. Dont known for 
 better ones (typically Hanning).
 This doubles the amount of FFT required for a frame but you 
 seem to say this was technically possible.
Yes, I assume anyone who knows about FFT also knows the theory for windowing? The theoretically "optimal" one for analysis is DPSS, although Kaiser is basicially the same, but I never use those. I use 4x not 2x and Hann^2 (cos*cos) as the window function for simplicity. The reason for this is that when you heavily modify the frequency content you need to window it again. So you multiply with cos(t) twice, but when you add them together the sum = 1. Probably not optimal, but easy to deal with for experiments. I also believe it is possible to use Hann-Poisson for analysis. It has excessive spectral leakage, but supposedly allows you to accurately find the peaks as the flanks for the spectral leakage are monotone (smooth slope) so you can use hill climbing. But I doubt you can use this for resynthesis. What you could do is use Hann-Poisson for detecting peaks and then use another window function for resynthesis. I will try this some day :-).
May 02 2022
parent reply user1234 <user1234 12.de> writes:
On Monday, 2 May 2022 at 12:07:17 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 2 May 2022 at 08:57:21 UTC, user1234 wrote:
 The concept of "windowing" + "overlapp add" to reduce 
 artifacts is quite old, e.g the Harris Window is [1978]. Dont 
 known for better ones (typically Hanning).
 This doubles the amount of FFT required for a frame but you 
 seem to say this was technically possible.
Yes, I assume anyone who knows about FFT also knows the theory for windowing? The theoretically "optimal" one for analysis is DPSS, although Kaiser is basicially the same, but I never use those.
OK, I thought the artifacts you mentioned were about not using a window, or the rectangular window ;)
May 04 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 May 2022 at 17:23:48 UTC, user1234 wrote:
 OK, I thought the artifacts you mentioned were about not using 
 a window, or the rectangular window ;)
Understood, one trick is to shift the whole content sideways 50% before doing the FFT so that the phase information you get is on the center of the bellshaped top of the window function. Which simplifies calculations. (IIRC simpler effects can be done with 50% overlap, and 75% for more general usage.)
May 04 2022
parent reply user1234 <user1234 12.de> writes:
On Wednesday, 4 May 2022 at 17:38:28 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 4 May 2022 at 17:23:48 UTC, user1234 wrote:
 OK, I thought the artifacts you mentioned were about not using 
 a window, or the rectangular window ;)
Understood, one trick is to shift the whole content sideways 50% before doing the FFT so that the phase information you get is on the center of the bellshaped top of the window function. Which simplifies calculations. (IIRC simpler effects can be done with 50% overlap, and 75% for more general usage.)
15 years laters Prosoniq Morph is still top notch, I hope. The quality of the sound produced is still not reached, it'is still the best you can do in the frequency domain ;) Well there's is also SpectrumWorx, but the morpher you can did with this product never reached the quality of Morph.
May 04 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 May 2022 at 18:14:21 UTC, user1234 wrote:
 15 years laters Prosoniq Morph is still top notch, I hope.
 The quality of the sound produced is still not reached, it'is 
 still
 the best you can do in the frequency domain ;)
I have never tried audio morphing plugins, but it sounds fun! A long time ago I read a paper that decomposed sound into noise and sinus components and used that for morphing between instruments or spectrally changing the character of an instrument. I cannot find the paper… but it seemed to use the same principles as Melodyne.
May 04 2022
parent reply user1234 <user1234 12.de> writes:
On Wednesday, 4 May 2022 at 20:31:52 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 4 May 2022 at 18:14:21 UTC, user1234 wrote:
 15 years laters Prosoniq Morph is still top notch, I hope.
 The quality of the sound produced is still not reached, it'is 
 still
 the best you can do in the frequency domain ;)
I have never tried audio morphing plugins,
seriously ?
May 04 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 May 2022 at 20:48:12 UTC, user1234 wrote:
 seriously ?
I am interested in the ideas, not really the products. :-) For hobby music I actually am most happy if I get to create all the sounds from scratch using basic math + flangers + reverb, but it is time consuming…
May 04 2022
parent reply user1234 <user1234 12.de> writes:
On Wednesday, 4 May 2022 at 21:24:40 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 4 May 2022 at 20:48:12 UTC, user1234 wrote:
 seriously ?
I am interested in the ideas, not really the products. :-) For hobby music I actually am most happy if I get to create all the sounds from scratch using basic math + flangers + reverb, but it is time consuming…
https://www.youtube.com/watch?v=Vkfpi2H8tOE https://www.youtube.com/watch?v=7liQx92aoKk
May 06 2022
parent reply user1234 <user1234 12.de> writes:
On Friday, 6 May 2022 at 22:38:06 UTC, user1234 wrote:
 On Wednesday, 4 May 2022 at 21:24:40 UTC, Ola Fosheim Grøstad 
 wrote:
 On Wednesday, 4 May 2022 at 20:48:12 UTC, user1234 wrote:
 seriously ?
I am interested in the ideas, not really the products. :-) For hobby music I actually am most happy if I get to create all the sounds from scratch using basic math + flangers + reverb, but it is time consuming…
https://www.youtube.com/watch?v=Vkfpi2H8tOE https://www.youtube.com/watch?v=7liQx92aoKk
https://www.youtube.com/watch?v=BXf1j8Hz2bU https://www.youtube.com/watch?v=l3QxT-w3WMo life is so ;)
May 06 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 6 May 2022 at 23:29:38 UTC, user1234 wrote:
 https://www.youtube.com/watch?v=Vkfpi2H8tOE
 https://www.youtube.com/watch?v=7liQx92aoKk
https://www.youtube.com/watch?v=BXf1j8Hz2bU https://www.youtube.com/watch?v=l3QxT-w3WMo life is so ;)
Thank you for the music! (Abba?) I once created this simple techno thing in Audacity with all sounds done with math in the builtin language: https://m.soundcloud.com/bambinella/electro-storm-2013 The editing got tedious, kinda difficult to change your mind, so you better not regret what you started with. The process is fun though.
May 06 2022
prev sibling parent reply Basile B. <b2.temp gmx.com> writes:
On Friday, 6 May 2022 at 23:29:38 UTC, user1234 wrote:
 On Friday, 6 May 2022 at 22:38:06 UTC, user1234 wrote:
 On Wednesday, 4 May 2022 at 21:24:40 UTC, Ola Fosheim Grøstad 
 wrote:
 On Wednesday, 4 May 2022 at 20:48:12 UTC, user1234 wrote:
 seriously ?
I am interested in the ideas, not really the products. :-) For hobby music I actually am most happy if I get to create all the sounds from scratch using basic math + flangers + reverb, but it is time consuming…
https://www.youtube.com/watch?v=Vkfpi2H8tOE https://www.youtube.com/watch?v=7liQx92aoKk
https://www.youtube.com/watch?v=BXf1j8Hz2bU https://www.youtube.com/watch?v=l3QxT-w3WMo life is so ;)
CTFE is so. Several implementations may exist but people only remember one in particular.
May 09 2022
parent Basile B. <b2.temp gmx.com> writes:
On Monday, 9 May 2022 at 20:45:41 UTC, Basile B. wrote:
 On Friday, 6 May 2022 at 23:29:38 UTC, user1234 wrote:
 On Friday, 6 May 2022 at 22:38:06 UTC, user1234 wrote:
 On Wednesday, 4 May 2022 at 21:24:40 UTC, Ola Fosheim Grøstad 
 wrote:
 On Wednesday, 4 May 2022 at 20:48:12 UTC, user1234 wrote:
 seriously ?
I am interested in the ideas, not really the products. :-) For hobby music I actually am most happy if I get to create all the sounds from scratch using basic math + flangers + reverb, but it is time consuming…
https://www.youtube.com/watch?v=Vkfpi2H8tOE https://www.youtube.com/watch?v=7liQx92aoKk
https://www.youtube.com/watch?v=BXf1j8Hz2bU https://www.youtube.com/watch?v=l3QxT-w3WMo life is so ;)
CTFE is so. Several implementations may exist but people only remember one in particular.
IMO the one that allows `mixin (makesomecode())` is the only one that's worth, being the first or not.
May 09 2022
prev sibling next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 2 May 2022 at 08:52:06 UTC, Ola Fosheim Grøstad wrote:
  (But as I said, by the late 90s, such artifacts was becoming 
 the norm in commercial music. House music pushed the sound of 
 popular music in a that direction throughout the 90s.)
Sometimes artifacts sound "good", be it for cultural or "objective" reason. Many small delays can help a voice "fit in the mix", and spectral leakage in a phase vocoder do just that. So some may want to come through a STFT process just for the sound of leakage, that makes a voice sound "processed" (even without pitch change). Why? Because in a live performance, you would have those delays because of mic leakage. It is also true of the artifacts that leads to reduced dynamics (such as phase misalignment in a phase vocoder). Didn't like those annoying vocal dynamics? Here is less of them, as a side-effect. The phase-shift in oversampling? It can make drums sound more processed by delaying the basses, again. To the point people use oversampling for processors that only add minimal aliasing. Plus in the 2020s, anything with the sound of a popular codec is going to sound "good" because it's the sound of streaming.
May 02 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 09:23:10 UTC, Guillaume Piolat wrote:
 Sometimes artifacts sound "good", be it for cultural or 
 "objective" reason.
Yes, this is true. Like, the loudness-competition that lead to excessive use of compression (multiband?) and ducking (to let bass drum through) lead to a sound image that was pumping in and out. I personally find that annoying, but when you see kids driving in the streets playing loud music they seem to favour this "musically bad" sound. I guess they find excitement in it, where I think of it as poor mastering. And I guess in some genres it is now considered bad mastering if you don't use excessive compression. I believe this loudness-competition and "overproduction" also has affected non-pop genres. If you get the ability to tweak, it is difficult to stop in time... I frequently find the live performances of talented singers on youtube more interesting than their studio recordings, actually. The french music scene might be different? French "electro" seemed more refined/sophisticated in the sound than many other "similar" genres, but this is only my impression, which could be wrong.
 Many small delays can help a voice "fit in the mix", and 
 spectral leakage in a phase vocoder do just that. So some may 
 want to come through a STFT process just for the sound of 
 leakage, that makes a voice sound "processed" (even without 
 pitch change). Why? Because in a live performance, you would 
 have those delays because of mic leakage.
I hadn't thought of that. Interesting perspective about mics, but a phase vocoder have other challenges related to changing the frequency content. How would you create a glissando from scratch just using inverse FFT, it is not so obvious? How do you tell the difference between a click and a "shhhhhhh" sound? The only difference is in the phase… so not so intuitive in the frequency domain, but very intuitive in the time domain. You don't only get spectral leakage from windowing, you also can get some phasing-artifacts when you manipulate the frequency content. And so on… But, the audience today is very much accustomed to electronic soundscapes in mainstream music, so sounding "artificial" is not a negative. In the 80s you could see people argue seriously and with a fair amount of contempt that electronic music wasn't real music… That is a big difference! Maybe similar things are happening in programming. Maybe very young programmers have a completely different view of what programming should be like? I don't know, but I've got a feeling that they would view C as a relic of the past. If we were teens, would we then focus on the GPU and forget about the CPU, or just patching together libraries in Javascript? Javascript is actually quite capable today, so…
 The phase-shift in oversampling? It can make drums sound more 
 processed by delaying the basses, again. To the point people 
 use oversampling for processors that only add minimal aliasing.
I didn't understand this one, do you mean that musicians misunderstand what is causing the effect so that they think that it is caused by the main effect, but instead it caused by the internal delay of the unit? Or did you mean something else?
 Plus in the 2020s, anything with the sound of a popular codec 
 is going to sound "good" because it's the sound of streaming.
I hadn't though of that. I'm not sure if I hear the difference between the original or mp3 when playing other people's music (maybe the hi-hats). I do hear a difference when listening to my own mix (maybe because I've spent so many hours analysing it).
May 02 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 2 May 2022 at 11:19:18 UTC, Ola Fosheim Grøstad wrote:
 I guess they find excitement in it, where I think of it as poor 
 mastering. And I guess in some genres it is now considered bad 
 mastering if you don't use excessive compression.
I don't think there is any real reason to trust one own taste, as taste is socially constructed (cf. La Distinction from Bourdieu) and - simplifying - it reflects too much of your socioeconomic background to be significative. Music in particular particularly reflects that.
 The french music scene might be different? French "electro" 
 seemed more refined/sophisticated in the sound than many other 
 "similar" genres, but this is only my impression, which could 
 be wrong.
French hiphop was amazing (and is popular) from 2017 to ~2021 but I don't think we have something interesting otherwise. French electro is much less interesting than the Argentinian progressive house scene for example, and that's just my opinion again. A lot of good music gets produced in niches, to get completely ignored nowadays, so it would be hard to say what scene is interesting ; we all get to miss it anyway.
 I didn't understand this one, do you mean that musicians 
 misunderstand what is causing the effect so that they think 
 that it is caused by the main effect, but instead it caused by 
 the internal delay of the unit? Or did you mean something else?
Oversampling typically produces: A. a phase shift B. anti-aliasing but because aliasing is a tiny problem in dynamics processing in the first place, people choose to use it while hearing only (A). Which can sound good by itself. The by-product becomes more desirable than the non-problem it solves. Now everyone wants the feature!
 I do hear a difference when listening to my own mix (maybe 
 because I've spent so many hours analysing it).
If a typically polished song is listened as MP3, then MP3 becomes the norm. And then what-everyone-else-is-doing sounds sincerely better to our ears. A process you could call "legitimation". I had a strange conversation about Autotune once with a 20 years old: - an heavily autotuned voice sounded "normal" and not-autotuned to her - but the _talkbox_ in Kavinsky - Nightcall sounded ugly to her and "autotuned". She mentionned of course she didn't like the Autotune. But was unable to identify it in practice.
May 02 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 13:44:24 UTC, Guillaume Piolat wrote:
 I don't think there is any real reason to trust one own taste, 
 as taste is socially constructed (cf. La Distinction from 
 Bourdieu) and - simplifying - it reflects too much of your 
 socioeconomic background to be significative. Music in 
 particular particularly reflects that.
I understand what you say, but in regard to aesthetical analysis you can think in terms of multiple dimensions. Some music is "meaningful" or "complex" in many dimensions. Socioeconomic matters, but take Eurovision or TV singer contests. When you take the average of everyones taste you end up with not-very-interesting-music, at best engaging entertainment. I was recently very disappointed in the Norwegian version of The Voice, there were some phenomenal singers, the professional jury celebrated them, but when the viewers get to vote, they voted for the guy that song a boring Beatles rendition or the singer with good dance moves… Basically, the good technical vocalists were voted out. I guess we can discuss the merits of taste, but if "all muscians" would pick one and the majority of "non-musicians" pick another there are some objective aspects to taste that go beyond "socioeconomic" reasons.
 French hiphop was amazing (and is popular) from 2017 to ~2021 
 but I don't think we have something interesting otherwise. 
 French electro is much less interesting than the Argentinian 
 progressive house scene for example, and that's just my opinion 
 again.
Thanks for the tip, I'll try to find Argentinian progressive house. Latin producers often add a new flare to dance-oriented genres. (Not to mention the top hip-hop mixing duo Latin Rascals in the 80s, still worth a listen, in my opinion.).
 A lot of good music gets produced in niches, to get completely 
 ignored nowadays, so it would be hard to say what scene is 
 interesting ; we all get to miss it anyway.
It is difficult to be visible when 50000 songs are released every day? (Or was it a different number? Something huge anyway.). It is quite mind blowing how transformative capable home computers have been.
 Oversampling typically produces:
 A. a phase shift
 B. anti-aliasing
I don't think I understand what you mean by oversampling. Why does sampling at 96kHz instead of 48kHz have any sonic impact? It shouldn't?
 The by-product becomes more desirable than the non-problem it 
 solves. Now everyone wants the feature!
This is new to me, is this related to some of your plugins? Got a link?
 I had a strange conversation about Autotune once with a 20 
 years old:
 - an heavily autotuned voice sounded "normal" and not-autotuned 
 to her
 - but the _talkbox_ in Kavinsky - Nightcall sounded ugly to her 
 and "autotuned". She mentionned of course she didn't like the 
 Autotune. But was unable to identify it in practice.
Maybe there is an increasing gap in music perception between people who create music as a hobby (or pros) and the average person? Last year [this singer]( https://www.youtube.com/watch?v=fAqMMKqmdfY) performed on The Voice Norway without any pitch-effects, and of course some would insist that it was Autotune. (That Nightcall-song reminds me of an analog 8-channel vocoder I built from a mail-order DIY kit back in the days, from a tiny company called [PAiA](https://paia.com/). :-)
May 02 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 15:39:41 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 2 May 2022 at 13:44:24 UTC, Guillaume Piolat wrote:
 Oversampling typically produces:
 A. a phase shift
 B. anti-aliasing
I don't think I understand what you mean by oversampling. Why does sampling at 96kHz instead of 48kHz have any sonic impact? It shouldn't?
Having thought some about this. Do you mean in AD-converters or in DSP? I don't know too much about state of the art AD-circuits, but I would imagine that they use a higher internal sample rate so that they can use an analog filter that does not affect the audible signal in a destructive way, followed by a digital correction filter followed by decimating? The result ought to be neutral? Or are you talking about side-effects from low pass filters in the DSP process, moving the knee (-3dB) of the filter out of the audible range by using oversampling? But regardless, you should be able to use a phase-correcting allpass filter, if desired…? I am not trying to be difficult, but I am trying to understand the context.
May 02 2022
parent Guillaume Piolat <first.last gmail.com> writes:
On Monday, 2 May 2022 at 17:36:46 UTC, Ola Fosheim Grøstad wrote:
 Or are you talking about side-effects from low pass filters
Yes, in a DSP process, upsampling and downsampling are two lowpass filters themselves.
 But regardless, you should be able to use a phase-correcting 
 allpass filter, if desired…?
You can go linear phase yes, but you need to choose between - introducing latency for all frequencies (linear phase), - or introducing a phase shift just for the basses (minphase). minphase is typically used because more efficient and better quality. linphase sounds "metallic".
May 02 2022
prev sibling parent reply claptrap <clap trap.com> writes:
On Monday, 2 May 2022 at 08:52:06 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 2 May 2022 at 01:43:03 UTC, claptrap wrote:

However, the concept of decomposing sound into spectral components in order to modify or improve on the resulting sound has been an active field ever since ordinary computers were able to run FFT in reasonable time. So there is no reason to claim that someone suddenly woke up with this obvious idea that nobody had thought about before. It comes down to executing and hitting a wave (being adopted).
It was adopted because it was revolutionary, it took something that was a tedious and difficult manual task and made it ridiculously easy. It wasn't about fashion or getting a few bigwig producers to make it popular. And maybe other people had thought to themselves "wouldn't it be cool if we had some tool to automatically re-tune the vocals". I mean "wouldnt it be cool if we could take this tedious manual task and automate it somehow" is probably the main driver of invention. But to focus on that does a disservice to what is involved in actually getting it to work, and especially so in real time. I used to loiter in a forum for audio software developers and you know how often people come in and post "I have this great idea for a product and I just need someone to implement it and we'll make loads of money", it was all the time, so much so that there was a sticky at the top of the forum telling people why it's dumb thing to post. Genius isn't having the idea, it's more often than not making the idea work.
May 02 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 2 May 2022 at 14:34:24 UTC, claptrap wrote:
 Genius isn't having the idea, it's more often than not making 
 the idea work.
For the most interesting stuff is what comes from people associated with institutions like CCRMA, IRCAM and the like, but I am not sure I would ascribe *genius* to anything related to audio. Most of it is layers of knowledge, not one amazing discovery. I guess Chowning's FM synthesis could qualify, but in general it is a series of smaller steps.
May 02 2022
parent reply claptrap <clap trap.com> writes:
On Monday, 2 May 2022 at 15:22:16 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 2 May 2022 at 14:34:24 UTC, claptrap wrote:
 Genius isn't having the idea, it's more often than not making 
 the idea work.
For the most interesting stuff is what comes from people associated with institutions like CCRMA, IRCAM and the like, but I am not sure I would ascribe *genius* to anything related to audio. Most of it is layers of knowledge, not one amazing discovery.
Yeah genius is probably the wrong word, but what I mean is its like that quote about genius being 1% inspiration and 99% perspiration. Focusing on saying the idea was obvious is doing a disservice to whats involved in actually getting it working. And to be far almost all human knowledge is built up in layers. Even when someone solves a really hard problem you usually find lots of different people have chipped away at it in different ways.
 I guess Chowning's FM synthesis could qualify, but in general 
 it is a series of smaller steps.
See to me that's less impressive, I mean I reckon people were doing FM synthesis with analog hardware already. So it was more likely just a refinement, or exploration, it's actually technically pretty simple. I mean real time pitch tracking and artifact free pitch shifting are orders of magnitude harder problems than FM synthesis. But maybe the implementation was harder because of the limited digital hardware back then?
May 03 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 14:59:12 UTC, claptrap wrote:
 Yeah genius is probably the wrong word, but what I mean is its 
 like that quote about genius being 1% inspiration and 99% 
 perspiration. Focusing on saying the idea was obvious is doing 
 a disservice to whats involved in actually getting it working.
Ok, but in DSP I think many ideas are obvious if you know the field, but getting the right mix, the right hacks, the right tweaks, getting it to run fast and making it sound good takes a lot of effort (or can happen as an accident :-). I certainly don't doubt that there are many years of highly skilled effort that has gone into the product as it is today. But that is solid engineering, not a moment of "wowsers!" :-D
 And to be far almost all human knowledge is built up in layers. 
 Even when someone solves a really hard problem you usually find 
 lots of different people have chipped away at it in different 
 ways.
I think what is special in computer music is that the bottom layer is all about human perception of sound. I think knowledge at that layer is more impressive than the other layers. Like, the technology behind mp3 isn't really all that impressive, what makes it impressive is how they used knowledge about human perception (our lack of ability to distinguish differences/resolutions between certain "sound textures"). When developers manage to create new "illusions" based on perceptional psychology and create algorithms that exploit that you have something special in my eyes (regardless of whether it has any practical application).
 See to me that's less impressive, I mean I reckon people were 
 doing FM synthesis with analog hardware already. So it was more 
 likely just a refinement, or exploration, it's actually 
 technically pretty simple.
It is difficult to find any individual discovery that is obviously impressive, and I guess putting a sin() into another sin() may seem intuitive, given people already used LFOs. I think the work he put into making it musically useful and expressive creating new types of bell-like sounds is why people emphasis his contribution. I find this wiki-quote a bit funny: «This was Stanford's most lucrative patent at one time, eclipsing many in electronics, computer science, and biotechnology.» Fooling around with some math expressions paid off! It was apparently first made available in Synclavier I, which I find interesting, upper high end at the time.
 I mean real time pitch tracking and artifact free pitch 
 shifting are orders of magnitude harder problems than FM 
 synthesis.
Many people worked on that though? It is very much the work of a community… In general most things in audio build on something else. Like, the concept of vocoders is in some way ingenious, but it was invented for speech in telecom by Bell labs in 1930s.
May 03 2022
parent reply claptrap <clap trap.com> writes:
On Tuesday, 3 May 2022 at 15:40:45 UTC, Ola Fosheim Grøstad wrote:
 On Tuesday, 3 May 2022 at 14:59:12 UTC, claptrap wrote:
 Ok, but in DSP I think many ideas are obvious if you know the 
 field, but getting the right mix, the right hacks, the right 
 tweaks, getting it to run fast and making it sound good takes a 
 lot of effort (or can happen as an accident :-). I certainly 
 don't doubt that there are many years of highly skilled effort 
 that has gone into the product as it is today. But that is 
 solid engineering, not a moment of "wowsers!" :-D
That's pretty much my experience. The actual math / "engineering" part is fairly straightforward if you're decent at math. But making it sound good is a bit more art than science i reckon. I guess at the end of the day because its being used to make art and that is a much more subjective realm.
 See to me that's less impressive, I mean I reckon people were 
 doing FM synthesis with analog hardware already. So it was 
 more likely just a refinement, or exploration, it's actually 
 technically pretty simple.
It is difficult to find any individual discovery that is obviously impressive, and I guess putting a sin() into another sin() may seem intuitive, given people already used LFOs. I think the work he put into making it musically useful and expressive creating new types of bell-like sounds is why people emphasis his contribution. I find this wiki-quote a bit funny: «This was Stanford's most lucrative patent at one time, eclipsing many in electronics, computer science, and biotechnology.»
It's just that building blocks in an FM synthesiser are quite simple, at least conceptually, I reckon I could knock one up in about 30 minutes, just the audio part anyway. Even the math is pretty straight forward, what sidebands you'll get etc... I think maybe it seems complicated to the end user cause it's not very user friendly to make presets. But it's actually pretty simple, and was probably already being done on analog gear, I mean I imagine VCOs existed with linear frequency control back then? AutoTune, i reckon days maybe? Plus a lot of research and months of time experimenting trying to make it not sound like crap?
 I mean real time pitch tracking and artifact free pitch 
 shifting are orders of magnitude harder problems than FM 
 synthesis.
Many people worked on that though? It is very much the work of a community… In general most things in audio build on something else. Like, the concept of vocoders is in some way ingenious, but it was invented for speech in telecom by Bell labs in 1930s.
That's engineering though isn't it, the higher you get up complexity wise, the more you're building on work done by other people. It doesn't mean we should only be impressed by people who lay foundations.
May 04 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 May 2022 at 12:30:13 UTC, claptrap wrote:
 at math. But making it sound good is a bit more art than 
 science i reckon. I guess at the end of the day because its 
 being used to make art and that is a much more subjective realm.
Yes, that art aspect is what makes this field interesting too, as there is no objectively right and wrong tool. If you can enable artists to create new "modes" of expression, you have a success! (Even if it as simple as setting all lower bits to zeros in a bitcrusher.) Same thing in visuals, when the researchers got bored with photo-realistic rendering and started looking at non-photo-realistic rendering you also open up for endless new possibilities of artistic toolmaking.
 It's just that building blocks in an FM synthesiser are quite 
 simple, at least conceptually, I reckon I could knock one up in 
 about 30 minutes, just the audio part anyway. Even the math is 
 pretty straight forward, what sidebands you'll get etc...
Yes, I agree. I only mentioned it because it is difficult to find areas where you can point to one person doing it all on his own. Also, doing computer music at this time on mainframes must have been tedious! The most well known tool from that time period is [Music V](https://www.britannica.com/topic/Music-V), which has an open source successor in [CSound](https://csound.com/). The latter actually has an online IDE that one can play with for fun: https://ide.csound.com/
 That's engineering though isn't it, the higher you get up 
 complexity wise, the more you're building on work done by other 
 people. It doesn't mean we should only be impressed by people 
 who lay foundations.
Sure, any tool that enable artists to create new expressions more easily are valuable, but is quite rare that something has not been tried in the past, or something close to it.
May 04 2022
prev sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 3 May 2022 at 14:59:12 UTC, claptrap wrote:
[... something about music production ...]
This thread is about why D is unpopular. This thread so completely derailed.
May 03 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 15:52:30 UTC, IGotD- wrote:
 On Tuesday, 3 May 2022 at 14:59:12 UTC, claptrap wrote:
[... something about music production ...]
This thread is about why D is unpopular. This thread so completely derailed.
We are trying to make D popular in audio-programming…
May 03 2022
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Tuesday, 3 May 2022 at 16:09:30 UTC, Ola Fosheim Grøstad wrote:
 We are trying to make D popular in audio-programming…
Are you serious? Afaik you are just talking here and never contributed to D. Please do not associate yourself with people walking the walk. Honestly just answered to your _many_ question marks, which I won't fall for again.
May 03 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 18:41:42 UTC, Guillaume Piolat wrote:
 On Tuesday, 3 May 2022 at 16:09:30 UTC, Ola Fosheim Grøstad 
 wrote:
 We are trying to make D popular in audio-programming…
Are you serious?
Of course not, this thread has turned into a chat a long time ago. No need to be upset.
May 03 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/2/2022 7:34 AM, claptrap wrote:
 Genius isn't having the idea, it's more often than not making the idea work.
Yup. I've heard endless arguments that Edison didn't really invent the light bulb, the Wrights did not invent the airplane, Musk did not invent reusable rockets, etc. An idea ain't worth spit if it is not implemented. The value in D's CTFE is after we demonstrated it, it suddenly became a "must have" feature in the other mainstream native languages.
May 02 2022
next sibling parent test123 <test123 gmail.com> writes:
On Tuesday, 3 May 2022 at 03:55:34 UTC, Walter Bright wrote:
 Yup. I've heard endless arguments that Edison didn't really 
 invent the light bulb, the Wrights did not invent the airplane, 
 Musk did not invent reusable rockets, etc.

 An idea ain't worth spit if it is not implemented.

 The value in D's CTFE is after we demonstrated it, it suddenly 
 became a "must have" feature in the other mainstream native 
 languages.
Wilzbach ask Andrei Alexandrescu to comment at this link 4 year ago without answer: https://github.com/dlang/dmd/pull/8460#issuecomment-438920811 Don't ask for it but earn it. Don't ask people to respect without respect the others first.
May 02 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 03:55:34 UTC, Walter Bright wrote:
 On 5/2/2022 7:34 AM, claptrap wrote:
 Genius isn't having the idea, it's more often than not making 
 the idea work.
Yup. I've heard endless arguments that Edison didn't really invent the light bulb, the Wrights did not invent the airplane, Musk did not invent reusable rockets, etc. An idea ain't worth spit if it is not implemented.
I disagree. There would have been no light bulb or reusable rocket if not for the hard mental and physical work of thousands of people who will never get due credit.
 The value in D's CTFE is after we demonstrated it, it suddenly 
 became a "must have" feature in the other mainstream native 
 languages.
Sorry for being a nuisance, but D was not the first to demonstrate it. It's a fact.
May 03 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 08:04:28 UTC, Max Samukha wrote:
 I disagree. There would have been no light bulb or reusable 
 rocket if not for the hard mental and physical work of 
 thousands of people who will never get due credit.
Equating entrepreneurship with genius is an american thing. The US doesn't really have a unifying cultural identity, symbols of capitalism has become the unifier. You see it all the time in US media, money becomes culture (symptoms: Kardashians, Paris Hilton, Donald Trump etc). Most reasonable people would reserve the term "genius" to people who's intellectual work far surpass what comes after: Bach, Einstein, Aristoteles etc… Has nothing to do with business at all. (An no, Bill Gates, Elon Musk, Steve Jobs does not qualify.)
May 03 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 3 May 2022 at 08:42:03 UTC, Ola Fosheim Grøstad wrote:
 (An no, Bill Gates, Elon Musk, Steve Jobs does not qualify.)
I partially agree with this, but I would take Bill Gates out of this list as he's actually very smart. He didn't succeed because of his business skills, but because of the technological products that he provided. Steve Jobs was nothing but a business man, he wasn't technological smart IMHO. He just knew what would sell. I don't consider Elon Musk a genius, but he is smart, but far more when it comes to business than when it comes to technology. Most of the inventions by his companies are inventions of other people, like Tesla wasn't an idea he came up with.
May 03 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 09:40:25 UTC, bauss wrote:
 On Tuesday, 3 May 2022 at 08:42:03 UTC, Ola Fosheim Grøstad 
 wrote:
 (An no, Bill Gates, Elon Musk, Steve Jobs does not qualify.)
I partially agree with this, but I would take Bill Gates out of this list as he's actually very smart.
Right, he is a nerd in the positive sense of the word and does seek knowledge. But being an entrepreneur is more like sports, you build a skilled team, map the terrain and focus on winning, you don't give up when have setbacks and you ruthlessly pursue your goals. Which earned Microsoft some bad reputation under Gates. Of course, the same applies to other IT giants like Oracle. Gates seem to have evolved a lot as a person after leaving Microsoft though.
May 03 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 08:42:03 UTC, Ola Fosheim Grøstad wrote:

 Equating entrepreneurship with genius is an american thing. The 
 US doesn't really have a unifying cultural identity, symbols of 
 capitalism has become the unifier. You see it all the time in 
 US media, money becomes culture (symptoms: Kardashians, Paris 
 Hilton, Donald Trump etc).

 Most reasonable people would reserve the term "genius" to 
 people who's intellectual work far surpass what comes after: 
 Bach, Einstein, Aristoteles etc… Has nothing to do with 
 business at all.

 (An no, Bill Gates, Elon Musk, Steve Jobs does not qualify.)
Yeah, it's fun to hear Musk saying something like "it's useless to think about things that cannot be turned into a product", while all his products are using the results of the intellectual work that people like him considered useless a century ago.
May 03 2022
parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Tuesday, 3 May 2022 at 09:40:42 UTC, Max Samukha wrote:
 Yeah, it's fun to hear Musk saying something like "it's useless 
 to think about things that cannot be turned into a product", 
 while all his products are using the results of the 
 intellectual work that people like him considered useless a 
 century ago.
It's hard to argue though that the world would not look very different if it had, for instance, a hundred additional Elons Musk in it. I do think there's a value-add there. You need entrepreneurs with a combination of business sense and product focus - that's actually pretty rare.
May 03 2022
next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 May 2022 at 10:28:19 UTC, FeepingCreature wrote:

 It's hard to argue though that the world would not look very 
 different if it had, for instance, a hundred additional Elons 
 Musk in it. I do think there's a value-add there. You need 
 entrepreneurs with a combination of business sense and product 
 focus - that's actually pretty rare.
I totally agree. Musk is brilliant. It just would be nice if people stopped calling him an inventor or engineer.
May 03 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 4:16 AM, Max Samukha wrote:
 I totally agree. Musk is brilliant. It just would be nice if people stopped 
 calling him an inventor or engineer.
I am an engineer, and I've got no problem calling Musk one. He has a degree in physics, which has little difference from engineering. He's the Chief Engineer at SpaceX. As for being an inventor, here's a list of his patents: https://patents.justia.com/inventor/elon-musk Holding a patent makes one an inventor.
May 03 2022
parent reply mee6 <mee6 lookat.me> writes:
On Tuesday, 3 May 2022 at 19:22:37 UTC, Walter Bright wrote:
 On 5/3/2022 4:16 AM, Max Samukha wrote:
 I totally agree. Musk is brilliant. It just would be nice if 
 people stopped calling him an inventor or engineer.
I am an engineer, and I've got no problem calling Musk one. He has a degree in physics, which has little difference from engineering. He's the Chief Engineer at SpaceX. As for being an inventor, here's a list of his patents: https://patents.justia.com/inventor/elon-musk Holding a patent makes one an inventor.
The patent was first invented in 1421 it appears. Whether it was even a good invention is yet to be seen. I think software patents are bad for software in general. I think Elon Musk isn't really an engineer. I think he's just a really good PR person. Most people don't know that Elon wasn't really involved with Tesla, he just bought out the actual person that started Tesla. You don't become a billionaire without exploiting people. The people he's severely underpaying for their work and over working them. He pays people less than their worth and takes credit for all of their successes.
May 03 2022
parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 3 May 2022 at 23:09:31 UTC, mee6 wrote:

 I think Elon Musk isn't really an engineer.
A capitalist with 230B! Man that doesn't pay taxes.
May 03 2022
parent mee6 <mee6 lookat.me> writes:
On Wednesday, 4 May 2022 at 00:41:10 UTC, zjh wrote:
 On Tuesday, 3 May 2022 at 23:09:31 UTC, mee6 wrote:

 I think Elon Musk isn't really an engineer.
A capitalist with 230B! Man that doesn't pay taxes.
Yes and banks give him near 0 interest loans that he doesn't have to pay tax on. The whole financial system needs to be gutted. It's a legacy system were stuck in.
May 04 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 10:28:19 UTC, FeepingCreature wrote:
 Musk in it. I do think there's a value-add there. You need 
 entrepreneurs with a combination of business sense and product 
 focus - that's actually pretty rare.
Not really. Every country has thousands if not millions of entrepreneurs, but few of them have the capital to grow fast. Where you have large gains you also have high risks, when you take high risks you usually also need luck. Why was my country flooded by Teslas when they launched? It was because the Norwegian government had removed taxes on electric cars and allowed them to drive in the bus/taxi lane, so Teslas became "cheap" luxury cars… You cannot plan for that kind of luck. When media tell tales about success they tend to ignore the timing, luck and not having the competition launch a submarine product that undermines your own product. For every success story there are many failures that did roughly the same things, and the source for failure can be as simple as not having the funds to do marketing. People who run fast is also not very rare, but there is only one person who runs faster than everyone else. That person will take it all. If you remove the fastest runners, you still have plenty of people that run fast. So I don't buy your argument here. If you remove Intel, we will still have fast computers. If you remove Apple we will still have good mobile phons. If you remove Google we will still have high quality search. If you remove Microsoft we will still have good cloud computing services. Etc. Etc. Etc. If you remove Apple, Microsoft and Amazon, very little will change, because the same people will work for some other entity filling the void.
May 03 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 11:25:48 UTC, Ola Fosheim Grøstad wrote:
 People who run fast is also not very rare, but there is only 
 one person who runs faster than everyone else. That person will 
 take it all. If you remove the fastest runners, you still have 
 plenty of people that run fast. So I don't buy your argument 
 here.
TLDR; nobody are indispensable. Physics would have landed on Einstein's theory eventually, maybe he saved it a few decades of work. FM synthesis would have been discovered, plenty of computer music scientists have a math background and modulating a sine-wave with a sine-wave is something you would expect a mathematician to do. So if a genius only can give a few decades of progress, it is very difficult to find examples of individual contribution that significantly alter technological progress by more than a few years. It is much easier to find example of people who set back progress! (e.g. warfare).
May 03 2022
prev sibling parent ShadoLight <ettienne.gilbert gmail.com> writes:
On Saturday, 30 April 2022 at 07:05:28 UTC, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 19:10:32 UTC, Walter Bright wrote:
 On 4/29/2022 11:26 AM, Paulo Pinto wrote:
 Those were interpreters first and added native code 
 generation later. D did is the other way around, and the 
 native code generating compilers started doing it soon 
 afterwards.
Decades before D was even an idea. Again, SIGPLAN.
So why did other native languages suddenly start doing it after D did to the point of it being something a language can't skip anymore?
They didn't, they got inspired by those that preceded D, you just want to believe D was the cause.
Walter may be suffering from a bit of confirmation bias, but aren't you doing the same thing? You are arguing that in the sequence A, B, C, D, E, ... where (let's say, using your own examples): A (Lisp, 1962) .., B(Interlisp, 1983), C(Allegro Common, 1985), ... D(DMD), E, ... ...you are arguing that it is impossible that some feature(s) of E could have been inspired by D, only by prior languages (any or some combination of A, B & C .. and others preceding D). How do you know they only (to quote you) *"got inspired by those that preceded D"*? Walter's original quote was *"Other languages have taken inspiration from D, such as ranges and compile time expression evaluation."* So 'E' above is *"other languages"* with *"ranges and compile time expression evaluation"*. So let's see - I can quote from this 2013 ACCU article named "C++ Range and Elevation" [1], which references a talk by Andrei in 2009: *"Back in 2009, Andrei Alexandrescu gave a presentation at the ACCU Conference about Ranges. The short version is that although you can represent a range in C++98 using a pair of iterators, the usage is cumbersome..."* *"And all that’s just using the iterators the C++ Standard Library gives you; defining your own iterator is notoriously complex. So, Andrei introduced a much simpler – and more powerful – abstraction: the Range [ Alexandrescu09 ]"* *"There have been a few attempts to implement Andrei’s ideas in C++ (he implemented them for D, and they form the basis of the D Standard Library)...etc..."* So, articles in relation to C++ mention prior work done by Andrei on ranges, mention attempts to achieve the same in C++, mentions the difficulties, etc... and this was all way back in 2013. And this, according to you, had zero impact on proposals in C++ regarding ranges? Similarly to Walter's assertion (without evidence), aren't you just asserting the opposite (also without evidence)? I mean, let alone any other languages, how do you know this **for certain** just about C++? [1]: https://accu.org/journals/overload/21/117/love_1833/
May 01 2022
prev sibling parent reply max haughton <maxhaton gmail.com> writes:
On Friday, 29 April 2022 at 18:26:46 UTC, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 18:05:42 UTC, Walter Bright wrote:
 On 4/29/2022 10:00 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 15:28:16 UTC, Walter Bright wrote:
 On 4/27/2022 8:59 AM, Satoshi wrote:
 [...]
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. ....
Sorry, Lisp, ML, CLU and Smalltalk did it first, D was surely not the first in this regard. Plenty of SIGPLAN papers on the subject.
Those were interpreters first and added native code generation later. D did is the other way around, and the native code generating compilers started doing it soon afterwards.
Decades before D was even an idea. Again, SIGPLAN.
Which papers?
Apr 29 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 29 April 2022 at 19:44:09 UTC, max haughton wrote:
 On Friday, 29 April 2022 at 18:26:46 UTC, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 18:05:42 UTC, Walter Bright wrote:
 On 4/29/2022 10:00 AM, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 15:28:16 UTC, Walter Bright 
 wrote:
 On 4/27/2022 8:59 AM, Satoshi wrote:
 [...]
Other languages have taken inspiration from D, such as ranges and compile time expression evaluation. ....
Sorry, Lisp, ML, CLU and Smalltalk did it first, D was surely not the first in this regard. Plenty of SIGPLAN papers on the subject.
Those were interpreters first and added native code generation later. D did is the other way around, and the native code generating compilers started doing it soon afterwards.
Decades before D was even an idea. Again, SIGPLAN.
Which papers?
Given that the point is compile time execution and ranges, with compiled code. Lets start with Lisp macros and reduce ourselves to the first generation of Lisp compilers that were relatively known. Interlisp-D at Xerox PARC, http://www.softwarepreservation.org/projects/LISP/interlisp_family, 1983. Don't be deceived by the references to bytecode or vm, Dorado workstations used microcoded CPUs loaded on boot, hardly any different from modern Intel/AMD CPUs doing on the fly translations from CISC to their RISC internals. But if you want to be pedantic about the very first Lisp compiler with macros support, it was created in 1962 https://web.archive.org/web/20201213195043/ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-039.pdf Or a version that is still in use, like Allegro Common, first release in 1985. http://www.softwarepreservation.org/projects/LISP/common_lisp_family Maybe Lisp isn't the thing, we can turn our attention to the ML linagage with MetaML (2000) or Template Haskell (2002), being two of most well known examples, https://www.sciencedirect.com/science/article/pii/S0304397500000530 https://userpages.uni-koblenz.de/~laemmel/TheEagle/dl/SheardPJ02.pdf Switching gears to ranges, we have Smalltalk-80 collections as one possible example, https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-80_Collection_Classes
Apr 30 2022
next sibling parent reply Daniel N <no public.email> writes:
On Saturday, 30 April 2022 at 08:32:19 UTC, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 19:44:09 UTC, max haughton wrote:

 Switching gears to ranges, we have Smalltalk-80 collections as 
 one possible example,

 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-80_Collection_Classes
Let's focus on smalltalk, what syntax do you use to choose if your code snippet should run at runtime or in compiletime? Not the entire program, but 50% compile-time and 50% runtime.
Apr 30 2022
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 30 April 2022 at 08:56:55 UTC, Daniel N wrote:
 On Saturday, 30 April 2022 at 08:32:19 UTC, Paulo Pinto wrote:
 On Friday, 29 April 2022 at 19:44:09 UTC, max haughton wrote:

 Switching gears to ranges, we have Smalltalk-80 collections as 
 one possible example,

 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-80_Collection_Classes
Let's focus on smalltalk, what syntax do you use to choose if your code snippet should run at runtime or in compiletime? Not the entire program, but 50% compile-time and 50% runtime.
You're focusing on the wrong apple, the Smalltalk example was related to ranges outside of the functional programmig family. So can focus instead on Lisp as one possible example, use macros for the 50% at compile time, leave the rest of the code using streams for the ranges part of the equation. If you don't like parentheses, that is also possible in Dylan or Template Haskell. D hasn't invented anything newer here, and regardless of the wishfull thinking that it did, you won't find any references to D as inspiration to those features in modern languages papers like HOPL, rather to those that have preceeded it.
Apr 30 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 2:30 AM, Paulo Pinto wrote:
 D hasn't invented anything newer here, and regardless of the wishfull thinking 
 that it did, you won't find any references to D as inspiration to those
features 
 in modern languages papers like HOPL, rather to those that have preceeded it.
You won't find any references to D as inspiration for static if in C++, either, despite the fact that Andrei, Herb, and I submitted a formal proposal for it for C++.
Apr 30 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 10:43 AM, Walter Bright wrote:
 On 4/30/2022 2:30 AM, Paulo Pinto wrote:
 D hasn't invented anything newer here, and regardless of the wishfull thinking 
 that it did, you won't find any references to D as inspiration to those 
 features in modern languages papers like HOPL, rather to those that have 
 preceeded it.
You won't find any references to D as inspiration for static if in C++, either, despite the fact that Andrei, Herb, and I submitted a formal proposal for it for C++.
As to why there aren't references to D as inspiration, and no references to Zortech C++'s seminal role in the early days of C++, consider this: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3613.pdf
Apr 30 2022
parent reply Dukc <ajieskola gmail.com> writes:
On Saturday, 30 April 2022 at 20:05:14 UTC, Walter Bright wrote:
 On 4/30/2022 10:43 AM, Walter Bright wrote:
 On 4/30/2022 2:30 AM, Paulo Pinto wrote:
 D hasn't invented anything newer here, and regardless of the 
 wishfull thinking that it did, you won't find any references 
 to D as inspiration to those features in modern languages 
 papers like HOPL, rather to those that have preceeded it.
You won't find any references to D as inspiration for static if in C++, either, despite the fact that Andrei, Herb, and I submitted a formal proposal for it for C++.
As to why there aren't references to D as inspiration, and no references to Zortech C++'s seminal role in the early days of C++, consider this: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3613.pdf
I can't fathom what they were thinking when they wrote that paper. The reasoning in it is so bad that it's an outright disgrace to the C++ committee. Not saying they should have accepted the proposal, there were some good arguments, but as a whole it's ignorant to the point of appearing downright hostile. You really deserved much better than that.
Apr 30 2022
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 4/30/22 13:14, Dukc wrote:
 On Saturday, 30 April 2022 at 20:05:14 UTC, Walter Bright wrote:
 As to why there aren't references to D as inspiration,
Bjarne Stroustrup mentions non-C++ programming languages only when he sees failures as in "Java tried." I will never hear him mention D.
 and no
 references to Zortech C++'s seminal role in the early days of C++,
 consider this:

 http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3613.pdf
I can't fathom what they were thinking when they wrote that paper. The reasoning in it is so bad that it's an outright disgrace to the C++ committee.
Note that the authors are just three influential people there, preempting the rest of the C++ committee from making a "mistake". As later I learned, Andrew Sutton has been strongly against the ideas of Sean Baxter's Circle compiler as well, I started to think he may be the main author of the paper above. (The ratio of typos in that article is beyond what I am accustomed to in Bjarne Stroustrup papers.) It is unfortunate that the rest of the C++ committee accepted the "mistake" of leaving 'static if' out.
 downright hostile. You really deserved much better than that.
Not everybody has the tactfulness to accept the ideas of non-academics. So much so that they went out of their way to write a paper about it. And with a juvenile title like that... Ali
Apr 30 2022
next sibling parent reply Araq <rumpf_a web.de> writes:
On Saturday, 30 April 2022 at 21:59:52 UTC, Ali Çehreli wrote:
 On 4/30/22 13:14, Dukc wrote:
 On Saturday, 30 April 2022 at 20:05:14 UTC, Walter Bright
wrote:
 As to why there aren't references to D as inspiration,
Bjarne Stroustrup mentions non-C++ programming languages only when he sees failures as in "Java tried." I will never hear him mention D.
 and no
 references to Zortech C++'s seminal role in the early days
of C++,
 consider this:

 
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3613.pdf
 I can't fathom what they were thinking when they wrote that
paper. The
 reasoning in it is so bad that it's an outright disgrace to
the C++
 committee.
Huh? The paper is quite good IMHO. The tooling problems that result from `static if` are real.
Apr 30 2022
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 4/30/22 15:17, Araq wrote:

 Huh? The paper is quite good IMHO.
You have a lower bar to "papers" than I do. The paper sounds scientific but is riddled with unscientific claims like "its adoption would be a disaster for the language", "This will make programs harder to read, understand, maintain, and debug". I quote: "blah blah blah". That might look good on certain type of paper but not on a scientific-sounding opinion piece like that. The paper is written without a single piece of experience wits static if. They say "Any use of those declarations would also need to be checked by more static if statements." Poppycock! (Now, that's scientific!) I have difficulty reading that paper because the authors do not have a single bit of self-doubt. Their (I quote) "silly" code that mixes the preprocessor and static if is useless. Just because they could come up with ridiculous programming constructs does not constitute a counter argument to anything. And watch this: "We have already heard suggestions of static_for and static_while. Can static_switch be far behind?" So what? What is the point in that? Is that an argument? Do the authors refute static if just because they ask that question with a smirk? I refuse it as an argument from grown ups. (The authors are from a university!) No, re-reading the paper now (but I skipped the last parts because the first parts are more than enough for me), my opinion is the same: A bad paper.
 The tooling problems that result from
 `static if` are real.
Oh! I must have missed their point. Ali
Apr 30 2022
parent reply Araq <rumpf_a web.de> writes:
On Sunday, 1 May 2022 at 02:18:37 UTC, Ali Çehreli wrote:
 The tooling problems that result from
 `static if` are real.
Oh! I must have missed their point.
I cannot tell if you're sarcastic or not but the major point of contention is the scoping involved, `static if` doesn't introduce a new scope (and that's a feature) whereas C++'s `if constexpr` does introduce a scope. This really does make quite a difference for tooling.
May 01 2022
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 5/1/22 00:12, Araq wrote:
 On Sunday, 1 May 2022 at 02:18:37 UTC, Ali Çehreli wrote:
 The tooling problems that result from
 `static if` are real.
Oh! I must have missed their point.
I cannot tell if you're sarcastic
I wasn't in a good mood yesterday. I apologize for that but I was only half sarcastic as most. :) I think the authors desperately held on to one idea that they could prove to be correct. One truth does not make all their points correct.
 or not but the major point of
 contention is the scoping involved, `static if` doesn't introduce a new
 scope (and that's a feature) whereas C++'s `if constexpr` does introduce
 a scope. This really does make quite a difference for tooling.
C++ is not in a position to fake that it cares about tooling. That train has already landed... (What? :p) Although tooling is a good point againts the proposal to discuss, it is nothing important to make anyone to write a paper to preemptively kill a proposal. After all, they could write (or say) that another token might be used. How about the angle brackets? static if (condition) < // ... > Solved. :) Ali
May 01 2022
next sibling parent reply Araq <rumpf_a web.de> writes:
On Sunday, 1 May 2022 at 21:00:23 UTC, Ali Çehreli wrote:

 C++ is not in a position to fake that it cares about tooling. 
 That train has already landed... (What? :p)
"X is already bad so let's make it worse."
 Although tooling is a good point againts the proposal to 
 discuss, it is nothing important to make anyone to write a 
 paper to preemptively kill a proposal. After all, they could 
 write (or say) that another token might be used. How about the 
 angle brackets?

   static if (condition) <
     // ...
   >

 Solved. :)
The tooling problem is not caused by the syntax but by the scoping rules.
May 01 2022
next sibling parent Bruce Carneal <bcarneal gmail.com> writes:
On Monday, 2 May 2022 at 00:21:10 UTC, Araq wrote:
 On Sunday, 1 May 2022 at 21:00:23 UTC, Ali Çehreli wrote:

 C++ is not in a position to fake that it cares about tooling. 
 That train has already landed... (What? :p)
"X is already bad so let's make it worse."
 Although tooling is a good point againts the proposal to 
 discuss, it is nothing important to make anyone to write a 
 paper to preemptively kill a proposal. After all, they could 
 write (or say) that another token might be used. How about the 
 angle brackets?

   static if (condition) <
     // ...
   >

 Solved. :)
The tooling problem is not caused by the syntax but by the scoping rules.
Yep. A lot of utility in scopeless but I'd imagine it's harder on the tooling folk (no guarantee of trivial mappings). Still, I think that upgrading the tooling is preferable to hobbling the language.
May 01 2022
prev sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 5/1/22 17:21, Araq wrote:
 On Sunday, 1 May 2022 at 21:00:23 UTC, Ali Çehreli wrote:

 C++ is not in a position to fake that it cares about tooling. That
 train has already landed... (What? :p)
"X is already bad so let's make it worse."
Not at all. I meant, the authors could not use tooling problems as excuse because C++ was and is already hostile to tooling. But I understand all your points so I am moving on to happier topics. Ali
May 01 2022
parent Bruce Carneal <bcarneal gmail.com> writes:
On Monday, 2 May 2022 at 02:24:01 UTC, Ali Çehreli wrote:
 On 5/1/22 17:21, Araq wrote:
...
 But I understand all your points so I am moving on to happier 
 topics.
Very good idea. I'm following your example.
 Ali
May 01 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 May 2022 at 21:00:23 UTC, Ali Çehreli wrote:
 I think the authors desperately held on to one idea that they 
 could prove to be correct. One truth does not make all their 
 points correct.
I believe the chose "if constexpr" because it doesn't significantly alter language semantics and compiler internals. C++ already had #ifdef, so the I personally don't find a need for more than "if constexpr". The concept of "if constexpr" also has a valuable advantage when writing a library: the compiler can type check all branches. The primary advantage, IMHO, for D to have "static if" rather than "if constexpr" is that a static for loop becomes trivial to implement. So this is an advantage the D is likely to retain.
May 02 2022
prev sibling next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 30.04.22 23:59, Ali Çehreli wrote:
 
  > downright hostile. You really deserved much better than that.
 
 Not everybody has the tactfulness to accept the ideas of non-academics.
Unfortunately that often goes both ways. :/ Also, Andrei has a PhD.
Apr 30 2022
parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 4/30/22 15:26, Timon Gehr wrote:
 On 30.04.22 23:59, Ali Çehreli wrote:
  > downright hostile. You really deserved much better than that.

 Not everybody has the tactfulness to accept the ideas of non-academics.
Unfortunately that often goes both ways. :/ Also, Andrei has a PhD.
I did not mean otherwise. With "academic" I meant (copying from the internet) "a member (such as a professor) of an institution of learning (such as a university)". Ali
Apr 30 2022
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 2:59 PM, Ali Çehreli wrote:
 Note that the authors are just three influential people there, preempting the 
 rest of the C++ committee from making a "mistake".
`static if` was eventually accepted, but with a different name and different authors.
Apr 30 2022
prev sibling parent Dukc <ajieskola gmail.com> writes:
On Saturday, 30 April 2022 at 21:59:52 UTC, Ali Çehreli wrote:
 Not everybody has the tactfulness to accept the ideas of 
 non-academics. So much so that they went out of their way to 
 write a paper about it. And with a juvenile title like that...
The title is almost right. Just move the quotes and it describes the paper perfectly. Unfortunately.
May 07 2022
prev sibling parent user1234 <user1234 12.de> writes:
On Saturday, 30 April 2022 at 08:56:55 UTC, Daniel N wrote:
 Let's focus on smalltalk, what syntax do you use to choose if 
 your code snippet should run at runtime or in compiletime? Not 
 the entire program, but 50% compile-time and 50% runtime.
naive solution: cost estimation via an ast walker, just like inliners do.
Apr 30 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 1:32 AM, Paulo Pinto wrote:
 Switching gears to ranges, we have Smalltalk-80 collections as one possible 
 example,
 
 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-8
_Collection_Classes 
C++ went the iterator approach. Ranges in C++ occurred only after D did them. Also, Lisp started out as an interpreted language. Native compilation came much later. I'm interested in an example of a language that started out as a natively compiled language, and then added interpretation. I know back in the 80's there were some C interpreters, but they were not combined with a native compiler. Nobody thought to put the chocolate and the peanut butter together. No version of Fortran I ever used had CTFE.
Apr 30 2022
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Saturday, 30 April 2022 at 17:39:04 UTC, Walter Bright wrote:

 I'm interested in an example of a language that started out as 
 a natively compiled language, and then added interpretation.
https://crystal-lang.org/2021/12/29/crystal-i.html
Apr 30 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 12:30 PM, Siarhei Siamashka wrote:
 On Saturday, 30 April 2022 at 17:39:04 UTC, Walter Bright wrote:
 
 I'm interested in an example of a language that started out as a natively 
 compiled language, and then added interpretation.
https://crystal-lang.org/2021/12/29/crystal-i.html
14 years after D :-)
Apr 30 2022
prev sibling next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Saturday, 30 April 2022 at 17:39:04 UTC, Walter Bright wrote:
 On 4/30/2022 1:32 AM, Paulo Pinto wrote:
 Switching gears to ranges, we have Smalltalk-80 collections as 
 one possible example,
 
 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-80_Collection_Classes
C++ went the iterator approach. Ranges in C++ occurred only after D did them.
We're strange. IIRC Bjarne's third C++ book from 1998 already discusses a bit about ranges, albeit calling them "sequences". It shows a few examples how they can work, by pairing iterators into one type, and then goes on to other topics. Spending any amount of time using Phobos ranges will reveal them as clearly superior to iterators in common usage. One would think that when the idea has been around at least 24 years, ranges would have long since displaced iterators as the recommended standard C++ construct. Yet no. BTW, As I understand it ranges came to D at around 2008 or 2009. Out of interest, what was D's way of doing the same tasks before that?
Apr 30 2022
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 12:33 PM, Dukc wrote:
 On Saturday, 30 April 2022 at 17:39:04 UTC, Walter Bright wrote:
 On 4/30/2022 1:32 AM, Paulo Pinto wrote:
 Switching gears to ranges, we have Smalltalk-80 collections as one possible 
 example,

 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-8
_Collection_Classes 
C++ went the iterator approach. Ranges in C++ occurred only after D did them.
We're strange. IIRC Bjarne's third C++ book from 1998 already discusses a bit about ranges, albeit calling them "sequences". It shows a few examples how they can work, by pairing iterators into one type, and then goes on to other topics.
There is little choice in C++ but to use a pair of iterators. The next step was to put them together into std::pair, but that simply went nowhere in C++ until Eric Niebler decided to do something about it.
 Spending any amount of time using Phobos ranges will reveal them as clearly 
 superior to iterators in common usage. One would think that when the idea has 
 been around at least 24 years, ranges would have long since displaced
iterators 
 as the recommended standard C++ construct. Yet no.
Iterators had gotten very, very entrenched in C++ by then.
 BTW, As I understand it ranges came to D at around 2008 or 2009. Out of 
 interest, what was D's way of doing the same tasks before that?
D didn't have a metaprogramming way of doing that before. Ranges based on dynamic arrays were a natural fit for D.
Apr 30 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 30 April 2022 at 19:33:18 UTC, Dukc wrote:
 We're strange. IIRC Bjarne's third C++ book from 1998 already 
 discusses a bit about ranges, albeit calling them "sequences". 
 It shows a few examples how they can work, by pairing iterators 
 into one type, and then goes on to other topics.
What C++ call iterators are usually called table pointers or cursors. What D calls ranges are usually called iterators (or generators). They have different usages.
Apr 30 2022
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/30/2022 10:39 AM, Walter Bright wrote:
 On 4/30/2022 1:32 AM, Paulo Pinto wrote:
 Switching gears to ranges, we have Smalltalk-80 collections as one possible 
 example,

 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-8
_Collection_Classes 
C++ went the iterator approach. Ranges in C++ occurred only after D did them.
A bit more. C++ invented iterators as a generalization of pointers. D's contribution was ranges as a generalization of D's dynamic arrays. Since then, ranges have found their way into C++ via Eric Niebler's contributions. I think the C++ ranges are a generalization of an iterator pair, which in my not-so-humble opinion is an inferior design as assembling one is not safe. I am not claiming invention of the concept of an object that can iterate through a data structure, or pipeline programming.
Apr 30 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 30 April 2022 at 17:39:04 UTC, Walter Bright wrote:
 On 4/30/2022 1:32 AM, Paulo Pinto wrote:
 Switching gears to ranges, we have Smalltalk-80 collections as 
 one possible example,
 
 https://www.researchgate.net/publication/2409926_Interfaces_and_Specifications_for_the_Smalltalk-80_Collection_Classes
C++ went the iterator approach. Ranges in C++ occurred only after D did them. Also, Lisp started out as an interpreted language. Native compilation came much later. I'm interested in an example of a language that started out as a natively compiled language, and then added interpretation. I know back in the 80's there were some C interpreters, but they were not combined with a native compiler. Nobody thought to put the chocolate and the peanut butter together. No version of Fortran I ever used had CTFE.
First Lisp compiler was in 1960's....
May 01 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 12:33 AM, Paulo Pinto wrote:
 First Lisp compiler was in 1960's....
I know. And Lisp 1 was an interpreter, page 9 of: http://jmc.stanford.edu/articles/lisp/lisp.pdf I know perfectly well that interpreters have long evolved to generate native code. I did one myself (Symantec's Java) in the 1990s. I considered it for the Javascript interpreter I wrote around 2000. I've also seen C interpreters in the 1980s. Why native C compilers still didn't add interpretation to functions is a mystery. The UCSD P-System had interpreting compilers for C, Pascal, and Fortran in the 1980s. ***** Note that even the C interpreters would reject things like: `int a[foo()];` i.e. CTFE was not part of the *language* semantics. ***** After D did it, suddenly the other native languages moved in that direction. If you have another explanation for the timing, I'd like to hear it. If you have a reference to a natively compiled language specification that had compile-time constant-expressions that could interpret a function at compile time, I'd appreciate it. No, not an interpreted language that JITs whatever it can. Thanks!
May 01 2022
next sibling parent reply Paulo Pissas <paulopissas gmail.com> writes:
On Sunday, 1 May 2022 at 09:04:11 UTC, Walter Bright wrote:
 On 5/1/2022 12:33 AM, Paulo Pinto wrote:
 First Lisp compiler was in 1960's....
I know. And Lisp 1 was an interpreter, page 9 of: http://jmc.stanford.edu/articles/lisp/lisp.pdf I know perfectly well that interpreters have long evolved to generate native code. I did one myself (Symantec's Java) in the 1990s. I considered it for the Javascript interpreter I wrote around 2000. I've also seen C interpreters in the 1980s. Why native C compilers still didn't add interpretation to functions is a mystery. The UCSD P-System had interpreting compilers for C, Pascal, and Fortran in the 1980s. ***** Note that even the C interpreters would reject things like: `int a[foo()];` i.e. CTFE was not part of the *language* semantics. ***** After D did it, suddenly the other native languages moved in that direction. If you have another explanation for the timing, I'd like to hear it. If you have a reference to a natively compiled language specification that had compile-time constant-expressions that could interpret a function at compile time, I'd appreciate it. No, not an interpreted language that JITs whatever it can. Thanks!
At the end of the day, the busy programmer doesn't care who invented what, or when. He wants to get something done with as little friction as possible, and chooses a tool appropriately. He might have a look at D and get seduced by it's syntax, which succinct. As he spends more time in the language, he realises it's even better than expected, as he learns about the not so obvious features, like CTFE, static if, templates, ranges/algorithms... His next step is to try and use D for everything. Why not? D is amazing as a language, the compiler is faster than C++, he tells himself... However, as the size of the project grows, compile times are no As he starts encountering frictions, he realises a lot of this friction derives from issues in the language/compiler, and that some issues have been there for a long time. The community is not big enough, and currently has no interest in addressing these issues. He realises the direction of the community is not to improve the busy programmer's life, but to increase their own joy and usability of the features they use, and to somehow find ways to claim D is better than other languages, an illusion based on their belief that more (unique) features are what makes a programming language better, despite the very obvious evidence of the contrary seen in the real world. The D community is talented and above average in terms of knowledge/skills. However, this is not representative of the reality for a small company building a simple SaaS, website, or web application. So, even if the community (comprised of a lot of enthusiasts) is comfortable with the status quo, this does not mean that is really the case for the "real world". D has a lot of good things, but the focus has never been on making D popular - sometimes it felt the goal was to keep D as "elite" as possible. The reason I dropped D after over 6 years of using it exclusively, was a loss of belief in the direction, and the fact that after many years, not a lot had improved in regards to frictions for the busy programmer. It really felt like an enthusiasts' project, and not really an effort to build a widely adopted tool. This might be an outdated view.
May 01 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 3:57 AM, Paulo Pissas wrote:
 The reason I dropped D after over 6 years of using it exclusively, was a loss
of 
 belief in the direction, and the fact that after many years, not a lot had 
 improved in regards to frictions for the busy programmer. It really felt like
an 
 enthusiasts' project, and not really an effort to build a widely adopted tool.
What friction was there that other languages you switched to did not have?
May 01 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/1/2022 11:04 AM, Walter Bright wrote:
 On 5/1/2022 3:57 AM, Paulo Pissas wrote:
 The reason I dropped D after over 6 years of using it exclusively, was a loss 
 of belief in the direction, and the fact that after many years, not a lot had 
 improved in regards to frictions for the busy programmer. It really felt like 
 an enthusiasts' project, and not really an effort to build a widely adopted
tool.
What friction was there that other languages you switched to did not have?
BTW, the entire reason for ImportC was to greatly reduce the friction of interfacing with C. It is not a feature of the D language itself.
May 01 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 1 May 2022 at 09:04:11 UTC, Walter Bright wrote:
 On 5/1/2022 12:33 AM, Paulo Pinto wrote:
 First Lisp compiler was in 1960's....
I know. And Lisp 1 was an interpreter, page 9 of: http://jmc.stanford.edu/articles/lisp/lisp.pdf I know perfectly well that interpreters have long evolved to generate native code. I did one myself (Symantec's Java) in the 1990s. I considered it for the Javascript interpreter I wrote around 2000. I've also seen C interpreters in the 1980s. Why native C compilers still didn't add interpretation to functions is a mystery. The UCSD P-System had interpreting compilers for C, Pascal, and Fortran in the 1980s. ***** Note that even the C interpreters would reject things like: `int a[foo()];` i.e. CTFE was not part of the *language* semantics. ***** After D did it, suddenly the other native languages moved in that direction. If you have another explanation for the timing, I'd like to hear it. If you have a reference to a natively compiled language specification that had compile-time constant-expressions that could interpret a function at compile time, I'd appreciate it. No, not an interpreted language that JITs whatever it can. Thanks!
I give up, as you clearly can't accept a compiled language from 1960, about 30 years older than D, so why bother when it will be dismissed no matter what.
May 03 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 May 2022 at 11:55:25 UTC, Paulo Pinto wrote:
 I give up, as you clearly can't accept a compiled language from 
 1960, about 30 years older than D, so why bother when it will 
 be dismissed no matter what.
I guess humans are disposed for being romantic about technology and advances. Fortunately as more people get higher education it becomes quite apparent that steady progress is the consequences of communities and not of individuals, it makes society much more robust! Modern programming languages are just "artistic expressions" of the aggregate ideas from the community of programming language research. It is all about finding the right mix of features and syntax, not really about the big groundbreaking ideas. Re, the idea further up the that that 100 Elon Musks would make a big difference, it would just mean that you would have 100 people competing for hype in the press and competing for the same high risk willing capital, most likely ending with 100 underfunded expensive prestige projects. Wouldn't do anything for society. The romantic view of technological progress is totally unrealistic! (but very seducing as nobody likes to be an ant in a hive)
May 03 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/3/2022 4:55 AM, Paulo Pinto wrote:
 I give up, as you clearly can't accept a compiled language from 1960, about 30 
 years older than D, so why bother when it will be dismissed no matter what.
I accept that you and I see things differently :-)
May 03 2022
prev sibling parent reply Adrian Matoga <dlang.spam matoga.info> writes:
On Tuesday, 2 November 2021 at 18:01:37 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular.
I don't think it is reasonable to say it is unpopular, [Github activity shows that people create new projects with it](https://forum.dlang.org/thread/ltfgzovqcadknyjnabwp forum.dlang.org) at roughly the same rate as Nim, Crystal and other smaller languages. What would be interesting to know is what made people who were very enthusiastic about D in the past (in the forums) switch to another language? Which language was it and why was that a better fit for them?
While I haven't been active in D community for something like five years already, it wasn't because I switched to another PL, but mostly due to some disturbances in personal life that made me shift my spare time activities from programming to anything from politics to gardening and woodworking, while still trying to advocate for D or at least write all my single-use tools in it (I learned that woodworkers call such stuff jigs and I sort of like it) at work. I've recently returned to tinkering with electronics and programming at home so let me share my view. From my not-so-extensive professional experience, large companies usually try to stick to well-established technologies, for two reasons - one is to make a product fast by reusing as much as possible from existing software, and this is mostly written in C and C++ in the area of compilers/drivers/system-level services (at least in Linux world) or Python and C/C++-based proprietary GPU languages in ML, but I'm sure in other businesses there have also been some default language choices. The other one is to sell the product fast by not making it too weird to their customers. Many SW engineers still think of D as a curiosity, and for their managers and salespeople it's surely more exotic than space travel. So it's mostly about a killer app that makes a language an obvious choice for thing X that one wants to work with, and I think it's been mentioned a thousand times in this forum. You just need to be ready and jump in as first or one of the first into an emerging business to be successful in this race. And it's not only about programming languages, but about any technology or even non-technology related idea too. Python became the default language for ML, because it was easy enough for people whose main focus wasn't programming, and who didn't require system level performance because available bindings to C libraries were good enough. Guido didn't anticipate that, but somehow Python was ready to be picked up when a need for such language emerged. NVidia made a lot of money on ML and cryptocurrency mining not because they entered this market when it was already mature, but because they'd already had a mature technology (originally designed for something completely unrelated, where they didn't have much competition either) someone picked up for their job that was yet to become a big business. ARM CPUs were mostly used in various embedded systems most people don't even realize have a computer inside, until someone picked them up for smartphones - a product everyone on the planet desired in their pocket - and they're still The Default Architecture for most mobile gadgets. It seems to me that popularity or commercial success do not exactly follow the rules of good engineering process. In my previous job we frequently joked about the upper management that they seemed to drive the company in an upside-down manner, sort of - hey guys, let's make this or that solution, and when you're done, we'll try to look for a problem that it solves. Yet experimenting with stuff that has no immediate application is the core of actual research, and when you try to look for a solution to a long-existing problem, people usually have accustomed themselves to deal with it with half-baked but then available means, and it's extremely hard to change their habits. What D tried to do was to be "better C++" or "better C", but in 2022 it's about 40 years too late to be successful in that. There're millions of programs in C and C++ that have been good enough to make revenue for many companies and thus convinced others to invest more money, effort and time in more stuff that depends on C and C++. D has low chance of becoming popular in systems programming or game programming or web programming or mobile programming or machine learning, mostly because there had already been large markets using other technologies in these areas long before D tried to enter them with enthusiasts lacking huge financial backing. It's far easier to be a pioneer in something new than to outrun the tycoons. Anyway, D is still my favourite language and the default language I choose when coding something from scratch. It does have rough edges, but it's most often less PITA than any other language I've worked with, mostly because it's really flexible in terms of style - it lets me solve common problems with just a bunch of chained range calls, it lets me write clever meta stuff when I don't want to repeat myself, it lets me get down to dirty imperative fors and ifs and pointer hacks when I'm too lazy to find nicer solutions. Just the right balance between typing and thinking. I sometimes miss the rich ecosystem that Python has, where you can find one-liners for many higher level tasks, but quite often they're limited to some basic set of use cases and they're not one-liners anymore when you need them to do something terms of how quickly I was able to reuse existing libraries and implement any new code, especially with pretty convenient tooling from MS, but that was long time ago when it wasn't seriously usable outside Windows and I didn't have much interest in developing for Windows later. What I've missed the most so far in D was a zero-effort reuse of C libraries, because there's a lot of useful libs in C I already know. Of course it's much less tedious to interface C in D than in Python, but I bet if D had a fully-blown ImportC from the very beginning, it could be where C++ is today.
Apr 27 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga wrote:
 of like it) at work. I've recently returned to tinkering with 
 electronics and programming at home so let me share my view.
Do you use or plan to use microcontrollers? If so, with what language?
 technology or even non-technology related idea too. Python 
 became the default language for ML, because it was easy enough 
 for people whose main focus wasn't programming, and who didn't 
 require system level performance because available bindings to 
 C libraries were good enough.
Yes, but I think this also has something to do with Python replacing Matlab in academic research institutions. Python is becoming the default platform for analysis and experimentation.
 What D tried to do was to be "better C++" or "better C", but in 
 2022 it's about 40 years too late to be successful in that. 
 There're millions of programs in C and C++ that have been good 
 enough to make revenue for many companies and thus convinced 
 others to invest more money, effort and time in more stuff that 
 depends on C and C++.
Yes, and they are ISO standards, so nobody "owns" C or C++. That creates a more "open" evolution that is industry-oriented (the main purpose of ISO is to make industrial tools and interfaces interoperable).
 do something beyond those. I recall I had some good experience 

 libraries and implement any new code, especially with pretty 
 convenient tooling from MS, but that was long time ago when it 
 wasn't seriously usable outside Windows and I didn't have much 
 interest in developing for Windows later.
in the IDE, or was it something that has to do with the language itself?
 What I've missed the most so far in D was a zero-effort reuse 
 of C libraries, because there's a lot of useful libs in C I 
 already know.
Yes, has the new import-C feature been helpful for you in that regard?
 Of course it's much less tedious to interface C in D than in 
 Python, but I bet if D had a fully-blown ImportC from the very 
 beginning, it could be where C++ is today.
When compared to C++, I'd say that D still needs to get its memory management story right and fix some language short-coming (incomplete features), but memory management is at least being looked at actively now. (People expect something more than malloc/free and roll-your-own ref-counting.) Thanks for sharing your thoughts, it was an interesting read!
Apr 28 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad 
wrote:

 suggestions in the IDE, or was it something that has to do with 
 the language itself?
Several reasons. It is a simplified language compared to C++. You can easily learn coming from C++ or D. While being more limited than C++ without real meta programming, you rarely need it. use and they are plenty. gold standard.
Apr 28 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 28 April 2022 at 09:02:09 UTC, IGotD- wrote:
 It is a simplified language compared to C++. You can easily 
 learn coming from C++ or D. While being more limited than C++ 
 without real meta programming, you rarely need it.
True, metaprogramming is mostly useful for writing flexible libraries, but you can usually do the same things without, but with more lines of library code.

 to use and they are plenty.


 the gold standard.
So it is all about the eco system and no advantages tied to the language itself (except avoiding memory pointers and such)?
Apr 28 2022
next sibling parent reply IGotD- <nise nise.com> writes:
On Thursday, 28 April 2022 at 09:12:07 UTC, Ola Fosheim Grøstad 
wrote:
 So it is all about the eco system and no advantages tied to the 
 language itself (except avoiding memory pointers and such)?
Basic also supports .Net to some extent but is rarely used However, eco system is of course one of the major reasons of the all major operating systems Like Windows, Linux, iOS etc. This the future and will eat up C++ market share significantly. GC as default just like D but there nobody complains about it. The designers were smart though and totally removed raw pointers (there are raw pointers but you need to step outside the safe
Apr 28 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 28 April 2022 at 09:27:54 UTC, IGotD- wrote:

 Visual Basic also supports .Net to some extent but is rarely 

wonder if some of the language features they added over time are is that?

 systems in the future and will eat up C++ market share 
 significantly.
Maybe. But, both Google and Microsoft tried to distance themselves from C++ for a while, yet they are fully backing it now. You also have things like Intel ISPC that tries to make parallell programming easy, and I think that one is sending signals of where performance programming is heading. The future of "low level programming" is difficult to predict, but maybe the nature of "low level programming" is changing? I.e. less about pointers and instructions and more about utilizing the hardware?
Apr 28 2022
parent Gregor =?UTF-8?B?TcO8Y2ts?= <gregormueckl gmx.de> writes:
On Thursday, 28 April 2022 at 09:39:57 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 28 April 2022 at 09:27:54 UTC, IGotD- wrote:

 Visual Basic also supports .Net to some extent but is rarely 

wonder if some of the language features they added over time Why is that?
I'll say how I feel about it after having used both languages extensively. For a long time, Java has been a very basic language in some respect, which forces a lot of boilerplate code and convoluted solutions onto the user. This flaw is really on a it much faster with a lot of quality of life features. Generics are a bit more powerful. Properties have nicer syntax. Delegates, lambdas, structs, pattern matching etc. make it feel more refined and more expressive. At the same time, the whole thing isn't too complex (yet?).
Apr 28 2022
prev sibling parent reply bauss <jj_1337 live.dk> writes:
On Thursday, 28 April 2022 at 09:12:07 UTC, Ola Fosheim Grøstad 
wrote:
 So it is all about the eco system and no advantages tied to the 
 language itself (except avoiding memory pointers and such)?
I can't speak for anyone else, but I started using D about a decade ago and back then I always wished other languages had features D had, but now it's the opposite. I'm always missing features in D. * async/await * Properties that actually work and also with a really good syntax IMHO (getters/setters) * string interpolation * Shortened methods (and properties.) - much better than the proposed version for D * nullability built-in ex. object?.method(), as well null-coalescing * pattern matching / switch expressions * out parameters that can be declared directly in the method call * built-in tuples, as well auto-expanding them to variables etc.
Apr 28 2022
next sibling parent reply SealabJaster <sealabjaster gmail.com> writes:
On Thursday, 28 April 2022 at 12:04:11 UTC, bauss wrote:


 ...
+100 to all of those things. Regrettably this community doesn't see much value in syntax sugar, and are fine with more bulky library solutions instead. It's not even about the amount of keystrokes like many here claim, it's about readability. All these little things add up as
Apr 28 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 28 April 2022 at 12:33:48 UTC, SealabJaster wrote:
 Regrettably this community doesn't see much value in syntax 
 sugar, and are fine with more bulky library solutions instead.
It is possible to have syntactical sugar for library solutions if you have clearly defined protocols. Like D and C++ do with for-loops over containers. It is in many ways the best approach, but the question is how to define the right protocol mechanism.
Apr 28 2022
parent reply bauss <jj_1337 live.dk> writes:
On Thursday, 28 April 2022 at 13:50:19 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 28 April 2022 at 12:33:48 UTC, SealabJaster wrote:
 Regrettably this community doesn't see much value in syntax 
 sugar, and are fine with more bulky library solutions instead.
It is possible to have syntactical sugar for library solutions if you have clearly defined protocols. Like D and C++ do with for-loops over containers. It is in many ways the best approach, but the question is how to define the right protocol mechanism.
I think his point was that D seems to favor library-only solutions in a lot of cases and then the syntactic sugar is never added properly. No matter how good your library solutions are then you can never implement async/await in a clear fashion without the compiler emitting you a state machine for it.
Apr 29 2022
next sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Friday, 29 April 2022 at 07:56:15 UTC, bauss wrote:
 I think his point was that D seems to favor library-only 
 solutions in a lot of cases and then the syntactic sugar is 
 never added properly.

 No matter how good your library solutions are then you can 
 never implement async/await in a clear fashion without the 
 compiler emitting you a state machine for it.
The dirty secret here is that the code quality of the DMD fronted has deteriorated to the point where it is basically impossible to do a correct, complete implementation of any non-trivial language feature. So we can either have a library solution that works, but has ugly syntax; or we can have a language solution that has nice syntax, but doesn't work.
Apr 29 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 7:56 AM, Paul Backus wrote:
 On Friday, 29 April 2022 at 07:56:15 UTC, bauss wrote:
 I think his point was that D seems to favor library-only solutions in a lot of 
 cases and then the syntactic sugar is never added properly.

 No matter how good your library solutions are then you can never implement 
 async/await in a clear fashion without the compiler emitting you a state 
 machine for it.
The dirty secret here is that the code quality of the DMD fronted has deteriorated to the point where it is basically impossible to do a correct, complete implementation of any non-trivial language feature. So we can either have a library solution that works, but has ugly syntax; or we can have a language solution that has nice syntax, but doesn't work.
I don't know what the rewrites of async/await would be. If someone would write that down, I doubt it would be that hard. After all, I got ImportC to work :-/
Apr 29 2022
parent reply Paul Backus <snarwin gmail.com> writes:
On Friday, 29 April 2022 at 15:36:34 UTC, Walter Bright wrote:
 On 4/29/2022 7:56 AM, Paul Backus wrote:
 On Friday, 29 April 2022 at 07:56:15 UTC, bauss wrote:
 I think his point was that D seems to favor library-only 
 solutions in a lot of cases and then the syntactic sugar is 
 never added properly.

 No matter how good your library solutions are then you can 
 never implement async/await in a clear fashion without the 
 compiler emitting you a state machine for it.
The dirty secret here is that the code quality of the DMD fronted has deteriorated to the point where it is basically impossible to do a correct, complete implementation of any non-trivial language feature. So we can either have a library solution that works, but has ugly syntax; or we can have a language solution that has nice syntax, but doesn't work.
I don't know what the rewrites of async/await would be. If someone would write that down, I doubt it would be that hard. After all, I got ImportC to work :-/
I would hardly call ImportC's implementation "correct" and "complete" at this point, given the large number of outstanding bugs and the fact that it does not even support the preprocessor yet. This is exactly what people mean when they call features of D "incomplete" or "unfinished" or "half-assed": the happy path works, but important features are missing and the edge cases are riddled with bugs. I'm sure ImportC will improve, given enough time--maybe 3-5 years? But outside of a tiny number of core developers such as yourself who work on D full-time, it is unrealistic to expect an open-source contributor to see a project of that scale through to completion. That's what I really mean by "basically impossible". Not that it literally cannot be done, but that the amount of time and effort required is prohibitive for the vast majority of contributors.
Apr 29 2022
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 9:03 AM, Paul Backus wrote:
 I would hardly call ImportC's implementation "correct" and "complete" at this 
 point, given the large number of outstanding bugs and the fact that it does
not 
 even support the preprocessor yet.
https://github.com/dlang/dmd/pull/13955 which has been there for 3 weeks, and I can't get anyone to approve it.
 This is exactly what people mean when they call features of D "incomplete" or 
 "unfinished" or "half-assed": the happy path works, but important features are 
 missing and the edge cases are riddled with bugs.
ImportC is only a few months old. Is it realistic to expect an entire language implementation to work perfectly that quickly? As for bugs, here is the current list: https://issues.dlang.org/buglist.cgi?quicksearch=importc&list_id=240477 The ones older than a week fall into these categories: 1. asking for extensions to match extensions in other compilers 2. the { } initializer syntax, which needs to be completely redone
 I'm sure ImportC will improve, given enough time--maybe 3-5 years?
Oh, it'll be a lot quicker than that.
 But outside 
 of a tiny number of core developers such as yourself who work on D full-time,
it 
 is unrealistic to expect an open-source contributor to see a project of that 
 scale through to completion. That's what I really mean by "basically 
 impossible". Not that it literally cannot be done, but that the amount of time 
 and effort required is prohibitive for the vast majority of contributors.
I'm not asking for help coding ImportC. I am asking for: 1. submit bug reports for problems (dave287091 gmail.com and duser neet.fi have been prolific in submitting very well done bug reports) 2. approve the PRs I submit for it
Apr 29 2022
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 29 April 2022 at 16:55:25 UTC, Walter Bright wrote:
 ImportC is only a few months old.
Oh how the time flies. It is actually about a full year old. https://forum.dlang.org/thread/s79ib1$1122$1 digitalmars.com Announced on May 09, 2021. I assume you were working on it for a little while before that, so probably over a year, but still, definitely more than "a few months". And what did I say at the time? That the preprocessor question needs attention (still no solution today) and, I quote: "There'd still the all the little details that need to be done and the inevitable flurry of bugs. I'd be surprised if this is legitimately usable by the end of the year."
Apr 29 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 10:20 AM, Adam D Ruppe wrote:
 On Friday, 29 April 2022 at 16:55:25 UTC, Walter Bright wrote:
 ImportC is only a few months old.
Oh how the time flies. It is actually about a full year old. https://forum.dlang.org/thread/s79ib1$1122$1 digitalmars.com Announced on May 09, 2021. I assume you were working on it for a little while before that, so probably over a year, but still, definitely more than "a few months".
I thought I'd done that last August, not May.
 And what did I say at the time? That the preprocessor question needs attention 
 (still no solution today)
There is a solution, https://github.com/dlang/dmd/pull/13955
 and, I quote: "There'd still the all the little 
 details that need to be done and the inevitable flurry of bugs. I'd be
surprised 
 if this is legitimately usable by the end of the year."
It's still got bugs, but it is usable, particularly if your code avoids C extensions.
Apr 29 2022
prev sibling parent reply mw <mingwu gmail.com> writes:
On Friday, 29 April 2022 at 16:03:16 UTC, Paul Backus wrote:
 I would hardly call ImportC's implementation "correct" and 
 "complete" at this point, given the large number of outstanding 
 bugs and the fact that it does not even support the 
 preprocessor yet.
For this particular preprocessor issue, I think we should cut the scope where this ImportC's goal is: it should only take the input from cpp's output, .i files, instead of .h files. 1) why should D reinvent the wheels, and waste efforts which all those cpp have put in for the past several decades? 2) otherwise, there are so many C compilers (flavored) headers, does dmd want to handle them all? After all, this is a D compiler, not a C compiler.
 This is exactly what people mean when they call features of D 
 "incomplete" or "unfinished" or "half-assed": the happy path 
 works, but important features are missing and the edge cases 
 are riddled with bugs.

 I'm sure ImportC will improve, given enough time--maybe 3-5 
 years? But outside of a tiny number of core developers such as 
 yourself who work on D full-time, it is unrealistic to expect 
 an open-source contributor to see a project of that scale 
 through to completion. That's what I really mean by "basically 
 impossible". Not that it literally cannot be done, but that the 
 amount of time and effort required is prohibitive for the vast 
 majority of contributors.
I agree with all the rest, we should really cut the scopes of each D features, and make them stable & mature; instead of 100 ∼95% complete features, users are more happy with 10 100% complete features. Those ∼5% incomplete corner cases will drive users away, esp for production software.
Jun 10 2022
parent max haughton <maxhaton gmail.com> writes:
On Friday, 10 June 2022 at 20:00:26 UTC, mw wrote:
 On Friday, 29 April 2022 at 16:03:16 UTC, Paul Backus wrote:
 [...]
For this particular preprocessor issue, I think we should cut the scope where this ImportC's goal is: it should only take the input from cpp's output, .i files, instead of .h files. [...]
This is what it does already. It uses the system compiler's preprocessor.
Jun 10 2022
prev sibling next sibling parent Jack <jckj33 gmail.com> writes:
On Friday, 29 April 2022 at 14:56:37 UTC, Paul Backus wrote:
 On Friday, 29 April 2022 at 07:56:15 UTC, bauss wrote:
 I think his point was that D seems to favor library-only 
 solutions in a lot of cases and then the syntactic sugar is 
 never added properly.

 No matter how good your library solutions are then you can 
 never implement async/await in a clear fashion without the 
 compiler emitting you a state machine for it.
The dirty secret here is that the code quality of the DMD fronted has deteriorated to the point where it is basically impossible to do a correct, complete implementation of any non-trivial language feature. So we can either have a library solution that works, but has ugly syntax; or we can have a language solution that has nice syntax, but doesn't work.
wow
May 28 2022
prev sibling parent reply mw <mingwu gmail.com> writes:
On Friday, 29 April 2022 at 14:56:37 UTC, Paul Backus wrote:

 The dirty secret here is that the code quality of the DMD 
 fronted has deteriorated to the point where it is basically 
 impossible to do a correct, complete implementation of any 
 non-trivial language feature. So we can either have a library 
 solution that works, but has ugly syntax; or we can have a 
 language solution that has nice syntax, but doesn't work.
Wow, can't believe it! According to: https://stackoverflow.com/a/14203288 all compilers LDC, GDC share a common frontend from DMD, so they all suffer from this? And dmd itself is developed in D: https://github.com/dlang/dmd D 87.9% by the best D-developers (authors & experts)! and who must believe D is one of the best OO language. How come the DMD frontend is in such terrible state?
Jun 10 2022
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Jun 10, 2022 at 07:22:51PM +0000, mw via Digitalmars-d wrote:
[...]
 How come the DMD frontend is in such terrible state?
Because: https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/ Selected quotes: [...] you can ask almost any programmer today about the code they are working on. “It’s a big hairy mess,” they will tell you. “I’d like nothing better than to throw it out and start over.” Why is it a mess? “Well,” they say, “look at this function. It is two pages long! None of this stuff belongs in there! I don’t know what half of these API calls are for.” [...] Yes, I know, it’s just a simple function to display a window, but it has grown little hairs and stuff on it and nobody knows why. Well, I’ll tell you why: those are bug fixes. One of them fixes that bug that Nancy had when she tried to install the thing on a computer that didn’t have Internet Explorer. Another one fixes that bug that occurs in low memory conditions. Another one fixes that bug that occurred when the file is on a floppy disk and the user yanks out the disk in the middle. That LoadLibrary call is ugly but it makes the code work on old versions of Windows 95. Each of these bugs took weeks of real-world usage before they were found. The programmer might have spent a couple of days reproducing the bug in the lab and fixing it. If it’s like a lot of bugs, the fix might be one line of code, or it might even be a couple of characters, but a lot of work and time went into those two characters. When you throw away code and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work. T -- If lightning were to ever strike an orchestra, it'd always hit the conductor first.
Jun 10 2022
next sibling parent reply max haughton <maxhaton gmail.com> writes:
On Friday, 10 June 2022 at 19:37:37 UTC, H. S. Teoh wrote:
 On Fri, Jun 10, 2022 at 07:22:51PM +0000, mw via Digitalmars-d 
 wrote: [...]
 How come the DMD frontend is in such terrible state?
Because: https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/ Selected quotes: [...] you can ask almost any programmer today about the code they are working on. “It’s a big hairy mess,” they will tell you. “I’d like nothing better than to throw it out and start over.” Why is it a mess? “Well,” they say, “look at this function. It is two pages long! None of this stuff belongs in there! I don’t know what half of these API calls are for.” [...] Yes, I know, it’s just a simple function to display a window, but it has grown little hairs and stuff on it and nobody knows why. Well, I’ll tell you why: those are bug fixes. One of them fixes that bug that Nancy had when she tried to install the thing on a computer that didn’t have Internet Explorer. Another one fixes that bug that occurs in low memory conditions. Another one fixes that bug that occurred when the file is on a floppy disk and the user yanks out the disk in the middle. That LoadLibrary call is ugly but it makes the code work on old versions of Windows 95. Each of these bugs took weeks of real-world usage before they were found. The programmer might have spent a couple of days reproducing the bug in the lab and fixing it. If it’s like a lot of bugs, the fix might be one line of code, or it might even be a couple of characters, but a lot of work and time went into those two characters. When you throw away code and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work. T
No it really is bad. Some newer areas are ok but the quality of the code is overall just bad, relies on enormous amounts of mutability, doesn't have a proper opinion about how to resolve symbols (it only has 3 passes), tries to make decision before properly analyzing the problem etc. The compiler is mostly reasonable semantically because D is a conventional language, but several key parts of the logic are either extremely old messy bits of code that basically cannot be easily changed or types with a very sloppy heritage that lead to an explosion of edge cases all over the place: Array, Struct, and Int32 are all considered to be the same type of thing according to the enum at the heart of the class that represents types, it's ad-hoc "eh just ship it code" that almost no one can be bothered to fix because they've either been scared off from working on the compiler because of aforementioned warts or because they've tried to divide and conquer the cleanup efforts and been told no. Probably 40% of the bug fixes of the kind you posit are *because* of the frontend being unreliable.
Jun 10 2022
next sibling parent reply mw <mingwu gmail.com> writes:
On Friday, 10 June 2022 at 19:52:15 UTC, max haughton wrote:

 No it really is bad. Some newer areas are ok but the quality of 
 the code is overall just bad, relies on enormous amounts of 
 mutability,
"relies on enormous amounts of mutability" of global state / variables?
 ... but several key parts of the logic are either extremely old 
 messy bits of code that basically cannot be easily changed or 
 types with a very sloppy heritage that lead to an explosion of 
 edge cases all over the place
...
 Probably 40% of the bug fixes of the kind you posit are 
 *because* of the frontend being unreliable.
Just curious: how DMD is becoming self hosted in D? it started from scratch, or being translated from the old C++ implementation? where this old mess baggage coming from? I still feel puzzled: D is supposed to be a better OO language (read: encapsulation, separation of concerns), and DMD is developed by a number of highly capable very experienced D developers (read: not ordinary programmers), how come DMD is in such a terrible state as if it's done by some average Joel (above)? No offense, I am just puzzled by this software engineering myth.
Jun 10 2022
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Jun 10, 2022 at 08:59:38PM +0000, mw via Digitalmars-d wrote:
[...]
 Just curious: how DMD is becoming self hosted in D? it started from
 scratch, or being translated from the old C++ implementation? where
 this old mess baggage coming from?
DMD was originally written in C++. There was a period of transition when the code was being auto-transliterated to D with increasing coverage until the result could be compiled. Then when it started passing the test suite, the official repo switched to the D version and dropped the C++ code. The auto translation, of course, was the absolute minimum needed to get C++-style code to compile as D code. Since that time, there has been a good amount of refactorings to take advantage of D's features, but a good chunk still remains more-or-less the same as in the C++ days (except with C++ syntax translated to D). From time to time Walter would refactor bits of this code, taking advantage of D features to make it better, but there's a LONG way to go before it could be considered anywhere close to idiomatic D. [...]
 D is supposed to be a better OO language (read: encapsulation,
 separation of concerns), and DMD is developed by a number of highly
 capable very experienced D developers (read: not ordinary
 programmers), how come DMD is in such a terrible state as if it's done
 by some average Joel (above)?
 
 No offense, I am just puzzled by this software engineering myth.
The answer is very simple: historical baggage. It happens to every project that's been around for more than just a few years. T -- PNP = Plug 'N' Pray
Jun 10 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
The optimizer and backend date back to the 1980s, and were written in the C 
style fashionable at the time. I've been slowly refactoring it to be better, in 
particular use of D arrays, and the CodeBuilder system. It still excessively 
uses global variables, though I have refactored some of that away.

The frontend dates back to 2000 or so, and was written in the "C with Classes" 
style of the time. Of course, this is outdated today. It doesn't use templates 
because C++ templates of that time were terrible, and frankly, I still find
them 
unpleasant to use.

A recent "D-ify" of it is the greatly expanded use of nested functions, and
some 
lambdas.
Jun 11 2022
parent zjh <fqbqrr 163.com> writes:
On Saturday, 11 June 2022 at 20:14:02 UTC, Walter Bright wrote:
  It doesn't use templates because C++ templates of that
 time were terrible, and frankly, I still find them unpleasant 
 to use.
`Mordern c++` template is very easy to use. The combination of `concept/variable parameter template/index sequence` is very powerful!
Jun 11 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
 ..
 D is supposed to be a better OO language (read: encapsulation, 
 separation of concerns), and DMD is developed by a number of 
 highly capable very experienced D developers (read: not 
 ordinary programmers), how come DMD is in such a terrible state 
 as if it's done by some average Joel (above)?

 No offense, I am just puzzled by this software engineering myth.
Nonsense. D .. a better OO langauge?? Do you even know how hard it is, to reason about a D module? The D module is, apparently, THE single most important abstraction for encapsulation - someone decided to design it this way. This conflicts with OO principle of being able to encapsulate an objects invariants in its specification. So D, a -betterOOP .. hah! The D module is designed to encourage shared mutability. There are no means to specifiy, let alone verify and enforce, encapasulated object invariants. They have no 'boundary' inside a D module - by that I mean, any other code in the same module can transgress any boundary that has been specified. The D 'supremecy' of the D module, encourages exactly what it going on the dmd and phobos source. Co-operative mutability, is a major source of bugs - always has been, always will be (cause it makes it so difficult to reason about code). Mutable state subverts encapsulation, makes it more difficult to reason about code, and makes it difficult to scale 'correct' code. Mutabilty is certainly 'convenient', and oftne necessary, especially in low-level code, but it needs to be approached with greater caution than is what demonstrated in D's source code. D's module is the result of imperative, co-operative mutability, thinking, not OO thinking. Please drop this idea, that D is a better OO langauge. It is not.
Jun 10 2022
next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 10 June 2022 at 22:44:25 UTC, forkit wrote:
 On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
 [...
[...]
Betterc++
Jun 10 2022
parent mw <mingwu gmail.com> writes:
 The answer is very simple: historical baggage.
OK, that explains. On Friday, 10 June 2022 at 23:09:57 UTC, monkyyy wrote:
 On Friday, 10 June 2022 at 22:44:25 UTC, forkit wrote:

 Betterc++
Yeah, that's what I mean, better than C++, that's where D get started. And I guess D's module stuff coming from Java's package idea: all the classes/files in the same directory are supposed to be closely related, so the access control (private/protected) is relaxed. IIRC, Walter also produced a Java compiler in the early Java 1.0 days before he started D. Sometimes, completely rewrite is the only way to solve the problem. I know one of the mega cap tech company completely rewrite their whole infrastructure at least 3 times. Each time with the lesson learned from the previous version. But, given D’s small community size, not sure if this is feasible. Just wondering if someone want to start from scratch writing a completely new D front-end.
Jun 10 2022
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Jun 10, 2022 at 10:44:25PM +0000, forkit via Digitalmars-d wrote:
 On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
[...]
 D is supposed to be a better OO language (read: encapsulation,
 separation of concerns), [...]
[...]
 Nonsense. D .. a better OO langauge??
This makes me chuckle. So the OO bandwagon is still alive and tooting, after all these years. [...]
 The D module is designed to encourage shared mutability. There are no
 means to specifiy, let alone verify and enforce, encapasulated object
 invariants.  They have no 'boundary' inside a D module - by that I
 mean, any other code in the same module can transgress any boundary
 that has been specified.
Interesting. By the same logic, any code in the same class in, say, Java, can "transgress any boundary" and omg modify shared state touched by other code in that class -- oh the horrors! The only pertinent difference, really, is the size of the unit of encapsulation in which this is permitted to happen. For any non-trivial code to work at all, it has to interact with other code *somehow*. One part of the program has to share data with another part of the program, otherwise it might as well do nothing at all. One may, of course, disagree with the chosen unit of encapsulation, but that's all there is to it, a difference in units. It isn't as though the universe is in danger of imploding as soon as shared state leaks beyond the boundaries of a(n) {expression, function body, class, module, thread, etc.}. It's a matter of balancing between risk and bureacracy: the larger the unit, the higher the risk; but the smaller the scope, the more the bureacracy (when you need to cross boundaries). I think almost all of us agree that there has to be *some* unit of encapsulation, we just disagree on what that unit should be. [...]
 Co-operative mutability, is a major source of bugs - always has been,
 always will be (cause it makes it so difficult to reason about code).
 
 Mutable state subverts encapsulation, makes it more difficult to
 reason about code, and makes it difficult to scale 'correct' code.
[...] This sounds like a sales pitch for Haskell. :-P The logical conclusion of the above line of reasoning is to eliminate mutable state altogether, and for that, Haskell perfectly fits the bill. T -- If it tastes good, it's probably bad for you.
Jun 10 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 10 June 2022 at 23:36:48 UTC, H. S. Teoh wrote:
 On Fri, Jun 10, 2022 at 10:44:25PM +0000, forkit via 
 Digitalmars-d wrote:
 On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
[...]
 D is supposed to be a better OO language (read: 
 encapsulation,
 separation of concerns), [...]
[...]
 Nonsense. D .. a better OO langauge??
This makes me chuckle. So the OO bandwagon is still alive and tooting, after all these years.
Friday... Drunk? What other modelling paradigm than OO would be suitable? Certainly not ER? Or maybe you think modelling is a waste of time? Let's just jump in and write code! Most open source sofware suffers badly from this attitude. D is no exception, of course. It is universal: modelling is boring and hard, coding is fun and easy, let's code!!
Jun 10 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 10 June 2022 at 23:36:48 UTC, H. S. Teoh wrote:

In OOP, the primary unit of scope/encapsulation is the class.

I'm not aware of there being different views on this matter - but 
happy to be enlightened.

That is the basis on which I reject the assertion that D is a 
better OO language.

btw. I'm not a proponent of Java, as some have suggested ;-)

So using Java to rebutt my argument, is pointless ;-)

But I do think, programming language of the future need to 
default to immutability and memory safety. One should have of 
course, the option, to opt-out where that is needed and 
appropriate.

The problem with D, is that it set out to be a betterC, perhaps 
even a betterC++.

But when one sets out to design a better motorcycle, you don't 
end up with a motorcycle at all. You end up with something 
completely different.

The advantage of a motorcycle, is you can just jump on it a zoom 
off...... and of course, you have the rush of being in control of 
your own fate..

That's the advantage of the motorcycle. Safety issues were never 
a primary consideration - otherwise you wouldn't be on that 
motorcycle.

So designing a better C or C++ is a futile effort, that results 
in some strange hybrid of things.. that don't see to really go 
all that well together.

I don't want a seat-belt on my motorcycle.

I don't want the seat-belts taken out of my car either.
Jun 10 2022
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Jun 11, 2022 at 12:25:08AM +0000, forkit via Digitalmars-d wrote:
[...]
 In OOP, the primary unit of scope/encapsulation is the class.
 
 I'm not aware of there being different views on this matter - but
 happy to be enlightened.

 That is the basis on which I reject the assertion that D is a better
 OO language.
OK, you got me there. :-D But the person you were replying to *did* clarify that what they meant by "OO" is "encapsulation, separation of concerns". So not necessarily OO in the strict sense of the word.
 btw. I'm not a proponent of Java, as some have suggested ;-)
 
 So using Java to rebutt my argument, is pointless ;-)
"Without geometry, life would be pointless..." Sorry, wrong forum. :-P
 But I do think, programming language of the future need to default to
 immutability and memory safety. One should have of course, the option,
 to opt-out where that is needed and appropriate.
Memory safety -- definitely agreed. Just browsing the latest list of CVEs one can't help noting that a disproportionately large number of them come from memory safety flaws. It may take another decade or four, but languages that are unsafe by default will IMO eventually become obsolete. Immutability -- not so much. Maybe something like Rust's `var` to ask for mutability might work, but Haskell-style straitjacketed immutability isn't my cup of tea.
 The problem with D, is that it set out to be a betterC, perhaps even a
 betterC++.
 
 But when one sets out to design a better motorcycle, you don't end up
 with a motorcycle at all. You end up with something completely
 different.
 
 The advantage of a motorcycle, is you can just jump on it a zoom
 off......  and of course, you have the rush of being in control of
 your own fate..
 
 That's the advantage of the motorcycle. Safety issues were never a
 primary consideration - otherwise you wouldn't be on that motorcycle.
 
 So designing a better C or C++ is a futile effort, that results in
 some strange hybrid of things.. that don't see to really go all that
 well together.
 
 I don't want a seat-belt on my motorcycle.
 
 I don't want the seat-belts taken out of my car either.
But but... *I* want car-torcycle with half-length seatbelts that come off automatically when I'm getting on or off, but stay firmly on in the case of an accident! Why can't I have both? :-P Surely with DbI and the appropriate UDAs, and perhaps an undocumented -preview=seatbelt switch or three, that could be made to work...! T -- People tell me that I'm skeptical, but I don't believe them.
Jun 10 2022
parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 00:48:29 UTC, H. S. Teoh wrote:
 But but... *I* want car-torcycle with half-length seatbelts 
 that come off automatically when I'm getting on or off, but 
 stay firmly on in the case of an accident!  Why can't I have 
 both?  :-P  Surely with DbI and the appropriate UDAs, and 
 perhaps an undocumented -preview=seatbelt switch or three, that 
 could be made to work...!


 T
D makes a great car-torcycle. I'll give you that ;-) I see plenty of teenagers in my street, driving strange looking hybrids up and down the street. They love it too. But I don't wanna drive interstate using one of those. Happy to drive around the streets though.
Jun 10 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Friday, 10 June 2022 at 23:36:48 UTC, H. S. Teoh wrote:
 ...
 The only pertinent difference, really, is the size of the unit 
 of encapsulation in which this is permitted to happen.  For any 
 non-trivial code to work at all, it has to interact with other 
 code *somehow*. One part of the program has to share data with 
 another part of the program, otherwise it might as well do 
 nothing at all.
People can interact, and nothing need to get mutated. If they're all stubborn in their views, then it's likely nothing can be mutated anyway. In this case, interaction and mutation are mutually exclusive. In the same way, I want to protecting my class from mutation (making it stubborn to being mutated by the outside world). My class will decide when to mutate, and when not to mutate. My module should not force mutation on me.
Jun 10 2022
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Jun 11, 2022 at 12:37:17AM +0000, forkit via Digitalmars-d wrote:
 On Friday, 10 June 2022 at 23:36:48 UTC, H. S. Teoh wrote:
 ...
 The only pertinent difference, really, is the size of the unit of
 encapsulation in which this is permitted to happen.  For any
 non-trivial code to work at all, it has to interact with other code
 *somehow*. One part of the program has to share data with another
 part of the program, otherwise it might as well do nothing at all.
People can interact, and nothing need to get mutated. If they're all stubborn in their views, then it's likely nothing can be mutated anyway. In this case, interaction and mutation are mutually exclusive. In the same way, I want to protecting my class from mutation (making it stubborn to being mutated by the outside world). My class will decide when to mutate, and when not to mutate. My module should not force mutation on me.
You know how ridiculous that sounds, right? -- when you rephrase that in terms of a different unit of encapsulation. "I want to protect my function's per-object state from mutation by other functions in the class. My function will decide when to mutate, and when not to mutate. My class should not force mutation on me." Or, "I want to protect my local variables from mutation by other blocks in the function. My block will decide when to mutate, and when not to mutate. My function body shold not force mutation on me." One can argue that functions in a class ought to work together on that class's data; one could argue the same for functions (and other code) in a module. Or blocks in a function. It's essentially the same argument at the core; the only difference is the unit of encapsulation. As I said, opinions differ on this. You say the class ought to be unit of encapsulation, Walter says it should be the module. Maybe next week I should write a DIP arguing for the block to be the unit of encapsulation instead. Each side of the argument has its merits and demerits; the choice is essentially arbitrary, based on what the language designer deems more important or not, in balancing the tradeoffs in the language. There are bigger fish to fry in the pond of programming language design. T -- It is impossible to make anything foolproof because fools are so ingenious. -- Sammy
Jun 10 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 01:00:44 UTC, H. S. Teoh wrote:
 ..
 You know how ridiculous that sounds, right? -- when you 
 rephrase that in terms of a different unit of encapsulation.  
 "I want to protect my function's per-object state from mutation 
 by other functions in the class.  My function will decide when 
 to mutate, and when not to mutate. My class should not force 
 mutation on me."  Or, "I want to protect my local variables 
 from mutation by other blocks in the function. My block will 
 decide when to mutate, and when not to mutate.  My function 
 body shold not force mutation on me."

 One can argue that functions in a class ought to work together 
 on that class's data; one could argue the same for functions 
 (and other code) in a module. Or blocks in a function.  It's 
 essentially the same argument at the core; the only difference 
 is the unit of encapsulation.
Any argument taken to the extreme, will certainly sound ridiculour - as you've demonstrated.
 As I said, opinions differ on this.  You say the class ought to 
 be unit of encapsulation, Walter says it should be the module. 
 Maybe next week I should write a DIP arguing for the block to 
 be the unit of encapsulation instead.  Each side of the 
 argument has its merits and demerits; the choice is essentially 
 arbitrary, based on what the language designer deems more 
 important or not, in balancing the tradeoffs in the language.

 There are bigger fish to fry in the pond of programming 
 language design.


 T
In OOP, opinions do not differ on the level of encapsulation. It is the class. I'm happy to be corrected here, if I'm wrong. It's why it's called OOP, not MOP. My only disagreement is where a language offering OOP, goes and puts this principle aside. It's not a disagreement in the module being an abstraction that provides a barrier around it. It's that the module swallows up the barrier of the class. Therefore, is you use classes in D, the very principle of OOP has been 'mutated'. In my case, I'm protecting this principle, it's private, and I mean really private, and the outside world is not going to mutate it .. I even have barriers in memory to prevent that from happening. So no amount of hacking will circumvent that barrier ;-) .. but I'm sure many will keep trying.
Jun 10 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 11 June 2022 at 01:14:16 UTC, forkit wrote:
 On Saturday, 11 June 2022 at 01:00:44 UTC, H. S. Teoh wrote:
 ..
 You know how ridiculous that sounds, right? -- when you 
 rephrase that in terms of a different unit of encapsulation.  
 "I want to protect my function's per-object state from 
 mutation by other functions in the class.  My function will 
 decide when to mutate, and when not to mutate. My class should 
 not force mutation on me."  Or, "I want to protect my local 
 variables from mutation by other blocks in the function. My 
 block will decide when to mutate, and when not to mutate.  My 
 function body shold not force mutation on me."

 One can argue that functions in a class ought to work together 
 on that class's data; one could argue the same for functions 
 (and other code) in a module. Or blocks in a function.  It's 
 essentially the same argument at the core; the only difference 
 is the unit of encapsulation.
Any argument taken to the extreme, will certainly sound ridiculour - as you've demonstrated.
 As I said, opinions differ on this.  You say the class ought 
 to be unit of encapsulation, Walter says it should be the 
 module. Maybe next week I should write a DIP arguing for the 
 block to be the unit of encapsulation instead.  Each side of 
 the argument has its merits and demerits; the choice is 
 essentially arbitrary, based on what the language designer 
 deems more important or not, in balancing the tradeoffs in the 
 language.

 There are bigger fish to fry in the pond of programming 
 language design.


 T
In OOP, opinions do not differ on the level of encapsulation. It is the class. I'm happy to be corrected here, if I'm wrong. It's why it's called OOP, not MOP. ...
Regardless of how it is called, OOP is definitely not the class. There are no classes in prototype based OOP languages like SELF and JavaScript (the ES6 "class" gets desugared into prototypes anyway). There are no classes in pattern based OOP languages like BETA. There are no classes in type extension based OOP languages like Oberon. There are no classes in multi-methods/protocol based OOP languages like Common Lisp, Clojure, Dylan and Julia. There are no classes in interface based OOP languages like VB (pre-.NET), Go, Rust, OCaml. Basically the OOP design space is big enough to have plain classes define the ultimate meaning of what is OOP.
Jun 11 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 07:27:43 UTC, Paulo Pinto wrote:
 There are no classes in pattern based OOP languages like BETA.
Pattern is very close to how classes work in Simula, just generalized to cover functions and more. You can have executable code in the body of a Simula class too. Lambdas in C++ captures some of that spirit. What tends to confuse people is that they conflate OO with language features, but OO is a modelling strategy, not a language extension mechanism. OO language features are there to make implementation OO models easier. OOP is a good features set to implement a model, but usually inadequate for extending the language (which is never was intended for). (Regardless, D is class based and should be evaluated as such.)
Jun 11 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 07:27:43 UTC, Paulo Pinto wrote:
 Regardless of how it is called, OOP is definitely not the class.

 There are no classes in prototype based OOP languages like SELF 
 and JavaScript (the ES6 "class" gets desugared into prototypes 
 anyway).

 There are no classes in pattern based OOP languages like BETA.

 There are no classes in type extension based OOP languages like 
 Oberon.

 There are no classes in multi-methods/protocol based OOP 
 languages like Common Lisp, Clojure, Dylan and Julia.

 There are no classes in interface based OOP languages like VB 
 (pre-.NET), Go, Rust, OCaml.

 Basically the OOP design space is big enough to have plain 
 classes define the ultimate meaning of what is OOP.
When you redefine what an 'object' is, then anything can be OOP ;-) Please properly acquaint yourself this this concept ;-) http://kristennygaard.org/FORSKNINGSDOK_MAPPE/F_OO_start.html
Jun 11 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 08:57:40 UTC, forkit wrote:
 When you redefine what an 'object' is, then anything can be OOP 
 ;-)

 Please properly acquaint yourself this this concept ;-)

 http://kristennygaard.org/FORSKNINGSDOK_MAPPE/F_OO_start.html
Heh, I had a short face-to-face exchange about the merits of the minimalism of BETA (the Simula successor) with Kristen and his opinion was that Self went one step to far, but he never suggested that it wasn't OOP. He also made a big point of OO not being a paradigm, but something to be used with other appraches. At that point he also felt that stakeholders would benfit from being trained in OO (IIRC) so it was more about a modelling mindset. He was also highly sceptical of pure OOP detached from modelling (I believe this was an american thing). To Kristen, OO was useful outside of programming and fit into ideas about empowerment. (My attempt at recollecting what he said in the 90s). (Languages presented at OO conferences are OO. Simula didnt get encapsulation until the 70s btw.) What is funny about the D culture is that the same people who whine about OO also would celebrate "alias this" as a great invention! The same people who point out how great "voldemort types" are, because they think it gives better encapslation, also think that having class encasulation is bad. (These NIH traits and celebration if being different for the sake of being different are sure signs of a cult...)
Jun 11 2022
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 11 June 2022 at 08:57:40 UTC, forkit wrote:
 On Saturday, 11 June 2022 at 07:27:43 UTC, Paulo Pinto wrote:
 Regardless of how it is called, OOP is definitely not the 
 class.

 There are no classes in prototype based OOP languages like 
 SELF and JavaScript (the ES6 "class" gets desugared into 
 prototypes anyway).

 There are no classes in pattern based OOP languages like BETA.

 There are no classes in type extension based OOP languages 
 like Oberon.

 There are no classes in multi-methods/protocol based OOP 
 languages like Common Lisp, Clojure, Dylan and Julia.

 There are no classes in interface based OOP languages like VB 
 (pre-.NET), Go, Rust, OCaml.

 Basically the OOP design space is big enough to have plain 
 classes define the ultimate meaning of what is OOP.
When you redefine what an 'object' is, then anything can be OOP ;-) Please properly acquaint yourself this this concept ;-) http://kristennygaard.org/FORSKNINGSDOK_MAPPE/F_OO_start.html
I am well acquitted, thank you very much. It would have been a bad major in systems programming and languages, if I had missed such basic stuff. Basically you are asserting that anything besides Simula isn't OOP, that is like asserting only pure lambda calculus is FP.
Jun 11 2022
parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 11:31:17 UTC, Paulo Pinto wrote:
 I am well acquitted, thank you very much. It would have been a 
 bad major in systems programming and languages, if I had missed 
 such basic stuff.

 Basically you are asserting that anything besides Simula isn't 
 OOP, that is like asserting only pure lambda calculus is FP.
I think we're clearly talking two different things here? me - object oriented languages (which support all of the features of object-oriented programming) you - object based languages (which support a subset of the features of OOP) we have to agree on terminolgy, or otherwise, its not possible to represent a concrete idea. A bit like the D module, which can't agree on what is a type and what isn't ;-)
Jun 11 2022
parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 11:42:06 UTC, forkit wrote:

btw.

I'm open to being enlightended, on how a language can support all 
the features of object-oriented progamming -  including, 
encapsulation, inheritance and polymorphism - without a class 
type.

My assertion was, that a language that supports OOP, must have a 
class type.

I'm not aware of another way to provide all these features 
without a class type.

Is there another way? Did you learn something in your degree that 
I'm unaware of?
Jun 11 2022
parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 11 June 2022 at 11:50:46 UTC, forkit wrote:
 On Saturday, 11 June 2022 at 11:42:06 UTC, forkit wrote:

 btw.

 I'm open to being enlightended, on how a language can support 
 all the features of object-oriented progamming -  including, 
 encapsulation, inheritance and polymorphism - without a class 
 type.

 My assertion was, that a language that supports OOP, must have 
 a class type.

 I'm not aware of another way to provide all these features 
 without a class type.

 Is there another way? Did you learn something in your degree 
 that I'm unaware of?
OOP doesn't imply being class based, SIGPLAN has enough information on the matter regarding programming languages taxonomy. You can start with ECOOP Proceedings available from Springer Verlag. However it is clearly a waste of my time trying to convince you otherwise.
Jun 11 2022
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 07:27:43 UTC, Paulo Pinto wrote:
 ..
 Regardless of how it is called, OOP is definitely not the class.

 There are no classes in prototype based OOP languages like SELF 
 and JavaScript (the ES6 "class" gets desugared into prototypes 
 anyway).

 There are no classes in pattern based OOP languages like BETA.

 There are no classes in type extension based OOP languages like 
 Oberon.

 There are no classes in multi-methods/protocol based OOP 
 languages like Common Lisp, Clojure, Dylan and Julia.

 There are no classes in interface based OOP languages like VB 
 (pre-.NET), Go, Rust, OCaml.

 Basically the OOP design space is big enough to have plain 
 classes define the ultimate meaning of what is OOP.
Your forgot to mention the programming language called 17. It doesn't use classes either. But it's object oriented. Afterall, 17 is an object.
Jun 11 2022
prev sibling parent Antonio <antonio abrevia.net> writes:
On Saturday, 11 June 2022 at 07:27:43 UTC, Paulo Pinto wrote:
 Regardless of how it is called, OOP is definitely not the class.

 There are no classes in prototype based OOP languages like SELF 
 and JavaScript (the ES6 "class" gets desugared into prototypes 
 anyway).

 There are no classes in pattern based OOP languages like BETA.

 There are no classes in type extension based OOP languages like 
 Oberon.

 There are no classes in multi-methods/protocol based OOP 
 languages like Common Lisp, Clojure, Dylan and Julia.

 There are no classes in interface based OOP languages like VB 
 (pre-.NET), Go, Rust, OCaml.

 Basically the OOP design space is big enough to have plain 
 classes define the ultimate meaning of what is OOP.
**Amen**
Jun 14 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 01:00:44 UTC, H. S. Teoh wrote:
 .. As I said, opinions differ on this.
 You say the class ought to be unit of encapsulation,
 Walter says it should be the module.
No. This is not my argument. It never has been. To use yet another analogy (but not taken to it's extreme though): Think of the module as a house. Now think of it as a house *without* doors inside it - no barriers - anyone can go anywhere. Now consider what happens as you increase the number of people living in it, beyond one. Very quickly, even at a small number, you start to realise that hey, maybe we need a door here, or over there. But tough luck. Cause this house does not accomodate doors. If you want even the most modest level of privacy, you need to go build your own house. What I like to do, is put rooms in my house. Cause we're a family, and we like living together - but some of us really do want a reasonable, modest level of privacy - without having to go build our own house. Consider the class, as being the door that provides that.
Jun 10 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 02:01:19 UTC, forkit wrote:

.. and Walter's response would be:

No. There are no doors in this house. It's not designed to 
accomodate doors.
Jun 10 2022
parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 02:07:04 UTC, forkit wrote:

of course, a door itself, won't do very much unless I also have 
walls.

Consider the walls, as the compiler 'enforcing' you to go through 
the door.

We got no doors, and no walls, in D.
Jun 10 2022
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Saturday, 11 June 2022 at 02:01:19 UTC, forkit wrote:
 Think of the module as a house.
[...]
 If you want even the most modest level of privacy, you need to 
 go build your own house.

 What I like to do, is put rooms in my house. Cause we're a 
 family, and we like living together - but some of us really do 
 want a reasonable, modest level of privacy - without having to 
 go build our own house.
I guess when you say "without having to go build your own house", you mean "without having to create a new file"? In other words, you would like encapsulation boundaries to be decoupled from file boundaries. IMO the correct way to do this would be to simply allow multiple modules to be declared in the same file, rather than to couple encapsulation boundaries to class boundaries. This is a feature that has been requested many times for D, and one that several other languages have implemented successfully. It is not entirely without downsides. Putting multiple modules in one file would complicate the build process somewhat--for example, `dmd -i` would no longer Just Work™ the way it currently does. But if someone could put together a convincing DIP, I think it's possible this feature could be added to D in the future.
Jun 10 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 04:00:32 UTC, Paul Backus wrote:
 IMO the correct way to do this would be to simply allow 
 multiple modules to be declared in the same file, rather than 
 to couple encapsulation boundaries to class boundaries. This is 
 a feature that has been requested many times for D, and one 
 that several other languages have implemented successfully.
Bad idea, it does not help at all, just makes code harder to read and navigate. This is how languages detoriate. Just keep it simple either add a 'hidden' keyword or leave it as is. You still don't get a way to protect inner classes from the outer classes.
Jun 10 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 11 June 2022 at 01:00:44 UTC, H. S. Teoh wrote:

 As I said, opinions differ on this.  You say the class ought to 
 be unit of encapsulation, Walter says it should be the module.
Class is *the* unit of encapsulation by any reasonable definition of class-based OOP. You don't create objects of modules (though modules can be viewed as singleton objects). You can argue about the mechanism of sharing the state between objects of different classes, but forcing the programmer to group classes together and share the entirety of their state is not the greatest of options. Disclaimer: Ola may assume I see OOP as god, but I actually don't.
Jun 11 2022
next sibling parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 08:23:27 UTC, Max Samukha wrote:
 Class is *the* unit of encapsulation by any reasonable 
 definition of class-based OOP. You don't create objects of 
 modules (though modules can be viewed as singleton objects). 
 You can argue about the mechanism of sharing the state between 
 objects of different classes, but forcing the programmer to 
 group classes together and share the entirety of their state is 
 not the greatest of options.
+1 also, a class is a type, just like any other type - example an int. when you assign to an int, type checking is done. you cannot put "wtf!" into an int! It has invariants to it, and the compiler checks these (type checking). but for some reason, a class doesn't get the same protection - not even an option to protect it (from other code in a module). The module is not a type, it's a unit of encapsulation, and that is all it is. A class is a type, which is much more than just a unit of encapsulation. When a class sets invariants, the compiler must do type checking, just as it does with an int, to ensure the invariants are being upheld. When a programming langauges stops treating a class as a type, it cannot claim to have support for OOP.
Jun 11 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 08:23:27 UTC, Max Samukha wrote:
 Disclaimer: Ola may assume I see OOP as god, but I actually 
 don't.
I try not to assume too much unless people say something explictly. Also, you've never stated what kind of OOP you are referring to and for what purpose, and what your judgment is based on, so it tells me next to nothing!? If the domain you deal with is suitable for OOA&D then a good OOPL will save you a lot of time. If it isn't then you can choose to not use OO features. Nobody cares what you think then! That's like city dwellers saying that tractors are no good. Farmers don't care! If you refuse to accept that any OO methodologies are useful, regardless of the setting, then you have to point to a better methodology and explain in detail what situations that methodology is better for and why. E.g. The classic SA can be useful if an organization is moving from paper based to computer based handling, but that scenario is less useful today than it was before...
Jun 11 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 11 June 2022 at 12:21:31 UTC, Ola Fosheim Grøstad 
wrote:

 Also, you've never stated what kind of OOP you are referring to 
 and for what purpose, and what your judgment is based on, so it 
 tells me next to nothing!?
I am talking about OOP, where an object is literally the unit of encapsulation. I don't understand why people are so eager to redefine the term.
Jun 11 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 13:14:20 UTC, Max Samukha wrote:
 I am talking about OOP, where an object is literally the unit 
 of encapsulation. I don't understand why people are so eager to 
 redefine the term.
The term OOP usually just implies inheritance of some sort and polymorphism of some sort. If you mean something more specific you have to spell it out...
Jun 11 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 11 June 2022 at 13:39:01 UTC, Ola Fosheim Grøstad 
wrote:

 The term OOP usually just implies inheritance of some sort and 
 polymorphism of some sort. If you mean something more specific 
 you have to spell it out...
I hate to quote Alan Kay again (http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay_oop_en): "OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme late-binding of all things. It can be done in Smalltalk and in LISP. There are possibly other systems in which this is possible, but I'm not aware of them." "local retention and protection and hiding of state-process" is encapsulation. I don't know how to think about OOP without it. Objects must protect their state from unconstrained mutation. Otherwise, the concept of OOP becomes meaningless.
Jun 11 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 14:09:54 UTC, Max Samukha wrote:
 "OOP to me means only messaging, local retention and protection 
 and
 hiding of state-process, and extreme late-binding of all 
 things. It
 can be done in Smalltalk and in LISP. There are possibly other
 systems in which this is possible, but I'm not aware of them."
Simula was created for simulation. It added inheritance and virtual functions to an Algol like language. You don't need encapsulation to do simulation. Talking about Smalltalk in this context is strange.
Jun 11 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 11 June 2022 at 15:43:52 UTC, Ola Fosheim Grøstad 
wrote:

 Simula was created for simulation. It added inheritance and 
 virtual functions to an Algol like language. You don't need 
 encapsulation to do simulation.
Yes, you can break encapsulation with Simula. You need to rely on discipline.
 Talking about Smalltalk in this context is strange.
Or talking about Simula, which doesn't have a mechanism for encapsulation.
Jun 11 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 11 June 2022 at 18:03:14 UTC, Max Samukha wrote:
 On Saturday, 11 June 2022 at 15:43:52 UTC, Ola Fosheim Grøstad 
 wrote:

 Simula was created for simulation. It added inheritance and 
 virtual functions to an Algol like language. You don't need 
 encapsulation to do simulation.
Yes, you can break encapsulation with Simula. You need to rely on discipline.
 Talking about Smalltalk in this context is strange.
Or talking about Simula, which doesn't have a mechanism for encapsulation.
C++, Java and D follows Simula. Simula's protection levels were added in the 70s: hidden, protected and hidden protected, but the defining characteristics is class inheritance, virtual functions and coroutines.
Jun 11 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 11 June 2022 at 18:59:42 UTC, Ola Fosheim Grøstad 
wrote:

 Or talking about Simula, which doesn't have a mechanism for 
 encapsulation.
C++, Java and D follows Simula. Simula's protection levels were added in the 70s: hidden, protected and hidden protected, but the defining characteristics is class inheritance, virtual functions and coroutines.
I didn't know those had been added. If it had them from the start, would that make encapsulation another defining characteristic?
Jun 12 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 12 June 2022 at 09:30:06 UTC, Max Samukha wrote:
 I didn't know those had been added. If it had them from the 
 start, would that make encapsulation another defining 
 characteristic?
Many OO languages don't provide encapsulation, like Python. I would say encapsulation has more to do with scaling up and evolving, as well as reliability (e.g. actors). But I guess you can say that OO features span a space where encapsulation is one dimension.
Jun 12 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 12 June 2022 at 09:54:42 UTC, Ola Fosheim Grøstad 
wrote:

 Many OO languages don't provide encapsulation, like Python.
Right, but Python programmers are expected to prepend private members with '_'.
 I would say encapsulation has more to do with scaling up and 
 evolving, as well as reliability (e.g. actors).
 But I guess you can say that OO features span a space where 
 encapsulation is one dimension.
Then it is natural to expect the feature would apply to the class level (if we are talking about class-based OOP)? Really, I'm yet to meet a D user that wouldn't be surprised 'private' is module-level. And the 'friend' story rarely impresses them. The reply is always: "Ok, 'friend' breaks OOP principles. Is D better because it breaks OOP in its own way?"
Jun 12 2022
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 12 June 2022 at 10:40:02 UTC, Max Samukha wrote:

 I'm yet to meet a D user that wouldn't be surprised 'private' 
 is module-level.
Hi. My name's Mike. Nice to meet you.
Jun 12 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:

 Hi. My name's Mike. Nice to meet you.
Hi, Mike! Congratulations on being the first unsurprised D user! (You were actually surprised for a moment, weren't you?)
Jun 12 2022
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:
 Hi, Mike! Congratulations on being the first unsurprised D 
 user! (You were actually surprised for a moment, weren't you?)
No. Nor was I surprised, for example, when I learned that in restrictive than `protected` in Java. Every language has a similar approach as other languages to some things, a different approach to others. I've investigated enough programming languages that I learned long ago to be open to the differences and never to expect that just because something is true in Language A that it will be true for a similar feature in Language B. I often have reactions of "neat" or "cool", or "too bad" or "that sucks", but I can't say I'm ever really surprised when learning about features the first time. You have to learn to think in the language you're using if you want to be productive with it, and that means accepting the differences. You may find some things grate on your nerves because they don't square with your view of the world, in which case you either push to change them, accept them, or, if it's too much to handle, move on to a language that better fits your mental model. The latter is why I never stuck with C++. When I first learned about D's private-to-the-module approach, it made perfect sense to me. It fits right in with D's concept of modules. I have been surprised occasionally, though, when I was certain a feature worked a certain way, but I learned later my understanding was wrong. There were a couple of those instances when I was writing Learning D, but I can't for the life of me remember what they were.
Jun 12 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 12 June 2022 at 14:56:53 UTC, Mike Parker wrote:

It takes a certain amount of ' ?? ' (not sure of the right word 
to use here) to not be surprised, when you're new car arrives and 
you're told "only brakes on the rear are necessary in this car", 
when you're experience is, that brakes on all wheels are more 
likely to protect you from an accident.
Jun 12 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 12 June 2022 at 23:15:30 UTC, forkit wrote:
 On Sunday, 12 June 2022 at 14:56:53 UTC, Mike Parker wrote:

 It takes a certain amount of ' ?? ' (not sure of the right word 
 to use here) to not be surprised, when you're new car arrives 
 and you're told "only brakes on the rear are necessary in this 
 car", when you're experience is, that brakes on all wheels are 
 more likely to protect you from an accident.
The ?? is that you think this is a relevant analogy.
Jun 12 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 01:07:07 UTC, Mike Parker wrote:
 On Sunday, 12 June 2022 at 23:15:30 UTC, forkit wrote:
 On Sunday, 12 June 2022 at 14:56:53 UTC, Mike Parker wrote:

 It takes a certain amount of ' ?? ' (not sure of the right 
 word to use here) to not be surprised, when you're new car 
 arrives and you're told "only brakes on the rear are necessary 
 in this car", when you're experience is, that brakes on all 
 wheels are more likely to protect you from an accident.
The ?? is that you think this is a relevant analogy.
Yes, coming up with an analogy for confusing decision in D, is challenging ;-) Putting analogies aside then.... The D approach of 'everything in a module is global to the module - and that's all there is to it', results in massive blobs of potentially globally mutating logic, that is very difficult to reason about. Don't believe me? Go look at D source code. But, just look what happens, when you have more flexibilty with access control: public class SomePublicClass // explicitly public class { fileprivate func someFilePrivateMethod() {} // explicitly file-private private func somePrivateMethod() {} // explicitly private class } (1) Now you can fully reason about that 'chunk' of code. You don't need to factor into consideration what other code as well, just to understand this little chunk. (2) Now the reasoning acts as a form of documentation also. (3) Now the reasoning is enforced by the compiler (you don't have to 'just not make mistakes'). Why would anyone would want to 'give this up' (or be forced to give it up), in favour of a 'global module mutation approach'? It is beyond my comprehension, and difficult to come up with a suitable analogy. Although, that brakes analogy suddenly starts to look not so bad afterall. i.e. Perhaps they 'just don't care all that much' about reducing the likelihood of an accident. 'Just don't have one' is their motto.
Jun 12 2022
parent reply zjh <fqbqrr 163.com> writes:
On Monday, 13 June 2022 at 01:40:29 UTC, forkit wrote:

 It is beyond my comprehension, and difficult to come up with a 
 suitable analogy.
They like to be like a big company where everyone can work on everything. I can use your data if I want. They hope that all people are `friends`. When their `data or code` is in a mess in the future, they simply can't find who messed it up . Because they are all friends! Originally, each was responsible for `one piece` of the work. Now, anyone can step in. Just like you drive car,but you can also fly a plane, because we are all `friends`!
Jun 12 2022
parent reply zjh <fqbqrr 163.com> writes:
On Monday, 13 June 2022 at 02:24:49 UTC, zjh wrote:

 Now, anyone can step in.
 Just like you drive car,but you can also fly a plane, because 
 we are all `friends`!
They forgot what they were proud of, that is, they are professional. Then they allow `unprofessional outsiders` to sabotage `your professional`. Because we are all `friends(enemies)`!
Jun 12 2022
parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 02:29:54 UTC, zjh wrote:
 On Monday, 13 June 2022 at 02:24:49 UTC, zjh wrote:

 Now, anyone can step in.
 Just like you drive car,but you can also fly a plane, because 
 we are all `friends`!
They forgot what they were proud of, that is, they are professional. Then they allow `unprofessional outsiders` to sabotage `your professional`. Because we are all `friends(enemies)`!
I'm not convinced you're progressing this argument any better than i am with my analogies ;-) Let's keep it a little more focused... perhaps?
Jun 12 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 01:07:07 UTC, Mike Parker wrote:
 On Sunday, 12 June 2022 at 23:15:30 UTC, forkit wrote:
 On Sunday, 12 June 2022 at 14:56:53 UTC, Mike Parker wrote:

 It takes a certain amount of ' ?? ' (not sure of the right 
 word to use here) to not be surprised, when you're new car 
 arrives and you're told "only brakes on the rear are necessary 
 in this car", when you're experience is, that brakes on all 
 wheels are more likely to protect you from an accident.
The ?? is that you think this is a relevant analogy.
let's forget analogies! how about two questions on your end of semester exam instead: Withing a module: (1) What is the disadvantage of having an 'optional' access modifier, such as 'class private', so that you can separate interface from implementation? (2) what is the disadvantage in leveraging the compiler to help us in determining whether code outside that class, but within the same module, is correctly using that interface?
Jun 12 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 02:42:52 UTC, forkit wrote:

 Withing a module:

 (1) What is the disadvantage of having an 'optional' access 
 modifier, such as 'class private', so that you can separate 
 interface from implementation?

 (2) what is the disadvantage in leveraging the compiler to help 
 us in determining whether code outside that class, but within 
 the same module, is correctly using that interface?
That's backwards. You're talking about adding a new language feature. The bar in that case is to show that the new feature provides an advantage over the status quo, such that the additional complexity is justified. That's what this boils down to. Walter and Atila are who you need to convince. Not me. I just don't see *practical* advantage to it. Let's say I have this class using a shiny new `private(intern)` feature implemented by someone on my team. ```d class Foo { private(intern) x; } ``` Now I want to add a function in the module that manipulates `x`. ```d void newFunction(Foo f) { f.x = 10; } ``` Oops! Can't do that. But no problem. I have access to the source. I can change `x` to `private` so that I can access it everywhere in the module. Or I add a new `private(intern)` member function to set the value of `x` in the way I need it. This is the reason I think it's a useless feature. If you have access to the module, you have access to the internal members of the class. It's no different than having a class per file in Java. At some point, it comes down to coding conventions (e.g., all internal access to private members within a Java class must explicitly go through the public interface). How does this feature bring any benefit that can't be had by putting `Foo` in a separate module? That's the question you have to answer.
Jun 12 2022
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 13 June 2022 at 03:46:52 UTC, Mike Parker wrote:

 How does this feature bring any benefit that can't be had by 
 putting `Foo` in a separate module?
The issue here is that you can only have one module per file. Allow multiple modules per files and the motivation behind "private for this class" will be satisfied. -Alex
Jun 12 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 04:09:12 UTC, 12345swordy wrote:
 On Monday, 13 June 2022 at 03:46:52 UTC, Mike Parker wrote:

 How does this feature bring any benefit that can't be had by 
 putting `Foo` in a separate module?
The issue here is that you can only have one module per file. Allow multiple modules per files and the motivation behind "private for this class" will be satisfied.
Right now, you can split your module into two files and present them to the world as a single module with package.d. What does your suggestion buy us that this doesn't aside from a single module name trait?
Jun 12 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 04:39:23 UTC, Mike Parker wrote:
 On Monday, 13 June 2022 at 04:09:12 UTC, 12345swordy wrote:
 On Monday, 13 June 2022 at 03:46:52 UTC, Mike Parker wrote:

 How does this feature bring any benefit that can't be had by 
 putting `Foo` in a separate module?
The issue here is that you can only have one module per file. Allow multiple modules per files and the motivation behind "private for this class" will be satisfied.
Right now, you can split your module into two files and present them to the world as a single module with package.d. What does your suggestion buy us that this doesn't aside from a single module name trait?
a choice.
Jun 12 2022
parent reply forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 04:59:09 UTC, forkit wrote:
 a choice.
oh. I misread the question. I don't want multiple modules per file. Multiple files per module would be nice though.
Jun 12 2022
parent Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 05:11:31 UTC, forkit wrote:

 oh. I misread the question.
Yeah, I misread the suggestion.
Jun 12 2022
prev sibling parent reply bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 04:39:23 UTC, Mike Parker wrote:
 Right now, you can split your module into two files and present 
 them to the world as a single module with package.d. What does 
 your suggestion buy us that this doesn't aside from a single 
 module name trait?
Let's see. a.d ``` module a; class Foo { private: int _c; } import b; void handle(Bar child) { child._c += child.c; } ``` b.d ``` module b; import a; class Bar : Foo { public: int c; this(int c) { this.c = c; } } ``` main.d ``` module main; import a; import b; void main() { auto bar = new Bar(30); handle(bar); } ``` If D is truly "module private" then the above should be able to compile, if it's "class private" then it shouldn't be able to compile. Since the above doesn't compile then D isn't really "module private" and thus the conclusion is that one of the most fundamental features of D is also an unfinished feature. D states it is "module private" but really it is neither "module private" or "class private" it's a mix between the two... and I'm not sure whether that's good or not.
Jun 13 2022
next sibling parent Tobias Pankrath <tobias pankrath.net> writes:
On Monday, 13 June 2022 at 07:59:24 UTC, bauss wrote:
 <snip>
It does what I would expect it to do. What exactly is the issue with this?
Jun 13 2022
prev sibling next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 07:59:24 UTC, bauss wrote:

 If D is truly "module private" then the above should be able to 
 compile, if it's "class private" then it shouldn't be able to 
 compile.

 Since the above doesn't compile then D isn't really "module 
 private" and thus the conclusion is that one of the most 
 fundamental features of D is also an unfinished feature.
But why should that compile? You're trying to manipulate `_c` through an instance of `Bar`. It's not visible in any `Bar`, ever, so why should it be visible here? It has to be gotten at through the interface of `Foo`.
Jun 13 2022
parent reply bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 08:07:05 UTC, Mike Parker wrote:
 But why should that compile? You're trying to manipulate `_c` 
 through an instance of `Bar`. It's not visible in any `Bar`, 
 ever, so why should it be visible here? It has to be gotten at 
 through the interface of `Foo`.
Because I'm in the module of a, _c is a member if Foo, Foo is in a. Thus _c should be accessible within a regardless of whether it's public within the b module or not.
Jun 13 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 08:44:59 UTC, bauss wrote:
 On Monday, 13 June 2022 at 08:07:05 UTC, Mike Parker wrote:
 But why should that compile? You're trying to manipulate `_c` 
 through an instance of `Bar`. It's not visible in any `Bar`, 
 ever, so why should it be visible here? It has to be gotten at 
 through the interface of `Foo`.
Because I'm in the module of a, _c is a member if Foo, Foo is in a. Thus _c should be accessible within a regardless of whether it's public within the b module or not.
My argument for why it should work is that if you place the subclass within the same module then it works, so clearly it's just a matter of where the function was called, not where it resides.
Jun 13 2022
next sibling parent bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 08:49:18 UTC, bauss wrote:
 On Monday, 13 June 2022 at 08:44:59 UTC, bauss wrote:
 On Monday, 13 June 2022 at 08:07:05 UTC, Mike Parker wrote:
 But why should that compile? You're trying to manipulate `_c` 
 through an instance of `Bar`. It's not visible in any `Bar`, 
 ever, so why should it be visible here? It has to be gotten 
 at through the interface of `Foo`.
Because I'm in the module of a, _c is a member if Foo, Foo is in a. Thus _c should be accessible within a regardless of whether it's public within the b module or not.
My argument for why it should work is that if you place the subclass within the same module then it works, so clearly it's just a matter of where the function was called, not where it resides.
Actually I could have worded this better, but hopefully my point comes across.
Jun 13 2022
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 08:49:18 UTC, bauss wrote:

 My argument for why it should work is that if you place the 
 subclass within the same module then it works, so clearly it's 
 just a matter of where the function was called, not where it 
 resides.
I don't get why you expect it to work. If you declare `Bar` outside of the module, then how could you access the private members of `Foo` through a `Bar`? It works in the same module because... private to the module.
Jun 13 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 09:05:43 UTC, Mike Parker wrote:
 On Monday, 13 June 2022 at 08:49:18 UTC, bauss wrote:

 My argument for why it should work is that if you place the 
 subclass within the same module then it works, so clearly it's 
 just a matter of where the function was called, not where it 
 resides.
I don't get why you expect it to work. If you declare `Bar` outside of the module, then how could you access the private members of `Foo` through a `Bar`? It works in the same module because... private to the module.
It follows from the principles of inheritance. A Bar is a more detailed Foo. If you cannot interact with a Bar as you would with a Foo, then Bar is breaking with the core principles of inheritance. A pointer to Foo, just means that you can assume less about it than if you had a pointer to Bar. If you can assume more about a pointer to a Foo than a pointer to a Bar then something is wrong.
Jun 13 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 09:11:42 UTC, Ola Fosheim Grøstad 
wrote:

 It follows from the principles of inheritance. A Bar is a more 
 detailed Foo. If you cannot interact with a Bar as you would 
 with a Foo, then Bar is breaking with the core principles of 
 inheritance
No, it does not follow in this case. A subclass does not have access to the private members of the superclass in D unless they are declared in the same module. So what that code is doing is saying, "Hey Bar, give me this field that you don't have access to."
Jun 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 09:34:46 UTC, Mike Parker wrote:
 No, it does not follow in this case. A subclass does not have 
 access to the private members of the superclass in D unless 
 they are declared in the same module. So what that code is 
 doing is saying, "Hey Bar, give me this field that you don't 
 have access to."
It wasn't the subclass that tried to access, it was the owner of the subclass, the module. If the owner of Cyborgs have direct access to the brain that all Cyborgs possess, then it follows that it also must have direct access to the brain of PinkCyborgs too without having to call it a Cyborg. A PinkCyborg is also a Cyborg. If a surgeon can operate directly on the brain of a Human, then it follows that he also should be able to operate directly on the brain of Men and Women, without having to claim that they are just Humans to him (and that he does not care about their gender specific attributes). Men and Women are undeniably Human. It should not matter whether the Ambulance personell says "this is a Man" or "this is a Human".
Jun 13 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 10:03:59 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 09:34:46 UTC, Mike Parker wrote:
 No, it does not follow in this case. A subclass does not have 
 access to the private members of the superclass in D unless 
 they are declared in the same module. So what that code is 
 doing is saying, "Hey Bar, give me this field that you don't 
 have access to."
It wasn't the subclass that tried to access, it was the owner of the subclass, the module.
Typo: "it was the owner of the *super class*, the module."
Jun 13 2022
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 10:03:59 UTC, Ola Fosheim Grøstad 
wrote:

 It wasn't the subclass that tried to access, it was the owner 
 of the subclass, the module.
*Through* the subclass.
 If the owner of Cyborgs
 If a surgeon can operate
I'm a Parker. My father is a Parker. But if you ask me for the contents of my father's safe, and I don't have the combination, then you aren't getting the contents of my father's safe. Ask me to put you in touch with my father though, and you can work something out. It's all about the interface here. An instance of B is an A only in terms of the public (and protected) interface. It doesn't have access to A's private members if it isn't declared in the same module, so you can't get A's private members through an instance of B. You have to cast to A. This doesn't allow access either: ``` module ca; import cb; class A { private int _x; protected void modX(B b) { b._x = 10; } } --- module cb; import ca; class B : A { void setX() { modX(this); } } void main() { B b = new B; b.setX; } ```
Jun 13 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 10:13:30 UTC, Mike Parker wrote:
 On Monday, 13 June 2022 at 10:03:59 UTC, Ola Fosheim Grøstad 
 wrote:

 It wasn't the subclass that tried to access, it was the owner 
 of the subclass, the module.
*Through* the subclass.
No. You have direct access.
 If the owner of Cyborgs
 If a surgeon can operate
I'm a Parker. My father is a Parker. But if you ask me for the contents of my father's safe, and I don't have the combination, then you aren't getting the contents of my father's safe. Ask me to put you in touch with my father though, and you can work something out.
This is not an example of inheritance. Inheritance is a [subtyping relation](https://en.wikipedia.org/wiki/Subtyping). If you let the Surgeon module own the Human blueprint and direct access to all the brains of all Humans, then it follows that the Surgeon module has to be given direct access to all Humans whether you know if that Human also is a Woman or not. Keep in mind that there are no pure Humans in the real world, only Men or Women (or some other gendered variations). The fact that we can instance Humans does not mean that they are not Women or Men. It means that we don't care (or know) whether they are Women or Men when we give them a computer representation.
Jun 13 2022
prev sibling parent reply bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 10:13:30 UTC, Mike Parker wrote:

Look, I'm not against the logic, I completely understand it from 
the perspective of that a subclass shouldn't have access to 
private members of the class it inherits from, BUT remember in D 
private doesn't mean private to the class, so the symbol should 
only be private when accessed from anywhere but the module.
Jun 13 2022
parent reply bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 10:43:09 UTC, bauss wrote:
 On Monday, 13 June 2022 at 10:13:30 UTC, Mike Parker wrote:

 Look, I'm not against the logic, I completely understand it 
 from the perspective of that a subclass shouldn't have access 
 to private members of the class it inherits from, BUT remember 
 in D private doesn't mean private to the class, so the symbol 
 should only be private when accessed from anywhere but the 
 module.
To add on to this; it's not a matter of what's more logical, but simply that D is contradicting itself, so either D stops being referred to as "module private" or this will be deemed an unfinished feature/bug.
Jun 13 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 13 June 2022 at 10:44:43 UTC, bauss wrote:
 On Monday, 13 June 2022 at 10:43:09 UTC, bauss wrote:
 On Monday, 13 June 2022 at 10:13:30 UTC, Mike Parker wrote:

 Look, I'm not against the logic, I completely understand it 
 from the perspective of that a subclass shouldn't have access 
 to private members of the class it inherits from, BUT remember 
 in D private doesn't mean private to the class, so the symbol 
 should only be private when accessed from anywhere but the 
 module.
To add on to this; it's not a matter of what's more logical, but simply that D is contradicting itself, so either D stops being referred to as "module private" or this will be deemed an unfinished feature/bug.
And I'm arguing that this is exactly what we should expect from private-to-the-module, since B is not declared in the same module as the superclass, so it's neither unfinished nor a bug.
Jun 13 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 10:51:10 UTC, Mike Parker wrote:
 And I'm arguing that this is exactly what we should expect from 
 private-to-the-module, since B is not declared in the same 
 module as the superclass, so it's neither unfinished nor a bug.
It breaks the sub-typing requirement. If you get more access by recasting a pointer to the super-type then the sub-typing relation cannot be satisfied. Hence, it is certainly broken. If it is not a bug, then it is broken by design. Which is no better.
Jun 13 2022
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 10:55:36 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 10:51:10 UTC, Mike Parker wrote:
 And I'm arguing that this is exactly what we should expect 
 from private-to-the-module, since B is not declared in the 
 same module as the superclass, so it's neither unfinished nor 
 a bug.
It breaks the sub-typing requirement. If you get more access by recasting a pointer to the super-type then the sub-typing relation cannot be satisfied. Hence, it is certainly broken. If it is not a bug, then it is broken by design. Which is no better.
Or to explain it in simple terms: *A cast to a supertype should only imply additional restrictions.* In this case the opposite happens. So this aspect of the type system is broken.
Jun 13 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 11:00:17 UTC, Ola Fosheim Grøstad 
wrote:
 Or to explain it in simple terms:

 *A cast to a supertype should only imply additional 
 restrictions.*

 In this case the opposite happens. So this aspect of the type 
 system is broken.
Or perhaps this formulation is less confusing: *A cast to a superclass is an act of forgetfulness* Being forgetful should not enable you do do more. The less you know, the less you should be able to do. The more you know, the more you should be able to do.
Jun 13 2022
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 10:55:36 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 10:51:10 UTC, Mike Parker wrote:
 And I'm arguing that this is exactly what we should expect 
 from private-to-the-module, since B is not declared in the 
 same module as the superclass, so it's neither unfinished nor 
 a bug.
It breaks the sub-typing requirement. If you get more access by recasting a pointer to the super-type then the sub-typing relation cannot be satisfied.
And the reason is the private member of the class shouldn't be accessible outside its declaration scope in the first place. Module-level 'private' is dysfunctional by design.
 Hence, it is certainly broken. If it is not a bug, then it is 
 broken by design. Which is no better.
Jun 13 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Monday, 13 June 2022 at 12:03:10 UTC, Max Samukha wrote:

 Module-level 'private' is dysfunctional by design.
No! We don't need `encapsulation`! We are all friends. Your variable, I will use it if I `want to`!You should happy however I use it,we are `friends`!
Jun 13 2022
parent zjh <fqbqrr 163.com> writes:
On Monday, 13 June 2022 at 12:16:28 UTC, zjh wrote:

 No! We don't need `encapsulation`!
 We are all friends.
 Your variable, I will use it if I `want to`!You should happy 
 however I use it,we are `friends`!
We should put all the modules in `one file`, so that there will be `more friends`!
Jun 13 2022
prev sibling next sibling parent bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 12:03:10 UTC, Max Samukha wrote:
 On Monday, 13 June 2022 at 10:55:36 UTC, Ola Fosheim Grøstad 
 wrote:
 On Monday, 13 June 2022 at 10:51:10 UTC, Mike Parker wrote:
 And I'm arguing that this is exactly what we should expect 
 from private-to-the-module, since B is not declared in the 
 same module as the superclass, so it's neither unfinished nor 
 a bug.
It breaks the sub-typing requirement. If you get more access by recasting a pointer to the super-type then the sub-typing relation cannot be satisfied.
And the reason is the private member of the class shouldn't be accessible outside its declaration scope in the first place. Module-level 'private' is dysfunctional by design.
 Hence, it is certainly broken. If it is not a bug, then it is 
 broken by design. Which is no better.
I honestly don't have a problem with module-level private, but I do have a problem with inconsistency and ignorance. D argues so much about being module-level private, but it really isn't.
Jun 13 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 12:03:10 UTC, Max Samukha wrote:
 And the reason is the private member of the class shouldn't be 
 accessible outside its declaration scope in the first place. 
 Module-level 'private' is dysfunctional by design.
It can be fixed, but I personally prefer granting other entities «roles». C++ friend-mechanism is one variant of granting "roles". For instance if you make the class MedicalRole a friend of the class Brain then you can make BrainSurgeon a subclass of MedicalRole which grants the BrainSurgeon access to the internals of Brains. Since C++ has multiple inheritance it follows that you can design your own role-granting regime in C++ if you want to. (I personally prefer to only grant access to individual functions because it is tighter, but the concept of class based roles is easier to deal with when debugging/remodelling than the concept of package.)
Jun 13 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 12:34:56 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 12:03:10 UTC, Max Samukha wrote:
 And the reason is the private member of the class shouldn't be 
 accessible outside its declaration scope in the first place. 
 Module-level 'private' is dysfunctional by design.
It can be fixed, but I personally prefer granting other entities «roles». C++ friend-mechanism is one variant of granting "roles".
I guess it is worth adding that one weakness in C++ (and D) is that functions are not objects (in contrast to Beta). If functions were objects and if the called function could inspect the caller's object-type then you could design a much more advanced regime and create interesting novel framework-mechanisms using meta-programming.
Jun 13 2022
prev sibling parent Fry <fry131313 gmail.com> writes:
On Monday, 13 June 2022 at 10:51:10 UTC, Mike Parker wrote:
 On Monday, 13 June 2022 at 10:44:43 UTC, bauss wrote:
 On Monday, 13 June 2022 at 10:43:09 UTC, bauss wrote:
 On Monday, 13 June 2022 at 10:13:30 UTC, Mike Parker wrote:

 Look, I'm not against the logic, I completely understand it 
 from the perspective of that a subclass shouldn't have access 
 to private members of the class it inherits from, BUT 
 remember in D private doesn't mean private to the class, so 
 the symbol should only be private when accessed from anywhere 
 but the module.
To add on to this; it's not a matter of what's more logical, but simply that D is contradicting itself, so either D stops being referred to as "module private" or this will be deemed an unfinished feature/bug.
And I'm arguing that this is exactly what we should expect from private-to-the-module, since B is not declared in the same module as the superclass, so it's neither unfinished nor a bug.
Now THIS is why D is unpopular. It is a unfinished or a bug, while the people maintaining it say it isn't. Take the source code from the original post: https://forum.dlang.org/post/yyurtzlglypsvgizxodg forum.dlang.org ``` import b; void handle(Bar child) { child.Foo._c += child.c; // works child._c += child.c; // error no property _c } ``` Both work, you can access `_c` through child and it is a also private and can't be accessed through child. So which is it? Is this what you define as "working" as intended?
Jun 13 2022
prev sibling parent bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 09:05:43 UTC, Mike Parker wrote:
 I don't get why you expect it to work. If you declare `Bar` 
 outside of the module, then how could you access the private 
 members of `Foo` through a `Bar`? It works in the same module 
 because... private to the module.
Because Bar inherits from Foo, Foo is in a, thus it should be as following: Outside of the module Foo resides in: - Only public members are accessible. Inside the module Foo resides in: - All members are accessible. That's the whole point of module private, that you can access private members within a module regardless of where they reside, as long as you're in the module the symbol belongs to. In this case _c belongs to the module a, so we should always be able to access _c within the module. If D wasn't module private, then I wouldn't expect it to work, but since D states itself to be module private then I expect it to work because: Bar inherits Foo, _c is a member of Foo, but also private to the module of Foo. When we access Bar within the module of Foo, then all members of Foo should be accessible. When we access Bar outside of the Foo's module, then only public members of Foo should be accessible. This should always be true with module private, it wouldn't be true with class private. So either D needs to admit it's not truly module private or it needs to fix that.
Jun 13 2022
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 08:44:59 UTC, bauss wrote:
 Thus _c should be accessible within a regardless of whether 
 it's public within the b module or not.
Yes, I agree. It should not matter whether the pointer is typed as Foo or Bar, as a Bar is also a Foo.
Jun 13 2022
parent bauss <jj_1337 live.dk> writes:
On Monday, 13 June 2022 at 08:59:24 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 08:44:59 UTC, bauss wrote:
 Thus _c should be accessible within a regardless of whether 
 it's public within the b module or not.
Yes, I agree. It should not matter whether the pointer is typed as Foo or Bar, as a Bar is also a Foo.
Exactly. I'm all for it being illegal, but then D isn't "module private" and stating it is would be wrong then.
Jun 13 2022
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Monday, 13 June 2022 at 08:44:59 UTC, bauss wrote:
 On Monday, 13 June 2022 at 08:07:05 UTC, Mike Parker wrote:
 But why should that compile? You're trying to manipulate `_c` 
 through an instance of `Bar`. It's not visible in any `Bar`, 
 ever, so why should it be visible here? It has to be gotten at 
 through the interface of `Foo`.
Because I'm in the module of a, _c is a member if Foo, Foo is in a. Thus _c should be accessible within a regardless of whether it's public within the b module or not.
`_c` is accessible; you just have to use the syntax `child.Foo._c` to access it. This is documented in the language spec:
 Members of a base class can be accessed by prepending the name 
 of the base class followed by a dot
https://dlang.org/spec/class.html#fields This is necessary because D allows you to define fields with the same name in a base class and its derived class; for example: ``` class Base { int x = 123; } class Derived : Base { int x = 456; } void main() { auto instance = new Derived; assert(instance.x == 456); assert(instance.Base.x == 123); } ```
Jun 13 2022
next sibling parent reply mw <mingwu gmail.com> writes:
On Monday, 13 June 2022 at 13:51:40 UTC, Paul Backus wrote:

 This is necessary because D allows you to define fields with 
 the same name in a base class and its derived class; for 
 example:
This is horrifying, one of the darkest corner of D.
 ```
 class Base
 {
     int x = 123;
 }

 class Derived : Base
 {
     int x = 456;
 }

 void main()
 {
     auto instance = new Derived;
     assert(instance.x == 456);
     assert(instance.Base.x == 123);
 }
 ```
Jun 13 2022
parent mw <mingwu gmail.com> writes:
On Monday, 13 June 2022 at 14:10:48 UTC, mw wrote:
 On Monday, 13 June 2022 at 13:51:40 UTC, Paul Backus wrote:

 This is necessary because D allows you to define fields with 
 the same name in a base class and its derived class; for 
 example:
This is horrifying, one of the darkest corner of D.
Maybe it's no way to change the language now, but I need a warning message when this happens, in case I accidentally redefined such a field, at least with a compiler flag I can check it.
 ```
 class Base
 {
     int x = 123;
 }

 class Derived : Base
 {
     int x = 456;
 }

 void main()
 {
     auto instance = new Derived;
     assert(instance.x == 456);
     assert(instance.Base.x == 123);
 }
 ```
Jun 13 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 13:51:40 UTC, Paul Backus wrote:
 This is necessary because D allows you to define fields with 
 the same name in a base class and its derived class;
Hardly necessary, C++ will error on this: ```c++ class A { int x=1; friend int main(); }; class B : public A { int x=4; }; int main() { B obj{}; cout << obj.x; } ``` But allow this: ```c++ class A { int x=1; friend int main(); }; class B : public A { int y=4; }; int main() { B obj{}; cout << obj.x; } ``` Of course, since C++ has multiple inheritance it has to deal with conflicts. D forbid it, although I guess you could argue that the subclass should not be affected by naming of fields in the superclass. I still prefer that this is not allowed as shadowing in class hierarchies makes debugging so much more confusing.
Jun 13 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 14:15:15 UTC, Ola Fosheim Grøstad 
wrote:
 D forbid it, although I guess you could argue
D *could* forbid it (since it is single inheritance).
Jun 13 2022
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 14:15:15 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 13 June 2022 at 13:51:40 UTC, Paul Backus wrote:
 This is necessary because D allows you to define fields with 
 the same name in a base class and its derived class;
Hardly necessary, C++ will error on this:
This was wrong. Sorry. I never shadow in real code…
 I still prefer that this is not allowed as shadowing in class 
 hierarchies makes debugging so much more confusing.
This is true. ;-)
Jun 14 2022
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 June 2022 at 13:19:18 UTC, Ola Fosheim Grøstad 
wrote:
 This was wrong. Sorry. I never shadow in real code…
(It wasn't strictly wrong, but not a complete explanation.)
Jun 14 2022
prev sibling parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 07:59:24 UTC, bauss wrote:
 ....
 D states it is "module private" but really it is neither 
 "module private" or "class private" it's a mix between the 
 two... and I'm not sure whether that's good or not.
on the issue of 'good or bad', well the answer is subjective. It depends on what you want, and expect. // ---- module test; safe: import std; void myFunc() { int a = 100; // a is bound (private) to scope, not module // so = good, I wanted and expected this. } class foo { static private int b = 200; // b is bound to module scope // even though I thought I was binding it to local scope // This is not what I wanted or expected. // so module global = bad. } class bar { static private(module) int c = 300; // c is bound to module scope } // good, that's what I wanted and expected. void main() { writeln(a); // won't compile = good writeln(foo.b); // will compile = bad. // bad cause compiler has no idea what i wanted, // cause i can't tell it what I wanted. // there is no feature in D to do this. writeln(bar.c); // will compile = good. It's what I wanted and expected. } // --------
Jun 13 2022
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 03:46:52 UTC, Mike Parker wrote:
 On Monday, 13 June 2022 at 02:42:52 UTC, forkit wrote:

 Withing a module:

 (1) What is the disadvantage of having an 'optional' access 
 modifier, such as 'class private', so that you can separate 
 interface from implementation?

 (2) what is the disadvantage in leveraging the compiler to 
 help us in determining whether code outside that class, but 
 within the same module, is correctly using that interface?
That's backwards. You're talking about adding a new language feature. The bar in that case is to show that the new feature provides an advantage over the status quo, such that the additional complexity is justified. That's what this boils down to. Walter and Atila are who you need to convince. Not me. I just don't see *practical* advantage to it. Let's say I have this class using a shiny new `private(intern)` feature implemented by someone on my team. ```d class Foo { private(intern) x; } ``` Now I want to add a function in the module that manipulates `x`. ```d void newFunction(Foo f) { f.x = 10; } ``` Oops! Can't do that. But no problem. I have access to the source. I can change `x` to `private` so that I can access it everywhere in the module. Or I add a new `private(intern)` member function to set the value of `x` in the way I need it. This is the reason I think it's a useless feature. If you have access to the module, you have access to the internal members of the class. It's no different than having a class per file in Java. At some point, it comes down to coding conventions (e.g., all internal access to private members within a Java class must explicitly go through the public interface). How does this feature bring any benefit that can't be had by putting `Foo` in a separate module? That's the question you have to answer.
In your example, there is no benefit. There is only a couple of lines of code in the module. I can chunk the whole module almost in one go. That approach is not scalable because: first, humans decipher information using chunks, and second, humans make mistakes. The argument that you have access to the source file is a strawman. The one-class-per-file is also a strawman. Why not one struct per file, one enum per file, one function per file, one int per file ... why only a class-per-file? This is not about this approach or that approach. It's about giving the programmer an option to decide which approach is best for them. Your side of the argument is basically saying we don't want to give you that option - cause we know what approach is best for you.
Jun 12 2022
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 13 June 2022 at 03:46:52 UTC, Mike Parker wrote:
 [snip]
 This is the reason I think it's a useless feature. If you have 
 access to the module, you have access to the internal members 
 of the class. It's no different than having a class per file in 
 Java. At some point, it comes down to coding conventions (e.g., 
 all internal access to private members within a Java class must 
 explicitly go through the public interface).

 How does this feature bring any benefit that can't be had by 
 putting `Foo` in a separate module? That's the question you 
 have to answer.
A fair point, but one of the arguments (over who knows how many pages at this point...) is that some people want to be able to have some way to ensure that the only way to access internal members is controlled through the class itself. You would reply that there is an alternative, which is to put that class in a separate module. It is a little like safe (in a project with no dependencies, is safe useless because the programmer has access to all the code and can verify themselves whether there are any errors?). Some people want guarantees. The people arguing for this want the guarantee that if they write some other function *in the module* that tries to access one of this internal members, then they get an error. Now, that's not something that has been a concern to me, but I don't necessarily think it is a useless feature.
Jun 13 2022
parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 12:33:32 UTC, jmh530 wrote:
 A fair point, but one of the arguments (over who knows how many 
 pages at this point...) is that some people want to be able to 
 have some way to ensure that the only way to access internal 
 members is controlled through the class itself. You would reply 
 that there is an alternative, which is to put that class in a 
 separate module. It is a little like  safe (in a project with 
 no dependencies, is  safe useless because the programmer has 
 access to all the code and can verify themselves whether there 
 are any errors?). Some people want guarantees. The people 
 arguing for this want the guarantee that if they write some 
 other function *in the module* that tries to access one of this 
 internal members, then they get an error. Now, that's not 
 something that has been a concern to me, but I don't 
 necessarily think it is a useless feature.
of course, this is primarly a matter of scale (both in terms of code, and people contributing to that code - even withint the same module, and not just now, but in the future as well). the only way to scale in D when using OO designs, without this feature, is to put every class on it's own in it's own module - for no other reason than to protect the code of the class from any other code. that is not scalable. sure, some choose to put one class per file - that is a design decision - not one that should be forced on to you, by the language, because the language refuses to provide the tools you need. you should be able to protect the code of your class, from surrounding code, without being told this is the only workaround we can provide for you. there is no downside to having an option to contain private parts to the scope of a class. What a complete joke to suggest there is. people will choose to use it , or not - but they'll have a choice. of course I already have that choice, in ALL the languages I programmer can already do this. There is a strong 'anti' class-private group in the D community. There motivations are their own, and not all that clear to me. But they sure are intent on ensuring i (and everyone else that uses D) don't have that choice.
Jun 13 2022
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 12 June 2022 at 14:56:53 UTC, Mike Parker wrote:
 On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:
 Hi, Mike! Congratulations on being the first unsurprised D 
 user! (You were actually surprised for a moment, weren't you?)
No. Nor was I surprised, for example, when I learned that in more restrictive than `protected` in Java. Every language has a similar approach as other languages to some things, a different approach to others. I've investigated enough programming languages that I learned long ago to be open to the differences and never to expect that just because something is true in Language A that it will be true for a similar feature in Language B. I often have reactions of "neat" or "cool", or "too bad" or "that sucks", but I can't say I'm ever really surprised when You have to learn to think in the language you're using if you want to be productive with it, and that means accepting the differences. You may find some things grate on your nerves because they don't square with your view of the world, in which case you either push to change them, accept them, or, if it's too much to handle, move on to a language that better fits your mental model. The latter is why I never stuck with C++.
The problem is that there are no 'practical' languages to move on to. All of them make you maintain inconsistent metal models. Like, 'synchronized' is class-level, its semantics is based on the assumption that 'private' is class-level too, but it is not, so 'shared' is broken as a consequence.
 When I first learned about D's private-to-the-module approach, 
 it made perfect sense to me. It fits right in with D's concept 
 of modules.
Right. The problem is it doesn't fit in with the concept of classes.
 I have been surprised occasionally, though, when I was certain 
 a feature worked a certain way, but I learned later my 
 understanding was wrong. There were a couple of those instances 
 when I was writing Learning D, but I can't for the life of me 
 remember what they were.
Yes, I have a similar experience. Some of the features made perfect sense after I had learned the reasoning behind them. Unfortunately, the module-level 'private' is not one of those.
Jun 13 2022
prev sibling next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 (You were actually surprised for a moment, weren't you?)
Actually, when I used PHP's classes the first time, I as so perplexed at why this private thing was giving an error when the code was right next to it. It just seems so unnatural.
Jun 12 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 12 June 2022 at 17:42:32 UTC, Adam D Ruppe wrote:
 On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 (You were actually surprised for a moment, weren't you?)
Actually, when I used PHP's classes the first time, I as so perplexed at why this private thing was giving an error when the code was right next to it. It just seems so unnatural.
But then 'private' on modules should not seem more natural to you as well.
Jun 13 2022
prev sibling parent reply claptrap <clap trap.com> writes:
On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:

 Hi. My name's Mike. Nice to meet you.
Hi, Mike! Congratulations on being the first unsurprised D user! (You were actually surprised for a moment, weren't you?)
Anyone who used Object Pascal / Delphi wouldn't have been surprised either.
Jun 13 2022
next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 08:13:14 UTC, claptrap wrote:
 On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:

 Hi. My name's Mike. Nice to meet you.
Hi, Mike! Congratulations on being the first unsurprised D user! (You were actually surprised for a moment, weren't you?)
Anyone who used Object Pascal / Delphi wouldn't have been surprised either.
I am a narrow-minded person and have never written a line in those. However, I heard that the original author of Turbo Pascal class-level 'private' and 'internal' to break out if it.
Jun 13 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 08:47:29 UTC, Max Samukha wrote:

 class-level 'private' and 'internal' to break out if it.
*out of it
Jun 13 2022
prev sibling parent reply user1234 <user1234 12.de> writes:
On Monday, 13 June 2022 at 08:13:14 UTC, claptrap wrote:
 On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:

 Hi. My name's Mike. Nice to meet you.
Hi, Mike! Congratulations on being the first unsurprised D user! (You were actually surprised for a moment, weren't you?)
Anyone who used Object Pascal / Delphi wouldn't have been surprised either.
True but ObjFPC has `strict private` too since a while: [demo]. This does exactly what is discussed on the D forum since a week. [demo]: https://ideone.com/j0kCMD
Jun 13 2022
parent reply user1234 <user1234 12.de> writes:
On Monday, 13 June 2022 at 09:09:08 UTC, user1234 wrote:
 On Monday, 13 June 2022 at 08:13:14 UTC, claptrap wrote:
 On Sunday, 12 June 2022 at 14:05:00 UTC, Max Samukha wrote:
 On Sunday, 12 June 2022 at 11:47:53 UTC, Mike Parker wrote:

 Hi. My name's Mike. Nice to meet you.
Hi, Mike! Congratulations on being the first unsurprised D user! (You were actually surprised for a moment, weren't you?)
Anyone who used Object Pascal / Delphi wouldn't have been surprised either.
True but ObjFPC has `strict private` too since a while: [demo]. This does exactly what is discussed on the D forum since a week. [demo]: https://ideone.com/j0kCMD
well the online compilers are too old, but if you try this code with a more recent compiler ```pascal {$MODE OBJFPC}{$H+} {$MODESWITCH ADVANCEDRECORDS} type TS = record private a: integer; strict private b: integer; end; var s: TS; begin s.a := 1; // OK s.b := 1; // NG end. ``` this gives
 project1.lpr(15,5) Error: identifier idents no member "b"
Jun 13 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 13 June 2022 at 09:12:47 UTC, user1234 wrote:
 On Monday, 13 June 2022 at 09:09:08 UTC, user1234 wrote:
 On Monday, 13 June 2022 at 08:13:14 UTC, claptrap wrote:
 [...]
True but ObjFPC has `strict private` too since a while: [demo]. This does exactly what is discussed on the D forum since a week. [demo]: https://ideone.com/j0kCMD
well the online compilers are too old, but if you try this code with a more recent compiler ```pascal {$MODE OBJFPC}{$H+} {$MODESWITCH ADVANCEDRECORDS} type TS = record private a: integer; strict private b: integer; end; var s: TS; begin s.a := 1; // OK s.b := 1; // NG end. ``` this gives
 project1.lpr(15,5) Error: identifier idents no member "b"
You should have used compiler explorer. :) https://godbolt.org/z/6M7be7M57
Jun 13 2022
parent user1234 <user1234 12.de> writes:
On Monday, 13 June 2022 at 10:49:37 UTC, Paulo Pinto wrote:
 On Monday, 13 June 2022 at 09:12:47 UTC, user1234 wrote:
 On Monday, 13 June 2022 at 09:09:08 UTC, user1234 wrote:
 [...]
well the online compilers are too old, but if you try this code with a more recent compiler ```pascal {$MODE OBJFPC}{$H+} {$MODESWITCH ADVANCEDRECORDS} type TS = record private a: integer; strict private b: integer; end; var s: TS; begin s.a := 1; // OK s.b := 1; // NG end. ``` this gives
 [...]
You should have used compiler explorer. :) https://godbolt.org/z/6M7be7M57
nice, even highlighting of "strict private" is correct there. one more kudo for compiler explorer.
Jun 13 2022
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 12 June 2022 at 10:40:02 UTC, Max Samukha wrote:
 Then it is natural to expect the feature would apply to the 
 class level (if we are talking about class-based OOP)? Really, 
 I'm yet to meet a D user that wouldn't be surprised 'private' 
 is module-level. And the 'friend' story rarely impresses them. 
 The reply is always: "Ok, 'friend' breaks OOP principles. Is D 
 better because it breaks OOP in its own way?"
Yes, of course, D could make private work like C++ and give module access to protected instead. You could also allow reading of object attributes from the module and restrict writing to the object (without using properties).
Jun 12 2022
parent reply Chris Katko <ckatko gmail.com> writes:
On Sunday, 12 June 2022 at 11:49:21 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 12 June 2022 at 10:40:02 UTC, Max Samukha wrote:
 Then it is natural to expect the feature would apply to the 
 class level (if we are talking about class-based OOP)? Really, 
 I'm yet to meet a D user that wouldn't be surprised 'private' 
 is module-level. And the 'friend' story rarely impresses them. 
 The reply is always: "Ok, 'friend' breaks OOP principles. Is D 
 better because it breaks OOP in its own way?"
Yes, of course, D could make private work like C++ and give module access to protected instead. You could also allow reading of object attributes from the module and restrict writing to the object (without using properties).
If I could have private (or any keyword that mimics C++ private) as a optional compiler flag, I would turn that on in a heartbeat.
Jun 12 2022
parent reply forkit <forkit gmail.com> writes:
On Sunday, 12 June 2022 at 12:26:33 UTC, Chris Katko wrote:
 If I could have private (or any keyword that mimics C++ 
 private) as a optional compiler flag, I would turn that on in a 
 heartbeat.
That is actually a decent alternative (i.e. an 'optional' compiler switch). But even with the support of the D community, this would likely end up being something that Walter would have to implement (in that messy thing known as the frontend). I'm sure he would give it is foremost attention ;-) To be honest, the more I look into Swift (I only started a few days ago), the less impressed I am with D. I think Swift has a really bright future actually. https://docs.swift.org/swift-book/GuidedTour/GuidedTour.html
Jun 12 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 12 June 2022 at 23:09:22 UTC, forkit wrote:
 On Sunday, 12 June 2022 at 12:26:33 UTC, Chris Katko wrote:
 If I could have private (or any keyword that mimics C++ 
 private) as a optional compiler flag, I would turn that on in 
 a heartbeat.
That is actually a decent alternative (i.e. an 'optional' compiler switch). But even with the support of the D community, this would likely end up being something that Walter would have to implement (in that messy thing known as the frontend). I'm sure he would give it is foremost attention ;-) To be honest, the more I look into Swift (I only started a few days ago), the less impressed I am with D. I think Swift has a really bright future actually. https://docs.swift.org/swift-book/GuidedTour/GuidedTour.html
Swift has definitly a bright future no matter what, because it has one of the most powerful companies that asserts that is the only way to play on their turf going forward, besides Objective-C and C++. Unfortunely the time for such kind of big industry players to pick D has moved on, they are now busy with Go, Rust, or adding to Java, .NET and C++ the missing pieces that made D a better option, while having a much bigger ecosystem in tooling and libraries.
Jun 12 2022
parent reply Tejas <notrealemail gmail.com> writes:
On Monday, 13 June 2022 at 06:09:34 UTC, Paulo Pinto wrote:
 On Sunday, 12 June 2022 at 23:09:22 UTC, forkit wrote:
 [...]
Swift has definitly a bright future no matter what, because it has one of the most powerful companies that asserts that is the only way to play on their turf going forward, besides Objective-C and C++. Unfortunely the time for such kind of big industry players to pick D has moved on, they are now busy with Go, Rust, or adding to Java, .NET and C++ the missing pieces that made D a better option, while having a much bigger ecosystem in tooling and libraries.
Isn't stuff like `Dart` also feasible on Apple though, due to its amazing portability (ie, `Flutter`)? Doubt non Apple people will pick up Swift, but it'll definitely work the other way round
Jun 12 2022
parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 13 June 2022 at 06:38:13 UTC, Tejas wrote:
 On Monday, 13 June 2022 at 06:09:34 UTC, Paulo Pinto wrote:
 On Sunday, 12 June 2022 at 23:09:22 UTC, forkit wrote:
 [...]
Swift has definitly a bright future no matter what, because it has one of the most powerful companies that asserts that is the only way to play on their turf going forward, besides Objective-C and C++. Unfortunely the time for such kind of big industry players to pick D has moved on, they are now busy with Go, Rust, or adding to Java, .NET and C++ the missing pieces that made D a better option, while having a much bigger ecosystem in tooling and libraries.
Isn't stuff like `Dart` also feasible on Apple though, due to its amazing portability (ie, `Flutter`)? Doubt non Apple people will pick up Swift, but it'll definitely work the other way round
Maybe, that also boils down to the same basic example anyway, one of biggest corporations on the world (Google), pushing a cross platform framework (Flutter), where only one language gets to play (Dart).
Jun 13 2022
prev sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 6/12/22 03:40, Max Samukha wrote:
 On Sunday, 12 June 2022 at 09:54:42 UTC, Ola Fosheim Grøstad wrote:

 Many OO languages don't provide encapsulation, like Python.
Right, but Python programmers are expected to prepend private members with '_'.
 I would say encapsulation has more to do with scaling up and evolving,
 as well as reliability (e.g. actors).
 But I guess you can say that OO features span a space where
 encapsulation is one dimension.
Then it is natural to expect the feature would apply to the class level (if we are talking about class-based OOP)? Really, I'm yet to meet a D user that wouldn't be surprised 'private' is module-level.
I was surprised too. Then I realized D's view was better that C++'s (my frame of reference at the time). Then I realized I've been under the influence of C++'s view of OOP. Then I thought more about what encapsulation actually means. Then I realized encapsulation has nothing to do with access control. I am much happier with this repaired frame of reference.
 And the
 'friend' story rarely impresses them. The reply is always: "Ok, 'friend'
 breaks OOP principles. Is D better because it breaks OOP in its own way?"
Access control is just an aspect of OOP. Neither friend nor module-level private breaks OOP. Being a simpleton, I start reading from Wikipedia but others can read their trusted OOP gospel to gather the same information: https://en.wikipedia.org/wiki/Encapsulation_(computer_programming) For example: "Under the definition that encapsulation "can be used to hide data members and member functions", the internal representation of an object is generally hidden from view outside of the object's definition." Note "can be used to hide" and "generally hidden". See, all the other languages apply their view of 'private', not OOP's view of 'private'. I find D's 'private' very useful and I doubt a single project had any problem with it. Ali
Jun 13 2022
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 14/06/2022 5:08 AM, Ali Çehreli wrote:
 I was surprised too. Then I realized D's view was better that C++'s (my 
 frame of reference at the time). Then I realized I've been under the 
 influence of C++'s view of OOP. Then I thought more about what 
 encapsulation actually means. Then I realized encapsulation has nothing 
 to do with access control. I am much happier with this repaired frame of 
 reference.
Replace C++ with Java and I'm the same.
 I find D's 'private' very useful and I doubt a single project had any 
 problem with it.
Same. I've got many more problems with export being a visibility modifier than private has ever given me.
Jun 13 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 17:34:32 UTC, rikki cattermole wrote:

 I've got many more problems with export being a visibility 
 modifier than private has ever given me.
I bet you would have even fewer problems if there weren't 'private' at all.
Jun 13 2022
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 14/06/2022 7:04 AM, Max Samukha wrote:
 On Monday, 13 June 2022 at 17:34:32 UTC, rikki cattermole wrote:
 I've got many more problems with export being a visibility modifier 
 than private has ever given me.
I bet you would have even fewer problems if there weren't 'private' at all.
You can't go lower than zero. Export on the other hand, blocks whole uses cases for D.
Jun 13 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 19:13:56 UTC, rikki cattermole wrote:

 You can't go lower than zero.

 Export on the other hand, blocks whole uses cases for D.
True.
Jun 13 2022
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 17:08:55 UTC, Ali Çehreli wrote:
 Then I thought more about what encapsulation actually means. 
 Then I realized encapsulation has nothing to do with access 
 control.
How did you reach that conclusion? There are at least two important aspects of encapsulation in components: information hiding and access control. *Information hiding* has to do with preventing the user from making assumptions about how something works beyond what is documented. *Access control* has to do with reducing the number of failure points that has to be inspected in more complex software. The latter aspect is quite important in complicated low level scenarios, like performance oriented concurrency code, as well as complex setting with many moving parts at any level (also at the cloud level). In Python the latter is not so critical as you typically don't care too much about performance or concurrency and have many opportunities to avoid complexity. Maybe the typical usage of D falls somewhere between Python and C++. Whether something is ok or not depends on the use case… The key question is: what usage scenario is D trying to be best for? Answer that, then we can discuss concrete features. Without an answer to that question discussions will either be personal or abstract.
Jun 13 2022
prev sibling next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 17:08:55 UTC, Ali Çehreli wrote:

 Access control is just an aspect of OOP. Neither friend nor 
 module-level private breaks OOP. Being a simpleton, I start 
 reading from Wikipedia but others can read their trusted OOP 
 gospel to gather the same information:

   
 https://en.wikipedia.org/wiki/Encapsulation_(computer_programming)

 For example:

   "Under the definition that encapsulation "can be used to hide 
 data members and member functions", the internal representation 
 of an object is generally hidden from view outside of the 
 object's definition."

 Note "can be used to hide" and "generally hidden". See, all the 
 other languages apply their view of 'private', not OOP's view 
 of 'private'.

 I find D's 'private' very useful and I doubt a single project 
 had any problem with it.

 Ali
I am not debating its usefulness, but it would be even more useful if it meant what I believe most programmers not damaged by Simula or Pascal intuitively expect, that is, 'private to the parent scope'. And then we could have something like 'private(ancestor)' (or extend 'package(ancestor')) for specifying the desired boundary of encapsulation. Currently I just cannot enforce class invariants (without isolating the class in its own module). See this gospel: "Encapsulation also protects the integrity of the component, by preventing users from setting the internal data of the component into an invalid or inconsistent state." (https://en.wikipedia.org/wiki/Information_hiding#Encapsulation)
Jun 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 13 June 2022 at 20:05:47 UTC, Max Samukha wrote:
 it would be even more useful if it meant what I believe most 
 programmers not damaged by Simula
Huh? Simula has the same protection modes as C++, except it has one more.
Jun 13 2022
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 13 June 2022 at 20:45:58 UTC, Ola Fosheim Grøstad 
wrote:

 Huh? Simula has the same protection modes as C++, except it has 
 one more.
Delphi or whichever other dead technology I have hard time keeping memory of.
Jun 13 2022
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 14 June 2022 at 05:15:11 UTC, Max Samukha wrote:
 On Monday, 13 June 2022 at 20:45:58 UTC, Ola Fosheim Grøstad 
 wrote:

 Huh? Simula has the same protection modes as C++, except it 
 has one more.
Delphi or whichever other dead technology I have hard time keeping memory of.
Ironically it looks much more alive than D to me as the way things are going, https://www.embarcadero.com/products/delphi https://entwickler-konferenz.de/
Jun 13 2022
parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 14 June 2022 at 05:36:05 UTC, Paulo Pinto wrote:

 Delphi or whichever other dead technology I have hard time 
 keeping memory of.
Ironically it looks much more alive than D to me as the way things are going, https://www.embarcadero.com/products/delphi https://entwickler-konferenz.de/
Oh, I see. They were smart enough to join the buzzword production industry.
Jun 13 2022
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 17:08:55 UTC, Ali Çehreli wrote:
 ...
 I was surprised too. Then I realized D's view was better that 
 C++'s (my frame of reference at the time). Then I realized I've 
 been under the influence of C++'s view of OOP. Then I thought 
 more about what encapsulation actually means. Then I realized 
 encapsulation has nothing to do with access control. I am much 
 happier with this repaired frame of reference.
Really? Sure, you can separate interface from implementation, without access control. But you can also ensure you don't put signed int, into unsigned int - by 'just not doing it'. It's the 'just don't do it' approach that bothers me. The 'just don't do it' philosophy, of D, s not a good tool to rely on, in software design and development, especially at scale. That is where 'private(scope)' comes in handy. If I 'do' do it, or someone else 'does it', then the compiler will know that what I'm doing, or someone else is doing, conflicts with the intent as specified in the class. It provides an level of assurance, that you simply cannot get without it. That's its benefit. There is no downside. You always have the choice to use it or not - that should always be up to you to decide. The problem with D, is that it decides this for you, and leaves you with only to options: (1) just don't do it. (2) put the class in its own file To me, this is uncessarily restrictive, and puts an immediate constraint on me, before I've even begun to consider what design is best for me.
 And the
 'friend' story rarely impresses them. The reply is always:
"Ok, 'friend'
 breaks OOP principles. Is D better because it breaks OOP in
its own way?"
The 'friend' argument is a strawman. No D programmer would need to alter anything they currently do, with the 'option' that I'm asking for. But a large number of programmers coming to D, will have to change what they're currently doing - cause they have no option.
 I find D's 'private' very useful and I doubt a single project 
 had any problem with it.

 Ali
this is another strawman. the problem is not the d module, it's not having a choice.
Jun 13 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 00:54:34 UTC, forkit wrote:
 It's the 'just don't do it' approach that bothers me.

 The 'just don't do it' philosophy, of D, s not a good tool to 
 rely on, in software design and development, especially at 
 scale.
I completely agree with this. It bothers me A LOT that it's the view of most people around here. Why stop there? Why not remove any constraints in the language? After all you can just stop making mistakes. "Just don't do it." Why do we need safe, dip1000 etc.? Just verify the memory yourself, if you accidentally screw up then that's your fault, surely it isn't the language's fault. nogc? Nah we don't need that, just don't allocate using the GC.
Jun 13 2022
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 June 2022 at 06:46:45 UTC, bauss wrote:
 I completely agree with this. It bothers me A LOT that it's the 
 view of most people around here.

 Why stop there? Why not remove any constraints in the language? 
 After all you can just stop making mistakes. "Just don't do it."

 Why do we need  safe, dip1000 etc.? Just verify the memory 
 yourself, if you accidentally screw up then that's your fault, 
 surely it isn't the language's fault.

  nogc? Nah we don't need that, just don't allocate using the GC.
When you end up adding more and more stuff to a language it is considered better to go back to the drawing board and start over. Otherwise you’ll end up with a mess...
Jun 14 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 07:09:23 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 14 June 2022 at 06:46:45 UTC, bauss wrote:
 I completely agree with this. It bothers me A LOT that it's 
 the view of most people around here.

 Why stop there? Why not remove any constraints in the 
 language? After all you can just stop making mistakes. "Just 
 don't do it."

 Why do we need  safe, dip1000 etc.? Just verify the memory 
 yourself, if you accidentally screw up then that's your fault, 
 surely it isn't the language's fault.

  nogc? Nah we don't need that, just don't allocate using the 
 GC.
When you end up adding more and more stuff to a language it is considered better to go back to the drawing board and start over. Otherwise you’ll end up with a mess...
D is honestly a mess compared to when I first started using it over a decade ago, back then it was simple and you could easily do something in it. Now you have this huge amount of attribute soup that you either need to sprinkle everywhere or you have to completely ignore. The community and libraries etc. also feel a lot more divided, even compared to when tango was a viable alternative to phobos; at least that's how it feels from my point of view. Also it's like D chooses the most complex implementations for every new feature that has to be added. I came to D because it was easier to use and get things done in than any other languages, but over the past couple years I've slowly stopped starting new projects in D because it's becoming a drain to use and all its selling points are no longer selling points because other languages are either on pair with D or does it better. D was on the path to greatness, but no more. It's spiraling down a bad path and it's honestly sad. It had so much potential and I'm not sure it'll ever recover. The only way for D to ever succeed would be to start D3 ASAP and start by figuring out how to get rid of the attribute soup etc. because all it does is adding clutter and clutter is what makes a language difficult to use. C++ wasn't hard because of the technical stuff, but because there's a lot of clutter in the language with templates etc. IMHO.
Jun 14 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 June 2022 at 07:34:32 UTC, bauss wrote:
 D is honestly a mess compared to when I first started using it 
 over a decade ago, back then it was simple and you could easily 
 do something in it.
It is my viewpoint that D would have been more successful if it had chosen to be simple system level language without GC/templates and instead focused on high quality builtins and high level optimizations… but that ship has sailed for sure!? Rust is taking that space, except it isn't all that convenient. I don't have much belief in Zig and the other newcomers… C++ is just too far ahead at this point for me to view minor system level languages as interesting. And by not being interesting, I mean that I am not even willing to download and give them one spin… There is also much less need for system level programming today than 10 years ago, so it is a shrinking market with more competition… Only 2 or 3 can succeed in building an eco system in that space and also reach an acceptable level of portability (full support for various hardware, iOS, Android, compilation to FPGA, etc).
 D was on the path to greatness, but no more. It's spiraling 
 down a bad path and it's honestly sad. It had so much potential 
 and I'm not sure it'll ever recover.
Depends on SDC, if SDC implements the core language and don't sacrifice compiler internals to support other cruft then I guess you could attract compiler devs that would evolve it into a clean slate D3.
 The only way for D to ever succeed would be to start D3 ASAP 
 and start by figuring out how to get rid of the attribute soup 
 etc. because all it does is adding clutter and clutter is what 
 makes a language difficult to use.
You could reimagine D3 as a high level language with the possibility of going system level. If you design from that angle then all the clutter will vanish as you cannot tolerate clutter in a high level language design. Clutter in D has for the most part been added in the name of "system level". (Which is a bit odd, because system level programming is even more in need of clean simplicity due to the drastic consequences of making a misstep.)
 C++ wasn't hard because of the technical stuff, but because 
 there's a lot of clutter in the language with templates etc. 
 IMHO.
Well, there isn't all that much incomprehensible clutter in C++ anymore if you want to avoid it, but you still have to deal with "evolved surprises", so it isn't easy for beginners. C++ cannot become acceptable for high level programming though, as it has a heavy focus on enabling compiler optimizations. As a programmer you have to focus heavily on the correctness of your code, rather than relying on wrongdoings being caught. That makes C++ unsuited for evolutionary programming. You basically need a design before you code in C++. So C++ is not a good alternative for D programmers who like to experiment and prototype.
Jun 14 2022
prev sibling parent reply norm <nr gmail.com> writes:
On Tuesday, 14 June 2022 at 07:34:32 UTC, bauss wrote:

 Now you have this huge amount of attribute soup that you either 
 need to sprinkle everywhere or you have to completely ignore.
Totally agree with this, it puts me off using D. TBH I haven't started a project in D for a while now, I am simply reaching for Python or C++20. Both get the job done and while not great C++ is at least moving in the right direction.
 The community and libraries etc. also feel a lot more divided, 
 even compared to when tango was a viable alternative to phobos; 
 at least that's how it feels from my point of view.
I much prefer Phobos over Tango but I agree the core development community seem very divided. Instead of trying to grow the user base with a great experience, adding syntax sugar and frition reducing features like `int[$] arr = [1, 2, 3];` etc. But no, a massive amnount of energy is spent on chasing C? Seriously?? C interop is hugely important but D already interop'd with C seamlessly enough....clearly not enough for a core few because that is the main focus for D development right now.
 Also it's like D chooses the most complex implementations for 
 every new feature that has to be added.

 I came to D because it was easier to use and get things done in 
 than any other languages, but over the past couple years I've 
 slowly stopped starting new projects in D because it's becoming 
 a drain to use and all its selling points are no longer selling 
 points because other languages are either on pair with D or 
 does it better.
This is also my experience, unfortunately it is at the point where I do not start new projects in D.
 D was on the path to greatness, but no more. It's spiraling 
 down a bad path and it's honestly sad. It had so much potential 
 and I'm not sure it'll ever recover.
Totally agree but I wouldn't say sad, it just is. I have moved back to C++ for hobby projects and I'm rather enjoying C++20. I also like the fact I'm staying up to date the latest C++. There are a lot more C++ jobs than D jobs out there.
 The only way for D to ever succeed would be to start D3 ASAP 
 and start by figuring out how to get rid of the attribute soup 
 etc. because all it does is adding clutter and clutter is what 
 makes a language difficult to use. C++ wasn't hard because of 
 the technical stuff, but because there's a lot of clutter in 
 the language with templates etc. IMHO.
Agreed, I like the improvements in C++20 but it has a long way to go to make things readable That said `int func(auto value) {}` is pretty easy to read template. D code now with attributes, I wouldn't know where to start and TBH would probably just ignore them.
Jun 14 2022
next sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 08:35:53 UTC, norm wrote:
 Agreed, I like the improvements in C++20 but it has a long way 
 to go to make things readable That said `int func(auto value) 
 {}` is pretty easy to read template. D code now with 
 attributes, I wouldn't know where to start and TBH would 
 probably just ignore them.
It's easy. Start with: module myModule; safe: ... now when things don't work, you just comment out safe: now the magic happens ;-) it isn't. I don't mind what D is doing with the other attributes. I like attributes. The more the merry'ier... it gives me choice. I just need to know what those choices actually are, and do they work.
Jun 14 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 June 2022 at 08:35:53 UTC, norm wrote:
 Totally agree with this, it puts me off using D. TBH I haven't 
 started a project in D for a while now, I am simply reaching 
 for Python or C++20.
Do you combine Python and C++, i.e. call C++ code from Python? Or do you "chase" higher level and lower level aspects in different projects?
 etc. But no, a massive amnount of energy is spent on chasing C? 
 Seriously?? C interop is hugely important but D already 
 interop'd with C seamlessly enough....clearly not enough for a 
 core few because that is the main focus for D development right 
 now.
It is being done for fun. It is not unreasonable that people do things they find interesting in their own spare time. The problematic part is that incomplete features are being merged instead of being polished in an experimental branch. Even more problematic when it is stated that the feature perhaps will never become complete (macro expansion). Such experiments should never be done on the main branch and that pretty much defines almost all of D's challenges. There is quite a difference between this practice and C++ compilers making features available for preview when they are following a formal spec!
 Totally agree but I wouldn't say sad, it just is.
It is a bit sad if you count all the hours put into attempts to build an eco system around it. The current approach guarantees a mode of perpetual instability, and that prevents an eco system from emerging. It would be much easier to build an eco system if you had one big fat breaking change to set the record straight (D3) than what is perceived as random movements. If people feel you are moving about randomly they become passive.
Jun 14 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 07:09:23 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 14 June 2022 at 06:46:45 UTC, bauss wrote:
 I completely agree with this. It bothers me A LOT that it's 
 the view of most people around here.

 Why stop there? Why not remove any constraints in the 
 language? After all you can just stop making mistakes. "Just 
 don't do it."

 Why do we need  safe, dip1000 etc.? Just verify the memory 
 yourself, if you accidentally screw up then that's your fault, 
 surely it isn't the language's fault.

  nogc? Nah we don't need that, just don't allocate using the 
 GC.
When you end up adding more and more stuff to a language it is considered better to go back to the drawing board and start over. Otherwise you’ll end up with a mess...
well, that cycle does seem to be the case. e.g. the universe is a result of 'things' combining together to make other things, and those things combining to make yet more things, and those things combining to (at some point) make atoms, and atoms combining to make molecules, and then molecules adding to make .., and .. .. ... Sure, it'll all come crashing down at some point (well, presuming a non-infinite universe, which i doubt is the case) But we exist cause these things were adding themselves to each other. At some point, the universe will implode and it'll all start all over again (at least that's why things seems to suggest). The trick is, to make that point is so far into the future, that it doesn't really matter in your everyday considerations ;-) i.e. adding a thousand new features to D every day, might be going too far.
Jun 14 2022
parent forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 07:45:26 UTC, forkit wrote:
 ...
 Sure, it'll all come crashing down at some point (well, 
 presuming a non-infinite universe, which i doubt is the case)
oops. replace 'doubt' with 'expect' " .. which i expect is the case)"
Jun 14 2022
prev sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 00:54:34 UTC, forkit wrote:
 And the
 'friend' story rarely impresses them. The reply is always:
"Ok, 'friend'
 breaks OOP principles. Is D better because it breaks OOP in
its own way?"
The 'friend' argument is a strawman. No D programmer would need to alter anything they currently do, with the 'option' that I'm asking for. But a large number of programmers coming to D, will have to change what they're currently doing - cause they have no option.
 I find D's 'private' very useful and I doubt a single project 
 had any problem with it.

 Ali
this is another strawman. the problem is not the d module, it's not having a choice.
People also completely misunderstand what we're asking for. We're not saying the current design is bad, I like module private as much as the next person, BUT I would really like to have the option to have a stricter private for classes whenever chosen. I'd rather have a module with couple of classes that are somewhat closely related, than many files with classes. It's honestly easier to maintain a small amount of files, than many files. You end up with so many tabs in your editor too, where as a symbol list is much easier to skip to the necessary class. I really don't understand why people are so much against it when it can be added without interfering with existing code, but it surely will help. Especially when you have projects multiple people work on, then it will help ensuring that someone in the future isn't gonna abuse it.
Jun 13 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Tuesday, 14 June 2022 at 06:49:59 UTC, bauss wrote:

 People also completely misunderstand what we're asking for.
No. No we do not. We understand *exactly* what you are asking for.
 I really don't understand why people are so much against it 
 when it can be added without interfering with existing code, 
 but it surely will help.
I can't speak for anyone else, but I don't think it adds *any* benefit to the language whatsoever. I have read all of the arguments for it and I don't find any of them compelling enough that the feature would bring significant advantage over putting classes with "strictly private" members in their own modules. But again, I'm just a guy expressing his opinion. Walter and Atila are the ones you need to convince. Talking about it endlessly in the forums has zero chance of getting the feature in. You guys posting about it in every other thread apparently feel strongly about it, so write a DIP. Get feedback from others who support it and try to craft a rationale that Walter and Atila will find compelling enough to accept (i.e., show that the benefit outweighs the added cost in maintenance and language complexity, why the arguments against it are wrong, etc.). Once you submit it and it's had enough time in Draft Review that you're satisfied with it, I'll even give it priority over any other DIPs in the queue.
Jun 14 2022
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 10:29:14 UTC, Mike Parker wrote:
 I can't speak for anyone else, but I don't think it adds *any* 
 benefit to the language whatsoever. I have read all of the 
 arguments for it and I don't find any of them compelling enough 
 that the feature would bring significant advantage over putting 
 classes with "strictly private" members in their own modules.
Just because it doesn't add a benefit to you doesn't mean it doesn't add a benefit to anyone else. I don't see a benefit in nogc, importC etc. for that matter, yet I still see the value in those and wouldn't/haven't propose for them to be removed/not implemented. I don't see the problem in adding things that helps one group of developers, if that thing doesn't ruin it for anyone else. I think it's really ignorant to have a stance like that with a programming language that strives to be general-purpose. The whole point of a programming language to be general-purpose is that you might not need all the features or see value in all features, but you have them at your disposal if needed, because you want to appeal to as many people as possible. I can't say I'm shocked, because it's the usual view with D, barely anyone in the community is willing to compromise.
Jun 14 2022
parent reply Mike Parker <aldacron gmail.com> writes:
On Tuesday, 14 June 2022 at 10:45:15 UTC, bauss wrote:

 I think it's really ignorant to have a stance like that with a 
 programming language that strives to be general-purpose. The 
 whole point of a programming language to be general-purpose is 
 that you might not need all the features or see value in all 
 features, but you have them at your disposal if needed, because 
 you want to appeal to as many people as possible.

 I can't say I'm shocked, because it's the usual view with D, 
 barely anyone in the community is willing to compromise.
I'm not the gatekeeper. There's nothing for me to compromise on. If the feature gets in, that's fine. Life goes on. I'm not speaking for Walter or Atila when I express my opinion on this. D is already a complex language, so I just think that *any* new feature should have a strong justification. Personally, I haven't seen one yet for this one. But if you can convince Walter and Atila, more power to you.
Jun 14 2022
next sibling parent bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 12:15:44 UTC, Mike Parker wrote:
 On Tuesday, 14 June 2022 at 10:45:15 UTC, bauss wrote:

 I think it's really ignorant to have a stance like that with a 
 programming language that strives to be general-purpose. The 
 whole point of a programming language to be general-purpose is 
 that you might not need all the features or see value in all 
 features, but you have them at your disposal if needed, 
 because you want to appeal to as many people as possible.

 I can't say I'm shocked, because it's the usual view with D, 
 barely anyone in the community is willing to compromise.
I'm not the gatekeeper. There's nothing for me to compromise on. If the feature gets in, that's fine. Life goes on. I'm not speaking for Walter or Atila when I express my opinion on this. D is already a complex language, so I just think that *any* new feature should have a strong justification. Personally, I haven't seen one yet for this one. But if you can convince Walter and Atila, more power to you.
I don't think anyone can convince them, at least not Walter.
Jun 14 2022
prev sibling parent reply mee6 <mee6 lookat.me> writes:
On Tuesday, 14 June 2022 at 12:15:44 UTC, Mike Parker wrote:
 D is already a complex language, so I just think that *any* new 
 feature should have a strong justification. Personally, I 
 haven't seen one yet for this one. But if you can convince 
 Walter and Atila, more power to you.
The latest features that have been added and some in the pipeline are questionable. Is the complexity added by basically maintaining a C compiler inside of D now. The added complexity isn't worth it. It's just another massive feature stretched thin. Having to convince those two is a waste of time. The entire community can be in uproar as I saw with safe by default C declarations, and they still won't change their mind. Whoever says that effort is better served doing bug fixes or writing Dip obviously hasn't tried to do either. My time is best served complaining about how awful it is than to waste my time going through the processes.
Jun 14 2022
next sibling parent Paul Backus <snarwin gmail.com> writes:
On Tuesday, 14 June 2022 at 19:22:34 UTC, mee6 wrote:
 Whoever says that effort is better served doing bug fixes or 
 writing Dip obviously hasn't tried to do either. My time is 
 best served complaining about how awful it is than to waste my 
 time going through the processes.
In my experience, just about anything is a better use of one's time than complaining on the internet. Take a walk in the park; listen to some music; read a book. You will be much happier.
Jun 14 2022
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Tuesday, 14 June 2022 at 19:22:34 UTC, mee6 wrote:

 Having to convince those two is a waste of time.
In the early days of D, new features could get into the language by persuading Walter here in the forums. The community was tiny, and such discussions were easy to follow. That's why we have templates, the is operator (i.e., `c is null`), mixin templates, and more. As time went by and the language became more complex, Walter had to raise the bar on letting new features in out of necessity (my private nickname for him back then was "Dr. No"). Moreover, at some point the community became too large for focused feature discussions. They were scattered across threads, rambling off topic, and difficult to follow (like this whole private-to-the-module discussion going on now). The DIP was an initiative started 100% by *the community*. [The Rationale of DIP 1](https://wiki.dlang.org/DIP1) shows why:
 Keeping track of improvement proposals is very hard and not 
 well documented organized. Having a template (and a process) 
 for such proposals can improve the situation significantly.
This forum discussion is where it came together: https://digitalmars.com/d/archives/digitalmars/D/new_DIP1_DIP_Template_92908.html Do a search on the following page for DIP and you'll see there were 6 DIPs "submitted" within a month of that post (and two more over the next three months): https://digitalmars.com/d/archives/digitalmars/D/index2009.html This was all community driven. There was no buy-in from Walter at that point. There's even one post there from [someone asking for Walter to declare DIPs "official"](https://digitalmars.com/d/archives/digitalmars/D/Wiki4D_Walter_and_DIPs_93463.html). There are plenty of DIP discussions in the archives, but it was never a formal process. Of course, there was some frustration because of that. So eventually, one volunteer stepped up and formalized the process with Walter and Andrei's blessing: https://www.digitalmars.com/d/archives/digitalmars/D/announce/Announcing_new_DIP_handling_process_44502.html ``` There are two main goals for going this way: 1) Ensure communication between language authors and DIP authors, establish better sense of actually contributing as opposed to simply throwing ideas into the black box. 2) Improve overall quality of DIP documents to the point where language authors can reasonably review them without spending too much time on trivialities. Additional benefit I am hoping for is to have a centralized place to subscribe to learn about all coming major changes coming to language for those who can't keep up with NG regularly. Walter and Atila are the maintainers. If you want a new feature, provide an argument that convinces them of its benefits vs. its costs via the DIP process. That's the bar. ``` And he added this note: ``` I will act as a DIP manager for the time being. Please note that role of DIP manager does not imply any decision power regarding DIP approval, it remains an exclusive domain of language authors. ``` I quote that for the last bit: "it remains the exclusive domain of the language authors." The DIP process was created by the community as a way to refine ideas for new features and ultimately present them to the language authors for consideration. If you want to add a new language feature, then this is the mechanism to do it, and Walter and Atila are the two who you need to convince.
Jun 14 2022
parent Mike Parker <aldacron gmail.com> writes:
On Wednesday, 15 June 2022 at 04:28:25 UTC, Mike Parker wrote:

 ```
 Walter and Atila are the maintainers. If you want a new 
 feature, provide an argument that convinces them of its 
 benefits vs. its costs via the DIP process. That's the bar.
 ```
This bit was supposed to have been outside of the block. That's my text.
Jun 14 2022
prev sibling next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 14 June 2022 at 10:29:14 UTC, Mike Parker wrote:
 Once you submit it and it's had enough time in Draft Review 
 that you're satisfied with it, I'll even give it priority over 
 any other DIPs in the queue.
Personally I would do it, but I unfortunately do not have enough time on my hands that I could even do that and I do believe it would be the right way to do it instead of these long discussions about it here in the forum. If anyone is willing to do it, then I for sure wouldn't mind helping but I don't have enough time to write a whole DIP by myself.
Jun 14 2022
parent reply forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 10:47:36 UTC, bauss wrote:
 On Tuesday, 14 June 2022 at 10:29:14 UTC, Mike Parker wrote:
 Once you submit it and it's had enough time in Draft Review 
 that you're satisfied with it, I'll even give it priority over 
 any other DIPs in the queue.
Personally I would do it, but I unfortunately do not have enough time on my hands that I could even do that and I do believe it would be the right way to do it instead of these long discussions about it here in the forum. If anyone is willing to do it, then I for sure wouldn't mind helping but I don't have enough time to write a whole DIP by myself.
I don't agree. As someone who has observed politics for a long time, I can tell you, that things come about as a result of discussion, and not becasue someone has brought a new policy to the floor. You bring the policy to the floor, when you have a reasonable understanding of the likelihood of success of that policy getting passed, and you cannot determine that except by having the discussions first. Of course discussions continue to take place once a policy is brought to the floor. The other likely outcome here, is that many are keeping silent and will not 'come out' unless or until a DIP is produced, and then they will feel forced to come out - either to express the rejection or acceptance of the idea. I think there is a signficant silent majority here, and it makes it difficult to assess whether it's worth the effort of doing a DIP. What does seem clear, is that core reject this idea outright. In which case, if a majority wanted it, what would core do? Would we see another phobos-tango situation arise? There is also the question of Swift, which for me, has this feature already, as you'd expect, and also provides many, perhaps even most, of the features that attracted me to D. So where i spend my effort, is also a consideration ;-)
Jun 14 2022
next sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 11:15:41 UTC, forkit wrote:
 There is also the question of Swift, which for me, has this 
 feature already, as you'd expect, and also provides many, 
 perhaps even most, of the features that attracted me to D.
oh. and Swift has many additional features that I find interesting, and useful, and ones that D doesn't have, and likely will never, ever have. I think D is living in its own little world sometimes. Others have already passed it by, and more will too.
Jun 14 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 June 2022 at 11:15:41 UTC, forkit wrote:
 I think there is a signficant silent majority here, and it 
 makes it difficult to assess whether it's worth the effort of 
 doing a DIP.
It is difficult regardless, I would for instance "vote it down" if the syntax was anything more complex than «hidden». I want less complexity, more elegance, not more language-attrition… Maybe write a DIP for adding a new forum called «New Features».
Jun 14 2022
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 14 June 2022 at 10:29:14 UTC, Mike Parker wrote:

 But again, I'm just a guy expressing his opinion. Walter and 
 Atila are the ones you need to convince.

 Talking about it endlessly in the forums has zero chance of 
 getting the feature in. You guys posting about it in every so 
 write a DIP.
 Get feedback from others who support it and try to craft a 
 rationale that Walter and Atila will find compelling enough to 
 accept (i.e., show that the benefit outweighs the added cost in 
 maintenance and language complexity, why the arguments against 
 it are wrong, etc.).

 Once you submit it and it's had enough time in Draft Review 
 that you're satisfied with it, I'll even give it priority over 
 any other DIPs in the queue.
Rationales have been presented. Walter has commented. It is obvious the feature won't get in. Why ask people to waste their time on a DIP that has zero chance to be accepted?
Jun 14 2022
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 17:08:55 UTC, Ali Çehreli wrote:

also, it should be considered as an 'design' control, not an 
'access' control.

you cannot prevent 'access' by simply saying it's private.

but a 'design' control that's not enforceable, seems pretty 
pointeless.

private(scope) -> would provide the D programmer with an 
optional, enforcable, design control.

Please explain the downside to this.
Jun 13 2022
prev sibling parent forkit <forkit gmail.com> writes:
On Monday, 13 June 2022 at 17:08:55 UTC, Ali Çehreli wrote:
 ...
 Then I thought more about what encapsulation actually means. 
 Then I realized encapsulation has nothing to do with access 
 control. I am much happier with this repaired frame of 
 reference.
But this has always been the case ;-) Encapsulation is an abstract concept. A cell can be considered encapsulated. The earth could be considered enscapsulated. The solar system could be considered encapsulated. The galaxy could be considered encapsulated. There is even some suggestions that the universe could be considered encapsulated. whereas: private(scope), is a real, compile-time-enforcable, design constraint.
Jun 13 2022
prev sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Saturday, 11 June 2022 at 13:14:20 UTC, Max Samukha wrote:
 On Saturday, 11 June 2022 at 12:21:31 UTC, Ola Fosheim Grøstad 
 wrote:

 Also, you've never stated what kind of OOP you are referring 
 to and for what purpose, and what your judgment is based on, 
 so it tells me next to nothing!?
I am talking about OOP, where an object is literally the unit of encapsulation. I don't understand why people are so eager to redefine the term.
But they are classes not objects, which presumably is because c++ was thinking "well Im not making real objects here"
Jun 11 2022
prev sibling parent forkit <forkit gmail.com> writes:
On Friday, 10 June 2022 at 23:36:48 UTC, H. S. Teoh wrote:
 [...]
 Co-operative mutability, is a major source of bugs - always 
 has been, always will be (cause it makes it so difficult to 
 reason about code).
 
 Mutable state subverts encapsulation, makes it more difficult 
 to reason about code, and makes it difficult to scale 
 'correct' code.
[...] This sounds like a sales pitch for Haskell. :-P The logical conclusion of the above line of reasoning is to eliminate mutable state altogether, and for that, Haskell perfectly fits the bill. T
C'mon. Please don't do that. Here is what I actually wrote: --- Co-operative mutability, is a major source of bugs - always has been, always will be (cause it makes it so difficult to reason about code). Mutable state subverts encapsulation, makes it more difficult to reason about code, and makes it difficult to scale 'correct' code. Mutabilty is certainly 'convenient', and oftne necessary, especially in low-level code, but it needs to be approached with greater caution than is what demonstrated in D's source code. ---
Jun 10 2022
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Friday, 10 June 2022 at 22:44:25 UTC, forkit wrote:
 On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
 ..
 D is supposed to be a better OO language (read: encapsulation, 
 separation of concerns), and DMD is developed by a number of 
 highly capable very experienced D developers (read: not 
 ordinary programmers), how come DMD is in such a terrible 
 state as if it's done by some average Joel (above)?

 No offense, I am just puzzled by this software engineering 
 myth.
Nonsense. D .. a better OO langauge?? Do you even know how hard it is, to reason about a D module? The D module is, apparently, THE single most important abstraction for encapsulation - someone decided to design it this way. This conflicts with OO principle of being able to encapsulate an objects invariants in its specification. So D, a -betterOOP .. hah!
Welcome to single module per class. Really, I fail to understand what's the problem with having one class per module, and import multiple classes per module using public imports if you desire to import multiple classes at once.
 The D module is designed to encourage shared mutability. There 
 are no means to specifiy, let alone verify and enforce, 
 encapasulated object invariants. They have no 'boundary' inside 
 a D module - by that I mean, any other code in the same module 
 can transgress any boundary that has been specified.
Which sometimes is quite useful, otherwise you'd end up with static nested classes/structs...
 Please drop this idea, that D is a better OO langauge. It is 
 not.
While it is not better per your reasoning, it is certainly not worse than others per my experience, and at least better than C++. I'd say on level with Java or a bit higher in functionality and convenience. Best regards, Alexandru.
Jun 11 2022
parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 10:49:09 UTC, Alexandru Ermicioi 
wrote:
 ..
 Welcome to single module per class. Really, I fail to 
 understand what's the problem with having one class per module, 
 and import multiple classes per module using public imports if 
 you desire to import multiple classes at once.
The problem I have, is that a class is not considered a proper type in D. If it were, you would be able to define and enforce (at compile time) its invariants. As it is, any code in the module (outside of the class), can invalidate the invariants. This is equivalent to D letting you put "wtf!" into an int. Would you be happy with that? i.e Even an int is given better type conformance in D. The module is not a type! I do not design my solution around the module. I design it around types, and I use the module to bring related types together. Separate, unrelated types can go in their own module. In the event a type (and I'm talking about an instatiation of class here) - is a standalone type.. sure..it can go it's own module, no problem. Makes sense even. But you don't have to put every int your define into its own module, just so the compiler can enforce its invariants. That would be crazy.. yes? A class is just a type - why is it considered less a type than an int? (i.e. the only way to protect its invariants, is to put it all by itself in its own module. What utter nonsense!
 While it is not better per your reasoning, it is certainly not 
 worse than others per my experience, and at least better than 
 C++. I'd say on level with Java or a bit higher in 
 functionality and convenience.

 Best regards,
 Alexandru.
The module is not a type. You do not design your solution using types. You do not instatiate a module, and start sending it messages, or recieve messages from it, or change its behavior, or have it change its own behavior. The module is a just convenient static encapsulation for related types. but you design a program by using types, not modules.
Jun 11 2022
next sibling parent reply forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 11:24:04 UTC, forkit wrote:
 The module is not a type. You do not design your solution using 
 types.
oops. of course the above should have said:
 The module is not a type. You design your solution using types.
Jun 11 2022
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 11 June 2022 at 11:27:34 UTC, forkit wrote:

 The module is not a type. You design your solution using types.
`Defeat` them, I support `you`!
Jun 11 2022
next sibling parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 11:32:52 UTC, zjh wrote:
 On Saturday, 11 June 2022 at 11:27:34 UTC, forkit wrote:

 The module is not a type. You design your solution using 
 types.
`Defeat` them, I support `you`!
I'm sure giving it my best shot ;-) If I'm going to use a class type in D, then D needs to treat it at least as equally as an int type ;-) i.e. If I put an invariant in my type, I expect the compiler to enforce it (at compile time).
Jun 11 2022
prev sibling parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 11 June 2022 at 11:32:52 UTC, zjh wrote:
 On Saturday, 11 June 2022 at 11:27:34 UTC, forkit wrote:

 The module is not a type. You design your solution using 
 types.
`Defeat` them, I support `you`!
Back then, `AA` put forward the concept of `"small class"` and absolutely opposed `"large class"`. But as a result, in `'d'`, `module` is the `encapsulation unit`? What an irony! Other languages have absorbed 'small classes' and developed 'interfaces'. And the `'d'` actually choose `module|file` as `encapsulation unit`?
Jun 11 2022
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 11 June 2022 at 13:56:09 UTC, zjh wrote:

 And the `'d'` actually choose `module|file` as `encapsulation 
 unit`?
`Big step backwards`! But a lot of people are still saying, good!very good!
Jun 11 2022
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 11 June 2022 at 13:58:57 UTC, zjh wrote:
 On Saturday, 11 June 2022 at 13:56:09 UTC, zjh wrote:

 And the `'d'` actually choose `module|file` as `encapsulation 
 unit`?
`Big step backwards`! But a lot of people are still saying, good!very good!
With `module encapsulation`, you don't need `enemies`! They are `enemies`!
Jun 11 2022
parent reply norm <norm.rowtree gmail.com> writes:
On Saturday, 11 June 2022 at 14:16:54 UTC, zjh wrote:
 On Saturday, 11 June 2022 at 13:58:57 UTC, zjh wrote:
 On Saturday, 11 June 2022 at 13:56:09 UTC, zjh wrote:

 And the `'d'` actually choose `module|file` as `encapsulation 
 unit`?
`Big step backwards`! But a lot of people are still saying, good!very good!
With `module encapsulation`, you don't need `enemies`! They are `enemies`!
In a practical sense module scope encapsulation works really well, I find it much better than strict class encapsulation that then has to be broken anyway with friends. In fact I've not encountered one bug using module scope encapsulation. None. I have had many bugs in my code though using friends in C++! It is fascinating the amount of energy in D forums spent talking about D's shortcomings and reminds me of the internet in the late 90's early 00's where every forum chat ended in a flame war. All this energy would be much better spent making software with D or building up its ecosystem....but that requires work and not as much fun as pointing out someone is wrong on the internet. The latter requires no work at all for the same physiological response in the brain as recognition by others for a job well done. I guess it is the nature of the beast, D is so flexible and very good at so many things that people are unwilling to accept that D by design just doesn't do everything the way they want.
Jun 11 2022
next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 12 June 2022 at 00:41:20 UTC, norm wrote:
 In a practical sense module scope encapsulation works really 
 well, I find it much better than strict class encapsulation 
 that then has to be broken anyway with friends. In fact I've 
 not encountered one bug using module scope encapsulation. None. 
 I have had many bugs in my code though using friends in C++!

 It is fascinating the amount of energy in D forums spent 
 talking about D's shortcomings and reminds me of the internet 
 in the late 90's early 00's where every forum chat ended in a 
 flame war.

 All this energy would be much better spent making software with 
 D or building up its ecosystem....but that requires work and 
 not as much fun as pointing out someone is wrong on the 
 internet. The latter requires no work at all for the same 
 physiological response in the brain as recognition by others 
 for a job well done.

 I guess it is the nature of the beast, D is so flexible and 
 very good at so many things that people are unwilling to accept 
 that D by design just doesn't do everything the way they want.
Nice distraction, yet again ;-) This is not about 'strict' encapsulation. How many times do I have to say this.. jeessssee........ I just want 'the option' to hide my invariants from other code in the module, and have the compiler enforce those invariants at compile time. Really, it doesn't sound like such a big ask to me. Why so many are against this, is puzzling, to say the least.
Jun 11 2022
prev sibling next sibling parent zjh <fqbqrr 163.com> writes:
On Sunday, 12 June 2022 at 00:41:20 UTC, norm wrote:
  D is so flexible and
 very good at so many things that people are unwilling to accept 
 that D by design just doesn't do everything the way they want.
Do you know why `c++` is `cool`? C++ provides `tools`. I can use them as I like. D `limits` me!
Jun 11 2022
prev sibling next sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 12 June 2022 at 00:41:20 UTC, norm wrote:

 In a practical sense module scope encapsulation works really 
 well, I find it much better than strict class encapsulation 
 that then has to be broken anyway with friends.
Unless it's `very special`, I don't need `c++'s` friend.
Jun 11 2022
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 12 June 2022 at 01:02:21 UTC, zjh wrote:

 Unless it's `very special`, I don't need `c++'s` friend.
But `c++` give `friend`. You can have it if you want .If you don't want it, forget it. D is `not` . D force you to have `a bunch of` friends . They are not `friends`, they are `enemies`!
Jun 11 2022
parent zjh <fqbqrr 163.com> writes:
On Sunday, 12 June 2022 at 01:04:55 UTC, zjh wrote:

 D is `not` .
 D force you to have `a bunch of` friends .
 They are not `friends`, they are `enemies`!
Just like `several outsiders` in your family misuse your things without permission `every day`. Do you think he is your `friend`?
Jun 11 2022
prev sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 12 June 2022 at 00:41:20 UTC, norm wrote:
 ... It is fascinating the amount of energy in D forums spent 
 talking about D's shortcomings and reminds me of the internet 
 in the late 90's early 00's where every forum chat ended in a 
 flame war.

 All this energy would be much better spent making software with 
 D or building up its ecosystem....but that requires work and 
 not as much fun as pointing out someone is wrong on the 
 internet. The latter requires no work at all for the same 
 physiological response in the brain as recognition by others 
 for a job well done.
Actually, 'the latter' as you call it, takes a LOT of work .. its exhausting ;-) The 'flaming' as you call it, usually results from the passive-aggressive types always coming out of the woodwork...and they use all kinds of diversionary tactics to shutdown your argument: https://medium.com/the-mission/5-tactics-used-by-passive-aggressive-arguers-and-the-best-forms-of-defense-42a9348b60ed My argument could not be simpler: Compiler. please enforce the invariants of my class at compile time, if that's what I ask you to do (with, for example, 'scope private int x'; It's a really, really, really, really, simple proposition.
Jun 11 2022
parent reply Sergey <kornburn yandex.ru> writes:
On Sunday, 12 June 2022 at 01:03:06 UTC, forkit wrote:
 My argument could not be simpler:

 Compiler. please enforce the invariants of my class at compile 
 time, if that's what I ask you to do (with, for example, 'scope 
 private int x';

 It's a really, really, really, really, simple proposition.
Why not prepare well written DIP with examples from other languages and to ask some experienced core developer to have a look at it?
Jun 12 2022
parent forkit <forkit gmail.com> writes:
On Sunday, 12 June 2022 at 11:11:32 UTC, Sergey wrote:
 On Sunday, 12 June 2022 at 01:03:06 UTC, forkit wrote:
 My argument could not be simpler:

 Compiler. please enforce the invariants of my class at compile 
 time, if that's what I ask you to do (with, for example, 
 'scope private int x';

 It's a really, really, really, really, simple proposition.
Why not prepare well written DIP with examples from other languages and to ask some experienced core developer to have a look at it?
Cause the decision to reject it, has already been made ;-) If I want a compiler that can enforce invariants of a class from code surrounding it, I'll have to look elsewhere. Which is probably what I'll end up doing. (e.g. Swift).
Jun 12 2022
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Saturday, 11 June 2022 at 11:24:04 UTC, forkit wrote:
 The module is not a type!

 I do not design my solution around the module. I design it 
 around types, and I use the module to bring related types 
 together.
Then don't use module to bring related types yet not tightly coupled ones toghether. Use rather packages. It seems that your definition of module is more matching to a package in D rather than D module. It is indeed pain sometimes to have to create new modules, but you can argue that it is also pain to skim through thousands of lines of code in a single module.
 Separate, unrelated types can go in their own module.

 In the event a type (and I'm talking about an instatiation of 
 class here) - is a standalone type.. sure..it can go it's own 
 module, no problem. Makes sense even.

 But you don't have to put every int your define into its own 
 module, just so the compiler can enforce its invariants. That 
 would be crazy.. yes?
Why not? Why crazy? Usually a type (class/struct) is not just a single or two lines, but 10-20 at least. Imho it is perfectly fine to put them in single file. In java, that is how it is done, one file one public class.
 A class is just a type - why is it considered less a type than 
 an int? (i.e. the only way to protect its invariants, is to put 
 it all by itself in its own module.

 What utter nonsense!
Obviously you're comparing apples with grapes. int is a built-in type and certainly not a class or struct, although making the builtin-types and user defined ones have homogeneous rules mostly, would be beneficial.
 The module is not a type. You do not design your solution using 
 types.

 You do not instatiate a module, and start sending it messages, 
 or recieve messages from it, or change its behavior, or have it 
 change its own behavior.

 The module is a just convenient static encapsulation for 
 related types.
Well, why can't it be? You certainly can consider it being a singleton per thread, where global vars are it's internal state, and exported functions are the methods to class (module). Certainly it doesn't have inheritance and other OOP aspects, but saying that module is a bunch of related types only, is not right. Better to say, is that they are bunch of tightly coupled types, meaning that extracting one type into separate module most certainly would require the other types in module to be adjusted to the change (not by importing the new module, but actual code changes). In summary, the current access modifiers, don't make oop programming in D in any way less powerful, than any other language with comparable oop features. Best regards, Alexandru.
Jun 11 2022
parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 12:57:25 UTC, Alexandru Ermicioi 
wrote:
 On Saturday, 11 June 2022 at 11:24:04 UTC, forkit wrote:
 The module is not a type!

 I do not design my solution around the module. I design it 
 around types, and I use the module to bring related types 
 together.
Then don't use module to bring related types yet not tightly coupled ones toghether. Use rather packages. It seems that your definition of module is more matching to a package in D rather than D module.
I don't. So the suggestion in irrelvant. Package brings related modules together, not related types.
 It is indeed pain sometimes to have to create new modules, but 
 you can argue that it is also pain to skim through thousands of 
 lines of code in a single module.
Yes, exactly! one class per module, just to ensure the invariants of that class can be verified at compile time, is .. a pain!
 Separate, unrelated types can go in their own module.

 In the event a type (and I'm talking about an instatiation of 
 class here) - is a standalone type.. sure..it can go it's own 
 module, no problem. Makes sense even.

 But you don't have to put every int your define into its own 
 module, just so the compiler can enforce its invariants. That 
 would be crazy.. yes?
Why not? Why crazy?
I don't get it. Put every int into its own module? Is that what you're saying here?
 Usually a type (class/struct) is not just a single or two 
 lines, but 10-20 at least. Imho it is perfectly fine to put 
 them in single file. In java, that is how it is done, one file 
 one public class.
Don't talk to me about Java ;-) The problem is NOT, whether a type should go in its own module. The problem, is being forced by the language, to put a type in its own module, just so you can ensure its invariants. Please read above lines again, so you know what my argument actually is.
 A class is just a type - why is it considered less a type than 
 an int? (i.e. the only way to protect its invariants, is to 
 put it all by itself in its own module.

 What utter nonsense!
Obviously you're comparing apples with grapes. int is a built-in type and certainly not a class or struct, although making the builtin-types and user defined ones have homogeneous rules mostly, would be beneficial.
No. You cannot put "wft!" into a int. Its invariants are enforced (from code surrounding that int, in the same module). Not so with a class type.
 The module is not a type. You do not design your solution 
 using types.

 You do not instatiate a module, and start sending it messages, 
 or recieve messages from it, or change its behavior, or have 
 it change its own behavior.

 The module is a just convenient static encapsulation for 
 related types.
Well, why can't it be? You certainly can consider it being a singleton per thread, where global vars are it's internal state, and exported functions are the methods to class (module). Certainly it doesn't have inheritance and other OOP aspects, but saying that module is a bunch of related types only, is not right. Better to say, is that they are bunch of tightly coupled types, meaning that extracting one type into separate module most certainly would require the other types in module to be adjusted to the change (not by importing the new module, but actual code changes).
Again, this is not an argument as to whether a type should go in its own module. This is an argument about the language forcing you to put a type (in this case a class type) into its own module, for the single purpose of enforcing the invariants of that type from surrounding code. Please read above lines again, so you know what my argument actually is.
 In summary, the current access modifiers, don't make oop 
 programming in D in any way less powerful, than any other 
 language with comparable oop features.

 Best regards,
 Alexandru.
I don't agree. If I cannot enforce the invariants of my class type, from ANY code surrounding it (related or not), then that type is not longer a strong type, but a weak type - and this requires to you to restructure your design, to accomodate this .. flaw!
Jun 11 2022
prev sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
 I still feel puzzled:

 D is supposed to be a better OO language (read: encapsulation, 
 separation of concerns), and DMD is developed by a number of 
 highly capable very experienced D developers (read: not 
 ordinary programmers), how come DMD is in such a terrible state 
 as if it's done by some average Joel (above)?

 No offense, I am just puzzled by this software engineering myth.
Its a above average compiler, its just that the average languge on a scale to 1-10 is maybe 2
Jun 10 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 10 June 2022 at 23:13:22 UTC, monkyyy wrote:
 Its a above average compiler, its just that the average languge 
 on a scale to 1-10 is maybe 2
Unrelated, you can implement a good model in a spartan language like C and end up with high quality on every level. It takes more discipline, that's all. Few people are good at designing architectures. Most good archirctures build on the experience from other architectures: you need to study the work of others before you write a single line of code. Or write many compilers with different strategies. If you go DIY and focus heavily on performance then the architecture will not be able to take on novel new features without a significant toll on the code base. At some point you'll be deadlocked into a backlog of bugs to fix.
Jun 10 2022
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Jun 10, 2022 at 11:13:22PM +0000, monkyyy via Digitalmars-d wrote:
 On Friday, 10 June 2022 at 20:59:38 UTC, mw wrote:
 I still feel puzzled:
 
 D is supposed to be a better OO language (read: encapsulation,
 separation of concerns), and DMD is developed by a number of highly
 capable very experienced D developers (read: not ordinary
 programmers), how come DMD is in such a terrible state as if it's
 done by some average Joel (above)?
 
 No offense, I am just puzzled by this software engineering myth.
Its a above average compiler, its just that the average languge on a scale to 1-10 is maybe 2
Wow, you're optimistic. After having worked with "enterprise" code for the last how-many decades, I would rate 99% of non-trivial codebases out there somewhere between 0 to 1 on a scale of 1-10. If you think dmd is bad, wait till you see the code of an "enterprise" compiler for an "enterprise" language. I guarantee you, you will need therapy afterwards. :-P To this day I still suffer PTSD from having once attempted to read glibc source code... T -- Give a man a fish, and he eats once. Teach a man to fish, and he will sit forever.
Jun 10 2022
next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Friday, 10 June 2022 at 23:41:58 UTC, H. S. Teoh wrote:

 code for the last how-many decades, I would rate 99% of 
 non-trivial codebases out there somewhere between 0 to 1 on a 
 scale of 1-10.

 If you think dmd is bad, wait till you see the code of an 
 "enterprise" compiler for an "enterprise" language. I guarantee 
 you, you will need therapy afterwards. :-P  To this day I still 
 suffer PTSD from having once attempted to read glibc source 
 code...


 T
I feel allot of language dev is academic, and enterprise grade code is worse then academica, also averages that include extermes on a linear scale bias upward
Jun 10 2022
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Jun 11, 2022 at 12:08:26AM +0000, monkyyy via Digitalmars-d wrote:
 On Friday, 10 June 2022 at 23:41:58 UTC, H. S. Teoh wrote:
 [...] I would rate 99% of non-trivial codebases out there somewhere
 between 0 to 1 on a scale of 1-10.
 
 If you think dmd is bad, wait till you see the code of an
 "enterprise" compiler for an "enterprise" language. I guarantee you,
 you will need therapy afterwards. :-P  To this day I still suffer
 PTSD from having once attempted to read glibc source code...
[...]
 I feel allot of language dev is academic, and enterprise grade code is
 worse then academica, also averages that include extermes on a linear
 scale bias upward
IMO, in spite of all its flaws and dark corners, D strikes a good balance between "academic" design and pragmatic enterprise-style code. T -- Why do conspiracy theories always come from the same people??
Jun 10 2022
next sibling parent forkit <forkit gmail.com> writes:
On Saturday, 11 June 2022 at 00:33:27 UTC, H. S. Teoh wrote:
 ..
 IMO, in spite of all its flaws and dark corners, D strikes a 
 good balance between "academic" design and pragmatic 
 enterprise-style code.


 T
I'm not necessarily rejecting your assertion, but can you elaborate please. A good balance of what? And for what purpose?
Jun 10 2022
prev sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Saturday, 11 June 2022 at 00:33:27 UTC, H. S. Teoh wrote:
 IMO, in spite of all its flaws and dark corners, D strikes a 
 good balance between "academic" design and pragmatic 
 enterprise-style code.


 T
Why thats quite an insult; neither academia or enterprise grade oo have any value.
Jun 10 2022
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Friday, 10 June 2022 at 23:41:58 UTC, H. S. Teoh wrote:

 Wow, you're optimistic.  After having worked with "enterprise" 
 code for the last how-many decades, I would rate 99% of 
 non-trivial codebases out there somewhere between 0 to 1 on a 
 scale of 1-10.

 If you think dmd is bad, wait till you see the code of an 
 "enterprise" compiler for an "enterprise" language. I guarantee 
 you, you will need therapy afterwards. :-P  To this day I still 
 suffer PTSD from having once attempted to read glibc source 
 code...
And at the same time, it looks like the D leadership revere "the industry" as a source of "best industry practices" we should adopt.
Jun 11 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/10/2022 12:52 PM, max haughton wrote:
 No it really is bad. Some newer areas are ok but the quality of the code is 
 overall just bad, relies on enormous amounts of mutability, doesn't have a 
 proper opinion about how to resolve symbols (it only has 3 passes), tries to 
 make decision before properly analyzing the problem etc.
RE the symbol lookup. Before you joined us, the symbol table lookup was simple and straightforward. But everyone complained that it was "unintuitive". Lookup in scopes, classes, and imports, all worked exactly the same way. I spend countless emails explaining how lookup all worked the same way. Not one person said they understood it, it was always "it makes no sense". So, it was changed to a 2 pass lookup. Everyone liked that, and said it was intuitive. The 3rd pass was to enable legacy compatibility. So there you have it, 3 passes. It's a classic case of something that is simple and straightforward from a math point of view is hopelessly unintuitive for humans. As for mutability, this stems from two things: 1. using the Visitor pattern, which precludes using const and is excessively complex (in my opinion). But as you know, Dennis and I have been gradually replacing it with nested functions. 2. In order to support forward references, the semantic routines are lazy to a large extent. Lazy means that when information is asked for, often it needs to be computed if it hasn't been yet.
 The compiler is mostly reasonable semantically because D is a conventional 
 language, but several key parts of the logic are either extremely old messy
bits 
 of code that basically cannot be easily changed or types with a very sloppy 
 heritage that lead to an explosion of edge cases all over the place: Array, 
 Struct, and Int32 are all considered to be the same type of thing according to 
 the enum at the heart of the class that represents types,
??? isTypeArray and isTypeStruct are distinct types
 it's ad-hoc "eh just 
 ship it code" that almost no one can be bothered to fix because they've either 
 been scared off from working on the compiler because of aforementioned warts
or 
 because they've tried to divide and conquer the cleanup efforts and been told
no.
Doing cleanup is hard, though you've seen many PRs from me that move in that direction.
 Probably 40% of the bug fixes of the kind you posit are *because* of the 
 frontend being unreliable.
I disagree. It's mostly because of unexpected interaction between features, or cases nobody thought of. BTW, I hate the front end inliner, and am trying to replace it with a far simpler backend one. Success in that means eliminating nearly 3000 lines of ugly code. But I'm having some problems with the test suite, as it is failing in ways I cannot reproduce locally. Any help in getting closer to what causes the failures would be most helpful: https://github.com/dlang/dmd/pull/14194
Jun 11 2022
prev sibling parent reply deadalnix <deadalnix gmail.com> writes:
On Friday, 10 June 2022 at 19:37:37 UTC, H. S. Teoh wrote:
 On Fri, Jun 10, 2022 at 07:22:51PM +0000, mw via Digitalmars-d 
 wrote: [...]
 How come the DMD frontend is in such terrible state?
Because: https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/ Selected quotes: [...] you can ask almost any programmer today about the code they are working on. “It’s a big hairy mess,” they will tell you. “I’d like nothing better than to throw it out and start over.”
Joel is assuming that the typical refactoring and maintenance is happening to keep the complexity under control. This hasn't happened in DMD at all. Joel is right that dev are usually too prompt to pull that card. However, this case is slightly different.
Jun 10 2022
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 10 June 2022 at 20:29:52 UTC, deadalnix wrote:
 Joel is right that dev are usually too prompt to pull that 
 card. However, this case is slightly different.
Nah, he is wrong. Why do people care so much about bloggers? Software usually needs a complete rewrite after decades of adding new features the original architecture did not forsee. It doesn't look like crap because of bugfixes, if that was so then the orginal design and implementation is beyond saving anyway. The reality is that you have a better understanding the second and third time to come up with a better model + hardware improvements ( like more RAM) allow adoptions of better models.
Jun 10 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/10/2022 1:29 PM, deadalnix wrote:
 Joel is assuming that the typical refactoring and maintenance is happening to 
 keep the complexity under control. This hasn't happened in DMD at all.
Never mind the refactorings I regularly do to it.
Jun 11 2022
parent Hipreme <msnmancini hotmail.com> writes:
On Saturday, 11 June 2022 at 20:16:58 UTC, Walter Bright wrote:
 On 6/10/2022 1:29 PM, deadalnix wrote:
 Joel is assuming that the typical refactoring and maintenance 
 is happening to keep the complexity under control. This hasn't 
 happened in DMD at all.
Never mind the refactorings I regularly do to it.
Well maybe someday we'll have some kind of D compilation server like modern languages has today. Making recompilations a lot faster and actually modularizing DMD enough to make it work as a lib, so maybe a lot of people would not have the kind of hard work they have today. Some people have attempt to do it but... Now they're simply gone.
Jun 11 2022
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 29 April 2022 at 07:56:15 UTC, bauss wrote:
 No matter how good your library solutions are then you can 
 never implement async/await in a clear fashion without the 
 compiler emitting you a state machine for it.
Seems to me that coroutine support is sufficient?
Apr 30 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 5:33 AM, SealabJaster wrote:
 It's not even about the amount of keystrokes like many here claim, it's about 

a 
 joy for me to read and write.
Fortunately, bitfields are now in the language, although they're behind a -preview=bitfields switch. Builtin bitfields are indeed all about syntax sugar. I agree with you that readability is more important than keystrokes.
Apr 29 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 5:04 AM, bauss wrote:
 * async/await
 * Properties that actually work and also with a really good syntax IMHO 
 (getters/setters)
 * string interpolation
 * Shortened methods (and properties.) - much better than the proposed version
for D
 * nullability built-in ex. object?.method(), as well null-coalescing
 * pattern matching / switch expressions
 * out parameters that can be declared directly in the method call
 * built-in tuples, as well auto-expanding them to variables etc.
It's a good list. Though I proposed a straightforward string interpolation method, but it received no traction. I don't know what you're driving at about properties not working in D. I use them all the time.
Apr 29 2022
parent Adam D Ruppe <destructionator gmail.com> writes:
On Friday, 29 April 2022 at 15:31:05 UTC, Walter Bright wrote:
 Though I proposed a straightforward string interpolation 
 method, but it received no traction.
A few modifications to your proposal was and still is plenty workable, just the DIP process is extremely draining.
Apr 29 2022
prev sibling parent reply Adrian Matoga <dlang.spam matoga.info> writes:
On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga 
 wrote:
 of like it) at work. I've recently returned to tinkering with 
 electronics and programming at home so let me share my view.
Do you use or plan to use microcontrollers? If so, with what language?
I do, mostly STM32s. Based on what Adam and Mike's had shared it wasn't hard to get started, but lacking a readily usable HAL or RTOS was heavily distracting from actual work towards constructing some basic framework instead. Still, all the compile time stuff D has is very useful in that environment, so is e.g. the scope(exit)/scope(failure) feature that makes resource cleanup much less confusing without the need to write any wrappers. Currently I'm working on an RPi Hat where I had to add some drivers to Linux, where anything other than C won't work, but I have the userspace app on top of it, which is all written in D and it's way more convenient for me to develop it than e.g. in C++, even though I had to manually add a lot of bindings. Plus, I have very pleasant experience with GDC on RPi since the very moment I got my first Pi around 2013. LDC works very well too, but GDC is easier to get OOTB both in native Raspbian and in buildroot. Iain's work is a true game changer.
 technology or even non-technology related idea too. Python 
 became the default language for ML, because it was easy enough 
 for people whose main focus wasn't programming, and who didn't 
 require system level performance because available bindings to 
 C libraries were good enough.
Yes, but I think this also has something to do with Python replacing Matlab in academic research institutions. Python is becoming the default platform for analysis and experimentation.
Right! I studied CS at physics department and many teachers were either nuclear or solid state physicists, so we did a lot of Matlab, and Python was only about to enter the field. ROOT was also used in some projects but I could never wrap my head around its weirdness.
 What D tried to do was to be "better C++" or "better C", but 
 in 2022 it's about 40 years too late to be successful in that. 
 There're millions of programs in C and C++ that have been good 
 enough to make revenue for many companies and thus convinced 
 others to invest more money, effort and time in more stuff 
 that depends on C and C++.
Yes, and they are ISO standards, so nobody "owns" C or C++. That creates a more "open" evolution that is industry-oriented (the main purpose of ISO is to make industrial tools and interfaces interoperable).
Yup, standardization may play a big role in adoption too. We've worked with customers who strongly insisted on sticking to OpenCL C with no extensions, rather than switching to CUDA or any vendor-specific extensions to OCL, both to have clean hands in terms of safety and to avoid vendor lock-ins, even though that meant worse performance and fatter hardware bills.
 do something beyond those. I recall I had some good experience 

 libraries and implement any new code, especially with pretty 
 convenient tooling from MS, but that was long time ago when it 
 wasn't seriously usable outside Windows and I didn't have much 
 interest in developing for Windows later.
suggestions in the IDE, or was it something that has to do with the language itself?
The language itself in the times of D1 was very similar in look and feel and expressiveness to D, and they felt equally convenient for me. And coming from {} mindset, both were easy for auto-completions and interactive debug, was mind-blowing. It made coding pretty effortless. Nowadays a properly configured IDE for C, C++ or D can be just as good in terms of completions. As for debugging, I've been working for some time in projects where languages from python to assembly are mixed with jsons and yamls that control various stages of code generation, with compilers and even HW specs being still under development, so I've learned not to expect anything more than print-, memory dump-, logic analyzer- or even guess-based debugging.
 What I've missed the most so far in D was a zero-effort reuse 
 of C libraries, because there's a lot of useful libs in C I 
 already know.
Yes, has the new import-C feature been helpful for you in that regard?
Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.
 Of course it's much less tedious to interface C in D than in 
 Python, but I bet if D had a fully-blown ImportC from the very 
 beginning, it could be where C++ is today.
When compared to C++, I'd say that D still needs to get its memory management story right and fix some language short-coming (incomplete features), but memory management is at least being looked at actively now. (People expect something more than malloc/free and roll-your-own ref-counting.)
Right, C++ has been developing faster for the last decade and got much more support from industry, and there are a lot of clever freaks working on it. Still many things that should be easy and straightforward are being made overly complex, for genericity or low-level control or whatever reasons. In that regard I prefer D, where I can prototype fast and gradually add lower-level optimizations as needed. The latter is also where I find Python very cumbersome compared to D.
Apr 28 2022
parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On Thursday, 28 April 2022 at 15:28:40 UTC, Adrian Matoga wrote:
 On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad 
 wrote:
 On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga 
 wrote:
 What I've missed the most so far in D was a zero-effort reuse 
 of C libraries, because there's a lot of useful libs in C I 
 already know.
Yes, has the new import-C feature been helpful for you in that regard?
Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.
GCC-12 has been branched and is currently in prerelease phase, with an RC to be made on Monday if all is going to schedule. GDC in there is currently in sync with the current DMD HEAD on the stable branch (tag: `v2.100.0-beta.1`, dmd `313d28b3d`, druntime `e361d200`, phobos `ac296f80c`). If you can build it, feel free to send bug reports. :-)
Apr 28 2022
next sibling parent M.M. <matus email.cz> writes:
On Thursday, 28 April 2022 at 19:02:39 UTC, Iain Buclaw wrote:
 On Thursday, 28 April 2022 at 15:28:40 UTC, Adrian Matoga wrote:
 On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim 
 Grøstad wrote:
 [...]
Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.
GCC-12 has been branched and is currently in prerelease phase, with an RC to be made on Monday if all is going to schedule. GDC in there is currently in sync with the current DMD HEAD on the stable branch (tag: `v2.100.0-beta.1`, dmd `313d28b3d`, druntime `e361d200`, phobos `ac296f80c`). If you can build it, feel free to send bug reports. :-)
wow wow
Apr 28 2022
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/28/2022 12:02 PM, Iain Buclaw wrote:
 GCC-12 has been branched and is currently in prerelease phase, with an RC to
be 
 made on Monday if all is going to schedule.
 
 GDC in there is currently in sync with the current DMD HEAD on the stable
branch 
 (tag: `v2.100.0-beta.1`, dmd `313d28b3d`, druntime `e361d200`, phobos
`ac296f80c`).
Great news!
Apr 29 2022
prev sibling parent reply tastyminerals <tastyminerals gmail.com> writes:
On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga wrote:
 On Tuesday, 2 November 2021 at 18:01:37 UTC, Ola Fosheim 
 Grøstad wrote:
 [...]
While I haven't been active in D community for something like five years already, it wasn't because I switched to another PL, but mostly due to some disturbances in personal life that made me shift my spare time activities from programming to anything from politics to gardening and woodworking, while still trying to advocate for D or at least write all my single-use tools in it (I learned that woodworkers call such stuff jigs and I sort of like it) at work. I've recently returned to tinkering with electronics and programming at home so let me share my view. [...]
This. Very well said. And I would love D to just `import awesomeClib;`.
May 06 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/6/2022 11:59 AM, tastyminerals wrote:
 And I would love D to just `import awesomeClib;`.
That's very much what ImportC is all about.
May 07 2022
parent reply tastyminerals <tastyminerals gmail.com> writes:
On Saturday, 7 May 2022 at 07:00:52 UTC, Walter Bright wrote:
 On 5/6/2022 11:59 AM, tastyminerals wrote:
 And I would love D to just `import awesomeClib;`.
That's very much what ImportC is all about.
That is great indeed! I've just skimmed over ImportC article. No code examples how to do it unfortunately, it looks incomplete. Luckily, I found a D forum question about ImportC where people explain that all you need to do is to import a C file into your D code, it should work. Why a simple code example is missing from the article brings me to the topic of this thread. I wish D could "market" and explain its awesome features the way other more popular but (in my opinion) less feature rich languages do. With concrete code examples and easy to read text.
May 14 2022
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/14/2022 9:13 AM, tastyminerals wrote:
 Why a simple code example is missing from the article brings me to the topic
of 
 this thread.
Which ImportC article? Or did you mean the documentation? We can improve it.
May 14 2022
parent reply electricface <electricface qq.com> writes:
On Saturday, 14 May 2022 at 17:25:51 UTC, Walter Bright wrote:
 On 5/14/2022 9:13 AM, tastyminerals wrote:
 Why a simple code example is missing from the article brings 
 me to the topic of this thread.
Which ImportC article? Or did you mean the documentation? We can improve it.
Why is D unpopular? I think the current language is good enough, but the IDE for programmers is not good enough, not smart enough to read and refactor code easily.
May 14 2022
parent reply electricface <electricface qq.com> writes:
On Sunday, 15 May 2022 at 00:55:32 UTC, electricface wrote:
 On Saturday, 14 May 2022 at 17:25:51 UTC, Walter Bright wrote:
 On 5/14/2022 9:13 AM, tastyminerals wrote:
 Why a simple code example is missing from the article brings 
 me to the topic of this thread.
Which ImportC article? Or did you mean the documentation? We can improve it.
Why is D unpopular? I think the current language is good enough, but the IDE for programmers is not good enough, not smart enough to read and refactor code easily.
I think it might be better to use a part of the D compiler front-end as a language server, preferably to support all the good features of the language.
May 14 2022
parent reply StarCanopy <starcanopy protonmail.com> writes:
On Sunday, 15 May 2022 at 01:18:13 UTC, electricface wrote:
 I think it might be better to use a part of the D compiler 
 front-end as a language server, preferably to support all the 
 good features of the language.
Good news, [SDC](https://github.com/snazzy-d/SDC) has been revived, and the project's objectives revised.
May 14 2022
parent electricface <electricface qq.com> writes:
On Sunday, 15 May 2022 at 03:21:18 UTC, StarCanopy wrote:
 On Sunday, 15 May 2022 at 01:18:13 UTC, electricface wrote:
 I think it might be better to use a part of the D compiler 
 front-end as a language server, preferably to support all the 
 good features of the language.
Good news, [SDC](https://github.com/snazzy-d/SDC) has been revived, and the project's objectives revised.
This is really good news. If it can be successful, it will greatly improve the programming experience of programmers, and it will make me prefer the D language.
May 14 2022
prev sibling next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Have asked this question for a long time now. Still don't know why.
Nov 02 2021
next sibling parent Martin <martin.brzenska googlemail.com> writes:
On Tuesday, 2 November 2021 at 18:08:53 UTC, Imperatorn wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Have asked this question for a long time now. Still don't know why.
Strange, I don't asked this question but have seen lot of answers here in the forums
Nov 02 2021
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 2 November 2021 at 18:08:53 UTC, Imperatorn wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Have asked this question for a long time now. Still don't know why.
features that D had as edge over them. While those improvements might be clunky in regards to how D has them, they allow for a good enough usage, while keeping the large ecosystem of libraries, IDEs and commercial support from third party vendors. Meanwhile Swift, Go and Rust, managed to gather steam either by having a benolovent corporate sponsor or by having a proper goal of what they want to achieve. D's main target seems to keep changing, every year it is getting the feature that will finally get everyone to come into D, then a year goes by and another feature goal replaces that one. Latest example seems to be the ongoing Phobos reboot, with lifetime, ImportC, nogc, scope, shared, dynamic loading across all OSes, WASM, .... still not fully there.
Nov 02 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 2 November 2021 at 18:57:32 UTC, Paulo Pinto wrote:
 On Tuesday, 2 November 2021 at 18:08:53 UTC, Imperatorn wrote:
 [...]
features that D had as edge over them. [...]
Well, yeah. But, if D has got something for everyone, why hasn't it got more users? 🤔
Nov 02 2021
parent reply SealabJaster <sealabjaster gmail.com> writes:
On Tuesday, 2 November 2021 at 19:02:52 UTC, Imperatorn wrote:
 Well, yeah. But, if D has got something for everyone, why 
 hasn't it got more users? 🤔
It's easier to just go with another language with: * Better ecosystem * Libraries with actual documentation * Popular/core libraries tend to have tutorials * Standard ways for logging, DB connection, etc * A VSCode plugin/IDE that's actually capable of handling the language properly * Doesn't require you to often dive into spending ages getting a random C library to compile * Doesn't require you to write C code alongside the native language's code * Doesn't constantly undergo an identity crisis to chase the next fad Starting from scratch is an excellent experience, since the language itself is amazing. But getting stuff done is simply easier to do with other languages. Friction. D has plenty to spare.
Nov 02 2021
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 2 November 2021 at 19:13:38 UTC, SealabJaster wrote:
 On Tuesday, 2 November 2021 at 19:02:52 UTC, Imperatorn wrote:
 Well, yeah. But, if D has got something for everyone, why 
 hasn't it got more users? 🤔
It's easier to just go with another language with: * Better ecosystem
Agreed
 * Libraries with actual documentation
Could be better yes
 * Popular/core libraries tend to have tutorials
True
 * Standard ways for logging, DB connection, etc
Also true Pattern for the above, are we afraid of standardizing stuff in D?
 * A VSCode plugin/IDE that's actually capable of handling the 
 language properly
code-d and Visual D in conjunction works good enough for me, but could be better.
 * Doesn't require you to often dive into spending ages getting 
 a random C library to compile
I don't quite get what you mean here
 * Doesn't require you to write C code alongside the native 
 language's code
Or here
 * Doesn't constantly undergo an identity crisis to chase the 
 next fad
Yes we need to make some wise decisions!
 Starting from scratch is an excellent experience, since the 
 language itself is amazing.
Agreed :)
 But getting stuff done is simply easier to do with other 
 languages.
Hmm, only the bigger languages I would agree this is true. But I guess that's what you mean
 Friction. D has plenty to spare.
It has a learning curve and edge cases, yes. I guess that's the price you pay for not having standard ways of doing some stuff...
Nov 02 2021
parent SealabJaster <sealabjaster gmail.com> writes:
On Tuesday, 2 November 2021 at 19:42:45 UTC, Imperatorn wrote:
 Pattern for the above, are we afraid of standardizing stuff in 
 D?
We seem to be afraid of doing anything that doesn't work in
insert fringe edge case the language designers care about<<
 I don't quite get what you mean here
Yeah, was a bit vague. What I was getting at is some D libraries have a requirement of some C library, but expect you to build said C library yourself, which sometimes is very straightforward, but is sometimes a nightmare. And then sometimes you have to go to a C library since D doesn't have a (functioning) package you need, so then you have to write C-style code alongside your normal D code.
Nov 03 2021
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Tuesday, 2 November 2021 at 18:57:32 UTC, Paulo Pinto wrote:
 On Tuesday, 2 November 2021 at 18:08:53 UTC, Imperatorn wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Have asked this question for a long time now. Still don't know why.
features that D had as edge over them.
I suspect an even bigger factor is the proliferation of good languages in the last 20 years. Someone wanting to have fun others. Someone wanting to replace C for things that need to be much faster than a scripting language have Rust, Go, Nim, and others. The fact that some of those languages are good for compiling to Javascript is yet another reason to not use D.
Nov 02 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 19:10:08 UTC, bachmeier wrote:
 Someone wanting to replace C for things that need to be much 
 faster than a scripting language have Rust, Go, Nim, and 
 others. The fact that some of those languages are good for 
 compiling to Javascript is yet another reason to not use D.
Not so sure about Javascript, but for people looking for a niche programming language there certainly are many options. Although I feel the primary reason is that D is without direction. The original focus was to be a cleaned up C++, without the complicating bits, but D2 gave it roughly the same disadvantages that C++ has without the advantages C++ has. That in combination with some odd choices (on multiple levels) for system level programming sends a signal of no clear vision. I think it is very important for small niche languages to send a very strong and clear signal of what the vision is, like Rust did. Because people don't pick them up for what they are capable of today, but for where they are heading. They want to join the ride, be one of the early adopters, gold rush, innovators, etc. That's how you get hyped up… The current vision appears to be to entertain those that find the current feature set attractive, but that is a very foggy picture. It is very difficult for outsiders to figure out where D is headed from such an unclear vision.
Nov 02 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 19:21:36 UTC, Ola Fosheim Grøstad 
wrote:
 did. Because people don't pick them up for what they are 
 capable of today, but for where they are heading. They want to 
 join the ride, be one of the early adopters, gold rush, 
 innovators, etc. That's how you get hyped up…
Another point that is easy to miss is that the hardcore geeks are very important for building the eco system. The «hacker» type. They sometimes feel at home with niche languages as they don't mind being outsiders, or maybe they hold «anti establishment»/«anarchy» views. Then the values of the community is just as important as the language. (Re. Lisp). So, there are no easy answers, really. Attraction and rejection happen on multiple levels.
Nov 02 2021
prev sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Tuesday, 2 November 2021 at 18:57:32 UTC, Paulo Pinto wrote:
 Meanwhile Swift, Go and Rust, managed to gather steam either by 
 having a benolovent corporate sponsor or by having a proper 
 goal of what they want to achieve.
On day one of their "introduction" (these languages were started way before their public release) Rust, Go and Swift had a marketing narrative that was both simple and then advertised to the public, with money. These narrative play on people aspirations, by telling what they want to hear _about themselves_. Go: "You're going to Google scale." Rust: "We nailed security. You care about security, right? Also you're going to be one of those badass system programmers where things go fast." Swift: "Well you have no choice. This is the future (if you want to sell something to Apple users)." I think they not quite subtly played on FOMO. I think D also has a narrative actually: D: "You too can be financially independent by crushing the competition with D and hard work. More power than C++." hence "Do It In D". The problem is that the (Go/Rust/Swift) stories speaks to more people, like people in managerial positions. D's underhanded motto - and that's just an opinion - is fundamentally an appeal for way fewer people; I'd like to elaborate more but probably I'm just speculating without substance. I don't think features or bugs ever entered the picture! Or only to support or contradict the main narrative.
Nov 03 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 13:25:18 UTC, Guillaume Piolat 
wrote:
 I think they not quite subtly played on FOMO.
 I think D also has a narrative actually:

 D: "You too can be financially independent by crushing the 
 competition with D and hard work. More power than C++."

 hence "Do It In D".
Does FOMO mean "fear of missing out"? Maybe, but I think they also played up to a dissatisfaction with system level programming being tedious, and alluding to the power of system level programming to programmers that had not gone there before. (D does too.) Go later distanced themselves from system level programming, when people saw that it was not suitable for that, and rightfully felt that they were mislead, but maybe they still benefited in terms of hype from the controversy? D did receive quite a bit of hype for being a possible successor to C++ on Slashdot around 2006/2007 or so (at that point in time /. was more inclined to publish stories about new exciting tech). Basically, D was projected as become the language that would remove the complexities of C++ and make you more productive. I doubt I would have looked at D if the connection to C++ wasn't used as a "marketing trajectory".
 The problem is that the (Go/Rust/Swift) stories speaks to more 
 people, like people in managerial positions.
At launch or now? Initially I think developers want to taste the latest and greatest and get a glimpse of the future. So when Apple or Google release a new language many will study it. I think many with an academic background got excited about Rust, just on paper, so the setup was blogger friendly. Now, I think it is by and large critical mass. Go has a solid runtime for cloud micro services. Swift is a platform. Rust, a good choice when C++ is too low level and Go is too high.
Nov 03 2021
next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 3 November 2021 at 14:02:52 UTC, Ola Fosheim 
Grøstad wrote:
 D did receive quite a bit of hype for being a possible 
 successor to C++ on Slashdot around 2006/2007 or so (at that 
 point in time /. was more inclined to publish stories about new 
 exciting tech). Basically, D was projected as become the 
 language that would remove the complexities of C++ and make you 
 more productive.
What I meant is that posting of the spec on Slashdot in 2001 wasn't a carefully executed software release you could have for... consumer applications and new languages from big companies.
 I doubt I would have looked at D if the connection to C++ 
 wasn't used as a "marketing trajectory".
Absolutely. For me it was playing Kenta Cho games and noticing they were all written in D, without source available, and the sheer amount of innovation and grit that Kenta would display through his games. I did fall for D later having to use C++ at work, because it instantly felt wrong after a long exposure to the Wirth family of languages. So yes exposure to C++ highlight the value of D more I guess, so you can get the issue of intersecting audiences.
Nov 03 2021
next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 3 November 2021 at 14:38:39 UTC, Guillaume Piolat 
wrote:
 without source available
with* source available
Nov 03 2021
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 14:38:39 UTC, Guillaume Piolat 
wrote:
 What I meant is that posting of the spec on Slashdot in 2001 
 wasn't a carefully executed software release you could have 
 for... consumer applications and new languages from big 
 companies.
Yes sure. I also remember that when I followed /. I generally paid little attention to the article and was more interested in reading the opinions and experiences that the users shared in their comments. Kinda like reddit. So, in some sense the /. population created their own hype machine fed by the editors… If reddit was more structured I guess it could have some of the same effect.
 Wirth family of languages. So yes exposure to C++ highlight the 
 value of D more I guess, so you can get the issue of 
 intersecting audiences.
Yes, especially since Walter pointed out on the website (IIRC) that the D compiler was built on the same core as a C++ compiler and that he had set out on a personal «mission» to create an easier language in the C++ family (he probably used different words, but that was the impression his «story» left in me). That made the goal believable, even though I hit some show-stopping experiences the first year I tried to use it. «Ok, let me wait and see how it looks when those compiler issues are dealt with.» Then D2 came with raised ambitions and things got more complicated.
Nov 03 2021
prev sibling parent reply bachmeier <no spam.net> writes:
On Wednesday, 3 November 2021 at 14:02:52 UTC, Ola Fosheim 
Grøstad wrote:
 I doubt I would have looked at D if the connection to C++ 
 wasn't used as a "marketing trajectory".
In contrast, I knew about D for many years before even looking at it, because I didn't want a better C++. I wanted something that wasn't C++. Go and Rust clearly separated themselves from C++, so I used them both before trying D. The only reason I even downloaded DMD was because Go and Rust had a certain kind of suckiness to them that I kept searching.
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 15:42:17 UTC, bachmeier wrote:
 On Wednesday, 3 November 2021 at 14:02:52 UTC, Ola Fosheim 
 Grøstad wrote:
 I doubt I would have looked at D if the connection to C++ 
 wasn't used as a "marketing trajectory".
In contrast, I knew about D for many years before even looking at it, because I didn't want a better C++. I wanted something that wasn't C++. Go and Rust clearly separated themselves from C++, so I used them both before trying D. The only reason I even downloaded DMD was because Go and Rust had a certain kind of suckiness to them that I kept searching.
Yeah, I get the suckiness thing, but when did you start to use D? I think D presented itself differently up to about 2014?
Nov 03 2021
parent bachmeier <no spam.net> writes:
On Wednesday, 3 November 2021 at 16:14:48 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 3 November 2021 at 15:42:17 UTC, bachmeier wrote:
 On Wednesday, 3 November 2021 at 14:02:52 UTC, Ola Fosheim 
 Grøstad wrote:
 I doubt I would have looked at D if the connection to C++ 
 wasn't used as a "marketing trajectory".
In contrast, I knew about D for many years before even looking at it, because I didn't want a better C++. I wanted something that wasn't C++. Go and Rust clearly separated themselves from C++, so I used them both before trying D. The only reason I even downloaded DMD was because Go and Rust had a certain kind of suckiness to them that I kept searching.
Yeah, I get the suckiness thing, but when did you start to use D? I think D presented itself differently up to about 2014?
Pretty sure it was 2013, in the aftermath of DConf. Back then D was all about "C++ done right".
Nov 03 2021
prev sibling parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 3 November 2021 at 13:25:18 UTC, Guillaume Piolat 
wrote:
 D: "You too can be financially independent by crushing the 
 competition with D and hard work. More power than C++."
D's user must be senior technicians, ordinary `it` people don't care it. And they are more willing to take risks to start a business. `More power than C++` and `Money`, `More independent, freedom, risks`. For first class talent.
Nov 03 2021
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 3 November 2021 at 14:39:19 UTC, zjh wrote:

 For first class talent.
After getting rid of GC. I Can `heavily promote` in the QQ group. `Rust` is for `large company`. `D` can aimed at `Technology Entrepreneurship` man. `D` can aimed at `small companys`. Many other relied on `large company`'s language is `Unreliable`. So,Work hard, man. Unlike others, I think D has a `good` future.
Nov 03 2021
prev sibling next sibling parent bachmeier <no spam.net> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
What's the basis for saying it's not popular? How popular should it be? I don't think the discussion is productive without additional details. Why doesn't it have heavy adoption in businesses? No, that's still too vague to be helpful. Why don't FAANG use it to develop their customer-facing apps? Still too vague. Why doesn't Amazon use it for [something Amazon sells]. Why doesn't NASA adopt it as their main language? Why isn't it used to teach intro programming classes?
Nov 02 2021
prev sibling next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
This is the sort of thing what many of us love to tell they "know", but we're really only guessing based on past personal experience and gut feeling - very shaky. Perhaps someone really has more insight than the rest but we can hardly tell who. We need some real research on human behaviour to base our opinions on, otherwise we're never going to know the real reasons with any meaningful confidence.
Nov 02 2021
next sibling parent harakim <harakim gmail.com> writes:
On Tuesday, 2 November 2021 at 21:08:22 UTC, Dukc wrote:
 This is the sort of thing what many of us love to tell they 
 "know", but we're really only guessing based on past personal 
 experience and gut feeling - very shaky. Perhaps someone really 
 has more insight than the rest but we can hardly tell who.

 We need some real research on human behaviour to base our 
 opinions on, otherwise we're never going to know the real 
 reasons with any meaningful confidence.
Most everyone in the thread seems to agree. I started writing my answer several hours ago and hadn't seen most of the other replies. My post has the same answers, though, because they're pretty clear. As someone who has stepped away from D many times to get work done, I feel like I am one of those people. I just always hold hope and come back.
Nov 02 2021
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 21:08:22 UTC, Dukc wrote:
 We need some real research on human behaviour to base our 
 opinions on, otherwise we're never going to know the real 
 reasons with any meaningful confidence.
You get an idea when you follow the forums for several years (where people explain why they quit) and read what people write on various social media. Lack of direction seems to be quite a common theme. If we take Rust as an example, then we know that it was growing and attracting users before it hit 1.0 and generated lots of hype. Rust projected very strong values, both in terms of language philosophy and community. Some found it off-putting, but if that means that they scared off 50%, it also means that those that remain are more likely to pull in the same direction. Which is immensely helpful… And they managed to attract a very geeky academic subset of the programming population (highly capable programmers) which is good for the eco system. By casting the net wide, D has a population that is pulling in all kinds of directions, which is kinda interesting, but not really ideal for progress.
Nov 02 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Tuesday, 2 November 2021 at 21:35:45 UTC, Ola Fosheim Grøstad 
wrote:
 You get an idea when you follow the forums for several years 
 (where people explain why they quit) and read what people write 
 on various social media. Lack of direction seems to be quite a 
 common theme.
This alone does not tell much. How often do people leave our/other languages? What reasons there are when they leave other languages? We do not know how we compare. Do those who post a goodbye (or badbye?) rant represent the typical leaver? And are the things that would have more people choose D the same that would have less people leaving? Those all are questions one can't answer by gut with any reliability.
 If we take Rust as an example, then we know that it was growing 
 and attracting users before it hit 1.0 and generated lots of 
 hype. Rust projected very strong values, both in terms of 
 language philosophy and community. Some found it off-putting, 
 but if that means that they scared off 50%, it also means that 
 those that remain are more likely to pull in the same direction.
But you can't conclude from that that projecting strong values correlates strongly with success, or you're committing the Texian sharpshooter fallacy. There are tons of other potential explanations. And no, you're not avoiding the fallacy merely by listing a few other succesful languages that also had strong opinions.
Nov 02 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 22:00:34 UTC, Dukc wrote:
 But you can't conclude from that that projecting strong values 
 correlates strongly with success, or you're committing the 
 Texian sharpshooter fallacy. There are tons of other potential 
 explanations.
There are many factors. One single factor is of course not sufficient. In the case of Rust, it seems quite clear that early adoption was related to projecting a strong vision. I think we can say the same for Java. It was projected as the next big thing on Internet a long time before it actually was useful.
 And no, you're not avoiding the fallacy merely by listing a few 
 other succesful languages that also had strong opinions.
We have to consider that there is a landscape of languages and that enthusiastic D users have moved from D to Rust, Zig and Nim. I think that ought to beg some soul searching questions.
Nov 02 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Tuesday, 2 November 2021 at 22:43:13 UTC, Ola Fosheim Grøstad 
wrote:
 In the case of Rust, it seems quite clear that early adoption 
 was related to projecting a strong vision. I think we can say 
 the same for Java. It was projected as the next big thing on 
 Internet a long time before it actually was useful.
It could be that one or both of them did not succeed because of strong opinions, but rather because of the bandwidth to broadcast it. In other words, perhaps an unopinionated language with as much bandwidth to broadcast it's malleability might succeed just as well. And that's just one alternative theory. Not saying your theory is wrong, but I'm not going to put much weight on it and neither should anyone else, unless you can show some research you're basing your opinions on. And that applies to all forum theories about subjects like this.
 And no, you're not avoiding the fallacy merely by listing a 
 few other succesful languages that also had strong opinions.
We have to consider that there is a landscape of languages and that enthusiastic D users have moved from D to Rust, Zig and Nim. I think that ought to beg some soul searching questions.
As if something else was suggested. We are not talking about whether we wish to answer questions about language adoption, we are talking about how they can be somewhat reliably answered, if at all.
Nov 02 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 2 November 2021 at 23:47:42 UTC, Dukc wrote:

 Not saying your theory is wrong, but I'm not going to put much 
 weight on it and neither should anyone else, unless you can 
 show some research you're basing your opinions on. And that 
 applies to all forum theories about subjects like this.
Quantitative science is not very good at answering questions related to design and culture where the context is changing. So you have to make do with qualitative analysis. If you don’t like that, why engage in discussions? What you say is that we never can find a quantitative foundation for innovation, but that does not mean that we cannot find a qualitative foundation for design. Which leads to better design because we consider more factors.
Nov 02 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Wednesday, 3 November 2021 at 05:20:11 UTC, Ola Fosheim 
Grøstad wrote:
 On Tuesday, 2 November 2021 at 23:47:42 UTC, Dukc wrote:

 Not saying your theory is wrong, but I'm not going to put much 
 weight on it and neither should anyone else, unless you can 
 show some research you're basing your opinions on. And that 
 applies to all forum theories about subjects like this.
Quantitative science is not very good at answering questions related to design and culture where the context is changing. So you have to make do with qualitative analysis. If you don’t like that, why engage in discussions?
Qualitative research is okay. But it has to be based on much more than what people say on the forum / Reddit / Hacker news. Following and analyzing the development on GitHub and alternatives would be a start, but even that misses closed-source projects and the underlying reasons on why people come and go. So ideally we want something else too, say interviews. The research should consider a spectrum of possibilities. Each theory the research considers unlikely in the conclusion, it must provide evidence against. It is not enough to provide evidence for the possibility considered likely. Perhaps you have done something like that already. But if you can not or will not show it, we others have no way of telling your theory from yet another forum ramble. Note that I'm not taking this stance for all forum discussions. When discussing DIPs for example, you can provide examples everyone can verify (or debunk). But if you're saying that a stronger vision will/would have attracted more people, only research can really tell.
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 11:03:35 UTC, Dukc wrote:
 Qualitative research is okay. But it has to be based on much 
 more than what people say on the forum / Reddit / Hacker news. 
 Following and analyzing the development on GitHub and 
 alternatives would be a start, but even that misses 
 closed-source projects and the underlying reasons on why people 
 come and go. So ideally we want something else too, say 
 interviews.
Interviews would be good, but I think you are putting too much faith in research practices. Of the papers I've read, in 90% of the cases I would have objections to their conclusions based on the methodology (for both qualitative and quantitative methods). If you don't want to participate in a line of reasoning, of course, you don't have to. In general, researchers tend to be very open to explore ideas in the way we do in this thread, so I don't quite share your objection to ranting. When you explore ideas through a line of reasoning then other people can choose to dig into it, find new angles and ideas following from it, or they can ignore it. But qualitiative analysis is an exploration. You don't aim to grade impact in a predictive manner, you try to unfold many different aspects of a process so that you can form a model of what might influence it. Also bringing nuances into an analysis is a very important aspect where quantitative methods shine. Then you also have interpretative research applying models of power etc to situations. The world is not black and white. You don't have to wait for someone to do formal data collection to explore the nuances of the situation.
Nov 03 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Wednesday, 3 November 2021 at 12:00:48 UTC, Ola Fosheim 
Grøstad wrote:
 The world is not black and white. You don't have to wait for 
 someone to do formal data collection to explore the nuances of 
 the situation.
You're assuming the truth is such you can infer it from what you see without going out to collect more data, and trust your analysis. There is no reason to assume so. We forum participants are likely to be a highly self-selected group. I do understand that you have programming experience from other places too, but realistically you're definitely not such a polyglot that you can see all the bubbles from the outside. Think Walter. He has seen pretty much everything, yet all of us can see blind spots in his thinking fairly often. You cannot expect yourself to do much better than Walter no matter how experienced you are.
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 12:52:10 UTC, Dukc wrote:
 You're assuming the truth is such you can infer it from what 
 you see without going out to collect more data, and trust your 
 analysis.
I am not assuming, I am presenting a viewpoint with arguments. Also don't assume that research is about conveying truth. It is about providing perspectives, models and ideally a transparent exposition. In reality you can always dismiss research on complex systems, by just pointing out that the context is different, the sample is biased, the analysis is biased, that they only looked for X and didn't look for Y, and that the results cannot be used for prediction. But that does not mean that the perspectives and models are useless.
 Think Walter. He has seen pretty much everything, yet all of us 
 can see blind spots in his thinking fairly often. You cannot 
 expect yourself to do much better than Walter no matter how 
 experienced you are.
I don't know how to respond to this. Let us keep Walter out of the analysis.
Nov 03 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Wednesday, 3 November 2021 at 13:10:21 UTC, Ola Fosheim 
Grøstad wrote:
 I am not assuming, I am presenting a viewpoint with arguments. 
 Also don't assume that research is about conveying truth. It is 
 about providing perspectives, models and ideally a transparent 
 exposition.
But how to evaluate the viewpoint with arguments, or the research, that's the problem. It is possible that I didn't pinpoint the problem with most forum theories about language popularity accurately, so I think I should present an [example of what I do find convincing](https://astralcodexten.substack.com/p/please-dont-give up-on-having-kids). It's not about language adoption, but nonetheless about a complex subject where you can't rebut bad arguments with simple self-made examples. If you can provide that level of argumentation for your favorite language adoption theory, then it's time to take it seriously IMO.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 10:28:52 UTC, Dukc wrote:
 It is possible that I didn't pinpoint the problem with most 
 forum theories about language popularity accurately, so I think 
 I should present an [example of what I do find 
 convincing](https://astralcodexten.substack.com/p/please-dont-give
up-on-having-kids). It's not about language adoption, but nonetheless about a
complex subject where you can't rebut bad arguments with simple self-made
examples.
And I find that conclusion to be highly suspicious. I can make an argument for why, but it is totally off topic. If you don't agree with an argument, put forth a counter argument/examples or just ignore it.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 10:45:05 UTC, Ola Fosheim Grøstad 
wrote:
 And I find that conclusion to be highly suspicious. I can make 
 an argument for why, but it is totally off topic.
Read it all and consider it. Really do. I don't think you managed to give it serious consideration in 15 minutes. Not saying you should agree with it, but you're losing a lot if you don't consider its arguments. But no need for the counter argument. First, as you said it would be off topic. Second, I don't want to reveal here what I precisely think of that article besides that it's arguments are generally convincing. Third, if that article does not convince you no way I'm going to.
 If you don't agree with an argument, put forth a counter 
 argument/examples or just ignore it.
Of course.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 10:59:42 UTC, Dukc wrote:
 Not saying you should agree with it, but you're losing a lot if 
 you don't consider its arguments.
I have not interest in the topic… But please understand that in fields such as software process improvement, educational research and design, the most useful ideas are not backed by "hard data". Most papers on education and design are anecdotal in nature, but that does not mean you should ignore them. This is the nature of the setting as the context is not stable and the system dynamics are complex.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 11:10:53 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 4 November 2021 at 10:59:42 UTC, Dukc wrote:
 Not saying you should agree with it, but you're losing a lot 
 if you don't consider its arguments.
I have not interest in the topic… But please understand that in fields such as software process improvement, educational research and design, the most useful ideas are not backed by "hard data". Most papers on education and design are anecdotal in nature, but that does not mean you should ignore them.
I believe what counts is the strength of the signal, hard data or anecdotal. You mentioned software process improvement, so let's take an example of an useful idea there: unit testing. Yeah, I have no hard data on their usefulness, only anecdotal experience that dmd/DRuntime/Phobos unit tests regulary prevent introducing bugs. The cause and effect are so clear there that I definitely believe they help a lot with program reliability, even if they aren't worth it for every project. So yes, an anecdotal forum theory about language adoption can be believable in principle. But the average case is nowhere near transparent enough to be considered anything more than noise. The writer may personally have good reasons to trust his/her theory, but at least as likely is that they're just making something up because they don't know. It's usually impossible to tell from outside.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 12:09:38 UTC, Dukc wrote:
 So yes, an anecdotal forum theory about language adoption can 
 be  believable in principle. But the average case is nowhere 
 near transparent enough to be considered anything more than 
 noise. The writer may personally have good reasons to trust 
 his/her theory, but at least as likely is that they're just 
 making something up because they don't know. It's usually 
 impossible to tell from outside.
As I already said, this not a black/white issue. It is not a single factor issue. If you choose to ignore all perspectives then you cannot make any judgement at all. Then you cannot design. Cannot improve. If you look at a situation from many perspectives then you can make judgments and weigh them. Is the outcome certain? Obviously not. No design outcome is certain! Just because you don't feel you can evaluate the usefulness of a perspective does not mean that this is the general case for everybody else. The more experience you get the better judgment you can make. The more perspectives you use when analysing, the more likely you are to make a good judgement. It is only impossible to make judgements from the outside because you choose to place yourself on the outside.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 12:19:01 UTC, Ola Fosheim Grøstad 
wrote:
 Just because you don't feel you can evaluate the usefulness of 
 a perspective does not mean that this is the general case for 
 everybody else. The more experience you get the better judgment 
 you can make. The more perspectives you use when analysing, the 
 more likely you are to make a good judgement.

 It is only impossible to make judgements from the outside 
 because you choose to place yourself on the outside.
I believe we both agree on the basic principle of usefulness of a viewpoint: Backed up by arguments that can be judged, useful. Not backed up, or backed up by only non-judgeable arguments, garbage. Ten or hundred of such viewpoints, still just as garbage. I am saying that the standard Reddit ramble about language adoption belongs to the latter category, and I believe you think it might belong to the former if the one evaluating has enough experience? The reason I believe a large majority of such posts are worth nothing is [that we have a tendency to make up reasons for even utterly random stuff](https://unherd.com/2021/07/what-warhammer-taught-me-about-life/).
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 12:39:28 UTC, Dukc wrote:
 The reason I believe a large majority of such posts are worth 
 nothing is [that we have a tendency to make up reasons for even 
 utterly random 
 stuff](https://unherd.com/2021/07/what-warhammer-taught-me-about-life/).
Meaning, you should not give any "authority" weight to the viewpoint of a random poster, unless they can show their viewpoint to be founded.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 12:58:36 UTC, Dukc wrote:
 Meaning, you should not give any "authority" weight to the 
 viewpoint of a random poster, unless they can show their 
 viewpoint to be founded.
But you shouldn't give "authority" to research results either, because the context is different and the research results could come from "tunnel vision". The viewpoint of a random poster could provide a perspective you had not thought about and that gives you a new angle to analyse your design. If you are unhappy with a situation then you need to change. You cannot predict the outcome of the change, but by using multiple perspectives you: 1. Get an idea of which directions you can make changes in. 2. Can consider more potential outcomes of the changes you make. But if you don't make any changes (because there is no data), then the situation is highly unlikely to improve. The core of design: you don't know the outcome, the outcome is one or multiple hypotheses. But if you look at it from multiple angles then you have a better grasp of where this could head.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 13:11:26 UTC, Ola Fosheim Grøstad 
wrote:
 The viewpoint of a random poster could provide a perspective 
 you had not thought about and that gives you a new angle to 
 analyse your design.

 If you are unhappy with a situation then you need to change. 
 You cannot predict the outcome of the change, but by using 
 multiple perspectives you:

 1. Get an idea of which directions you can make changes in.
 2. Can consider more potential outcomes of the changes you make.
You're saying that the forum ranting may be valuable because it might give you ideas, even if it conveys no trustworthy data? Okay, granted. If that's all you claim, we don't disagree after all. I guess my stance came out as "talking about language adoption is strictly negative, unless you have exceptional arguments", and you felt a need to rebut that. That wasn't quite what I said but thanks for the clarification anyway. I admit that I probably was in the need for that clarification myself.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 13:26:10 UTC, Dukc wrote:
 I guess my stance came out as "talking about language adoption 
 is strictly negative, unless you have exceptional arguments", 
 and you felt a need to rebut that. That wasn't quite what I 
 said but thanks for the clarification anyway. I admit that I 
 probably was in the need for that clarification myself.
Why would it be negative? Ok, so to go deeper into what "design is". It is perfectly reasonable to claim that we cannot make any certain predictions about outcome, so we have to rely on hypotheses. Then we need to consider two types of hypotheses: 1. hypotheses about possible negative effects of the design change 2. hypotheses about possible positive effects of the design change Then you can evaluate various designs and make trade-offs. You don't need to know the exact outcome, but in order to plan ahead you benefit from having a good grasp on possible positive and negative outcomes. Both in order find the right design and in order to plan ahead beyond that. If a possible positive effect does not happen, the negative impact is low. If we overlook a possible negative impact then he negative impact can be quite high. So, when we design for change it isn't critical that the positive outcome did not occur, but it is critical to avoid negative outcomes. So yes, enabling more potentially positive outcomes and minimizing potentially negative outcomes will over several iterations make for a better situation (from a statistical perspective).
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 13:39:28 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 4 November 2021 at 13:26:10 UTC, Dukc wrote:
 I guess my stance came out as "talking about language adoption 
 is strictly negative, unless you have exceptional arguments", 
 and you felt a need to rebut that. That wasn't quite what I 
 said but thanks for the clarification anyway. I admit that I 
 probably was in the need for that clarification myself.
Why would it be negative? Ok, so to go deeper into what "design is". [some insights about design theory]
I'm sorry, I'm not sure why you're explaining design theory to me? I thought we were not talking about design?
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 16:11:02 UTC, Dukc wrote:
 I'm sorry, I'm not sure why you're explaining design theory to 
 me? I thought we were not talking about design?
I think so? We are basically discussing which design aspects influence adoption? Visions are designed. A process is designed (or should be). Webpages are designed. Documentation is designed. Language features are designed.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 16:18:03 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 4 November 2021 at 16:11:02 UTC, Dukc wrote:
 I'm sorry, I'm not sure why you're explaining design theory to 
 me? I thought we were not talking about design?
I think so? We are basically discussing which design aspects influence adoption?
I see - that is what this thread is about after all. But you didn't finish your explanation to make any point about language adoption.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 16:36:39 UTC, Dukc wrote:
 I see - that is what this thread is about after all. But you 
 didn't finish your explanation to make any point about language 
 adoption.
I tried to point out that having multiple perspectives isn't negative, even if some turn out to have only a negligible effect. To increase adoption I think one should take a look at all the perspectives that have been presented in the thread, because I think they all are contributing factors. But one also has to think about multiple groups/classes of programmers and prioritize the groups that are most likely to address the worst weak spot (eco system). If I personally were looking for an alternative to C++ today I probably would have given Rust a spin first, because they seem very focused and determined to support lower level programming and make it more productive than C++. So for me a clear vision would be the first thing to look for in addition to the technical.
Nov 04 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 16:45:34 UTC, Ola Fosheim Grøstad 
wrote:
 I tried to point out that having multiple perspectives isn't 
 negative, even if some turn out to have only a negligible 
 effect.

 To increase adoption I think one should take a look at all the 
 perspectives that have been presented in the thread, because I 
 think they all are contributing factors.
In the brainstorming sense, yes. But in the "contributes to evidence in favour of this position", no, unless backed up with transparent reasoning. Note, I'm assuming a theard where people question the general strategy of a language. If discussing a particular language feature or other improvement it's different since it's more concrete and the facts can be reasoned about by anyone.
Nov 04 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 4 November 2021 at 16:58:05 UTC, Dukc wrote:
 Note, I'm assuming a theard where people question the general 
 strategy of a language. If discussing a particular language 
 feature or other improvement it's different since it's more 
 concrete and the facts can be reasoned about by anyone.
Ok, but again, it calls for soul searching if current users of the language most likely would not have landed on D if they started with no prior knowledge and experience. So, that means the "market position" has changed. So, how do you then get back to a position where you have enough hardcore programmers to develop something like an application framework that would make the language more attractive to a broader audience? You would need the same level of hardcore enthusiasm that lead a group of developers to build Tango. How do you get there?
Nov 04 2021
parent Dukc <ajieskola gmail.com> writes:
On Thursday, 4 November 2021 at 17:11:34 UTC, Ola Fosheim Grøstad 
wrote:
 Ok, but again, it calls for soul searching if current users of 
 the language most likely would not have landed on D if they 
 started with no prior knowledge and experience. So, that means 
 the "market position" has changed.

 So, how do you then get back to a position where you have 
 enough hardcore programmers to develop something like an 
 application framework that would make the language more 
 attractive to a broader audience?

 You would need the same level of hardcore enthusiasm that lead 
 a group of developers to build Tango.

 How do you get there?
I don't think I'm any better than the average forum ranter to answer that. I do have technical experience from D so I consider myself qualified to contribute opinions on specific proposals or decisions made in monthly meetups. But to suggest a brand new general strategy that I would feel to have better chances than the current one, not so much.
Nov 04 2021
prev sibling parent reply harakim <harakim gmail.com> writes:
On Tuesday, 2 November 2021 at 23:47:42 UTC, Dukc wrote:
 Not saying your theory is wrong, but I'm not going to put much 
 weight on it and neither should anyone else, unless you can 
 show some research you're basing your opinions on. And that 
 applies to all forum theories about subjects like this.
 As if something else was suggested. We are not talking about 
 whether we wish to answer questions about language adoption, we 
 are talking about how they can be somewhat reliably answered, 
 if at all.
Let me share my credentials with you. I am a human and a developer. I have been one for some number of years. I interact with humans who are developers on a weekdaily basis and have done so for over a decade. I guess you could call me a world expert on humans who are developers. As a world expert on humans, I have noticed that humans don't really like uncertainty. As a human developer who works with developers, I have noticed that human developers aren't super interested in sinking a bunch of time into something that will not provide lasting value. They look at the long-term/Big-Oh value as opposed to the immediate gratification when it comes to development projects. Based on those two concepts, and virtually ever post I have ever seen about why people leave D, I can come to some conclusions: 1. There is uncertainty in D. Where is it going? Is it going to support the things I like about it in the future? Is it going to grow? Is it going to die? Will I be able to find an answer to a question that comes up? What IDE should I use? Does it matter which compiler I use? 2. People are not sure their investment in writing D programs is a secure investment. Will your code still work in a year? Will the language exist in a few years? Are the maintainers going to make some crazy non-backwards-compatible decision that will ruin my application? Will the library I am using go away? You could call what it lacks a vision. I don't mean a vision that spells out where the language is going or what it's good at, although that can be part of a vision. I think people can accept very little vision if they have some guarantees - some things they can hold on to and make decisions around. These are the guarantees which most people like and which are provided by most popular languages, like: tools will be maintained and fixed if there is an issue, libraries for common tasks will be maintained and available, the language will remain backwards compatible, you can quickly get answers to your questions from a number of sources. I also think people see a language that is 20 years old with the toolset and libraries it has and assume that this is all the progress that has been made and that progress must be slow. They may not know about all the corpses of libraries and tools strewn along the way. They might think this is it. That is just speculation, though. If you want to know what people think, I can see two good ways to do it. The first is to just think of all the reasons to use D. Then think of a specific type of user who those reasons to appeal to. Now ask them to use D and see what they say. Repeat. Do the reasons appeal to them? Do they make up for the unappealing parts of the ecosystem? You can also do this as a thought experiment. The second is to find repos in the d language that haven't been maintained in a while and reach out to the users and ask them why they left. In the end, the question is does this solve a problem people have and does it solve it in a way that another tool doesn't? In some sense, yes, but what do you have to give up to get those gains? I think the answer is pretty obvious and it's the reason D is what I always consider first, but not what I usually use.
Nov 02 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 06:08:57 UTC, harakim wrote:
 really like uncertainty. As a human developer who works with 
 developers, I have noticed that human developers aren't super 
 interested in sinking a bunch of time into something that will 
 not provide lasting value. They look at the long-term/Big-Oh
It is true that developers look for safe choices when picking languages in professional contexts, that basically always mean either a language that is already popular or a very narrow language that is well supported. As such, small generic niche languages will always look unsafe and developers with (sound) risk aversion will avoid them. So, we cannot extrapolate too much from what choices people make in professional software development contexts. We should expect that the majority of uptake comes from individuals that have a lot of freedom and can take higher risk. But the critical adoption is not with average developers. Average developers are essentially consumers that can motivate people to write tutorials and so on (basically an audience), but they do not increase the quality of the eco system. The critical adoption is related to those that are highly skilled programmers that also have programming as their primary hobby. The ones that are willing to both work all day with programming using a cookie cutter language and in addition want to spend their evenings and weekends programming a niche language (+ those that have a lot of freedom in their daytime). Those are the ones that can drive early adoption and build a solid eco system for small languages. D's major problem is not that it does not have sufficient numbers of "consuming programmers". I think it does. The major problem is that it does not have enough of those hardcore hobbyists. It has not been able to retain enough of them over time. That is where the vision is very important. And it would also help if the compiler had a better architecture, and some semantic cleanup to bring it more in line with the ideals of computer science (as the most skilled programmers will know what the semantics ought to be, and that can also be a turn off). because those languages have reached critical mass. They have so many users that they statistically also have a large number of hardcore programmers (even if that percentage is very low). A small language needs a higher ratio of hardcore vs average programmers than a language that already has critical mass.
Nov 03 2021
prev sibling next sibling parent Dukc <ajieskola gmail.com> writes:
On Wednesday, 3 November 2021 at 06:08:57 UTC, harakim wrote:
 Let me share my credentials with you. I am a human and a 
 developer. I have been one for some number of years. I interact 
 with humans who are developers on a weekdaily basis and have 
 done so for over a decade. I guess you could call me a world 
 expert on humans who are developers.
Sigh. If you think that being a developer with roughly normal feelings makes your experience to qualify as research results, I don't even know where to begin. You're underestimating the problems by an incredible factor. You have a lot, lot to learn about doing research.
Nov 03 2021
prev sibling parent Dennis <dkorpel gmail.com> writes:
On Wednesday, 3 November 2021 at 06:08:57 UTC, harakim wrote:
 As a world expert on humans,
This is my new favorite quote.
Nov 03 2021
prev sibling next sibling parent harakim <harakim gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
I left a long response on that, but I guess I can add one here. tldr: it's not stable, there aren't many libraries for it, there are few standard tools, you can't find answers to other people who had the problem you're currently having and the alternatives are not that bad. What is the mindset of someone investigating D? Are they trying to learn about D or are they trying to find a solution to some problems they either have or expect to have? So right out of the gate, they probably aren't coming to D. They are searching in a search engine and finding Rust, Nim, etc, if they are looking for something low level. They are probably also finding that C++ is popular and what is used the most. For web so those people don't feel bad.) They will even find Go. If they manage to find D and decide to evaluate it, imagine their experience as contrasted to other languages. Type learn d programming into google and you'll get a tutorial that says they already need to know how to program and "You just need to have a basic understanding of working with a simple text editor and command line." I think at that point, a lot of people say "this isn't for me." The second hit is the d language home page. When you go to the homepage, it shows download language, an editor that people who already know D can use and a donation button. Very small in the off-screen section, you will find a tour. This is fine for Java, which people have to use and which people learn in school. It's not fine when you're trying to get people interested. The third link is a sub link and is the tour. Now the tour starts with a wall of text and many language options, none of which is English. It also doesn't really tell you how to set up your environment. It says you don't need an IDE, but you can use one if you want. The next thing it shows is the command line. I'm not saying this tutorial is that hard to follow, especially if you already use the command line and know about programming, I'm just saying it's 4 pages in and probably 20 minutes if you really read through and there is no pay dirt yet. Contrast that to Ruby. For me, the first link is code academy, the second link is: https://www.ruby-lang.org/en/documentation/quickstart/. It's the official ruby page and it's still pretty decent. It gets almost straight to the action. I think that is one series of barriers to entry. People may not even get to the point where they try the language. Those are the main barriers. Once you get in, though, there isn't really a coherent way to move forward with projects. There are not a lot of tutorials with a result of a working piece of software, there are not a lot of projects or libraries. Most of the libraries you find are no longer maintained. So you have to do a lot from scratch. To be honest, this isn't as big of a deal as in other languages, because you have low-level access through the standard library, you can use C libraries and it's quicker to build a library than in other languages. But it's not nothing. When you get tripped up, you can't find examples in D or answers to your specific problem on stack overflow. The language changes in non-backwards compatible ways. If you're a casual user, you probably won't notice these changes being announced. You'll just notice when you have to update your D compiler 30 versions for a library. It will compile the library, but it doesn't compile *your* code anymore. At some point, I think most people who have used D have asked: is it worth it? I mean, you can just switch to another language that might be a little inferior but it has tons of libraries, tons of backwards compatible. probably not with Python. At some point, I just have to say "What am I trying to accomplish, and is D essential to that?" I think a lot of people decided it was not. I do think there is a path for D to become a major programming language, but it would need to change course a little. I don't know if that is worth it to the people who work on D.
Nov 02 2021
prev sibling next sibling parent Dave P. <dave287091 gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Personally, I think most of the explanations people give miss the mark. I think D’s issue are mostly engineering. I personally would like to use D, but I keep running into bugs with the compiler. The latest one I’ve hit is this [one](https://issues.dlang.org/show_bug.cgi?id=22427). Occasionally I give D a good try and really like the language itself, but then run into weird errors like that one. It makes it hard to adopt and recommend.
Nov 02 2021
prev sibling next sibling parent zjh <fqbqrr 163.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/why_is_d_unpopular/)
`D` was bitten by GC.
Nov 02 2021
prev sibling next sibling parent reply mw <mingwu gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 ... I'd like to know you all opinions
For me, D is on an one man's island having trouble to cooperate with software components in other languages, due to unfixed bugs for a long time: -- apache-thrift-d, LDC has build failure, because this bug (logged for 4 months now): https://issues.dlang.org/show_bug.cgi?id=22083 -- D grpc client cannot talk to Python grpc server, because this bug (logged for a year now): https://github.com/huntlabs/grpc-dlang/issues/15 BTW, apache-thrift and grpc are the most important two rpc frameworks used in the industry.
Nov 02 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Wednesday, 3 November 2021 at 06:42:05 UTC, mw wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 ... I'd like to know you all opinions
For me, D is on an one man's island having trouble to cooperate with software components in other languages, due to unfixed bugs for a long time: -- apache-thrift-d, LDC has build failure, because this bug (logged for 4 months now): https://issues.dlang.org/show_bug.cgi?id=22083 -- D grpc client cannot talk to Python grpc server, because this bug (logged for a year now): https://github.com/huntlabs/grpc-dlang/issues/15 BTW, apache-thrift and grpc are the most important two rpc frameworks used in the industry.
We need to fix these
Nov 03 2021
parent sardigarmi44 <sardigarmi44 gmail.com> writes:
I also think people see a language that is 20 years old with the 
toolset and libraries it has and assume that this is all the 
progress that has been made and that progress must be slow. They 
may not know about all the corpses of libraries and tools strewn 
along the way. They might think this is it. That is just 
speculation, though.
May 14 2022
prev sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 03/11/2021 7:42 PM, mw wrote:
 -- apache-thrift-d, LDC has build failure, because this bug (logged for 
 4 months now):
 
 https://issues.dlang.org/show_bug.cgi?id=22083
 
 
 -- D grpc client cannot talk to Python grpc server, because this bug 
 (logged for a year now):
 
 https://github.com/huntlabs/grpc-dlang/issues/15
 
 
 BTW, apache-thrift and grpc are the most important two rpc frameworks 
 used in the industry.
#dbugfix
May 14 2022
prev sibling next sibling parent reply kot <kot lin.ko> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
i don't care if a language is popular. i want to share a real world example that i believe answers a few questions related to language adaptation. years ago i was asked to develop an app for both mobile platforms (ios, android), they didn't care which language[s] was being used. swift was obvious choice for ios. for android i first checked clojure. android support was there but not seamless. then i found out a language called kotlin. not only that kotlin supported android, it had tools to convert java code to kotlin. i immediately tried the tools and results were beautiful. this much quality at the alpha/beta stage of a language... it was obvious that this language i just found out about will dominate the android-developement, and soon enough it was the official language. now, if D had supported android/ios half as good as swift or kotlin, i would not think twice. i find these language wars silly, it is *always* about tooling/support. i am using c++ for my current project because i have to. if i could use D as painless as C++ (again, not about language quality. tool quality, os-support, seamless ecosystem) i wouldn't think twice. for the project i am working, experiements and live coding is vital. so, my obvious choice would be lisp right? but i can't.
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 16:25:55 UTC, kot wrote:
 now, if D had supported android/ios half as good as swift or 
 kotlin, i would not think twice. i find these language wars 
 silly, it is *always* about tooling/support. i am using c++ for 
 my current project because i have to. if i could use D as 
 painless as C++ (again, not about language quality. tool 
 quality, os-support, seamless ecosystem) i wouldn't think 
 twice. for the project i am working, experiements and live 
 coding is vital. so, my obvious choice would be lisp right? but 
 i can't.
But do you feel productive in C++? I find that even for simple things, in C++ it will take 10x longer than in Python and a language like D is somewhere in-between. I guess that to some extent this is because I usually don't do things in C++ unless speed is critical, but the main gripe I have with C++ is that changing code is very costly. So that does not encourage you to avoid premature optimization. This is basically an area where a language like D (perhaps also Rust) might do better. So when you say that you do a project that requires experimentation, what made you reject other languages than C++?
Nov 03 2021
next sibling parent reply MGW <mgw yandex.ru> writes:
I think D is positioning itself wrongly. It tries to be useful to 
a very narrow circle of highly skilled developers (mostly C++) by 
betting on an advanced compiler. But, such developers are 
hampered by GC, there are few of them and they don't promote D.

On the other hand, beginning developers do not need it because D 
grew up with C++ and beginning developers are repulsed by the 
very mention of C++.

D should be positioned for intermediate developers. There is a 
big plus here GC. The bulk of development does not require 
serious knowledge of the C++ level. The main slogan: The power of 
C++ and the ease of programming as in Python.

To do this, you need:
1 - try to "freeze" compiler changes that break compatibility 
with the written code.
2 - ask the developers to update the packages for the modern 
version of the compiler.
3 - clearly show groups of packages by direction, CLI, GUI, WEB, 
NET, etc. on the homepage. The developer should take an example 
from any group and with minimal effort get a demo version.
4 - Try to "stick" to the mainstream, such as Qt. Now Qt fully 
supports only C++ and Python.  If we can position D as the third 
power in Qt, the rise in popularity of the D language is 
guaranteed.
Nov 03 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 16:42:01 UTC, MGW wrote:
 4 - Try to "stick" to the mainstream, such as Qt. Now Qt fully 
 supports only C++ and Python.  If we can position D as the 
 third power in Qt, the rise in popularity of the D language is 
 guaranteed.
In another thread Guillaume pointed out that D was suitable for developing cross platform desktop applications. That could be, right now there are no good options for that (unless you consider Java to be a good option). Dart/Flutter is preparing to cover the desktop with C++ interop. So that could be a contender in the future. You are right, an easy to use application framework is needed if you want to attract intermediate developers. Being limited to web and command line utils isn't all that attractive. Something like Qt is ok, but I don't think Qt is all that attractive. I think taking the engine in Flutter and tailoring it to the language and giving it better 2D graphics capabilities might be the best option. Then you have something current and generic that can be used for both simple games and desktop UI. But you need maybe 5 developers to do it well. Are they here? I don't know.
Nov 03 2021
prev sibling parent reply kot <kot lin.ko> writes:
On Wednesday, 3 November 2021 at 16:38:03 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 3 November 2021 at 16:25:55 UTC, kot wrote:
 now, if D had supported android/ios half as good as swift or 
 kotlin, i would not think twice. i find these language wars 
 silly, it is *always* about tooling/support. i am using c++ 
 for my current project because i have to. if i could use D as 
 painless as C++ (again, not about language quality. tool 
 quality, os-support, seamless ecosystem) i wouldn't think 
 twice. for the project i am working, experiements and live 
 coding is vital. so, my obvious choice would be lisp right? 
 but i can't.
But do you feel productive in C++? I find that even for simple things, in C++ it will take 10x longer than in Python and a language like D is somewhere in-between. I guess that to some extent this is because I usually don't do things in C++ unless speed is critical, but the main gripe I have with C++ is that changing code is very costly. So that does not encourage you to avoid premature optimization. This is basically an area where a language like D (perhaps also Rust) might do better. So when you say that you do a project that requires experimentation, what made you reject other languages than C++?
i am using c++ for almost 20 years and i am quite productive in it. given enough time i think one can be productive in any language. of course D would at least double that. this project (game) at first targetted both mobile and pc platforms. for this reason alone, i was stuck with c/c++. then i dropped mobile support. I don't know the state of tooling of D, but if it was seamless enough then D would be my first choice now. i don't like rust as much, rust code looks even uglier than c++ and its handling of generic-code/metaprogramming looks even worse. they should have started from D templates, not c++.
 what made you reject other languages than C++?
obvious choice for such a project is lisp, afaik no other language still has that speed/power when it comes to live coding. compared to C++ D has that too, answer is the same for both; tool and os support
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 17:12:28 UTC, kot wrote:
 i am using c++ for almost 20 years and i am quite productive in 
 it. given enough time i think one can be productive in any 
 language.
Yes, but some languages require you to do solid modelling before you start so they limit "evolutionary experiments". I find C++ to be such a language, in order to be productive you need a clear picture of what you want before you start. Lisp and Python and such languages allow you to grow a program "like a tree". I guess we could say they support metamorphosis (although dynamic typing can backfire).
 i  don't like rust as much, rust code looks even uglier than 
 c++ and its handling of generic-code/metaprogramming looks even 
 worse. they should have started from D templates, not c++.
The first time I saw a C program I thought it looked like incomprehensible shit compared to Turbo Pascal. :-) Then I learned it and went out of my way to write terse C code on an assignment in an algorithm class. I think the teacher got an headache when I wrote my solution on the blackboard (he allowed us to pick whatever language we wanted). When I see a Rust program (like the Rust compiler) I get the same feeling, but I cannot be sure if it is me or Rust. Maybe both?
 obvious choice for such a project is lisp, afaik no other 
 language still has that speed/power when it comes to live 
 coding. compared to C++ D has that too, answer is the same for 
 both; tool and os support
If it is a game, then I guess your choice is reasonable. C++ with gcc extensions is well tailored for games. Not having a SIMD type with traits is a weak spot in C++ though. I had to make my own library with SIMD traits to clean up my code, and figuring out how to do that was time-consuming… That is my main gripe with C++, you set out to clean up your code by designing a beautiful library and then implementing it turns into a chore because of some language weak spots. It is important for D to iron out weak spots, otherwise D is too much like C++ to stand out. (I guess you can embed Scheme/Guile or some other scripting language if you want to experiment too.)
Nov 03 2021
parent reply kot <kot lin.ko> writes:
On Wednesday, 3 November 2021 at 17:47:45 UTC, Ola Fosheim 
Grøstad wrote:
 When I see a Rust program (like the Rust compiler) I get the 
 same feeling, but I cannot be sure if it is me or Rust. Maybe 
 both?
after D every language looks ugly when it comes to generic programming.
 If it is a game, then I guess your choice is reasonable. C++ 
 with gcc extensions is well tailored for games. Not having a 
 SIMD type with traits is a weak spot in C++ though. I had to 
 make my own library with SIMD traits to clean up my code, and 
 figuring out how to do that was time-consuming… That is my main 
 gripe with C++, you set out to clean up your code by designing 
 a beautiful library and then implementing it turns into a chore 
 because of some language weak spots. It is important for D to 
 iron out weak spots, otherwise D is too much like C++ to stand 
 out.

 (I guess you can embed Scheme/Guile or some other scripting 
 language if you want to experiment too.)
years ago i suggested D should be distributed with an embeded C compiler. since it was already required to be binary compatible with C, this was the obvious next step which would also solve most tooling issues out of the box. from license issues to practicality there were many voices against it, some said it was a dumb. what you are saying about c++ at first looks like the failure of the language. for the most part it is of course right. but i also think it is a compleiment to c/c++. at least with these languages you are not dealing with tooling issues, which to me is the most important. they save you from one hell, only to welcome you with another, yet they give you the tools (half-assed tools, tools nonetheless) to get something done.
Nov 03 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 18:20:55 UTC, kot wrote:
 years ago i suggested D should be distributed with an embeded C 
 compiler. since it was already required to be binary compatible 
 with C, this was the obvious next step which would also solve 
 most tooling issues out of the box. from license issues to 
 practicality there were many voices against it, some said it 
 was a dumb.
Hah, I have suggested this too!! About seven (??) years ago. Did you use a different nickname back then? (Walter didn't like it, but he changed his mind now?)
 is the most important. they save you from one hell, only to 
 welcome you with another, yet they give you the tools 
 (half-assed tools, tools nonetheless) to get something done.
Well, yes, although it shows that C/C++ was not designed with tooling in mind. But looking back those language has improved a lot. When I started with C, the Ansi-standard was so new that most codebases I retrieved by FTP didn't support it. So they were littered with #ifdefs and macros to support all the C-dialects (and Unices…). So there has been a steady stream of improvement, although C itself looks arcane now.
Nov 03 2021
parent reply kot <kot lin.ko> writes:
On Wednesday, 3 November 2021 at 18:49:46 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 3 November 2021 at 18:20:55 UTC, kot wrote:
 years ago i suggested D should be distributed with an embeded 
 C compiler. since it was already required to be binary 
 compatible with C, this was the obvious next step which would 
 also solve most tooling issues out of the box. from license 
 issues to practicality there were many voices against it, some 
 said it was a dumb.
Hah, I have suggested this too!! About seven (??) years ago. Did you use a different nickname back then? (Walter didn't like it, but he changed his mind now?)
looks like it was around 2011-2013, i have yet to find the post. yes i used a different name back then, haven't posted anything since.
Nov 03 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 3 November 2021 at 19:05:49 UTC, kot wrote:
 looks like it was around 2011-2013, i have yet to find the 
 post. yes i used a different name back then, haven't posted 
 anything since.
It could have been around that time. I am not able to find my first post in the forums, maybe some of the early posts have been lost? Anyway, one other user agreed (:-D), but I cannot remember the nick…
Nov 03 2021
prev sibling next sibling parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
This is my first post here. I've been reading this for a long time out of general interest but since I never actually decided to try D, I though I could contribute my 2 cents to this topic. In addition to all that has already been said, I think there are two fundamental issues. This first one is historic: for a long time, D was not open source, in an era where proprietary languages were long frowned upon. People expect a language and all its associated tools to be free, both as in speech and beer. D may not have been expensive and difficult to obtain like Ada was, but it was still probably a big detriment to its early adoption. By the time it was relicensed and truly open source compilers appeared, its launch window had closed and there were other languages around. This brings me to the second point: perhaps D is simply based on the wrong idea. From my reading of the dlang forums it wants to be, if not all things to everyone, at least a general all-around language that can do more or less everything. The problem is that maybe this is not what the world wants. Most people are very happy to use different languages for different problems, and they will go for the one that is the best (for varying definitions of best) at something rather than one that is pretty good at lots of things. Today one can certainly do low level system programming in D, but Rust is a better systems language. One can develop microservices etc. in D, and it might be pretty good for that, but Go is better. D can even be used as a scripting language or one to drive high level logic, but Python is better for that. To put it differently, the world doesn't seem to want another C++. Both Rust and Go came after D and enjoyed significant uptake in areas that overlap with D's. The key difference IMHO is that they both know not only what they need to provide to be good options for their selected application spaces, but also what they don't want to become and what is totally out of scope for them. I really believe that the latter part is as important, if not more, than the former, and if there is one feature that D is lacking to get more traction, it's probably this one: decide which rabbits it's not trying to chase.
Nov 03 2021
next sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
On Wednesday, 3 November 2021 at 23:08:54 UTC, arco wrote:
 This first one is historic: for a long time, D was not open 
 source
This is not true. Really persistent myth but easy to prove that it was GPL'd as early as 2002. gdc's predecessor was out by 2003, also GPL licensed.
Nov 03 2021
parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Wednesday, 3 November 2021 at 23:13:34 UTC, Adam Ruppe wrote:
 On Wednesday, 3 November 2021 at 23:08:54 UTC, arco wrote:
 This first one is historic: for a long time, D was not open 
 source
This is not true. Really persistent myth but easy to prove that it was GPL'd as early as 2002. gdc's predecessor was out by 2003, also GPL licensed.
But at what point did D become truly usable using open source compilers? GDC was only declared feature complete, supported and merged upstream in GCC 9.0, that is in 2019. Both Go and Rust came with open source, production quality, reference compilers since day 1. It really makes a difference. Have you got some information about the early GPL'd compiler? My understanding is that DMD only became open source some time around 2014, was there another early project?
Nov 03 2021
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 03, 2021 at 11:33:05PM +0000, arco via Digitalmars-d wrote:
 On Wednesday, 3 November 2021 at 23:13:34 UTC, Adam Ruppe wrote:
 On Wednesday, 3 November 2021 at 23:08:54 UTC, arco wrote:
 This first one is historic: for a long time, D was not open source
This is not true. Really persistent myth but easy to prove that it was GPL'd as early as 2002. gdc's predecessor was out by 2003, also GPL licensed.
[...]
 Have you got some information about the early GPL'd compiler? My
 understanding is that DMD only became open source some time around
 2014, was there another early project?
DMD's front end has always been open source, and GDC has always used only the DMD front end, so GDC has always been open source since the first day GDC came into existence. The DMD backend was not open source due to an obligation between Walter and Symantec, but Symantec eventually released Walter from this obligation in 2017, and since then the entire DMD toolchain has been open source. T -- Music critic: "That's an imitation fugue!"
Nov 03 2021
prev sibling next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On Wednesday, 3 November 2021 at 23:33:05 UTC, arco wrote:
 On Wednesday, 3 November 2021 at 23:13:34 UTC, Adam Ruppe wrote:
 On Wednesday, 3 November 2021 at 23:08:54 UTC, arco wrote:
 This first one is historic: for a long time, D was not open 
 source
This is not true. Really persistent myth but easy to prove that it was GPL'd as early as 2002. gdc's predecessor was out by 2003, also GPL licensed.
But at what point did D become truly usable using open source compilers? GDC was only declared feature complete, supported and merged upstream in GCC 9.0, that is in 2019. Both Go and
I would like to know where you read that GDC wasn't feature complete for nearly 20 years.
 Rust came with open source, production quality, reference 
 compilers since day 1. It really makes a difference.
Actually, using your logic with GDC, you can say because it isn't merged into GCC, Rust is actually *not* open source, feature complete, supported, or truly usable.
 Have you got some information about the early GPL'd compiler? 
 My understanding is that DMD only became open source some time 
 around 2014, was there another early project?
Watch my talk on GDC which covers the timeline of early D compilers (BrightD, OpenD, DLI, GDMD, and a few others). :-) In short, open source - and more specifically porting to Linux - was always talked about since day one really, and Walter released the source code in 2002 in order to let the community do this. It took about two years before there was eventually one that was almost feature complete - DGCC, later to be known most commonly as GDC.
Nov 05 2021
parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Friday, 5 November 2021 at 09:16:39 UTC, Iain Buclaw wrote:
 On Wednesday, 3 November 2021 at 23:33:05 UTC, arco wrote:
 [...]
I would like to know where you read that GDC wasn't feature complete for nearly 20 years. [...]
GDC rox
Nov 05 2021
prev sibling parent reply bachmeier <no spam.net> writes:
On Wednesday, 3 November 2021 at 23:33:05 UTC, arco wrote:

 But at what point did D become truly usable using open source 
 compilers? GDC was only declared feature complete, supported 
 and merged upstream in GCC 9.0, that is in 2019. Both Go and 
 Rust came with open source, production quality, reference 
 compilers since day 1. It really makes a difference.
How are you defining "day 1"? I stopped using Rust before version 1, and I can guarantee that there was nothing production quality about their compiler. Unless you mean by "day 1" the day version 1 was released - which was four years after the first release". When I checked Wikipedia to see these dates, I found this interesting quote
 In January 2014, before the first stable release, Rust 1.0, the 
 editor-in-chief of Dr. Dobb's, Andrew Binstock, commented on 
 Rust's chances of becoming a competitor to C++ and to the other 
 up-and-coming languages D, Go, and Nim (then Nimrod). According 
 to Binstock, while Rust was "widely viewed as a remarkably 
 elegant language", adoption slowed because it repeatedly 
 changed between versions.
Nov 05 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 5 November 2021 at 16:02:45 UTC, bachmeier wrote:
 How are you defining "day 1"? I stopped using Rust before 
 version 1, and I can guarantee that there was nothing 
 production quality about their compiler. Unless you mean by 
 "day 1" the day version 1 was released - which was four years 
 after the first release". When I checked Wikipedia to see these 
 dates, I found this interesting quote
Rust had a very erudite hardcore following from the start. Some people would argue about their proofs for various formulation of aspects of the type system in coq. Lots of showoff in the commentary… So they attracted people with compi.sci. background (and possibly students). In that sense they did not need to be productive from the start, they had fun with the type system etc. I don't think this has been the case for D, but you could argue that those that created Tango also had fun with ideas (albeit in a less erudite manner). I also don't think open source mattered much in the case of Go, like most Google products they are primarily developed by Google employees and these products tend to die/stagnate when Google stops development.
Nov 05 2021
prev sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 3 November 2021 at 23:08:54 UTC, arco wrote:
 Today one can certainly do low level system programming in D, 
 but Rust is a better systems language. One can develop 
 microservices etc. in D, and it might be pretty good for that, 
 but Go is better. D can even be used as a scripting language or 
 one to drive high level logic, but Python is better for that.
Interesting point of view because obviously it isn't the point of view of many in the D community. I'll spare you the details but it is very possible D outperform those in their specialty. But what you said is indicative of the effectiveness of the stories displayed by those other langages, probably because they were described in terms of benefits: https://www.lumitos.com/en/blog/feature-advantage-benefit-the-fab-formula-for-product-descriptions-that-sell/ All 3 homepages of Rust/Go/Python start with a Benefit with few mentions of Features. It is a conscious marketing effort with relatively standard marketing thinking.
Nov 05 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 5 November 2021 at 13:10:58 UTC, Guillaume Piolat 
wrote:
 Interesting point of view because obviously it isn't the point 
 of view of many in the D community. I'll spare you the details 
 but it is very possible D outperform those in their specialty.
I think we need to embrace the idea that there is no single factor, and that D cannot best languages with critical mass in their speciality niches. One does not have to like Go as a language to understand that Go objectively has a runtime and ecosystem that makes it more suitable for micro services than D; Go is better supported (by cloud providers and by library authors). For instance, last time I tested I could automatically boot OS+service on a new machine as a Go instance in one second on App Engine Standard, and a Java server in 4 seconds. Although I could set up D, I would not be able to get close to those numbers, it would not be automatic and it would cost more. Clearly Go wins (unless Python is suitable, in which case Python wins). Rust clearly is more suitable than D if you want WASM (C++ might be even better, I don't know). Python is objectively the better portable scripting language, because D isn't a scripting language. Python has vast libraries and interop you get access to, there is no way D can reach critical mass there. Of course, there is no reason for D to go after microservices, scripting or WASM either!! If D had an application framework, then it would be better than Go/Rust/Python for writing desktop apps, for instance.
Nov 05 2021
next sibling parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Friday, 5 November 2021 at 13:42:44 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 5 November 2021 at 13:10:58 UTC, Guillaume Piolat 
 wrote:
 Interesting point of view because obviously it isn't the point 
 of view of many in the D community. I'll spare you the details 
 but it is very possible D outperform those in their specialty.
I think we need to embrace the idea that there is no single factor, and that D cannot best languages with critical mass in their speciality niches.
I agree that language evaluation occurs in a multi-dimensional space (no single factor) but I believe that D has already "bested" C++, to pick one competitor, in several ways. For example, in performance critical areas it has allowed individuals, or small groups, to quickly develop software that equals or outperforms the world's best. Roughly a (hard performance / development cost) metric. For me D is significantly better than my previous favorite in the performance space (C++/CUDA). The accessible and powerful metaprogramming capabilities let you get to the metal quickly with code that is quite readable. Unittest, modules, nested functions, ... are also very helpful. I imagine the Mir developer(s) might feel the same way. I don't know exactly how much Intel has spent on the portion of MKL that competes with Mir, but I would be quite surprised if it were less than 10X that spent on Mir. Finally, note that "besting" a particular language in some way(s) does not imply "besting" that language in terms of uptake but it does mean that for those who weight those factors heavily, D is the right choice.
Nov 05 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 5 November 2021 at 15:30:51 UTC, Bruce Carneal wrote:
 Finally, note that "besting" a particular language in some 
 way(s) does not imply "besting" that language in terms of 
 uptake but it does mean that for those who weight those factors 
 heavily, D is the right choice.
Yes, for instance, if you know D and not Python and don't need to do system integration, then the cost of using D for "scripting" is much less than using Python. So I understand that programmers that don't want to be polyglot might prefer to use a swiss-army-knife language like D. For polyglot programmers, I think, specialities tend to win out. For instance, I have no preference for Dart for writing a GUI applications (as a language). But right now it looks like it could win out by conquering iPhone, Android and in the future the Desktop. Once Dart has established itself on the Desktop, it might be difficult to compete. So, there is a window of opportunity for languages like D, Nim etc in that space right now. Who builds the more attractive portable application framework wins. And in that space I would agree that C++ is not a strong contender even though it has Qt (although C++ does compete in companionship with other languages).
Nov 05 2021
prev sibling parent reply Dr Machine Code <jckj33 gmail.com> writes:
On Friday, 5 November 2021 at 15:30:51 UTC, Bruce Carneal wrote:
 On Friday, 5 November 2021 at 13:42:44 UTC, Ola Fosheim Grøstad 
 wrote:
[...]
I agree that language evaluation occurs in a multi-dimensional space (no single factor) but I believe that D has already "bested" C++, to pick one competitor, in several ways. For example, in performance critical areas it has allowed individuals, or small groups, to quickly develop software that equals or outperforms the world's best. Roughly a (hard performance / development cost) metric. [...]
What's Mir and MLK?
Nov 05 2021
parent mw <mingwu gmail.com> writes:
On Saturday, 6 November 2021 at 03:20:30 UTC, Dr Machine Code 
wrote:
 What's Mir and MLK?
https://www.libmir.org/ https://en.m.wikipedia.org/wiki/Math_Kernel_Library
Nov 05 2021
prev sibling parent reply Dr Machine Code <jckj33 gmail.com> writes:
On Friday, 5 November 2021 at 13:42:44 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 5 November 2021 at 13:10:58 UTC, Guillaume Piolat 
 wrote:
[...]
 Of course, there is no reason for D to go after microservices, 
 scripting or WASM either!! If D had an application framework, 
 then it would be better than Go/Rust/Python for writing desktop 
 apps, for instance.
What kind of application framework? something like .NET?
Nov 05 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 6 November 2021 at 02:43:44 UTC, Dr Machine Code 
wrote:
 What kind of application framework? something like .NET?
Something suitable for audio-visual applications. Something you could use to build visualization tools, photo editors, audio editors, training software, simulation tools, simpler games… A framework for building the kind of desktop applications where you still benefit from native compilation and OS features.
Nov 06 2021
prev sibling parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Friday, 5 November 2021 at 13:10:58 UTC, Guillaume Piolat 
wrote:

 Interesting point of view because obviously it isn't the point 
 of view of many in the D community. I'll spare you the details 
 but it is very possible D outperform those in their specialty.
I know it's not the point of view of many in the D community, but at least in part this topic is all about the gap between the PoV of the D community vs the rest. When the D community believes D has many advantages but the rest of the developer community is not rushing to embrace it, and when at the same time the OP asks why is it so, then it's fair to ask whether they might actually see certain things that the dlang community just doesn't want to recognise. For low-level systems programming, Rust is fully committed to manual memory management unlike D with its schizophrenic attitude towards the GC. Together with the borrow checker, Rust can offer some static guarantees that D can't and which are especially important in low-level code. However unlike Phobos, using Rust that way also doesn't require you to forego the standard library. At the same time, its type system is richer than D's, with full Hindley Milner inference and especially now that GAT's are being stabilised and the language is basically on its way to support HKTs. It also offers sophisticated error handling without using exceptions (another big win in low level code). Many people have forgotten it now but at one point, Rust had classes with inheritance as well as a GC. It was concluded that those features didn't belong in a systems language and were removed. This is probably Rust's greatest strength over D: its consistency, the follow through to ensure that features that are declared stable are really rock solid and make sense within the language's general philosophy. Sometimes the choice made is the wrong one, of course (the async semanitcs don't really play well with Linux's new io_uring API, for example), but even then, the community is fully committed to support them into the future with all the care it takes, and find solutions that work whenever they made the wrong call. Add a homogeneous ecosystem, top notch tooling and documentation and it's easy to see why it has been so successful. The second example was Go. It's often criticised for being a limited language with not much expressive power. Undeniably D's modelling power is far greater than Go's. It's also largely irrelevant: Go still doesn't even have generics in stable and it hasn't slowed down its adoption. Why? Because Go's designer knew their target: corporate coders whose only concern is productivity and time-to-market. Go is not "elegant" or rich or powerful, but it's extremely easy to use and quick to get things done. GC is an advantage for those application and Go's is best of breed. D's situation here is a kind of lose-lose: it is primarily and clearly a GC langauge, which hampers its use for low level code, yet where a GC is desirable, it can't compete with Go's GC. Goroutines are another big item: a web application developer needs to react asynchronously to an event and calling "go process_stuff()" is an immediate answer. They want the code to work today, not spend time marvelling at how cleverly (or "cleverly") a framework that doesn't compile any more with a recent compiler used to implement it through template metaprogramming. Go gives them that and the time and effort it saves is worth every penny. In many ways, Go epitomises an important trait of the Zen of Python: for every problem, there is one obvious way to do it. That's exactly what that programming community wants: a language that gets out of their way and just gives them the simple, basic building blocks (but of excellent quality) that allow them to solve their actual problem. Regarding Python, its biggest advantage (other than its outstanding ecosystem) is that it's a dynamic language. Prototyping, testing, customising etc will always be easier and more flexible in Python than in any compiled language. Plus its conscious focus on a simple syntax makes it in my opinion the modern day heir to the BASIC of the 80s in terms of universal applicability and a kind of immediacy. So where does that leave D? Someone here mentioned mobile development. IMHO that is one huge missed opportunity for D. It could have been a language of choice on Android (better than either Java or Kotlin). But note that no-one develops mobile applications in C, C++ or Rust. That's because once again, it requires a language that seamlessly blends with the APIs, doesn't bother the developer with the minutiae of memory management etc. D's general approach has been to always try to have its cake and eat it too. It wants a GC but at the same time wants to pretend that it doesn't need a GC, it wants to support OOP (with inheritance etc) and at the same time tries to downplay it, etc. Ultimately in each of these cases it fails to provide the advantages of either of the valid choices while retaining the disadvantages of both.
Nov 06 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 7 November 2021 at 06:36:48 UTC, arco wrote:
 I know it's not the point of view of many in the D community, 
 but at least in part this topic is all about the gap between 
 the PoV of the D community vs the rest.
Yes, I think you get this response because people have written successful applications for things like micro-services. But there is a difference between being usable and having solid support. So the real challenge for D is focusing and providing solid support for something specific. I think everyone understands this. People who use D in professional contexts seems to be ok with creating their own frameworks, which requires dedication. As of today D is for programmers with a high level of dedication. That basically puts D in the same category as C++ and Haskell, but not in the same category as Go, Python, Java etc. That there are fewer programmers with a high level of dedication to a single language should not surprise anyone.
 that GAT's are being stabilised and the language is basically 
 on its way to support HKTs.
What does "GAT" and "HKT" mean?
 It also offers sophisticated error handling without using 
 exceptions (another big win in low level code).
Doesn't seem to be better than C++20 in that regard? How is Rust's error handling sophisticated?
 it hasn't slowed down its adoption. Why? Because Go's designer 
 knew their target: corporate coders whose only concern is 
 productivity and time-to-market. Go is not "elegant" or rich or 
 powerful, but it's extremely easy to use and quick to get 
 things done.
Go is ok for tiny applications that fits well to their standard library, but the language does not seem to scale very well. I am not so sure about productivity. I want to see that measured in terms of long term evolution and maintenance. I would prefer Java, but Go's runtime is lighter and spins up faster on the platform I use, so Go it is. Has nothing to do with the language, only runtime and stability. The JVM clearly provides better languages than Go… D isn't stable enough. Both Java and Go have stellar backwards compatibility and that is critical for servers where you may be forced to upgrade over night (either because of security issues or because the cloud service requires you to).
 D's situation here is a kind of lose-lose: it is primarily and 
 clearly a GC language, which hampers its use for low level 
 code, yet where a GC is desirable, it can't compete with Go's 
 GC.
Well, it is primarily C with some adjustments and additions. What is unfortunate is that the language does not fit well with global GC and there is no willingness to do what is necessary: switch to a GC model that is local to the computation and use reference counting elsewhere. (other options are also possible) There is also little willingness to adjust the language, clean out inconsistencies, improve on C (like fixing the flawed operator precedence rules of C). These language issues are perhaps not so apparent for people who primarily use D, but stands out to anyone with a comp sci background or who use many languages. I see people claim tooling as the "one issue". I don't think so. I think language issues come first, then eco system, then tooling. Can I prove it? No, but if you have quirks that is off-putting to highly educated programmers then that means less sophisticated eco system. This is particularly true for D and C++ as making good use of metaprogramming requires high levels of sophistication! The one issue that truly prevents progress is that the language designers are happy with the language as is, and they are not interested in changes (only additions).
 Regarding Python, its biggest advantage (other than its 
 outstanding ecosystem) is that it's a dynamic language.
And its biggest disadvantage. However now that type annotations have been added, you basically have gradual typing in Python, which is very productive.
 development. IMHO that is one huge missed opportunity for D. It 
 could have been a language of choice on Android (better than 
 either Java or Kotlin).
Moving target. You need vast resources to address the continuous changes on mobile platforms. The graveyard for mobile cross platform frameworks is enormous! It is improbable for D to succeed in this field.
 D's general approach has been to always try to have its cake 
 and eat it too. It wants a GC but at the same time wants to 
 pretend that it doesn't need a GC, it wants to support OOP 
 (with inheritance etc) and at the same time tries to downplay 
 it, etc. Ultimately in each of these cases it fails to provide 
 the advantages of either of the valid choices while retaining 
 the disadvantages of both.
D is retaining the disadvantages because features are being replicated from other languages rather than reinvented. If you want to combine low level programming with high level programming you need to go further away from the mainstream than D. (E.g. choose the actor model). In order to reinvent, D needs to accept more breaking changes on the language level.
Nov 07 2021
next sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 7 November 2021 at 09:18:37 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 7 November 2021 at 06:36:48 UTC, arco wrote:
 [...]
Yes, I think you get this response because people have written successful applications for things like micro-services. But there is a difference between being usable and having solid support. [...]
Generic Associated- and Higher Kinded Types
Nov 07 2021
prev sibling parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Sunday, 7 November 2021 at 09:18:37 UTC, Ola Fosheim Grøstad 
wrote:

 Yes, I think you get this response because people have written 
 successful applications for things like micro-services. But 
 there is a difference between being usable and having solid 
 support.
It is a very common response when people are enthusiastic about a particular technology. People for whom using D is a goal in itself are using it, but the others are not going to move to it for the sake of using D. They would do it if D solves some problems they are currently facing, or if it allows them to achieve their own objectives (which are probably unrelated to D) better than some other alternative.
 So the real challenge for D is focusing and providing solid 
 support for something specific. I think everyone understands 
 this. People who use D in professional contexts seems to be ok 
 with creating their own frameworks, which requires dedication.
But there are two aspects to this where in my opinion D is currently failing and both are more cultural than technical. That D "can" do something is not enough; that someone successfully wrote some app in D doesn't mean D is a good choice for that application field. Like any language, D must not just be good at something, it must be the best at something. It's fair to say that currently it isn't. The dlang community wants to push static if etc. as their killer feature but it seems that more often than not, it elicites a "meh" reaction. The other problem is that D seems deeply allergic to making hard decisions. From reading this forum it seems to me that the overall attitude within D is that it should offer choices, not enforce decisions. But in the real world, that's often a bad thing. Another language with the same mantra was Perl. Look at what happened to Perl the moment we got Python, Ruby etc. Enforcing decisions is a Good Thing: it ensures that people understand each other's code, that analysis tools can do a much better job, that people won't roll out their own error management that doesn't work together with someone else's, that there will be consistency in memory management, etc etc etc. Once again, the two languages somewhat comparable to D that "made it", Go and Rust, are all about enforcing what they deemed to be their way to do things. That doesn't mean it's the only legitimate way, but it means that if you want to do X, this is how you do it in Go (not that there isn't a one true way, but you are free to try to devise 15 different ways). And if the Go way doesn't work for you, that's perfectly fine, there are other languages that would be a better fit for your problem. Same for Rust. This is a hard lesson that D desperately doesn't want to learn. By trying to appease everyone it ultimately doesn't win for anyone in particular.
 As of today D is for programmers with a high level of 
 dedication. That basically puts D in the same category as C++ 
 and Haskell, but not in the same category as Go, Python, Java 
 etc.
Not really. C++ has a huge wealth of frameworks, IDEs, tools and libraries that make its strength. Plus ever since C++11 it has been dedicated to recognising and attempting to fix its problems (although the solutions are often a kind of placebo: there is no way to ensure, for example, that unique_ptr is really unique, or that an object won't be accessed after std::move). I personally dislike C++ with a vengeance, but if you develop in C++, you will never be on your own. Not so with D. Haskell is a different beast altogether. It's a niche language that knows exactly what it wants to be and especially what it doesn't want. It's the perfect solution for a specific area of problems. Which illustrates the discussion above.
 That there are fewer programmers with a high level of 
 dedication to a single language should not surprise anyone.
Of course. But why should they? A language is nothing more than a tool. Carpenters are dedicated to building furniture, not to using a hammer.
 What does "GAT" and "HKT" mean?
[Generic Associated Types](https://blog.rust-lang.org/2021/08/03/GATs-stabilization-push.html). HKT = Higher Kind Types. GATs are a subset of HKTs.
 Doesn't seem to be better than C++20 in that regard? How is 
 Rust's error handling sophisticated?
In several ways. Rust has proper sum types and a "match" statement that guarantees that all cases are covered (plus it has functional semantics). That alone makes it more general and safer than C++. It also has some syntactic sugar (like the ? operator) and the pseudo-monadic methods and_then etc. that together make it basically as easy to use as exceptions. But it has also the advantages that since all errors are in-band values, they are friendly towards multithreading and parallelisation. With the Rayon library (which provides parallel iterators) you can for example write something like: ``` let results = data.par_iter().map(|v| process_value(&v)).collect(); ``` This will run process_value() in parallel on all elements of the data collection. If process_value() can fail (= it could throw in D or C++), it will return a Result. You will get a collection (like a Vec) of Results and you will be able to check which ones have succeeded and which have failed. Using exceptions in multithreaded code is... no comment.
 Go is ok for tiny applications that fits well to their standard 
 library, but the language does not seem to scale very well. I 
 am not so sure about productivity. I want to see that measured 
 in terms of long term evolution and maintenance.
It seems that those who use Go, Google to begin with, beg to disagree that it's just for tiny applications. As for whether it scales well, its long term evolution and maintenance... how do you think it compares to D as far as hard data and proof of the pudding are concerned?
 And its biggest disadvantage. However now that type annotations 
 have been added, you basically have gradual typing in Python, 
 which is very productive.
Of course it's also a big disadvantage. But notice the pattern: Python is dynamic - type annotations don't change that, it can still alter classes at runtime, reflect etc. It made that design choice and tries to offer developers all the benefits that go with it. And where it's not suitable, then you simply don't use Python.
 Moving target. You need vast resources to address the 
 continuous changes on mobile platforms. The graveyard for 
 mobile cross platform frameworks is enormous! It is improbable 
 for D to succeed in this field.
It needn't be improbable. Kotlin was born as a third party project, like D but smaller. Google saw its benefit for Android development and embraced it wholeheartedly. It could have been D instead (and IMO it would make a nicer Android language than Kotlin). Making D target the JVM shouldn't have been a big problem. What was and still missing is consistency, well defined features that make sense for a stated goal, excellent tools and documentation and community dedication (to the application field, not to the language as such).
 D is retaining the disadvantages because features are being 
 replicated from other languages rather than reinvented. If you 
 want to combine low level programming with high level 
 programming you need to go further away from the mainstream 
 than D. (E.g. choose the actor model).
You are onto something there. Maybe an interesting and promising direction for D would be to get inspiration from Erlang as an actor-based application language, but with full C interoperability and a generally more conventional (and thus more approachable) language design. But once again, it needs to make the choices that are consistent with that, in full knowledge that it means saying NO to those features that aren't, and keep on track. It also requires a deliberate and long term community building effort. Go does it and Rust does it, but D still seems to believe that "if you just make a compiler, they will come".
Nov 07 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 01:36:59 UTC, arco wrote:
 On Sunday, 7 November 2021 at 09:18:37 UTC, Ola Fosheim Grøstad 
 wrote:
 But there are two aspects to this where in my opinion D is 
 currently failing and both are more cultural than technical.
I don't think D is failing, as it does what the creator set out to make it do. That this appeals to a limited audience is not as such a failure. In the context of computer language history, you might argue that some aspects of the semantics and syntax has serious flaws and that the language designers were unwilling or unable to fix them. But failure is not an appropriate word even in that context.
 The other problem is that D seems deeply allergic to making 
 hard decisions.
Yes, this is also typical for language design processes. It is easier to add new features instead of adjusting what is. You can always say «we a busy adding this new feature that will make all the difference» and thus the focus on streamlining what exists is limited to the most trivial and assert that the price of change is too high.
 Once again, the two languages somewhat comparable to D that 
 "made it", Go and Rust, are all about enforcing what they 
 deemed to be their way to do things. That doesn't mean it's the 
 only legitimate way, but it means that if you want to do X, 
 this is how you do it in Go (not that there isn't a one true 
 way, but you are free to try to devise 15 different ways).
If there are 15 ways, then there is no real enforcement? I roll my own error-handling using panic. I care more about correctness and maintenance than a slowdown of code paths that are rarely executed. The Go Way does not lead to more correct programs as far as I am concerned. Most people don't code C++ the way they do at conferences either. That is the power of meta programming too. That is the promise of high powered meta-programming. You can shape the language to the domain. In fact I think this is a weak spot of D. It should leave more room for shaping the language. This is an area where D should be doing better, but the language designers have put severe constraints on operators that makes it more difficult to create useful types. These constraints are there out of fear of creating a tower of Babel. As such, D has kneecapped what is touted as its strong point. D should also add unicode operators and make it the best language for building frameworks for scientific computing etc. But there is no strategic thinking or "scientific philosophy", just opinions, that shape the language. As I already have said; too much focus on what other programming languages are doing (replicating) and not enough focus on unused parts of the design space (reinventing/innovation). There should be a balance, but D has not quite found the right mix yet that would set it apart from other languages.
 And if the Go way doesn't work for you, that's perfectly fine, 
 there are other languages that would be a better fit for your 
 problem. Same for Rust.
Not really. I am only interested in managed autoscaling. So App Engine Standard is my preferred option for now. I have to pick between: Python, Java (JVM), Node.js, PHP, Ruby and Go. If JVM wasn't more heavy than other options I'd say it is the most attractive one , in addition to Python. But since performance sometimes matter, Go sometimes is the better option. And only because GOOGLE SAYS SO! (they optimize the service for Go?) This isn't unique. In mobile space you have Swift, Kotlin, Dart, And that is it. On the browser you have TypeScript and WASM. End of story. D should not look for areas where there is volume. Too crowded. It should look for areas where metaprogramming could be transformative! The only area where it can be transformative are areas where building your own types has huge impact. Scientific computing, perhaps signal processing, 3D graphics etc… But to do so it needs to improve on metaprogramming (and memory management). Not because other languages has it, but because metaprogramming is so weak in all languages that very few frameworks make good use of it. Irrespective of language. It is therefore an underutilized language feature (in all languages), but the core D language needs changes so it can blossom. It is great that the standard lib is going to be streamlined, but it won't move the needle as the root cause is in the language.
 This is a hard lesson that D desperately doesn't want to learn. 
 By trying to appease everyone it ultimately doesn't win for 
 anyone in particular.
Yes, D primarily appeals to those that consider and reject C++ (for various reasons).
 std::move). I personally dislike C++ with a vengeance, but if 
 you develop in C++, you will never be on your own. Not so with 
 D.
I don't know. Many C++ apps roll their own, all the way. Outside of games C++ applications tend to be very focused and there is limited room for reuse (e.g. a compiler can reuse the LLVM backend, and that is about it).
 Haskell is a different beast altogether. It's a niche language 
 that knows exactly what it wants to be and especially what it 
 doesn't want.
Yes, it was designed to be an academic language. When I say Haskell in this context I mean people who use it in production, like for web solutions. You have to be dedicated to use it. People who do use it, say they are less miserable as programmers. I believe them, but they need to put in extra work initially, for sure. And that is not all that different from what professional D users do and say.
 In several ways. Rust has proper sum types and a "match" 
 statement that guarantees that all cases are covered (plus it 
 has functional semantics). That alone makes it more general and 
 safer than C++. It also has some syntactic sugar (like the ? 
 operator) and the pseudo-monadic methods and_then etc. that 
 together make it basically as easy to use as exceptions.
Ok, so this is the issue I have with Rust. All the points where Rust claims to have a better solution than C++ are areas where I never experience problems with C++! As far as I am concerned Rust is solving the wrong problems. Maybe Rust is better for bigger teams or bigger projects, but I fail to see how it could improve my programs. If D was streamlined, cleaned up, with better memory management and improved metaprogramming (without the knee-capping), it could provide a better solution than C++ for writing full applications (like games, audio editors and the like). Unfortunately the language designers have no interest in these application areas, so D is moving in the right direction, but very slowly. Too slow.
 Using exceptions in multithreaded code is... no comment.
I don't do fine-grained multithreading, so RAII and exceptions covers me well in C++. Rust provides a solution for a problem I am unlikely to have…
 It seems that those who use Go, Google to begin with, beg to 
 disagree that it's just for tiny applications. As for whether 
 it scales well, its long term evolution and maintenance... how 
 do you think it compares to D as far as hard data and proof of 
 the pudding are concerned?
D isn't stable. Go, Java and C++ are stable. That is a prerequisite for writing anything larger that is meant to last. But there is no point in going stable until you have gotten the upper hand on semantics and syntax (or have reached critical mass).
 still alter classes at runtime, reflect etc. It made that 
 design choice and tries to offer developers all the benefits 
 that go with it. And where it's not suitable, then you simply 
 don't use Python.
Yes, I use Python whenever I can because it is cheaper (less time). But I would have preferred as stronger static type system. I rarely use dynamic features.
 It needn't be improbable. Kotlin was born as a third party 
 project, like D but smaller. Google saw its benefit for Android 
 development and embraced it wholeheartedly. It could have been 
 D instead (and IMO it would make a nicer Android language than 
 Kotlin). Making D target the JVM shouldn't have been a big 
 problem. What was and still missing is consistency, well
This is apples and oranges. I am pretty sure that Google didn't pick "Kotlin" in isolation, but that they picked JetBrain's Kotlin + JetBrain's IDE. Clearly a strategic partnership that benefits all. D could not have been Google's choice for the JVM. That is improbable. Quite a few years back Mosync did quite well in providing a cross platform C++ solution (also for JVM), but it failed in the market. Google could have picked them up, if this is what they wanted. There is no point in dreaming about what is improbable. D could not have been on Google's table for Android. To get there D would have had to be a completely different language.
Nov 08 2021
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Nov 08, 2021 at 01:36:59AM +0000, arco via Digitalmars-d wrote:
[...]
 The other problem is that D seems deeply allergic to making hard
 decisions.  From reading this forum it seems to me that the overall
 attitude within D is that it should offer choices, not enforce
 decisions. But in the real world, that's often a bad thing. Another
 language with the same mantra was Perl.  Look at what happened to Perl
 the moment we got Python, Ruby etc. Enforcing decisions is a Good
 Thing: it ensures that people understand each other's code, that
 analysis tools can do a much better job, that people won't roll out
 their own error management that doesn't work together with someone
 else's, that there will be consistency in memory management, etc etc
 etc.  Once again, the two languages somewhat comparable to D that
 "made it", Go and Rust, are all about enforcing what they deemed to be
 their way to do things. That doesn't mean it's the only legitimate
 way, but it means that if you want to do X, this is how you do it in
 Go (not that there isn't a one true way, but you are free to try to
 devise 15 different ways). And if the Go way doesn't work for you,
 that's perfectly fine, there are other languages that would be a
 better fit for your problem. Same for Rust.
[...] Interesting, one of the reasons I *like* D is because it lets me do the deciding, instead of dictating how I should do things. IMO it's what makes it adaptable to all kinds of different tasks. For one task I might need heavy use of the OO paradigm, for another task I might need a more functional approach, for yet another task, I might need to go low-level C-like approach with manual loop unrolling and hand-tweaking generated code. D lets me do all of that without any encumbrance (I don't have to e.g. do lip-service to OO just to get the thing to compile, like I have to in say Java), and best of all, lets me do all of that *in the same program* because it's all D. I'm honestly surprised anyone would want it any other way! :-D (Being forced to do things one particular way is what drove me *away* from languages like Java.) T -- Кто везде - тот нигде.
Nov 08 2021
parent reply jfondren <julian.fondren gmail.com> writes:
On Monday, 8 November 2021 at 15:16:07 UTC, H. S. Teoh wrote:
 I'm honestly surprised anyone would want it any other way! :-D  
 (Being forced to do things one particular way is what drove me 
 *away* from languages like Java.)
I doubt that people do want it any other way; strictness is seen rather as an easy to understand catalyst for what they actually want: - for the language to evolve in a predictable direction (and definitely not to add features they don't want, or waste time on features they don't care about) - for the language's future to be more certain - for the language to get more popular - for robust follow-through on features that are added - for there to be an easy argument to get language devs to move quickly to fix a problem (this go bug makes compilation super slow; this rust bug breaks memory safety; this d bug breaks ???). There's a lot of arguing for means instead of ends like this when language popularity comes up.
Nov 08 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 15:57:30 UTC, jfondren wrote:
 On Monday, 8 November 2021 at 15:16:07 UTC, H. S. Teoh wrote:
 I'm honestly surprised anyone would want it any other way! :-D 
  (Being forced to do things one particular way is what drove 
 me *away* from languages like Java.)
I doubt that people do want it any other way; strictness is seen rather as an easy to understand catalyst for what they actually want:
You need consistency in the language in order to enable sensible meta programming. So there is a need for streamlining. For instance, having three ways to express references is not necessary. (class, "*", "ref"). Ideally you would define a minimal core and express all other concepts through meta-programming, but add syntactical sugar where necessary. (Associative arrays could have been a standard library type with syntactical sugar.)
Nov 08 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 16:13:42 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 15:57:30 UTC, jfondren wrote:
 On Monday, 8 November 2021 at 15:16:07 UTC, H. S. Teoh wrote:
 I'm honestly surprised anyone would want it any other way! 
 :-D  (Being forced to do things one particular way is what 
 drove me *away* from languages like Java.)
I doubt that people do want it any other way; strictness is seen rather as an easy to understand catalyst for what they actually want:
You need consistency in the language in order to enable sensible meta programming. So there is a need for streamlining. For instance, having three ways to express references is not necessary. (class, "*", "ref"). Ideally you would define a minimal core and express all other concepts through meta-programming, but add syntactical sugar where necessary. (Associative arrays could have been a standard library type with syntactical sugar.)
class as a reference type makes sense as you are dealing with polymorphism. The only thing I see is unnecessary is the "*", which from my understand is intentional design by walter. Which according to him, it is to make porting c code to d easier. Yet we have import c now. So I don't know what he thinks of it now. -Alex
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 17:40:08 UTC, 12345swordy wrote:
 class as a reference type makes sense as you are dealing with 
 polymorphism.
Maybe so, but you could have the same typing scheme. Just make it a typing error to instantiate it as a non-reference type. So you could for instance collapse struct/class, yet allow the programmer to specify that this particular struct/class can only be instantiated as heap-object. And you could similarly put a ban on virtual members. So rather than having special cases struct and class, you allow the addition of constraints to a singular aggregate concept.
Nov 08 2021
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 18:02:27 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 17:40:08 UTC, 12345swordy wrote:
 class as a reference type makes sense as you are dealing with 
 polymorphism.
Maybe so, but you could have the same typing scheme. Just make it a typing error to instantiate it as a non-reference type. So you could for instance collapse struct/class, yet allow the programmer to specify that this particular struct/class can only be instantiated as heap-object. And you could similarly put a ban on virtual members. So rather than having special cases struct and class, you allow the addition of constraints to a singular aggregate concept.
That would be reaching near the infamous "curse of lisp" here. -Alex
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 19:56:11 UTC, 12345swordy wrote:
 That would be reaching near the infamous "curse of lisp" here.
Why is that? As I said you can add syntactical sugar.
Nov 08 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 20:20:05 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 19:56:11 UTC, 12345swordy wrote:
 That would be reaching near the infamous "curse of lisp" here.
Why is that? As I said you can add syntactical sugar.
You may up creating types that are consider to be bad design by other people, and thus create unneeded issues. What exactly do you have in your vision that the current language is preventing you from carrying out that vision? - Alex
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 21:19:55 UTC, 12345swordy wrote:
 You may up creating types that are consider to be bad design by 
 other people, and thus create unneeded issues. What exactly do 
 you have in your vision that the current language is preventing 
 you from carrying out that vision?
Less bloat. Why would making the type system more uniform be considered as bad design?
Nov 08 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 21:21:43 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 21:19:55 UTC, 12345swordy wrote:
 You may up creating types that are consider to be bad design 
 by other people, and thus create unneeded issues. What exactly 
 do you have in your vision that the current language is 
 preventing you from carrying out that vision?
Less bloat. Why would making the type system more uniform be considered as bad design?
This isn't exactly helpful here. What exactly do you mean by "Less bloat" here? Lines Of code? Memory? Why does the current type system needs to be more uniform then it currently is? Are there limitations that is negatively effect you? If so, how is your solution would address this without introducing the "lisp curse"? - Alex
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 21:37:30 UTC, 12345swordy wrote:
 This isn't exactly helpful here. What exactly do you mean by 
 "Less bloat" here? Lines Of code? Memory?
Yes, fewer lines of code, more homogeneous.
 If so, how is your solution would address this without 
 introducing the "lisp curse"?
There is no "lisp curse" in this. As I said, you can have syntactical sugar for the common case if need be.
Nov 08 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 21:55:10 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 21:37:30 UTC, 12345swordy wrote:
 This isn't exactly helpful here. What exactly do you mean by 
 "Less bloat" here? Lines Of code? Memory?
Yes, fewer lines of code, more homogeneous.
 If so, how is your solution would address this without 
 introducing the "lisp curse"?
There is no "lisp curse" in this. As I said, you can have syntactical sugar for the common case if need be.
How about you write down your proposal in a file first then link said proposal, so that the rest of us can see what exactly are you proposing here. Right now you are making claims without substantiating them. -Alex
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 22:22:04 UTC, 12345swordy wrote:
 How about you write down your proposal in a file first then 
 link said proposal, so that the rest of us can see what exactly 
 are you proposing here. Right now you are making claims without 
 substantiating them.
Uhm. Why would I write a proposal? The proof is all over Phobos. A more uniform core language would help immensly on meta programming. That is just a fact.
Nov 08 2021
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 8 November 2021 at 22:35:10 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 22:22:04 UTC, 12345swordy wrote:
 How about you write down your proposal in a file first then 
 link said proposal, so that the rest of us can see what 
 exactly are you proposing here. Right now you are making 
 claims without substantiating them.
Uhm. Why would I write a proposal?
That is because I literally have no idea what exactly does your solution consist here! Right now, I am not convinced at all regarding your solution. - Alex
Nov 08 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 22:53:09 UTC, 12345swordy wrote:
 That is because I literally have no idea what exactly does your 
 solution consist here! Right now, I am not convinced at all 
 regarding your solution.
Alright, but my sketch of a solution would require rethinking the D type system and that would have to follow a restructuring of compiler internals, so it would totally depend on the compiler authors' willingness to build a clean architecture first. Meaning it is something you could plan for, but not do over night. Anyway, the solution would be to make the language more uniform under-the-hood. So you would not necessarily notice it much as an application programmer, it would mostly be visible in meta-programming. So what you do is you internally in the compiler collapse struct and class into one concept, with the ability to constrain it. That way a class and a referenced structs become similar in meta-programming. For an application programmer it could be more or less the same as now, if desired. Syntax could be expanded internally into the new representation.
Nov 09 2021
prev sibling parent Daniel N <no public.email> writes:
On Monday, 8 November 2021 at 18:02:27 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 17:40:08 UTC, 12345swordy wrote:
 class as a reference type makes sense as you are dealing with 
 polymorphism.
Maybe so, but you could have the same typing scheme. Just make it a typing error to instantiate it as a non-reference type. So you could for instance collapse struct/class, yet allow the programmer to specify that this particular struct/class can only be instantiated as heap-object. And you could similarly put a ban on virtual members. So rather than having special cases struct and class, you allow the addition of constraints to a singular aggregate concept.
We are lucky we have class in D as ref is near useless in D meta-programming. alias I1(alias T) = T; alias I2(T...) = T; I1!(ref int) i1; // RIP I2!(ref int) i2; // RIP I rest my case.
Nov 08 2021
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Nov 08, 2021 at 03:57:30PM +0000, jfondren via Digitalmars-d wrote:
 On Monday, 8 November 2021 at 15:16:07 UTC, H. S. Teoh wrote:
 I'm honestly surprised anyone would want it any other way! :-D
 (Being forced to do things one particular way is what drove me
 *away* from languages like Java.)
I doubt that people do want it any other way; strictness is seen rather as an easy to understand catalyst for what they actually want: - for the language to evolve in a predictable direction (and definitely not to add features they don't want, or waste time on features they don't care about)
I can understand this sentiment, but why would I care about features that I don't want/use? As long as I'm not forced to use it, I can just not use those features. I used to write quite a lot of C++, but I doubt if I even used 50% of its features. That didn't stop me from writing lots of useful C++ code.
 - for the language's future to be more certain
I'm not sure how strictness equates with the future, the latter doesn't necessarily follow from the former. But sure, if people want something concrete to put a finger on...
 - for the language to get more popular
Again, not sure how this follows from strictness, but OK, sure.
 - for robust follow-through on features that are added
Now this is something I could stand behind. A lot of D features aren't bad ideas per se, but they only cover a small subset of use cases, and their interactions with other language features is anybody's guess (usually this means an ICE). Things like `inout` or `shared` fall in this category.
 - for there to be an easy argument to get language devs to move
   quickly to fix a problem (this go bug makes compilation super slow;
   this rust bug breaks memory safety; this d bug breaks ???).
"This D bug breaks existing code" - that seems to be the biggest bugbear / motivator these days. :-D
 There's a lot of arguing for means instead of ends like this when
 language popularity comes up.
I usually don't bother participating in threads about popularity, because I don't believe in the philosophy that more popular == better, or that popularity should be a goal at all. But I responded this time because it sounded really strange that people would actually prefer less choice instead of more in a programming language. :-D T -- For every argument for something, there is always an equal and opposite argument against it. Debates don't give answers, only wounded or inflated egos.
Nov 08 2021
prev sibling parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Monday, 8 November 2021 at 01:36:59 UTC, arco wrote:
 On Sunday, 7 November 2021 at 09:18:37 UTC, Ola Fosheim Grøstad 
 wrote:

 [...]
It is a very common response when people are enthusiastic about a particular technology. People for whom using D is a goal in itself are using it, but the others are not going to move to it for the sake of using D. They would do it if D solves some problems they are currently facing, or if it allows them to achieve their own objectives (which are probably unrelated to D) better than some other alternative. [...]
Erlang rox
Nov 08 2021
prev sibling parent reply harakim <harakim gmail.com> writes:
On Sunday, 7 November 2021 at 06:36:48 UTC, arco wrote:
 Many people have forgotten it now but at one point, Rust had 
 classes with inheritance as well as a GC. It was concluded that 
 those features didn't belong in a systems language and were 
 removed. This is probably Rust's greatest strength over D: its 
 consistency, the follow through to ensure that features that 
 are declared stable are really rock solid and make sense within 
 the language's general philosophy.
I agree. Go still doesn't even have generics in stable and
 it hasn't slowed down its adoption.
I have noticed Go adoption has slowed down a lot. I wanted to love Go... but it was too slow to get anything done besides a hello world type program. Since you said this, I decided to double check and the tiobe index seems to agree, as does google trends. I think Go's decline in popularity is an example of all the problems D doesn't have. It's mild success includes all the things D lacks: standard tools, corporate backing, tutorials, etc..
 Regarding Python, its biggest advantage (other than its 
 outstanding ecosystem) is that it's a dynamic language. 
 Prototyping, testing, customising etc will always be easier and 
 more flexible in Python than in any compiled language. Plus its 
 conscious focus on a simple syntax makes it in my opinion the 
 modern day heir to the BASIC of the 80s in terms of universal 
 applicability and a kind of immediacy.
I agree that that is the appeal of python. It's dynamic but also has languages. What other languages is like that? Javascript is kind of like that, but it's such a horrible language anyway.
 D's general approach has been to always try to have its cake 
 and eat it too. It wants a GC but at the same time wants to 
 pretend that it doesn't need a GC, it wants to support OOP 
 (with inheritance etc) and at the same time tries to downplay 
 it, etc. Ultimately in each of these cases it fails to provide 
 the advantages of either of the valid choices while retaining 
 the disadvantages of both.
I have experienced this phenomenon a lot in my career, but I think D is pretty good at having more than one way of doing things. I like not having to have classes, but I like having them available. The way I feel about frameworks is how I feel about languages like python. You have to do it their way even though you know a better way. With D, I do feel like . This is one of it's strongest advantages. If I want to do string handling in a pointer-based way, I can! If I want to use normal methods, the language supports that. Arrays are a great example. You get most of, if not all of, the power of a list in another language, but you get all the power of arrays. I'd like to see an example that shows you get the worst of both! I have relied on that heavily this weekend.
 This is a hard lesson that D desperately doesn't want to learn. 
 By trying to appease everyone it ultimately doesn't win for 
 anyone in particular.
Although I agree with pretty much everything you have written, I think that is one of D's strengths. That was reiterated this weekend. I got a call for a job and they said the next interview would be on codesignal.com. I did about 100 practice problems and I found myself reaching for D a lot. There were some problems minutes (most things Linq related or string related. I don't know what D's equivalents are.) I don't think D needs to stick to one way to do everything. I think Python is popular almost in spite of that because it's easy to start and it has good libraries. You get that constant success metric where you write something and it works right away and you tweak it and get little wins along the way. D is similar except for some safety stuff that I have ideas about (immutable, const, etc.) There are two effects I see where being opinionated is VERY helpful - maybe essential - to a language. 1. Documentation and tutorials Tutorials need to be opinionated. People need to see ONE way to do things when they learn. Once they understand, they can take advantage of all the different ways, but they need to see a path to success. 2. Whatever you expect people to do with the language, it needs to be designed so there is at LEAST one way to do the above things that is elegant. For example, web servers usually allow some kind of annotation-based routing system. If you use Jetty and you do your own routing, that is fairly confusing and time-consuming. That is something where the annotation-based method is a simple solution and allows you to not have to discuss much about routing and that part of the framework to a new user. It allows you to streamline. However, for Jetty (and technically others), you don't HAVE to do it that way. It's just the default way. Someone once said something like allow endless configuration but provide sensible defaults. That's what the D language is good at. I would like to read your response to this. Then they can take advantage of the rest. It needs to provide a vision when teaching. I don't think the language needs to be opinionated, but the tutorials need to be. Also, they need to have a succe
Nov 07 2021
parent reply harakim <harakim gmail.com> writes:
On Sunday, 7 November 2021 at 06:36:48 UTC, arco wrote:
 Many people have forgotten it now but at one point, Rust had 
 classes with inheritance as well as a GC. It was concluded that 
 those features didn't belong in a systems language and were 
 removed. This is probably Rust's greatest strength over D: its 
 consistency, the follow through to ensure that features that 
 are declared stable are really rock solid and make sense within 
 the language's general philosophy.
I agree.
 Go still doesn't even have generics in stable and it hasn't 
 slowed down its adoption.
I have noticed Go adoption has slowed down a lot. I wanted to love Go... but it was too slow to get anything done besides a hello world type program. Since you said this, I decided to double check and the tiobe index seems to agree, as does google trends. I think Go's decline in popularity is an example of all the problems D doesn't have. It's mild success includes all the things D lacks: standard tools, corporate backing, tutorials, etc. Maybe not libraries.
 Regarding Python, its biggest advantage (other than its 
 outstanding ecosystem) is that it's a dynamic language. 
 Prototyping, testing, customising etc will always be easier and 
 more flexible in Python than in any compiled language. Plus its 
 conscious focus on a simple syntax makes it in my opinion the 
 modern day heir to the BASIC of the 80s in terms of universal 
 applicability and a kind of immediacy.
I agree that that is the appeal of python. It's dynamic but also has libraries. What other language is like that? Javascript is kind of like that, but it's such a horrible language in comparison (I think).
 D's general approach has been to always try to have its cake 
 and eat it too. It wants a GC but at the same time wants to 
 pretend that it doesn't need a GC, it wants to support OOP 
 (with inheritance etc) and at the same time tries to downplay 
 it, etc. Ultimately in each of these cases it fails to provide 
 the advantages of either of the valid choices while retaining 
 the disadvantages of both.
I have experienced this phenomenon a lot in my career, but I think D is pretty good at having more than one way of doing things. I like not having to have classes, but I like having them available. The way I feel about frameworks is how I feel about languages like python. You have to do it their way even though you know a better way. With D, I feel like a library. I can use it's features if I want and not if I don't. This is one of it's strongest advantages. If I want to do string handling in a pointer-based way, I can! If I want to use normal methods, the language supports that. Arrays are a great example. You get most of, if not all of, the power of a list in another language, but you get all the power of arrays. I'd like to see an example that shows you get the worst of both! I have relied on that heavily this weekend.
 This is a hard lesson that D desperately doesn't want to learn. 
 By trying to appease everyone it ultimately doesn't win for 
 anyone in particular.
Although I agree with pretty much everything you have written, I think that is one of D's strengths. That was reiterated this weekend. I got a call for a job and they said the next interview would be on codesignal.com. I did about 100 practice problems and I found myself reaching for D a lot. There were some problems 5 minutes (most things Linq related or string related. I don't know what D's equivalents are.) I don't think D needs to stick to one way to do everything. I think Python is popular almost in spite of it's one-way-only philosophy because it's easy to start and it has good libraries. (Although being easy to get started is probably helped by the one-way-only philosophy for reasons I get to below.) You get that constant success feedback where you write something and it works right away. Then you tweak it and get little wins along the way. D is similar except for some safety stuff that I have ideas about (immutable, const, etc.) There are two effects I see where being opinionated is VERY helpful - maybe essential - to a language. 1. Documentation and tutorials Tutorials need to be opinionated. People need to see ONE way to do things when they learn. Once they understand, they can take advantage of all the different ways, but they need to see a path to success. 2. Whatever you expect people to do with the language, it needs to be designed so there is at LEAST one way to do the above things that is elegant. For example, web servers usually allow some kind of annotation-based routing system. If you use Jetty and you do your own routing, that is fairly confusing and time-consuming. That is something where the annotation-based method is a simple solution and allows you to not have to discuss much about routing and that part of the framework to a new user. It allows you to streamline. However, for Jetty (and technically others), you don't HAVE to do it that way. It's just the default way. Someone once said something like allow endless configuration but provide sensible defaults. That's what the D language is good at. I would like to read your response to this. I hit send instead of save and preview on the previous post. If someone can delete it, please do.
Nov 07 2021
parent arco <qva6y4sqi relay.firefox.com> writes:
On Monday, 8 November 2021 at 05:20:43 UTC, harakim wrote:


 I have experienced this phenomenon a lot in my career, but I 
 think D is pretty good at having more than one way of doing 
 things. I like not having to have classes, but I like having 
 them available. The way I feel about frameworks is how I feel 
 about languages like python. You have to do it their way even 
 though you know a better way. With D, I feel like a library. I 
 can use it's features if I want and not if I don't. This is one 
 of it's strongest advantages. If I want to do string handling 
 in a pointer-based way, I can! If I want to use normal methods, 
 the language supports that. Arrays are a great example. You get 
 most of, if not all of, the power of a list in another 
 language, but you get all the power of arrays. I'd like to see 
 an example that shows you get the worst of both! I have relied 
 on that heavily this weekend.
This is great if you are an enthusiast who implements stuff this week-end. In large scale projects you unavoidably end up in the situation where you are convinced you know a better solution, and so does your colleague, and another colleague etc. Of course, each of you has a different solution and you all "know" that yours is the best. The bicycle sched fallacy is always lurking at every corner with this type of approach. The examples to support my point are plenty. The GC to begin with. D relies on a GC so it's not really usable as a systems language. At the same time, since it doesn't want to fully commit to the GC and wear it with pride, it has a mediocre one and is not a good choice for projects where I feel that a GC-based language is the way to go. Developers are permanently wondering what kind of memory management they should use, because they can't see a clear picture or consensus and whichever decision they make, it will unavoidably clash with other parts of the ecosystem or even the same codebase. Similarly you may want to handle your strings as pointers like in C, but Joe's output library expects std.string. What now? As a result, the developer spends time dealing with trivial issues like that instead on focusing on solving the actual problem. It just keeps coming back to the same issue: the point of software development is not to have fun with the language or be "creative" with it, it's to provide a solution to a problem that other people will be able to use with as little friction as possible.
 Although I agree with pretty much everything you have written, 
 I think that is one of D's strengths. That was reiterated this 
 weekend. I got a call for a job and they said the next 
 interview would be on codesignal.com. I did about 100 practice 
 problems and I found myself reaching for D a lot. There were 

 to do it in under 5 minutes (most things Linq related or string 
 related. I don't know what D's equivalents are.)
You know D, you like D so it's natural that you would often reach for D to solve a problem. But the point of this thread is not that people who appreciate D can use it to do things, it's why the other people are not showing more interest in D.
 I don't think D needs to stick to one way to do everything. I 
 think Python is popular almost in spite of it's one-way-only 
 philosophy because it's easy to start and it has good 
 libraries. (Although being easy to get started is probably 
 helped by the one-way-only philosophy for reasons I get to 
 below.) You get that constant success feedback where you write 
 something and it works right away. Then you tweak it and get 
 little wins along the way. D is similar except for some safety 
 stuff that I have ideas about (immutable, const, etc.)
I think the fact that Python lives by its "one true way only" philosophy is precisely the reason it is so easy to get started with and has such good libraries. It's easy because tutorials make sense, if you are beginning and face a problem, other people can help you easily, and libraries provide APIs with little or no surprise and for the overwhelming part, they work well together. Perl, in contrast, said "there shall be more than one way to do it". Its ecosystem has always been a huge mess. Now of course D doesn't need to stick to one way for absolutely everything, otherwise it would be a DSL which is obviously not its purpose. But it should at least have a clear and cogent memory management story (and not "you can do it absolutely every way you want and so can everyone else"). It should have a clear position on strings. Autodecoding was a bad idea and maybe it's now stuck with it, or maybe it's possible to introduce a new string type that doesn't autodecode and keep std.string for legacy code, but there should be The D Way of handling strings (and not "ok and you can also use pointers like in C, and by the way you can have an array of dchars" etc.). It should decide once and for all what its pointer aliasing and coercion rules are and not introduce a new function attribute every week on one hand and rely on hacks like "alias this" on the other. And then there is the tooling. D has dub but it's neither fully mature not universally used, which means that adding a 3rd party dependency to a project is much more of a chore than it needs to be (and should be).
 There are two effects I see where being opinionated is VERY 
 helpful - maybe essential - to a language.
 1. Documentation and tutorials
 Tutorials need to be opinionated. People need to see ONE way to 
 do things when they learn. Once they understand, they can take 
 advantage of all the different ways, but they need to see a 
 path to success.
 2. Whatever you expect people to do with the language, it needs 
 to be designed so there is at LEAST one way to do the above 
 things that is elegant. For example, web servers usually allow 
 some kind of annotation-based routing system. If you use Jetty 
 and you do your own routing, that is fairly confusing and 
 time-consuming. That is something where the annotation-based 
 method is a simple solution and allows you to not have to 
 discuss much about routing and that part of the framework to a 
 new user. It allows you to streamline. However, for Jetty (and 
 technically others), you don't HAVE to do it that way. It's 
 just the default way. Someone once said something like allow 
 endless configuration but provide sensible defaults. That's 
 what the D language is good at.

 I would like to read your response to this.
I've never personally used Jetty and admittedly web application development is not one of my areas of interest or expertise so I can't comment on that. On a more general level though, the notion of what is "elegant" is subjective. There should be at least one way that is idiomatic, well supported and proven to work for the proverbial 99% of the common cases. And if there is more than one way, then they should interoperate as easily as possible.
Nov 08 2021
prev sibling next sibling parent reply Andrey Zherikov <andrey.zherikov gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
I've been playing with D from time to time for, may be, 10 years and here are my observations. Positive: - Maintainers like contributions. - My PRs were reviewed in a timely manner. - Great thanks to Ali for his book - it helped me a lot. Negative: - If I don't know how to fix a bug then it will likely be never fixed. - Not enough tutorials how to do things. - Lack of "best practices" documentation. - Quite steep learning curve about how to contribute (things might have changed but I remember that compile DMD on Windows was not so simple). - Unclear roadmap for D/DMD/Phobos. - Unclear processes for extending standard library - there are [bunch of packages that are marked as "candidate for inclusion into std lib"](https://code.dlang.org/?sort=updated&category=library.std_aspira t&skip=0&limit=100) and some of them were not updated for 5+ years. My personal verdict: I like D and I'll try to bring it into my company (if I won't leave it soon) but only to allow people to play with it. What I'd like to see in the future: - Fixing bugs in compiler should be the highest priority. So we can say that stable code is a priority. - Clear roadmap of what to expect in D/DMD/Phobos in the future. - Clear process of how new features in are delivered to public. - Having "preview" mode of new features (e.g. `dmd --preview=feature-foo`). - Clear process of how contributors can add new modules into Phobos. For example: someone nominates a library for inclusion, this application is reviewed by stakeholders and might be conditionally approved (they might reject); the condition is that library source code must meet "Phobos code quality" requirements (checks should be automatic) and as soon as this requirement is met, the library is added to std. - Have a tool that upgrades the code so I don't need to do the same simple thing in thousand places (this is the case for long-living production code). - Have better support in IDE - I prefer using IDEA, not VisualStudio. The other thought that I hope might be useful: There should be a list of "preferred" (not sure that this is the correct word) packages. It should be opt-in list that (a) provides a benefit for the package of being listed as preferred, (b) is a requirement for a package to be promoted into Phobos, (c) the package is tested with 5 (or other reasonable number) last versions of compilers and guaranteed to be working (being built). On the other side adding a package into that list requires some commitment from the author(s): if upcoming version of compiler breaks the build then the author is responsible to fix it, otherwise the package will be excluded. Having such a list will give the public the understanding of what is expected to work and what packages are maintained. In addition to that, DMD/Phobos maintainers can test the changes against the packages in the list to ensure that they are not breaking the most maintained packages.
Nov 03 2021
parent Andrey Zherikov <andrey.zherikov gmail.com> writes:
On Thursday, 4 November 2021 at 02:40:54 UTC, Andrey Zherikov 
wrote:
 ...
Also I think all bugs should be in github, not bugzilla. There should be a requirement what they should have (like compiler version, minimal code that reproduces the bug). Having this, we can have a bot that reproduces the bug automatically (i.e. build provided example). This bot can even validate that the bug still persists with new compiler and automatically close the ones that are not reproduced anymore (for example the bug was fixed by other change).
Nov 03 2021
prev sibling next sibling parent reply Antonio <antonio abrevia.net> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
My answere: **Becaus it can't!** Some D greatest features are rejected by common developers (mainly for massive/productive profiles). My favourite one is templates & mixins & compile time code: People accepts to work with *generics*: a method signature is a contract and generic types can be part of the signature. It is easy to understand this: ``` IIterable<R> map<T,R>(IIterable<T>, (T)=>R fmapper) ``` The languajge itself offers a native way to describe unambiguously what *map* accepts and returns... you can then read the comments to see what "map" means. Even if the language offers optional typing, in the end you can deduce what each type is without actually compiling the code (the intellisence system itself can deduce it from the source code) **and D** You see the D map signature: It is Template syntax, not Type based signature syntax: ```template map(fun...)``` Then you enter in the code for details ```auto map(Range)(Range r) if (isInputRange!(Unqual!Range))``` What type auto refers? ... the "if" signature part tells us about verification methods, not Type dependent contracts. Ok, let's enter in the code more and more ```return MapResult!(_fun, Range)(r);``` Because Range is a "generic" type and previously you know it verifies isInputRange... then it refers to an InputRange. Good: auto is an input range :-) But What exactly is an InputRange... is it an Interface? no: you must enter in the isInputRange implementation to see more details. Good... but then... what is fun? ... Of course, you have the documentation and the large descriptive comments that explains what signature can't and a complete chapter about ranges to understand how they work. **My point:** D introduces a different philosophy: you must accept **conventions**, don't expect to find Interfaces... That is not seeing. It is a paradigm itself: your project will define more and more conventions (without the help of languaje native contracts) and compiler will check them. Languages like Scala 3 or Typescript introduces new strong type flexibility with "union types" (i.e.: sayHello( who: "peter" | "andrew" ): string | string[] ) or exploid the pattern matching or .... but without losing the type based contracts May be D is unpopular because it can't be popular.
Nov 05 2021
parent Paul Backus <snarwin gmail.com> writes:
On Friday, 5 November 2021 at 15:54:53 UTC, Antonio wrote:
 **My point:**

 D introduces a different philosophy:  you must accept 
 **conventions**,  don't expect to find Interfaces... That is 

 used to seeing.   It is a paradigm itself:  your project will 
 define more and more conventions (without the help of languaje 
 native contracts) and compiler will check them.

 Languages like Scala 3 or Typescript introduces new strong type 
 flexibility with "union types" (i.e.:    sayHello( who: "peter" 
 | "andrew" ): string | string[] ) or exploid the pattern 
 matching or .... but without losing the type based contracts

 May be D is unpopular because it can't be popular.
This kind of "convention-based" programming is actually very common in dynamic languages like Python and Ruby, where it goes by the name of ["duck typing"][1]. [1]: https://en.wikipedia.org/wiki/Duck_typing
Nov 05 2021
prev sibling next sibling parent Kapps <opantm2+spam gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
I used to use D a lot for hobby projects. But at some point, much of my code broke in ways that took me many hours to fix and still didn't get decent results. I forget the exact details, but I think one of the ones that caused me significant issues was something related to no longer being able to access private members in traits. And then eventually working around that, const just kept getting in the way for code that was previously okay. But during this, there were changes that sounded like improvements that weren't made in the spirit of not breaking code. All the while my code was breaking in very frustrating ways. There was also an aspect where the type of code I was writing (utilizes traits to generate runtime reflection recursively for types) would very frequently trigger new compiler/library bugs. I really enjoy the open source aspect of D and that I was able to make PRs to fix some of these bugs, but sometimes it was painful to figure out what was going on. I'm going to be using D for one of the services in a project I'm working on soon as it's a good fit. I need a language that's easy to use, yet where I can have full memory control and native performance. Performance is the number one concern, and I won't need to use other cloud services or have many dependencies. D is a fantastic fit there, but I likely won't do it for too much more. In terms of language issues, personally the main thing that causes me frustration is issues with const and it just causing things to be... annoying when using templates (such as containers) or operators. Sometimes it feels like basics are missing from the standard library because they can be implemented using other constructs, but the way to do that isn't necessarily obvious. The real problem though is not related to the language at all, it's just tooling and library support. I personally don't want to write a bunch of infrastructure around AWS to be able to deploy a Lambda that triggers from S3 events and reads/writes files while calling other AWS services. In addition, I've been very spoiled it's really hard to go to a language that doesn't have virtually flawless completion (UFCS with templates and mixins inherently make these adding IDE features a nightmare I would think).
Nov 05 2021
prev sibling next sibling parent reply valmat <ufabiz gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
My point: it is because D is swimming against the current. D tries to be a system language. And in this capacity few people need it. Instead of moving against the flow, we could join the flow and overtake it. What D is really good at? D allows to do all the same things that can be done in D allows to write well-organized, well-readable code very quickly and this code executes very fast. We need to stop fighting with C++ and attract those who write in languages like Python. We can give endless possibilities while preserving what such languages are loved for -- the speed of development.
Nov 06 2021
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 6 November 2021 at 07:29:25 UTC, valmat wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
In this case, Why do I use `d`? Don't other languages smell good? Only `dare` to go against the current can we succeed. `D` need to solve the unfriendly tool question. I've heard many people say that they don't dare to use them , without good tools.
Nov 06 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Saturday, 6 November 2021 at 08:19:29 UTC, zjh wrote:
 On Saturday, 6 November 2021 at 07:29:25 UTC, valmat wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
In this case, Why do I use `d`? Don't other languages smell good? Only `dare` to go against the current can we succeed. `D` need to solve the unfriendly tool question. I've heard many people say that they don't dare to use them , without good tools.
I've been thinking about this for about 9000 years now and I think tooling is the answer. Why? Because developers are used to it now. And are lazy. void. Where was my static analysis, intellisense and refactoring tools. I don't know what made me continue my journey, but I'm glad I did. With that said though, for D to succeed we need better tooling around the language.
Nov 06 2021
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 6 November 2021 at 08:52:42 UTC, Imperatorn wrote:
 On Saturday, 6 November 2021 at 08:19:29 UTC, zjh wrote:
`D` community should find several people to provide excellent `VIM/vscode` plug-in tools. In fact, `many` people (really) `like` D, they just lack good tools. Maybe we can investigate the special tools like `ide/editors` of programmers in other communities(`c++/rust`), and then work on it.
Nov 06 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 6 November 2021 at 09:02:07 UTC, zjh wrote:

 `VIM/vscode` plug-in tools.
For example, people say that it is difficult to complete the automatic completion of `ucfs/template mixin`, so we should find a way to solve it ,then provide tools to `vim/vscode` plugin.
Nov 06 2021
parent reply zjh <fqbqrr 163.com> writes:
On Saturday, 6 November 2021 at 09:09:34 UTC, zjh wrote:
 plugin.
D needs to provide `options`, not mandatory. `utf8`, auto decoding, `GC`, are all like this. They are mandatory. Therefore, the user was upset and left. What `D` needs is `positioning`, what is `D`? For whom? Don't be greedy. As `better C++`, do a good job of `better c++ `, and as `system programming`, don't force `GC`. Since `metaprogramming` is powerful, D should continue to maintain its advantages. `D` can be positioned as serving for `experts`. `Experts` is `minority`. Why `C++` user smaller than `Python`'s, `Ordinary users` are much more than `professional programmers`. `D` just need to position itself and `experts` will come naturally.
Nov 07 2021
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 7 November 2021 at 07:12:20 UTC, zjh wrote:

 experts!experts!experts!
Serving for ordinary users has no future. Because `other languages` have occupied the field. The `expert` user is the `commanding height` of the programming language, because only the `expert` writes the `library`. Only when there are more `experts` writing the `library` can we form `positive feedback`.
Nov 07 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 7 November 2021 at 07:28:38 UTC, zjh wrote:
 The `expert` user is the `commanding height` of the programming 
 language, because only the `expert` writes the `library`.
 Only when there are more `experts` writing the `library` can we 
 form `positive feedback`.
I agree. If meta-programming is meant to be D's strong feature then experts are invaluable, and you have to be willing to do what is necessary to get them interested (including breaking changes). ((However, experts also want to spend less time on boring tasks, so you also need some high level features.))
Nov 07 2021
prev sibling parent reply MGW <mgw yandex.ru> writes:
On Saturday, 6 November 2021 at 09:02:07 UTC, zjh wrote:
 In fact, `many` people (really) `like` D, they just lack good 
 tools.
For me, a good tool is Qt adapted to D. It allows me to make beautiful applications with minimal effort. The documentation is the Qt documentation. ![example](https://sun9-86.userapi.com/impg/6CeKlZzZjh2pT1IxCQGJBJPBS6ty1stJmzPHxQ/77-df6SYc8Q.jpg?size=1280x720&quality=96&sign=908fc19f826649e79dacbfba214543f7&type=album)
Nov 07 2021
parent reply Tobias Pankrath <tobias pankrath.net> writes:
On Sunday, 7 November 2021 at 11:45:42 UTC, MGW wrote:
 On Saturday, 6 November 2021 at 09:02:07 UTC, zjh wrote:
 In fact, `many` people (really) `like` D, they just lack good 
 tools.
For me, a good tool is Qt adapted to D. It allows me to make beautiful applications with minimal effort. The documentation is the Qt documentation. ![example](https://sun9-86.userapi.com/impg/6CeKlZzZjh2pT1IxCQGJBJPBS6ty1stJmzPHxQ/77-df6SYc8Q.jpg?size=1280x720&quality=96&sign=908fc19f826649e79dacbfba214543f7&type=album)
Does that work reliably now?
Nov 07 2021
parent reply MGW <mgw yandex.ru> writes:
 Does that work reliably now?
For the tasks I use, it's quite reliable. I haven't tried it in tasks that require long execution times or manipulating a huge number of Qt class instances. I see Qt as an ecosystem to hold on to, which can dramatically speed up application creation. The D place is between C++ and Python.
Nov 07 2021
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 7 November 2021 at 12:09:42 UTC, MGW wrote:
 Does that work reliably now?
If `You` could write an introduction, That will be good. There is a great need for articles for `D`.
Nov 07 2021
next sibling parent pilger <abcd dcba.com> writes:
On Sunday, 7 November 2021 at 12:37:44 UTC, zjh wrote:
 If `You` could write an introduction,
 That will be good.
especially since the github page and repository is kinda confusing...
Nov 07 2021
prev sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 7 November 2021 at 12:37:44 UTC, zjh wrote:

 There is a great need for articles for `D`.
Phobos's idea is wrong in `force`. It forces you to use `GC`, forces you to use `utf8`, and forces you to use `auto decoding`... `Coercion` is not a good thing. Users should be provided with options. `GC` is what really hurt `d`. Why `Rust` not use `GC`? For `system language`, `GC` is `garbage`.
Nov 08 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Monday, 8 November 2021 at 12:43:05 UTC, zjh wrote:

 Phobos's
But metaprogramming is definitely a bonus. Look at the newly languages. Which one has no generics?
Nov 08 2021
parent reply Antonio <antonio abrevia.net> writes:
On Monday, 8 November 2021 at 12:50:46 UTC, zjh wrote:
 On Monday, 8 November 2021 at 12:43:05 UTC, zjh wrote:

 Phobos's
But metaprogramming is definitely a bonus. Look at the newly languages. Which one has no generics?
Generics has good type inference support: it is a well stablished part of methods/classes/interfaces signature... metaprogrammig with mixins difficuts type inference for code intellisense
Nov 08 2021
parent zjh <fqbqrr 163.com> writes:
On Monday, 8 November 2021 at 13:03:59 UTC, Antonio wrote:
 On Monday, 8 November 2021 at 12:50:46 UTC, zjh wrote:
 Look at the newly languages. Which one has no generics?
 metaprogrammig with mixins difficuts type inference for code 
 intellisense
This is what `D team` should dive into. The problem has been raised and must be solved if we want to `compete` with other popular languages.
Nov 08 2021
prev sibling parent reply bachmeier <no spam.net> writes:
On Monday, 8 November 2021 at 12:43:05 UTC, zjh wrote:
 Phobos's idea is wrong in `force`. It forces you to use `GC`, 
 forces you to use `utf8`, and forces you to use `auto 
 decoding`...
Strange definition of "force". You have to voluntarily choose to call those functions that use the GC, which does not qualify as "force" according to any standard definition. You seem to have it reversed. You want to force everyone using D to deal with the headache of avoiding the GC.
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 13:52:52 UTC, bachmeier wrote:
 Strange definition of "force". You have to voluntarily choose 
 to call those functions that use the GC, which does not qualify 
 as "force" according to any standard definition.
The language mandates a GC as design, meaning: you cannot remove the GC and use the full language. What is needed is more of a layered approach.
Nov 08 2021
next sibling parent reply Rumbu <rumbu rumbu.ro> writes:
On Monday, 8 November 2021 at 13:59:30 UTC, Ola Fosheim Grøstad 
wrote:
 The language mandates a GC as design, meaning: you cannot 
 remove the GC and use the full language.

 What is needed is more of a layered approach.
Yes, but it can be a deterministic gc like reference counting. Pascal (modern one) did it from the beginning 20 years ago with strings, dynamic arrays and interfaces and nobody from the "every cpu cycle counts" camp complained about it. Interfaces in pascal resolved also the class/struct pointer dichotomy: use a class if you want to manage memory yourself, use an interface implemented by that class if you want that we take care of the deallocation. I wonder if the !gc crowd will be happy with ARC built in language.
Nov 08 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 14:45:19 UTC, Rumbu wrote:
 Yes, but it can be a deterministic gc like reference counting. 
 Pascal (modern one) did it from the beginning 20 years ago with 
 strings, dynamic arrays and interfaces and nobody from the 
 "every cpu cycle counts" camp complained about it. Interfaces 
 in pascal resolved also the class/struct pointer dichotomy: use 
 a class if you want to manage memory yourself, use an interface 
 implemented by that class if you want that we take care of the 
 deallocation.

 I wonder if the !gc crowd will be happy with ARC built in 
 language.
I would. You could have ARC for objects that are shared between actors/threads and use a proper GC with an actor/thread. Then you get the best of both worlds IMO. What is preventing ARC from happening right now is that the interface between the compiler frontend and backend isn't clear. So it is very costly for an individual to implement it. With a better compiler architecture then ARC is something a separate group of 2-3 people could add as it requires limited insight into the compiler internals, and you can start by adding regular reference counting and gradually improve on the ARC optimizations.
Nov 08 2021
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Monday, 8 November 2021 at 13:59:30 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 13:52:52 UTC, bachmeier wrote:
 Strange definition of "force". You have to voluntarily choose 
 to call those functions that use the GC, which does not 
 qualify as "force" according to any standard definition.
The language mandates a GC as design, meaning: you cannot remove the GC and use the full language.
I'd be willing to be money that no program has ever been written that uses all of the language features.
Nov 08 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 16:29:28 UTC, bachmeier wrote:
 I'd be willing to be money that no program has ever been 
 written that uses all of the language features.
I think that is the wrong line of argument. The core language ought to be minimal, yet complete. So, for a system level programmer having some language features being GC dependent is seen as a red flag. For that to be reasonable you would need different profiles and a more layered approach. So you have a core language (no GC) and then on top of that a more high level language (possibly GC).
Nov 08 2021
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Monday, 8 November 2021 at 13:59:30 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 13:52:52 UTC, bachmeier wrote:
 Strange definition of "force". You have to voluntarily choose 
 to call those functions that use the GC, which does not 
 qualify as "force" according to any standard definition.
The language mandates a GC as design, meaning: you cannot remove the GC and use the full language. What is needed is more of a layered approach.
That's what `-betterC` is for I believe.
Nov 08 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 18:59:56 UTC, Dukc wrote:
 On Monday, 8 November 2021 at 13:59:30 UTC, Ola Fosheim Grøstad 
 wrote:
 What is needed is more of a layered approach.
That's what `-betterC` is for I believe.
That is more of a feature removal. You ought to have all interfacing types at the bottom layer.
Nov 08 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Monday, 8 November 2021 at 19:27:49 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 8 November 2021 at 18:59:56 UTC, Dukc wrote:
 On Monday, 8 November 2021 at 13:59:30 UTC, Ola Fosheim 
 Grøstad wrote:
 What is needed is more of a layered approach.
That's what `-betterC` is for I believe.
That is more of a feature removal. You ought to have all interfacing types at the bottom layer.
What do "interfacing types" mean? You somehow use a lot of different words than the rest of us, making you difficult to follow. And not a lot of concrete D examples. And I mean in general, not just this thread. I've read dozens of your posts and I still don't have a good picture of what you're lobbying for. What I THINK I've gathered so far: - Dmd should be rewritten in idiomatic D style, so that it's easier to experiment with. - After that, a grand rework of the whole language. - A small simple core for the reworked language, much like Lisp or Forth. - A different fork for the language rework, instead of having all that in the same codebase behind `version` declarations or `-preview` switches or such. - No serious priority to stability and backwards compatibility before the language rework is complete. If I got those even nearly right, You're in essence proposing D3.
Nov 09 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 9 November 2021 at 11:58:37 UTC, Dukc wrote:
 What do "interfacing types" mean?
All builtin types should be present in the bottom layer. The layer above should predominantly be built by builtin-types and meta-programming + syntactical sugar.
 - Dmd should be rewritten in idiomatic D style, so that it's 
 easier to experiment with.
Not a requirement. Just a more modular architecture of the compiler, greater independence of compilation stages.
 - After that, a grand rework of the whole language.
Adjustments.
 - A small simple core for the reworked language, much like Lisp 
 or Forth.
No, nothing like Lisp or Forth. Move as much as possible to meta-programming, does not mean Forth or Lisp. It also does not mean that a minimalistic syntax. You can have syntactical sugar for common constructs.
 - A different fork for the language rework, instead of having 
 all that in the same codebase behind `version` declarations or 
 `-preview` switches or such.
 - No serious priority to stability and backwards compatibility 
 before the language rework is complete.

 If I got those even nearly right, You're in essence proposing 
 D3.
That is a faster path than evolving the current compiler structure, and also faster than dealing with all the bickering about even the smallest adjustment. There is a reason for why they work on Golang2.
Nov 09 2021
prev sibling parent zjh <fqbqrr 163.com> writes:
On Saturday, 6 November 2021 at 08:19:29 UTC, zjh wrote:

 I've heard many people say that they don't dare to use them , 
 without good tools.
We need to work hard on `vscode/VIM` plugins for `D` users.
Nov 06 2021
prev sibling next sibling parent reply Alexey <invalid email.address> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 Go seeming to be
Dlang can now already easily takeover Golang programmers: 'go generate', absence of generics and other language features and low expressiveness of Golang, makes Dlang already better. As for Rust: I don't think Dlang should compete with it directly, thou Dlang can obtain optionally Rust's 'strong' features and by this make self more attractive to those who choose Rust. But in general, there are too few projects, which can display Dlang's awesomeness, while for Golang it's Docker and Kubernetes and for Rust it is Mozilla programs. IMHO, If Dlang would have something like Web-engine written on it.. or may be if Dlang had own Linux-kernel replacement, or maybe some DBMS (for instance RethinkDB fork or Cassandra fork), or OpenShift substitution - this would be huge!
Nov 06 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 6 November 2021 at 11:09:24 UTC, Alexey wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 [...]
Dlang can now already easily takeover Golang programmers: 'go generate', absence of generics and other language features and low expressiveness of Golang, makes Dlang already better. [...]
https://go.googlesource.com/proposal/+/refs/heads/master/design/43651-type-parameters.md
 This is the design for adding generic programming using type 
 parameters to the Go language. This design has been proposed 
 and accepted as a future language change. We currently expect 
 that this change will be available in the Go 1.18 release in 
 early 2022.
https://go2goplay.golang.org/ While D keeps trying to decide what market it should cater to, others keep closing the gap, while having better ecosystem.
Nov 06 2021
prev sibling parent Alexey <invalid email.address> writes:
On Saturday, 6 November 2021 at 11:09:24 UTC, Alexey wrote:
 there are too few projects, which can display Dlang's 
 awesomeness, while for Golang it's Docker and Kubernetes and 
 for Rust it is Mozilla programs.

 IMHO, If Dlang would have something like Web-engine written on 
 it.. or may be if Dlang had own Linux-kernel replacement, or 
 maybe some DBMS (for instance RethinkDB fork or Cassandra 
 fork), or OpenShift substitution - this would be huge!
How about this?: create Dlang job board, so companies and entrepreneurs could find and hire D programmers? If this will workout, Dlang's fame will become self-sustainable.
Nov 09 2021
prev sibling next sibling parent Booster <Booster Rooster.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
It's because the leaders of D do not want it to be popular. D's popularity is manifest of their attitude about it. They don't care any more. It's just their golf game, simple as that.
Nov 06 2021
prev sibling next sibling parent reply rumbu <rumbu rumbu.ro> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
First of all I will separate the language itself from the standard library. One of the big mistakes - in my opinion - was the involvement of language maintainers in the standard library design which is a very different animal. Language maintainers must provide the minimal blocks in the standard library and let the crowd design the rest of content as they consider. This will allow, for example, the gc crowd to abuse the garbage collector if they want so, but also the !gc crowd to get rid of it. The future will prove if D really needs a garbage collector or not. The language maintainers need just to publish some rules and that's all. Third party users want to get the job done. Let's do a web server. Let's connect to a database. Let's spawn some window on the screen. What they get instead? 12 sorting methods in std.algorithm. Personally, I really like the arsd libs more than any mambo-jambo written across phobos. If you ask me, I would grant Adam the official position of standard library designer. Probably you will say that's ok, the crowd is free to design their libraries, just push it on code.dlang.org. In reality this is a graveyard (or the morgue, if we count std.experimental as the graveyard). Why projects are dead, simply because they are not officially blessed by the language maintainers and not included in the standard library. To reinforce what I said, I will bring on the table the unwanted subject of Tango (yes, you can lie yourself that it was not the official library, but the reality is that it was de facto standard library). When the library design was let in the hands of the crowd, the content exceeded any expectation and consequentely D's popularity flourished, despite the fact that there was only one way to sort things (and by default a correct one, string collation was built-in). Phobos is still struggling after 15 years to match some Tango features. Now, having the library designed by the crowd, it will put pressure to language maintainers to update D to cope with the library requirements. If ranges are the first class citizens, let's get them some syntactic sugar. If the gc is worthless, let's get rid of it. And so on. Language maintainers became lazy. Instead of improving the language, it's easy to outsource everything to a library. D was nice and innovative 15 years ago, now it's struggling to keep the pace with new languages by patching missing features with libraries. You want resource counting, here you have 15 libraries to choose from. You want tagged unions, here I give you Algebraic. Wait, let's deprecate this, sumtype sounds better. Tuples, Nullable, Optional? I have another 10 brand new libraries for you. Dependency injection, serializing? Too advanced to be included in the language, here you have another 5 libraries.... Even old languages like C++ embraced new features; To sum things up, why D became unpopular: - because the standard library does not match users' expectations; - because the language didn't evolve in the last 13 years;
Nov 08 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 08:07:54 UTC, rumbu wrote:
 One of the big mistakes - in my opinion - was the involvement 
 of language maintainers in the standard library design which is 
 a very different animal. Language maintainers must provide the 
 minimal blocks in the standard library and let the crowd design 
 the rest of content as they consider. This will allow, for 
 example, the gc crowd to abuse the garbage collector if they 
 want so, but also the !gc crowd to get rid of it. The future 
 will prove if D really needs a garbage collector or not. The 
 language maintainers need just to publish some rules and that's 
 all.
Yes, I think this is correct. I never understand why people claim that Tango was a big issue. I was only interested in low level programming and did not use Tango, it was too high level for me, but I never saw it as a limiting factor. I was more interested in using C libraries than D libraries.
 Probably you will say that's ok, the crowd is free to design 
 their libraries, just push it on code.dlang.org. In reality 
 this is a graveyard (or the morgue, if we count 
 std.experimental as the graveyard). Why projects are dead, 
 simply because they are not officially blessed by the language 
 maintainers and not included in the standard library.
I think the idea was to replicate the success of Python, but in Python speed does not matter, it is all about convenience and stability. So that is essentially not possible for D where people have very different requirements (in comparison to Python). Also, developing a standard library like Python has takes a lot of time and effort, and you need critical mass to do it (or financial backing).
 - because the language didn't evolve in the last 13 years;
Yes, it is growing, but not really evolving.
Nov 08 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 8 November 2021 at 10:57:37 UTC, Ola Fosheim Grøstad 
wrote:
 Yes, I think this is correct. I never understand why people 
 claim that Tango was a big issue. I was only interested in low 
 level programming and did not use Tango, it was too high level 
 for me, but I never saw it as a limiting factor.
For the record, it is of course a requirement that all frameworks build on top of the same foundational runtime. So, you can have a core standard library without a GC runtime, and on top of it have an expanded (compatible) runtime with GC. So you need cooperation on runtime-profiles in addition to having a small focused core standard library.
Nov 08 2021
prev sibling parent Antonio <antonio abrevia.net> writes:
On Monday, 8 November 2021 at 08:07:54 UTC, rumbu wrote:
 ...

 To reinforce what I said, I will bring on the table the 
 unwanted subject of Tango (yes, you can lie yourself that it 
 was not the official library, but the reality is that it was de 
 facto standard library). When the library design was let in the 
 hands of the crowd, the content exceeded any expectation and 
 consequentely D's popularity flourished, despite the fact that 
 there was only one way to sort things (and by default a correct 
 one, string collation was built-in). Phobos is still struggling 
 after 15 years to match some Tango features.
Phobos is absolutely "template" oriented: this is a hard decision that, may be, is good for some developers/projects but absolutely bad for others: templates (and mixins, and...) are hard to integrate with debugging/inspecting and intelligence tooling. Tango was a more standard library from the point of view of how an standard library is (Using the "generic type" way instead "template all"). It is difficult for a team accustomed to the speed of working with other statically typed languages well integrated with of". * If I can't inspect objects and its properties recursively as easy as I can with Java then I really can't debug code. * If I can't identify easily a variable type (As a class or interface or whatever) and I need to answer the compiler what "this strange thing" is... then language doesn't describe 100% the model with it's own words: sometimes I wonder if the D team thinks in terms of language or only in terms of compiler. In my opinion... a "template all" library is an option for some developments, but forcing everybody to work this way is an stopper not only for developers: for D language itself. The facto, Phobos is what actually defines the D language: Phobos is killing D.
 Now, having the library designed by the crowd, it will put 
 pressure to language maintainers to update D to cope with the 
 library requirements. If ranges are the first class citizens, 
 let's get them some syntactic sugar. If the gc is worthless, 
 let's get rid of it. And so on.
Absolutely
 Language maintainers became lazy. Instead of improving the 
 language, it's easy to outsource everything to a library. D was 
 nice and innovative 15 years ago, now it's struggling to keep 
 the pace with new languages by patching missing features with 
 libraries. You want resource counting, here you have 15 
 libraries to choose from. You want tagged unions, here I give 
 you Algebraic. Wait, let's deprecate this, sumtype sounds 
 better. Tuples, Nullable, Optional? I have another 10 brand new 
 libraries for you. Dependency injection, serializing? Too 
 advanced to be included in the language, here you have another 
 5 libraries.... Even old languages like C++ embraced new 
 features;
I agree. Optional/Some/None (remove Nullable ASAP) and Pattern matching will be a great adquisition, Union types would be appreciated, powerful type inference (i.e. with Voldemort types): if Typescript can effort some of this features, why not D?
 To sum things up, why D became unpopular:
 - because the standard library does not match users' 
 expectations;
And doesn't fit developers "normalized" tooling way of working
 - because the language didn't evolve in the last 13 years;
Good point.
Nov 08 2021
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
The most important aspects of a programming language, are its cognitve demands and problem-solving performance. In the end, people just want to solve problems, quickly, easily, and efficiently. This is, and always has been, the driver for language development and evolution. So 'if' your assertion is correct, then one basis for that, is that 'people' are solving their problems more quickly, more easily, and more efficiently, using other (as in one or more) languages.
Nov 08 2021
prev sibling next sibling parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
I think this argument has it backwards. The big corporations with deep pockets are a consequence of the success, not the cause. Big corporations like Microsoft, Google, Facebook etc only really become interested in Rust in 2019. Until then it was a small enthusiast's language with the occasional in-house project here and there, not unlike D. In fact I find that the geneses of D and Rust are remarkably similar: both were born in a company, out of frustration with C++ and the belief that their creators could design something better. Even the problems of C++ that D and Rust wanted to fix overlap to a large degree: better memory management, better type system, better encapsulation, getting rid of the preprocessor... Of course from there their respective routes were very different. Rust succeeded in convincing the big corps to fund it and adopt it. But if we are debating why D didn't, the question is then what made Rust different. I tried to expose what I believe are the reasons.
Nov 08 2021
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Tuesday, 9 November 2021 at 07:22:01 UTC, arco wrote:

 In fact I find that the geneses of D and Rust are remarkably 
 similar: both were born in a company, out of frustration with 
 C++ and the belief that their creators could design something 
 better.
Not even close to similar. Digital Mars a one-man company and D a one-man project. Rust had and continues to have the resources of Mozilla behind it.
Nov 08 2021
parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Tuesday, 9 November 2021 at 07:53:19 UTC, Mike Parker wrote:

 Not even close to similar. Digital Mars a one-man company and D 
 a one-man project. Rust had and continues to have the resources 
 of Mozilla behind it.
Rust also started as a one man show (Graydon Hoare). Mozilla initially supported it as a research project, not as a large investment, and its resources are limited anyway compared to the likes of Google. The comparison holds in my opinion, it's what came after that made the difference.
Nov 09 2021
next sibling parent reply bachmeier <no spam.net> writes:
On Tuesday, 9 November 2021 at 08:32:07 UTC, arco wrote:

 Rust also started as a one man show (Graydon Hoare). Mozilla 
 initially supported it as a research project, not as a large 
 investment, and its resources are limited anyway compared to 
 the likes of Google. The comparison holds in my opinion, it's 
 what came after that made the difference.
This is an interesting interpretation of history. By the end of 2014, the Rust core team had eight members, at least seven of whom were hired by Mozilla to work on getting the language in order for the 1.0 release: https://web.archive.org/web/20141225072631/https://github.com/rust-lang/rust/wiki/Note-core-team But that understates the funding. They had the Servo team, many of whom used paid hours to work on Rust*, and paid interns. \* Even when these were not direct contributions to the language, being paid to work on Servo meant they were paid to test the language, find bugs, and identify areas needing improvement.
Nov 09 2021
parent reply arco <qva6y4sqi relay.firefox.com> writes:
On Tuesday, 9 November 2021 at 09:54:06 UTC, bachmeier wrote:
 On Tuesday, 9 November 2021 at 08:32:07 UTC, arco wrote:

 Rust also started as a one man show (Graydon Hoare). Mozilla 
 initially supported it as a research project, not as a large 
 investment, and its resources are limited anyway compared to 
 the likes of Google. The comparison holds in my opinion, it's 
 what came after that made the difference.
This is an interesting interpretation of history. By the end of 2014, the Rust core team had eight members, at least seven of whom were hired by Mozilla to work on getting the language in order for the 1.0 release: https://web.archive.org/web/20141225072631/https://github.com/rust-lang/rust/wiki/Note-core-team But that understates the funding. They had the Servo team, many of whom used paid hours to work on Rust*, and paid interns.
Yes. That was in 2014. But Hoare started Rust in 2006 as a personal project. Which is my point: in 2006 it was an obscure experiment created by one person. In 2014 Mozilla was employing people both to work on it and to use it. In 2019 Microsoft & co moved in. In 2021 Linux developers are considering using it in the kernel. In other words, instead of dismissing comparisons between Dlang and Rust as somehow unfair because Rust has a lot more resources, a more interesting question is IMO why and how did Rust attract those resources and which lessons can be learnt from it.
Nov 09 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 9 November 2021 at 10:32:57 UTC, arco wrote:
 In other words, instead of dismissing comparisons between Dlang 
 and Rust as somehow unfair because Rust has a lot more 
 resources, a more interesting question is IMO why and how did 
 Rust attract those resources and which lessons can be learnt 
 from it.
Walter is unlikely to hand over D to a commercial entity, which I think most D users are happy with. That would be a completely different project. Rust follows the same path as Swift and JavaScript. Person affiliated with an organization creates a language. The person hands over decision making to the organization, which in turn evolves it further into a "commercial" commodity.
Nov 09 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 9 November 2021 at 10:48:01 UTC, Ola Fosheim Grøstad 
wrote:

 Walter is unlikely to hand over D to a commercial entity, which 
 I think most D users are happy with. That would be a completely 
 different project.
Why I like `D` is that the author of D doesn't bloat.
Nov 09 2021
parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 9 November 2021 at 11:05:09 UTC, zjh wrote:

GUI is a market with a large number of users. At present, it is 
still a mess. I think the`d` community can arrange somebody to 
port `wxWidgets`. This may be a good idea.
Many people strongly need a good 'GUI'.
Microsoft is also working on `winui3`
Nov 13 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 14 November 2021 at 03:48:14 UTC, zjh wrote:

 Microsoft is also working on `winui3`
Those who are satisfied with GC,is YOU make `d` unpopular.
Nov 13 2021
parent zjh <fqbqrr 163.com> writes:
On Sunday, 14 November 2021 at 03:51:35 UTC, zjh wrote:
 Microsoft is also working on `winui3`
`Rust` is for large companies. D can fight for the so-called `poor man`. So strive for poorman's GUI `wxWidgets`'s `d` port, I think it may be good.
Nov 13 2021
prev sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 14 November 2021 at 03:48:14 UTC, zjh wrote:
 On Tuesday, 9 November 2021 at 11:05:09 UTC, zjh wrote:

 GUI is a market with a large number of users. At present, it is 
 still a mess. I think the`d` community can arrange somebody to 
 port `wxWidgets`. This may be a good idea.
 Many people strongly need a good 'GUI'.
 Microsoft is also working on `winui3`
Do you mean port like an actual port and not only bindings?
Nov 14 2021
parent reply zjh <fqbqrr 163.com> writes:
On Sunday, 14 November 2021 at 20:39:07 UTC, Imperatorn wrote:

 Do you mean port like an actual port and not only bindings?
binding is Ok,usable for D users.
Nov 14 2021
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Monday, 15 November 2021 at 00:57:43 UTC, zjh wrote:
 On Sunday, 14 November 2021 at 20:39:07 UTC, Imperatorn wrote:

 Do you mean port like an actual port and not only bindings?
binding is Ok,usable for D users.
http://wxd.sourceforge.net/ It's a bit outdated tho. Tried it?
Nov 15 2021
parent reply zjh <fqbqrr 163.com> writes:
On Monday, 15 November 2021 at 09:28:29 UTC, Imperatorn wrote:

 It's a bit outdated tho. Tried it?
It's just out of date. Someone needs to bind it. I'm sure it will attract a lot of users for `D`. `wxWidgets` with `D`, it must be good. `GUI` is a big market.
Nov 15 2021
next sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Monday, 15 November 2021 at 10:21:04 UTC, zjh wrote:
 On Monday, 15 November 2021 at 09:28:29 UTC, Imperatorn wrote:

 It's a bit outdated tho. Tried it?
It's just out of date. Someone needs to bind it. I'm sure it will attract a lot of users for `D`. `wxWidgets` with `D`, it must be good. `GUI` is a big market.
Which of these alternatives do you consider best atm: https://wiki.dlang.org/GUI_Libraries
Nov 15 2021
parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 07:15:58 UTC, Imperatorn wrote:

 `wxWidgets` with `D`, it must be good.
 `GUI` is a big market.
 https://wiki.dlang.org/GUI_Libraries
Those GUIs are not competitive. Just `wxWidgets` is good. This is the `most friendly` for programmers.
Nov 15 2021
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 07:20:26 UTC, zjh wrote:

 Just `wxWidgets` is good.
 This is the `most friendly` for programmers.
`wxWidgets`'s comprehensive ability is the best.
Nov 15 2021
parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 07:22:16 UTC, zjh wrote:

 `wxWidgets`'s comprehensive ability is the best.
`QT`, only `DLL`.`sciter` is the same. The rest are too small. `WxWidgets` comprehensive, can `lib` connected, free. Very good. Maybe we can cooperate with the `wxWidgets` community.
Nov 15 2021
parent zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 07:37:17 UTC, zjh wrote:

 `wxWidgets`'s comprehensive ability is the best.
We can also investigate the common binding of `python`. Then we can bind it again.
Nov 15 2021
prev sibling parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Tuesday, 16 November 2021 at 07:20:26 UTC, zjh wrote:
 On Tuesday, 16 November 2021 at 07:15:58 UTC, Imperatorn wrote:

 `wxWidgets` with `D`, it must be good.
 `GUI` is a big market.
 https://wiki.dlang.org/GUI_Libraries
Those GUIs are not competitive. Just `wxWidgets` is good. This is the `most friendly` for programmers.
I have no idea what you just said. Is not GTK, DWT/SWT, Qt or tk used you mean? :D
Nov 16 2021
parent zjh <fqbqrr 163.com> writes:
On Tuesday, 16 November 2021 at 08:40:30 UTC, Imperatorn wrote:

 Is not GTK, DWT/SWT, Qt or tk used you mean? :D
Score `GUI` libraries according to `size/user-friendly/comprehensive/open-source/speed`, `wxWidgets` is basically the best.
Nov 16 2021
prev sibling parent zjh <fqbqrr 163.com> writes:
On Monday, 15 November 2021 at 10:21:04 UTC, zjh wrote:

 `GUI` is a big market.
`GUI` is a warlord scuffle. We just need to bind to the one with the most potential. We can get a large number of users.
Nov 15 2021
prev sibling parent arco <qva6y4sqi relay.firefox.com> writes:
On Tuesday, 9 November 2021 at 10:48:01 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 9 November 2021 at 10:32:57 UTC, arco wrote:
 In other words, instead of dismissing comparisons between 
 Dlang and Rust as somehow unfair because Rust has a lot more 
 resources, a more interesting question is IMO why and how did 
 Rust attract those resources and which lessons can be learnt 
 from it.
Walter is unlikely to hand over D to a commercial entity, which I think most D users are happy with. That would be a completely different project. Rust follows the same path as Swift and JavaScript. Person affiliated with an organization creates a language. The person hands over decision making to the organization, which in turn evolves it further into a "commercial" commodity.
Rust is not a commercial entity. It's governed by the Rust Foundation which is non-profit (and also very recent), but it remains a community project open to anyone who wants to get involved. Again, it's not very different from D in that regard.
Nov 09 2021
prev sibling parent bachmeier <no spam.net> writes:
On Tuesday, 9 November 2021 at 10:32:57 UTC, arco wrote:

 In other words, instead of dismissing comparisons between Dlang 
 and Rust as somehow unfair because Rust has a lot more 
 resources, a more interesting question is IMO why and how did 
 Rust attract those resources and which lessons can be learnt 
 from it.
Okay
Nov 09 2021
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.com> writes:
On 2021-11-09 3:32, arco wrote:
 On Tuesday, 9 November 2021 at 07:53:19 UTC, Mike Parker wrote:
 
 Not even close to similar. Digital Mars a one-man company and D a 
 one-man project. Rust had and continues to have the resources of 
 Mozilla behind it.
Rust also started as a one man show (Graydon Hoare). Mozilla initially supported it as a research project, not as a large investment, and its resources are limited anyway compared to the likes of Google. The comparison holds in my opinion, it's what came after that made the difference.
and nope
Nov 11 2021
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 9 November 2021 at 07:22:01 UTC, arco wrote:
 In fact I find that the geneses of D and Rust are remarkably 
 similar: both were born in a company, out of frustration with 
 C++ and the belief that their creators could design something 
 better.
D was not born in a company. D was created because Walter didn't feel like being retired and set out to create something he would like to use and share with the world. If you want D to take a different direction, you have to change Walter's idea of what makes up a good language. AFAIK, Rust was handed over by the creator to the organization he was part of. Completely different settings.
Nov 08 2021
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.com> writes:
On 2021-11-09 2:22, arco wrote:
 In fact I find that the geneses of D and Rust are remarkably similar: 
 both were born in a company
nope
Nov 11 2021
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 ...trying to understand why it's unpopular.
 ...But I'd like to know you all opinions.
Does not being popular, mean it's unpopular?? That's a philosophical question I suppose. ..in any case... Is this comment (below) a 'possible' contributing factor? "Regrets? I should have made it open source from the beginning! Stupid me." - Walter Bright https://news.ycombinator.com/item?id=27102584
Nov 09 2021
parent reply harakim <harakim gmail.com> writes:
On Wednesday, 10 November 2021 at 00:49:43 UTC, forkit wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 ...trying to understand why it's unpopular.
 ...But I'd like to know you all opinions.
Does not being popular, mean it's unpopular?? That's a philosophical question I suppose. ..in any case... Is this comment (below) a 'possible' contributing factor? "Regrets? I should have made it open source from the beginning! Stupid me." - Walter Bright https://news.ycombinator.com/item?id=27102584
And to my point and Kapps (?) point, this quote: "But dlang has a long problem of change constantly without backward compatibility and a lot of projects just die because of it, for example the mentioned https://github.com/facebookarchive/warp :"
Nov 09 2021
parent reply Mike Parker <aldacron gmail.com> writes:
On Wednesday, 10 November 2021 at 06:35:08 UTC, harakim wrote:

 And to my point and Kapps (?) point, this quote:
 "But dlang has a long problem of change constantly without 
 backward compatibility and a lot of projects just die because 
 of it, for example the mentioned 
 https://github.com/facebookarchive/warp :"
And Walter's reply to that:
 That isn't why Warp was discontinued.
His fork: https://github.com/DigitalMars/dmpp
Nov 09 2021
parent reply Dr Machine Code <jckj33 gmail.com> writes:
On Wednesday, 10 November 2021 at 07:15:56 UTC, Mike Parker wrote:
 On Wednesday, 10 November 2021 at 06:35:08 UTC, harakim wrote:

 And to my point and Kapps (?) point, this quote:
 "But dlang has a long problem of change constantly without 
 backward compatibility and a lot of projects just die because 
 of it, for example the mentioned 
 https://github.com/facebookarchive/warp :"
And Walter's reply to that:
 That isn't why Warp was discontinued.
His fork: https://github.com/DigitalMars/dmpp
Why it was discontinued? I didn't even know that
Nov 10 2021
parent Mike Parker <aldacron gmail.com> writes:
On Wednesday, 10 November 2021 at 17:00:19 UTC, Dr Machine Code 
wrote:
 And Walter's reply to that:

 That isn't why Warp was discontinued.
His fork: https://github.com/DigitalMars/dmpp
Why it was discontinued? I didn't even know that
Facebook stopped using it. I don’t know anything beyond that.
Nov 10 2021
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
One of the creators of Go, said, and I quote "you can actually write quite nice code in C++" https://youtu.be/sln-gJaURzk?t=868 "... if you write in a subset of it..." Another interesting comment during the few minutes of that video, where they were discussing C++, was one a comment about being able to better reason about the code (in Go) vs C++. I wonder if that is why C++ is becoming much less popular - despite it being everywhere - cause it's soooo hard to reason about the code in C++ (vs Go, is their argument) - even more so for newcomers. My 'first point' being, will being able to 'reason about the code', be (or is it already) a reason why D won't ever become 'popular'? Is there an equivalent 'subset' in D, where 'you can actually write quite nice code'? Also, I found an interesting part of this video relevent too, where 'Dave' basically has a go at Herb, for trying to 'rephrasing' user complaints about C++ being to complex. Herb's argument is that we're making it simpler (by adding to it). Dave's argument is, not your NOT - https://youtu.be/raB_289NxBk?t=5486 My 'second point' being, has D already become to complex? And is there any way to make it simpler, other than 'adding to it'? I personally do not like simple. To me, that equates to restricted. But I doubt most are like me, hence why D may never, ever, become 'popular'. I'd welcome comments that respond to my 'first point' and my 'second point'.
Nov 13 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 14 November 2021 at 03:19:37 UTC, forkit wrote:
 My 'second point' being, has D already become to complex? And 
 is there any way to make it simpler, other than 'adding to it'?

 I personally do not like simple. To me, that equates to 
 restricted. But I doubt most are like me, hence why D may 
 never, ever, become 'popular'.
Features like ``` trusted``` only makes sense if the invariants (rules) of the type system are easy to reason about, otherwise it becomes impossible for most programmers to write correct ``` trusted``` code. Both C++ and D have unnecessary complexity, meaning that the complexity does not add power, it is basically just relics of the past. C++ cannot fix this, because of critical mass. It is a mistake for D to follow the same recipe…
Nov 14 2021
parent reply Dr Machine Code <jckj33 gmail.com> writes:
On Sunday, 14 November 2021 at 21:28:35 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 14 November 2021 at 03:19:37 UTC, forkit wrote:
 [...]
Features like ``` trusted``` only makes sense if the invariants (rules) of the type system are easy to reason about, otherwise it becomes impossible for most programmers to write correct ``` trusted``` code. Both C++ and D have unnecessary complexity, meaning that the complexity does not add power, it is basically just relics of the past. C++ cannot fix this, because of critical mass. It is a mistake for D to follow the same recipe…
what example of that D's complexity?
Nov 14 2021
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 14 November 2021 at 21:42:44 UTC, Dr Machine Code 
wrote:
 what example of that D's complexity?
In general? The forums are full of those. I don't want to start a flamewar by listing, but despite people claiming meta-programming is easy in D, it isn't difficult to point out that there are half-baked mechanisms, feature overlap that isn't needed, limited deduction abilities that can make expressing things more awkward etc. Too much complexity on function signatures for sure, how many casual D programmers remember what inout does? The ACID test of language complexity is to look at typical code bases and ask yourself if it would be a good idea to tell newbies to learn the language from studying those. I think both C+, Rust and D has a usability problem there.
Nov 14 2021
next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 14 November 2021 at 21:58:21 UTC, Ola Fosheim Grøstad 
wrote:
 The ACID test of language complexity is to look at typical code 
 bases and ask yourself if it would be a good idea to tell 
 newbies to learn the language from studying those.

 I think both C+, Rust and D has a usability problem there.
No. Never send a newbie to study a typical code base, no matter what the language. First, the concept of their being a 'typical' code base is a bit sus. Second, newbies need (up-to-date) learning material that minimizes learning time and confusion, and provides concise, focused guides to specific topics(Andrew Koenig, Accelerated C++, 2000).
Nov 14 2021
prev sibling next sibling parent forkit <forkit gmail.com> writes:
On Sunday, 14 November 2021 at 21:58:21 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 14 November 2021 at 21:42:44 UTC, Dr Machine Code 
 wrote:
 what example of that D's complexity?
In general? The forums are full of those. I don't want to start a flamewar by listing, but despite people claiming meta-programming is easy in D, it isn't difficult to point out that there are half-baked mechanisms, feature overlap that isn't needed, limited deduction abilities that can make expressing things more awkward etc. Too much complexity on function signatures for sure, how many casual D programmers remember what inout does? The ACID test of language complexity is to look at typical code bases and ask yourself if it would be a good idea to tell newbies to learn the language from studying those. I think both C+, Rust and D has a usability problem there.
I'd also argue, that complexity is the natural outcome of any programming language that is suitable for solving problems across different domains. Complexity is the natural outcome, because developing 'general purpose' problem solving strategies that cut across different domains, is very difficult. With such a language, you must begin on the basis: 'expect the unexpected'. When introducing such a language to novices, you need to do it in a very structured and focused manner, and with a good understanding of cognitive science as it relates to learning. The inverse is also true. A programming language designed for solving problems within a specific domain, will be less complex that one designed to solve problems across different domains. Complexity is not the problem per se. Our approach to that complexity is usually the problem.
Nov 14 2021
prev sibling next sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 14 November 2021 at 21:58:21 UTC, Ola Fosheim Grøstad 
wrote:
 I think both C+, Rust and D has a usability problem there.
I really like this video.. it compares doing gcd (greatest common divisor) in 16 different languages. If I were running a intro to programming course, I'd make them all watch this, as soon as they walk into their first class. The link below starts where he compares the C++ vs D vs Rust solution. In D, you just do it and quote '.. move on'. https://youtu.be/UVUjnzpQKUo?t=449
Nov 14 2021
parent Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Monday, 15 November 2021 at 03:10:34 UTC, forkit wrote:
 I really like this video.. it compares doing gcd (greatest 
 common divisor) in 16 different languages. If I were running a 
 intro to programming course, I'd make them all watch this, as 
 soon as they walk into their first class.

 The link below starts where he compares the C++ vs D vs Rust 
 solution.

 In D, you just do it and quote '.. move on'.

 https://youtu.be/UVUjnzpQKUo?t=449
This video shows a suboptimal solution for D, because their implementation unnecessarily iterates over the array twice. Their C++ solution is much faster, because of doing processing in a single pass. But D can do it in a single pass too: ```D int findGCD_from_the_video_if_you_dont_care_about_performance(const int[] nums) { return gcd(nums.minElement, nums.maxElement); } int findGCD_twice_faster_for_large_arrays(const int[] nums) { return gcd(nums.reduce!(min, max)[]); } ```
Nov 14 2021
prev sibling parent reply forkit <forkit gmail.com> writes:
On Sunday, 14 November 2021 at 21:58:21 UTC, Ola Fosheim Grøstad 
wrote:
 The ACID test of language complexity is to look at typical code 
 bases and ask yourself if it would be a good idea to tell 
 newbies to learn the language from studying those.
I'd argue, that the ACID test of language complexity is the extent to which you can make sense of the code you're looking at, without any prior exposure to the language that it is written in.
Nov 14 2021
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmal.com> writes:
On Monday, 15 November 2021 at 03:57:59 UTC, forkit wrote:
 I'd argue, that the ACID test of language complexity is the 
 extent to which you can make sense of the code you're looking 
 at, without any prior exposure to the language that it is 
 written in.
That would favour languages that are similar to the most used
Nov 14 2021
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Monday, 15 November 2021 at 03:57:59 UTC, forkit wrote:
 I'd argue, that the ACID test of language complexity is the 
 extent to which you can make sense of the code you're looking 
 at, without any prior exposure to the language that it is 
 written in.
This seems more like a test of familiarity than complexity. A better metric for complexity would be something like "number of words in the language spec."
Nov 14 2021
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 15 November 2021 at 04:06:31 UTC, Paul Backus wrote:
 This seems more like a test of familiarity than complexity. A 
 better metric for complexity would be something like "number of 
 words in the language spec."
No, difficult implementation != difficult to use
Nov 14 2021
parent reply Paul Backus <snarwin gmail.com> writes:
On Monday, 15 November 2021 at 04:10:01 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 15 November 2021 at 04:06:31 UTC, Paul Backus wrote:
 This seems more like a test of familiarity than complexity. A 
 better metric for complexity would be something like "number 
 of words in the language spec."
No, difficult implementation != difficult to use
Spec != implementation
Nov 14 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 15 November 2021 at 04:37:48 UTC, Paul Backus wrote:
 On Monday, 15 November 2021 at 04:10:01 UTC, Ola Fosheim 
 Grøstad wrote:
 On Monday, 15 November 2021 at 04:06:31 UTC, Paul Backus wrote:
 This seems more like a test of familiarity than complexity. A 
 better metric for complexity would be something like "number 
 of words in the language spec."
No, difficult implementation != difficult to use
Spec != implementation
Actually, a formal spec is pretty close...
Nov 14 2021
prev sibling parent reply forkit <forkit gmail.com> writes:
On Monday, 15 November 2021 at 04:06:31 UTC, Paul Backus wrote:
 This seems more like a test of familiarity than complexity. A 
 better metric for complexity would be something like "number of 
 words in the language spec."
My assertion was based on there being no familiarity Yes, familiarity will play a role if your looking at code in a language that looks/works similar to one you're familiar with. But I mean complete novices. Never exposed to a programming language. Also, I'm not referring to 'language' complexity per se, but rather 'cognitive' complexity. Specifically, chunking: https://en.wikipedia.org/wiki/Chunking_(psychology) Some languages are well suited to chunking (either intentionally by design, or by accident). Chunking will impact on your capacity to learn and remember. Others seem more like an "undifferentiated mess of atomic information items". Ever wondered why C++ is so hard for a novice to learn and remember? Is it because you cannot fit those "undifferentiated mess of atomic information items" into working memory?
Nov 14 2021
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 15 November 2021 at 06:18:17 UTC, forkit wrote:
 But I mean complete novices. Never exposed to a programming 
 language.
There are languages designed for novices, but they seem to be cumbersome for production use.
 Also, I'm not referring to 'language' complexity per se, but 
 rather 'cognitive' complexity.
Yes, this is an interesting topic. I think new (and old) programmers benefit from writing pseudo code before they write in the implementation language. Basically write the code in their own favourite short-hand english mixed with symbols of their own choice. We could probably learn a lot about preferences if we collected personal "pseudo-code" from a large number of programmers. I think there is quite a distance between the pseudo-code people choose to write and implementation in a system level language. That in itself suggests to me that "something is missing" in terms of usability. There is clearly room for improvement.
 Chunking will impact on your capacity to learn and remember.

 Others seem more like an "undifferentiated mess of atomic 
 information items".
The visual image can often be noisy, and text editors provide limited tools for visually cleaning up and bringing emphasis to the important parts. Maybe also auto-formatting means that programmers loose a bit of creativity/interest in improving on the visual presentation of code?
 Ever wondered why C++ is so hard for a novice to learn and 
 remember?

 Is it because you cannot fit those "undifferentiated mess of 
 atomic information items" into working memory?
Maybe so, and another factor is that they cannot filter out what is important and what isn't. It is like driving in a busy city. If you seldom do, then it is a scary and taxing experience, pedestrians basically jump out in front of the car... With lots of experience you filter out all noise and laser focus on the critical elements (like kids). If you cannot filter properly then you will "run out of space" in your short term memory. I probably takes a lot of exposure to get used to the extended use of C++ namespaces, which makes the code look rather cluttered. Might have something to do with visual symbols too. To me ```namespace::function()``` looks more like two items than ```namespace'function()```, so that could definitively be a chunking issue. I favour the latter notation for that reason (I think it is used by Ada?). Also unique usage of mnemonics can help. One issue in D is reusing symbols and keywords for unrelated things. That is obviously making things harder as you now have to associate multiple things with the same visual impression, and that has a cognitive differentiation cost. So simpler visuals do not have to be better.
Nov 15 2021
prev sibling next sibling parent anonymous <anonymous anonymous.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 But I'd like to know you all opinions
In my opinion, the reason is, that D never had a stable version and never will have. (It's not enough to call something stable: when it changes every 2 or 3 months it is not stable.)
Nov 16 2021
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Two things: 1. This question keeps popping up. There must be a reason. 2. On reddit, a poster wrote: "D was used as a POC to show some of the functionality proposed." I don't know to which extent this is true, but D is certainly more of a research than a real world language. What galls ex-users is that this is not communicated clearly to newcomers. Instead, you have to get deep into D (which, in fairness, has its merits, because you learn a lot) before you realize that your projects will forever be hampered by D's shortcomings. Same shortcomings are never really addressed because D is, at its core, a CS research language (which is slowly losing its edge not be sold as production ready. I remember the "update lottery" whenever a new version was available: will it break my code or not? This is a situation you find yourself in with languages that are below 1.0 and then you know what you're in for. However, D is already 2.+. The only explanation is that in reality D is still at a stage that is below 1.0. I've been an early adopter with some PLs and one OS, it's always a bit of a gamble and it takes some effort. However, those PLs and the OS have matured and stabilized. D never ever matures or stabilizes, it just goes on and on and on being between 0.1 and 0.9. This made it impossible for me to create real world applications. Apart from making my work unnecessarily difficult, how could I justify the use of an eternal 0.x language to superiors and users? break (D) { switch; }
Apr 29 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/29/2022 3:40 AM, Chris wrote:
 I don't know to which extent this is true, but D is certainly more of a
research 
 than a real world language. What galls ex-users is that this is not
communicated 
 clearly to newcomers. Instead, you have to get deep into D (which, in
fairness, 
 has its merits, because you learn a lot) before you realize that your projects 
 will forever be hampered by D's shortcomings. Same shortcomings are never
really 
 addressed because D is, at its core, a CS research language (which is slowly 

 should not be sold as production ready. I remember the "update lottery"
whenever 
 a new version was available: will it break my code or not? This is a situation 
 you find yourself in with languages that are below 1.0 and then you know what 
 you're in for. However, D is already 2.+. The only explanation is that in 
 reality D is still at a stage that is below 1.0. I've been an early adopter
with 
 some PLs and one OS, it's always a bit of a gamble and it takes some effort. 
 However, those PLs and the OS have matured and stabilized. D never ever
matures 
 or stabilizes, it just goes on and on and on being between 0.1 and 0.9. This 
 made it impossible for me to create real world applications. Apart from making 
 my work unnecessarily difficult, how could I justify the use of an eternal 0.x 
 language to superiors and users? break (D) { switch; }
We leave deprecated features alive for several years, and there are quite a lot of long term projects on buildkite that are part of the test suite so we know if something breaks.
Apr 29 2022
prev sibling next sibling parent reply zoujiaqing <zoujiaqing gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
Two aspects: First of all: D The biggest problem is the name! Of course, many people deny this problem. If you search for a letter in a search engine, it's hard to get valid results. For example, a Google search for "D" results in comparison to a search for "Python," "Ruby," and "rust." This has the undesirable effect of making it difficult for new users to find learning materials. Second: The standard library is weak, and the most basic HTTP modules are probably well supported in Python, Golang, and Rust, making it easy to build HTTP servers and use HTTP clients to access HTTP server resources. Even the most basic URL parsing is missing from the D standard library. I think the library should have a better experience with python and Golang. -- zoujiaqing
May 16 2022
next sibling parent reply Vladimir Panteleev <thecybershadow.lists gmail.com> writes:
On Monday, 16 May 2022 at 10:35:14 UTC, zoujiaqing wrote:
 Two aspects:

 First of all:
 D The biggest problem is the name! Of course, many people deny 
 this problem.
 If you search for a letter in a search engine, it's hard to get 
 valid results.
 For example, a Google search for "D" results in comparison to a 
 search for "Python," "Ruby," and "rust."
 This has the undesirable effect of making it difficult for new 
 users to find learning materials.
because of its popularity and origins, this problem was rectified within a few years.
May 16 2022
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Monday, 16 May 2022 at 10:35:14 UTC, zoujiaqing wrote:
 Two aspects:
 
 First of all:
 D The biggest problem is the name! Of course, many people deny this
 problem.  If you search for a letter in a search engine, it's hard
 to get valid results.  For example, a Google search for "D" results
 in comparison to a search for "Python," "Ruby," and "rust." This has
 the undesirable effect of making it difficult for new users to find
 learning materials.
[...] This is not really a big problem. Just search for `dlang` instead of `D` and you get good results. T -- Written on the window of a clothing store: No shirt, no shoes, no service.
May 16 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/16/2022 1:29 PM, H. S. Teoh wrote:
 This is not really a big problem. Just search for `dlang` instead of `D`
 and you get good results.
Or 'D Programming'. When I search for C stuff, I use 'C Programming'. Both work well.
May 16 2022
prev sibling parent Jack <jckj33 gmail.com> writes:
On Monday, 16 May 2022 at 10:35:14 UTC, zoujiaqing wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:
 [...]
Two aspects: [...]
i'm using dlang, it works even in search engines such as duck.com same goes to "go" that people should use "golang" instead to find something
 [...]
yep, i agree.
 -- zoujiaqing
May 30 2022
prev sibling next sibling parent Antonio <antonio abrevia.net> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:
 It got [asked on 
 reddit](https://www.reddit.com/r/d_language/comments/q74bzr/
hy_is_d_unpopular/) sub but for those that aren't active too, I'd like you
opinions. Please don't get me wrong, I also love D, I've used it everywhere I
can and I'd say it's my favourite language (yes I have one...) but I'm as as
the reddit's OP, trying to understand why it's unpopular. Rust and Go seeming
to be getting more and more users. I think it's due to large ecosystem and the
big corporations with deep pockets that pushes them. But I'd like to know you
all opinions
I decided not to question the whys... D simply follows it's "winding" path (like Paul McCartney's Beatles song). I use it as an scripting language and I simply wait for some productive "things" to be incorporated one day: named parameters, strings interpolation, unlimited UFCS, null safety (or Optional/Some/None native support, or union types or whatever D decide to do), better optional typing inference (at least, as powerful as dart or typescript or ...), nice D debugger inspector I'm not the one to be involved in D experts brainy discussions. Languages are tools... just use the one that fits your needs. It can be D or not.
May 20 2022
prev sibling next sibling parent reply Ozan =?UTF-8?B?U8O8ZWw=?= <ozan.sueel gmail.com> writes:
It's the name of the language ;-)

Try to find "D". You will get a long list of results from 
everything and everyone.
Renaming to something like would DFutureC help.

Regards, Ozan
Jun 13 2022
parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Mon, Jun 13, 2022 at 11:33:35AM +0000, Ozan Sel via Digitalmars-d wrote:
 It's the name of the language ;-)
 
 Try to find "D". You will get a long list of results from everything
 and everyone.
Just search for "dlang" instead. T -- I see that you JS got Bach.
Jun 13 2022
prev sibling next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
wrote:

The reason d in unpopular is because the forums are not active 
enough, has there even been a thread with 1k comments?
Jun 14 2022
parent reply forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 21:02:32 UTC, monkyyy wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:

 The reason d in unpopular is because the forums are not active 
 enough, has there even been a thread with 1k comments?
If i had not raised, and continually pursued this idea, it would have never got to 1k ;-) D is too complex and has to many unfinished ideas in it, to accept any new ideas, it seems. How sad. Core are too stuck in there ways, and seemed focused on programming in the small. D also suffers (and will continually suffer) from bringing too much of C along with it. It'll constrain it.. forever. Because of all this (and more), other languages have already bypassed D, any many will continue to do so. I personally, cannot make the case for using D any longer.
Jun 14 2022
parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Tuesday, 14 June 2022 at 21:41:50 UTC, forkit wrote:
 On Tuesday, 14 June 2022 at 21:02:32 UTC, monkyyy wrote:
 On Tuesday, 2 November 2021 at 17:27:25 UTC, Dr Machine Code 
 wrote:

 The reason d in unpopular is because the forums are not active 
 enough, has there even been a thread with 1k comments?
If i had not raised, and continually pursued this idea, it would have never got to 1k ;-) D is too complex and has to many unfinished ideas in it, to accept any new ideas, it seems. How sad. Core are too stuck in there ways, and seemed focused on programming in the small. D also suffers (and will continually suffer) from bringing too much of C along with it. It'll constrain it.. forever. Because of all this (and more), other languages have already bypassed D, any many will continue to do so. I personally, cannot make the case for using D any longer.
D having allot of c is part of why its usable, like its probably the single most important feature of c++ and its the consistent target to have c comparability
Jun 14 2022
parent forkit <forkit gmail.com> writes:
On Tuesday, 14 June 2022 at 21:52:39 UTC, monkyyy wrote:
 D having allot of c is part of why its usable, like its 
 probably the single most important feature of c++ and its the 
 consistent target to have c comparability
I already have both C and C++, both of which are better at C and C++ than D ;-)
Jun 14 2022
prev sibling parent Mike Parker <aldacron gmail.com> writes:
This thread has gone off the rails more than once. At this point, 
I don't see that it's serving any purpose. I don't have a means 
to actually lock it other than to declare:

THIS THREAD IS CLOSED

Any further posts here will be deleted. If you'd like to raise a 
new point, or continue a discussion on a specific topic raised 
here, please do so in a new thread focused on that topic.

Thank you.
Jun 14 2022