www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - We are forking D

reply GrimMaple <grimmaple95 gmail.com> writes:
Hello everyone!

Growing greatly dissatisfied with how things are in the D 
Programming Language, we decided it is time to fork it.
We want to change the way decisions are made, giving both more 
freedom to change, and more speed to decision making. We want the 
process of contribution to be as open-ended as possible, and not 
having unnecessary blockage to contributions. We also want the 
language to allow for faster software development. The way this 
is going to be achieved is still not finalized, but that is the 
goal.
One of the ways to achive our goal is to have core focuses of the 
language. Such focuses are:

* Embracing the GC and improving upon it, disregarding betterC 
and nogc in the process
* Concentrating on the code being ` safe` by default
* Revizing & rewriting the standard library, making `std.v2`
* Improving `druntime` and porting it to other platforms, like 
wasm
* Encouraging writing code in D, not sticking up with C
* Improving toolchain

The following stuff will be forked:
* dmd
* ldc
* phobos
* druntime

As hard as it is to say this, unfortunate code breaking changese 
are going to be made. But only if they help achieve the goals 
listed above.

The forking process is still in progress, and there isn't much 
done per se. We are discussing the future of the fork and what we 
want from it, it might be a little crazy at first. But if you 
wish to help out, bring your changes in, or just look around, 
please join our Discord server to discuss: 
https://discord.gg/tfT9MjA69u . Temporary website: 
https://dpldocs.info/opend/contribute.html

Thank you, and good luck.
Jan 02
next sibling parent Daniel N <no public.email> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!


 Thank you, and good luck.
Good luck, please use a different name for your module, not std.v2 as that should be reserved for the original dlang.
Jan 02
prev sibling next sibling parent ryuukk_ <ryuukk.dev gmail.com> writes:
Embracing GC only sets yourself for failure

The best approach is to be pragmatic and embrace APIs around an 
allocator, so you can have the default allocation strategy be a 
GC without hurting projects that require fine tuning the 
allocation strategy (games, drivers, anything realtime)

type inference and tuple, that's very nice, overall the roadmap 
is interesting and promising, missing tagged union and allocator 
driven API and i would root for you


i'd make ``-i`` the default as well


but.. who is 'we'?
Jan 02
prev sibling next sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it......
 
 ...

 Thank you, and good luck.
That's good news but who are "we" in this case? The name OpenD is ok, I was thinking to naming it "D professional" but that's not so important. This is a decision that is long overdue and I am happy that it is finally happening. It is also very unfortunate but there is simply no other way in order to progress. The important thing is that you get productive people on board who wants to contribute. Also is there a new forum that we can use in order to discuss things? I'm talking about a forum and not a chat group.
Jan 02
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 02, 2024 at 06:54:17PM +0000, IGotD- via Digitalmars-d wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!
 
 Growing greatly dissatisfied with how things are in the D
 Programming Language, we decided it is time to fork it......
 
 ...
 
 Thank you, and good luck.
That's good news but who are "we" in this case?
From the looks of it, GrimMaple and Adam Ruppe.  Don't know who else.
Maybe Jonathan Marler? Some openness(!) on this front would be appreciated.
 The name OpenD is ok, I was thinking to naming it "D professional" but
 that's not so important.
I actually thing Boulder (or boulDer) would be a much better name than OpenD. Adam would understand why. ;-)
 This is a decision that is long overdue and I am happy that it is
 finally happening. It is also very unfortunate but there is simply no
 other way in order to progress. The important thing is that you get
 productive people on board who wants to contribute.
If there's a way to get Kenji Hara back, then there'll be a fighting chance that this will actually succeed. Maybe also Bearophile, but may be too late by now.
 Also is there a new forum that we can use in order to discuss things?
 I'm talking about a forum and not a chat group.
Yeah, that would be much preferred. I'm not interested in joining discord just for this. Same reason why I'll never sign up for FB. T -- Those who don't understand D are condemned to reinvent it, poorly. -- Daniel N
Jan 02
prev sibling next sibling parent matheus <matheus gmail.com> writes:
On Tuesday, 2 January 2024 at 18:54:17 UTC, IGotD- wrote:
 ...
 The name OpenD is ok, I was thinking to naming it "D 
 professional" but that's not so important.
 ...
D++ ? If I'm not wrong, there was a topic once about renaming D to something different, there was a bunch of nice choices there. I couldn't find that topic, but there was this one: https://forum.dlang.org/thread/hgyujiwbqffzjxudbofc forum.dlang.org In any case I wish you luck and I hope to see SI feature soon, Matheus.
Jan 02
prev sibling next sibling parent Profunctor <profunctor example.com> writes:
On Tuesday, 2 January 2024 at 18:54:17 UTC, IGotD- wrote:
 Also is there a new forum that we can use in order to discuss 
 things? I'm talking about a forum and not a chat group.
I too am interested in this.
Jan 02
prev sibling parent reply GrimMaple <grimmaple95 gmail.com> writes:
Thank you everyone for your genuine interest in this project!

To answer a few questions:

On Tuesday, 2 January 2024 at 18:54:17 UTC, IGotD- wrote:
 That's good news but who are "we" in this case?
We -- the community. An idea of a fork was in the air for a long time, it just so happened that Adam beat me to it by an inch. I joined soon after :) There are 2 dedicated members as of now: Adam and I. We're gonna directly ask some of the D people around later on.
 Also is there a new forum that we can use in order to discuss 
 things? I'm talking about a forum and not a chat group.
Setting up a forum is too much work™, so we're gonna have to stick with Discord as of now. Maybe a forum will arise later. As for the name, I don't think it's really important. At least as of now.
Jan 02
next sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:
 Setting up a forum is too much work™, so we're gonna have to 
 stick with Discord as of now. Maybe a forum will arise later.
Actually no. I was thinking until we have a more permanent forum. One way is to setup something at freeforums.net or similar forum host.
Jan 02
parent reply GrimMaple <grimmaple95 gmail.com> writes:
On Tuesday, 2 January 2024 at 20:35:49 UTC, IGotD- wrote:
 On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:
 Setting up a forum is too much work™, so we're gonna have to 
 stick with Discord as of now. Maybe a forum will arise later.
Actually no. I was thinking until we have a more permanent forum. One way is to setup something at freeforums.net or similar forum host.
Enabled discussions on github, if that helps: https://github.com/orgs/opendlang/discussions They can be used as a forum.
Jan 02
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 02, 2024 at 08:40:45PM +0000, GrimMaple via Digitalmars-d wrote:
 On Tuesday, 2 January 2024 at 20:35:49 UTC, IGotD- wrote:
 On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:
 
 Setting up a forum is too much work™, so we're gonna have to stick
 with Discord as of now. Maybe a forum will arise later.
 
Actually no. I was thinking until we have a more permanent forum. One way is to setup something at freeforums.net or similar forum host.
Enabled discussions on github, if that helps: https://github.com/orgs/opendlang/discussions They can be used as a forum.
Cool, thanks! That would be a good place to start, even if temporary. T -- Famous last words: I *think* this will work...
Jan 02
prev sibling parent IGotD- <nise nise.com> writes:
On Tuesday, 2 January 2024 at 20:40:45 UTC, GrimMaple wrote:
 Enabled discussions on github, if that helps: 
 https://github.com/orgs/opendlang/discussions

 They can be used as a forum.
Thank you, that will certainly do to begin with.
Jan 02
prev sibling parent reply mw <mw g.c> writes:
On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:

 Also is there a new forum that we can use in order to discuss 
 things? I'm talking about a forum and not a chat group.
Setting up a forum is too much work™, so we're gonna have to stick with Discord as of now. Maybe a forum will arise later. As for the name, I don't think it's really important. At least as of now.
Maybe just create another group here "OpenD", so people can easily see what's going on in D (general) and OpenD.
Jan 02
parent reply Quirin Schroll <qs.il.paperinik gmail.com> writes:
On Tuesday, 2 January 2024 at 21:56:54 UTC, mw wrote:
 On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:

 Also is there a new forum that we can use in order to discuss 
 things? I'm talking about a forum and not a chat group.
Setting up a forum is too much work™, so we're gonna have to stick with Discord as of now. Maybe a forum will arise later. As for the name, I don't think it's really important. At least as of now.
Maybe just create another group here "OpenD", so people can easily see what's going on in D (general) and OpenD.
Why not name it I? Going by the history of D, it’s a successor of C, which is a successor of B. Now, the next letter would be E, but that is taken, as are F, G, and H, but I is the next one that’s not already taken.
Jan 09
next sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Tuesday, 9 January 2024 at 15:46:15 UTC, Quirin Schroll wrote:
 On Tuesday, 2 January 2024 at 21:56:54 UTC, mw wrote:
 On Tuesday, 2 January 2024 at 20:25:17 UTC, GrimMaple wrote:

 Also is there a new forum that we can use in order to 
 discuss things? I'm talking about a forum and not a chat 
 group.
Setting up a forum is too much work™, so we're gonna have to stick with Discord as of now. Maybe a forum will arise later. As for the name, I don't think it's really important. At least as of now.
Maybe just create another group here "OpenD", so people can easily see what's going on in D (general) and OpenD.
Why not name it I? Going by the history of D, it’s a successor of C, which is a successor of B. Now, the next letter would be E, but that is taken, as are F, G, and H, but I is the next one that’s not already taken.
:-) In all seriousness, if their fork does work out for them, I can see them renaming to something like **D**ivergent.
Jan 09
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 04:18:40PM +0000, Martyn via Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 15:46:15 UTC, Quirin Schroll wrote:
[...]
 Why not name it I? Going by the history of D, it’s a successor of C,
 which is a successor of B. Now, the next letter would be E, but that
 is taken, as are F, G, and H, but I is the next one that’s not
 already taken.
:-) In all seriousness, if their fork does work out for them, I can see them renaming to something like **D**ivergent.
[...] According to Adam, he wants to keep the name "OpenD": the idea is that it's D, only open to others' contributions without undue delay or onerous demands. My personal hope is that we'd call it Boulder or boulDer: the thing that gets D rolling again, that makes D rock (or rather, boulder) again. :-P T -- Change is inevitable, except from a vending machine.
Jan 09
next sibling parent bomat <Tempest_spam gmx.de> writes:
On Tuesday, 9 January 2024 at 16:52:09 UTC, H. S. Teoh wrote:
 According to Adam, he wants to keep the name "OpenD": the idea 
 is that it's D, only open to others' contributions without 
 undue delay or onerous demands.

 My personal hope is that we'd call it Boulder or boulDer: the 
 thing that gets D rolling again, that makes D rock (or rather, 
 boulder) again. :-P
Dork, the D-Fork.
Jan 09
prev sibling parent cc <cc nevernet.com> writes:
On Tuesday, 9 January 2024 at 16:52:09 UTC, H. S. Teoh wrote:
 My personal hope is that we'd call it Boulder or boulDer: the 
 thing that gets D rolling again, that makes D rock (or rather, 
 boulder) again. :-P
Call it *Dive*, cause sometimes just going for a *DIP* is not enough...
Jan 11
prev sibling next sibling parent Martyn <martyn.developer googlemail.com> writes:
On Tuesday, 9 January 2024 at 15:46:15 UTC, Quirin Schroll wrote:
 ..
 Why not name it I?
 ..
Come to think of it -- I am surprised Apple did not use I, as in ilang. Goes well with iphone, ipod, etc. I guess 'Swift' sounds cooler for their cool fanbase. :-)
Jan 09
prev sibling parent tony <tonytdominguez aol.com> writes:
On Tuesday, 9 January 2024 at 15:46:15 UTC, Quirin Schroll wrote:

 Why not name it I? Going by the history of D, it’s a successor 
 of C, which is a successor of B. Now, the next letter would be 
 E, but that is taken, as are F, G, and H, but I is the next one 
 that’s not already taken.
A language named "I" would be problematic from a search standpoint.
Jan 09
prev sibling next sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 * Embracing the GC and improving upon it, disregarding betterC
Will you be breaking betterc before wasm is merged?
 The following stuff will be forked:
 dmd
 ldc
why both?
 we
Got a list of names?
Jan 02
prev sibling next sibling parent reply victoroak <victoroak victor.oak> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want to change the way decisions are made, giving both more 
 freedom to change, and more speed to decision making. We want 
 the process of contribution to be as open-ended as possible, 
 and not having unnecessary blockage to contributions. We also 
 want the language to allow for faster software development. The 
 way this is going to be achieved is still not finalized, but 
 that is the goal.
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
 * Concentrating on the code being ` safe` by default
 * Revizing & rewriting the standard library, making `std.v2`
 * Improving `druntime` and porting it to other platforms, like 
 wasm
 * Encouraging writing code in D, not sticking up with C
 * Improving toolchain

 The following stuff will be forked:
 * dmd
 * ldc
 * phobos
 * druntime

 As hard as it is to say this, unfortunate code breaking 
 changese are going to be made. But only if they help achieve 
 the goals listed above.

 The forking process is still in progress, and there isn't much 
 done per se. We are discussing the future of the fork and what 
 we want from it, it might be a little crazy at first. But if 
 you wish to help out, bring your changes in, or just look 
 around, please join our Discord server to discuss: 
 https://discord.gg/tfT9MjA69u . Temporary website: 
 https://dpldocs.info/opend/contribute.html

 Thank you, and good luck.
Good luck with this. I do not agree with every goal of the project but it's nice to see the community stepping up and fixing things. Maybe [issue 5710](https://issues.dlang.org/show_bug.cgi?id=5710) might be merged again or fixed in another way. It would be great to see async/await or some kind of stackless resumable function implemented too. I might enter the discord later to see more about the project.
Jan 02
parent d007 <d007 gmail.com> writes:
On Tuesday, 2 January 2024 at 19:52:30 UTC, victoroak wrote:
 Good luck with this. I do not agree with every goal of the 
 project but it's nice to see the community stepping up and 
 fixing things. Maybe [issue 
 5710](https://issues.dlang.org/show_bug.cgi?id=5710) might be 
 merged again or fixed in another way. It would be great to see 
 async/await or some kind of stackless resumable function 
 implemented too.

 I might enter the discord later to see more about the project.
use LLVM IR will be easy to implement for stackless async/await. A betterC + stackless async/await, without GC will be my dream language.
Jan 02
prev sibling next sibling parent reply Profunctor <profunctor example.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
This alone is worth it. I pray for your success in these endeavors.
Jan 02
next sibling parent user <user tmp.com> writes:
On Tuesday, 2 January 2024 at 20:13:59 UTC, Profunctor wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
This alone is worth it. I pray for your success in these endeavors.
I got attracted to D1 for this exact reason. I also pray for your success. May be will try contributing where possible.
Jan 02
prev sibling parent Antonio <antoniocabreraperez gmail.com> writes:
On Tuesday, 2 January 2024 at 20:13:59 UTC, Profunctor wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
This alone is worth it. I pray for your success in these endeavors.
I think the same. This eternal discussion only serves to make many .Net/Java/... programmers feel insecure. If D opted for GC, he should embrace that path from the beginning. If "many" C or C++ programmers were interested in D but did not want to use GC, they are the ones who should have created their own Fork and not the other way around. D had (and has) great qualities to be an efficient "high level" language and that is how I perceived it 20 years ago (comparing and it ended up disappointing me. Each time I return to D I have to "remember" or "learn" again and there is not a toolchain that helps me to "remember" naturally as other languages do (i.e. intellisense system comparable to a real wall hard to cross) And, of course, I always find something annoying (i.e.: https://issues.dlang.org/show_bug.cgi?id=3543 ) that consumes my time until I find it is a bug or an unexpected behavior (like the limitations with UFCS) My conclusion was that D is managed, mainly, by C/C++ developers that really doesn't need to move from C++ to D. But it's my opinion Wellcome to OpenD
Jan 12
prev sibling next sibling parent Konstantin <kostya.hm2 gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
 * Concentrating on the code being ` safe` by default
 * Revizing & rewriting the standard library, making `std.v2`
 * Improving `druntime` and porting it to other platforms, like 
 wasm
 * Encouraging writing code in D, not sticking up with C
 * Improving toolchain
Good luck! On the contrary, I'm trying to find a basic, uncomplicated language D to use it as better C++ (modules, mixins, UFCS and so on) with an experimental nogc compiler stripped of features that split the language into subsets (better C, safe/unsafe and so on). Compiler is based on the old dmd-cxx branch.
Jan 02
prev sibling next sibling parent reply Luna <luna foxgirls.gay> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want to change the way decisions are made, giving both more 
 freedom to change, and more speed to decision making. We want 
 the process of contribution to be as open-ended as possible, 
 and not having unnecessary blockage to contributions. We also 
 want the language to allow for faster software development. The 
 way this is going to be achieved is still not finalized, but 
 that is the goal.
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
 * Concentrating on the code being ` safe` by default
 * Revizing & rewriting the standard library, making `std.v2`
 * Improving `druntime` and porting it to other platforms, like 
 wasm
 * Encouraging writing code in D, not sticking up with C
 * Improving toolchain

 The following stuff will be forked:
 * dmd
 * ldc
 * phobos
 * druntime

 As hard as it is to say this, unfortunate code breaking 
 changese are going to be made. But only if they help achieve 
 the goals listed above.

 The forking process is still in progress, and there isn't much 
 done per se. We are discussing the future of the fork and what 
 we want from it, it might be a little crazy at first. But if 
 you wish to help out, bring your changes in, or just look 
 around, please join our Discord server to discuss: 
 https://discord.gg/tfT9MjA69u . Temporary website: 
 https://dpldocs.info/opend/contribute.html

 Thank you, and good luck.
For the love of everything good, if you want to fork D, please don't *name* it D, that could create another wave of issues where people think OpenD is *actual* D and getting confused. It's fair that you're unhappy with the state of the D language, but I don't think creating confusion is a good solution.
Jan 02
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 02, 2024 at 08:54:49PM +0000, Luna via Digitalmars-d wrote:
[...]
 For the love of everything good, if you want to fork D, please don't
 *name* it D, that could create another wave of issues where people
 think OpenD is *actual* D and getting confused.
[...] I propose Boulder. Or boulDer. ;-) I'm more hoping that this drastic action would force things to change for the better, rather than that this would become an actual, hard fork. Boulder, the thing that forces things to get rolling again. :-D The thing that makes D r0x^WI mean, boulder again. :-P T -- "Holy war is an oxymoron." -- Lazarus Long
Jan 02
parent Daniel N <no public.email> writes:
On Tuesday, 2 January 2024 at 21:16:44 UTC, H. S. Teoh wrote:
 On Tue, Jan 02, 2024 at 08:54:49PM +0000, Luna via 
 Digitalmars-d wrote: [...]
 For the love of everything good, if you want to fork D, please 
 don't *name* it D, that could create another wave of issues 
 where people think OpenD is *actual* D and getting confused.
[...] I propose Boulder. Or boulDer. ;-) I'm more hoping that this drastic action would force things to change for the better, rather than that this would become an actual, hard fork. Boulder, the thing that forces things to get rolling again. :-D The thing that makes D r0x^WI mean, boulder again. :-P T
You only need one single feature to succeed. AST macros, it can greatly accelerate innovation as not every contributer needs to be a compiler dev. It could also help evaluating experimental GC features.
Jan 02
prev sibling next sibling parent reply user <user tmp.com> writes:
On Tuesday, 2 January 2024 at 20:54:49 UTC, Luna wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 [...]
For the love of everything good, if you want to fork D, please don't *name* it D, that could create another wave of issues where people think OpenD is *actual* D and getting confused. It's fair that you're unhappy with the state of the D language, but I don't think creating confusion is a good solution.
Yup, maybe Dazzle, Delight, Dauntless :-)
Jan 02
next sibling parent Andrey Zherikov <andrey.zherikov gmail.com> writes:
On Wednesday, 3 January 2024 at 03:26:59 UTC, user wrote:
 On Tuesday, 2 January 2024 at 20:54:49 UTC, Luna wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 [...]
For the love of everything good, if you want to fork D, please don't *name* it D, that could create another wave of issues where people think OpenD is *actual* D and getting confused. It's fair that you're unhappy with the state of the D language, but I don't think creating confusion is a good solution.
Yup, maybe Dazzle, Delight, Dauntless :-)
Dream :)
Jan 02
prev sibling parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Wednesday, 3 January 2024 at 03:26:59 UTC, user wrote:
 On Tuesday, 2 January 2024 at 20:54:49 UTC, Luna wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 [...]
For the love of everything good, if you want to fork D, please don't *name* it D, that could create another wave of issues where people think OpenD is *actual* D and getting confused. It's fair that you're unhappy with the state of the D language, but I don't think creating confusion is a good solution.
Yup, maybe Dazzle, Delight, Dauntless :-)
Dceive, Dception, Dcrepit, Dgenerate, Don't ;-)
Jan 03
prev sibling parent Iain Buclaw <ibuclaw gdcproject.org> writes:
On Tuesday, 2 January 2024 at 20:54:49 UTC, Luna wrote:
 
 For the love of everything good, if you want to fork D, please 
 don't *name* it D, that could create another wave of issues 
 where people think OpenD is *actual* D and getting confused.
It's also somebody else's brand now https://www.opend.org I had a look into getting the old site back, but likely no chance of that anymore. https://web.archive.org/web/20180329114817/http://opend.org/
Jan 02
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!
 Thank you, and good luck.
Here's your first bug report ;) https://dpldocs.info/opend/developer-setup.html throws an error with: ``` arsd.webtemplate.TemplateException /home/me/program/lib/arsd/webtemplate.d(67): Exception in template ./developer-setup.html: char 1026 (line 16): mismatched tag: </ol> != <li> (opened on line 13) ```
Jan 02
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!
https://dpldocs.info/opend/roadmap.html On this page it's mentioned:
 D's current dictatorship model goes contrary to those lessons. 
 Whether benevolent or not, a dictator is still a single point 
 of failure and a bottleneck on the process. We need to correct 
 this to finally allow D to thrive.
I think it's easy to fall into the trap of replacing one "dictator" with another. For example, the roadmap on that page lists language features which will be added or removed. But, who decided on this? The community? Or just another person who declared themselves to be the new ruler? I'll be really curious to see how the new leadership handles themselves when tough calls have to be made. Wishing you all the best.
Jan 02
prev sibling next sibling parent Per =?UTF-8?B?Tm9yZGzDtnc=?= <per.nordlow gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!
I'd look at the Neat programming language for inspiration. Good luck.
Jan 02
prev sibling next sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want to change the way decisions are made, giving both more 
 freedom to change, and more speed to decision making. We want 
 the process of contribution to be as open-ended as possible, 
 and not having unnecessary blockage to contributions. We also 
 want the language to allow for faster software development. The 
 way this is going to be achieved is still not finalized, but 
 that is the goal.
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 * Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process
 * Concentrating on the code being ` safe` by default
 * Revizing & rewriting the standard library, making `std.v2`
 * Improving `druntime` and porting it to other platforms, like 
 wasm
 * Encouraging writing code in D, not sticking up with C
 * Improving toolchain

 The following stuff will be forked:
 * dmd
 * ldc
 * phobos
 * druntime

 As hard as it is to say this, unfortunate code breaking 
 changese are going to be made. But only if they help achieve 
 the goals listed above.

 The forking process is still in progress, and there isn't much 
 done per se. We are discussing the future of the fork and what 
 we want from it, it might be a little crazy at first. But if 
 you wish to help out, bring your changes in, or just look 
 around, please join our Discord server to discuss: 
 https://discord.gg/tfT9MjA69u . Temporary website: 
 https://dpldocs.info/opend/contribute.html

 Thank you, and good luck.
I was expecting a post like this from one of (about) 5 regulars in this forum. The only area I personally would disagree on is:- `Embracing the GC and improving upon it, disregarding betterC and nogc in the process` I think GC should be optional or, atleast, have some kind of Allocator feature so we can have control if needed. D allows you to code whatever way you like, or a combination of them... why not provide this power when it comes to memory? (To those it concerns -- I am not interested turning this into another side debate of GC or other areas, like the other post before xmas. I am probably the minority with this mindest. If I am, then good luck - but I don't think this will serve my purposes if you go this route)
Jan 03
next sibling parent reply JN <666total wp.pl> writes:
On Wednesday, 3 January 2024 at 09:28:15 UTC, Martyn wrote:
 The only area I personally would disagree on is:-
 `Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process`

 I think GC should be optional or, atleast, have some kind of 
 Allocator feature so we can have control if needed.

 D allows you to code whatever way you like, or a combination of 
 them... why not provide this power when it comes to memory?
Or perhaps that power comes with maintenance cost. I don't have experience with language design, but I assume there are some features that are easier to implement or possible at all only if you assume a GC is present. But if you always have to assume GC may not be present, you are limiting the development of the language. Not saying whether GC is the right way or refcounting or anything else, but it's certainly easier if you only have one real memory management method to deal with.
Jan 03
next sibling parent matheus <matheus gmail.com> writes:
On Wednesday, 3 January 2024 at 12:54:10 UTC, JN wrote:
 On Wednesday, 3 January 2024 at 09:28:15 UTC, Martyn wrote:
 The only area I personally would disagree on is:-
 `Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process`

 I think GC should be optional or, atleast, have some kind of 
 Allocator feature so we can have control if needed.

 D allows you to code whatever way you like, or a combination 
 of them... why not provide this power when it comes to memory?
Or perhaps that power comes with maintenance cost. I don't have experience with language design, but I assume there are some features that are easier to implement or possible at all only if you assume a GC is present. But if you always have to assume GC may not be present, you are limiting the development of the language. Not saying whether GC is the right way or refcounting or anything else, but it's certainly easier if you only have one real memory management method to deal with.
What called my attention about D back in the day, was a Native/Compiled language like C/C++ but with GC built-in. I think sometimes you need to have constrains and keep with it, there are a lot of languages out there, it will be very hard to do what everyone is doing (At least not in high level bases). So I'd prefer a nice language with some constrains than a half-baked one. This is a "new language" (Fork!) from another one, I think that if people don't want GC, they just keep with the current one or go with another alternative. In my opinion I think the first 2 priorities: GC and Safe by default is a nice thing to start with, and maybe even String Interpolation should be added as it's already developed but in alpha/toy mode, so people could taste it more. Matheus.
Jan 03
prev sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Wednesday, 3 January 2024 at 12:54:10 UTC, JN wrote:
 On Wednesday, 3 January 2024 at 09:28:15 UTC, Martyn wrote:
 The only area I personally would disagree on is:-
 `Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process`

 I think GC should be optional or, atleast, have some kind of 
 Allocator feature so we can have control if needed.

 D allows you to code whatever way you like, or a combination 
 of them... why not provide this power when it comes to memory?
Or perhaps that power comes with maintenance cost. I don't have experience with language design, but I assume there are some features that are easier to implement or possible at all only if you assume a GC is present. But if you always have to assume GC may not be present, you are limiting the development of the language. Not saying whether GC is the right way or refcounting or anything else, but it's certainly easier if you only have one real memory management method to deal with.
Sure. Its down to those involved in the currently titled std.v2 A plan looks set with this fork and going full steam ahead! If they are choosing to remove nogc (going all in on GC) - they are free to do so. One of the reasons why I was interested in D going back a few years now was because:- - The GC is optional. - Performance-wise, it is comparable to C\C++. - OOP is optional as well. I can code functional, procedural, etc. I can write code to (really) be a `better C` or a `better C++` Now -- a fork is happening. 2 things are going to happen. 1) Dlang will eventually die if the fork is successful, or 2) The fork fails (Yes, I do think it will be one or the other - but we shall see within the next 18 months) If the fork is really going ALL-IN on the GC, then it no longer serves my purpose. Again - I might be the minority. I think there is atleast 1 or 2 other members who will share the same views as me as evident in previous threads. However, if the majority joining the fork want this... then more power to them. I wish nothing but success to Dlang and the new fork. My point is if the new fork is successful and they all for GC, then it is time for me to move on (if dlang fades away further as a result of the fork)
Jan 03
next sibling parent IGotD- <nise nise.com> writes:
On Wednesday, 3 January 2024 at 13:38:10 UTC, Martyn wrote:
 If the fork is really going ALL-IN on the GC, then it no longer 
 serves my purpose.
I think you need to first define what you mean with no GC. I would presume that a D fork would work like D right now that you have a choice not to use the GC and instead use library containers similar to C++ STL (which D has partially already). What I would like is that we can remove GC requirement for druntime so that you can use basic system functions without turning GC on. However, making Phobos non-GC is a huge undertaking and I really wonder if is possible or really worth it. The only way I can see that is realistic is that we introduce managed pointers so that the GC type can be changed to at least satisfy a little more people. This remains to be seen which direction the memory management will go. I personally believe in the short term, improvements will be made to the tracing GC.
Jan 03
prev sibling next sibling parent GrimMaple <grimmaple95 gmail.com> writes:
I am going to answer a few questions that are on everybody's mind.

On Wednesday, 3 January 2024 at 13:38:10 UTC, Martyn wrote:
 - The GC is optional.
 - Performance-wise, it is comparable to C\C++.
 - OOP is optional as well. I can code functional, procedural, 
 etc.
This is still going to be the case. If you don't use GC, you don't use it, just the way it is now. If there are no allocations being made with the GC, then the GC isn't going to be ran. That being said, there is no real will to directly support GC-less code, so as I see it, you'd have to make your own nogc code if you want to. On Wednesday, 3 January 2024 at 16:20:39 UTC, Hors wrote:
 If this fork also maintains interop with C, then we can still 
 have control via malloc() and free() from a C library.
Of course, the C interop is still gonna be in, it's not something we want to get rid of entirely. We just don't want to encourage using it. But if you want to - feel free to do so.
Jan 03
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Jan 03, 2024 at 01:38:10PM +0000, Martyn via Digitalmars-d wrote:
[...]
 Now -- a fork is happening. 2 things are going to happen.
 1) Dlang will eventually die if the fork is successful, or
 2) The fork fails
[...] Actually, it's more nuanced than that. One of 5 things could happen:[*] 1) Successful forking: the original D language and the fork both survive and thrive as distinct languages; 2) The fork merges back upstream at some point, e.g., if the current dispute(s) is/are eventually resolved. 3) The fork fails: eventually the fork burns out and is effectively discontinued. 4) The fork takes over: the original D language stagnates and falls by the wayside, while the fork takes over and assumes primary development. 5) Both fails: the fork drains resources from the original project and fragments the community, and both eventually find themselves unsustainable. [*] https://www.cs.cmu.edu/~ckaestne/pdf/icse20-forks.pdf Which will be the actual outcome remains to be seen. T -- All problems are easy in retrospect.
Jan 03
parent reply Don Allen <donaldcallen gmail.com> writes:
I view this development positively. The constant strife I've 
observed as a latecomer to D, but as someone who has done and 
managed software development for a very long time, was clearly 
not healthy or accomplishing anything other than wasting peoples' 
energy, because it wasn't converging. This divorce will hopefully 
allow the disagreements to be resolved on technical merits.

Having said that, I want to express my great respect for what 
Walter and the others responsible for D have accomplished. Given 
its conservative objectives (as opposed to something like Haskell 
or even Rust -- "here's a new way to think about computer 
programming"), I think D is very, very good and as a long-time 
language enthusiast, I'm familiar with the competition, such as 
Nim and Zig (which isn't really competition because it is still 
far from releasable quality, both the software and 
documentation). I do admit to having had little experience with 
C++, just enough to share Linus Torvalds' opinion.

That D hasn't taken over the world is beside the point; good 
things aren't always popular, e.g., Scheme, and sometimes bad 
things are very popular, e.g., Windows, JavaScript, C/C++. What 
percentage of the world's music-loving population listens to the 
music of Bach or Mozart?

My opinions, of course.
Jan 03
next sibling parent reply bomat <Tempest_spam gmx.de> writes:
On Thursday, 4 January 2024 at 03:12:48 UTC, Don Allen wrote:
 I view this development positively. The constant strife I've 
 observed as a latecomer to D, but as someone who has done and 
 managed software development for a very long time, was clearly 
 not healthy or accomplishing anything other than wasting 
 peoples' energy, because it wasn't converging. This divorce 
 will hopefully allow the disagreements to be resolved on 
 technical merits.
Also being a newcomer to the language, I quite agree. While it's sad to see a split in a language that is already niche as it is, I try to see the positive sides (as Abdulhaq wrote, forking is better than quitting), and I hope that everyone can gain new insights by having a direct comparison between different approaches.
 Having said that, I want to express my great respect for what 
 Walter and the others responsible for D have accomplished.
Seconded! At the same time however, I want to express respect towards Adam D Ruppe. I don't know him at all, but I have enjoyed his D Cookbook and, although discussions may have become a bit too heated, I respect people with so much passion for a project.
 Given its conservative objectives (as opposed to something like 
 Haskell or even Rust -- "here's a new way to think about 
 computer programming"), I think D is very, very good...
Again, agreed! When I see the syntax of a lot of self-proclaimed "C-Killers", I can't help but wonder if things really *have* to be that ugly...
 ... and as a long-time language enthusiast, I'm familiar with 
 the competition, such as Nim and Zig (which isn't really 
 competition because it is still far from releasable quality, 
 both the software and documentation).
And yet, I feel like Zig is already getting more attention than D. Just my impression, I'm not going to speculate why that is.
 That D hasn't taken over the world is beside the point; good 
 things aren't always popular, e.g., Scheme, and sometimes bad 
 things are very popular, e.g., Windows, JavaScript, C/C++.
Now this is the point where I have to totally disagree with you. It doesn't suffice for a system to be well designed and great to use "in theory", there must also be tooling, documentation, thousands if not millions of samples, and an active community. Otherwise it will not feel safe to embrace it - certainly not for companies, but to a lesser extent for every single developer. If you look at your list of examples again - regardless if you deem them "good" or "bad" - this is something that every single one of them has, and which their competitors *don't*. If you google for a problem/question you have with any of the mentioned things, you are *very likely* to find a viable solution. In short, in order for something to be successful, it already has to be successful. This is a paradox that has been written about a lot and by much smarter people than me, and it is mysterious to most why some few projects have achieved to get over this hump while millions of others haven't. This brings me back to the beginning of my post where I lamented the split of an already niche language. Again, I hope that the motto "unity is strength" does not apply in this case and that everyone keeps open minded enough to profit from each other.
Jan 07
next sibling parent reply Lance Bachmeier <no spam.net> writes:
On Sunday, 7 January 2024 at 11:54:16 UTC, bomat wrote:
 On Thursday, 4 January 2024 at 03:12:48 UTC, Don Allen wrote:
 I view this development positively. The constant strife I've 
 observed as a latecomer to D, but as someone who has done and 
 managed software development for a very long time, was clearly 
 not healthy or accomplishing anything other than wasting 
 peoples' energy, because it wasn't converging. This divorce 
 will hopefully allow the disagreements to be resolved on 
 technical merits.
Also being a newcomer to the language, I quite agree. While it's sad to see a split in a language that is already niche as it is, I try to see the positive sides (as Abdulhaq wrote, forking is better than quitting), and I hope that everyone can gain new insights by having a direct comparison between different approaches.
I'd much rather Adam put his time into a fork, rather than the more common approach where he'd post here under various names, make ridiculous claims, and vandalize the discussions. If you're new, you may not have seen the many posts from someone that doesn't like private at the module level. Whether there are useful insights from this or any other fork will depend on what they do with it. If there's too much incompatibility of code, due to breaking changes, it won't have much effect. There's already Nim, Rust, Go, Zig, etc., to compare with and the forks will in each case be just another language.
 That D hasn't taken over the world is beside the point; good 
 things aren't always popular, e.g., Scheme, and sometimes bad 
 things are very popular, e.g., Windows, JavaScript, C/C++.
Now this is the point where I have to totally disagree with you. It doesn't suffice for a system to be well designed and great to use "in theory", there must also be tooling, documentation, thousands if not millions of samples, and an active community.
Not really. Those things come after the userbase gets large. I was using Python in the 1990s and I can assure you that the growth did not come because of good tooling, documentation, or code samples. Similarly, I was using R when you used Emacs or a plain text editor, the documentation was similar to Linux man pages, and you asked questions on a mailing list with Brian Ripley. Only after it took off did they build RStudio. What both languages had was a small, dedicated group of users that were willing and able to build useful things with the language.
Jan 07
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sun, Jan 07, 2024 at 04:43:17PM +0000, Lance Bachmeier via Digitalmars-d
wrote:
[...]
 I'd much rather Adam put his time into a fork, rather than the more
 common approach where he'd post here under various names, make
 ridiculous claims, and vandalize the discussions. If you're new, you
 may not have seen the many posts from someone that doesn't like
 private at the module level.
Are you sure you have your facts straight? AFAIK the pseudonymous trolls who posted about private or this or that complaint were not Adam, they were some other disgruntled former D user who has since left. AFAIK Adam has never engaged in such tactics and was in close contact with the core D team until the recent spat (which is the culmination of ongoing disagreements on governance that started more recently).
 Whether there are useful insights from this or any other fork will
 depend on what they do with it. If there's too much incompatibility of
 code, due to breaking changes, it won't have much effect. There's
 already Nim, Rust, Go, Zig, etc., to compare with and the forks will
 in each case be just another language.
Adam has stated (publicly, in the opendlang github discussions) that he does not plan to make major changes to the language that would massively break compatibility. But of course, if this fork continues independently then over time things will diverge sooner or later.
 That D hasn't taken over the world is beside the point; good
 things aren't always popular, e.g., Scheme, and sometimes bad
 things are very popular, e.g., Windows, JavaScript, C/C++.
Now this is the point where I have to totally disagree with you. It doesn't suffice for a system to be well designed and great to use "in theory", there must also be tooling, documentation, thousands if not millions of samples, and an active community.
Not really. Those things come after the userbase gets large. I was using Python in the 1990s and I can assure you that the growth did not come because of good tooling, documentation, or code samples. Similarly, I was using R when you used Emacs or a plain text editor, the documentation was similar to Linux man pages, and you asked questions on a mailing list with Brian Ripley. Only after it took off did they build RStudio. What both languages had was a small, dedicated group of users that were willing and able to build useful things with the language.
As somebody has said, it depends on your definition of "success". If your definition of success is popularity, then sure, you need a big community, lots of existing code, hype, etc.. By that measure C++ is more successful than D and I should be using C++ instead. But I came to D not because of popularity, but because of technical merit. I would rather stay with a small, relatively unknown community where technical excellence plays a deciding role, than in a large community of mediocrity where popularity is the deciding factor. So my definition of success is rather different from what some have been using when bemoaning the current state of D. T -- Why is it that all of the instruments seeking intelligent life in the universe are pointed away from Earth? -- Michael Beibl
Jan 07
next sibling parent Lance Bachmeier <no spam.net> writes:
On Sunday, 7 January 2024 at 21:16:46 UTC, H. S. Teoh wrote:
 On Sun, Jan 07, 2024 at 04:43:17PM +0000, Lance Bachmeier via 
 Digitalmars-d wrote: [...]
 I'd much rather Adam put his time into a fork, rather than the 
 more common approach where he'd post here under various names, 
 make ridiculous claims, and vandalize the discussions. If 
 you're new, you may not have seen the many posts from someone 
 that doesn't like private at the module level.
Are you sure you have your facts straight? AFAIK the pseudonymous trolls who posted about private or this or that complaint were not Adam, they were some other disgruntled former D user who has since left. AFAIK Adam has never engaged in such tactics and was in close contact with the core D team until the recent spat (which is the culmination of ongoing disagreements on governance that started more recently).
I'm saying it's good Adam isn't doing that, not that he is.
 Whether there are useful insights from this or any other fork 
 will depend on what they do with it. If there's too much 
 incompatibility of code, due to breaking changes, it won't 
 have much effect. There's already Nim, Rust, Go, Zig, etc., to 
 compare with and the forks will in each case be just another 
 language.
Adam has stated (publicly, in the opendlang github discussions) that he does not plan to make major changes to the language that would massively break compatibility. But of course, if this fork continues independently then over time things will diverge sooner or later.
I'm not sure what he has planned. I was just making a statement about forks in general - the more incompatibilities of a fork from the original, the less effect it has upstream. I think it's too early to speculate what will come of Adam's fork.
Jan 07
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Sunday, 7 January 2024 at 21:16:46 UTC, H. S. Teoh wrote:
 As somebody has said, it depends on your definition of 
 "success".  If your definition of success is popularity, then 
 sure, you need a big community, lots of existing code, hype, 
 etc..  By that measure C++ is more successful than D and I 
 should be using C++ instead.  But I came to D not because of 
 popularity, but because of technical merit.  I would rather 
 stay with a small, relatively unknown community where technical 
 excellence plays a deciding role, than in a large community of 
 mediocrity where popularity is the deciding factor.  So my 
 definition of success is rather different from what some have 
 been using when bemoaning the current state of D.
I think a good definition of success is `popularity * (yourLanguage.technicalMerit - replacedLanguage.technicalMerit)`. No matter how popular a language is, if it isn't better than what it replaces it can't be considered a success. If it is outright worse than the old ones, it's actually a bad thing for it to gain popularity. Of course, there are many definitions for techical merit. Maybe your language serves it's task in itself only as well as that it replaces, but if it is better in teaching good mental skills than it's replacement it still can be considered a success in another sense. Also, great technical merit is always positive as long as you have at least some users but the success tends tends to be minimal compared to a langauge which is only slightly better than older ones but massively more popular. Another caveat - popularity also has many forms. Even if you have little or no direct users, if your work serves to improve other languages that take their ideas from you IMO you have part of the credit if those languages become popular.
Jan 08
parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Mon, Jan 08, 2024 at 05:27:09PM +0000, Dukc via Digitalmars-d wrote:
 On Sunday, 7 January 2024 at 21:16:46 UTC, H. S. Teoh wrote:
 As somebody has said, it depends on your definition of "success".
[...]
 I think a good definition of success is `popularity *
 (yourLanguage.technicalMerit - replacedLanguage.technicalMerit)`. No
 matter how popular a language is, if it isn't better than what it
 replaces it can't be considered a success. If it is outright worse
 than the old ones, it's actually a bad thing for it to gain
 popularity.
[...] The thing is, arbitrary definitions like this miss the point that success in its literal sense means you achieved the goal(s) you set out to accomplish. As such, what constitutes success depends on what your goals are in the first place. If the goals are unclear or unknown, then success is strictly speaking undefined. And since the definition of success depends on its goals, if two languages have two different goals then you can't really compare their respective "successes", because that's just comparing apples and oranges. Furthermore, what the *user* considers as success may not be the same as what the author considers as success -- because their respective goals differ. If the language author's goal is a language that can express the kind of complex tasks he wishes to program, then he might consider it a success once the language has reached that level of expressivity. But if a user's goal is to use a popular language, then he may not agree on this "success". tl;dr: "success" is often a squirrely word, used more for self-praise or criticism rather than an objective measure of a language. T -- You are only young once, but you can stay immature indefinitely. -- azephrahel
Jan 08
prev sibling next sibling parent reply GrimMaple <grimmaple95 gmail.com> writes:
On Sunday, 7 January 2024 at 16:43:17 UTC, Lance Bachmeier wrote:
 Whether there are useful insights from this or any other fork 
 will depend on what they do with it. If there's too much 
 incompatibility of code, due to breaking changes, it won't have 
 much effect. There's already Nim, Rust, Go, Zig, etc., to 
 compare with and the forks will in each case be just another 
 language.
Realistically speaking, the competition in "System level programming language" is really tough, and it's unreasonable to compete there. But there simply isn't a language that's as easy from C++ thinking it would be a good match, and it really is. At times, it even outperforms C++ in some aspects! Now if only it upon as all. There simply isn't a GC based language that can be as quick as the other one going and it would be unlike anything else on the market.
Jan 07
parent reply Don Allen <donaldcallen gmail.com> writes:
On Sunday, 7 January 2024 at 21:58:50 UTC, GrimMaple wrote:

 There simply isn't a GC based language that can be as quick as 
 C++
That's perhaps true of languages whose only memory-management method is GC-based. D isn't such a language. You can frequently avoid use of the GC in D because you have multiple memory-management techniques at your disposal. I'm going to say something now that should be known by people who write code professionally, but is too often overlooked (I base this on many years of experience needing to convince people not to optimize prematurely and without having measured why their code doesn't perform as they would like). I'm not suggesting that you are guilty of this. I simply want it said explicitly. The speed of the code generated by the language you are using is not necessarily the determining factor of the speed of your application. If your application is not cpu-limited and written in Python, re-writing it in C++ in pursuit of better performance is going to be a disappointment. If your application *is* cpu-limited but spends most of its time in some library, e.g.,sqlite, or in system-calls, re-writing it in C++ will only help in proportion to the time spent in your own code, which is small by assumption. My point is that the performance vs. ease-of-coding tradeoff is not simple. It takes some smart engineering for a project to be near the sweet spot.
Jan 07
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Mon, Jan 08, 2024 at 02:58:58AM +0000, Don Allen via Digitalmars-d wrote:
[...]
 The speed of the code generated by the language you are using is not
 necessarily the determining factor of the speed of your application.
 If your application is not cpu-limited and written in Python,
 re-writing it in C++ in pursuit of better performance is going to be a
 disappointment. If your application *is* cpu-limited but spends most
 of its time in some library, e.g.,sqlite, or in system-calls,
 re-writing it in C++ will only help in proportion to the time spent in
 your own code, which is small by assumption.
I learned this the hard way over the years. I used to be a self-proclaimed C/C++ expert, took pride in knowing obscure corners of C that very few others knew (I managed to answer correctly an obscure interview question that even the interviewer didn't know the answer to). I thought I knew it all about optimization, and that I could optimize just by looking at the code. Boy was I wrong. This one time I was having performance issues with my code, and I spent hours poring over every single line and optimizing it to next year and back. The overall performance improvement was a disappointment. Finally, I conceded to using a profiler (before that I felt profilers were for wimps who didn't know how to program properly). The profiler instantly found the problem: a stray debug printf() that I forgot to remove after fixing a bug, that just happened to be sitting in the hot path. That was just one of many, many such instances of hard reality smacking me upside the head. After many years of such experiences, I began to realize just how wrong I am 90% of the time when it comes to performance. The bottleneck is usually far away from where I assumed it'd be. After learning this the hard way time and again, I learned one thing: Until you run a profiler on your program, you are most probably wrong about where the bottleneck is. The corollary goes: If you're optimizing a piece of code before running a profiler on it, it's premature optimization and you're probably wasting your time. The general theorem reads: Discussions about performance without hard numbers produced by actual profiler output are usually wrong. Until you profile, it's all just guesswork and unfounded assumptions. The practical takeaway is, whenever performance is an issue: Profile, profile, profile. If you're not profiling and you talk about performance, I probably won't bother reading any further. Or perhaps I would, but more for the entertainment value than any true technical merit.
 My point is that the performance vs. ease-of-coding tradeoff is not
 simple.  It takes some smart engineering for a project to be near the
 sweet spot.
[...] Yep, optimization is an extremely complex problem that's very sensitive to fine details, and often, the exact use case or even exact data you're operating on. Sweeping general statements about performance are rarely accurate, and not worth relying on unless there's hard data to back it up. And that's profiler data, not some made-up numbers made from some preconceived notion about optimization that's not based in reality. T -- Those who don't understand D are condemned to reinvent it, poorly. -- Daniel N
Jan 07
next sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Monday, 8 January 2024 at 04:00:52 UTC, H. S. Teoh wrote:
 On Mon, Jan 08, 2024 at 02:58:58AM +0000, Don Allen via 
 Digitalmars-d wrote: [...]
[...]
 If you're not profiling and you talk about performance, I 
 probably won't bother reading any further.  Or perhaps I would, 
 but more for the entertainment value than any true technical 
 merit.
Now how do you interpret the following Don Allen's sentence: "If your application *is* cpu-limited but spends most of its time in some library, e.g.,sqlite, or in system-calls, re-writing it in C++ will only help in proportion to the time spent in your own code, which is small by assumption." ? The information about the time spent in sqlite or in system-calls is trivially obtainable via running a profiler and pretty much everyone knows this. What kind of *alternative method* of obtaining this information do you imagine Don Allen might have used? There's no need to write angry walls of text about the most elementary things.
Jan 07
parent Jordan Wilson <wilsonjord gmail.com> writes:
On Monday, 8 January 2024 at 07:55:55 UTC, Siarhei Siamashka 
wrote:
 On Monday, 8 January 2024 at 04:00:52 UTC, H. S. Teoh wrote:
 [...]
[...]
 [...]
Now how do you interpret the following Don Allen's sentence: "If your application *is* cpu-limited but spends most of its time in some library, e.g.,sqlite, or in system-calls, re-writing it in C++ will only help in proportion to the time spent in your own code, which is small by assumption." ? The information about the time spent in sqlite or in system-calls is trivially obtainable via running a profiler and pretty much everyone knows this. What kind of *alternative method* of obtaining this information do you imagine Don Allen might have used? There's no need to write angry walls of text about the most elementary things.
I think you've misunderstood. H.S Teoh was agreeing with Don Allen. And in the part you quoted, he was not meaning Don specifically, but a general "you".
Jan 08
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 8:00 PM, H. S. Teoh wrote:
 [...]
Over the years, again and again, I've seen published benchmarks (including ones posted in this forum) that purport to benchmark a section of code, yet never realizing that there was a printf and they were just benchmarking printf. Borland made a very smart move back in the day with TurboC. The Borland C compiler generated rather poor code. So Borland hired a guy who coded their printf implementation in highly optimized assembler. With a fast printf, people tended to not notice the generated C code was slow.
Jan 08
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Mon, Jan 08, 2024 at 03:37:22PM -0800, Walter Bright via Digitalmars-d wrote:
 On 1/7/2024 8:00 PM, H. S. Teoh wrote:
 [...]
Over the years, again and again, I've seen published benchmarks (including ones posted in this forum) that purport to benchmark a section of code, yet never realizing that there was a printf and they were just benchmarking printf.
On the other extreme, there were also benchmarks that were actually measuring background noise instead of the function it's purportedly benchmarking, because the optimizer has elided the entire function call after realizing that the return value is never used. LDC has a tendency to do this. :-P It also has the tendency of executing simple functions at compile-time and replacing the function call with an instruction that loads the answer.
 Borland made a very smart move back in the day with TurboC. The
 Borland C compiler generated rather poor code. So Borland hired a guy
 who coded their printf implementation in highly optimized assembler.
 With a fast printf, people tended to not notice the generated C code
 was slow.
:-D T -- It won't be covered in the book. The source code has to be useful for something, after all. -- Larry Wall
Jan 08
parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 4:51 PM, H. S. Teoh wrote:
 On the other extreme, there were also benchmarks that were actually
 measuring background noise instead of the function it's purportedly
 benchmarking, because the optimizer has elided the entire function call
 after realizing that the return value is never used.
I'm painfully aware of this. My compiler (Datalight C) was the first PC compiler to sport a data flow analysis optimizer. It determined that the benchmark code didn't do anything useful, and so deleted it. The reviewer never contacted me about this, he just wrote the benchmark roundup article saying that the Datalight compiler was buggy because it didn't execute the benchmark code. About a year later, other compilers had implemented DFA, and of course this time the reporter got it right. I'm still mad about that :-/
Jan 08
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 6:58 PM, Don Allen wrote:
 The speed of the code generated by the language you are using is not
necessarily 
 the determining factor of the speed of your application.
Interestingly, C string processing is pretty slow, in spite of it being a "too the metal" language. The reason is simple. C strings are 0-terminated. This means that whenever you want to refer to a slice of a string that does not include the end, it is necessary to allocate memory, copy the slice to it, add a terminating 0, and then some time later free the memory. This is quite wasteful and inefficient. Another egregious source of inefficiency in C is the never-ending re-scanning of strings to find the length of them. This is CPU-intensive, and very cache-unfriendly. This is where D's slices shine. Yes, you can emulate D slices in C, but essentially every C API uses 0 terminated strings so you're borked anyway.
Jan 08
prev sibling parent reply Siarhei Siamashka <siarhei.siamashka gmail.com> writes:
On Sunday, 7 January 2024 at 16:43:17 UTC, Lance Bachmeier wrote:
 I'd much rather Adam put his time into a fork, rather than the 
 more common approach where he'd post here under various names, 
 make ridiculous claims, and vandalize the discussions. If 
 you're new, you may not have seen the many posts from someone 
 that doesn't like private at the module level.
Do you think that there can't possibly be more than one person in the whole world, who doesn't like the D's private keyword? And that this single person persistently registering under different names is the only possible explanation? Really? I did mention the private keyword in this forum before. I guess, now I'm starting to understand the reasons why some topics provoke unusually hostile reaction around here.
Jan 08
next sibling parent Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Monday, 8 January 2024 at 08:25:08 UTC, Siarhei Siamashka 
wrote:
 Do you think that there can't possibly be more than one person 
 in the whole world, who doesn't like the D's private keyword? 
 And that this single person persistently registering under 
 different names is the only possible explanation? Really?
There are. But there were a set of users with unusually similar style of writing and argument set, which was easy to spot.
Jan 08
prev sibling parent Lance Bachmeier <no spam.net> writes:
On Monday, 8 January 2024 at 08:25:08 UTC, Siarhei Siamashka 
wrote:
 On Sunday, 7 January 2024 at 16:43:17 UTC, Lance Bachmeier 
 wrote:
 I'd much rather Adam put his time into a fork, rather than the 
 more common approach where he'd post here under various names, 
 make ridiculous claims, and vandalize the discussions. If 
 you're new, you may not have seen the many posts from someone 
 that doesn't like private at the module level.
Do you think that there can't possibly be more than one person in the whole world, who doesn't like the D's private keyword? And that this single person persistently registering under different names is the only possible explanation? Really? I did mention the private keyword in this forum before. I guess, now I'm starting to understand the reasons why some topics provoke unusually hostile reaction around here.
Here's one example, posted under the name "UmmReally", even admitting that it's off-topic:
 In my version of D (a fork based on someone elses work), I am 
 not 'forced' to use that 'workaround'.

 Even javascript has private to the class.

 D is comlete joke! i.e. ..>  that you cannot even make a 
 private member within a class (except through some stupid 
 'workaround').

 So with that...back on to topic.. YES 'offical' D really IS 
 that bad!

 (but not my version of D ;-)

 btw. This is not really a complaint ;-)
 
It's great that I can create my own fork based on someone else 
work (to do what I can do in anyother langauge, including 
javascript!).
There's nothing useful about a post like that. But I'm not just talking about this topic. We used to have trolls waiting for any positive post so that they could add a bunch of fabricated/misleading statements to the thread. They wanted to make it so Google wouldn't return anything positive about D (and they mostly succeeded).
Jan 08
prev sibling next sibling parent reply monkyyy <crazymonkyyy gmail.com> writes:
On Sunday, 7 January 2024 at 11:54:16 UTC, bomat wrote:
 And yet, I feel like Zig is already getting more attention than 
 D. Just my impression, I'm not going to speculate why that is.
I'll do it for you, d is a c++ killer not a c killer, zig will probably win the c killer race because it has a ton and different platforms and is working on a new libc, meaning zig and basically no one else can make breaking changes to fundamental things. And d is a template hell with the most features and fastest compile time, but doesnt actually ship a completed data structures and algorithms project and honestly copies too many of stl's mistakes.
Jan 07
next sibling parent reply Lance Bachmeier <no spam.net> writes:
On Sunday, 7 January 2024 at 17:47:37 UTC, monkyyy wrote:

 zig will probably win the c killer race
The race is over for a language like Zig that requires you to learn a new syntax and do things differently. Rust has the users that don't want GC, and Go has those that do. D has the advantage that you can write C code as you've always done, mixing in conveniences as you go. Predictions are always tough when they involve the future, but it's hard to see Zig gaining much traction when Rust and Go are established in that area. (I haven't looked at Zig recently. Maybe things have changed. I couldn't find a reason to use it when I investigated earlier.)
Jan 07
next sibling parent Don Allen <donaldcallen gmail.com> writes:
On Sunday, 7 January 2024 at 21:12:11 UTC, Lance Bachmeier wrote:
 On Sunday, 7 January 2024 at 17:47:37 UTC, monkyyy wrote:

 zig will probably win the c killer race
The race is over for a language like Zig that requires you to learn a new syntax and do things differently. Rust has the users that don't want GC, and Go has those that do. D has the advantage that you can write C code as you've always done, mixing in conveniences as you go. Predictions are always tough when they involve the future, but it's hard to see Zig gaining much traction when Rust and Go are established in that area. (I haven't looked at Zig recently. Maybe things have changed. I couldn't find a reason to use it when I investigated earlier.)
The first derivative of the issues count is still positive and you see it in practice. Every time I check in on the Zig project and try to use it for something real, I run into a show-stopper. The documentation is simply awful -- not only sparse, but frequently inaccurate. It reminds me of BSD4.3 40 years ago -- motto: "it was hard to build, it should be hard to use". Running it on an overloaded Vax 780 made just experience just that much more wonderful. Having said that, I think there is some good work being done by the Zig people. But my guess is that they have a long way to go. I'll be surprised if they make Andrew's 2025 estimate for release.
Jan 07
prev sibling parent monkyyy <crazymonkyyy gmail.com> writes:
On Sunday, 7 January 2024 at 21:12:11 UTC, Lance Bachmeier wrote:
 On Sunday, 7 January 2024 at 17:47:37 UTC, monkyyy wrote:

 zig will probably win the c killer race
The race is over for a language like Zig that requires you to learn a new syntax and do things differently. Rust has the users that don't want GC, and Go has those that do. D has the advantage that you can write C code as you've always done, mixing in conveniences as you go. Predictions are always tough when they involve the future, but it's hard to see Zig gaining much traction when Rust and Go are established in that area. (I haven't looked at Zig recently. Maybe things have changed. I couldn't find a reason to use it when I investigated earlier.)
Rust is a c++ competitor like you cant be that ugly and complex without going for lots of abstractions(also it will fail). Go is to narrow and owned by google and etc etc etc.
Jan 07
prev sibling parent bomat <Tempest_spam gmx.de> writes:
On Sunday, 7 January 2024 at 17:47:37 UTC, monkyyy wrote:
 I'll do it for you, d is a c++ killer not a c killer, ...
Sorry, but that is just untrue; with the BetterC subset, D specifically targets C programmers as well.
 zig will probably win the c killer race ...
Just to throw in my 2 cents on this... when it comes to potential C killers, my money would be on Jai if it ever came out of closed beta. But that being said, I don't think C will ever truly be killed, at least not in my life time.
Jan 07
prev sibling parent reply Don Allen <donaldcallen gmail.com> writes:
On Sunday, 7 January 2024 at 11:54:16 UTC, bomat wrote:
 On Thursday, 4 January 2024 at 03:12:48 UTC, Don Allen wrote:
[snip]
 That D hasn't taken over the world is beside the point; good 
 things aren't always popular, e.g., Scheme, and sometimes bad 
 things are very popular, e.g., Windows, JavaScript, C/C++.
Now this is the point where I have to totally disagree with you. It doesn't suffice for a system to be well designed and great to use "in theory", there must also be tooling, documentation, thousands if not millions of samples, and an active community.
I said nothing about good tooling, documentation, etc. Of *course* those are necessary for something to be "good", which I'd point out, was *my* definition of "good", as "My opinions, of course" was intended to convey.
 Otherwise it will not feel safe to embrace it - certainly not 
 for companies, but to a lesser extent for every single 
 developer.
 If you look at your list of examples again - regardless if you 
 deem them "good" or "bad" - this is something that every single 
 one of them has, and which their competitors *don't*.
 If you google for a problem/question you have with any of the 
 mentioned things, you are *very likely* to find a viable 
 solution.

 In short, in order for something to be successful, it already 
 has to be successful. This is a paradox that has been written 
 about a lot and by much smarter people than me, and it is 
 mysterious to most why some few projects have achieved to get 
 over this hump while millions of others haven't.
It all depends on how you define "successful". If you define success as having a huge user community, then your paragraph above applies. I would argue that there are alternative ways to measure success. Is Scheme a success? OpenBSD? JS Bach? I say emphatically "yes" (especially regarding Bach) and yet all of them have user communities orders of magnitude smaller than their most popular competitors.
 This brings me back to the beginning of my post where I 
 lamented the split of an already niche language. Again, I hope 
 that the motto "unity is strength" does not apply in this case 
 and that everyone keeps open minded enough to profit from each 
 other.
Jan 07
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
D has been used by several companies to make money with, that they've said
would 
have been much more difficult to do with other languages.

That's one great definition of success!
Jan 07
prev sibling parent reply bomat <Tempest_spam gmx.de> writes:
On Sunday, 7 January 2024 at 17:53:27 UTC, Don Allen wrote:
 It all depends on how you define "successful". If you define 
 success as having a huge user community, then your paragraph 
 above applies. I would argue that there are alternative ways to 
 measure success. Is Scheme a success? OpenBSD? JS Bach? I say 
 emphatically "yes" (especially regarding Bach) and yet all of 
 them have user communities orders of magnitude smaller than 
 their most popular competitors.
In this context, my definition of "successful" is a language that you can put your trust in. Like in this thread, just as an example: https://forum.dlang.org/thread/qwrguthjwvawnjtpzcbf forum.dlang.org This is about the question if using D for a major commercial project is a solid business decision. I am not so sure, to be honest. And please be assured that I'm not trying to trash-talk here, I just call it as I see it, as a newcomer to the language. D has some awesome aspects to it, that's why I started looking into it for personal projects. I appreciate that there are some good books, I got the ones by Ali Cehreli, Mike Parker, and Adam Ruppe, plus the one by Kai Nacke on vibe.d. So there's good learning material. But on the other hand, a lot of people seem to have walked away from the language over the years, like Andrei Alexandrescu. Fine, people walk away from projects all the time, doesn't have to mean anything. But then, looking at the forums, there seems to be a lot of dissent and even a recent fork. When I tried to get help with my vibe.d project, I didn't get much feedback in this forum and the "official" vibe.d forum at https://forum.rejectedsoftware.com/ seems abandoned entirely. What's that all about? What happened to Sönke Ludwig, did he walk away, too? What does that mean for the future of vibe.d? Has any large project ever been built with vibe.d? Apparently not: https://forum.dlang.org/thread/mjewutfvitpnlovbwofu forum.dlang.org I'm using VS Code as my IDE, mostly because I can use it under Windows and Linux alike. I quite like its good D support provided by the code-d plugin. Problem is, though, that it hasn't been updated for two years: https://marketplace.visualstudio.com/items?itemName=webfreak.code-d I asked about that here (https://forum.dlang.org/thread/gwulbuatekebrtrdmyhi forum.dlang.org) and got no answer. What's up with WebFreak001, is he gone, too? I just don't know. I will continue with my D/vibe.d project because it is just a fun experiment that I do in my free time. If I get stuck at some point, fine, it still will have been a great learning experience. But would I stick with D if my business depended on it? Hell no. I'd probably use Go or maybe even C++ with wt or something like that. Something that I can be reasonably sure will still be maintained in 5 years from now. Maybe I will be able to complete my project successfully. But as to the D language as a whole, it does not feel like a success story to me. Please feel free to correct me on any and all points that I got wrong (I'm sure there are many) - again, I have been describing my totally subjective impressions, with no intention to offend anyone.
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 3:27 PM, bomat wrote:
 I will continue with my D/vibe.d project because it is just a fun experiment 
 that I do in my free time. If I get stuck at some point, fine, it still will 
 have been a great learning experience.
 But would I stick with D if my business depended on it? Hell no. I'd probably 
 use Go or maybe even C++ with wt or something like that. Something that I can
be 
 reasonably sure will still be maintained in 5 years from now.
D has been continuously improved and supported for over 20 years now.
Jan 07
parent bomat <Tempest_spam gmx.de> writes:
On Monday, 8 January 2024 at 00:37:52 UTC, Walter Bright wrote:
 On 1/7/2024 3:27 PM, bomat wrote:
 I will continue with my D/vibe.d project because it is just a 
 fun experiment that I do in my free time. If I get stuck at 
 some point, fine, it still will have been a great learning 
 experience.
 But would I stick with D if my business depended on it? Hell 
 no. I'd probably use Go or maybe even C++ with wt or something 
 like that. Something that I can be reasonably sure will still 
 be maintained in 5 years from now.
D has been continuously improved and supported for over 20 years now.
Yes. And now it's running out of steam. At least That's what I'm afraid of, considering the loss of important infrastructure and library maintainers I mentioned. I hope I'm wrong, of course.
Jan 08
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Thursday, 4 January 2024 at 03:12:48 UTC, Don Allen wrote:
 as opposed to something like Haskell or even Rust -- "here's a 
 new way to think about computer programming")
Haskell is not a new way to think about programming. Its history can be traced back to at least to https://en.wikipedia.org/wiki/ISWIM, which is 1966.
Jan 07
prev sibling parent reply Hors <q q.com> writes:
On Wednesday, 3 January 2024 at 09:28:15 UTC, Martyn wrote:
 The only area I personally would disagree on is:-
 `Embracing the GC and improving upon it, disregarding betterC 
 and nogc in the process`

 I think GC should be optional or, atleast, have some kind of 
 Allocator feature so we can have control if needed.

 D allows you to code whatever way you like, or a combination of 
 them... why not provide this power when it comes to memory?
 [...]
If this fork also maintains interop with C, then we can still have control via malloc() and free() from a C library.
Jan 03
parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Jan 03, 2024 at 04:20:39PM +0000, Hors via Digitalmars-d wrote:
 On Wednesday, 3 January 2024 at 09:28:15 UTC, Martyn wrote:
 The only area I personally would disagree on is:-
 `Embracing the GC and improving upon it, disregarding betterC and
 nogc in the process`
 
 I think GC should be optional or, atleast, have some kind of
 Allocator feature so we can have control if needed.
 
 D allows you to code whatever way you like, or a combination of
 them...  why not provide this power when it comes to memory?
 [...]
If this fork also maintains interop with C, then we can still have control via malloc() and free() from a C library.
D has always had interop with C on this level at least. I don't see the fork moving away from this in the foreseeable future, as this is pretty deeply ingrained in the way codegen is done in D. T -- It's amazing how careful choice of punctuation can leave you hanging:
Jan 03
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want to change the way decisions are made, giving both more 
 freedom to change, and more speed to decision making. We want 
 the process of contribution to be as open-ended as possible, 
 and not having unnecessary blockage to contributions. We also 
 want the language to allow for faster software development. The 
 way this is going to be achieved is still not finalized, but 
 that is the goal.
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 [...]
All the best on your efforts, other languages have filled for me the role I wished D for, at the time Andrei's book got published.
Jan 03
prev sibling next sibling parent reply i_meva <i_meva outlook.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 [...]
If the codes we write in the current D language will work smoothly with this fork, I would like to contribute to this project. A D language that does not have any breakdowns is very important for stability.
Jan 04
next sibling parent Hors <q q.com> writes:
On Thursday, 4 January 2024 at 10:09:27 UTC, i_meva wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 [...]
If the codes we write in the current D language will work smoothly with this fork, I would like to contribute to this project. A D language that does not have any breakdowns is very important for stability.
D code using nogc and betterC may not work.
Jan 04
prev sibling parent Martyn <martyn.developer googlemail.com> writes:
On Thursday, 4 January 2024 at 10:09:27 UTC, i_meva wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 One of the ways to achive our goal is to have core focuses of 
 the language. Such focuses are:

 [...]
If the codes we write in the current D language will work smoothly with this fork, I would like to contribute to this project. A D language that does not have any breakdowns is very important for stability.
It depends, I guess, on how you wrote you program/lib. While things can change.. this is their roadmap. Currently titled: "Stuff i might do in my D fork" This is their "No" category: *Focus on betterC or nogc. I might or might not deliberately break them, but I certainly won't expend effort maintaining or promoting them* So if you write anything for betterC or nogc, you might not be able to migrate to "OpenD" As I say - its early days yet, and this roadmap could be changed.
Jan 04
prev sibling next sibling parent reply Abdulhaq <alynch4048 gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want to change the way decisions are made, giving both more 
 freedom to change, and more speed to decision making. We want 
 the process of contribution to be as open-ended as possible, 
 and not having unnecessary blockage to contributions. We also 
 want the language to allow for faster software development. The 
 way this is going to be achieved is still not finalized, but 
 that is the goal.
I think this is a good way for Adam and co. to blow off some steam and push forward with the changes they would like to see. I wish them all well on that. They should achieve their goal of evolving their preferred flavour of D faster than it would otherwise happen. Other potential goals such as becoming a popular and well used dialect of D will be much harder to achieve. A lot of people currently put in a lot of effort on the project admin side, such as web sites, source code management, funding, organising conferences, dealing with commercial customers etc. A handful of gearheads will not want to be spending their weekends and evenings doing that. This time next year Adam will have a new understanding of why things are as they are. If this "fork" could be under the umbrella of the main D project, as an experimental D, then I think it has more chance of influencing D and getting the changes into the mainline. Right now in Discord there is talk of changes to ranges, dub, iterators etc. Significant changes in those areas would take months/years and likely isolate "OpenD" and break many of the libraries in existence. It would make the chances of adoption very small and the community would likely remain small and cut off from the main stream. Just my two cents.
Jan 04
parent IGotD- <nise nise.com> writes:
On Thursday, 4 January 2024 at 12:51:31 UTC, Abdulhaq wrote:
 If this "fork" could be under the umbrella of the main D 
 project, as an experimental D, then I think it has more chance 
 of influencing D and getting the changes into the mainline.
This is very much likely to happen. Walter is a notorious no sayer (a bit too much in my opinion) and rejects ideas that aren't his own or he doesn't fully understand the benefits for the moment. If he gets to play around with new features we will probably discover "this is great" and merge them back into D. Also there will be a lot of, "OpenD has this feature now and it works great" from other people in the forum. There will be a lot of pressure towards D from others in that regard. That is what I hope at least, that the new D form will be the tip of the spear.
Jan 04
prev sibling next sibling parent Etienne <etcimon gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 The forking process is still in progress, and there isn't much 
 done per se. We are discussing the future of the fork and what 
 we want from it, it might be a little crazy at first. But if 
 you wish to help out, bring your changes in, or just look 
 around, please join our Discord server to discuss: 
 https://discord.gg/tfT9MjA69u . Temporary website: 
 https://dpldocs.info/opend/contribute.html

 Thank you, and good luck.
The GC would be more efficient if it was instantiated as thread-local for `new` allocations and globally shared for `new shared` allocations.
Jan 04
prev sibling next sibling parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
The chances of a fork being successful when the main repo is alive and rapidly changing is close to zero, I am afraid.
Jan 05
parent reply matheus <matheus gmail.com> writes:
On Friday, 5 January 2024 at 17:02:12 UTC, Dibyendu Majumdar 
wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
The chances of a fork being successful when the main repo is alive and rapidly changing is close to zero, I am afraid.
You have a point, but also I wonder what will happen if things like: DIP1015, DIP1028 and DIP1036 which is currently being discussed in "opend", or other things which usually take years get some traction. And about other fellows coming back (Maybe?) like Jonathan Marler and others which was already discussed before. Matheus.
Jan 05
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Jan 05, 2024 at 06:55:15PM +0000, matheus via Digitalmars-d wrote:
 On Friday, 5 January 2024 at 17:02:12 UTC, Dibyendu Majumdar wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!
 
 Growing greatly dissatisfied with how things are in the D
 Programming Language, we decided it is time to fork it.
 
The chances of a fork being successful when the main repo is alive and rapidly changing is close to zero, I am afraid.
You have a point, but also I wonder what will happen if things like: DIP1015, DIP1028 and DIP1036 which is currently being discussed in "opend", or other things which usually take years get some traction.
DIP1036 is already merged and working. E.g., the following works today: ----------snip---------- import std; void main() { int i = 10; string abc = "hooray"; float j = 1.5; writeln(i"i=$(i) abc=$(abc) j=$(j)"); } ----------snip---------- Output: ----------snip---------- i=10 abc=hooray j=1.5 ----------snip----------
 And about other fellows coming back (Maybe?) like Jonathan Marler and
 others which was already discussed before.
[...] There's been a lot more than just 2 people actively involved in discussions on the github discussions page, probably even more on discord. It's anybody's guess what will happen in the future, but at present it seems like this is bigger than it might first appear. T -- Perhaps the most widespread illusion is that if we were in power we would behave very differently from those who now hold it---when, in truth, in order to get power we would have to become very much like them. -- Unknown
Jan 05
parent reply matheus <matheus gmail.com> writes:
On Friday, 5 January 2024 at 19:21:36 UTC, H. S. Teoh wrote:
 ...
 DIP1036 is already merged and working. E.g., the following 
 works today:
 ...
Already!? - I need to check this but I couldn't find any nightly build (The currently Readme points to original Dlang), so I think I will need to build myself? Matheus.
Jan 06
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Jan 06, 2024 at 01:53:18PM +0000, matheus via Digitalmars-d wrote:
 On Friday, 5 January 2024 at 19:21:36 UTC, H. S. Teoh wrote:
 ...
 DIP1036 is already merged and working. E.g., the following works today:
 ...
Already!? - I need to check this but I couldn't find any nightly build (The currently Readme points to original Dlang), so I think I will need to build myself?
[...] Thanks to the recent unified Makefile (which happened before the fork), you can just git clone dmd and phobos (under the same parent directory) and run Make in each, and you should get a working compiler. T -- Change is inevitable, except from a vending machine.
Jan 06
parent Mengu <mengukagan gmail.com> writes:
On Saturday, 6 January 2024 at 14:44:07 UTC, H. S. Teoh wrote:
 On Sat, Jan 06, 2024 at 01:53:18PM +0000, matheus via 
 Digitalmars-d wrote:
 On Friday, 5 January 2024 at 19:21:36 UTC, H. S. Teoh wrote:
 ...
 DIP1036 is already merged and working. E.g., the following 
 works today:
 ...
Already!? - I need to check this but I couldn't find any nightly build (The currently Readme points to original Dlang), so I think I will need to build myself?
[...] Thanks to the recent unified Makefile (which happened before the fork), you can just git clone dmd and phobos (under the same parent directory) and run Make in each, and you should get a working compiler. T
except for apple silicon :-)
Jan 06
prev sibling next sibling parent IGotD- <nise nise.com> writes:
On Saturday, 6 January 2024 at 13:53:18 UTC, matheus wrote:
 Already!? - I need to check this but I couldn't find any 
 nightly build (The currently Readme points to original Dlang), 
 so I think I will need to build myself?
Yes, the fork was announced just a few days ago and the CI system has not yet been built up.
Jan 06
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Sat, Jan 06, 2024 at 01:53:18PM +0000, matheus via Digitalmars-d wrote:
 On Friday, 5 January 2024 at 19:21:36 UTC, H. S. Teoh wrote:
 ...
 DIP1036 is already merged and working. E.g., the following works today:
 ...
Already!? - I need to check this but I couldn't find any nightly build (The currently Readme points to original Dlang), so I think I will need to build myself?
[...] Thought I'd showcase what the string interpolation tuple DIP can do, which currently compiles with the forked compiler and behaves as advertised: ```` // Taken from Adam's interpolation-examples repo // (https://github.com/adamdruppe/interpolation-examples) module demo.sql; import arsd.sqlite; import lib.sql; import std.stdio; void main() { auto db = new Sqlite(":memory:"); db.query("CREATE TABLE sample (id INTEGER, name TEXT)"); // you might think this is sql injection... but it isn't! the lib // uses the rich metadata provided by the interpolated sequence to // use prepared statements appropriate for the db engine under the hood int id = 1; string name = "' DROP TABLE', '"; db.execi(i"INSERT INTO sample VALUES ($(id), $(name))"); foreach(row; db.query("SELECT * from sample")) writeln(row[0], ": ", row[1]); } ```` This compiles with the forked compiler today, and correctly handles the dangerous string `name` that imitates a SQL injection attack. Thanks to the compile-time information provided by the interpolation tuple, the backend code is able to understand that the contents of `name` is string data to be bound to a SQL placeholder rather than copied verbatim into a SQL query string (which would have allowed the SQL injection attack to work), so it is able to safely store the data as a harmless string inside the database. Output (note that the string "' DROP TABLE', '" is stored as a string, and the attempted SQL injection did not work): ---------------- 1: ' DROP TABLE', ' ---------------- Noteworthy is the fact that the competing string interpolation proposals are *not* immune to this sort of SQL injection attack, because premature conversion of the i"" literal to string *would* result in a successful injection. My personal hope is that the string interpolation tuple DIP would go through in the official version of D and we would all benefit from it, rather than have the current stalemate continue for another who knows how many years. T -- Unix was not designed to stop people from doing stupid things, because that would also stop them from doing clever things. -- Doug Gwyn
Jan 06
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/6/2024 9:35 PM, H. S. Teoh wrote:
 Noteworthy is the fact that the competing string interpolation proposals
 are *not* immune to this sort of SQL injection attack, because premature
 conversion of the i"" literal to string *would* result in a successful
 injection.
The same technique of having a template take the generated tuple and modifying it as it sees fit works with DIP1027, too. I posted an example here in the last debate about this. The tuple generated from the istring is passed to a template that accepts tuples. The format string is the first element in the tuples, and it is a string literal. The template reads the format string character by character, generating a new format string as it goes. It examines the format specifications, and the type of the corresponding tuple argument, and adjusts the output to the new format string as required. The template then returns the new tuple which consists of the new format string followed by the arguments. It's true that in order for this to work, ``` db.execi(i"INSERT INTO sample VALUES ($(id), $(name))"); ``` would need to be written as: ``` db.execi(xxx!(i"INSERT INTO sample VALUES ($(id), $(name))")); ``` where `xxx` is the thoroughly unimaginative name of the transformer template. Is adding the template call an undue burden? Follow the call to deb.execi() with: ``` std.stdio.writeln(i"id = ($(id), name = $(name)"); ``` I haven't tried this myself, as I don't have sql on my machine, but I expect the output to stdout would not be what one would expect. I.e. The imported Interpolation functions will produce what is right for sql, not writeln, which would be in error. Since you do have this setup, please give this line a try and let us know what it prints.
Jan 07
next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Sunday, 7 January 2024 at 09:04:16 UTC, Walter Bright wrote:
 On 1/6/2024 9:35 PM, H. S. Teoh wrote:
 Noteworthy is the fact that the competing string interpolation 
 proposals
 are *not* immune to this sort of SQL injection attack, because 
 premature
 conversion of the i"" literal to string *would* result in a 
 successful
 injection.
The same technique of having a template take the generated tuple and modifying it as it sees fit works with DIP1027, too. I posted an example here in the last debate about this. The tuple generated from the istring is passed to a template that accepts tuples. The format string is the first element in the tuples, and it is a string literal. The template reads the format string character by character, generating a new format string as it goes. It examines the format specifications, and the type of the corresponding tuple argument, and adjusts the output to the new format string as required. The template then returns the new tuple which consists of the new format string followed by the arguments. It's true that in order for this to work, ``` db.execi(i"INSERT INTO sample VALUES ($(id), $(name))"); ``` would need to be written as: ``` db.execi(xxx!(i"INSERT INTO sample VALUES ($(id), $(name))")); ``` where `xxx` is the thoroughly unimaginative name of the transformer template. Is adding the template call an undue burden? Follow the call to deb.execi() with: ``` std.stdio.writeln(i"id = ($(id), name = $(name)"); ``` I haven't tried this myself, as I don't have sql on my machine, but I expect the output to stdout would not be what one would expect. I.e. The imported Interpolation functions will produce what is right for sql, not writeln, which would be in error. Since you do have this setup, please give this line a try and let us know what it prints.
In our codebase, we are binding of local variables to sql parameters with something like: mixin( BindParameters!actualQuery ); Also that way of doing it is not 'an undue burden'. The whole point is, well, let's move on and simplify it! DIP1036 could allow us to do that, with better library code and more encapsulation, your proposal simply can't do that without what you call 'more burden'. Long story short: in our codebase, we will stick with mixins with your proposal merged, on the contrary, we will use DIP1036 functionalities if merged. /P
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 1:22 AM, Paolo Invernizzi wrote:
 The whole point is, well, let's move on and simplify it!
Of course!
 DIP1036 could allow us to do that, with better library code and more 
 encapsulation, your proposal simply can't do that without what you call 'more 
 burden'.
I asked the question if adding a template call constituted an "undue" burden. What it does do is send a clear visual signal that the default behavior of istring is being adjusted.
 Long story short: in our codebase, we will stick with mixins with your
proposal 
 merged, on the contrary, we will use DIP1036 functionalities if merged.
Using DIP1036 for string interpolation means that for using it for sql calls means: 1. if something other than core.interpolation is imported at the global level, it will apply to all users of istring in that module, not just the ones calling sql functions. It means calling writeln with istrings won't do what is expected, and this is not visually apparent by looking at the code, nor is it checkable by the compiler. 2. an alternative would be locally importing core.interpolation or arsd.sqlite as required for each scoped use of istrings.
Jan 07
next sibling parent reply brianush1 <brianush1 outlook.com> writes:
On Sunday, 7 January 2024 at 18:51:40 UTC, Walter Bright wrote:
 On 1/7/2024 1:22 AM, Paolo Invernizzi wrote:
 The whole point is, well, let's move on and simplify it!
Of course!
 DIP1036 could allow us to do that, with better library code 
 and more encapsulation, your proposal simply can't do that 
 without what you call 'more burden'.
I asked the question if adding a template call constituted an "undue" burden. What it does do is send a clear visual signal that the default behavior of istring is being adjusted.
 Long story short: in our codebase, we will stick with mixins 
 with your proposal merged, on the contrary, we will use 
 DIP1036 functionalities if merged.
Using DIP1036 for string interpolation means that for using it for sql calls means: 1. if something other than core.interpolation is imported at the global level, it will apply to all users of istring in that module, not just the ones calling sql functions. It means calling writeln with istrings won't do what is expected, and this is not visually apparent by looking at the code, nor is it checkable by the compiler. 2. an alternative would be locally importing core.interpolation or arsd.sqlite as required for each scoped use of istrings.
It seems you have a fundamental misunderstanding of what DIP1036 is, so here's a quick explanation: `i"$​(str) has $​(num) items."` becomes AliasSeq!( InterpolationHeader(), InterpolatedExpression!"str", str, InterpolatedLiteral!" has ", InterpolatedExpression!"num", num, InterpolatedLiteral!" items."; InterpolationFooter(), ) `InterpolationHeader`, `InterpolatedExpression`, `InterpolatedLiteral`, and `InterpolationFooter` are defined in core.interpolation, which doesn't need to be imported in order to use interpolated strings. Not arsd.sqlite, nor any other library, defines their own interpolated strings. Importing a library does not and cannot change the behavior of interpolated strings, that is a misunderstanding. Library functions simple take in the interpolated string using a vararg template and do literally *whatever they want with it,* since they're given all the information about the interpolated string, including the evaluated expressions that were inside the string.
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 11:52 AM, brianush1 wrote:
 Not arsd.sqlite, nor any other library, defines their own interpolated
strings. 
 Importing a library does not and cannot change the behavior of interpolated 
 strings, that is a misunderstanding. Library functions simple take in the 
 interpolated string using a vararg template and do literally *whatever they
want 
 with it,* since they're given all the information about the interpolated
string, 
 including the evaluated expressions that were inside the string.
Thanks for the clarification. I had indeed misunderstood it. So that means db.execi() is the template that adjusts how the call to sql is made. (I had assumed it was part of the sql api, rather than a wrapper.) This makes it no different than DIP1027 in that regard. It is not really necessary to have a marker to say if an argument is a literal or a variable. Consider this program: ``` import std.stdio; void main() { writeln("hello".stringof); string foo = "betty"; writeln(foo.stringof); } ``` Which, when run, prints: "hello" foo and so a template, by using .stringof, can determine if a tuple element is a string literal or a variable.
Jan 07
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/7/24 21:19, Walter Bright wrote:
 On 1/7/2024 11:52 AM, brianush1 wrote:
 Not arsd.sqlite, nor any other library, defines their own interpolated 
 strings. Importing a library does not and cannot change the behavior 
 of interpolated strings, that is a misunderstanding. Library functions 
 simple take in the interpolated string using a vararg template and do 
 literally *whatever they want with it,* since they're given all the 
 information about the interpolated string, including the evaluated 
 expressions that were inside the string.
Thanks for the clarification. I had indeed misunderstood it. So that means db.execi() is the template that adjusts how the call to sql is made. (I had assumed it was part of the sql api, rather than a wrapper.) This makes it no different than DIP1027 in that regard. It is not really necessary to have a marker to say if an argument is a literal or a variable. Consider this program: ``` import std.stdio; void main() {     writeln("hello".stringof);     string foo = "betty";     writeln(foo.stringof); } ``` Which, when run, prints: "hello" foo and so a template, by using .stringof, can determine if a tuple element is a string literal or a variable.
Only if you pass the i-string as a template parameter (which might not even work because any non-symbol expression will need to be evaluated at compile time but maybe cannot). DIP1036 does not require that. Also, what do you do if someone nests multiple i-strings? DIP1036 handles it. etc. The design is simple but it addresses many issues that authors of competing proposals did not even think about.
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 3:45 PM, Timon Gehr wrote:
 Only if you pass the i-string as a template parameter (which might not even
work 
 because any non-symbol expression will need to be evaluated at compile time
but 
 maybe cannot).
Currently D internally supports non-trivial expressions as tuples. DIP1027 relies on that, as: ``` int i = 4; writefln(i"hello $(3 + i)"); ``` prints: hello 7 But, if the tuple is created with: ``` template tuple(A ...) { alias tuple = A; } ``` then a non-trivial expression cannot be passed to it, as it will not work as an alias parameter. This is a general shortcoming in D, not a specific problem with DIP1027. We've been moving towards: ``` auto tup = (1, 3 + i); ``` for a while now, which is why comma expressions have been deprecated. Amusingly, istrings can be used to create non-trivial expression tuples with this: ``` int i = 4; auto tup2 = i"$i $(3 + i)"[1..3]; writeln("a tuple: ", tup2); ``` which prints: a tuple: 47 I can't recommend doing things that way, but it just illustrates istrings as being a building block rather than an end product. As for detecting string literals, I have tried a few schemes to no avail. It may have to be added with a __traits, which is where we put such things. Trying that with DIP1036 means there are several more entries added to the tuple, which would have to be filtered out.
 Also, what do you do if someone nests multiple i-strings? DIP1036 handles it. 
 etc. The design is simple but it addresses many issues that authors of
competing 
 proposals did not even think about.
Nested istrings would do the expected - create a tuple within a tuple, which gets flattened out. You'd likely wind up with a compile time error that there are too many arguments for the format. Recall that the istring is simply converted to a tuple, after that, predictable tuple rules are followed. However, the nested istring can always be inserted as an argument to the `text` function which will expand it into a single argument, and no tuple twubble. In general, you and I agree that D should move towards much better tuple support. DIP1027 fits right in with that, as it does no magic. Better tuple support will fit right in in extending istring power, rather than making istring itself more powerful.
Jan 07
next sibling parent reply Nick Treleaven <nick geany.org> writes:
On Monday, 8 January 2024 at 02:10:03 UTC, Walter Bright wrote:
 Amusingly, istrings can be used to create non-trivial 
 expression tuples with this:

 ```
 int i = 4;
 auto tup2 = i"$i $(3 + i)"[1..3];
 writeln("a tuple: ", tup2);
 ```

 which prints:

 a tuple: 47
Wouldn't you just use `std.typecons.Tuple`? ```d import std; void main() { int i = 4; auto tup2 = tuple(i, 3 + i); writeln("a tuple: ", tup2[]); } ```
Jan 08
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 7:50 AM, Nick Treleaven wrote:
 Wouldn't you just use `std.typecons.Tuple`?
I never really understood Tuple. It creates a struct that behaves like a tuple.
Jan 08
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 1/9/24 00:15, Walter Bright wrote:
 On 1/8/2024 7:50 AM, Nick Treleaven wrote:
 Wouldn't you just use `std.typecons.Tuple`?
I never really understood Tuple. It creates a struct that behaves like a tuple.
- It is a proper type. - It can be used to create ad-hoc groups of values when using generic code. - It does not expand unless the user asks for it. - It can be put in data structures. - It can be returned from functions. - It has an address. etc.
Jan 09
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/8/24 03:10, Walter Bright wrote:
 On 1/7/2024 3:45 PM, Timon Gehr wrote:
 Only if you pass the i-string as a template parameter (which might not 
 even work because any non-symbol expression will need to be evaluated 
 at compile time but maybe cannot).
Currently D internally supports non-trivial expressions as tuples. DIP1027 relies on that, as: ``` int i = 4; writefln(i"hello $(3 + i)"); ``` prints: hello 7 ...
Well yes, but my point was you can't pass that to a template. You were the one passing the istring to a template as part of a suggested routine operation, it was not me.
 But, if the tuple is created with:
 
 ```
 template tuple(A ...) { alias tuple = A; }
 ```
 
 then a non-trivial expression cannot be passed to it, as it will not 
 work as an alias parameter.
Yes.
 This is a general shortcoming in D, not a specific problem with DIP1027.
 ...
The specific problem with DIP1027 is that it has problems for which your suggested solution is to pass an istring to a template, which does not work. DIP1036e simply does not have such problems, because it separates compile-time from runtime information in a way where information is not lost when passing the istring to a function template.
 We've been moving towards:
 
 ```
 auto tup = (1, 3 + i);
 ```
 for a while now, which is why comma expressions have been deprecated.
 ...
Which is good, but I fear your intention is to use this syntax to create auto-expanding entities, which would be bad. In any case, this still does not allow `(1, 3 + i)` to be passed as a template parameter.
 Amusingly, istrings can be used to create non-trivial expression tuples 
 with this:
 
 ```
 int i = 4;
 auto tup2 = i"$i $(3 + i)"[1..3];
 writeln("a tuple: ", tup2);
 ```
 
 which prints:
 
 a tuple: 47
 ...
`tup2` is a sequence of aliases to local variables. It is exactly the same as what we get from this: ```d import std.typecons:tuple; auto tup2 = tuple(i,3+i).expand; ``` I get that what you mean is that in the right-hand side you have an actual sequence of expressions. It used to be the case that UDAs did not evaluate such expressions but this has been fixed in the meantime.
 I can't recommend doing things that way, but it just illustrates 
 istrings as being a building block rather than an end product.
 ...
And yet with DIP1027 they are presented to a function that would consume them just like the final product without any way of distinguishing a building block from an end product.
 As for detecting string literals, I have tried a few schemes to no 
 avail. It may have to be added with a __traits, which is where we put 
 such things.
 ...
You cannot detect whether a function argument was a string literal in the current language even if you add a `__traits`, because such details are not passed through implicit function template instantiation. And anyway, what do you do if a user decides to do something like this? i"$("hi") $("%s") $("hi") $(i"$("hi")")" DIP1036e can handle this relatively naturally.
 Trying that with DIP1036 means there are several more entries added to 
 the tuple, which would have to be filtered out.
 ...
Building block vs end product applies here.
 
 Also, what do you do if someone nests multiple i-strings? DIP1036 
 handles it. etc. The design is simple but it addresses many issues 
 that authors of competing proposals did not even think about.
Nested istrings would do the expected - create a tuple within a tuple, which gets flattened out. You'd likely wind up with a compile time error that there are too many arguments for the format. Recall that the istring is simply converted to a tuple, after that, predictable tuple rules are followed. ...
I want it to work instead of fail to work or pretend to work in a way that is predictable to a type checker developer like you or me. ;)
 However, the nested istring can always be inserted as an argument to the 
 `text` function which will expand it into a single argument, and no 
 tuple twubble.
 ...
At which point you may get back the security vulnerabilities.
 In general, you and I agree that D should move towards much better tuple 
 support.
Yes. NB: I think among others that entails having a clean separation between auto-expanding sequences and "real" tuples that do not auto-expand. Note that the fact that expression sequences auto-expand into any context does not help either DIP1027 nor DIP1036e (on the contrary). The reason why DIP1036e uses an expression sequence is that structs have some limitations.
 DIP1027 fits right in with that, as it does no magic. Better 
 tuple support will fit right in in extending istring power, rather than 
 making istring itself more powerful.
I think the things DIP1036e allows to do that DIP1029 does not still do not work with DIP1027 together better tuple support. Rather you would need to add more introspection features to implicit template instantiation to even approach what DIP1036e does.
Jan 08
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/8/24 21:32, Timon Gehr wrote:
 I think the things DIP1036e allows to do that DIP1029 does not still do 
 not work with DIP1027 together better tuple support. Rather you would 
 need to add more introspection features to implicit template 
 instantiation to even approach what DIP1036e does.
Should have been something like:
 I think the things DIP1036e allows to do that DIP1027 does not still do
 not work with DIP1027 together with better tuple support. Rather you 
would
 need to add more introspection features to implicit template
 instantiation to even approach what DIP1036e does.
Jan 08
parent Walter Bright <newshound2 digitalmars.com> writes:
I started a new thread on this "Interpolated strings and SQL".
Jan 08
prev sibling parent reply DrDread <no no.no> writes:
On Sunday, 7 January 2024 at 18:51:40 UTC, Walter Bright wrote:
 On 1/7/2024 1:22 AM, Paolo Invernizzi wrote:
 [...]
Of course!
 [...]
I asked the question if adding a template call constituted an "undue" burden. What it does do is send a clear visual signal that the default behavior of istring is being adjusted.
 [...]
Using DIP1036 for string interpolation means that for using it for sql calls means: 1. if something other than core.interpolation is imported at the global level, it will apply to all users of istring in that module, not just the ones calling sql functions. It means calling writeln with istrings won't do what is expected, and this is not visually apparent by looking at the code, nor is it checkable by the compiler. 2. an alternative would be locally importing core.interpolation or arsd.sqlite as required for each scoped use of istrings.
and you _still_ misunderstand DIP 1036. which is the whole problem here. and we've told you repeatedly that you misunderstand it. please just go and look at the implementation.
Jan 08
next sibling parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Monday, 8 January 2024 at 14:09:09 UTC, DrDread wrote:
 On Sunday, 7 January 2024 at 18:51:40 UTC, Walter Bright wrote:
 ...
and you _still_ misunderstand DIP 1036. which is the whole problem here. and we've told you repeatedly that you misunderstand it. please just go and look at the implementation.
I read a draft 1036 spec from Atila on his publicly accessible github here: https://github.com/atilaneves/DIPs/blob/string-interpolation/Interpolation.md It might not still be up but I found it approachable. Again, though, it's a draft and might not match the code (or the actual intent for that matter). Still, a useful possibility for those who prefer not to look at code in the early going.
Jan 08
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 3:34 AM, Bruce Carneal wrote:
 It might not still be up but I found it approachable.  Again, though, 
 it's a draft and might not match the code (or the actual intent for that 
 matter).  Still, a useful possibility for those who prefer not to look 
 at code in the early going.
The main issue with it that I've seen is that it still uses the ``$$`` escape for ``$``, rather than ``\$``. Apart from that it looks ok.
Jan 08
prev sibling next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On Monday, 8 January 2024 at 14:34:26 UTC, Bruce Carneal wrote:
 On Monday, 8 January 2024 at 14:09:09 UTC, DrDread wrote:
 On Sunday, 7 January 2024 at 18:51:40 UTC, Walter Bright wrote:
 ...
and you _still_ misunderstand DIP 1036. which is the whole problem here. and we've told you repeatedly that you misunderstand it. please just go and look at the implementation.
I read a draft 1036 spec from Atila on his publicly accessible github here: https://github.com/atilaneves/DIPs/blob/string-interpolation/Interpolation.md It might not still be up but I found it approachable. Again, though, it's a draft and might not match the code (or the actual intent for that matter). Still, a useful possibility for those who prefer not to look at code in the early going.
I really wish we didn't have to force parentheses in `$()` if the only thing inside the expression is an alphanumeric variable name. There's the example of: ```D enum result = text( i" property bool $(name)() safe pure nothrow nogc const { return ($(store) & $(maskAllElse)) != 0; } property void $(name)(bool v) safe pure nothrow nogc { if (v) $(store) |= $(maskAllElse); else $(store) &= cast(typeof($(store)))(-1-cast(typeof($(store)))$(maskAllElse)); }\n" ); ``` But why not take it a step further and allow this: ```D enum result = text( i" property bool $name() safe pure nothrow nogc const { return ($store & $maskAllElse) != 0; } property void $name(bool v) safe pure nothrow nogc { if (v) $store |= $maskAllElse; else $store &= cast(typeof($store))(-1-cast(typeof($store))$maskAllElse); }\n" ); ``` To me that's more readable then the first example where I have to count the closing parens to figure out where an expression starts and ends. The `$()` should just be an escape hatch when you need more complex expressions. For example: ```D int foo = 2; int bar = 4; writeln(i"$foo + $bar is $(foo + bar)"); // 2 + 4 is 6 ``` I haven't read any rationale why parentheses are absolutely required in any of the DIPs I've skimmed through.
Jan 08
next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 3:50 AM, Andrej Mitrovic wrote:
 I really wish we didn't have to force parentheses in |$()| if the only 
 thing inside the expression is an alphanumeric variable name.
It isn't. It was voted on as the common denominator that everyone could agree upon who voted. Adding support for an identifier is trivial. So would adding formatting support ala f-string. At least this way it could be expanded upon later.
Jan 08
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 8 January 2024 at 14:58:13 UTC, Richard (Rikki) Andrew 
Cattermole wrote:

 It was voted on as the common denominator that everyone could 
 agree upon who voted.
If optional () for single-token template argument lists had been put to a vote, the majority would have voted against it. I doubt anybody would want to remove it today.
Jan 08
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 8 January 2024 at 15:30:19 UTC, Max Samukha wrote:
 On Monday, 8 January 2024 at 14:58:13 UTC, Richard (Rikki) 
 Andrew Cattermole wrote:

 It was voted on as the common denominator that everyone could 
 agree upon who voted.
If optional () for single-token template argument lists had been put to a vote, the majority would have voted against it. I doubt anybody would want to remove it today.
+1 /P
Jan 08
prev sibling next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Monday, 8 January 2024 at 15:30:19 UTC, Max Samukha wrote:

 If optional () for single-token template argument lists had 
 been put to a vote, the majority would have voted against it. I 
 doubt anybody would want to remove it today.
(The majority included me, btw.)
Jan 08
prev sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 4:30 AM, Max Samukha wrote:
 On Monday, 8 January 2024 at 14:58:13 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 
 It was voted on as the common denominator that everyone could agree 
 upon who voted.
If optional () for single-token template argument lists had been put to a vote, the majority would have voted against it. I doubt anybody would want to remove it today.
I want full blown f-strings. I have a formatter all written for it and date/time updated too. Its pretty nice. But alas, Adam, didn't. Best I could do was argue against locking us in to a different feature set and keeping it consistent with double quoted strings (``\$`` instead of ``$$`` escape).
Jan 08
parent reply Max Samukha <maxsamukha gmail.com> writes:
On Monday, 8 January 2024 at 15:34:21 UTC, Richard (Rikki) Andrew 
Cattermole wrote:

 I want full blown f-strings.

 I have a formatter all written for it and date/time updated 
 too. Its pretty nice.

 But alas, Adam, didn't.

 Best I could do was argue against locking us in to a different 
 feature set and keeping it consistent with double quoted 
 strings (``\$`` instead of ``$$`` escape).
I don't know why they opted for $$. \ is used for any other escape sequence in D.
Jan 08
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 5:18 AM, Max Samukha wrote:
 On Monday, 8 January 2024 at 15:34:21 UTC, Richard (Rikki) Andrew 
 Cattermole wrote:
 
 I want full blown f-strings.

 I have a formatter all written for it and date/time updated too. Its 
 pretty nice.

 But alas, Adam, didn't.

 Best I could do was argue against locking us in to a different feature 
 set and keeping it consistent with double quoted strings (``\$`` 
 instead of ``$$`` escape).
I don't know why they opted for $$. \ is used for any other escape sequence in D.
I don't either. I'm glad that I managed to argue to make it based upon double quoted strings. Keeping things consistent is important when a decision doesn't matter too much otherwise!
Jan 08
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 8:18 AM, Max Samukha wrote:
 I don't know why they opted for $$. \ is used for any other escape sequence in
D.
The trouble is \ is already used and consumed when lexing the string. That is why printf uses %%.
Jan 08
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 12:09 PM, Walter Bright wrote:
 On 1/8/2024 8:18 AM, Max Samukha wrote:
 I don't know why they opted for $$. \ is used for any other escape 
 sequence in D.
The trouble is \ is already used and consumed when lexing the string. That is why printf uses %%.
It makes no difference. You would simply be replacing: https://github.com/dlang/dmd/blob/c04ae03765bb1d4da8e6b3d0087d6bc46c5c3789/compiler/src/dmd/lexer.d#L1940 with a check in: https://github.com/dlang/dmd/blob/c04ae03765bb1d4da8e6b3d0087d6bc46c5c3789/compiler/src/dmd/lexer.d#L1954 Works: ``i"\n"`` ``i"\$"`` (with this proposed change) ``i"$$"`` Does not work: ``i"$(\n)`` ``i"$(\$)`` ``i"$($$)"`` So what behavior are you attributing to this escape, that requires it to not be consistent with double quoted strings?
Jan 08
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 4:57 PM, Richard (Rikki) Andrew Cattermole wrote:
 |i"$(\n)| |i"$(\$)| |i"$($$)"|
Meant to be: ``i"$(\n)"`` ``i"$(\$)"`` ``i"$($$)"``
Jan 08
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 7:57 PM, Richard (Rikki) Andrew Cattermole wrote:
 So what behavior are you attributing to this escape, that requires it to not
be 
 consistent with double quoted strings?
At least in DIP1027, the istring is lexed the same as regular strings. The \ is removed.
Jan 08
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 5:07 PM, Walter Bright wrote:
 On 1/8/2024 7:57 PM, Richard (Rikki) Andrew Cattermole wrote:
 So what behavior are you attributing to this escape, that requires it 
 to not be consistent with double quoted strings?
At least in DIP1027, the istring is lexed the same as regular strings. The \ is removed.
In 1036e they are handled with the same functions that handle string literal escapes. There is no distinction. Once you are in the expression, the interpolated string handling becomes a parser calling ``scan`` to get tokens where it is doing brace counting. Essentially the lexer is splitting: ``i"prefix$(Identifier1 3.2)suffix"`` Into: ``"prefix"`` Identifier1 3.2 ``"suffix"``
Jan 08
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 8:11 PM, Richard (Rikki) Andrew Cattermole wrote:
 Essentially the lexer is splitting:
 
 ``i"prefix$(Identifier1 3.2)suffix"``
 
 Into:
 
 ``"prefix"``
 
 Identifier1
 
 3.2
 
 ``"suffix"``
Yes, and see what happens with: i"pr\\efix$(Identifier1 3.2)suffix" Are you going to get pr\efix or prefix?
Jan 08
parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 8:17 PM, Walter Bright wrote:
 On 1/8/2024 8:11 PM, Richard (Rikki) Andrew Cattermole wrote:
 Essentially the lexer is splitting:

 ``i"prefix$(Identifier1 3.2)suffix"``

 Into:

 ``"prefix"``

 Identifier1

 3.2

 ``"suffix"``
Yes, and see what happens with: i"pr\\efix$(Identifier1 3.2)suffix" Are you going to get pr\efix or prefix?
Is equivalent to: "pr\\efix" Identifier1 3.2 "suffix" So you would get ``pr\efix`` as the string value. As the standard rules of double quoted strings would apply in that subset of the i-string. Both: ``"p\ns"`` and ``i"p\ns"`` is the same thing, ignoring the extra expansion of types/values of i-string's.
Jan 08
next sibling parent reply "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 09/01/2024 8:40 PM, Richard (Rikki) Andrew Cattermole wrote:
 Is equivalent to:
 
 "pr\efix"
If you use a markdown viewer (like myself) this is probably rendering out to be one backslash. It is meant to be two, as you need to escape the backslash to get into the value rather than trying to escape the e.
Jan 08
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 11:42 PM, Richard (Rikki) Andrew Cattermole wrote:
 On 09/01/2024 8:40 PM, Richard (Rikki) Andrew Cattermole wrote:
 Is equivalent to:

 "pr\efix"
If you use a markdown viewer (like myself) this is probably rendering out to be one backslash. It is meant to be two, as you need to escape the backslash to get into the value rather than trying to escape the e.
Oh, ok. That explains it.
Jan 09
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 10/01/2024 9:23 AM, Walter Bright wrote:
 On 1/8/2024 11:42 PM, Richard (Rikki) Andrew Cattermole wrote:
 On 09/01/2024 8:40 PM, Richard (Rikki) Andrew Cattermole wrote:
 Is equivalent to:

 "pr\efix"
If you use a markdown viewer (like myself) this is probably rendering out to be one backslash. It is meant to be two, as you need to escape the backslash to get into the value rather than trying to escape the e.
Oh, ok. That explains it.
I'm glad that we have resolved this line of misunderstandings. It was getting silly, given that you would've wrote the code that handled this very thing and it was easy to see what was the case if you read the PR ;)
Jan 09
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 11:40 PM, Richard (Rikki) Andrew Cattermole wrote:
 So you would get ``pr\efix`` as the string value. As the standard rules of 
 double quoted strings would apply in that subset of the i-string.
The standard rules of double quoted strings produce the error message: Error: undefined escape sequence \e
Jan 09
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 6:50 AM, Andrej Mitrovic wrote:
 I haven't read any rationale why parentheses are absolutely required in any of 
 the DIPs I've skimmed through.
They're not required in DIP1027.
Jan 08
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 6:34 AM, Bruce Carneal wrote:
 It might not still be up but I found it approachable.  Again, though, it's a 
 draft and might not match the code (or the actual intent for that matter).  
I had thought the Interpolation template implementations were meant to be overridden. But looking at the the sql implementation: https://github.com/adamdruppe/interpolation-examples/blob/master/lib/sql.d shows this not to be the case.
 Still, a useful possibility for those who prefer not to look at code in the
 early going.
A specification should not require looking at the code. After all, do you expect to need to read the C++ compiler source to figure out what it does?
Jan 08
parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Monday, 8 January 2024 at 18:35:53 UTC, Walter Bright wrote:
 On 1/8/2024 6:34 AM, Bruce Carneal wrote:
 It might not still be up but I found it approachable.  Again, 
 though, it's a draft and might not match the code (or the 
 actual intent for that matter).
I had thought the Interpolation template implementations were meant to be overridden. But looking at the the sql implementation: https://github.com/adamdruppe/interpolation-examples/blob/master/lib/sql.d shows this not to be the case.
 Still, a useful possibility for those who prefer not to look
at code in the
 early going.
A specification should not require looking at the code. After all, do you expect to need to read the C++ compiler source to figure out what it does?
Sure, specs are useful as are the documented use cases provided with a working implementation. Working prototypes let you "kick the tires" and can help you *debug* the spec. They can also help you estimate the long term support burden more accurately than you would from simply looking at a spec (forestalls a bunch of hand waving). I'm not saying we should drop specs of course, in fact I consider them a requirement at this level, rather that we should understand their limitations and the benefits provided by working code with examples.
Jan 08
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 11:44 AM, Bruce Carneal wrote:
 I'm not saying we should drop specs of course, in fact I consider them a 
 requirement at this level, rather that we should understand their limitations 
 and the benefits provided by working code with examples.
I agree, and that's why specifications usually include examples. Examples in the spec should be specific and minimal. A reader should not need to be familiar with SQL code in order to understand string interpolation. Anyhow, this is now moot, see my new thread topic "Interpolated strings and SQL".
Jan 08
parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Monday, 8 January 2024 at 23:42:49 UTC, Walter Bright wrote:
 On 1/8/2024 11:44 AM, Bruce Carneal wrote:
 I'm not saying we should drop specs of course, in fact I 
 consider them a requirement at this level, rather that we 
 should understand their limitations and the benefits provided 
 by working code with examples.
I agree, and that's why specifications usually include examples. Examples in the spec should be specific and minimal. ...
Yes, examples are good but there is a qualitative difference between the pseudo-code of never compiled examples in a standalone spec and the actual code of examples compiled by a prototype implementation. Additionally noted earlier: a naked spec (no implementation) tempts us to speculate about the relative difficulty and invasiveness of the eventual, often long delayed, implementation. I don't think we should require an implementation to accompany a proposal. I do think, however, that proposals that have them will be easier to evaluate properly and deserving of expedited review.
Jan 08
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 7:49 PM, Bruce Carneal wrote:
 I don't think we should require an implementation to accompany a proposal.
We don't. We just require an accurate description (i.e. a specification), rather than suggesting reviewers reverse engineer the implementation. Implementations do a lot more than just be a specification - they manage memory, have optimizations, have workarounds for language problems, manage error handling, deal with the operating system, deal with configuration switches, and on and on.
 I do 
 think, however, that proposals that have them will be easier to evaluate 
 properly and deserving of expedited review.
That's true.
Jan 08
prev sibling parent razyk <user home.org> writes:
On 09.01.24 04:49, Bruce Carneal wrote:
 Additionally noted earlier: a naked spec (no implementation) tempts us 
 to speculate about the relative difficulty and invasiveness of the 
 eventual, often long delayed, implementation.
But the implementation is easy to change later, the specification is not.
Jan 09
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/8/2024 6:09 AM, DrDread wrote:
 and you _still_ misunderstand DIP 1036. which is the whole problem here. and 
 we've told you repeatedly that you misunderstand it. please just go and look
at 
 the implementation.
You're right on that point. More later.
Jan 08
prev sibling next sibling parent reply Bruce Carneal <bcarneal gmail.com> writes:
On Sunday, 7 January 2024 at 09:04:16 UTC, Walter Bright wrote:
 On 1/6/2024 9:35 PM, H. S. Teoh wrote:
 Noteworthy is the fact that the competing string interpolation 
 proposals
 are *not* immune to this sort of SQL injection attack, because 
 premature
 conversion of the i"" literal to string *would* result in a 
 successful
 injection.
The same technique of having a template take the generated tuple and modifying it as it sees fit works with DIP1027, too. I posted an example here in the last debate about this. ... It's true that in order for this to work, ``` db.execi(i"INSERT INTO sample VALUES ($(id), $(name))"); ``` would need to be written as: ``` db.execi(xxx!(i"INSERT INTO sample VALUES ($(id), $(name))"));
 ```
 where `xxx` is the thoroughly unimaginative name of the
... So 1027 is equivalently good in this aspect as long as programmers are conscientious in their definition and use of a typing convention? Unless 1036e is believed to be very difficult to implement correctly, or has nasty cross dependencies that could cause problems later, this would seem to be a bad trade (hypothetical? simplification of implementation in exchange for making things harder for users).
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 11:09 AM, Bruce Carneal wrote:
 It's true that in order for this to work,

 ```
 db.execi(i"INSERT INTO sample VALUES ($(id), $(name))");
 ```
 would need to be written as:
 ```
 db.execi(xxx!(i"INSERT INTO sample VALUES ($(id), $(name))"));
 ```
 where `xxx` is the thoroughly unimaginative name of the
... So 1027 is equivalently good in this aspect as long as programmers are conscientious in their definition and use of a typing convention?
It turns out this is an issue for DIP1036 as well, as db.execi() is the template. I hadn't realized that.
 Unless 1036e is believed to be very difficult to implement correctly, or has 
 nasty cross dependencies that could cause problems later, this would seem to
be 
 a bad trade (hypothetical? simplification of implementation in exchange for 
 making things harder for users).
Apparently DIP1027 is no harder for users than DIP1036.
Jan 07
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 1/7/24 21:22, Walter Bright wrote:
 So 1027 is equivalently good in this aspect as long as programmers are 
 conscientious in their definition and use of a typing convention?
It turns out this is an issue for DIP1036 as well, as db.execi() is the template. I hadn't realized that. ...
No, it is not an issue for DIP1036e, because it properly separates out compile-time and runtime data.
 
 Unless 1036e is believed to be very difficult to implement correctly, 
 or has nasty cross dependencies that could cause problems later, this 
 would seem to be a bad trade (hypothetical? simplification of 
 implementation in exchange for making things harder for users).
Apparently DIP1027 is no harder for users than DIP1036.
Yes it is. DIP1027 is impossible for users. It's a low bar.
Jan 09
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/7/24 10:04, Walter Bright wrote:
 
 It's true that in order for this to work,
 
 ```
 db.execi(i"INSERT INTO sample VALUES ($(id), $(name))");
 ```
 would need to be written as:
 ```
 db.execi(xxx!(i"INSERT INTO sample VALUES ($(id), $(name))"));
 ```
 where `xxx` is the thoroughly unimaginative name of the transformer 
 template.
 
 Is adding the template call an undue burden?
Yes, among other reasons because it does not work. This works with DIP1036: ``` int x=readln.strip.split.to!int; db.execi(i"INSERT INTO sample VALUES ($(id), $(2*x))"); ``` This cannot work: ``` int x=readln.strip.split.to!int; db.execi(xxx!i"INSERT INTO sample VALUES ($(id), $(2*x))"); ```
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 3:50 PM, Timon Gehr wrote:
 This cannot work:
 
 ```
 int x=readln.strip.split.to!int;
 db.execi(xxx!i"INSERT INTO sample VALUES ($(id), $(2*x))");
 ```
True, you got me there. It's the 2*x that is not turnable into an alias. I'm going to think about this a bit.
Jan 07
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 6:30 PM, Walter Bright wrote:
 On 1/7/2024 3:50 PM, Timon Gehr wrote:
 This cannot work:

 ```
 int x=readln.strip.split.to!int;
 db.execi(xxx!i"INSERT INTO sample VALUES ($(id), $(2*x))");
 ```
True, you got me there. It's the 2*x that is not turnable into an alias. I'm going to think about this a bit.
I wonder if what we're missing are functions that operate on tuples and return tuples. We almost have them in the form of: ``` template tuple(A ...) { alias tuple = A; } ``` but the compiler wants A to only consist of symbols, types and expressions that can be computed at compile time. This is so the name mangling will work. But what if we don't bother doing name mangling for this kind of template?
Jan 07
parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
I kinda wish we had tuples that are not alias sequences right now.

1036e could be (at least in my opinion) better with them.

```
tuple(prefix, expression or positional, ...)
```

Removes the possibility of ref yes, but does introduce the possibility 
of f-string positional argument support and easily passing it around.

``i"prefix1${1}prefix2%{arg}"``

becomes

``tuple(PrefixType!("prefix1"), PositionType!1, PrefixType!("prefix2"), 
arg)``

```d
void writeln(Args...)(Args args) {
	static foreach(i, Arg; Args) {
		static if (is(Arg : typeof(tuple))) {
			// is tuple for arg i
		}
	}
}
```
Jan 07
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/6/2024 9:35 PM, H. S. Teoh wrote:
 import lib.sql;
Where is this file?
Jan 07
parent reply matheus <matheus gmail.com> writes:
On Sunday, 7 January 2024 at 20:03:03 UTC, Walter Bright wrote:
 On 1/6/2024 9:35 PM, H. S. Teoh wrote:
 import lib.sql;
Where is this file?
I think it's this one: https://github.com/adamdruppe/interpolation-examples/blob/master/lib/sql.d From the Adam's examples: https://github.com/adamdruppe/interpolation-examples/ Matheus.
Jan 07
parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/7/2024 12:08 PM, matheus wrote:
 I think it's this one: 
Thank you. It turns out I had a misunderstanding, so I don't need it.
Jan 07
prev sibling next sibling parent reply Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
 We want the process of contribution to be as open-ended as 
 possible, and not having unnecessary blockage to contributions.
 https://dpldocs.info/opend/contribute.html
But why name it "OpenD"? If I read "Open" in any project name, I think of "Open Source", not "open-ended". I find it very unfortunate that this may signal that D is somehow not open source, a misconception that we were struggling with for a long time. D being open source is the very reason why a fork is possible in the first place. Forking is a lot better than just leaving, and I wish you, Adam, and any future contributors success and happiness. I have appreciation for the work that you (especially Adam) have put into D thus far, and I am sure you appreciate the work that everybody else has put into D as well (because why otherwise fork it). Naming it OpenD doesn't look like a nice move to me, and I hope you'll find a better name. -- Bastiaan.
Jan 06
parent reply matheus <matheus gmail.com> writes:
On Saturday, 6 January 2024 at 13:18:40 UTC, Bastiaan Veelo wrote:
 ...
With the risk of putting myself in hot waters, my understanding (I may be wrong!) is that after the xmas incident, Adam posts were under moderation and he went lay low mode around this mailing list. So I'd suggest you to talk to Adam directly over IRC #opend which I think it will reach him fast. Matheus.
Jan 06
parent Meta <jared771 gmail.com> writes:
On Saturday, 6 January 2024 at 14:39:44 UTC, matheus wrote:
 On Saturday, 6 January 2024 at 13:18:40 UTC, Bastiaan Veelo 
 wrote:
 ...
With the risk of putting myself in hot waters, my understanding (I may be wrong!) is that after the xmas incident, Adam posts were under moderation and he went lay low mode around this mailing list. So I'd suggest you to talk to Adam directly over IRC #opend which I think it will reach him fast. Matheus.
What happened on Christmas?
Jan 06
prev sibling next sibling parent reply whitebyte <caffeine9999 mailbox.org> writes:
On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
It's fascinating that Walter did not relate to this at all, but readily jumped into a lengthy technical discussion of a tangentially related topic.
Jan 09
next sibling parent reply BlueBeach <blue.beach7052 fastmail.com> writes:
On Tuesday, 9 January 2024 at 08:45:03 UTC, whitebyte wrote:
 It's fascinating that Walter did not relate to this at all, but 
 readily jumped into a lengthy technical discussion of a 
 tangentially related topic.
Agreed, particularly with the word „fascinating“. I think the whole forking situation is kinda sad. This fork will probably go nowhere, that something that had been proven time and time again unfortunately. And I‘m afraid for dlang it will be a net negative too. I just reread the whole thread and I think in the first 4 to 5 pages the discussion stayed mostly on topic. But then the discussion first got a little more confrontational when some senior dlang people joined the discussion. Subsequently the entire discussion was basically ended by drowning it in technical detail for the next 5 pages. To be honest it is hard for me to see the end goal of this strategy. What makes it so hard to discuss this fork and engage on topic? Why not let them at least have their discussion?
Jan 09
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 10/01/2024 2:00 AM, BlueBeach wrote:
 To be honest it is hard for me to see the end goal of this strategy. 
 What makes it so hard to discuss this fork and engage on topic? Why not 
 let them at least have their discussion?
It is incredibly easy to go from fully relevant material to original topic, to split out into a different thread for some relevant but not quite relevant and you can't do it after the fact. Which is what happened here.
Jan 09
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/9/24 14:00, BlueBeach wrote:
 On Tuesday, 9 January 2024 at 08:45:03 UTC, whitebyte wrote:
 It's fascinating that Walter did not relate to this at all, but 
 readily jumped into a lengthy technical discussion of a tangentially 
 related topic.
Agreed, particularly with the word „fascinating“. I think the whole forking situation is kinda sad. This fork will probably go nowhere, that something that had been proven time and time again unfortunately.
Not really. See druntime.
 And I‘m afraid for dlang it will be a net negative too.
 ...
A bit of competition is usually a good thing. It's also not like the two projects have no synergies, they are literally built on the same code base.
 I just reread the whole thread and I think in the first 4 to 5 pages the 
 discussion stayed mostly on topic. But then the discussion first got a 
 little more confrontational when some senior dlang people joined the 
 discussion. Subsequently the entire discussion was basically ended by 
 drowning it in technical detail for the next 5 pages.
 ...
Well, the stubbornly different understanding of technical details of DIP1027 vs DIP1038e is the final straw that spawned the fork. Also, more senior dlang people tend to use e.g. thunderbird rather than forum.dlang.org, with a threaded view.
 To be honest it is hard for me  to see the end goal of this strategy. 
 What makes it so hard to discuss this fork and engage on topic? Why not 
 let them at least have their discussion?
Walter opened a new thread now.
Jan 09
parent reply BlueBeach <blue.beach7052 fastmail.com> writes:
On Tuesday, 9 January 2024 at 13:22:08 UTC, Timon Gehr wrote:

 ... See druntime.
I'm not familiar with this case. It is part of the DMD repository. Can you link/give a little more background?
 A bit of competition is usually a good thing. It's also not 
 like the two projects have no synergies,
From what I understand one of the grievances of OP is the slow processing of PRs. If thats really the case, the effect of synergies by exchange is hindered.
 Well, the stubbornly different understanding of technical 
 details of DIP1027 vs DIP1038e is the final straw that spawned 
 the fork.
There is nothing wrong with going into detail making when you make your point. But the discussion was conducted in way that gave the impression of a serious lack of awareness as of the 'where' (in a thread discussing a fork, created mainly because of non-technical reasons) and the 'who' (in a thread red by a lot of people most of them not primarily interested in a conflict about escape characters). To be honest it struck me as rather arrogant and ignorant than particulary savvy.
 Also, more senior dlang people tend to use e.g. thunderbird 
 rather than forum.dlang.org, with a threaded view.
If thats a reason on how the discussion is conducted I would adress that.
 Walter opened a new thread now.
That is very good. I think this distinction of organisational and technical issues is important and frankly I don't quite understand the problem with it. It seems to me the immediate reaction from Dlangs leadership is to deny or downplay the organisational aspect by jumping immmediatly and with great vigor into technical sophistry whithout any real need. On the one side it is fascinating on the other hand it also quite sad.
Jan 09
next sibling parent Lance Bachmeier <no spam.net> writes:
On Tuesday, 9 January 2024 at 14:02:36 UTC, BlueBeach wrote:
 On Tuesday, 9 January 2024 at 13:22:08 UTC, Timon Gehr wrote:

 ... See druntime.
I'm not familiar with this case. It is part of the DMD repository. Can you link/give a little more background?
I assume it's a reference to Tango vs Phobos. It was a very long time ago (resolved before I started using D in 2013) so I'm not the best person to explain it. You can find information if you search this forum.
 Well, the stubbornly different understanding of technical 
 details of DIP1027 vs DIP1038e is the final straw that spawned 
 the fork.
There is nothing wrong with going into detail making when you make your point. But the discussion was conducted in way that gave the impression of a serious lack of awareness as of the 'where' (in a thread discussing a fork, created mainly because of non-technical reasons) and the 'who' (in a thread red by a lot of people most of them not primarily interested in a conflict about escape characters). To be honest it struck me as rather arrogant and ignorant than particulary savvy.
I think to the insiders the problems are known quite well. Until there is a change in the way decisions are made, there probably isn't much to discuss with respect to the fork.
Jan 09
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 02:02:36PM +0000, BlueBeach via Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 13:22:08 UTC, Timon Gehr wrote:
 
 ... See druntime.
I'm not familiar with this case. It is part of the DMD repository. Can you link/give a little more background?
See: http://dpldocs.info/this-week-in-d/Blog.Posted_2024_01_01.html About 2/10 of the way down the page, you have this: The code we know know as "druntime" originated as a fork because Walter failed to accept community contributions. In 2004, after attempting to work with upstream, these developers were left with no option but to fork the language to keep their contributions - which they must have felt were necessary to expand the use of D - from being totally lost. This fork was called "Ares" at first, and would later join forces with other community efforts to become "Tango". Tango called itself "The Developer's Library for D". I didn't understand why at the time, I just knew to use it, you had to install some things from an additional website and I didn't want to do that. But now, knowing what I know now about D, the name was obvious: Tango was where the developers actually were welcome to contribute. And contribute they did: the Tango ecosystem had many things the Phobos ecosystem lacked. "Walter blesses many ideas. What I'm wondering is how quickly he incorporates the results." jcc7, September 10, 2004, dsource.org forums Only after four years of persisting in the fork and gaining significant popularity, including capturing D's early commercial users, did upstream finally relent and opened up to a reconciliation, leading to the "druntime" we enjoy today.
 A bit of competition is usually a good thing. It's also not like the
 two projects have no synergies,
From what I understand one of the grievances of OP is the slow processing of PRs. If thats really the case, the effect of synergies by exchange is hindered.
It's not just the slow procesing of PRs. It's the arbitrary shutdown of contributions after months, often years, of silence, with no prior warning and no regard of the amount of work put into maintaining said PRs over that length of time. And often while totally misunderstanding what was actually done, as is being shown right at this moment with the discussion on DIP 1036e. [...]
 Walter opened a new thread now.
That is very good. I think this distinction of organisational and technical issues is important and frankly I don't quite understand the problem with it. It seems to me the immediate reaction from Dlangs leadership is to deny or downplay the organisational aspect by jumping immmediatly and with great vigor into technical sophistry whithout any real need. On the one side it is fascinating on the other hand it also quite sad.
This is by far not the first time. Some years ago, I, a random nobody online, showed up and complained about the lack of progress in the Phobos PR queue. (By then I had contributed a handful of PRs.) After making enough of a noise about it, I was suddenly handed the "keys to the kingdom", to so speak, i.e., commit access to Phobos, along with another contributor. That was good; we got to work and after several months, or perhaps even a year of work we managed to unclog most of Phobos' PR queue. A happy ending... or was it? Well, after some time had passed, Andrei suddenly appeared out of nowhere after having been mostly silent (or occasionally giving one sentence or one-word responses) and came down upon us like a ton of bricks, saying that Phobos is a mess and indirectly implying that our efforts to get the ball rolling in the Phobos PR queue was the cause. He went on about Good Work vs. Great Work and a bunch of philosophical gripes -- none of which we were informed of prior to this. Neither was any concrete explanation given as to what exactly was required beyond the vague terms "Great Work" vs. "Good Work" and some handwaving explanation. In the aftermath, the other contributor quit contributing. I continued, but scaled back my contributions to a mere trickle. Not because I was mad or anything, mind you, I just decided that since whatever I did wasn't good enough, and since I didn't "get it" about Good Work vs. Great Work, I'd just leave it to somebody else who "got it" to step up and fill the role. Well guess what? Nobody stepped up. Phobos returned to stagnation, and the PR queue grew back to its old unmanageble proportions clogged up with PRs that never went anywhere unless you were willing to wait months or even years. And even after that it was anybody's guess as to whether it would move forward at all. Most just stagnated until the original author(s) abandoned it or quit in frustration. Eventually the DLF was forced to spend money to hire somebody to look after the queue, because nobody was willing to do it unless they were paid to do so. And it wasn't as if there was a lack of willing contributors. Many willing contributors came, was active for a period of time, then left, usually over some disagreement or frustration at the way things were managed. If it was merely a handful of contributors, then one could reasonably attribute it to personal disagreements, or that that person simply didn't "fit" in how things worked here. But it isn't just a handful. It's a whole long series of active contributors, some of whom made enormous contributions (Kenji Hara, for example, contributed about 1/2 of all the code currently in dmd), but eventually left due to some dispute. After more than 10 years of this very same pattern repeated over and over -- someone comes on board, actively contributes, sometimes makes major contributions, then leaves in a huff or withdraws from active contribution -- one cannot help asking the question, why? What are we doing wrong that's driving willing contributors away? What should we do to change this? This issue has been brought up time and again throughout the history of D, and I have yet to see any action on the part of the leadership that actually made a difference. It's not that the leadership didn't try; they certainly did. But it's been more than 10 years, and whatever they have tried has not qualitatively changed the outcome. This past year there was a lot of noise about changing the process for the better, etc., but what was the outcome? Adam, one of the major contributors to D, decides to fork the project. What gives? It's time to ask some serious questions. Why is it that other programming languages are gaining contributors while we're bleeding them? Why are long-time D fans feeling so much frustration after years of being loyal to D? Something is not adding up, and whatever we've attempted in the past more than 10 years haven't solved this problem. Perhaps it's time for a drastic, fundamental change in the way this project is run. T -- In order to understand recursion you must first understand recursion.
Jan 09
next sibling parent matheus <matheus gmail.com> writes:
On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 ...
Seeing the small size of this community, this should be taken very seriously. I saw this argument before and saw some people because the problem you mentioned. I would be worried losing contributors in a free project like this, and I think I probably would talk to them and try to smooth things a bit. Finally sorry if this disrupting the topic (About Fork), I'll refrain for now on, I just think that post is a serious issue and something that led the creation of this topic/fork and again should be reflected. Matheus.
Jan 09
prev sibling next sibling parent reply BlueBeach <blue.beach7052 fastmail.com> writes:
On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 On Tue, Jan 09, 2024 at 02:02:36PM +0000, BlueBeach via 
 Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 13:22:08 UTC, Timon Gehr wrote:
 
 ... See druntime.
I'm not familiar with this case. It is part of the DMD repository. Can you link/give a little more background?
[...] In 2004, after attempting to work with upstream, these developers were left with no option but to fork the language to keep their contributions [...]
Thanks for the info and wow. Thats 20 years ago (already) ...
 After more than 10 years of this very same pattern repeated 
 over and over -- someone comes on board, actively contributes, 
 sometimes makes major contributions, then leaves in a huff or 
 withdraws from active contribution -- one cannot help asking 
 the question, why?  What are we doing wrong that's driving 
 willing contributors away?  What should we do to change this?
Since you asked , I hope it is OK if I share my opinion on this topic although I'm mostly a lurker and not active in the community. I think the main reason for some of the reoccurring organisational issues and their unpleasant side effects are unresolved questions around Walters authority. Since Dlang is an Open Source project, there are expectations on a certain level of democracy. Nobody is perfect and a flawed democracy would probably suffice, but a lot of people seem to experience the Dlang community as a flawed oligarchy where only a minority has a saying and sometimes even those people are omitted. Maybe a less nice way to describe the style of this project is a mix of meritocracy and dictatorship. While nobody is against leadership by merit, there is a reason why autocratic forms of government are disliked: They are poor in words and highly unpredictable. In the end only Walter as the founder of the project can decide how and if he wants to define and exercise his authority and how fundamental democratic structures should be. One thing I know for sure: If you want a more democratic and predictable leadership you are not getting it by chance. It is not the natural state and you really have to fight (or work) for it.
Jan 09
parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 05:06:20PM +0000, BlueBeach via Digitalmars-d wrote:
[...]
 I think the main reason for some of the reoccurring organisational
 issues and their unpleasant side effects are unresolved questions
 around Walters authority. Since Dlang is an Open Source project, there
 are expectations on a certain level of democracy. Nobody is perfect
 and a flawed democracy would probably suffice, but a lot of people
 seem to experience the Dlang community as a flawed oligarchy where
 only a minority has a saying and sometimes even those people are
 omitted. Maybe a less nice way to describe the style of this project
 is a mix of meritocracy and dictatorship. While nobody is against
 leadership by merit, there is a reason why autocratic forms of
 government are disliked: They are poor in words and highly
 unpredictable.
[...]
From the first paragraph in Adam's blog on 2024-01-01:
While the oft-repeated claim that D is a closed-source language is not really true (the D-specific parts of the compiler were GPL'd as early as 2002, leading to an all-GPL compiler (what we now know as gdc) being released in 2004), it is true that D's development methodologies are not especially open and that decision making has very little meaningful input from the community, and this has been true for its entire history. It has always struck me as somewhat incongruent that while D's *code* is open source and licensed under an open source license, its development methologies are very much entrenched in the proprietary (commercial), closed source mentality. A mentality where decisions are made behind closed doors and there is no obligation to provide any rationale. This leads to a lot of friction with contributors who come in expecting a more open style of development that one would expect to find in an open source project, but discovering to their chagrin that it's being run as if it were a closed source, proprietary project. Of course, this project was initiated by Walter and he has the right to choose whatever development methodology he wishes, and if you wish to play ball in this project then that's just what you have to work with. And I'm not saying that this style of management 100% doesn't work. But D's history has shown us time and again that it is at least one of the sources of much frustration on the part of contributors who are expecting an open source project that's run more like an open source project. Given our track record so far, perhaps it's time to reconsider the very fundamental principles by which D is being managed. From a technical standpoint, D has no parallels that I know of -- it comes very close to my ideal of what a programming language should be. But the way it's managed leaves a lot to be desired. It would be a pity for this beautiful language to languish when under a different style of management it could be flourishing and taking over the world. T -- Never criticize a man until you've walked a mile in his shoes. Then when you do criticize him, you'll be a mile away and he won't have his shoes.
Jan 09
next sibling parent reply user1234 <user1234 12.de> writes:
On Tuesday, 9 January 2024 at 17:42:11 UTC, H. S. Teoh wrote:
 [...]
I dont see how a D fork managed by Ruppe could be managed in a better way. That would be a downgrade. You oppose someone who spent years into trying to maintain a community to someone who gets crazy over a single disagreement.
Jan 09
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 06:09:19PM +0000, user1234 via Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 17:42:11 UTC, H. S. Teoh wrote:
 [...]
I dont see how a D fork managed by Ruppe could be managed in a better way. That would be a downgrade. You oppose someone who spent years into trying to maintain a community to someone who gets crazy over a single disagreement.
You are free to form whatever opinion you like of me or Adam, it doesn't bother me. But I have worked with Adam before, and he's generally much more open to contributions than Walter, and generally more pragmatic about not holding things up and letting the perfect be the enemy of the good. And he doesn't strike me as the "crazy" reactionary type, even though lately, before the fork, he did seem unusually riled up. Which, given the frustrations he has gone through in the past years, isn't entirely unexpected either. And it is far from a merely "a single disagreement"; that was merely the last straw that broke the camel's back after many years of pent-up frustration. I'm not 100% in agreement with his approach either, but given D's track record, which I've already elaborated on at length and won't repeat here, I'm interested to see where this leads. It may lead nowhere as the naysayers are already saying, but it may also lead somewhere D ought to have been but hasn't gotten to so far for the aforementioned reasons. It will be educational to see how this all pans out. Ultimately, I, and Adam himself as he already expressed privately to me, hope that this fork will lead to D moving past its present roadblocks and making progress instead of continuing to stagnate. The goal isn't to oppose Walter or anything in the current D leadership, but to bring D forward. T -- Frank disagreement binds closer than feigned agreement.
Jan 09
prev sibling parent max haughton <maxhaton gmail.com> writes:
On Tuesday, 9 January 2024 at 18:09:19 UTC, user1234 wrote:
 On Tuesday, 9 January 2024 at 17:42:11 UTC, H. S. Teoh wrote:
 [...]
I dont see how a D fork managed by Ruppe could be managed in a better way. That would be a downgrade. You oppose someone who spent years into trying to maintain a community to someone who gets crazy over a single disagreement.
Adam's been around for ages, come on now. FWIW I contributed to both projects over the weekend because I view the fork as more of an opportunity to try things out rather than the future necessarily, and it was a lot easier with the fork. A lot of the practices and layout of the main D project are either antiquated or just bad, a fork is a nice way to try out new stuff and learn something rather than making a plan to make a plan about how you might fix things.
Jan 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/9/2024 9:42 AM, H. S. Teoh wrote:
  From a technical
 standpoint, D has no parallels that I know of -- it comes very close to
 my ideal of what a programming language should be.  But the way it's
 managed leaves a lot to be desired.  It would be a pity for this
 beautiful language to languish when under a different style of
 management it could be flourishing and taking over the world.
Thank you for the kind compliments about D. Perhaps one reason it is such a nice language is because I say "no" to most enhancements? D would have version algebra and macros if it was a committee. Some features are great ideas, until you've used them for 10 years, and come to the realization that they aren't so good of an idea. Aesthetic appeal is a big deal. D has got to look good on the screen, because after all, we spend most "programming" time just staring at the code. I remember once attending a C++ conference where the presenter had slides of his innovative ideas, and I had the thought that there was no way to format that or rewrite it so it looked good. I've had that experience many times with C++. For example, one of the Tango features I rejected was creating a clone of C++'s iostreams. I knew by then that iostreams was a great idea, but it just looked awful on the screen (and had some other fundamental problems). The modern consensus is that iostreams was a misuse of operator overloading. D also restricts operator overloading to discourage using it as a DSL (though Tango still managed to use it for I/O). I could go on with that, but that's enough for the moment. The end goal for me with D is that it will no longer need me. As for Phobos, I am not involved with it directly. There has been a sequence of people in charge of it, but that hasn't worked out too well. But there is a core team of 35 people (though some are inactive) that controls what goes into it: https://github.com/orgs/dlang/teams/team-phobos They have the authority to decide what goes in Phobos or not. I'm open to nominations to that team. Anybody can bring attention on the n.g. to any PR that is being overlooked.
Jan 09
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 01:11:39PM -0800, Walter Bright via Digitalmars-d wrote:
 On 1/9/2024 9:42 AM, H. S. Teoh wrote:
 From a technical standpoint, D has no parallels that I know of -- it
 comes very close to my ideal of what a programming language should
 be.  But the way it's managed leaves a lot to be desired.  It would
 be a pity for this beautiful language to languish when under a
 different style of management it could be flourishing and taking
 over the world.
Thank you for the kind compliments about D. Perhaps one reason it is such a nice language is because I say "no" to most enhancements? D would have version algebra and macros if it was a committee. Some features are great ideas, until you've used them for 10 years, and come to the realization that they aren't so good of an idea. Aesthetic appeal is a big deal. D has got to look good on the screen, because after all, we spend most "programming" time just staring at the code. I remember once attending a C++ conference where the presenter had slides of his innovative ideas, and I had the thought that there was no way to format that or rewrite it so it looked good. I've had that experience many times with C++.
It's C++, aesthetic appeal isn't even on the list. :-D [...]
 The end goal for me with D is that it will no longer need me.
Wonderful! The way it's going right now, however, appears to be in the complete opposite direction.
 As for Phobos, I am not involved with it directly. There has been a
 sequence of people in charge of it, but that hasn't worked out too
 well. But there is a core team of 35 people (though some are inactive)
 that controls what goes into it:
 
 https://github.com/orgs/dlang/teams/team-phobos
 
 They have the authority to decide what goes in Phobos or not. I'm open
 to nominations to that team.
I'm on that list. ;-) But I haven't contributed for a long while now. Currently there isn't much incentive for me to do so. The barrier of entry is too high, both for contributor and reviewer -- even for a D veteran like me, if I can say so myself. The requirements are disproportionate for small changes, needless to say for big changes. And there are a lot of unstated, unwritten expectations. I don't have the energy / patience to second guess what's acceptable and what's not, when I could be writing code for my own projects instead.
 Anybody can bring attention on the n.g. to any PR that is being
 overlooked.
And they're unlikely to get any better response. All of this could be justifiable. There may be solid technical reasons behind it all. But the message that would-be contributors are getting is unfortunately not inviting more of them to join in. So this situation persists. It is what it is. T -- Change is inevitable, except from a vending machine.
Jan 09
prev sibling next sibling parent Martyn <martyn.developer googlemail.com> writes:
On Tuesday, 9 January 2024 at 21:11:39 UTC, Walter Bright wrote:
 On 1/9/2024 9:42 AM, H. S. Teoh wrote:
  From a technical
 standpoint, D has no parallels that I know of -- it comes very 
 close to
 my ideal of what a programming language should be.  But the 
 way it's
 managed leaves a lot to be desired.  It would be a pity for 
 this
 beautiful language to languish when under a different style of
 management it could be flourishing and taking over the world.
Thank you for the kind compliments about D. Perhaps one reason it is such a nice language is because I say "no" to most enhancements? D would have version algebra and macros if it was a committee. Some features are great ideas, until you've used them for 10 years, and come to the realization that they aren't so good of an idea. Aesthetic appeal is a big deal. D has got to look good on the screen, because after all, we spend most "programming" time just staring at the code. I remember once attending a C++ conference where the presenter had slides of his innovative ideas, and I had the thought that there was no way to format that or rewrite it so it looked good. I've had that experience many times with C++. For example, one of the Tango features I rejected was creating a clone of C++'s iostreams. I knew by then that iostreams was a great idea, but it just looked awful on the screen (and had some other fundamental problems). The modern consensus is that iostreams was a misuse of operator overloading. D also restricts operator overloading to discourage using it as a DSL (though Tango still managed to use it for I/O). I could go on with that, but that's enough for the moment. The end goal for me with D is that it will no longer need me. As for Phobos, I am not involved with it directly. There has been a sequence of people in charge of it, but that hasn't worked out too well. But there is a core team of 35 people (though some are inactive) that controls what goes into it: https://github.com/orgs/dlang/teams/team-phobos They have the authority to decide what goes in Phobos or not. I'm open to nominations to that team. Anybody can bring attention on the n.g. to any PR that is being overlooked.
*Of course, I personally do not want to see this split at all. This is a rather serious issue where both projects can suffer.* With regards to the Forked project - I am just sitting on the fence to see how it turns out. It could be successful and, if so, more power to Adam and contributors. If it fails.. even badly, I will still take my hat off for their attempted effort. We do live in an (internet) age where people like to bash and put people down. I refuse to be one of those people. Same can be said on this forum on a number of ocassions, and lots towards Walter and a few others. Coming back to Walter - I do understand his position and his comment (above) confirms that this is the right mindset whether people like it or not. D **is** a very good language and I don't think Walter should just add new things if he is not 100% commited to it. Some things could be great at the time but could be a mistake in 10 years - and D will then be stuck with it. I think the reason why I am not frustrated with certain features not making it into the language is because D has many of what I need. However I understand that there are people that dont agree and waited some time for progress of said feature with nothing as a result. I do believe that *OpenD* will divert away from D pretty quickly, merging new features within the first 6 months. It will divert so quickly that even if there is a chance of agreement between the two projects, they are simply too far apart to put back together without some plan. On top of this, *OpenD* could be including a bunch of things that I personally do not care about. It could change the direction of the language itself. This is why I am sitting on the fence. It might still serve my purposes or it (very much) wont.
Jan 10
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 9 January 2024 at 21:11:39 UTC, Walter Bright wrote:

 The modern consensus is that iostreams was a misuse of operator 
 overloading.
I don't know where you got the idea that there is a concensus. I've never met an unindoctrinated programmer with such a prejudice. For most people, it is natural to adapt to the fact that a name can have a different meaning in different contexts. The appeal to aesthetics doesn't work, either. Aesthetics is highly subjective and depends on the environment. In reality, some of your decisions that limit the language in order to impose your aesthetic preferences on the programmer often result in the most unaesthetic hacks I've ever seen.
Jan 10
next sibling parent Nick Treleaven <nick geany.org> writes:
On Wednesday, 10 January 2024 at 13:38:51 UTC, Max Samukha wrote:
 On Tuesday, 9 January 2024 at 21:11:39 UTC, Walter Bright wrote:

 The modern consensus is that iostreams was a misuse of 
 operator overloading.
I don't know where you got the idea that there is a concensus.
Aren't std::print and std::format intended to supplant iostream?
Jan 10
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 5:38 AM, Max Samukha wrote:
 On Tuesday, 9 January 2024 at 21:11:39 UTC, Walter Bright wrote:
 
 The modern consensus is that iostreams was a misuse of operator overloading.
I don't know where you got the idea that there is a concensus.
I know a number of leaders in the C++ community who have decades of experience with C++. Everyone thought iostreams was great in the 1980s. The luster of it wore off year by year.
Jan 12
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 5:38 AM, Max Samukha wrote:
 The appeal to aesthetics doesn't work, either. Aesthetics is highly subjective 
 and depends on the environment.
Aesthetics based on fashion are highly subjective, sure. There are enduring things of beauty, too. For example, there are ugly airplanes are beautiful ones. The beautiful ones tend to fly better. The lines on a modern airliner are beautiful, and none of it is the result of artists. Speaking as an engineer, there's a consistent correlation between things that are beautiful and things that work well. It's visible everywhere - bridges, ships, turbines, rockets, even clothing. Back when I designed electronic circuits, I laid things out so they'd form neatly arranged patterns. A break in the pattern suggested a mistake. Students who created a circuit that looked like a rat's nest of wires and parts rarely got them to work. So why not programming languages?
 In reality, some of your decisions that limit the language in order to impose 
 your aesthetic preferences on the programmer often result in the most 
 unaesthetic hacks I've ever seen.
That's correct. The idea is to nudge the programmer to find a better way. For example, version algebra in C is a rich, endless source of bugs and errors, on top of being ugly. There are much better ways to do it in C, but it's just too easy to create the ugly buggy version.
Jan 12
prev sibling next sibling parent reply Abdulhaq <alynch4048 gmail.com> writes:
On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:

Great comment.

I love Andrei but yes that Good Work vs Great Work thing was 
meant to be motivational BS but was actually just plain BS.

Andrei also teased the community with leadership-backed evolution 
of Phobos, but then he failed to show up.

A few birds-eye view observations of my own:

* D is Walter's baby and his life work. As other people come and 
go, as they are wont do with any project such as D, he knows he 
will be left holding the baby and maintaining it. That is why he 
will not accept PRs unless they appeal to his taste and he is 
confident they are a net positive and don't complicate the whole 
thing too much. I fully expect Adam to start behaving like this 
BTW, for the same reasons.

At some point in the history of D it was realised there needed to 
be a way to make the evolution of D more democratic. This is when 
DIPs were introduced. However, Walter's reluctance to accept DIPs 
made them an infamous time sink and often a dead end.In my view 
this is not unreasonable but it's certainly annoying for would-be 
contributors.

* Languages such as D need a BDFL who spends more time managing 
and orchestrating developments than cutting their own code. 
However, if often doesn't work out like that. Walter becomes a 
bottleneck and until he steps back from CTO, he will remain so.

* In the python world, the standard library is considered to be 
the place where libraries go to die, because the API becomes 
frozen. I agree with that take, and would concentrate on having a 
good packaging system where it's easy find the popular and well 
maintained libraries for given tasks e.g. XML, json, database 
clients etc. Forget about fighting to get stuff into Phobos, it's 
too hard and a fool's errand.
Jan 09
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 06:23:47PM +0000, Abdulhaq via Digitalmars-d wrote:
[...]
 A few birds-eye view observations of my own:
 
 * D is Walter's baby and his life work. As other people come and go,
 as they are wont do with any project such as D, he knows he will be
 left holding the baby and maintaining it. That is why he will not
 accept PRs unless they appeal to his taste and he is confident they
 are a net positive and don't complicate the whole thing too much. I
 fully expect Adam to start behaving like this BTW, for the same
 reasons.
Adam is already doing this. There's been a ton of proposals and ideas, and while he hasn't straight out turned down anyone yet, he *is* already warning that everything will be evaluated based on whether they bring a net positive, or will simply be too costly to be worth the effort. There are nuances to such management, though. Take Linux for example. Linus still calls the final shots for whatever makes it into the kernel. But he knows how to delegate -- he has not a small number of trusted delegates that look after major subsystems, and he trusts them to make decisions of their own without always needing to go through him. He can still override their decision if he sees something obviously wrong, and he can (and has) reverted stuff that he feels were wrong after the fact. But the key point is that he does not demand that every decision go through him, and that's what keeps him from becoming a bottleneck. We do have something similar in D to some extent, but nowhere near the point where Walter ceases to become a bottleneck. The whole process could use more streamlining. A LOT more.
 At some point in the history of D it was realised there needed to be a
 way to make the evolution of D more democratic. This is when DIPs were
 introduced. However, Walter's reluctance to accept DIPs made them an
 infamous time sink and often a dead end.In my view this is not
 unreasonable but it's certainly annoying for would-be contributors.
It's definitely reasonable. Walter is the one who decides what goes into D and what doesn't, so naturally he needs to be fully convinced of the value of a DIP before he will accept it. His standards are naturally high -- D being what it is, it could hardly be otherwise. Unfortunately he has also shown time and again that he often misunderstands exactly what is being proposed, and appears reluctant to take the time to understand the proposal before shooting it down. It's his prerogative, of course, but it does make working with him a rather challenging task. One that not many would-be contributors would be willing to go through.
 * Languages such as D need a BDFL who spends more time managing and
 orchestrating developments than cutting their own code. However, if
 often doesn't work out like that. Walter becomes a bottleneck and
 until he steps back from CTO, he will remain so.
And this is where we have trouble: Walter and Andrei are technical geniuses, but the way they interact with the community is, how do I put it, lackluster. This thread is a prime example: when confronted with a full-out fork of the project, any manager in charge would at least be, to put it mildly, *somewhat* concerned, and at least try to engage with the issues being presented, even if it is to disagree. However, here we have dead silence on the core dispute and instead lots of activity on a tangential technical issue. OT1H it shows where Walter's strength is -- grappling with technical issues -- but OTOH it also leaves a lot to be desired on the management side of things.
 * In the python world, the standard library is considered to be the
 place where libraries go to die, because the API becomes frozen. I
 agree with that take, and would concentrate on having a good packaging
 system where it's easy find the popular and well maintained libraries
 for given tasks e.g.  XML, json, database clients etc. Forget about
 fighting to get stuff into Phobos, it's too hard and a fool's errand.
[...] But why does it have to be this way? Why must the standard library be held to such unattainable standards that nobody but a rare few could reach it? Maybe it's time to reconsider how it is managed. Why can't it be open for the community to maintain? Delegate each major module -- std.json, std.xml, std.db (hypothetical), etc., to one or two competent people who are enthusiastic about it and who can maintain it long term, then take your hands off and just let them do their job. Micromanagement helps no one and only hurts in the long term. People need to earn their trust, it's true, but once they've earned it, they also need some room to do what they do, rather than be stifled by onerous demands or unreasonably high standards. I'm not saying this is a silver bullet that will solve all D's problems, but why not give it a try and see? T -- The diminished 7th chord is the most flexible and fear-instilling chord. Use it often, use it unsparingly, to subdue your listeners into submission!
Jan 09
next sibling parent Konstantin <kostya.hm2 gmail.com> writes:
On Tuesday, 9 January 2024 at 19:19:51 UTC, H. S. Teoh wrote:

 But why does it have to be this way?  Why must the standard 
 library be held to such unattainable standards that nobody but 
 a rare few could reach it?  Maybe it's time to reconsider how 
 it is managed.  Why can't it be open for the community to 
 maintain?  Delegate each major module -- std.json, std.xml, 
 std.db (hypothetical), etc., to one or two competent people who 
 are enthusiastic about it and who can maintain it long term, 
 then take your hands off and just let them do their job. 
 Micromanagement helps no one and only hurts in the long term.  
 People need to earn their trust, it's true, but once they've 
 earned it, they also need some room to do what they do, rather 
 than be stifled by onerous demands or unreasonably high 
 standards.
Maybe create a collection of libraries? Something like a boost for c++, which is community-driven. Some libs from boost time-after-time got merged to std.
Jan 09
prev sibling next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 9 January 2024 at 19:19:51 UTC, H. S. Teoh wrote:
 [snip]

 And this is where we have trouble: Walter and Andrei are 
 technical geniuses, but the way they interact with the 
 community is, how do I put it, lackluster.  This thread is a 
 prime example: when confronted with a full-out fork of the 
 project, any manager in charge would at least be, to put it 
 mildly, *somewhat* concerned, and at least try to engage with 
 the issues being presented, even if it is to disagree.  
 However, here we have dead silence on the core dispute and 
 instead lots of activity on a tangential technical issue.
There's a bit of "damned if you do damned if you don't" on this point.
Jan 09
prev sibling parent reply Lance Bachmeier <no spam.net> writes:
On Tuesday, 9 January 2024 at 19:19:51 UTC, H. S. Teoh wrote:

 And this is where we have trouble: Walter and Andrei are 
 technical geniuses, but the way they interact with the 
 community is, how do I put it, lackluster.  This thread is a 
 prime example: when confronted with a full-out fork of the 
 project, any manager in charge would at least be, to put it 
 mildly, *somewhat* concerned, and at least try to engage with 
 the issues being presented, even if it is to disagree.  
 However, here we have dead silence on the core dispute and 
 instead lots of activity on a tangential technical issue.  OT1H 
 it shows where Walter's strength is -- grappling with technical 
 issues -- but OTOH it also leaves a lot to be desired on the 
 management side of things.
I'll point out that there's a third leader that nobody's expecting to interact with the community. To the point that nobody's even bringing him up or asking why he doesn't respond to the fork. He's also strong technically, but interacting with random people and building a community are not exactly his strong suit. I think the problems run deeper than Walter being slow to give feedback or being conservative about accepting language additions.
Jan 09
parent reply BlueBeach <blue.beach7052 fastmail.com> writes:
On Tuesday, 9 January 2024 at 22:04:52 UTC, Lance Bachmeier wrote:

 I'll point out that there's a third leader that nobody's 
 expecting to interact with the community.
Stupid question: Is there a page on dlang.org where people and their roles are mentioned? A who is who of the Dlang team …
Jan 09
parent bachmeier <no spam.net> writes:
On Wednesday, 10 January 2024 at 00:05:10 UTC, BlueBeach wrote:
 On Tuesday, 9 January 2024 at 22:04:52 UTC, Lance Bachmeier 
 wrote:

 I'll point out that there's a third leader that nobody's 
 expecting to interact with the community.
Stupid question: Is there a page on dlang.org where people and their roles are mentioned? A who is who of the Dlang team …
Not sure. Walter and Atila Neves are "co-maintainers" of the language. The D Language Foundation officers are here: https://dlang.org/foundation/about.html The attendees of the most recent foundation meeting are listed here: https://forum.dlang.org/post/kcokzqxwdgtwvigqcrsi forum.dlang.org And I think this is the list of everyone that can commit: https://github.com/orgs/dlang/people?page=1
Jan 09
prev sibling next sibling parent whitebyte <caffeine9999 mailbox.org> writes:
On Tuesday, 9 January 2024 at 18:23:47 UTC, Abdulhaq wrote:
 * D is Walter's baby and his life work. As other people come 
 and go, as they are wont do with any project such as D, he 
 knows he will be left holding the baby and maintaining it.
There was a similar story with Vim. Bram Moolinear was a BDFL and was rather defensive in regards to the project direction. So eventually Vim was forked and Neovim happened. Not only it enjoys great popularity now, but it was a great catalyst for Vim develpment, which was stangnating at that time.
Jan 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/9/2024 10:23 AM, Abdulhaq wrote:
 * Languages such as D need a BDFL who spends more time managing and 
 orchestrating developments than cutting their own code.
The trouble is there are some coding problems that only I can resolve. For example, nobody else is crazy enough to have embedded a C compiler into D. Heck, I thought it was a crazy idea for a couple decades. Would anyone else have implemented an ownership/borrowing system for D? It exists as a prototype in the compiler now, though it's been fallow for a bit as too many other things are happening. I know its design is controversial (Timon doesn't like it at all!), and it hasn't yet proven itself. Many bugzilla issues get forwarded to me because nobody else seems to want to or are able to fix them. I've been slowly working on restructuring the front end so it is more understandable and tractable. I'm also very impressed with how far along Razvan and Dennis have come in being able to deal with difficult compiler problems.
Jan 09
next sibling parent reply Lance Bachmeier <no spam.net> writes:
On Tuesday, 9 January 2024 at 21:56:55 UTC, Walter Bright wrote:
 On 1/9/2024 10:23 AM, Abdulhaq wrote:
 * Languages such as D need a BDFL who spends more time 
 managing and orchestrating developments than cutting their own 
 code.
The trouble is there are some coding problems that only I can resolve. For example, nobody else is crazy enough to have embedded a C compiler into D. Heck, I thought it was a crazy idea for a couple decades. Would anyone else have implemented an ownership/borrowing system for D? It exists as a prototype in the compiler now, though it's been fallow for a bit as too many other things are happening. I know its design is controversial (Timon doesn't like it at all!), and it hasn't yet proven itself.
If you contribute this work to the fork, you'll have folks to try it out and provide feedback, and you won't have to add experimental flags to the compiler or mess with the compiler at all.
Jan 09
parent reply Nick Treleaven <nick geany.org> writes:
On Tuesday, 9 January 2024 at 22:10:45 UTC, Lance Bachmeier wrote:
 If you contribute this work to the fork, you'll have folks to 
 try it out and provide feedback, and you won't have to add 
 experimental flags to the compiler or mess with the compiler at 
 all.
My impression so far is that the openD fork may diverge too far from dmd to be able to merge changes upstream. They've already changed the layout of the repositories - combining phobos and ldc into the compiler one. Maybe my git knowledge is not perfect, but it seems to have made merging openD changes back into dmd much more awkward.
Jan 10
parent reply bachmeier <no spam.net> writes:
On Wednesday, 10 January 2024 at 15:40:55 UTC, Nick Treleaven 
wrote:
 On Tuesday, 9 January 2024 at 22:10:45 UTC, Lance Bachmeier 
 wrote:
 If you contribute this work to the fork, you'll have folks to 
 try it out and provide feedback, and you won't have to add 
 experimental flags to the compiler or mess with the compiler 
 at all.
My impression so far is that the openD fork may diverge too far from dmd to be able to merge changes upstream. They've already changed the layout of the repositories - combining phobos and ldc into the compiler one. Maybe my git knowledge is not perfect, but it seems to have made merging openD changes back into dmd much more awkward.
It's hard to say at this point. A fork doesn't have to diverge terribly far from upstream. This fork would be quite helpful if it provided a way to experiment with new ideas that could be merged upstream - something that doesn't happen enough now.
Jan 10
next sibling parent Sergey <kornburn yandex.ru> writes:
On Wednesday, 10 January 2024 at 17:21:26 UTC, bachmeier wrote:
 It's hard to say at this point. A fork doesn't have to diverge 
 terribly far from upstream. This fork would be quite helpful if 
 it provided a way to experiment with new ideas that could be 
 merged upstream - something that doesn't happen enough now.
Like "experimental" or "nightly" edition... But I think OpenD doesn't have it in mind by design
Jan 10
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Wed, Jan 10, 2024 at 05:21:26PM +0000, bachmeier via Digitalmars-d wrote:
 On Wednesday, 10 January 2024 at 15:40:55 UTC, Nick Treleaven wrote:
[...]
 My impression so far is that the openD fork may diverge too far from
 dmd to be able to merge changes upstream. They've already changed
 the layout of the repositories - combining phobos and ldc into the
 compiler one.  Maybe my git knowledge is not perfect, but it seems
 to have made merging openD changes back into dmd much more awkward.
Adam merged the repos for ease of management. As far as the code itself is concerned, he's generally taking the more conservative approach of not breaking things deliberately unless there's a good reason to. Of course, this will eventually lead to irreconciliable divergence from upstream, but it won't happen overnight. The plan is to stay close to D as much as possible.
 It's hard to say at this point. A fork doesn't have to diverge
 terribly far from upstream. This fork would be quite helpful if it
 provided a way to experiment with new ideas that could be merged
 upstream - something that doesn't happen enough now.
Judging from the responses to this thread, it seems clear that the current upstream team is not interested in changing their direction. Which means that merging features back from the fork is probably not going to happen, since these features generally would be those that upstream has rejected. So I'm not holding my breath. The best that could happen in this scenario is that upstream would borrow ideas from the fork, but would write their own implementation, possibly with changes to suit their taste. It doesn't seem very likely that code from the fork would be adopted as-is by upstream. T -- PENTIUM = Produces Erroneous Numbers Thru Incorrect Understanding of Mathematics
Jan 10
parent M. M. <matus email.cz> writes:
On Wednesday, 10 January 2024 at 17:46:38 UTC, H. S. Teoh wrote:
 Judging from the responses to this thread, it seems clear that 
 the current upstream team is not interested in changing their 
 direction. Which means that merging features back from the fork 
 is probably not going to happen, since these features generally 
 would be those that upstream has rejected.  So I'm not holding 
 my breath.
Maybe... But from the discussions on the forum it rather seems that the problem is not the features being rejected but the speed in which they are discussed, and sometime the seemingly reluctance of Walter to try to understand what the community thinks (like safe for extern(c) or now the interpolate strings). Yet, taking time to accept features may be beneficial. Imagine that the first dip of Adam and Steven would be accepted, or the original DIP1036. We would never come to the DIP1036e, which I hope will be accepted. (After Attila reverse engineers the implementation...)
Jan 10
prev sibling next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Tuesday, 9 January 2024 at 21:56:55 UTC, Walter Bright wrote:
 On 1/9/2024 10:23 AM, Abdulhaq wrote:
 * Languages such as D need a BDFL who spends more time 
 managing and orchestrating developments than cutting their own 
 code.
The trouble is there are some coding problems that only I can resolve. For example, nobody else is crazy enough to have embedded a C compiler into D. Heck, I thought it was a crazy idea for a couple decades. Would anyone else have implemented an ownership/borrowing system for D? It exists as a prototype in the compiler now, though it's been fallow for a bit as too many other things are happening. I know its design is controversial (Timon doesn't like it at all!), and it hasn't yet proven itself. Many bugzilla issues get forwarded to me because nobody else seems to want to or are able to fix them. I've been slowly working on restructuring the front end so it is more understandable and tractable. I'm also very impressed with how far along Razvan and Dennis have come in being able to deal with difficult compiler problems.
Nobody is asking you to solve all the problem, I'm here for almost 20 years now, following and actively using (taking unfair advantages?) of D at work. I've the impression that things are slowly moving on, and right now it's a pivot point for D history, just like as it was turning it open source, or the joining of Andrei and the first book, something similar. You, Walter, created an incredible useful language (and a beautiful one!), so you have all my respect, it's clear in my mind how big the effort was in the past, and still it is: I've followed your effort since pre D1. A fork can revitalise D, as Rust, and the flourishing of other new modern languages shock C++ , I think all the best about Adam, he is able to produce an incredible amount of code that actually DO the job. I will not bet against the failure of OpenD, ironically Adam is VERY pragmatic. And pragmatism was what first attracted me to D, pragmatic view about problems: we have a big problem right now, so let's try to find a way to resolve it in a pragmatic way. The D programming language does not need another Kenji event. D can't loose talented contributors just for the sake of "formatting": that was an incredible fool example of total nonsense, and the net loss for the community was terribly hight. Pragmatically, again, D core members need to sit down (hopefully in front of a good beer!) and find an escape path. My humble suggestion, from what I see and I've seen in the past. Literally NOBODY is on DIP 1027 camp. This means that EVERYONE sometimes is wrong, also if not convinced at all. There should be a mechanism that triggers in that case: “Logic clearly dictates that the needs of the many outweigh the needs of the few.", How am I to contradict Spock? With this mechanism in place, DIP DIP1036e should be merged. Yes, the same happened with safe by default. Also, add a third person to the Walter/Atila duo, a member of the community with better focus and understanding about the attitude of the community, but also a strong tech view of D. My ideal choice would be Steven (but hey, maybe Steven is horrified by the idea), as Mike is already doing a great job in his role. As someone said, a modest proposal. /P
Jan 09
next sibling parent "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 11:14:56PM +0000, Paolo Invernizzi via Digitalmars-d
wrote:
[...]
 A fork can revitalise D, as Rust, and the flourishing of other new
 modern languages shock C++ , I think all the best about Adam, he is
 able to produce an incredible amount of code that actually DO the job.
 I will not bet against the failure of OpenD, ironically Adam is VERY
 pragmatic.
If anyone hasn't tried out Adam's arsd.* libs yet, I'd highly recommend trying them out. Especially check out arsd.jni, which is so awesome it almost makes working with Java palatable to me again. As far as I'm concerned, Adam could well be the D library analogue of Kenji. If anybody can make a D fork succeed, he'd be one of the prime candidates. Personally I don't always agree with him, but there's no arguing with his results. [...]
 The D programming language does not need another Kenji event. D can't
 loose talented contributors just for the sake of "formatting": that
 was an incredible fool example of total nonsense, and the net loss for
 the community was terribly hight.
The loss of Kenji was one of untold proportions. He was one of the major drivers of D development back in the day -- we'd be discussing about some hypothetical feature in the forums, debating the pros and cons, and Kenji would suddenly pop up with an implementation that addressed all concerns, had a neat, consistent design, and worked as expected. It was phenomenal. D wouldn't be halfway where it is today without him. He literally contributed about 1/2 of the entire dmd codebase when he was still with us. To lose such a major contributor over some petty squabble about formatting is, well, I'm at a loss of words for it. If one can't see this tragic catastrophe for what it is, then I really don't know what else to say, it's a lost cause.
 Pragmatically, again, D core members need to sit down (hopefully in
 front of a good beer!) and find an escape path. My humble suggestion,
 from what I see and I've seen in the past.
 
 Literally NOBODY is on DIP 1027 camp. This means that EVERYONE
 sometimes is wrong, also if not convinced at all. There should be a
 mechanism that triggers in that case: “Logic clearly dictates that the
 needs of the many outweigh the needs of the few.", How am I to
 contradict Spock? With this mechanism in place, DIP DIP1036e should be
 merged. Yes, the same happened with safe by default.
Exactly. It's the same pattern repeated over and over throughout the years. It has not changed one bit. If it was once, we could call it a mistake. If it was twice, we could blame it on some thing or another. But now that it has happened repeatedly for more than 10 years, without fail, even the most stubborn among has to admit that something isn't going right, and it's fundamental. We should not fool ourselves any longer. D isn't going to get past this blockade until something radical changes. It's not a question of decisions that I don't like or that didn't go my way, or anybody else's way. Many design decisions could go either way, there's always pros and cons and sometimes you just have to arbitrarily pick one way or the other, and no matter which choice you make, somebody will be unhappy. That's expected, and that's not the problem here. The problem here is the persistent, recurring, and consistent act of rejecting something *without having bothered to understand what it was in the first place*, topped up with the outright unwillingness to be made to understand. I've been trying not to use the word disrespect directly, but there is really no nice way to put this. Just read the current discussion on DIP1036e in the other thread, it's there for all to see. As long as this continues, we're not getting past this blockade. That's all there is to it. This is why I'm extremely interested in this fork, even if I don't always agree with Adam's motivations and decisions. This could be the event that will shake things up enough to break D through the blockade. It would be the biggest tragedy if D continues to languish and never reach its full potential. It's no longer just about DIP1036e, or safe by default (which failed over a seriously minor quibble, totally disproportionate to the benefit that we would have reaped had the leadership relented on that minor point), or a whole bunch of other technical issues. What's at stake is the long-term viability of D. I've said before and I say again: this issue here is not technical, it's social. Until something changes on that front, D is going to be stuck behind that blockade indefinitely. OTOH, maybe that will be for the best. Let the current D continue to stagnate, and let Adam's fork thrive. Time will prove which way was the right way forward. A radical change must happen one way or another; the status quo cannot continue anymore.
 Also, add a third person to the Walter/Atila duo, a member of the
 community with better focus and understanding about the attitude of
 the community, but also a strong tech view of D.
What we sorely lack is a person with better social skills than your typical average D geek (including yours truly -- I do not exclude myself from disqualification). In spite of all our hopes to the contrary, not every problem can be solved with technical means; that's just not how things work in real life. Somebody in the leadership needs to have the social skills to interact with the community, and interact successfully, otherwise this stalemate will only be prolonged. Like it or not, that's just the harsh reality we have to face.
 My ideal choice would be Steven (but hey, maybe Steven is horrified by
 the idea), as Mike is already doing a great job in his role.
[...] If Steven would be nominated, I'd support him. ;-) OTOH, based on the reactions I'm observing on this thread, the chances of a breakthrough are rather low, even if Steven were to be added. In spite of myself I'm seeing Adam's fork as being the more promising alternative at present. Time will tell where it will lead. T -- People tell me that I'm paranoid, but they're just out to get me.
Jan 09
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/9/2024 3:14 PM, Paolo Invernizzi wrote:
 I've the impression that things are slowly moving on, and right now it's a
pivot 
 point for D history, just like as it was turning it open source, or the
joining 
 of Andrei and the first book, something similar.
 
 You, Walter, created an incredible useful language (and a beautiful one!), so 
 you have all my respect, it's clear in my mind how big the effort was in the 
 past, and still it is: I've followed your effort since pre D1.
 
 A fork can revitalise D, as Rust, and the flourishing of other new modern 
 languages shock C++ , I think all the best about Adam, he is able to produce
an 
 incredible amount of code that actually DO the job.  I will not bet against
the 
 failure of OpenD, ironically Adam is VERY pragmatic.
 
 And pragmatism was what first attracted me to D,  pragmatic view about
problems: 
 we have a big problem right now, so let's try to find a way to resolve it in a 
 pragmatic way.
I appreciate your thoughts on this. One issue is that, as D has become more complex, it also is inevitably going to move more slowly. A lot of effort is needed to keep from breaking stuff and to try not to box ourselves into a corner with a feature-of-the-moment. (Autodecoding was a box we put ourselves in, arrggh.) For example, quite recently, there was a storm on the n.g. about not fixing existing problems, but instead adding new features. We decided to stop adding new features for a while, and concentrate on backing and filling what we'd already done. For example, I posted a spec on pattern matching and option types, but that is on hold until we get some more backing and filling done. As an example of backing and filling, we merged several PRs that re-enabled deprecated features, in order to better support older code. We've amended our mission now to do the best we can to not break existing code. It's kind of an invisible feature, it doesn't generate any excitement, its effect is just in not making people mad (!).
 The D programming language does not need another Kenji event.
There's a non-public story about what happened with Kenji. He has chosen to not leave an explanation, and I respect that by reciprocating. I hope he has done well and prospered since leaving us. P.S. I don't reject proposals just because they come from Adam. I recently approved his standalone constructor proposal, because he competently addressed all my issues with it.
Jan 09
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 10/01/2024 1:29 PM, Walter Bright wrote:
     The D programming language does not need another Kenji event.
 
 There's a non-public story about what happened with Kenji. He has chosen 
 to not leave an explanation, and I respect that by reciprocating. I hope 
 he has done well and prospered since leaving us.
When it comes to leadership changes, even if the position was informal (like with Kenji), its important to make a statement otherwise it can lead to tensions like it has done here. This is the first time I have seen suggested that it isn't just a disagreement that lead to him leaving. Quite often the details don't matter. What matters is the intent and the intent up to this point has appeared to be of being insulted.
Jan 09
prev sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Wednesday, 10 January 2024 at 00:29:38 UTC, Walter Bright 
wrote:

 I appreciate your thoughts on this. One issue is that, as D has 
 become more complex, it also is inevitably going to move more 
 slowly. A lot of effort is needed to keep from breaking stuff 
 and to try not to box ourselves into a corner with a 
 feature-of-the-moment. (Autodecoding was a box we put ourselves 
 in, arrggh.)

 For example, quite recently, there was a storm on the n.g. 
 about not fixing existing problems, but instead adding new 
 features.

 We decided to stop adding new features for a while, and 
 concentrate on backing and filling what we'd already done.
The point is not the complexity of the language, or the moving velocity, It's good that actually D is in "let's fix stuff" phase of life. The point is not either the eternal "break / don't break" my code war. Contributors are having an hard-life, that's the point. You repeated multiple times over the years that you excel in the technical field (like Andrei, or Atila), but it's more difficult to you to handle other human related management. That's especially true when you fell in love with your idea, you literally start arguing with "your idea glasses", it's really clear in the thread about DIP1027 vs DIP1038e on SQL, you are arguing there with DIP1027 glasses on. It's not your specific fault, we are human, we behave sometime like that. Let's come back to pragmatism: - everyone thinks DIP1038e is far better than DIP1027 - you are not convinced, but hey, we are human being, maybe you are wrong. What's the solution? Simple, recognise that! You grown a really talented group of people, so trust them! We are talking about a "language" feature, we have Timon onboard, with a raised thumb on that, trust his judgement! There should be a way to simply trigger some procedure in such a cases, put onboard someone with that role: tap on your shoulder about that. I'm also adding around auto-decoding, and the box we put ourselves in, that in that case there was no unanimous consensus, and a real unicode expertise was lacking to the designer at that time: it was a different story, that of course can be corrected.
 As an example of backing and filling, we merged several PRs 
 that re-enabled deprecated features, in order to better support 
 older code. We've amended our mission now to do the best we can 
 to not break existing code. It's kind of an invisible feature, 
 it doesn't generate any excitement, its effect is just in not 
 making people mad (!).
I will not discuss too deeply about that, because I see a clear dichotomy in trying to keep things simple in the compiler because it's growing too complex and big and really no-one understand it fully (cough cough CTFE), and trying to keep every historical feature inside it, while evolving: it's an herculean effort, so I'm skeptical about that.
 The D programming language does not need another Kenji event.
There's a non-public story about what happened with Kenji. He has chosen to not leave an explanation, and I respect that by reciprocating. I hope he has done well and prospered since leaving us.
It's good to know, thank you for the clarification, that's refreshing indeed.
 P.S. I don't reject proposals just because they come from Adam. 
 I recently approved his standalone constructor proposal, 
 because he competently addressed all my issues with it.
I've no doubt about that, you are a serious and ethic person. /P
Jan 10
next sibling parent reply Guillaume Piolat <first.name gmail.com> writes:
On Wednesday, 10 January 2024 at 10:40:51 UTC, Paolo Invernizzi 
wrote:
 Let's come back to pragmatism:
 - everyone thinks DIP1038e is far better than DIP1027
This isn't strictly true. Some people like me don't care at all don't have time to read DIP and arguments, and trust the core team to choose for them. It's called having someone responsible for the design. I'm in the camp of people fed up hearing about string interpolation for the last 3 months, and all the drama surrounding it. I'd rather not have string interpolation than just hearing about people complaining for months. Because this is what happened and at this point I can very well live without variables in quotes. That's from seeing the leadership fence off bad ideas since years and years. A lot of the times, about the right decision was taken. It's painful seeing people becoming ever more demanding of open-source projects. And I remember very well this community to be against introduction of nogc, of UDAs (there was massive backlash), of -betterC, of memory-safety... including me.
Jan 10
next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Wednesday, 10 January 2024 at 11:24:29 UTC, Guillaume Piolat 
wrote:
 On Wednesday, 10 January 2024 at 10:40:51 UTC, Paolo Invernizzi 
 wrote:
 Let's come back to pragmatism:
 - everyone thinks DIP1038e is far better than DIP1027
This isn't strictly true. Some people like me don't care at all don't have time to read DIP and arguments, and trust the core team to choose for them. It's called having someone responsible for the design. I'm in the camp of people fed up hearing about string interpolation for the last 3 months, and all the drama surrounding it. I'd rather not have string interpolation than just hearing about people complaining for months. Because this is what happened and at this point I can very well live without variables in quotes. That's from seeing the leadership fence off bad ideas since years and years. A lot of the times, about the right decision was taken. It's painful seeing people becoming ever more demanding of open-source projects. And I remember very well this community to be against introduction of nogc, of UDAs (there was massive backlash), of -betterC, of memory-safety... including me.
What was meant obviously is: everyone who cares, had time to read DIP and arguments, and obviously trust the core team is on DIP1038e. If you don't care, well, that's fine. I care, for example, because that would be an improvement in my company codebase. But, again, the point under discussion is different, and Theo explained it well in his posts in this thread, don't focus yourself on the current specific case in string interpolation. /O
Jan 10
parent reply Guillaume Piolat <first.name gmail.com> writes:
On Wednesday, 10 January 2024 at 12:08:07 UTC, Paolo Invernizzi 
wrote:
 I care, for example, because that would be an improvement in my 
 company codebase.
But it's not an improvement _in the D ecosystem_ to hear shouts during 3 month (we loose lots of users this way), lost goodwill, over a menial feature that was always possible as part of the scriptlike package 9 years ago. https://github.com/Abscissa/scriptlike/blob/master/examples/features/StringInterpolation.d I care a lot about D mind you and I see no evidence that any feature is worth that kind of shouting contest. It's going to be very fun when everyone come to agree ImportC was a good idea in the end.
Jan 10
next sibling parent matheus <matheus gmail.com> writes:
On Wednesday, 10 January 2024 at 13:02:55 UTC, Guillaume Piolat 
wrote:
 ... It's going to be very fun when everyone come to agree 
 ImportC was a good idea in the end.
There was a debate/contest against this? I thought most wanted a easy way to use C inside D without hassle. Sometimes I see complains about not be completed though. Matheus.
Jan 10
prev sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Wednesday, 10 January 2024 at 13:02:55 UTC, Guillaume Piolat 
wrote:
 On Wednesday, 10 January 2024 at 12:08:07 UTC, Paolo Invernizzi 
 wrote:
 I care, for example, because that would be an improvement in 
 my company codebase.
But it's not an improvement _in the D ecosystem_ to hear shouts during 3 month (we loose lots of users this way), lost goodwill
That are speculations, users come and go. The lost of Adam and a fork is a fact. The discussion is about losing _long term_ contributors, and try to find out why and how to keep them contributing. But that's fine, we agree that we disagree.
, over a menial feature that was always possible as
 part of the scriptlike package 9 years ago.
https://github.com/Abscissa/scriptlike/blob/master/examples/features/StringInterpolation.d
I know about Nick solutions, I remind you we are using mixins too.
 I care a lot about D mind you and I see no evidence that any 
 feature is worth that kind of shouting contest. It's going to 
 be very fun when everyone come to agree ImportC was a good idea 
 in the end.
Again, the point is not about features, is about caring about contributions. It's not a shouting contest, is trying to prevent people to uprise until the point of shouting. Let's start from a common point: we all care about D, let's try to be positive and find out if there's a good way to move forward and solve the kind of problems that we are facing with this situation. I honestly ask, you have suggestions?
Jan 10
parent reply Guillaume Piolat <first.name gmail.com> writes:
On Wednesday, 10 January 2024 at 13:30:02 UTC, Paolo Invernizzi 
wrote:
 Let's start from a common point: we all care about D, let's try 
 to be positive and find out if there's a good way to move 
 forward and solve the kind of problems that we are facing with 
 this situation.
Exactly.
 I honestly ask, you have suggestions?
Yes, but I don't think I would be more relevant than what the DLF say itself. What I observe is that we've given ample space to a discourse that fantasize a horrible destiny for D, and sometimes just plain impoliteness, while from where I stand all kinds of issues go away over time, 50 of my 64 Buzilla entries have been solved (and the other don't matter), and in D industry meetings some have nothing to ask for! And everyone seems to be liking D. How do you reconcile that? The core of the doom discourse is that somehow core team limits D, I feel instead that the community fails to be supportive when it needs to and even accept users behaving in a unprofessional way... If it were for the common good it would be worth it, but I think we're learning eventually that putting up with bad behaviour is not worth it.
Jan 10
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
On 11/01/2024 4:32 AM, Guillaume Piolat wrote:
 The core of the doom discourse is that somehow core team limits D, I 
 feel instead that the community fails to be supportive when it needs to 
 and even accept users behaving in a unprofessional way...
 
 If it were for the common good it would be worth it, but I think we're 
 learning eventually that putting up with bad behaviour is not worth it.
Agreed. I have been trying to foster positive growth factors in the community over the last year due to this. Negativity, especially when it is not earned, does not create growth. Trying to be positive, and encouraging is quite a change.
Jan 10
prev sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Wednesday, 10 January 2024 at 15:32:11 UTC, Guillaume Piolat 
wrote:
 On Wednesday, 10 January 2024 at 13:30:02 UTC, Paolo Invernizzi 
 wrote:
 Let's start from a common point: we all care about D, let's 
 try to be positive and find out if there's a good way to move 
 forward and solve the kind of problems that we are facing with 
 this situation.
Exactly.
 I honestly ask, you have suggestions?
Yes, but I don't think I would be more relevant than what the DLF say itself. What I observe is that we've given ample space to a discourse that fantasize a horrible destiny for D, and sometimes just plain impoliteness, while from where I stand all kinds of issues go away over time, 50 of my 64 Buzilla entries have been solved (and the other don't matter), and in D industry meetings some have nothing to ask for! And everyone seems to be liking D. How do you reconcile that?
Everyone like D, and everyone would like more, not less, work that people like Adam constructed in the past years. Me and my company would like more work like the excellent sumtype inclusion in Phobos, to give you a concrete example. But again, that's OT. I reiterate: what we can do to have a best handling of events that resulted in what we are seeing today, the fork. I've given my suggestion, we need more people like Steven and his attitude as maintainers.
 The core of the doom discourse is that somehow core team limits 
 D, I feel instead that the community fails to be supportive 
 when it needs to and even accept users behaving in a 
 unprofessional way...
Maybe, but I think that the common feeling is exactly the opposite, and that's what I've observed since many, many, many years. Things are moving toward better in recent months, so it's seems to me that's the right time to tackle this specific issue, as it seems at hand. To be clear, I don't care if at the end interpolation is merged or not, I care to see improvements in way people are managed.
 If it were for the common good it would be worth it, but I 
 think we're learning eventually that putting up with bad 
 behaviour is not worth it.
I totally agree, and I reiterate: how to avoid to rouse people to that point? /P
Jan 10
prev sibling parent reply Martyn <martyn.developer googlemail.com> writes:
On Wednesday, 10 January 2024 at 11:24:29 UTC, Guillaume Piolat 
wrote:
 On Wednesday, 10 January 2024 at 10:40:51 UTC, Paolo Invernizzi 
 wrote:
 Let's come back to pragmatism:
 - everyone thinks DIP1038e is far better than DIP1027
This isn't strictly true. Some people like me don't care at all don't have time to read DIP and arguments, and trust the core team to choose for them. It's called having someone responsible for the design. I'm in the camp of people fed up hearing about string interpolation for the last 3 months, and all the drama surrounding it. I'd rather not have string interpolation than just hearing about people complaining for months. Because this is what happened and at this point I can very well live without variables in quotes. That's from seeing the leadership fence off bad ideas since years and years. A lot of the times, about the right decision was taken. It's painful seeing people becoming ever more demanding of open-source projects. And I remember very well this community to be against introduction of nogc, of UDAs (there was massive backlash), of -betterC, of memory-safety... including me.
I am surprised there was a massive backlash towards UDAs. The idea of **Attributes** that was handled at compile time sounds awesome. Maybe its initial plans had various flaws? I am not sure when UDAs were added to D. Taking a guess, its must be pre-2016. I am sure UDAs existed when I started viewing D more seriously. Assuming the backlash is correct, I think it is a good example of understanding what the community wanted from D at the time. I mean, if UDAs were first introduced in more recent times, I believe it would be met positively by the community. Maybe I am wrong? If I am correct then it shows where the community categorize D back then compared to now. Rather than being a C++ killer.. I like UDA would be more of a welcoming idea. It shows how things change in 10 years. What is "cool" today might not be the case down the road. For me, I find UDAs to be a welcoming feature of D. if it succeeds. Again - we shall see in a few years.
Jan 10
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 10 January 2024 at 12:23:32 UTC, Martyn wrote:

 If I am correct then it shows where the community categorize D 
 back then compared to now. Rather than being a C++ killer.. I 

 like UDA would be more of a welcoming idea.

 It shows how things change in 10 years. What is "cool" today 
 might not be the case down the road. For me, I find UDAs to be 
 a welcoming feature of D.


 if it succeeds. Again - we shall see in a few years.
Having an `'openD'` branch is not necessarily a bad thing. The main thing is that there should be `communication` between the `'openD'` and the `'dmd'` main branch! Some functions are available for both OpenD and DMD, so they should be able to provide feedback to `DMD` If dmd is appropriate, vice versus, you can also provide `feedback` to `openD`. `'dmd'` focuses on competing with `'C++/rust'` and others!
Jan 10
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 10 January 2024 at 12:36:42 UTC, zjh wrote:


 `'dmd'` focuses on competing with `'C++/rust'` and others!
I think the existing `d` should be developed more `aggressively`. There should be a `suitable `function to try and exit mechanism, because the development of the existing`dmd`is too slow. Looking at `opend` again, there is already a lot of discussion about `features`!. The real problem is that `d `should expand its user base!. And `d `can now focus more on competing with `rust/C++`. Leaving
Jan 10
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 4:23 AM, Martyn wrote:
 I am surprised there was a massive backlash towards UDAs.
I was, too.
Jan 14
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 2:40 AM, Paolo Invernizzi wrote:
 Let's come back to pragmatism:
 - everyone thinks DIP1038e is far better than DIP1027
 - you are not convinced, but hey, we are human being, maybe you are wrong.
Please save that thought for the other thread.
Jan 10
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 1/10/24 11:40, Paolo Invernizzi wrote:
 
 Let's come back to pragmatism:
 - everyone thinks DIP1038e is far better than DIP1027
 - you are not convinced, but hey, we are human being, maybe you are wrong.
 
 What's the solution? Simple, recognise that! You grown a really talented 
 group of people, so trust them! We are talking about a "language" 
 feature, we have Timon onboard, with a raised thumb on that, trust his 
 judgement!
To be clear, in order of importance: - I do not want DIP1027 in the language, it solves the wrong problem. - I think DIP1036e is cool and solves the right problem. Whether DIP1036e should be merged as-is is or should be separated out into different features that allow to solve the same problem is another question, at the moment I am mostly arguing that DIP1027 is inadequate. Walter implementing the previously rejected DIP1027 instead of engaging with Adam's new proposal that addressed DIP1027's shortcomings I think was not a great move, but I understand it is more fun to implement your own idea than to try to understand someone else's.
Jan 11
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/11/2024 1:17 PM, Timon Gehr wrote:
 Walter implementing the previously rejected DIP1027 instead of engaging with 
 Adam's new proposal that addressed DIP1027's shortcomings I think was not a 
 great move, but I understand it is more fun to implement your own idea than to 
 try to understand someone else's.
I wrote a review of Adam's proposal some months back.
Jan 11
parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/11/2024 11:02 PM, Walter Bright wrote:
 On 1/11/2024 1:17 PM, Timon Gehr wrote:
 Walter implementing the previously rejected DIP1027 instead of engaging with 
 Adam's new proposal that addressed DIP1027's shortcomings I think was not a 
 great move, but I understand it is more fun to implement your own idea than to 
 try to understand someone else's.
I wrote a review of Adam's proposal some months back.
Earlier proposals were also extensively discussed in the n.g., which I participated in.
Jan 11
prev sibling parent aberba <karabutaworld gmail.com> writes:
On Tuesday, 9 January 2024 at 23:14:56 UTC, Paolo Invernizzi 
wrote:
 On Tuesday, 9 January 2024 at 21:56:55 UTC, Walter Bright wrote:
 Nobody is asking you ...
 Literally NOBODY is ...
I don't like this style of making a point. It tends to claim more weight than it might actually have. Who's "nobody"? I rather you say speak for yourself with "I" Let's just merge whatever DIP is available lol.
Jan 10
prev sibling parent reply GrimMaple <grimmaple95 gmail.com> writes:
On Tuesday, 9 January 2024 at 21:56:55 UTC, Walter Bright wrote:
 The trouble is there are some coding problems that only I can 
 resolve. For example, nobody else is crazy enough to have 
 embedded a C compiler into D. Heck, I thought it was a crazy 
 idea for a couple decades.
Have you ever considered that this is the case because you **deliberatly** created an environment where other people simply don't want to resolve problems? Do you think that getting your changes reverted enables positive thinking for trying to fix anything? Why bother fixing a difficult problem if, out of the blue, you're gonna show up and just revert stuff because you "can't grep properly".
 Would anyone else have implemented an ownership/borrowing 
 system for D? It exists as a prototype in the compiler now, 
 though it's been fallow for a bit as too many other things are 
 happening. I know its design is controversial (Timon doesn't 
 like it at all!), and it hasn't yet proven itself.
Has anyone ever **cared** about ownership/borrowing in a language that already fixed problems that borrowing fixes? Just use the GC -- and there isn't a need for ownership checks. The later part is just funny to me, because it reiterates what I said earlier: whenever it's a community accepted solution against just you, it's a no-go. When community is against something - you just push it in anyway.
 Many bugzilla issues get forwarded to me because nobody else 
 seems to want to or are able to fix them.
When you have the mentality of "I have the final say" -- of course nobody is gonna do anything. If you have the final say, you come up with a solution. Would you attempt to fix something knowing that your fix has a very good chance of being dismissed? I doubt so. Maybe out of enthusiasm, sure. But this enthusiasm only can get you so far. After some point you just give up and find a better use for your time. Interestingly enough, being too involved in D made me somewhat afraid of making contributions at all. I was pleasantly surprised when my changes were **silently** merged into other projects despite me just dropping them out of nowhere. This is the way I see an open-source project shall be to have any form of success.
 I've been slowly working on restructuring the front end so it 
 is more understandable and tractable.
Funny you say this, because I had to update the compiler recently (to work on OpenD), and it started spitting out deprecations on my other code. And it's something that I've been complaining about for years, yet here we are. Updating the compiler even one version ahead gives me deprecations.
Jan 10
next sibling parent reply Nick Treleaven <nick geany.org> writes:
On Wednesday, 10 January 2024 at 11:01:07 UTC, GrimMaple wrote:
 On Tuesday, 9 January 2024 at 21:56:55 UTC, Walter Bright wrote:
 The trouble is there are some coding problems that only I can 
 resolve. For example, nobody else is crazy enough to have 
 embedded a C compiler into D. Heck, I thought it was a crazy 
 idea for a couple decades.
Have you ever considered that this is the case because you **deliberatly** created an environment where other people simply don't want to resolve problems?
You may believe that but you can't know that your sentence is true. There's a good principle: 'Never attribute to malice that which can adequately be explained by incompetence'. It does both you and the recipient no good to insist on malice.
 Do you think that getting your changes reverted enables 
 positive thinking for trying to fix anything?
There are times when reverting things is necessary for the good of users in future, even if it upsets some people.
 Would anyone else have implemented an ownership/borrowing 
 system for D? It exists as a prototype in the compiler now, 
 though it's been fallow for a bit as too many other things are 
 happening. I know its design is controversial (Timon doesn't 
 like it at all!), and it hasn't yet proven itself.
Has anyone ever **cared** about ownership/borrowing in a language that already fixed problems that borrowing fixes? Just use the GC -- and there isn't a need for ownership checks.
Then why do people use Rust? People here use nogc and -betterC. Some kind of ownership/borrowing system is the go-to solution for memory-safety without a GC.
 Interestingly enough, being too involved in D made me somewhat 
 afraid of making contributions at all. I was pleasantly 
 surprised when my changes were **silently** merged into other 
 projects despite me just dropping them out of nowhere. This is 
 the way I see an open-source project shall be to have any form 
 of success.
OTOH, users have complained about features not being finished or not interacting with other features how they want. So it's a great thing for users when language maintainers are careful when people want to add features or break compatibility. Fortunately I think the DLF have accepted the need for editions, so compatibility won't be so much of an issue.
Jan 10
parent reply GrimMaple <grimmaple95 gmail.com> writes:
On Wednesday, 10 January 2024 at 15:19:18 UTC, Nick Treleaven 
wrote:
 You may believe that but you can't know that your sentence is 
 true. There's a good principle: 'Never attribute to malice that 
 which can adequately be explained by incompetence'. It does 
 both you and the recipient no good to insist on malice.
I am not "believing" that, I am __seeing__ that. See those for example https://github.com/dlang/dmd/pull/10460 -- Reverts commit without even notifying the person who did that commit https://github.com/dlang/dmd/pull/9881 -- no reason for reverting is given at all https://github.com/dlang/dmd/pull/9880 -- reverted because it breaks something that has nothign to do with DMD in the first place. Perfectly reasonable argument by Adam is simply ignored. https://github.com/dlang/dmd/pull/12828 -- this is the saddest thing that personally broke my heart to see. A community consensus was reached, Walter reverts anyway. One of the important contributors leave.
 There are times when reverting things is necessary for the good 
 of users in future, even if it upsets some people.
Unfortunately, we are talking about _everyone_. Not jsut some people.
 Then why do people use Rust? People here use  nogc and 
 -betterC. Some kind of ownership/borrowing system is the go-to 
 solution for memory-safety without a GC.
Because people cared, they created Rust. On the other hand, all of that is up for removal in OpenD (at least we are actively discussing that)
 OTOH, users have complained about features not being finished 
 or not interacting with other features how they want. So it's a 
 great thing for users when language maintainers are careful 
 when people want to add features or break compatibility. 
 Fortunately I think the DLF have accepted the need for 
 editions, so compatibility won't be so much of an issue.
This is not the point that is being argued. The point is, Walter demands perfection when it's someone else, yet allows his subpar code slip in all the time.
Jan 10
next sibling parent reply Nick Treleaven <nick geany.org> writes:
On Wednesday, 10 January 2024 at 15:56:27 UTC, GrimMaple wrote:
 I am not "believing" that, I am __seeing__ that. See those for 
 example
 https://github.com/dlang/dmd/pull/10460 -- Reverts commit 
 without even notifying the person who did that commit
 https://github.com/dlang/dmd/pull/9881 -- no reason for 
 reverting is given at all
 https://github.com/dlang/dmd/pull/9880 -- reverted because it 
 breaks something that has nothign to do with DMD in the first 
 place. Perfectly reasonable argument by Adam is simply ignored.
 https://github.com/dlang/dmd/pull/12828 -- this is the saddest 
 thing that personally broke my heart to see. A community 
 consensus was reached, Walter reverts anyway. One of the 
 important contributors leave.
I looked at the last one, that doesn't prove malice. (As the first one I looked at wasn't clearly malice I didn't bother looking at the others). I agree it would be nice if there was a brief reason given in the description, but at least the reason was posted in a comment. No one is perfect every day.
 There are times when reverting things is necessary for the 
 good of users in future, even if it upsets some people.
Unfortunately, we are talking about _everyone_. Not jsut some people.
Please don't exaggerate. Probably most D users don't even post on the forum. And what about future users?
 Then why do people use Rust? People here use  nogc and 
 -betterC. Some kind of ownership/borrowing system is the go-to 
 solution for memory-safety without a GC.
Because people cared, they created Rust. On the other hand, all of that is up for removal in OpenD (at least we are actively discussing that)
The fact people here use -betterC refutes your point that D users are not interested in avoiding the GC. Please don't make that point in future.
 OTOH, users have complained about features not being finished 
 or not interacting with other features how they want. So it's 
 a great thing for users when language maintainers are careful 
 when people want to add features or break compatibility. 
 Fortunately I think the DLF have accepted the need for 
 editions, so compatibility won't be so much of an issue.
This is not the point that is being argued. The point is, Walter demands perfection when it's someone else, yet allows his subpar code slip in all the time.
If you mean live, that's under a preview switch. If you mean importC, that's a compiler feature, not part of the D language.
Jan 10
parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 8:15 AM, Nick Treleaven wrote:
 If you mean  live, that's under a preview switch. If you mean importC, that's
a 
 compiler feature, not part of the D language.
Interestingly, I have begun adopting an O/B style in my own coding. It has a lot of merit, even for GC code, as it makes code more understandable. I well know that O/B isn't a solution for every problem, which is why Rust has an "unsafe" mode. I also have no intention of breaking everyone's code by introducing OB. That's why it's restricted to live functions.
Jan 10
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 7:56 AM, GrimMaple wrote:
 I am not "believing" that, I am __seeing__ that. See those for example
 https://github.com/dlang/dmd/pull/10460 -- Reverts commit without even
notifying 
 the person who did that commit
Scroll down and look at the last entry.
Jan 10
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/10/2024 3:01 AM, GrimMaple wrote:
 When you have the mentality of
Rudeness to any forum member is not acceptable here.
 Funny you say this, because I had to update the compiler recently (to work on 
 OpenD), and it started spitting out deprecations on my other code. And it's 
 something that I've been complaining about for years, yet here we are.
Updating 
 the compiler even one version ahead gives me deprecations.
Please post what they are and I will address them.
Jan 10
prev sibling next sibling parent reply GrimMaple <grimmaple95 gmail.com> writes:
On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 It's not just the slow procesing of PRs. It's the arbitrary 
 shutdown of contributions after months, often years, of 
 silence, with no prior warning and no regard of the amount of 
 work put into maintaining said PRs over that length of time. 
 And often while totally misunderstanding what was actually 
 done, as is being shown right at this moment with the 
 discussion on DIP 1036e.
Though I don't have a long-standing track record of contributing to D, my futile attempts to even get anywhere were so painfull I just stopped bothering very quickly. It always sucks to get critisized, but it sucks extra when you're not even told what to do with this criticism. Most of the time, D community seems to try and look like it cares about the end user, but in reality (this thread is a confirmation) it's just pointless bantering about something that the end user doesn't care about. More importantly, this pointless bantering is used as an escape goat to shut down changes that DLF doesn't want/like, yet DLF itself never bothers with breakage if it implements some kewl new feature that nobody asked for. As a result, the string interpolation that works very well right now in opend, can't get into upstream D because someone just doesn't want to accept it. On the other hand, the same someone has spent an entire year on a feature that still doesn't work even remotely well.
 It's fascinating that Walter did not relate to this at all, but 
 readily jumped into a lengthy technical discussion of a
 tangentially related topic.
Which is why a fork was created in the first place.
 I dont see how a D fork managed by Ruppe could be managed in a 
 better way
Actually, Adam might be the only person that is genuinely interested in getting things done. He is very quick to answer and __never__ goes down to pointles philosophical discussions.
Jan 09
parent reply BlueBeach <blue.beach7052 fastmail.com> writes:
On Tuesday, 9 January 2024 at 19:32:10 UTC, GrimMaple wrote:
 On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 […] More importantly, this pointless bantering is used
as an escape goat to shut down changes that DLF doesn't want/like, yet DLF itself never bothers with breakage if it implements some kewl new feature that nobody asked for.
There is the vision document from Mike Parker (https://github.com/dlang/vision-document) I think it’s noble attempt and could be starting point for improvements. But I’m interested in your opinion. Where you involved somehow? Do you see any of your concerns somehow addressed?
Jan 09
parent GrimMaple <grimmaple95 gmail.com> writes:
On Tuesday, 9 January 2024 at 20:02:40 UTC, BlueBeach wrote:
 There is the vision document from Mike Parker 
 (https://github.com/dlang/vision-document)

 I think it’s noble attempt and could be starting point for 
 improvements. But I’m interested in your opinion.

 Where you involved somehow?
 Do you see any of your concerns somehow addressed?
I think I accidentlly kick-started that by one of my rants on the discord. I definitely wasn't the cause, but maybe just the final straw. It is a good read, however, 2 years passed and you see literally 0 development. Nothing is improved in reality. It's all just talk, no action. I had my scepticism, but now I'm 95% sure that none of that is going to be implemented; it's just gonna rot as a historical artifact on the internet.
Jan 09
prev sibling parent karitoy <karitoy gmail.com> writes:
On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 On Tue, Jan 09, 2024 at 02:02:36PM +0000, BlueBeach via 
 Digitalmars-d wrote:
 [...]
See: http://dpldocs.info/this-week-in-d/Blog.Posted_2024_01_01.html [...]
I agree wholeheartedly!!!
Jan 10
prev sibling next sibling parent bomat <Tempest_spam gmx.de> writes:
On Tuesday, 9 January 2024 at 08:45:03 UTC, whitebyte wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
It's fascinating that Walter did not relate to this at all, but readily jumped into a lengthy technical discussion of a tangentially related topic.
This. Sorry for the lack of content in this post, I just wanted to stress my astonishment as well.
Jan 09
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/9/2024 12:45 AM, whitebyte wrote:
 It's fascinating that Walter did not relate to this at all
A great strength of D is that it is deliberately constructed so that anyone can fork it at any time for any reason. I'm really pleased that we were finally able to get the back end Boost licensed, too. That means the D compiler has the least restrictive license in the world! I did not dive into this as I thought it reasonable for people to make up their own minds without myself muddying the waters.
Jan 09
prev sibling parent reply Les De Ridder <dlang lesderid.net> writes:
On Tuesday, 9 January 2024 at 08:45:03 UTC, whitebyte wrote:
 On Tuesday, 2 January 2024 at 17:55:56 UTC, GrimMaple wrote:
 Hello everyone!

 Growing greatly dissatisfied with how things are in the D 
 Programming Language, we decided it is time to fork it.
It's fascinating that Walter did not relate to this at all, but readily jumped into a lengthy technical discussion of a tangentially related topic.
This should not have happened. This is my first time reading the D newsgroup in years, and the first thing I see is a post about a fork that quickly turned into an off-topic discussion, that Walter decided to engage in. I realised he ended up forking it into a new thread, but it just shouldn't have happened in the first place. Good luck to both projects. I wonder what my go-to compiled language will be in a couple years.
Jan 16
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/16/2024 12:49 PM, Les De Ridder wrote:
 Good luck to both projects. I wonder what my go-to compiled language
 will be in a couple years.
Steel only gets tempered with fire and hammer :-)
Jan 16
parent reply Salih Dincer <salihdb hotmail.com> writes:
On Wednesday, 17 January 2024 at 00:44:06 UTC, Walter Bright 
wrote:
 On 1/16/2024 12:49 PM, Les De Ridder wrote:
 Good luck to both projects. I wonder what my go-to compiled 
 language
 will be in a couple years.
Steel only gets tempered with fire and hammer :-)
What do they expect from Walter! To take the hammer and hammer a piece of iron (openD) in view? Walter has a responsibility, there is a foundation he heads. Undoubtedly, this project will also contribute to D, but D will live within the foundation forever. SDB 79
Jan 20
parent reply Danilo <codedan aol.com> writes:
D is Walter's invention, his baby and life-time project. It's his 
legacy to this world.

It's what Walter Bright will be remembered for in the future 
world, after his death, on Wikipedia etc.
His contribution to the world of programming languages and 
compilers.

Some people just don't get it. Some specific people will not be 
remembered at all, nobody cares if they ever lived.
They didn't contribute anything useful to the world at all, so 
they will be forgotten forever.

It is how it is.
Jan 20
next sibling parent reply Danilo <codedan aol.com> writes:
On Sunday, 21 January 2024 at 04:45:57 UTC, Danilo wrote:
 D is Walter's invention, his baby and life-time project. It's
Walter Bright will probably be in the league of Knuth and Niklaus Wirth, when it comes to the contribution to compilers and programming languages. Nobody will care about 'Ruppe', 'Danilo', and 'grim'...
Jan 20
parent claptrap <clap trap.com> writes:
On Sunday, 21 January 2024 at 05:13:54 UTC, Danilo wrote:
 On Sunday, 21 January 2024 at 04:45:57 UTC, Danilo wrote:
 D is Walter's invention, his baby and life-time project. It's
Walter Bright will probably be in the league of Knuth and Niklaus Wirth, when it comes to the contribution to compilers and programming languages. Nobody will care about 'Ruppe', 'Danilo', and 'grim'...
Should we all chip in and build him a statue? Maybe walter stood ontop of pile of old computers holding a shining D-Man aloft, Maybe with everyone else prostrate on the ground beneath him? Im sure he'd love that ;-)
Jan 21
prev sibling parent matheus <matheus gmail.com> writes:
On Sunday, 21 January 2024 at 04:45:57 UTC, Danilo wrote:
 D is Walter's invention, his baby and life-time project. It's 
 his legacy to this world.

 It's what Walter Bright will be remembered for in the future 
 world, after his death, on Wikipedia etc.
 His contribution to the world of programming languages and 
 compilers.
I think that outside the programming world, most people actually knows him because: Empire (https://www.classicempire.com) =] Matheus.
Jan 21
prev sibling parent reply cc <cc nevernet.com> writes:
On Tuesday, 16 January 2024 at 20:49:51 UTC, Les De Ridder wrote:
 Good luck to both projects. I wonder what my go-to compiled 
 language
 will be in a couple years.
There is a ~99.9% chance mine will end in the letter "D".
Jan 19
parent Dom DiSc <dominikus scherkl.de> writes:
On Friday, 19 January 2024 at 09:25:00 UTC, cc wrote:

 There is a ~99.9% chance mine will end in the letter "D".
RusD? (sorry, couldn't resist :-)
Jan 19
prev sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Tue, Jan 09, 2024 at 07:32:10PM +0000, GrimMaple via Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 It's not just the slow procesing of PRs. It's the arbitrary shutdown
 of contributions after months, often years, of silence, with no
 prior warning and no regard of the amount of work put into
 maintaining said PRs over that length of time. And often while
 totally misunderstanding what was actually done, as is being shown
 right at this moment with the discussion on DIP 1036e.
Though I don't have a long-standing track record of contributing to D, my futile attempts to even get anywhere were so painfull I just stopped bothering very quickly. It always sucks to get critisized, but it sucks extra when you're not even told what to do with this criticism.
Well, I *do* have a long track record of contributing to D. I've had not a small number of PRs merged into Phobos and a few in DMD. I was even given commit access to Phobos (and still have it, though I haven't used it in years) and for a time put a lot of effort into reviewing and merging PRs. It's not so much the criticism that's the problem -- in any technical project criticism is good and necessary for high quality work, as long as the criticism is justified. It's not even the slow processing of PRs: as an "insider", I see (or at least, used to see -- I haven't been involved long enough that my information is probably outdated) the other side of the situation. As a volunteer offering my precious little free time to help D, I simply don't have the time/energy to review big complex PRs. Neither do I have the confidence to review PRs involving areas that I'm not an expert in, such as numerical algorithms. As such, I tend to avoid reviewing PRs that either (1) require too much time/effort, and (2) outside my area of expertise. I'm sure I'm not the only one who feels this way among the Phobos committers. So the total effect of this is that large PRs get neglected in the queue, along with PRs that contain controversial changes (as a volunteer, I obviously do not want to waste time merging a PR only to have Walter or Andrei revert it later) or technicalities requiring very specific expertise that only one or two (or sometimes zero) people have. All of this stems from a shortage of manpower, but that in itself is solvable and isn't the core of the problem -- you just draw more willing contributors and sooner or later one of them will have the needed skills to do the job. Or you gain enough manpower to offload the routine maintenance tasks. The troubles begin when you're losing more contributors than you're gaining them. And losing them because of things like somebody in high standing appearing out of nowhere and dropping on you like a ton of bricks because you didn't meet requirements A, B, C, D, none of which were communicated to you prior to your involvement. Worse, when said person fails to elucidate just what exactly A, B, C, D are beyond some vague, non-actionable hand-waving. And arbitrarily reverting work you've poured hours into, even though prior to that they were unresponsive when asked for feedback and did not make it clear exactly what was expected of you. And to add salt to the wound, when you've poured even more countless hours into fixing up your PR to meet the requested changes, only to have the reviewer go MIA (or only give one word responses) for months on end, and then come back later to say no, it's still not good enough, here are additional requirements E, F, G -- none of which were mentioned as issues before. And to top it all off, when the person rejecting the work does not fully understand what has been done, and clearly has not bothered to try to understand it (because if they did, they wouldn't say the things they did or continue to repeat wrong claims which have already been debunked, often multiple times). In a commercial enterprise where people are paid to do the work and have to listen to you no matter what, such an approach may not have been as disastrous. (Well OK, people will quit, but at least some would stay just because of the money.) But when you're dealing with volunteers burning up their free time in hopes of contributing to what they feel is a worthy cause, is it any surprise that people rapidly lose interest in contributing further? And when even longstanding contributors after years of involvement decide to call it quits, and when this happens not once or twice, but in a continual, recurring pattern throughout the history of D, then you really gotta wonder just *what* is going on.
 Most of the time, D community seems to try and look like it cares
 about the end user, but in reality (this thread is a confirmation)
 it's just pointless bantering about something that the end user
 doesn't care about. More importantly, this pointless bantering is used
 as an escape goat to shut down changes that DLF doesn't want/like, yet
 DLF itself never bothers with breakage if it implements some kewl new
 feature that nobody asked for.
[...] I wouldn't go that far to say that the DLF doesn't care about the end user. I'm sure they do, as most of them are users themselves. The problem is more with the perfectionist ideal of letting the perfect be the enemy of the good. Contributions that are not 100% perfect are rejected, even when they're already good enough for the majority of use cases. Rather than have a solution that's good enough for the time being, we would rather not have any solution at all until the ideal arrives. If it ever arrives at all. It's that, plus the problems with communication. Or the lack thereof. IOW, the core issue here isn't technical -- we have no problem with technical issues, Walter is an expert on that -- it's social. On Tue, Jan 09, 2024 at 08:02:40PM +0000, BlueBeach via Digitalmars-d wrote:
 On Tuesday, 9 January 2024 at 19:32:10 UTC, GrimMaple wrote:
 On Tuesday, 9 January 2024 at 15:01:28 UTC, H. S. Teoh wrote:
 […] More importantly, this pointless bantering is used
as an escape goat to shut down changes that DLF doesn't want/like, yet DLF itself never bothers with breakage if it implements some kewl new feature that nobody asked for.
There is the vision document from Mike Parker (https://github.com/dlang/vision-document) I think it’s noble attempt and could be starting point for improvements. But I’m interested in your opinion.
[...] Over the years, there have been countless such "vision documents" and other similar things. The more time goes on, the more skeptical I have become that they have any real impact on anything. They sound just like the typical corporate motivational "visions" that have lots of buzzwords but scanty in actual, actionable details. You know, the kind of pep talk that your CEO would give at company-wide meetings where he would proudly announce "for this upcoming year, our company slogan will be $buzzword1, $buzzword2 and $buzzword3. We're doing awesome, and we're going to be even more awesome by leveraging $buzzword1 to $buzzword2 the $buzzword3 and achieve even higher levels of $buzzword4 which will blah blah blah ... PROFIT!". Everybody claps their hands as the soap opera episode^W^W^Wcompany meeting comes to an end: the strawman has been defeated and everything has reset to the status quo as at the beginning of the show. The next day will be business as usual. I wish things were different, but our track record isn't looking very promising right now. I'd much rather look at actual work that's being done, than such documents that we're not actually acting on. // Now, to be fair, LOTS of work has been done in D over the past years. Don't get me wrong, it isn't as though D has come to a standstill. There's still lots of good stuff in D, and they're still trickling in. (I wish they were pouring in, but I'll settle with a trickle.) Unfortunately, the fundamental issues like I described above remain unsolved, and from all appearances, unlikely to be solved. We excel at solving technical issues, social issues not so much. I'm not holding my breath. T -- Кто везде - тот нигде.
Jan 09
next sibling parent zjh <fqbqrr 163.com> writes:
On Tuesday, 9 January 2024 at 21:26:30 UTC, H. S. Teoh wrote:

We excel at solving
 technical issues, social issues not so much.  I'm not holding 
 my breath.


 T
Yes, various high demands have driven out D contributors. We don't need to be absolutely right from the beginning, we need to meet countless needs from the beginning. And many proposals and requirements are ruined before we even try them out. All kinds of prohibitions, all kinds of prohibitions, so contributors and users all ran away. The simplest thing is that C++ users want a private that is only private to the class. But, just not letting you have it, it doesn't meet D's perfect feeling? The problem can be solved simply by adding a switch! They just don't want your users to be satisfied. So the user ran, the contributor ran!
Jan 09
prev sibling next sibling parent reply zjh <fqbqrr 163.com> writes:
On Tuesday, 9 January 2024 at 21:26:30 UTC, H. S. Teoh wrote:
 Now, to be fair, LOTS of work has been done in D over the past 
 years. Don't get me wrong, it isn't as though D has come to a 
 standstill. There's still lots of good stuff in D, and they're 
 still trickling in. (I wish they were pouring in, but I'll 
 settle with a trickle.) Unfortunately, the fundamental issues 
 like I described above remain unsolved, and from all 
 appearances, unlikely to be solved. We excel at solving 
 technical issues, social issues not so much.  I'm not holding 
 my breath.


 T
Yes, various high demands have driven out D contributors. We don't need to be absolutely right from the beginning, we need to meet countless needs from the beginning. And many proposals and requirements are ruined before we even try them out. All kinds of prohibitions, all kinds of prohibitions, so contributors and users all ran away. The simplest thing is that C++ users want a private that is only private to the class. But, just not letting you have it, it doesn't meet D's perfect feeling? The problem can be solved simply by adding a switch! They just don't want your users to be satisfied. So the user ran, the contributor ran!
Jan 09
parent reply zjh <fqbqrr 163.com> writes:
On Wednesday, 10 January 2024 at 01:56:48 UTC, zjh wrote:
 So the user ran, the contributor ran!
Users, And Meeting their needs is fundamental! A language without user, there is no future!
Jan 09
parent zjh <fqbqrr 163.com> writes:
On Wednesday, 10 January 2024 at 02:04:00 UTC, zjh wrote:

 Users, And Meeting their needs is fundamental!
 A language without user, there is no future!
Just like `'dip1027' or 'dip1036'`, simply setting a switch and letting the user use it first, `discovering and solving` problems during use, is not more useful than discussing it over and over again? If you love using `'dip1027'`, you can use it. If you love using `'dip1036'`, you can also give it a switch. When everyone thinks it's okay, then whoever has more users will be default! Spending `too much time` discussing making him perfect! No, you don't need it. You can gradually make it perfect `during use`! D community, what most needed is to `increase users`! Not `perfect language`! As long as a feature can bring `a large number of users`, it can be added, such as the `private feature` of C++, serving for C++ users!
Jan 09
prev sibling next sibling parent aberba <karabutaworld gmail.com> writes:
On Tuesday, 9 January 2024 at 21:26:30 UTC, H. S. Teoh wrote:
 On Tue, Jan 09, 2024 at 07:32:10PM +0000, GrimMaple via 
 Digitalmars-d wrote:
 [...]
Well, I *do* have a long track record of contributing to D. I've had not a small number of PRs merged into Phobos and a few in DMD. I was even given commit access to Phobos (and still have it, though I haven't used it in years) and for a time put a lot of effort into reviewing and merging PRs. [...]
Pretty much how it was 3yrs ago. I even wrote about the lack of soft skills needed to grow the community among the leadership [1](https://aberba.com/posts/2020-12-09-why-i-still-use-d). Unfortunately not much has changed. I among others used to complain a lot about the above mentioned issues but it only stressing us all out lol.
Jan 10
prev sibling parent reply a11e99z <black80 bk.ru> writes:
what should be done with D?

1) /for any future path of D/
drop DMD entirely and develop only LDC. point.
LLVM has:
- code generation/optimization. nobody should to spent precious 
time to fap with register allocations and instruction generation.
- RT code generation for any script, JIT or something.
- dozens attributes and internal asm for any platform.
- best interop with C and C++ through clang/AST.
- many other interops through LLVM-IR with any other languages 
that can generate it.
u'd have to be a fool to reject this.
- minus: to slow code generation. well, Zig decides to generate 
LLVM-binary-IR w/o LLVM framework/tools. and need to optimize 
compiler itself not generated code.

2) allow fat pointers for GC-refs and class' userdata-field.
let people enjoy to invent something useful or just integrate 
GoGC or Nim'ORC.

3) nogc std lib.

/my C++-vision of D-dev/ C++ is good but a lil ugly. current D - 
became same.

10) simplify D: to many keywords for almost nothing. wtf copy 
constructor
this(ref return scope const A rhs) { }

 returns(scoped | other_flags)

11) getting closer to C++: use C++ types, call C++ 
.ctor/.dtr/ops, const/refs same as in C++, const char* const ptr 
and const T&, catch C++ exceptions etc.
millions libs in C++ and you drop it for abandoned/dead 
D-manual-wrappers of it - Detroit for packages.
Jan 11
next sibling parent reply a11e99z <black80 bk.ru> writes:
On Thursday, 11 January 2024 at 17:40:50 UTC, a11e99z wrote:
 what should be done with D?
12) drop betterC - Zig/Vlang already won this race. develop importC because then you don’t have to write wrappers.
Jan 11
parent reply zjh <fqbqrr 163.com> writes:
On Thursday, 11 January 2024 at 17:54:24 UTC, a11e99z wrote:

 12) drop betterC - Zig/Vlang already won this race.
If you give up on `'betterC'`, then 'D' will lose a significant portion of users! Now with `OpenD`, anyone who likes `GC` can go there. Anyone who doesn't like `'GC'` can stay behind and even make `'nogc'` the default!
Jan 11
parent reply zjh <fqbqrr 163.com> writes:
On Friday, 12 January 2024 at 03:06:55 UTC, zjh wrote:

 Now with `OpenD`, anyone who likes `GC` can go there.
 Anyone who doesn't like `'GC'` can stay behind and even make 
 `'nogc'` the default!
Let `'openD'` compete with the `'GC'` series of languages! Let Dmd focus on competing with `C++/Rust`!
Jan 11
parent zjh <fqbqrr 163.com> writes:
On Friday, 12 January 2024 at 03:08:25 UTC, zjh wrote:

 Let `'openD'` compete with the `'GC'` series of languages!
 Let Dmd focus on competing with `C++/Rust`!
You can even use both `'openD'` and `'dmd'` at the same time.
Jan 11
prev sibling next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 1/11/24 18:40, a11e99z wrote:
 what should be done with D?
 
 1) /for any future path of D/
 drop DMD entirely and develop only LDC.
DMD is much easier to build so as things are it is worth keeping even if only for the sake of compiler development.
 ...
 - minus: to slow code generation.
It's a big one.
 well, Zig decides to generate 
 LLVM-binary-IR w/o LLVM framework/tools. and need to optimize compiler 
 itself not generated code.
 ...
That seems potentially interesting, though an additional step to transform the generated LLVM IR into machine code is certainly more expensive than outputting machine code directly. I am sure the Zig developers must be aware of this and are planning to do their own native backends for common architectures as well? Though I hear that LDC spends a lot of time on IR validation to detect bugs by default and that disabling it using --disable-verify can increase LDC build speeds significantly.
Jan 11
prev sibling next sibling parent reply ryuukk_ <ryuukk.dev gmail.com> writes:
On Thursday, 11 January 2024 at 17:40:50 UTC, a11e99z wrote:
 what should be done with D?

 1) /for any future path of D/
 drop DMD entirely and develop only LDC. point.
 - minus: to slow code generation. well, Zig decides to generate 
 LLVM-binary-IR w/o LLVM framework/tools. and need to optimize 
 compiler itself not generated code.
mistake DMD is D's best asset, provides very fast code compilation, i have tried plenty of language, and it is the main reason why i stick with D, no other language can compete, they all depend on LLVM and go read that if you want to wake up from your disillusion: https://kristoff.it/blog/zig-new-relationship-llvm/ https://github.com/ziglang/zig/issues/16270 TLDR: zig will do like D, and will maintain their own backend, in order to provide faster compilation for their debug builds (and very far in the future, release as well), just like DMD how woken up do you feel now? D is leading in that area, one shall not give it up
Jan 11
parent reply Hors <q q.com> writes:
On Thursday, 11 January 2024 at 21:52:58 UTC, ryuukk_ wrote:
 On Thursday, 11 January 2024 at 17:40:50 UTC, a11e99z wrote:
 [...]
mistake DMD is D's best asset, provides very fast code compilation, i have tried plenty of language, and it is the main reason why i stick with D, no other language can compete, they all depend on LLVM and go read that if you want to wake up from your disillusion: https://kristoff.it/blog/zig-new-relationship-llvm/ https://github.com/ziglang/zig/issues/16270 TLDR: zig will do like D, and will maintain their own backend, in order to provide faster compilation for their debug builds (and very far in the future, release as well), just like DMD how woken up do you feel now? D is leading in that area, one shall not give it up
People are forgetting D has extreme small community, it is very unlikely community can maintain two different compilers (without having compiler bugs). Maybe something in future, but not now.
Jan 11
parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Friday, 12 January 2024 at 06:51:43 UTC, Hors wrote:
 On Thursday, 11 January 2024 at 21:52:58 UTC, ryuukk_ wrote:
 On Thursday, 11 January 2024 at 17:40:50 UTC, a11e99z wrote:
 [...]
mistake DMD is D's best asset, provides very fast code compilation, i have tried plenty of language, and it is the main reason why i stick with D, no other language can compete, they all depend on LLVM and go read that if you want to wake up from your disillusion: https://kristoff.it/blog/zig-new-relationship-llvm/ https://github.com/ziglang/zig/issues/16270 TLDR: zig will do like D, and will maintain their own backend, in order to provide faster compilation for their debug builds (and very far in the future, release as well), just like DMD how woken up do you feel now? D is leading in that area, one shall not give it up
People are forgetting D has extreme small community, it is very unlikely community can maintain two different compilers (without having compiler bugs). Maybe something in future, but not now.
Walter created and maintained DMD by himself at the beginning ... a solo project can archive incredible results sometime, so I won't bet against the success of the fork. There's also the plus that D is really a productive language, hey, I remember the discussion about the opportunity to convert D codebase from C++ to D: "that will turn the compiler into something extremely manageable, that will be D unfair advance over other languages!"
Jan 12
parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/12/2024 12:45 AM, Paolo Invernizzi wrote:
 There's also the plus that D is really a productive language, hey, I remember 
 the discussion about the opportunity to convert D codebase from C++ to D:
"that 
 will turn the compiler into something extremely manageable, that will be D 
 unfair advance over other languages!"
I'm very happy it's now 100% D code, it's much more pleasant to maintain it in D. The codebase still retains a "C with Classes" style to it, one I am gradually bending it away from into a more D-ish style.
Jan 14
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 1/11/2024 9:40 AM, a11e99z wrote:
 2) allow fat pointers for GC-refs and class' userdata-field.
 let people enjoy to invent something useful or just integrate GoGC or Nim'ORC.
Microsoft did this with their "Managed C++" project. C++ had two fundamental pointer types, one for GC memory and the other for non-managed memory. I'm told it is still around, but it never caught on. I expect it had similar issues that DOS C compilers had. They had 3 pointer types - near pointers, far pointers, and huge pointers (and sometimes stack pointers and code pointers, too!). That often meant a function had to have multiple versions, one for each pointer type. It was wonderful to get away from that and have only one pointer type to deal with.
Jan 14
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 14 January 2024 at 08:28:00 UTC, Walter Bright wrote:
 On 1/11/2024 9:40 AM, a11e99z wrote:
 2) allow fat pointers for GC-refs and class' userdata-field.
 let people enjoy to invent something useful or just integrate 
 GoGC or Nim'ORC.
Microsoft did this with their "Managed C++" project. C++ had two fundamental pointer types, one for GC memory and the other for non-managed memory. I'm told it is still around, but it never caught on. ....
It caught on where it mattered, .NET interop with native libraries. Just got updated to C++20 last year. https://devblogs.microsoft.com/cppblog/cpp20-support-comes-to-cpp-cli/ Then there is Unreal C++, with its GC support for Blueprint visual scripting. Any game written in Unreal, using Blueprint makes use of it.
Jan 14
next sibling parent evilrat <evilrat666 gmail.com> writes:
On Sunday, 14 January 2024 at 11:34:19 UTC, Paulo Pinto wrote:
 Then there is Unreal C++, with its GC support for Blueprint 
 visual scripting.

 Any game written in Unreal, using Blueprint makes use of it.
Great, but in Unreal it is just plain pointers and it is the UPROPERTY that makes it work, IIRC all it does just adds and deletes GC roots for said pointers on object spawning/deleting. If you forgot to add UPROPERTY you will be quickly reminded as it will get released way too soon.
Jan 14
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 1/14/2024 3:34 AM, Paulo Pinto wrote:
 It caught on where it mattered, .NET interop with native libraries.
It caught on in DOS programming, too, because there was no other way. Nobody liked it, though. It remained ugly with duplicative code. Needing both a "far" strlen and a "near" strlen just stunk. It was constant cognitive load - should this pointer be a far one or a near one?
Jan 14