www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Long-term evolution of D

reply Brian Hay <bhay construct3d.com> writes:
Has anyone considered the long-term positive evolution of D, assuming a 
successful version 1.0 release and growing interest?

If D is meant to be "a reengineering of C and C++" I think we also need 
to adopt a post version 1.0 development paradigm different from that of 
C and C++, otherwise we risk repeating the history of those languages 
i.e. how do we avoid or minimize the problems of the past or is it folly 
to even think we can - is the "F Programming Language" inevitable 2 
decades from now?

I'm not talking about road maps and feature sets but moreso the 
framework within which D will evolve as a language in the long-term in 
order to avoid, as much as possible, the legacy crud, competing vested 
interests and community fragmentation that begins to hamper language 
development after a decade or so.

The specification of the D Programming Language is largely a one-person 
effort, albeit with much community input, and I think at the present 
time it benefits from this model, given Walter's extensive language 
knowledge and compiler implementation experience. But what happens when 
D does become the success we all know it can be? Is standardization 
(ISO, ECMA etc) an option?

It might be too early to consider such things. Just a thought.

Brian.
Mar 08 2006
next sibling parent SebastiŠn E. Peyrott <as7cf yahoo.com> writes:
In article <duobud$1rjk$1 digitaldaemon.com>, Brian Hay says...
Has anyone considered the long-term positive evolution of D, assuming a 
successful version 1.0 release and growing interest?

If D is meant to be "a reengineering of C and C++" I think we also need 
to adopt a post version 1.0 development paradigm different from that of 
C and C++, otherwise we risk repeating the history of those languages 
i.e. how do we avoid or minimize the problems of the past or is it folly 
to even think we can - is the "F Programming Language" inevitable 2 
decades from now?

I'm not talking about road maps and feature sets but moreso the 
framework within which D will evolve as a language in the long-term in 
order to avoid, as much as possible, the legacy crud, competing vested 
interests and community fragmentation that begins to hamper language 
development after a decade or so.

The specification of the D Programming Language is largely a one-person 
effort, albeit with much community input, and I think at the present 
time it benefits from this model, given Walter's extensive language 
knowledge and compiler implementation experience. But what happens when 
D does become the success we all know it can be? Is standardization 
(ISO, ECMA etc) an option?

It might be too early to consider such things. Just a thought.

Brian.

IMO, although certain assumptions can be made, there's one thing we cannot be entirely sure about: hardware. What type of hardware will there be in the future? Will D be usefull in such architectures? Will it be easy to adapt it? Will we need an entirely new specification? It may be possible to avoid making the same mistakes once again...the question is, will that be enough? -- SebastiŠn.
Mar 08 2006
prev sibling next sibling parent reply Sean Kelly <sean f4.ca> writes:
Brian Hay wrote:
 
 The specification of the D Programming Language is largely a one-person 
 effort, albeit with much community input, and I think at the present 
 time it benefits from this model, given Walter's extensive language 
 knowledge and compiler implementation experience. But what happens when 
 D does become the success we all know it can be? Is standardization 
 (ISO, ECMA etc) an option?

I've begun to think that the standardization process may simply not be a good fit for software, simply because of how slow it is. While it's a welcome assurance that a language isn't going to change out from under you, the alternative seems to be that it is unable to keep up with changing requirements. That said, I would be pleased to eventually see D accepted as some sort of open standard, but perhaps not with the 5-10 year cycle apparently required by the ISO process. Sean
Mar 09 2006
next sibling parent reply =?ISO-8859-1?Q?Jari-Matti_M=E4kel=E4?= <jmjmak utu.fi.invalid> writes:
Sean Kelly wrote:
 Brian Hay wrote:
 The specification of the D Programming Language is largely a
 one-person effort, albeit with much community input, and I think at
 the present time it benefits from this model, given Walter's extensive
 language knowledge and compiler implementation experience. But what
 happens when D does become the success we all know it can be? Is
 standardization (ISO, ECMA etc) an option?

I've begun to think that the standardization process may simply not be a good fit for software, simply because of how slow it is. While it's a welcome assurance that a language isn't going to change out from under you, the alternative seems to be that it is unable to keep up with changing requirements. That said, I would be pleased to eventually see D accepted as some sort of open standard, but perhaps not with the 5-10 year cycle apparently required by the ISO process.

could not so easily "embrace and extend" (=rape) the reference specs. It took a while even for C++ before it was ready for standardization. I don't think we're in a hurry here.
Mar 09 2006
parent Georg Wrede <georg.wrede nospam.org> writes:
Jari-Matti Mškelš wrote:
 
 The only good thing in a standard would be that corporations like M$
 could not so easily "embrace and extend" (=rape) the reference specs. It
 took a while even for C++ before it was ready for standardization. I
 don't think we're in a hurry here.

The only thing even a little slowing Microsoft from doing that is if we trademark the moniker D#. ;-)
Mar 10 2006
prev sibling parent reply Don Clugston <dac nospam.com.au> writes:
Sean Kelly wrote:
 Brian Hay wrote:
 The specification of the D Programming Language is largely a 
 one-person effort, albeit with much community input, and I think at 
 the present time it benefits from this model, given Walter's extensive 
 language knowledge and compiler implementation experience. But what 
 happens when D does become the success we all know it can be? Is 
 standardization (ISO, ECMA etc) an option?

I've begun to think that the standardization process may simply not be a good fit for software, simply because of how slow it is. While it's a welcome assurance that a language isn't going to change out from under you, the alternative seems to be that it is unable to keep up with changing requirements. That said, I would be pleased to eventually see D accepted as some sort of open standard, but perhaps not with the 5-10 year cycle apparently required by the ISO process.

I really like the way that dstress is becoming a defacto standard compliance test. The standard could be simply be, "must pass all tests in dstress", rather than the absurd C++ situation where parts of the standard are unimplementable. Defining a standard via tests seems to me to be more appropriate to software than the legalese that standards bodies inevitably generate.
 
 
 Sean

Mar 09 2006
parent Sean Kelly <sean f4.ca> writes:
Don Clugston wrote:
 Sean Kelly wrote:
 Brian Hay wrote:
 The specification of the D Programming Language is largely a 
 one-person effort, albeit with much community input, and I think at 
 the present time it benefits from this model, given Walter's 
 extensive language knowledge and compiler implementation experience. 
 But what happens when D does become the success we all know it can 
 be? Is standardization (ISO, ECMA etc) an option?

I've begun to think that the standardization process may simply not be a good fit for software, simply because of how slow it is. While it's a welcome assurance that a language isn't going to change out from under you, the alternative seems to be that it is unable to keep up with changing requirements. That said, I would be pleased to eventually see D accepted as some sort of open standard, but perhaps not with the 5-10 year cycle apparently required by the ISO process.

I really like the way that dstress is becoming a defacto standard compliance test. The standard could be simply be, "must pass all tests in dstress", rather than the absurd C++ situation where parts of the standard are unimplementable. Defining a standard via tests seems to me to be more appropriate to software than the legalese that standards bodies inevitably generate.

But as a language user, I like having the legalese available to know what should and should not be possible. From the language specs I've seen, the C and C++ versions are by far the best. In fact, I refer to them for language issues far more than any other reference material. But I agree that implementing by spec alone is perhaps not ideal as it's impossible to eliminate room for interpretation, particularly when concepts are used that have little direct relation to compiler design (such as sequence points in C++). I think it helps D quite a bit that the language is being designed by a compiler writer as there's little risk that any of its features will be meaningless to one ;-) Sean
Mar 09 2006
prev sibling parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
One question is interesting to put. Supposing we have this 1.0 release, 
how will future releases (like 2.0) be like in terms of backwards 
compatibility?

This I'd like to know, and I state already that I hope D doesn't get 
clogged and stumped because of backwards-compatibility with 1.0 . This 
would be redoing the mistakes that we see happening with C++ and with 
with Java more recently (in this latter case to the extreme). In fact, 
Java vs. C# is an interesting case study. Both languages have been 
developed more or less to the same pace and set of features, yet Java 
already got some ugly hacks in it's second iteration (Java 5.0) to 
implement new features without breaking existing code (like the 
"foreach" and  Override). Whereas C# is growing more smoothly and 
cleaning, and in C# 3.0 we can already see some upcoming interesting 
features (like LINQ, auto-type(var), co-rotines (yield)), which would be 
near-impossible to put in Java, at least in a clean way.

Also, regardless, I hope we put every good feature we think about 
already in D 1.0, and not let it for a "2.0" release. Even if such 
feature be just in the spec and not yet implement in a compiler.

-- 
Bruno Medeiros - CS/E student
http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Mar 09 2006
parent "Ameer Armaly" <ameer_armaly hotmail.com> writes:
"Bruno Medeiros" <daiphoenixNO SPAMlycos.com> wrote in message 
news:duqbnv$1fct$1 digitaldaemon.com...
 One question is interesting to put. Supposing we have this 1.0 release, 
 how will future releases (like 2.0) be like in terms of backwards 
 compatibility?

 This I'd like to know, and I state already that I hope D doesn't get 
 clogged and stumped because of backwards-compatibility with 1.0 . This 
 would be redoing the mistakes that we see happening with C++ and with with 
 Java more recently (in this latter case to the extreme). In fact, Java vs. 
 C# is an interesting case study. Both languages have been developed more 
 or less to the same pace and set of features, yet Java already got some 
 ugly hacks in it's second iteration (Java 5.0) to implement new features 
 without breaking existing code (like the "foreach" and  Override). Whereas 
 C# is growing more smoothly and cleaning, and in C# 3.0 we can already see 
 some upcoming interesting features (like LINQ, auto-type(var), co-rotines 
 (yield)), which would be near-impossible to put in Java, at least in a 
 clean way.

 Also, regardless, I hope we put every good feature we think about already 
 in D 1.0, and not let it for a "2.0" release. Even if such feature be just 
 in the spec and not yet implement in a compiler.

enough to jump out at you when you're not expecting it; a feature should be entirely implimented, or not at all.
 -- 
 Bruno Medeiros - CS/E student
 http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D 

Mar 09 2006