www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - C++0x

reply Valgrind valmouth.eu writes:
Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8
Jan 03 2006
next sibling parent reply "John C" <johnch_atms hotmail.com> writes:
<Valgrind valmouth.eu> wrote in message 
news:dpflnm$1vo1$1 digitaldaemon.com...
 Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

 http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8

A more useful overview, from Stroustrup himself, is presented here: http://www.artima.com/cppsource/cpp0x.html. I like the new "template concepts" (analogous to C# generic constraints). This would be a useful addition to D as well. Also, the idea of "sequence constructors" to allow array-like initialisation of UDTs is a good one (I suggested something similar for D a while back). Strangely, I find the proposals both disappointing and promising at the same time. Plus it's at least three years away, longer if you factor in compiler vendors updating their implementations.
Jan 04 2006
parent Sean Kelly <sean f4.ca> writes:
John C wrote:
 
 A more useful overview, from Stroustrup himself, is presented here: 
 http://www.artima.com/cppsource/cpp0x.html.
 
 I like the new "template concepts" (analogous to C# generic constraints). 
 This would be a useful addition to D as well.

Agreed. Concept checking is a great benefit to C++, though static assert can fill much of that role in D.
 Strangely, I find the proposals both disappointing and promising at the same 
 time. Plus it's at least three years away, longer if you factor in compiler 
 vendors updating their implementations. 

C++ is already the most complex language I know of, and the next iteration of the standard promises to make it a lot more complicated (at least in terms of library code). While I'm excited about a lot of the additions, I'm also worried that the language may end up as somewhat of a monstrosity. I suppose it all really depends on just how many additional proposals are accepted by 2009. Sean
Jan 04 2006
prev sibling next sibling parent reply "Craig Black" <cblack ara.com> writes:
These new features will not be able to fix the innumerable difficulties of 
the language.  Too little too late to reform C++ in my opinion.  By the time 
these features are ratified and become available in compilers, D will have 
already stolen the show.  Atleast it would seem that way at the pace that 
Walter is improving both the language and compiler.

-Craig 
Jan 04 2006
parent reply Matthias Spycher <matthias coware.com> writes:
I frequently discuss D with a colleague who represented a company on the 
C++ standards committee for a few years and who has a deep knowledge of 
the language and libraries. He often reminds me that without certain 
features like multiple inheritance, certain users (like himself) will 
remain unconvinced. I tend to disagree when it comes to features like 
multiple inheritance, but IMHO, there are other barriers ahead for D.

Concurrency, memory model semantics, and data coherency on hyperthreaded 
multiprocessor cores are probably going to divide programming languages 
in the coming years. For systems programming and sequential or lightly 
multithreaded applications programming C/C++/D will probably dominate. 
But for large, highly concurrent, applications Java/C# and other systems 
that rely on virtual machine technology will probably win out. They 
define a consistent model of memory in light of concurrency and they 
don't expose raw memory to the programmer directly. The cost in time of 
dynamic profiling, recompilation, and garbage collection, etc. will 
become more insignificant as multicore processors become the norm. On a 
machine with, say, 32 threads running parallel in a single core, eg 
http://www.sun.com/processors/UltraSPARC-T1/index.xml, application 
performance is unlikely to be significantly different from a statically 
compiled program.

Without a clear specification of the memory model and corresponding 
language abstractions (e.g. thread lifecycle, atomic assignment rules) 
to guarantee the same behavior for programs compiled by different 
compilers, D will not be able to challenge jvm/clr technology in the 
years to come. This is not to say it can't remain an attractive 
alternative for certain kinds of programs.

The main problem D faces in this context is the lack of a large 
ecosystem which grows the language/libraries and provides the 
development tools that increase programmer productivity. If a company 
like IBM could throw 40 million dollars at this problem to get the ball 
rolling (as they did with the Eclipse/JDT project for Java), then D 
could really change the game. The Java programming environment is 
scalable and productive because it has tools like Eclipse/JDT/EMF/GEF 
that eliminate much of the drudgery.

Another event that might give D some extra momentum is the development 
of a new operating system with its own libraries -- and no dependency on 
C or any other legacy. It might take 10-20 years to reach the main 
stream, but that's quite normal for this type of technology.

Matthias

Craig Black wrote:
 These new features will not be able to fix the innumerable difficulties of 
 the language.  Too little too late to reform C++ in my opinion.  By the time 
 these features are ratified and become available in compilers, D will have 
 already stolen the show.  Atleast it would seem that way at the pace that 
 Walter is improving both the language and compiler.
 
 -Craig 
 
 

Jan 04 2006
next sibling parent reply Sean Kelly <sean f4.ca> writes:
Matthias Spycher wrote:
 I frequently discuss D with a colleague who represented a company on the 
 C++ standards committee for a few years and who has a deep knowledge of 
 the language and libraries. He often reminds me that without certain 
 features like multiple inheritance, certain users (like himself) will 
 remain unconvinced. I tend to disagree when it comes to features like 
 multiple inheritance, but IMHO, there are other barriers ahead for D.

Mixins have addressed most of the design problems that MI typically addresses. Though I grant that aliasing to make everything properly visible is a tad annoying.
 Concurrency, memory model semantics, and data coherency on hyperthreaded 
 multiprocessor cores are probably going to divide programming languages 
 in the coming years. For systems programming and sequential or lightly 
 multithreaded applications programming C/C++/D will probably dominate. 
 But for large, highly concurrent, applications Java/C# and other systems 
 that rely on virtual machine technology will probably win out. They 
 define a consistent model of memory in light of concurrency and they 
 don't expose raw memory to the programmer directly. The cost in time of 
 dynamic profiling, recompilation, and garbage collection, etc. will 
 become more insignificant as multicore processors become the norm. On a 
 machine with, say, 32 threads running parallel in a single core, eg 
 http://www.sun.com/processors/UltraSPARC-T1/index.xml, application 
 performance is unlikely to be significantly different from a statically 
 compiled program.

I very much agree. And given the relatively slow progress the committee has made on multithreading support for C++ 0x, there is little chance that early implementation will be possible for compiler writers with respect to these features.
 Without a clear specification of the memory model and corresponding 
 language abstractions (e.g. thread lifecycle, atomic assignment rules) 
 to guarantee the same behavior for programs compiled by different 
 compilers, D will not be able to challenge jvm/clr technology in the 
 years to come. This is not to say it can't remain an attractive 
 alternative for certain kinds of programs.

While I grant that D could do with defined memory semantics, I think it can get by using inline assembler and the volatile statement. I expect that D may ultimately adopt whatever the C++ committee decides if the design seems reasonable however. The major issue here is to have a model that is simple and usable and that allows room for hardware changes, such as transactional memory. I agree with the C++ committee that the Java spec is unduly limiting, but I'm concerned about their slow progress in defining acceptable semantics.
 The main problem D faces in this context is the lack of a large 
 ecosystem which grows the language/libraries and provides the 
 development tools that increase programmer productivity. If a company 
 like IBM could throw 40 million dollars at this problem to get the ball 
 rolling (as they did with the Eclipse/JDT project for Java), then D 
 could really change the game. The Java programming environment is 
 scalable and productive because it has tools like Eclipse/JDT/EMF/GEF 
 that eliminate much of the drudgery.

Personally, I think having a fancy IDE is largely useful for UI programming. So long as D has decent interactive debugger support, I don't much miss the presence of a full-featured package like Eclipse.
 Another event that might give D some extra momentum is the development 
 of a new operating system with its own libraries -- and no dependency on 
 C or any other legacy. It might take 10-20 years to reach the main 
 stream, but that's quite normal for this type of technology.

I think you were on the right track with what you said about concurrent progreamming. If D can show the capacity to produce correct concurrent code now, it will be the only general purpose systems programming language I know of which can make such a claim. Given the rapidly growing interest in concurrent programming, this may position D quite well to gain users from the C++ community over the next 5 years. Sean
Jan 04 2006
next sibling parent reply Matthias Spycher <matthias coware.com> writes:
Sean Kelly wrote:
 Matthias Spycher wrote:

 The main problem D faces in this context is the lack of a large 
 ecosystem which grows the language/libraries and provides the 
 development tools that increase programmer productivity. If a company 
 like IBM could throw 40 million dollars at this problem to get the 
 ball rolling (as they did with the Eclipse/JDT project for Java), then 
 D could really change the game. The Java programming environment is 
 scalable and productive because it has tools like Eclipse/JDT/EMF/GEF 
 that eliminate much of the drudgery.

Personally, I think having a fancy IDE is largely useful for UI programming. So long as D has decent interactive debugger support, I don't much miss the presence of a full-featured package like Eclipse.

 Sean

I like emacs too..., but I see the power of Eclipse/JDT not in UI development, but rather the ability to maintain massive amounts of code (thousands of files, millions of lines). A refactoring operation that touches hundreds of classes can be completed in minutes and large code bases evolve much more rapidly than they do in my emacs-based C/C++ environment. Eclipse would probably not have matured as quickly if it wasn't written in C++. Good debugging features like dynamic code replacement at runtime (no restart necessary) help too, of course...
Jan 04 2006
parent Matthias Spycher <matthias coware.com> writes:
Matthias Spycher should have written:

 environment. Eclipse would probably not have matured as quickly if it 
 wasn't written in C++. Good debugging features like dynamic code 

 replacement at runtime (no restart necessary) help too, of course...

mea culpa
Jan 04 2006
prev sibling parent Sean Kelly <sean f4.ca> writes:
Sean Kelly wrote:
 
 While I grant that D could do with defined memory semantics, I think it 
 can get by using inline assembler and the volatile statement.  I expect 
 that D may ultimately adopt whatever the C++ committee decides if the 
 design seems reasonable however.  The major issue here is to have a 
 model that is simple and usable and that allows room for hardware 
 changes, such as transactional memory.  I agree with the C++ committee 
 that the Java spec is unduly limiting, but I'm concerned about their 
 slow progress in defining acceptable semantics.

By the way, an interesting way to go for D might be to toss the idea of explicit synchronization and adopt transactional programming instead. I need to read up on the limitations, but transactional memory models can be done in software--code already exists for C# and Java, for example. I have a sneaking feeling that the days of explicit synchronization are drawing to a close and that transactional programming will begin to get some attention over the next few years. Sean
Jan 04 2006
prev sibling next sibling parent reply "Kris" <fu bar.com> writes:
Amen to that, Matthias.

There are technical hurdles for D (libraries, tools, some language 
constructs), and there are strategic hurdles. The latter gets too little 
love; perhaps because there's not enough to go around?

I'm totally with you on the Threading aspect, and where the industry is 
heading these days (32 hardware threads etc). There have been one or two 
topics around here on consideration for explicit language support of 
parallel constructs (perhaps a smidgen of Occam?), but they haven't had any 
traction. Either way, D could probably use some strategic goals in this 
regard and others; just to set the stage if nothing else?

One strategic aspect to question is Systems Programming itself (the arena D 
puports to target). There's little doubt (in my mind) that the largest 
segment of systems-programmers are in the embedded market, yet D effectively 
ignores the entire area (if you really take a critical look at the 
language). There's a variety of other factors that sometimes combine to make 
me wonder "what does D want to be when it grows up?". This is all about 
strategic direction, or perhaps a lack thereof. Whilst stating something 
like "D is the successor to C++" may garner some attention, it's a bit too 
nebulous to figure out what D could be actually used for (there's little in 
the way of explicit or targeted goals).

A problem, as I see it, is that Walter has but one pair of hands. Perhaps 
there's precious little time to spend on strategy? Perhaps it's something 
that Walter does not feel comfortable with, or is not particularly adept at? 
I see a similar pattern in the library development too ~ not just in Phobos, 
but in the vaguely fractured picture over at dsource.org (no disrespect to 
anyone there ~ it's a genuinely great place). It would perhaps be more 
productive to set some overall goals and cajole a bunch of willing hands in 
that direction? Yet, that tends to require some kind of "guidance body" ~ 
one which has notably been left for dead on a number of occasions thus far. 
Note that there's no poor reflection upon any one person here; or indeed 
anyone.

While it's great to have independent, dedicated, developers doing their own 
thing for D, I wonder if some larger over-arching strategic goals could be 
set and met? On the other hand, it can be tough to organize such a thing 
when some of the more obvious examples are repeadedly overlooked ~ for 
instance, full debugging support is surely a pre-requisite to widespread 
adoption, though never gets the attention it deserves? Hint hint :)

Thus, I think there may be a bit of "follow the leader" going on ~ by that I 
mean (or I suspect) that there's little impetus for the independent 
developers to set strategic goals for themselves (as a group; though there 
are exceptions such as DDL), when the language development itself tends to 
avoid such things? This is certainly not an intent to have a go at Walter 
(or anyone else for that matter), but is instead just a personal reflection 
upon the state of play; and how it's been for the 2 years I've been around.

As you say; if someone threw $40M in the pot, things would undoubtedly 
become more organized. Money can certainly motivate people to adopt one 
particular direction over another :)

2 cents;



"Matthias Spycher" <matthias coware.com> wrote in...
I frequently discuss D with a colleague who represented a company on the 
C++ standards committee for a few years and who has a deep knowledge of the 
language and libraries. He often reminds me that without certain features 
like multiple inheritance, certain users (like himself) will remain 
unconvinced. I tend to disagree when it comes to features like multiple 
inheritance, but IMHO, there are other barriers ahead for D.

 Concurrency, memory model semantics, and data coherency on hyperthreaded 
 multiprocessor cores are probably going to divide programming languages in 
 the coming years. For systems programming and sequential or lightly 
 multithreaded applications programming C/C++/D will probably dominate. But 
 for large, highly concurrent, applications Java/C# and other systems that 
 rely on virtual machine technology will probably win out. They define a 
 consistent model of memory in light of concurrency and they don't expose 
 raw memory to the programmer directly. The cost in time of dynamic 
 profiling, recompilation, and garbage collection, etc. will become more 
 insignificant as multicore processors become the norm. On a machine with, 
 say, 32 threads running parallel in a single core, eg 
 http://www.sun.com/processors/UltraSPARC-T1/index.xml, application 
 performance is unlikely to be significantly different from a statically 
 compiled program.

 Without a clear specification of the memory model and corresponding 
 language abstractions (e.g. thread lifecycle, atomic assignment rules) to 
 guarantee the same behavior for programs compiled by different compilers, 
 D will not be able to challenge jvm/clr technology in the years to come. 
 This is not to say it can't remain an attractive alternative for certain 
 kinds of programs.

 The main problem D faces in this context is the lack of a large ecosystem 
 which grows the language/libraries and provides the development tools that 
 increase programmer productivity. If a company like IBM could throw 40 
 million dollars at this problem to get the ball rolling (as they did with 
 the Eclipse/JDT project for Java), then D could really change the game. 
 The Java programming environment is scalable and productive because it has 
 tools like Eclipse/JDT/EMF/GEF that eliminate much of the drudgery.

 Another event that might give D some extra momentum is the development of 
 a new operating system with its own libraries -- and no dependency on C or 
 any other legacy. It might take 10-20 years to reach the main stream, but 
 that's quite normal for this type of technology.

 Matthias

 Craig Black wrote:
 These new features will not be able to fix the innumerable difficulties 
 of the language.  Too little too late to reform C++ in my opinion.  By 
 the time these features are ratified and become available in compilers, D 
 will have already stolen the show.  Atleast it would seem that way at the 
 pace that Walter is improving both the language and compiler.

 -Craig 


Jan 04 2006
parent reply Sean Kelly <sean f4.ca> writes:
Kris wrote:
 
 I'm totally with you on the Threading aspect, and where the industry is 
 heading these days (32 hardware threads etc). There have been one or two 
 topics around here on consideration for explicit language support of 
 parallel constructs (perhaps a smidgen of Occam?), but they haven't had any 
 traction. Either way, D could probably use some strategic goals in this 
 regard and others; just to set the stage if nothing else?

See my post on transactional programming. I need to find time to read up on STM (software transactional memory) to see just how it might integrate with D, but it could be an interesting approach if the restrictions and code generation/library complexity are tractable.
 As you say; if someone threw $40M in the pot, things would undoubtedly 
 become more organized. Money can certainly motivate people to adopt one 
 particular direction over another :)

Perhaps we should all start playing the lottery and promise to set up a D development shop if we win ;-) Sean
Jan 04 2006
parent reply "Kris" <fu bar.com> writes:
"Sean Kelly" <sean f4.ca> wrote ...
 Kris wrote:
 I'm totally with you on the Threading aspect, and where the industry is 
 heading these days (32 hardware threads etc). There have been one or two 
 topics around here on consideration for explicit language support of 
 parallel constructs (perhaps a smidgen of Occam?), but they haven't had 
 any traction. Either way, D could probably use some strategic goals in 
 this regard and others; just to set the stage if nothing else?

See my post on transactional programming. I need to find time to read up on STM (software transactional memory) to see just how it might integrate with D, but it could be an interesting approach if the restrictions and code generation/library complexity are tractable.

Aye. There's a lot of interesting research in a related area of "thread level speculation" ~ for those interested, here's a taster: http://www.vldb2005.org/program/paper/tue/p73-colohan.pdf That one targets a DB, since it's an obvious candidate. Yet, I wonder whether much of this type of research will remain relegated to niche application?
Jan 04 2006
parent Sean Kelly <sean f4.ca> writes:
Kris wrote:
 "Sean Kelly" <sean f4.ca> wrote ...
 Kris wrote:
 I'm totally with you on the Threading aspect, and where the industry is 
 heading these days (32 hardware threads etc). There have been one or two 
 topics around here on consideration for explicit language support of 
 parallel constructs (perhaps a smidgen of Occam?), but they haven't had 
 any traction. Either way, D could probably use some strategic goals in 
 this regard and others; just to set the stage if nothing else?

on STM (software transactional memory) to see just how it might integrate with D, but it could be an interesting approach if the restrictions and code generation/library complexity are tractable.

Aye. There's a lot of interesting research in a related area of "thread level speculation" ~ for those interested, here's a taster: http://www.vldb2005.org/program/paper/tue/p73-colohan.pdf That one targets a DB, since it's an obvious candidate. Yet, I wonder whether much of this type of research will remain relegated to niche application?

In the long term, I think the mainstream will move away from lock-based programming--the technique is prone to subtle bugs, deadlocks, and resists scalability in many cases. Also, current lock-free techniques are very tricky to implement and they don't really apply to higher level data integrity concerns. On the surface, transactional programming seems to address all of these concerns adequately, and the concept of a transaction is something many people are familiar with. And it helps that the technique seems to be gaining popularity in the right circles. But there seem to be issues that still need to be sorted out, and the hardware community is notorious for not listening to the software folks ;-) Sean
Jan 04 2006
prev sibling next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthias Spycher" <matthias coware.com> wrote in message 
news:dphaqc$hn9$1 digitaldaemon.com...
 Without a clear specification of the memory model and corresponding 
 language abstractions (e.g. thread lifecycle, atomic assignment rules) to 
 guarantee the same behavior for programs compiled by different compilers, 
 D will not be able to challenge jvm/clr technology in the years to come. 
 This is not to say it can't remain an attractive alternative for certain 
 kinds of programs.

This is a problem mostly with the specification of D, not with the implementation. It can be fixed by clarifying the spec, once I figure out how to do that <g>.
Jan 04 2006
parent Matthias Spycher <matthias coware.com> writes:
Walter Bright wrote:
 "Matthias Spycher" <matthias coware.com> wrote in message 
 news:dphaqc$hn9$1 digitaldaemon.com...
 
Without a clear specification of the memory model and corresponding 
language abstractions (e.g. thread lifecycle, atomic assignment rules) to 
guarantee the same behavior for programs compiled by different compilers, 
D will not be able to challenge jvm/clr technology in the years to come. 
This is not to say it can't remain an attractive alternative for certain 
kinds of programs.

This is a problem mostly with the specification of D, not with the implementation. It can be fixed by clarifying the spec, once I figure out how to do that <g>.

I agree. But exposing raw memory to the programmer introduces serious constraints on the implementation of (scalable, incremental, generational, parallel) garbage collectors. Once the spec is ready, you may recognize the need for a "managed D" subset. And there's nothing wrong with that. You have the advantage of building on a relatively clean syntax and you can glean from the work done in other languages/runtimes. I do hope you succeed. Matthias
Jan 05 2006
prev sibling next sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Matthias Spycher" <matthias coware.com> wrote in message 
news:dphaqc$hn9$1 digitaldaemon.com...
I frequently discuss D with a colleague who represented a company on the 
C++ standards committee for a few years and who has a deep knowledge of the 
language and libraries. He often reminds me that without certain features 
like multiple inheritance, certain users (like himself) will remain 
unconvinced. I tend to disagree when it comes to features like multiple 
inheritance, but IMHO, there are other barriers ahead for D.

I've run into this often before, the "D is no good because it needs feature X." The reality is, if D implemented feature X tomorrow, the person will still not use D, they'll just find another Y that is "needed." It's a red herring. The domain of problems that multiple inheritance solves that is not also addressed by interfaces, mixins, and mere aggregation is insignificant. And heck, D has a number of very useful features (like closures, nested functions, inner classes, foreach, UTF support) that cannot be done by C++ in a reasonable way.
Jan 04 2006
prev sibling parent reply "Craig Black" <cblack ara.com> writes:
 The cost in time of  dynamic profiling, recompilation, and garbage 
 collection, etc. will  become more insignificant as multicore processors 
 become the norm. On a machine with, say, 32 threads running parallel in a 
 single core, eg http://www.sun.com/processors/UltraSPARC-T1/index.xml, 
 application performance is unlikely to be significantly different from a 
 statically compiled program.

I disagree. The cost of all the above mentioned crap that goes on with a VM will not be diminished by parallel processing. Computers will obviously be faster, but software will always become more complex and more demanding. D currently does not support concurrency as well as C# or Java, but that does not mean it will not have a good solution in the future. What impresses me most about D is not its current capabilities, but the rapid pace at which the language evolves. With all of the new features that have been added over the past months, I am confident that D will able to add concurrency before processors with 32 hardware threads become the norm. It does worry me a little that Walter is the only one maintaining the language, but so far he has done an excellent job.
 Without a clear specification of the memory model and corresponding 
 language abstractions (e.g. thread lifecycle, atomic assignment rules) to 
 guarantee the same behavior for programs compiled by different compilers, 
 D will not be able to challenge jvm/clr technology in the years to come. 
 This is not to say it can't remain an attractive alternative for certain 
 kinds of programs.

blah blah blah
 The main problem D faces in this context is the lack of a large ecosystem 
 which grows the language/libraries and provides the development tools that 
 increase programmer productivity. If a company like IBM could throw 40 
 million dollars at this problem to get the ball rolling (as they did with 
 the Eclipse/JDT project for Java), then D could really change the game. 
 The Java programming environment is scalable and productive because it has 
 tools like Eclipse/JDT/EMF/GEF that eliminate much of the drudgery.

I agree. 40 million would help. D needs standard libraries. Especially a GUI. I don't know how this will be done without funding. This is indeed a conundrum.
 Another event that might give D some extra momentum is the development of 
 a new operating system with its own libraries -- and no dependency on C or 
 any other legacy. It might take 10-20 years to reach the main stream, but 
 that's quite normal for this type of technology.

I don't think it will take so long to become mainstream. Developing an operating system is probably not the approach to take, though. I like the library development approach better. -Craig
Jan 05 2006
parent reply Lucas Goss <lgoss007 gmail.com> writes:
Craig Black wrote:
 blah blah blah
 
The main problem D faces in this context is the lack of a large ecosystem 
which grows the language/libraries and provides the development tools that 
increase programmer productivity. If a company like IBM could throw 40 
million dollars at this problem to get the ball rolling (as they did with 
the Eclipse/JDT project for Java), then D could really change the game. 
The Java programming environment is scalable and productive because it has 
tools like Eclipse/JDT/EMF/GEF that eliminate much of the drudgery.

I agree. 40 million would help. D needs standard libraries. Especially a GUI. I don't know how this will be done without funding. This is indeed a conundrum.

40 million would be nice, but I'll work for much cheaper, :) Actually I'd love to have a job doing D programming on libraries and GUI work. But as it is now I'm doing this in my free time, so I'm essentially working for free (currently on some math libraries and eventually I'll be doing GUI stuff), so I probably won't be paid in the future either. Hmm, this isn't looking good... But really I don't see why the community can't get a jump on these efforts now. A large group of programmers can get a lot more accomplished working together than a whole bunch of individual programmers (which is the current situation). I'd really like to hear Walter's opinion on the library though, as I've never seen him reply whenever the library is mentioned. Like would he be open to a repository that could hold the current library, working versions, specialized fields, etc? Or does he have something else planned? Lucas
Jan 05 2006
parent reply Brad Anderson <brad dsource.dot.org> writes:
Lucas Goss wrote:
 40 million would be nice, but I'll work for much cheaper, :)

dsource.org would be a lot better, and it wouldn't take $40MM.
Jan 05 2006
parent Lucas Goss <lgoss007 gmail.com> writes:
Brad Anderson wrote:
 Lucas Goss wrote:
 
 40 million would be nice, but I'll work for much cheaper, :)

dsource.org would be a lot better, and it wouldn't take $40MM.

Yeah I like dsource.org, I just wish it was more... official. And, well I think Jason Thomas said it up best on the Ares forum: "...The fact that there isn't yet a cohesive uniting entity, like the Java Community Process, to create and adopt these projects into a standard library, is a problem. This is especially true since it's likely these projects are following divergent coding philosophies and styles, and the resulting integrated library will definitely not follow the principle of least surprise..." That just bothers me, probably because I'm spoiled a little bit by C# and Java. Though I can say the lack of standard libraries in C++ is really a turn off, so much so that to me the lack of libraries is worse than than the language itself (which I really didn't mind until I found D). But yeah, I agree that dsource.org is a good place to get started. Lucas
Jan 05 2006
prev sibling parent reply pragma <pragma_member pathlink.com> writes:
In article <dpflnm$1vo1$1 digitaldaemon.com>, Valgrind valmouth.eu says...
Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8

What I don't understand is: why is Stroustrup making this language even more complicated than it already is? Its like that old expression, "throwing good money after bad". Maybe someone should shoot him an email and invite him to this newsgroup before he spends another 4 years "improving" c++? I think Java, C# and D have all demonstrated that less is more when it comes to language design. D gets bonus points for the cleanest meshing of compile and runtime constructs in any C-style language; I don't miss seeing <Foo<T,Bar<A>>> everywhere. - EricAnderton at yahoo
Jan 04 2006
parent reply bobef <bobef lessequal.com> writes:
pragma wrote:
 In article <dpflnm$1vo1$1 digitaldaemon.com>, Valgrind valmouth.eu says...
 
Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8

What I don't understand is: why is Stroustrup making this language even more complicated than it already is?

I really can't understand why so many people are finding C++ a compilcated language (if they are not writing a compiler for it)... Probably I am not using its full potential (after I found D I am not using it at all). Anyway...
 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.
Jan 05 2006
next sibling parent John Reimer <terminal.node gmail.com> writes:
bobef wrote:
 pragma wrote:
 In article <dpflnm$1vo1$1 digitaldaemon.com>, Valgrind valmouth.eu 
 says...

 Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

 http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8

What I don't understand is: why is Stroustrup making this language even more complicated than it already is?

I really can't understand why so many people are finding C++ a compilcated language (if they are not writing a compiler for it)... Probably I am not using its full potential (after I found D I am not using it at all). Anyway...

I found it excessively complicated. Not everyone does find it so, I guess. Maybe it's just my intrinsic laziness, but I found C++ tedious to work with. Add to that the prevalent use of macros in C/C++ languages, and you get a mess... Compare C++ to another "complicated" language: Ada. I imagine between the two, Ada has code that's much easier to read. -JJR
Jan 05 2006
prev sibling next sibling parent pragma <pragma_member pathlink.com> writes:
In article <dpinj0$1l2s$1 digitaldaemon.com>, bobef says...
pragma wrote:
 In article <dpflnm$1vo1$1 digitaldaemon.com>, Valgrind valmouth.eu says...
 
Did anyone see the slashdot article on Bjarne Stroustrup Previews C++0x?

http://it.slashdot.org/it/06/01/03/0351202.shtml?tid=156&tid=8

What I don't understand is: why is Stroustrup making this language even more complicated than it already is?

I really can't understand why so many people are finding C++ a compilcated language (if they are not writing a compiler for it)... Probably I am not using its full potential (after I found D I am not using it at all). Anyway...

Well, its never really the language's fault, right? The very grammar of templates combined with some other features leads to very unreadable code for relatively simple, yet insanely useful, things (at least when compared with other languages). Its hard for the C++ developer to go "nope, that's just unreadable" and walk away when it *does* work. It just seems like the language leads itself to unwieldy solutions, and its proponents seem to accept this as okay (or are just better at coping with it). There's an old saying: "C lets you shoot yourself in the foot, and C++ lets you reuse the bullet." - EricAnderton at yahoo
Jan 05 2006
prev sibling parent reply bobef <bobef lessequal.com> writes:
bobef wrote:

 
 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."
Jan 05 2006
parent reply James Dunne <james.jdunne gmail.com> writes:
bobef wrote:
 bobef wrote:
 
 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.
Jan 05 2006
next sibling parent reply bobef <bobef lessequal.com> writes:
James Dunne wrote:
 bobef wrote:
 
 bobef wrote:

 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

"What a cop out." What is this? I don't understand it. Sorry for my poor English.
Jan 05 2006
parent reply Sean Kelly <sean f4.ca> writes:
bobef wrote:
 James Dunne wrote:
 bobef wrote:

 bobef wrote:
 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

"What a cop out." What is this? I don't understand it. Sorry for my poor English.

It means "what a poor excuse," essentially. Sean
Jan 05 2006
parent bobef <bobef lessequal.com> writes:
Sean Kelly wrote:
 bobef wrote:
 
 James Dunne wrote:

 bobef wrote:

 bobef wrote:

 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

"What a cop out." What is this? I don't understand it. Sorry for my poor English.

It means "what a poor excuse," essentially.

Oh, I see. Thank you.
Jan 05 2006
prev sibling parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"James Dunne" <james.jdunne gmail.com> wrote in message
news:dpjmoe$2hbf$1 digitaldaemon.com...
 bobef wrote:
 bobef wrote:

 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

By Dr Stroustrup, or by you guys? Perhaps what he should have said was "oh yes, this language whose billions of lines of extant code that're underpinning most of the technology in this modern world in which we live, and which has a number of flaws, some of which were necessitated to get it from the research lab and into the code shops, and some of which are side-effects of emergent properties of novel, unanticipated and powerful uses and combinations of new features that have been added to the language as it has evolved over the last two decades while keeping it viable and backwards compatible, this language ... well, I'll just throw it all out right now and go for D, a new language with huge potential but that is immature, poor on support, without a library worth the name, without a distilled (or at least apparent) vision, with a huge number of questions yet to be answered, and all under the control of one man who, albeit with stupendous talent and incredible 29:1-like productivity, has an admittedly compiler-writer viewpoint rather than one of library-writer or application-writer, or some combination thereof." Yes, what a cop-out indeed. Matthew P.S. FWIW: Sean and I are putting our money where our mouths are this month, and will be presenting the first of a number of proposals to the C++ standards committee. It's boring and arduous (and very likely thankless and futile) work, to be sure, and much less fun than just pouring scorn on someone from afar, but, Hey!, someone's got to do it. (And for anyone who would rather dismiss this as just a defensive rant from Bjarne Stroustrup's little puppy, you should have seen some of the heat I received from several members of the TCS Ed Panel for the title of my latest TCS article. <g>)
Jan 05 2006
next sibling parent reply BCS <BCS_member pathlink.com> writes:
Matthew wrote:

 little puppy, you should have seen some of the heat I received from several
 members of the TCS Ed Panel for the title of my latest TCS article. <g>)
 
 

link??
Jan 05 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"BCS" <BCS_member pathlink.com> wrote in message
news:dpk2pu$2s1n$1 digitaldaemon.com...
 Matthew wrote:

 little puppy, you should have seen some of the heat I received from


 members of the TCS Ed Panel for the title of my latest TCS article. <g>)

link??

http://www.artima.com/cppsource/deepspace.html
Jan 05 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 "BCS" <BCS_member pathlink.com> wrote in message
 news:dpk2pu$2s1n$1 digitaldaemon.com...
 Matthew wrote:

 little puppy, you should have seen some of the heat I received from


 members of the TCS Ed Panel for the title of my latest TCS article. <g>)


http://www.artima.com/cppsource/deepspace.html

You caught flak over that? LOL. The C++ community can be a bit odd at times. Sean
Jan 05 2006
prev sibling parent reply Derek Parnell <derek psych.ward> writes:
On Fri, 6 Jan 2006 10:45:27 +1100, Matthew wrote:

 http://www.artima.com/cppsource/deepspace.html

Loved the article. I'm happy to report that our thinking is, so far, identical on this matter. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Down with mediocracy!" 6/01/2006 12:43:01 PM
Jan 05 2006
parent "Matthew" <matthew hat.stlsoft.dot.org> writes:
 http://www.artima.com/cppsource/deepspace.html

Loved the article.

Thanks!
 I'm happy to report that our thinking is, so far,
 identical on this matter.

Give it time. ;-)
Jan 05 2006
prev sibling next sibling parent reply John Reimer <terminal.node gmail.com> writes:
Matthew wrote:
 "James Dunne" <james.jdunne gmail.com> wrote in message
 news:dpjmoe$2hbf$1 digitaldaemon.com...
 bobef wrote:
 bobef wrote:

 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."


By Dr Stroustrup, or by you guys? Perhaps what he should have said was "oh yes, this language whose billions of lines of extant code that're underpinning most of the technology in this modern world in which we live, and which has a number of flaws, some of which were necessitated to get it from the research lab and into the code shops, and some of which are side-effects of emergent properties of novel, unanticipated and powerful uses and combinations of new features that have been added to the language as it has evolved over the last two decades while keeping it viable and backwards compatible, this language ... well, I'll just throw it all out right now and go for D, a new language with huge potential but that is immature, poor on support, without a library worth the name, without a distilled (or at least apparent) vision, with a huge number of questions yet to be answered, and all under the control of one man who, albeit with stupendous talent and incredible 29:1-like productivity, has an admittedly compiler-writer viewpoint rather than one of library-writer or application-writer, or some combination thereof." Yes, what a cop-out indeed. Matthew P.S. FWIW: Sean and I are putting our money where our mouths are this month, and will be presenting the first of a number of proposals to the C++ standards committee. It's boring and arduous (and very likely thankless and futile) work, to be sure, and much less fun than just pouring scorn on someone from afar, but, Hey!, someone's got to do it. (And for anyone who would rather dismiss this as just a defensive rant from Bjarne Stroustrup's little puppy, you should have seen some of the heat I received from several members of the TCS Ed Panel for the title of my latest TCS article. <g>)

Ah... I agree Matthew. But I think the point we were trying to get across was that "we're disappointed." It may be illogical in the childish-folding-of-arms-and-pouting sort of way, but it doesn't remove the fact that we're disappointed that he didn't put all those "inconsequential" things aside to visit here! :D -JJR
Jan 05 2006
next sibling parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
 Ah... I agree Matthew.  But I think the point we were trying to get
 across was that "we're disappointed."  It may be illogical in the
 childish-folding-of-arms-and-pouting sort of way, but it doesn't remove
 the fact that we're disappointed that he didn't put all those
 "inconsequential" things aside to visit here! :D

Well, you're assuming he's not looked at the language, and my impression of comments over the last couple of years is that he has. I suspect it's not a detailed look, maybe a bit of a test program or two, probably not much more. (All that's pure conjecture, of course.) For D to prosper - which is, to steal aware mindshare from C++ and from Java and from C#/.NET - it has to be both demonstrably superior and demonstrably viable. There's no point just telling people how great some features are now. People need proof: tangible, immediate, compelling. And it needs to be easy. Which means excellent tools, excellent standard library, seriously impressive non-trivial applications, well-honed language specification, and so on. Since D's not at this point yet, any expressions of discontent and/or wonder that people are not yet leaving their other, established, languages in droves is both pointless and foolish. All that energy would be better spent working/honing/inventing/documenting. (Not that most of that's terribly fun, of course.) So, however much I/you/we might like D and/or think D has great potential, most people simply won't care until it's irrefutable. (That's why I wonder why Walter bashes his head against the brick-wall of c.l.c++.m so much. Perhaps he's using it to stimulate his fertile mind for such C++/Java/C#-killing ideas.)
Jan 05 2006
next sibling parent John Reimer <terminal.node gmail.com> writes:
Matthew wrote:
 Ah... I agree Matthew.  But I think the point we were trying to get
 across was that "we're disappointed."  It may be illogical in the
 childish-folding-of-arms-and-pouting sort of way, but it doesn't remove
 the fact that we're disappointed that he didn't put all those
 "inconsequential" things aside to visit here! :D

Well, you're assuming he's not looked at the language, and my impression of comments over the last couple of years is that he has. I suspect it's not a detailed look, maybe a bit of a test program or two, probably not much more. (All that's pure conjecture, of course.) For D to prosper - which is, to steal aware mindshare from C++ and from Java and from C#/.NET - it has to be both demonstrably superior and demonstrably viable. There's no point just telling people how great some features are now. People need proof: tangible, immediate, compelling. And it needs to be easy. Which means excellent tools, excellent standard library, seriously impressive non-trivial applications, well-honed language specification, and so on. Since D's not at this point yet, any expressions of discontent and/or wonder that people are not yet leaving their other, established, languages in droves is both pointless and foolish. All that energy would be better spent working/honing/inventing/documenting. (Not that most of that's terribly fun, of course.) So, however much I/you/we might like D and/or think D has great potential, most people simply won't care until it's irrefutable. (That's why I wonder why Walter bashes his head against the brick-wall of c.l.c++.m so much. Perhaps he's using it to stimulate his fertile mind for such C++/Java/C#-killing ideas.)

Waaiiiittt a minute... What was I thinking? Why would we want to have come to /this/ newsgroup. Why would we want him to see all the gripes and grimaces going on in here? Better to paint a pretty picture for him and others far and away from here! Concerning the above, all good points, Matthew. Welcome back! -JJR
Jan 05 2006
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:dpkbp7$22v$1 digitaldaemon.com...
 (That's why I wonder
 why Walter bashes his head against the brick-wall of c.l.c++.m so much.
 Perhaps he's using it to stimulate his fertile mind for such
 C++/Java/C#-killing ideas.)

Because it's good practice.
Jan 05 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:dpkole$ai0$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:dpkbp7$22v$1 digitaldaemon.com...
 (That's why I wonder
 why Walter bashes his head against the brick-wall of c.l.c++.m so much.
 Perhaps he's using it to stimulate his fertile mind for such
 C++/Java/C#-killing ideas.)

Because it's good practice.

Indeed. I guess I just admire your ability to absord the banging. I get a headache if I visit the C++ newsgroups more than once a week. That's probably why I like being here ('cept when I'm tired and grumpy). <g>
Jan 05 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:dpkqgo$bml$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:dpkole$ai0$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:dpkbp7$22v$1 digitaldaemon.com...
 (That's why I wonder
 why Walter bashes his head against the brick-wall of c.l.c++.m so much.
 Perhaps he's using it to stimulate his fertile mind for such
 C++/Java/C#-killing ideas.)

Because it's good practice.

Indeed. I guess I just admire your ability to absord the banging. I get a headache if I visit the C++ newsgroups more than once a week. That's probably why I like being here ('cept when I'm tired and grumpy). <g>

I was invited to another company in a couple weeks to give a presentation on D. I find these debates on the C++ newsgroups help me a lot in being prepared. I confess to enjoying the debate, too <g>.
Jan 05 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
 (That's why I wonder
 why Walter bashes his head against the brick-wall of c.l.c++.m so




 Perhaps he's using it to stimulate his fertile mind for such
 C++/Java/C#-killing ideas.)

Because it's good practice.

Indeed. I guess I just admire your ability to absord the banging. I get


 headache if I visit the C++ newsgroups more than once a week. That's
 probably why I like being here ('cept when I'm tired and grumpy). <g>

I was invited to another company in a couple weeks to give a presentation

 D. I find these debates on the C++ newsgroups help me a lot in being
 prepared. I confess to enjoying the debate, too <g>.

It's funny, the differences, I suppose. I enjoy debate only until it gets to where I'd have to repeat myself, or where the person I'm talking to goes all emotional and unscientific and refuses to acknowledge the trail of established propositions from agreed axioms, at which point I get the overwhelming feeling that my life is being sucked out and wasted, so I slope off to write an article on the topic, subject largely to my own argument and approval. (Hence, the birth of the Nuclear Reactor and the Deep Space Probe series, which started on this NG some 18 months ago.) You however, would rather visit the dentist than write an article, but yet are quite happy to go on repeating your points to people who won't listen (or to those who've already proven you wrong <g>). Vive le difference, I suppose. ;-)
Jan 05 2006
next sibling parent reply Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 
 It's funny, the differences, I suppose.
 
 I enjoy debate only until it gets to where I'd have to repeat myself, or
 where the person I'm talking to goes all emotional and unscientific and
 refuses to acknowledge the trail of established propositions from agreed
 axioms, at which point I get the overwhelming feeling that my life is being
 sucked out and wasted, so I slope off to write an article on the topic,
 subject largely to my own argument and approval. (Hence, the birth of the
 Nuclear Reactor and the Deep Space Probe series, which started on this NG
 some 18 months ago.)
 
 You however, would rather visit the dentist than write an article, but yet
 are quite happy to go on repeating your points to people who won't listen
 (or to those who've already proven you wrong <g>).
 
 Vive le difference, I suppose. ;-)

Perhaps you and Walter should co-author the first dialectic programming book. Then bulletproof arguments could be made and Walter could feel like he was arguing with somebody ;-) Sean
Jan 05 2006
next sibling parent kris <fu bar.org> writes:
Sean Kelly wrote:
 Matthew wrote:
 
 It's funny, the differences, I suppose.

 I enjoy debate only until it gets to where I'd have to repeat myself, or
 where the person I'm talking to goes all emotional and unscientific and
 refuses to acknowledge the trail of established propositions from agreed
 axioms, at which point I get the overwhelming feeling that my life is 
 being
 sucked out and wasted, so I slope off to write an article on the topic,
 subject largely to my own argument and approval. (Hence, the birth of the
 Nuclear Reactor and the Deep Space Probe series, which started on this NG
 some 18 months ago.)

 You however, would rather visit the dentist than write an article, but 
 yet
 are quite happy to go on repeating your points to people who won't listen
 (or to those who've already proven you wrong <g>).

 Vive le difference, I suppose. ;-)

Perhaps you and Walter should co-author the first dialectic programming book. Then bulletproof arguments could be made and Walter could feel like he was arguing with somebody ;-) Sean

ROFL! Funniest post of the year! Uhhhhhhhh ... oh. Well, it's funny anyway <g>
Jan 05 2006
prev sibling next sibling parent pragma <pragma_member pathlink.com> writes:
In article <dpl34r$hbi$1 digitaldaemon.com>, Sean Kelly says...
Matthew wrote:
 
 It's funny, the differences, I suppose.
 
 I enjoy debate only until it gets to where I'd have to repeat myself, or
 where the person I'm talking to goes all emotional and unscientific and
 refuses to acknowledge the trail of established propositions from agreed
 axioms, at which point I get the overwhelming feeling that my life is being
 sucked out and wasted, so I slope off to write an article on the topic,
 subject largely to my own argument and approval. (Hence, the birth of the
 Nuclear Reactor and the Deep Space Probe series, which started on this NG
 some 18 months ago.)
 
 You however, would rather visit the dentist than write an article, but yet
 are quite happy to go on repeating your points to people who won't listen
 (or to those who've already proven you wrong <g>).
 
 Vive le difference, I suppose. ;-)

Perhaps you and Walter should co-author the first dialectic programming book. Then bulletproof arguments could be made and Walter could feel like he was arguing with somebody ;-)

Dialectic book? A radio-show format podcast would work nicely, IMO. Plus, there's fewer editors to deal with. - EricAnderton at yahoo
Jan 06 2006
prev sibling parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Sean Kelly" <sean f4.ca> wrote in message
news:dpl34r$hbi$1 digitaldaemon.com...
 Matthew wrote:
 It's funny, the differences, I suppose.

 I enjoy debate only until it gets to where I'd have to repeat myself, or
 where the person I'm talking to goes all emotional and unscientific and
 refuses to acknowledge the trail of established propositions from agreed
 axioms, at which point I get the overwhelming feeling that my life is


 sucked out and wasted, so I slope off to write an article on the topic,
 subject largely to my own argument and approval. (Hence, the birth of


 Nuclear Reactor and the Deep Space Probe series, which started on this


 some 18 months ago.)

 You however, would rather visit the dentist than write an article, but


 are quite happy to go on repeating your points to people who won't


 (or to those who've already proven you wrong <g>).

 Vive le difference, I suppose. ;-)

Perhaps you and Walter should co-author the first dialectic programming book. Then bulletproof arguments could be made and Walter could feel like he was arguing with somebody ;-)

Don't give me more book ideas. I've already got two more for after Extended STL, volume 2! My wife's all set for getting out the guillotine, and it's not entirely clear whether that's for the books or my neck ...
Jan 06 2006
parent reply John Reimer <terminal.node gmail.com> writes:
Matthew wrote:

 Don't give me more book ideas. I've already got two more for after Extended
 STL, volume 2!
 
 My wife's all set for getting out the guillotine, and it's not entirely
 clear whether that's for the books or my neck ...
 
 

Matthew, I hate to tell you this, but I haven't heard of an instance in which a guillotine was used on any books. I think you're in trouble! -JJR
Jan 06 2006
parent "Matthew" <matthew hat.stlsoft.dot.org> writes:
 Don't give me more book ideas. I've already got two more for after


 STL, volume 2!

 My wife's all set for getting out the guillotine, and it's not entirely
 clear whether that's for the books or my neck ...

Matthew, I hate to tell you this, but I haven't heard of an instance in which a guillotine was used on any books.

Ulp!
Jan 06 2006
prev sibling next sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:dpkveu$eth$1 digitaldaemon.com...
 You however, would rather visit the dentist than write an article,

True enough, the dentist is less painful <g>. I enjoy debate like others enjoy a relaxing game of football between friends. What I don't enjoy is trading insults, and when a debate gets personal, I'll check out.
Jan 06 2006
prev sibling parent "Craig Black" <cblack ara.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:dpkveu$eth$1 digitaldaemon.com...
 (That's why I wonder
 why Walter bashes his head against the brick-wall of c.l.c++.m so




 Perhaps he's using it to stimulate his fertile mind for such
 C++/Java/C#-killing ideas.)

Because it's good practice.

Indeed. I guess I just admire your ability to absord the banging. I get


 headache if I visit the C++ newsgroups more than once a week. That's
 probably why I like being here ('cept when I'm tired and grumpy). <g>

I was invited to another company in a couple weeks to give a presentation

 D. I find these debates on the C++ newsgroups help me a lot in being
 prepared. I confess to enjoying the debate, too <g>.

It's funny, the differences, I suppose. I enjoy debate only until it gets to where I'd have to repeat myself, or where the person I'm talking to goes all emotional and unscientific and refuses to acknowledge the trail of established propositions from agreed axioms, at which point I get the overwhelming feeling that my life is being sucked out and wasted, so I slope off to write an article on the topic, subject largely to my own argument and approval. (Hence, the birth of the Nuclear Reactor and the Deep Space Probe series, which started on this NG some 18 months ago.) You however, would rather visit the dentist than write an article, but yet are quite happy to go on repeating your points to people who won't listen (or to those who've already proven you wrong <g>). Vive le difference, I suppose. ;-)

lol! -Craig
Jan 06 2006
prev sibling parent Carlos Santander <csantander619 gmail.com> writes:
John Reimer escribió:
 Matthew wrote:
 
 "James Dunne" <james.jdunne gmail.com> wrote in message
 news:dpjmoe$2hbf$1 digitaldaemon.com...

 bobef wrote:

 bobef wrote:

 Maybe someone should shoot him an email and invite him to
 this newsgroup before he spends another 4 years "improving" c++?






I think he already knew about D. I vaguely remember someone saying something about that. At least Walter has had his name in the Acknowledgments page since I can remember.
 I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

By Dr Stroustrup, or by you guys? Perhaps what he should have said was "oh yes, this language whose billions of lines of extant code that're underpinning most of the technology in this modern world in which we live, and which has a number of flaws, some of which were necessitated to get it from the research lab and into the code shops, and some of which are side-effects of emergent properties of novel, unanticipated and powerful uses and combinations of new features that have been added to the language as it has evolved over the last two decades while keeping it viable and backwards compatible, this language ... well, I'll just throw it all out right now and go for D, a new language with huge potential but that is immature, poor on support, without a library worth the name, without a distilled (or at least apparent) vision, with a huge number of questions yet to be answered, and all under the control of one man who, albeit with stupendous talent and incredible 29:1-like productivity, has an admittedly compiler-writer viewpoint rather than one of library-writer or application-writer, or some combination thereof."


I understand that you and Mr. Stroustrup feel that way, while I don't agree with all points.
 Yes, what a cop-out indeed.

 Matthew

 P.S. FWIW: Sean and I are putting our money where our mouths are this 
 month,
 and will be presenting the first of a number of proposals to the C++
 standards committee. It's boring and arduous (and very likely 
 thankless and
 futile) work, to be sure, and much less fun than just pouring scorn on
 someone from afar, but, Hey!, someone's got to do it. (And for anyone who
 would rather dismiss this as just a defensive rant from Bjarne 
 Stroustrup's
 little puppy, you should have seen some of the heat I received from 
 several
 members of the TCS Ed Panel for the title of my latest TCS article. <g>)

Ah... I agree Matthew. But I think the point we were trying to get across was that "we're disappointed." It may be illogical in the childish-folding-of-arms-and-pouting sort of way, but it doesn't remove the fact that we're disappointed that he didn't put all those "inconsequential" things aside to visit here! :D -JJR

Maybe he is too busy? You know, after seeing a bit the real world, I understand why some (many) persons and companies won't show any interest in D at this moment. It's going to take a while for D to get noticed, and I'm still 200% on the D boat, but I don't feel like saying "what a cop out" because of what he said. -- Carlos Santander Bernal
Jan 05 2006
prev sibling parent reply James Dunne <james.jdunne gmail.com> writes:
Matthew wrote:
 "James Dunne" <james.jdunne gmail.com> wrote in message
 news:dpjmoe$2hbf$1 digitaldaemon.com...
 
bobef wrote:

bobef wrote:


Maybe someone should shoot him an email and invite him to
this newsgroup before he spends another 4 years "improving" c++?

I like this idea. I will do it.

He replied: "Thanks you. I'm aware of D, but my work is with C++."

What a cop out.

By Dr Stroustrup, or by you guys? Perhaps what he should have said was "oh yes, this language whose billions of lines of extant code that're underpinning most of the technology in this modern world in which we live, and which has a number of flaws, some of which were necessitated to get it from the research lab and into the code shops, and some of which are side-effects of emergent properties of novel, unanticipated and powerful uses and combinations of new features that have been added to the language as it has evolved over the last two decades while keeping it viable and backwards compatible, this language ... well, I'll just throw it all out right now and go for D,

Where'd you get all that from out of "cop out"? I certainly understand it all, and yes I agree, but it's not what I was getting at. I didn't say any of that.
 a new language with huge
 potential but that is immature, poor on support, without a library worth the
 name, without a distilled (or at least apparent) vision, with a huge number
 of questions yet to be answered, and all under the control of one man who,
 albeit with stupendous talent and incredible 29:1-like productivity, has an
 admittedly compiler-writer viewpoint rather than one of library-writer or
 application-writer, or some combination thereof."

I like your 29:1 figure :). I think it makes sense for Walter to be the one-man-show for now until the language is mostly finalized and frozen at 1.0 with most of the quirks and gotchas removed. Until this point comes, the standard library development should remain at near a halt.
 Yes, what a cop-out indeed.
 
 Matthew
 
 P.S. FWIW: Sean and I are putting our money where our mouths are this month,
 and will be presenting the first of a number of proposals to the C++
 standards committee. It's boring and arduous (and very likely thankless and
 futile) work, to be sure, and much less fun than just pouring scorn on
 someone from afar, but, Hey!, someone's got to do it. (And for anyone who
 would rather dismiss this as just a defensive rant from Bjarne Stroustrup's
 little puppy, you should have seen some of the heat I received from several
 members of the TCS Ed Panel for the title of my latest TCS article. <g>)
 

Jan 06 2006
parent reply "Craig Black" <cblack ara.com> writes:
 I think it makes sense for Walter to be the one-man-show for now until the 
 language is mostly finalized and frozen at 1.0 with most of the quirks and 
 gotchas removed.  Until this point comes, the standard library development 
 should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library. -Craig
Jan 10 2006
parent reply Ivan Senji <ivan.senji_REMOVE_ _THIS__gmail.com> writes:
Craig Black wrote:
I think it makes sense for Walter to be the one-man-show for now until the 
language is mostly finalized and frozen at 1.0 with most of the quirks and 
gotchas removed.  Until this point comes, the standard library development 
should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library.

I agree 100%. If things were done this way when 1.0 is out it would have no library and no one would want or be able to use it.
Jan 10 2006
parent reply "Ameer Armaly" <ameer_armaly hotmail.com> writes:
"Ivan Senji" <ivan.senji_REMOVE_ _THIS__gmail.com> wrote in message 
news:dq184n$2u0o$1 digitaldaemon.com...
 Craig Black wrote:
I think it makes sense for Walter to be the one-man-show for now until 
the language is mostly finalized and frozen at 1.0 with most of the 
quirks and gotchas removed.  Until this point comes, the standard library 
development should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library.

I agree 100%. If things were done this way when 1.0 is out it would have no library and no one would want or be able to use it.

success.
Jan 10 2006
parent reply James Dunne <james.jdunne gmail.com> writes:
Ameer Armaly wrote:
 "Ivan Senji" <ivan.senji_REMOVE_ _THIS__gmail.com> wrote in message 
 news:dq184n$2u0o$1 digitaldaemon.com...
 
Craig Black wrote:

I think it makes sense for Walter to be the one-man-show for now until 
the language is mostly finalized and frozen at 1.0 with most of the 
quirks and gotchas removed.  Until this point comes, the standard library 
development should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library.

I agree 100%. If things were done this way when 1.0 is out it would have no library and no one would want or be able to use it.

I agree. The language and library are dependent on each other to achieve success.

Not if the semantics of language features keep changing.
Jan 11 2006
parent reply "Ameer Armaly" <ameer_armaly hotmail.com> writes:
"James Dunne" <james.jdunne gmail.com> wrote in message 
news:dq3ht8$1tsd$1 digitaldaemon.com...
 Ameer Armaly wrote:
 "Ivan Senji" <ivan.senji_REMOVE_ _THIS__gmail.com> wrote in message 
 news:dq184n$2u0o$1 digitaldaemon.com...

Craig Black wrote:

I think it makes sense for Walter to be the one-man-show for now until 
the language is mostly finalized and frozen at 1.0 with most of the 
quirks and gotchas removed.  Until this point comes, the standard 
library development should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library.

I agree 100%. If things were done this way when 1.0 is out it would have no library and no one would want or be able to use it.

I agree. The language and library are dependent on each other to achieve success.

Not if the semantics of language features keep changing.

that? In my opinion it's not; the language has achieved a reasonable level of stability for a faster pace in library development.
Jan 11 2006
parent reply James Dunne <james.jdunne gmail.com> writes:
Ameer Armaly wrote:
 "James Dunne" <james.jdunne gmail.com> wrote in message 
 news:dq3ht8$1tsd$1 digitaldaemon.com...
 
Ameer Armaly wrote:

"Ivan Senji" <ivan.senji_REMOVE_ _THIS__gmail.com> wrote in message 
news:dq184n$2u0o$1 digitaldaemon.com...


Craig Black wrote:


I think it makes sense for Walter to be the one-man-show for now until 
the language is mostly finalized and frozen at 1.0 with most of the 
quirks and gotchas removed.  Until this point comes, the standard 
library development should remain at near a halt.

Huh? A near halt? That's crazy talk! Just because the language isn't finalized doesn't mean that the syntax will change radically. I don't think that the pre-1.0 state of the compiler should be a deterrent to advancing the standard library.

I agree 100%. If things were done this way when 1.0 is out it would have no library and no one would want or be able to use it.




D the language 1.0? or D the infrastructure + language 1.0?
I agree.  The language and library are dependent on each other to achieve 
success.



Yes, and in order for success to be achieved, the D library must wait for the D language.
Not if the semantics of language features keep changing.

But is the rate of change a big enough concern at this point to justify that? In my opinion it's not; the language has achieved a reasonable level of stability for a faster pace in library development.

You can't be guaranteed that the rate of change is constant or even predictable. So many of the latest proposals for D offer breaking changes or radically different semantics. You don't know which ones are planned for integration. Without some kind of roadmap in front of you, it's hard to see if you're going to drive straight into a tree. The D infrastructure 1.0 can stand to wait for the D language 1.0. IMO, waiting for the D language 1.0 can prove to have many benefits that outweigh the costs of beginning to develop large quantities of possibly unsupported code. Compiler bugs often warrant one to hack around or possibly even refactor certain portions of code. To go back after that bug has been fixed and unhack / re-refactor the design is a costly operation, depending on how much code depends on that hack/design. Walter has more direct control of the language pre-1.0. We must be patient and let him (and us all) improve the language and fix bugs up to the feature freeze. Let us ensure ourselves that we have a quality language on which to design and build a quality infrastructure. Let's not repeat the mistakes of C++ here, please. Summary: Patience. There is no deadline for this project. What's the rush?
Jan 12 2006
parent reply Sean Kelly <sean f4.ca> writes:
James Dunne wrote:
 
 You can't be guaranteed that the rate of change is constant or even 
 predictable.  So many of the latest proposals for D offer breaking 
 changes or radically different semantics.  You don't know which ones are 
 planned for integration.  Without some kind of roadmap in front of you, 
 it's hard to see if you're going to drive straight into a tree.

True. But language development has settled down for the most part. And most of the existing features that may yet change have been talked about on this NG quite a bit. AAs, for instance.
 
 The D infrastructure 1.0 can stand to wait for the D language 1.0.  IMO, 
 waiting for the D language 1.0 can prove to have many benefits that 
 outweigh the costs of beginning to develop large quantities of possibly 
 unsupported code.  Compiler bugs often warrant one to hack around or 
 possibly even refactor certain portions of code.  To go back after that 
 bug has been fixed and unhack / re-refactor the design is a costly 
 operation, depending on how much code depends on that hack/design.

Yup. Which is why I've been avoiding serious template programming to date. Every time I try it I run into bugs that are either showstoppers or require workarounds I'm not willing to provide. That said, the compiler has improved here quite a bit recently. There's only one or two remaining issues I have with Ares in this regard, but since few people are using std.atomic (Ares' only real use of templates at the moment), it can wait.
 Walter has more direct control of the language pre-1.0.  We must be 
 patient and let him (and us all) improve the language and fix bugs up to 
 the feature freeze.  Let us ensure ourselves that we have a quality 
 language on which to design and build a quality infrastructure.  Let's 
 not repeat the mistakes of C++ here, please.
 
 Summary:  Patience.  There is no deadline for this project.  What's the 
 rush?

D's window may close a tad if things take too long. But that's more a measure of years than months. Sean
Jan 12 2006
parent reply James Dunne <james.jdunne gmail.com> writes:
Sean Kelly wrote:
 James Dunne wrote:
 
 You can't be guaranteed that the rate of change is constant or even 
 predictable.  So many of the latest proposals for D offer breaking 
 changes or radically different semantics.  You don't know which ones 
 are planned for integration.  Without some kind of roadmap in front of 
 you, it's hard to see if you're going to drive straight into a tree.

True. But language development has settled down for the most part. And most of the existing features that may yet change have been talked about on this NG quite a bit. AAs, for instance.

Finally, thank you for understanding my point of view here. =) In response: What about sets, ranges, array literals? Potential features shouldn't be left out too, especially considering the quantity of posts relating to them. Bit arrays are also a problem, and should be library-ized. It was a nice attempt but not implemented to the extent it should've been. bool should be sized to a machine-size word for speed and addressability. bool should remain bool: true or false, nothing else. While it is true that a bool can map to a bit, it creates problems in the long run when people don't realize the limitations of bits, bit arrays, etc. For example: Just not too long ago I wrote some D code that used a dynamic bool array to keep track of freed items. Weird problems arose because it was aliased down to a bit array which had issues (I can't recall what the exact problem was). I converted the bool array to an int array and the problems magically disappeared. This is a 'gotcha' that should either be fixed by fixing the bit array implementation or by creating a standard bool type. Inner classes and templates are still sources of major problems.
 The D infrastructure 1.0 can stand to wait for the D language 1.0.  
 IMO, waiting for the D language 1.0 can prove to have many benefits 
 that outweigh the costs of beginning to develop large quantities of 
 possibly unsupported code.  Compiler bugs often warrant one to hack 
 around or possibly even refactor certain portions of code.  To go back 
 after that bug has been fixed and unhack / re-refactor the design is a 
 costly operation, depending on how much code depends on that hack/design.

Yup. Which is why I've been avoiding serious template programming to date. Every time I try it I run into bugs that are either showstoppers or require workarounds I'm not willing to provide. That said, the compiler has improved here quite a bit recently. There's only one or two remaining issues I have with Ares in this regard, but since few people are using std.atomic (Ares' only real use of templates at the moment), it can wait.

So have I (on template programming). We need a serious template programming tutorial/article to really show the masses (and me) how much power D's template system holds. I know it's powerful, but I don't feel comfortable using it at the moment. Mostly, all the aliasing and scope-avoidance quirks make me squeemish. Especially that damned annoying "template X cannot be used as a type" error message I get back all the time. BTW, I would like to use std.atomic if someone would build a concurrent library around it (as lock-free as possible). Much like Ben Hinkle has done with MinTL, but last I recall he hasn't updated it for the latest compilers. Correct me if I'm wrong, please.
 Walter has more direct control of the language pre-1.0.  We must be 
 patient and let him (and us all) improve the language and fix bugs up 
 to the feature freeze.  Let us ensure ourselves that we have a quality 
 language on which to design and build a quality infrastructure.  Let's 
 not repeat the mistakes of C++ here, please.

 Summary:  Patience.  There is no deadline for this project.  What's 
 the rush?

D's window may close a tad if things take too long. But that's more a measure of years than months. Sean

D's windows to whom exactly? Those who don't know about it? What's closing that window? I can think of a few things on Digital Mars' side of things that would close that window up in a few years, namely the 32-bit-only code generator and incompatible (with Microsoft) binary linker written in assembly language. No need to reiterate this over and over though. The D language is a specification, and I assume others are free to take up that specification and implement their own compilers. (I really wonder what Manfred Nowak's would look like =P) I've mentioned in passing before, but a C99-compatible source code generator as a backend for D is starting to sound like a reasonable idea. This would open up the floodgates for users to test out other compiler toolchains and platforms not directly supported by the Digital Mars reference compiler. A host of ideas come rushing to my mind at the thought of this, but I'm certainly not going to dump them all out here. Others can contemplate on this now that I've thrown it out there.
Jan 12 2006
next sibling parent reply Sean Kelly <sean f4.ca> writes:
James Dunne wrote:
 Sean Kelly wrote:
 James Dunne wrote:

 You can't be guaranteed that the rate of change is constant or even 
 predictable.  So many of the latest proposals for D offer breaking 
 changes or radically different semantics.  You don't know which ones 
 are planned for integration.  Without some kind of roadmap in front 
 of you, it's hard to see if you're going to drive straight into a tree.

True. But language development has settled down for the most part. And most of the existing features that may yet change have been talked about on this NG quite a bit. AAs, for instance.

Finally, thank you for understanding my point of view here. =) In response: What about sets, ranges, array literals? Potential features shouldn't be left out too, especially considering the quantity of posts relating to them.

I agree. However, new features typically don't require old code to be refactored.
 Bit arrays are also a problem, and should be library-ized.  It was a 
 nice attempt but not implemented to the extent it should've been.

I agree.
 bool should be sized to a machine-size word for speed and 
 addressability.  bool should remain bool: true or false, nothing else. 
 While it is true that a bool can map to a bit, it creates problems in 
 the long run when people don't realize the limitations of bits, bit 
 arrays, etc.

I agree with the intent, but the last thing I want is for D to have two boolean types. However, if packed bit arrays are removed from D then perhaps it would make sense to replace "bit" with "bool" as well, because when I see "bit" I kind of assume that I can use it to perform bit-level operations, at least in some context.
 For example: Just not too long ago I wrote some D code that used a 
 dynamic bool array to keep track of freed items.  Weird problems arose 
 because it was aliased down to a bit array which had issues (I can't 
 recall what the exact problem was).  I converted the bool array to an 
 int array and the problems magically disappeared.  This is a 'gotcha' 
 that should either be fixed by fixing the bit array implementation or by 
 creating a standard bool type.

Definately. I've actually avoided the use of bit arrays in the past because of the addressing problem. For example, you can't even do this: bit[] field; foreach( inout bit b; field ) { } This is kind of a showstopper in my opinion, particularly in the face of template code. And since it's completely understandable why this isn't supported, I think packed bit arrays should simply be removed. It's the same exact issue as the vector<bool> mess in C++. So I agree that a roadmap would be very helpful, as it would both allow us to design with upcoming features or changes in mind (at least by tagging potential problem areas with easily greppable comment tags) and to offer constructive feedback on areas where changes are being considered. In some respects I think feature discussions tend to escalate to arguments simply because it feels like they're being done in a void and way down deep perhaps people feel that enough screaming and yelling may get Walter's attention. Historically however, Walter has often kept silent and implemented changes based on his analysis of these discussions, which seems to have resulted in a few changes that those who participated in the discussions aren't entirely happy with (AAs being the prime offender here). A bit more back and forth may have sorted things out at the outset, or at least made people more aware of what they were getting themselves into.
 The D infrastructure 1.0 can stand to wait for the D language 1.0.  
 IMO, waiting for the D language 1.0 can prove to have many benefits 
 that outweigh the costs of beginning to develop large quantities of 
 possibly unsupported code.  Compiler bugs often warrant one to hack 
 around or possibly even refactor certain portions of code.  To go 
 back after that bug has been fixed and unhack / re-refactor the 
 design is a costly operation, depending on how much code depends on 
 that hack/design.

Yup. Which is why I've been avoiding serious template programming to date. Every time I try it I run into bugs that are either showstoppers or require workarounds I'm not willing to provide. That said, the compiler has improved here quite a bit recently. There's only one or two remaining issues I have with Ares in this regard, but since few people are using std.atomic (Ares' only real use of templates at the moment), it can wait.

So have I (on template programming). We need a serious template programming tutorial/article to really show the masses (and me) how much power D's template system holds. I know it's powerful, but I don't feel comfortable using it at the moment. Mostly, all the aliasing and scope-avoidance quirks make me squeemish. Especially that damned annoying "template X cannot be used as a type" error message I get back all the time.

For me it's mostly been a matter of where to spend my limited free time. I'd rather implement something that works and feels pretty solid than search for workarounds for compiler bugs. Particularly when there's no telling how long those bugs will remain in the queue (the template bugs I've reported have remained unfixed for months). So if I start a project and run up against a wall I usually just report the obstacles and move on to a different project... unless that thing is necessary for some reason, and I've had few of those.
 BTW, I would like to use std.atomic if someone would build a concurrent 
 library around it (as lock-free as possible).  Much like Ben Hinkle has 
 done with MinTL, but last I recall he hasn't updated it for the latest 
 compilers.  Correct me if I'm wrong, please.

It hasn't. I'll admit that I've been putting off container support in Ares simply because there are so many options to choose from and I want to be able to spend some real time thinking about interface issues before I go any further there. But I'm running out of excuses, so this is something I'll probably be looking at soon. And when I do, I'll keep lock-free in mind. I'm also ruminating on support for software transactional memory, which would allow for some fairly fancy lock-free containers (such as red-black trees), but I've got some more research to do here before I decide whether this is worth pursuing.
 Walter has more direct control of the language pre-1.0.  We must be 
 patient and let him (and us all) improve the language and fix bugs up 
 to the feature freeze.  Let us ensure ourselves that we have a 
 quality language on which to design and build a quality 
 infrastructure.  Let's not repeat the mistakes of C++ here, please.

 Summary:  Patience.  There is no deadline for this project.  What's 
 the rush?

D's window may close a tad if things take too long. But that's more a measure of years than months.

D's windows to whom exactly? Those who don't know about it? What's closing that window?

Right now, D is about the only game in town for some things--generalized, system-level concurrent programming being one such example. But .NET/CLI is progressing fairly rapidly, the next iteration of the C++ standard is shaping up, etc. None of these do/will address these issues quite as well as D does, but their popularity is a mitigating factor. I think it would be a selling point if D could make 1.0 before the alternatives get to the point where they can at least pretend to address the same problem areas. That said, D's support for floating-point math is unparalleled, and this will continue to be a significant feature even if it gains competition in other areas.
 I can think of a few things on Digital Mars' side of things that would 
 close that window up in a few years, namely the 32-bit-only code 
 generator and incompatible (with Microsoft) binary linker written in 
 assembly language.  No need to reiterate this over and over though.  The 
 D language is a specification, and I assume others are free to take up 
 that specification and implement their own compilers.  (I really wonder 
 what Manfred Nowak's would look like =P)

This is one reason GDC is such an important project, as I suspect it can rely on the GCC code generator for the most part.
 I've mentioned in passing before, but a C99-compatible source code 
 generator as a backend for D is starting to sound like a reasonable 
 idea.  This would open up the floodgates for users to test out other 
 compiler toolchains and platforms not directly supported by the Digital 
 Mars reference compiler.  A host of ideas come rushing to my mind at the 
 thought of this, but I'm certainly not going to dump them all out here. 
  Others can contemplate on this now that I've thrown it out there.

True enough. Sean
Jan 12 2006
parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Sean Kelly wrote:

 bool should be sized to a machine-size word for speed and 
 addressability.  bool should remain bool: true or false, nothing else. 
 While it is true that a bool can map to a bit, it creates problems in 
 the long run when people don't realize the limitations of bits, bit 
 arrays, etc.

I agree with the intent, but the last thing I want is for D to have two boolean types. However, if packed bit arrays are removed from D then perhaps it would make sense to replace "bit" with "bool" as well, because when I see "bit" I kind of assume that I can use it to perform bit-level operations, at least in some context.

Just Two ? :-P I thought D was proud of three boolean types, and three string types. (that would be bit/byte/int and char[]/wchar[]/dchar[], respectively) http://www.prowiki.org/wiki4d/wiki.cgi?BitsAndBools http://www.prowiki.org/wiki4d/wiki.cgi?CharsAndStrs --anders
Jan 12 2006
prev sibling parent reply Ben Hinkle <Ben_member pathlink.com> writes:
BTW, I would like to use std.atomic if someone would build a concurrent 
library around it (as lock-free as possible).  Much like Ben Hinkle has 
done with MinTL, but last I recall he hasn't updated it for the latest 
compilers.  Correct me if I'm wrong, please.

I think you're right (well, technically the synchronization stuff is in the Locks library not MinTL since MinTL is just a bunch of containers). I'll try to update as much as possible this weekend. -Ben
Jan 13 2006
parent James Dunne <james.jdunne gmail.com> writes:
Ben Hinkle wrote:
BTW, I would like to use std.atomic if someone would build a concurrent 
library around it (as lock-free as possible).  Much like Ben Hinkle has 
done with MinTL, but last I recall he hasn't updated it for the latest 
compilers.  Correct me if I'm wrong, please.

I think you're right (well, technically the synchronization stuff is in the Locks library not MinTL since MinTL is just a bunch of containers). I'll try to update as much as possible this weekend. -Ben

Sweet! Thanks.
Jan 13 2006