www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - What's C's biggest mistake?

reply Walter Bright <newshound1 digitalmars.com> writes:
http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
Dec 24 2009
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
Many people would say null pointers. I don't agree.<
Note that in this newsgroup I've asked for nonnull class references. I have said nothing serious about pointers. Please don't mix the two things.
C can still be fixed. All it needs is a little new syntax:<
Good luck moving that iceberg. Bye, bearophile
Dec 24 2009
parent Walter Bright <newshound1 digitalmars.com> writes:
bearophile wrote:
 C can still be fixed. All it needs is a little new syntax:<
Good luck moving that iceberg.
I've already thrown my hat in the ring with D <g>. The C standards group has already made two attempts to fix the array problem in C. Both attempts missed the mark by a wide margin. The development of so-called safe C libraries also show the interest in a decent solution.
Dec 24 2009
prev sibling next sibling parent reply Edward Diener <eddielee_no_spam_here tropicsoft.com> writes:
Walter Bright wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/ 
 
This reminds me of the Java idiots when Java first came out. They discussed C endlessly trying to prove Java's supremacy as the new premier language, and either completely ignored that C++ existed or acted as if C++ was a computer language nobody actually used as opposed to that paragon of an up to date programming excellence circa 1996, the C programming language. Think how you will feel in 10 years when others belittle the D programming language, compared to other languages around in 2019, and are always referring to D version 1. While I admire much of the work you have done in creating D, it does not take much for an intelligent programmer to realize that your own view of C++ is jaundiced and heavily affected by what you perceive as D's improvements over C++. As a suggestion, which I don't expect you to take, you would do much better in viewing C++ in a more reasonable light while touting features of D which you feel is an improvement.
Dec 24 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Edward Diener wrote:
 Walter Bright wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/ 
This reminds me of the Java idiots when Java first came out. They discussed C endlessly trying to prove Java's supremacy as the new premier language, and either completely ignored that C++ existed or acted as if C++ was a computer language nobody actually used as opposed to that paragon of an up to date programming excellence circa 1996, the C programming language.
C is still heavily used today for new code, so fixing a mistake in it is very relevant. It's a lot easier to add a simple compatible change to C than to convince people to change to a whole new language.
 Think how you will feel in 10 years when others belittle the D 
 programming language, compared to other languages around in 2019, and 
 are always referring to D version 1.
I expect that time will expose many mistakes in the design of D. How I feel about it would be quite irrelevant. My article was not about attempting to get anyone to switch away from C - it was how a simple fix to C will address a serious shortcoming. It will still be C. It is not fundamentally different from bashing C++ for not having variadic templates and then proposing variadic templates for C++0x.
 While I admire much of the work you have done in creating D, it does not 
 take much for an intelligent programmer to realize that your own view of 
 C++ is jaundiced and heavily affected by what you perceive as D's 
 improvements over C++. As a suggestion, which I don't expect you to 
 take, you would do much better in viewing C++ in a more reasonable light 
 while touting features of D which you feel is an improvement.
Many people have suggested I stop comparing C++ to D, and they're right, and I have generally done so. As for my view of C++ being jaundiced, consider that I have been in the C++ compiler business since 1986 or so. I'm still the only person who has written a C++ compiler from front to back. I still support and maintain the C++ compiler. C++ has been good to me - professionally and financially. I did not discover D and become a fanboy, it was created out of my frustrations with C++. Of course I perceive D's improvements as improvements over C++ because I deliberately designed them that way! I have a C++ compiler, I wrote nearly every line in it, I know how it works and how to implement all of C++'s features. So if D doesn't work like C++ in one way or another, there's a deliberate choice made to make the change - not I didn't know how to do it. In other words, I don't feel my view of C++ is based on prejudice. It's based on spending most of my professional life developing, implementing, and using C++. C++ did a lot of things right, and you can see that in D. My opinions might still be wrong, of course. I have another blog post I'm working on that lists several design mistakes in D <g>. I hope you'll find it enjoyable! Andrei and I were just talking about programming language books, and how we both thought it was disingenuous that some of them never admit to any flaws in the language. We hope we don't fall into that trap.
Dec 24 2009
next sibling parent Max Samukha <spambox d-coding.com> writes:
On 25.12.2009 2:31, Walter Bright wrote:
 Edward Diener wrote:
 Walter Bright wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
 My opinions might still be wrong, of course. I have another blog post
 I'm working on that lists several design mistakes in D <g>. I hope
 you'll find it enjoyable!
That would be helpful. Not knowing that a feature is actually a design mistake may lead to painful revelations in the future. Also, I suggest mentioning, which design mistakes are going to be fixed and which are considered unfixable without major (and unlikely) language redesign.
 Andrei and I were just talking about programming language books, and how
 we both thought it was disingenuous that some of them never admit to any
 flaws in the language. We hope we don't fall into that trap.
Dec 25 2009
prev sibling parent "Kagamin" <spam here.lot> writes:
On Friday, 25 December 2009 at 00:31:58 UTC, Walter Bright wrote:
 Andrei and I were just talking about programming language 
 books, and how we both thought it was disingenuous that some of 
 them never admit to any flaws in the language. We hope we don't 
 fall into that trap.
Whoops.
Nov 07 2012
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
Walter Bright Wrote:

 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
On seeing this title I was going to interject "using pointers to represent arrays!" but it seems you beat me to it. This is definitely the biggest problem with C.
Dec 25 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Sean Kelly wrote:
 On seeing this title I was going to interject "using pointers to
 represent arrays!" but it seems you beat me to it.  This is
 definitely the biggest problem with C.
Also here: http://news.ycombinator.com/item?id=1014533 I find the responses to be very curious, particularly the "not in the spirit of C" ones. Then there are the ones who deny that this even is an actual problem in C code. It reminds me of why I decided to work on D rather than keep pushing on those ropes.
Dec 25 2009
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
I find the responses to be very curious, particularly the "not in the spirit of
C" ones.<
There are people that think of C as something set in stone, something that has a "necessary" design. Few years of discussions in the D newsgroups teach that instead C was not born as a single atomic perfect thing, it's a collection of design choices and design compromises, the original authors have chosen only part of the possible alternatives. And some of those design choices today can be improved. (In biology it's the same thing, a mammalian body like the human one is the result of a very large number of design choices, many of them are arbitrary, and some of them are just wrong). Today D is not a replacement of C, because of its GC and few other things (I don't think today you can use D to create 1200 bytes long binaries that run on an Arduino CPU), but maybe a reduced-D can be used for that purpose too. Bye, bearophile
Dec 25 2009
next sibling parent reply retard <re tard.com.invalid> writes:
Fri, 25 Dec 2009 14:54:18 -0500, bearophile wrote:

 Today D is not a replacement of C, because of its GC and few other
 things (I don't think today you can use D to create 1200 bytes long
 binaries that run on an Arduino CPU), but maybe a reduced-D can be used
 for that purpose too.
Where can I download the compiler for this reduced-D? Let's be honest - even with upx exe compressor the binaries are huge. If you have micro- controllers with 64 to 128 kB of ROM, the best you can do with dmd is a 'hello world'. Unless a better fork of the language is made (without extra typeinfo, with optional gc - we need a compiler switch, compiling the stdlib is a PITA, and without template bloat -> smart linker, not some legacy crap made with assembler). A single version of D can't serve the tastes of all audiences.
Dec 26 2009
parent Sean Kelly <sean invisibleduck.org> writes:
retard Wrote:

 Fri, 25 Dec 2009 14:54:18 -0500, bearophile wrote:
 
 Today D is not a replacement of C, because of its GC and few other
 things (I don't think today you can use D to create 1200 bytes long
 binaries that run on an Arduino CPU), but maybe a reduced-D can be used
 for that purpose too.
Where can I download the compiler for this reduced-D? Let's be honest - even with upx exe compressor the binaries are huge. If you have micro- controllers with 64 to 128 kB of ROM, the best you can do with dmd is a 'hello world'. Unless a better fork of the language is made (without extra typeinfo, with optional gc - we need a compiler switch, compiling the stdlib is a PITA, and without template bloat -> smart linker, not some legacy crap made with assembler). A single version of D can't serve the tastes of all audiences.
Yeah. Even without the GC binaries contain a good bit of TypeInfo data. I think we're getting to a point where this could be left out and the language would still be pretty much the same though.
Dec 26 2009
prev sibling parent reply Marco <marco nospam.com> writes:
bearophile Wrote:

 Walter Bright:
I find the responses to be very curious, particularly the "not in the spirit of
C" ones.<
There are people that think of C as something set in stone, something >that has a "necessary" design. Few years of discussions in the D >newsgroups teach that instead C was not born as a single atomic >perfect thing, it's a collection of design choices and design compromises, >the original authors have chosen only part of the possible alternatives.
For the most part the C design is complete at this point in its language lifecycle with only minor tweaks allowed in the core language that don't break existing ISO C90 standard code. C99 introduced VLAs which was a mistake bolting on a new feature. Of course - cleanup of the standard libraries should continue.
Dec 27 2009
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Marco:

C99 introduced VLAs which was a mistake bolting on a new feature.<
I may want those VLAs in D :-) They are more handy than using alloca(). Bye, bearophile
Dec 27 2009
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Marco wrote:
 bearophile Wrote:
 
 Walter Bright:
 I find the responses to be very curious, particularly the "not in
 the spirit of C" ones.<
There are people that think of C as something set in stone, something >that has a "necessary" design. Few years of discussions in the D >newsgroups teach that instead C was not born as a single atomic >perfect thing, it's a collection of design choices and design compromises, >the original authors have chosen only part of the possible alternatives.
For the most part the C design is complete at this point in its language lifecycle with only minor tweaks allowed in the core language that don't break existing ISO C90 standard code. C99 introduced VLAs which was a mistake bolting on a new feature. Of course - cleanup of the standard libraries should continue.
VLA's are not a mistake because they are a bolted on new feature. They are a mistake because they are poorly designed and have very limited utility. On the other hand, I know the array proposal delivers because it's been very successful in D for nearly 10 years now. It won't break any existing code, either. And it's simple to implement (easier than VLA's). It's just a win, all around.
Dec 27 2009
prev sibling next sibling parent Sean Kelly <sean invisibleduck.org> writes:
Walter Bright Wrote:

 Sean Kelly wrote:
 On seeing this title I was going to interject "using pointers to
 represent arrays!" but it seems you beat me to it.  This is
 definitely the biggest problem with C.
Also here: http://news.ycombinator.com/item?id=1014533 I find the responses to be very curious, particularly the "not in the spirit of C" ones. Then there are the ones who deny that this even is an actual problem in C code.
I'd be inclined to think most of these people are students or hobbyist programmers. I couldn't imagine someone using C professionally and having this attitude. Though there are certainly a lot of language lawyer wannabes on the NGs, so who knows.
 It reminds me of why I decided to work on D rather than keep pushing on 
 those ropes.
And the programming community is better for it!
Dec 26 2009
prev sibling parent reply BCS <none anon.com> writes:
Hello Walter,

 Sean Kelly wrote:
 
 On seeing this title I was going to interject "using pointers to
 represent arrays!" but it seems you beat me to it.  This is
 definitely the biggest problem with C.
 
Also here: http://news.ycombinator.com/item?id=1014533 I find the responses to be very curious, particularly the "not in the spirit of C" ones. Then there are the ones who deny that this even is an actual problem in C code. It reminds me of why I decided to work on D rather than keep pushing on those ropes.
C is Latin, about 300 years ago: A dead language that no one uses but everyone knows.
Dec 29 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
BCS wrote:
 C is Latin, about 300 years ago: A dead language that no one uses but 
 everyone knows.
C is still in wide use.
Dec 30 2009
parent reply BCS <none anon.com> writes:
Hello Walter,

 BCS wrote:
 
 C is Latin, about 300 years ago: A dead language that no one uses but
 everyone knows.
 
C is still in wide use.
OK I over stated. OTOH, Latin was in wide use (for some professions) 300 years ago. I guess my point is that aside from VERY resource limited systems, almost no one will have C as their first choice. Even with those limited systems I'd bet that most people would rather be working in something else if they could. That said, there are many places where it ends up being the lingua franca.
Dec 30 2009
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
BCS wrote:
 I guess my point is that aside from VERY resource limited systems, 
 almost no one will have C as their first choice. Even with those limited 
 systems I'd bet that most people would rather be working in something 
 else if they could. That said, there are many places where it ends up 
 being the lingua franca.
I still think you're underestimating C's audience. Consider the Linux effort - they can choose any implementation language they want, and they choose C. They aren't forced into C.
Dec 30 2009
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 I still think you're underestimating C's audience.
Yep. The Tiobe index is mostly trash, but it shows C as the second most used language, just after Java, and Java usage is getting lower, I think in a couple years C even risks becoming the most used, according to that flawed metrics: http://www.tiobe.com/content/paperinfo/tpci/index.html Tiobe is not a serious thing, but you can't deny that C is in wide use still. It's a mostly clean language, concise, and you usually know that what you do will not cost too much for the CPU. Bye, bearophile
Dec 30 2009
prev sibling parent reply BCS <none anon.com> writes:
Hello Walter,

 BCS wrote:
 
 I guess my point is that aside from VERY resource limited systems,
 almost no one will have C as their first choice. Even with those
 limited systems I'd bet that most people would rather be working in
 something else if they could. That said, there are many places where
 it ends up being the lingua franca.
 
I still think you're underestimating C's audience. Consider the Linux effort - they can choose any implementation language they want, and they choose C. They aren't forced into C.
Yes, C has a wide audience (any one who says differently is selling something). But I still thing that their are very few cases where C will be chosen for would have done just as good a job in Linux as C, I'd almost bet that Linux wouldn't have been written in C. My point isn't that C is never the right choice (as that is clearly false) but that when C is chosen, it's (almost) always for technical reasons rather than aesthetic ones (where it is merely good enough).
Dec 31 2009
parent reply Kevin Bealer <kevinbealer gmail.com> writes:
BCS Wrote:

 Hello Walter,
 
 BCS wrote:
 
 I guess my point is that aside from VERY resource limited systems,
 almost no one will have C as their first choice. Even with those
 limited systems I'd bet that most people would rather be working in
 something else if they could. That said, there are many places where
 it ends up being the lingua franca.
 
I still think you're underestimating C's audience. Consider the Linux effort - they can choose any implementation language they want, and they choose C. They aren't forced into C.
Yes, C has a wide audience (any one who says differently is selling something). But I still thing that their are very few cases where C will be chosen for reasons other than it's purely technical merits. If would have done just as good a job in Linux as C, I'd almost bet that Linux wouldn't have been written in C. My point isn't that C is never the right choice (as that is clearly false) but that when C is chosen, it's (almost) always for technical reasons rather than aesthetic ones (where it is merely good enough).
I would say these are the technical merits of C that get it chosen these days: 1. The new code they're writing will be part of a large body of existing C code which they don't have time, permission, or inclination to convert to C++. 2. They need to be aware of every tiny low level detail anyway, so having the language do too many things "for you" is not the desired approach (security, O/S and embedded work). 3. C has a real ABI on almost every platform; therefore, C is chosen for most inter-language work such as writing python modules. But some people really *are* choosing C for aesthetics. Linus Torvalds, bless his little world dominating heart, chose C for a normal app (git), and he cited that the existence of operator overloading in C++ is bad because it hides information -- e.g. in the general case you "never know what an expression is actually doing." I think this can be seen as mainly an aesthetic choice. Avoiding a language because it *supports* information hiding (which is what I think operator overloading is) is not really an 'economic' tradeoff, since you could choose not to hide information by not using those features. He'd just rather not be in the vicinity of language features that make those kinds of choices because they seem wrong to him (and because he wants to keep C++ies out of his code I think.) Some people want their language to have a "WYSIWYG" relationship with the generated assembler code (if I'm right, it does seem consistent with him being an OS developer). I also know some scientists and mathematicians who use C rather than C++. I think the reason is that by using a simpler language they can know everything about the language. I think the sooner they can 'get the computer science stuff out of the way', the sooner they can focus on what they see as the domain issues. (I think once the program gets big enough, the CompSci aspects reassert themself and scalability and maintainability issues begin to bite you in the rear.) Kevin
Dec 31 2009
next sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Kevin Bealer (kevinbealer gmail.com)'s article
 I would say these are the technical merits of C that get it chosen these days:
 1. The new code they're writing will be part of a large body of existing C code
which they don't have time, permission, or inclination to convert to C++.
 2. They need to be aware of every tiny low level detail anyway, so having the
language do too many things "for you" is not the desired approach (security, O/S and embedded work). Even if you need to be aware of every tiny detail, it still sometimes pays to have more ability to automate some stuff. For example, even if you care about performance enough to really think hard about when to use virtual functions, it's nice to have an automated, non error-prone way to create them if you do want them. Similarly, if you need to parametrize something on types, it's nice to be able to automate this with templates instead of doing it manually.
 3. C has a real ABI on almost every platform; therefore, C is chosen for most
inter-language work such as writing python modules.
 But some people really *are* choosing C for aesthetics.  Linus Torvalds, bless
his little world dominating heart, chose C for a normal app (git), and he cited that the existence of operator overloading in C++ is bad because it hides information -- e.g. in the general case you "never know what an expression is actually doing." Isn't the whole purpose of any language higher-level than assembler to hide information? If your language isn't hiding any complexity, you may as well be writing in raw, inscrutable hexadecimal numbers.
 I think this can be seen as mainly an aesthetic choice.  Avoiding a language
because it *supports* information hiding (which is what I think operator overloading is) is not really an 'economic' tradeoff, since you could choose not to hide information by not using those features. He'd just rather not be in the vicinity of language features that make those kinds of choices because they seem wrong to him (and because he wants to keep C++ies out of his code I think.)
 Some people want their language to have a "WYSIWYG" relationship with the
generated assembler code (if I'm right, it does seem consistent with him being an OS developer). This kind of thinking is understandable for kernel development and very resource-constrained environments, but not much else.
 I also know some scientists and mathematicians who use C rather than C++.  I
think the reason is that by using a simpler language they can know everything about the language. I think the sooner they can 'get the computer science stuff out of the way', the sooner they can focus on what they see as the domain issues. (I think once the program gets big enough, the CompSci aspects reassert themself and scalability and maintainability issues begin to bite you in the rear.) I personally am a scientist (bioinformatics specifically) and I think having basic complexity management in your code is worthwhile even at fairly small project sizes. I learned this the hard way. For anything over about 100 lines I want some modularity (classes/structs, higher order functions, arrays that are more than a pointer + a convention, etc.) so that I can tweak my scientific app easily. Furthermore, multithreading is absolutely essential for some of the stuff I do, since it's embarrassingly parallel, and is a huge PITA in C.
Dec 31 2009
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
dsimcha wrote:
 I personally am a scientist (bioinformatics specifically) and I think having
basic
 complexity management in your code is worthwhile even at fairly small project
 sizes.  I learned this the hard way.  For anything over about 100 lines I want
 some modularity (classes/structs, higher order functions, arrays that are more
 than a pointer + a convention, etc.) so that I can tweak my scientific app
easily.
  Furthermore, multithreading is absolutely essential for some of the stuff I
do,
 since it's embarrassingly parallel, and is a huge PITA in C.
When I was working on converting Optlink to C, I thought long and hard about why C instead of D. The only, and I mean only, reason to do it via C was because part of the build process for Optlink used old tools that did not recognize newer features of the OMF that D outputs. Once it is all in C, the old build system can be dispensed with, and then it can be easily converted to D. If you want to, you can literally write code in D that is line-for-line nearly identical to C, and it will compile to the same code, and will perform the same. You can do the same with C++ - Linus surely knows this, but I suspect he didn't want to use C++ because sure as shinola, members of his dev team would start using operator overloading, virtual base classes, etc.
Dec 31 2009
parent reply Don <nospam nospam.com> writes:
Walter Bright wrote:
 dsimcha wrote:
 I personally am a scientist (bioinformatics specifically) and I think 
 having basic
 complexity management in your code is worthwhile even at fairly small 
 project
 sizes.  I learned this the hard way.  For anything over about 100 
 lines I want
 some modularity (classes/structs, higher order functions, arrays that 
 are more
 than a pointer + a convention, etc.) so that I can tweak my scientific 
 app easily.
  Furthermore, multithreading is absolutely essential for some of the 
 stuff I do,
 since it's embarrassingly parallel, and is a huge PITA in C.
When I was working on converting Optlink to C, I thought long and hard about why C instead of D. The only, and I mean only, reason to do it via C was because part of the build process for Optlink used old tools that did not recognize newer features of the OMF that D outputs. Once it is all in C, the old build system can be dispensed with, and then it can be easily converted to D. If you want to, you can literally write code in D that is line-for-line nearly identical to C, and it will compile to the same code, and will perform the same. You can do the same with C++ - Linus surely knows this, but I suspect he didn't want to use C++ because sure as shinola, members of his dev team would start using operator overloading, virtual base classes, etc.
Well, if you ask the question "what's C++'s biggest mistake?" it's much more difficult. C++'s failure to specify the ABI is enough of a reason to use C instead, I reckon. It think it's an appalling, inexcusable mistake -- it guaranteed compiled libraries 20 years later would use extern(C), not extern(C++). And that's not the worst C++ mistake.
Dec 31 2009
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Don:
 Well, if you ask the question "what's C++'s biggest mistake?" it's much 
 more difficult. C++'s failure to specify the ABI is enough of a reason 
 to use C instead, I reckon.
What about of mistakes that D can avoid? :-) A larger stack alignment? (I think on Snow Leopard the standard stack alignment is 16 bytes). Bye, bearophile
Jan 01 2010
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Don wrote:
 Well, if you ask the question "what's C++'s biggest mistake?" it's much 
 more difficult. C++'s failure to specify the ABI is enough of a reason 
 to use C instead, I reckon. It think it's an appalling, inexcusable 
 mistake -- it guaranteed compiled libraries 20 years later would use 
 extern(C), not extern(C++). And that's not the worst C++ mistake.
I'd be hard pressed to come up with C++'s biggest mistake. Perhaps it was failing to address the array => pointer conversion.
Jan 01 2010
next sibling parent Don <nospam nospam.com> writes:
Walter Bright wrote:
 Don wrote:
 Well, if you ask the question "what's C++'s biggest mistake?" it's 
 much more difficult. C++'s failure to specify the ABI is enough of a 
 reason to use C instead, I reckon. It think it's an appalling, 
 inexcusable mistake -- it guaranteed compiled libraries 20 years later 
 would use extern(C), not extern(C++). And that's not the worst C++ 
 mistake.
I'd be hard pressed to come up with C++'s biggest mistake. Perhaps it was failing to address the array => pointer conversion.
I think the biggest mistake was preserving backwards source-code compatibility with C, but _still_ managing to lose many of C's advantages. As a pure superset of C, it _should_ have been able to replace 100% of the uses of C, but failed miserably.
Jan 01 2010
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 Don wrote:
 Well, if you ask the question "what's C++'s biggest mistake?" it's 
 much more difficult. C++'s failure to specify the ABI is enough of a 
 reason to use C instead, I reckon. It think it's an appalling, 
 inexcusable mistake -- it guaranteed compiled libraries 20 years later 
 would use extern(C), not extern(C++). And that's not the worst C++ 
 mistake.
I'd be hard pressed to come up with C++'s biggest mistake. Perhaps it was failing to address the array => pointer conversion.
That's partially addressed by the ability to define somewhat encapsulated types like std::vector. Don's suggested lack of ABI is big, and unfortunately not addressed in C++0x. There are many smaller ones to follow... two that come to mind: using "<"/">" for grouping and argument-dependent lookup. Happy New Year in 2010! Let's make it a great year for D. Andrei
Jan 01 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 Walter Bright wrote:
 Don wrote:
 Well, if you ask the question "what's C++'s biggest mistake?" it's 
 much more difficult. C++'s failure to specify the ABI is enough of a 
 reason to use C instead, I reckon. It think it's an appalling, 
 inexcusable mistake -- it guaranteed compiled libraries 20 years 
 later would use extern(C), not extern(C++). And that's not the worst 
 C++ mistake.
I'd be hard pressed to come up with C++'s biggest mistake. Perhaps it was failing to address the array => pointer conversion.
That's partially addressed by the ability to define somewhat encapsulated types like std::vector.
I agree that it is partially addressed by std::vector and std::string. But: 1. Those appeared 10 years after C++ was in widespread use, 10 years of wandering in the desert with everyone trying to invent their own string class. 2. It still reflects in the design of std::string, in that to preserve the 0 termination design, it makes severe compromises. 3. The glaring fact that std::vector<char> and std::string are different suggests something is still wrong. 4. The pointer-centric design resulted in std::iterator, and of course iterators must go!
 Don's suggested lack of ABI is big, 
 and unfortunately not addressed in C++0x. There are many smaller ones to 
 follow... two that come to mind: using "<"/">" for grouping and 
 argument-dependent lookup.
I would suggest a broader issue than < >, it is the willingness to require semantic analysis to successfully parse C++. < > would never have been used if that was not considered acceptable. Only later did the C++ community realize there was value in parsing things without semantic analysis, i.e. template bodies. ADL was a fix for another issue, the asymmetric design of operator overloading. So I don't think ADL was a mistake so much as what C++ was forced into to compensate for a previous mistake.
Jan 01 2010
parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 3. The glaring fact that std::vector<char> and std::string are different 
 suggests something is still wrong.
In an array/vector you want O(1) access time to all items (ignoring RAM-cache access/transfer delays), while in a string with variable-width Unicode encoding that can be hard to do. So they look like two different data structures. Bye, bearophile
Jan 01 2010
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
bearophile wrote:
 Walter Bright:
 3. The glaring fact that std::vector<char> and std::string are
 different suggests something is still wrong.
In an array/vector you want O(1) access time to all items (ignoring RAM-cache access/transfer delays), while in a string with variable-width Unicode encoding that can be hard to do. So they look like two different data structures.
The real reason is different (multibyte support in std::string is at best nonexistent). std::vector was defined by Stepanov alone. But by the time std::string was standardized, many factions of the committee had a feature on their list. std::string is the result of that patchwork. Andrei
Jan 01 2010
parent reply bearophile <bearophileHUGS lycos.com> writes:
Andrei Alexandrescu:
 The real reason is different (multibyte support in std::string is at 
 best nonexistent). std::vector was defined by Stepanov alone. But by the 
 time std::string was standardized, many factions of the committee had a 
 feature on their list. std::string is the result of that patchwork.
Thanks you for the info, I didn't know that. In practice I was mostly talking about D2, that interests me more than C++. In D2 strings can be your bidirectional Ranges, while fixed-sized/dynamic arrays can be random access Ranges (string can be random access Ranges according to just the underlying bytes. This may require two different syntaxes to access strings, the normal str[] and something else like str.byte[] for the bytes, and usually only the second one can guarantee a O(1) access time unless it's a 32-bit wide unicode chars. The access with [] may use something simple, like a "skip list" to speed up access from O(n) to O(ln n)). And to avoid silly bugs D2 associative arrays can allow constant/immutable keys only (especially when used in safe modules), as in Python. Because if you put a key in a set/AA and later you modify it, its hash value and position doesn't get updated. Bye, bearophile
Jan 01 2010
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
bearophile wrote:
 Andrei Alexandrescu:
 The real reason is different (multibyte support in std::string is at 
 best nonexistent). std::vector was defined by Stepanov alone. But by the 
 time std::string was standardized, many factions of the committee had a 
 feature on their list. std::string is the result of that patchwork.
Thanks you for the info, I didn't know that. In practice I was mostly talking about D2, that interests me more than C++. In D2 strings can be your bidirectional Ranges, while fixed-sized/dynamic arrays can be random access Ranges (string can be random access Ranges according to just the underlying bytes. This may require two different syntaxes to access strings, the normal str[] and something else like str.byte[] for the bytes, and usually only the second one can guarantee a O(1) access time unless it's a 32-bit wide unicode chars. The access with [] may use something simple, like a "skip list" to speed up access from O(n) to O(ln n)).
Look for byCodeUnit in here: http://dsource.org/projects/phobos/browser/trunk/phobos/std/string.d and improve it.
 And to avoid silly bugs D2 associative arrays can allow constant/immutable
keys only (especially when used in safe modules), as in Python. Because if you
put a key in a set/AA and later you modify it, its hash value and position
doesn't get updated.
That's a long discussion, sigh. Andrei
Jan 01 2010
prev sibling parent Kevin Bealer <kevinbealer gmail.com> writes:
bearophile Wrote:

 Walter Bright:
 3. The glaring fact that std::vector<char> and std::string are different 
 suggests something is still wrong.
In an array/vector you want O(1) access time to all items (ignoring RAM-cache access/transfer delays), while in a string with variable-width Unicode encoding that can be hard to do. So they look like two different data structures. Bye, bearophile
Yeah, I think the charset thing was probably the main reason for the string/vector split, that and the desire to have special properties like conversion from char* that wouldn't be in vector. Using basic_string<T> with locales is something of a historical wart, because with Unicode, getting your charset from your locale is somewhat obsolete for general purpose computers. (Maybe very small profile systems will continue to use ascii or the code page of whatever culture buildt them.) But I don't think C++'s string can be made to index by character unless you use wchar_t for the T in basic_string<T>. I don't think string.size() is ever anything but a bytes or wchar_t count. Kevin
Jan 02 2010
prev sibling parent retard <re tard.com.invalid> writes:
Fri, 01 Jan 2010 00:56:04 +0000, dsimcha wrote:

[OT] the nntp client you use seems to have serious problems with line 
wrapping.
Jan 04 2010
prev sibling parent BCS <none anon.com> writes:
Hello Kevin,

 I would say these are the technical merits of C that get it chosen
 these days:
 
 1. The new code they're writing will be part of a large body of
 existing C code which they don't have time, permission, or inclination
 to convert to C++.
Probably the most common reason that C is used (OTOH I'm not sure that counts as "choose" rather than just used)
 
 2. They need to be aware of every tiny low level detail anyway, so
 having the language do too many things "for you" is not the desired
 approach (security, O/S and embedded work).
Nod.
 
 3. C has a real ABI on almost every platform; therefore, C is chosen
 for most inter-language work such as writing python modules.
 
Nod.
 But some people really *are* choosing C for aesthetics.  Linus
 Torvalds, bless his little world dominating heart, chose C for a
 normal app (git), and he cited that the existence of operator
 overloading in C++ is bad because it hides information -- e.g. in the
 general case you "never know what an expression is actually doing."
Is that choosing C or getting stuck with it after removing the non-options?
 
 I think this can be seen as mainly an aesthetic choice.  Avoiding a
 language because it *supports* information hiding (which is what I
 think operator overloading is) is not really an 'economic' tradeoff,
 since you could choose not to hide information by not using those
 features.  He'd just rather not be in the vicinity of language
 features that make those kinds of choices because they seem wrong to
 him (and because he wants to keep C++ies out of his code I think.)
I considered citing Linus as the counter example... but there are also people who LIKE assembler so I think we should stick to the other 99%.
 
 Some people want their language to have a "WYSIWYG" relationship with
 the generated assembler code (if I'm right, it does seem consistent
 with him being an OS developer).
 
 I also know some scientists and mathematicians who use C rather than
 C++.  I think the reason is that by using a simpler language they can
 know everything about the language.  I think the sooner they can 'get
 the computer science stuff out of the way', the sooner they can focus
 on what they see as the domain issues.  (I think once the program gets
 big enough, the CompSci aspects reassert themself and scalability and
 maintainability issues begin to bite you in the rear.)
Odd, I'd expect that crowd to go with Fortran...
 
 Kevin
 
Jan 01 2010
prev sibling next sibling parent reply retard <re tard.com.invalid> writes:
Wed, 30 Dec 2009 23:36:36 +0000, BCS wrote:

 Hello Walter,
 
 BCS wrote:
 
 C is Latin, about 300 years ago: A dead language that no one uses but
 everyone knows.
 
C is still in wide use.
OK I over stated. OTOH, Latin was in wide use (for some professions) 300 years ago. I guess my point is that aside from VERY resource limited systems, almost no one will have C as their first choice. Even with those limited systems I'd bet that most people would rather be working in something else if they could. That said, there are many places where it ends up being the lingua franca.
If you ask this question from e.g. the linux kernel devs, they always answer C. It's not a surprise that projects like GIT are (mostly) written in C. Most traditional *nix software is written in C and so are libraries like GTK+. C just works :)
Dec 30 2009
parent Walter Bright <newshound1 digitalmars.com> writes:
retard wrote:
 C just works :)
Until you have buffer overflows!
Dec 30 2009
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
BCS Wrote:
 
 I guess my point is that aside from VERY resource limited systems, almost 
 no one will have C as their first choice. Even with those limited systems 
 I'd bet that most people would rather be working in something else if they 
 could. That said, there are many places where it ends up being the lingua 
 franca.
C has the advantage of working pretty much the same on every platform around, while C++ compilers are /still/ unreliable about standard library support, language features, etc. In fact, my current project is in C, though I'd prefer at least using the "C with objects" style of C++ like DMD is written in. As you've said, C is the lingua franca in many places and it's difficult to displace.
Dec 30 2009
parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Sean Kelly (sean invisibleduck.org)'s article
 BCS Wrote:
 I guess my point is that aside from VERY resource limited systems, almost
 no one will have C as their first choice. Even with those limited systems
 I'd bet that most people would rather be working in something else if they
 could. That said, there are many places where it ends up being the lingua
 franca.
C has the advantage of working pretty much the same on every platform around,
while C++ compilers are /still/ unreliable about standard library support, language features, etc. In fact, my current project is in C, though I'd prefer at least using the "C with objects" style of C++ like DMD is written in. As you've said, C is the lingua franca in many places and it's difficult to displace. C is such an unbelievably low level language that I find it amazing that anyone would use it outside of kernels, device drivers, very resource-limited embedded systems and legacy systems where the decision was made a long time ago. I would think the portability issues of C++ would be easier to deal with than the extreme low levelness of C.
Dec 31 2009
parent Sean Kelly <sean invisibleduck.org> writes:
dsimcha Wrote:

 == Quote from Sean Kelly (sean invisibleduck.org)'s article
 BCS Wrote:
 I guess my point is that aside from VERY resource limited systems, almost
 no one will have C as their first choice. Even with those limited systems
 I'd bet that most people would rather be working in something else if they
 could. That said, there are many places where it ends up being the lingua
 franca.
C has the advantage of working pretty much the same on every platform around,
while C++ compilers are /still/ unreliable about standard library support, language features, etc. In fact, my current project is in C, though I'd prefer at least using the "C with objects" style of C++ like DMD is written in. As you've said, C is the lingua franca in many places and it's difficult to displace. C is such an unbelievably low level language that I find it amazing that anyone would use it outside of kernels, device drivers, very resource-limited embedded systems and legacy systems where the decision was made a long time ago. I would think the portability issues of C++ would be easier to deal with than the extreme low levelness of C.
In my case it's really mostly entrenchment. I could have pushed for C++, but I'd have been the only one on the team who'd spent much time with the language so there'd have been little point. As it is, I find myself lamenting the "pointers are arrays" issue every single day. I've ended up writing a whole slew of memxxx() routines the standard library left out just so I could do string manipulation without inserting and removing null terminators for each operation.
Dec 31 2009
prev sibling next sibling parent reply Mike James <foo bar.com> writes:
Walter Bright Wrote:

 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
I write a lot of embedded C for microcontrollers. It would be great to have an easy means of accessing bytes in an int or short without having to resort to messy unions. Maybe make it a bit more BCPL :-) -=mike=-
Dec 27 2009
parent Teemu Pudas <tpudas cc.hut.fi> writes:
On 27/12/2009 20:29, Mike James wrote:
 It would be great to have an easy means of accessing bytes in an int or short
without having to resort to messy unions.
Don't do that. It's undefined behaviour. Yes, I've been bitten by it.
Dec 27 2009
prev sibling next sibling parent reply merlin <merlin esilo.com> writes:
Walter Bright wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/ 
That's a big one. I don't know if it's the biggest, there are so many to choose from: *) lack of standard bool type (later fixed) *) lack of guaranteed length integer types (later fixed) *) lack of string type and broken standard library string handling (not fixed) *) obviously wrong type declaration (int v[] not int[] v) *) grammar not context free (so close, yet so far...) *) lousy exception handling implementation IMO, had a few things gone differently with C, Java, C++, and other attempts to simplify/fix it would not have happened. D is the only language that really got it right IMO. merlin
Dec 30 2009
parent reply "renoX" <renozyx gmail.com> writes:
On Wednesday, 30 December 2009 at 14:32:01 UTC, merlin wrote:
 Walter Bright wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
That's a big one. I don't know if it's the biggest, there are so many to choose from: *) lack of standard bool type (later fixed) *) lack of guaranteed length integer types (later fixed) *) lack of string type and broken standard library string handling (not fixed) *) obviously wrong type declaration (int v[] not int[] v)
I agree with your previous point but your type declaration syntax is still awful IMHO declaring int[Y][X] and then using [x][y].. I don't like reading type declaration right-to-left and then normal code left-to-right..
 *) grammar not context free (so close, yet so far...)
 *) lousy exception handling implementation
You forgot no sane integer overflow behaviour, undefined program on overflow isn't a good default behaviour, it should be only a possible optimisation. Same with array indexing. renoX
Nov 08 2012
parent reply "Kagamin" <spam here.lot> writes:
On Thursday, 8 November 2012 at 09:38:28 UTC, renoX wrote:
 I agree with your previous point but your type declaration 
 syntax is still awful IMHO declaring int[Y][X] and then using 
 [x][y]..
 I don't like reading type declaration right-to-left and then 
 normal code left-to-right..
Well, then read type declarations left-to-right. It's the strangest decision in design of golang to reverse type declarations. I always read byte[] as `byte array`, not `an array of bytes`.
Nov 08 2012
next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Thu, 08 Nov 2012 19:45:31 +0100
"Kagamin" <spam here.lot> wrote:

 On Thursday, 8 November 2012 at 09:38:28 UTC, renoX wrote:
 I agree with your previous point but your type declaration 
 syntax is still awful IMHO declaring int[Y][X] and then using 
 [x][y]..
 I don't like reading type declaration right-to-left and then 
 normal code left-to-right..
Well, then read type declarations left-to-right. It's the strangest decision in design of golang to reverse type declarations. I always read byte[] as `byte array`, not `an array of bytes`.
Doing "int[y][x] ... foo[x][y]" is an odd reversal. But Issue 9's "[x][y]int" *also* feels very backwards to me (though perhaps I'd get used to it?). Either way though, they still both beat the hell out of C/C++'s seemingly random arrangement which can't be read left-to-right *or* right-to-left. So I'm happy either way :)
Nov 08 2012
parent reply "Kagamin" <spam here.lot> writes:
Well, in the same vein one could argue that write(a,b) looks as 
if first function is called then arguments are computed and 
passed so the call should be written (a,b)write instead. The 
language has not only syntax, but also semantics.
Nov 08 2012
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Thu, 08 Nov 2012 21:04:06 +0100
"Kagamin" <spam here.lot> wrote:

 Well, in the same vein one could argue that write(a,b) looks as 
 if first function is called then arguments are computed and 
 passed so the call should be written (a,b)write instead. The 
 language has not only syntax, but also semantics.
Actually, that's one of the reasons I prefer UFCS function chaining over nested calls.
Nov 08 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 08, 2012 at 10:47:10PM -0500, Nick Sabalausky wrote:
 On Thu, 08 Nov 2012 21:04:06 +0100
 "Kagamin" <spam here.lot> wrote:
 
 Well, in the same vein one could argue that write(a,b) looks as 
 if first function is called then arguments are computed and 
 passed so the call should be written (a,b)write instead. The 
 language has not only syntax, but also semantics.
In that case, we should just switch wholesale to reverse Polish notation, and get rid of parenthesis completely. Why write hard-to-read expressions like a+2*(b-c) when you can write a 2 b c - * +? Then function calls would fit right in: 1 5 sqrt + 2 / GoldenRatio == assert; Even constructs like if statements would be considerably simplified: i 10 < if i++ else i--; Things like function composition would actually make sense, as opposed to the reversed order of writing things that mathematicians have imposed upon us. Instead of f(g(x)) which makes no sense in terms of ordering, we'd have x g f, which shows exactly in what order things are evaluated. ;-)
 Actually, that's one of the reasons I prefer UFCS function chaining
 over nested calls.
Fortunately for me, I got used to UFCS-style function chaining when learning jQuery. (Yes, Javascript actually proved beneficial in that case, shockingly enough.) T -- Prosperity breeds contempt, and poverty breeds consent. -- Suck.com
Nov 08 2012
prev sibling parent "Kagamin" <spam here.lot> writes:
On Friday, 9 November 2012 at 00:18:40 UTC, Tommi wrote:
 How do you read byte[5][2] from left to right?
The structure is fairly the same as byte[]. On Friday, 9 November 2012 at 03:47:16 UTC, Nick Sabalausky wrote:
 Actually, that's one of the reasons I prefer UFCS function 
 chaining
 over nested calls.
Left-to-right chaining, right-to-left chaining and nesting are three different cases.
Nov 12 2012
prev sibling parent "Tommi" <tommitissari hotmail.com> writes:
On Thursday, 8 November 2012 at 18:45:35 UTC, Kagamin wrote:
 Well, then read type declarations left-to-right. It's the 
 strangest decision in design of golang to reverse type 
 declarations. I always read byte[] as `byte array`, not `an 
 array of bytes`.
How do you read byte[5][2] from left to right? "Byte arrays of 5 elements 2 times in an array". It's impossible. On the other hand, [5][2]byte reads nicely from left to right: "Array of 5 arrays of 2 bytes". You start with the most important fact: that it's an array. Then you start describing what the array is made of.
Nov 08 2012
prev sibling next sibling parent reply Bane <branimir.milosavljevic gmail.com> writes:
Biggest mistake? That's easy:

C made Walter angry, so he created D. 

Now that its done, I can newer go back typing & ** -> #ifdef... when I can
accomplish same thing with much less headache using D.
Dec 30 2009
parent bearophile <bearophileHUGS lycos.com> writes:
Bane:
Now that its done, I can newer go back typing & ** -> #ifdef... when I can
accomplish same thing with much less headache using D.<
The replacement of -> with the dot is cute and handy, but it leads to a little break of symmetry still. If you have a class like: class V3 { float[3] data; void opIndexAssign(float x, size_t i) { data[i] = x; } double dot() { float s = 0.0; // slow foreach (d; data) s += d * d; return s; } } void main() { V3 v = new V3; v[0] = 1.0; v[1] = 2.0; v[2] = 3.0; printf("%f\n", v.dot()); } To increase performance you may want to rewrite it as: struct V3 { float[3] data; void opIndexAssign(float x, size_t i) { data[i] = x; } double dot() { float s = 0.0; // slow foreach (d; data) s += d * d; return s; } } void main() { V3* v = new V3; (*v)[0] = 1.0; (*v)[1] = 2.0; (*v)[2] = 3.0; printf("%f\n", v.dot()); } As you see the call to the dot() is unchanged, while the usage of opIndexAssign() is changed. The last line can also be written like this: printf("%f\n", (*v).dot()); To restore symmetry you can also write this, but I am not sure if this is good practice: struct V3 { float[3] data; void opIndexAssign(float x, size_t i) { data[i] = x; } double dot() { float s = 0.0; // slow foreach (d; data) s += d * d; return s; } } struct V3p { V3* ptr; void opIndexAssign(float x, size_t i) { assert(ptr != null); (*ptr)[i] = x; } double dot() { assert(ptr != null); return ptr.dot(); } } void main() { V3p v = V3p(new V3); v[0] = 1.0; v[1] = 2.0; v[2] = 3.0; printf("%f\n", v.dot()); } Bye, bearophile
Dec 30 2009
prev sibling parent reply "Ali =?UTF-8?B?w4dlaHJlbGki?= <acehreli yahoo.com> writes:
On Thursday, 24 December 2009 at 19:52:00 UTC, Walter Bright 
wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
That article is not on Dr.Dobb's anymore: http://www.drdobbs.com/author/Walter-Bright Is there a copy somewhere else? Ali
Nov 07 2012
next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 11/7/12, "Ali =C7ehreli\" <acehreli yahoo.com>" puremagic.com <"Ali
=C7ehreli\" <acehreli yahoo.com>" puremagic.com> wrote:
 On Thursday, 24 December 2009 at 19:52:00 UTC, Walter Bright
 wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mist=
ake/
 That article is not on Dr.Dobb's anymore:

    http://www.drdobbs.com/author/Walter-Bright

 Is there a copy somewhere else?

 Ali
http://web.archive.org/web/20100128003913/http://dobbscodetalk.com/index.ph= p?option=3Dcom_myblog&show=3DCs-Biggest-Mistake.html&Itemid=3D29
Nov 07 2012
prev sibling parent "Jesse Phillips" <Jessekphillips+D gmail.com> writes:
On Wednesday, 7 November 2012 at 21:36:57 UTC, Ali Çehreli wrote:
 On Thursday, 24 December 2009 at 19:52:00 UTC, Walter Bright 
 wrote:
 http://www.reddit.com/r/programming/comments/ai9uc/whats_cs_biggest_mistake/
That article is not on Dr.Dobb's anymore: http://www.drdobbs.com/author/Walter-Bright Is there a copy somewhere else? Ali
I think I've seen that with other articles. Dr. Dobb's does not appear to provide very permanent hosting for articles.
Nov 07 2012