www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Re: [article] Language Design Deal Breakers

reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, May 26, 2013 at 01:13:30PM +0200, Paulo Pinto wrote:
[...]
 Now it is too late for it, but at the time C could have stayed as
 powerful as it is while offering:
 
 - proper modules, or at least namespaces
 
 - no automatic conversions between arrays and pointers. how hard it
 is to write &a[0]?

I think this was a historical accident. The original C, being an abstraction of assembly, simply inherited the convention of using a memory address to refer to some data, be it singular (pointer) or plural (array). Of course, in retrospect, this conflation was a bad idea, but it's not surprising why this choice was made.
 - arguments by reference, no need to check for null for every parameter

Again, this was a historical accident. In assembly, there is no distinction between a pointer and a reference, so it's not surprising why the inventors of C didn't make that distinction also. In retrospect, of course, this would've been a huge improvement, but as they say, hindsight is always 20-20.
 - strong typed enumerations

I've always been on the fence about this one. I wonder how things would have turned out if Kernighan and Ritchie had made a distinction between proper enumerations (nothing outside of the set is allowed) and what's essentially an int with named values (values outside the set of names are allowed). I find that a lot of C code actually want the latter, not the former, esp. with bitfields.
 - memory allocation without requiring the developer to use sizeof
 everywhere

And typecasts everywhere. I mean, seriously, it's so incredibly annoying to keep writing: myLongNamedType *t = (myLongNamedType *)malloc(sizeof(myLongNamedType) * n); if (!t) { ... } ... over and over and over. Talk about verbosity. (And this isn't even Java!) In retrospect, though, I can understand why it was done this way (malloc was just another library function), but again, hindsight is 20-20. Back then there was a lot of pressure to minimalize the language; nowadays we know better, and realize that certain things, like memory allocation, really are much better handled within the language itself.
 - strings similar to what D has

Yeah, ASCIIZ was both a blessing and a curse. A blessing in that you don't have to keep passing a length around; a curse in that sometimes passing a length around is the right thing to do.
 - proper arrays, after all the compilers for other languages always
 offered control over when bound checking code was generated

Well, in those days, premature optimization was king, and I can see why C went the way it did. Saving every last byte was always a top concern, even if today we scoff at it (and rightly so!). Time proved, though, that ultimately we *had* to keep the length around for various reasons anyway, so in retrospect they should've been included in the first place.
 In the end, same syntax, just some semantic improvements on the type
 system.
 
 But now it is too late, we only have modern C++ with its warts, or
 hopefully D, Rust, Go, C#, or something else as possible
 replacement.
 
 However, given that C and UNIX are one and only, it will outlive us all.

I don't know, if we get our act together in polishing up D, we may yet live to see the rightful demise of C (or more likely, its fading into the obscure dusts of time). I'm hopeful, anyway. :) On Sun, May 26, 2013 at 05:22:18AM -0400, Nick Sabalausky wrote:
 On Sun, 26 May 2013 13:27:55 +1000
 Peter Williams <pwil3058 bigpond.net.au> wrote:
 
 I should mention that this was back in the mid
 90s and C++ may have improved since then :-).
 

I dunno. The more I learned about C++'s more advances features the more disillusioned I became with it. I was always happiest (or at least, least unhappy) with the "C with classes" form of C++.

Have to agree with that. Though when templates first came out, they were a huge thing for me. It was only later that it became clear that the way C++ handled them was ... well, it left a lot to be desired. :) When I got acquianted with D's templates, I was totally blown away. It was like a veil was lifted and I saw for the first time what a *real* template system ought to look like.
 But then again, maybe that has nothing to do with "older C++ vs newer
 C++"?

"C with classes" *was* the older C++, whatever that meant. Since then C++ has been trying to become purely OO (and failing horribly, IMO -- Java takes the cake in that area, no matter how much I may dislike that fact), and tried to clean up the horrible mess it inherited from C (but not very successfully). C++11 (finally!) introduced lambdas and type inference, and a bunch of other stuff, but ... meh. A lot of it feels like "too little, too late". T -- Indifference will certainly be the downfall of mankind, but who cares? -- Miquel van Smoorenburg
May 26 2013
parent Paulo Pinto <pjmlp progtools.org> writes:
Am 26.05.2013 17:18, schrieb H. S. Teoh:
 On Sun, May 26, 2013 at 01:13:30PM +0200, Paulo Pinto wrote:
 [...]
 Now it is too late for it, but at the time C could have stayed as
 powerful as it is while offering:

 - proper modules, or at least namespaces

 - no automatic conversions between arrays and pointers. how hard it
 is to write &a[0]?

I think this was a historical accident. The original C, being an abstraction of assembly, simply inherited the convention of using a memory address to refer to some data, be it singular (pointer) or plural (array). Of course, in retrospect, this conflation was a bad idea, but it's not surprising why this choice was made.
 - arguments by reference, no need to check for null for every parameter

Again, this was a historical accident. In assembly, there is no distinction between a pointer and a reference, so it's not surprising why the inventors of C didn't make that distinction also. In retrospect, of course, this would've been a huge improvement, but as they say, hindsight is always 20-20.
 - strong typed enumerations

I've always been on the fence about this one. I wonder how things would have turned out if Kernighan and Ritchie had made a distinction between proper enumerations (nothing outside of the set is allowed) and what's essentially an int with named values (values outside the set of names are allowed). I find that a lot of C code actually want the latter, not the former, esp. with bitfields.
 - memory allocation without requiring the developer to use sizeof
 everywhere

And typecasts everywhere. I mean, seriously, it's so incredibly annoying to keep writing: myLongNamedType *t = (myLongNamedType *)malloc(sizeof(myLongNamedType) * n); if (!t) { ... } ... over and over and over. Talk about verbosity. (And this isn't even Java!) In retrospect, though, I can understand why it was done this way (malloc was just another library function), but again, hindsight is 20-20. Back then there was a lot of pressure to minimalize the language; nowadays we know better, and realize that certain things, like memory allocation, really are much better handled within the language itself.
 - strings similar to what D has

Yeah, ASCIIZ was both a blessing and a curse. A blessing in that you don't have to keep passing a length around; a curse in that sometimes passing a length around is the right thing to do.
 - proper arrays, after all the compilers for other languages always
 offered control over when bound checking code was generated

Well, in those days, premature optimization was king, and I can see why C went the way it did. Saving every last byte was always a top concern, even if today we scoff at it (and rightly so!). Time proved, though, that ultimately we *had* to keep the length around for various reasons anyway, so in retrospect they should've been included in the first place.
 In the end, same syntax, just some semantic improvements on the type
 system.

 But now it is too late, we only have modern C++ with its warts, or
 hopefully D, Rust, Go, C#, or something else as possible
 replacement.

 However, given that C and UNIX are one and only, it will outlive us all.

I don't know, if we get our act together in polishing up D, we may yet live to see the rightful demise of C (or more likely, its fading into the obscure dusts of time). I'm hopeful, anyway. :)

I do understand why many of C decisions where made, specially interesting is to read Ritchie's history on how everything came out to be. http://cm.bell-labs.com/cm/cs/who/dmr/chist.html I started using computers at the age of 10 back in 1986, so there are a few things that I missed about those early languages. However given my special interest in language design and the endless set of information my university used to offer, I devoured all the publications I could put my hands on. According to many of those papers some of the languages available on those days for systems programming could have provided for a safer C, assuming their designers were aware of them. Then UNIX's popularity made everyone have the same tools on their home systems and we got C instead of Modula-2 or similar languages. I am not religious and will use it if it makes sense to do so, but since 2001 I don't miss it. -- Paulo
May 26 2013