www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - ESR on post-C landscape

reply Laeeth Isharc <laeeth laeeth.com> writes:
He mentions D, a bit dismissively.
http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
Nov 13
next sibling parent reply lobo <swamp.lobo gmail.com> writes:
On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
"[snip]...Then came the day we discovered that a person we incautiously gave commit privileges to had fucked up the games’s AI core. It became apparent that I was the only dev on the team not too frightened of that code to go in. And I fixed it all right – took me two weeks of struggle. After which I swore a mighty oath never to go near C++ again. ...[snip]" Either no one manages SW in his team so that this "bad" dev could run off and to build a monster architecture, which would take weeks, or this guy has no idea how to revert commit. I stopped reading at this point.
Nov 13
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 14 November 2017 at 06:32:55 UTC, lobo wrote:
 "[snip]...Then came the day we discovered that a person we 
 incautiously gave commit privileges to had fucked up the 
 games’s AI core. It became apparent that I was the only dev on 
 the team not too frightened of that code to go in. And I fixed 
 it all right – took me two weeks of struggle. After which I 
 swore a mighty oath never to go near C++ again. ...[snip]"

 Either no one manages SW in his team so that this "bad" dev 
 could run off and to build a monster architecture, which would 
 take weeks, or this guy has no idea how to revert commit.
ESR got famous for his cathedral vs bazaar piece, which IMO was basically just a not very insightful allegory over waterfall vs evolutionary development models, but since many software developers don't know the basics of software development he managed to become infamous for it… But I think embracing emergence has hurt open source projects more than it has helped it. D bears signs of too much emergence too, and is still trying correct those «random moves» with DIPs. ESR states «C is flawed, but it does have one immensely valuable property that C++ didn’t keep – if you can mentally model the hardware it’s running on, you can easily see all the way down. If C++ had actually eliminated C’s flaws (that it, been type-safe and memory-safe) giving away that transparency might be a trade worth making. As it is, nope.» I don't think this is true, you can reduce C++ down to the level where it is just like C. If he cannot mentally model the hardware in C++ that basically just means he has never tried to get there… I also think he is in denial if he does not see that C++ is taking over C. Starting a big project in C today sounds like a very bad idea to me. Actually, one could say that one of the weaknesses of C++ is that it limited by a relatively direct mapping to the underlying hardware and therefore makes some types of optimization and convenient programming harder. *shrug*
Nov 14
next sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Tuesday, 14 November 2017 at 09:43:07 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 14 November 2017 at 06:32:55 UTC, lobo wrote:
 "[snip]...Then came the day we discovered that a person we 
 incautiously gave commit privileges to had fucked up the 
 games’s AI core. It became apparent that I was the only dev on 
 the team not too frightened of that code to go in. And I fixed 
 it all right – took me two weeks of struggle. After which I 
 swore a mighty oath never to go near C++ again. ...[snip]"

 Either no one manages SW in his team so that this "bad" dev 
 could run off and to build a monster architecture, which would 
 take weeks, or this guy has no idea how to revert commit.
ESR got famous for his cathedral vs bazaar piece, which IMO was basically just a not very insightful allegory over waterfall vs evolutionary development models, but since many software developers don't know the basics of software development he managed to become infamous for it… But I think embracing emergence has hurt open source projects more than it has helped it. D bears signs of too much emergence too, and is still trying correct those «random moves» with DIPs. ESR states «C is flawed, but it does have one immensely valuable property that C++ didn’t keep – if you can mentally model the hardware it’s running on, you can easily see all the way down. If C++ had actually eliminated C’s flaws (that it, been type-safe and memory-safe) giving away that transparency might be a trade worth making. As it is, nope.» I don't think this is true, you can reduce C++ down to the level where it is just like C. If he cannot mentally model the hardware in C++ that basically just means he has never tried to get there…
The shear amount of inscrutable cruft and rules, plus the moving target of continuously changing semantics an order or two of magnitude bigger than C added to the fact that you still need to know C's gotchas, makes it one or two order of magnitude more difficult to mental model the hardware. You can also mental model the hardware with Intercal, if you haven't managed just means you haven't tried hard enough.
 I also think he is in denial if he does not see that C++ is 
 taking over C. Starting a big project in C today sounds like a 
 very bad idea to me.
Even worse in C++ with its changing standards ever 5 years.
Nov 16
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Thursday, 16 November 2017 at 18:02:10 UTC, Patrick Schluter 
wrote:
 The shear amount of inscrutable cruft and rules, plus the 
 moving target of continuously changing semantics an order or 
 two of magnitude bigger than C added to the fact that you still 
 need to know C's gotchas, makes it one or two order of 
 magnitude more difficult to mental model the hardware.
I don't feel that way, most of what C++ adds to C happens on a typesystem or textual level. The core language is similar to C.
 Even worse in C++ with its changing standards ever 5 years.
But those features are mostly short hand for things that already are in the language. E.g. lambdas are just objects, move semantics is just an additional nominal ref type with barely any semantics attached to it (some rules for coercion to regular references)... So while these things make a difference, it doesn't change my low level mental model of C++, which remain as close to C today as it did in the 90s.
Nov 16
prev sibling parent sarn <sarn theartofmachinery.com> writes:
On Tuesday, 14 November 2017 at 09:43:07 UTC, Ola Fosheim Grøstad 
wrote:
 ESR got famous for his cathedral vs bazaar piece, which IMO was 
 basically just a not very insightful allegory over waterfall vs 
 evolutionary development models, but since many software 
 developers don't know the basics of software development he 
 managed to become infamous for it…
Everything ESR says is worth taking with a good dose of salt, but his "The Art of Unix Programming" isn't a bad read.
Nov 16
prev sibling parent Era Scarecrow <rtcvb32 yahoo.com> writes:
On Tuesday, 14 November 2017 at 06:32:55 UTC, lobo wrote:
 And I fixed it all right – took me two weeks of struggle. After 
 which I swore a mighty oath never to go near C++ again. 
 ...[snip]"
Reminds me of the last time I touched C++. A friend wanted help with the Unreal Engine. While skeptical the actual headers and code I was going into were... really straight forward. #IfDef's to encapsulate and control if something was/wasn't used, and simple C syntax with no overrides or special code otherwise. But it was ugly... it was verbose... it was still hard to find my way around. And I still don't want to ever touch C++ if I can avoid it.
Nov 14
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
The reason he can dismiss D, so easily, is because of his starting premise that C is flawed. As soon as you begin with that premise, you justify searching for C's replacement, which makes it difficult to envsion something like D. That's why we got C++, instead of D. Because the starting point for C++, was the idea that C was flawed. C is not flawed. It doesn't need a new language to replace it. If that was the starting point for Go and Rust, then it is ill conceived. One should also not make the same error, by starting with the premise that we need a simpler language to replace the complexity of the C++ language. If that was the starting point for Go and Rust, then it is ill conceived. What we need, is a language that provides you with the flexibility to model your solution to a problem, *as you see fit*. If that were my starting point, then it's unlikely I'd end up designing Go or Rust. Only something like D can result from that starting point. I'd like Eric to go write a new article, with that being the starting point. Because then, it's unlikely he would get away with being so dismissive of D.
Nov 14
next sibling parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 14 November 2017 at 11:55:17 UTC, codephantom wrote:
 The reason he can dismiss D, so easily, is because of his 
 starting premise that C is flawed. As soon as you begin with 
 that premise, you justify searching for C's replacement, which 
 makes it difficult to envsion something like D.
Well, in another thread he talked about the Tango split, so not sure where he is coming from.
 That's why we got C++, instead of D. Because the starting point 
 for C++, was the idea that C was flawed.
No, the starting point for C++ was that Simula is better for a specific kind of modelling than C.
 C is not flawed. It doesn't need a new language to replace it.
It is flawed... ESR got that right, not sure how anyone can disagree. The only thing C has going for it is that CPU designs have been adapted to C for decades. But that is changing. C no longer models the hardware in a reasonable manner.
 If that was the starting point for Go and Rust, then it is ill 
 conceived.
It wasn't really. The startingpoint for Go was just as much a language used to implement Plan 9. Don't know about Rust, but it looks like a ML spinoff.
 One should also not make the same error, by starting with the 
 premise that we need a simpler language to replace the 
 complexity of the C++ language.
Why not? Much of the evolved complexity of C++ can be removed by streamlining.
 If that was the starting point for Go and Rust, then it is ill 
 conceived.
It was the starting point for D...
 What we need, is a language that provides you with the 
 flexibility to model your solution to a problem, *as you see 
 fit*.

 If that were my starting point, then it's unlikely I'd end up 
 designing Go or Rust. Only something like D can result from 
 that starting point.
Or C++, or ML, or BETA, or Scala, or etc etc...
 Because then, it's unlikely he would get away with being so 
 dismissive of D.
If he is dismissive of C++ and Rust then he most likely will remain dismissive od D as well?
Nov 14
next sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim Grostad 
wrote:
 It [C]is flawed... ESR got that right, not sure how anyone can 
 disagree.
Well I 'can' disagree ;-) Is a scalpel flawed because someone tried to use it to screw in a screw? Languages are just part of an evolutionary chain. No part of the chain should be considered flawed - unless it was actually flawed - in that it didn't meet the demands of the environment in which it was initially conceived. In that circumstance, it must be considered flawed, and evolutionary forces will quickly take care of that. But a programming language is not flawed, simply because people use it an environment where it was not designed to operate. If I take the average joe blow out of his comfy house, and put him in the middle of raging battle field, is Joe Blow flawed, because he quickly got shot down? What's flawed there, is the decision to take Joe Blow and put him in the battlefield. Corporate needs/strategy, skews ones view of the larger environment, and infects language design. I think it's infected Go, from the get Go. I am glad D is not being designed by a corporate, otherwise D would be something very different, and far less interesting. The idea that C is flawed, also skews ones view of the larger environment, and so it too infects language design. This is where Eric got it wrong, in my opinion. He's looking for the language that can best fix the flaws of C. In fact C has barely had to evolve (which is not a sign of something that is flawed), because it works just fine, in enviroments for which it was designed to work in. And those enviroments still exist today. They will still exist tomorrow..and the next day...and the next..and... So language designers..please stop the senseless bashing of C. Why does anyone need array index validation anyway? I don't get it. If you're indexing incorrectly into an array..you're a fool. btw. The conditions under which C evolved, are well documented here. It's a facinating read. https://www.bell-labs.com/usr/dmr/www/chist.pdf
Nov 14
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 15 November 2017 at 02:05:27 UTC, codephantom wrote:
 On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim 
 Grostad wrote:
 It [C]is flawed... ESR got that right, not sure how anyone can 
 disagree.
Well I 'can' disagree ;-) Is a scalpel flawed because someone tried to use it to screw in a screw? Languages are just part of an evolutionary chain. No part of the chain should be considered flawed - unless it was actually flawed - in that it didn't meet the demands of the environment in which it was initially conceived. In that circumstance, it must be considered flawed, and evolutionary forces will quickly take care of that. But a programming language is not flawed, simply because people use it an environment where it was not designed to operate. If I take the average joe blow out of his comfy house, and put him in the middle of raging battle field, is Joe Blow flawed, because he quickly got shot down? What's flawed there, is the decision to take Joe Blow and put him in the battlefield. Corporate needs/strategy, skews ones view of the larger environment, and infects language design. I think it's infected Go, from the get Go. I am glad D is not being designed by a corporate, otherwise D would be something very different, and far less interesting. The idea that C is flawed, also skews ones view of the larger environment, and so it too infects language design. This is where Eric got it wrong, in my opinion. He's looking for the language that can best fix the flaws of C. In fact C has barely had to evolve (which is not a sign of something that is flawed), because it works just fine, in enviroments for which it was designed to work in. And those enviroments still exist today. They will still exist tomorrow..and the next day...and the next..and... So language designers..please stop the senseless bashing of C. Why does anyone need array index validation anyway? I don't get it. If you're indexing incorrectly into an array..you're a fool. btw. The conditions under which C evolved, are well documented here. It's a facinating read. https://www.bell-labs.com/usr/dmr/www/chist.pdf
Quite fascinating indeed. "Although the first edition of K&R described most of the rules that brought C’s type structure to its present form, many programs written in the older, more relaxed style persisted, and so did compilers that tolerated it. To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions." "Two ideas are most characteristic of C among languages of its class: the relationship between arrays and pointers, and the way in which declaration syntax mimics expression syntax. They are also among its most frequently criticized features, and often serve as stumbling blocks to the beginner. In both cases, historical accidents or mistakes have exacerbated their difficulty. The most important of these has been the tolerance of C compilers to errors in type." "On the other hand, C’s treatment of arrays in general (not just strings) has unfortunate implications both for optimization and for future extensions. The prevalence of pointers in C programs, whether those declared explicitly or arising from arrays, means that optimizers must be cautious, and must use careful dataflow techniques to achieve good results. Sophisticated compilers can understand what most pointers can possibly change, but some important usages remain difficult to analyze." There are quite a few snippets to take from it, but I will finalize with the design goals regarding the spirit of C for ANSI C11 instead. "12. Trust the programmer, as a goal, is outdated in respect to the security and safety programming communities. While it should not be totally disregarded as a facet of the spirit of C, the C11 version of the C Standard should take into account that programmers need the ability to check their work. " http://www.open-std.org/jtc1/sc22/wg14/www/docs/n2021.htm
Nov 15
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 15 November 2017 at 02:05:27 UTC, codephantom wrote:
 On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim 
 Grostad wrote:
 It [C]is flawed... ESR got that right, not sure how anyone can 
 disagree.
Well I 'can' disagree ;-)
Right… :-)
 Languages are just part of an evolutionary chain.
Right and C is part of this chain BCPL->B->C, the funny thing is that BCPL was never meant to be used as a language beyond bootstrapping CPL, but given the limited computers of the day BCPL and later C became the default system programming language exactly because it wasn't much of a language and fit rather well with the CPUs of the day (among other things, you could hold the whole compiler in main memory ;-). But C doesn't fit well with the underlying hardware anymore, even though CPU makers are benchmarking against existing C code and make provisions for the C model. That argument can be used against C++, D and Rust too. :-P So, if the abstraction no longer match the concrete well then it will make less and less sense to use it. Overall, I think we over time will see growth in higher level languages designed to map well onto the hardware at the optimization stage. Current languages aren't quite there yet though, and frankly, neither are the CPUs. I think we are in a transition period (wide pipelines in the CPU + GPU is an indicator).
 No part of the chain should be considered flawed - unless it 
 was actually flawed - in that it didn't meet the demands of the 
 environment in which it was initially conceived.
Right, but the performance bottle-neck of serial code is now forcing changes in the environment.
 In that circumstance, it must be considered flawed, and 
 evolutionary forces will quickly take care of that.
Not quickly, you have the whole issue with installed base, center of gravity… The mindset of people…
 But a programming language is not flawed, simply because people 
 use it an environment where it was not designed to operate.
Ok, fair enough. BCPL was meant to bootstrap CPL and C was a hack to implement Unix… ;^)
 Corporate needs/strategy, skews ones view of the larger 
 environment, and infects language design. I think it's infected 
 Go, from the get Go. I am glad D is not being designed by a 
 corporate, otherwise D would be something very different, and 
 far less interesting.
I don't think Go is much affected by the corporate… The Go designers appear to be strong-headed and the language design is in line with their prior language designs. I believe they also made the same mistake as D with reallocating buffers "randomly" when extending slices such that you end up with two arrays because the other slices aren't updated. Not very reassuring when a team of people make such correctness blunders. And what about their "exceptions", a dirty hack only added because they are hellbent on not having exceptions… All about the mentality of the designers… put blame where blame is due… The design of Dart was affected by the corporate requirements according to the designer who called it a "bland language". I think he would have preferred something closer to Smalltalk :-). Dart is probably more important to Google than Go as their business frontend depends on it. And Dart is now getting static typing to the dismay of the original designer (who is in the dynamic camp). But yeah, it can be interesting to think about how a mix of personalities and external pressure affect language design.
 This is where Eric got it wrong, in my opinion. He's looking 
 for the language that can best fix the flaws of C.
Seems to me that he is looking for something that is easier to deal with than C, but where he can retain his C mindset without having performance issues related to GC. So the bare-bones semantics of Go combined with a decent runtime and libraries that are geared towards network programming probably fits his use case (NTP something?)
 In fact C has barely had to evolve (which is not a sign of 
 something that is flawed), because it works just fine, in 
 enviroments for which it was designed to work in. And those 
 enviroments still exist today. They will still exist 
 tomorrow..and the next day...and the next..and...
They exist in embedded devices/SoCs etc. Not sure if it is reasonable to say that they exist on the desktop anymore beyond a mere hygienic backwards-compatibility mentality among CPU designers. Ola.
Nov 15
parent reply codephantom <me noyb.com> writes:
On Wednesday, 15 November 2017 at 09:26:49 UTC, Ola Fosheim 
Grøstad wrote:
 I don't think Go is much affected by the corporate…
Umm.... "We made the language to help make google more productive and helpful internally" - Rob Pike https://www.youtube.com/watch?v=sln-gJaURzk 2min:55sec To be honest, it's really hard for me to be critical of something that (The) Ken Thompson was involved in ;-) ...but I've gotta speak the truth. sorry Ken. But Go sucks.
Nov 15
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 15 November 2017 at 10:40:50 UTC, codephantom wrote:
 On Wednesday, 15 November 2017 at 09:26:49 UTC, Ola Fosheim 
 Grøstad wrote:
 I don't think Go is much affected by the corporate…
Umm.... "We made the language to help make google more productive and helpful internally" - Rob Pike
I know, I followed the debate for a while, but that sounds much more like a defence of their own minimalistic aesthetics (which they don't deny) than a corporate requirement. With a different team Go most certainly would have exceptions and generics, like Dart. Makes no sense to claim that their server programmers are less skilled than their front end programmers?
Nov 15
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim Grostad 
wrote:
 On Tuesday, 14 November 2017 at 11:55:17 UTC, codephantom wrote:
 [...]
Well, in another thread he talked about the Tango split, so not sure where he is coming from.
 [...]
No, the starting point for C++ was that Simula is better for a specific kind of modelling than C.
 [...]
It is flawed... ESR got that right, not sure how anyone can disagree. The only thing C has going for it is that CPU designs have been adapted to C for decades. But that is changing. C no longer models the hardware in a reasonable manner.
Because of the flawed interpretation of UB by the compiler writers, not because of a property of the language itself.
Nov 16
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Thursday, 16 November 2017 at 18:06:22 UTC, Patrick Schluter 
wrote:
 On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim 
 Grostad wrote:
 changing. C no longer models the hardware in a reasonable 
 manner.
Because of the flawed interpretation of UB by the compiler writers, not because of a property of the language itself.
No, I am talking about the actual hardware, not UB. In the 80s there was almost 1-to-1 correspondence between C and CPU internals. CPUs are still designed for C, but the more code shift away from C, the more rewarding it will be for hardware designers to move to more parallell designs.
Nov 16
prev sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 14 November 2017 at 11:55:17 UTC, codephantom wrote:
 The reason he can dismiss D, so easily, is because of his 
 starting premise that C is flawed. As soon as you begin with 
 that premise, you justify searching for C's replacement, which 
 makes it difficult to envsion something like D.

 That's why we got C++, instead of D. Because the starting point 
 for C++, was the idea that C was flawed.
Actually, I got that wrong. Perhaps the mistake C++ made, was concluding that 'classes' were the "proper primary focus of program design" (chp1. The Design and Evolution of C++). I have to wonder whether that conclusion sparked the inevitable demise of C++. Eric should be asking a similar question about Go ..what decision has been made that sparked Go's inevitable demise - or in the case of Go, decision would be decisions. this is what did it for me: a := b
Nov 15
next sibling parent Bauss <jj_1337 live.dk> writes:
On Thursday, 16 November 2017 at 02:12:10 UTC, codephantom wrote:
 On Tuesday, 14 November 2017 at 11:55:17 UTC, codephantom wrote:
[...]
Actually, I got that wrong. Perhaps the mistake C++ made, was concluding that 'classes' were the "proper primary focus of program design" (chp1. The Design and Evolution of C++). I have to wonder whether that conclusion sparked the inevitable demise of C++. Eric should be asking a similar question about Go ..what decision has been made that sparked Go's inevitable demise - or in the case of Go, decision would be decisions. this is what did it for me: a := b
interface{} definitely
Nov 15
prev sibling parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Thursday, 16 November 2017 at 02:12:10 UTC, codephantom wrote:
 Perhaps the mistake C++ made, was concluding that 'classes' 
 were the "proper primary focus of program design" (chp1. The 
 Design and Evolution of C++).
No, classes is a powerful modelling primitive. C++ got that right. C++ is also fairly uniform because of it. Not as uniform as Self and Beta, but more so than D. People who harp about how OO is a failure don't know how to do real world modelling...
 I have to wonder whether that conclusion sparked the inevitable 
 demise of C++.
There is no demise...
 Eric should be asking a similar question about Go ..what 
 decision has been made that sparked Go's inevitable demise - or 
 in the case of Go, decision would be decisions.
Go is growing...
 a := b
A practical shorthand, if you dont like it, then dont use it.
Nov 15
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 16/11/2017 6:35 AM, Ola Fosheim Grostad wrote:
 On Thursday, 16 November 2017 at 02:12:10 UTC, codephantom wrote:
 Perhaps the mistake C++ made, was concluding that 'classes' were the 
 "proper primary focus of program design" (chp1. The Design and 
 Evolution of C++).
No, classes is a powerful modelling primitive. C++ got that right. C++ is also fairly uniform because of it. Not as uniform as Self and Beta, but more so than D. People who harp about how OO is a failure don't know how to do real world modelling...
Thing is, it is a failure, the way most people use it. When used correctly it is a very nice additive to any code base. It just can't be the only one.
Nov 15
parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Thursday, 16 November 2017 at 06:51:58 UTC, rikki cattermole 
wrote:
 On 16/11/2017 6:35 AM, Ola Fosheim Grostad wrote:
 Thing is, it is a failure, the way most people use it.
You can say that about most things: exceptions, arrays, pointers, memory, structs with public fields... But I guess what you are saying is that many people arent good at modelling...
 When used correctly it is a very nice additive to any code base.
 It just can't be the only one.
Well, it can in a flexible OO language (niche languages). However it was never meant to be used out of context, i.e not meant to be used for "pure math".
Nov 15
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 16 November 2017 at 07:12:16 UTC, Ola Fosheim 
Grostad wrote:
 But I guess what you are saying is that many people arent good 
 at modelling...
I just want to add to this that I believe most people are much better at OO modelling than other modelling strategies (ER, SA, NIAM etc). Simply because people are good at understanding stereotypes. Even without training. The ability to communicate with your customer is therefore a good reason to use OO-modelling, so that you can get some feedback on your take of their world.
Nov 15
prev sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 16 November 2017 at 06:35:30 UTC, Ola Fosheim 
Grostad wrote:
 No, classes is a powerful modelling primitive. C++ got that 
 right. C++ is also fairly uniform because of it.
Yes, I agree that classes are a powerful modelling primitive, but my point was that Stroustrup made classes the 'primary focus of program design'. Yes, that made it more uniform alright... uniformly more complicated. And why? Because he went on to throw C into the mix, because performance in Simula was so poor, and would not scale. C promised the efficiency and scalability he was after. But an efficient and scalable 'class oriented' language, means complexity was inevitable. It wasn't a bad decision on his part. It was right for the time I guess. But it set the stage for its demise I believe.
 People who harp about how OO is a failure don't know how to do 
 real world modelling...
I would never say OO itself is a failure. But the idea that is should be the 'primary focus of program design' .. I think that is a failure...and I think that principle is generally accepted these days.
 I have to wonder whether that conclusion sparked the 
 inevitable demise of C++.
There is no demise...
If the next C++ doesn't get modules, that'll be the end of it...for sure.
 Eric should be asking a similar question about Go ..what 
 decision has been made that sparked Go's inevitable demise - 
 or in the case of Go, decision would be decisions.
Go is growing...
Yeah..but into what? It's all those furry gopher toys, t-shirts, and playful colors.. I think that's what's attracting people to Go. Google is the master of advertising afterall. Would work well in a kindergarten. But it makes me want to puke. It's so fake.
 a := b
A practical shorthand, if you dont like it, then dont use it.
Was just a senseless, unnecessary change. The immediate impression I got, was that they were trying to undo a decision, that was made when B was developed, rather doing it because it really assisted the modern programmer (what language uses that? None that I use that's for sure). And I get that feeling about other decisions they've made...as if they are just trying to correct the past. They should be focused on the future. They should have got some experienced younger programmers at google to design a language instead. I bet it wouldn't look anything like Go.
Nov 16
parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Thursday, 16 November 2017 at 11:24:09 UTC, codephantom wrote:
 On Thursday, 16 November 2017 at 06:35:30 UTC, Ola Fosheim 
 Grostad wrote:
 Yes, I agree that classes are a powerful modelling primitive, 
 but my point was that Stroustrup made classes the 'primary 
 focus of program design'. Yes, that made it more uniform 
 alright... uniformly more complicated. And why? Because he went 
 on to throw C into the mix, because performance in Simula was 
 so poor, and would not scale. C promised the efficiency and 
 scalability he was after. But an efficient and scalable 'class 
 oriented' language, means complexity was inevitable.
Nah, he is just making excuses. Simula wasn't particularly slow as a design, but used a GC similar to the one in D and bounds checks on arrays, like D. C++ was just a simple layer over C and evolved from that. Had nothing to do with language design, but was all about cheap implementation. Initial version of C++ was cheap and easy to do.
 I would never say OO itself is a failure. But the idea that is 
 should be the 'primary focus of program design' .. I think that 
 is a failure...and I think that principle is generally accepted 
 these days.
Uhm, no? What do you mean by 'primary focus of program design' and in which context?
 If the next C++ doesn't get modules, that'll be the end of 
 it...for sure.
I like namespaces. Flat is generally better when you want explicit qualifications.
 Yeah..but into what? It's all those furry gopher toys, 
 t-shirts, and playful colors.. I think that's what's attracting 
 people to Go. Google is the master of advertising afterall. 
 Would work well in a kindergarten. But it makes me want to 
 puke. It's so fake.
It is the runtime and standard library. And stability. Nice for smaller web services.
 correct the past. They should be focused on the future. They 
 should have got some experienced younger programmers at google 
 to design a language instead. I bet it wouldn't look anything 
 like Go.
Go isnt exciting and has some short-comings that is surprising, but they managed to reach a stable state, which is desirable when writing server code. It is this stability that has ensured that they could improve on the runtime. ("experienced young programmers" is a rather contradictory term, btw :-)
Nov 16
next sibling parent reply sarn <sarn theartofmachinery.com> writes:
On Thursday, 16 November 2017 at 11:52:45 UTC, Ola Fosheim 
Grostad wrote:
 On Thursday, 16 November 2017 at 11:24:09 UTC, codephantom
 I would never say OO itself is a failure. But the idea that is 
 should be the 'primary focus of program design' .. I think 
 that is a failure...and I think that principle is generally 
 accepted these days.
Uhm, no? What do you mean by 'primary focus of program design' and in which context?
In the 90s (and a bit into the 00s) there was a pretty extreme "everything must be an object; OO is the solution to everything" movement in the industry. Like most tech fads, it was associated with a lot of marketing and snake oil from people promising anything managers would pay money to hear (e.g., "use OO and your projects will be made up of reusable objects that you can simply drop into your next project!"). Look around most programming languages today and you'll see objects, so in that sense OOP never failed. What failed was the hype train. It's no different from most other tech fads (except XML has declined drastically since the hype passed).
Nov 16
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 16 November 2017 at 22:27:58 UTC, sarn wrote:
 In the 90s (and a bit into the 00s) there was a pretty extreme 
 "everything must be an object; OO is the solution to 
 everything" movement in the industry.
Yes, around 1991, the computer mags were all over C++ and the bookshelves in the programming section of book stores too…
 Look around most programming languages today and you'll see 
 objects, so in that sense OOP never failed.  What failed was 
 the hype train.  It's no different from most other tech fads 
 (except XML has declined drastically since the hype passed).
*nods* I recall Kristen Nygaard (driving force behind OO and Simula) being sceptical of some of the drive away from OO modelling and towards OO-everything in the mid 90s. However in Scandinavia I think the focus was predominantly on supporting modelling. OOP is a way to support the model. I never heard any of the people behind OO suggest anything more than that OO was one paradigm among many. Nygaard also believed that OO modelling would be useful outside programming, as a mode-of-thinking when doing analysis, for instance in government. Of course there are plenty of pure OO languages that also are interesting in their own right: Smalltalk, Beta, gBeta, Self, and in online text games Moo. Javascript could have made it onto that list too, if it had been given a more suitable syntax and slightly different semantics.
Nov 16
prev sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 16 November 2017 at 11:52:45 UTC, Ola Fosheim 
Grostad wrote:
 Uhm, no? What do you mean by 'primary focus of program design' 
 and in which context?
I the context that, this is specifically what Stroustrup says in his book (The Design and Evolution of C++ 1994) "Simula's class concept was seen as the key difference, and ever since I have seen classes as the proper primary focus of program design." (chp1, page 20). Freud would tell us, that Stroustups obssesion with Simula, is where it all began. Stroustrup also wrote this paper in 1995 (due to all the hype of OO in the 90's), where again, he highlights how classes (and there derivatives) are his primary focus of program design: http://www.stroustrup.com/oopsla.pdf
Nov 16
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Friday, 17 November 2017 at 00:36:21 UTC, codephantom wrote:
 On Thursday, 16 November 2017 at 11:52:45 UTC, Ola Fosheim 
 Grostad wrote:
 Uhm, no? What do you mean by 'primary focus of program design' 
 and in which context?
I the context that, this is specifically what Stroustrup says in his book (The Design and Evolution of C++ 1994) "Simula's class concept was seen as the key difference, and ever since I have seen classes as the proper primary focus of program design." (chp1, page 20)
Yes, that is reasonable, it is very useful when made available in a generic form. I believe Simula was used in teaching at his university. A class in Simula is essentially a record, library scope, block-scope, coroutines, with inheritance and virtual functions, implemented as a closure where the body acts as an extensible constructor. So it is a foundational primitive. Nygaard and Dahl got the Turing award for their work. Ole-Johan Dahl was also a coauthor of an influental book on structured programming which had a chapter on it IIRC. Nygaard and others in Denmark later in the 70s and 80s refined the class concept into a unifying concept that was essentially the only primary building block in Beta (called pattern, which allows functions to be extended using inheritance, instantiation of objects from virtual patterns, type variables as members etc). So Beta establish that you don't need other structural mechanisms than a powerful class concept + tuples for parameters. Self establish the same with objects.
 Freud would tell us, that Stroustups obssesion with Simula, is 
 where it all began.
Anyone that cares already know that Simula was an ancestor for C++, Smalltalk, Java and many other OO languages... But Stroustrup wasn't obsessed by Simula, if he was he would have added things like coroutines, local functions, used the class as a module scope etc. He also would have avoided multiple inheritance.
Nov 16
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
Eh, he parrots decade-old anti-D talking points about non-technical, organizational issues and doesn't say anything about the language itself, who knows if he's even tried it. As for the the rest, the usual bunk from him, a fair amount of random theorizing only to reach conclusions that many others reached years ago: C has serious problems and more memory-safe languages are aiming to replace it, while C++ doesn't have a chance for the same reason it took off, it bakes in all of C's problems and adds more on top. He's basically just jumping on the same bandwagon that a lot of people are already on, as it starts to pick up speed. Good for him that he sees it picking up momentum and has jumped in instead of being left behind clinging to the old tech, but no big deal if he didn't.
Nov 14
parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 14 November 2017 at 19:48:07 UTC, Joakim wrote:
 On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc 
 wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
Eh, he parrots decade-old anti-D talking points about non-technical, organizational issues and doesn't say anything about the language itself, who knows if he's even tried it. As for the the rest, the usual bunk from him, a fair amount of random theorizing only to reach conclusions that many others reached years ago: C has serious problems and more memory-safe languages are aiming to replace it, while C++ doesn't have a chance for the same reason it took off, it bakes in all of C's problems and adds more on top. He's basically just jumping on the same bandwagon that a lot of people are already on, as it starts to pick up speed. Good for him that he sees it picking up momentum and has jumped in instead of being left behind clinging to the old tech, but no big deal if he didn't.
I thought this was a much better post in that thread, especially the last two paragraphs: "Some may claim that the programming language isn’t the place to look for help, but I disagree. If it can prevent language errors in the first place (memory management, type systems) and help me use available resources (concurrency), and deal with expected failure (distribution) then I want it in the flow of the program (my description of what should happen), not in some box bolted onto the side. And it has to be efficient, because I’ve only a few cycles to waste and no IO or memory. So that’s where I’d look for action in the programming language field – not to improve C, an imperfect solution to yesterday’s problems; I want something that helps with apps that are big, distributed, concurrent, and efficient because those are the more important problems people are solving today and in the future." http://esr.ibiblio.org/?p=7724&cpage=1#comment-1913062 To the extent microservices push in exactly this direction, D needs to make an effort there.
Nov 15
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 15 November 2017 at 09:00:38 UTC, Joakim wrote:
 problems; I want something that helps with apps that are big, 
 distributed, concurrent, and efficient because those are the 
 more important problems people are solving today and in the 
 future."
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1913062

 To the extent microservices push in exactly this direction, D 
 needs to make an effort there.
I'm not sure if most distributed computing require D/C++/Rust level of efficiency. Erlang isn't exactly efficient or convenient, but people still use it… for semantic reasons. Seems like this is an area where high level protocol verification would be more important than C-style coding. And since such tools are on the way (as static semantic analysis) it will be difficult to provide good library solutions for that field.
Nov 15
prev sibling next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
I think that the date he mentions in that paragraph (2001) speaks a lot for his argument, i.e. completely outdated.
Nov 14
prev sibling next sibling parent bauss <jj_1337 live.dk> writes:
On Tuesday, 14 November 2017 at 04:31:43 UTC, Laeeth Isharc wrote:
 He mentions D, a bit dismissively.
 http://esr.ibiblio.org/?p=7724&cpage=1#comment-1912717
Couldn't read that without cringing.
Nov 14
prev sibling parent Kagamin <spam here.lot> writes:
Also 
http://ithare.com/chapter-vb-modular-architecture-client-side-programming-languages-for-games-including-resilience-to-reverse-engineer
ng-and-portability/ scroll to the part about language choice.
Nov 22