www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - You are a stupid programmer, you can't have that

reply IGotD- <nise nise.com> writes:
This is a general discussion which applies to all computer 
languages and also under several decades. What I have observed is 
that language designers see programmers misuse the language and 
introduce possible bugs and therefore remove features in 
languages. An analogy would limit the functionality of cars 
because people sometimes are involved in accidents, like 
automatic speed limiter (soon to be law in several countries).

Language designers seem to have a big brother attitude towards 
programmers and think they will save the world by introducing 
limitations.

Examples.

1.
Array indexes should be signed instead of unsigned because 
somehow programmers mess up loops among other things. Bjarne 
Stroustrup considered his unsigned index to be a historic 
mistake. While unsigned array indexes make perfectly sense, the 
bias towards signed seems to be that programmers are stupid. The 
question is, if can't make a for loop with unsigned math, 
shouldn't you look for another job?

2.
Somewhat related. when Java was designed, the designer (James 
Gosling I believe) claimed that programmers were too stupid to 
understand the difference between signed and unsigned math 
(despite often several years of university education) and removed 
signed math entirely from the language. The impact is that when 
unsigned math is required, you are forced to conversions and 
library solutions. Not ideal when an HW APIs deals with unsigned 
numbers for example.

You are welcome to add any other examples that you find 
significant for the discussion.


This partially applies to D in some extent but can often be found 
in other languages and mentality of several language designers.

The question is, do you think language designers go to far when 
trying to "save" programmers from misuse or not?
Do you think there can be approaches that both prevent bugs at 
the same time do not limit the language?
Aug 07 2021
next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 The question is, do you think language designers go to far when 
 trying to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at 
 the same time do not limit the language?
I think that a tool should not prevent unrecommended uses completely, but it should make them harder so it won't be done by mistake or because it's easier than the right way. Well, that's my standard philosophy to this issue anyway, but perhaps there are cases where that approach is too idealistic. But IMO it's what D usually does. Examples: String mixins. They can be used to do anything the C preprocessor can do (and more!), but the syntax is ugly so it's nicer to use normal templates, manifest constants, aliases or at least template mixins when they can do the job. `version` declarations. They do not allow expressions in them, but if one really needs expressions, `static if` will work around the problem. Default initialization. The shortest syntax will not introduce heisenbugs, but if performance is really that important there's still `=void`. Floats default initialized to NaN instead of 0. One can still initialize them to anything, but this forces to think about it.
Aug 07 2021
next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
 When Java was designed, the designer (James Gosling I believe) 
 claimed that programmers were too stupid to understand the 
 difference between signed and unsigned math (despite often 
 several years of university education) and removed signed math 
 entirely from the language. The impact is that when unsigned 
 math is required, you are forced to conversions and library 
 solutions.
It's hard to appreciate how good that Java decision was without maintaining large C++ codebases. Unsigned/signed math and the implicit conversions between them absolutely DO cause bugs in real-world code, the most prominent of which must be: #include <vector> #include <iostream> bool isArrayOrdered(std::vector<int>& vec) { for (size_t n = 0; n < vec.size() - 1; ++n) { if (vec[n] > vec[n+1]) return false; } return true; } int main(int argc, char** argv) { std::vector<int> v; // works if the array isn't empty // v.push_back(4); std::cout << isArrayOrdered(v); } See the bug in isArrayOrdered? It will fail with an empty array since: 1. vec.size() yields size_t 2. vec.size() - 1 stays unsigned, if the size was zero it is now the maximum possible size 3. and we have an unsigned comparison, with a very long loop which in this case will cause out of bounds (Solution: use + 1 on the other side of the comparison) Similar problems happen when people store image size with unsigned, or widget positions in unsigned, or dates in unsigned integers. All of these problems disappear: - if only signed integers are available. - if container length return a signed integer (and that's what D could have done...) - if comparison or arithmetic with mixed signedness is forbidden (but: lots of casts) Programmers do not actually learn about integer promotion except when forced by such bugs. It has nothing to with programmers being dumb or low-skilled, and everything to do with an economy of information: it's better if the problem doesn't exist in the first place. So I believe Java made a decision that saves up mental space.
Aug 07 2021
next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Saturday, 7 August 2021 at 14:21:52 UTC, Guillaume Piolat 
wrote:

 See the bug in isArrayOrdered?
Gotchas like that are harmful but well-known, and the knowledge in our days spreads fast thanks to the internets. I believe an average C++ programmer today is less likely to make that mistake. On the other hand, we don't know how much harm the lack of unsigned types has caused. To claim that it is less harmful would be a case of survivorship bias.
Aug 07 2021
prev sibling parent IGotD- <nise nise.com> writes:
On Saturday, 7 August 2021 at 14:21:52 UTC, Guillaume Piolat 
wrote:
 It's hard to appreciate how good that Java decision was without 
 maintaining large C++ codebases.
 Unsigned/signed math and the implicit conversions between them 
 absolutely DO cause bugs in real-world code, the most prominent 
 of which must be:

     #include <vector>
     #include <iostream>

     bool isArrayOrdered(std::vector<int>& vec)
     {
         for (size_t n = 0; n < vec.size() - 1; ++n)
         {
             if (vec[n] > vec[n+1])
                 return false;
         }
         return true;
     }

     int main(int argc, char** argv)
     {
         std::vector<int> v;
         // works if the array isn't empty
         // v.push_back(4);
         std::cout << isArrayOrdered(v);
     }


 See the bug in isArrayOrdered?

 It will fail with an empty array since:
 1. vec.size() yields size_t
 2. vec.size() - 1 stays unsigned, if the size was zero it is 
 now the maximum possible size
 3. and we have an unsigned comparison, with a very long loop 
 which in this case will cause out of bounds

 (Solution: use + 1 on the other side of the comparison)

 Similar problems happen when people store image size with 
 unsigned, or widget positions in unsigned, or dates in unsigned 
 integers.

 All of these problems disappear:
 - if only signed integers are available.
 - if container length return a signed integer (and that's what 
 D could have done...)
 - if comparison or arithmetic with mixed signedness is 
 forbidden (but: lots of casts)

 Programmers do not actually learn about integer promotion 
 except when forced by such bugs.
 It has nothing to with programmers being dumb or low-skilled, 
 and everything to do with an economy of information: it's 
 better if the problem doesn't exist in the first place.

 So I believe Java made a decision that saves up mental space.
Your example is a valid argument as well as you provide the solution for unsigned math. The most interesting with the example is that your example provide a one comparison solution for signed/unsigned. This type of error would quickly be found as it would generate an exception in both C++ and D. Then perhaps the lazy programmer would just do an extra check when the size is zero. For example bool isArrayOrdered(std::vector<int>& vec) { if (vec.empty()) { return true; } for (size_t n = 0; n < vec.size() - 1; ++n) { if (vec[n] > vec[n+1]) return false; } return true; } This would be better but requires an extra check, two comparisons. I think you provide an excellent example that the problem isn't signed/unsigned math but how you can use the arithmetics to your advantage regardless signed/unsigned. There are often solutions when using unsigned math and that is what I think the programmer should learn instead of limiting the language.
Aug 07 2021
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/7/2021 6:05 AM, Dukc wrote:
 `version` declarations. They do not allow expressions in them, but if one
really 
 needs expressions, `static if` will work around the problem.
Many years back, some revisions were made in druntime to use enums and `static if` instead of versions. It didn't take long before the problems I warned about with #if expressions emerged. It's inevitable. I replaced it all with versions, and the problems went away. D's design is very much based on decades of my experience with unanticipated problems that arise from certain features, and my discussions with other experienced programmers and team leaders about what goes wrong and causes trouble. --- As for car analogies, I recall a TV show called "Pinks" where amateur drivers drag raced for pink slips. On one episode, a teenager showed up with his Beetle, in which he had installed a much more powerful engine. The christmas tree turned green, he popped the clutch, and snapped both rear axles. The car rolled 10 feet down the track and stopped. He sheepishly got out, suffering a humiliating defeat. He shows up again on another episode a few months later. This time, he says, he got the drive train upgraded for the power. But also, his Beetle had sprouted a full roll cage. The MC asked him about the cage, and he sheepishly rolled his eyes and said his mom made him put it in. The tree turns green, he popped the clutch, and launched down the track. About halfway down, a front axle gave way. As the car was going very fast, it promptly flipped and tumbled down the track, over and over, with pieces of the car flying off everywhere. The MC was all "oh my gawd" and everyone ran down the track to see if he was dead. There was nothing left of the car but the roll cage, with our hero strapped within. He was unhurt. All he said was "thank god for mom!"
Aug 08 2021
next sibling parent reply user1234 <user1234 12.de> writes:
On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:
[...]
 The tree turns green, he popped the clutch, and launched down 
 the track. About halfway down, a front axle gave way. As the 
 car was going very fast, it promptly flipped and tumbled down 
 the track, over and over, with pieces of the car flying off 
 everywhere. The MC was all "oh my gawd" and everyone ran down 
 the track to see if he was dead. There was nothing left of the 
 car but the roll cage, with our hero strapped within. He was 
 unhurt.

 All he said was "thank god for mom!"
[indeed](https://youtu.be/HNMlooJeJ2M?t=2453)
Aug 08 2021
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/8/2021 6:57 AM, user1234 wrote:
 [indeed](https://youtu.be/HNMlooJeJ2M?t=2453)
Nice!
Aug 08 2021
prev sibling next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:

 I replaced it all with versions, and the problems went away.
You have just dismissed the problems caused by your solution, as you often do.
Aug 08 2021
next sibling parent reply Kyle Ingraham <kyle kyleingraham.com> writes:
On Sunday, 8 August 2021 at 14:30:29 UTC, Max Samukha wrote:
 On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:

 I replaced it all with versions, and the problems went away.
You have just dismissed the problems caused by your solution, as you often do.
Why the personal attack? What does it add here? I see these from time to time and they distract from the topic at hand. Sometimes even souring the discussion.
Aug 08 2021
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Sunday, 8 August 2021 at 16:15:00 UTC, Kyle Ingraham wrote:
 Why the personal attack?
That's directly addressing the argument, not a personal attack. Imagine going to a mechanic and saying "my car shakes and thumps" and getting the answer "just don't drive it then, no more shaking!" True, but presumably you were driving the car for a reason, like maybe you had some place to be and that's not going to change. So you need some kind of solution. Maybe it is to ride a bike or take the bus, but you can't just ignore that need while saying "the problems went away".
Aug 08 2021
next sibling parent jfondren <julian.fondren gmail.com> writes:
On Sunday, 8 August 2021 at 16:38:41 UTC, Adam D Ruppe wrote:
 On Sunday, 8 August 2021 at 16:15:00 UTC, Kyle Ingraham wrote:
 Why the personal attack?
That's directly addressing the argument, not a personal attack. Imagine going to a mechanic and saying "my car shakes and thumps" and getting the answer "just don't drive it then, no more shaking!"
Or rather, imagine going to a mechanic and saying "my engine heats up and stops", and getting the answer "yep, I checked it out and the radiator's busted. I replaced it for you. You won't be having them heat-ups anymore." And then, because the mechanic didn't go into sufficiently tedious detail to deflect uncharitable criticism, you blow up at him: "why are you COVERING UP the problem with my engine by replacing some random part?!"
Aug 08 2021
prev sibling parent Kyle Ingraham <kyle kyleingraham.com> writes:
On Sunday, 8 August 2021 at 16:38:41 UTC, Adam D Ruppe wrote:
 On Sunday, 8 August 2021 at 16:15:00 UTC, Kyle Ingraham wrote:
 Why the personal attack?
That's directly addressing the argument, not a personal attack. Imagine going to a mechanic and saying "my car shakes and thumps" and getting the answer "just don't drive it then, no more shaking!" True, but presumably you were driving the car for a reason, like maybe you had some place to be and that's not going to change. So you need some kind of solution. Maybe it is to ride a bike or take the bus, but you can't just ignore that need while saying "the problems went away".
I got the same as what you describe up to “You have just dismissed the problems caused by your solution…”. I just felt there was more there without background context that didn’t need to be there.
Aug 16 2021
prev sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 8 August 2021 at 16:15:00 UTC, Kyle Ingraham wrote:

 Why the personal attack? What does it add here?
I did it out of frustration with an issue that has a long history. Shouldn't have done that, sorry. I promise not to post here again.
Aug 08 2021
parent Kyle Ingraham <kyle kyleingraham.com> writes:
On Sunday, 8 August 2021 at 16:45:23 UTC, Max Samukha wrote:
 On Sunday, 8 August 2021 at 16:15:00 UTC, Kyle Ingraham wrote:

 Why the personal attack? What does it add here?
I did it out of frustration with an issue that has a long history. Shouldn't have done that, sorry. I promise not to post here again.
Definitely post here again. I only asked about something I had also been frustrated with. Don’t hold back your contribution because of my questions.
Aug 16 2021
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/8/2021 7:30 AM, Max Samukha wrote:
 I replaced it all with versions, and the problems went away.
You have just dismissed the problems caused by your solution, as you often do.
The static if solution was written by people who asserted I was wrong, that they needed that power. I got involved in it is because the static if solution resulted in the predicted bugs, and it fell into my lap to fix the bugs. No programming technique is without problems, but it's better to use the technique with fewer problems. I cannot transfer my experience to others, I can only explain. If people want to do things their way on their projects, that's ok, but with the D project we need to follow best practices. Over time, as people gain experience with D and get comfortable with D best practices, the desire for #if expressions tends to fade, and the value in D's version system becomes evident.
Aug 08 2021
prev sibling next sibling parent someone <someone somewhere.com> writes:
On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:

First and foremost ... thanks for D; your language seems superb 
:) !

Second: you all are a welcoming community to outsiders/newbies, 
something not seen very often on these kind of projects.

 All he said was "thank god for mom!"
I am not complaining nor anything like it, I think you know what should have been done and proceeded accordingly giving your vast and indisputable experience and track record, but, as usual, there are two points-of-view for any given problem: I think I have a better analogy to the car-one: - NASA/JPL/Boeing/et-al vs SpaceX For instance JPL won't even let you a minor-minor-minor-deviation on any given issue and so their track record is almost astounding to say the least (let's forget about that *minor* issue involving a metric/imperial conversion). Think mom. Same for NASA/Boeing/and-family ... superb engineers thinking contingencies against the impossible. You got excellent track records but ... On the other side, when you let loose available unrestricted tools on the wild, you end up with things like SpaceX. The progress we are having right now in a very short span of time (I mean, we, as humanity, not as a SpaceX employee of course) is unheard off. Lots of out-of-the-box-thinkers and what-not at the expense of taking increased risk in order to succeed. And the whole establishment frantically playing catch-up. Which approach is better ? Both and neither one; it depends on the point-of-view. Why do I bring back this analogy ? Because when you give users unrestricted tools/hardware they'll surely find ways of doing things you can never dreamed of. That's how innovation works. At the expense of risk, of course, you can't have both at the same time. And in our tech-sphere, same with hardware vendors: they want you to use their products the way they suit them for whatever reasons ($) and people always find ways to build superb things out-of-the-intended use-case ... just think google and hardware vendors on the beginnings before this hyper-scale thing became into existence. Again: I am not complaining, I am very happy with D so far, it is your project, your language, you can do whatever you want with it the way you like :) It is just I found this such a goood topic !
Aug 08 2021
prev sibling parent Mark <mt.rzezniczak gmail.com> writes:
On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:
 All he said was "thank god for mom!"
Just an off-topic philosophical question here, but was he thanking god, his mom, or both?
Aug 09 2021
prev sibling next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 This is a general discussion which applies to all computer 
 languages and also under several decades. What I have observed 
 is that language designers see programmers misuse the language 
 and introduce possible bugs and therefore remove features in 
 languages. An analogy would limit the functionality of cars 
 because people sometimes are involved in accidents, like 
 automatic speed limiter (soon to be law in several countries).

 Language designers seem to have a big brother attitude towards 
 programmers and think they will save the world by introducing 
 limitations.

 Examples.

 1.
 Array indexes should be signed instead of unsigned because 
 somehow programmers mess up loops among other things. Bjarne 
 Stroustrup considered his unsigned index to be a historic 
 mistake. While unsigned array indexes make perfectly sense, the 
 bias towards signed seems to be that programmers are stupid. 
 The question is, if can't make a for loop with unsigned math, 
 shouldn't you look for another job?

 2.
 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.

 You are welcome to add any other examples that you find 
 significant for the discussion.


 This partially applies to D in some extent but can often be 
 found in other languages and mentality of several language 
 designers.

 The question is, do you think language designers go to far when 
 trying to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at 
 the same time do not limit the language?
"In programming language design, one of the standard problems is that the language grows so complex that nobody can understand it. One of the little experiments I tried was asking people about the rules for unsigned arithmetic in C. It turns out nobody understands how unsigned arithmetic in C works. There are a few obvious things that people understand, but many people don't understand it." https://www.artima.com/articles/james-gosling-on-java-may-2001 Turns out not everyone has an university degree for programming, and many known less than they actually are able to deliver, specifically when under stress and deadlines. By the way, Java supports unsigned math since Java 8 via unsigned math methods, if only people would actually bother to update, we are on version 16 now.
Aug 07 2021
prev sibling next sibling parent reply Dylan Graham <dylan.graham2000 gmail.com> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions
Zig does stuff like this and it's why I can't take that language seriously. At all. To paraphrase what I was told by Zig's community and BDFL: "if there's multiple ways to do something then obviously dumb dumb programmer will get it completely wrong". I like programming languages that help me catch bugs (D or Rust), not languages that treat me like a 3 year old.
Aug 08 2021
parent reply Dukc <ajieskola gmail.com> writes:
On Sunday, 8 August 2021 at 09:49:41 UTC, Dylan Graham wrote:
 On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 [snip]
Zig does stuff like this and it's why I can't take that language seriously. At all. To paraphrase what I was told by Zig's community and BDFL: "if there's multiple ways to do something then obviously dumb dumb programmer will get it completely wrong". I like programming languages that help me catch bugs (D or Rust), not languages that treat me like a 3 year old.
To be fair, I don't think lack of unsigned integers is a big deal. Because many operators do not care whether it's signed or unsigned that's being used, and for the remaining ones (divide, modulus, right shift) the language can define standard library functions. But I do agree in general that either Java does not have much faith in programmer ability, or they value implementation simplicity much more than readability. I can see no other reasons for disallowing free functions or requiring curly braces for every singe `if` or `while` statement. Or for requiring repeating `public` on every public member instead of just before the first one as in C++ (Haven't done much Java, forgive me if I recall the details wrong).
Aug 09 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 9 August 2021 at 12:14:27 UTC, Dukc wrote:
 On Sunday, 8 August 2021 at 09:49:41 UTC, Dylan Graham wrote:
 On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 [snip]
Zig does stuff like this and it's why I can't take that language seriously. At all. To paraphrase what I was told by Zig's community and BDFL: "if there's multiple ways to do something then obviously dumb dumb programmer will get it completely wrong". I like programming languages that help me catch bugs (D or Rust), not languages that treat me like a 3 year old.
To be fair, I don't think lack of unsigned integers is a big deal. Because many operators do not care whether it's signed or unsigned that's being used, and for the remaining ones (divide, modulus, right shift) the language can define standard library functions. But I do agree in general that either Java does not have much faith in programmer ability, or they value implementation simplicity much more than readability. I can see no other reasons for disallowing free functions or requiring curly braces for every singe `if` or `while` statement. Or for requiring repeating `public` on every public member instead of just before the first one as in C++ (Haven't done much Java, forgive me if I recall the details wrong).
You can kind of fake free functions with import static or method references. Compound statements aren't required for if/while, only a convention inherited from C best practices, goto fail from Apple showed the world why they are a good idea. Class members without visibility specificier default to package visibility, so they are public to remaining classes on the package or module (as of Java 9+).
Aug 09 2021
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Monday, 9 August 2021 at 12:14:27 UTC, Dukc wrote:
 I can see no other reasons for disallowing free functions or 
 requiring curly braces for every singe `if` or `while` 
 statement.
Java is object oriented only language, hence it makes sense to not have free functions as in C++, and you actually do have free functions there. They are static methods. Braces are not required for control statements, but encouraged to avoid ambiguous then statements. Best regards, Alexandru
Aug 09 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 05:37:26PM +0000, Alexandru Ermicioi via Digitalmars-d
wrote:
 On Monday, 9 August 2021 at 12:14:27 UTC, Dukc wrote:
 I can see no other reasons for disallowing free functions or
 requiring curly braces for every singe `if` or `while` statement.
Java is object oriented only language, hence it makes sense to not have free functions as in C++, and you actually do have free functions there. They are static methods.
I find this to be silly OO-extremism on Java's part. Static methods of a singleton class are basically free global functions. Why can't they just *be* global free functions?! But nooo, that would not jive with the OO agenda, so we have to wrap that in a singleton class just to do lip service to the OO dogma and pacify the OO police. We're eating the greasy unhealthy food of global free functions, but to pacify our OO conscience we order the diet coke of a singleton class's static methods. Seriously, just call it what it is already.
 Braces are not required for control statements, but encouraged to
 avoid ambiguous then statements.
[...] This one I actually agree with. (Even if I do find it annoying sometimes.) Ambiguous statements are actually worse than they appear at first glance, because of long-term maintainability issues. Consider this code where braces are omitted: if (someCondition) doSomething(); else doSomethingElse(); finishUp(); Imagine if some time later somebody discovers a bug, and introduces this fix: if (someCondition) doSomething(); else doSomethingElse(); return 1; // <--- bugfix finishUp(); Had braces been required, the `return 1;` would have been introduced in the proper scope and there would have been no problem. But now this fix has introduced a new bug. T -- The volume of a pizza of thickness a and radius z can be described by the following formula: pi zz a. -- Wouter Verhelst
Aug 09 2021
next sibling parent reply Daniel N <no public.email> writes:
On Monday, 9 August 2021 at 18:00:23 UTC, H. S. Teoh wrote:
 	if (someCondition)
 		doSomething();
 	else
 		doSomethingElse();
 		return 1;	// <--- bugfix
 	finishUp();

 Had braces been required, the `return 1;` would have been 
 introduced in the proper scope and there would have been no 
 problem. But now this fix has introduced a new bug.


 T
The above code will fail to compile in recent C/C++ compilers, because of suspected intentional backdoors hiding in plain sight using this technique. <source>:7:5: warning: this 'else' clause does not guard... [-Wmisleading-indentation] <source>:9:9: note: ...this statement, but the latter is misleadingly indented as if it were Thus there's no longer a good reason to avoid this style, unless you are using a compiler where this is not yet implemented.
Aug 09 2021
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 06:23:42PM +0000, Daniel N via Digitalmars-d wrote:
 On Monday, 9 August 2021 at 18:00:23 UTC, H. S. Teoh wrote:
 	if (someCondition)
 		doSomething();
 	else
 		doSomethingElse();
 		return 1;	// <--- bugfix
 	finishUp();
 
 Had braces been required, the `return 1;` would have been introduced
 in the proper scope and there would have been no problem. But now
 this fix has introduced a new bug.
[...]
 The above code will fail to compile in recent C/C++ compilers, because
 of suspected intentional backdoors hiding in plain sight using this
 technique.
 
 <source>:7:5: warning: this 'else' clause does not guard...
 [-Wmisleading-indentation]
 <source>:9:9: note: ...this statement, but the latter is misleadingly
 indented as if it were
 
 Thus there's no longer a good reason to avoid this style, unless you
 are using a compiler where this is not yet implemented.
I would not trust a coding style that depends on the compiler to warn about mistakes. What if the compiler fails to issue the warning in some obscure corner cases? It's completely preventable: just don't write code this way. Besides, it's kinda sad that C is moving towards Python-style indentation sensitivity... T -- This is a tpyo.
Aug 09 2021
prev sibling next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Monday, 9 August 2021 at 18:00:23 UTC, H. S. Teoh wrote:
 [snip]

 Imagine if some time later somebody discovers a bug, and 
 introduces this fix:

 	if (someCondition)
 		doSomething();
 	else
 		doSomethingElse();
 		return 1;	// <--- bugfix
 	finishUp();

 Had braces been required, the `return 1;` would have been 
 introduced in the proper scope and there would have been no 
 problem. But now this fix has introduced a new bug.


 T
I strongly prefer requiring the braces are here.
Aug 09 2021
prev sibling parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Monday, 9 August 2021 at 18:00:23 UTC, H. S. Teoh wrote:
 I find this to be silly OO-extremism on Java's part.  Static 
 methods of a singleton class are basically free global 
 functions.  Why can't they just *be* global free functions?!  
 But nooo, that would not jive with the OO agenda, so we have to 
 wrap that in a singleton class just to do lip service to the OO 
 dogma and pacify the OO police.  We're eating the greasy 
 unhealthy food of global free functions, but to pacify our OO 
 conscience we order the diet coke of a singleton class's static 
 methods.
Well, if you're picky, then no, static methods are not free functions in global space, but rather can be considered as methods of object representing the class of objects (x.class). They though are usually used as free functions, and people don't fuss over whether they a free or part of a class. I think java doesn't have any notion of a global space where you can put your funcs. There is just class and that is it. To note, I'm not defending the choices made for java by it's dev team, just stated some facts, and personal views. Best regards, Alexandru.
Aug 09 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 10:32:01PM +0000, Alexandru Ermicioi via Digitalmars-d
wrote:
 On Monday, 9 August 2021 at 18:00:23 UTC, H. S. Teoh wrote:
 I find this to be silly OO-extremism on Java's part.  Static methods
 of a singleton class are basically free global functions.  Why can't
 they just *be* global free functions?!  But nooo, that would not
 jive with the OO agenda, so we have to wrap that in a singleton
 class just to do lip service to the OO dogma and pacify the OO
 police.  We're eating the greasy unhealthy food of global free
 functions, but to pacify our OO conscience we order the diet coke of
 a singleton class's static methods.
Well, if you're picky, then no, static methods are not free functions in global space, but rather can be considered as methods of object representing the class of objects (x.class). They though are usually used as free functions, and people don't fuss over whether they a free or part of a class.
Frankly, that's just calling a duck an aquatic chicken with webbed feet. The class is a singleton (which kinda defeats the whole point of a *class* -- at that point it might as well be just a namespace, as far as effective semantics are), and the static method does not even take a reference to said singleton class, it's completely standalone and independent (aside from the FQN). It basically behaves like a global free function -- and is in fact the exact idiom Java uses when interfacing with C functions. The equivalence is practically 1-to-1. It quacks like a duck and waddles like a duck -- it's a duck. Calling it an aquatic class method with webbed feet is really only good for doing lip service to OO dogma. For all practical purposes, it's a free function.
 I think java doesn't have any notion of a global space where you can
 put your funcs. There is just class and that is it.
Yes, and this is why such a non-OO hack was introduced in the first place. Since anything existing outside a class is taboo in OO dogma, in order to rationalize away the non-OO-ness of a global free function we wrap it in the robes of a class and crown it as a method of that class. But the class is a singleton (and holds no state) and the method is static (does not depend on the class instance at all), which makes it painfully obvious that the entire intent is to rephrase "global free function" in terms that are less offensive to OO dogma. But regardless, this shoe-horning doesn't change the fact that it essentially, for all practical purposes, behaves like a global free function.
 To note, I'm not defending the choices made for java by it's dev team,
 just stated some facts, and personal views.
[...] No offense taken. I just find the OO extremism of Java to be laughable, that's all. Why not just admit that OO isn't the be-all and end-all of programming, and call free global functions what they are. This is why D's multi-paradigm approach makes so much more sense: sometimes one paradigm doesn't fit the problem, why not acknowledge it and allow a different paradigm to step in. Not every problem is a nail to which the OO hammer must be brought to bear. T -- Talk is cheap. Whining is actually free. -- Lars Wirzenius
Aug 09 2021
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 9 August 2021 at 23:09:30 UTC, H. S. Teoh wrote:
 ...
 No offense taken.  I just find the OO extremism of Java to be 
 laughable, that's all.  Why not just admit that OO isn't the 
 be-all and end-all of programming, and call free global 
 functions what they are.

 This is why D's multi-paradigm approach makes so much more 
 sense: sometimes one paradigm doesn't fit the problem, why not 
 acknowledge it and allow a different paradigm to step in.  Not 
 every problem is a nail to which the OO hammer must be brought 
 to bear.


 T
I love how people love to hate Java, yet have no words of hate (check methods from typeof(func)), Dart. How has D's multiparadigm helped to have a better marketshare than some of those extremist OOP languages?
Aug 09 2021
next sibling parent reply Brian Tiffin <btiffin gnu.org> writes:
On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:
 On Monday, 9 August 2021 at 23:09:30 UTC, H. S. Teoh wrote:
 ...
 I love how people love to hate Java, yet have no words of hate 

 (check methods from typeof(func)), Dart.

 How has D's multiparadigm helped to have a better marketshare 
 than some of those extremist OOP languages?
I may have started some of the jump on Java trend with the bumper car comment. *This may get long, as there are historical evolutions to cover, an old guy in the mood to ramble, and maybe pass on some learnin's, save a few younglings from some choice of programming tool angst.* I'm not going to speak for others, Paulo, but I find Java to be fun, not hated, more smirky amusing. *Now rambling to the room.* Being a fellow Canuck, I've followed the works of James Gosling with some interest, for a lot of years now. Back when it was Oak, and the smart agent in Oak was Duke, and Duke is still a very cool mascot. *Be sure to check out sc, the spreadsheet calculator, a TUI interface console spreadsheet with vi like key bindings. Something James also wrote, way back when as public domain code.* And some *personal* reasons behind the smirking at Java. When Java was first being hyped (and it was hyped, something like $500 million was allocated to advertising by Sun), I was programming in Forth on a Vax for a phone company. PCs were toys and core business adjacent, not core business tools, in my mind at the time. The web was a distraction, for science and amusement, viewed as business adjacent, not core. *At the time*. Could see the potential of Mosaic and the WWW, but would not have recommended it to a manager as a smart move for critical work. It was for the marketing department, not a competitor to telephony. The web did not seem like bigboy pants internet technology at the time, not like the more serious uses like Gopher, Telnet and FTP. :-) Attitude on the web changed fairly quickly, but Java still seemed like a toy. Real programmers wrote to machine code, not a play area sandbox. But, $500 million in advertising budget caught the eye of many a leader, and it was fun. Our Forth system was set to be superseded by an amalgam overlord system in C++, consolidating some 20 or 30 other large systems. Watched in predicted horror as three years of 300 people's time was wasted on project management and Grady Booch cloud diagram planning. 300 people. 3 years. The writing was on the wall when **zero** user level demo screens were available years after starting. *C++ isn't going to work, this needs to be in Java, was the next recommendation.* I think the Forth project we worked on (for many more years after the story from above) was finally retired a few years ago after being outsourced to a Vaxen emulator in Bangalore a decade ago. The amalgam is still in a continuing cycle of development startup followed by cancel and fail, last I heard. 25ish years later. Yet, unless given the runway we initially enjoyed with the Forth system, I would not recommend Forth for large scale systems. You need time to build an application vocabulary when using Forth in the large, and there is little chance for cost recovery during that phase. Given lead time, absolutely, but that'd be a rare management decision in the 3rd millennium. C++ rose to fame, in part, due to human pride. A sense of "I figured it out, ar'mant I smart". That's a powerful draw. For many, C++ is very smart. Smarter than some of us prideful programmers though. Java did not rise to fame on merit (it might have, more slowly). It rose to fame on aggressive marketing and expensive hype. Since the lock in had started, many companies have poured billions into JVM technology, to ensure its success at making reasonably fast bumper cars run in a sandbox. That's all ok, but I was young when it was just starting, and it smelled like a toy then. It still kinda smells like a toy, because of that initial bias. Useful for business adjacent pretty reporting and such. Haven't been paid to work in Java since 1998-ish, but have followed it along the way, 1.2 was just coming out, with the whole J2EE thing (or J2SE, or JDK, can't quite remember the marketing speak without looking it up). I'm not sure, but I'd wager some of that Java 1.2 is still in the marketing application we wrote, if the company still exists. Still enjoy exploring OpenJDK 11 now, and have an intrinsic FUNCTION JVM built into a version of GnuCOBOL, which should eventually be solid enough for us to advertise GnuCOBOL as ready to run any enterprise Java classes in an engine embedded in COBOL via JNI. *At its core, JDK Java is written in C and GnuCOBOL compiles via C.* I'm still that kind of biased internally, COBOL for core business, Java for some fluffy bits and pretty pictures. Also realize that the amount of .class code in the world is immense. Why not leverage it to help an enterprise pursue its goals, while counting all the beans in COBOL, integrated tightly? My bias to OOP is similar. Watched too many large scale failures come and go with C++ and Java rip and replace projects. But when determined and with deep enough pockets most failures can be silently buried, small successes over hyped until it all becomes legacy code anyway, ready to be replaced by Go, Ruby, Rust, Zig, or whatever is the lang hype du jour at the time. Except for the life expectancy of source code thing, I'd happily hype D to any management at any large company. But at 6ish years of life expectancy before being forced to manually update a codebase, it'd be a hard sell for large systems. Will recommend D for business adjacent short span productivity applications from now on though. How many sites still run Python 2 because the code borks in Python 3? How many sites will be on Java 8 until 2038 or beyond? The borks might be extremely trivial to a programmer, parens around arguments to `print` for instance, but when a non-tech boss sees the first `SyntaxError: Missing parentheses in call to 'print'. Did you mean print(name)?` they may just stick with Python 2 and get on with running their business. They paid for that code already. 10 years is nothing in that kind of time frame. They will wait until the pain of stagnation (or being mocked as archaic legacy) exceeds a threshold. Then they will pour money at the tech team, often times just to save face, and in a sour mood. There will always be another consultant ready to hype a tech, and promise a short cut to the promised land. Slow, long tail growth is the best kind of growth, in my old guy opinion. D is making sound decisions in the now it seems, growing slowly, which is sound in the long term. Now to convince the D dev team that we, the stupid programmers, leave behind code that may not age well if even a small language detail changes. Cover that base and businesses will follow. To keep a little bit more *in thread*, some of the decisions may be guard rail protect the programmer from the programmer decisions. Those aren't all bad. COBOL isn't active at 60 because it's great tech in the internet era, it's still active, with a huge market share, because it grew that big with a never shrinking codebase, that still compiles. Pennies are still pennies, core banking is still the same banking. If I'm not mistaken, the oldest still running codebase is a contract management system, MOCAS, for the U.S. department of defense. 60+ years old, mostly COBOL. There are billions on the line for the first team that can answer a still open call to convince DoD that a replacement won't drop any features or muck up any contracts in play at time of transition. Some have tried, all failed to convince the brass, so far. More famous, but still old is the SABRE airline reservation system, in COBOL/CICS. There are programs running in the IRS supporting the Individual Master File and Master Banking File projects still using (probably, these are not overly public source codes) COBOL sources written in 1962, recompiled for each new mainframe iteration. Java 8 is set to ~~enjoy~~ suffer a similar fate. As is Python 2, Perl 5, ... (whether the Python/Perl/... core contributors like it or not). Ruby, maybe not, but maybe. If you can convince a company to pay for some code, they are not going to want to pay for the same code again, even decades later. Ranting almost over. D design decisions seem like very sane long view decisions. But source codes need a longer life expectancy, or D may end up relegated to short and mid range development planning, ad infinitum. That's ok, if that is the end goal or an acceptable fate. D does not feel like a design effort with stupid in mind. The guard rails and preferred paths in D seem well planned and well placed. *Still new to D, could be completely wrong in the depths*. Protecting from tricks some people may feel comfortable with but keeping simple mistakes from being catastrophic, painful to find and fix or costly to maintain, is not limiting, in my opinion, at least not in a bad way. That goes for D or any other programming language tool chain. Have good, make well, excuse the long read and slow drift.
Aug 10 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 10, 2021 at 10:59:58AM +0000, Brian Tiffin via Digitalmars-d wrote:
[...]
 *Now rambling to the room.* Being a fellow Canuck, I've followed the
 works of James Gosling with some interest, for a lot of years now.
 Back when it was Oak, and the smart agent in Oak was Duke, and Duke is
 still a very cool mascot.
Haha, didn't realize Gosling was Canadian. Hooray for fellow Canuckians! ;-) [...]
 When Java was first being hyped (and it was hyped, something like $500
 million was allocated to advertising by Sun), [...]
 [...] Real programmers wrote to machine code, not a play area sandbox.
 But, $500 million in advertising budget caught the eye of many a
 leader, and it was fun.
Yep, that's the perception I had from the 90's. I was young and idealistic, and immediately smelled the hype behind Java -- and distrusted it. Almost 3 decades later, I still distrust it. But now, with history in hindsight, it is also abundantly clear that what drives programming language popularity is marketing budget. Technical merit plays at best a secondary role (if even). [...]
 Java did not rise to fame on merit (it might have, more slowly). It rose to
 fame on aggressive marketing and expensive hype.  Since the lock in had
 started, many companies have poured billions into JVM technology, to ensure
 its success at making reasonably fast bumper cars run in a sandbox.
Yes, and that is why marketing hype is ultimately the decider of technology adoption, not the technical merit itself. First, with a large enough budget, you attract the attention of the VIPs who make the decisions at the top level. They dictate the use of said tech among the lower ranks, and then over time the tech becomes entrenched, ensuring its continued use. Technical excellence does not play a major role here. If it works reasonably well, it will stay. And the longer it stays, the harder it becomes to displace it. Inertia is a powerful force. [...]
 My bias to OOP is similar.  Watched too many large scale failures come
 and go with C++ and Java rip and replace projects.
As Joel Spolsky once said: ... the single worst strategic mistake that any software company can make: They decided to rewrite the code from scratch. ;-)
 But when determined and with deep enough pockets most failures can be
 silently buried, small successes over hyped until it all becomes
 legacy code anyway, ready to be replaced by Go, Ruby, Rust, Zig, or
 whatever is the lang hype du jour at the time.
Haha, this reminds me of one of my favorite Walter quotes: I've been around long enough to have seen an endless parade of magic new techniques du jour, most of which purport to remove the necessity of thought about your programming problem. In the end they wind up contributing one or two pieces to the collective wisdom, and fade away in the rearview mirror. -- Walter Bright But yeah, it's the hype that drives adoption. The technical merit, not so much. If at all. [...]
 Slow, long tail growth is the best kind of growth, in my old guy
 opinion.
I agree. But that's a hard sell in today's age of instant gratification. Nobody wants to -- nor has the time to -- gradually build up an infrastructure that will last for the long term. They want, and need, a solution NOW, and you better be able to deliver NOW, otherwise they will just move on to the next salesman who promises the here and now. And it's hard to fault them when the investors are knocking on their door asking when the promised results will materialize.
 D is making sound decisions in the now it seems, growing slowly, which
 is sound in the long term.  Now to convince the D dev team that we,
 the stupid programmers, leave behind code that may not age well if
 even a small language detail changes.  Cover that base and businesses
 will follow.
[...] I think this is what Andrei has been saying every so often: stop deleting, start adding. I.e., never remove old features, just add new ones to be used in favor if necessary. If needed, deprecate and hide away old features (hide away the docs so that new people don't accidentally use it for new code), but never, ever remove it. If a new, better way to do something is discovered, add it to the language, highlight it front and center in the docs, but don't ever remove the old stuff. If we need to redesign Phobos, for example, do it in a new namespace, don't remove/replace the old one. Make it possible for the two to coexist. Accretion rather than replacement. OT1H the idealist in me screams "please break my code, fix the language to remove all the broken stuff!". OTOH the long-term code maintainer in me shakes his fists every time a compiler upgrade breaks one of my old projects. The solution to the conundrum is, accretion rather than replacement. Let the old code compile. Complain about deprecations if you have to, but don't ever make it not compile (unless it was an outright bug to have compiled in the first place). But new code can use the new stuff and benefit from it. T -- Trying to define yourself is like trying to bite your own teeth. -- Alan Watts
Aug 10 2021
parent reply Paul Backus <snarwin gmail.com> writes:
On Tuesday, 10 August 2021 at 17:47:48 UTC, H. S. Teoh wrote:
 I think this is what Andrei has been saying every so often: 
 stop deleting, start adding.  I.e., never remove old features, 
 just add new ones to be used in favor if necessary.  If needed, 
 deprecate and hide away old features (hide away the docs so 
 that new people don't accidentally use it for new code), but 
 never, ever remove it. If a new, better way to do something is 
 discovered, add it to the language, highlight it front and 
 center in the docs, but don't ever remove the old stuff.  If we 
 need to redesign Phobos, for example, do it in a new namespace, 
 don't remove/replace the old one.  Make it possible for the two 
 to coexist.  Accretion rather than replacement.

 OT1H the idealist in me screams "please break my code, fix the 
 language to remove all the broken stuff!".  OTOH the long-term 
 code maintainer in me shakes his fists every time a compiler 
 upgrade breaks one of my old projects.

 The solution to the conundrum is, accretion rather than 
 replacement. Let the old code compile.  Complain about 
 deprecations if you have to, but don't ever make it not compile 
 (unless it was an outright bug to have compiled in the first 
 place).  But new code can use the new stuff and benefit from it.
Of course, the obvious counterargument is that this approach is exactly how you end up with a language like C++: powerful in the hands of experts, with a large and successful ecosystem, but packed to the gills with pitfalls and footguns for beginners to hurt themselves with. Ultimately, that's the question D's leadership has to answer. What's more important to D's future: existing users of D, or potential future users of D? C++ has a lot of inertia, and doesn't necessarily need to attract new users to continue being a successful language. But I'm not sure the same is true for D. If C++ holds onto legacy features that cause pain for beginners, people will grumble and put up with it, because C++ is so entrenched that they have no better option. If D does the same thing, people are more likely to just
Aug 10 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 10, 2021 at 06:46:57PM +0000, Paul Backus via Digitalmars-d wrote:
 On Tuesday, 10 August 2021 at 17:47:48 UTC, H. S. Teoh wrote:
[...]
 OT1H the idealist in me screams "please break my code, fix the
 language to remove all the broken stuff!".  OTOH the long-term code
 maintainer in me shakes his fists every time a compiler upgrade
 breaks one of my old projects.
 
 The solution to the conundrum is, accretion rather than replacement.
 Let the old code compile.  Complain about deprecations if you have
 to, but don't ever make it not compile (unless it was an outright
 bug to have compiled in the first place).  But new code can use the
 new stuff and benefit from it.
Of course, the obvious counterargument is that this approach is exactly how you end up with a language like C++: powerful in the hands of experts, with a large and successful ecosystem, but packed to the gills with pitfalls and footguns for beginners to hurt themselves with.
[...] Very true. Which is why one time I brought up this idea of embedding language version in source files. Perhaps right after the module declaration there could be an optional version declaration that states which language version the source file was written for: module mymodule; version = 2.097; ... If the version declaration is earlier than the current compiler version, it would enter compatibility mode in which it emulates the behaviour of a previous language version. But if the version declaration is up-to-date or omitted, the compiler would apply the latest version of the language, which may or may not break older features. That gives us a way to fix flaws in the language and reverse design decisions that were wrong in hindsight, yet without breaking old code. Newer code would then be encouraged to use the latest language version, that does not have the pitfalls of older language versions. Of course, in practice this will add a whole ton of extra work for the compiler devs, especially once the number of supported language versions increases. And it's not clear what to do if an older language construct has to interact with a newer, incompatible language feature. So I'm not sure how practical this will be. But if we could somehow pull it off, then we could *both* evolve the language and retain backward compatibility at the same time. T -- Fact is stranger than fiction.
Aug 10 2021
parent reply Alexandru Ermicioi <alexandru.ermicioi gmail.com> writes:
On Tuesday, 10 August 2021 at 19:18:47 UTC, H. S. Teoh wrote:
 Of course, in practice this will add a whole ton of extra work 
 for the compiler devs, especially once the number of supported 
 language versions increases.  And it's not clear what to do if 
 an older language construct has to interact with a newer, 
 incompatible language feature.  So I'm not sure how practical 
 this will be.  But if we could somehow pull it off, then we 
 could *both* evolve the language and retain backward 
 compatibility at the same time.


 T
Better in this case to have semi or full code migrator, that will automatically upgrade source code to latest version of d compiler. Then you won't require any extensive support of older versions of language, in the compiler itself.
Aug 10 2021
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 10, 2021 at 07:23:52PM +0000, Alexandru Ermicioi via Digitalmars-d
wrote:
 On Tuesday, 10 August 2021 at 19:18:47 UTC, H. S. Teoh wrote:
 Of course, in practice this will add a whole ton of extra work for
 the compiler devs, especially once the number of supported language
 versions increases.  And it's not clear what to do if an older
 language construct has to interact with a newer, incompatible
 language feature.  So I'm not sure how practical this will be.  But
 if we could somehow pull it off, then we could *both* evolve the
 language and retain backward compatibility at the same time.
[...]
 Better in this case to have semi or full code migrator, that will
 automatically upgrade source code to latest version of d compiler.
 Then you won't require any extensive support of older versions of
 language, in the compiler itself.
This has been brought up many times, but I still see no actual progress in this area. We really need to make dfix a part of the official compiler default installation so that it's officially supported, and not merely an afterthought. Or even better yet, make it a part of the compiler (or at least auto-spawned by the compiler, the same way the linker is). The less external tools the user needs to manually invoke to make the upgrade happen, the better. In the ideal scenario, the act of running the compiler would auto-upgrade the source code in-place, and compile that instead of the original code. Of course, this may be too intrusive, so maybe hide it behind a compiler switch? When the switch is specified the compiler invokes dfix on the user's behalf, replaces the input file(s) with the upgraded versions, and compiles them, all in one go. Imagine how much better it is, if we could just: dmd -i -auto-upgrade my_project/*.d -of=profit ./profit and have everything Just Work(tm), rather than having to: dfix T -- Amateurs built the Ark; professionals built the Titanic.
Aug 10 2021
prev sibling next sibling parent Dukc <ajieskola gmail.com> writes:
On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:
 I love how people love to hate Java, yet have no words of hate 

 (check methods from typeof(func)), Dart.
In this case, Java was already raised as an example, so it was easiest to talk about it when criticizing the dogmatism about OO. restrictive as Java, because at least it has user-defined value types and operator overloading. I'll grant that this is probably not because Java would trust the quess I dislike scarce-featured languages in general - I'm not very tempted by Go either, not that I have tried. C is kind of fun to play with because of the low-level tricks, but I don't consider it a good general-purpose language either.
 How has D's multiparadigm helped to have a better marketshare 
 than some of those extremist OOP languages?
I don't think I have to explain the bandwagon fallacy for you, so I'm assuming you mean "How has D's multiparadigm helped to have a better marketshare than if it had an extremist attitude towards OOP?". Consider a simple bug report (https://issues.dlang.org/show_bug.cgi?id=22151) example I submitted recently. ```d void main() { *&main = *&main; } ``` have to be something like ```d struct S { static freeFun() { *&freeFun = *&freeFun; } } ``` Note that the comparison is fair - I'm still taking advantage of D's expressive power in the second example, by dropping the unnecessary `void` and visibility attribute. If every single small code snippet (or utility module) has to contain an enclosing `struct`, `class` or `union`, that totals to a HUGE readability price. Yes, allowing free functions does complicate modules, and includes the need for function hijack protection. But that pays back in a big way when it comes to readability. Also remember that you cannot have UFCS without free functions, `static` member functions. And frankly, that solution has NO upsides whatsoever compared to D. The programmer still needs to define the needless `class`, and language designers need to think about the module system and hijack protection just as hard as with free functions.
Aug 10 2021
prev sibling next sibling parent bachmeier <no spam.net> writes:
On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:

 I love how people love to hate Java, yet have no words of hate 

 (check methods from typeof(func)), Dart.

 How has D's multiparadigm helped to have a better marketshare 
 than some of those extremist OOP languages?
I think in both cases a better comparison is Scala, since it's so close to Java. Martin Odersky cleaned up the OOP and made it multiparadigm (with functional programming) and a lot of people liked it. Since it's a programming language, people do complain about Scala, but they complain about other things. The features that stand out about Java are the boilerplate and the lack of expressiveness, and in my opinion, those criticisms are correct. In one sense, the Java language designers agree, since they've copied so much from languages like Scala.
Aug 10 2021
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 10, 2021 at 06:23:39AM +0000, Paulo Pinto via Digitalmars-d wrote:
 On Monday, 9 August 2021 at 23:09:30 UTC, H. S. Teoh wrote:
 ...
 No offense taken.  I just find the OO extremism of Java to be
 laughable, that's all.  Why not just admit that OO isn't the be-all
 and end-all of programming, and call free global functions what they
 are.
 
 This is why D's multi-paradigm approach makes so much more sense:
 sometimes one paradigm doesn't fit the problem, why not acknowledge
 it and allow a different paradigm to step in.  Not every problem is
 a nail to which the OO hammer must be brought to bear.
[...]
 I love how people love to hate Java, yet have no words of hate against

 from typeof(func)), Dart.
Whoa, slow down right there. My rant was against shoehorning every programming problem into a single paradigm, not against OOP itself. OOP certainly has its uses -- that's why D has it too! The problem comes when OOP zealots try to shove it down everyone's throats even when the problem at hand clearly does not fit the glove. I picked Java because that's the language I am most familiar with that exhibits this syndrome. I don't know enough (well, any) Smalltalk or Eiffel to be able to criticize them coherently, and I have not written a written in very small scales, and it does have some nice things about it. But I've not written enough non-trivial Python code to be able to criticize it effectively. But the point is that single-paradigm languages ultimately suffer from the shoehorn program: trying to shoehorn every programming problem into a single mold, even when it doesn't really fit. Certain algorithms are simply better expressed as an imperative loop that mutate variables; it's a royal pain in the neck to try to write the equivalent of such code in Lisp, for example. It's certainly *possible*, thanks to Turing completeness, but exhibits all the symptoms of dressing up a duck as an aquatic chicken with webbed feet: it's awkward, requires excessive paraphrases, and more complex than it really needs to be. The same can be said about writing range-based code in a language like C: you can do it, but it's awkward, error-prone, and just plain ugly. In a multi-paradigm language, you can choose the best paradigm to express the algorithm at hand, in the most convenient, clear way, without having to resort to the shoe-horn. It's faster to write, easier to read, and more maintainable in the long run. You can treat a duck as a duck instead of an aquatic chicken with webbed feet. It's so refreshing not to have to paraphrase yourself all the time.
 How has D's multiparadigm helped to have a better marketshare than
 some of those extremist OOP languages?
I don't understand where marketshare enters the equation. We're talking about how well a language fits a problem domain, not about the fashion contest that is today's programming language "market". (And incidentally, which language is most popular has very little to do with its technical merit, it is primarily determined by how much marketing capital it has and how well the marketing department sells it. With a large-enough marketing budget and a spark of marketing genius, you can sell *anything* to anyone. I have no interest in what's popular and what's not; what I look for is technical excellence, which is the true value.) T -- Дерево держится корнями, а человек - друзьями.
Aug 10 2021
prev sibling next sibling parent reply Brian Tiffin <btiffin gnu.org> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 ...
 Language designers seem to have a big brother attitude towards 
 programmers and think they will save the world by introducing 
 limitations.

 Examples.
 ...
 2.
 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.

 You are welcome to add any other examples that you find 
 significant for the discussion.


 This partially applies to D in some extent but can often be 
 found in other languages and mentality of several language 
 designers.

 The question is, do you think language designers go to far when 
 trying to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at 
 the same time do not limit the language?
Just to point out that using Java as a sample seems a bit off. Java (well, the JVM) is the ultimate in restrictive. It's a compiler that doesn't emit CPU code, but safe(r) emulation in a sandbox. Java stormed the Enterprise world because Business managers wanted safer code, and are willing to slow everything down for an extra margin of safety. The JVM was not accepted en masse because bosses didn't trust users, but because they didn't trust the programmers. It kept those pesky cubicle workers one level away from **their** hardware. They were promised less show-stopping mistakes while still allowing the hiring of all the coders they need to help automate everyone else out of work. That James and the other Java designers decided to lessen signed/unsigned transition errors by simply removing unsigned, is just par for the JVM course. Walter seems to be trying to lead D development down paths he sees as smarter, and wiser in the large. I don't get the sense that D is being designed for stupid programmers, or limited because of stupid. At least nowhere near the same league as with the JVM decisions *which was purposely built because non-technical people don't trust programmers (lacking the skills to quickly determine if one is foolish or wise, but still want to hire them all while pursuing business goals)*. Just about everyone thinks they are a good driver, or they wouldn't drive. But seat belts still save a lot of lives. *Would probably save even more, if we didn't all drive that little bit faster feeling safer behind a seat belt, in cars now designed to crumple more for less crumpling of passengers*. They don't install guard rails on highways because *everyone* misses the turn, but some few might on a bad day, so there is a guard rail to limit the damage. I've always viewed the JVM as programming in bumper cars. There is a lot of fun to be had in bumper cars. You can learn a lot about yourself and others. Business leaders, particularly the non-technical, seem to agree. I'm still new to D, but see it's present and future as an easy to read and write *feels safer, is safer* programming environment, at speed, en masse. Have good, make well.
Aug 08 2021
parent Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 8 August 2021 at 18:26:10 UTC, Brian Tiffin wrote:
 On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 ...
 Language designers seem to have a big brother attitude towards 
 programmers and think they will save the world by introducing 
 limitations.

 Examples.
 ...
 2.
 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.

 You are welcome to add any other examples that you find 
 significant for the discussion.


 This partially applies to D in some extent but can often be 
 found in other languages and mentality of several language 
 designers.

 The question is, do you think language designers go to far 
 when trying to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at 
 the same time do not limit the language?
Just to point out that using Java as a sample seems a bit off. Java (well, the JVM) is the ultimate in restrictive. It's a compiler that doesn't emit CPU code, but safe(r) emulation in a sandbox. ....
If you are talking about Java 1.0 - 1.2, yeah. Since around 2000, AOT compilation has always been a capability in commercial JDKs like Excelsior JET, IBM RealTime WebSphere, Aonix, even GCC had GCJ until 2009. Nowadays PTC, Jamaica are still around, GraalVM exists, while the JIT cache from JRockit has been made part of OpenJDK, and IBM RealTime WebSphere are now part of IBM OpenJ9.
Aug 08 2021
prev sibling next sibling parent reply cy <dlang verge.info.tm> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 Array indexes should be signed instead of unsigned because 
 somehow programmers mess up loops among other things.
This bugs me to no end. It is not hard to understand the concept of `idx < size()` and anyone using `idx < size() - 1` should take a remedial course in basic Algebra, rather than fixing my language by not allowing me to say unsigned things. It's just not hard to remember to always use "less than" and "size" in your for loops. Reverse for loops are really the only exception, and that's just 1 more thing to remember `for(top=size(), top > 0; --top) { auto idx = top - 1; ... }` repeat for every program ever exactly the same way. I can accept a language that doesn't let me use unsigned arithmetic though, even if I grumble about it. I do approve of stuff that limits subtle errors in rarely encountered, esoteric situations, like how you can't have a multi-byte character inside a 1 byte literal. What really is a bad idea is stupid, fake encapsulation, that doesn't encapsulate anything and is just encouraging programmers to make things hard for other programmers purely for ego stroking, with no savings in complexity. So basically I think the `private:` keyword needs to go die in a fire.
 The question is, do you think language designers go to far when 
 trying to "save" programmers from misuse or not?
I believe the technical term these days for when language designers go too far trying to save programmers from misuse is "rust."
Aug 08 2021
parent cy <dlang verge.info.tm> writes:
On Monday, 9 August 2021 at 01:37:09 UTC, cy wrote:
 `for(top=size(), top > 0; --top) { auto idx = top - 1; ... }`
Oh, and I really appreciate languages that don't allow me to use for loops with a single semicolon, refusing to assume that I meant to type a comma there.
Aug 08 2021
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.
Programmers are humans that write programs. I've surely written more than a million lines of code in my life (who knows how much, but that's definitely a lower bound) and I did not study unsigned math in college. I took one programming class and I've done a lot of independent study. Maybe I could figure out how to work with unsigned math, but why would I want to? I have better things to do with my time. But set all that aside. Anyone that's taught a university class will agree that you can't assume someone understands something just because they attended a lecture and took a test over it. I don't necessarily disagree that there are *some* cases of overly restrictive language design. I didn't last very long with Go for that reason. I just think unsigned math is not the best example. Switching to safe by default would be a better example.
Aug 09 2021
next sibling parent reply Mark <mt.rzezniczak gmail.com> writes:
On Monday, 9 August 2021 at 15:15:53 UTC, bachmeier wrote:
 On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.
Programmers are humans that write programs. I've surely written more than a million lines of code in my life (who knows how much, but that's definitely a lower bound) and I did not study unsigned math in college. I took one programming class and I've done a lot of independent study. Maybe I could figure out how to work with unsigned math, but why would I want to? I have better things to do with my time. But set all that aside. Anyone that's taught a university class will agree that you can't assume someone understands something just because they attended a lecture and took a test over it. I don't necessarily disagree that there are *some* cases of overly restrictive language design. I didn't last very long with Go for that reason. I just think unsigned math is not the best example. Switching to safe by default would be a better example.
Isn't unsigned math exist because we are limited in how many bits the number can hold? There's no concept of signed/unsigned for float/double/real. Which one is simpler that would introduce less complexity for the programmer writing code?
Aug 09 2021
next sibling parent bachmeier <no spam.net> writes:
On Monday, 9 August 2021 at 16:14:49 UTC, Mark wrote:

 Isn't unsigned math exist because we are limited in how many 
 bits the number can hold? There's no concept of signed/unsigned 
 for float/double/real. Which one is simpler that would 
 introduce less complexity for the programmer writing code?
I'm definitely not qualified to answer such questions, but I can say I hate unsigned integers with a passion. See for instance https://forum.dlang.org/post/rdrqedmbknwrppbfixll forum.dlang.org
Aug 09 2021
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 04:14:49PM +0000, Mark via Digitalmars-d wrote:
[...]
 Isn't unsigned math exist because we are limited in how many bits the
 number can hold? There's no concept of signed/unsigned for
 float/double/real. Which one is simpler that would introduce less
 complexity for the programmer writing code?
Floating-point, if anything, is *more* complex and comes with more pitfalls than signed/unsigned mistakes. Worse yet, it *appears* to behave like what most people imagine real numbers to behave, but in fact doesn't, which makes mistakes more likely, and also harder to catch because they usually only crop up in special corner cases while generally appearing to work correctly. Cf.: https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html At the end of the day, it boils down to: you have to learn how your programming language and its types work. If you imagine you can just cowboy your way through programming without fully understanding what you're doing, you only have yourself to blame when the results are disappointing. As Walter once said: I've been around long enough to have seen an endless parade of magic new techniques du jour, most of which purport to remove the necessity of thought about your programming problem. In the end they wind up contributing one or two pieces to the collective wisdom, and fade away in the rearview mirror. -- Walter Bright T -- Windows 95 was a joke, and Windows 98 was the punchline.
Aug 09 2021
parent reply bachmeier <no spam.net> writes:
On Monday, 9 August 2021 at 17:03:57 UTC, H. S. Teoh wrote:

 Floating-point, if anything, is *more* complex and comes with 
 more pitfalls than signed/unsigned mistakes. Worse yet, it 
 *appears* to behave like what most people imagine real numbers 
 to behave, but in fact doesn't, which makes mistakes more 
 likely, and also harder to catch because they usually only crop 
 up in special corner cases while generally appearing to work 
 correctly.
There's an important difference though. Signed/unsigned mistakes are a choice the programming language designer makes - Walter made his choice and Gosling made a different choice. You're more or less stuck with the limitations of floating point.
Aug 09 2021
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 07:01:18PM +0000, bachmeier via Digitalmars-d wrote:
 On Monday, 9 August 2021 at 17:03:57 UTC, H. S. Teoh wrote:
 
 Floating-point, if anything, is *more* complex and comes with more
 pitfalls than signed/unsigned mistakes.
[...]
 There's an important difference though. Signed/unsigned mistakes are a
 choice the programming language designer makes - Walter made his
 choice and Gosling made a different choice. You're more or less stuck
 with the limitations of floating point.
Thing is, Java can (somewhat) get away with it because it's designed to run in an idealized VM environment, abstracted away from the ugly details of the underlying hardware. But D is a systems programming language, so it should not hide the ugly realities of the hardware that it runs on. By using a systems programming language you kinda signed up for this, in a sense. As Knuth once said: People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird. -- D. Knuth That includes knowing the ugly realities of 2's complement arithmetic. T -- Klein bottle for rent ... inquire within. -- Stephen Mulraney
Aug 09 2021
next sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Monday, 9 August 2021 at 19:23:14 UTC, H. S. Teoh wrote:
 As Knuth once said:

 	People who are more than casually interested in computers 
 should
 	have at least some idea of what the underlying hardware is 
 like.
 	Otherwise the programs they write will be pretty weird.
 	-- D. Knuth

 That includes knowing the ugly realities of 2's complement 
 arithmetic.
The way I remember it, 2's complement notation is a method of encoding signed integers as unsigned integers such that a CPU can use the same instructions and circuits for both signed and unsigned arithmetic. So, from the hardware's point of view, unsigned arithmetic is the pure, simple version, and signed arithmetic is the ugly, complex version. :)
Aug 09 2021
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 08:18:26PM +0000, Paul Backus via Digitalmars-d wrote:
 On Monday, 9 August 2021 at 19:23:14 UTC, H. S. Teoh wrote:
 As Knuth once said:
 
 	People who are more than casually interested in computers should
 	have at least some idea of what the underlying hardware is like.
 	Otherwise the programs they write will be pretty weird.
 	-- D. Knuth
 
 That includes knowing the ugly realities of 2's complement
 arithmetic.
The way I remember it, 2's complement notation is a method of encoding signed integers as unsigned integers such that a CPU can use the same instructions and circuits for both signed and unsigned arithmetic. So, from the hardware's point of view, unsigned arithmetic is the pure, simple version, and signed arithmetic is the ugly, complex version. :)
Indeed! If you look at the assembly level, unsigned arithmetic is the one with straightforward instructions mapping 1-to-1 with arithmetic operations, whereas signed arithmetic is the one that involves carry bits and other such additional complications. T -- Heads I win, tails you lose.
Aug 09 2021
next sibling parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Monday, 9 August 2021 at 21:05:05 UTC, H. S. Teoh wrote:
 Indeed!  If you look at the assembly level, unsigned arithmetic 
 is the one with straightforward instructions mapping 1-to-1 
 with arithmetic operations, whereas signed arithmetic is the 
 one that involves carry bits and other such additional 
 complications.
They're literally identical for most operations; you can use the very same instructions and the only difference is how you interpret the data. In x86 they set both carry and overflow flags so you can decide which one you care about.
Aug 09 2021
parent reply claptrap <clap trap.com> writes:
On Monday, 9 August 2021 at 22:01:33 UTC, Adam D Ruppe wrote:
 On Monday, 9 August 2021 at 21:05:05 UTC, H. S. Teoh wrote:
 Indeed!  If you look at the assembly level, unsigned 
 arithmetic is the one with straightforward instructions 
 mapping 1-to-1 with arithmetic operations, whereas signed 
 arithmetic is the one that involves carry bits and other such 
 additional complications.
They're literally identical for most operations; you can use the very same instructions and the only difference is how you interpret the data. In x86 they set both carry and overflow flags so you can decide which one you care about.
Since there's only ADD,SUB,MUL(IMUL),DIV(IDIV), it's about 50/50. Half the ops have only unsigned versions, half have signed and unsigned. IIRC the overflow flag is actually just for catching the error, not for actual arithmetic, since if you're doing signed multiword arithmetic you only use a signed word at the top. That's why there's ADC, SBB, but no equivalents for the overflow flag.
Aug 09 2021
parent Adam D Ruppe <destructionator gmail.com> writes:
On Monday, 9 August 2021 at 22:57:22 UTC, claptrap wrote:
 Since there's only ADD,SUB,MUL(IMUL),DIV(IDIV), it's about 
 50/50.
Well, mul is semi-agnostic. It only matters if you look at the high word (which is frequently discarded anyway) and I believe imul works either way. And there's also sar vs shr which are slightly different, while sal and shl are identical.
 IIRC the overflow flag is actually just for catching the error, 
 not for actual arithmetic, since if you're doing signed 
 multiword arithmetic you only use a signed word at the top.
aye.
Aug 09 2021
prev sibling parent reply claptrap <clap trap.com> writes:
On Monday, 9 August 2021 at 21:05:05 UTC, H. S. Teoh wrote:
 On Mon, Aug 09, 2021 at 08:18:26PM +0000, Paul Backus via 
 Digitalmars-d wrote:
 On Monday, 9 August 2021 at 19:23:14 UTC, H. S. Teoh wrote:
 As Knuth once said:
 
 	People who are more than casually interested in computers 
 should
 	have at least some idea of what the underlying hardware is 
 like.
 	Otherwise the programs they write will be pretty weird.
 	-- D. Knuth
 
 That includes knowing the ugly realities of 2's complement 
 arithmetic.
The way I remember it, 2's complement notation is a method of encoding signed integers as unsigned integers such that a CPU can use the same instructions and circuits for both signed and unsigned arithmetic. So, from the hardware's point of view, unsigned arithmetic is the pure, simple version, and signed arithmetic is the ugly, complex version. :)
Indeed! If you look at the assembly level, unsigned arithmetic is the one with straightforward instructions mapping 1-to-1 with arithmetic operations, whereas signed arithmetic is the one that involves carry bits and other such additional complications.
The carry bit is for unsigned arithmetic, it's the overflow flag that is for signed arithmetic. On a 32 bit system carry indicates that bit 31 has overflowed into bit 32, and overflow indicates bit 30 has overflowed into bit 31. They are mainly for multiword arithmetic. And whats probably not obvious is that you need unsigned arithmetic to do multiword signed arithmetic, you only use signed arithmetic for the top most word.
Aug 09 2021
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 09, 2021 at 10:49:13PM +0000, claptrap via Digitalmars-d wrote:
[...]
 And whats probably not obvious is that you need unsigned arithmetic to
 do multiword signed arithmetic, you only use signed arithmetic for the
 top most word.
Exactly!! This is why unsigned arithmetic is more basic and fundamental than signed arithmetic. The latter is built from the former, not the former from the latter. T -- "You know, maybe we don't *need* enemies." "Yeah, best friends are about all I can take." -- Calvin & Hobbes
Aug 09 2021
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/9/2021 12:23 PM, H. S. Teoh wrote:
 As Knuth once said:
 
 	People who are more than casually interested in computers should
 	have at least some idea of what the underlying hardware is like.
 	Otherwise the programs they write will be pretty weird.
 	-- D. Knuth
 
 That includes knowing the ugly realities of 2's complement arithmetic.
A friend mine (a very smart one) back in college one day decided to learn programming. He got the Fortran specification(!), read it, and wrote a program. The program ran correctly, but incredibly slowly. Baffled, he took it to his programmer friend for help. The friend laughed, and said here's the problem: you're writing to a file in a loop: loop open the file append a character close the file instead of: open the file loop append character close the file My friend said he followed the spec, which said nothing at all about how file I/O actually worked.
Aug 10 2021
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 10, 2021 at 11:29:35AM -0700, Walter Bright via Digitalmars-d wrote:
[...]
 A friend mine (a very smart one) back in college one day decided to
 learn programming. He got the Fortran specification(!), read it, and
 wrote a program.
Hey, what better way to learn a language than to get to the very definition of it? ;-)
 The program ran correctly, but incredibly slowly. Baffled, he took it
 to his programmer friend for help. The friend laughed, and said here's
 the problem: you're writing to a file in a loop:
 
     loop
         open the file
         append a character
         close the file
 
 instead of:
 
    open the file
    loop
       append character
    close the file
 
 My friend said he followed the spec, which said nothing at all about
 how file I/O actually worked.
:-D What the spec *doesn't* say is often just as important as, if not more important than, what it does say. :-) Reminds me of learning a foreign language by reading a dictionary... you can learn all the words, and even consult a book on grammar, but will it survive the first encounter with a native speaker? ;-) True story: one time, I found a particular foreign word from a dictionary, and thought that was how you referred to a particular thing. So I used that word with a native speaker. He stared at me for a moment with this strange look of incomprehension, and then suddenly comprehension gleamed in his eyes, and with a smile he explained that this was an archaic word that is no longer in widespread use, and that people these days use a different word for the same thing. He further added that if I were to use that word with somebody younger than him, they probably wouldn't even understand what it meant. Such is the peril of learning a language by spec without understanding the context in which it operates. :-) T -- Almost all proofs have bugs, but almost all theorems are true. -- Paul Pedersen
Aug 10 2021
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/9/2021 12:01 PM, bachmeier wrote:
 There's an important difference though. Signed/unsigned mistakes are a choice 
 the programming language designer makes - Walter made his choice and Gosling 
 made a different choice. You're more or less stuck with the limitations of 
 floating point.
The thing about two's complement arithmetic, and IEEE floating point, is their faults are very well known. You can move your understanding of them from one language to the next. It's worthwhile to invest a few minutes learning about them if you're going to spend your career programming. Just like it's worthwhile learning about how tire adhesion works when looking forward to a decades of driving. (It can save your life.)
Aug 10 2021
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/9/2021 8:15 AM, bachmeier wrote:
 I did not study unsigned math in college.
There are no classes in unsigned math.
 I took one 
 programming class and I've done a lot of independent study. Maybe I could
figure 
 out how to work with unsigned math, but why would I want to? I have better 
 things to do with my time.
It's worth spending 5 minutes to learn what 2's complement arithmetic is. https://en.wikipedia.org/wiki/Two%27s_complement#:~:text=Two's%20complement%20is%20a%20mathematical,two's%20complement%20is%202N.
Aug 10 2021
parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Tuesday, 10 August 2021 at 17:47:46 UTC, Walter Bright wrote:
 On 8/9/2021 8:15 AM, bachmeier wrote:
 I did not study unsigned math in college.
There are no classes in unsigned math.
Primary school is where unsigned arithmetic is tought. :-)
Aug 10 2021
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 10 August 2021 at 21:51:51 UTC, Patrick Schluter 
wrote:
 On Tuesday, 10 August 2021 at 17:47:46 UTC, Walter Bright wrote:
 On 8/9/2021 8:15 AM, bachmeier wrote:
 I did not study unsigned math in college.
There are no classes in unsigned math.
Primary school is where unsigned arithmetic is tought. :-)
With unbound storage, base 10 and no wrapping.
Aug 10 2021
prev sibling next sibling parent Dominikus Dittes Scherkl <dominikus scherkl.de> writes:
On Tuesday, 10 August 2021 at 21:51:51 UTC, Patrick Schluter 
wrote:
 On Tuesday, 10 August 2021 at 17:47:46 UTC, Walter Bright wrote:
 On 8/9/2021 8:15 AM, bachmeier wrote:
 I did not study unsigned math in college.
There are no classes in unsigned math.
Primary school is where unsigned arithmetic is taught. :-)
No. There are classes in unsigned math - but very much later. They are called "Finite Fields Arithmetic" or even "binary polynomials over finite fields" and although it its not too complicated, it assumes a lot of base math knowledge to be understood well.
Aug 10 2021
prev sibling parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 8/10/21 2:51 PM, Patrick Schluter wrote:

 Primary school is where unsigned arithmetic is tought. :-)
I was surprised at how quickly my son learned negative numbers at probably the age of 5 in less than 5 minutes. I am not exaggerating! I only said: "You know the numbers and you know zero... Well, actually there are the negative numbers on the other side too." (I used colored domino pieces to represent those three kinds.) That was it! :) I would show off to my friends by asking my son: -- "Let's say we are reading a book with copyright 2009. How old *were* you when that book was published?" After thinking 2 seconds: -- "Negative 4!" :) Ali
Aug 11 2021
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/7/21 8:15 AM, IGotD- wrote:
 This is a general discussion which applies to all computer languages and 
 also under several decades. What I have observed is that language 
 designers see programmers misuse the language and introduce possible 
 bugs and therefore remove features in languages. An analogy would limit 
 the functionality of cars because people sometimes are involved in 
 accidents, like automatic speed limiter (soon to be law in several 
 countries).
 
 Language designers seem to have a big brother attitude towards 
 programmers and think they will save the world by introducing limitations.
 
 Examples.
 
 1.
 Array indexes should be signed instead of unsigned because somehow 
 programmers mess up loops among other things. Bjarne Stroustrup 
 considered his unsigned index to be a historic mistake. While unsigned 
 array indexes make perfectly sense, the bias towards signed seems to be 
 that programmers are stupid. The question is, if can't make a for loop 
 with unsigned math, shouldn't you look for another job?
If people who have made mistakes with unsigned math in loops were disqualified from programming, this place would be a ghost town. Also note that signed indexes are allowed (you can also use negative indexes)! It's the array LENGTH being unsigned which is a problem. Note also due to D promotion rules, using any unsigned values poisons all your other values to be unsigned (usually unexpectedly).
 2.
 Somewhat related. when Java was designed, the designer (James Gosling I 
 believe) claimed that programmers were too stupid to understand the 
 difference between signed and unsigned math (despite often several years 
 of university education) and removed signed math entirely from the 
 language. The impact is that when unsigned math is required, you are 
 forced to conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.
"too stupid" seems like an incorrect assessment. More like "too careless". Consider that it's not really unsigned math or signed math, but when you are doing math between signed and unsigned values, what should happen? That's where most people get into trouble. Note that signed math and unsigned math is identical, it's just most people aren't doing math around the value 2^31, but they do a lot around the value 0. I would love for D to use signed indexes for arrays, especially with 64-bit integers.
 The question is, do you think language designers go to far when trying 
 to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at the same 
 time do not limit the language?
Until the world of programming is ruled by perfect AI, please keep trying to fix my stupid human mistakes, thanks! However, I do know of cases that have gone too far. Like Swift eliminating for loops -- that one stung. -Steve
Aug 09 2021
next sibling parent reply IGotD- <nise nise.com> writes:
On Monday, 9 August 2021 at 16:27:28 UTC, Steven Schveighoffer 
wrote:
 However, I do know of cases that have gone too far. Like Swift 
 eliminating for loops -- that one stung.

 -Steve
Is this correct? for loops with classical C syntax are removed but you can still have for loops over ranges and the x...y syntax just makes an integer range of your liking. This is similar to foreach (i; 0 .. 3) in D. It's just a syntax change and bugs with ranges is probably just as easy as with the old C syntax.
Aug 09 2021
parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/9/21 1:14 PM, IGotD- wrote:
 On Monday, 9 August 2021 at 16:27:28 UTC, Steven Schveighoffer wrote:
 However, I do know of cases that have gone too far. Like Swift 
 eliminating for loops -- that one stung.
Is this correct?
Yes.
 for loops with classical C syntax are removed but you 
 can still have for loops over ranges and the x...y syntax just makes an 
 integer range of your liking. This is similar to foreach (i; 0 .. 3) in D.
There are more uses for traditional `for` loops than just looping over integers or ranges.
 
 It's just a syntax change and bugs with ranges is probably just as easy 
 as with the old C syntax.
The use case I had, I needed to rewrite into a sequence. Hm... Let me find the change: ```swift for var i = 0; (gridOrigin.x + CGFloat(i) * subGridSpacing.width) * scale < bounds.width; i += 1 { ... ``` I wrote a "for generator" sequence type that accepted a lambda for the condition. It now looks like: ```swift for i in forgen(0, condition: { (self.gridOrigin.x + CGFloat($0) * self.subGridSpacing.width) * self.scale < self.bounds.width }){ ... ``` This is just a hack, and not a full replacement. It takes an initial value and an increment (by default 1), and only works with Int. Honestly, it was a long time ago (heck, they probably removed more features and this won't compile now). There might be better ways. And I probably would have been fine with swift never having for loops. But to start out with them, and then remove them, seemed unnecessary. -Steve
Aug 09 2021
prev sibling parent Yatheendra <gmail yath.com> writes:
 most people aren't doing math around the value 2^31, but they 
 do a lot around the value 0
Very true. I have shifted from looping 'N-1 to 0' to looping 'N to 1', naming the loop variable ith and not i, making 0 a natural sentinel value for ith in C-style (ith = N; ith > 0; --ith). The above is part of a late realization for me, as to how under-used and mis-used naming conventions are in programming. I appreciate "object-oriented style" C even more now for this reason (bespoke code in commercial settings, I wouldn't know about glib/gtk). Maybe -betterC with template support suffices for most programming use-cases (and we should be using dynamically typed languages for everything else).
Aug 25 2021
prev sibling next sibling parent tausy <tausy protonmail.com> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 This is a general discussion which applies to all computer 
 languages and also under several decades. What I have observed 
 is that language designers see programmers misuse the language 
 and introduce possible bugs and therefore remove features in 
 languages. An analogy would limit the functionality of cars 
 because people sometimes are involved in accidents, like 
 automatic speed limiter (soon to be law in several countries).

 Language designers seem to have a big brother attitude towards 
 programmers and think they will save the world by introducing 
 limitations.

 Examples.

 1.
 Array indexes should be signed instead of unsigned because 
 somehow programmers mess up loops among other things. Bjarne 
 Stroustrup considered his unsigned index to be a historic 
 mistake. While unsigned array indexes make perfectly sense, the 
 bias towards signed seems to be that programmers are stupid. 
 The question is, if can't make a for loop with unsigned math, 
 shouldn't you look for another job?

 2.
 Somewhat related. when Java was designed, the designer (James 
 Gosling I believe) claimed that programmers were too stupid to 
 understand the difference between signed and unsigned math 
 (despite often several years of university education) and 
 removed signed math entirely from the language. The impact is 
 that when unsigned math is required, you are forced to 
 conversions and library solutions. Not ideal when an HW APIs 
 deals with unsigned numbers for example.

 You are welcome to add any other examples that you find 
 significant for the discussion.


 This partially applies to D in some extent but can often be 
 found in other languages and mentality of several language 
 designers.

 The question is, do you think language designers go to far when 
 trying to "save" programmers from misuse or not?
 Do you think there can be approaches that both prevent bugs at 
 the same time do not limit the language?
If there wasn't a need for safe and restrictive languages they probably wouldn't exist. I'd say like most things in this world the drive to make languages safe and easy is money. https://money.cnn.com/2012/08/09/technology/knight-expensive-computer-bug/index.html People make mistakes, even the smartest "college/university educated" people. As well, if I want to start a business and I don't need to use a systems language like C that would be nice. I'm sure it's cheaper for a startup to hire python developers to write a backend than it would be to hire C/C++ developers to do the same thing. Another reason, even more important than the first, would be that bugs can and have killed people. https://theinsurancenerd.com/therac-25-a-computer-bug-that-killed-many/ https://www.linkedin.com/pulse/real-world-bugs-debugging-embedded-system-stanly-christoper I'd can probably say with certainty that the people working on these systems were intelligent, had a CS degree and they weren't to me, seem overly restrictive but the fact is I don't have to use or learn them. Imagine how restrictive a language that has mathematical proof that it's type safe, memory safe, runtime exception safe, data-race free and dead-lock free. Take a look at Pony. Too much for me, I'll pass. My point is that it's not that programmers are stupid it's just about safety.
Aug 09 2021
prev sibling parent reply surlymoor <surlymoor cock.li> writes:
On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 [...]
Those examples don't feel all that insulting; then again, I am a stupid programmer. However, something that would sting is being deprived of generics because the Elder Gods deemed them too confusing.
Aug 10 2021
parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 10 August 2021 at 08:22:04 UTC, surlymoor wrote:
 On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:
 [...]
Those examples don't feel all that insulting; then again, I am a stupid programmer. However, something that would sting is being deprived of generics because the Elder Gods deemed them too confusing.
Indeed, however after years of gatherings at the Eldery Mountains, a new revelation on the Book of Wisdom has been made visible on the Old World, an event that is deemed to happen every few centuries, upon which the Elder Gods have summoned the runes of generics, and great rejoice has spread throughout the land with festivities across all villages. https://go.googlesource.com/proposal/+/refs/heads/master/design/43651-type-parameters.md https://go2goplay.golang.org/
Aug 10 2021