www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Wish: Variable Not Used Warning

reply "Nick Sabalausky" <a a.a> writes:
I don't suppose there's any chance of DMD getting a warning for 
variables/arguments that are declared but never accessed? Just today alone 
there's been two bugs I spent 10-30 minutes going nuts trying to track down 
that turned out to be variables I had intended to use but forgot to. 
Jul 05 2008
next sibling parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 06 Jul 2008 03:34:52 +0400, Nick Sabalausky <a a.a> wrote:

 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just today  
 alone
 there's been two bugs I spent 10-30 minutes going nuts trying to track  
 down
 that turned out to be variables I had intended to use but forgot to.

Agreed, put it into a bugzilla, otherwise it may get lost. Since D has no warnings, it should be an error, but it would break too much of the existing code. Better solution might be to enable this check when compiling with -w...
Jul 05 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.

I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too  
 much of the existing code.

What's bad about warnings? Bye, bearophile
Jul 05 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:g4p207$24dk$1 digitalmars.com...
 Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.

I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too
 much of the existing code.

What's bad about warnings?

Walter doesn't like them. He feels (felt?) that they tend to reflect shortcomings in a language's design (And I think there are many cases where he's right on that, looking at some other languages). But awhile ago he was finally convinced to put in some warnings when the "-w" flag is used. There was much rejoicing.
Jul 05 2008
prev sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
Well, I think the problems center around the following:

1. In many cases a "warning" in some C/C++ compilers really should be an 
error.  The line seems to be difficult and arguable.

2. Compilers don't agree on warnings.  This is sorta #1 but for those 
who ignore warnings, it means their code might (surprisingly to them) be 
non-portable.

3. Warnings beg warning suppressions methodologies, because they are 
sometimes made incorrectly (otherwise they'd be an error, no?)

4. It creates more of a rift between people who are pedantic about 
warnings, etc. and people who are not.  In my experience at least, 
maintenance programmers, newer programmers, and experienced programmers 
can all fit into those two groups in awkward ways.

A lot of these points are well expressed by PHP's (imho very flawed) 
error reporting.  Most PHP programmers learn to turn warnings off before 
they even learn what a for loop is.  I've taken many large open-source 
or proprietary PHP codebases, simply turned on warnings, and been able 
to point out a handful of very obvious bugs in short order.

I think breaking it like D does is an excellent strategy based on real 
world, practical problems with warnings.

In any case, I would very much like to see (or develop) a code-standards 
enforcing lint tool for D.  This wouldn't be that hard to make based on 
dmd's open source frontend, and could be configured to enforce such 
guidelines as:

1. No commented out code (WebKit uses this guideline, I do with some 
languages as well.)

2. Either consistent or specific indentation style.

3. Variable usage and naming.

4. Use of unstable, deprecated, or untrusted language or library features.

But I really think that works better as a separate tool (that could be a 
checkin hook for whatever preferred versioning system, etc.)  This helps 
especially since some people don't compile things (although they should) 
before checkin, and actually recompiling automatically is often wrong.

-[Unknown]


bearophile wrote:
 Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.

I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too  
 much of the existing code.

What's bad about warnings? Bye, bearophile

Jul 05 2008
prev sibling parent "Nick Sabalausky" <a a.a> writes:
"Koroskin Denis" <2korden gmail.com> wrote in message 
news:op.uduer8uhenyajd korden...
 On Sun, 06 Jul 2008 03:34:52 +0400, Nick Sabalausky <a a.a> wrote:

 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just today 
 alone
 there's been two bugs I spent 10-30 minutes going nuts trying to track 
 down
 that turned out to be variables I had intended to use but forgot to.

Agreed, put it into a bugzilla, otherwise it may get lost. Since D has no warnings, it should be an error, but it would break too much of the existing code. Better solution might be to enable this check when compiling with -w...

Added as issue #2197 http://d.puremagic.com/issues/show_bug.cgi?id=2197
Jul 05 2008
prev sibling next sibling parent Ary Borenszweig <ary esperanto.org.ar> writes:
Nick Sabalausky a crit :
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today alone 
 there's been two bugs I spent 10-30 minutes going nuts trying to track down 
 that turned out to be variables I had intended to use but forgot to. 

Moreover, I'd like a warning when a private variable is declared but never read. And when a private method is declared but never invoked.
Jul 05 2008
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today alone 
 there's been two bugs I spent 10-30 minutes going nuts trying to track down 
 that turned out to be variables I had intended to use but forgot to. 

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.
Jul 05 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4pplc$gno$1 digitalmars.com...
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements.

I develop code in a highly iterative manner and find "unused variable" warnings highly useful. In all of the time I've spent with other compilers that do issue "unused variable" warnings, I've never found it to be an annoyance. And the way I've always seen it, warnings literally *are* a built-in lint tool.
Jul 06 2008
prev sibling next sibling parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 06 Jul 2008 10:45:03 +0400, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for  
 variables/arguments that are declared but never accessed? Just today  
 alone there's been two bugs I spent 10-30 minutes going nuts trying to  
 track down that turned out to be variables I had intended to use but  
 forgot to.

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.

Since DMD already has -w switch, why not make use of it? I think it would be a good practice to compile your code with -w on just once in a while, say, before a public release. This should enable more strick code checking, like unused methods, variables, unreachable code etc.
Jul 06 2008
parent "Nick Sabalausky" <a a.a> writes:
"Koroskin Denis" <2korden gmail.com> wrote in message 
news:op.udu17gmcenyajd korden...
 On Sun, 06 Jul 2008 10:45:03 +0400, Walter Bright 
 <newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.

Since DMD already has -w switch, why not make use of it? I think it would be a good practice to compile your code with -w on just once in a while, say, before a public release. This should enable more strick code checking, like unused methods, variables, unreachable code etc.

Not to beat a dead horse, but I always have warnings permanently turned on in every compiler I use, including DMD.
Jul 06 2008
prev sibling next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Walter Bright wrote:
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.

The compiler already has full semantic knowledge of the code, and at least some of the warnings seem like "low-hanging fruit" so why not make the compiler act as a "mini-lint"?
Jul 06 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not make 
 the compiler act as a "mini-lint"?

Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
Jul 06 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?

Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.

A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?
Jul 06 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?

Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.

A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?

Because what constitutes a proper warning is a very subjective issue, there is plenty of room for different ideas. If it was in the compiler, it would inhibit development of static analysis tools, and would confuse the issue of what was correct D code.
Jul 08 2008
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4v646$c2j$1 digitalmars.com...
 Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?

Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.

A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?

Because what constitutes a proper warning is a very subjective issue, there is plenty of room for different ideas.

Ok, so the different warnings should be able to be turned on and off. If you don't agree with a particular type of warning then you turn it off. That's the nice thing about warnings as opposed to errors: they're optionally letting you know about certain conditions that you might want to be aware of, and they do it without changing, redefining, or otherwise affecting the language itself.
 If it was in the compiler, it would inhibit development of static analysis 
 tools,

Can you elaborate on how this would happen?
 and would confuse the issue of what was correct D code.

Anything that generates a warning instead of an error is by definition valid code. If it wasn't valid it would generate an error instead of a warning. Although, if by "correct" you're referring to style guidelines instead of syntactic and semantic validity, then I still disagree that it's an issue. For instance, I don't think many people would be opposed to having an optional switch that checked for consistent indentation style (I'm not actually requesting this though). People have different indentation style preferences, so the type of indentation could be configued, but perhaps have some sort of default. I can't imagine that confusing people as to what correct style was. Anyone who isn't an absolute novice is well aware of what does and doesn't constitute an issue of style (If it doesn't cause compile-time/runtime errors, then it's a matter of style).
Jul 08 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Ok, so the different warnings should be able to be turned on and off. If you 
 don't agree with a particular type of warning then you turn it off. That's 
 the nice thing about warnings as opposed to errors: they're optionally 
 letting you know about certain conditions that you might want to be aware 
 of, and they do it without changing, redefining, or otherwise affecting the 
 language itself.

That situation exists today for C++ compilers, and it's not so good. You have, as I mentioned previously, n factorial different languages instead of 1. Portability becomes a problem. Confusion about whether the code should compile or not reigns.
 If it was in the compiler, it would inhibit development of static analysis 
 tools,


It's the same reason why "m4" never caught on as a C preprocessor, despite being vastly superior, and despite everyone who wanted a better CPP being told to use m4.
 and would confuse the issue of what was correct D code.

Anything that generates a warning instead of an error is by definition valid code. If it wasn't valid it would generate an error instead of a warning.

That's true, but it is not what happens in the real world with warnings. I've dealt with warnings on C/C++ compilers for 25 years, and the practice is very different from the theory.
Jul 08 2008
parent reply bearophile <bearophileHUGS lycos.com> writes:
Markus Koskimies:
 For me, it could even warn about indentation quirks, like:
 
 	...
 	if(a == b)
 		do_that();
 		do_that_also();
 	...
 
 ...In which case the compiler could stop and say, that either add {}'s or 
 correct the indentation :)

Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute. Bye, bearophile
Jul 10 2008
parent "Nick Sabalausky" <a a.a> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:g54j46$2e05$1 digitalmars.com...
 Markus Koskimies:
 For me, it could even warn about indentation quirks, like:

 ...
 if(a == b)
 do_that();
 do_that_also();
 ...

 ...In which case the compiler could stop and say, that either add {}'s or
 correct the indentation :)

Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute. Bye, bearophile

At the risk of reliving an old discussion... http://dobbscodetalk.com/index.php?option=com_myblog&show=Redundancy-in-Programming-Languages.html&Itemid=29 In the case of Python (I assume that's the same sort of behavior as the Knuth you mention), the whole point behind the way it handles scope/indentation was to correct the problem of source files that have improper indentation by actually enforcing proper indentation. That's a very worthy goal. But the problem is in the way it goes about it: Python doesn't enfore a damn thing with regard to indentaion. It *can't* inforce proper indentation because it runs around assuming that the intentation it receives *is* the intended scope. So it can't enforce it just because doesn't have the slightest idea what the proper indentation for a particular piece of code would be - that would require separating scope from indentation and...oh, yea, that's what C-based languages like D do.
Jul 10 2008
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Walter Bright (newshound1 digitalmars.com)'s article
 Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at
 least some of the warnings seem like "low-hanging fruit" so why not
 make the compiler act as a "mini-lint"?

Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.

A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?

there is plenty of room for different ideas. If it was in the compiler, it would inhibit development of static analysis tools, and would confuse the issue of what was correct D code.

And regarding this particular issue, it's not uncommon to have unused function parameters. And while C++ allows them to be left out: void fn( int ) {} D does not. A warning for this would be terribly annoying. Sean
Jul 08 2008
parent reply "Jarrett Billingsley" <kb3ctd2 yahoo.com> writes:
"Sean Kelly" <sean invisibleduck.org> wrote in message 
news:g4vvtg$2237$1 digitalmars.com...

 And regarding this particular issue, it's not uncommon to have unused
 function parameters.  And while C++ allows them to be left out:

 void fn( int ) {}

 D does not.  A warning for this would be terribly annoying.

Are you sure about that? Cause that compiles and runs in D.
Jul 08 2008
parent Sean Kelly <sean invisibleduck.org> writes:
== Quote from Jarrett Billingsley (kb3ctd2 yahoo.com)'s article
 "Sean Kelly" <sean invisibleduck.org> wrote in message
 news:g4vvtg$2237$1 digitalmars.com...
 And regarding this particular issue, it's not uncommon to have unused
 function parameters.  And while C++ allows them to be left out:

 void fn( int ) {}

 D does not.  A warning for this would be terribly annoying.


Really? It didn't used to :-) Sean
Jul 08 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4pplc$gno$1 digitalmars.com...
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language.

I really don't see how this is, unless every compiler always has "treat warnings as errors" permanently enabled with no way to disable. That's like saying that using different lint tools, or different settings within a the same lint tool constitutes different versions of the same language, and then claiming that means we shouldn't use lint tools.
  If you're faced with compiling someone else's code (like you downloaded 
 it off the internet and have to compile it because it only is distributed 
 as source) and warnings go off, is that a bug in the code or not? What do 
 you do?

By definition, that's not an error. Even if it is a manefestation of a bug, that's no reason to deliberately be silent - bugs should be noisy. What do you do? Whatever you would normally do when you come across something in a program that you're not sure is right or not. It's not an issue that's specific (or partucularly relevant, imho) to compiler warnings.
 Some shops have a "thou shall compile with warnings enabled, and there 
 shall be no warning messages."

That's a management problem, not a compiler design problem. (But I'm not saying that deliberately minimizing warning conditions is bad. I'm just saying that being strict enough that it becomes a problem, ie not balancing it with practical common sence, is just begging for trouble, and there's no reason we should be bending backwards to help compensate for the bad management choices of certain teams.)
 That causes problems when you port the code to a different compiler with a 
 different, even contradictory, notion of what is a warning. So then you 
 wind up putting wacky things in the code just to get the compiler to shut 
 up about the warnings.

 Those kind of things tend to interfere with the beauty of the code, and 
 since they are not necessary to the program's logic, they tend to confuse 
 and misdirect the maintenance programmer (why is this variable pointlessly 
 referenced here? Why is this unreachable return statement here? Is this a 
 bug?)

Thus comments.
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.

Like I've said, compiler warnings are essentialy a built-in lint tool. I see no reason to think of them any other way.
Jul 06 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 The problem with warnings is that if there are n warnings, there are 
 essentially n factorial different versions of the language.

warnings as errors" permanently enabled with no way to disable. That's like saying that using different lint tools, or different settings within a the same lint tool constitutes different versions of the same language, and then claiming that means we shouldn't use lint tools.

If you have 10 warnings, each independently toggled on or off, then you have 10 factorial different languages. The difference between lint and a compiler is people know lint is not a compiler and do not worry about lint's complaints. Warnings in the compiler are treated, in reality, like programming errors.
  If you're faced with compiling someone else's code (like you downloaded 
 it off the internet and have to compile it because it only is distributed 
 as source) and warnings go off, is that a bug in the code or not? What do 
 you do?


I know, but that is NOT how they are perceived. People wonder if they downloaded it right, or if they downloaded the right version, they wonder if they should complain about it, they wonder if the program will work properly if compiled. It sucks.
 Some shops have a "thou shall compile with warnings enabled, and there 
 shall be no warning messages."


Management of programming teams is an important issue. There are a number of characteristics in D that try to make it easier for managers to manage the programmers. These are based on my conversations with many managers about the kinds of problems they face. I don't agree that these issues should be ignored and dismissed as just a management problem.
 Those kind of things tend to interfere with the beauty of the code, and 
 since they are not necessary to the program's logic, they tend to confuse 
 and misdirect the maintenance programmer (why is this variable pointlessly 
 referenced here? Why is this unreachable return statement here? Is this a 
 bug?)


I don't agree with relying on comments to make up for a language design that encourages confusing and misleading code to be written.
 Like I've said, compiler warnings are essentialy a built-in lint tool. I see 
 no reason to think of them any other way. 

I think you and I have had very different experiences with warnings!
Jul 08 2008
next sibling parent dennis luehring <dl.soluz gmx.net> writes:
is "walter bright" the name for a group of high experienced software 
developers and managers - or is an escaped us-army expert-system- experiment

because sometimes i am realy shocked how perfect your ideas fits 
developers(managers) daily needs

and i hope its not to hard for you to describe people, who are years 
behind your experience - and normaly don't understand the(or your) 
problem domain - why your ideas are better...

you are my how-it-should-work-brain-brother
its like a memo of my thinking each time i read your comments
thx very much for writing them down :-)
Jul 09 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.

Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.
Jul 09 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:g51qgu$1f63$1 digitalmars.com...
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.

Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.

I'd also like to note one other thing...Umm, this might come across sounding harsh, so please understand I don't in any way intend it as any sort of personal or professional disrespect/insult/sarcasm/etc.: It's just that the way I've always felt about lint tools is, I've always seen lint tools as a sign of popular languages and compilers doing an insufficient job of catching easily-overlooked programming mistakes. (For instance, if I were going to use a language that allows implicit variable declarations (makes hidden mistakes easy), *and* there was no way to prevent the compiler/interpreter from remaining silent about it when it happened (a mere band-aid in the case of the implicit declaration problem, but a very welcome band-aid nonetheless), then I would grunble about it and try to find a lint tool that plugged that bug-hole. This, of course, goes back to the "good/bad redundancy in lanugage design" point that you've made.)
Jul 09 2008
prev sibling next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.

Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.

I think Walter is right here too. With Microsoft compilers warnings are so copious that they become almost useless. They warn about piles of trivial things that only have a remote possibility of being a bug. So you end up just ignoring them, and in that case they might as well not be there. It's just annoying. I think the problem is that the compiler writers have this attitude that they can be "helpful" by warning about anything that possibly could be a bug, even if it's going to have 100 times more false positives than real hits. That's not a good way to do warnings. By making warnings either off or fatal like D, you force the compiler writers to actually think long and hard about whether the warning they're thinking to add is really so likely to be a bug that they should force the user to change the code. If it's fairly likely that the coder actually knows what they're doing, then that really doesn't justify the compiler issuing the warning. A lint tool fine, but not the compiler. +1 votes for Walter :-) --bb
Jul 09 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not 
 a compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.

Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.

I think Walter is right here too. With Microsoft compilers warnings are so copious that they become almost useless. They warn about piles of trivial things that only have a remote possibility of being a bug. So you end up just ignoring them, and in that case they might as well not be there. It's just annoying. I think the problem is that the compiler writers have this attitude that they can be "helpful" by warning about anything that possibly could be a bug, even if it's going to have 100 times more false positives than real hits. That's not a good way to do warnings.

But Visual Studio had the option to disable *specific* warnings either globally (in the IDE), or locally (in source code with pragma statements). So in my experience with VS C++, even though I did find several types of warnings which were fairly useless, I simply disabled those kinds of warnings globally, keeping all the others. So I don't see a problem here.
 By making warnings either off or fatal like D, you force the compiler 
 writers to actually think long and hard about whether the warning 
 they're thinking to add is really so likely to be a bug that they should 
 force the user to change the code.  If it's fairly likely that the coder 
 actually knows what they're doing, then that really doesn't justify the 
 compiler issuing the warning.  A lint tool fine, but not the compiler.
 
 +1 votes for Walter  :-)
 
 --bb

And because I don't see a problem, I also don't find the need to "making warnings either off or fatal like D", thus denying the use case where you want the compiler to report a code situation which is not necessarily "fairly likely" to be a bug, but is still relevant enough to report to the coder, for whatever reason. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 It sounds like (previously unknown to me) there's a rift between the reality 
 of warnings and the perceptions that many programmers (excluding us) have 
 about warnings. As I understand it, you consider it more important to design 
 around common perceptions of warnings, even if they're mistaken perceptions 
 (such as warnings, by definition, not actually being errors). My 
 disagreement is that I consider it better to design around the realities, 
 and use a more education-based approach (I don't necessarily mean school) to 
 address misperceptions. Is this a fair assessment of your stance, or am I 
 still misunderstanding?

It's a fair assessment. I give more weight to designing a language around the way programmers are and the way they tend to work, rather than trying to force them adapt to the language. As for the needs of programming managers, I think D is the only language that has attempted to address those needs. At least I've never ever heard of any other language even acknowledge the existence of such needs.
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g51uc7$1let$1 digitalmars.com...
 Nick Sabalausky wrote:
 It sounds like (previously unknown to me) there's a rift between the 
 reality of warnings and the perceptions that many programmers (excluding 
 us) have about warnings. As I understand it, you consider it more 
 important to design around common perceptions of warnings, even if 
 they're mistaken perceptions (such as warnings, by definition, not 
 actually being errors). My disagreement is that I consider it better to 
 design around the realities, and use a more education-based approach (I 
 don't necessarily mean school) to address misperceptions. Is this a fair 
 assessment of your stance, or am I still misunderstanding?

It's a fair assessment. I give more weight to designing a language around the way programmers are and the way they tend to work, rather than trying to force them adapt to the language.

The way I program, I tend run into situations such as the two Koroskin Denis pointed out. Unless this hypothetical D lint tool actually ends up materializing, then I'm forced to adapt to a compiler that refuses to let me know about a condition that I *want* to know about. Someone else in this thread just mentioned that DMD's warnings are always treated as errors, instead of only being treated as errors with a "warnings as errors" switch. I wasn't aware of this. That approach *certainly* confuses the issue of "warning" vs. "error" and creates what are effectively multiple languages (and, as other people pointed out, makes such "'warnings'-but-not-really-true-warnings" useless when using outside source libraries). (If you're wondering how I could have not known DMD treats warnings as errors since I'm obviously so pro-warning that I would certainly be using the -w switch, it's because at the moment, I seem to be having trouble getting DMD 1.029 to emit any warnings, even when deliberately trying to trigger the ones it's supposed to support. *But* for all I know right now this may be a rebuild or IDE issue, I haven't had a chance to look into it yet.)
 As for the needs of programming managers, I think D is the only language 
 that has attempted to address those needs. At least I've never ever heard 
 of any other language even acknowledge the existence of such needs.

If there's a legitimate need that programming managers have that can be met by a compiler without creating any problems for the actual programmers, then I'm all for it. But when there's a "programming manager" that's steadfast about "all warnings must always be treated as errors", *BUT* refuses to be practical about it and entertain any notion that there may actually be some warnings that are NOT really problems (in other words, "delusional" by the very definition of the word), then said "programming manager" is clearly incompetent and by no means should be indulged. That's like creating a programming language where 2 + 2 equals 7, just because you find out that there are "programmers" who are incompetent enough to insist that 2 + 2 really does equal 7.
Jul 09 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
The reason for treating warnings as errors when warnings are enabled is 
so that, for a long build, they don't scroll up and off your screen and 
go unnoticed.
Jul 09 2008
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
"Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously! -Steve
Jul 09 2008
next sibling parent reply BCS <ao pathlink.com> writes:
Reply to Steven,

 "Walter Bright" wrote
 
 The reason for treating warnings as errors when warnings are enabled
 is so that, for a long build, they don't scroll up and off your
 screen and go unnoticed.
 

the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously! -Steve

I think grep is more useful there. I have a few builds that have several pages of output on a successful build and I sometimes miss even the errors. (I once had an error that didn't even fit in the scroll back buffer, but that was just a bit nuts ;)
Jul 09 2008
parent reply TomD <t_demmern.ospam web.de> writes:
BCS Wrote:
[...]
 
 I think grep is more useful there. I have a few builds that have several 
 pages of output on a successful build and I sometimes miss even the errors.
 
 (I once had an error that didn't even fit in the scroll back buffer, but 
 that was just a bit nuts ;)

That's why I think having a "tee" is even better :-) Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash... Ciao Tom
Jul 09 2008
parent reply BCS <ao pathlink.com> writes:
Reply to tomD,

 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.
 
 (I once had an error that didn't even fit in the scroll back buffer,
 but that was just a bit nuts ;)
 

Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash...

<sarcastic>No? You can't?!</sarcastic> <G/> Your right and Oooh do i feel it now and again.
Jul 09 2008
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
BCS wrote:
 Reply to tomD,
 
 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.

 (I once had an error that didn't even fit in the scroll back buffer,
 but that was just a bit nuts ;)

Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash...

<sarcastic>No? You can't?!</sarcastic> <G/> Your right and Oooh do i feel it now and again.

I agree Cygwin is nice, but go get yourself the gnuwin32 tools. Then you'll be able to use all your favorite unix commands (like 'tee') from the dos box. Makes it much less painful. And get Console2 also. --bb
Jul 09 2008
parent BCS <ao pathlink.com> writes:
Reply to Bill,

 BCS wrote:
 
 Reply to tomD,
 
 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.
 
 (I once had an error that didn't even fit in the scroll back
 buffer, but that was just a bit nuts ;)
 

cannot do anything useful under Windows unless it is in a Cygwin bash...

Your right and Oooh do i feel it now and again.

you'll be able to use all your favorite unix commands (like 'tee') from the dos box. Makes it much less painful. And get Console2 also. --bb

six of one half dozon of the other just Give Me My Linux CLI Tools!
Jul 09 2008
prev sibling parent reply superdan <super dan.org> writes:
Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!

in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language. a lint tool should not be folded into d. such a tool could e.g. follow pointers, do virtual execution, and some other weird shit. it could run for hours and produce output that takes an expert to interpret. that kind of shit does not belong in the compiler.
Jul 09 2008
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
"superdan" wrote
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!

in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language.

I think you missed my point. Walter's position on warnings being errors (mind you, not by default, only when the -w (show me the warnings) switch is applied) is that people run out of screen space. To me, that's just plain silly as an argument. If you're gonna have warnings, which aren't considered errors by default, at least have it possible to configure so the compiler doesn't error out on the 1st warning. By education I mean, tell the ignorant programmer how to use his shell to pipe the warnings into a paged format, or to a file, or whatever. Don't hinder the knowlegable programmers who want to have everything at once. Regarding whether warnings should be in a lint tool or not, I'm undecided on the issue, as I have been hit by both sides (too many useless warnings, or gee it would have been nice for the compiler to tell me I did this wrong). -Steve
Jul 09 2008
parent superdan <super dan.org> writes:
Steven Schveighoffer Wrote:

 "superdan" wrote
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!

in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language.

I think you missed my point. Walter's position on warnings being errors (mind you, not by default, only when the -w (show me the warnings) switch is applied) is that people run out of screen space. To me, that's just plain silly as an argument. If you're gonna have warnings, which aren't considered errors by default, at least have it possible to configure so the compiler doesn't error out on the 1st warning.

yarp i also didn't exactly get high on walter's argument.
 By education I mean, tell the ignorant programmer how to use his shell to 
 pipe the warnings into a paged format, or to a file, or whatever.  Don't 
 hinder the knowlegable programmers who want to have everything at once.

fair enough. by the way i'm of with that gun zealot (what's his name) that good shit should output exactly nothing on success.
 Regarding whether warnings should be in a lint tool or not, I'm undecided on 
 the issue, as I have been hit by both sides (too many useless warnings, or 
 gee it would have been nice for the compiler to tell me I did this wrong).

that's a good argument that there should be no two ways about it. walter, make all warnings errors without -w and get rid of -w.
Jul 09 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"superdan" <super dan.org> wrote in message 
news:g53831$20jk$1 digitalmars.com...
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!

in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up.

That's probably because over the past ten years, the people who care more about doing things the right way than catering to the status quo have been leaving C++ en masse (hence, D). It's no surprise that the people still remaining onboard C++ are either A. people who hold that particular viewpoint or B. people who are required to use C++ for some reason and have long since gotten used to the fact that C++ is never going to fix most of its problems. So I wouldn't place too much weight on the "comp.lang.c++" take on this particular issue; their consensus is likely just a reflection of group dynamics.
 i am 110% on walter's side on this shit. there should be no warnings and 
 shit. only errors. it is not catering to the ignorant. it is a matter of a 
 properly defined language.

That's right, no true warnings, but just a handful of what are in effect "optional errors". In a "properly defined language", how would you solve the problem of unintentionally-unused variables? Adopt the "unused" keyword that Koroskin Denis proposed and say that an unused var without the unused keyword is an error, and accessing a var that does have the unused keyword is also an error? That sounded good to me at first but then I realized: What happens when you're in the middle of an implementation and you stick the "unused" keyword on a variable in a function that you've only partially implemented just because you want to test the partial implementation. Then you fix any problems, get distracted by something else, and forget to finish (it happens more than you may think). Well great, now that wonderful compiles/errors dichotomy has just *created* a hole for that bug to slip in, whereas a real warning (the true kind, not the "warnings as errors" kind) would have caught it. So how else could a "properly defined language" solve it? Just simply treat it as a non-error as it is now and be done with it? That turns potentially-noisy errors into silent errors which is one of the biggest design mistakes of all. Any other suggestions on how to "properly design" a fix for that? If it works, I'd be all for it. Suppose that does get fixed. Now, when some other common gotcha is discovered in a language, or a particular version of a language, that's had a design freeze (like D1), then what do you do? Stick to your "warnings are bad" guns and just leave everyone tripping over the gotcha in the dark, maybe hoping that someone else could come along and create a lint tool that would do the job that you could have already done? Designing everything to fit into a compiles/errors dichotomy is great, in theory. But in practice it's just unrealistic. Even Walter ended up having to add a few "warnings" to D (even if he implemented them more as optional errors than as true warnings). Which is why, as I was saying in the beginning, trying to eliminate the need for a specific warning is great - *if* it actually pans out. But that doesn't always happen.
 a lint tool should not be folded into d. such a tool could e.g. follow 
 pointers, do virtual execution, and some other weird shit. it could run 
 for hours and produce output that takes an expert to interpret. that kind 
 of shit does not belong in the compiler.

Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.
Jul 09 2008
parent reply superdan <super dan.org> writes:
Nick Sabalausky Wrote:

 "superdan" <super dan.org> wrote in message 
 news:g53831$20jk$1 digitalmars.com...
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.

I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!

in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up.

That's probably because over the past ten years, the people who care more about doing things the right way than catering to the status quo have been leaving C++ en masse (hence, D). It's no surprise that the people still remaining onboard C++ are either A. people who hold that particular viewpoint or B. people who are required to use C++ for some reason and have long since gotten used to the fact that C++ is never going to fix most of its problems. So I wouldn't place too much weight on the "comp.lang.c++" take on this particular issue; their consensus is likely just a reflection of group dynamics.

group was given as an example. the thing is it has become clear to the luminaries that invoking better education is not an answer. it is clear from the literature and also from c++ oh ecs.
 i am 110% on walter's side on this shit. there should be no warnings and 
 shit. only errors. it is not catering to the ignorant. it is a matter of a 
 properly defined language.

That's right, no true warnings, but just a handful of what are in effect "optional errors". In a "properly defined language", how would you solve the problem of unintentionally-unused variables?

first i'd stop bitching why oh why the language does not build that shit in. that would be a great start. give me my fucking soapbox again. there. thanks. too many people around here are trigger happy about changing the language. (next breath they yell they want stability.) has nothing to do with you but reminds me of shit goin' on here in this group. moron: "d has no bitfields. somehow in my fucking world bitfields are so essential, i can't fucking live without them. hence i can't use d. give me bitfields and i'll give you my girlfriend." months go by. walter: "here, there are perfectly functional bitfields in std.bitmanip. they're more flexible and more rigorously defined than in fucking c. you can count on'em." moron: "don't like the syntax. still won't use d. i want them in the language. put them in the language and i'll use d."
 Adopt the "unused" keyword that Koroskin Denis proposed and say that an 
 unused var without the unused keyword is an error, and accessing a var that 
 does have the unused keyword is also an error?

once i stop bitching i get a clearer mind and I get to write some shit like this. void vacuouslyUse(T)(ref T x) {} void foo() { int crap; vacuouslyUse(crap); ................ } use and remove as you wish.
 That sounded good to me at 
 first but then I realized: What happens when you're in the middle of an 
 implementation and you stick the "unused" keyword on a variable in a 
 function that you've only partially implemented just because you want to 
 test the partial implementation. Then you fix any problems, get distracted 
 by something else, and forget to finish (it happens more than you may 
 think). Well great, now that wonderful compiles/errors dichotomy has just 
 *created* a hole for that bug to slip in, whereas a real warning (the true 
 kind, not the "warnings as errors" kind) would have caught it.

unused name should be an error. if you want to not use something, you must sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.
 So how else could a "properly defined language" solve it? Just simply treat 
 it as a non-error as it is now and be done with it? That turns 
 potentially-noisy errors into silent errors which is one of the biggest 
 design mistakes of all.
 
 Any other suggestions on how to "properly design" a fix for that? If it 
 works, I'd be all for it.

it works but i kinda doubt you'll be all for it. you don't want to solve the unused variable problem. you want compiler warnings. somehow you'll work your argument out to make my solution undesirable.
 Suppose that does get fixed. Now, when some other common gotcha is 
 discovered in a language, or a particular version of a language, that's had 
 a design freeze (like D1), then what do you do? Stick to your "warnings are 
 bad" guns and just leave everyone tripping over the gotcha in the dark, 
 maybe hoping that someone else could come along and create a lint tool that 
 would do the job that you could have already done?

this is an imperfect world. i see value in the no warning stance. you don't see. therefore when competition in d compilers arena will pick up i'd see a warning as a shitty concession, while you will grin "i told ya all along".
 Designing everything to fit into a compiles/errors dichotomy is great, in 
 theory. But in practice it's just unrealistic. Even Walter ended up having 
 to add a few "warnings" to D (even if he implemented them more as optional 
 errors than as true warnings). Which is why, as I was saying in the 
 beginning, trying to eliminate the need for a specific warning is great - 
 *if* it actually pans out. But that doesn't always happen.

when doesn't it happen?
 a lint tool should not be folded into d. such a tool could e.g. follow 
 pointers, do virtual execution, and some other weird shit. it could run 
 for hours and produce output that takes an expert to interpret. that kind 
 of shit does not belong in the compiler.

Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.

weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?
Jul 09 2008
next sibling parent Walter Bright <newshound1 digitalmars.com> writes:
superdan wrote:
 walter: "here, there are perfectly functional bitfields in
 std.bitmanip. they're more flexible and more rigorously defined than
 in fucking c. you can count on'em."

I'd like to take credit for std.bitmanip, but it's Andrei's design and effort.
Jul 09 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"superdan" <super dan.org> wrote in message 
news:g53ms5$h6n$1 digitalmars.com...
 group was given as an example. the thing is it has become clear to the 
 luminaries that invoking better education is not an answer. it is clear 
 from the literature and also from c++ oh ecs.

 once i stop bitching i get a clearer mind and I get to write some shit 
 like this.

 void vacuouslyUse(T)(ref T x) {}

 void foo()
 {
    int crap;
    vacuouslyUse(crap);
    ................
 }

 use and remove as you wish.

 unused name should be an error. if you want to not use something, you must 
 sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.

I would still prefer it to be a warning (that way it would keep nagging me when I forget to finish up and take out the temporary vacuouslyUse), but at this point I could live with this compromise. It would certainly be a lot better then the total silence it gives me now.
 it works but i kinda doubt you'll be all for it. you don't want to solve 
 the unused variable problem. you want compiler warnings. somehow you'll 
 work your argument out to make my solution undesirable.


 this is an imperfect world. i see value in the no warning stance. you 
 don't see.

I see value in warnings, you don't see. The imperfect world fact only serves to illustrate that taking sound practical advide ("the need for warnings should be minimized") to a unilateral extreme ("all warnings are always bad") just doesn't typically work out. Remember when the Java folks were trying to tell us that nothing should ever be non-OO?
 therefore when competition in d compilers arena will pick up i'd see a 
 warning as a shitty concession, while you will grin "i told ya all along".

I'm well aware of the difference between truth and popular opinion.
 Designing everything to fit into a compiles/errors dichotomy is great, in
 theory. But in practice it's just unrealistic. Even Walter ended up 
 having
 to add a few "warnings" to D (even if he implemented them more as 
 optional
 errors than as true warnings). Which is why, as I was saying in the
 beginning, trying to eliminate the need for a specific warning is great -
 *if* it actually pans out. But that doesn't always happen.

when doesn't it happen?

As just a few examples: http://www.digitalmars.com/d/1.0/warnings.html
 Anything like that can be attached to an optional command-line parameter
 that defaults to "off". Problem solved.

weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?

Trying to convince a Unix-hater of something by appealing to Unix-values is kinda like using the bible to convince an athiest of someting. But, I'm well aware that debating the merits of Unix-philosophy to a Unix-fan is equally fruitless, so I'm going to leave this particular point at that.
Jul 09 2008
parent reply superdan <super dan.org> writes:
Nick Sabalausky Wrote:

 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html

yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton. 2. warning - array 'length' hides other 'length' name in outer scope i seem to recall andrei pissed on this one until it dissolved into the fucking ground. can't agree more. it is a crying shame that this stupid length thing is still in the language. just get rid of it already. 3. warning - no return at end of function now what a sick decision was it to accept that in the first place. an overwhelming percentage of functions *can* and *will* be written to have a meaningful return at the end. then why the fuck cater for the minority and hurt everybody else. just require a return or throw and call it a day. people who can't return something meaningful can just put a throw. code growth is negligible. impact on speed is virtually nil. why the hell do we even bother arguing over it. 4. warning - switch statement has no default another example of a motherfuck. just require total coverage. in closed-set cases i routinely write anyway: switch (crap) { case a: ...; break; case b: ...; break; default: assert(crap == c): ...; break; } again: vast majority of code already has a default. the minority just has to add a little code. make it an error. 5. warning - statement is not reachable this is a tad more messy. people routinely insert a premature return in there to check for stuff. it pisses me off when that won't compile. i discovered i could do this: if (true) return crap; that takes care of the error. and i think it's actually good for me because it really is supposed to be temporary code. it jumps at me in a review. as it should.
Jul 10 2008
parent reply Don <nospam nospam.com.au> writes:
superdan wrote:
 Nick Sabalausky Wrote:
 
 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html

yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton.

That's in bugzilla. http://d.puremagic.com/issues/show_bug.cgi?id=1257 That whole area needs to be tidied up. Polysemous types should really help with this.
 3. warning - no return at end of function
 
 now what a sick decision was it to accept that in the first place. an
overwhelming percentage of functions *can* and *will* be written to have a
meaningful return at the end. then why the fuck cater for the minority and hurt
everybody else. just require a return or throw and call it a day. people who
can't return something meaningful can just put a throw. code growth is
negligible. impact on speed is virtually nil. why the hell do we even bother
arguing over it.

Yup. return should be required, unless function contains inline asm. Otherwise manually put assert(0); at the last line.
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.

Yup. Make it an error.
 
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in there
to check for stuff. it pisses me off when that won't compile. i discovered i
could do this:
 
 if (true) return crap;
 
 that takes care of the error. and i think it's actually good for me because it
really is supposed to be temporary code. it jumps at me in a review. as it
should.

You can also put assert(0); at the top of the unreachable code. 2,3, and 4 should definitely be errors. I also think that uninitialised class variables should be a compile-time error. It's a horrible newbie trap, especially for anyone with a C++ background: ------------- class C { void hello() { writefln("Hello crashing world"); } }; void main() { C c; // should be illegal c.hello(); } -------------- My first D program using classes was somewhat like that; took me ages to work out why it was segfaulting at runtime. It's still the most common mistake I make. You should have to write C c = null; for the rare cases where you really want an uninitialised class.
Jul 11 2008
next sibling parent reply superdan <super dan.org> writes:
Don Wrote:

 superdan wrote:
 Nick Sabalausky Wrote:
 
 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html

yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton.

That's in bugzilla. http://d.puremagic.com/issues/show_bug.cgi?id=1257

cool that's great. just a nit now. you mention only logical operations there. (actually you meant bitwise operation.) but i got to thinking a bit and a few integer arithmetic operations also should be included. a / b is never larger than a (cept for signed/unsigned mixed shit). a % b is never larger than b (again save for same shit). this could go a long way making casts unnecessary. as a consequence the compiler could tighten its sphincters and become more strict about implicit casts & shit. someone else also mentioned a < b which is fucked for mixed signs. all ordering comparisons like that are fucked and should be disabled. only == and != work for mixed signs. for the rest, cast must be required. of course if one is constant there may be no need. i have no idea on what to do about a + b with mixed signs. it's messed up like shit.
 That whole area needs to be tidied up. Polysemous types should really 
 help with this.

could someone care explain what this polysemous shit is (is it not polysemantic btw). the video is too vague about it. maybe this will convince andrey to haul his russian ass over here. btw thought he'd be older and more self-righteous. but i was surprised he seems a laid back dood. tries too hard to be funny tho. but he knows his shit.
 3. warning - no return at end of function
 
 now what a sick decision was it to accept that in the first place. an
overwhelming percentage of functions *can* and *will* be written to have a
meaningful return at the end. then why the fuck cater for the minority and hurt
everybody else. just require a return or throw and call it a day. people who
can't return something meaningful can just put a throw. code growth is
negligible. impact on speed is virtually nil. why the hell do we even bother
arguing over it.

Yup. return should be required, unless function contains inline asm. Otherwise manually put assert(0); at the last line.

i don't think assert(0); is cool. in a release build it disappears and that fucks the whole plan right there.
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.

Yup. Make it an error.

great! where do i sign the petition?
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in there
to check for stuff. it pisses me off when that won't compile. i discovered i
could do this:
 
 if (true) return crap;
 
 that takes care of the error. and i think it's actually good for me because it
really is supposed to be temporary code. it jumps at me in a review. as it
should.

You can also put assert(0); at the top of the unreachable code.

again assert(0); goes away in release mode. but wait, that's unreachable code anyway. guess that could work.
 2,3, and 4 should definitely be errors.
 I also think that uninitialised class variables should be a compile-time 
 error. It's a horrible newbie trap, especially for anyone with a C++ 
 background:
 -------------
 class C {
    void hello() { writefln("Hello crashing world"); }
 };
 
 void main()  {
   C c;  // should be illegal
   c.hello();
 }
 --------------
 My first D program using classes was somewhat like that; took me ages to 
 work out why it was segfaulting at runtime. It's still the most common 
 mistake I make.
 You should have to write C c = null; for the rare cases where you really 
 want an uninitialised class.

yarp. can't tell how many times this bit my ass. in fact even "new" is bad. there should be no new. auto c = C(crap); then classes and structs are more inter changeable.
Jul 11 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned mixed shit).

a = -10; b = -5
Jul 11 2008
parent superdan <super dan.org> writes:
Robert Fraser Wrote:

 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned mixed shit).

a = -10; b = -5

i got lazy. whenever i said "larger" i really meant "larger type". so in this case the correct sentence was; a / b never requires a larger type than the type of a.
Jul 11 2008
prev sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 

Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case. --bb
Jul 11 2008
next sibling parent superdan <super dan.org> writes:
Bill Baxter Wrote:

 superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 

Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case.

thanks. ow that sucks goat balls. i finally explained to myself the weird shit that happened to me a couple months ago. why the special case???
Jul 11 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 

Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case. --bb

What's "worse" about it? I think it actually makes sense. The reason normal asserts are not put in release mode is for performance reasons: so that the program doesn't waste time processing the assert condition, when it could evaluate to true, and not generate an exception. But when a program reaches an assert(false) it would always throw, and you have a bug, so I see no reason for it to be removed in release code. Unless you want your program to try to keeping going nonetheless (instead of throwing), but I'm not sure that's a good idea, although I guess it could work in some cases. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Don Wrote:

 superdan wrote:
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.

Yup. Make it an error.

I agree with everything else, but this one I think shouldn't be an error or warning (the implicit assert(0) is enough). This is because the vast majority of switch statements I use (and many I see) are over enums, and if every branch in the enumeration is covered, a pointless "default" will just complicate code. The "final switch" thing mentioned at the conference & now forgotten, OTOH, is a great idea for statically checking switch statements.
Jul 11 2008
parent superdan <super dan.org> writes:
Robert Fraser Wrote:

 Don Wrote:
 
 superdan wrote:
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.

Yup. Make it an error.

I agree with everything else, but this one I think shouldn't be an error or warning (the implicit assert(0) is enough). This is because the vast majority of switch statements I use (and many I see) are over enums, and if every branch in the enumeration is covered, a pointless "default" will just complicate code.

you are not disagreeing. switching over an enum is already closed if you mention all cases. the compiler knows that. it should indeed just throw an error if you have an out-of-range value that you forged from an int. but that's an uncommon case. don't make all pay for a rare bug.
 The "final switch" thing mentioned at the conference & now forgotten, 
 OTOH, is a great idea for statically checking switch statements.

yarp i liked it too til i realized all switches should be final.
Jul 11 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.

Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages. In all the time I've spent using Microsoft compilers, I've found the "x number of errors, y number of warnings" display at the end of every compile to be perfectly sufficient for the problem you point out. If a build involves many different calls to a compiler, then whatever rebuild-like tool is being used could be made to screen-scrape and total up the warnings and errors. Or the compiler could stick the error/warning counts along in an output file that gets read and accumulated by the rebuild/make tool. Or a copy of all the output could just be piped into a "grep for the error/warning counts" tool. This way, DMD's warnings could be lint-like warnings instead of the language-splitting "optional errors" (which I can understand your reluctance to create more of) that they are now. A "treat warnings as errors" flag could be retained for any large builds that involve multiple compiler invokations but for some reason still don't do any form of proper cumulative "x warnings / x errors".
Jul 09 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.

Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.

I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix. Piping the output into a file and then perusing it manually looking for warning statements is never going to happen. Complex builds tend to produce a lot of output, and poking through it looking for warnings every time you build is not practical. Changing your build process to point out warnings is the same thing as the compiler treating them as errors, except it's extra work for the build master. Trying to educate your programmers into doing extra work to deal with warnings that scroll off the screen is a lost cause. If you're using a static analysis tool, such as Coverity, which produces lots of spurious warnings, it is not put in the build process. It's run occasionally as a separate evaluation tool.
Jul 09 2008
next sibling parent Sean Kelly <sean invisibleduck.org> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled 
 is so that, for a long build, they don't scroll up and off your 
 screen and go unnoticed.

Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.

I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.

This is true. However, warnings are often related to code structure and the compiler isn't perfect at identifying real problems... and code changes to work around deficiencies in the checking tool aren't always appealing. For example, there is a file in the GC code, if I remember correctly, that doesn't compile correctly with warnings enabled because it uses a goto or some such that confuses the compiler about what's going on. If this were C++ I might be inclined to pragma out that particular warning for the area where the warning is displayed. Another issue is with third-party libraries. I always compile my code with the strictest warning settings, yet some of the libraries I use aren't so careful. With them, the easiest thing to do it often to assume that they work correctly despite the warnings and disable warning messages for the relevant headers.
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings 
 every time you build is not practical. Changing your build process to 
 point out warnings is the same thing as the compiler treating them as 
 errors, except it's extra work for the build master.

It isn't practical to do so for every build, but it's not uncommon for a team to set aside some time to address warnings in bulk, say between releases.
 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.
 
 If you're using a static analysis tool, such as Coverity, which produces 
 lots of spurious warnings, it is not put in the build process. It's run 
 occasionally as a separate evaluation tool.

That's certainly an option, and probably a preferable one overall. Sean
Jul 09 2008
prev sibling next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g537q0$1vi0$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is 
 so that, for a long build, they don't scroll up and off your screen and 
 go unnoticed.

Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.

I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.

First of all, if you don't want to deal with warnings, then along with what you said, you presumably wouldn't have turned them on in the first place. So I'm not sure you'd be developing a blind spot (unless you're being required to use them, which goes back to the management discussion). Aside from that, I'll fully agree that cerain warnings can get annoying and eventually overlooked, *if* you're using a language like C/C++ that has accumulated decades of warnings that resulted from design issues that could have been fixed by a design change but never were simply because maintaining full backwards compatibility (across all those years) was considered more important than never fixing a problem properly. D's not really in that boat. But even if it did end up in that boat someday, I'd much rather have the *chance* of not noticing a particular warning, than be guaranteed never to notice it simply because it was decided not to even offer the warning out of fear that it might get ignored.
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings every 
 time you build is not practical. Changing your build process to point out 
 warnings is the same thing as the compiler treating them as errors, except 
 it's extra work for the build master.

 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.

I think you misunderstood me. What I was talking about would only involve the makers of things like rebuild or make. All we need is a cumulative "x errors, x warnings" at the end of the build process. That's enough to let people know that there were warnings they should scroll up and look at (if they care about warnings in the first place). That would elininate the need to always force warnings as errors out of the mere worry that someone might not see it because it scrolled away. And if a so-called "programmer" has a problem looking at the "x errors, x warnings" display, then they themselves are already a lost cause, period. I don't to have to put up with gimped tools just because some incompetent morons are masquerading as real programmers.
 If you're using a static analysis tool, such as Coverity, which produces 
 lots of spurious warnings, it is not put in the build process. It's run 
 occasionally as a separate evaluation tool.

I can agree with that, but with the caveat that I, for one, would at the very least choose the most useful subset of those warnings to run during each compile. As an example, the last time I was using ActionScript 1 (ECMAScript without any of the improvements from v3 or beyond), I was constantly running into problems that a compiler like DMD would have caught (and considered errors) but ended up spending half an hour, a full hour, etc, trying to debug. I incorporated a JavaScript lint tool into my workflow (ran it every time I saved and was about to test something), and it helped immensely. Never gave me any problem. My point is, I *do* want certain warnings to be checked for on every compile. Now yes, there can be extra ones that are really anal and only occasionally useful. But in a normal setup where warnings are only *optionally* treated as errors *and* I can select which warnings I want, then those really anal annoying warnings can just be run on occasion, and I can still have my more common and highly-useful ones caught right away - which is what I want. And I seriously doubt I'm any sort exceptional case by feeling that way about it.
Jul 09 2008
prev sibling next sibling parent "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 21:41:35 +0100, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message  
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled  
 is so that, for a long build, they don't scroll up and off your screen  
 and go unnoticed.

reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.

I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix. Piping the output into a file and then perusing it manually looking for warning statements is never going to happen. Complex builds tend to produce a lot of output, and poking through it looking for warnings every time you build is not practical. Changing your build process to point out warnings is the same thing as the compiler treating them as errors, except it's extra work for the build master. Trying to educate your programmers into doing extra work to deal with warnings that scroll off the screen is a lost cause. If you're using a static analysis tool, such as Coverity, which produces lots of spurious warnings, it is not put in the build process. It's run occasionally as a separate evaluation tool.

Focussing mainly on your last point... Whenever I work with static analysis tools (I'm talking C++ here obviously) the first thing I do is to put them into the build process right after the compiler. If you run one just occasionally you will get lost in a sea of spurious warnings. Eliminating warnings and hence the slight possibility of error that goes with them takes effort and that effort can be focussed on problem areas. You can use a different lint configuration for a different set of files and gradually crank up the quality. If necessary enabling one type of warning for only a few files at a time. A similar approach is to record the warning count and require that it never increases and occasionally work to lower the limit. Instead of educating programmers to deal with screen fulls of messages the focus should be on educating them that code quality is important and that removing warnings is one way of improving quality. It is not the be all and end all and not nearly as good as say automated testing. Personally I insist on both wherever I can. Regards, Bruce.
Jul 09 2008
prev sibling next sibling parent reply Leandro Lucarella <llucax gmail.com> writes:
Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for
warning 
 statements is never going to happen.

I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific. -- Leandro Lucarella (luca) | Blog colectivo: http://www.mazziblog.com.ar/blog/ ---------------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------------- Me duele encontrarte en mis sueños muertos
Jul 10 2008
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for
warning 
 statements is never going to happen.

I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.

Emacs has it too! M-x ` --bb
Jul 10 2008
parent "Nick Sabalausky" <a a.a> writes:
"Bill Baxter" <dnewsgroup billbaxter.com> wrote in message 
news:g561hh$2g6g$2 digitalmars.com...
 Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen.

I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.

Emacs has it too! M-x `

Every IDE I've ever used does it. And I'm constantly IDE-hopping.
Jul 10 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 
 I'll draw on my 25 years of experience with warnings to answer this.
 
 If you turn warnings on, then you want to see them and presumably deal 
 with them. If you don't deal with them, then they persist every time you 
 compile, and either they get very irritating and you fix them anyway, or 
 you develop a blind spot for them and never see the ones you do want to 
 fix.
 
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings 
 every time you build is not practical. Changing your build process to 
 point out warnings is the same thing as the compiler treating them as 
 errors, except it's extra work for the build master.
 

Of course it's not going to happen. Cause manually looking at the compiler output is plain ridiculous. See my other post for details.
 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.
 

Again, anyone who firmly believes that trying to look at console output is even a worthy cause to begin with (lost or not), is living in the past. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 The reason for treating warnings as errors when warnings are enabled is 
 so that, for a long build, they don't scroll up and off your screen and 
 go unnoticed.

Dear gods... These are the kind of comments that make me cringe deep inside, and honestly worry me about the future of D. :( Looking at the output of a compiler in a console is a thing of the past. It's fraking obsolote. It's only done when you're hobbying or toying with the language. No one who does serious development is going to do that. What you do is use an IDE with a minimum of intelligence, that presents to you the warnings in a sensible way. Here's an example from CDT: http://www-128.ibm.com/developerworks/library/os-eclipse-ganymede/ Before some people here say they don't use an IDE, but instead use <editor foo with syntax highlighting and little more than that> and are fine with it, well, ask yourselves, are you doing any serious development, or just toying around? If you were in a multi-team, 6+ months project, working with such tools, do you think you would perform the same as the same team, with a proper toolchain? Head my words: you wouldn't. Thinking otherwise is a nuisance. And it's even worse if you're Walter. Basing such language/tool design issues on outdated notions is a danger to D's development. And it's not just the "looking at compiler output in console", there are plenty of other cases of this mentality. Walter, you need to shed some of your outdated notions of the software development process and think of the *modern* (present and future) development models that exist, or D will risk heavily retarded adoption (or even failure). I'm dead serious and I want to "record" this message for future reference, especially if things don't go well (which may not be obvious though). -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply Jussi Jumppanen <jussij zeusedit.com> writes:
Bruno Medeiros Wrote:

 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 

I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.
 If you were in a multi-team, 6+ months project, working with 
 such tools, do you think you would perform the same as the same 
 team, with a proper toolchain?

Yes. I would say team of 'editor based programmers' would be far more productive than a team o 'IDE based programmers'. The simple fact that the editor programmer can code outside the IDE immediately means they have a better understanding of there coding environment and their toolchain. There is nothing more pathetic than to watch an IDE programmer turn into a quivering mess, just because they can't find the answer to simple questions like: Why does my program run fine in the IDE but not outside the IDE?
Jul 27 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Jussi Jumppanen wrote:
 Bruno Medeiros Wrote:
 
 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 

I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.

Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?
Jul 27 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter Wrote:

 On Mon, Jul 28, 2008 at 11:15 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 Jussi Jumppanen wrote:

 Bruno Medeiros Wrote:

  Before some people here say they don't use an IDE, but instead use
 <editor foo with syntax highlighting and little more than that> and are fine
 with it,

I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.

Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?

I think part of the problem is that there are a whole lot of IDEs that really don't live up to the potential you guys are talking about. Plus IDEs come with their own set of problems. For instance I just wasted most of a day getting a MSVC7 project set up to also work with MSVC9. That's just ridiculous. Microsoft goes and makes these minor changes to their project file formats for every release of Visual Studio, and then only provide a tool to do 1-way, in-place upgrades of all your project files. It's insane. Just imagine if you were forced to fork your makefiles for every dang version of GCC that comes out. The way project management works in IDEs is often just completely silly like that. The so called "Intellisense" in Visual Studio also has historically been pretty lame, with refactoring support basically non-existant. The Visual Assist add-on from Whole Tomato was pretty much a "must" to bring it up to snuff. I get the impression that the Java IDEs offer a lot more on the refactoring frontier. So that's just to say, it's easy to get the impression that IDEs are not useful because there are many IDEs that genuinely are not that useful. I can see where Jussi is coming from. I have a feeling when Brunos says "IDE" he's thinking of IDEs at their very best. Not another one of these lame editors with syntax highlighting and a "compile" button that claims to be an IDE. I still primarily like to use my good ole emacs for writing large amounts of new code. There I don't find all the little buttons and completion popups and things in an IDE very useful. But when it comes to debugging and fixing code, damn it's nice to have the IDE there with all it's quick cross-linking abilities. The integrated debugger in MSVC is also damn fine. --bb

VS is crap (when the VS team is using Source Insight to develop it, you know something is wrong...). Even Visual C# pales in comparison to what I can do with Eclipse + JDT for Java; you have to use ReSharper to get the functionality a real IDE can provide.
Jul 28 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter Wrote:

 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 VS is crap (when the VS team is using Source Insight to develop it, you
 know something is wrong...). Even Visual C# pales in comparison to what
 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.

Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb

For C++, I like SourceInisht. For C#, I use VS + ReSharper (vanilla VS sucks). I don't like either as much as I like JDT, but what can you do?
Jul 28 2008
parent Yigal Chripun <yigal100 gmail.com> writes:
Robert Fraser wrote:
 Bill Baxter Wrote:
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:

 VS is crap (when the VS team is using Source Insight to develop it, you
 know something is wrong...). Even Visual C# pales in comparison to what
 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.

really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb

For C++, I like SourceInisht. For C#, I use VS + ReSharper (vanilla VS sucks). I don't like either as much as I like JDT, but what can you do?

Have you tried CDT for eclipse? Netbeans also has a c++ plugin. I'm sure that there's also an eclipse plugin for C#.
Jul 29 2008
prev sibling next sibling parent Paul D. Anderson <paul.d.removethis.anderson comcast.andthis.net> writes:
Bill Baxter Wrote:

 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 VS is crap (when the VS team is using Source Insight to develop it, you
 know something is wrong...). Even Visual C# pales in comparison to what
 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.

Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb

I prefer IntelliJ for Java development, although Eclipse or NetBeans are both good tools. IntelliJ has better (IMHO) code completion, macro and refactoring capabilities, but the difference is probably just that I've used IntelliJ more. If I had to pick one feature that stands out it is the refactoring. IntelliJ is a commercial product but they have a policy of making it available to open source projects at no cost (which I've been the beneficiary of). Paul
Jul 28 2008
prev sibling next sibling parent Don <nospam nospam.com.au> writes:
Bill Baxter wrote:
 
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser 
 <fraserofthenight gmail.com <mailto:fraserofthenight gmail.com>> wrote:
 
     VS is crap (when the VS team is using Source Insight to develop it, you
     know something is wrong...). Even Visual C# pales in comparison to what
     I can do with Eclipse + JDT for Java; you have to use ReSharper to get
     the functionality a real IDE can provide.
 
 
 Hmm, Brunos is an Eclipse fan too.  So maybe when you guys say "an IDE" 
 you really mean "Eclipse+JDT for Java".

This makes sense now. There might not be so much disagreement after all. 1: "my favourite text editor is better than the IDEs I've used (VS)" 2: "my favourite IDE is better than any text editor" Both of these statements could be true.
Jul 29 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser 
 <fraserofthenight gmail.com <mailto:fraserofthenight gmail.com>> wrote:
 
     VS is crap (when the VS team is using Source Insight to develop it, you
     know something is wrong...). Even Visual C# pales in comparison to what
     I can do with Eclipse + JDT for Java; you have to use ReSharper to get
     the functionality a real IDE can provide.
 
 
 Hmm, Brunos is an Eclipse fan too.  So maybe when you guys say "an IDE" 
 you really mean "Eclipse+JDT for Java".  Are there any other IDEs, for 
 any language, out there that you would deem acceptable?  Just curious.
 
 --bb

There's Eclipse+CDT, like Yigal mentioned. Although I haven't used it or examined it in-depth recently, I think it has advanced a lot in the last years, and is on-par, if not better, that VS. Configuring a compiler might not be as easy as VS, since CDT doesn't come bundled with one, but on semantic features (code completion, open/find references, refactoring) it seems to be much better than VS is. Dunno about debugging. IntelliJ is also pretty good, but it's a paid IDE. But really, for the point I was making (productivity of simple tools vs. IDEs), it did still apply with many other IDEs, like VS, KDev, etc.. I wasn't thinking of Eclipse alone. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 30 2008
prev sibling parent Jussi Jumppanen <jussij zeusedit.com> writes:
Robert Fraser Wrote:

 It might take 5 seconds if you click "go to definition" and 
 it has to open a new file, but that's vs 2 minutes of searching 
 for an import, finding the file location, and using find to get 
 to the definition in that file.

When you're accustomed to load times of less than 1 second, 5 seconds can feel like an eternity.
 If you've been using ed to write code for the last 30 years, the 
 mental concept of using your $2000 computer to its full potential 

Ed was not the text editor I was referring to.
 to help you write software is mind-boggling. 

If I had been referring to Ed or Notepad then I would agree with you.
 just can't deal with a 1-minute startup time for a product you're 
 going to be using for 8 hours, well all the more power to ya; no 
 amount of productivity gains could make you willing to switch.

You've hit the nail right on the head. When you're expecting a sub second response times, having to put up with minute/second delays is rather off putting, to the point of being counter productive.
 I'm not saying "more complex is always better," but why let all 
 that processing power go to waste?

But all that power is not going to waste. All that processing power lets the computer respond at an amazingly fast speed. It responds so fast it feels like it is not even there.
Jul 28 2008
prev sibling parent Ary Borenszweig <ary esperanto.org.ar> writes:
Jussi Jumppanen a crit :
 Bruno Medeiros Wrote:
 
 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 

I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers.

In Eclipse there's a time delay you can configure before autocompletion proposals appear (by default 200ms). That means that if you are faster than that delay (your claim), the IDE won't help you. Buf if you do wait a little, probably because you don't know what all the possible autocompletions are, then it will show the popup with suggestions. I can't see how that is worse than not having autocompletion at all, plus not having go-to-definition or semantic highlighting.
Jul 28 2008
prev sibling parent "Bill Baxter" <wbaxter gmail.com> writes:
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
<fraserofthenight gmail.com>wrote:

 VS is crap (when the VS team is using Source Insight to develop it, you
 know something is wrong...). Even Visual C# pales in comparison to what
 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.

Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb
Jul 28 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Here are some horrid examples from my own code which, to please the 
client, had to compile with all warnings on for MSVC:

---
   p = NULL;  // suppress spurious warning
---
   b = NULL;  // Needed for the b->Put() below to shutup a compiler 
use-without-init warning
---
   #if _MSC_VER
   // Disable useless warnings about unreferenced formal parameters
   #pragma warning (disable : 4100)
   #endif
---
   #define LOG 0       // 0: disable logging, 1: enable it

   #ifdef _MSC_VER
   #pragma warning(disable: 4127)      // Caused by if (LOG)
   #endif // _MSC_VER
---

Note the uglification this makes for code by forcing useless statements 
to be added. If I hadn't put in the comments (and comments are often 
omitted) these things would be a mystery.
Jul 09 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 Moreover, I would be happy to have an `unused` modifier in addition to in,  
 out and inout (doh!) to denote that a variable is not going to be used. In  
 this case compiler will show an error if the variable is used by chance.  
 It could help programmer to catch potential bugs at early stage once he  
 eventually start using it. Besides, it really fits well into D, IMO:
 void bar( unused int foo ) // no warning is generated
 {
 }

Can you explain me in what practical situation(s) this can be useful? Bye, bearophile
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Koroskin Denis wrote:


     void connect(unused int timeout)

 connection.connect(0); // I don't care, since it is immediate
 anyway 

 and then refactor your code

In which way is this type of coding better than preparing for overloading `connect': void connect(){ // ... } void connect( int timeout){ // ... this.connect(); } and then calling > connection.connect(); // immediate connection In addition: why is it good to be forced to refactor? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Koroskin Denis wrote:

 You asked an example, I provided one. There is another one:

Bearophile asked for a _practical_ example. But your example seems to illustrate consequences rooted in a coding style and not rooted in the absence of an `unused' keyword and its semantics.
 I was going to modify local variable, but not a member. 

This is a well known phenomenon. But again no need for an `unused' keyword shows up. To the contrary: within the function you want to use both variables, although one of them only for reading. Pollution of the namspace within the function causes the problem. But would you really want to write import statements for variables from surrounding scopes?
 Compiler could warn me that I don't use it

This claim comes up once in a while, but seems to be unprovable in general. It might be provable in your special case though. But without the general proof one may have both: - many false warnings - many true bugs without warnings Do you have a proof for the general case? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g52faq$2s3g$1 digitalmars.com...
 Koroskin Denis wrote:

 You asked an example, I provided one. There is another one:

Bearophile asked for a _practical_ example. But your example seems to illustrate consequences rooted in a coding style and not rooted in the absence of an `unused' keyword and its semantics.
 I was going to modify local variable, but not a member.

This is a well known phenomenon. But again no need for an `unused' keyword shows up. To the contrary: within the function you want to use both variables, although one of them only for reading. Pollution of the namspace within the function causes the problem. But would you really want to write import statements for variables from surrounding scopes?

Can you prove that namespace pollution is the root cause of "unintentially-unused variable" errors in the general case?
 Compiler could warn me that I don't use it

This claim comes up once in a while, but seems to be unprovable in general. It might be provable in your special case though. But without the general proof one may have both: - many false warnings - many true bugs without warnings Do you have a proof for the general case?

Do you have a general-case proof that an "unused variable" warning would cause too many false warnings/etc.? Would the proof still hold with the proposed "unused" keyword (or some functionaly-equivilent alternative)?
Jul 09 2008
parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 Can you prove

 Do you have a general-case proof

No---and in addition no one is obliged to have a counter proof for any claim, especially not if the claim is not formalized. I have a counter hint only: D as an intended systems programming language has `cast'- and `asm'- statements as well as pointers available. With these a clever coder might be able to access every data storage location accessable to the program, regardless of the protection status announced by the source. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
prev sibling next sibling parent reply Matti Niemenmaa <see_signature for.real.address> writes:
Koroskin Denis wrote:
 Moreover, I would be happy to have an `unused` modifier in addition to 
 in, out and inout (doh!) to denote that a variable is not going to be 
 used. In this case compiler will show an error if the variable is used 
 by chance. It could help programmer to catch potential bugs at early 
 stage once he eventually start using it. Besides, it really fits well 
 into D, IMO:
 
 void bar( unused int foo ) // no warning is generated
 {
 }

Just do: void bar(int) {} I.e. don't name the variable. And you will get an error if you try to use it regardless, as you might expect. <g> -- E-mail address: matti.niemenmaa+news, domain is iki (DOT) fi
Jul 09 2008
parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Matti Niemenmaa wrote:

 you will get an error if you try to use it

This is true only, if the name he tries to use isn't declared in any visible scope. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 09:49:34 +0100, Walter Bright  =

<newshound1 digitalmars.com> wrote:

 Here are some horrid examples from my own code which, to please the  =

 client, had to compile with all warnings on for MSVC:

 ---
    p =3D NULL;  // suppress spurious warning
 ---
    b =3D NULL;  // Needed for the b->Put() below to shutup a compiler =

 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless statement=

 to be added. If I hadn't put in the comments (and comments are often  =

 omitted) these things would be a mystery.

I would contend this is a problem with the quality of headers provided b= y = M$. Library code has a greater need to be high quality than regular code. Operating system APIs even more so. Removing warnings from C/C++ headers requires you to write them carefull= y = to remove the ambiguity that leads to the warning. That is, this definition= of quality is a measure that increases with decreasing semantic ambiguity. Asking users of your library code to disable warnings with a #pragma is = = laziness that a big monopoly like M$ can get away with. Then people wrongly start= = to think its okay because the big monopoly does it. Regards, Bruce.
Jul 09 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Wed, 09 Jul 2008 09:49:34 +0100, Walter Bright 
 <newshound1 digitalmars.com> wrote:
 
 Here are some horrid examples from my own code which, to please the 
 client, had to compile with all warnings on for MSVC:

 ---
    p = NULL;  // suppress spurious warning
 ---
    b = NULL;  // Needed for the b->Put() below to shutup a compiler 
 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless 
 statements to be added. If I hadn't put in the comments (and comments 
 are often omitted) these things would be a mystery.

I would contend this is a problem with the quality of headers provided by M$. Library code has a greater need to be high quality than regular code. Operating system APIs even more so. Removing warnings from C/C++ headers requires you to write them carefully to remove the ambiguity that leads to the warning. That is, this definition of quality is a measure that increases with decreasing semantic ambiguity.

I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning. If you insist on avoiding all warnings, you're effectively using the programming language spec which one individual has carelessly made on a whim. For example, VC6 generates some utterly ridiculous warnings. In some cases, the chance of it being a bug is not small, it is ZERO. In DMD, the signed/unsigned mismatch warning is almost always spurious. Getting rid of it reduces code quality.
Jul 10 2008
parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam nospam.com.au> wrote:

 Bruce Adams wrote:

  I would contend this is a problem with the quality of headers provided  
 by M$.
 Library code has a greater need to be high quality than regular code.
 Operating system APIs even more so.
 Removing warnings from C/C++ headers requires you to write them  
 carefully to
 remove the ambiguity that leads to the warning. That is, this  
 definition of
 quality is a measure that increases with decreasing semantic ambiguity.

I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.

a factor. There are other factors that are typically more significant. Still given the choice between code with some warnings and warning free code all other things being equal I would pick the warning free code. You obviously shift your quality measure towards that aspect of readability. Personally I think the impact on readability is minimal.
 If you insist on avoiding all warnings, you're effectively using the  
 programming language spec which one individual has carelessly made on a  
 whim.

general to say they're introduced carelessly on a whim.
 For example, VC6 generates some utterly ridiculous warnings. In some  
 cases, the chance of it being a bug is not small, it is ZERO.

If that's true then it would be a compiler bug. If you know it to be true you can disable the warning with a pragma. Similarly in gcc all warnings are supposed to have an on/off switch. So you get to choose which warnings you think are important. I am well aware that some people choose to ignore all warnings in order to code faster. In general its a false economy like not writing unit-tests.
 In DMD, the signed/unsigned mismatch warning is almost always spurious.  
 Getting rid of it reduces code quality.

I have encountered quite a few bugs (in C++) relating to unsigned/signed mismatches. Its a very subtle and hard to spot problem when a simple addition suddenly changes the sign of your result. It costs a ugly cast to remove the warning but that is a trade I'm prepared to make to never have to worry about such bugs. Regards, Bruce.
Jul 10 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam nospam.com.au> wrote:
 
 Bruce Adams wrote:

  I would contend this is a problem with the quality of headers 
 provided by M$.
 Library code has a greater need to be high quality than regular code.
 Operating system APIs even more so.
 Removing warnings from C/C++ headers requires you to write them 
 carefully to
 remove the ambiguity that leads to the warning. That is, this 
 definition of
 quality is a measure that increases with decreasing semantic ambiguity.

I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.

is a factor. There are other factors that are typically more significant. Still given the choice between code with some warnings and warning free code all other things being equal I would pick the warning free code. You obviously shift your quality measure towards that aspect of readability. Personally I think the impact on readability is minimal.

I don't care about readability so much as correctness. My point is that sometimes making code warning-free makes it WORSE. It depends on the quality of the warning. Some are hugely beneficial.
 
 If you insist on avoiding all warnings, you're effectively using the 
 programming language spec which one individual has carelessly made on 
 a whim.

general to say they're introduced carelessly on a whim.

The VC6 ones certainly seemed to be. Generation of a warning by a compiler is something that deserves almost as much care as a language specification. In the C++ world I really haven't seen evidence that much care is taken.
 For example, VC6 generates some utterly ridiculous warnings. In some 
 cases, the chance of it being a bug is not small, it is ZERO.

If that's true then it would be a compiler bug. If you know it to be true you can disable the warning with a pragma. Similarly in gcc all warnings are supposed to have an on/off switch. So you get to choose which warnings you think are important. I am well aware that some people choose to ignore all warnings in order to code faster. In general its a false economy like not writing unit-tests.

That's totally different. The issue is not that "generating warning-free code is more work". Rather, the problem is when there is no reasonable way to make the warning go away, that doesn't involve writing incorrect code.
 In DMD, the signed/unsigned mismatch warning is almost always 
 spurious. Getting rid of it reduces code quality.

I have encountered quite a few bugs (in C++) relating to unsigned/signed mismatches. Its a very subtle and hard to spot problem when a simple addition suddenly changes the sign of your result. It costs a ugly cast to remove the warning but that is a trade I'm prepared to make to never have to worry about such bugs.

(1) casts can hide far more serious errors. You want to say, "I know this is a signed-unsigned mismatch, but I know it is ok in this instance". But what you are saying is, "Please change the type of this variable. No matter what it is, turn it into an int". (2) In DMD, the signed/unsigned error (as it exists today) really is garbage. I've had to introduce incorrect code (via casts) into both Tango and Phobos in order to satisfy it.
Jul 11 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Don wrote:
 (2) In DMD, the signed/unsigned error (as it exists today) really is 
 garbage. I've had to introduce incorrect code (via casts) into both 
 Tango and Phobos in order to satisfy it.

I agree that looks pretty damning.
Jul 11 2008
parent Don <nospam nospam.com.au> writes:
Walter Bright wrote:
 Don wrote:
 (2) In DMD, the signed/unsigned error (as it exists today) really is 
 garbage. I've had to introduce incorrect code (via casts) into both 
 Tango and Phobos in order to satisfy it.

I agree that looks pretty damning.

It's bugzilla 1257.
Jul 14 2008
prev sibling next sibling parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 Like I've said, compiler warnings are essentialy a built-in lint
 tool. 

Finally the contradiction seems to show up: if lint is a tool in its own right, then one must have strong arguments to incorporate it into any other tool. In the paralell posting Walter remarks but not emphasizes that compilers have more goals, than enabling the evaluation of sources, on which your OP concentrates. Building large software systems and migrating some application source to another architecture are in the duties of compilers. For at least huge parts of these latter tasks a reevaluation of some static aspects of semantics of the application is useless but time consuming. In addition and by definition lint tools are not capable of doing more than this. This is the central point: lint tools are only capable of informing about possible bugs. If a warning emitted by a lint tool would be a sure hint for a bug in the program, then the compiler should have emitted an error. Thus there should be no need to incorporate any lint tool into any compiler. I am ready to read your counter arguments. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g51tmn$1kb3$1 digitalmars.com...
 Nick Sabalausky wrote:

 Like I've said, compiler warnings are essentialy a built-in lint
 tool.

Finally the contradiction seems to show up: if lint is a tool in its own right, then one must have strong arguments to incorporate it into any other tool.

1. There is no sufficient D lint tool either currently in existence or on the foreseeable horizon (at least as far as I'm aware). 2. The compiler is already in a position to provide such diagnostics (and in fact, already does for certain other conditions).
 In the paralell posting Walter remarks but not emphasizes that
 compilers have more goals, than enabling the evaluation of sources, on
 which your OP concentrates. Building large software systems and
 migrating some application source to another architecture are in the
 duties of compilers.

 For at least huge parts of these latter tasks a reevaluation of some
 static aspects of semantics of the application is useless but time
 consuming.

Hence, optional.
 In addition and by definition lint tools are not capable of
 doing more than this.

Which is part of what makes a compiler inherently more general-purpose, and a lint tool a mere symptom of a compiler's shortcomings.
 This is the central point: lint tools are only capable of informing
 about possible bugs. If a warning emitted by a lint tool would be a
 sure hint for a bug in the program, then the compiler should have
 emitted an error.

Warings are never sure hints about a bug in the program either. That's what makes them warnings.
 Thus there should be no need to incorporate any lint tool into any
 compiler. I am ready to read your counter arguments.

I could turn that around and say that with a sufficient lint tool incorporated into the compiler (activated optionally, of course), there would be no need for an external lint tool. Plus, an external lint tool is, by necessity, going to incorporate a lot of duplicated functionality from the compiler (roughly the whole front-end). Although I suppose that could be moved into a shared library to avoid duplicated maintenance efforts. But since you mentioned that having lint work being done in the compiler would be uselessly time consuming (Again, uselessly time consuming only if there's no switch to turn such checking on/off. Also, I assume you're referring to the speed of compiling), then I should point out that with an external lint tool, you're likely to have plenty of duplicated processing going on (lexing and parsing once for the external lint, and again for the real compiler).
Jul 09 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Warings are never sure hints about a bug in the program either. That's what 
 makes them warnings.

Not always. Sometimes they are the result of an inability to change the language specification (because it's a standard). Common C++ compiler warnings are indications of bugs in the standard that can't be fixed.
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g530ol$18th$2 digitalmars.com...
 Nick Sabalausky wrote:
 Warings are never sure hints about a bug in the program either. That's 
 what makes them warnings.

Not always. Sometimes they are the result of an inability to change the language specification (because it's a standard). Common C++ compiler warnings are indications of bugs in the standard that can't be fixed.

Fair enough. But that doesn't really apply to the current state of D, though, does it?
Jul 09 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Fair enough. But that doesn't really apply to the current state of D, 
 though, does it? 

No, except for 1.0. Many common C++ warnings were put into D as errors because the spec could be changed.
Jul 09 2008
prev sibling parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 1. There is no sufficient D lint tool either currently in
 existence or on the foreseeable horizon (at least as far as I'm
 aware). 

But this is at most a requirement for building a lint tool, not an argument for incorporating a lint tool into a compiler.
 2. The compiler is already in a position to provide such
 diagnostics (and in fact, already does for certain other
 conditions). 

This again is no argument for a lint tool in a compiler. It is at most an argument for a case where there is some checking already built into a compiler then to be able to toggle its behaviour on or off.
 For at least huge parts of these latter tasks a reevaluation of
 some static aspects of semantics of the application is useless
 but time consuming.

Hence, optional.

But why optional? If one needs the code one needs no checking any more. If one needs the checking, one needs no code.
 In addition and by definition lint tools are not capable of
 doing more than this.

Which is part of what makes a compiler inherently more general-purpose, and a lint tool a mere symptom of a compiler's shortcomings.

This is based on the assumption, that a compiler without a lint functionality has shortcomings, which has still to be proven.
 Plus, an
 external lint tool is, by necessity, going to incorporate a lot of
 duplicated functionality from the compiler (roughly the whole
 front-end).
 Although I suppose that could be moved into a shared
 library to avoid duplicated maintenance efforts. But since you
 mentioned that having lint work being done in the compiler would 
 be uselessly time consuming (Again, uselessly time consuming only
 if there's no switch to turn such checking on/off. Also, I assume
 you're referring to the speed of compiling), then I should point
 out that with an external lint tool, you're likely to have plenty
 of duplicated processing going on (lexing and parsing once for the
 external lint, and again for the real compiler). 

This is an argument only for having an intermediate representation suitable for the compiler and the lint tool. Interestingly IBM wants to sell the integration of a lint tool into the IDE: http://www-306.ibm.com/software/rational/announce/swanalyzer/ -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Aug 05 2008
prev sibling parent "Bill Baxter" <wbaxter gmail.com> writes:
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On Mon, Jul 28, 2008 at 11:15 AM, Robert Fraser
<fraserofthenight gmail.com>wrote:

 Jussi Jumppanen wrote:

 Bruno Medeiros Wrote:

  Before some people here say they don't use an IDE, but instead use
 <editor foo with syntax highlighting and little more than that> and are fine
 with it,

I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.

Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?

I think part of the problem is that there are a whole lot of IDEs that really don't live up to the potential you guys are talking about. Plus IDEs come with their own set of problems. For instance I just wasted most of a day getting a MSVC7 project set up to also work with MSVC9. That's just ridiculous. Microsoft goes and makes these minor changes to their project file formats for every release of Visual Studio, and then only provide a tool to do 1-way, in-place upgrades of all your project files. It's insane. Just imagine if you were forced to fork your makefiles for every dang version of GCC that comes out. The way project management works in IDEs is often just completely silly like that. The so called "Intellisense" in Visual Studio also has historically been pretty lame, with refactoring support basically non-existant. The Visual Assist add-on from Whole Tomato was pretty much a "must" to bring it up to snuff. I get the impression that the Java IDEs offer a lot more on the refactoring frontier. So that's just to say, it's easy to get the impression that IDEs are not useful because there are many IDEs that genuinely are not that useful. I can see where Jussi is coming from. I have a feeling when Brunos says "IDE" he's thinking of IDEs at their very best. Not another one of these lame editors with syntax highlighting and a "compile" button that claims to be an IDE. I still primarily like to use my good ole emacs for writing large amounts of new code. There I don't find all the little buttons and completion popups and things in an IDE very useful. But when it comes to debugging and fixing code, damn it's nice to have the IDE there with all it's quick cross-linking abilities. The integrated debugger in MSVC is also damn fine. --bb
Jul 27 2008
prev sibling next sibling parent reply Don <nospam nospam.com.au> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?)

I agree. Unfortunately, there's a problem with the 'optional' warnings we have in DMD right now. They are not optional for libraries. If a library generates warnings, then it is not usable by anyone who wants to compile with warnings enabled. I'd love to see the warnings tightened up so that they can become bugs.
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.

Jul 07 2008
next sibling parent Don <nospam nospam.com.au> writes:
Don wrote:
 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying 
 to track down that turned out to be variables I had intended to use 
 but forgot to. 

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?)

I agree. Unfortunately, there's a problem with the 'optional' warnings we have in DMD right now. They are not optional for libraries. If a library generates warnings, then it is not usable by anyone who wants to compile with warnings enabled. I'd love to see the warnings tightened up so that they can become bugs.

Of course, I mean 'errors' not bugs!
 
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I 
 don't think they should be part of the compiler, however.


Jul 07 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:

 I agree. Unfortunately, there's a problem with the 'optional' warnings  
 we have in DMD right now. They are not optional for libraries. If a  
 library generates warnings, then it is not usable by anyone who wants to  
 compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.

That is questionably a flaw in the way D processes libraries. But the same flaw as in C++. If your include file has warnings you have to surpress them when using the library. Presumably if you use .di files (created/compiled with warnings switched off) this won't happen? Regards, Bruce.
Jul 09 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:
 
 I agree. Unfortunately, there's a problem with the 'optional' warnings 
 we have in DMD right now. They are not optional for libraries. If a 
 library generates warnings, then it is not usable by anyone who wants 
 to compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.

That is questionably a flaw in the way D processes libraries. But the same flaw as in C++.

It's even worse in D, though, because with warnings switched on the library won't compile at all.
 If your include file has warnings you have to surpress them when using 
 the library.

And D has no way to do that.
 Presumably if you use .di files (created/compiled with warnings switched 
 off) this won't
 happen?

That's an interesting idea. That might well work.
 
 Regards,
 
 Bruce.

Jul 09 2008
parent "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 09:34:30 +0100, Don <nospam nospam.com.au> wrote:

 Bruce Adams wrote:
 On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:

 I agree. Unfortunately, there's a problem with the 'optional' warnings  
 we have in DMD right now. They are not optional for libraries. If a  
 library generates warnings, then it is not usable by anyone who wants  
 to compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.

same flaw as in C++.

It's even worse in D, though, because with warnings switched on the library won't compile at all.

it needs and you are only parsing the D to import interfaces etc. for your code.
 If your include file has warnings you have to surpress them when using  
 the library.

And D has no way to do that.
 Presumably if you use .di files (created/compiled with warnings  
 switched off) this won't
 happen?

That's an interesting idea. That might well work.

Jul 09 2008
prev sibling next sibling parent "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 12:49:34 +0400, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Here are some horrid examples from my own code which, to please the  
 client, had to compile with all warnings on for MSVC:

 ---
    p = NULL;  // suppress spurious warning
 ---
    b = NULL;  // Needed for the b->Put() below to shutup a compiler  
 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless statements  
 to be added. If I hadn't put in the comments (and comments are often  
 omitted) these things would be a mystery.

We don't have problems with most of these in D, since there is no C-style macros and uninitialized variables. Moreover, I would be happy to have an `unused` modifier in addition to in, out and inout (doh!) to denote that a variable is not going to be used. In this case compiler will show an error if the variable is used by chance. It could help programmer to catch potential bugs at early stage once he eventually start using it. Besides, it really fits well into D, IMO: void bar( unused int foo ) // no warning is generated { }
Jul 09 2008
prev sibling next sibling parent "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 14:05:21 +0400, bearophile <bearophileHUGS lycos.com>  
wrote:

 Koroskin Denis:
 Moreover, I would be happy to have an `unused` modifier in addition to  
 in,
 out and inout (doh!) to denote that a variable is not going to be used.  
 In
 this case compiler will show an error if the variable is used by chance.
 It could help programmer to catch potential bugs at early stage once he
 eventually start using it. Besides, it really fits well into D, IMO:
 void bar( unused int foo ) // no warning is generated
 {
 }

Can you explain me in what practical situation(s) this can be useful? Bye, bearophile

It is the most useful if warning is generated when a variable is unused: class Connection { void connect(int timeout) { // do something } } class SomeOtherConnectionType : Connection { void connect(unused int timeout) { // this type of connection is immediate, and therefore there is no need for timeout // but since we don't use timeout, just mark it unused // do the connection } } And then you realize that due to some specific changes this type of connection is no more immediate, so now you are going to take it into account. And you see an unused modifier that says to you: "Man, this variable was not used before, go check your code to see maybe there are some cases when you passed some dummy value to this function just to satisfy compilator, like this: auto connection = new SomeOtherConnectionType(); connection.connect(0); // I don't care, since it is immediate anyway and then refactor your code".
Jul 09 2008
prev sibling next sibling parent "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 15:26:54 +0400, Manfred_Nowak <svv1999 hotmail.com>  
wrote:

 Koroskin Denis wrote:


     void connect(unused int timeout)

 connection.connect(0); // I don't care, since it is immediate
 anyway

 and then refactor your code

In which way is this type of coding better than preparing for overloading `connect': void connect(){ // ... } void connect( int timeout){ // ... this.connect(); } and then calling > connection.connect(); // immediate connection In addition: why is it good to be forced to refactor? -manfred

You asked an example, I provided one. There is another one: class Node { private Node parent; Node getRoot() { Node p = parent; while (parent !is null) { parent = parent.next; } return parent; } } Actually, getRoo() isn't supposed to modify this.parent, but it does by accident (say nothing about const, please!). In my code, I was going to modify local variable, but not a member. Local variable p was defined, it has value assigned but it's _not_ used. Compiler could warn me that I don't use it, and it would help to detect a problem.
Jul 09 2008
prev sibling next sibling parent "Davidson Corry" <davidsoncorry comcast.net> writes:
On Tue, 08 Jul 2008 14:40:26 -0700, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 Ok, so the different warnings should be able to be turned on and off.  
 If you don't agree with a particular type of warning then you turn it  
 off. That's the nice thing about warnings as opposed to errors: they're  
 optionally letting you know about certain conditions that you might  
 want to be aware of, and they do it without changing, redefining, or  
 otherwise affecting the language itself.

That situation exists today for C++ compilers, and it's not so good. You have, as I mentioned previously, n factorial different languages instead of 1. Portability becomes a problem. Confusion about whether the code should compile or not reigns.
 If it was in the compiler, it would inhibit development of static  
 analysis tools,


It's the same reason why "m4" never caught on as a C preprocessor, despite being vastly superior, and despite everyone who wanted a better CPP being told to use m4.
 and would confuse the issue of what was correct D code.

valid code. If it wasn't valid it would generate an error instead of a warning.

That's true, but it is not what happens in the real world with warnings. I've dealt with warnings on C/C++ compilers for 25 years, and the practice is very different from the theory.

I agree with Walter. One of the driving forces behind D was a desire *not* to have the quirks, corners and obscurities that grew within C++ over the years because Stroustrup wanted full backwards compatibility with C, etc. I want a compiler that says *this* is legal D, *that* is not, and there's an end on it. I *also* want a tool (or sheaf of tools, smart editor, etc.) that will do lint-like static analysis and style vetting to warn me that, yes, this is legal D but you're using it in an obscure or unmaintainable or not easily extensible or not easily understood manner. _But_I_don't_want_that_tool_to_be_the_compiler_! Walter is right that you end up with effectively 2**n different languages depending, not only on which warnings you enable|disable, but also on whether the shop you work for demands that you compile at /W1 or /W3 or /W4 and does or doesn't treat warnings as errors. Yes, having the full parse tree available makes it easier to find some (not all) of those sorts of... not "errors", call them "infelicities". So compiler writers have tried to be generous and give their users more information "for free", and by doing so have made IMHO a design error. It is exactly analogous to overloading an operator with functionality that doesn't properly apply to that operation. You're trying to do too much with one tool. I applaud Walter for not making that error. And I want him focused on writing a knife-clean compiler that stabs illegal code in the heart, and trusts the programmer to have meant what he said when the code is legal, even if it's "excessively clever". Let someone *else* write "Clippy for D". -- Dai
Jul 09 2008
prev sibling next sibling parent reply JAnderson <ask me.com> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 

So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. <Snip>

 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.

Something like lint can be run and have a consistent output on every D compiler and platform since it doesn't care about building the actual platform specific code. Having no warnings in the compiler means you can't have hardware/platform specific warnings. And I think thats great. Personally in C++ I like to turn to warning level 4 with warnings as errors and run both a GCC and a MSVC++ compile (when working on multiple platforms). Most warnings can removed without use of pragma and using both compilers gives a much better coverage of warnings. I'm personally a fan of catching things early. The more warnings as errors the better. If I have to suffer a little for false positives *shrug* however its much better then spending hours in a mud pit full of crocodiles; that is debugging. -Joel
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*

What do you understand by "a little"? Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php? art_group=digitalmars.D&article_id=73441 Do you recognize how many warnings a lint tool might emit on that code? Would you admit then, that a paranoic lint would be quite useless, even if it detects that the variable `p' should be accessed? Would you admit, that you yourself are unable to decide whether the presence of some access statements to `p' should suppress the warning? My understanding of lint tools is, that they incorporate a collection of programming patterns together with a fuzzy recognition algorithm. If there are enough hits for a specific pattern, but it is still only partial implemented, then warnings are generated. Under this the primary question is: what is so special to the collection of programming patterns that they can be formalized into a lint tool but not be used as paradigms in the source language? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 10 2008
next sibling parent reply JAnderson <ask me.com> writes:
Manfred_Nowak wrote:
 JAnderson wrote:
 
 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*

What do you understand by "a little"?

I don't understand what your asking. I meant that if I have to fix it because the compiler tells me its an error then so be it. Its a little pain for a lot of gain.
 
 Please look at the example from 
 http://www.digitalmars.com/webnews/newsgroups.php?
 art_group=digitalmars.D&article_id=73441 
 
 Do you recognize how many warnings a lint tool might emit on that code?
 Would you admit then, that a paranoic lint would be quite useless, even 
 if it detects that the variable `p' should be accessed?

I don't understand? With lint it just gives you hints about what could be wrong. You pick and choose what to fix.
 Would you 
 admit, that you yourself are unable to decide whether the presence of 
 some access statements to `p' should suppress the warning?

I would prefer this be an error like C#. In C++ because all my warnings are errors it would be an error too. If you really want to use an uninitialized variable there should be a work around but it should be harder to do.
 My understanding of lint tools is, that they incorporate a collection 
 of programming patterns together with a fuzzy recognition algorithm. If 
 there are enough hits for a specific pattern, but it is still only 
 partial implemented, then warnings are generated. Under this the 
 primary question is: what is so special to the collection of 
 programming patterns that they can be formalized into a lint tool but 
 not be used as paradigms in the source language?

For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors). This means the lint effort can be separate. It means they can continually add and remove checks while the compiler is worked on as a separate effort. Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well. I simply don't add an identifier or I semicolon the value when I'm writting stubs.
 
 -manfred  
 

Jul 10 2008
parent JAnderson <ask me.com> writes:
JAnderson wrote:
 Manfred_Nowak wrote:
 JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*

What do you understand by "a little"?

I don't understand what your asking. I meant that if I have to fix it because the compiler tells me its an error then so be it. Its a little pain for a lot of gain.
 Please look at the example from 
 http://www.digitalmars.com/webnews/newsgroups.php?
 art_group=digitalmars.D&article_id=73441
 Do you recognize how many warnings a lint tool might emit on that code?
 Would you admit then, that a paranoic lint would be quite useless, 
 even if it detects that the variable `p' should be accessed?

I don't understand? With lint it just gives you hints about what could be wrong. You pick and choose what to fix.
 Would you admit, that you yourself are unable to decide whether the 
 presence of some access statements to `p' should suppress the warning?

I would prefer this be an error like C#. In C++ because all my warnings are errors it would be an error too. If you really want to use an uninitialized variable there should be a work around but it should be harder to do.

Before someone else corrects me. This is not an error in C# I was thinking of "used uninitialized variable" not "variable not used". I And I still prefer errors for these.
 
 My understanding of lint tools is, that they incorporate a collection 
 of programming patterns together with a fuzzy recognition algorithm. 
 If there are enough hits for a specific pattern, but it is still only 
 partial implemented, then warnings are generated. Under this the 
 primary question is: what is so special to the collection of 
 programming patterns that they can be formalized into a lint tool but 
 not be used as paradigms in the source language?

For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors). This means the lint effort can be separate. It means they can continually add and remove checks while the compiler is worked on as a separate effort. Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well. I simply don't add an identifier or I semicolon the value when I'm writting stubs.
 -manfred 


Jul 10 2008
prev sibling parent "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Thu, 10 Jul 2008 10:01:25 +0100, Manfred_Nowak <svv1999 hotmail.com> =
 =

wrote:

 JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*

What do you understand by "a little"? Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php? art_group=3Ddigitalmars.D&article_id=3D73441 Do you recognize how many warnings a lint tool might emit on that code=

 Would you admit then, that a paranoic lint would be quite useless, eve=

 if it detects that the variable `p' should be accessed? Would you
 admit, that you yourself are unable to decide whether the presence of
 some access statements to `p' should suppress the warning?

Generally there are two types of code. Code for which warnings are allow= ed and warning free code. Transitioning from code that's been allowed to ha= ve = warnings for a long time to warning free code takes effort. I still think the lon= g term benefits make it worth it.
 My understanding of lint tools is, that they incorporate a collection
 of programming patterns together with a fuzzy recognition algorithm. I=

 there are enough hits for a specific pattern, but it is still only
 partial implemented, then warnings are generated. Under this the
 primary question is: what is so special to the collection of
 programming patterns that they can be formalized into a lint tool but
 not be used as paradigms in the source language?

 -manfred

= how you are allowed to put bricks together. Static analysis tools work at a much= = higher level. They say things like, this is a load bearing wall putting a door = = here without a lintal (pardon the pun) is unwise. Semantic checks rely on trying to work out what your code is trying to d= o, = its not just following the steps to needed execute it (with certain exceptions). Regards, Bruce.
Jul 10 2008
prev sibling next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:

 I agree with Walter. One of the driving forces behind D was a desire
 *not* to have the quirks, corners and obscurities that grew within C++
 over the years because Stroustrup wanted full backwards compatibility
 with C, etc.

This part I agree. D is a great language, and it has been my "home language" for years (replaced C++).
 I want a compiler that says *this* is legal D, *that* is
 not, and there's an end on it.

Maybe unused local vars, arguments or static arrays would be defined not to be legal D? :)
 I *also* want a tool (or sheaf of tools, smart editor, etc.) that will
 do lint-like static analysis and style vetting to warn me that, yes,
 this is legal D but you're using it in an obscure or unmaintainable or
 not easily extensible or not easily understood manner.
 _But_I_don't_want_that_tool_to_be_the_compiler_!

Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it. For me, it could even warn about indentation quirks, like: ... if(a == b) do_that(); do_that_also(); ... ...In which case the compiler could stop and say, that either add {}'s or correct the indentation :)
 Walter is right that you end up with effectively 2**n different
 languages depending, not only on which warnings you enable|disable, but
 also on whether the shop you work for demands that you compile at /W1 or
 /W3 or /W4 and does or doesn't treat warnings as errors.

Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)
 I applaud Walter for not making that error. And I want him focused on
 writing a knife-clean compiler that stabs illegal code in the heart, and
 trusts the programmer to have meant what he said when the code is legal,
 even if it's "excessively clever".

Heh, I like compilers that does not over-estimate the cleverness of the developer, but instead think that they (compilers) are the smarter part ;) Although being well known with syntax and best practices of a language, many times I write something else than I thought that I wrote. For catching these kind of spurious "miswritings", there are "syntactic salt" in many languages, including D. But at some point I think that it's no use to add more this salt, but instead do static checks to make the language better. As a very simple example, the current DMD warns about this: --- void error(string msg) { writefln(msg); exit(-1); } int doSomethingWith(string a) { if(a == null) { error("A null string"); } else { return a.length; } } --- $ dmd test.d warning - xxx: function test.doSomethingWith no return at end of function ...Since it does not understand that exit never returns (yes I know that that case should be written in different manner, but it is just an example). It could be told e.g. with some new return type (instead of "void exit" you would write "never_return exit"), and of course the analysis should go through the possible execution flows to check, which parts of the code may return and which parts cannot. Similar cases occur with switch statements. What I try to say, is that IMO it is impossible to think that language, compiler (code generation) and static checking are three separate things. If there is a good synergy between these three elements, the language is great. But that's just my opinion...
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g549hh$1h9i$2 digitalmars.com...
 On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:

 I *also* want a tool (or sheaf of tools, smart editor, etc.) that will
 do lint-like static analysis and style vetting to warn me that, yes,
 this is legal D but you're using it in an obscure or unmaintainable or
 not easily extensible or not easily understood manner.
 _But_I_don't_want_that_tool_to_be_the_compiler_!

Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it.

Right. See, even if you don't want that tool to be your compiler...you don't have to turn that feature on. If I want to use a TV remote, I can do so without dealing with the buttons that are built into the TV.
 Walter is right that you end up with effectively 2**n different
 languages depending, not only on which warnings you enable|disable, but
 also on whether the shop you work for demands that you compile at /W1 or
 /W3 or /W4 and does or doesn't treat warnings as errors.

Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)

I'm not sure I see the need for as many as four warning levels (though I suppose I could be convinced given an appropriate argument), but something like this sounds ideal to me: - enable typically-useful warnings - enable anally-retentive, only sometimes-helpful, warnings - treat typically-useful warnings as errors - treat all warnings as errors
 I applaud Walter for not making that error. And I want him focused on
 writing a knife-clean compiler that stabs illegal code in the heart, and
 trusts the programmer to have meant what he said when the code is legal,
 even if it's "excessively clever".

Heh, I like compilers that does not over-estimate the cleverness of the developer, but instead think that they (compilers) are the smarter part ;) Although being well known with syntax and best practices of a language, many times I write something else than I thought that I wrote. For catching these kind of spurious "miswritings", there are "syntactic salt" in many languages, including D. But at some point I think that it's no use to add more this salt, but instead do static checks to make the language better.

At the risk of a "me too" post...Me too ;)
Jul 10 2008
parent "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g55tmb$1h9i$17 digitalmars.com...
 On Thu, 10 Jul 2008 14:55:49 -0400, Nick Sabalausky wrote:

 I'm not sure I see the need for as many as four warning levels (though I
 suppose I could be convinced given an appropriate argument), but
 something like this sounds ideal to me:

 - enable typically-useful warnings
 - enable anally-retentive, only sometimes-helpful, warnings

 - treat typically-useful warnings as errors - treat all warnings as
 errors

What I think is that the basic compiler needs following: - A set of warnings, that usually indicate bugs in the code and are relatively easy to circumvent (like unused vars and such), but which may be looked to be at least some sort frequent things while sketching software - Basically two warning levels: either to generate code while there are warnings, or not generate code (treating them errors) Suppressing the output of warnings? Why? What use? If you are not going to correct the warnings in your code when completing it, why you then even read the output of the compiler (if the code is generated)? Closing eyes does not make the things behind the warnings to go away :) Then, when dealing with larger software and looking for good places for refactoring, there could be an external "anally-retentive" lint-like tool :)

You've convinced me :)
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 13:41:35 -0700, Walter Bright wrote:

 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled
 is so that, for a long build, they don't scroll up and off your screen
 and go unnoticed.

Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.

If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.

I completely agree with this. If warnings are generated, it's best to stop compilation and let the developer to correct those parts. Warnings, that does not stop building process have no use at all.
Jul 09 2008
prev sibling next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:

 In a "properly defined language", how would you solve the problem of
 unintentionally-unused variables?

My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects. For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used. Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out. In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :) And yes, warnings could be considered as "optional errors" for us who think that it's best to tackle all sorts of quirks & potential bugs at compile time and not trying to find them with runtime debugging. As long as the warning makes some sense and can be circumvented in some reasonable way, just throw it to my screen :)
Jul 09 2008
parent "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g54b6m$1h9i$4 digitalmars.com...
 On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:

 In a "properly defined language", how would you solve the problem of
 unintentionally-unused variables?

My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects. For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used. Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out. In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :)

I'd prefer a warning, but I'd be fine with all this.
 And yes, warnings could be considered as "optional errors" for us who
 think that it's best to tackle all sorts of quirks & potential bugs at
 compile time and not trying to find them with runtime debugging. As long
 as the warning makes some sense and can be circumvented in some
 reasonable way, just throw it to my screen :)

I, too, like to tackle all that stuff right when I compile. But whenever I've refered to warnings as "optional errors" here, what I meant was that it's impossible to turn off "treat warnings as errors". Even if warnings are not treated as errors, they're still going to show up on your screen (provided you at least enabled them, of course), so you can still choose to deal with them right then and there. The benefit is that you wouldn't have to fix (or wait for a fix for) any warnings in any third-party source libraries you use. Also, while I can't confirm or deny this at the moment, someone here said that -w compiles currently halt at the first warning. If that's the case, then disabling "treat warnings as errors" would let you see all the warnings at once, not just the first one. Plus, allowing "treat warnings as errors" to be disabled would decrease the strength of the phenomenon Walter and others described where warnings effectively create multiple versions of the same language. The pheonomenon would only occur in places that take "No warnings allowed!" to an obsessive/compulsive/irrational level (rather than a merely sensible level), instead of happening to everybody.
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 06:17:22 +0000, Markus Koskimies wrote:

Well, I need to share this experience with you; I have been debugging one 
of my segfaulting D programs for a few hours, and I finally found the 
reason for that. A shortened version:

	foreach(...)
	{
		Arg arg = null;

		...
		if( ... )
		...
		else if( ... )
		{
			...
			arg = somefunc( ... );
		}
		else if( ... )
		...
		else if( ... )
		{
--->			someotherfunc( ... ); <---
		}
		...
	...
	}

Those functions return Arg class objects, but earlier they returned voids 
(and used Arg objects as parameters). When modifying the code I forgot to 
store the return value in one of the execution paths --> segfaults.

Having "optional error" called a warning about not using return value of 
a function would have saved a lots of time, cigarettes, coffee and  
headache :D
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 05:00:54 -0400, bearophile wrote:

[...]
 Let's see... maybe just using indentation?

Aa, I'm a big fan of Python and I wouldn't complain if D would use the same method for determining blocks ;D
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 14:55:49 -0400, Nick Sabalausky wrote:

 I'm not sure I see the need for as many as four warning levels (though I
 suppose I could be convinced given an appropriate argument), but
 something like this sounds ideal to me:
 
 - enable typically-useful warnings
 - enable anally-retentive, only sometimes-helpful, warnings
 
 - treat typically-useful warnings as errors - treat all warnings as
 errors

What I think is that the basic compiler needs following: - A set of warnings, that usually indicate bugs in the code and are relatively easy to circumvent (like unused vars and such), but which may be looked to be at least some sort frequent things while sketching software - Basically two warning levels: either to generate code while there are warnings, or not generate code (treating them errors) Suppressing the output of warnings? Why? What use? If you are not going to correct the warnings in your code when completing it, why you then even read the output of the compiler (if the code is generated)? Closing eyes does not make the things behind the warnings to go away :) Then, when dealing with larger software and looking for good places for refactoring, there could be an external "anally-retentive" lint-like tool :)
Jul 10 2008
prev sibling next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 15:20:54 -0400, Nick Sabalausky wrote:

About that "warnings as errors"; for me the reason for that behavior is 
that I usually include executing the code to the command line when coding 
(and use up arrow to rerun it, if the compiler didn't accept my code):

$ make && ./myProgram

If the compiler does not stop for warnings, I'd need some sort of build 
log to examine the warnings after execution. But if the compiler returns 
error value (like -1) when meeting warnings, the program was not executed 
and I can easily examine the reasons.

This same happens of course with IDEs, when using "Run" instead of first 
compiling/building the software.

 Also, while I
 can't confirm or deny this at the moment, someone here said that -w
 compiles currently halt at the first warning.

No, it shows all warnings it generates. But IMO it does not generate enough warnings.
Jul 10 2008
parent "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g55u4d$1h9i$18 digitalmars.com...
 On Thu, 10 Jul 2008 15:20:54 -0400, Nick Sabalausky wrote:

 About that "warnings as errors"; for me the reason for that behavior is
 that I usually include executing the code to the command line when coding
 (and use up arrow to rerun it, if the compiler didn't accept my code):

 $ make && ./myProgram

 If the compiler does not stop for warnings, I'd need some sort of build
 log to examine the warnings after execution. But if the compiler returns
 error value (like -1) when meeting warnings, the program was not executed
 and I can easily examine the reasons.

 This same happens of course with IDEs, when using "Run" instead of first
 compiling/building the software.

 Also, while I
 can't confirm or deny this at the moment, someone here said that -w
 compiles currently halt at the first warning.

No, it shows all warnings it generates. But IMO it does not generate enough warnings.

I suppose I should point out that I have nothing against treating warnings as errors, per se. I just think it should be optional and not forced by the compiler to be either "always treated as errors and there's nothing you can do about it" or "never treated as errors and there's nothing you can do about it"
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:28:53 -0400, Nick Sabalausky wrote:

 I suppose I should point out that I have nothing against treating
 warnings as errors, per se. I just think it should be optional and not
 forced by the compiler to be either "always treated as errors and
 there's nothing you can do about it" or "never treated as errors and
 there's nothing you can do about it"

Honestly, (1) I was using D compiler happily for some years and I thought that it generates warnings just like other compilers do. I was shocked to recognize, that it really does not do that. (2) I realized that there is some kind of fundamentalist ideology not to produce warnings from compiler (that's extremely silly from my point of view); that's why I suggested, that combining the current D possibilities it would really make no big difference to treat warnings as errors (since it seems, that it is more likely to get errors to the compiler, not warnings), (3) From the point of both programmer, and compiler designer, I see absolutely no point not generating warnings, when the compiler knows it has done something probably silly. The more optimizations the compiler does, the more aware it is about the source code. What v*#p%&"/(¤ %&#s¤/&/ # %¤yh&/&/"&/&# %&/#¤ (*) is the sole reason not to show the analysis compiler has already made (about unused vars, private methods, dead code, unused imports etc. etc). --- (*) Those are Finnish swearing words, that does not compile to English. You may use "f**k" for every character ;)
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:31:02 -0400, Nick Sabalausky wrote:

 "Bill Baxter" <dnewsgroup billbaxter.com> wrote in message
 news:g561hh$2g6g$2 digitalmars.com...
 Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking
 for warning statements is never going to happen.

I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.

Emacs has it too! M-x `


Currently, I use IDE only if forced to do so. Kate/nedit & make & tee does everything I need ;)
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:53:53 +0100, Bruce Adams wrote:

 Generally there are two types of code. Code for which warnings are
 allowed and warning free code. Transitioning from code that's been
 allowed to have warnings
 for a long time to warning free code takes effort. I still think the
 long term benefits make it worth it.

I so totally agree with this! "me too"...
Jul 10 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 16:06:45 -0400, superdan wrote:

 byte a, b;
 byte c = a + b;

I think that compilers should never generate warnings in these cases. If you overflow in arithmetic operations of same types, it is at most runtime error issue. You should know how large values can be stored in basic types, and use large enough data type in the code. Since explicit casting easily hides bugs, it should not be used overwhelmingly.
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in
 there to check for stuff. it pisses me off when that won't compile. 

That is very good example, why it would be good to have possibility to (temporarily) generate code even when it has warnings. That way the warning does not go anywhere and you still can debug your program.
 i discovered i could do this:
 
 if (true) return crap;

Thanks! :)
Jul 11 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 

The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?

You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _ -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros  
<brunodomedeiros+spam com.gmail> wrote:

 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for  
 variables/arguments that are declared but never accessed? Just today  
 alone there's been two bugs I spent 10-30 minutes going nuts trying to  
 track down that turned out to be variables I had intended to use but  
 forgot to.

you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?

You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _

Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them? OTOH, I want for my code to be constantly analyzed for suspicious situation but _only during development time_. That's why I use IDE. And my IDE should help me as I type: syntax highlighting, code autocomplete, refactoring *and* warnings. It's almost free for IDE to analyze my code for possible errors. But compiler's job is to compile *or* reject the code, and it should do it as fast as possible without spending time for looking into suspicious situations. Compiler and IDE tasks do often overlap, of course, but it doesn't mean that they should be merged into single solution. Just my $0.02...
Jul 27 2008
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 As a gentoo user when I compile something (and I do it alot :p  
 ) I expect two messages: "build finished" *or* "build failed for the  
 following reason: ...". All those warning are *not for me*, they are for  
 developers and needed during development time only. Imagine you are  
 updating a web-browser or some other application and get all those  
 "comparison between signed and unsigned types" messages. Do you want to  
 read them?

Then if you build things to just use them quickly, then you may want to omit putting -Wall there... Bye, bearophile
Jul 27 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Koroskin Denis wrote:
 On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros 
 <brunodomedeiros+spam com.gmail> wrote:
 
 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying 
 to track down that turned out to be variables I had intended to use 
 but forgot to.

you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?

You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _

Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them?

I too was talking about development time only. If you're compiling as a user, then yes there should be an option that suppresses various output, warnings or not.
 OTOH, I want for my code to be constantly analyzed for suspicious 
 situation but _only during development time_. That's why I use IDE. And 
 my IDE should help me as I type: syntax highlighting, code autocomplete, 
 refactoring *and* warnings. It's almost free for IDE to analyze my code 
 for possible errors. But compiler's job is to compile *or* reject the 
 code, and it should do it as fast as possible without spending time for 
 looking into suspicious situations.
 
 Compiler and IDE tasks do often overlap, of course, but it doesn't mean 
 that they should be merged into single solution.
 
 Just my $0.02...

Like you said, the compiler and IDE tasks overlap. In D's case, if DMD did a proper warning analysis, then an IDE could use the compiler to present warnings to the user in a proper manner (like squiggles in the source code editor). In Descent's case, it would be particularly easy to do that, since it has an embedded/ported DMD frontend, and already does the same for compiler errors. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply Yigal Chripun <yigal100 gmail.com> writes:
Bruno Medeiros wrote:
 Koroskin Denis wrote:
 On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros
 <brunodomedeiros+spam com.gmail> wrote:

 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just
 today alone there's been two bugs I spent 10-30 minutes going nuts
 trying to track down that turned out to be variables I had intended
 to use but forgot to.

you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?

You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _

Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them?

I too was talking about development time only. If you're compiling as a user, then yes there should be an option that suppresses various output, warnings or not.
 OTOH, I want for my code to be constantly analyzed for suspicious
 situation but _only during development time_. That's why I use IDE.
 And my IDE should help me as I type: syntax highlighting, code
 autocomplete, refactoring *and* warnings. It's almost free for IDE to
 analyze my code for possible errors. But compiler's job is to compile
 *or* reject the code, and it should do it as fast as possible without
 spending time for looking into suspicious situations.

 Compiler and IDE tasks do often overlap, of course, but it doesn't
 mean that they should be merged into single solution.

 Just my $0.02...

Like you said, the compiler and IDE tasks overlap. In D's case, if DMD did a proper warning analysis, then an IDE could use the compiler to present warnings to the user in a proper manner (like squiggles in the source code editor). In Descent's case, it would be particularly easy to do that, since it has an embedded/ported DMD frontend, and already does the same for compiler errors.

Even better would be to have something like clang which offers a collection of libs (codegen, semantic analysis, parsing, etc..) and an API for each lib. that way the descent folks could have just used the semantic analysis and parser DLLs of the compiler and the respective APIs instead of having to port the DMD frontend from c++ to Java. I think Ary wrote here once that he had to replace all the gotos with exceptions or something like that. That doesn't sound good or maintainable to me..
Jul 27 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Yigal Chripun wrote:
 
 Even better would be to have something like clang which offers a
 collection of libs (codegen, semantic analysis, parsing, etc..) and an
 API for each lib. that way the descent folks could have just used the
 semantic analysis and parser DLLs of the compiler and the respective
 APIs instead of having to port the DMD frontend from c++ to Java. I
 think Ary wrote here once that he had to replace all the gotos with
 exceptions or something like that. That doesn't sound good or
 maintainable to me..

*How* an IDE uses the compiler to perform analysis is another story. Right now the point is simply that it would be nice if the compiler (DMD) had more analysis functionality. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 turned out to be variables I had intended to use but forgot to

I am trying to tackle such time wastings with "protocols" in drokue. If one would be able to formally attach ones intentions to variables then such bugs could possibly be prevented. In your case the intention might have been to write and read the variable several times, of course starting with a write followed by some read. This intention can be expressed by a regular expression like: write+ read ( write | read )* For evaluating this at runtime (!) one may attach a DFA to the variable---a DFA that interpretes the operations on the variable as input. Of course the DFA has to be initiated somewhere before the first operation on the variable. At program termination the DFA can than be checked, whether it is in a final state. If at program termination the DFA is not in a final state then an "intention violation"-error can be reported. This way your time wouldn't have been wasted. Please note, that such cannot be done by a lint tool. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g51vnt$1o9n$1 digitalmars.com...
 Nick Sabalausky wrote:

 turned out to be variables I had intended to use but forgot to

I am trying to tackle such time wastings with "protocols" in drokue. If one would be able to formally attach ones intentions to variables then such bugs could possibly be prevented. In your case the intention might have been to write and read the variable several times, of course starting with a write followed by some read. This intention can be expressed by a regular expression like: write+ read ( write | read )* For evaluating this at runtime (!) one may attach a DFA to the variable---a DFA that interpretes the operations on the variable as input. Of course the DFA has to be initiated somewhere before the first operation on the variable. At program termination the DFA can than be checked, whether it is in a final state. If at program termination the DFA is not in a final state then an "intention violation"-error can be reported. This way your time wouldn't have been wasted.

I've been seeing alot in these last few years about such...I'm going to call it "intent oriented programming". There's a lot of good stuff that works that way (unit testing and D's function contracts, for instance. I've seen some other new things from the Java world as well), and your idea is certainly interesting. But I worry that eventually we'll get to some point where all code either is or can be generated straight from "intents" syntax. Now that certainly sounds great, but at that point all we really would have done is reinvent "programming language" and we'd be left with the same problem we have today: how can we be sure that the "code"/"intents" that we wrote are really what we intended to write? The solution would have just recreated the problem. Regarding your specific idea, my concern with that is that the whole "write and read the variable several times, starting with a write followed by some read" is tied too closely to the actual implementation. Change your algorithm/approach, and you gotta go update your intents. I'd feel like I'd gone right back to header files.
 Please note, that such cannot be done by a lint tool.

True, since lint tools only do front-end work. But the compiler would be able to do it by injecting appropriate code into its output. An external lint tool *could* be made to do it by using ultra-fancy CTFE, but then the ultra-fancy-CTFE engine would effectively be a VM (or, heck, even real native code), which would mean adding a backend to the lint tool which would basically turn it into a compiler. Thus, in a manner of speaking, there would be correctness-analysis that a compiler could do that a lint tool (by a definition of "lint tool" that would have admittedly become rather debatable by that point) couldn't. ;)
Jul 09 2008
next sibling parent reply BCS <ao pathlink.com> writes:
Reply to Nick,

 I've been seeing alot in these last few years about such...I'm going
 to call it "intent oriented programming".
 

look up intentional programming.
 But I worry that eventually we'll get to some point where all code
 either is or can be generated straight from "intents" syntax. Now that
 certainly sounds great, but at that point all we really would have
 done is reinvent "programming language" and we'd be left with the same
 problem we have today: how can we be sure that the "code"/"intents"
 that we wrote are really what we intended to write? The solution would
 have just recreated the problem.

But the hope is that this stuff will be easier for you to read/evaluate, smaller, written in terms that are closer to how you think of the problem and further from how it's implemented. At some point the issue arises of "is this what the end user wants?" (lint can't help you there :)
Jul 09 2008
parent "Nick Sabalausky" <a a.a> writes:
"BCS" <ao pathlink.com> wrote in message 
news:55391cb32ef798caafd0fdb12c62 news.digitalmars.com...

 At some point the issue arises of "is this what the end user wants?" (lint 
 can't help you there :)

I would love to have a "deal with the client" tool I could delegate all of that stuff to ;)
Jul 09 2008
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 But I worry that eventually we'll get to some point where all code either is 
 or can be generated straight from "intents" syntax. Now that certainly 
 sounds great, but at that point all we really would have done is reinvent 
 "programming language" and we'd be left with the same problem we have today: 
 how can we be sure that the "code"/"intents" that we wrote are really what 
 we intended to write? The solution would have just recreated the problem.

Back in the 80's, there was a heavily advertised product that touted "no more programming necessary". All you had to do was write in their "scripting language" and the product would read that and do all the "programming" for you. I thought it was hilarious.
Jul 09 2008