www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Wish: Variable Not Used Warning

reply "Nick Sabalausky" <a a.a> writes:
I don't suppose there's any chance of DMD getting a warning for 
variables/arguments that are declared but never accessed? Just today alone 
there's been two bugs I spent 10-30 minutes going nuts trying to track down 
that turned out to be variables I had intended to use but forgot to. 
Jul 05 2008
next sibling parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 06 Jul 2008 03:34:52 +0400, Nick Sabalausky <a a.a> wrote:

 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just today  
 alone
 there's been two bugs I spent 10-30 minutes going nuts trying to track  
 down
 that turned out to be variables I had intended to use but forgot to.
Agreed, put it into a bugzilla, otherwise it may get lost. Since D has no warnings, it should be an error, but it would break too much of the existing code. Better solution might be to enable this check when compiling with -w...
Jul 05 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.
I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too  
 much of the existing code.
What's bad about warnings? Bye, bearophile
Jul 05 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:g4p207$24dk$1 digitalmars.com...
 Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.
I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too
 much of the existing code.
What's bad about warnings?
Walter doesn't like them. He feels (felt?) that they tend to reflect shortcomings in a language's design (And I think there are many cases where he's right on that, looking at some other languages). But awhile ago he was finally convinced to put in some warnings when the "-w" flag is used. There was much rejoicing.
Jul 05 2008
prev sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
Well, I think the problems center around the following:

1. In many cases a "warning" in some C/C++ compilers really should be an 
error.  The line seems to be difficult and arguable.


who ignore warnings, it means their code might (surprisingly to them) be 
non-portable.

3. Warnings beg warning suppressions methodologies, because they are 
sometimes made incorrectly (otherwise they'd be an error, no?)

4. It creates more of a rift between people who are pedantic about 
warnings, etc. and people who are not.  In my experience at least, 
maintenance programmers, newer programmers, and experienced programmers 
can all fit into those two groups in awkward ways.

A lot of these points are well expressed by PHP's (imho very flawed) 
error reporting.  Most PHP programmers learn to turn warnings off before 
they even learn what a for loop is.  I've taken many large open-source 
or proprietary PHP codebases, simply turned on warnings, and been able 
to point out a handful of very obvious bugs in short order.

I think breaking it like D does is an excellent strategy based on real 
world, practical problems with warnings.

In any case, I would very much like to see (or develop) a code-standards 
enforcing lint tool for D.  This wouldn't be that hard to make based on 
dmd's open source frontend, and could be configured to enforce such 
guidelines as:

1. No commented out code (WebKit uses this guideline, I do with some 
languages as well.)

2. Either consistent or specific indentation style.

3. Variable usage and naming.

4. Use of unstable, deprecated, or untrusted language or library features.

But I really think that works better as a separate tool (that could be a 
checkin hook for whatever preferred versioning system, etc.)  This helps 
especially since some people don't compile things (although they should) 
before checkin, and actually recompiling automatically is often wrong.

-[Unknown]


bearophile wrote:
 Koroskin Denis:
 Agreed, put it into a bugzilla, otherwise it may get lost.
I have written this too in one of my lists of suggestions I have posted in this newsgroup, probably more than one year ago :-)
 Since D has no warnings, it should be an error, but it would break too  
 much of the existing code.
What's bad about warnings? Bye, bearophile
Jul 05 2008
prev sibling parent "Nick Sabalausky" <a a.a> writes:
"Koroskin Denis" <2korden gmail.com> wrote in message 
news:op.uduer8uhenyajd korden...
 On Sun, 06 Jul 2008 03:34:52 +0400, Nick Sabalausky <a a.a> wrote:

 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just today 
 alone
 there's been two bugs I spent 10-30 minutes going nuts trying to track 
 down
 that turned out to be variables I had intended to use but forgot to.
Agreed, put it into a bugzilla, otherwise it may get lost. Since D has no warnings, it should be an error, but it would break too much of the existing code. Better solution might be to enable this check when compiling with -w...
http://d.puremagic.com/issues/show_bug.cgi?id=2197
Jul 05 2008
prev sibling next sibling parent Ary Borenszweig <ary esperanto.org.ar> writes:
Nick Sabalausky a écrit :
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today alone 
 there's been two bugs I spent 10-30 minutes going nuts trying to track down 
 that turned out to be variables I had intended to use but forgot to. 
Moreover, I'd like a warning when a private variable is declared but never read. And when a private method is declared but never invoked.
Jul 05 2008
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today alone 
 there's been two bugs I spent 10-30 minutes going nuts trying to track down 
 that turned out to be variables I had intended to use but forgot to. 
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.
Jul 05 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4pplc$gno$1 digitalmars.com...
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements.
I develop code in a highly iterative manner and find "unused variable" warnings highly useful. In all of the time I've spent with other compilers that do issue "unused variable" warnings, I've never found it to be an annoyance. And the way I've always seen it, warnings literally *are* a built-in lint tool.
Jul 06 2008
prev sibling next sibling parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 06 Jul 2008 10:45:03 +0400, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for  
 variables/arguments that are declared but never accessed? Just today  
 alone there's been two bugs I spent 10-30 minutes going nuts trying to  
 track down that turned out to be variables I had intended to use but  
 forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.
Since DMD already has -w switch, why not make use of it? I think it would be a good practice to compile your code with -w on just once in a while, say, before a public release. This should enable more strick code checking, like unused methods, variables, unreachable code etc.
Jul 06 2008
parent "Nick Sabalausky" <a a.a> writes:
"Koroskin Denis" <2korden gmail.com> wrote in message 
news:op.udu17gmcenyajd korden...
 On Sun, 06 Jul 2008 10:45:03 +0400, Walter Bright 
 <newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?) There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.
Since DMD already has -w switch, why not make use of it? I think it would be a good practice to compile your code with -w on just once in a while, say, before a public release. This should enable more strick code checking, like unused methods, variables, unreachable code etc.
Not to beat a dead horse, but I always have warnings permanently turned on in every compiler I use, including DMD.
Jul 06 2008
prev sibling next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Walter Bright wrote:
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.
The compiler already has full semantic knowledge of the code, and at least some of the warnings seem like "low-hanging fruit" so why not make the compiler act as a "mini-lint"?
Jul 06 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not make 
 the compiler act as a "mini-lint"?
Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
Jul 06 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?
Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?
Jul 06 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?
Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?
Because what constitutes a proper warning is a very subjective issue, there is plenty of room for different ideas. If it was in the compiler, it would inhibit development of static analysis tools, and would confuse the issue of what was correct D code.
Jul 08 2008
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4v646$c2j$1 digitalmars.com...
 Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at 
 least some of the warnings seem like "low-hanging fruit" so why not 
 make the compiler act as a "mini-lint"?
Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?
Because what constitutes a proper warning is a very subjective issue, there is plenty of room for different ideas.
Ok, so the different warnings should be able to be turned on and off. If you don't agree with a particular type of warning then you turn it off. That's the nice thing about warnings as opposed to errors: they're optionally letting you know about certain conditions that you might want to be aware of, and they do it without changing, redefining, or otherwise affecting the language itself.
 If it was in the compiler, it would inhibit development of static analysis 
 tools,
Can you elaborate on how this would happen?
 and would confuse the issue of what was correct D code.
Anything that generates a warning instead of an error is by definition valid code. If it wasn't valid it would generate an error instead of a warning. Although, if by "correct" you're referring to style guidelines instead of syntactic and semantic validity, then I still disagree that it's an issue. For instance, I don't think many people would be opposed to having an optional switch that checked for consistent indentation style (I'm not actually requesting this though). People have different indentation style preferences, so the type of indentation could be configued, but perhaps have some sort of default. I can't imagine that confusing people as to what correct style was. Anyone who isn't an absolute novice is well aware of what does and doesn't constitute an issue of style (If it doesn't cause compile-time/runtime errors, then it's a matter of style).
Jul 08 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Ok, so the different warnings should be able to be turned on and off. If you 
 don't agree with a particular type of warning then you turn it off. That's 
 the nice thing about warnings as opposed to errors: they're optionally 
 letting you know about certain conditions that you might want to be aware 
 of, and they do it without changing, redefining, or otherwise affecting the 
 language itself.
That situation exists today for C++ compilers, and it's not so good. You have, as I mentioned previously, n factorial different languages instead of 1. Portability becomes a problem. Confusion about whether the code should compile or not reigns.
 If it was in the compiler, it would inhibit development of static analysis 
 tools,
Can you elaborate on how this would happen?
It's the same reason why "m4" never caught on as a C preprocessor, despite being vastly superior, and despite everyone who wanted a better CPP being told to use m4.
 and would confuse the issue of what was correct D code.
Anything that generates a warning instead of an error is by definition valid code. If it wasn't valid it would generate an error instead of a warning.
That's true, but it is not what happens in the real world with warnings. I've dealt with warnings on C/C++ compilers for 25 years, and the practice is very different from the theory.
Jul 08 2008
parent reply "Davidson Corry" <davidsoncorry comcast.net> writes:
On Tue, 08 Jul 2008 14:40:26 -0700, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 Ok, so the different warnings should be able to be turned on and off.  
 If you don't agree with a particular type of warning then you turn it  
 off. That's the nice thing about warnings as opposed to errors: they're  
 optionally letting you know about certain conditions that you might  
 want to be aware of, and they do it without changing, redefining, or  
 otherwise affecting the language itself.
That situation exists today for C++ compilers, and it's not so good. You have, as I mentioned previously, n factorial different languages instead of 1. Portability becomes a problem. Confusion about whether the code should compile or not reigns.
 If it was in the compiler, it would inhibit development of static  
 analysis tools,
Can you elaborate on how this would happen?
It's the same reason why "m4" never caught on as a C preprocessor, despite being vastly superior, and despite everyone who wanted a better CPP being told to use m4.
 and would confuse the issue of what was correct D code.
Anything that generates a warning instead of an error is by definition valid code. If it wasn't valid it would generate an error instead of a warning.
That's true, but it is not what happens in the real world with warnings. I've dealt with warnings on C/C++ compilers for 25 years, and the practice is very different from the theory.
I agree with Walter. One of the driving forces behind D was a desire *not* to have the quirks, corners and obscurities that grew within C++ over the years because Stroustrup wanted full backwards compatibility with C, etc. I want a compiler that says *this* is legal D, *that* is not, and there's an end on it. I *also* want a tool (or sheaf of tools, smart editor, etc.) that will do lint-like static analysis and style vetting to warn me that, yes, this is legal D but you're using it in an obscure or unmaintainable or not easily extensible or not easily understood manner. _But_I_don't_want_that_tool_to_be_the_compiler_! Walter is right that you end up with effectively 2**n different languages depending, not only on which warnings you enable|disable, but also on whether the shop you work for demands that you compile at /W1 or /W3 or /W4 and does or doesn't treat warnings as errors. Yes, having the full parse tree available makes it easier to find some (not all) of those sorts of... not "errors", call them "infelicities". So compiler writers have tried to be generous and give their users more information "for free", and by doing so have made IMHO a design error. It is exactly analogous to overloading an operator with functionality that doesn't properly apply to that operation. You're trying to do too much with one tool. I applaud Walter for not making that error. And I want him focused on writing a knife-clean compiler that stabs illegal code in the heart, and trusts the programmer to have meant what he said when the code is legal, even if it's "excessively clever". Let someone *else* write "Clippy for D". -- Dai
Jul 09 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:

 I agree with Walter. One of the driving forces behind D was a desire
 *not* to have the quirks, corners and obscurities that grew within C++
 over the years because Stroustrup wanted full backwards compatibility
 with C, etc.
This part I agree. D is a great language, and it has been my "home language" for years (replaced C++).
 I want a compiler that says *this* is legal D, *that* is
 not, and there's an end on it.
Maybe unused local vars, arguments or static arrays would be defined not to be legal D? :)
 I *also* want a tool (or sheaf of tools, smart editor, etc.) that will
 do lint-like static analysis and style vetting to warn me that, yes,
 this is legal D but you're using it in an obscure or unmaintainable or
 not easily extensible or not easily understood manner.
 _But_I_don't_want_that_tool_to_be_the_compiler_!
Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it. For me, it could even warn about indentation quirks, like: ... if(a == b) do_that(); do_that_also(); ... ...In which case the compiler could stop and say, that either add {}'s or correct the indentation :)
 Walter is right that you end up with effectively 2**n different
 languages depending, not only on which warnings you enable|disable, but
 also on whether the shop you work for demands that you compile at /W1 or
 /W3 or /W4 and does or doesn't treat warnings as errors.
Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)
 I applaud Walter for not making that error. And I want him focused on
 writing a knife-clean compiler that stabs illegal code in the heart, and
 trusts the programmer to have meant what he said when the code is legal,
 even if it's "excessively clever".
Heh, I like compilers that does not over-estimate the cleverness of the developer, but instead think that they (compilers) are the smarter part ;) Although being well known with syntax and best practices of a language, many times I write something else than I thought that I wrote. For catching these kind of spurious "miswritings", there are "syntactic salt" in many languages, including D. But at some point I think that it's no use to add more this salt, but instead do static checks to make the language better. As a very simple example, the current DMD warns about this: --- void error(string msg) { writefln(msg); exit(-1); } int doSomethingWith(string a) { if(a == null) { error("A null string"); } else { return a.length; } } --- $ dmd test.d warning - xxx: function test.doSomethingWith no return at end of function ...Since it does not understand that exit never returns (yes I know that that case should be written in different manner, but it is just an example). It could be told e.g. with some new return type (instead of "void exit" you would write "never_return exit"), and of course the analysis should go through the possible execution flows to check, which parts of the code may return and which parts cannot. Similar cases occur with switch statements. What I try to say, is that IMO it is impossible to think that language, compiler (code generation) and static checking are three separate things. If there is a good synergy between these three elements, the language is great. But that's just my opinion...
Jul 09 2008
next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 06:17:22 +0000, Markus Koskimies wrote:

Well, I need to share this experience with you; I have been debugging one 
of my segfaulting D programs for a few hours, and I finally found the 
reason for that. A shortened version:

	foreach(...)
	{
		Arg arg = null;

		...
		if( ... )
		...
		else if( ... )
		{
			...
			arg = somefunc( ... );
		}
		else if( ... )
		...
		else if( ... )
		{
--->			someotherfunc( ... ); <---
		}
		...
	...
	}

Those functions return Arg class objects, but earlier they returned voids 
(and used Arg objects as parameters). When modifying the code I forgot to 
store the return value in one of the execution paths --> segfaults.

Having "optional error" called a warning about not using return value of 
a function would have saved a lots of time, cigarettes, coffee and  
headache :D
Jul 10 2008
prev sibling next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Markus Koskimies:
 For me, it could even warn about indentation quirks, like:
 
 	...
 	if(a == b)
 		do_that();
 		do_that_also();
 	...
 
 ...In which case the compiler could stop and say, that either add {}'s or 
 correct the indentation :)
Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute. Bye, bearophile
Jul 10 2008
next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 05:00:54 -0400, bearophile wrote:

[...]
 Let's see... maybe just using indentation?
[...] Aa, I'm a big fan of Python and I wouldn't complain if D would use the same method for determining blocks ;D
Jul 10 2008
prev sibling parent "Nick Sabalausky" <a a.a> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:g54j46$2e05$1 digitalmars.com...
 Markus Koskimies:
 For me, it could even warn about indentation quirks, like:

 ...
 if(a == b)
 do_that();
 do_that_also();
 ...

 ...In which case the compiler could stop and say, that either add {}'s or
 correct the indentation :)
Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute. Bye, bearophile
At the risk of reliving an old discussion... http://dobbscodetalk.com/index.php?option=com_myblog&show=Redundancy-in-Programming-Languages.html&Itemid=29 In the case of Python (I assume that's the same sort of behavior as the Knuth you mention), the whole point behind the way it handles scope/indentation was to correct the problem of source files that have improper indentation by actually enforcing proper indentation. That's a very worthy goal. But the problem is in the way it goes about it: Python doesn't enfore a damn thing with regard to indentaion. It *can't* inforce proper indentation because it runs around assuming that the intentation it receives *is* the intended scope. So it can't enforce it just because doesn't have the slightest idea what the proper indentation for a particular piece of code would be - that would require separating scope from indentation and...oh, yea, that's what C-based languages like D do.
Jul 10 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g549hh$1h9i$2 digitalmars.com...
 On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:

 I *also* want a tool (or sheaf of tools, smart editor, etc.) that will
 do lint-like static analysis and style vetting to warn me that, yes,
 this is legal D but you're using it in an obscure or unmaintainable or
 not easily extensible or not easily understood manner.
 _But_I_don't_want_that_tool_to_be_the_compiler_!
Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it.
Right. See, even if you don't want that tool to be your compiler...you don't have to turn that feature on. If I want to use a TV remote, I can do so without dealing with the buttons that are built into the TV.
 Walter is right that you end up with effectively 2**n different
 languages depending, not only on which warnings you enable|disable, but
 also on whether the shop you work for demands that you compile at /W1 or
 /W3 or /W4 and does or doesn't treat warnings as errors.
Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)
I'm not sure I see the need for as many as four warning levels (though I suppose I could be convinced given an appropriate argument), but something like this sounds ideal to me: - enable typically-useful warnings - enable anally-retentive, only sometimes-helpful, warnings - treat typically-useful warnings as errors - treat all warnings as errors
 I applaud Walter for not making that error. And I want him focused on
 writing a knife-clean compiler that stabs illegal code in the heart, and
 trusts the programmer to have meant what he said when the code is legal,
 even if it's "excessively clever".
Heh, I like compilers that does not over-estimate the cleverness of the developer, but instead think that they (compilers) are the smarter part ;) Although being well known with syntax and best practices of a language, many times I write something else than I thought that I wrote. For catching these kind of spurious "miswritings", there are "syntactic salt" in many languages, including D. But at some point I think that it's no use to add more this salt, but instead do static checks to make the language better.
At the risk of a "me too" post...Me too ;)
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 14:55:49 -0400, Nick Sabalausky wrote:

 I'm not sure I see the need for as many as four warning levels (though I
 suppose I could be convinced given an appropriate argument), but
 something like this sounds ideal to me:
 
 - enable typically-useful warnings
 - enable anally-retentive, only sometimes-helpful, warnings
 
 - treat typically-useful warnings as errors - treat all warnings as
 errors
What I think is that the basic compiler needs following: - A set of warnings, that usually indicate bugs in the code and are relatively easy to circumvent (like unused vars and such), but which may be looked to be at least some sort frequent things while sketching software - Basically two warning levels: either to generate code while there are warnings, or not generate code (treating them errors) Suppressing the output of warnings? Why? What use? If you are not going to correct the warnings in your code when completing it, why you then even read the output of the compiler (if the code is generated)? Closing eyes does not make the things behind the warnings to go away :) Then, when dealing with larger software and looking for good places for refactoring, there could be an external "anally-retentive" lint-like tool :)
Jul 10 2008
parent "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g55tmb$1h9i$17 digitalmars.com...
 On Thu, 10 Jul 2008 14:55:49 -0400, Nick Sabalausky wrote:

 I'm not sure I see the need for as many as four warning levels (though I
 suppose I could be convinced given an appropriate argument), but
 something like this sounds ideal to me:

 - enable typically-useful warnings
 - enable anally-retentive, only sometimes-helpful, warnings

 - treat typically-useful warnings as errors - treat all warnings as
 errors
What I think is that the basic compiler needs following: - A set of warnings, that usually indicate bugs in the code and are relatively easy to circumvent (like unused vars and such), but which may be looked to be at least some sort frequent things while sketching software - Basically two warning levels: either to generate code while there are warnings, or not generate code (treating them errors) Suppressing the output of warnings? Why? What use? If you are not going to correct the warnings in your code when completing it, why you then even read the output of the compiler (if the code is generated)? Closing eyes does not make the things behind the warnings to go away :) Then, when dealing with larger software and looking for good places for refactoring, there could be an external "anally-retentive" lint-like tool :)
You've convinced me :)
Jul 10 2008
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Walter Bright (newshound1 digitalmars.com)'s article
 Robert Fraser wrote:
 Walter Bright wrote:
 Robert Fraser wrote:
 The compiler already has full semantic knowledge of the code, and at
 least some of the warnings seem like "low-hanging fruit" so why not
 make the compiler act as a "mini-lint"?
Generally for the reasons already mentioned. Warnings are properly in the scope of static analysis tools, which have a different purpose than a compiler.
A compiler is not a documentation generator or a header generator, yet DMD does both (with some switches). Why not the same with lint-like functionality?
Because what constitutes a proper warning is a very subjective issue, there is plenty of room for different ideas. If it was in the compiler, it would inhibit development of static analysis tools, and would confuse the issue of what was correct D code.
And regarding this particular issue, it's not uncommon to have unused function parameters. And while C++ allows them to be left out: void fn( int ) {} D does not. A warning for this would be terribly annoying. Sean
Jul 08 2008
parent reply "Jarrett Billingsley" <kb3ctd2 yahoo.com> writes:
"Sean Kelly" <sean invisibleduck.org> wrote in message 
news:g4vvtg$2237$1 digitalmars.com...

 And regarding this particular issue, it's not uncommon to have unused
 function parameters.  And while C++ allows them to be left out:

 void fn( int ) {}

 D does not.  A warning for this would be terribly annoying.
Are you sure about that? Cause that compiles and runs in D.
Jul 08 2008
parent Sean Kelly <sean invisibleduck.org> writes:
== Quote from Jarrett Billingsley (kb3ctd2 yahoo.com)'s article
 "Sean Kelly" <sean invisibleduck.org> wrote in message
 news:g4vvtg$2237$1 digitalmars.com...
 And regarding this particular issue, it's not uncommon to have unused
 function parameters.  And while C++ allows them to be left out:

 void fn( int ) {}

 D does not.  A warning for this would be terribly annoying.
Are you sure about that? Cause that compiles and runs in D.
Really? It didn't used to :-) Sean
Jul 08 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g4pplc$gno$1 digitalmars.com...
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language.
I really don't see how this is, unless every compiler always has "treat warnings as errors" permanently enabled with no way to disable. That's like saying that using different lint tools, or different settings within a the same lint tool constitutes different versions of the same language, and then claiming that means we shouldn't use lint tools.
  If you're faced with compiling someone else's code (like you downloaded 
 it off the internet and have to compile it because it only is distributed 
 as source) and warnings go off, is that a bug in the code or not? What do 
 you do?
By definition, that's not an error. Even if it is a manefestation of a bug, that's no reason to deliberately be silent - bugs should be noisy. What do you do? Whatever you would normally do when you come across something in a program that you're not sure is right or not. It's not an issue that's specific (or partucularly relevant, imho) to compiler warnings.
 Some shops have a "thou shall compile with warnings enabled, and there 
 shall be no warning messages."
That's a management problem, not a compiler design problem. (But I'm not saying that deliberately minimizing warning conditions is bad. I'm just saying that being strict enough that it becomes a problem, ie not balancing it with practical common sence, is just begging for trouble, and there's no reason we should be bending backwards to help compensate for the bad management choices of certain teams.)
 That causes problems when you port the code to a different compiler with a 
 different, even contradictory, notion of what is a warning. So then you 
 wind up putting wacky things in the code just to get the compiler to shut 
 up about the warnings.

 Those kind of things tend to interfere with the beauty of the code, and 
 since they are not necessary to the program's logic, they tend to confuse 
 and misdirect the maintenance programmer (why is this variable pointlessly 
 referenced here? Why is this unreachable return statement here? Is this a 
 bug?)
Thus comments.
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.
Like I've said, compiler warnings are essentialy a built-in lint tool. I see no reason to think of them any other way.
Jul 06 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 The problem with warnings is that if there are n warnings, there are 
 essentially n factorial different versions of the language.
I really don't see how this is, unless every compiler always has "treat warnings as errors" permanently enabled with no way to disable. That's like saying that using different lint tools, or different settings within a the same lint tool constitutes different versions of the same language, and then claiming that means we shouldn't use lint tools.
If you have 10 warnings, each independently toggled on or off, then you have 10 factorial different languages. The difference between lint and a compiler is people know lint is not a compiler and do not worry about lint's complaints. Warnings in the compiler are treated, in reality, like programming errors.
  If you're faced with compiling someone else's code (like you downloaded 
 it off the internet and have to compile it because it only is distributed 
 as source) and warnings go off, is that a bug in the code or not? What do 
 you do?
By definition, that's not an error.
I know, but that is NOT how they are perceived. People wonder if they downloaded it right, or if they downloaded the right version, they wonder if they should complain about it, they wonder if the program will work properly if compiled. It sucks.
 Some shops have a "thou shall compile with warnings enabled, and there 
 shall be no warning messages."
That's a management problem, not a compiler design problem.
Management of programming teams is an important issue. There are a number of characteristics in D that try to make it easier for managers to manage the programmers. These are based on my conversations with many managers about the kinds of problems they face. I don't agree that these issues should be ignored and dismissed as just a management problem.
 Those kind of things tend to interfere with the beauty of the code, and 
 since they are not necessary to the program's logic, they tend to confuse 
 and misdirect the maintenance programmer (why is this variable pointlessly 
 referenced here? Why is this unreachable return statement here? Is this a 
 bug?)
Thus comments.
I don't agree with relying on comments to make up for a language design that encourages confusing and misleading code to be written.
 Like I've said, compiler warnings are essentialy a built-in lint tool. I see 
 no reason to think of them any other way. 
I think you and I have had very different experiences with warnings!
Jul 08 2008
next sibling parent dennis luehring <dl.soluz gmx.net> writes:
is "walter bright" the name for a group of high experienced software 
developers and managers - or is an escaped us-army expert-system- experiment

because sometimes i am realy shocked how perfect your ideas fits 
developers(managers) daily needs

and i hope its not to hard for you to describe people, who are years 
behind your experience - and normaly don't understand the(or your) 
problem domain - why your ideas are better...

you are my how-it-should-work-brain-brother
its like a memo of my thinking each time i read your comments
thx very much for writing them down :-)
Jul 09 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.
Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.
Jul 09 2008
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:g51qgu$1f63$1 digitalmars.com...
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.
Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.
I'd also like to note one other thing...Umm, this might come across sounding harsh, so please understand I don't in any way intend it as any sort of personal or professional disrespect/insult/sarcasm/etc.: It's just that the way I've always felt about lint tools is, I've always seen lint tools as a sign of popular languages and compilers doing an insufficient job of catching easily-overlooked programming mistakes. (For instance, if I were going to use a language that allows implicit variable declarations (makes hidden mistakes easy), *and* there was no way to prevent the compiler/interpreter from remaining silent about it when it happened (a mere band-aid in the case of the implicit declaration problem, but a very welcome band-aid nonetheless), then I would grunble about it and try to find a lint tool that plugged that bug-hole. This, of course, goes back to the "good/bad redundancy in lanugage design" point that you've made.)
Jul 09 2008
prev sibling next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not a 
 compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.
Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.
I think Walter is right here too. With Microsoft compilers warnings are so copious that they become almost useless. They warn about piles of trivial things that only have a remote possibility of being a bug. So you end up just ignoring them, and in that case they might as well not be there. It's just annoying. I think the problem is that the compiler writers have this attitude that they can be "helpful" by warning about anything that possibly could be a bug, even if it's going to have 100 times more false positives than real hits. That's not a good way to do warnings. By making warnings either off or fatal like D, you force the compiler writers to actually think long and hard about whether the warning they're thinking to add is really so likely to be a bug that they should force the user to change the code. If it's fairly likely that the coder actually knows what they're doing, then that really doesn't justify the compiler issuing the warning. A lint tool fine, but not the compiler. +1 votes for Walter :-) --bb
Jul 09 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g51k8s$102f$1 digitalmars.com...
 The difference between lint and a compiler is people know lint is not 
 a compiler and do not worry about lint's complaints. Warnings in the 
 compiler are treated, in reality, like programming errors.
Ahh, now this appears to be the root of our differing opinions on this. I think I understand your reasoning behind this now, even though I still don't agree with it. It sounds like (previously unknown to me) there's a rift between the reality of warnings and the perceptions that many programmers (excluding us) have about warnings. As I understand it, you consider it more important to design around common perceptions of warnings, even if they're mistaken perceptions (such as warnings, by definition, not actually being errors). My disagreement is that I consider it better to design around the realities, and use a more education-based approach (I don't necessarily mean school) to address misperceptions. Is this a fair assessment of your stance, or am I still misunderstanding? If this is so, then our disagreement on this runs deeper than just the warnings themselves and exists on more of a "design-values" level, so I won't push this any further than to just simply note my disagreement.
I think Walter is right here too. With Microsoft compilers warnings are so copious that they become almost useless. They warn about piles of trivial things that only have a remote possibility of being a bug. So you end up just ignoring them, and in that case they might as well not be there. It's just annoying. I think the problem is that the compiler writers have this attitude that they can be "helpful" by warning about anything that possibly could be a bug, even if it's going to have 100 times more false positives than real hits. That's not a good way to do warnings.
But Visual Studio had the option to disable *specific* warnings either globally (in the IDE), or locally (in source code with pragma statements). So in my experience with VS C++, even though I did find several types of warnings which were fairly useless, I simply disabled those kinds of warnings globally, keeping all the others. So I don't see a problem here.
 By making warnings either off or fatal like D, you force the compiler 
 writers to actually think long and hard about whether the warning 
 they're thinking to add is really so likely to be a bug that they should 
 force the user to change the code.  If it's fairly likely that the coder 
 actually knows what they're doing, then that really doesn't justify the 
 compiler issuing the warning.  A lint tool fine, but not the compiler.
 
 +1 votes for Walter  :-)
 
 --bb
And because I don't see a problem, I also don't find the need to "making warnings either off or fatal like D", thus denying the use case where you want the compiler to report a code situation which is not necessarily "fairly likely" to be a bug, but is still relevant enough to report to the coder, for whatever reason. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 It sounds like (previously unknown to me) there's a rift between the reality 
 of warnings and the perceptions that many programmers (excluding us) have 
 about warnings. As I understand it, you consider it more important to design 
 around common perceptions of warnings, even if they're mistaken perceptions 
 (such as warnings, by definition, not actually being errors). My 
 disagreement is that I consider it better to design around the realities, 
 and use a more education-based approach (I don't necessarily mean school) to 
 address misperceptions. Is this a fair assessment of your stance, or am I 
 still misunderstanding?
It's a fair assessment. I give more weight to designing a language around the way programmers are and the way they tend to work, rather than trying to force them adapt to the language. As for the needs of programming managers, I think D is the only language that has attempted to address those needs. At least I've never ever heard of any other language even acknowledge the existence of such needs.
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g51uc7$1let$1 digitalmars.com...
 Nick Sabalausky wrote:
 It sounds like (previously unknown to me) there's a rift between the 
 reality of warnings and the perceptions that many programmers (excluding 
 us) have about warnings. As I understand it, you consider it more 
 important to design around common perceptions of warnings, even if 
 they're mistaken perceptions (such as warnings, by definition, not 
 actually being errors). My disagreement is that I consider it better to 
 design around the realities, and use a more education-based approach (I 
 don't necessarily mean school) to address misperceptions. Is this a fair 
 assessment of your stance, or am I still misunderstanding?
It's a fair assessment. I give more weight to designing a language around the way programmers are and the way they tend to work, rather than trying to force them adapt to the language.
The way I program, I tend run into situations such as the two Koroskin Denis pointed out. Unless this hypothetical D lint tool actually ends up materializing, then I'm forced to adapt to a compiler that refuses to let me know about a condition that I *want* to know about. Someone else in this thread just mentioned that DMD's warnings are always treated as errors, instead of only being treated as errors with a "warnings as errors" switch. I wasn't aware of this. That approach *certainly* confuses the issue of "warning" vs. "error" and creates what are effectively multiple languages (and, as other people pointed out, makes such "'warnings'-but-not-really-true-warnings" useless when using outside source libraries). (If you're wondering how I could have not known DMD treats warnings as errors since I'm obviously so pro-warning that I would certainly be using the -w switch, it's because at the moment, I seem to be having trouble getting DMD 1.029 to emit any warnings, even when deliberately trying to trigger the ones it's supposed to support. *But* for all I know right now this may be a rebuild or IDE issue, I haven't had a chance to look into it yet.)
 As for the needs of programming managers, I think D is the only language 
 that has attempted to address those needs. At least I've never ever heard 
 of any other language even acknowledge the existence of such needs.
If there's a legitimate need that programming managers have that can be met by a compiler without creating any problems for the actual programmers, then I'm all for it. But when there's a "programming manager" that's steadfast about "all warnings must always be treated as errors", *BUT* refuses to be practical about it and entertain any notion that there may actually be some warnings that are NOT really problems (in other words, "delusional" by the very definition of the word), then said "programming manager" is clearly incompetent and by no means should be indulged. That's like creating a programming language where 2 + 2 equals 7, just because you find out that there are "programmers" who are incompetent enough to insist that 2 + 2 really does equal 7.
Jul 09 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
The reason for treating warnings as errors when warnings are enabled is 
so that, for a long build, they don't scroll up and off your screen and 
go unnoticed.
Jul 09 2008
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
"Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously! -Steve
Jul 09 2008
next sibling parent reply BCS <ao pathlink.com> writes:
Reply to Steven,

 "Walter Bright" wrote
 
 The reason for treating warnings as errors when warnings are enabled
 is so that, for a long build, they don't scroll up and off your
 screen and go unnoticed.
 
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously! -Steve
I think grep is more useful there. I have a few builds that have several pages of output on a successful build and I sometimes miss even the errors. (I once had an error that didn't even fit in the scroll back buffer, but that was just a bit nuts ;)
Jul 09 2008
parent reply TomD <t_demmern.ospam web.de> writes:
BCS Wrote:
[...]
 
 I think grep is more useful there. I have a few builds that have several 
 pages of output on a successful build and I sometimes miss even the errors.
 
 (I once had an error that didn't even fit in the scroll back buffer, but 
 that was just a bit nuts ;)
That's why I think having a "tee" is even better :-) Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash... Ciao Tom
Jul 09 2008
parent reply BCS <ao pathlink.com> writes:
Reply to tomD,

 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.
 
 (I once had an error that didn't even fit in the scroll back buffer,
 but that was just a bit nuts ;)
 
That's why I think having a "tee" is even better :-) Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash...
<sarcastic>No? You can't?!</sarcastic> <G/> Your right and Oooh do i feel it now and again.
Jul 09 2008
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
BCS wrote:
 Reply to tomD,
 
 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.

 (I once had an error that didn't even fit in the scroll back buffer,
 but that was just a bit nuts ;)
That's why I think having a "tee" is even better :-) Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash...
<sarcastic>No? You can't?!</sarcastic> <G/> Your right and Oooh do i feel it now and again.
I agree Cygwin is nice, but go get yourself the gnuwin32 tools. Then you'll be able to use all your favorite unix commands (like 'tee') from the dos box. Makes it much less painful. And get Console2 also. --bb
Jul 09 2008
parent BCS <ao pathlink.com> writes:
Reply to Bill,

 BCS wrote:
 
 Reply to tomD,
 
 BCS Wrote:
 [...]
 I think grep is more useful there. I have a few builds that have
 several pages of output on a successful build and I sometimes miss
 even the errors.
 
 (I once had an error that didn't even fit in the scroll back
 buffer, but that was just a bit nuts ;)
 
That's why I think having a "tee" is even better :-) Anyway, you cannot do anything useful under Windows unless it is in a Cygwin bash...
<sarcastic>No? You can't?!</sarcastic> <G/> Your right and Oooh do i feel it now and again.
I agree Cygwin is nice, but go get yourself the gnuwin32 tools. Then you'll be able to use all your favorite unix commands (like 'tee') from the dos box. Makes it much less painful. And get Console2 also. --bb
six of one half dozon of the other just Give Me My Linux CLI Tools!
Jul 09 2008
prev sibling parent reply superdan <super dan.org> writes:
Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!
in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language. a lint tool should not be folded into d. such a tool could e.g. follow pointers, do virtual execution, and some other weird shit. it could run for hours and produce output that takes an expert to interpret. that kind of shit does not belong in the compiler.
Jul 09 2008
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
"superdan" wrote
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!
in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language.
I think you missed my point. Walter's position on warnings being errors (mind you, not by default, only when the -w (show me the warnings) switch is applied) is that people run out of screen space. To me, that's just plain silly as an argument. If you're gonna have warnings, which aren't considered errors by default, at least have it possible to configure so the compiler doesn't error out on the 1st warning. By education I mean, tell the ignorant programmer how to use his shell to pipe the warnings into a paged format, or to a file, or whatever. Don't hinder the knowlegable programmers who want to have everything at once. Regarding whether warnings should be in a lint tool or not, I'm undecided on the issue, as I have been hit by both sides (too many useless warnings, or gee it would have been nice for the compiler to tell me I did this wrong). -Steve
Jul 09 2008
parent superdan <super dan.org> writes:
Steven Schveighoffer Wrote:

 "superdan" wrote
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!
in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up. i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language.
I think you missed my point. Walter's position on warnings being errors (mind you, not by default, only when the -w (show me the warnings) switch is applied) is that people run out of screen space. To me, that's just plain silly as an argument. If you're gonna have warnings, which aren't considered errors by default, at least have it possible to configure so the compiler doesn't error out on the 1st warning.
yarp i also didn't exactly get high on walter's argument.
 By education I mean, tell the ignorant programmer how to use his shell to 
 pipe the warnings into a paged format, or to a file, or whatever.  Don't 
 hinder the knowlegable programmers who want to have everything at once.
fair enough. by the way i'm of with that gun zealot (what's his name) that good shit should output exactly nothing on success.
 Regarding whether warnings should be in a lint tool or not, I'm undecided on 
 the issue, as I have been hit by both sides (too many useless warnings, or 
 gee it would have been nice for the compiler to tell me I did this wrong).
that's a good argument that there should be no two ways about it. walter, make all warnings errors without -w and get rid of -w.
Jul 09 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"superdan" <super dan.org> wrote in message 
news:g53831$20jk$1 digitalmars.com...
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!
in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up.
That's probably because over the past ten years, the people who care more about doing things the right way than catering to the status quo have been leaving C++ en masse (hence, D). It's no surprise that the people still remaining onboard C++ are either A. people who hold that particular viewpoint or B. people who are required to use C++ for some reason and have long since gotten used to the fact that C++ is never going to fix most of its problems. So I wouldn't place too much weight on the "comp.lang.c++" take on this particular issue; their consensus is likely just a reflection of group dynamics.
 i am 110% on walter's side on this shit. there should be no warnings and 
 shit. only errors. it is not catering to the ignorant. it is a matter of a 
 properly defined language.
That's right, no true warnings, but just a handful of what are in effect "optional errors". In a "properly defined language", how would you solve the problem of unintentionally-unused variables? Adopt the "unused" keyword that Koroskin Denis proposed and say that an unused var without the unused keyword is an error, and accessing a var that does have the unused keyword is also an error? That sounded good to me at first but then I realized: What happens when you're in the middle of an implementation and you stick the "unused" keyword on a variable in a function that you've only partially implemented just because you want to test the partial implementation. Then you fix any problems, get distracted by something else, and forget to finish (it happens more than you may think). Well great, now that wonderful compiles/errors dichotomy has just *created* a hole for that bug to slip in, whereas a real warning (the true kind, not the "warnings as errors" kind) would have caught it. So how else could a "properly defined language" solve it? Just simply treat it as a non-error as it is now and be done with it? That turns potentially-noisy errors into silent errors which is one of the biggest design mistakes of all. Any other suggestions on how to "properly design" a fix for that? If it works, I'd be all for it. Suppose that does get fixed. Now, when some other common gotcha is discovered in a language, or a particular version of a language, that's had a design freeze (like D1), then what do you do? Stick to your "warnings are bad" guns and just leave everyone tripping over the gotcha in the dark, maybe hoping that someone else could come along and create a lint tool that would do the job that you could have already done? Designing everything to fit into a compiles/errors dichotomy is great, in theory. But in practice it's just unrealistic. Even Walter ended up having to add a few "warnings" to D (even if he implemented them more as optional errors than as true warnings). Which is why, as I was saying in the beginning, trying to eliminate the need for a specific warning is great - *if* it actually pans out. But that doesn't always happen.
 a lint tool should not be folded into d. such a tool could e.g. follow 
 pointers, do virtual execution, and some other weird shit. it could run 
 for hours and produce output that takes an expert to interpret. that kind 
 of shit does not belong in the compiler.
Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.
Jul 09 2008
next sibling parent reply superdan <super dan.org> writes:
Nick Sabalausky Wrote:

 "superdan" <super dan.org> wrote in message 
 news:g53831$20jk$1 digitalmars.com...
 Steven Schveighoffer Wrote:

 "Walter Bright" wrote
 The reason for treating warnings as errors when warnings are enabled is 
 so
 that, for a long build, they don't scroll up and off your screen and go
 unnoticed.
I've been following this thread, and I'm not really sure which side of the issue I'm on, but this, sir, is one of the worst explanations for a feature. Ever heard of 'less'? or 'more' on Windows? Maybe piping to a file? Maybe using an IDE that stores all the warnings/errors for you? Please stop saving poor Mr. ignorant programmer from himself. Education is the key to solving this problem, not catering to the ignorance. Sorry for the harshness, but seriously!
in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up.
That's probably because over the past ten years, the people who care more about doing things the right way than catering to the status quo have been leaving C++ en masse (hence, D). It's no surprise that the people still remaining onboard C++ are either A. people who hold that particular viewpoint or B. people who are required to use C++ for some reason and have long since gotten used to the fact that C++ is never going to fix most of its problems. So I wouldn't place too much weight on the "comp.lang.c++" take on this particular issue; their consensus is likely just a reflection of group dynamics.
group was given as an example. the thing is it has become clear to the luminaries that invoking better education is not an answer. it is clear from the literature and also from c++ oh ecs.
 i am 110% on walter's side on this shit. there should be no warnings and 
 shit. only errors. it is not catering to the ignorant. it is a matter of a 
 properly defined language.
That's right, no true warnings, but just a handful of what are in effect "optional errors". In a "properly defined language", how would you solve the problem of unintentionally-unused variables?
first i'd stop bitching why oh why the language does not build that shit in. that would be a great start. give me my fucking soapbox again. there. thanks. too many people around here are trigger happy about changing the language. (next breath they yell they want stability.) has nothing to do with you but reminds me of shit goin' on here in this group. moron: "d has no bitfields. somehow in my fucking world bitfields are so essential, i can't fucking live without them. hence i can't use d. give me bitfields and i'll give you my girlfriend." months go by. walter: "here, there are perfectly functional bitfields in std.bitmanip. they're more flexible and more rigorously defined than in fucking c. you can count on'em." moron: "don't like the syntax. still won't use d. i want them in the language. put them in the language and i'll use d."
 Adopt the "unused" keyword that Koroskin Denis proposed and say that an 
 unused var without the unused keyword is an error, and accessing a var that 
 does have the unused keyword is also an error?
once i stop bitching i get a clearer mind and I get to write some shit like this. void vacuouslyUse(T)(ref T x) {} void foo() { int crap; vacuouslyUse(crap); ................ } use and remove as you wish.
 That sounded good to me at 
 first but then I realized: What happens when you're in the middle of an 
 implementation and you stick the "unused" keyword on a variable in a 
 function that you've only partially implemented just because you want to 
 test the partial implementation. Then you fix any problems, get distracted 
 by something else, and forget to finish (it happens more than you may 
 think). Well great, now that wonderful compiles/errors dichotomy has just 
 *created* a hole for that bug to slip in, whereas a real warning (the true 
 kind, not the "warnings as errors" kind) would have caught it.
unused name should be an error. if you want to not use something, you must sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.
 So how else could a "properly defined language" solve it? Just simply treat 
 it as a non-error as it is now and be done with it? That turns 
 potentially-noisy errors into silent errors which is one of the biggest 
 design mistakes of all.
 
 Any other suggestions on how to "properly design" a fix for that? If it 
 works, I'd be all for it.
it works but i kinda doubt you'll be all for it. you don't want to solve the unused variable problem. you want compiler warnings. somehow you'll work your argument out to make my solution undesirable.
 Suppose that does get fixed. Now, when some other common gotcha is 
 discovered in a language, or a particular version of a language, that's had 
 a design freeze (like D1), then what do you do? Stick to your "warnings are 
 bad" guns and just leave everyone tripping over the gotcha in the dark, 
 maybe hoping that someone else could come along and create a lint tool that 
 would do the job that you could have already done?
this is an imperfect world. i see value in the no warning stance. you don't see. therefore when competition in d compilers arena will pick up i'd see a warning as a shitty concession, while you will grin "i told ya all along".
 Designing everything to fit into a compiles/errors dichotomy is great, in 
 theory. But in practice it's just unrealistic. Even Walter ended up having 
 to add a few "warnings" to D (even if he implemented them more as optional 
 errors than as true warnings). Which is why, as I was saying in the 
 beginning, trying to eliminate the need for a specific warning is great - 
 *if* it actually pans out. But that doesn't always happen.
when doesn't it happen?
 a lint tool should not be folded into d. such a tool could e.g. follow 
 pointers, do virtual execution, and some other weird shit. it could run 
 for hours and produce output that takes an expert to interpret. that kind 
 of shit does not belong in the compiler.
Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.
weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?
Jul 09 2008
next sibling parent Walter Bright <newshound1 digitalmars.com> writes:
superdan wrote:
 walter: "here, there are perfectly functional bitfields in
 std.bitmanip. they're more flexible and more rigorously defined than
 in fucking c. you can count on'em."
I'd like to take credit for std.bitmanip, but it's Andrei's design and effort.
Jul 09 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"superdan" <super dan.org> wrote in message 
news:g53ms5$h6n$1 digitalmars.com...
 group was given as an example. the thing is it has become clear to the 
 luminaries that invoking better education is not an answer. it is clear 
 from the literature and also from c++ oh ecs.
[rambling, barely-readable cuss-fest trimmed]
 once i stop bitching i get a clearer mind and I get to write some shit 
 like this.

 void vacuouslyUse(T)(ref T x) {}

 void foo()
 {
    int crap;
    vacuouslyUse(crap);
    ................
 }

 use and remove as you wish.

 unused name should be an error. if you want to not use something, you must 
 sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.
I would still prefer it to be a warning (that way it would keep nagging me when I forget to finish up and take out the temporary vacuouslyUse), but at this point I could live with this compromise. It would certainly be a lot better then the total silence it gives me now.
 it works but i kinda doubt you'll be all for it. you don't want to solve 
 the unused variable problem. you want compiler warnings. somehow you'll 
 work your argument out to make my solution undesirable.


 this is an imperfect world. i see value in the no warning stance. you 
 don't see.
I see value in warnings, you don't see. The imperfect world fact only serves to illustrate that taking sound practical advide ("the need for warnings should be minimized") to a unilateral extreme ("all warnings are always bad") just doesn't typically work out. Remember when the Java folks were trying to tell us that nothing should ever be non-OO?
 therefore when competition in d compilers arena will pick up i'd see a 
 warning as a shitty concession, while you will grin "i told ya all along".
I'm well aware of the difference between truth and popular opinion.
 Designing everything to fit into a compiles/errors dichotomy is great, in
 theory. But in practice it's just unrealistic. Even Walter ended up 
 having
 to add a few "warnings" to D (even if he implemented them more as 
 optional
 errors than as true warnings). Which is why, as I was saying in the
 beginning, trying to eliminate the need for a specific warning is great -
 *if* it actually pans out. But that doesn't always happen.
when doesn't it happen?
As just a few examples: http://www.digitalmars.com/d/1.0/warnings.html
 Anything like that can be attached to an optional command-line parameter
 that defaults to "off". Problem solved.
weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?
Trying to convince a Unix-hater of something by appealing to Unix-values is kinda like using the bible to convince an athiest of someting. But, I'm well aware that debating the merits of Unix-philosophy to a Unix-fan is equally fruitless, so I'm going to leave this particular point at that.
Jul 09 2008
parent reply superdan <super dan.org> writes:
Nick Sabalausky Wrote:

 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html
yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton. 2. warning - array 'length' hides other 'length' name in outer scope i seem to recall andrei pissed on this one until it dissolved into the fucking ground. can't agree more. it is a crying shame that this stupid length thing is still in the language. just get rid of it already. 3. warning - no return at end of function now what a sick decision was it to accept that in the first place. an overwhelming percentage of functions *can* and *will* be written to have a meaningful return at the end. then why the fuck cater for the minority and hurt everybody else. just require a return or throw and call it a day. people who can't return something meaningful can just put a throw. code growth is negligible. impact on speed is virtually nil. why the hell do we even bother arguing over it. 4. warning - switch statement has no default another example of a motherfuck. just require total coverage. in closed-set cases i routinely write anyway: switch (crap) { case a: ...; break; case b: ...; break; default: assert(crap == c): ...; break; } again: vast majority of code already has a default. the minority just has to add a little code. make it an error. 5. warning - statement is not reachable this is a tad more messy. people routinely insert a premature return in there to check for stuff. it pisses me off when that won't compile. i discovered i could do this: if (true) return crap; that takes care of the error. and i think it's actually good for me because it really is supposed to be temporary code. it jumps at me in a review. as it should.
Jul 10 2008
next sibling parent reply Don <nospam nospam.com.au> writes:
superdan wrote:
 Nick Sabalausky Wrote:
 
 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html
yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton.
That's in bugzilla. http://d.puremagic.com/issues/show_bug.cgi?id=1257 That whole area needs to be tidied up. Polysemous types should really help with this.
 3. warning - no return at end of function
 
 now what a sick decision was it to accept that in the first place. an
overwhelming percentage of functions *can* and *will* be written to have a
meaningful return at the end. then why the fuck cater for the minority and hurt
everybody else. just require a return or throw and call it a day. people who
can't return something meaningful can just put a throw. code growth is
negligible. impact on speed is virtually nil. why the hell do we even bother
arguing over it.
Yup. return should be required, unless function contains inline asm. Otherwise manually put assert(0); at the last line.
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.
Yup. Make it an error.
 
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in there
to check for stuff. it pisses me off when that won't compile. i discovered i
could do this:
 
 if (true) return crap;
 
 that takes care of the error. and i think it's actually good for me because it
really is supposed to be temporary code. it jumps at me in a review. as it
should.
You can also put assert(0); at the top of the unreachable code. 2,3, and 4 should definitely be errors. I also think that uninitialised class variables should be a compile-time error. It's a horrible newbie trap, especially for anyone with a C++ background: ------------- class C { void hello() { writefln("Hello crashing world"); } }; void main() { C c; // should be illegal c.hello(); } -------------- My first D program using classes was somewhat like that; took me ages to work out why it was segfaulting at runtime. It's still the most common mistake I make. You should have to write C c = null; for the rare cases where you really want an uninitialised class.
Jul 11 2008
next sibling parent reply superdan <super dan.org> writes:
Don Wrote:

 superdan wrote:
 Nick Sabalausky Wrote:
 
 As just a few examples:
 http://www.digitalmars.com/d/1.0/warnings.html
yarp i'm so happy you sent those. let's take'em 1 by 1. please let me know agree or disagree. 1. warning - implicit conversion of expression expr of type type to type can cause loss of data it's a shame this is allowed at all. any conversion that involves a loss must require a cast right there. as far as the example give goes: byte a, b; byte c = a + b; compiler can't know a + b is in byte range so a cast is good. but take this now: byte c = a & b; in this case the compiler must accept code. so what i'm saying is that better operator types will help a ton.
That's in bugzilla. http://d.puremagic.com/issues/show_bug.cgi?id=1257
cool that's great. just a nit now. you mention only logical operations there. (actually you meant bitwise operation.) but i got to thinking a bit and a few integer arithmetic operations also should be included. a / b is never larger than a (cept for signed/unsigned mixed shit). a % b is never larger than b (again save for same shit). this could go a long way making casts unnecessary. as a consequence the compiler could tighten its sphincters and become more strict about implicit casts & shit. someone else also mentioned a < b which is fucked for mixed signs. all ordering comparisons like that are fucked and should be disabled. only == and != work for mixed signs. for the rest, cast must be required. of course if one is constant there may be no need. i have no idea on what to do about a + b with mixed signs. it's messed up like shit.
 That whole area needs to be tidied up. Polysemous types should really 
 help with this.
could someone care explain what this polysemous shit is (is it not polysemantic btw). the video is too vague about it. maybe this will convince andrey to haul his russian ass over here. btw thought he'd be older and more self-righteous. but i was surprised he seems a laid back dood. tries too hard to be funny tho. but he knows his shit.
 3. warning - no return at end of function
 
 now what a sick decision was it to accept that in the first place. an
overwhelming percentage of functions *can* and *will* be written to have a
meaningful return at the end. then why the fuck cater for the minority and hurt
everybody else. just require a return or throw and call it a day. people who
can't return something meaningful can just put a throw. code growth is
negligible. impact on speed is virtually nil. why the hell do we even bother
arguing over it.
Yup. return should be required, unless function contains inline asm. Otherwise manually put assert(0); at the last line.
i don't think assert(0); is cool. in a release build it disappears and that fucks the whole plan right there.
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.
Yup. Make it an error.
great! where do i sign the petition?
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in there
to check for stuff. it pisses me off when that won't compile. i discovered i
could do this:
 
 if (true) return crap;
 
 that takes care of the error. and i think it's actually good for me because it
really is supposed to be temporary code. it jumps at me in a review. as it
should.
You can also put assert(0); at the top of the unreachable code.
again assert(0); goes away in release mode. but wait, that's unreachable code anyway. guess that could work.
 2,3, and 4 should definitely be errors.
 I also think that uninitialised class variables should be a compile-time 
 error. It's a horrible newbie trap, especially for anyone with a C++ 
 background:
 -------------
 class C {
    void hello() { writefln("Hello crashing world"); }
 };
 
 void main()  {
   C c;  // should be illegal
   c.hello();
 }
 --------------
 My first D program using classes was somewhat like that; took me ages to 
 work out why it was segfaulting at runtime. It's still the most common 
 mistake I make.
 You should have to write C c = null; for the rare cases where you really 
 want an uninitialised class.
yarp. can't tell how many times this bit my ass. in fact even "new" is bad. there should be no new. auto c = C(crap); then classes and structs are more inter changeable.
Jul 11 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned mixed shit).
a = -10; b = -5
Jul 11 2008
parent superdan <super dan.org> writes:
Robert Fraser Wrote:

 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned mixed shit).
a = -10; b = -5
i got lazy. whenever i said "larger" i really meant "larger type". so in this case the correct sentence was; a / b never requires a larger type than the type of a.
Jul 11 2008
prev sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 
Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case. --bb
Jul 11 2008
next sibling parent superdan <super dan.org> writes:
Bill Baxter Wrote:

 superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 
Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case.
thanks. ow that sucks goat balls. i finally explained to myself the weird shit that happened to me a couple months ago. why the special case???
Jul 11 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 superdan wrote:
 i don't think assert(0); is cool. in a release build it disappears 
Actually it does not disappear in release mode. assert(0) and assert(false) are always active. For better or worse, it's treated as a special case. --bb
What's "worse" about it? I think it actually makes sense. The reason normal asserts are not put in release mode is for performance reasons: so that the program doesn't waste time processing the assert condition, when it could evaluate to true, and not generate an exception. But when a program reaches an assert(false) it would always throw, and you have a bug, so I see no reason for it to be removed in release code. Unless you want your program to try to keeping going nonetheless (instead of throwing), but I'm not sure that's a good idea, although I guess it could work in some cases. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Don Wrote:

 superdan wrote:
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.
Yup. Make it an error.
I agree with everything else, but this one I think shouldn't be an error or warning (the implicit assert(0) is enough). This is because the vast majority of switch statements I use (and many I see) are over enums, and if every branch in the enumeration is covered, a pointless "default" will just complicate code. The "final switch" thing mentioned at the conference & now forgotten, OTOH, is a great idea for statically checking switch statements.
Jul 11 2008
parent superdan <super dan.org> writes:
Robert Fraser Wrote:

 Don Wrote:
 
 superdan wrote:
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.
Yup. Make it an error.
I agree with everything else, but this one I think shouldn't be an error or warning (the implicit assert(0) is enough). This is because the vast majority of switch statements I use (and many I see) are over enums, and if every branch in the enumeration is covered, a pointless "default" will just complicate code.
you are not disagreeing. switching over an enum is already closed if you mention all cases. the compiler knows that. it should indeed just throw an error if you have an out-of-range value that you forged from an int. but that's an uncommon case. don't make all pay for a rare bug.
 The "final switch" thing mentioned at the conference & now forgotten, 
 OTOH, is a great idea for statically checking switch statements.
yarp i liked it too til i realized all switches should be final.
Jul 11 2008
prev sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 16:06:45 -0400, superdan wrote:

 byte a, b;
 byte c = a + b;
I think that compilers should never generate warnings in these cases. If you overflow in arithmetic operations of same types, it is at most runtime error issue. You should know how large values can be stored in basic types, and use large enough data type in the code. Since explicit casting easily hides bugs, it should not be used overwhelmingly.
 5. warning - statement is not reachable
 
 this is a tad more messy. people routinely insert a premature return in
 there to check for stuff. it pisses me off when that won't compile. 
That is very good example, why it would be good to have possibility to (temporarily) generate code even when it has warnings. That way the warning does not go anywhere and you still can debug your program.
 i discovered i could do this:
 
 if (true) return crap;
Thanks! :)
Jul 11 2008
prev sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:

 In a "properly defined language", how would you solve the problem of
 unintentionally-unused variables?
My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects. For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used. Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out. In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :) And yes, warnings could be considered as "optional errors" for us who think that it's best to tackle all sorts of quirks & potential bugs at compile time and not trying to find them with runtime debugging. As long as the warning makes some sense and can be circumvented in some reasonable way, just throw it to my screen :)
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g54b6m$1h9i$4 digitalmars.com...
 On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:

 In a "properly defined language", how would you solve the problem of
 unintentionally-unused variables?
My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects. For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used. Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out. In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :)
I'd prefer a warning, but I'd be fine with all this.
 And yes, warnings could be considered as "optional errors" for us who
 think that it's best to tackle all sorts of quirks & potential bugs at
 compile time and not trying to find them with runtime debugging. As long
 as the warning makes some sense and can be circumvented in some
 reasonable way, just throw it to my screen :)
I, too, like to tackle all that stuff right when I compile. But whenever I've refered to warnings as "optional errors" here, what I meant was that it's impossible to turn off "treat warnings as errors". Even if warnings are not treated as errors, they're still going to show up on your screen (provided you at least enabled them, of course), so you can still choose to deal with them right then and there. The benefit is that you wouldn't have to fix (or wait for a fix for) any warnings in any third-party source libraries you use. Also, while I can't confirm or deny this at the moment, someone here said that -w compiles currently halt at the first warning. If that's the case, then disabling "treat warnings as errors" would let you see all the warnings at once, not just the first one. Plus, allowing "treat warnings as errors" to be disabled would decrease the strength of the phenomenon Walter and others described where warnings effectively create multiple versions of the same language. The pheonomenon would only occur in places that take "No warnings allowed!" to an obsessive/compulsive/irrational level (rather than a merely sensible level), instead of happening to everybody.
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 15:20:54 -0400, Nick Sabalausky wrote:

About that "warnings as errors"; for me the reason for that behavior is 
that I usually include executing the code to the command line when coding 
(and use up arrow to rerun it, if the compiler didn't accept my code):

$ make && ./myProgram

If the compiler does not stop for warnings, I'd need some sort of build 
log to examine the warnings after execution. But if the compiler returns 
error value (like -1) when meeting warnings, the program was not executed 
and I can easily examine the reasons.

This same happens of course with IDEs, when using "Run" instead of first 
compiling/building the software.

 Also, while I
 can't confirm or deny this at the moment, someone here said that -w
 compiles currently halt at the first warning.
No, it shows all warnings it generates. But IMO it does not generate enough warnings.
Jul 10 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g55u4d$1h9i$18 digitalmars.com...
 On Thu, 10 Jul 2008 15:20:54 -0400, Nick Sabalausky wrote:

 About that "warnings as errors"; for me the reason for that behavior is
 that I usually include executing the code to the command line when coding
 (and use up arrow to rerun it, if the compiler didn't accept my code):

 $ make && ./myProgram

 If the compiler does not stop for warnings, I'd need some sort of build
 log to examine the warnings after execution. But if the compiler returns
 error value (like -1) when meeting warnings, the program was not executed
 and I can easily examine the reasons.

 This same happens of course with IDEs, when using "Run" instead of first
 compiling/building the software.

 Also, while I
 can't confirm or deny this at the moment, someone here said that -w
 compiles currently halt at the first warning.
No, it shows all warnings it generates. But IMO it does not generate enough warnings.
I suppose I should point out that I have nothing against treating warnings as errors, per se. I just think it should be optional and not forced by the compiler to be either "always treated as errors and there's nothing you can do about it" or "never treated as errors and there's nothing you can do about it"
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:28:53 -0400, Nick Sabalausky wrote:

 I suppose I should point out that I have nothing against treating
 warnings as errors, per se. I just think it should be optional and not
 forced by the compiler to be either "always treated as errors and
 there's nothing you can do about it" or "never treated as errors and
 there's nothing you can do about it"
Honestly, (1) I was using D compiler happily for some years and I thought that it generates warnings just like other compilers do. I was shocked to recognize, that it really does not do that. (2) I realized that there is some kind of fundamentalist ideology not to produce warnings from compiler (that's extremely silly from my point of view); that's why I suggested, that combining the current D possibilities it would really make no big difference to treat warnings as errors (since it seems, that it is more likely to get errors to the compiler, not warnings), (3) From the point of both programmer, and compiler designer, I see absolutely no point not generating warnings, when the compiler knows it has done something probably silly. The more optimizations the compiler does, the more aware it is about the source code. What v*#p%&"/(¤ %&#s¤/&/ compiler has already made (about unused vars, private methods, dead code, unused imports etc. etc). --- (*) Those are Finnish swearing words, that does not compile to English. You may use "f**k" for every character ;)
Jul 10 2008
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages. In all the time I've spent using Microsoft compilers, I've found the "x number of errors, y number of warnings" display at the end of every compile to be perfectly sufficient for the problem you point out. If a build involves many different calls to a compiler, then whatever rebuild-like tool is being used could be made to screen-scrape and total up the warnings and errors. Or the compiler could stick the error/warning counts along in an output file that gets read and accumulated by the rebuild/make tool. Or a copy of all the output could just be piped into a "grep for the error/warning counts" tool. This way, DMD's warnings could be lint-like warnings instead of the language-splitting "optional errors" (which I can understand your reluctance to create more of) that they are now. A "treat warnings as errors" flag could be retained for any large builds that involve multiple compiler invokations but for some reason still don't do any form of proper cumulative "x warnings / x errors".
Jul 09 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is so 
 that, for a long build, they don't scroll up and off your screen and go 
 unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix. Piping the output into a file and then perusing it manually looking for warning statements is never going to happen. Complex builds tend to produce a lot of output, and poking through it looking for warnings every time you build is not practical. Changing your build process to point out warnings is the same thing as the compiler treating them as errors, except it's extra work for the build master. Trying to educate your programmers into doing extra work to deal with warnings that scroll off the screen is a lost cause. If you're using a static analysis tool, such as Coverity, which produces lots of spurious warnings, it is not put in the build process. It's run occasionally as a separate evaluation tool.
Jul 09 2008
next sibling parent Sean Kelly <sean invisibleduck.org> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled 
 is so that, for a long build, they don't scroll up and off your 
 screen and go unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.
This is true. However, warnings are often related to code structure and the compiler isn't perfect at identifying real problems... and code changes to work around deficiencies in the checking tool aren't always appealing. For example, there is a file in the GC code, if I remember correctly, that doesn't compile correctly with warnings enabled because it uses a goto or some such that confuses the compiler about what's going on. If this were C++ I might be inclined to pragma out that particular warning for the area where the warning is displayed. Another issue is with third-party libraries. I always compile my code with the strictest warning settings, yet some of the libraries I use aren't so careful. With them, the easiest thing to do it often to assume that they work correctly despite the warnings and disable warning messages for the relevant headers.
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings 
 every time you build is not practical. Changing your build process to 
 point out warnings is the same thing as the compiler treating them as 
 errors, except it's extra work for the build master.
It isn't practical to do so for every build, but it's not uncommon for a team to set aside some time to address warnings in bulk, say between releases.
 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.
 
 If you're using a static analysis tool, such as Coverity, which produces 
 lots of spurious warnings, it is not put in the build process. It's run 
 occasionally as a separate evaluation tool.
That's certainly an option, and probably a preferable one overall. Sean
Jul 09 2008
prev sibling next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g537q0$1vi0$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled is 
 so that, for a long build, they don't scroll up and off your screen and 
 go unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.
First of all, if you don't want to deal with warnings, then along with what you said, you presumably wouldn't have turned them on in the first place. So I'm not sure you'd be developing a blind spot (unless you're being required to use them, which goes back to the management discussion). Aside from that, I'll fully agree that cerain warnings can get annoying and eventually overlooked, *if* you're using a language like C/C++ that has accumulated decades of warnings that resulted from design issues that could have been fixed by a design change but never were simply because maintaining full backwards compatibility (across all those years) was considered more important than never fixing a problem properly. D's not really in that boat. But even if it did end up in that boat someday, I'd much rather have the *chance* of not noticing a particular warning, than be guaranteed never to notice it simply because it was decided not to even offer the warning out of fear that it might get ignored.
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings every 
 time you build is not practical. Changing your build process to point out 
 warnings is the same thing as the compiler treating them as errors, except 
 it's extra work for the build master.

 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.
I think you misunderstood me. What I was talking about would only involve the makers of things like rebuild or make. All we need is a cumulative "x errors, x warnings" at the end of the build process. That's enough to let people know that there were warnings they should scroll up and look at (if they care about warnings in the first place). That would elininate the need to always force warnings as errors out of the mere worry that someone might not see it because it scrolled away. And if a so-called "programmer" has a problem looking at the "x errors, x warnings" display, then they themselves are already a lost cause, period. I don't to have to put up with gimped tools just because some incompetent morons are masquerading as real programmers.
 If you're using a static analysis tool, such as Coverity, which produces 
 lots of spurious warnings, it is not put in the build process. It's run 
 occasionally as a separate evaluation tool.
I can agree with that, but with the caveat that I, for one, would at the very least choose the most useful subset of those warnings to run during each compile. As an example, the last time I was using ActionScript 1 (ECMAScript without any of the improvements from v3 or beyond), I was constantly running into problems that a compiler like DMD would have caught (and considered errors) but ended up spending half an hour, a full hour, etc, trying to debug. I incorporated a JavaScript lint tool into my workflow (ran it every time I saved and was about to test something), and it helped immensely. Never gave me any problem. My point is, I *do* want certain warnings to be checked for on every compile. Now yes, there can be extra ones that are really anal and only occasionally useful. But in a normal setup where warnings are only *optionally* treated as errors *and* I can select which warnings I want, then those really anal annoying warnings can just be run on occasion, and I can still have my more common and highly-useful ones caught right away - which is what I want. And I seriously doubt I'm any sort exceptional case by feeling that way about it.
Jul 09 2008
prev sibling next sibling parent "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 21:41:35 +0100, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message  
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled  
 is so that, for a long build, they don't scroll up and off your screen  
 and go unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
I'll draw on my 25 years of experience with warnings to answer this. If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix. Piping the output into a file and then perusing it manually looking for warning statements is never going to happen. Complex builds tend to produce a lot of output, and poking through it looking for warnings every time you build is not practical. Changing your build process to point out warnings is the same thing as the compiler treating them as errors, except it's extra work for the build master. Trying to educate your programmers into doing extra work to deal with warnings that scroll off the screen is a lost cause. If you're using a static analysis tool, such as Coverity, which produces lots of spurious warnings, it is not put in the build process. It's run occasionally as a separate evaluation tool.
Focussing mainly on your last point... Whenever I work with static analysis tools (I'm talking C++ here obviously) the first thing I do is to put them into the build process right after the compiler. If you run one just occasionally you will get lost in a sea of spurious warnings. Eliminating warnings and hence the slight possibility of error that goes with them takes effort and that effort can be focussed on problem areas. You can use a different lint configuration for a different set of files and gradually crank up the quality. If necessary enabling one type of warning for only a few files at a time. A similar approach is to record the warning count and require that it never increases and occasionally work to lower the limit. Instead of educating programmers to deal with screen fulls of messages the focus should be on educating them that code quality is important and that removing warnings is one way of improving quality. It is not the be all and end all and not nearly as good as say automated testing. Personally I insist on both wherever I can. Regards, Bruce.
Jul 09 2008
prev sibling next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Wed, 09 Jul 2008 13:41:35 -0700, Walter Bright wrote:

 Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message
 news:g530j8$18th$1 digitalmars.com...
 The reason for treating warnings as errors when warnings are enabled
 is so that, for a long build, they don't scroll up and off your screen
 and go unnoticed.
Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.
I completely agree with this. If warnings are generated, it's best to stop compilation and let the developer to correct those parts. Warnings, that does not stop building process have no use at all.
Jul 09 2008
prev sibling next sibling parent reply Leandro Lucarella <llucax gmail.com> writes:
Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for
warning 
 statements is never going to happen.
I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific. -- Leandro Lucarella (luca) | Blog colectivo: http://www.mazziblog.com.ar/blog/ ---------------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------------- Me duele encontrarte en mis sueños muertos
Jul 10 2008
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for
warning 
 statements is never going to happen.
I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.
Emacs has it too! M-x ` --bb
Jul 10 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Bill Baxter" <dnewsgroup billbaxter.com> wrote in message 
news:g561hh$2g6g$2 digitalmars.com...
 Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen.
I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.
Emacs has it too! M-x `
Every IDE I've ever used does it. And I'm constantly IDE-hopping.
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:31:02 -0400, Nick Sabalausky wrote:

 "Bill Baxter" <dnewsgroup billbaxter.com> wrote in message
 news:g561hh$2g6g$2 digitalmars.com...
 Leandro Lucarella wrote:
 Walter Bright, el  9 de julio a las 13:41 me escribiste:
 Piping the output into a file and then perusing it manually looking
 for warning statements is never going to happen.
I code using VIM. VIM has a very convenient feature that recolects the make (compiler output) and let you you iterate over warnings/errors (using :cn, and :cp). So yes. It's going to happen. It happens all the time. And I think most decent IDEs/Editor do that, so it's not something VIM-specific.
Emacs has it too! M-x `
Every IDE I've ever used does it. And I'm constantly IDE-hopping.
Currently, I use IDE only if forced to do so. Kate/nedit & make & tee does everything I need ;)
Jul 10 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 
 I'll draw on my 25 years of experience with warnings to answer this.
 
 If you turn warnings on, then you want to see them and presumably deal 
 with them. If you don't deal with them, then they persist every time you 
 compile, and either they get very irritating and you fix them anyway, or 
 you develop a blind spot for them and never see the ones you do want to 
 fix.
 
 Piping the output into a file and then perusing it manually looking for 
 warning statements is never going to happen. Complex builds tend to 
 produce a lot of output, and poking through it looking for warnings 
 every time you build is not practical. Changing your build process to 
 point out warnings is the same thing as the compiler treating them as 
 errors, except it's extra work for the build master.
 
Of course it's not going to happen. Cause manually looking at the compiler output is plain ridiculous. See my other post for details.
 Trying to educate your programmers into doing extra work to deal with 
 warnings that scroll off the screen is a lost cause.
 
Again, anyone who firmly believes that trying to look at console output is even a worthy cause to begin with (lost or not), is living in the past. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 The reason for treating warnings as errors when warnings are enabled is 
 so that, for a long build, they don't scroll up and off your screen and 
 go unnoticed.
Dear gods... These are the kind of comments that make me cringe deep inside, and honestly worry me about the future of D. :( Looking at the output of a compiler in a console is a thing of the past. It's fraking obsolote. It's only done when you're hobbying or toying with the language. No one who does serious development is going to do that. What you do is use an IDE with a minimum of intelligence, that presents to you the warnings in a sensible way. Here's an example from CDT: http://www-128.ibm.com/developerworks/library/os-eclipse-ganymede/ Before some people here say they don't use an IDE, but instead use <editor foo with syntax highlighting and little more than that> and are fine with it, well, ask yourselves, are you doing any serious development, or just toying around? If you were in a multi-team, 6+ months project, working with such tools, do you think you would perform the same as the same team, with a proper toolchain? Head my words: you wouldn't. Thinking otherwise is a nuisance. And it's even worse if you're Walter. Basing such language/tool design issues on outdated notions is a danger to D's development. And it's not just the "looking at compiler output in console", there are plenty of other cases of this mentality. Walter, you need to shed some of your outdated notions of the software development process and think of the *modern* (present and future) development models that exist, or D will risk heavily retarded adoption (or even failure). I'm dead serious and I want to "record" this message for future reference, especially if things don't go well (which may not be obvious though). -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply Jussi Jumppanen <jussij zeusedit.com> writes:
Bruno Medeiros Wrote:

 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 
I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.
 If you were in a multi-team, 6+ months project, working with 
 such tools, do you think you would perform the same as the same 
 team, with a proper toolchain?
Yes. I would say team of 'editor based programmers' would be far more productive than a team o 'IDE based programmers'. The simple fact that the editor programmer can code outside the IDE immediately means they have a better understanding of there coding environment and their toolchain. There is nothing more pathetic than to watch an IDE programmer turn into a quivering mess, just because they can't find the answer to simple questions like: Why does my program run fine in the IDE but not outside the IDE?
Jul 27 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Jussi Jumppanen wrote:
 Bruno Medeiros Wrote:
 
 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 
I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.
Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?
Jul 27 2008
next sibling parent reply "Bill Baxter" <wbaxter gmail.com> writes:
Content-Disposition: inline

On Mon, Jul 28, 2008 at 11:15 AM, Robert Fraser
<fraserofthenight gmail.com>wrote:

 Jussi Jumppanen wrote:

 Bruno Medeiros Wrote:

  Before some people here say they don't use an IDE, but instead use
 <editor foo with syntax highlighting and little more than that> and are fine
 with it,
I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.
Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?
I think part of the problem is that there are a whole lot of IDEs that really don't live up to the potential you guys are talking about. Plus IDEs come with their own set of problems. For instance I just wasted most of a day getting a MSVC7 project set up to also work with MSVC9. That's just ridiculous. Microsoft goes and makes these minor changes to their project file formats for every release of Visual Studio, and then only provide a tool to do 1-way, in-place upgrades of all your project files. It's insane. Just imagine if you were forced to fork your makefiles for every dang version of GCC that comes out. The way project management works in IDEs is often just completely silly like that. The so called "Intellisense" in Visual Studio also has historically been pretty lame, with refactoring support basically non-existant. The Visual Assist add-on from Whole Tomato was pretty much a "must" to bring it up to snuff. I get the impression that the Java IDEs offer a lot more on the refactoring frontier. So that's just to say, it's easy to get the impression that IDEs are not useful because there are many IDEs that genuinely are not that useful. I can see where Jussi is coming from. I have a feeling when Brunos says "IDE" he's thinking of IDEs at their very best. Not another one of these lame editors with syntax highlighting and a "compile" button that claims to be an IDE. I still primarily like to use my good ole emacs for writing large amounts of new code. There I don't find all the little buttons and completion popups and things in an IDE very useful. But when it comes to debugging and fixing code, damn it's nice to have the IDE there with all it's quick cross-linking abilities. The integrated debugger in MSVC is also damn fine. --bb
Jul 27 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter Wrote:

 On Mon, Jul 28, 2008 at 11:15 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 Jussi Jumppanen wrote:

 Bruno Medeiros Wrote:

  Before some people here say they don't use an IDE, but instead use
 <editor foo with syntax highlighting and little more than that> and are fine
 with it,
I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers. When you're used to a super quick, highly responsive editor, it can be terribly frustrating to have you step down to a slow IDE. The slowness of the keyboard response turns what was an automatic action, that of typing, into a though process and this plays havoc with the 'thinking about the code while I type' through process.
Bullshit. Do you have a 200 MhZ Pentium with 128MB RAM? Even then, IDEs are going to prioritize the editor itself over any autocomplete/background processing, so the editor shouldn't be any less responsive. It might take 5 seconds if you click "go to definition" and it has to open a new file, but that's vs 2 minutes of searching for an import, finding the file location, and using find to get to the definition in that file. The issue is the placebo effect and the comfort zone... which are real issues (that's why so many people are like "oh, Vista is soooo bloated compared to XP"...). If you've been using ed to write code for the last 30 years, the mental concept of using your $2000 computer to its full potential to help you write software is mind-boggling. If you're more comfortable with your "power-editor" or just can't deal with a 1-minute startup time for a product you're going to be using for 8 hours, well all the more power to ya; no amount of productivity gains could make you willing to switch. I'm not saying "more complex is always better," but why let all that processing power go to waste?
I think part of the problem is that there are a whole lot of IDEs that really don't live up to the potential you guys are talking about. Plus IDEs come with their own set of problems. For instance I just wasted most of a day getting a MSVC7 project set up to also work with MSVC9. That's just ridiculous. Microsoft goes and makes these minor changes to their project file formats for every release of Visual Studio, and then only provide a tool to do 1-way, in-place upgrades of all your project files. It's insane. Just imagine if you were forced to fork your makefiles for every dang version of GCC that comes out. The way project management works in IDEs is often just completely silly like that. The so called "Intellisense" in Visual Studio also has historically been pretty lame, with refactoring support basically non-existant. The Visual Assist add-on from Whole Tomato was pretty much a "must" to bring it up to snuff. I get the impression that the Java IDEs offer a lot more on the refactoring frontier. So that's just to say, it's easy to get the impression that IDEs are not useful because there are many IDEs that genuinely are not that useful. I can see where Jussi is coming from. I have a feeling when Brunos says "IDE" he's thinking of IDEs at their very best. Not another one of these lame editors with syntax highlighting and a "compile" button that claims to be an IDE. I still primarily like to use my good ole emacs for writing large amounts of new code. There I don't find all the little buttons and completion popups and things in an IDE very useful. But when it comes to debugging and fixing code, damn it's nice to have the IDE there with all it's quick cross-linking abilities. The integrated debugger in MSVC is also damn fine. --bb
VS is crap (when the VS team is using Source Insight to develop it, you I can do with Eclipse + JDT for Java; you have to use ReSharper to get the functionality a real IDE can provide.
Jul 28 2008
parent reply "Bill Baxter" <wbaxter gmail.com> writes:
Content-Disposition: inline

On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
<fraserofthenight gmail.com>wrote:

 VS is crap (when the VS team is using Source Insight to develop it, you

 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.
Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb
Jul 28 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter Wrote:

 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 VS is crap (when the VS team is using Source Insight to develop it, you

 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.
Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb
sucks). I don't like either as much as I like JDT, but what can you do?
Jul 28 2008
parent Yigal Chripun <yigal100 gmail.com> writes:
Robert Fraser wrote:
 Bill Baxter Wrote:
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:

 VS is crap (when the VS team is using Source Insight to develop it, you

 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.
Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb
sucks). I don't like either as much as I like JDT, but what can you do?
Have you tried CDT for eclipse? Netbeans also has a c++ plugin.
Jul 29 2008
prev sibling next sibling parent Paul D. Anderson <paul.d.removethis.anderson comcast.andthis.net> writes:
Bill Baxter Wrote:

 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser
 <fraserofthenight gmail.com>wrote:
 
 VS is crap (when the VS team is using Source Insight to develop it, you

 I can do with Eclipse + JDT for Java; you have to use ReSharper to get
 the functionality a real IDE can provide.
Hmm, Brunos is an Eclipse fan too. So maybe when you guys say "an IDE" you really mean "Eclipse+JDT for Java". Are there any other IDEs, for any language, out there that you would deem acceptable? Just curious. --bb
I prefer IntelliJ for Java development, although Eclipse or NetBeans are both good tools. IntelliJ has better (IMHO) code completion, macro and refactoring capabilities, but the difference is probably just that I've used IntelliJ more. If I had to pick one feature that stands out it is the refactoring. IntelliJ is a commercial product but they have a policy of making it available to open source projects at no cost (which I've been the beneficiary of). Paul
Jul 28 2008
prev sibling next sibling parent Don <nospam nospam.com.au> writes:
Bill Baxter wrote:
 
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser 
 <fraserofthenight gmail.com <mailto:fraserofthenight gmail.com>> wrote:
 
     VS is crap (when the VS team is using Source Insight to develop it, you

     I can do with Eclipse + JDT for Java; you have to use ReSharper to get
     the functionality a real IDE can provide.
 
 
 Hmm, Brunos is an Eclipse fan too.  So maybe when you guys say "an IDE" 
 you really mean "Eclipse+JDT for Java".
This makes sense now. There might not be so much disagreement after all. 1: "my favourite text editor is better than the IDEs I've used (VS)" 2: "my favourite IDE is better than any text editor" Both of these statements could be true.
Jul 29 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 
 
 On Tue, Jul 29, 2008 at 2:56 AM, Robert Fraser 
 <fraserofthenight gmail.com <mailto:fraserofthenight gmail.com>> wrote:
 
     VS is crap (when the VS team is using Source Insight to develop it, you

     I can do with Eclipse + JDT for Java; you have to use ReSharper to get
     the functionality a real IDE can provide.
 
 
 Hmm, Brunos is an Eclipse fan too.  So maybe when you guys say "an IDE" 
 you really mean "Eclipse+JDT for Java".  Are there any other IDEs, for 
 any language, out there that you would deem acceptable?  Just curious.
 
 --bb
There's Eclipse+CDT, like Yigal mentioned. Although I haven't used it or examined it in-depth recently, I think it has advanced a lot in the last years, and is on-par, if not better, that VS. Configuring a compiler might not be as easy as VS, since CDT doesn't come bundled with one, but on semantic features (code completion, open/find references, refactoring) it seems to be much better than VS is. Dunno about debugging. IntelliJ is also pretty good, but it's a paid IDE. But really, for the point I was making (productivity of simple tools vs. IDEs), it did still apply with many other IDEs, like VS, KDev, etc.. I wasn't thinking of Eclipse alone. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 30 2008
prev sibling parent Jussi Jumppanen <jussij zeusedit.com> writes:
Robert Fraser Wrote:

 It might take 5 seconds if you click "go to definition" and 
 it has to open a new file, but that's vs 2 minutes of searching 
 for an import, finding the file location, and using find to get 
 to the definition in that file.
When you're accustomed to load times of less than 1 second, 5 seconds can feel like an eternity.
 If you've been using ed to write code for the last 30 years, the 
 mental concept of using your $2000 computer to its full potential 
Ed was not the text editor I was referring to.
 to help you write software is mind-boggling. 
If I had been referring to Ed or Notepad then I would agree with you.
 just can't deal with a 1-minute startup time for a product you're 
 going to be using for 8 hours, well all the more power to ya; no 
 amount of productivity gains could make you willing to switch.
You've hit the nail right on the head. When you're expecting a sub second response times, having to put up with minute/second delays is rather off putting, to the point of being counter productive.
 I'm not saying "more complex is always better," but why let all 
 that processing power go to waste?
But all that power is not going to waste. All that processing power lets the computer respond at an amazingly fast speed. It responds so fast it feels like it is not even there.
Jul 28 2008
prev sibling parent Ary Borenszweig <ary esperanto.org.ar> writes:
Jussi Jumppanen a écrit :
 Bruno Medeiros Wrote:
 
 Before some people here say they don't use an IDE, but 
 instead use <editor foo with syntax highlighting and 
 little more than that> and are fine with it, 
I would say that the reason developers still prefer to code with text editors rather than IDE's is they find the text editor more productive. Eclipse based IDE are just far too slow for a good developer's fingers.
In Eclipse there's a time delay you can configure before autocompletion proposals appear (by default 200ms). That means that if you are faster than that delay (your claim), the IDE won't help you. Buf if you do wait a little, probably because you don't know what all the possible autocompletions are, then it will show the popup with suggestions. I can't see how that is worse than not having autocompletion at all, plus not having go-to-definition or semantic highlighting.
Jul 28 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Here are some horrid examples from my own code which, to please the 
client, had to compile with all warnings on for MSVC:

---
   p = NULL;  // suppress spurious warning
---
   b = NULL;  // Needed for the b->Put() below to shutup a compiler 
use-without-init warning
---
   #if _MSC_VER
   // Disable useless warnings about unreferenced formal parameters
   #pragma warning (disable : 4100)
   #endif
---
   #define LOG 0       // 0: disable logging, 1: enable it

   #ifdef _MSC_VER
   #pragma warning(disable: 4127)      // Caused by if (LOG)
   #endif // _MSC_VER
---

Note the uglification this makes for code by forcing useless statements 
to be added. If I hadn't put in the comments (and comments are often 
omitted) these things would be a mystery.
Jul 09 2008
next sibling parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 12:49:34 +0400, Walter Bright  
<newshound1 digitalmars.com> wrote:

 Here are some horrid examples from my own code which, to please the  
 client, had to compile with all warnings on for MSVC:

 ---
    p = NULL;  // suppress spurious warning
 ---
    b = NULL;  // Needed for the b->Put() below to shutup a compiler  
 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless statements  
 to be added. If I hadn't put in the comments (and comments are often  
 omitted) these things would be a mystery.
We don't have problems with most of these in D, since there is no C-style macros and uninitialized variables. Moreover, I would be happy to have an `unused` modifier in addition to in, out and inout (doh!) to denote that a variable is not going to be used. In this case compiler will show an error if the variable is used by chance. It could help programmer to catch potential bugs at early stage once he eventually start using it. Besides, it really fits well into D, IMO: void bar( unused int foo ) // no warning is generated { }
Jul 09 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 Moreover, I would be happy to have an `unused` modifier in addition to in,  
 out and inout (doh!) to denote that a variable is not going to be used. In  
 this case compiler will show an error if the variable is used by chance.  
 It could help programmer to catch potential bugs at early stage once he  
 eventually start using it. Besides, it really fits well into D, IMO:
 void bar( unused int foo ) // no warning is generated
 {
 }
Can you explain me in what practical situation(s) this can be useful? Bye, bearophile
Jul 09 2008
parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 14:05:21 +0400, bearophile <bearophileHUGS lycos.com>  
wrote:

 Koroskin Denis:
 Moreover, I would be happy to have an `unused` modifier in addition to  
 in,
 out and inout (doh!) to denote that a variable is not going to be used.  
 In
 this case compiler will show an error if the variable is used by chance.
 It could help programmer to catch potential bugs at early stage once he
 eventually start using it. Besides, it really fits well into D, IMO:
 void bar( unused int foo ) // no warning is generated
 {
 }
Can you explain me in what practical situation(s) this can be useful? Bye, bearophile
It is the most useful if warning is generated when a variable is unused: class Connection { void connect(int timeout) { // do something } } class SomeOtherConnectionType : Connection { void connect(unused int timeout) { // this type of connection is immediate, and therefore there is no need for timeout // but since we don't use timeout, just mark it unused // do the connection } } And then you realize that due to some specific changes this type of connection is no more immediate, so now you are going to take it into account. And you see an unused modifier that says to you: "Man, this variable was not used before, go check your code to see maybe there are some cases when you passed some dummy value to this function just to satisfy compilator, like this: auto connection = new SomeOtherConnectionType(); connection.connect(0); // I don't care, since it is immediate anyway and then refactor your code".
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Koroskin Denis wrote:


     void connect(unused int timeout)
[...]
 connection.connect(0); // I don't care, since it is immediate
 anyway 
[...]
 and then refactor your code
In which way is this type of coding better than preparing for overloading `connect': void connect(){ // ... } void connect( int timeout){ // ... this.connect(); } and then calling > connection.connect(); // immediate connection In addition: why is it good to be forced to refactor? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Wed, 09 Jul 2008 15:26:54 +0400, Manfred_Nowak <svv1999 hotmail.com>  
wrote:

 Koroskin Denis wrote:


     void connect(unused int timeout)
[...]
 connection.connect(0); // I don't care, since it is immediate
 anyway
[...]
 and then refactor your code
In which way is this type of coding better than preparing for overloading `connect': void connect(){ // ... } void connect( int timeout){ // ... this.connect(); } and then calling > connection.connect(); // immediate connection In addition: why is it good to be forced to refactor? -manfred
You asked an example, I provided one. There is another one: class Node { private Node parent; Node getRoot() { Node p = parent; while (parent !is null) { parent = parent.next; } return parent; } } Actually, getRoo() isn't supposed to modify this.parent, but it does by accident (say nothing about const, please!). In my code, I was going to modify local variable, but not a member. Local variable p was defined, it has value assigned but it's _not_ used. Compiler could warn me that I don't use it, and it would help to detect a problem.
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Koroskin Denis wrote:

 You asked an example, I provided one. There is another one:
[...] Bearophile asked for a _practical_ example. But your example seems to illustrate consequences rooted in a coding style and not rooted in the absence of an `unused' keyword and its semantics.
 I was going to modify local variable, but not a member. 
This is a well known phenomenon. But again no need for an `unused' keyword shows up. To the contrary: within the function you want to use both variables, although one of them only for reading. Pollution of the namspace within the function causes the problem. But would you really want to write import statements for variables from surrounding scopes?
 Compiler could warn me that I don't use it
This claim comes up once in a while, but seems to be unprovable in general. It might be provable in your special case though. But without the general proof one may have both: - many false warnings - many true bugs without warnings Do you have a proof for the general case? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g52faq$2s3g$1 digitalmars.com...
 Koroskin Denis wrote:

 You asked an example, I provided one. There is another one:
[...] Bearophile asked for a _practical_ example. But your example seems to illustrate consequences rooted in a coding style and not rooted in the absence of an `unused' keyword and its semantics.
 I was going to modify local variable, but not a member.
This is a well known phenomenon. But again no need for an `unused' keyword shows up. To the contrary: within the function you want to use both variables, although one of them only for reading. Pollution of the namspace within the function causes the problem. But would you really want to write import statements for variables from surrounding scopes?
Can you prove that namespace pollution is the root cause of "unintentially-unused variable" errors in the general case?
 Compiler could warn me that I don't use it
This claim comes up once in a while, but seems to be unprovable in general. It might be provable in your special case though. But without the general proof one may have both: - many false warnings - many true bugs without warnings Do you have a proof for the general case?
Do you have a general-case proof that an "unused variable" warning would cause too many false warnings/etc.? Would the proof still hold with the proposed "unused" keyword (or some functionaly-equivilent alternative)?
Jul 09 2008
parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 Can you prove
[...]
 Do you have a general-case proof
No---and in addition no one is obliged to have a counter proof for any claim, especially not if the claim is not formalized. I have a counter hint only: D as an intended systems programming language has `cast'- and `asm'- statements as well as pointers available. With these a clever coder might be able to access every data storage location accessable to the program, regardless of the protection status announced by the source. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
prev sibling parent reply Matti Niemenmaa <see_signature for.real.address> writes:
Koroskin Denis wrote:
 Moreover, I would be happy to have an `unused` modifier in addition to 
 in, out and inout (doh!) to denote that a variable is not going to be 
 used. In this case compiler will show an error if the variable is used 
 by chance. It could help programmer to catch potential bugs at early 
 stage once he eventually start using it. Besides, it really fits well 
 into D, IMO:
 
 void bar( unused int foo ) // no warning is generated
 {
 }
Just do: void bar(int) {} I.e. don't name the variable. And you will get an error if you try to use it regardless, as you might expect. <g> -- E-mail address: matti.niemenmaa+news, domain is iki (DOT) fi
Jul 09 2008
parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Matti Niemenmaa wrote:

 you will get an error if you try to use it
This is true only, if the name he tries to use isn't declared in any visible scope. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 09:49:34 +0100, Walter Bright  =

<newshound1 digitalmars.com> wrote:

 Here are some horrid examples from my own code which, to please the  =
 client, had to compile with all warnings on for MSVC:

 ---
    p =3D NULL;  // suppress spurious warning
 ---
    b =3D NULL;  // Needed for the b->Put() below to shutup a compiler =
=
 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless statement=
s =
 to be added. If I hadn't put in the comments (and comments are often  =
 omitted) these things would be a mystery.
I would contend this is a problem with the quality of headers provided b= y = M$. Library code has a greater need to be high quality than regular code. Operating system APIs even more so. Removing warnings from C/C++ headers requires you to write them carefull= y = to remove the ambiguity that leads to the warning. That is, this definition= of quality is a measure that increases with decreasing semantic ambiguity. Asking users of your library code to disable warnings with a #pragma is = = laziness that a big monopoly like M$ can get away with. Then people wrongly start= = to think its okay because the big monopoly does it. Regards, Bruce.
Jul 09 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Wed, 09 Jul 2008 09:49:34 +0100, Walter Bright 
 <newshound1 digitalmars.com> wrote:
 
 Here are some horrid examples from my own code which, to please the 
 client, had to compile with all warnings on for MSVC:

 ---
    p = NULL;  // suppress spurious warning
 ---
    b = NULL;  // Needed for the b->Put() below to shutup a compiler 
 use-without-init warning
 ---
    #if _MSC_VER
    // Disable useless warnings about unreferenced formal parameters
    #pragma warning (disable : 4100)
    #endif
 ---
    #define LOG 0       // 0: disable logging, 1: enable it

    #ifdef _MSC_VER
    #pragma warning(disable: 4127)      // Caused by if (LOG)
    #endif // _MSC_VER
 ---

 Note the uglification this makes for code by forcing useless 
 statements to be added. If I hadn't put in the comments (and comments 
 are often omitted) these things would be a mystery.
I would contend this is a problem with the quality of headers provided by M$. Library code has a greater need to be high quality than regular code. Operating system APIs even more so. Removing warnings from C/C++ headers requires you to write them carefully to remove the ambiguity that leads to the warning. That is, this definition of quality is a measure that increases with decreasing semantic ambiguity.
I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning. If you insist on avoiding all warnings, you're effectively using the programming language spec which one individual has carelessly made on a whim. For example, VC6 generates some utterly ridiculous warnings. In some cases, the chance of it being a bug is not small, it is ZERO. In DMD, the signed/unsigned mismatch warning is almost always spurious. Getting rid of it reduces code quality.
Jul 10 2008
parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam nospam.com.au> wrote:

 Bruce Adams wrote:
  I would contend this is a problem with the quality of headers provided  
 by M$.
 Library code has a greater need to be high quality than regular code.
 Operating system APIs even more so.
 Removing warnings from C/C++ headers requires you to write them  
 carefully to
 remove the ambiguity that leads to the warning. That is, this  
 definition of
 quality is a measure that increases with decreasing semantic ambiguity.
I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.
I didn't say that *overall* quality is related to lower warnings but it is a factor. There are other factors that are typically more significant. Still given the choice between code with some warnings and warning free code all other things being equal I would pick the warning free code. You obviously shift your quality measure towards that aspect of readability. Personally I think the impact on readability is minimal.
 If you insist on avoiding all warnings, you're effectively using the  
 programming language spec which one individual has carelessly made on a  
 whim.
While some warnings are less useful than others I don't think its fair in general to say they're introduced carelessly on a whim.
 For example, VC6 generates some utterly ridiculous warnings. In some  
 cases, the chance of it being a bug is not small, it is ZERO.
Before they got Herb Sutter on board VC++ was notoriously bad. If that's true then it would be a compiler bug. If you know it to be true you can disable the warning with a pragma. Similarly in gcc all warnings are supposed to have an on/off switch. So you get to choose which warnings you think are important. I am well aware that some people choose to ignore all warnings in order to code faster. In general its a false economy like not writing unit-tests.
 In DMD, the signed/unsigned mismatch warning is almost always spurious.  
 Getting rid of it reduces code quality.
I have encountered quite a few bugs (in C++) relating to unsigned/signed mismatches. Its a very subtle and hard to spot problem when a simple addition suddenly changes the sign of your result. It costs a ugly cast to remove the warning but that is a trade I'm prepared to make to never have to worry about such bugs. Regards, Bruce.
Jul 10 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam nospam.com.au> wrote:
 
 Bruce Adams wrote:
  I would contend this is a problem with the quality of headers 
 provided by M$.
 Library code has a greater need to be high quality than regular code.
 Operating system APIs even more so.
 Removing warnings from C/C++ headers requires you to write them 
 carefully to
 remove the ambiguity that leads to the warning. That is, this 
 definition of
 quality is a measure that increases with decreasing semantic ambiguity.
I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality. Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning. In C++, error-free code is clearly defined in the spec. But warning-free code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.
I didn't say that *overall* quality is related to lower warnings but it is a factor. There are other factors that are typically more significant. Still given the choice between code with some warnings and warning free code all other things being equal I would pick the warning free code. You obviously shift your quality measure towards that aspect of readability. Personally I think the impact on readability is minimal.
I don't care about readability so much as correctness. My point is that sometimes making code warning-free makes it WORSE. It depends on the quality of the warning. Some are hugely beneficial.
 
 If you insist on avoiding all warnings, you're effectively using the 
 programming language spec which one individual has carelessly made on 
 a whim.
While some warnings are less useful than others I don't think its fair in general to say they're introduced carelessly on a whim.
The VC6 ones certainly seemed to be. Generation of a warning by a compiler is something that deserves almost as much care as a language specification. In the C++ world I really haven't seen evidence that much care is taken.
 For example, VC6 generates some utterly ridiculous warnings. In some 
 cases, the chance of it being a bug is not small, it is ZERO.
Before they got Herb Sutter on board VC++ was notoriously bad. If that's true then it would be a compiler bug. If you know it to be true you can disable the warning with a pragma. Similarly in gcc all warnings are supposed to have an on/off switch. So you get to choose which warnings you think are important. I am well aware that some people choose to ignore all warnings in order to code faster. In general its a false economy like not writing unit-tests.
That's totally different. The issue is not that "generating warning-free code is more work". Rather, the problem is when there is no reasonable way to make the warning go away, that doesn't involve writing incorrect code.
 In DMD, the signed/unsigned mismatch warning is almost always 
 spurious. Getting rid of it reduces code quality.
I have encountered quite a few bugs (in C++) relating to unsigned/signed mismatches. Its a very subtle and hard to spot problem when a simple addition suddenly changes the sign of your result. It costs a ugly cast to remove the warning but that is a trade I'm prepared to make to never have to worry about such bugs.
(1) casts can hide far more serious errors. You want to say, "I know this is a signed-unsigned mismatch, but I know it is ok in this instance". But what you are saying is, "Please change the type of this variable. No matter what it is, turn it into an int". (2) In DMD, the signed/unsigned error (as it exists today) really is garbage. I've had to introduce incorrect code (via casts) into both Tango and Phobos in order to satisfy it.
Jul 11 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Don wrote:
 (2) In DMD, the signed/unsigned error (as it exists today) really is 
 garbage. I've had to introduce incorrect code (via casts) into both 
 Tango and Phobos in order to satisfy it.
I agree that looks pretty damning.
Jul 11 2008
parent Don <nospam nospam.com.au> writes:
Walter Bright wrote:
 Don wrote:
 (2) In DMD, the signed/unsigned error (as it exists today) really is 
 garbage. I've had to introduce incorrect code (via casts) into both 
 Tango and Phobos in order to satisfy it.
I agree that looks pretty damning.
It's bugzilla 1257.
Jul 14 2008
prev sibling parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 Like I've said, compiler warnings are essentialy a built-in lint
 tool. 
Finally the contradiction seems to show up: if lint is a tool in its own right, then one must have strong arguments to incorporate it into any other tool. In the paralell posting Walter remarks but not emphasizes that compilers have more goals, than enabling the evaluation of sources, on which your OP concentrates. Building large software systems and migrating some application source to another architecture are in the duties of compilers. For at least huge parts of these latter tasks a reevaluation of some static aspects of semantics of the application is useless but time consuming. In addition and by definition lint tools are not capable of doing more than this. This is the central point: lint tools are only capable of informing about possible bugs. If a warning emitted by a lint tool would be a sure hint for a bug in the program, then the compiler should have emitted an error. Thus there should be no need to incorporate any lint tool into any compiler. I am ready to read your counter arguments. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g51tmn$1kb3$1 digitalmars.com...
 Nick Sabalausky wrote:

 Like I've said, compiler warnings are essentialy a built-in lint
 tool.
Finally the contradiction seems to show up: if lint is a tool in its own right, then one must have strong arguments to incorporate it into any other tool.
1. There is no sufficient D lint tool either currently in existence or on the foreseeable horizon (at least as far as I'm aware). 2. The compiler is already in a position to provide such diagnostics (and in fact, already does for certain other conditions).
 In the paralell posting Walter remarks but not emphasizes that
 compilers have more goals, than enabling the evaluation of sources, on
 which your OP concentrates. Building large software systems and
 migrating some application source to another architecture are in the
 duties of compilers.

 For at least huge parts of these latter tasks a reevaluation of some
 static aspects of semantics of the application is useless but time
 consuming.
Hence, optional.
 In addition and by definition lint tools are not capable of
 doing more than this.
Which is part of what makes a compiler inherently more general-purpose, and a lint tool a mere symptom of a compiler's shortcomings.
 This is the central point: lint tools are only capable of informing
 about possible bugs. If a warning emitted by a lint tool would be a
 sure hint for a bug in the program, then the compiler should have
 emitted an error.
Warings are never sure hints about a bug in the program either. That's what makes them warnings.
 Thus there should be no need to incorporate any lint tool into any
 compiler. I am ready to read your counter arguments.
I could turn that around and say that with a sufficient lint tool incorporated into the compiler (activated optionally, of course), there would be no need for an external lint tool. Plus, an external lint tool is, by necessity, going to incorporate a lot of duplicated functionality from the compiler (roughly the whole front-end). Although I suppose that could be moved into a shared library to avoid duplicated maintenance efforts. But since you mentioned that having lint work being done in the compiler would be uselessly time consuming (Again, uselessly time consuming only if there's no switch to turn such checking on/off. Also, I assume you're referring to the speed of compiling), then I should point out that with an external lint tool, you're likely to have plenty of duplicated processing going on (lexing and parsing once for the external lint, and again for the real compiler).
Jul 09 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Warings are never sure hints about a bug in the program either. That's what 
 makes them warnings.
Not always. Sometimes they are the result of an inability to change the language specification (because it's a standard). Common C++ compiler warnings are indications of bugs in the standard that can't be fixed.
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:g530ol$18th$2 digitalmars.com...
 Nick Sabalausky wrote:
 Warings are never sure hints about a bug in the program either. That's 
 what makes them warnings.
Not always. Sometimes they are the result of an inability to change the language specification (because it's a standard). Common C++ compiler warnings are indications of bugs in the standard that can't be fixed.
Fair enough. But that doesn't really apply to the current state of D, though, does it?
Jul 09 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 Fair enough. But that doesn't really apply to the current state of D, 
 though, does it? 
No, except for 1.0. Many common C++ warnings were put into D as errors because the spec could be changed.
Jul 09 2008
prev sibling parent "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 1. There is no sufficient D lint tool either currently in
 existence or on the foreseeable horizon (at least as far as I'm
 aware). 
But this is at most a requirement for building a lint tool, not an argument for incorporating a lint tool into a compiler.
 2. The compiler is already in a position to provide such
 diagnostics (and in fact, already does for certain other
 conditions). 
This again is no argument for a lint tool in a compiler. It is at most an argument for a case where there is some checking already built into a compiler then to be able to toggle its behaviour on or off.
 For at least huge parts of these latter tasks a reevaluation of
 some static aspects of semantics of the application is useless
 but time consuming.
Hence, optional.
But why optional? If one needs the code one needs no checking any more. If one needs the checking, one needs no code.
 In addition and by definition lint tools are not capable of
 doing more than this.
Which is part of what makes a compiler inherently more general-purpose, and a lint tool a mere symptom of a compiler's shortcomings.
This is based on the assumption, that a compiler without a lint functionality has shortcomings, which has still to be proven.
 Plus, an
 external lint tool is, by necessity, going to incorporate a lot of
 duplicated functionality from the compiler (roughly the whole
 front-end).
 Although I suppose that could be moved into a shared
 library to avoid duplicated maintenance efforts. But since you
 mentioned that having lint work being done in the compiler would 
 be uselessly time consuming (Again, uselessly time consuming only
 if there's no switch to turn such checking on/off. Also, I assume
 you're referring to the speed of compiling), then I should point
 out that with an external lint tool, you're likely to have plenty
 of duplicated processing going on (lexing and parsing once for the
 external lint, and again for the real compiler). 
This is an argument only for having an intermediate representation suitable for the compiler and the lint tool. Interestingly IBM wants to sell the integration of a lint tool into the IDE: http://www-306.ibm.com/software/rational/announce/swanalyzer/ -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Aug 05 2008
prev sibling next sibling parent reply Don <nospam nospam.com.au> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?)
I agree. Unfortunately, there's a problem with the 'optional' warnings we have in DMD right now. They are not optional for libraries. If a library generates warnings, then it is not usable by anyone who wants to compile with warnings enabled. I'd love to see the warnings tightened up so that they can become bugs.
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.
Jul 07 2008
next sibling parent Don <nospam nospam.com.au> writes:
Don wrote:
 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying 
 to track down that turned out to be variables I had intended to use 
 but forgot to. 
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do? Some shops have a "thou shall compile with warnings enabled, and there shall be no warning messages." That causes problems when you port the code to a different compiler with a different, even contradictory, notion of what is a warning. So then you wind up putting wacky things in the code just to get the compiler to shut up about the warnings. Those kind of things tend to interfere with the beauty of the code, and since they are not necessary to the program's logic, they tend to confuse and misdirect the maintenance programmer (why is this variable pointlessly referenced here? Why is this unreachable return statement here? Is this a bug?)
I agree. Unfortunately, there's a problem with the 'optional' warnings we have in DMD right now. They are not optional for libraries. If a library generates warnings, then it is not usable by anyone who wants to compile with warnings enabled. I'd love to see the warnings tightened up so that they can become bugs.
Of course, I mean 'errors' not bugs!
 
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I 
 don't think they should be part of the compiler, however.
Jul 07 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:

 I agree. Unfortunately, there's a problem with the 'optional' warnings  
 we have in DMD right now. They are not optional for libraries. If a  
 library generates warnings, then it is not usable by anyone who wants to  
 compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.
That is questionably a flaw in the way D processes libraries. But the same flaw as in C++. If your include file has warnings you have to surpress them when using the library. Presumably if you use .di files (created/compiled with warnings switched off) this won't happen? Regards, Bruce.
Jul 09 2008
parent reply Don <nospam nospam.com.au> writes:
Bruce Adams wrote:
 On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:
 
 I agree. Unfortunately, there's a problem with the 'optional' warnings 
 we have in DMD right now. They are not optional for libraries. If a 
 library generates warnings, then it is not usable by anyone who wants 
 to compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.
That is questionably a flaw in the way D processes libraries. But the same flaw as in C++.
It's even worse in D, though, because with warnings switched on the library won't compile at all.
 If your include file has warnings you have to surpress them when using 
 the library.
And D has no way to do that.
 Presumably if you use .di files (created/compiled with warnings switched 
 off) this won't
 happen?
That's an interesting idea. That might well work.
 
 Regards,
 
 Bruce.
Jul 09 2008
parent "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Wed, 09 Jul 2008 09:34:30 +0100, Don <nospam nospam.com.au> wrote:

 Bruce Adams wrote:
 On Mon, 07 Jul 2008 09:23:15 +0100, Don <nospam nospam.com.au> wrote:

 I agree. Unfortunately, there's a problem with the 'optional' warnings  
 we have in DMD right now. They are not optional for libraries. If a  
 library generates warnings, then it is not usable by anyone who wants  
 to compile with warnings enabled.

 I'd love to see the warnings tightened up so that they can become bugs.
That is questionably a flaw in the way D processes libraries. But the same flaw as in C++.
It's even worse in D, though, because with warnings switched on the library won't compile at all.
I was assuming you've already compiled the library with whatever options it needs and you are only parsing the D to import interfaces etc. for your code.
 If your include file has warnings you have to surpress them when using  
 the library.
And D has no way to do that.
 Presumably if you use .di files (created/compiled with warnings  
 switched off) this won't
 happen?
That's an interesting idea. That might well work.
Jul 09 2008
prev sibling next sibling parent reply JAnderson <ask me.com> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 
So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. <Snip>
 There is a place for warnings, however. That is in a separate static 
 analysis tool (i.e. lint, coverity, etc.) which can be armed with all 
 kinds of heuristics with which to flag questionable constructs. I don't 
 think they should be part of the compiler, however.
Something like lint can be run and have a consistent output on every D compiler and platform since it doesn't care about building the actual platform specific code. Having no warnings in the compiler means you can't have hardware/platform specific warnings. And I think thats great. Personally in C++ I like to turn to warning level 4 with warnings as errors and run both a GCC and a MSVC++ compile (when working on multiple platforms). Most warnings can removed without use of pragma and using both compilers gives a much better coverage of warnings. I'm personally a fan of catching things early. The more warnings as errors the better. If I have to suffer a little for false positives *shrug* however its much better then spending hours in a mud pit full of crocodiles; that is debugging. -Joel
Jul 09 2008
parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*
What do you understand by "a little"? Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php? art_group=digitalmars.D&article_id=73441 Do you recognize how many warnings a lint tool might emit on that code? Would you admit then, that a paranoic lint would be quite useless, even if it detects that the variable `p' should be accessed? Would you admit, that you yourself are unable to decide whether the presence of some access statements to `p' should suppress the warning? My understanding of lint tools is, that they incorporate a collection of programming patterns together with a fuzzy recognition algorithm. If there are enough hits for a specific pattern, but it is still only partial implemented, then warnings are generated. Under this the primary question is: what is so special to the collection of programming patterns that they can be formalized into a lint tool but not be used as paradigms in the source language? -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 10 2008
next sibling parent reply JAnderson <ask me.com> writes:
Manfred_Nowak wrote:
 JAnderson wrote:
 
 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*
What do you understand by "a little"?
I don't understand what your asking. I meant that if I have to fix it because the compiler tells me its an error then so be it. Its a little pain for a lot of gain.
 
 Please look at the example from 
 http://www.digitalmars.com/webnews/newsgroups.php?
 art_group=digitalmars.D&article_id=73441 
 
 Do you recognize how many warnings a lint tool might emit on that code?
 Would you admit then, that a paranoic lint would be quite useless, even 
 if it detects that the variable `p' should be accessed?
I don't understand? With lint it just gives you hints about what could be wrong. You pick and choose what to fix.
 Would you 
 admit, that you yourself are unable to decide whether the presence of 
 some access statements to `p' should suppress the warning?
are errors it would be an error too. If you really want to use an uninitialized variable there should be a work around but it should be harder to do.
 My understanding of lint tools is, that they incorporate a collection 
 of programming patterns together with a fuzzy recognition algorithm. If 
 there are enough hits for a specific pattern, but it is still only 
 partial implemented, then warnings are generated. Under this the 
 primary question is: what is so special to the collection of 
 programming patterns that they can be formalized into a lint tool but 
 not be used as paradigms in the source language?
For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors). This means the lint effort can be separate. It means they can continually add and remove checks while the compiler is worked on as a separate effort. Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well. I simply don't add an identifier or I semicolon the value when I'm writting stubs.
 
 -manfred  
 
Jul 10 2008
parent JAnderson <ask me.com> writes:
JAnderson wrote:
 Manfred_Nowak wrote:
 JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*
What do you understand by "a little"?
I don't understand what your asking. I meant that if I have to fix it because the compiler tells me its an error then so be it. Its a little pain for a lot of gain.
 Please look at the example from 
 http://www.digitalmars.com/webnews/newsgroups.php?
 art_group=digitalmars.D&article_id=73441
 Do you recognize how many warnings a lint tool might emit on that code?
 Would you admit then, that a paranoic lint would be quite useless, 
 even if it detects that the variable `p' should be accessed?
I don't understand? With lint it just gives you hints about what could be wrong. You pick and choose what to fix.
 Would you admit, that you yourself are unable to decide whether the 
 presence of some access statements to `p' should suppress the warning?
are errors it would be an error too. If you really want to use an uninitialized variable there should be a work around but it should be harder to do.
thinking of "used uninitialized variable" not "variable not used". I And I still prefer errors for these.
 
 My understanding of lint tools is, that they incorporate a collection 
 of programming patterns together with a fuzzy recognition algorithm. 
 If there are enough hits for a specific pattern, but it is still only 
 partial implemented, then warnings are generated. Under this the 
 primary question is: what is so special to the collection of 
 programming patterns that they can be formalized into a lint tool but 
 not be used as paradigms in the source language?
For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors). This means the lint effort can be separate. It means they can continually add and remove checks while the compiler is worked on as a separate effort. Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well. I simply don't add an identifier or I semicolon the value when I'm writting stubs.
 -manfred 
Jul 10 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Thu, 10 Jul 2008 10:01:25 +0100, Manfred_Nowak <svv1999 hotmail.com> =
 =

wrote:

 JAnderson wrote:

 The more warnings as errors the better.  If I have to suffer a
 little for false positives *shrug*
What do you understand by "a little"? Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php? art_group=3Ddigitalmars.D&article_id=3D73441 Do you recognize how many warnings a lint tool might emit on that code=
?
 Would you admit then, that a paranoic lint would be quite useless, eve=
n
 if it detects that the variable `p' should be accessed? Would you
 admit, that you yourself are unable to decide whether the presence of
 some access statements to `p' should suppress the warning?
Generally there are two types of code. Code for which warnings are allow= ed and warning free code. Transitioning from code that's been allowed to ha= ve = warnings for a long time to warning free code takes effort. I still think the lon= g term benefits make it worth it.
 My understanding of lint tools is, that they incorporate a collection
 of programming patterns together with a fuzzy recognition algorithm. I=
f
 there are enough hits for a specific pattern, but it is still only
 partial implemented, then warnings are generated. Under this the
 primary question is: what is so special to the collection of
 programming patterns that they can be formalized into a lint tool but
 not be used as paradigms in the source language?

 -manfred
The reason is that the language specification is low level. It tells you= = how you are allowed to put bricks together. Static analysis tools work at a much= = higher level. They say things like, this is a load bearing wall putting a door = = here without a lintal (pardon the pun) is unwise. Semantic checks rely on trying to work out what your code is trying to d= o, = its not just following the steps to needed execute it (with certain exceptions). Regards, Bruce.
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:53:53 +0100, Bruce Adams wrote:

 Generally there are two types of code. Code for which warnings are
 allowed and warning free code. Transitioning from code that's been
 allowed to have warnings
 for a long time to warning free code takes effort. I still think the
 long term benefits make it worth it.
I so totally agree with this! "me too"...
Jul 10 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying to 
 track down that turned out to be variables I had intended to use but 
 forgot to. 
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?
You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _ -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply "Koroskin Denis" <2korden gmail.com> writes:
On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros  
<brunodomedeiros+spam com.gmail> wrote:

 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for  
 variables/arguments that are declared but never accessed? Just today  
 alone there's been two bugs I spent 10-30 minutes going nuts trying to  
 track down that turned out to be variables I had intended to use but  
 forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?
You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _
Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them? OTOH, I want for my code to be constantly analyzed for suspicious situation but _only during development time_. That's why I use IDE. And my IDE should help me as I type: syntax highlighting, code autocomplete, refactoring *and* warnings. It's almost free for IDE to analyze my code for possible errors. But compiler's job is to compile *or* reject the code, and it should do it as fast as possible without spending time for looking into suspicious situations. Compiler and IDE tasks do often overlap, of course, but it doesn't mean that they should be merged into single solution. Just my $0.02...
Jul 27 2008
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Koroskin Denis:
 As a gentoo user when I compile something (and I do it alot :p  
 ) I expect two messages: "build finished" *or* "build failed for the  
 following reason: ...". All those warning are *not for me*, they are for  
 developers and needed during development time only. Imagine you are  
 updating a web-browser or some other application and get all those  
 "comparison between signed and unsigned types" messages. Do you want to  
 read them?
Then if you build things to just use them quickly, then you may want to omit putting -Wall there... Bye, bearophile
Jul 27 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Koroskin Denis wrote:
 On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros 
 <brunodomedeiros+spam com.gmail> wrote:
 
 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for 
 variables/arguments that are declared but never accessed? Just today 
 alone there's been two bugs I spent 10-30 minutes going nuts trying 
 to track down that turned out to be variables I had intended to use 
 but forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?
You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _
Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them?
I too was talking about development time only. If you're compiling as a user, then yes there should be an option that suppresses various output, warnings or not.
 OTOH, I want for my code to be constantly analyzed for suspicious 
 situation but _only during development time_. That's why I use IDE. And 
 my IDE should help me as I type: syntax highlighting, code autocomplete, 
 refactoring *and* warnings. It's almost free for IDE to analyze my code 
 for possible errors. But compiler's job is to compile *or* reject the 
 code, and it should do it as fast as possible without spending time for 
 looking into suspicious situations.
 
 Compiler and IDE tasks do often overlap, of course, but it doesn't mean 
 that they should be merged into single solution.
 
 Just my $0.02...
Like you said, the compiler and IDE tasks overlap. In D's case, if DMD did a proper warning analysis, then an IDE could use the compiler to present warnings to the user in a proper manner (like squiggles in the source code editor). In Descent's case, it would be particularly easy to do that, since it has an embedded/ported DMD frontend, and already does the same for compiler errors. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
parent reply Yigal Chripun <yigal100 gmail.com> writes:
Bruno Medeiros wrote:
 Koroskin Denis wrote:
 On Sun, 27 Jul 2008 17:56:01 +0400, Bruno Medeiros
 <brunodomedeiros+spam com.gmail> wrote:

 Walter Bright wrote:
 Nick Sabalausky wrote:
 I don't suppose there's any chance of DMD getting a warning for
 variables/arguments that are declared but never accessed? Just
 today alone there's been two bugs I spent 10-30 minutes going nuts
 trying to track down that turned out to be variables I had intended
 to use but forgot to.
The problem with unused variable warnings is they are annoying when you're developing code in an iterative manner. They get in the way when you're commenting out sections of code to try and isolate a problem. They can be a problem when using "version" and "static if" statements. So, why not just turn off the warnings? The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. If you're faced with compiling someone else's code (like you downloaded it off the internet and have to compile it because it only is distributed as source) and warnings go off, is that a bug in the code or not? What do you do?
You have a distorted notion of warnings. Warnings are not errors (and by corollary are not "optional errors" also). They are simply messages which indicate some "strange" situations in code, which suggest some attention from the developer (now or in the future). That's why other compilers have an *option* such as "treat-warnings-as-errors". If they were errors, they wouldn't need that option, cause they would already be treated as errors (cause they would *be* errors...), lol. :( You (Walter) and other people, may be inclined to disagree, especially if you are heavily biased torwards C++, where warnings, like you said, have been used for *things that should have been errors*, and have created this whole messed up and confused situations, and scenarios where people think C++ code should compile without errors, etc., etc.. But that is just a scenario arising from C++ fucked-up-ness. If you (and others) still don't agree, which you probably won't, then let's not argue semantics, and just call this notion of warnings that I defined before as "cautions". _ With this in mind, what's wrong with the compiler generating messages (and just messages, not errors) for certain suspicious code situations, such as unused variables? Just that, what do you say? _
Now I agree with Walter on that matter. Compiler's job is to compile an executable. As a gentoo user when I compile something (and I do it alot :p ) I expect two messages: "build finished" *or* "build failed for the following reason: ...". All those warning are *not for me*, they are for developers and needed during development time only. Imagine you are updating a web-browser or some other application and get all those "comparison between signed and unsigned types" messages. Do you want to read them?
I too was talking about development time only. If you're compiling as a user, then yes there should be an option that suppresses various output, warnings or not.
 OTOH, I want for my code to be constantly analyzed for suspicious
 situation but _only during development time_. That's why I use IDE.
 And my IDE should help me as I type: syntax highlighting, code
 autocomplete, refactoring *and* warnings. It's almost free for IDE to
 analyze my code for possible errors. But compiler's job is to compile
 *or* reject the code, and it should do it as fast as possible without
 spending time for looking into suspicious situations.

 Compiler and IDE tasks do often overlap, of course, but it doesn't
 mean that they should be merged into single solution.

 Just my $0.02...
Like you said, the compiler and IDE tasks overlap. In D's case, if DMD did a proper warning analysis, then an IDE could use the compiler to present warnings to the user in a proper manner (like squiggles in the source code editor). In Descent's case, it would be particularly easy to do that, since it has an embedded/ported DMD frontend, and already does the same for compiler errors.
Even better would be to have something like clang which offers a collection of libs (codegen, semantic analysis, parsing, etc..) and an API for each lib. that way the descent folks could have just used the semantic analysis and parser DLLs of the compiler and the respective APIs instead of having to port the DMD frontend from c++ to Java. I think Ary wrote here once that he had to replace all the gotos with exceptions or something like that. That doesn't sound good or maintainable to me..
Jul 27 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Yigal Chripun wrote:
 
 Even better would be to have something like clang which offers a
 collection of libs (codegen, semantic analysis, parsing, etc..) and an
 API for each lib. that way the descent folks could have just used the
 semantic analysis and parser DLLs of the compiler and the respective
 APIs instead of having to port the DMD frontend from c++ to Java. I
 think Ary wrote here once that he had to replace all the gotos with
 exceptions or something like that. That doesn't sound good or
 maintainable to me..
*How* an IDE uses the compiler to perform analysis is another story. Right now the point is simply that it would be nice if the compiler (DMD) had more analysis functionality. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling next sibling parent reply "Manfred_Nowak" <svv1999 hotmail.com> writes:
Nick Sabalausky wrote:

 turned out to be variables I had intended to use but forgot to
I am trying to tackle such time wastings with "protocols" in drokue. If one would be able to formally attach ones intentions to variables then such bugs could possibly be prevented. In your case the intention might have been to write and read the variable several times, of course starting with a write followed by some read. This intention can be expressed by a regular expression like: write+ read ( write | read )* For evaluating this at runtime (!) one may attach a DFA to the variable---a DFA that interpretes the operations on the variable as input. Of course the DFA has to be initiated somewhere before the first operation on the variable. At program termination the DFA can than be checked, whether it is in a final state. If at program termination the DFA is not in a final state then an "intention violation"-error can be reported. This way your time wouldn't have been wasted. Please note, that such cannot be done by a lint tool. -manfred -- Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
Jul 09 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Manfred_Nowak" <svv1999 hotmail.com> wrote in message 
news:g51vnt$1o9n$1 digitalmars.com...
 Nick Sabalausky wrote:

 turned out to be variables I had intended to use but forgot to
I am trying to tackle such time wastings with "protocols" in drokue. If one would be able to formally attach ones intentions to variables then such bugs could possibly be prevented. In your case the intention might have been to write and read the variable several times, of course starting with a write followed by some read. This intention can be expressed by a regular expression like: write+ read ( write | read )* For evaluating this at runtime (!) one may attach a DFA to the variable---a DFA that interpretes the operations on the variable as input. Of course the DFA has to be initiated somewhere before the first operation on the variable. At program termination the DFA can than be checked, whether it is in a final state. If at program termination the DFA is not in a final state then an "intention violation"-error can be reported. This way your time wouldn't have been wasted.
I've been seeing alot in these last few years about such...I'm going to call it "intent oriented programming". There's a lot of good stuff that works that way (unit testing and D's function contracts, for instance. I've seen some other new things from the Java world as well), and your idea is certainly interesting. But I worry that eventually we'll get to some point where all code either is or can be generated straight from "intents" syntax. Now that certainly sounds great, but at that point all we really would have done is reinvent "programming language" and we'd be left with the same problem we have today: how can we be sure that the "code"/"intents" that we wrote are really what we intended to write? The solution would have just recreated the problem. Regarding your specific idea, my concern with that is that the whole "write and read the variable several times, starting with a write followed by some read" is tied too closely to the actual implementation. Change your algorithm/approach, and you gotta go update your intents. I'd feel like I'd gone right back to header files.
 Please note, that such cannot be done by a lint tool.
True, since lint tools only do front-end work. But the compiler would be able to do it by injecting appropriate code into its output. An external lint tool *could* be made to do it by using ultra-fancy CTFE, but then the ultra-fancy-CTFE engine would effectively be a VM (or, heck, even real native code), which would mean adding a backend to the lint tool which would basically turn it into a compiler. Thus, in a manner of speaking, there would be correctness-analysis that a compiler could do that a lint tool (by a definition of "lint tool" that would have admittedly become rather debatable by that point) couldn't. ;)
Jul 09 2008
next sibling parent reply BCS <ao pathlink.com> writes:
Reply to Nick,

 I've been seeing alot in these last few years about such...I'm going
 to call it "intent oriented programming".
 
look up intentional programming.
 But I worry that eventually we'll get to some point where all code
 either is or can be generated straight from "intents" syntax. Now that
 certainly sounds great, but at that point all we really would have
 done is reinvent "programming language" and we'd be left with the same
 problem we have today: how can we be sure that the "code"/"intents"
 that we wrote are really what we intended to write? The solution would
 have just recreated the problem.
But the hope is that this stuff will be easier for you to read/evaluate, smaller, written in terms that are closer to how you think of the problem and further from how it's implemented. At some point the issue arises of "is this what the end user wants?" (lint can't help you there :)
Jul 09 2008
parent "Nick Sabalausky" <a a.a> writes:
"BCS" <ao pathlink.com> wrote in message 
news:55391cb32ef798caafd0fdb12c62 news.digitalmars.com...

 At some point the issue arises of "is this what the end user wants?" (lint 
 can't help you there :)
I would love to have a "deal with the client" tool I could delegate all of that stuff to ;)
Jul 09 2008
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 But I worry that eventually we'll get to some point where all code either is 
 or can be generated straight from "intents" syntax. Now that certainly 
 sounds great, but at that point all we really would have done is reinvent 
 "programming language" and we'd be left with the same problem we have today: 
 how can we be sure that the "code"/"intents" that we wrote are really what 
 we intended to write? The solution would have just recreated the problem.
Back in the 80's, there was a heavily advertised product that touted "no more programming necessary". All you had to do was write in their "scripting language" and the product would read that and do all the "programming" for you. I thought it was hilarious.
Jul 09 2008
prev sibling next sibling parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
 Manfred_Nowak wrote:
 JAnderson wrote:
 
 The more warnings as errors the better.  If I have
to suffer a
 little for false positives *shrug*
What do you understand by "a little"?
I don't understand what your asking. I meant that if I have to fix it because the compiler tells me its an error then so be it. Its a little pain for a lot of gain.
Forcing all warnings to be errors will not always be nice or pretty in any given circumstance. And occasionally, the 'error's will appear on certain architectures and not others, am i right? Under a basic understanding, the compiler does the following at the start of a function/code block. First prepares the stack with space, then initializes if it's static, Etc. With an unused variable, it shouldn't be an error during certain phases, and should be on others. Example. During my first few phases i usually start by declaring my function, start by what it needs, the order of the parameters and names, then i put in several different variables declaring them. //basic declaration to be finished later int isPrime(int number, int[] primesList) { int cnt; return 0; //shut the compiler up for now. } simply as an example, i can declare primesList, which i intend to implement at a later time for performance and speed, but at the moment i'll implement a very basic formula that is simple and requires very little work. What's the worst case that primesList or cnt aren't used right now? 8? 12 bytes in the stack? Geez, you all make it sound like it should be an error and it's illegal to not touch something now because i'm working on it and other things at once. I think i'd rather have a simple warning saying 'oh by the way, you aren't using this' when i go through my warnings/error list and i'll say later on down the road 'oh yeah, here's something i need to finish' or 'oh i don't need that variable, let's remove it' rather than.... //basic declaration to be finished later int isPrime(int number, int[] primesList) { int cnt; unusedReferenceToShutUpErrors(cnt); unusedReferenceToShutUpErrors(primesList); return 0; //shut the compiler up for now. } In my opinion that extra code is just taking up space to tell the compiler to shut the hell up, with no functional usage what-so-ever. Please don't say it isn't, i'm not going to fight and reply on the issue. I don't see why everyone is so nit-picky about this specific topic. My opinion? If the variable isn't used, have it as a warning but compile it, if you set warnings as errors, you can remove/comment the declarations when the time comes but i always work to get the code working before i re-factor it to make the 'by the ways' warnings go away, if they need it. I'm not sure about everyone else, but i've been reading a lot on how Walter has been defining and building D. The way it works, usage, declarations; There's a few minor things i am not sure yet about (bitfields, multi-dimensional arrays, Ect). However it's a big stepup from C, and looks much easier to understand without ugly operators and overloading that C++ has (::, friend functions, headers and preprocessing, Ect.). It was these very things that kept me from learning C++, because it was too hard for me to grasp because it was ugly, and i don't want to make something ugly.
 Please look at the example from 
 http://www.digitalmars.com/webnews/newsgroups.php?
 art_group=digitalmars.D&article_id=73441 
 
 Do you recognize how many warnings a lint tool might
emit on that code?
 Would you admit then, that a paranoic lint would be
quite useless, even
 if it detects that the variable `p' should be
accessed? I don't understand? With lint it just gives you hints about what could be wrong. You pick and choose what to fix.
 Would you 
 admit, that you yourself are unable to decide whether
the presence of
 some access statements to `p' should suppress the
warning? all my warnings are errors it would be an error too. If you really want to use an uninitialized variable there should be a work around but it should be harder to do.
Indeed. But some warnings you can't ignore without dropping out the code. Take the example. const DEBUG = 0; .. if (DEBUG) { //put debugging information } The compiler SHOULD give you a happy warning of 'Always comes up False' or 'Always comes up True' when you're enabling/disabling code in this manner. Making all warnings as errors, simply removes the functionality of the code. I believe GCC does a good job at it, last i checked, and occasionally notice i simply did the equation wrong when i see the warning where it shouldn't be there. while(1) //always true { if (someCondition){ break; } } I've had to use this code once or twice, once again you'll have a warning, not strictly an error. Era
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 04:16:06 -0700, Era Scarecrow wrote:

[...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;
 
     return 0; //shut the compiler up for now.
 }
[...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;
     unusedReferenceToShutUpErrors(cnt);
     unusedReferenceToShutUpErrors(primesList);
 
     return 0; //shut the compiler up for now.
 }
In "C++'ish" / D way, this is normally dealt like this: int isPrime(int number, int[] /*primesList*/) { //int cnt; return 0; } Hard & ugly? I think warnings are not meant to be used to remember you that you have something unfinished. I regularly tag those parts with "// todo", which is easily grep'd from code.
  What's the worst case that primesList or cnt aren't used right now? 8?
  12 bytes in the stack? Geez, you all make it sound like it should be an
  error and it's illegal to not touch something now because i'm working
  on it and other things at once.
It is not about consuming memory, it's about compiling code that won't work. It is not about making intentionally dead code, it's about accidentally having dead code.
  I don't see why everyone is so nit-picky about this specific topic.
Because syntactic salts and more pedantic checkings save a lots of debugging time.
  It was these very things that kept me from learning C++, because it was
  too hard for me to grasp because it was ugly, and i don't want to make
  something ugly.
Warnings & checkings does not make C++ ugly :) Constant conditionals:
 const DEBUG = 0;
 
 ..
 
 if (DEBUG) {
     //put debugging information
 }
1) Use static if? 2) Anyway, it is not constant conditional in that way that it is normally warned - you have intentionally set the value so, and thus the compiler could optimize the dead code away without problem. Normally it is warned, if you have an expression: for(...; (a + b) < 8; ...) { ... } ...if the compiler recognizes, that the condition cannot never be anything else than true or false, do you think it is intentional?
 while(1)    //always true
 {
     if (someCondition){
         break;
     }
 }
The same hold here; "while(true)", "for(;;)" are intentionally written to be infinite loops. I regularly define in C/C++: #define forever for(;;) I really don't mean, that the "warning system" should be something like "code smelling analyzing" for refactoring, but I think that most of those basic things could easily be catched at compile time (w/ warnings) without any negative side effects.
Jul 10 2008
next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 12:00:15 +0000, Markus Koskimies wrote:

An addition; I wouldn't mind (in fact I would hail) if all unused parts 
would be declared as an error in D spec. Unused/uninitalized vars, 
private members, dead code blocks and such are normally produced when you 
have had an intensive coding session to change some internals, but you 
have not remembered to change the behavior in every affected places. At 
least I never intentionally do dead code, they always mean that I have 
forgotten to do something. Furthermore, D has so many powerful tools for 
quickly "out-commenting" dead code (like "/+ ... +/" and static if) that 
it should not be a problem.

There are some harder-to-detect -things, just like infinite loops. I 
think that it is always an error to make a compiler-detectable infinite 
loop (or if statement) using comparison operators (even if there are 
reachable breaks or exception-throwable calls inside of it), like:

	for(; -1 > 0 ;) { ... }

"for(;;)" is valid for those, as well as "while(true)". "if(true)" would 
be better to be written "static if(true/false)".

Signed-unsigned -comparison is easily unintentionally made, since most 
programmers use int extensively and library functions return size_t and 
similar types, which can be unsigned. IMO it is a clear programming 
error; 
Jul 10 2008
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Markus Koskimies" <markus reaaliaika.net> wrote in message 
news:g54tke$1h9i$11 digitalmars.com...
 On Thu, 10 Jul 2008 04:16:06 -0700, Era Scarecrow wrote:

 [...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;

     return 0; //shut the compiler up for now.
 }
[...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;
     unusedReferenceToShutUpErrors(cnt);
     unusedReferenceToShutUpErrors(primesList);

     return 0; //shut the compiler up for now.
 }
In "C++'ish" / D way, this is normally dealt like this: int isPrime(int number, int[] /*primesList*/) { //int cnt; return 0; } Hard & ugly? I think warnings are not meant to be used to remember you that you have something unfinished. I regularly tag those parts with "// todo", which is easily grep'd from code.
Warnings are intended to point out things you may have overlooked. Forgetting to finish something certainly qualifies as "overlooked". I do "//TODO"s as well, but I do it so much that I'm very likely to end up forgetting to finish a partial-implementation before I get around to going through my TODOs and getting to that particular one. So the problem that inevitably crops up is: When I've commented that stuff out to make my partial implementation compile (so that I can test what I've implemented so far), and then move on to something else and forget to come back to my partial implementation, what's going to happen? It's NOT going to give me a nice noisy warning about "Hey, Mr. Screwup, you're not using this variable!", which *would* have pointed me right back to my unfinished code *before* I got around to that particular "//TODO". So now I have a hidden bug on my hands, just because I've deliberately circumvented the warning system and thereby lost the benefits of having it.
Jul 10 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:g55ovu$1p0b$1 digitalmars.com...
 "Markus Koskimies" <markus reaaliaika.net> wrote in message 
 news:g54tke$1h9i$11 digitalmars.com...
 On Thu, 10 Jul 2008 04:16:06 -0700, Era Scarecrow wrote:

 [...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;

     return 0; //shut the compiler up for now.
 }
[...]
 //basic declaration to be finished later int isPrime(int number, int[]
 primesList) {
     int cnt;
     unusedReferenceToShutUpErrors(cnt);
     unusedReferenceToShutUpErrors(primesList);

     return 0; //shut the compiler up for now.
 }
In "C++'ish" / D way, this is normally dealt like this: int isPrime(int number, int[] /*primesList*/) { //int cnt; return 0; } Hard & ugly? I think warnings are not meant to be used to remember you that you have something unfinished. I regularly tag those parts with "// todo", which is easily grep'd from code.
Warnings are intended to point out things you may have overlooked. Forgetting to finish something certainly qualifies as "overlooked". I do "//TODO"s as well, but I do it so much that I'm very likely to end up forgetting to finish a partial-implementation before I get around to going through my TODOs and getting to that particular one. So the problem that inevitably crops up is: When I've commented that stuff out to make my partial implementation compile (so that I can test what I've implemented so far), and then move on to something else and forget to come back to my partial implementation, what's going to happen? It's NOT going to give me a nice noisy warning about "Hey, Mr. Screwup, you're not using this variable!", which *would* have pointed me right back to my unfinished code *before* I got around to that particular "//TODO". So now I have a hidden bug on my hands, just because I've deliberately circumvented the warning system and thereby lost the benefits of having it.
In other words, treating warnings as errors (at least if it's all the time) or turning warnings into errors creates a need to supress the warnings. And in cases like these, supressing them ends up defeating the whole point of having them in the first place.
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 15:51:52 -0400, Nick Sabalausky wrote:

 In other words, treating warnings as errors (at least if it's all the
 time) or turning warnings into errors creates a need to supress the
 warnings. And in cases like these, supressing them ends up defeating the
 whole point of having them in the first place.
I understand your point, and that's why my suggestion would be that the compiler has flag to relax (warnings as warnings) when sketching the code :) (sure it is just the same if compiler is relaxed or strict by default, but the whole point is if the warnings are needed or not). The current DMD -release -flag would be good point to turn warnings to errors, but unfortunately it removes runtime checks :(
Jul 10 2008
prev sibling next sibling parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
 In "C++'ish" / D way, this is normally dealt
 like this:
 
 int isPrime(int number, int[] /*primesList*/)
 {
 	//int cnt;
 
 	return 0;
 }
 
 Hard & ugly? I think warnings are not meant to be used
 to remember you 
 that you have something unfinished. I regularly tag those
 parts with "//
 todo", which is easily grep'd from code.
Maybe in that instance no it isn't hard or ugly. However if i had continued and partially implimented, you need it enabled, but enabled doesn't help much since you can't get to it. //returns true/flase if a prime. bool isPrime(int number, int[] primesList) { int cnt; bool prime = true; primesList = null; //disable for now. When fully implimented, remove if (primesList) //gives an error since it never executes, but logic is there. { foreach (pnum; primesList) { //finish later } } else for (cnt = 2; cnt < number; cnt++) if (number % cnt == 0) prime=false; return prime; } Being partially implimented, i don't really want to comment out large areas of my code, since i want to know if i wrote it wrong, or right. I'll in the middle of my project, make sure all my block openings/closings are good, and then compile the code, which i will never run but instead i use the errors and warnings to find certain bugs and simple problems earlier, rather then do an entire file and then trace all the bugs all at once. Or am i doing it wrong?
 It is not about consuming memory, it's about compiling
 code that won't 
 work. It is not about making intentionally dead code,
 it's about 
 accidentally having dead code.
Agreed. truely dead code should reply an error.
 Because syntactic salts and more pedantic checkings save a
 lots of 
 debugging time.
 
  It was these very things that kept me from learning
C++, because it was
  too hard for me to grasp because it was ugly, and i
don't want to make
  something ugly.
Warnings & checkings does not make C++ ugly :)
No, i mean the :: operator, and other details in the language. Sorry for the confusion there.
 Constant conditionals:
 
 const DEBUG = 0;
 
 ..
 
 if (DEBUG) {
     //put debugging information
 }
1) Use static if?
Still learning D, i've seen references of static if's but i haven't read enough on it yet.
 2) Anyway, it is not constant conditional in that way that
 it is normally 
 warned - you have intentionally set the value so, and thus
 the compiler 
 could optimize the dead code away without problem. Normally
 it is warned, 
 if you have an expression:
 
 	for(...; (a + b) < 8; ...) { ... }
 
 ...if the compiler recognizes, that the condition cannot
 never be 
 anything else than true or false, do you think it is
 intentional?
if it was truely intentional, then wouldn't you do..? for (;;){} or while(true){} if it wasn't intentional, then you were trying to actually check for something.
 
 while(1)    //always true
 {
     if (someCondition){
         break;
     }
 }
The same hold here; "while(true)", "for(;;)" are intentionally written to be infinite loops. I regularly define in C/C++:
those are truely intentional, a contition should actually have a chance to change and be true/false at different steps, correct?
 #define forever for(;;)
 
 I really don't mean, that the "warning
 system" should be something like 
 "code smelling analyzing" for refactoring, but I
 think that most of those 
 basic things could easily be catched at compile time (w/
 warnings) 
 without any negative side effects.
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 10:10:49 -0700, Era Scarecrow wrote:

 //returns true/flase if a prime.
 bool isPrime(int number, int[] primesList) {
 	int cnt;
 	bool prime = true;
 
 	primesList = null;	//disable for now.
                               //When fully implimented, remove
 
 	if (primesList)	//gives an error since it never executes,
                       //but logic is there. 
       {
 		foreach (pnum; primesList)
 		{
 		//finish later
 		}
 
 	}
 	else
 		for (cnt = 2; cnt < number; cnt++)
 			if (number % cnt == 0)
 				prime=false;
 
 
 	return prime;
 }
Hmmh, if I would do something like that, I would do it like this: bool isPrime(int number, int[] /* primesList */) { bool prime = true; static if(false) { // Partial implementation; should still be // syntactically valid D foreach(pnum; primesList) { ... } } else { for(int cnt = 2; ...) { ... } } return prime; } In fact, I wouldn't even put that list scanning part there, if I'm working with the loop there. Even when I'm just sketching code, I try to avoid large numbers of unused and incomplete parts there; I mainly sketch the interface, and start from core parts to put things together.
  Being partially implimented, i don't really want to comment out large
  areas of my code, since i want to know if i wrote it wrong, or right.
Yeah, I don't like large out-commented blocks either. But I normally write the code in pieces, so at that case I would first implement that loop and only slightly - if at all - touch to that primesList part. This easily goes to a discussion about how to sketch code, and I try to avoid that. I still see no use to allow unused vars & members and dead code, since they are regularly easily work-arounded in D during sketching phase, and you don't want them to be there at the end.
  I'll in the middle of my project, make sure all my block
  openings/closings are good, and then compile the code, which i will
  never run but instead i use the errors and warnings to find certain
  bugs and simple problems earlier, rather then do an entire file and
  then trace all the bugs all at once.
That's exactly the same way I work with the code. I keep adding blocks and continuously compile to see, if there are warnings or errors. If there would be a situation like above, and I would be just added that list scanning part but being still unsure if it works, I would add there an assert or similar to catch the program immediately if it goes there; bool isPrime(int number, int[] primeList) { if(primeList) { assert(false); // Don't go here at the moment foreach(pnum; primeList) { ... } } else { for(int cnt; ...;) if(!(number%cnt)) return false; return true; } } But anyway, if I would starting to make that part, the build-stoppable warnings about unused code and similar would not be a big issue any more, since I would not be executing the program until I get the errors & warnings away.
  Or am i doing it wrong?
I don't think so, but I feel that warnings and code sketching are not mutually exclusive things :D Anyway, I surfed in the net and found some writings about warnings and D. I'll quote here two things; Walter Bright about Redundancy in Programming Languages: "You can get a feel for where redundancy in a language is lacking by looking at warning diagnostics the compiler implementors tend to add over time. For the designer of a new language, common warnings are a rich source of inspiration for improvements." http://dobbscodetalk.com/index.php?option=com_myblog&show=Redundancy-in- Programming-Languages.html&Itemid=29 D 2.0 Overview, "Who D is Not For"; "Language purists. D is a practical language, and each feature of it is evaluated in that light, rather than by an ideal." http://www.digitalmars.com/d/2.0/overview.html Warnings may indicate imperfectness in language design, but from practical point of view I would accept that and add all practical warnings, although my aim would get rid of them in the future.
  Still learning D, i've seen references of static if's but i haven't
  read enough on it yet.
Static if is something like #if-#endif in C/C++, but since it is in the compiler, it knows all about the constants & types in the code (unlike C preprocessor) \o/
 if you have an expression:
 
 	for(...; (a + b) < 8; ...) { ... }
 
 ...if the compiler recognizes, that the condition cannot never be
 anything else than true or false, do you think it is intentional?
if it was truely intentional, then wouldn't you do..?
Assume, that (a+b) is always smaller than 8, which causes infinite loop. The compiler is able to know that. If the programmer was intentionally making an infinite loop, why didn't he use "for(;;)" or "while(true)"? If the compiler gives an error, that the expression is always true, I think there are two possibilities: 1) The programmer was checking out how smart is the compiler, by expressing an infinite loop in a hard way; to get the code compiled, he should change it to regular "for(;;)" or "while(true)". 2) The programmer didn't notice, that the values he was using are never evaluated greater than the intended loop termination condition. He re- examines the code and probably finds something that he was missing in the first attempt (forgot to add some other value, forgot to change the loop termination condition, ...) Win-win -situation, isn't that?
  if it wasn't intentional, then you were trying to actually check for
  something.
Exactly. That's why I like error-like warnings, that won't pass through suspicious parts.
 while(1)    //always true
 {
     if (someCondition){
         break;
     }
 }
The same hold here; "while(true)", "for(;;)" are intentionally written to be infinite loops. I regularly define in C/C++:
those are truely intentional, a contition should actually have a chance to change and be true/false at different steps, correct?
From this point on, I don't any more use the word warning. Instead I use the word error. I'm not speaking about "Variable Not Used *Warning*", but instead of "Variable Not Used *Error*" and "Unreachable Statement *Error*". Yes, right. Of course, compiler should not give any kind of warning if the loop is clearly intentionally made infinite, or if the condition is clearly intentionally made constant: while(true) ... // No error if(true) ... // No error auto a=file.getSize("myfile.txt"); // ulong ... while( a != -1) ... // Error In fact, I would change the default behavior of the compiler so that it uses all warnings and regards them as errors. For testing out incomplete code, a "--cmon-relax" flag could be used :)
Jul 10 2008
prev sibling next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Markus Koskimies Wrote:

 On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:
 
 In a "properly defined language", how would you solve the problem of
 unintentionally-unused variables?
My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects. For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used. Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out. In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :) And yes, warnings could be considered as "optional errors" for us who think that it's best to tackle all sorts of quirks & potential bugs at compile time and not trying to find them with runtime debugging. As long as the warning makes some sense and can be circumvented in some reasonable way, just throw it to my screen :)
In a final release, unused things are signs of errors. When writing code, unused variables (perhaps they were used in a commented-out section?) are a dime a dozen. If unused vars were errors in any language, it would be for development. With our linkers, unused imports are potentially more dangerous than an unused local variable that the code generator throws out.
Jul 10 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Robert Fraser wrote:
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
Jul 10 2008
next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:

 Robert Fraser wrote:
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
I'd been coding for awhile, and I have hard times to remember the last time I have had unused vars or other pieces of code, that would be there intentionally... Sure, I may have got some sort of brain-damage due to heavy use of "gcc -Wall" and "lint" to affect to my coding style so that I unconsciously avoid warning-generating sketchs... About those unused imports (mentioned by Robert) - I think that the compiler could stop also with those. I was just coming to that subject :D
Jul 10 2008
next sibling parent reply BCS <ao pathlink.com> writes:
Reply to Markus,

 On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:
 
 Robert Fraser wrote:
 
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
 
Yes, that's why I find the warning to be a nuisance, not a help.
I'd been coding for awhile, and I have hard times to remember the last time I have had unused vars or other pieces of code, that would be there intentionally... Sure, I may have got some sort of brain-damage due to heavy use of "gcc -Wall" and "lint" to affect to my coding style so that I unconsciously avoid warning-generating sketchs... About those unused imports (mentioned by Robert) - I think that the compiler could stop also with those. I was just coming to that subject :D
one cases where the extra vars might be added is as padding for cache effects.
Jul 10 2008
next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 21:09:26 +0000, BCS wrote:

 one cases where the extra vars might be added is as padding for cache
 effects.
?!? You mean you use extra vars for making the cache to load the correct one to a right cache line? Sounds extremely silly to me!
Jul 10 2008
parent reply Brad Roberts <braddr puremagic.com> writes:
Markus Koskimies wrote:
 On Thu, 10 Jul 2008 21:09:26 +0000, BCS wrote:
 
 one cases where the extra vars might be added is as padding for cache
 effects.
?!? You mean you use extra vars for making the cache to load the correct one to a right cache line? Sounds extremely silly to me!
For a good talk on just how important this can be, read the slides and/or watch the video here: http://www.nwcpp.org/Meetings/2007/09.html The memory latency and cache line behavior is covered towards the end. The entire talk is really really good. Parts of it talk about another issue that's come up on these newsgroups more than once, instruction and memory ordering within concurrent applications. Later, Brad
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 20:23:03 -0700, Brad Roberts wrote:

 Markus Koskimies wrote:
 On Thu, 10 Jul 2008 21:09:26 +0000, BCS wrote:
 
 one cases where the extra vars might be added is as padding for cache
 effects.
?!? You mean you use extra vars for making the cache to load the correct one to a right cache line? Sounds extremely silly to me!
For a good talk on just how important this can be, read the slides and/or watch the video here: http://www.nwcpp.org/Meetings/2007/09.html
I'll read that later.
 The memory latency and cache line behavior is covered towards the end.
 The entire talk is really really good.  Parts of it talk about another
 issue that's come up on these newsgroups more than once, instruction and
 memory ordering within concurrent applications.
Sorry to say to you, but; 1) the situation is different in embedded systems. In those systems you know the size of the cache, the number of ways it has and the amount of lines it stores. And in those systems, you don't use compiler for optimize cache. You use linker to put the things to correct place. 2) For PC, IMO cache optimization is totally ridiculous. You really don't have any kind of glue in which kind of computer your code is being executed. If you optimize the cache usage for one special type of cache- CPU -configuration, you have no idea how it performs in another configuration. I'll put all my €2 that the optimizations made for one's Windows Pentium-4 does really have no (good) effetcs on my Linux 64-bit. If you're going to make cache optimizations, you'll need linker, and in PC world you need a system that does it automatically for you.
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Fri, 11 Jul 2008 04:17:13 +0000, Markus Koskimies wrote:

 I'll read that later.
I read it. It's all about the well-known barrier between porcessors, memories (RAM) and disks, and the necessarity of (1) having mulit-level caches (2) strive to the locality of execution Nothing to do with D compiler, extra unused vars and performance. If you really like to do cache optimization for modern PC, and not to trust to compiler & runtime environment, you would need (1) determine the cache hierarchy, sizes and the number of ways it has (as well as indexing) (2) write your code in assembler, and locate it at runtime so that it fills the cache lines optimally Certainly nothing to do with HLLs like D. Absolutely nothing.
Jul 10 2008
next sibling parent reply Brad Roberts <braddr puremagic.com> writes:
Markus Koskimies wrote:
 On Fri, 11 Jul 2008 04:17:13 +0000, Markus Koskimies wrote:
 
 I'll read that later.
I read it. It's all about the well-known barrier between porcessors, memories (RAM) and disks, and the necessarity of (1) having mulit-level caches (2) strive to the locality of execution Nothing to do with D compiler, extra unused vars and performance. If you really like to do cache optimization for modern PC, and not to trust to compiler & runtime environment, you would need (1) determine the cache hierarchy, sizes and the number of ways it has (as well as indexing) (2) write your code in assembler, and locate it at runtime so that it fills the cache lines optimally Certainly nothing to do with HLLs like D. Absolutely nothing.
Why is it that so many people here seem to have some sort of weird blinders that turn the world into black and white with no shades of grey? The world just doesn't work like that. Sorry to burst your bubble. I'm glad it's well known to you, but it's completely foreign to others. It's very relevant information and that's why I posted the URL. Additionally, your last sentence makes me think you're either being willfully blind or just stubborn. The cache latency and multi-processor interlocking on cache lines can be a serious performance killer that is easy resolved with padding without the need to dip into linker tricks and assembly. Unfortunatly, tools don't really exist to make it easy to discover these sorts of problems, so just knowing that they can exist might help someone out there realize a new avenue of thought at some point in their programming career. Every modern x86 shares a cache line size these days.. 64 bytes. That one optimization alone can double the performance of a system that's hitting cache line contention. An awful lot of people aren't even aware this sort of thing can occur. Are you suggesting that it's not something programmers should be aware of? You're 'absolutely nothing' comment is wrong. Every one of the examples in that presentation are in C, and demonstrate quite clearly its effects. Can you do even better by going lower level, sure, but doesn't make it worthless or nothing. Later, Brad
Jul 10 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 21:53:49 -0700, Brad Roberts wrote:

 Certainly nothing to do with HLLs like D. Absolutely nothing.
Why is it that so many people here seem to have some sort of weird blinders that turn the world into black and white with no shades of grey? The world just doesn't work like that. Sorry to burst your bubble.
Be my guest.
 Additionally, your last sentence makes me think you're either being
 willfully blind or just stubborn.
Probably both.
 Every modern x86 shares a cache line size these days.. 64 bytes.  That
 one optimization alone can double the performance of a system that's
 hitting cache line contention.
There is a thing called align. If you don't mind about cache indexes, but just one to make things to appear in separate lanes, use align. But that really does not have any effect to _regular_ multi-way caches.
 An awful lot of people aren't even aware
 this sort of thing can occur.
For decades, PC processor manufacturers are optimized their processors for software, not in the other way. That is why the processors execute functions so quickly, that is the sole reasons for having caches (the regular locality of software, e.g. the IBM study from 60's).
 Are you suggesting that it's not
 something programmers should be aware of?
Yes, I am.
 You're 'absolutely nothing' comment is wrong.  Every one of the examples
 in that presentation are in C, and demonstrate quite clearly its
 effects.  Can you do even better by going lower level, sure, but doesn't
 make it worthless or nothing.
Certainly, if you make lowlevel optimizations, it pays back somehow. But only in the architectures you are doing it. Not a HLL thing IMO. And all the time I'm optimizing cache usage for specified architecture, I use alignments (to cache line sizes) and linker (not to put two regularly referenced things to same index).
Jul 10 2008
next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Fri, 11 Jul 2008 05:14:57 +0000, Markus Koskimies wrote:

 And all the time I'm optimizing cache usage for specified architecture,
 I use alignments (to cache line sizes) and linker (not to put two
 regularly referenced things to same index).
Never ever I have tried to make cache optimization with unused variables.
Jul 10 2008
prev sibling parent reply BCS <ao pathlink.com> writes:
Reply to Markus,

 For decades, PC processor manufacturers are optimized their processors
 for software, not in the other way. That is why the processors execute
 functions so quickly, that is the sole reasons for having caches (the
 regular locality of software, e.g. the IBM study from 60's).
 
I hope I'm reading you wrong but if I'm not: The whole point of the talk is that CPU's can't get better performance by optimizing them more. If the code isn't written well (the code isn't optimized for the CPU) performance will not improve... ever.
 Are you suggesting that it's not
 something programmers should be aware of?
Yes, I am.
How can you say that? Expecting the tool chain to deal with cache effects would be like expecting it to convert a bubble sort into qsort.
Jul 11 2008
parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Fri, 11 Jul 2008 17:16:54 +0000, BCS wrote:

 Reply to Markus,
 
 For decades, PC processor manufacturers are optimized their processors
 for software, not in the other way. That is why the processors execute
 functions so quickly, that is the sole reasons for having caches (the
 regular locality of software, e.g. the IBM study from 60's).
 
 
I hope I'm reading you wrong but if I'm not: The whole point of the talk is that CPU's can't get better performance by optimizing them more. If the code isn't written well (the code isn't optimized for the CPU) performance will not improve... ever.
They will get better, and that is going to affect your software. IMO you should not write your software for CPU, instead you need to follow certain paradigms. I explain this lengthly. The current processors are fundamentally based on RASP models, which are an example of so called von Neumann architecture. This architecture offers, when physically realized, a very flexible but dense computing platform, since it is constructed from two specialized parts - memory and CPU. The drawback of this architecture is so called von Neumann bottleneck, which has been irritating both processor and software designers for decades. --- The processor fabrication technology sets limitations to how fast a processor can execute instructions. The early processors fetched the instructions always from main memory (causing of course lots of external bus activity), and they processed one instruction at time. Since processor fabrication technology gets better quite slowly, there have always been interest to search "alternative" solutions, which could give performance benefits on current technology. These improvements have been for example; - Pipelining - Super-scalar architectures - OoO execution - Threaded processors - Multi-core processors - etc. The more switches you can put to silicon, the more you can try to find performance benefits from concurrency. Pipelining & OoO have had a major impact to compiler technology; in early days, code generation was relatively easy, but in modern days to get the best possible performance you really need to know the internals of the processors. When writing code with C or D, you really have very minimal possibilities to try to make your software to utilize pipelines and OoO - if the compiler does not do that, your program will not do that. But at the same time, the processors have been tried to make compiler- friendly; since high level languages uses lots of certain instructions and patterns, the processors try to be good with them. If you take a look to the evolution of processors and compare it to the evolution of software design, you will see the impacts of changing from BASIC/ assembler programming to the compiled HLLs, changing from procedural languages to OO languages, and changing to threaded architectures. At BASIC/Assembler era, the processor machine language was intended for humans; that was the era of CISC-style processors. Compilers does not need human-readable machine code, and when the compiled languages were taken into use, there were raise of RISC processors. The procedural languages used lots of calls - the processors were optimized for calling functions quickly. The OO introduced intensive use of referring data via pointers (compared to data segments of procedural languages); the processors were optimized for accessing memory efficiently via pointers. How caching relates to this? Complex memory hierarchy (and in fact, the pipelines and OoO, too) is not desirable and intentional thing, it is a symptom raised from RASP model. It has been introduced only because it can give performance benefits to software, and the key word here is locality. Locality - and its natural consequence, distribution - is, in fact, one of the keyword of forthcoming processor models. The next major step in processor architectures is very likely reconfigurable platforms, and they will introduce a whole new set of challenges to compilers and software to be fully utilized. Refer to PlayStation Cell compiler to get the idea. At code level, you really can't design your software to "reconfigurable- friendly". The best thing is just keep the code clear, and hope that compilers can get the idea and make a good results. At your software architecture level, if you are using threads, try to keep everything local. The importance of that thing is just getting higher.
 Are you suggesting that it's not
 something programmers should be aware of?
Yes, I am.
How can you say that? Expecting the tool chain to deal with cache effects would be like expecting it to convert a bubble sort into qsort.
Does that description above answer to this question? In case it does not, I'll explain; in general software, don't mess with the cache. Instead, strive to locality and distribution. Use the threading libraries, and when possible, try to do the interactions between threads with some standard way. If you're doing lower level code, like threading library or hardware driver, you will probably need to know about caching. That is totally different story, since especially writing a hardware driver introduces much more things to take into account along with caches.
Jul 11 2008
parent BCS <ao pathlink.com> writes:
Reply to Markus,

[lots of stuff]

I think we are seeing the same effects from slightly different perspectives 
and arriving at /only slightly/ different results.

As long as you have code that has a wide fan out of potential memory access 
("in 42 instructions i might be accessing an of 2GB of data" vs. "I might 
be accessing any of only 32 bytes in the next 200 instructions") deep memory 
hierarchy will be needed because you can't fit 2GB or ram on the CPU and 
accessing /anything/ off chip and (even things on chip to some extent) is 
slow. PGA's and PPA's (programmable /processor/ arrays) might be able to 
tackle some of these issues, particularly in highly pipeline centric
programming 
(Do X to Y1 to Y1e6) but this still requiters that the programmer be aware 
of the CPU/cache/memory stuff.

Also, I have access to 2 P4's 3 P-III's and a Sparc. So If I want to improve 
performance, my only option is wright better programs and that's something 
that my tool chain can only do just so much for. Then I need to known about 
the system. I'm all for better chips. And when they come I'll (where it's 
needed) optimize my code for them. But till then and even then, programmers 
need to known what they are working with.
Jul 11 2008
prev sibling parent reply BCS <ao pathlink.com> writes:
Reply to Markus,

 On Fri, 11 Jul 2008 04:17:13 +0000, Markus Koskimies wrote:
 
 I'll read that later.
 
I read it. It's all about the well-known barrier between porcessors, memories (RAM) and disks, and the necessarity of
The spesific effect I was talking about is not in the slides. If you havn't seen the video, you didn't see the part I was refering to. int[1000] data; thread 1: for(int i = 1000_000); i; i--) data[0]++; thread 2a: for(int i = 1000_000); i; i--) data[1]++; thread 2b: for(int i = 1000_000); i; i--) data[999]++; On a multi core system run thread 1 and 2a and then run 1 and 2b. You will see a difference.
Jul 11 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Fri, 11 Jul 2008 17:10:50 +0000, BCS wrote:

 The spesific effect I was talking about is not in the slides. If you
 havn't seen the video, you didn't see the part I was refering to.
 
 
 int[1000] data;
 
 thread 1:
    for(int i = 1000_000); i; i--) data[0]++;
 
 thread 2a:
    for(int i = 1000_000); i; i--) data[1]++;
 
 thread 2b:
    for(int i = 1000_000); i; i--) data[999]++;
 
 
 On a multi core system run thread 1 and 2a and then run 1 and 2b. You
 will see a difference.
Sure I will. In the first example the caches of processor cores will be constantly negotiating the cache contents. If you are writing program with threads intensively accessing the same data structures, you need to know what you are doing. There is a big difference of doing: 1) int thread_status[1000]; thread_code() { ... thread_status[my_id] = X ... } 2) Thread* threads[1000]; Thread { int status; run() { ... status = X ... } } In the first example, you use a global data structure for threads and that can always cause problems. The entire cache system is based on locality; without locality in software it will not work. In that example, you would need to know the details of the cache system to align the data correctly. In the second example, the thread table is global, yes; but the data structures for threads get allocated from heap, and they are local. Whenever they are allocated from the same cache line or not depends on the operating system as well as the runtime library (implementation of heap; does it align the blocks to cache lines or not). Doing threaded code, I would always suggest to try to minimize the accesses to global data structures, and try to always use local data. Most probably every forthcoming processor architecture tries to improve the effectiveness of such threads. I would also try to use the standard thread libraries, since they try to tackle the machine-dependent bottlenecks.
Jul 11 2008
prev sibling parent JAnderson <ask me.com> writes:
BCS wrote:
 Reply to Markus,
 
 On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:

 Robert Fraser wrote:

 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
I'd been coding for awhile, and I have hard times to remember the last time I have had unused vars or other pieces of code, that would be there intentionally... Sure, I may have got some sort of brain-damage due to heavy use of "gcc -Wall" and "lint" to affect to my coding style so that I unconsciously avoid warning-generating sketchs... About those unused imports (mentioned by Robert) - I think that the compiler could stop also with those. I was just coming to that subject :D
one cases where the extra vars might be added is as padding for cache effects.
In C++ this is not really a problem. There are 2 ways to deal with this. One "var;" and 2 don't give the variable a name. -Joel
Jul 29 2008
prev sibling parent reply "Bruce Adams" <tortoise_74 yeah.who.co.uk> writes:
On Thu, 10 Jul 2008 21:32:23 +0100, Markus Koskimies  
<markus reaaliaika.net> wrote:

 About those unused imports (mentioned by Robert) - I think that the
 compiler could stop also with those. I was just coming to that subject :D
Unused symbols must not be a warning in the case of shared libraries. They must be errors when called from functions in a real program as this is most likely a bug and is what we have come to expect from C programs. For unreachable code I suppose it doesn't matter so much. I would still be inclined to err on the side of caution and require a stub rather than trying to be clever on the sly (which might go wrong). It is of course reasonable to have symbols that are resolved at runtime by a dynamic linker. These probably should be annotated specially in the code. But please for gawd's sake not __declspec(dllimport) because that's just sick and twisted. Regards, Bruce.
Jul 14 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Mon, 14 Jul 2008 21:51:16 +0100, Bruce Adams wrote:

 On Thu, 10 Jul 2008 21:32:23 +0100, Markus Koskimies
 <markus reaaliaika.net> wrote:
 
 About those unused imports (mentioned by Robert) - I think that the
 compiler could stop also with those. I was just coming to that subject
 :D
Unused symbols must not be a warning in the case of shared libraries.
You mean unused local vars or private members in classes? (although I learned a short time a go that private members in D are not invisible from other code). Certainly public members in classes/modules are never "unused", since you may link it with different code and it becomes used.
Jul 14 2008
prev sibling next sibling parent Ary Borenszweig <ary esperanto.org.ar> writes:
Walter Bright a écrit :
 Robert Fraser wrote:
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
Related to this, but not specifically to your post, I found some unused variable in DMD's front-end code. I don't know if they are there for a reason. Should I post them as bugs or something like that?
Jul 10 2008
prev sibling next sibling parent reply Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:

 Robert Fraser wrote:
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
I have thought this issue more closely now about few hours. I'll try write a more comprehensive answer to my home pages, but something now quickly thought; * What is the purpose of the source code? There is probably lots of answers, but I will present you just one; the source code is aimed for human readers & writers. Certainly this does not mean, that the source would be readable by John Doe - certainly, you need to learn the practices used by one programming language and get familiar with that before you can say anything about the practicality of one programming language. An ultra-short history; I have been programming about 30 years, done it for my living about 20 years. My "brain damage" is that I have been programmed mostly embedded systems, DSP, MCU and microcontrollers, so people more familiar with Win/Linux programming can tell where I'm wrong. I have never be a programming language purist. In fact, most of my colleges think that I'm a misuser of OOP, since I see nothing wrong of using goto's, large switch-cases or God-objects, if they just work. I think, that source code is not for the compiler. Compilers can deal with languages not having any kind of syntactic salt, they does not require comments, and they generally give a shit about indentation. No, source code is not meant to be compiler-friendly (although it is very good that it is one, for completely different reasons), instead it is aimed to be read by a human - it is the bridge between informal humans and formal compilers. How could you improve your source code specification? I think that there is just one answer, which D at the moment follows; the more complex programs you can understand by reading the source (and being familiar with that specific language), the better it is. I know that that sounds something humanists, but really - if the source is not meant to be understood by limited capability of humans, why the hell we are using (1) modular languages, (2) high level languages, (3) _indentation_, (4) __COMMENTS?!?__?!? Tell me, that source code is something else that fundamentally aimed to be understood by a human, which is familiar with the language, and I'll reconsider the following. --- Since source code is aimed for humans, to be able to understand even more complex structures, what is the purpose of warnings? Many of us D-users have a long experience with C, C++ and Java. Many of us are well aware about the problems in those languages. There is a very solid reason that I - and we - nowadays use D for my/us freetime activities, and why I'm advocating C++ people to give D a try (I'm very sad, that being a few years out from the community, there is currently a big fight between DMD-GDC-Tango-Phobos; that is something we all need to solve, to really make D as the future programming language - what it can be IMO!) But those warnings? I know that some of the readers think that warnings are something that indicate inadequates in the design of the programming language. But - referring to the previous parts - what is the sole reason of source language? All of you that think that D does not need any more warnings, please answer to this question: There is two source codes, and lets assume that both of them are valid D; 1) class A { int thing; int getThing() { return thing; } }; 2) class A { private void addThatThing() { ... } int a = 0; void *getThatProperty() { ... } template A(T) { ... } int getThing() { template A(int) { possiblyReturn(1); } if(itWasNot!(int()) return somethingElse(); return module.x.thatThing(); } } Which one is easier to understand? Yes, I know that the examples are very extreme, but think about this; if you allow the intermediate code to exists, does it really make the language better? Consider, that both of the examples are made by a very experienced D programmer. From compiler point of view, what is the big difference? Nothing. As long as the complex thing follows the correct syntax, compiler is happy; but does it mean, that the source is really understandable by other readers? Does it really follow good programming practices - and more importantly, do we need to aim people to follow good programming practices? Please, say me that complex source code is less error-proof, and I will stay silent from this issue to the rest of my life! --- For me, the answer is very clear; yes, we need to guide people to follow good practices. From human point of view (and I have lots of experience of reading other people sources), there is a big difference writing an infinite loop in following examples; 1) for(;;) { ... } 2) int a = -1; int b = (a - 2); for(uint i = 0; (a*2) + (b*3) < i; i++) { ... } The first one you recognize in less than a microsecond to an infinite loop. The second one requires you careful examination, let's say that a minute, which is 60,000 times more than the first one. From the point of compiler, they make no difference. Compiler easily detects the inifinite loop and drops all unnecessary parts. Could you resolve the situation with adding more syntactic salt? Like I said earlier, I think that there is a certain limit for adding that. It is not anymore useful for telling everything with reserved keywords to compiler, instead the compiler could strive you to use common good programming practices - those are called warnings; they does not necessarily prevent compiler to generate code, but they do not follow the guidelines of human understandability. In this end, I know that there is a great resistance against C/C++ in the D community (I undersand it, but I'm not signing it). What I am asking for is to follow the principles stated in the "D Overview", and the common sense; the language is not meant for language purists, instead it is meant for us trying to write something useful, as easily it can achieved. That is the real power of D; and it really DOES NOT mean that D compilers would not have warnings!
Jul 10 2008
parent reply BCS <ao pathlink.com> writes:
Reply to Markus,

 On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:
 
 Robert Fraser wrote:
 
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
 
Yes, that's why I find the warning to be a nuisance, not a help.
I have thought this issue more closely now about few hours. I'll try write a more comprehensive answer to my home pages, but something now quickly thought; * What is the purpose of the source code? There is probably lots of answers, but I will present you just one; the source code is aimed for human readers & writers.
[...]
 I know that
 that sounds something humanists, but really - if the source is not
 meant to be understood by limited capability of humans, why the hell
 we are using (1) modular languages, (2) high level languages, (3)
 _indentation_, (4) __COMMENTS?!?__?!?
 
counter point: perl/regex counter counter point: smalltalk (I've never seen it, but I've been told...) <g> all (most?) programming languages are designed to be written (lisp?), some are designed to be read. The purpose of a programming language is to tell the compiler what to do in a way that humans /can deal with/, saying nothing about how well.
Jul 11 2008
parent reply "Nick Sabalausky" <a a.a> writes:
"BCS" <ao pathlink.com> wrote in message 
news:55391cb32f16f8cab1577d9fd978 news.digitalmars.com...
 Reply to Markus,

 On Thu, 10 Jul 2008 12:51:47 -0700, Walter Bright wrote:

 Robert Fraser wrote:

 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
I have thought this issue more closely now about few hours. I'll try write a more comprehensive answer to my home pages, but something now quickly thought; * What is the purpose of the source code? There is probably lots of answers, but I will present you just one; the source code is aimed for human readers & writers.
[...]
 I know that
 that sounds something humanists, but really - if the source is not
 meant to be understood by limited capability of humans, why the hell
 we are using (1) modular languages, (2) high level languages, (3)
 _indentation_, (4) __COMMENTS?!?__?!?
counter point: perl/regex
I'm not sure that's much of a counterpoint since those are widely considered by everyone exept total die-hards to be unsuitable for most non-trivial tasks.
Jul 11 2008
parent BCS <ao pathlink.com> writes:
Reply to Nick,

 "BCS" <ao pathlink.com> wrote in message
 news:55391cb32f16f8cab1577d9fd978 news.digitalmars.com...
 
 counter point: perl/regex
 
I'm not sure that's much of a counterpoint since those are widely considered by everyone exept total die-hards to be unsuitable for most non-trivial tasks.
but is still a programming language (and I was sort of makeing a joke)
Jul 11 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 Robert Fraser wrote:
 In a final release, unused things are signs of errors. When writing
 code, unused variables (perhaps they were used in a commented-out
 section?) are a dime a dozen.
Yes, that's why I find the warning to be a nuisance, not a help.
That's because you think warnings should be errors, which they shouldn't. (see my other post about "cautions"). Unused variables in code should only generate "caution" messages, not errors. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jul 27 2008
prev sibling next sibling parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
 Hmmh, if I would do something like that, I would do it like
 this:
 
 bool isPrime(int number, int[] /* primesList */)
 {
 	bool prime = true;
 
 	static if(false)
 	{
 		// Partial implementation; should still be
 		// syntactically valid D
 		foreach(pnum; primesList) { ... }
 	}
 	else
 	{
 		for(int cnt = 2; ...) { ... }
 	}
 
 	return prime;
 }
Closer to what i am meaning. Wouldn't it throw an error though that you have foreach(pnum; primesList) where the primesList is commented out? course the smart compiler will probably see it as 'don't compile this code' and treat it as a large comment to the else.
  I'll in the middle of my project, make sure all
my block
  openings/closings are good, and then compile the
code, which i will
  never run but instead i use the errors and warnings
to find certain
  bugs and simple problems earlier, rather then do an
entire file and
  then trace all the bugs all at once.
That's exactly the same way I work with the code. I keep adding blocks and continuously compile to see, if there are warnings or errors. If there would be a situation like above, and I would be just added that list scanning part but being still unsure if it works, I would add there an assert or similar to catch the program immediately if it goes there; bool isPrime(int number, int[] primeList) { if(primeList) { assert(false); // Don't go here at the moment foreach(pnum; primeList) { ... } } else { for(int cnt; ...;) if(!(number%cnt)) return false; return true; } }
The a reason i had the 'primesList = null;' it's so the functionality of isPrime() could start to be used and implimented right away, checking for a prime, since primesList is only to be used as a speedup later, the purpose of the code is to check for if the number is a prime. If i wanted to use the isPrime (before i got the other half implimented), i'd have to intentionally pass a null, thereby removing intended purpose of it's existance later, Speed. As the example, i could impliment all my code with primesList with an array that is filled with an appropriate number of primes to give me the result for a prime very fast, but if it isn't implimented, it still works. the way i had it. assert(false); //automatically makes the function fail reguardless. Doing it this way, i can only run it if i pass a null. perhaps 'static if(false){...}' is better in this scenario as you said before. But i also wanted to ensure i had my variable names spelling and connections working right, even if the rest of the code wasn't ready just yet.
 But anyway, if I would starting to make that part, the
 build-stoppable 
 warnings about unused code and similar would not be a big
 issue any more, 
 since I would not be executing the program until I get the
 errors & 
 warnings away.
Agreed. Build code, get it to work quickly and easily. Then refactor out un-needed parts, and remove as many warnings as possible without making it ugly and only to get the compiler to shut up about said warning.
 Static if is something like #if-#endif in C/C++, but since
 it is in the 
 compiler, it knows all about the constants & types in
 the code (unlike C 
 preprocessor) \o/
That's nice to know. I'll start using it where applicable.
 Assume, that (a+b) is always smaller than 8, which causes
 infinite loop. 
 The compiler is able to know that. If the programmer was
 intentionally 
 making an infinite loop, why didn't he use
 "for(;;)" or "while(true)"? If 
 the compiler gives an error, that the expression is always
 true, I think 
 there are two possibilities:
 
 1) The programmer was checking out how smart is the
 compiler, by 
 expressing an infinite loop in a hard way; to get the code
 compiled, he 
 should change it to regular "for(;;)" or
 "while(true)".
I can mostly see that only in instances of those trying to make better checkers by making it fail any way possible, or making sure the error comes up on a later build. Either way, if ;(a+b) < 8; always evaluates to less than 8, and there's no way to get a or b to go up or might change to become something higher (or a break somewhere to get out of the loop); most likely something is missing; unless they have some 'clever code' which, then becomes a pain to debug later.
 2) The programmer didn't notice, that the values he was
 using are never 
 evaluated greater than the intended loop termination
 condition. He re-
 examines the code and probably finds something that he was
 missing in the 
 first attempt (forgot to add some other value, forgot to
 change the loop 
 termination condition, ...)
 
 Win-win -situation, isn't that?
Only if he's actually warned about the infinate loop. would this warn you? (i will check myself later) int a,b; for (;(a+b)<8;) { if (a>=0) a++; if (b<=0) b--; } both a and b change. is it smart enough to know it should ALWAYS be 0? (or -1 for a brief time) or... { //...somewhere in the code. a++; b-=a; a=b; } not sure if the compiler will be quiet about this either, but logically something's changing constantly so we know there's a possible potential for an eventual false statement somewhere.
 Yes, right. Of course, compiler should not give any kind of
 warning if 
 the loop is clearly intentionally made infinite, or if the
 condition is 
 clearly intentionally made constant:
 
 	while(true) ...		// No error
 	if(true) ...		// No error
 
 	auto a=file.getSize("myfile.txt");	// ulong
 	...
 	while( a != -1) ...	// Error
 
 In fact, I would change the default behavior of the
 compiler so that it 
 uses all warnings and regards them as errors. For testing
 out incomplete 
 code, a "--cmon-relax" flag could be used :)
Sounds good to me. It just seems like there's a delicate balance where all warnings are errors, and not all warnings can be delt with without some type of ugly work-around (in some situations), where a warning would be better than an error. I honestly believe that unused variables shouldn't be an error; at least until i've finished building a working version of the function. Then start refactoring and removing un-needed parts. Once ready for release, all warnings that can be removed, are treated as errors. Last thing i personally want to do is to put in variables i know i need, and then have to comment them out just to get the compiler to be quiet because they aren't used, just so i can confirm some basic logic right before working on the block of code that would have held the variables i just commented out. The annoyance of jumping back and forth when i could have just left them alone to begin with is mostly where i'm going towards. production code and final code? Variables that aren't used should be errors.
Jul 10 2008
parent Markus Koskimies <markus reaaliaika.net> writes:
On Thu, 10 Jul 2008 15:05:30 -0700, Era Scarecrow wrote:

 Hmmh, if I would do something like that, I would do it like this:
 
 bool isPrime(int number, int[] /* primesList */) {
 	bool prime = true;
 
 	static if(false)
 	{
 		// Partial implementation; should still be // 
syntactically valid D
 		foreach(pnum; primesList) { ... }
 	}
 	else
 	{
 		for(int cnt = 2; ...) { ... }
 	}
 
 	return prime;
 }
Closer to what i am meaning. Wouldn't it throw an error though that you have foreach(pnum; primesList) where the primesList is commented out?
The block inside of static if is only syntactically checked. It does not have to have valid references to variables, classes or something like that. Like said, it is just like #if...#endif, with the exception that (1) static if knows all about the code around it (i.e. you can use e.g. sizes of some aliased types in expression) (2) static if allos only syntactically correct blocks inside of it (that block is parsed, but anything else is not made for it)
 course the smart compiler will probably see it as 'don't compile this
 code' and treat it as a large comment to the else.
Smart compiler regocnizes, when the attempt is intentional and when it is more likely spurious. Like side, there is a big differece between making infinite loops like: 1) while(true) { ... } 2) while(-4 < -2) { ... } ..Or outcommenting; 1) if(false) { ...} 2) if(-1 > 0) { ... } (you catched the point? If you didn't, consider that the compiler has evaluated a complex expression to correspond the second examples)
   The a reason i had the 'primesList = null;' it's so the functionality
   of isPrime() could start to be used and implimented right away,
   checking for a prime, since primesList is only to be used as a speedup
   later, the purpose of the code is to check for if the number is a
   prime. 
Yes, I understood that. There's plenty of possibilities to do that even with "full C++-like warnings" (D has only ridiculously small set of that) if warnings are treated as errors.
  assert(false); //automatically makes the function fail reguardless.
 
  Doing it this way, i can only run it if i pass a null.
Yes, true, but if it is not working, why do you want it to go there? That was just an example to enable code, and to guard it not to be executed while testing other parts. [...]
  Agreed. Build code, get it to work quickly and easily. Then refactor
  out un-needed parts, and remove as many warnings as possible without
  making it ugly and only to get the compiler to shut up about said
  warning.
Yes, exactly me!
 Assume, that (a+b) is always smaller than 8, which causes infinite
 loop.
 The compiler is able to know that. If the programmer was intentionally
 making an infinite loop, why didn't he use "for(;;)" or "while(true)"?
 If
 the compiler gives an error, that the expression is always true, I
 think
 there are two possibilities:
 
 1) The programmer was checking out how smart is the compiler, by
 expressing an infinite loop in a hard way; to get the code compiled, he
 should change it to regular "for(;;)" or "while(true)".
I can mostly see that only in instances of those trying to make better checkers by making it fail any way possible, or making sure the error comes up on a later build. Either way, if ;(a+b) < 8; always evaluates to less than 8, and there's no way to get a or b to go up or might change to become something higher (or a break somewhere to get out of the loop); most likely something is missing; unless they have some 'clever code' which, then becomes a pain to debug later.
I completely agree!
 2) The programmer didn't notice, that the values he was using are never
 evaluated greater than the intended loop termination condition. He re-
 examines the code and probably finds something that he was missing in
 the
 first attempt (forgot to add some other value, forgot to change the
 loop
 termination condition, ...)
 
 Win-win -situation, isn't that?
Only if he's actually warned about the infinate loop.
In the case that compiler has produced an infinite loop, by optimizing away unnecassary parts, and it recognizes that the source indicates something else (i.e. not "while(true)"), would it be polite to tell that to the programmer? If the compiler is dumb enough not to optimize expressions like: while(-2 < -1) { ... } ...Then there is of course no need to produce warning (i.e. if the compiler generates code, that loads -2 to one register and -1 one register and really performs cmp between those).
  int a,b;
 
 for (;(a+b)<8;)
 {
     if (a>=0)
         a++;
     if (b<=0)
         b--;
 }
 both a and b change. is it smart enough to know it should ALWAYS be 0?
 (or -1 for a brief time)
Oh my god. It would take me a while to really analyze that kind of code. If that kind of loop would appear when I'm keeping review, I would put the programmer to make some simplifications to the loop... In any case, quickly analyzed, that would easily produce an infinite loop. a is growing all the time it is greater than zero, while b is decreasing all the time it is less zero. Likely outputs would be; 1) a < 0 ==> if b < 0, almost infinite, otherwise if (a+b) >= 8 it would not take a round, 2) a > 0 ==> if b < 0, infinite loop; a and b would compensate each other 3) Many undeterministic behaviors; from 0 to 8 rounds, and if more than 8, would lead to almost infinite
 or...
 
 {
 //...somewhere in the code.
  a++;
  b-=a;
  a=b;
 }
The behaviour would be hugely dependent on the inital values of a and b. Many non-deterministic results, and because of a-=b clause it could lead to infinite loops.
 not sure if the compiler will be quiet about this either, but logically
 something's changing constantly so we know there's a possible potential
 for an eventual false statement somewhere.
Compilers cannot regularly catch the things you presented above. They catch much more simple things, most likely caused because of you forgot something.
  Last thing i personally want to do is to put in variables i know i
  need, and then have to comment them out just to get the compiler to be
  quiet because they aren't used, just so i can confirm some basic logic
  right before working on the block of code that would have held the
  variables i just commented out. The annoyance of jumping back and forth
  when i could have just left them alone to begin with is mostly where
  i'm going towards. production code and final code? Variables that
  aren't used should be errors.
Like said, I have used so much "g++ -Wall" and "lint" by making my salary in private companies (which regularly gives a shit to fancy things, it just need to work and you are forced to do code like being in army; no own will, suggestions may be considered in next year), that I have used to sketch code so that it does not generate warnings. My bad, indeed, although I'm not the biggest fan of completely freedom of making code (i.e. good practices).
Jul 10 2008
prev sibling next sibling parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned
mixed shit). a = -10; b = -5
result is -15, done on a simple calculator (unless Win32 Calc.exe is wrong) Wow, this just reminds me how much i perfer using non-signed numbers for all my personal work. This reminds me, just as a little question nothing big. Is there a BigInt class we're suppose to be using or that's for D? (Besides trunk, that one there was talk of not being sure about the license.)
Jul 11 2008
next sibling parent Markus Koskimies <markus reaaliaika.net> writes:
On Fri, 11 Jul 2008 13:04:04 -0700, Era Scarecrow wrote:

 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned
mixed shit). a = -10; b = -5
result is -15, done on a simple calculator (unless Win32 Calc.exe is wrong)
$ python -c "print -10/-5" 2 $
Jul 11 2008
prev sibling next sibling parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Era Scarecrow wrote:
 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned
mixed shit). a = -10; b = -5
result is -15, done on a simple calculator (unless Win32 Calc.exe is wrong)
Uhh, I think your method of entry was wrong. It's +2. Y'know I don't think this problem was meant to require a calculator... --bb
Jul 11 2008
prev sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
Era Scarecrow Wrote:

 superdan Wrote:
 a / b is never larger than a (cept for signed/unsigned
mixed types). a = -10; b = -5
result is -15, done on a simple calculator (unless Win32 Calc.exe is wrong)
You sure you're not using one of these calc.exe versions?: http://thedailywtf.com/Articles/OMG-Finalist-Week-Conclusion--Voting.aspx
Jul 11 2008
prev sibling next sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
superdan Wrote:
 Robert Fraser Wrote:
 
 Don Wrote:
 
 superdan wrote:
 4. warning - switch statement has no default
 
 another example of a motherfuck. just require total coverage. in closed-set
cases i routinely write anyway:
 
 switch (crap) 
 {
 case a: ...; break;
 case b: ...; break;
 default: assert(crap == c): ...; break;
 }
 
 again: vast majority of code already has a default. the minority just has to
add a little code. make it an error.
Yup. Make it an error.
I agree with everything else, but this one I think shouldn't be an error or warning (the implicit assert(0) is enough). This is because the vast majority of switch statements I use (and many I see) are over enums, and if every branch in the enumeration is covered, a pointless "default" will just complicate code.
you are not disagreeing. switching over an enum is already closed if you mention all cases. the compiler knows that. it should indeed just throw an error if you have an out-of-range value that you forged from an int. but that's an uncommon case. don't make all pay for a rare bug.
Ah, in that case we are in agreement. But as of a few versions ago, DMD still gives a warning even if you use all the possible values of an enum in a switch statement.
Jul 11 2008
prev sibling parent Era Scarecrow <rtcvb32 yahoo.com> writes:
 $ python -c "print -10/-5"
 2
 $
 Uhh, I think your method of entry was wrong.  It's +2.
 Y'know I don't think this problem was meant to
 require a calculator...
No, i don't much work with negative numbers; It wasn't until 3 minutes after i posted, i tried to do another negative division, to see it's reaction, only to realize when i did 100 / -3, and it was 97. M$ Calc said 'oh, you meant minus, not divide' and my result was utterly wrong. My bad on that part.
Jul 11 2008