www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Why D doesn't have warnings

reply "Walter" <newshound digitalmars.com> writes:
Check out this exerpt from:

http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160

"Of course, this has been tried before-most compilers generate various
warnings when they encounter questionable code. Old-time Unix/C programmers
will certainly recall lint(1), a code-checker that did cross-file error
checking and parameter type matching. These tools have existed for years but
are not popular. Why? Because they generate a lot of warnings, and, as
countless software engineers have pointed out, it's time-consuming to sift
through the spurious warnings looking for the ones that really matter. I've
got news for them: there is no such thing as a warning that doesn't matter.
That's why it warns you. Anyone who has worked with enough code will tell
you that, generally, software that compiles without warnings crashes less
often. As far as I'm concerned, warnings are for wimps. Tools such as
lint(1) and DevStudio should not issue warnings: they should decide if
they've found an error and stop the build process, or they should shut up
and generate code."
Jun 28 2004
next sibling parent reply J C Calvarese <jcc7 cox.net> writes:
Walter wrote:
 Check out this exerpt from:
 
 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160
 
 "Of course, this has been tried before-most compilers generate various
 warnings when they encounter questionable code. Old-time Unix/C programmers
 will certainly recall lint(1), a code-checker that did cross-file error
 checking and parameter type matching. These tools have existed for years but
 are not popular. Why? Because they generate a lot of warnings, and, as
 countless software engineers have pointed out, it's time-consuming to sift
 through the spurious warnings looking for the ones that really matter. I've
 got news for them: there is no such thing as a warning that doesn't matter.
 That's why it warns you. Anyone who has worked with enough code will tell
 you that, generally, software that compiles without warnings crashes less
 often. As far as I'm concerned, warnings are for wimps. Tools such as
 lint(1) and DevStudio should not issue warnings: they should decide if
 they've found an error and stop the build process, or they should shut up
 and generate code."
I much prefer how D makes me fix irregularities in my code by using only errors and no warnings. During the little bit of time that I've worked with C, I didn't often have the self-discipline to start fixing the warnings until they started to scroll off the screen. (tsk! tsk!) -- Justin (a/k/a jcc7) http://jcc_7.tripod.com/d/
Jun 28 2004
parent reply Daniel Horn <hellcatv hotmail.com> writes:
I agree for the most part... There's one exception to your steadfast 
rule if the following two conditions apply:
a) the warning may be turned off (and probably is off by default)
b) the compiler errors on said warning.

Then it's just the compiler helping out the user who WANTS to be helped.

Case in point of course is boolean logic.

We should have an optional warning flag passed into the compiler when 
the user uses a boolean expression as an int or vice versa... the 
warning may be off for most devels, and for those who deign to turn it 
on, it would error on said warning.  Of course libs that shipped with 
the compiler would have it enabled for compatability...

the flag could look like
-Wstrong-boolean -Werror
;-)

J C Calvarese wrote:
 Walter wrote:
 
 Check out this exerpt from:

 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160

 "Of course, this has been tried before-most compilers generate various
 warnings when they encounter questionable code. Old-time Unix/C 
 programmers
 will certainly recall lint(1), a code-checker that did cross-file error
 checking and parameter type matching. These tools have existed for 
 years but
 are not popular. Why? Because they generate a lot of warnings, and, as
 countless software engineers have pointed out, it's time-consuming to 
 sift
 through the spurious warnings looking for the ones that really matter. 
 I've
 got news for them: there is no such thing as a warning that doesn't 
 matter.
 That's why it warns you. Anyone who has worked with enough code will tell
 you that, generally, software that compiles without warnings crashes less
 often. As far as I'm concerned, warnings are for wimps. Tools such as
 lint(1) and DevStudio should not issue warnings: they should decide if
 they've found an error and stop the build process, or they should shut up
 and generate code."
I much prefer how D makes me fix irregularities in my code by using only errors and no warnings. During the little bit of time that I've worked with C, I didn't often have the self-discipline to start fixing the warnings until they started to scroll off the screen. (tsk! tsk!)
Jun 28 2004
parent reply "Walter" <newshound digitalmars.com> writes:
The difficulty with your approach is that when you pass on the code to
someone else, they are faced with "is it a bug or is it ok". Compilation
should be a binary pass/fail, not something that sort of seems to compile,
but who knows if it really did <g>.


"Daniel Horn" <hellcatv hotmail.com> wrote in message
news:cbq6rb$2ckl$1 digitaldaemon.com...
 I agree for the most part... There's one exception to your steadfast
 rule if the following two conditions apply:
 a) the warning may be turned off (and probably is off by default)
 b) the compiler errors on said warning.

 Then it's just the compiler helping out the user who WANTS to be helped.

 Case in point of course is boolean logic.

 We should have an optional warning flag passed into the compiler when
 the user uses a boolean expression as an int or vice versa... the
 warning may be off for most devels, and for those who deign to turn it
 on, it would error on said warning.  Of course libs that shipped with
 the compiler would have it enabled for compatability...

 the flag could look like
 -Wstrong-boolean -Werror
 ;-)
Jun 28 2004
parent reply Arcane Jill <Arcane_member pathlink.com> writes:
In article <cbqauv$2ilu$1 digitaldaemon.com>, Walter says...
The difficulty with your approach is that when you pass on the code to
someone else, they are faced with "is it a bug or is it ok". Compilation
should be a binary pass/fail, not something that sort of seems to compile,
but who knows if it really did <g>.
The problem, Walter, is that we don't all agree what is or is not an error. I believe that the following SHOULD be an error: Now, we all know that a strong boolean type would render that an error, but even /that/ wouldn't catch ALL boolean type errors. I wrote a line of code the other which contained a subtle bug. I wrote: Now, the second == should have read =, obviously, but it compiled fine. And EVEN IF we'd have had a strong boolean type, it would STILL have compiled fine. But it /was/ a bug, and I think that it COULD have been spotted by a sufficiently intelligent compiler. Now, suppose, along comes a second, rival, D-compiler, which is intelligent enough to spot this circumstance and point it out to the user. Should it do so? Should it say "warning - using an equality test as a statement is a really dumb idea"? Or should it go with the idea that DMD's behavior is definitive, because "conflicting standards are bad"? So long as there exists only one D compiler, there is no use for warnings. But, enter Jill's imagined, hypothetical super-strict compiler designed to catch as many bugs as possible at compile-time - it would be quite reasonable for that hypothetical compiler to offer a "allow anything that DMD allows" option, in which case, that would be a good time to issue warnings. No? Jill
Jun 28 2004
next sibling parent Sean Kelly <sean f4.ca> writes:
In article <cbr37b$l31$1 digitaldaemon.com>, Arcane Jill says...
So long as there exists only one D compiler, there is no use for warnings. But,
enter Jill's imagined, hypothetical super-strict compiler designed to catch as
many bugs as possible at compile-time - it would be quite reasonable for that
hypothetical compiler to offer a "allow anything that DMD allows" option, in
which case, that would be a good time to issue warnings. No?
I tend to think of Walter's compiler as a reference implementation. Your example is a completely legitimate extension for a third-party compiler, provided it defaults to off. The problem in my mind is when third-party compilers display different behavior from one another. This is the case with current C++ compilers, and as someone who treats all warnings as errors this drives me crazy :) Sean
Jun 29 2004
prev sibling parent reply "Walter" <newshound digitalmars.com> writes:
"Arcane Jill" <Arcane_member pathlink.com> wrote in message
news:cbr37b$l31$1 digitaldaemon.com...
 In article <cbqauv$2ilu$1 digitaldaemon.com>, Walter says...
The difficulty with your approach is that when you pass on the code to
someone else, they are faced with "is it a bug or is it ok". Compilation
should be a binary pass/fail, not something that sort of seems to
compile,
but who knows if it really did <g>.
The problem, Walter, is that we don't all agree what is or is not an
error. I
 believe that the following SHOULD be an error:



We obviously disagree, because I don't consider that an error. There's no way any of us will agree 100% on the feature set of D.
 Now, we all know that a strong boolean type would render that an error,
but even
 /that/ wouldn't catch ALL boolean type errors. I wrote a line of code the
other
 which contained a subtle bug. I wrote:



 Now, the second == should have read =, obviously, but it compiled fine.
And EVEN
 IF we'd have had a strong boolean type, it would STILL have compiled fine.
But
 it /was/ a bug, and I think that it COULD have been spotted by a
sufficiently
 intelligent compiler.

 Now, suppose, along comes a second, rival, D-compiler, which is
intelligent
 enough to spot this circumstance and point it out to the user. Should it
do so?
 Should it say "warning - using an equality test as a statement is a really
dumb
 idea"? Or should it go with the idea that DMD's behavior is definitive,
because
 "conflicting standards are bad"?
I'll go with my experience implementing the C and C++ standards - it's better to conform to the standards. Fixing what I consider to be suboptimal decisions in those standards has turned out to be a failure. Changing semantics from one compiler to the next will cause no end of grief and will impede the adoption of D - look at what happened with the varying interpretations of template rules in C++. That said, having an optional 'lint mode' offered by a particular D implementation can be an appealing feature for that implementation, as long as it is both optional and compiles a strict subset of D.
 So long as there exists only one D compiler, there is no use for warnings.
But,
 enter Jill's imagined, hypothetical super-strict compiler designed to
catch as
 many bugs as possible at compile-time - it would be quite reasonable for
that
 hypothetical compiler to offer a "allow anything that DMD allows" option,
in
 which case, that would be a good time to issue warnings. No?
Jun 29 2004
parent reply Rex Couture <Rex_member pathlink.com> writes:
In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...

I'll go with my experience implementing the C and C++ standards - it's
better to conform to the standards. Fixing what I consider to be suboptimal
decisions in those standards has turned out to be a failure.
I'm shocked. Is this another C/C++ compiler? I thought it was supposed to be something better. And pardon my ignorance, but as a mere mortal programmer, I haven't the slightest idea why should compile. Sooner or later, someone is going to die over errors like that.
Jun 29 2004
parent reply "Walter" <newshound digitalmars.com> writes:
"Rex Couture" <Rex_member pathlink.com> wrote in message
news:cbt6f5$l92$1 digitaldaemon.com...
 In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...
I'll go with my experience implementing the C and C++ standards - it's
better to conform to the standards. Fixing what I consider to be
suboptimal
decisions in those standards has turned out to be a failure.
I'm shocked. Is this another C/C++ compiler? I thought it was supposed
to be
 something better.
Yup - here I was referring not to C/C++ in particular, but to my experience with the utility of implementing slightly non-standard compilers.
 And pardon my ignorance, but as a mere mortal programmer, I
 haven't the slightest idea why



 should compile.  Sooner or later, someone is going to die over errors like
that. If a==c actually calls a function opCmp, and one is writing code that wants to 'tickle' that function. This comes up sometimes in writing test coverage code, or in profiling code. This can also come up in generic code if one wants to verify that a and b are "comparable", but not care what the result is. It can also come up as the result of the combination of various optimizations and function inlining.
Jun 29 2004
next sibling parent Rex Couture <Rex_member pathlink.com> writes:
In article <cbti1b$1765$1 digitaldaemon.com>, Walter says...
"Rex Couture" <Rex_member pathlink.com> wrote in message
news:cbt6f5$l92$1 digitaldaemon.com...
 In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...
I'll go with my experience implementing the C and C++ standards - it's
better to conform to the standards. Fixing what I consider to be
suboptimal
decisions in those standards has turned out to be a failure.
I'm shocked. Is this another C/C++ compiler? I thought it was supposed
to be
 something better.
Yup - here I was referring not to C/C++ in particular, but to my experience with the utility of implementing slightly non-standard compilers.
Oh. Sorry, I misunderstood.
 And pardon my ignorance, but as a mere mortal programmer, I
 haven't the slightest idea why



 should compile.  Sooner or later, someone is going to die over errors like
that. If a==c actually calls a function opCmp, and one is writing code that wants to 'tickle' that function. This comes up sometimes in writing test coverage code, or in profiling code. This can also come up in generic code if one wants to verify that a and b are "comparable", but not care what the result is. It can also come up as the result of the combination of various optimizations and function inlining.
Pardon me. I guess I'll have to defer to you on that one.
Jun 30 2004
prev sibling parent reply Rex Couture <Rex_member pathlink.com> writes:
In article <cbti1b$1765$1 digitaldaemon.com>, Walter says...
"Rex Couture" <Rex_member pathlink.com> wrote in message
news:cbt6f5$l92$1 digitaldaemon.com...
 In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...
Oh, and thanks for replying. Sorry I didn't see your reply sooner. There are a lot of messages here. Some of my later messages are a little tart.
Jun 30 2004
parent "Walter" <newshound digitalmars.com> writes:
"Rex Couture" <Rex_member pathlink.com> wrote in message
news:cbtvc7$23pn$1 digitaldaemon.com...
 In article <cbti1b$1765$1 digitaldaemon.com>, Walter says...
"Rex Couture" <Rex_member pathlink.com> wrote in message
news:cbt6f5$l92$1 digitaldaemon.com...
 In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...
Oh, and thanks for replying. Sorry I didn't see your reply sooner. There
are a
 lot of messages here.  Some of my later messages are a little tart.
No worries. I'm at least a couple thousand messages behind :-(
Jun 30 2004
prev sibling next sibling parent reply Derek Parnell <derek psych.ward> writes:
On Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote:

 Check out this exerpt from:
 
 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160
 
 "Of course, this has been tried before-most compilers generate various
 warnings when they encounter questionable code. Old-time Unix/C programmers
 will certainly recall lint(1), a code-checker that did cross-file error
 checking and parameter type matching. These tools have existed for years but
 are not popular. Why? Because they generate a lot of warnings, and, as
 countless software engineers have pointed out, it's time-consuming to sift
 through the spurious warnings looking for the ones that really matter. I've
 got news for them: there is no such thing as a warning that doesn't matter.
 That's why it warns you. Anyone who has worked with enough code will tell
 you that, generally, software that compiles without warnings crashes less
 often. As far as I'm concerned, warnings are for wimps. Tools such as
 lint(1) and DevStudio should not issue warnings: they should decide if
 they've found an error and stop the build process, or they should shut up
 and generate code."
In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it? If its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not? I work with a language that, by default, informs me about this type of coding. I have the option to turn the warning off (on a per function basis if required). -- Derek Melbourne, Australia 29/Jun/04 10:50:48 AM
Jun 28 2004
next sibling parent reply "Walter" <newshound digitalmars.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message
news:cbqep2$2nut$1 digitaldaemon.com...
 In the code below, is the non-use of the function argument 'a' an error or
 not? If its an error then why does D allow it?
It is not an error. There are many legitimate cases where one would have unused arguments.
 If its not an error, wouldn't it be 'nice' to inform the coder of a
 POTENTIAL error or not?
Let's say you know it is not an error in a particular case, and turn off or ignore the warning messages. Now you pass the code on to the maintainers, post it on the internet, sell it to a customer. They try to compile the code, and get the warning. What do they do now? From personal experience, I've found it leaves a bad impression, coupled with confusion and uncertainty, when users of the code compile it and it generates warnings. That leaves one's best option as "compile with warnings flagged as errors" so the customer doesn't see them, which is essentially what D does. (Another thing D does is adjust the syntax in a few cases so that many typical C warnings can't happen.)
 I work with a language that, by default, informs me about this type of
 coding. I have the option to turn the warning off (on a per function basis
 if required).
I've shipped a lot of code with funky pragmas that turn off specific warnings in specific parts of the code. It's a kludge at best. I'm trying to do better with D.
Jun 28 2004
next sibling parent reply Derek Parnell <derek psych.ward> writes:
On Mon, 28 Jun 2004 19:01:52 -0700, Walter wrote:

 "Derek Parnell" <derek psych.ward> wrote in message
 news:cbqep2$2nut$1 digitaldaemon.com...
 In the code below, is the non-use of the function argument 'a' an error or
 not? If its an error then why does D allow it?
It is not an error. There are many legitimate cases where one would have unused arguments.
I think at best one could say that is not necessarily an error. Yes there are many legit cases, and many non-legit cases too.
 If its not an error, wouldn't it be 'nice' to inform the coder of a
 POTENTIAL error or not?
Let's say you know it is not an error in a particular case, and turn off or ignore the warning messages. Now you pass the code on to the maintainers, post it on the internet, sell it to a customer. They try to compile the code, and get the warning. What do they do now? From personal experience, I've found it leaves a bad impression, coupled with confusion and uncertainty, when users of the code compile it and it generates warnings. That leaves one's best option as "compile with warnings flagged as errors" so the customer doesn't see them, which is essentially what D does. (Another thing D does is adjust the syntax in a few cases so that many typical C warnings can't happen.)
 I work with a language that, by default, informs me about this type of
 coding. I have the option to turn the warning off (on a per function basis
 if required).
I've shipped a lot of code with funky pragmas that turn off specific warnings in specific parts of the code. It's a kludge at best. I'm trying to do better with D.
I would prefer that by default 'D', or any language for that matter, behaves like D does now. That is, allow coders to do this sort of thing. However, I'd also like to be able to turn on such a checking process once in a while, to make sure I didn't *accidentally* forget a piece of poor coding. So maybe a lint-like application is not such a silly idea. It is an additional review of my code; one that has better "eyes" than me or my colleagues. -- Derek Melbourne, Australia 29/Jun/04 2:07:46 PM
Jun 28 2004
parent reply Mike Swieton <mike swieton.net> writes:
On Tue, 29 Jun 2004 14:16:00 +1000, Derek Parnell wrote:
 I would prefer that by default 'D', or any language for that matter,
 behaves like D does now. That is, allow coders to do this sort of thing.
 However, I'd also like to be able to turn on such a checking process once
 in a while, to make sure I didn't *accidentally* forget a piece of poor
 coding. So maybe a lint-like application is not such a silly idea. It is an
 additional review of my code; one that has better "eyes" than me or my
 colleagues.
I'll throw in my opinion here, because I know you're all dying to hear it ;) My problem with warnings in most languages is that they are not all the same. That is, if I have a product which needs to build on several GCC versions along with VC6 (this is the case right now, actually), there are several places where, due to differences in the warnings/errors of the compiler, certain C++ code simply cannot be done the same way on both compilers, and that's ridiculous. I don't mind the concept of warnings, because it really can tell you when you've done something that may be wrong. After all, a compiler shouldn't be so pedantic as to croak on *every* little thing. One consideration is this: the D standard could specify 'official' warnings, which are the only ones competing implementations may throw. Optionally, I feel a lint program could be valuable. Just as long as my different compilers work the same. Note: even some Java compilers/VMs can be bitchy, even with the same language version. Just some brain food for yah. Mike Swieton __ Things won are done, joy's soul lies in the doing. - William Shakespeare
Jun 28 2004
next sibling parent "Walter" <newshound digitalmars.com> writes:
"Mike Swieton" <mike swieton.net> wrote in message
news:pan.2004.06.29.04.31.05.718520 swieton.net...
 My problem with warnings in most languages is that they are not all the
same.
 That is, if I have a product which needs to build on several GCC versions
 along with VC6 (this is the case right now, actually), there are several
 places where, due to differences in the warnings/errors of the compiler,
 certain C++ code simply cannot be done the same way on both compilers, and
 that's ridiculous.
I've run into that, too <g>.
Jun 29 2004
prev sibling parent Regan Heath <regan netwin.co.nz> writes:
On Tue, 29 Jun 2004 00:31:06 -0400, Mike Swieton <mike swieton.net> wrote:

 On Tue, 29 Jun 2004 14:16:00 +1000, Derek Parnell wrote:
 I would prefer that by default 'D', or any language for that matter,
 behaves like D does now. That is, allow coders to do this sort of thing.
 However, I'd also like to be able to turn on such a checking process 
 once
 in a while, to make sure I didn't *accidentally* forget a piece of poor
 coding. So maybe a lint-like application is not such a silly idea. It 
 is an
 additional review of my code; one that has better "eyes" than me or my
 colleagues.
I'll throw in my opinion here, because I know you're all dying to hear it ;) My problem with warnings in most languages is that they are not all the same. That is, if I have a product which needs to build on several GCC versions along with VC6 (this is the case right now, actually), there are several places where, due to differences in the warnings/errors of the compiler, certain C++ code simply cannot be done the same way on both compilers, and that's ridiculous. I don't mind the concept of warnings, because it really can tell you when you've done something that may be wrong. After all, a compiler shouldn't be so pedantic as to croak on *every* little thing. One consideration is this: the D standard could specify 'official' warnings, which are the only ones competing implementations may throw. Optionally, I feel a lint program could be valuable. Just as long as my different compilers work the same. Note: even some Java compilers/VMs can be bitchy, even with the same language version. Just some brain food for yah.
From what you've said I think the best soln is that the compiler doesn't ever throw warnings, just errors. If you think it is and want to be 'extra safe' you run a lint-like program after it compiles. To me they are seperate and distinct things, one is compiling a program from instructions, either it will go, or it won't. The other is double checking potential mistakes in those instructions. A lint-like program, or your development ide or whatever can do this step. Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jun 29 2004
prev sibling next sibling parent reply Sean Kelly <sean f4.ca> writes:
In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says...
"Derek Parnell" <derek psych.ward> wrote in message
news:cbqep2$2nut$1 digitaldaemon.com...
 In the code below, is the non-use of the function argument 'a' an error or
 not? If its an error then why does D allow it?
It is not an error. There are many legitimate cases where one would have unused arguments.
I had thought that D treated unused variables as errors. Is this not the case for unused arguments? Not that I'm complaining--I agree that there are legitimate uses for this technique. Sean
Jun 29 2004
parent reply J C Calvarese <jcc7 cox.net> writes:
Sean Kelly wrote:
 In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says...
 
"Derek Parnell" <derek psych.ward> wrote in message
news:cbqep2$2nut$1 digitaldaemon.com...

In the code below, is the non-use of the function argument 'a' an error or
not? If its an error then why does D allow it?
It is not an error. There are many legitimate cases where one would have unused arguments.
I had thought that D treated unused variables as errors. Is this not the case for unused arguments? Not that I'm complaining--I agree that there are legitimate uses for this technique. Sean
DMD compiles either unused variables or unused arguments without complaint: int whatever(int i, int j, int k) { return 1; } void main() { int m; int n; n = whatever(n, n, n); } That's fine with me. I might not want to use all of the variables or arguments. -- Justin (a/k/a jcc7) http://jcc_7.tripod.com/d/
Jun 29 2004
parent reply Rex Couture <Rex_member pathlink.com> writes:
In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...
DMD compiles either unused variables or unused arguments without complaint:
I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
Jun 29 2004
parent reply Derek Parnell <derek psych.ward> writes:
On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:

 In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...
DMD compiles either unused variables or unused arguments without complaint:
I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such. -- Derek Melbourne, Australia 30/Jun/04 2:44:28 PM
Jun 29 2004
parent reply Regan Heath <regan netwin.co.nz> writes:
On Wed, 30 Jun 2004 14:46:38 +1000, Derek Parnell <derek psych.ward> wrote:

 On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:

 In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...
 DMD compiles either unused variables or unused arguments without 
 complaint:
I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such.
I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code. Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jun 29 2004
next sibling parent Rex Couture <Rex_member pathlink.com> writes:
In article <opsad6jgkc5a2sq9 digitalmars.com>, Regan Heath says...
On Wed, 30 Jun 2004 14:46:38 +1000, Derek Parnell <derek psych.ward> wrote:

 On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:

 In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...
 DMD compiles either unused variables or unused arguments without 
 complaint:
I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such.
I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code. Regan.
By all means, if there is some guaranteed mechanism to warn you of unused variables -- whatever it is -- that's fine. But I wonder if there is any practical difference between this and a compiler warning. It probably has to warn you every time to be of any significant use. I think Walter has a different objective than most programmers. He wants to sell software to someone, and doesn't want it to ever generate a warning. Most of us need the compiler to tell us about unused variables, most of the time. There are simple ways of suppressing warnings if you really must. By the way, the strict boolean issue is not going to go away.
Jun 29 2004
prev sibling next sibling parent reply Arcane Jill <Arcane_member pathlink.com> writes:
Not in reply to anyone in particular....

In C and C++, the function:






will compile without error or warning, as I believe it should. That strange void
line tells the compiler *I DON'T WANT TO USE THIS VARIABLE*.

I'm in favor of unsused variables being a compile-error, unless explicitly
indicated by the programmer, as above (or using some other, D-specific, syntax).

Arcane Jill
Jun 30 2004
parent Rex Couture <Rex_member pathlink.com> writes:
In article <cbtqrd$1lhq$1 digitaldaemon.com>, Arcane Jill says...
Not in reply to anyone in particular....

In C and C++, the function:






will compile without error or warning, as I believe it should. That strange void
line tells the compiler *I DON'T WANT TO USE THIS VARIABLE*.

I'm in favor of unsused variables being a compile-error, unless explicitly
indicated by the programmer, as above (or using some other, D-specific, syntax).

Arcane Jill
An elegant solution. Can I assume a similar void statement would also work outside the function, to discard an unwanted parameter?
Jun 30 2004
prev sibling parent reply Russ Lewis <spamhole-2001-07-16 deming-os.org> writes:
Regan Heath wrote:
 I think you should give Walter a chance to give you an example where 
 you'd want to have an un-used parameter, I suspect the times you'd want 
 one have all been solved by having default function parameters, see my 
 post asking walter for examples, for an example if this.
 
 To be fair an un-used parameter does not cause a crash, it might not 
 cause the desired behaviour, as it's not being used to do whatever it is 
 supposed to do, but, you should notice this either in a DBC out block OR 
 in a unittest OR the first time you run your code.
After defining an API, you may change it in the future, and choose to ignore one or more arguments. Or, consider the situation where you have an API which had different implementations. Some of the implementations might make use of certain parameters which are ignored in other implementations.
Jul 01 2004
parent reply Regan Heath <regan netwin.co.nz> writes:
On Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis 
<spamhole-2001-07-16 deming-os.org> wrote:

 Regan Heath wrote:
 I think you should give Walter a chance to give you an example where 
 you'd want to have an un-used parameter, I suspect the times you'd want 
 one have all been solved by having default function parameters, see my 
 post asking walter for examples, for an example if this.

 To be fair an un-used parameter does not cause a crash, it might not 
 cause the desired behaviour, as it's not being used to do whatever it 
 is supposed to do, but, you should notice this either in a DBC out 
 block OR in a unittest OR the first time you run your code.
After defining an API, you may change it in the future, and choose to ignore one or more arguments.
Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.
 Or, consider the situation where you have an API which had different 
 implementations.  Some of the implementations might make use of certain 
 parameters which are ignored in other implementations.
In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 01 2004
parent reply Derek Parnell <derek psych.ward> writes:
On Fri, 02 Jul 2004 17:47:02 +1200, Regan Heath wrote:

 On Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis 
 <spamhole-2001-07-16 deming-os.org> wrote:
 
 Regan Heath wrote:
 I think you should give Walter a chance to give you an example where 
 you'd want to have an un-used parameter, I suspect the times you'd want 
 one have all been solved by having default function parameters, see my 
 post asking walter for examples, for an example if this.

 To be fair an un-used parameter does not cause a crash, it might not 
 cause the desired behaviour, as it's not being used to do whatever it 
 is supposed to do, but, you should notice this either in a DBC out 
 block OR in a unittest OR the first time you run your code.
After defining an API, you may change it in the future, and choose to ignore one or more arguments.
Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.
 Or, consider the situation where you have an API which had different 
 implementations.  Some of the implementations might make use of certain 
 parameters which are ignored in other implementations.
In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan.
I have an real-world example where occasionally it is right to ignore a parameter. I use a library that is a Windows GUI development tool (BTW, I must think about porting it to D) and the way it handles events is that the application sets up an event handler for a control/event combination. The library calls your event handler with exactly three parameters : The ID of the control the event happened to, the id of the event type, and a dynamic array of parameters specific to the event. Not every event handler needs all this information, but it is given to each and every event handler. So frequently, the code in the event handler ignores one or more of the parameters supplied to it. An example might be useful (not D code) ... Here the 'event' parameter is not needed even though it is supplied by the GUI library. It would be nice to tell the compiler that I'm deliberately not using that parameter. The 'expire' idea would suffice. -- Derek Melbourne, Australia 2/Jul/04 3:53:09 PM
Jul 01 2004
parent Regan Heath <regan netwin.co.nz> writes:
On Fri, 2 Jul 2004 16:17:39 +1000, Derek Parnell <derek psych.ward> wrote:

 On Fri, 02 Jul 2004 17:47:02 +1200, Regan Heath wrote:

 On Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis
 <spamhole-2001-07-16 deming-os.org> wrote:

 Regan Heath wrote:
 I think you should give Walter a chance to give you an example where
 you'd want to have an un-used parameter, I suspect the times you'd 
 want
 one have all been solved by having default function parameters, see my
 post asking walter for examples, for an example if this.

 To be fair an un-used parameter does not cause a crash, it might not
 cause the desired behaviour, as it's not being used to do whatever it
 is supposed to do, but, you should notice this either in a DBC out
 block OR in a unittest OR the first time you run your code.
After defining an API, you may change it in the future, and choose to ignore one or more arguments.
Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.
 Or, consider the situation where you have an API which had different
 implementations.  Some of the implementations might make use of certain
 parameters which are ignored in other implementations.
In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan.
I have an real-world example where occasionally it is right to ignore a parameter. I use a library that is a Windows GUI development tool (BTW, I must think about porting it to D) and the way it handles events is that the application sets up an event handler for a control/event combination. The library calls your event handler with exactly three parameters : The ID of the control the event happened to, the id of the event type, and a dynamic array of parameters specific to the event. Not every event handler needs all this information, but it is given to each and every event handler. So frequently, the code in the event handler ignores one or more of the parameters supplied to it.
Ahh.. yes, good example. :) Regan
 An example might be useful (not D code) ...


 parms)




















 Here the 'event' parameter is not needed even though it is supplied by 
 the
 GUI library.

 It would be nice to tell the compiler that I'm deliberately not using 
 that
 parameter. The 'expire' idea would suffice.
-- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 02 2004
prev sibling next sibling parent Rex Couture <Rex_member pathlink.com> writes:
Good point.  You convinced me about warnings.  But see Daniel Horn's point
below.

What about a compromise?  A standard compiler directive (or metacode, or
whatever you wish to call it), which can turn strict booleans off and on for
those who need that feature.  If you put it right in the code before and after
the line(s) in question, it's right in the code, and they will not be in doubt.


In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says...
Let's say you know it is not an error in a particular case, and turn off or
ignore the warning messages. Now you pass the code on to the maintainers,
post it on the internet, sell it to a customer. They try to compile the
code, and get the warning. What do they do now?
I've shipped a lot of code with funky pragmas that turn off specific
warnings in specific parts of the code. It's a kludge at best. I'm trying to
do better with D.
============================== On Monday, June 28, Daniel Horn wrote: "Case in point of course is boolean logic. "We should have an optional warning flag passed into the compiler when the user uses a boolean expression as an int or vice versa... the warning may be off for most devels, and for those who deign to turn it on, it would error on said warning."
Jun 29 2004
prev sibling parent Regan Heath <regan netwin.co.nz> writes:
On Mon, 28 Jun 2004 19:01:52 -0700, Walter <newshound digitalmars.com> 
wrote:
 "Derek Parnell" <derek psych.ward> wrote in message
 news:cbqep2$2nut$1 digitaldaemon.com...
 In the code below, is the non-use of the function argument 'a' an error 
 or
 not? If its an error then why does D allow it?
It is not an error. There are many legitimate cases where one would have unused arguments.
Can you give us one or two? The only ones I can think of are for example... void doSomething(char *foo, int bar, int reserved) { ..use foo and bar.. } where reserved is for a future possible extension to this function. Default function parameters *solve* this case IMO. Instead of the above you have... void doSomething(char *foo, int bar) { ..use foo and bar.. } then you can extend it... void doSomething(char *foo, int bar, long[] baz = null) { ..use foo and bar and baz.. } Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jun 29 2004
prev sibling parent reply Russ Lewis <spamhole-2001-07-16 deming-os.org> writes:
Derek Parnell wrote:
 On Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote:
 
 In the code below, is the non-use of the function argument 'a' an error or
 not? If its an error then why does D allow it?
 
 If its not an error, wouldn't it be 'nice' to inform the coder of a
 POTENTIAL error or not? 
 









Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value." Or, perhaps, maybe expire should be simply a statement that tells the compiler "pretend as though this is uninitialized data." So try out this code: void bar(int b) { int arg_save = b; printf("Old value of b = %d\n", b); expire b; int a = b; // SYNTAX ERROR, b is expired b = arg_save*arg_save; printf("New value of b = %d\n", b); // OK, b is valid again } Thoughts?
Jul 01 2004
next sibling parent Derek Parnell <derek psych.ward> writes:
On Thu, 01 Jul 2004 17:34:15 -0700, Russ Lewis wrote:

 Derek Parnell wrote:
 On Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote:
 
 In the code below, is the non-use of the function argument 'a' an error or
 not? If its an error then why does D allow it?
 
 If its not an error, wouldn't it be 'nice' to inform the coder of a
 POTENTIAL error or not? 
 









Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value." Or, perhaps, maybe expire should be simply a statement that tells the compiler "pretend as though this is uninitialized data." So try out this code: void bar(int b) { int arg_save = b; printf("Old value of b = %d\n", b); expire b; int a = b; // SYNTAX ERROR, b is expired b = arg_save*arg_save; printf("New value of b = %d\n", b); // OK, b is valid again } Thoughts?
I like this idea a lot. It is brief, explicit, permissive, parsible, and doesn't break any existing code. -- Derek Melbourne, Australia 2/Jul/04 10:39:43 AM
Jul 01 2004
prev sibling parent reply Sean Kelly <sean f4.ca> writes:
In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...
Ok, I'm not going to take sides on the warning issue, because frankly I 
agree with both sides here.  But I am going to jump in and say that you 
could treat this as an error, and still handle the legitimate cases, if 
you had an 'expire' construct, that worked like this:

   int foo(int a)
   {
     expire a;
     return 1;
   }

'expire' would simply make a contract that a variable must not be used 
later in the function.  So, it would be an error to have a function 
which did not use one of its arguments - unless it explicitly expired 
them.  It is an explicit contract of "I don't care about this value."
The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword. Sean
Jul 02 2004
next sibling parent ANT <ANT_member pathlink.com> writes:
In article <cc4bb3$ap1$1 digitaldaemon.com>, Sean Kelly says...
In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...
Ok, I'm not going to take sides on the warning issue, because frankly I 
agree with both sides here.  But I am going to jump in and say that you 
could treat this as an error, and still handle the legitimate cases, if 
you had an 'expire' construct, that worked like this:

   int foo(int a)
   {
     expire a;
     return 1;
   }

'expire' would simply make a contract that a variable must not be used 
later in the function.  So, it would be an error to have a function 
which did not use one of its arguments - unless it explicitly expired 
them.  It is an explicit contract of "I don't care about this value."
The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword.
and we can make intellisense for IDEs aware of it. Ant
Jul 02 2004
prev sibling parent Russ Lewis <spamhole-2001-07-16 deming-os.org> writes:
Sean Kelly wrote:
 In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...
 
Ok, I'm not going to take sides on the warning issue, because frankly I 
agree with both sides here.  But I am going to jump in and say that you 
could treat this as an error, and still handle the legitimate cases, if 
you had an 'expire' construct, that worked like this:

  int foo(int a)
  {
    expire a;
    return 1;
  }

'expire' would simply make a contract that a variable must not be used 
later in the function.  So, it would be an error to have a function 
which did not use one of its arguments - unless it explicitly expired 
them.  It is an explicit contract of "I don't care about this value."
The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword.
I agree that it is desirable to avoid a new keyword. However, let me point out another use for expire that is hard (sometimes impossible) to do currently: void bar() { int rc = GoDoStuff(); if(rc != 0) throw SomeSortOfException; expire rc; // we're not going to use this value again ...more code... } Thus, 'expire' would help us write more self-documenting code.
Jul 02 2004
prev sibling next sibling parent reply "Bruno A. Costa" <bruno codata.com.br> writes:
In part I agree with you, but I think there are some special cases where
warnings are wellcome. A simple example: In big projects, unused variables
may happen and they tend to obfuscate the code, IMHO. In that cases we
could have an option like "-Wall" from gcc to instruct the compiler to
generate warnings.

And of course, bugs in the library should not be treated by the compiler.

Greetings,

Bruno.


Walter wrote:

 Check out this exerpt from:
 
 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160
 
 "Of course, this has been tried before-most compilers generate various
 warnings when they encounter questionable code. Old-time Unix/C
 programmers will certainly recall lint(1), a code-checker that did
 cross-file error checking and parameter type matching. These tools have
 existed for years but are not popular. Why? Because they generate a lot of
 warnings, and, as countless software engineers have pointed out, it's
 time-consuming to sift through the spurious warnings looking for the ones
 that really matter. I've got news for them: there is no such thing as a
 warning that doesn't matter. That's why it warns you. Anyone who has
 worked with enough code will tell you that, generally, software that
 compiles without warnings crashes less often. As far as I'm concerned,
 warnings are for wimps. Tools such as lint(1) and DevStudio should not
 issue warnings: they should decide if they've found an error and stop the
 build process, or they should shut up and generate code."
Jun 29 2004
parent reply Sam McCall <tunah.d tunah.net> writes:
Bruno A. Costa wrote:

 In part I agree with you, but I think there are some special cases where
 warnings are wellcome. A simple example: In big projects, unused variables
 may happen and they tend to obfuscate the code, IMHO. In that cases we
 could have an option like "-Wall" from gcc to instruct the compiler to
 generate warnings.
Hmm, in my code, unused member variables and local variables are usually bugs, and unused parameters are usually not bugs, there are exceptions to both. However I don't see why searching for these should happen at compile time, the two processes interfere with each other. Perhaps we need a lint and a compiler, likely sharing some code? Sam
 
 And of course, bugs in the library should not be treated by the compiler.
 
 Greetings,
 
 Bruno.
 
 
 Walter wrote:
 
 
Check out this exerpt from:

http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160

"Of course, this has been tried before-most compilers generate various
warnings when they encounter questionable code. Old-time Unix/C
programmers will certainly recall lint(1), a code-checker that did
cross-file error checking and parameter type matching. These tools have
existed for years but are not popular. Why? Because they generate a lot of
warnings, and, as countless software engineers have pointed out, it's
time-consuming to sift through the spurious warnings looking for the ones
that really matter. I've got news for them: there is no such thing as a
warning that doesn't matter. That's why it warns you. Anyone who has
worked with enough code will tell you that, generally, software that
compiles without warnings crashes less often. As far as I'm concerned,
warnings are for wimps. Tools such as lint(1) and DevStudio should not
issue warnings: they should decide if they've found an error and stop the
build process, or they should shut up and generate code."
Jun 29 2004
parent reply Regan Heath <regan netwin.co.nz> writes:
On Wed, 30 Jun 2004 03:52:48 +1200, Sam McCall <tunah.d tunah.net> wrote:

 Bruno A. Costa wrote:

 In part I agree with you, but I think there are some special cases where
 warnings are wellcome. A simple example: In big projects, unused 
 variables
 may happen and they tend to obfuscate the code, IMHO. In that cases we
 could have an option like "-Wall" from gcc to instruct the compiler to
 generate warnings.
Hmm, in my code, unused member variables and local variables are usually bugs, and unused parameters are usually not bugs, there are exceptions to both. However I don't see why searching for these should happen at compile time, the two processes interfere with each other. Perhaps we need a lint and a compiler, likely sharing some code?
I agree, they are 2 different processes, they are only linked in that they look at the same imput. I think the best soln is to seperate them. It paves the way for a competitive market for good lint-like applications. Regan
 And of course, bugs in the library should not be treated by the 
 compiler.

 Greetings,

 Bruno.


 Walter wrote:


 Check out this exerpt from:

 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160

 "Of course, this has been tried before-most compilers generate various
 warnings when they encounter questionable code. Old-time Unix/C
 programmers will certainly recall lint(1), a code-checker that did
 cross-file error checking and parameter type matching. These tools have
 existed for years but are not popular. Why? Because they generate a 
 lot of
 warnings, and, as countless software engineers have pointed out, it's
 time-consuming to sift through the spurious warnings looking for the 
 ones
 that really matter. I've got news for them: there is no such thing as a
 warning that doesn't matter. That's why it warns you. Anyone who has
 worked with enough code will tell you that, generally, software that
 compiles without warnings crashes less often. As far as I'm concerned,
 warnings are for wimps. Tools such as lint(1) and DevStudio should not
 issue warnings: they should decide if they've found an error and stop 
 the
 build process, or they should shut up and generate code."
-- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jun 29 2004
parent "Walter" <newshound digitalmars.com> writes:
"Regan Heath" <regan netwin.co.nz> wrote in message
news:opsadrjmq55a2sq9 digitalmars.com...
 I agree, they are 2 different processes, they are only linked in that they
 look at the same imput. I think the best soln is to seperate them.

 It paves the way for a competitive market for good lint-like applications.
It wouldn't be hard to use the existing DMD front end source as a starting point for such an app.
Jun 29 2004
prev sibling next sibling parent reply Juliano Ravasi Ferraz <contact write-my-first-name-here.info> writes:
Walter wrote:
 "...Why? Because they generate a lot of warnings, and, as
 countless software engineers have pointed out, it's time-consuming to sift
 through the spurious warnings looking for the ones that really matter..."
I think it is a matter of when and how you want to spend that time.
 "...As far as I'm concerned, warnings are for wimps..."
I can't take this seriously enough. I won't hire a programmer who ignores or disables compiler warnings if they are there.
 Tools such as
 lint(1) and DevStudio should not issue warnings: they should decide if
 they've found an error and stop the build process, or they should shut up
 and generate code."
There are many warnings that are real life-savers. I usually compile my own programs with gcc's -Wall -Werror (enable all common warnings and turn them into erros). They were countless times those warnings saved me. I know that D aims to fix the syntax where it may lead to bugs (like not allowing assignment expressions where a boolean result is expected), but this is still far away from "safe". There is an ancient programmer wisdom that says that if you manage to create an idiot-proof application, an enhanced type of idiot will be born on the day after. BTW, both dmd and lastest version of gcc with `-W -WallŽ compiles this without any warning (nested functions are a gcc extension): int main() { int a, b; void f(int a) { if (a) { int a = a+1; b == a++; } } f(a); return 0; } <fun> how much time for someone poping up with a www.iodcc.org? :-D </fun> -- Juliano
Jun 29 2004
parent Sean Kelly <sean f4.ca> writes:
In article <cbspu4$427$1 digitaldaemon.com>, Juliano Ravasi Ferraz says...
There are many warnings that are real life-savers. I usually compile my 
own programs with gcc's -Wall -Werror (enable all common warnings and 
turn them into erros). They were countless times those warnings saved 
me. I know that D aims to fix the syntax where it may lead to bugs (like 
not allowing assignment expressions where a boolean result is expected), 
but this is still far away from "safe".
It sounds like you agree, but the D compiler always runs as if you had -Wall -Werror set. I think this is fantastic. Working with third-party libraries in C++ is a headache for folks like me who compile with these options set in C++. It's often not feasible to fix errors in third-party code, and wrapping everything in a million pragmas is not a fun way to code :)
There is an ancient programmer wisdom that says that if you manage to 
create an idiot-proof application, an enhanced type of idiot will be 
born on the day after.
I'm in the "safety through documentation" camp. If the user compiles without DBC enabled and didn't read the docs then it's not my problem if he's being an idiot.
BTW, both dmd and lastest version of gcc with `-W -WallŽ compiles this 
without any warning (nested functions are a gcc extension):

int main() {
     int a, b;
     void f(int a) {
         if (a) {
             int a = a+1;
             b == a++;
         }
     }
     f(a);
     return 0;
}
Kind of an odd construct, but I don't see anything wrong with an expression as a statement. And this one has a side-effect so a compiler might not consider it unnecessary code anyway. Sean
Jun 29 2004
prev sibling parent reply Stewart Gordon <smjg_1998 yahoo.com> writes:
Walter wrote:
 Check out this exerpt from:
 
 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160
<snip> Arcane Jill has half taken the words out of my mouth, but I'll say it anyway. The idea of a _language_ having warnings or not having warnings makes little or no sense to me. Indeed, does the C specification say anything about whether the compiler should complain about an unused function parameter, a comparison with an inherently constant truth value, or the common typo of if (qwert = yuiop) { ... } ...? Different compiler writers, and hence different compilers, have different conceptions of what is probably a coding error and what isn't. For example, Borland C++ has warnings that GCC doesn't, and vice versa. Is this meant to be a decree that any D compiler written by anybody should never output a warning in its life? This has its own problems. Three options remain: (a) let the dodgy code silently slip through, no matter how dodgy it is. This would hinder the principle of eliminating common bugs from the start, and consequently reduce the scope for competition in implementation quality. Since D aims to help get rid of common bugs, it seems silly to try and stop different compilers from helping further. (b) take anything the implementer feels is dodgy as an error. This could seriously break the other bit of D philosophy: portability. (c) have (a) and (b) as separate compilation modes. (a) would be used to compile programs written by someone else, (b) would be used to compile your own stuff. But what about using libraries? You'd end up mixing the two compilation modes in the process of building a project, even within a single module and its imports. I for one don't know what I'd do if I decided to write a D compiler.... Stewart. -- My e-mail is valid but not my primary mailbox, aside from its being the unfortunate victim of intensive mail-bombing at the moment. Please keep replies on the 'group where everyone may benefit.
Jul 05 2004
parent reply Regan Heath <regan netwin.co.nz> writes:
On Mon, 05 Jul 2004 11:53:13 +0100, Stewart Gordon <smjg_1998 yahoo.com> 
wrote:
 Walter wrote:
 Check out this exerpt from:

 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160
<snip> Arcane Jill has half taken the words out of my mouth, but I'll say it anyway. The idea of a _language_ having warnings or not having warnings makes little or no sense to me. Indeed, does the C specification say anything about whether the compiler should complain about an unused function parameter, a comparison with an inherently constant truth value, or the common typo of if (qwert = yuiop) { ... } ...? Different compiler writers, and hence different compilers, have different conceptions of what is probably a coding error and what isn't. For example, Borland C++ has warnings that GCC doesn't, and vice versa. Is this meant to be a decree that any D compiler written by anybody should never output a warning in its life? This has its own problems. Three options remain: (a) let the dodgy code silently slip through, no matter how dodgy it is. This would hinder the principle of eliminating common bugs from the start, and consequently reduce the scope for competition in implementation quality. Since D aims to help get rid of common bugs, it seems silly to try and stop different compilers from helping further. (b) take anything the implementer feels is dodgy as an error. This could seriously break the other bit of D philosophy: portability. (c) have (a) and (b) as separate compilation modes. (a) would be used to compile programs written by someone else, (b) would be used to compile your own stuff. But what about using libraries? You'd end up mixing the two compilation modes in the process of building a project, even within a single module and its imports. I for one don't know what I'd do if I decided to write a D compiler....
It's my impression that compilation and lint-like processing are 2 different steps. The compiler does the first, it does not do the second. What this means is that regardless of which D *compiler* you use, you will get the exact same errors. This gives the code greater portability, no more weird errors on system X. In addition, imagine you are writing a cross platform app, compiling it on Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X times (where X is the number of operating systems you compile for), you only *need* to do it once. Furthermore, you want to be able to choose which lint-like program to run, with your favourite config options and you want to run it on the fastest hardware, not that old Mac you use for your MacOSX compilations. Walter has implied that it would be easy to write a lint-like program using the DMD front end. I suggest that anyone who is really worried about catching these sorts of 'possibly a coding error' errors starts a lint-like project using the dmd front end. Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 05 2004
parent reply Arcane Jill <Arcane_member pathlink.com> writes:
In article <opsany5h0b5a2sq9 digitalmars.com>, Regan Heath says...
It's my impression that compilation and lint-like processing are 2 
different steps.
The compiler does the first, it does not do the second.

What this means is that regardless of which D *compiler* you use, you will 
get the exact same errors. This gives the code greater portability, no 
more weird errors on system X.

In addition, imagine you are writing a cross platform app, compiling it on 
Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X 
times (where X is the number of operating systems you compile for), you 
only *need* to do it once.

Furthermore, you want to be able to choose which lint-like program to run, 
with your favourite config options and you want to run it on the fastest 
hardware, not that old Mac you use for your MacOSX compilations.

Walter has implied that it would be easy to write a lint-like program 
using the DMD front end. I suggest that anyone who is really worried about 
catching these sorts of 'possibly a coding error' errors starts a 
lint-like project using the dmd front end.

Regan.
An interesting idea, but it sounds like solving the problem just by calling things by different names. Which is okay, of course. What I mean is, if a third party brought out the ACME-D compiler which generated warnings, it could get away with it just by calling itself, not a compiler, but a lint-tool and compiler combined. It could simply state that it was the lint-component, not the compiler-component, which was generating the warnings, and everyone would be happy. Well, I'd buy it. However, I imagine that those purists who argue that D is so inherently well-defined that there should never be any such thing as a warning (and I am not one of them) might see this as just re-introducing warnings through the back door. Well, whatever. I'd be happy to call it a lint add-on if it did the job and kept everyone happy. Arcane Jill
Jul 05 2004
parent Regan Heath <regan netwin.co.nz> writes:
On Mon, 5 Jul 2004 20:51:12 +0000 (UTC), Arcane Jill 
<Arcane_member pathlink.com> wrote:
 In article <opsany5h0b5a2sq9 digitalmars.com>, Regan Heath says...
 It's my impression that compilation and lint-like processing are 2
 different steps.
 The compiler does the first, it does not do the second.

 What this means is that regardless of which D *compiler* you use, you 
 will
 get the exact same errors. This gives the code greater portability, no
 more weird errors on system X.

 In addition, imagine you are writing a cross platform app, compiling it 
 on
 Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X
 times (where X is the number of operating systems you compile for), you
 only *need* to do it once.

 Furthermore, you want to be able to choose which lint-like program to 
 run,
 with your favourite config options and you want to run it on the fastest
 hardware, not that old Mac you use for your MacOSX compilations.

 Walter has implied that it would be easy to write a lint-like program
 using the DMD front end. I suggest that anyone who is really worried 
 about
 catching these sorts of 'possibly a coding error' errors starts a
 lint-like project using the dmd front end.

 Regan.
An interesting idea, but it sounds like solving the problem just by calling things by different names. Which is okay, of course.
It is.. and it isn't. I think the core idea is that to be called a D compiler it must behave as do all other D compilers (if only by default). This gives all the benefits I have mentioned above: - portability of code - speed/efficiency of compilation - flexibility to choose (the lint process/options) This does not mean a D compiler cannot have a non-conformant 'mode', this mode should not be the default mode however.
 What I mean is, if a third party brought out the ACME-D compiler which 
 generated
 warnings, it could get away with it just by calling itself, not a 
 compiler, but
 a lint-tool and compiler combined. It could simply state that it was the
 lint-component, not the compiler-component, which was generating the 
 warnings,
 and everyone would be happy. Well, I'd buy it.
As long as ACME-D did not do the lint-processing by default, no problem. Making the 2 processes part of the same executable does have benefits: - smaller than 2 containing the same dmd front-end or similar syntax processing code. - you only need one executable to do both things, more likely it's installed on system X. - you can do both in 1 step. So as long as lint processing is not the default behaviour you only get benefits from doing it this way.
 However, I imagine that those purists who argue that D is so inherently
 well-defined that there should never be any such thing as a warning (and 
 I am
 not one of them)
I don't believe this is even possible. The compiler can and does know the syntax and can verify that. The compiler cannot and does not know your intent, it only knows what you wrote, not what you meant to write.
 might see this as just re-introducing warnings through the back door.
Not the secretive back door, instead the more obvious and well signposted service entrance.
 Well, whatever. I'd be happy to call it a lint add-on if it did the job 
 and kept everyone happy.
An 'add-on' by it's very nature is not 'default', so by renaming it, we have defined behaviour which I believe is the best way to handle this desire/situation. Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 05 2004