www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - const, final, scope function parameters

reply Walter Bright <newshound1 digitalmars.com> writes:
It looks like making "const final scope" be the default for function 
parameters is going to be infeasible. The troubles are that:

1) It seems to knock a lot of people for a loop, who will be assuming 
that an undecorated name would be like an undecorated name for a local 
or global variable.

2) Having to turn off one of the const, final, or scope, introduces the 
need for some sort of "not" keyword, like mutable, !const, !final, etc. 
It comes off looking bizarre.

However, making "in" be equivalent to "const final scope" does seem to 
work fine, requires no new keywords, and doesn't seem to confuse anyone.

On a related note, "cstring" has received universal condemnation <g>, so 
   I'll just have to make "string" work.
May 26 2007
next sibling parent reply Chris Nicholson-Sauls <ibisbasenji gmail.com> writes:
Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
I like the 'in' method myself.
 On a related note, "cstring" has received universal condemnation <g>, so 
   I'll just have to make "string" work.
Or maybe something like: alias const( char)[] utf8 ; // or even u8string alias const(wchar)[] utf16 ; // or even u16string alias const(dchar)[] utf32 ; // or even u32string -- Chris Nicholson-Sauls
May 26 2007
next sibling parent Lars Ivar Igesund <larsivar igesund.net> writes:
Chris Nicholson-Sauls wrote:

 Walter Bright wrote:
 It looks like making "const final scope" be the default for function
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming
 that an undecorated name would be like an undecorated name for a local
 or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces the
 need for some sort of "not" keyword, like mutable, !const, !final, etc.
 It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem to
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
I like the 'in' method myself.
 On a related note, "cstring" has received universal condemnation <g>, so
   I'll just have to make "string" work.
Or maybe something like: alias const( char)[] utf8 ; // or even u8string alias const(wchar)[] utf16 ; // or even u16string alias const(dchar)[] utf32 ; // or even u32string -- Chris Nicholson-Sauls
Hear hear :) -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
May 27 2007
prev sibling next sibling parent reply "Jarrett Billingsley" <kb3ctd2 yahoo.com> writes:
"Chris Nicholson-Sauls" <ibisbasenji gmail.com> wrote in message 
news:f3b74k$bbk$2 digitalmars.com...
 Or maybe something like:
 alias const( char)[] utf8  ; // or even u8string
 alias const(wchar)[] utf16 ; // or even u16string
 alias const(dchar)[] utf32 ; // or even u32string
Yes, I really like those :)
May 27 2007
parent Traveler Hauptman <none none.com> writes:
Jarrett Billingsley wrote:
 "Chris Nicholson-Sauls" <ibisbasenji gmail.com> wrote in message 
 news:f3b74k$bbk$2 digitalmars.com...
 Or maybe something like:
 alias const( char)[] utf8  ; // or even u8string
 alias const(wchar)[] utf16 ; // or even u16string
 alias const(dchar)[] utf32 ; // or even u32string
Yes, I really like those :)
Sort of a newb question; Why put stuff like this in the core "language". Phobos can have one, tango can have another, other runtime libs can have what-ever suits. Ignore this if that's what is being proposed and I misunderstood.
May 27 2007
prev sibling parent Charles D Hixson <charleshixsn earthlink.net> writes:
Chris Nicholson-Sauls wrote:
 Walter Bright wrote:
 ...
 On a related note, "cstring" has received universal condemnation <g>, 
 so   I'll just have to make "string" work.
Or maybe something like: alias const( char)[] utf8 ; // or even u8string alias const(wchar)[] utf16 ; // or even u16string alias const(dchar)[] utf32 ; // or even u32string -- Chris Nicholson-Sauls
+1 on the short forms, -1 on the long forms. OTOH, this isn't that significant, as your implementation is quite short. :-) Sill, if the names are to get used in library routines... then perhaps it is significant.
Oct 22 2007
prev sibling next sibling parent reply janderson <askme me.com> writes:
Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
I don't disagree here.
 
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
On this point, couldn't it be something like, if you define const, final or scope then the default "const final scope" is removed? [snip]
May 27 2007
next sibling parent reply "Rioshin an'Harthen" <rharth75 hotmail.com> writes:
"janderson" <askme me.com> kirjoitti viestissä 
news:f3bijn$t1h$1 digitalmars.com...
 Walter Bright wrote:
 [snip]
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
On this point, couldn't it be something like, if you define const, final or scope then the default "const final scope" is removed? [snip]
I find myself thinking that this might be a better way. So something akin to this (for in-only parameters) void foo(int bar) - bar is const final scope void foo(in int bar) - bar is normal void foo(const int bar) - bar is const void foo(final int bar) - bar is final void foo(scope int bar) - bar is scope etc. with the combinations Any specifier of {in|const|final|scope} cancels the const final scope default of the parameter; mixing "in" with "const", "final" or "scope" is no problem, since the only use for "in" is to cancel the default of const final scope, while any of those three cancels the default and toggles itself on. (Basically, in and no specifier swap places compared to Walter's suggestion.)
May 27 2007
parent reply Johan Granberg <lijat.meREM OVEgmail.com> writes:
Rioshin an'Harthen wrote:

 
 "janderson" <askme me.com> kirjoitti viestissä
 news:f3bijn$t1h$1 digitalmars.com...
 Walter Bright wrote:
 [snip]
 2) Having to turn off one of the const, final, or scope, introduces the
 need for some sort of "not" keyword, like mutable, !const, !final, etc.
 It comes off looking bizarre.
On this point, couldn't it be something like, if you define const, final or scope then the default "const final scope" is removed? [snip]
I find myself thinking that this might be a better way. So something akin to this (for in-only parameters) void foo(int bar) - bar is const final scope void foo(in int bar) - bar is normal void foo(const int bar) - bar is const void foo(final int bar) - bar is final void foo(scope int bar) - bar is scope etc. with the combinations Any specifier of {in|const|final|scope} cancels the const final scope default of the parameter; mixing "in" with "const", "final" or "scope" is no problem, since the only use for "in" is to cancel the default of const final scope, while any of those three cancels the default and toggles itself on. (Basically, in and no specifier swap places compared to Walter's suggestion.)
I like this one, basically it's safe by default and saves typing in the most common case, it would also avoid the problem I see in c++ sometimes, that people don't write const because it's more typing. Walter please reconsider, const by default will be worth any initial hassle.
May 27 2007
next sibling parent reply Frank Benoit <keinfarbton googlemail.com> writes:
 I like this one, basically it's safe by default and saves typing in the most
 common case, it would also avoid the problem I see in c++ sometimes, that
 people don't write const because it's more typing. Walter please
 reconsider, const by default will be worth any initial hassle.
i still second that. with save default: the compiler will force the user to make the param mutable if needed. without save default: it will not complain, when mutable is not needed.
May 27 2007
parent Henning Hasemann <hhasemann web.de> writes:
On Sun, 27 May 2007 11:59:52 +0200
Frank Benoit <keinfarbton googlemail.com> wrote:

 
 I like this one, basically it's safe by default and saves typing in the most
 common case, it would also avoid the problem I see in c++ sometimes, that
 people don't write const because it's more typing. Walter please
 reconsider, const by default will be worth any initial hassle.
i still second that. with save default: the compiler will force the user to make the param mutable if needed.
ack. Default to const seems reasonable to me too. Henning -- GPG Public Key: http://keyserver.veridis.com:11371/search?q=0x41911851 Fingerprint: 344F 4072 F038 BB9E B35D E6AB DDD6 D36D 4191 1851
May 27 2007
prev sibling parent gareis <dhasenan gmail.com> writes:
Johan Granberg palsat
 Rioshin an'Harthen wrote:
 (Basically, in and no specifier swap places compared to Walter's
 suggestion.)
I like this one, basically it's safe by default and saves typing in the most common case, it would also avoid the problem I see in c++ sometimes, that people don't write const because it's more typing. Walter please reconsider, const by default will be worth any initial hassle.
I agree -- as long as there's good documentation on the meaning of each, the worst I have to deal with is an occasional compiler warning and a quick look at the docs. It also makes everything very consistent in that no changes get propagated unless I specify that they should be. I was one of the people getting confused, and that was mostly over the definitions of scope, const, and final. Once I have a compiler yelling at me, I can either find out with a quick test or, if the compiler warnings are good, just change the attributes as the warning says I should. Or I could just, y'know, read the documentation.
May 27 2007
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
janderson wrote:
 On this point, couldn't it be something like, if you define const, final 
 or scope then the default "const final scope" is removed?
That was my thought, too, but that just confused things for others even more.
May 27 2007
parent janderson <askme me.com> writes:
Walter Bright wrote:
 janderson wrote:
 On this point, couldn't it be something like, if you define const, 
 final or scope then the default "const final scope" is removed?
That was my thought, too, but that just confused things for others even more.
"in" seems reasonable to me. Imagine you could require either "in?, or const, final scope in the definition or the compiler would complain. -Joel
May 27 2007
prev sibling parent reply Myron Alexander <someone somewhere.com> writes:
janderson wrote:
 Walter Bright wrote:
 2) Having to turn off one of the const, final, or scope, introduces 
 the need for some sort of "not" keyword, like mutable, !const, !final, 
 etc. It comes off looking bizarre.
On this point, couldn't it be something like, if you define const, final or scope then the default "const final scope" is removed? [snip]
I agree with this suggestion. I always insist on the most restrictions possible and relaxing restrictions on a case-by-case basis. This is what I call tight code :) I understand scope within the function body but I do not understand scope for parameters. How does scope affect class references, arrays, and primitives (like int)? Regards, Myron.
May 27 2007
parent reply Regan Heath <regan netmail.co.nz> writes:
Myron Alexander Wrote:
 janderson wrote:
 Walter Bright wrote:
 2) Having to turn off one of the const, final, or scope, introduces 
 the need for some sort of "not" keyword, like mutable, !const, !final, 
 etc. It comes off looking bizarre.
On this point, couldn't it be something like, if you define const, final or scope then the default "const final scope" is removed? [snip]
I agree with this suggestion. I always insist on the most restrictions possible and relaxing restrictions on a case-by-case basis. This is what I call tight code :) I understand scope within the function body but I do not understand scope for parameters. How does scope affect class references, arrays, and primitives (like int)?
It's my understanding that if you have: Class A { int a; } void foo(A aa, char[] bb, int cc) {} void main() { A a = new A(); char[] b = "b"; int c = 1; foo(a,b,c); } when you call 'foo' you get a copy of a, b, and c called aa, bb, and cc. These copies are naturally 'scope' because they only exist in the function scope, after which they vanish; because they are copies on the stack. Therefore having a foo like this would be illegal: //variables in a higher scope A* pa; char[]* pb; int *pc; void foo(A aa, char[] bb, int cc) { pa = &aa; pb = &bb; pc = &cc; } you would be violating 'scope' and should get errors. Likewise returning the address of one of these would be illegal: A* foo(A aa) { return &aa; } Because classes and arrays are reference types and int is a value type I believe this code is legal and does not violate scope. //variables in a higher scope A* aaa; char[]* bbb; int *ccc; void foo(A aa, char[] bb, int cc) { aaa = aa; bbb = bb; ccc = cc; } I think I may have muddied the waters with bad examples prior to this point. Regan Heath
May 27 2007
parent reply Regan Heath <regan netmail.co.nz> writes:
Regan Heath Wrote:
 Because classes and arrays are reference types and int is a value type I
believe this code is legal and does not violate scope.
<snip typo!>
 void foo(A aa, char[] bb, int cc)
 {
   aaa = aa;
   bbb = bb;
   ccc = cc;
 }
Gah! Sorry typo there, try these: //variables in a higher scope A aaa; char[] bbb; int ccc; Regan
May 27 2007
parent Myron Alexander <someone somewhere.com> writes:
Regan Heath wrote:
 Regan Heath Wrote:
 Because classes and arrays are reference types and int is a value type I
believe this code is legal and does not violate scope.
<snip typo!>
 void foo(A aa, char[] bb, int cc)
 {
   aaa = aa;
   bbb = bb;
   ccc = cc;
 }
Gah! Sorry typo there, try these: //variables in a higher scope A aaa; char[] bbb; int ccc; Regan
Thanks Regan. Makes more sense now. Regards, Myron.
May 27 2007
prev sibling next sibling parent reply Dave <Dave_member pathlink.com> writes:
Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
 
I can understand that concern, but who've you been bouncing the beta off of ("It seems to knock people for a loop")? It seems that over in d.D.announce the response was the opposite (IIRC, most were in favor of 'in' by default, at least to try with 2.0 to start off with). That said, I have a nagging suspicion you'd be right for the most likely people to try D.
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
 
Already suggested, but maybe consider making the 'in' mean what it does in 1.x, undecorated as 'final const scope', and any other specifier(s) overrides any default group (const would be just const, etc.)?
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
 
 On a related note, "cstring" has received universal condemnation <g>, so 
   I'll just have to make "string" work.
May 27 2007
parent Walter Bright <newshound1 digitalmars.com> writes:
Dave wrote:
 Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:

 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
I can understand that concern, but who've you been bouncing the beta off of ("It seems to knock people for a loop")?
C++ people. I regularly go to the nwcpp meetings, and we like to talk about D afterwards <g>. The nwcpp people are experienced C++ programmers, and a lot of them are opinion leaders (for example, a number of them have regular C++ articles and papers published).
 It seems that over in d.D.announce the response was the opposite (IIRC, 
 most were in favor of 'in' by default, at least to try with 2.0 to start 
 off with).
 
 That said, I have a nagging suspicion you'd be right for the most likely 
 people to try D.
Yup. First impressions count, and C++ peoples' first impressions of that was just universally bad. After some explaining, they understood what was going on and the rationale, but still just thought the confusion just wasn't worth it. BTW, the next D compiler will be an 'alpha' with this stuff in it, mainly to try these things out and see how they work in practice. If it just isn't going to work, we'll try and fix it.
May 27 2007
prev sibling next sibling parent reply Frank Benoit <keinfarbton googlemail.com> writes:
Perhaps we look at it from the wrong side.

If we want to change the D language to make it more const, the keywords
'const', 'invariant'... are probably the wrong choice.

How about restricting keywords and add their opposites: 'mutable',
'once' (write once) and then make every variable declaration const by
default? Each variable/parameter needs to be made modifyable with
modifiers if needed.
May 27 2007
next sibling parent Frank Benoit <keinfarbton googlemail.com> writes:
 How about restricting keywords
How about /removing/ restricting keywords
May 27 2007
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Frank Benoit wrote:
 Perhaps we look at it from the wrong side.
 
 If we want to change the D language to make it more const, the keywords
 'const', 'invariant'... are probably the wrong choice.
 
 How about restricting keywords and add their opposites: 'mutable',
 'once' (write once) and then make every variable declaration const by
 default? Each variable/parameter needs to be made modifyable with
 modifiers if needed.
I think having to write: mutable int x; instead of: int x; just isn't going to please people.
May 27 2007
next sibling parent James Dennett <jdennett acm.org> writes:
Walter Bright wrote:
 Frank Benoit wrote:
 Perhaps we look at it from the wrong side.

 If we want to change the D language to make it more const, the keywords
 'const', 'invariant'... are probably the wrong choice.

 How about restricting keywords and add their opposites: 'mutable',
 'once' (write once) and then make every variable declaration const by
 default? Each variable/parameter needs to be made modifyable with
 modifiers if needed.
I think having to write: mutable int x; instead of: int x; just isn't going to please people.
Maybe not. It would please me; in C++ right now, I usually have to write "int const x = ...;" whereas if the default were the "safe" form I could write just "int x = ...;". (As usual, "int" is just an example, of course.) Writing var int x; would be just fine by me; I find it more readable than using "mutable" (and the C++ community already has a similar-but-different meaning for mutable, as you know, so using a different term might be helpful). It's often been said that if C++ were being designed from a clean start that const would be the default. D has had that clean start -- and has made various changes that C++ would make but cannot for backwards compatibility reasons. It would be nice to make some more steps in the right direction while D still has a _relatively_ small existing user base and code base. (One direction in which C++ and D are going in different directions is default definitions for special member functions; most of those involved in C++ would seemingly like fewer defined by default, whereas if I remember, D tends to define more, such as memberwise comparison. C++ defined those that it has implicitly largely for C compatibility. The best option, currently under discussion for C++, seems to be to allow users to explicitly request that normal forms of certain operations be provided or excluded. But I digress.) -- James
May 27 2007
prev sibling parent reply Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 Frank Benoit wrote:
 Perhaps we look at it from the wrong side.
 
 If we want to change the D language to make it more const, the keywords
 'const', 'invariant'... are probably the wrong choice.
 
 How about restricting keywords and add their opposites: 'mutable',
 'once' (write once) and then make every variable declaration const by
 default? Each variable/parameter needs to be made modifyable with
 modifiers if needed.
I think having to write: mutable int x; instead of: int x; just isn't going to please people.
Ahhh, I think I see what you're concerned about. As in this example? mutable int gx; void foo(int y) { //y is scope const final mutable int z; } where the global and local scope ints 'gx' and 'z' are not supposed to be const scope final. Why can't we apply 'scope const final' to function parameters only? In fact, that's what I was proposing when I said implicit 'in' should be 'const scope final'. Global and local scope variables are not 'in' therefore are not 'const scope final'. Regan Heath
May 27 2007
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Regan Heath wrote:
 Ahhh, I think I see what you're concerned about.  As in this example?
 
 mutable int gx;
 void foo(int y) {  //y is scope const final
   mutable int z;
 }
Yes. Even I wouldn't use such a language :-(
 Why can't we apply 'scope const final' to function parameters only?
Because it knocks people for a loop - gives a bad first impression.
May 27 2007
parent reply Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 Regan Heath wrote:
 Ahhh, I think I see what you're concerned about.  As in this example?
 
 mutable int gx;
 void foo(int y) {  //y is scope const final
   mutable int z;
 }
Yes. Even I wouldn't use such a language :-(
I agree.
 Why can't we apply 'scope const final' to function parameters only?
Because it knocks people for a loop - gives a bad first impression.
Really? Have these people tried using it, in the beta itself? Or have you just explained it to them? I would hope that once someone actually uses it, it would come quite naturally. The hope is that these automatic restrictions will prevent bad programming practices, assuming they do that it must mean these people like to use bad programming practices? or maybe we're preventing things which aren't bad programming practices? if so, what? As this is going to be a 'beta' can we just give it a go anyway? I mean, once people start using it we can get concrete examples of where it doesn't work, or is a hindrance, or whatever. I know "having things work the way you'd expect them to" is something you want for D, but surely there are ingrained but 'bad' expectations which exist, those which we should really contradict in as definate a fashion as possible in order to evolve. Regan Heath
May 27 2007
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Regan Heath wrote:
 Walter Bright Wrote:
 Why can't we apply 'scope const final' to function parameters only?
Because it knocks people for a loop - gives a bad first impression.
Have these people tried using it, in the beta itself?
No. But you can't get people to try something they take an instant dislike to.
May 27 2007
next sibling parent reply Myron Alexander <someone somewhere.com> writes:
James Dennett wrote:
 Walter Bright wrote:
 I think having to write:

     mutable int x;

 instead of:

     int x;

 just isn't going to please people.
Maybe not. It would please me; in C++ right now, I usually have to write "int const x = ...;" whereas if the default were the "safe" form I could write just "int x = ...;". (As usual, "int" is just an example, of course.) Writing var int x; would be just fine by me; I find it more readable than using "mutable" (and the C++ community already has a similar-but-different meaning for mutable, as you know, so using a different term might be helpful).
James, Having const default for all variable declarations would be problematic in that most variables are expected to be mutable. The whole point of variables is that they are variable :) I think the one exception is strings in that the D implementation is an array but conceptually, there is a difference of opinion for whether strings are buffers or values. I am firmly on the "string is a value" bench and have my wet trout ready to go slap the "other" side when the mother of all wars begins :) (Actually, I have a nuke-lee-are trout that is const, non-static, and very final ;)) Walter Bright wrote:
 Regan Heath wrote:
 Walter Bright Wrote:
 Why can't we apply 'scope const final' to function parameters only?
Because it knocks people for a loop - gives a bad first impression.
Have these people tried using it, in the beta itself?
No. But you can't get people to try something they take an instant dislike to.
I am of the opinion that function parameters that are, by default, "scope const final" will improve code quality. When coding (even in Python and Java), I use that style. My reasoning is that values passed in are part of the interface state and that rebinding, or modifying them can lead to confusion or errors. A quick example that actually happened to me: Two people working on the same code need to modify a method. This method is quite long so when the other programmer added some code in the middle that modified the parameter value before I used it, on merge, there was a problem. The other person did not need to modify the actual parameter value but was too lazy to assign it to a temp variable thus the problem. (BTW, this is not how I work, it was the team leader's idea of "collaboration". We weren't even using CVS, the merge was manual). I have seen similar bugs on more than one occasion. BTW, if I declare a class parameter to be const, what happens if I try to call a method that will change the class state? Regards, Myron.
May 27 2007
parent reply James Dennett <jdennett acm.org> writes:
Myron Alexander wrote:
 James Dennett wrote:
 Walter Bright wrote:
 I think having to write:

     mutable int x;

 instead of:

     int x;

 just isn't going to please people.
Maybe not. It would please me; in C++ right now, I usually have to write "int const x = ...;" whereas if the default were the "safe" form I could write just "int x = ...;". (As usual, "int" is just an example, of course.) Writing var int x; would be just fine by me; I find it more readable than using "mutable" (and the C++ community already has a similar-but-different meaning for mutable, as you know, so using a different term might be helpful).
James, Having const default for all variable declarations would be problematic in that most variables are expected to be mutable. The whole point of variables is that they are variable :)
I disagree! Experience (from C++) has shown me that a large proportion of "variables" are not changed after their initialization. I find that it's true of most local variables (including function parameters), and using immutable objects with reference semantics is also fairly common. The advantages in compiler checking mean that even if there is a (sufficiently small) additional cost in having the default be immutability, my experience strongly suggests that it would be worthwhile. As I've said, I also find that the cost would be small, given how many "variables" aren't variable in clean code. (Who knows, maybe this default could even encourage people to use new variables when they have a different value with a different meaning, rather than reusing a "convenient" local variable which was used for something else. But now I'm dreaming.)
 I think the one exception is strings in that the D implementation is an
 array but conceptually, there is a difference of opinion for whether
 strings are buffers or values. I am firmly on the "string is a value"
 bench and have my wet trout ready to go slap the "other" side when the
 mother of all wars begins :) (Actually, I have a nuke-lee-are trout that
 is const, non-static, and very final ;))
My understanding is that in D, arrays (almost) have reference semantics. It's a shame, IMO, but languages have to make choices and there's no way one language is going to make them all the way I would unless I design it myself (and I don't believe that I could successfully bring a language to a mass audience as Walter is attempting to do). -- James
May 27 2007
next sibling parent "David B. Held" <dheld codelogicconsulting.com> writes:
James Dennett wrote:
 Myron Alexander wrote:
 [...]
 I think the one exception is strings in that the D implementation is an
 array but conceptually, there is a difference of opinion for whether
 strings are buffers or values. I am firmly on the "string is a value"
 bench and have my wet trout ready to go slap the "other" side when the
 mother of all wars begins :) (Actually, I have a nuke-lee-are trout that
 is const, non-static, and very final ;))
My understanding is that in D, arrays (almost) have reference semantics. It's a shame, IMO, but languages have to make choices and there's no way one language is going to make them all the way I would unless I design it myself (and I don't believe that I could successfully bring a language to a mass audience as Walter is attempting to do).
Actually, the intent is for string literals to be invariant arrays, which mean that they will be values implemented as reference types. ;) Hey, Java does it and it works just fine! Walter chose to make D's arrays be references for performance reasons. It is still possible to create C-style arrays with value semantics using pointers and structs (almost, as soon as structs get fixed). However, since D relies heavily on arrays, it was important to make them fast by default (and passing arrays more than about 16 bytes large is almost certainly faster by value than by reference). I've seen plenty of C++ noobs (and experienced coders) passing around std::vector<> by value out of laziness or ignorance or both. At least users can't make that mistake in D. Dave
May 28 2007
prev sibling parent reply Myron Alexander <someone somewhere.com> writes:
James Dennett wrote:
 
 I disagree!  Experience (from C++) has shown me that a
 large proportion of "variables" are not changed after
 their initialization.  I find that it's true of most
 local variables (including function parameters), and
 using immutable objects with reference semantics is
 also fairly common.
 
Would you please post some example. My experience in C, C++, and Java is different from yours. I'm not saying you are exactly wrong, I understand where you are coming from, especially with class instance references but my way of coding has more mutable variables than constants. I think it is a matter of program style and I don't mind writing "final Xx x = new Xx ();" which I do often in Java.
 The advantages in compiler checking mean that even if
 there is a (sufficiently small) additional cost in
 having the default be immutability, my experience
 strongly suggests that it would be worthwhile.  As
 I've said, I also find that the cost would be small,
 given how many "variables" aren't variable in clean
 code.  (Who knows, maybe this default could even
 encourage people to use new variables when they have
 a different value with a different meaning, rather
 than reusing a "convenient" local variable which was
 used for something else.  But now I'm dreaming.)
Once again, please post an example. I am curious to see your code style, maybe I can learn from it. Regards, Myron.
May 28 2007
next sibling parent Derek Parnell <derek psych.ward> writes:
On Mon, 28 May 2007 13:05:47 +0200, Myron Alexander wrote:

 
 Once again, please post an example. I am curious to see your code style, 
 maybe I can learn from it.
I have a style which has the program gathers a lot of it's "literals" from optional data supplied to it at run time. In other words, I prefer to write highly user-customisable applications. This has the effect that a lot of variables are set to default values then overridden as nominated by the user running the program. This makes using 'final' and 'invariant' problematic, I think. I might have to change my style to something like ... string[string] UserOptions; UserOptions = CollectUserOptions(); // Move them to final vars for performance reasons. final string OptionOne = UserOption["One"].dup; final string OptionTwo = UserOption["Two"].dup; UserOptions = null; // get rid of mutable stuff. . . . // From here on the options can't be changed. string[string] CollectUserOptions() { string[string] temp; // Set default values. temp["One"] = "aaaa"; temp["Two"] = "bbbb"; // Who knows where these are collected from // as it doesn't matter for this example if (<option one supplied>) temp["One"] = <suppliedvalue>; if (<option two supplied>) temp["Two"] = <suppliedvalue>; return temp; } -- Derek Parnell Melbourne, Australia "Justice for David Hicks!" skype: derek.j.parnell
May 28 2007
prev sibling parent reply Frank Benoit <keinfarbton googlemail.com> writes:
 Would you please post some example. My experience in C, C++, and Java is
 different from yours. I'm not saying you are exactly wrong, I understand
 where you are coming from, especially with class instance references but
 my way of coding has more mutable variables than constants. I think it
 is a matter of program style and I don't mind writing "final Xx x = new
 Xx ();" which I do often in Java.
 
The idea of a save default is, that if you don't care, the compiler raises an error if you change the value after the initial assignment. The compiler can only detect a missing "mutable" (or call it 'var'), but there is no way the compiler can detect a missing "const". And that is the advantage of const by default. So i think it is not a matter of "I use more const than non-const" or vice versa, it is a issue of making mutable things explicitely. an example i see here: foreach( m; myCollection ){ C c = m.getC(); c.doA(); c.doB(); C c2 = m.getTheOtherC(); } m is const, because its live cycle starts and ends with each iteration. Most of my object references are initialized once. I omit to reuse reference variables. Object member variables, that are not initialized in the ctor, are worth to be marked as mutable, because they are the complicated states in the object, which need special care.
May 28 2007
parent reply Myron Alexander <someone somewhere.com> writes:
Frank Benoit wrote:
 
 foreach( m; myCollection ){
   C c = m.getC();
   c.doA();
   c.doB();
   C c2 = m.getTheOtherC();
 }
 
I think that, as defined before, immutable would mean that the doA, doB call would fail unless they do not affect state as well. What you seem to be talking about is "final" which is that the reference is immutable, but the instance is not. If I am right, then your example would have to be rewritten as such: foreach( m; myCollection ){ var C c = m.getC(); c.doA(); c.doB(); var C c2 = m.getTheOtherC(); } but then you could update the value of "c" to point to another reference. Thus the semantic is too course. We are back to having const, final, invariant. In Walter's proposal: foreach( m; myCollection ){ final C c = m.getC(); c.doA(); c.doB(); final C c2 = m.getTheOtherC(); } would do what you want. So are you suggesting that final be the default storage attribute? Regards, Myron.
May 28 2007
parent Frank Benoit <keinfarbton googlemail.com> writes:
 So are you suggesting that final be the default
 storage attribute?
Oh, so i have a wrong understanding of the discussion, sorry about that. I need to read more.
May 28 2007
prev sibling parent Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 Regan Heath wrote:
 Walter Bright Wrote:
 Why can't we apply 'scope const final' to function parameters only?
Because it knocks people for a loop - gives a bad first impression.
Have these people tried using it, in the beta itself?
No. But you can't get people to try something they take an instant dislike to.
I don't want to offend these people you're referring to but I find their attitude quite .. how to put this .. closed minded and suggest you find people more willing to try new things .. like me! and basically everyone here in the NG. After all D is a new thing (relatively speaking) so everyone here is willing to try new things. And I think the generaly consensus is that we're all willing to try this new 'const scope final' by default thing and see how it goes. Regan Heath
May 28 2007
prev sibling parent Georg Wrede <georg nospam.org> writes:
Regan Heath wrote:
 Walter Bright Wrote:
 Regan Heath wrote:
 
 Why can't we apply 'scope const final' to function parameters
 only?
Because it knocks people for a loop - gives a bad first impression.
Really? Have these people tried using it, in the beta itself? Or have you just explained it to them? I would hope that once someone actually uses it, it would come quite naturally.
I agree. As a matter of fact, I originally took it for granted.
May 28 2007
prev sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Regan Heath wrote:
 Walter Bright Wrote:
 Frank Benoit wrote:
 Perhaps we look at it from the wrong side.

 If we want to change the D language to make it more const, the keywords
 'const', 'invariant'... are probably the wrong choice.

 How about restricting keywords and add their opposites: 'mutable',
 'once' (write once) and then make every variable declaration const by
 default? Each variable/parameter needs to be made modifyable with
 modifiers if needed.
I think having to write: mutable int x; instead of: int x; just isn't going to please people.
Ahhh, I think I see what you're concerned about. As in this example? mutable int gx; void foo(int y) { //y is scope const final mutable int z; } where the global and local scope ints 'gx' and 'z' are not supposed to be const scope final. Why can't we apply 'scope const final' to function parameters only?
Because he said "that seems to throw people for a loop." I'm guessing the qualm is that in this: int* gx; void foo(int* y) { int* z; } all three of those declarations sure look the same to an uninitiated C++ (or current D) user. So I think by "throws people for a loop", Walter means "isn't obvious to C++ converts". But well... all I have to say to that is why would you expect any form of const by default to look natural to a C++ user? It's just not going to. But the hope is they'll thank you for it in the long run. --bb
May 27 2007
parent reply Reiner Pope <some address.com> writes:
Bill Baxter wrote:
 Regan Heath wrote:
 Walter Bright Wrote:
 Frank Benoit wrote:
 Perhaps we look at it from the wrong side.

 If we want to change the D language to make it more const, the keywords
 'const', 'invariant'... are probably the wrong choice.

 How about restricting keywords and add their opposites: 'mutable',
 'once' (write once) and then make every variable declaration const by
 default? Each variable/parameter needs to be made modifyable with
 modifiers if needed.
I think having to write: mutable int x; instead of: int x; just isn't going to please people.
Ahhh, I think I see what you're concerned about. As in this example? mutable int gx; void foo(int y) { //y is scope const final mutable int z; } where the global and local scope ints 'gx' and 'z' are not supposed to be const scope final. Why can't we apply 'scope const final' to function parameters only?
Because he said "that seems to throw people for a loop." I'm guessing the qualm is that in this: int* gx; void foo(int* y) { int* z; } all three of those declarations sure look the same to an uninitiated C++ (or current D) user. So I think by "throws people for a loop", Walter means "isn't obvious to C++ converts". But well... all I have to say to that is why would you expect any form of const by default to look natural to a C++ user? It's just not going to. But the hope is they'll thank you for it in the long run. --bb
I don't think it's a rule you're likely to forget, either, because it makes sense and can be concisely phrased: "because the safest and most common way to deal with function parameters is by not modifying or retaining them outside the function scope, function parameters default to 'const scope final'" Also, const won't affect the behaviour of your code, so the only place you will run into difficulties is when trying to compile bad code. In that case, we hope the compiler can give a nice error message. :) -- Reiner
May 27 2007
parent Derek Parnell <derek psych.ward> writes:
On Mon, 28 May 2007 07:43:12 +1000, Reiner Pope wrote:


 In that case, we hope the compiler can give a nice error message. :)
And start another new trend? <G> I like it! -- Derek Parnell Melbourne, Australia "Justice for David Hicks!" skype: derek.j.parnell
May 27 2007
prev sibling next sibling parent Henning Hasemann <hhasemann web.de> writes:
On Sat, 26 May 2007 22:35:30 -0700
Walter Bright <newshound1 digitalmars.com> wrote:

 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
Maybe noconst, nofinal etc.. might be a better idea at the cost that that would introduce extra keywords. Henning -- GPG Public Key: http://keyserver.veridis.com:11371/search?q=0x41911851 Fingerprint: 344F 4072 F038 BB9E B35D E6AB DDD6 D36D 4191 1851
May 27 2007
prev sibling next sibling parent reply Jason House <jason.james.house gmail.com> writes:
For those of us who haven't read all the threads on this stuff... Is 
there a page I can go to and read about the planned changes to D for 
this stuff?

I guess I wonder which combinations of qualifiers would make sense as 
in, out, and inout parameters.  I'd then try to figure out how one would 
write out any of the variations and try to make the most common ones as 
short and understandable as possible.


Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
 
 On a related note, "cstring" has received universal condemnation <g>, so 
   I'll just have to make "string" work.
May 27 2007
parent Traveler Hauptman <none none.com> writes:
Yes please! An explanation page for those of us that haven't been
following this for the last year, or don't code in D yet.


Jason House wrote:
 For those of us who haven't read all the threads on this stuff... Is
 there a page I can go to and read about the planned changes to D for
 this stuff?
 
 I guess I wonder which combinations of qualifiers would make sense as
 in, out, and inout parameters.  I'd then try to figure out how one would
 write out any of the variations and try to make the most common ones as
 short and understandable as possible.
 
 
 Walter Bright wrote:
 It looks like making "const final scope" be the default for function
 parameters is going to be infeasible. The troubles are that:

 1) It seems to knock a lot of people for a loop, who will be assuming
 that an undecorated name would be like an undecorated name for a local
 or global variable.

 2) Having to turn off one of the const, final, or scope, introduces
 the need for some sort of "not" keyword, like mutable, !const, !final,
 etc. It comes off looking bizarre.

 However, making "in" be equivalent to "const final scope" does seem to
 work fine, requires no new keywords, and doesn't seem to confuse anyone.

 On a related note, "cstring" has received universal condemnation <g>,
 so   I'll just have to make "string" work.
May 27 2007
prev sibling next sibling parent Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
So why not make globals 'const scope final' also. 'scope' in this case being the global program scope. Many globals are initialised and then simply read from, the rest would be marked 'mutable' and would then indicate a variable that could change and in the case of multithreaded apps clearly indicate a variable which needed locking/protection.
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
Using 'mutable' where you want a mutable parameter/global doesn't look bizarre to me. Do you have a specific example in mind where it looks bizarre.
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
Who are you testing it on? Are they a) long time D users or b) new users to D, perhaps long time C++ users? c) some other group?
 On a related note, "cstring" has received universal condemnation <g>, so 
    I'll just have to make "string" work.
Excellent. Regan Heath
May 27 2007
prev sibling next sibling parent reply Robert Fraser <fraserofthenight gmail.coim> writes:
Heh; this is quite the interesting thread, though I seem to feel quite
different than most of you. I agree most function parameters should be final,
but const, and especially scope, just seem very wrong to me to be a default.
Maybe it's my Java background, but there are quite a few times I want users to
pass in reference to classes that implement a particular interface, or
otherwise have some sort of "setter" function for reference types, and having
to rewrite all my existing code to somehow allow that just seems like a total
waste.

Also, think about all the copying that would have to be made if scope was the
default - if I was passing in references to big objects into a collection, I
sure as hell wouldn't want a copy of each one made... one of D's advantages is
runtime performance. I know that scope could be turned off, but if on was the
default, a lot of coders would be more likely to use it, so I'd be much more
worried about using someone else's code for a performance-critical task.

In an ideal wold, most functions would indeed be "pure" functions, and if your
code has a lot of math going on or something, I can surely see that being the
case. However, real code is ugly, and programmers do break rules, usually
knowing full-well what we're doing. Reassigning a parameter does occasionally
introduce bugs, but it's a useful feature that we've come to accept and use
when it should be used, and avoid when it shouldn't be. I'm willing to put "in"
on API functions, and functions I expect others to use, but for private methods
and internal functions, let me code it how I want, and how I expect it to work,
without having to think about what I'm going to do with every parameter.

Eek, on re-reading, that came off a bit combative.... Sorry... I'll retreat to
my Virtual-Machine-managed world, and let you systems-level coders duke this
one out.

All the best,
Fraser

Walter Bright Wrote:

 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
 
 On a related note, "cstring" has received universal condemnation <g>, so 
    I'll just have to make "string" work.
May 28 2007
next sibling parent Myron Alexander <someone somewhere.com> writes:
Robert Fraser wrote:
 Heh; this is quite the interesting thread, though I seem to feel
 quite different than most of you. I agree most function parameters
 should be final, but const, and especially scope, just seem very
 wrong to me to be a default. Maybe it's my Java background, but there
 are quite a few times I want users to pass in reference to classes
 that implement a particular interface, or otherwise have some sort of
 "setter" function for reference types, and having to rewrite all my
 existing code to somehow allow that just seems like a total waste.
 
 Also, think about all the copying that would have to be made if scope
 was the default - if I was passing in references to big objects into
 a collection, I sure as hell wouldn't want a copy of each one made...
 one of D's advantages is runtime performance. I know that scope could
 be turned off, but if on was the default, a lot of coders would be
 more likely to use it, so I'd be much more worried about using
 someone else's code for a performance-critical task.
I have changed my mind and now think that Walter is right. I originally thought that const meant invariant but now know that you can rebind a const reference to a non-const variable and affect changes. In my original assumption, a const value would not incur performance hit with scope as the optimizer could make assumptions and no copy would be necessary. I now know this to be incorrect. Even though I prefer all parameters to be, at least, final, this would mean I am imposing my methodology on others and D is not supposed to be about enforcing a religion ;) I guess I am going to have to accept that there will be libraries/code that are almost impossible to read, or have undocumented/unintended side-effects. This is something that Darwinian forces will have to sort out. This was the reason I left C/C++ and joined the Java circus; well, that and I hate #define, #import :) On a side note, I originally considered D because I saw C+=1 done right, and with modules instead of source import. Of course, since then, the other lush goodies have their warm place in me heart.
 
 In an ideal wold, most functions would indeed be "pure" functions,
 and if your code has a lot of math going on or something, I can
 surely see that being the case. However, real code is ugly, and
 programmers do break rules, usually knowing full-well what we're
 doing. Reassigning a parameter does occasionally introduce bugs, but
 it's a useful feature that we've come to accept and use when it
 should be used, and avoid when it shouldn't be. I'm willing to put
 "in" on API functions, and functions I expect others to use, but for
 private methods and internal functions, let me code it how I want,
 and how I expect it to work, without having to think about what I'm
 going to do with every parameter.
I agree with you 100%.
 
 Eek, on re-reading, that came off a bit combative.... Sorry... I'll
 retreat to my Virtual-Machine-managed world, and let you
 systems-level coders duke this one out.
 
I don't think you presented a combative tone, just laying it down like it is.
 All the best, Fraser
 
 Walter Bright Wrote:
 
 It looks like making "const final scope" be the default for
 function parameters is going to be infeasible. The troubles are
 that:
 
 1) It seems to knock a lot of people for a loop, who will be
 assuming that an undecorated name would be like an undecorated name
 for a local or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces
 the need for some sort of "not" keyword, like mutable, !const,
 !final, etc. It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem
 to work fine, requires no new keywords, and doesn't seem to confuse
 anyone.
 
 On a related note, "cstring" has received universal condemnation
 <g>, so I'll just have to make "string" work.
Regards, Myron.
May 28 2007
prev sibling parent reply Regan Heath <regan netmail.co.nz> writes:
Robert Fraser Wrote:
 Heh; this is quite the interesting thread, though I seem to feel quite
different than most of you. I agree most function parameters should be final,
but const, and especially scope, just seem very wrong to me to be a default.
Maybe it's my Java background, but there are quite a few times I want users to
pass in reference to classes that implement a particular interface, or
otherwise have some sort of "setter" function for reference types, and having
to rewrite all my existing code to somehow allow that just seems like a total
waste.
I think you may have a missunderstanding of one or more of the terms and the changes Walter suggested. If I'm wrong feel free to ignore my reply, or read it out of interest only ;) I don't think a re-write would be necessary in the case you describe above, just removal of 'const' from the input parameter (as you intend to call methods which modify the data to which the refernece refers).
 Also, think about all the copying that would have to be made if scope was the
default - if I was passing in references to big objects into a collection, I
sure as hell wouldn't want a copy of each one made... 
But that's already what happens. When you pass anything to a function a copy is made (unless you use 'ref' or 'out'). The important thing to realise is _what_ is copied. In the case of a value type the entire value type is copied, but, in the case of a reference/array only the reference is copied. The data to which a reference refers is never copied and 'scope' does not change that.
one of D's advantages is runtime performance. I know that scope could be turned
off, but if on was the default, a lot of coders would be more likely to use it,
so I'd be much more worried about using someone else's code for a
performance-critical task.
'scope' poses no performance problems as it doesn't change the existing behaviour, it is actually just a formalisation of existing behaviour. Take this example: class A { ..large object, lots of members.. } void foo(A b) {} A a = new A(); foo(a); when foo is called a copy of the reference 'a' is made and it is called 'b'. It is 'scope' because it exists solely in the function scope, once the function returns 'b' ceases to exist (because it was a copy of 'a' on the stack and the stack is reclaimed at function exit). This is what currently happens in D, and is what happens in C and C++ too. 'scope' is intended to detact and prevent this: A* pa; void foo(A b) { pa = &b; } In the above pa is set to the address of the reference 'b', this address becomes invalid when 'foo' returns and thus violates 'scope'. Likewise this: A* foo(A b) { return &b; } will be caught and prevented. This above code is currently valid D and the compiler gives no errors, once 'scope' is added to D the compiler will detect this and give an error. As for the others... 'final' means you cannot reassign the reference/array parameter during the function call. eg. void foo(A b) { b = new A(); } //voilates 'final' Initially you might not think this re-assign was dangerous and needed preventing, after all 'b' is a copy and changing it does not affect the original reference 'a' in any way. However, in a large function re-using an input parameter can introduce hard to find bugs, especially if 2+ programmers are working on the same code. Lastly 'const'. I think this one is the most important, especially given that arrays data is referenced. By this I mean when you call 'foo' here: void foo(char[] b) { b[0] = 'a'; } 'b' may be a copy of the original array reference 'a', but they both refer to the same array data and therefore changes to that data using either reference affect the other. This behaviour is often called 'aliasing' as 'b' is effectively an alias of 'a', changes via 'b' are the same as changed via 'a'. This is a source of many bugs and providing some protection, and by default, will prevent a large number of 'aliasing' bugs. Wow, that became a novel almost, sorry. Regan
May 28 2007
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Ah, thanks, now I understand scope (I understood the other two; I thought scope
would prevent copies of references). Still, const by default seems a bit odd -
make it explicit. For example, if a file stream is being passed to a function,
could I write to it inside the function if it was const?

Regan Heath Wrote:

 Robert Fraser Wrote:
 Heh; this is quite the interesting thread, though I seem to feel quite
different than most of you. I agree most function parameters should be final,
but const, and especially scope, just seem very wrong to me to be a default.
Maybe it's my Java background, but there are quite a few times I want users to
pass in reference to classes that implement a particular interface, or
otherwise have some sort of "setter" function for reference types, and having
to rewrite all my existing code to somehow allow that just seems like a total
waste.
I think you may have a missunderstanding of one or more of the terms and the changes Walter suggested. If I'm wrong feel free to ignore my reply, or read it out of interest only ;) I don't think a re-write would be necessary in the case you describe above, just removal of 'const' from the input parameter (as you intend to call methods which modify the data to which the refernece refers).
 Also, think about all the copying that would have to be made if scope was the
default - if I was passing in references to big objects into a collection, I
sure as hell wouldn't want a copy of each one made... 
But that's already what happens. When you pass anything to a function a copy is made (unless you use 'ref' or 'out'). The important thing to realise is _what_ is copied. In the case of a value type the entire value type is copied, but, in the case of a reference/array only the reference is copied. The data to which a reference refers is never copied and 'scope' does not change that.
one of D's advantages is runtime performance. I know that scope could be turned
off, but if on was the default, a lot of coders would be more likely to use it,
so I'd be much more worried about using someone else's code for a
performance-critical task.
'scope' poses no performance problems as it doesn't change the existing behaviour, it is actually just a formalisation of existing behaviour. Take this example: class A { ..large object, lots of members.. } void foo(A b) {} A a = new A(); foo(a); when foo is called a copy of the reference 'a' is made and it is called 'b'. It is 'scope' because it exists solely in the function scope, once the function returns 'b' ceases to exist (because it was a copy of 'a' on the stack and the stack is reclaimed at function exit). This is what currently happens in D, and is what happens in C and C++ too. 'scope' is intended to detact and prevent this: A* pa; void foo(A b) { pa = &b; } In the above pa is set to the address of the reference 'b', this address becomes invalid when 'foo' returns and thus violates 'scope'. Likewise this: A* foo(A b) { return &b; } will be caught and prevented. This above code is currently valid D and the compiler gives no errors, once 'scope' is added to D the compiler will detect this and give an error. As for the others... 'final' means you cannot reassign the reference/array parameter during the function call. eg. void foo(A b) { b = new A(); } //voilates 'final' Initially you might not think this re-assign was dangerous and needed preventing, after all 'b' is a copy and changing it does not affect the original reference 'a' in any way. However, in a large function re-using an input parameter can introduce hard to find bugs, especially if 2+ programmers are working on the same code. Lastly 'const'. I think this one is the most important, especially given that arrays data is referenced. By this I mean when you call 'foo' here: void foo(char[] b) { b[0] = 'a'; } 'b' may be a copy of the original array reference 'a', but they both refer to the same array data and therefore changes to that data using either reference affect the other. This behaviour is often called 'aliasing' as 'b' is effectively an alias of 'a', changes via 'b' are the same as changed via 'a'. This is a source of many bugs and providing some protection, and by default, will prevent a large number of 'aliasing' bugs. Wow, that became a novel almost, sorry. Regan
May 28 2007
parent reply Denton Cockburn <diboss hotmail.com> writes:
On Mon, 28 May 2007 18:56:00 -0400, Robert Fraser wrote:

 Ah, thanks, now I understand scope (I understood the other two; I thought
scope would prevent copies of references). Still, const by default seems a bit
odd - make it explicit. For example, if a file stream is being passed to a
function, could I write to it inside the function if it was const?
 
No, not if it changes the internals of the object. If you want that, then simply specify the parameter as 'ref' void foo(ref Stream x) { ...blah... } or void foo(scope final Stream x) { ...blah... } I'm trying to understand this too, so hopefully I didn't just tell you the wrong thing.
May 29 2007
parent Regan Heath <regan netmail.co.nz> writes:
Denton Cockburn Wrote:
 On Mon, 28 May 2007 18:56:00 -0400, Robert Fraser wrote:
 
 Ah, thanks, now I understand scope (I understood the other two; I thought
scope would prevent copies of references). Still, const by default seems a bit
odd - make it explicit. For example, if a file stream is being passed to a
function, could I write to it inside the function if it was const?
 
No, not if it changes the internals of the object. If you want that, then simply specify the parameter as 'ref' void foo(ref Stream x) { ...blah... } or void foo(scope final Stream x) { ...blah... } I'm trying to understand this too, so hopefully I didn't just tell you the wrong thing.
You're answer looks good to me. Passing with 'ref' means that the Stream 'x' is _not_ a copy of the passed reference, but is actually the exact reference passed. This is useful when you might want to re-assign it inside the function and expect that change to be reflected outside the function. Passing with 'scope final' gives a copy of the passed reference, 'final' prevents/detects you from reassigning it (catching the bug where you want the change to be reflected and actually meant to use 'ref' and preventing the bug where a parameter is re-used several times in a large function and this can result in bugs due to programmers expecting it to have it's inital value). 'scope' prevents/detects any attempt to store it's address in an external variable (which would be an ugly bug to find) because as a copy it ceases to exist after the function returns. I'm just re-iterating the behaviour for anyone still coming to grips with this. A good understanding of what goes on in the background (function agruments being copies, or not, etc) makes for better programmers. Regan
May 29 2007
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Regan Heath wrote:
 
 'scope' poses no performance problems as it doesn't change the existing
behaviour, it is actually just a formalisation of existing behaviour.  Take
this example:
 
 class A { ..large object, lots of members.. }
 void foo(A b) {}
 A a = new A();
 foo(a);
 
 when foo is called a copy of the reference 'a' is made and it is called 'b'. 
It is 'scope' because it exists solely in the function scope, once the function
returns 'b' ceases to exist (because it was a copy of 'a' on the stack and the
stack is reclaimed at function exit).
 
 This is what currently happens in D, and is what happens in C and C++ too.
'scope' is intended to detact and prevent this:
 
 A* pa;
 void foo(A b) { pa = &b; }
 
 In the above pa is set to the address of the reference 'b', this address
becomes invalid when 'foo' returns and thus violates 'scope'.  Likewise this:
 
 A* foo(A b) { return &b; }
 
 will be caught and prevented.
 
 This above code is currently valid D and the compiler gives no errors, once
'scope' is added to D the compiler will detect this and give an error.
 
It should be noted however that it is not the value of b that is intrisincally 'scope'. It is the value of &b that is scope. The same happens with any local variable, parameter or not. -- Bruno Medeiros - MSc in CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
May 30 2007
parent Regan Heath <regan netmail.co.nz> writes:
Bruno Medeiros Wrote:
 Regan Heath wrote:
 
 'scope' poses no performance problems as it doesn't change the existing
behaviour, it is actually just a formalisation of existing behaviour.  Take
this example:
 
 class A { ..large object, lots of members.. }
 void foo(A b) {}
 A a = new A();
 foo(a);
 
 when foo is called a copy of the reference 'a' is made and it is called 'b'. 
It is 'scope' because it exists solely in the function scope, once the function
returns 'b' ceases to exist (because it was a copy of 'a' on the stack and the
stack is reclaimed at function exit).
 
 This is what currently happens in D, and is what happens in C and C++ too.
'scope' is intended to detact and prevent this:
 
 A* pa;
 void foo(A b) { pa = &b; }
 
 In the above pa is set to the address of the reference 'b', this address
becomes invalid when 'foo' returns and thus violates 'scope'.  Likewise this:
 
 A* foo(A b) { return &b; }
 
 will be caught and prevented.
 
 This above code is currently valid D and the compiler gives no errors, once
'scope' is added to D the compiler will detect this and give an error.
 
It should be noted however that it is not the value of b that is intrisincally 'scope'. It is the value of &b that is scope. The same happens with any local variable, parameter or not.
True, which is why I look at 'scope' as less of a "new feature" and more of a "formalisation" of what actually happens plus some help from the compiler at finding and preventing bugs related to it. In a sense all variables at whatever level are 'scope', that is to say they exist within a given scope (program, class, function) and cease to exist above/outside that scope. Also as you say, it's the address of them, rather than their value which is what becomes invalid when execution leaves their scope. Regan
May 30 2007
prev sibling next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
I understand that part, but I don't necessarily want to reassign the reference,
just change something in the state of the object being referred to. For example:

struct Foo
{
    int x;
}

void bar(in Foo var)
{
    var.x = 10;
}

Allowed?

Regan Heath Wrote:

 Denton Cockburn Wrote:
 On Mon, 28 May 2007 18:56:00 -0400, Robert Fraser wrote:
 
 Ah, thanks, now I understand scope (I understood the other two; I thought
scope would prevent copies of references). Still, const by default seems a bit
odd - make it explicit. For example, if a file stream is being passed to a
function, could I write to it inside the function if it was const?
 
No, not if it changes the internals of the object. If you want that, then simply specify the parameter as 'ref' void foo(ref Stream x) { ...blah... } or void foo(scope final Stream x) { ...blah... } I'm trying to understand this too, so hopefully I didn't just tell you the wrong thing.
You're answer looks good to me. Passing with 'ref' means that the Stream 'x' is _not_ a copy of the passed reference, but is actually the exact reference passed. This is useful when you might want to re-assign it inside the function and expect that change to be reflected outside the function. Passing with 'scope final' gives a copy of the passed reference, 'final' prevents/detects you from reassigning it (catching the bug where you want the change to be reflected and actually meant to use 'ref' and preventing the bug where a parameter is re-used several times in a large function and this can result in bugs due to programmers expecting it to have it's inital value). 'scope' prevents/detects any attempt to store it's address in an external variable (which would be an ugly bug to find) because as a copy it ceases to exist after the function returns. I'm just re-iterating the behaviour for anyone still coming to grips with this. A good understanding of what goes on in the background (function agruments being copies, or not, etc) makes for better programmers. Regan
May 29 2007
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Robert Fraser Wrote:
 
 struct Foo
 {
     int x;
 }
 
 void bar(in Foo var)
 {
     var.x = 10;
 }
 
 Allowed?
And also, what if it was a reference type instead of a value type: class Foo { int x; } void bar(Foo var) { var.x = 5; // Is this allowed? }
May 29 2007
parent reply Regan Heath <regan netmail.co.nz> writes:
Robert Fraser Wrote:
 Robert Fraser Wrote:
 
 struct Foo
 {
     int x;
 }
 
 void bar(in Foo var)
 {
     var.x = 10;
 }
 
 Allowed?
Assuming 'in' means 'scope, const, final' then "No". 'const' prevents you modifying the data to which the variable refers. In this case that is the contents of the struct. In the case of a class reference, the contents of the class to which it refers. In the case of a pointer, the data to which it points. You might think that with a value type there is no harm in modifying it because the change is not reflected back to the original variable passed but consider; 1. What if the programmer wanted that change to be reflected back to the original variable and they have forgotten to use 'ref'? 2. Or, in a large function maintained by more than 1 developer someone changes the input variable and someone else doesn't realise and makes an assumption about it's value, perhaps.. void bar(in Foo var) { assert(var.x == 0); //many lines of code var.x = 10; //many lines of code //line which assumes var.x == 0; BANG! }
 And also, what if it was a reference type instead of a value type:
 
 class Foo
 {
     int x;
 }
 
 void bar(Foo var)
 {
     var.x = 5; // Is this allowed?
 }
"No", 'const' prevents this too. It's perhaps more important when reference types are involved because the change will be reflected back to the original variable passed. Without 'const' someone who wants to call this function cannot guarantee what state their object will be in after the function has finished. With 'const' you can guarantee the object has not changed. Regan Heath
May 29 2007
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Thanks for explaining that to me (I've only ever worked in Java, so const is a
foreign concept). Then doing that by default seems like it'd not only break a
great deal of existing code, and just sounds weird to me. I'm all for explicit
"in," but I guess we'll just have to wait and see what Walter decides.

Regan Heath Wrote:

 Robert Fraser Wrote:
 Robert Fraser Wrote:
 
 struct Foo
 {
     int x;
 }
 
 void bar(in Foo var)
 {
     var.x = 10;
 }
 
 Allowed?
Assuming 'in' means 'scope, const, final' then "No". 'const' prevents you modifying the data to which the variable refers. In this case that is the contents of the struct. In the case of a class reference, the contents of the class to which it refers. In the case of a pointer, the data to which it points. You might think that with a value type there is no harm in modifying it because the change is not reflected back to the original variable passed but consider; 1. What if the programmer wanted that change to be reflected back to the original variable and they have forgotten to use 'ref'? 2. Or, in a large function maintained by more than 1 developer someone changes the input variable and someone else doesn't realise and makes an assumption about it's value, perhaps.. void bar(in Foo var) { assert(var.x == 0); //many lines of code var.x = 10; //many lines of code //line which assumes var.x == 0; BANG! }
 And also, what if it was a reference type instead of a value type:
 
 class Foo
 {
     int x;
 }
 
 void bar(Foo var)
 {
     var.x = 5; // Is this allowed?
 }
"No", 'const' prevents this too. It's perhaps more important when reference types are involved because the change will be reflected back to the original variable passed. Without 'const' someone who wants to call this function cannot guarantee what state their object will be in after the function has finished. With 'const' you can guarantee the object has not changed. Regan Heath
May 29 2007
parent Regan Heath <regan netmail.co.nz> writes:
Robert Fraser Wrote:

 Thanks for explaining that to me (I've only ever worked in Java, so const is a
foreign concept). Then doing that by default seems like it'd not only break a
great deal of existing code, and just sounds weird to me. 
Can you give an example of something you think would break. When you say break do you mean wont compile, or will compile and behaves differently. Because, the former is fine as (I'd like to think that) it breaks because it's something that shouldn't be done. The latter is not acceptable, ever.
I'm all for explicit "in," but I guess we'll just have to wait and see what
Walter decides.
Yep, as always. (I dont want to imply any dislike about this fact, it doesn't really bother me when Walter makes a decision about Walters programming language which I don't like.. it is after all Walters). Regan
May 30 2007
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:
 
 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.
 
 2) Having to turn off one of the const, final, or scope, introduces the 
 need for some sort of "not" keyword, like mutable, !const, !final, etc. 
 It comes off looking bizarre.
 
 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.
 
 On a related note, "cstring" has received universal condemnation <g>, so 
   I'll just have to make "string" work.
I'm gonna repost my question in this thread: What is the reasoning behind the idea of 'scope' being the default together with 'const' and 'final'? I understand (and agree) why 'final' and 'const' should be the default type modifiers for function parameters, but why 'scope' as well? Does it look like 'scope' would be more common than non-scope? -- Bruno Medeiros - MSc in CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
May 30 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Bruno Medeiros wrote:
 Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:

 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a local 
 or global variable.

 2) Having to turn off one of the const, final, or scope, introduces 
 the need for some sort of "not" keyword, like mutable, !const, !final, 
 etc. It comes off looking bizarre.

 However, making "in" be equivalent to "const final scope" does seem to 
 work fine, requires no new keywords, and doesn't seem to confuse anyone.

 On a related note, "cstring" has received universal condemnation <g>, 
 so   I'll just have to make "string" work.
I'm gonna repost my question in this thread: What is the reasoning behind the idea of 'scope' being the default together with 'const' and 'final'? I understand (and agree) why 'final' and 'const' should be the default type modifiers for function parameters, but why 'scope' as well? Does it look like 'scope' would be more common than non-scope?
Do a majority of the parameters you pass to functions get stored somewhere that will last beyond the scope of the function? From what I understand scope on a parameter will not mean that class objects get passed on the stack, if that's what you're thinking. They'll still be reference parameters, just it will be an error for the function to store that reference in a global or a local static variable. So I guess that means most setter methods that take objects will have to declare away the scope part (or in Walter's current way be 'const' rather than 'in'), because storing a reference is their whole reason for being. --bb
May 30 2007
parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Bill Baxter wrote:
 Bruno Medeiros wrote:
 Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:

 1) It seems to knock a lot of people for a loop, who will be assuming 
 that an undecorated name would be like an undecorated name for a 
 local or global variable.

 2) Having to turn off one of the const, final, or scope, introduces 
 the need for some sort of "not" keyword, like mutable, !const, 
 !final, etc. It comes off looking bizarre.

 However, making "in" be equivalent to "const final scope" does seem 
 to work fine, requires no new keywords, and doesn't seem to confuse 
 anyone.

 On a related note, "cstring" has received universal condemnation <g>, 
 so   I'll just have to make "string" work.
I'm gonna repost my question in this thread: What is the reasoning behind the idea of 'scope' being the default together with 'const' and 'final'? I understand (and agree) why 'final' and 'const' should be the default type modifiers for function parameters, but why 'scope' as well? Does it look like 'scope' would be more common than non-scope?
Do a majority of the parameters you pass to functions get stored somewhere that will last beyond the scope of the function? From what I understand scope on a parameter will not mean that class objects get passed on the stack, if that's what you're thinking. They'll still be reference parameters, just it will be an error for the function to store that reference in a global or a local static variable. So I guess that means most setter methods that take objects will have to declare away the scope part (or in Walter's current way be 'const' rather than 'in'), because storing a reference is their whole reason for being. --bb
Hum, in my experience, I wouldn't say the majority of them get stored somewhere, but I wouldn't say the majority of them *don't* get stored either. I guess both cases are more or less equal (unlike const&final which are the majority), even if perhaps the scope case is slightly more common. However, being that both cases are similar occurrences, I would say that the default case should be same as not having a keyword. (i.e., the default case should be the same as not having scope). But this is very early opinion, I'm nowhere near sure. -- Bruno Medeiros - MSc in CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Jun 01 2007
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Bruno Medeiros wrote:
 Bill Baxter wrote:
 Bruno Medeiros wrote:
 Walter Bright wrote:
 It looks like making "const final scope" be the default for function 
 parameters is going to be infeasible. The troubles are that:

 1) It seems to knock a lot of people for a loop, who will be 
 assuming that an undecorated name would be like an undecorated name 
 for a local or global variable.

 2) Having to turn off one of the const, final, or scope, introduces 
 the need for some sort of "not" keyword, like mutable, !const, 
 !final, etc. It comes off looking bizarre.

 However, making "in" be equivalent to "const final scope" does seem 
 to work fine, requires no new keywords, and doesn't seem to confuse 
 anyone.

 On a related note, "cstring" has received universal condemnation 
 <g>, so   I'll just have to make "string" work.
I'm gonna repost my question in this thread: What is the reasoning behind the idea of 'scope' being the default together with 'const' and 'final'? I understand (and agree) why 'final' and 'const' should be the default type modifiers for function parameters, but why 'scope' as well? Does it look like 'scope' would be more common than non-scope?
Do a majority of the parameters you pass to functions get stored somewhere that will last beyond the scope of the function? From what I understand scope on a parameter will not mean that class objects get passed on the stack, if that's what you're thinking. They'll still be reference parameters, just it will be an error for the function to store that reference in a global or a local static variable. So I guess that means most setter methods that take objects will have to declare away the scope part (or in Walter's current way be 'const' rather than 'in'), because storing a reference is their whole reason for being. --bb
Hum, in my experience, I wouldn't say the majority of them get stored somewhere, but I wouldn't say the majority of them *don't* get stored either. I guess both cases are more or less equal (unlike const&final which are the majority), even if perhaps the scope case is slightly more common. However, being that both cases are similar occurrences, I would say that the default case should be same as not having a keyword. (i.e., the default case should be the same as not having scope). But this is very early opinion, I'm nowhere near sure.
I think the ratio of stored vs not stored is probably very different for member functions and free functions. I'm not sure I believe it's close to 50% for member functions[*], but it's certainly much higher than for the free funcs. Presumably Walter's going through and making all the necessary changes to get Phobos working and from that he'll get some idea of how painful the changes are. But Phobos doesn't have a lot of classes, so he may get a skewed view. :-/ Hopefully he'll take that into account. [*] My logic is that it seems reasonable to have about a 50/50 mix of accessors and mutators. None of the accessors will store anything. And only some fraction of the mutators will, since only mutators taking reference types will be affected. Value types are just copied so no scope problems. Put another way, I think the things that will need to be un-scoped are the exact same things with which you have ownership issues in C++. That's certainly not the majority of methods in my experience. --bb
Jun 01 2007