www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.announce - Re: preparing for const, final, and invariant

reply Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 Do not use 'in' if you wish to do any of these operations on a 
 parameter. Using 'in' has no useful effect on D 1.0 code, so it'll be 
 backwards compatible.
 
 Adding in all those 'in's is tedious, as I'm finding out :-(, but I 
 think the results will be worth the effort.

Perhaps I have missed the discussion (being away for the last 7 months) which discussed why we don't want 'scope const final' applied to implicit 'in' parameters? As opposed to requiring explicit 'in' which Walter is proposing. To explain... Currently an parameter is implicitly 'in' i.e. void foo(int a) {} //a is 'in' Why not make this implicit 'in' parameter 'scope const final' avoiding the need to explicity say 'in' everywhere? My reasoning is that I think 'scope const final' should be the default for parameters as it's the most commonly used and safest option for parameters. People should use it by default/accident and should have to explicitly opt out in cases where it makes sense. These cases would then be clearly, visibly marked with 'out', 'ref' etc From what I can see we currently have 2 reasons to require explicit 'in' (from Walters post and others in this thread): 1. avoid breaking backward compatibility with D 1.0. 2. parameters will all have parameter specifiers and pedantic people (no offence intended) will enjoy fully specified function parameters all the time. If so, isn't part of the point in making this change in a beta so that we can ignore reason #1. Granted if the feature was to be integrated into the mainstream version it would then break backward compatibility, but, imagine the situation: Old code would cease to compile and would need modification, but, the modifictions required would for the most part simply be the addition of 'out', 'ref', or an added .dup or copy. Which, in actual fact should have been there in the first place. Meaning, the exisiting code is unsafe and the resulting code after these changes would be much safer. In other words, applying 'scope const final' to implicit 'in' will catch existing bugs, but requiring explicit 'in' will not, right? As for reason #2 I think it's largely aesthetic however.. 1. Applying it to implicit 'in' is less typing for those non-pedantic programmers who can live without all parameters having a specifier. I personally dont think the presence of an explicit 'in' results in clearer code as it is clear to me that unspecified parameters are 'in' (and with these changes would be 'scope const final' too). Regan Heath
May 20 2007
next sibling parent Frank Benoit <keinfarbton googlemail.com> writes:
I second that.
Make the safe thing the default, and the unsafe the explicit case.
May 20 2007
prev sibling next sibling parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Regan Heath wrote:
 Walter Bright Wrote:
 Perhaps I have missed the discussion (being away for the last 7 months) which
discussed why we don't want 'scope const final' applied to implicit 'in'
parameters?

Makes sense to me too. I'd at least like to try it out for a few months and see how the shoe fits. It seems to make a lot of sense. I found this big thread about "const by default": http://lists.puremagic.com/pipermail/digitalmars-d/2006-July/005626.html --bb
May 20 2007
prev sibling next sibling parent reply =?ISO-8859-1?Q?Manuel_K=F6nig?= <ManuelK89 gmx.net> writes:
I second this.

Doing it this way 'in' also keeps its expressive character of saying 
"Hey, I am only the input and not that bunch of scope const final!", 
which especially makes sense when compared to 'out' in terms of data 
flow. And dismissing all of 'scope const final' just requires you to 
declare your params as 'in', which will rarely be the case.
May 20 2007
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Manuel König wrote:
 I second this.
 
 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.

Does nobody quote any more? What are you seconding? --bb
May 20 2007
next sibling parent =?ISO-8859-1?Q?Manuel_K=F6nig?= <ManuelK89 gmx.net> writes:
Bill Baxter wrote:
 Manuel König wrote:
 I second this.

 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.

Does nobody quote any more? What are you seconding? --bb

I'm seconding just the whole proposal. Quoting something didn't came to my because I'm not sticking to something in particular, but the whole thing :P Anyhow, are there thoughts, comments by anyone who does/does not like the behaviour of omitting 'in' being 'scope const final'? Otherwise the 'in' behaviour proposed by Regan should really be part of the language, IMHO. greetings, manuel
May 20 2007
prev sibling parent reply Regan Heath <regan netmail.co.nz> writes:
Bill Baxter Wrote:
 Manuel König wrote:
 I second this.
 
 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.

Does nobody quote any more? What are you seconding?

Are you using a news reader which displays posts in threads? When you're not it can be annoying to find a post with no quotation, I agree. The web interface (which I am using until I have my own PC) seems to have trouble correctly threading all the posts too. Opera didn't seem to have any trouble when I last used it. I suspect either some of our newsreaders/posters do not correctly format the headers in replies and/or the web interface isn't looking for some headers which Opera did and/or there is some clever trick Opera was used to thread them correctly. Anyway, enough rambling. ;) Regan Heath
May 20 2007
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Regan Heath wrote:
 Bill Baxter Wrote:
 Manuel König wrote:
 I second this.

 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.


Are you using a news reader which displays posts in threads? When you're not it can be annoying to find a post with no quotation, I agree.

I use Thunderbird, but since the D groups have so much traffic, and since I read it from multiple different computers, I end up reading in sort-by-date mode so that all the recent stuff is at the bottom and easy to find. Oh for the day when I can store my Thunderbird settings on the net. I wonder if there's a Google Thunderbird Sync plugin on the way... --bb
May 20 2007
prev sibling parent reply Regan Heath <regan netmail.co.nz> writes:
Manuel König Wrote:
 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.

To clarify, I was actually proposing that 'in' would be 'scope const final' and there would be no difference between explicit and implicit 'in'. I think it's a bad idea to have implicit and explicit 'in' mean different things, it would just be confusing to me. That said, you could decide/declare that: void foo(int a) {} was actually equivalent to: void foo(scope const final int a) {} and that: void foo(in int a) {} was something entirely different. In that case what do you want 'in' it to mean? I get the impression you'd like it to behave as the current implicit/explicit 'in' does, right? My only question is, why do you want the current behaviour? I'm guessing you want to be able to do the things that 'scope const final' will protect against, but surely those things are dangerous and shouldn't be done? Is there a specific case you are thinking of where you need to do these things? One where there is no decent work-around in the presense of 'scope const final' as default? Regan Heath
May 20 2007
parent =?ISO-8859-1?Q?Manuel_K=F6nig?= <ManuelK89 gmx.net> writes:
Regan Heath wrote:
 Manuel König Wrote:
 Doing it this way 'in' also keeps its expressive character of saying 
 "Hey, I am only the input and not that bunch of scope const final!", 
 which especially makes sense when compared to 'out' in terms of data 
 flow. And dismissing all of 'scope const final' just requires you to 
 declare your params as 'in', which will rarely be the case.

To clarify, I was actually proposing that 'in' would be 'scope const final' and there would be no difference between explicit and implicit 'in'. I think it's a bad idea to have implicit and explicit 'in' mean different things, it would just be confusing to me.

Yes, difference between implicit/explicit 'in' can be confusing. But that's only because we're used of data 'flowing in' to a function, and marking it 'out' if that's not the case. In fact, either 'in' or 'out' are obligatory for every parameter (when in/out being considered as data flow). So you can say that only by accident 'in' became the default if not explicitly specified.
 That said, you could decide/declare that:
 
   void foo(int a) {}
 
 was actually equivalent to:
 
   void foo(scope const final int a) {}
 
 and that:
 
   void foo(in int a) {}
 
 was something entirely different.

True.
 
 In that case what do you want 'in' it to mean?  I get the impression you'd
like it to behave as the current implicit/explicit 'in' does, right?
 

Yes, I "like" it. But I would put it more like 'in' and 'out' are two complementary attributes where one has to be present.
 My only question is, why do you want the current behaviour?  
 
 I'm guessing you want to be able to do the things that 'scope const final'
will protect against, but surely those things are dangerous and shouldn't be
done?
 

Yes, that's exactly what I want. I think everyone can agree that 'scope' and 'const' are not everytime well appreciated. So the one left is the 'final' attribute. Ok, I think I really can live with declaring my params 'final' when I don't want all the other things. But sometimes it would be nice if the parameter could be reassigned to something, like this: void log(LogObject msg) { msg.isLogged = true; // => no const lastLoggedMsg = msg; // => no scope // preprocessing => no final if (systemIsServer) { msg = "Server-log: " ~ msg; } else { msg = "Client-log: " ~ msg; } /* actually doing something with msg */ writef(logfile, msg); } Here msg gets preprocessed first, and than used by whatever you want (in this case it gets logged). If you could not rebind 'msg' to another symbol, you would have to declare a new variable, think of a new name for it, eg. 'msg_ex', and finally assign it to the preprocessing results. But that seems to me like a workaround that you can't just reassign 'msg' and it also bloats your code. Not too much an argument? Declaring new variables really isn't that bad compared to all the issues you trouble into when code gets rewritten and suddenly something does not work because you assign to a renamed parameter that was a local variable before? Ok, I hear you. May be 'in' being 'scope final const' isn't that bad at all :). I'm just being stuck with 'in' not only meaning the data flow. In the end I would prefer 'in' being only 'in' a little bit over 'scope final const (in)' because there seems to be no reason for me using 'in' when I have the opportunity to just write nothing. And when I would use 'in' I would seriously know what I'm typing there! But that's only my personal opinion. I would be totally fine at all with 'in' meaning 'scope const final'.
 Is there a specific case you are thinking of where you need to do these
things?  One where there is no decent work-around in the presense of 'scope
const final' as default?
 

Look above. (ok, there actually IS a decent work-around... :) )
 Regan Heath

May 20 2007
prev sibling next sibling parent Johan Granberg <lijat.meREM OVEgmail.com> writes:
Regan Heath wrote:

 Walter Bright Wrote:
 Do not use 'in' if you wish to do any of these operations on a
 parameter. Using 'in' has no useful effect on D 1.0 code, so it'll be
 backwards compatible.
 
 Adding in all those 'in's is tedious, as I'm finding out :-(, but I
 think the results will be worth the effort.

Perhaps I have missed the discussion (being away for the last 7 months) which discussed why we don't want 'scope const final' applied to implicit 'in' parameters? As opposed to requiring explicit 'in' which Walter is proposing. To explain... Currently an parameter is implicitly 'in' i.e. void foo(int a) {} //a is 'in' Why not make this implicit 'in' parameter 'scope const final' avoiding the need to explicity say 'in' everywhere? My reasoning is that I think 'scope const final' should be the default for parameters as it's the most commonly used and safest option for parameters. People should use it by default/accident and should have to explicitly opt out in cases where it makes sense. These cases would then be clearly, visibly marked with 'out', 'ref' etc From what I can see we currently have 2 reasons to require explicit 'in' (from Walters post and others in this thread): 1. avoid breaking backward compatibility with D 1.0. 2. parameters will all have parameter specifiers and pedantic people (no offence intended) will enjoy fully specified function parameters all the time. If so, isn't part of the point in making this change in a beta so that we can ignore reason #1. Granted if the feature was to be integrated into the mainstream version it would then break backward compatibility, but, imagine the situation: Old code would cease to compile and would need modification, but, the modifictions required would for the most part simply be the addition of 'out', 'ref', or an added .dup or copy. Which, in actual fact should have been there in the first place. Meaning, the exisiting code is unsafe and the resulting code after these changes would be much safer. In other words, applying 'scope const final' to implicit 'in' will catch existing bugs, but requiring explicit 'in' will not, right? As for reason #2 I think it's largely aesthetic however.. 1. Applying it to implicit 'in' is less typing for those non-pedantic programmers who can live without all parameters having a specifier. I personally dont think the presence of an explicit 'in' results in clearer code as it is clear to me that unspecified parameters are 'in' (and with these changes would be 'scope const final' too). Regan Heath

I agree with this, const by default is a real good idea (TM)
May 20 2007
prev sibling next sibling parent reply Derek Parnell <derek nomail.afraid.org> writes:
On Sun, 20 May 2007 14:53:54 -0400, Regan Heath wrote:

 Walter Bright Wrote:
 Do not use 'in' if you wish to do any of these operations on a 
 parameter. Using 'in' has no useful effect on D 1.0 code, so it'll be 
 backwards compatible.
 
 Adding in all those 'in's is tedious, as I'm finding out :-(, but I 
 think the results will be worth the effort.


...
 Why not make this implicit 'in' parameter 'scope const final' avoiding
 the need to explicity say 'in' everywhere?
 
 My reasoning is that I think 'scope const final' should be the default
 for parameters as it's the most commonly used and safest option for
 parameters.  People should use it by default/accident and should have
 to explicitly opt out in cases where it makes sense.  These cases
 would then be clearly, visibly marked with 'out', 'ref' etc

Thanks Regan, your proposal sits very comfortably with me. I have no problems with adding the 'ref' (whatever) keyword in those cases where an implicit 'in' (a.k.a. 'scope const final') causes the compiler to complain. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Justice for David Hicks!" 21/05/2007 9:59:51 AM
May 20 2007
parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
Derek Parnell wrote:
 On Sun, 20 May 2007 14:53:54 -0400, Regan Heath wrote:
 
 Walter Bright Wrote:
 Do not use 'in' if you wish to do any of these operations on a 
 parameter. Using 'in' has no useful effect on D 1.0 code, so it'll be 
 backwards compatible.

 Adding in all those 'in's is tedious, as I'm finding out :-(, but I 
 think the results will be worth the effort.


....
 Why not make this implicit 'in' parameter 'scope const final' avoiding
 the need to explicity say 'in' everywhere?

 My reasoning is that I think 'scope const final' should be the default
 for parameters as it's the most commonly used and safest option for
 parameters.  People should use it by default/accident and should have
 to explicitly opt out in cases where it makes sense.  These cases
 would then be clearly, visibly marked with 'out', 'ref' etc

Thanks Regan, your proposal sits very comfortably with me. I have no problems with adding the 'ref' (whatever) keyword in those cases where an implicit 'in' (a.k.a. 'scope const final') causes the compiler to complain.

The only thing I'm concerned about is having a way of specifying "not scope const final". I'm happy to have "safe-by-default", but there should be a way to escape it. Maybe lack of type annotations on an argument could be taken as "scope const final"; adding any annotations manually disables this (so if you use "const int foo", then it really means "const int foo" and not "scope const final const int foo". Then, we can use "in" to mean "just in; nothing else." -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
May 20 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Daniel Keep wrote:
 
 Derek Parnell wrote:
 On Sun, 20 May 2007 14:53:54 -0400, Regan Heath wrote:

 Walter Bright Wrote:
 Do not use 'in' if you wish to do any of these operations on a 
 parameter. Using 'in' has no useful effect on D 1.0 code, so it'll be 
 backwards compatible.

 Adding in all those 'in's is tedious, as I'm finding out :-(, but I 
 think the results will be worth the effort.


 Why not make this implicit 'in' parameter 'scope const final' avoiding
 the need to explicity say 'in' everywhere?

 My reasoning is that I think 'scope const final' should be the default
 for parameters as it's the most commonly used and safest option for
 parameters.  People should use it by default/accident and should have
 to explicitly opt out in cases where it makes sense.  These cases
 would then be clearly, visibly marked with 'out', 'ref' etc

I have no problems with adding the 'ref' (whatever) keyword in those cases where an implicit 'in' (a.k.a. 'scope const final') causes the compiler to complain.

The only thing I'm concerned about is having a way of specifying "not scope const final". I'm happy to have "safe-by-default", but there should be a way to escape it. Maybe lack of type annotations on an argument could be taken as "scope const final"; adding any annotations manually disables this (so if you use "const int foo", then it really means "const int foo" and not "scope const final const int foo". Then, we can use "in" to mean "just in; nothing else."

Ideally, starting from a black slate I'd say 'inout' should remove the const/final/scope-ness, instead of what it does now which is to make the *pointer* to the object itself modifiable. Then 'ref' could be used to mean 'i want to pass the pointer by reference'. So.... void foobulate(MyObject o) { o = new Object(); // bad o.member = 42; // bad } void foobulate(inout MyObject o) { o = new Object(); // bad! o.member = 42; // ok! } void foobulate(ref MyObject o) { o = new Object(); // ok! o.member = 42; // ok! } But that's just my top-of-the-head reaction. I'm sure there are ramifications that I haven't considered. I just wouldn't like to see 'in' take on a meaning other than "this parameter is being passed _in_ and you shouldn't expect to be able to get any information _out_ using it" If it has a meaning of "it's ok to modify the thing being passed in" then pretty much the only time it will be used is when you are interested in the modification, so basically 'in' would mean "we want to get something out", which is nonsensical. --bb
May 20 2007
parent Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 I just wouldn't like to see 'in' take on a meaning other than "this 
 parameter is being passed _in_ and you shouldn't expect to be able to 
 get any information _out_ using it"

I also think that would be a disastrously confusing change from what people having been currently using 'in' for. It'd be like wearing those funny glasses that turn the world upside down.
May 20 2007
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Regan Heath wrote:
 Why not make this implicit 'in' parameter 'scope const final' avoiding the
need to explicity say 'in' everywhere?

It's a good idea, but then there needs to be some way to pass a mutable reference. Such as: class C { int x; } void foo(C c) { c.x = 3; } That doesn't work if 'const' is the default. Using out or inout doesn't work either, as those have slightly different semantics (an extra level of indirection).
May 20 2007
next sibling parent reply Derek Parnell <derek nomail.afraid.org> writes:
On Sun, 20 May 2007 20:10:44 -0700, Walter Bright wrote:

 Regan Heath wrote:
 Why not make this implicit 'in' parameter 'scope const final' avoiding the
need to explicity say 'in' everywhere?

It's a good idea, but then there needs to be some way to pass a mutable reference. Such as: class C { int x; } void foo(C c) { c.x = 3; } That doesn't work if 'const' is the default. Using out or inout doesn't work either, as those have slightly different semantics (an extra level of indirection).

What about 'const ref' meaning that a reference is being passed and that reference is constant but not the data being referred to...? class C { int x; } void foo(const ref C c) { c.x = 3; // okay c = new C; // fail } I know I wouldn't mind changing my code to do this. It is just a lot safer and I'd rather "get it right" now than some years down the road with D. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Justice for David Hicks!" 21/05/2007 1:53:21 PM
May 20 2007
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Derek Parnell wrote:
 What about 'const ref' meaning that a reference is being passed and that
 reference is constant but not the data being referred to...?
 
  class C { int x; }
  void foo(const ref C c)
  {
       c.x = 3; // okay
       c = new C; // fail
  }

The trouble is that ref adds an extra level of indirection, and it would be confusing to say that in this case it didn't. Another option is to reuse 'inout' to mean 'mutable', since 'inout' is replaced by 'ref'.
May 20 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 Derek Parnell wrote:
 What about 'const ref' meaning that a reference is being passed and that
 reference is constant but not the data being referred to...?

  class C { int x; }
  void foo(const ref C c)
  {
       c.x = 3; // okay
       c = new C; // fail
  }

The trouble is that ref adds an extra level of indirection, and it would be confusing to say that in this case it didn't. Another option is to reuse 'inout' to mean 'mutable', since 'inout' is replaced by 'ref'.

...which is what my last message was suggesting. Any reason why that wouldn't work? There is the question of what would happen to "out" and how you'd get out behavior applied to the pointer rather than the value. And while "mutable" is on the table, is D going to have a story for private mutable members that don't affect the interface? Like the classic private mutable cache member in C++. --bb
May 20 2007
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 Walter Bright wrote:
 Another option is to reuse 'inout' to mean 'mutable', since 'inout' is 
 replaced by 'ref'.


You're right, I read your posting too quickly.
 Any reason why that wouldn't work?

Breaking existing code.
 There is the question of what would happen to "out" and 
 how you'd get out behavior applied to the pointer rather than the value.

I'd leave out as it is.
 And while "mutable" is on the table, is D going to have a story for 
 private mutable members that don't affect the interface?  Like the 
 classic private mutable cache member in C++.

Ah, the "logical constness" design pattern. I personally loathe that <g>. Const but mutable data just smacks of being far too clever.
May 21 2007
parent reply Regan Heath <regan netmail.co.nz> writes:
Walter Bright Wrote:
 Bill Baxter wrote:
 Walter Bright wrote:
 Another option is to reuse 'inout' to mean 'mutable', since 'inout' is 
 replaced by 'ref'.


You're right, I read your posting too quickly.
 Any reason why that wouldn't work?

Breaking existing code.
 There is the question of what would happen to "out" and 
 how you'd get out behavior applied to the pointer rather than the value.

I'd leave out as it is.
 And while "mutable" is on the table, is D going to have a story for 
 private mutable members that don't affect the interface?  Like the 
 classic private mutable cache member in C++.

Ah, the "logical constness" design pattern. I personally loathe that <g>. Const but mutable data just smacks of being far too clever.

My feeling is that if we have 'scope const final' as default and implicit then we do need some way to escape it, as we've all suggested. I think the best way is as Daniel suggested, any keyword will override the implicit/default ones, so: void foo(int i) {} //scope, const, final void foo(const int i) {} //just const ..etc.. So, that just leaves the problem you (Walter) proposed of:
class C { int x; }
void foo(C c)
{
     c.x = 3;
}

and being able to pass a mutable reference. Would this reference be 'scope' or 'final'? My understanding is that 'final' means the reference itself could not be changed and 'scope' means an outside reference cannot be given it's value. It seems to me you want both of these ('scope' because the reference will persist outside the function and 'final' because the very point of 'ref' is to be able to modify the reference) except in cases where you pass it by 'ref', in which case you want neither. Assuming we want 'scope' and 'final' applied to these mutable references then I dislike re-using 'inout' because (and perhaps this is ingrained thinking due to having used inout and out) the 'out' part of the keyword doesn't immediately appear to be happening. We're not setting the reference to something which is then used 'out'side the function, instead (as Bill mentioned) we're changing them internally and only in this way is it reflected outside the function. I'd think I'd prefer to use a new keyword like 'mutable', which in our case would be a shortcut for 'scope final'. In a general sense it seems we have 2 classes of keyword here, the base ones: const final scope ref and these handy, shortcut, combination ones: in(default) = const, scope, final mutable = scope, final The question I think we need to ask before we decide what keyword to use is: do we want/need to have opposites for all the base keywords? or do we want to use !<keyword>? or do we want something else? I dislike !<keyword> purely for aesthetic reasons, to me it looks *ick*. So, if we had opposites what would they be? const - mutable? scope - global? final - mutable? I seem to have hit a little wall here, we can't use mutable for both the opposite of const and final, and then also for the combination of 'scope final', can we? It seems I have asked more questions than given answers, hopefully someone else can come up with a few solutions :) Regan Heath
May 21 2007
next sibling parent reply Regan Heath <regan netmail.co.nz> writes:
Regan Heath Wrote:
 It seems to me you want both of these ('scope' because the reference will
persist outside the function and 'final' because the very point of 'ref' is to
be able to modify the reference) except in cases where you pass it by 'ref', in
which case you want neither.  

Re-reading this it appears I have made a mistake and worded it terribly to boot. To clarify... What I was trying to say is twofold: 1. Because we have 'ref' as an option then in the cases where we do not use 'ref' we do not need to modify the reference and therefore it should be 'final'. 2. Because the reference is not passed by 'ref' it is a copy and will not persist outside the function and therefore is 'scope' In short, unless you use 'ref' you want 'scope final' applied to these references. Fingers crossed I haven't made any more mistakes there. Regan Heath
May 21 2007
parent reply gareis <dhasenan gmail.com> writes:
== Quote from Regan Heath (regan netmail.co.nz)'s article
 Regan Heath Wrote:
 It seems to me you want both of these ('scope' because the reference will


able to modify the reference) except in cases where you pass it by 'ref', in which case you want neither.
 Re-reading this it appears I have made a mistake and worded it terribly to
boot.

 What I was trying to say is twofold:
 1. Because we have 'ref' as an option then in the cases where we do not use

 2. Because the reference is not passed by 'ref' it is a copy and will not

 In short, unless you use 'ref' you want 'scope final' applied to these
references.
 Fingers crossed I haven't made any more mistakes there.
 Regan Heath

So wait...if I have a ref parameter, can I change the value of the reference locally without global changes? I like passing mutable copies of references. It's simple and expected behavior that I can count on. So will there be syntax that, for example, would give me the following? --- void func(char[] a) { a = a[1..$]; // good a[1] = 'f'; // error } --- For that, I'd just use final, correct?
May 24 2007
parent Regan Heath <regan netmail.co.nz> writes:
gareis Wrote:
 == Quote from Regan Heath (regan netmail.co.nz)'s article
 Regan Heath Wrote:
 It seems to me you want both of these ('scope' because the reference will


able to modify the reference) except in cases where you pass it by 'ref', in which case you want neither.
 Re-reading this it appears I have made a mistake and worded it terribly to
boot.

 What I was trying to say is twofold:
 1. Because we have 'ref' as an option then in the cases where we do not use

 2. Because the reference is not passed by 'ref' it is a copy and will not

 In short, unless you use 'ref' you want 'scope final' applied to these
references.
 Fingers crossed I haven't made any more mistakes there.
 Regan Heath

So wait...if I have a ref parameter, can I change the value of the reference locally without global changes?

No, as that's the point of the 'ref' (the new name for 'inout') keyword, to explain.. void foo(ref char[] a) { a = "1,2,3"; } void main() { char[] b = "testing"; foo(b); writefln(b); } In the above 'b' is passed by reference to 'foo' (not a copy of 'b') which changes the value of the reference itself. This change can be seen when foo returns and 'b' is written to the console resulting in "1,2,3" instead of "testing". Remove 'ref' and you see "testing" on the console as "a = .." only modifies the copy of the original reference. In comparrison in Walters new sceme, assuming implicit 'in' meaning 'final const scope', eg. void foo(char[] a) { a = "1,2,3"; } void main() { char[] b = "testing"; foo(b); writefln(b); } you would get an error as the "a = .." line would violate the 'final' protection.
 I like passing mutable copies of references. It's simple and expected behavior
 that I can count on.
 
 So will there be syntax that, for example, would give me the following?
 ---
 void func(char[] a) {
    a = a[1..$]; // good
    a[1] = 'f';  // error
 }
 ---
 
 For that, I'd just use final, correct?

No, I think you'd use 'const scope'. In this thread we talked about having a new 'mutable' keyword which would mean 'const scope', eg. //these would be identical declarations void func(mutable char[] a) void func(const scope char[] a) My understanding, and I could be wrong here, is that 'final' protects the reference and 'const' protects the thing to which it refers. In the case of arrays: char[] aa; //global void func(char[] a) { a = a[1..$]; // violates final a[1] = 'f'; // violates const aa = a; //violates scope } In the case of classes: class A { int b; } A aa; //global void foo(A a) { a.b = 1; //violates const a = new A(); //violates final aa = a; //violates scope } Someone please correct me if I have this wrong/backward. Regan
May 24 2007
prev sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Regan Heath wrote:
 Walter Bright Wrote:
 Bill Baxter wrote:
 Walter Bright wrote:
 Another option is to reuse 'inout' to mean 'mutable', since 'inout' is 
 replaced by 'ref'.


 Any reason why that wouldn't work?

 There is the question of what would happen to "out" and 
 how you'd get out behavior applied to the pointer rather than the value.

 And while "mutable" is on the table, is D going to have a story for 
 private mutable members that don't affect the interface?  Like the 
 classic private mutable cache member in C++.

<g>. Const but mutable data just smacks of being far too clever.

My feeling is that if we have 'scope const final' as default and implicit then we do need some way to escape it, as we've all suggested. I think the best way is as Daniel suggested, any keyword will override the implicit/default ones, so: void foo(int i) {} //scope, const, final void foo(const int i) {} //just const ..etc..

That makes sense to me too. If you don't say anything it's 'scope const final'. But if you do specify something then it's only that. I'm not wild about the aesthetics of !const for parameters, and even less wild about the possibility that !const could become common idiom for modifiable parameters. If it's a common way to pass a parameter, then there should be a way to express the attribute positively (like "mutable" or "variable" or "inout") in terms of what it does do, rather than what it doesn't. --bb
May 21 2007
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 That makes sense to me too.  If you don't say anything it's 'scope const 
 final'.  But if you do specify something then it's only that.

Right. There are too many qualifies to do otherwise.
 I'm not 
 wild about the aesthetics of !const for parameters, and even less wild 
 about the possibility that !const could become common idiom for 
 modifiable parameters.  If it's a common way to pass a parameter, then 
 there should be a way to express the attribute positively (like 
 "mutable" or "variable" or "inout") in terms of what it does do, rather 
 than what it doesn't.

Uh, I think you put a finger on just where I was getting a bad feeling about !const. It's generally confusing to use negatives as attributes, i.e., having state variables named: "notFull" is a bad idea. I'm at the moment thinking we should just bite the bullet and introduce 'mutable' as a keyword.
May 21 2007
next sibling parent reply Derek Parnell <derek nomail.afraid.org> writes:
On Mon, 21 May 2007 19:11:01 -0700, Walter Bright wrote:

 I'm at the moment thinking we should just bite the bullet and introduce 
 'mutable' as a keyword.

Excellent! And could the opposite of 'scope' be 'persist', maybe? -- Derek (skype: derek.j.parnell) Melbourne, Australia "Justice for David Hicks!" 22/05/2007 3:37:15 PM
May 21 2007
parent Walter Bright <newshound1 digitalmars.com> writes:
Derek Parnell wrote:
 On Mon, 21 May 2007 19:11:01 -0700, Walter Bright wrote:
 
 I'm at the moment thinking we should just bite the bullet and introduce 
 'mutable' as a keyword.

Excellent! And could the opposite of 'scope' be 'persist', maybe?

No, the opposite of scope would just be - nothing.
May 21 2007
prev sibling parent Dave <Dave_member pathlink.com> writes:
Walter Bright wrote:
 Bill Baxter wrote:
 That makes sense to me too.  If you don't say anything it's 'scope 
 const final'.  But if you do specify something then it's only that.


So the new 'in' would be the default if not specified, right?
 Right. There are too many qualifies to do otherwise.
 
 I'm not wild about the aesthetics of !const for parameters, and even 
 less wild about the possibility that !const could become common idiom 
 for modifiable parameters.  If it's a common way to pass a parameter, 
 then there should be a way to express the attribute positively (like 
 "mutable" or "variable" or "inout") in terms of what it does do, 
 rather than what it doesn't.

Uh, I think you put a finger on just where I was getting a bad feeling about !const. It's generally confusing to use negatives as attributes, i.e., having state variables named: "notFull" is a bad idea. I'm at the moment thinking we should just bite the bullet and introduce 'mutable' as a keyword.

A little late and FWIW, but this all sounds great to me!
May 25 2007
prev sibling parent reply Reiner Pope <some address.com> writes:
Walter Bright wrote:
 Regan Heath wrote:
 Why not make this implicit 'in' parameter 'scope const final' avoiding 
 the need to explicity say 'in' everywhere?

It's a good idea, but then there needs to be some way to pass a mutable reference. Such as: class C { int x; } void foo(C c) { c.x = 3; } That doesn't work if 'const' is the default. Using out or inout doesn't work either, as those have slightly different semantics (an extra level of indirection).

At the risk of aggravating more people by adding Yet Another Keyword, how about 'const scope final' by default, with some negative keywords to undo it. Like, say, unconst and unscope: void foo(unconst C c) { c.x = 3; } or why not use the ! operator, which already means 'not': void foo(!const C c) { c.x = 3; } Cheers, Reiner
May 20 2007
next sibling parent Reiner Pope <some address.com> writes:
Reiner Pope wrote:
 void foo(!const C c)
 {
     c.x = 3;
 }
 

Although, on second thoughts, the ! there looks quite invisible...
May 20 2007
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Reiner Pope wrote:
 or why not use the ! operator, which already means 'not':
 
 void foo(!const C c)
 {
     c.x = 3;
 }

That's Andrei's suggestion, too. It would take some getting used to.
May 21 2007
parent Deewiant <deewiant.doesnotlike.spam gmail.com> writes:
Walter Bright wrote:
 Reiner Pope wrote:
 or why not use the ! operator, which already means 'not':

 void foo(!const C c)
 {
     c.x = 3;
 }

That's Andrei's suggestion, too. It would take some getting used to.

I'd prefer it, though. That, or some other mechanism for const by default, but I like this syntax because no new keywords are needed and it's not overly verbose. Reiner's thought that the ! looks invisible doesn't matter, in my opinion, because you wouldn't ever write "const" without the ! for a function parameter. A 2.0 release is the time to break existing code, and I don't see why you shouldn't do so. Existing projects can go on with only 1.0 support or convert to 2.0 as they will.
May 21 2007