www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - shared switch

reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
Regarding shared, why is preview=nosharedaccess not the default?

Instead it could have been preview=sharedaccess.

I thought the point was that you want protection, otherwise you 
would use __gshared?

Input?
Oct 07 2023
next sibling parent "Richard (Rikki) Andrew Cattermole" <richard cattermole.co.nz> writes:
Preview switches, are there to allow people to migrate to new behaviors 
of the language that could cause code breakage.

It is meant to become the default once it has matured and the ecosystem 
has had a chance to evolve to it.
Oct 07 2023
prev sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 7 October 2023 at 09:26:09 UTC, Imperatorn wrote:
 Regarding shared, why is preview=nosharedaccess not the default?
because it break a bunch of stuff
 Instead it could have been preview=sharedaccess.

 I thought the point was that you want protection, otherwise you 
 would use __gshared?

 Input?
The idea is that access to shared variables is unsafe. So in order to access them you need to do so through a trusted interface that does the appropriate thing (atomics, locks, whatever). To force you to use the correct interface, -preview=nosharedaccess forbids both reads and writes of shared variables (hence no access) and so (in the implementation) you must `cast()` (which is a safety violation), and so those functions must be trusted to be used from safe code.
Oct 07 2023
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 8 October 2023 at 05:27:54 UTC, Nicholas Wilson wrote:
 On Saturday, 7 October 2023 at 09:26:09 UTC, Imperatorn wrote:
 Regarding shared, why is preview=nosharedaccess not the 
 default?
because it break a bunch of stuff
 Instead it could have been preview=sharedaccess.

 I thought the point was that you want protection, otherwise 
 you would use __gshared?

 Input?
The idea is that access to shared variables is unsafe. So in order to access them you need to do so through a trusted interface that does the appropriate thing (atomics, locks, whatever). To force you to use the correct interface, -preview=nosharedaccess forbids both reads and writes of shared variables (hence no access) and so (in the implementation) you must `cast()` (which is a safety violation), and so those functions must be trusted to be used from safe code.
Thanks for the explanation. I'm trying to explore what to do to get the maximum safety features for D. Not only safety, but also "logical safety" Like, safe pure etc. Seems I should use nosharedaccess also then and probably dip1000, although we won't use pointers at all if possible.
Oct 07 2023
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, October 8, 2023 12:27:20 AM MDT Imperatorn via Digitalmars-d wrote:
 I'm trying to explore what to do to get the maximum safety
 features for D.

 Not only safety, but also "logical safety"

 Like,  safe pure etc. Seems I should use nosharedaccess also then
 and probably dip1000, although we won't use pointers at all if
 possible.
Just be aware that when you're using -preview switches, you're typically using features that are still changing as bugs (and sometimes even how the feature works) get ironed out. So, there is a much higher risk of your code breaking when using such switches, and depending on what happens with bugs with and changes to those features, the changes that they force you to make to your code may or may not actually be required in the long run. As for those specific switches, issues with shared should be very rare in your code, because it's almost certainly the case that very little of your code should be using shared. So, while it may be worthwhile to see what the compiler says when enabling the related switch, for most programs, it shouldn't matter one way or the other - and when it does, it will typically be a for a very small part of the program. DIP 1000 is a thornier issue, because what they do with it seems to keep changing, and dealing with it when stuff gets marked with scope (either explicitly by you or implicitly be the compiler) can get annoying _really_ fast. Depending on what your code is doing, it could help you find issues, or it could just be forcing you to mark a bunch of stuff with scope to shut the compiler up about stuff that you're doing which is just fine. So, enabling it might be valuable, or it could be almost purely irritating. That being said, if you're not doing much with taking the address of local variables or slicing static arrays and the like, there really isn't going to be much for DIP 1000 to catch. So, use the switches if you think that they'll benefit you, but be aware that they're not yet standard D, and how stable they are can vary. So, depending on what you're doing, they could easily cause more trouble than they're worth - or they could catch issues for you and save you time and trouble. It's hard to know which ahead of time. - Jonathan M Davis
Oct 08 2023
parent reply Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Sunday, 8 October 2023 at 08:26:53 UTC, Jonathan M Davis wrote:
 On Sunday, October 8, 2023 12:27:20 AM MDT Imperatorn via 
 Digitalmars-d wrote:
 [...]
Just be aware that when you're using -preview switches, you're typically using features that are still changing as bugs (and sometimes even how the feature works) get ironed out. So, there is a much higher risk of your code breaking when using such switches, and depending on what happens with bugs with and changes to those features, the changes that they force you to make to your code may or may not actually be required in the long run. [...]
Thanks for your input. What would you personally do if you had to write an application in D with the risk of loss of life if you got a runtime error? What can be done to minimize the risk basically by using D.
Oct 08 2023
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, October 8, 2023 8:02:22 AM MDT Imperatorn via Digitalmars-d wrote:
 On Sunday, 8 October 2023 at 08:26:53 UTC, Jonathan M Davis wrote:
 On Sunday, October 8, 2023 12:27:20 AM MDT Imperatorn via

 Digitalmars-d wrote:
 [...]
Just be aware that when you're using -preview switches, you're typically using features that are still changing as bugs (and sometimes even how the feature works) get ironed out. So, there is a much higher risk of your code breaking when using such switches, and depending on what happens with bugs with and changes to those features, the changes that they force you to make to your code may or may not actually be required in the long run. [...]
Thanks for your input. What would you personally do if you had to write an application in D with the risk of loss of life if you got a runtime error? What can be done to minimize the risk basically by using D.
In general, the biggest thing there would be to try to be _very_ thorough with unit tests (and integration tests and the like, etc). The better your testing, the more issues that you'll find. Arguably, one of the biggest features of D is that it has built-in unit testing with unittest blocks, making it really easy to write tests without having to jump through a bunch of extra hoops like you have to do in most languages. It's great to find as many bugs as you can via the type system and language features, but ultimately, it's testing that's going to find most of the issues, since the language itself can't verify that your logic is correct for what you're doing. Similarly, you should probably make liberal use of assertions (and potentially contracts, though those are mostly just a way to group assertions to be called when entering or exiting a function) where reasonable to catch issues - though if you're dealing with stuff where you want to catch it in release builds, then testing and throwing exceptions on failure would be a better choice. And of course, the big thing that usually comes up with discussions on systems written where there's a real potential of loss of life is to have redundancy so that you can afford for some parts to fail. But that's obviously less of a language concern and more of a general design issue. And if you're working on such systems, you probably know more about that than I do. As far as things like scope and shared go, restricting how much you even need them will buy you more than any feature designed to make sure that you use them correctly. In most applications, very little should be shared across threads, and restricting what is to small portions of the code base will make it much easier to both find and avoid bugs related to it. Similarly, if you're typically avoiding taking the address of local variables or slicing static arrays, scope won't matter much. scope is supposed to find bugs with regards to escaping references to local data, and there's nothing to find if you're not doing anything with references to local data. Sometimes, you do need to do that sort of thing for performance (or because of how you have to interact with C code), but minimizing how much you use risky features like that will go a long way in avoiding bugs related to them. Part of what makes D generally safer than C or C++ is how it reduces how much you're doing risky things with memory (e.g. by having the GC worry about freeing stuff). It may make sense to at least peridiocally use the preview flags in a build to see if you might need to update your code, but how much sense it's going to make to do that is really going to depend on what your code is doing. If you're in an environment where you actually need to take the address of locals all over the place for performance reasons or whatnot, then it could be worth the pain of just turning on the switch for DIP 1000 and using it all the time, whereas if you're doing relatively little with taking the address of locals or slicing static arrays, worrying about DIP 1000 could just be wasting your time. As for shared, it may be worth just turning on the switch, because you want the compiler to basically not let you do anything with shared other than store a variable that way. Typically, the two ways that shared needs to be handled are 1. Have an object which is shared which you do no operations on while it's shared. In the sections where you need to operate on it, you then either use atomics on it, or you protect that section of code with a mutex and cast away shared to get a thread-local reference to the data. You can then do whatever you need to do with that thread-local reference, making sure that it doesn't escape anywhere (which scope may or may not help with), and then when you're done, make sure that no thread-local references exist before releasing the mutex. Because of the casts, the code in question will need to be trusted if you want to use it with safe code, which should help segregate that section of the code. 2. You have a type which is designed to be operated on as shared. It has shared member functions, and you use it normally as if it weren't shared. object needs to do anything with its data, it either uses atomics, or it locks a mutex and temporarily casts away shared, ensuring that no thread-local references escape. Whichever way you handle it, you basically want the compiler complaining any time you do anything with shared that isn't guaranteed to be thread-safe, which functionally means that you want it to complain when you do just about anything with it other than call a shared member function. So, if the switch means that the compiler complains more, that's probably a good thing. However, the exact meaning of the switch does risk changing over time (e.g. there's been some discussion about changing it so that shared integer types just do the atomics for you automatically for basic operations, whereas types that won't directly work with atomics would just be an error to do anything with as shared other than call shared member functions), so depending on exactly what happens, the switch could get annoying, but in general, if it's just going to flag more stuff as an error with shared, then that's usually a good thing. And since in the vast majority of programs, shared should be in a very small portion of your code base, any issues with a compiler switch should be of minimal cast. But I'd have to see exactly what the preview flag for shared was complaining about (or not complaining about) in a code base to see whether it really made sense to enable it normally or not. Honestly, it wouldn't surprise me if the primitives intended to be used with shared (such as Mutex in core.sync.mutex) were the most annoying parts with regards to shared simply because they were originally written before shared, and shared hasn't necessarily been applied to them correctly. That's the sort of thing that needs to be sorted out along with the exact behavior of the switch. In any case, in general, I would say that you should use safe as much as reasonably possible (with trusted used as little as reasonably possible) and test, test, test. How much the type system will be able to help you catch stuff will change over time (hopefully exclusively for the better) as features like scope and shared are better sorted out, but ultimately, you're going to need to catch anything that gets through the cracks by testing - and with how much of the code's behavior depends on logic that only the folks working on it are going to know and understand, and the compiler can't possibly catch, those cracks will always be large. And that's true of any code base. - Jonathan M Davis
Oct 08 2023
parent Imperatorn <johan_forsberg_86 hotmail.com> writes:
On Monday, 9 October 2023 at 02:07:50 UTC, Jonathan M Davis wrote:
 On Sunday, October 8, 2023 8:02:22 AM MDT Imperatorn via 
 Digitalmars-d wrote:
 [...]
In general, the biggest thing there would be to try to be _very_ thorough with unit tests (and integration tests and the like, etc). The better your testing, the more issues that you'll find. Arguably, one of the biggest features of D is that it has built-in unit testing with unittest blocks, making it really easy to write tests without having to jump through a bunch of extra hoops like you have to do in most languages. [...]
Whoa, I didn't expect such a long and well formulated answer. I will have to digest this for some time. But yes, I agree. Even languages that are formally verified can have logical errors, so in that sense testing is the best you can do. Thanks for your answer
Oct 09 2023