www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - ARC on Objects not Class Definitions

reply IbanezDavy <anonymous anonymous.com> writes:
I actually asked this on reddit but I figured I'd pose the 
question here because it seems like a more fitting place for the 
discussion. I'd actually very much like to see ARC as an 
alternative to GC in D. I'd literally have no more excuses to use 
C++ anymore if that was the case. But after reading DIP74, I 
can't help but think the same mistake is being repeated with ARC 
that was repeated with the GC.

Usually how you allocate the object has very little to do with 
implementation of the object itself. So why have the  safe 
property specified when defining the class? Why not have it as a 
type qualifier for the object instead? This would allow people to 
opt in and out of both GC and ARC where they feel fit and not 
force some arbitrary allocation scheme that could function just 
as well with GC as with ARC. Furthermore, taking this idea to the 
logical extreme (this part isn't necessarily required), you could 
have another qualifier to allow you to manage the reference 
type's memory yourself. Something like  nogc or  unsafe would 
seem appropriate.

I figured I'd ask because it seemed to make a lot sense to me. 
Maybe one of you could enlighten me why this wouldn't work in D's 
case?
Nov 04 2015
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Wednesday, 4 November 2015 at 19:33:07 UTC, IbanezDavy wrote:
 Usually how you allocate the object has very little to do with 
 implementation of the object itself.
The question isn't really allocation, but instead when to deallocate. The object itself needs to know it because the object may pass out references to itself. Consider something like this: class MyString { private char[128] data_; private int length_; char[] getData() { return this.data_[0 .. length_]; } } How you allocate that isn't terribly important. You might malloc it, you might gc it, you might stick it on the stack. (This is all possible, even easy, with D today btw.) But, when should it be freed? char[] getString() { MyString s = make!myString; return s.getData(); // does s get freed here? what about the returned member though? } GC solves it by keeping the object alive if there's any reference in living memory, determined by scanning it at collection time. ARC injects code. If it is just an allocation decision, the getData function wouldn't know it needs to increase its reference count... and s would be freed too soon, while the data is still propagated. The object would need to be aware either to a) don't do that, make it an error to return a reference to itself, or b) add some refcounting-aware type to the return value too. You can get by without that stuff if you leave matters all in the programmer's hands, but the point of safe is to take it out fo your hands and into the compiler's.
Nov 04 2015
parent NX <nightmarex1337 hotmail.com> writes:
On Thursday, 5 November 2015 at 01:42:18 UTC, Adam D. Ruppe wrote:
 The object itself needs to know it because the object may pass 
 out references to itself.

 Consider something like this:

 class MyString {
    private char[128] data_;
    private int length_;

    char[] getData() { return this.data_[0 .. length_]; }
 }

 How you allocate that isn't terribly important. You might 
 malloc it, you might gc it, you might stick it on the stack. 
 (This is all possible, even easy, with D today btw.)

 But, when should it be freed?

 char[] getString() {
    MyString s = make!myString;
    return s.getData();
    // does s get freed here? what about the returned member 
 though?
 }
How can class coder possibly know what's happening outside of his implementation or how can compiler understand a reference to a class member is being returned ? Doesn't this require full control flow analysis ? Enlighten me please...
Nov 07 2015