www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - strong typing

reply florian <florian_member pathlink.com> writes:
I really like the fact that typedef creates strong types, and not merely aliases
as in C. However, some very simple test I just run show some limitations. here
they are:
void main()
{
typedef int foo;
typedef int bar;
foo f=1;
bar b=2;
int i=3;
long l=4;
f = i;		// error: cannot implicitly convert expression (i) of type int to foo
f = b;		// error: cannot implicitly convert expression (b) of type bar to foo
i= f;		// no error
l= f;
f=l;		// cannot implicitly convert expression (l) of type long to foo
b = f + b;	// error: cannot implicitly convert expression (f + b) of type foo to
bar
// note: somehow, dmd didn't mind to convert b from type bar to type foo, in
order to add it

b = b + f;	// no error
b = b * f;	// no error either, it is nothing special about the + opperator
}

so to summ up, strong typing works in most cases, except :
* affecting a presonal type(foo, bar) to a native one (int, long...)
* converting a personnal type in the right opperant to the type needed to
perform the opperation

for the first one, it is simply the casting to a native type implicitely is ok.

for the second one, the explanation i guess is:
* implicitely casting to native type is OK
* the signature of the + opperator of type bar is "bar bar.oppAdd(int)"

This doesn't feel rigth to me. while I understand that implicitely casting
personal types to native ones could be considered a usefull thing (thoug i don't
really like it), I can't find any good reason why opperator should take a int as
right argument (and through implicit casting, all personal types).

So, is there some reason i missed for this behavior? or did I just find a bug?

btw, I got this behaviour with dmd 0.125, and didn't try it on other versions.
May 22 2005
parent reply "Unknown W. Brackets" <unknown simplemachines.org> writes:
I think some of these make sense.

 f = i;		// error: cannot implicitly convert

I like those two. You should have to cast from int to foo.
 i = f;		// no error

This makes some sense to me, as f can implicitly cast to an int - it is one. The reverse just isn't true (an int is not necessarily a foo.)
 b = f + b;

I agree the addition should have given an error; although, I suppose, maybe it casted b implicitly to an int and then multiplied it by f. That's logical.
 b = b + f;

See above logic. Anyway, I think implicitly casting back to the "origin type" makes sense. After all, that's not what you're trying to avoid - you're trying to avoid the origin type being treated as it. Example: typedef void* HANDLE; Now, if I want to pass a HANDLE to: HANDLE someWeirdFunc(void* pointer) Seems like I should be able to. But, if I had this function: void CloseHandle(HANDLE h) It should *only* work on HANDLEs, not on void*s or anything like that. -[Unknown]
May 23 2005
parent florian <florian_member pathlink.com> writes:
I think some of these make sense.

 f = i;		// error: cannot implicitly convert

I like those two. You should have to cast from int to foo.

 i = f;		// no error

This makes some sense to me, as f can implicitly cast to an int - it is one. The reverse just isn't true (an int is not necessarily a foo.)

 b = f + b;

I agree the addition should have given an error; although, I suppose, maybe it casted b implicitly to an int and then multiplied it by f. That's logical.

int * foo has type int doesn't make sense foo * bar has type foo doesn't make sense .. the current behavior is "foo op bar" has type foo, no matter what the op is, no matter what foo and bar are (native type or typedef of native), because "foo op int" is always defined and bar can be casted into int implicitely. while this does make sense in a few subcases, i don't think it should be the rule. moreover, when you say that 3 foo is a foo, your thinking in term of dimension (in the physics sens of the term), in the same logic, foo * foo is not of type foo, but of type foo_squared. and foo * bar is of type foobar. should types match dimensions? I don't think so, it would require the compiler to generate new types everytime you multipy something. I am not sure what the prover solution is here, but something feels wrong.
May 23 2005