www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - convert char[4] to uint at compile time

reply Moritz Warning <moritzwarning web.de> writes:
Hi,

I have problems to convert a char[4] to an uint at compile time.
All variations (I've tried) of using an enum crashes dmd:

union pp { char[4] str; uint num; }
const uint x = pp("abcd").num

This does also doesn't work:

const uint x = cast(uint) x"aa aa aa aa";


Any ideas?
Dec 16 2008
parent reply BCS <ao pathlink.com> writes:
Reply to Moritz,

 Hi,
 
 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:
 
 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:
 
 const uint x = cast(uint) x"aa aa aa aa";
 
 Any ideas?
 
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Dec 16 2008
parent reply Moritz Warning <moritzwarning web.de> writes:
On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:

 Reply to Moritz,
 
 Hi,
 
 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:
 
 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:
 
 const uint x = cast(uint) x"aa aa aa aa";
 
 Any ideas?
 
 
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Thanks! That workaround should do it. Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
Dec 16 2008
parent reply Janderson <ask me.com> writes:
Moritz Warning wrote:
 On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
 
 Reply to Moritz,

 Hi,

 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:

 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:

 const uint x = cast(uint) x"aa aa aa aa";

 Any ideas?
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Thanks! That workaround should do it. Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs". -Joel
Dec 23 2008
next sibling parent reply "Denis Koroskin" <2korden gmail.com> writes:
On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask me.com> wrote:

 Moritz Warning wrote:
 On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:

 Reply to Moritz,

 Hi,

 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:

 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:

 const uint x = cast(uint) x"aa aa aa aa";

 Any ideas?
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Thanks! That workaround should do it. Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs". -Joel
And what about endianness? You can't have a feature in a language that gives different results in different environment.
Dec 23 2008
parent Moritz Warning <moritzwarning web.de> writes:
On Tue, 23 Dec 2008 13:16:28 +0300, Denis Koroskin wrote:

 On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask me.com> wrote:
 
 Moritz Warning wrote:
 On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:

 Reply to Moritz,

 Hi,

 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:

 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:

 const uint x = cast(uint) x"aa aa aa aa";

 Any ideas?
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Thanks! That workaround should do it. Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs". -Joel
And what about endianness? You can't have a feature in a language that gives different results in different environment.
The use of uint in my example might be confusing. I only needed an environment independent bit pattern of 4 bytes. An integer is used because it's faster than comparing a char[4] with DMD. :/ (GDC doesn't show such behavior)
Dec 23 2008
prev sibling parent Moritz Warning <moritzwarning web.de> writes:
On Tue, 23 Dec 2008 00:07:08 -0800, Janderson wrote:

 Moritz Warning wrote:
 On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
 
 Reply to Moritz,

 Hi,

 I have problems to convert a char[4] to an uint at compile time. All
 variations (I've tried) of using an enum crashes dmd:

 union pp { char[4] str; uint num; }
 const uint x = pp("abcd").num
 This does also doesn't work:

 const uint x = cast(uint) x"aa aa aa aa";

 Any ideas?
template Go (char[4] arg) { const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3]; } import std.stdio; void main() { writef("%x\n", Go!("Good")); }
Thanks! That workaround should do it. Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs". -Joel
I like to see "abcd" being a value type like a decimal or hex value. A cast(uint) would be possible and nice in that case.
Dec 23 2008