www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Motivation for compile time function execution

reply Walter Bright <newshound digitalmars.com> writes:
The sqrt() example I gave is cute, but not a very interesting reason for 
doing compile time function execution. So what's the real reason?

A while back, Eric Anderton and Don Clugston showed how one could use 
template metaprogramming techniques to do, for example compile time 
regular expressions. It was a marvelous technical demonstration, and 
showed that essentially any computation could be done, using templates, 
at compile time.

There were some serious problems, though:

1) They are hard and unintuitive to write (for those not comfortable 
with functional programming, which is most of us).

2) The result looks awful - templates are just syntactically unsuited to 
this, even though they're a lot easier on the eye than C++ template 
metaprograms.

3) This is off-putting enough that some question even having such 
facilities in the language, as it results in impenetrable code 
unmaintainable by joe coders.

4) While theoretically capable of any computation, such template 
metaprogramming had severe practical limitations. Every step of every 
computation required generating a unique template. This naturally is 
incredibly slow and memory consumptive (C++ metaprogramming is notorious 
for taking all night to do a build). Even worse, if you're going to use 
string templates to parse, say, a 1000 character DSL, the template name 
generated must include its arguments, so that template identifier will 
be 1000 characters long. Then, it slices off the first character, and 
generates another template for the rest (999), then 998, then 997, etc., 
until 1000 templates are generated averaging 500 characters long. It 
doesn't take much of that before the whole thing collapses.

5) In casting about for a solution, the compile time regex came up 
again. This was still strongly disliked.

6) I promised to try and make template metaprogramming less obtuse, and 
what better way to do that than to obsolete a whole class of templates, 
replacing them with ordinary functions?
Feb 15 2007
next sibling parent "Lionello Lunesu" <lionello lunesu.remove.com> writes:
Shhh.. You had me at "cute" ...

L. 
Feb 15 2007
prev sibling next sibling parent reply Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 The sqrt() example I gave is cute, but not a very interesting reason for 
 doing compile time function execution. So what's the real reason?
 
 A while back, Eric Anderton and Don Clugston showed how one could use 
 template metaprogramming techniques to do, for example compile time 
 regular expressions. It was a marvelous technical demonstration, and 
 showed that essentially any computation could be done, using templates, 
 at compile time.
 
 There were some serious problems, though:
 
 1) They are hard and unintuitive to write (for those not comfortable 
 with functional programming, which is most of us).
 
 2) The result looks awful - templates are just syntactically unsuited to 
 this, even though they're a lot easier on the eye than C++ template 
 metaprograms.
 
 3) This is off-putting enough that some question even having such 
 facilities in the language, as it results in impenetrable code 
 unmaintainable by joe coders.
 
 4) While theoretically capable of any computation, such template 
 metaprogramming had severe practical limitations. Every step of every 
 computation required generating a unique template. This naturally is 
 incredibly slow and memory consumptive (C++ metaprogramming is notorious 
 for taking all night to do a build). Even worse, if you're going to use 
 string templates to parse, say, a 1000 character DSL, the template name 
 generated must include its arguments, so that template identifier will 
 be 1000 characters long. Then, it slices off the first character, and 
 generates another template for the rest (999), then 998, then 997, etc., 
 until 1000 templates are generated averaging 500 characters long. It 
 doesn't take much of that before the whole thing collapses.
 
 5) In casting about for a solution, the compile time regex came up 
 again. This was still strongly disliked.
 
 6) I promised to try and make template metaprogramming less obtuse, and 
 what better way to do that than to obsolete a whole class of templates, 
 replacing them with ordinary functions?
I think this is a fantastic feature. It uses plain old D syntax and offers all the computational power of templates. Will this work for template functions that follow the rules as well? Also, is there any way to test whether a function is being executed at run time? Perhaps by placing a pragma(msg) inside the function body? Sean
Feb 15 2007
parent reply Walter Bright <newshound digitalmars.com> writes:
Sean Kelly wrote:
 I think this is a fantastic feature.  It uses plain old D syntax and 
 offers all the computational power of templates.  Will this work for 
 template functions that follow the rules as well?
Yes.
 Also, is there any 
 way to test whether a function is being executed at run time?
Yes, it will be always executed at runtime if it is not in a context where it is required to be folded to a constant.
Feb 15 2007
parent reply Kevin Bealer <kevinbealer gmail.com> writes:
Walter Bright wrote:
 Sean Kelly wrote:
 I think this is a fantastic feature.  It uses plain old D syntax and 
 offers all the computational power of templates.  Will this work for 
 template functions that follow the rules as well?
Yes.
 Also, is there any way to test whether a function is being executed at 
 run time?
Yes, it will be always executed at runtime if it is not in a context where it is required to be folded to a constant.
This looks awesome, by the way! I don't have this version of dmd yet, but it looks like I can convert any expression to be run at compile time or runtime with not too much extra syntax. int hack(int a, int b) { return sqrt(a) + b*b; } int heck() { // at runtime return hack(10, 20); } int hick() { // compile time mixin("return "~ToString!(hack(11,22))~";"); } Kevin
Feb 15 2007
parent Sean Kelly <sean f4.ca> writes:
Kevin Bealer wrote:
 Walter Bright wrote:
 Sean Kelly wrote:
 I think this is a fantastic feature.  It uses plain old D syntax and 
 offers all the computational power of templates.  Will this work for 
 template functions that follow the rules as well?
Yes.
 Also, is there any way to test whether a function is being executed 
 at run time?
Yes, it will be always executed at runtime if it is not in a context where it is required to be folded to a constant.
This looks awesome, by the way! I don't have this version of dmd yet, but it looks like I can convert any expression to be run at compile time or runtime with not too much extra syntax. int hack(int a, int b) { return sqrt(a) + b*b; } int heck() { // at runtime return hack(10, 20); }
I think the above will actually be folded to a constant value at compile time, assuming sqrt is defined properly? Sean
Feb 15 2007
prev sibling next sibling parent reply Miles <_______ _______.____> writes:
Walter Bright wrote:
 Even worse, if you're going to use
 string templates to parse, say, a 1000 character DSL, the template name
 generated must include its arguments, so that template identifier will
 be 1000 characters long.
Wait... this sounds too wrong... instantiated templates with string arguments hold the whole string on the template symbol? (checking myself...) Ugh... yes, it is. Worse, the string is in hexadecimal... Using the whole string on the template symbol limits the parameter size. On Linux, IIRC, the maximum symbol size is 4096 bytes, which limits strings to a little less than 2000 bytes using the current scheme. Is there some rationale for not using an MD5 hash of the string? 1. Although MD5 is not (anymore) a good cryptographic hash, it is fast and good to avoid collisions; 2. collisions should not be a problem unless the programmer really wants to collide symbols, for whatever strange reason; 3. it uses a fixed space on symbol name, allowing parameters of arbitrary size; 4. it reduces the text object memory space for long strings; 5. it should reduce the overhead over the compiler and the linker. Better yet would be to encode the MD5 hash using a variant of Base64 instead of hexadecimal digits, to make it shorter.
Feb 15 2007
parent reply Walter Bright <newshound digitalmars.com> writes:
Miles wrote:
 Is there some rationale for not using an MD5 hash of the string?
It's not reversible - no pretty printing, no demangling, etc.
Feb 15 2007
parent reply Miles <_______ _______.____> writes:
Walter Bright wrote:
 Miles wrote:
 Is there some rationale for not using an MD5 hash of the string?
It's not reversible - no pretty printing, no demangling, etc.
For common usage, does it really need to be reversible? Pretty printing and demangling are only used for debugging purposes, so, this information should go to debug sections of the object file, that are not loaded into memory unless you are running the debugger. And let the debugger use that information to extract what the symbol means.
Feb 15 2007
parent reply Walter Bright <newshound digitalmars.com> writes:
Miles wrote:
 Walter Bright wrote:
 Miles wrote:
 Is there some rationale for not using an MD5 hash of the string?
It's not reversible - no pretty printing, no demangling, etc.
For common usage, does it really need to be reversible?
Problem is, I can't tell when the uncommon use is required.
 Pretty printing and demangling are only used for debugging purposes, so,
 this information should go to debug sections of the object file, that
 are not loaded into memory unless you are running the debugger. And let
 the debugger use that information to extract what the symbol means.
Don has based some runtime reflection code on demangling names.
Feb 15 2007
next sibling parent reply Robby <robby.lansaw gmail.com> writes:
Walter Bright wrote:
 Miles wrote:
 Walter Bright wrote:
 Miles wrote:
 Is there some rationale for not using an MD5 hash of the string?
It's not reversible - no pretty printing, no demangling, etc.
For common usage, does it really need to be reversible?
Problem is, I can't tell when the uncommon use is required.
 Pretty printing and demangling are only used for debugging purposes, so,
 this information should go to debug sections of the object file, that
 are not loaded into memory unless you are running the debugger. And let
 the debugger use that information to extract what the symbol means.
Don has based some runtime reflection code on demangling names.
Considering that, will it be fair to say that future reflection capabilities will be based off of demangling? Or is there any other ideas up the sleeve about how it's going to be implemented. (trying not to go way off subject, but feeding general curiosity)
Feb 15 2007
parent Walter Bright <newshound digitalmars.com> writes:
Robby wrote:
 Walter Bright wrote:
 Don has based some runtime reflection code on demangling names.
Considering that, will it be fair to say that future reflection capabilities will be based off of demangling? Or is there any other ideas up the sleeve about how it's going to be implemented. (trying not to go way off subject, but feeding general curiosity)
It might be, too soon to tell.
Feb 15 2007
prev sibling parent Don Clugston <dac nospam.com.au> writes:
Walter Bright wrote:
 Miles wrote:
 Walter Bright wrote:
 Miles wrote:
 Is there some rationale for not using an MD5 hash of the string?
It's not reversible - no pretty printing, no demangling, etc.
For common usage, does it really need to be reversible?
Problem is, I can't tell when the uncommon use is required.
 Pretty printing and demangling are only used for debugging purposes, so,
 this information should go to debug sections of the object file, that
 are not loaded into memory unless you are running the debugger. And let
 the debugger use that information to extract what the symbol means.
Don has based some runtime reflection code on demangling names.
I wonder how relevant that still is.... <g>
Feb 15 2007
prev sibling next sibling parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
You know .. I proposed this idea a looooooong time ago
http://www.digitalmars.com/d/archives/digitalmars/D/announce/3244.html

I wonder what was your motivation for ignoring me back then <g>

P.S. My terms were stupid .. this kind of meta-programming is not a 
special case of constant folding; it's a *general* case of constant folding.

Walter Bright wrote:
 The sqrt() example I gave is cute, but not a very interesting reason for 
 doing compile time function execution. So what's the real reason?
 
 A while back, Eric Anderton and Don Clugston showed how one could use 
 template metaprogramming techniques to do, for example compile time 
 regular expressions. It was a marvelous technical demonstration, and 
 showed that essentially any computation could be done, using templates, 
 at compile time.
 
 There were some serious problems, though:
 
 1) They are hard and unintuitive to write (for those not comfortable 
 with functional programming, which is most of us).
 
 2) The result looks awful - templates are just syntactically unsuited to 
 this, even though they're a lot easier on the eye than C++ template 
 metaprograms.
 
 3) This is off-putting enough that some question even having such 
 facilities in the language, as it results in impenetrable code 
 unmaintainable by joe coders.
 
 4) While theoretically capable of any computation, such template 
 metaprogramming had severe practical limitations. Every step of every 
 computation required generating a unique template. This naturally is 
 incredibly slow and memory consumptive (C++ metaprogramming is notorious 
 for taking all night to do a build). Even worse, if you're going to use 
 string templates to parse, say, a 1000 character DSL, the template name 
 generated must include its arguments, so that template identifier will 
 be 1000 characters long. Then, it slices off the first character, and 
 generates another template for the rest (999), then 998, then 997, etc., 
 until 1000 templates are generated averaging 500 characters long. It 
 doesn't take much of that before the whole thing collapses.
 
 5) In casting about for a solution, the compile time regex came up 
 again. This was still strongly disliked.
 
 6) I promised to try and make template metaprogramming less obtuse, and 
 what better way to do that than to obsolete a whole class of templates, 
 replacing them with ordinary functions?
Feb 15 2007
next sibling parent "Andrei Alexandrescu (See Website For Email)" <SeeWebsiteForEmail erdani.org> writes:
Hasan Aljudy wrote:
 You know .. I proposed this idea a looooooong time ago
 http://www.digitalmars.com/d/archives/digitalmars/D/announce/3244.html
 
 I wonder what was your motivation for ignoring me back then <g>
You weren't ignored; a synapse was reinforced there, and with time, enough neurons fired. :o) Andrei
Feb 15 2007
prev sibling parent reply Walter Bright <newshound digitalmars.com> writes:
Hasan Aljudy wrote:
 You know .. I proposed this idea a looooooong time ago
 http://www.digitalmars.com/d/archives/digitalmars/D/announce/3244.html
 
 I wonder what was your motivation for ignoring me back then <g>
Lots of great ideas get posted here, it just takes time to get to them. This one wasn't simple to implement.
Feb 15 2007
parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Walter Bright wrote:
 Hasan Aljudy wrote:
 You know .. I proposed this idea a looooooong time ago
 http://www.digitalmars.com/d/archives/digitalmars/D/announce/3244.html

 I wonder what was your motivation for ignoring me back then <g>
Lots of great ideas get posted here, it just takes time to get to them. This one wasn't simple to implement.
Yeah heheh .. I was kinda just joking ..
Feb 15 2007
prev sibling parent reply Pragma <ericanderton yahoo.removeme.com> writes:
Walter Bright wrote:
 The sqrt() example I gave is cute, but not a very interesting reason for 
 doing compile time function execution. So what's the real reason?
 
 A while back, Eric Anderton and Don Clugston showed how one could use 
 template metaprogramming techniques to do, for example compile time 
 regular expressions. It was a marvelous technical demonstration, and 
 showed that essentially any computation could be done, using templates, 
 at compile time.
 
 There were some serious problems, though:
 
 1) They are hard and unintuitive to write (for those not comfortable 
 with functional programming, which is most of us).
 
 2) The result looks awful - templates are just syntactically unsuited to 
 this, even though they're a lot easier on the eye than C++ template 
 metaprograms.
 
 3) This is off-putting enough that some question even having such 
 facilities in the language, as it results in impenetrable code 
 unmaintainable by joe coders.
 
 4) While theoretically capable of any computation, such template 
 metaprogramming had severe practical limitations. Every step of every 
 computation required generating a unique template. This naturally is 
 incredibly slow and memory consumptive (C++ metaprogramming is notorious 
 for taking all night to do a build). Even worse, if you're going to use 
 string templates to parse, say, a 1000 character DSL, the template name 
 generated must include its arguments, so that template identifier will 
 be 1000 characters long. Then, it slices off the first character, and 
 generates another template for the rest (999), then 998, then 997, etc., 
 until 1000 templates are generated averaging 500 characters long. It 
 doesn't take much of that before the whole thing collapses.
 
 5) In casting about for a solution, the compile time regex came up 
 again. This was still strongly disliked.
 
 6) I promised to try and make template metaprogramming less obtuse, and 
 what better way to do that than to obsolete a whole class of templates, 
 replacing them with ordinary functions?
I mentioned this over in announce, but I'm working on taking this to the next level. I've found that treating meta programming like lisp (thanks to everyone who drew those parallels this week) is the first up in trouble if you try to approach it like normal D code (DMD 1.006 is going to change that quite a bit). Then, I found that aggressive use of tuples makes a lot of things much easier to handle and grok; thanks BCS! Again, this kind of list passing is very lisp-ish and makes for some very straightforward templates. To that end, I went about writing a parsing lib in the same vein as Enki. I currently have a tokenizer (attached) that will turn an input string into a tuple of token 'structs', complete with line/col information. By this, I've found that added to the regex lib. Passing such large tuples around still bloats the template "manglespace" horribly, and generating the list is about as bad. That said, Walter, I think you're on the right track by making functions compile-time capable. I might just refactor my code to take advantage of this. This leads me to two questions for DMD 1.006: Say I convert my tuple-processing code into compile-time functions that use const lists instead. What is the primary tradeoff, increased compile-time for unlimited call-depth due to non-bloated namespaces? Is there another tradeoff that is more dominant here, and not as obvious? -- - EricAnderton at yahoo
Feb 16 2007
parent reply Walter Bright <newshound digitalmars.com> writes:
Pragma wrote:
 I've found that treating meta programming like lisp (thanks to everyone 
 who drew those parallels this week) is the first step.  Adopting this 

 if you try to approach it like normal D code (DMD 1.006 is going to 
 change that quite a bit).
To someone who is comfortable with Lisp, I agree that 1, 2, and 3 are moot. But most of us aren't, and a very common request I'd get is to make metaprogramming look "like normal D code." The eventual metaprogramming goal is to get the power of Lisp expressible in the "normal D" syntax.
 To that end, I went about writing a parsing lib in the same vein as 
 Enki.  I currently have a tokenizer (attached) that will turn an input 
 string into a tuple of token 'structs', complete with line/col 
 information.
I was thinking of writing one myself, you beat me to it.
 By this, I've found that using tuples has addressed point 

Right, tuples don't fix the mangling problem at all.
 Passing such large tuples around still bloats the template "manglespace" 
 horribly, and generating the list is about as bad.
Yes, this is the motivation for compile time interpretation.
 That said, Walter, I think you're on the right track by making functions 
 compile-time capable.  I might just refactor my code to take advantage 
 of this.
I think for parsing strings, this will vastly improve things.
 This leads me to two questions for DMD 1.006:
 
 Say I convert my tuple-processing code into compile-time functions that 
 use const lists instead.
I think the right result would be array literals.
 What is the primary tradeoff, increased 
 compile-time for unlimited call-depth due to non-bloated namespaces?  Is 
 there another tradeoff that is more dominant here, and not as obvious?
The tradeoff is you won't be able to generate tuples with functions, but you will be able to handle a couple orders of magnitude larger datasets to work on.
Feb 16 2007
next sibling parent reply Pragma <ericanderton yahoo.removeme.com> writes:
Walter Bright wrote:
 Pragma wrote:
 I've found that treating meta programming like lisp (thanks to 
 everyone who drew those parallels this week) is the first step.  

 up in trouble if you try to approach it like normal D code (DMD 1.006 
 is going to change that quite a bit).
To someone who is comfortable with Lisp, I agree that 1, 2, and 3 are moot. But most of us aren't, and a very common request I'd get is to make metaprogramming look "like normal D code." The eventual metaprogramming goal is to get the power of Lisp expressible in the "normal D" syntax.
Actually, FWIW, I'm not comfortable with Lisp. D template coding is just "noisy" enough for me to grok as a list processing grammar. In contrast, my comprehension of Lisp breaks down after about three or four nestings. Yea, I'm probably a huge weirdo in that respect. ;p But I'm all for the changes and improvements. Your rationale for adding to things is dead on.
 
 
 To that end, I went about writing a parsing lib in the same vein as 
 Enki.  I currently have a tokenizer (attached) that will turn an input 
 string into a tuple of token 'structs', complete with line/col 
 information.
I was thinking of writing one myself, you beat me to it.
Well, consider it public domain then. Everyone, hack away as you wish.
 

 slightly by avoiding the kludge I added to the regex lib.  
Right, tuples don't fix the mangling problem at all.
 Passing such large tuples around still bloats the template 
 "manglespace" horribly, and generating the list is about as bad.
Yes, this is the motivation for compile time interpretation.
 That said, Walter, I think you're on the right track by making 
 functions compile-time capable.  I might just refactor my code to take 
 advantage of this.
I think for parsing strings, this will vastly improve things.
 This leads me to two questions for DMD 1.006:

 Say I convert my tuple-processing code into compile-time functions 
 that use const lists instead.
I think the right result would be array literals.
You're right. My fingertips weren't connected to my brain in that sentence.
 
 What is the primary tradeoff, increased compile-time for unlimited 
 call-depth due to non-bloated namespaces?  Is there another tradeoff 
 that is more dominant here, and not as obvious?
The tradeoff is you won't be able to generate tuples with functions, but you will be able to handle a couple orders of magnitude larger datasets to work on.
Fantastic. D is starting to look like one of those all-in-one Swiss army knives. -- - EricAnderton at yahoo
Feb 16 2007
next sibling parent reply BCS <BCS pathlink.com> writes:
Pragma wrote:
 
 
 Fantastic.  D is starting to look like one of those all-in-one Swiss 
 army knives.
 
http://img.timeinc.net/popsci/images/2006/08/giantknife_485.jpg <G>
Feb 16 2007
parent Pragma <ericanderton yahoo.removeme.com> writes:
BCS wrote:
 Pragma wrote:
 Fantastic.  D is starting to look like one of those all-in-one Swiss 
 army knives.
http://img.timeinc.net/popsci/images/2006/08/giantknife_485.jpg <G>
Thank you! That's *exactly* the knife I was thinking of. D - Before you get around to shooting yourself in the foot, you manage to lop your fingers off with the 99 other attachments on your gun. -- - EricAnderton at yahoo
Feb 16 2007
prev sibling parent Walter Bright <newshound digitalmars.com> writes:
Pragma wrote:
 Walter Bright wrote:
 To someone who is comfortable with Lisp, I agree that 1, 2, and 3 are 
 moot. But most of us aren't, and a very common request I'd get is to 
 make metaprogramming look "like normal D code." The eventual 
 metaprogramming goal is to get the power of Lisp expressible in the 
 "normal D" syntax.
Actually, FWIW, I'm not comfortable with Lisp. D template coding is just "noisy" enough for me to grok as a list processing grammar. In contrast, my comprehension of Lisp breaks down after about three or four nestings. Yea, I'm probably a huge weirdo in that respect. ;p But I'm all for the changes and improvements. Your rationale for adding to things is dead on.
You'll have to thank Andrei and Bartosz for that; they kept hating my proposals for new syntax to support metaprogramming <g>. They kept sending me back until I understood the obvious.
Feb 16 2007
prev sibling parent BCS <BCS pathlink.com> writes:
Walter Bright wrote:
 
 
 The tradeoff is you won't be able to generate tuples with functions, but 
 you will be able to handle a couple orders of magnitude larger datasets 
 to work on.
That's something I'm wondering about: how to get static foreach from a compile time function? One option that would be nice is to give static foreach full status as a static operator. This would require the static keyword (at least in some cases) but allow it in all the same places static if is allowed. Also this could allow const arrays to be used just like tuples static foreach(char[] string; strings(some, const, data)) { callMe!(string)(); }
Feb 16 2007