www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - A Perspective on D from game industry

reply "Peter Alexander" <peter.alexander.au gmail.com> writes:
http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

The arguments against D are pretty weak if I'm honest, but I 
think it's important we understand what people think of D. I can 
confirm this sentiment is fairly common in the industry.

Watch out for the little jab at Andrei :-P
Jun 15 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I like how he says that productivity is important and mentions fear of meta-programming in the same article ;) Interesting though, I had totally different set of demands and expectation when used to work with C/C++. Feels like industry matters much more than a language here.
Jun 15 2014
next sibling parent reply "Peter Alexander" <peter.alexander.au gmail.com> writes:
On Sunday, 15 June 2014 at 11:45:30 UTC, Dicebot wrote:
 I like how he says that productivity is important and mentions 
 fear of meta-programming in the same article ;)
That's true, but meta programming is just a tool. Would be nice to implement dynamic visualisation and interactivity with it though. The fear of meta programming comes from Boost, and rightly so in my opinion. Boost is written with the assumption that users will never have to read its source code. When it comes to debugging and performance tuning however, that assumption is shattered. Fortunately, D makes meta programming more simple, but it's something to keep in mind.
 Interesting though, I had totally different set of demands and 
 expectation when used to work with C/C++. Feels like industry 
 matters much more than a language here.
Absolutely. I'm beginning to learn of these differences since leaving gamedev :-)
Jun 15 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Sunday, 15 June 2014 at 13:50:10 UTC, Peter Alexander wrote:
 On Sunday, 15 June 2014 at 11:45:30 UTC, Dicebot wrote:
 I like how he says that productivity is important and mentions 
 fear of meta-programming in the same article ;)
That's true, but meta programming is just a tool. Would be nice to implement dynamic visualisation and interactivity with it though.
Well I'd call it productivity tool number one when it comes to language features (as opposed to external tools). Best way to be productive at writing code is to not write it at all :)
 The fear of meta programming comes from Boost, and rightly so in
 my opinion. Boost is written with the assumption that users will
 never have to read its source code. When it comes to debugging
 and performance tuning however, that assumption is shattered.

 Fortunately, D makes meta programming more simple, but it's
 something to keep in mind.
C++ has done a huge damage to meta paradigm, resulting in many programmers thinking first about how horrible actual implementation is and rarely even considering what can help to achieve with more reasonable design. Had some good time of my own debugging Boost::Spirit2 >_<
Jun 15 2014
parent reply "Paolo Invernizzi" <paolo.invernizzi no.address> writes:
On Sunday, 15 June 2014 at 16:06:08 UTC, Dicebot wrote:
 On Sunday, 15 June 2014 at 13:50:10 UTC, Peter Alexander wrote:

 Had some good time of my own debugging Boost::Spirit2 >_<
That's simply an impossible task! ;-P --- Paolo
Jun 15 2014
parent reply "Dicebot" <public dicebot.lv> writes:
On Sunday, 15 June 2014 at 21:11:38 UTC, Paolo Invernizzi wrote:
 On Sunday, 15 June 2014 at 16:06:08 UTC, Dicebot wrote:
 On Sunday, 15 June 2014 at 13:50:10 UTC, Peter Alexander wrote:

 Had some good time of my own debugging Boost::Spirit2 >_<
That's simply an impossible task! ;-P
If spending only reasonable time is in question - oh yes. If you are eager to spend months of spare time - there are some possibilities ;) btw ironically this is when I have felt in love with generic paradigm, with reasoning "wow, this is kind of cool despite all C++ madness, how cool it can be with same language implementation?". ..or it is just form of Stockholm syndrome.
Jun 15 2014
parent "ponce" <contact gam3sfrommars.fr> writes:
On Sunday, 15 June 2014 at 21:18:10 UTC, Dicebot wrote:
 If spending only reasonable time is in question - oh yes. If 
 you are eager to spend months of spare time - there are some 
 possibilities ;) btw ironically this is when I have felt in 
 love with generic paradigm, with reasoning "wow, this is kind 
 of cool despite all C++ madness, how cool it can be with same 
 language implementation?".

 ..or it is just form of Stockholm syndrome.
C++ meta-programming is already quite cool when you get how to make duck-typed algorithms, STL-style. It's the C++ culture that might be a problem: the STL-style is rarely used by most practitionners.
Jun 15 2014
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 6:50 AM, Peter Alexander wrote:
 The fear of meta programming comes from Boost, and rightly so in
 my opinion. Boost is written with the assumption that users will
 never have to read its source code. When it comes to debugging
 and performance tuning however, that assumption is shattered.
For years I avoided C++ templates (even though I implemented them in DMC++) because they were just so dang hard to read. D originally was not going to have templates for that reason. But I eventually discovered that hiding behind the complexity of C++ templates was a very simple idea - templates are just functions with compile time rather than run time arguments. (Isn't it amazing that I could implement C++ without figuring this out? I still don't understand that.) That was the enabling breakthrough for D templates. In fact, templates engender such an "OMG! Templates! I don't get Templates!" aura about them that I convinced Andrei to not even use the word "template" in his book about D!
Jun 15 2014
next sibling parent reply "Burp" <brpy yahoo.com> writes:
  I work in the game industry so I'm familiar with this type of 
mindset. Not everyone in my industry is like this, but 
unfortunately many are(I avoid working with them).

  He doesn't understand metaprogramming and so dismisses it. He 
also assumes C++ is all about Java style OOP, when modern style 
is wildly different from Java.

  And yes the game industry will likely *never* produce its own 
language or tools. Why? Because it is very short-term goal 
oriented, focusing almost entirely on the current project with 
little thought for long term growth. Most companies are 
relatively small, and even large ones like EA are very 
fragmented(although EA did produce its own version of the STL).

  Basically, this guy is a *rendering engineer*, likely good at 
math and algorithms, but not so hot with design.



 For years I avoided C++ templates (even though I implemented 
 them in DMC++) because they were just so dang hard to read. D 
 originally was not going to have templates for that reason.

 But I eventually discovered that hiding behind the complexity 
 of C++ templates was a very simple idea - templates are just 
 functions with compile time rather than run time arguments. 
 (Isn't it amazing that I could implement C++ without figuring 
 this out? I still don't understand that.) That was the enabling 
 breakthrough for D templates.

 In fact, templates engender such an "OMG! Templates! I don't 
 get Templates!" aura about them that I convinced Andrei to not 
 even use the word "template" in his book about D!
Jun 15 2014
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 4:26 PM, Burp wrote:
   I work in the game industry so I'm familiar with this type of mindset.
 Not everyone in my industry is like this, but unfortunately many are(I
 avoid working with them).

   He doesn't understand metaprogramming and so dismisses it. He also
 assumes C++ is all about Java style OOP, when modern style is wildly
 different from Java.

   And yes the game industry will likely *never* produce its own language
 or tools. Why? Because it is very short-term goal oriented, focusing
 almost entirely on the current project with little thought for long term
 growth. Most companies are relatively small, and even large ones like EA
 are very fragmented(although EA did produce its own version of the STL).

   Basically, this guy is a *rendering engineer*, likely good at math and
 algorithms, but not so hot with design.
Interesting to hear, thanks for sharing your perspective. There's one thing I'd like to ask about though, not intending to argue, but just for clarification: You say the industry isn't likely to produce its own tools. While I'm in no position to disagree, I am surprised to hear that since the industry is known to produce some of its own middleware. EA is said to have a fairly sophisticated in-house UI authoring system, and of course they have Frostbite. Various studios have developed in-house engines, and many of the big-name ones (ex, Unreal Engine, Source, CryEngine) started out as in-house projects. Would you say those are more exceptional cases, or did you mean something more specific by "tools"?
Jun 15 2014
next sibling parent "Burp" <brpy yahoo.com> writes:
C++'s lack of finally didn't do any favors for exception 
handling's popularity, either. (Has "finally" finally been 
added?)
Just noting: exceptions are rarely used in gamedev. Also I agree with Bjarne RIAA is preferable to finally in the C++ context, finally makes more sense in a language with GC for dealing with none memory resources.
 You say the industry isn't likely to produce its own tools. 
 While I'm in no position to disagree, I am surprised to hear 
 that since the industry is known to produce some of its own 
 middleware. EA is said to have a fairly sophisticated in-house 
 UI authoring system, and of course they have Frostbite. Various 
 studios have developed in-house engines, and many of the 
 big-name ones (ex, Unreal Engine, Source, CryEngine) started 
 out as in-house projects.

 Would you say those are more exceptional cases, or did you mean 
 something more specific by "tools"?
Yeah I should clarify. I'm not really speaking of middleware or engines; gamedev produces plenty of that, but always in bog standard C or C++. What I don't see, is the game industry producing a programming language that would be adopted outside of the company that produced it. Or assisting in the development of an existing one. Despite their being (guessing here) tens of thousands of professional C++ game developers, are any of them attending ISO C++ meetings? I doubt it. If a game studio does produce anything resembling a language/associated tools it would very like proprietary. -Take Epic, they created Unreal script(barf), nobody else uses it. Epic has abandoned it in UE4. -Naughty Dog, they had a custom lisp based development at one point, nobody else used it, I believe they now use Racket to generate C++
That is some *crazy*, impressive, *herculean*-effort stuff. 
CLEARLY, significant parts of the game industry genuinely 
understand the importance of investments into technology. And 
yet...all the complaining they do about C++ and they *still* 
won't write the language they want?
Some of this comes from the proprietary tooling they end up using on each platform. It is supplied by the platform owner. Language wise, you get a C++ compiler, and not necessary a very good one. Making a clean replacement for C++ isn't really enough. Any C++ replacement has to interop well with C++ because of the existing mountain of C++ based middleware, libraries, and engines.
Several *years* ago, I was under the impression that problem had 
finally been changing? Is that not so?
My experience is that is has changed for the better. I'm in the Western US though, and Manu is (I believe) in Australia. If a studio tried to make me crunch extra hours without pay I'd just refuse, finding a different job isn't that hard /shrug.
 I switched to web development, where I work roughly 9-5 for a 
 good
 salary, and I never looked back.
The state of California passed laws after the EA spouse case, so if you work in California or for a CA based company they cannot legally make you work more than 40 hrs/week. Scummy places may try to get more hours out of you by applying peer pressure or some such crap, but they cannot legally do so-
Jun 16 2014
prev sibling parent reply Xavier Bigand <flamaros.xavier gmail.com> writes:
Le 16/06/2014 08:20, Nick Sabalausky a écrit :
 On 6/15/2014 4:26 PM, Burp wrote:
   I work in the game industry so I'm familiar with this type of mindset.
 Not everyone in my industry is like this, but unfortunately many are(I
 avoid working with them).

   He doesn't understand metaprogramming and so dismisses it. He also
 assumes C++ is all about Java style OOP, when modern style is wildly
 different from Java.

   And yes the game industry will likely *never* produce its own language
 or tools. Why? Because it is very short-term goal oriented, focusing
 almost entirely on the current project with little thought for long term
 growth. Most companies are relatively small, and even large ones like EA
 are very fragmented(although EA did produce its own version of the STL).

   Basically, this guy is a *rendering engineer*, likely good at math and
 algorithms, but not so hot with design.
Interesting to hear, thanks for sharing your perspective. There's one thing I'd like to ask about though, not intending to argue, but just for clarification: You say the industry isn't likely to produce its own tools. While I'm in no position to disagree, I am surprised to hear that since the industry is known to produce some of its own middleware. EA is said to have a fairly sophisticated in-house UI authoring system, and of course they have Frostbite. Various studios have developed in-house engines, and many of the big-name ones (ex, Unreal Engine, Source, CryEngine) started out as in-house projects. Would you say those are more exceptional cases, or did you mean something more specific by "tools"?
A language need to be open, it's not the case of all middle wares and game engines. Game companies like so much let their sources closed and sharing anything... It's a pain for small video game companies, we can't access to good articles,... So every body learn in his own little corner.
Jun 16 2014
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 16 June 2014 at 18:55:11 UTC, Xavier Bigand wrote:
 Le 16/06/2014 08:20, Nick Sabalausky a écrit :
 On 6/15/2014 4:26 PM, Burp wrote:
  I work in the game industry so I'm familiar with this type 
 of mindset.
 Not everyone in my industry is like this, but unfortunately 
 many are(I
 avoid working with them).

  He doesn't understand metaprogramming and so dismisses it. 
 He also
 assumes C++ is all about Java style OOP, when modern style is 
 wildly
 different from Java.

  And yes the game industry will likely *never* produce its 
 own language
 or tools. Why? Because it is very short-term goal oriented, 
 focusing
 almost entirely on the current project with little thought 
 for long term
 growth. Most companies are relatively small, and even large 
 ones like EA
 are very fragmented(although EA did produce its own version 
 of the STL).

  Basically, this guy is a *rendering engineer*, likely good 
 at math and
 algorithms, but not so hot with design.
Interesting to hear, thanks for sharing your perspective. There's one thing I'd like to ask about though, not intending to argue, but just for clarification: You say the industry isn't likely to produce its own tools. While I'm in no position to disagree, I am surprised to hear that since the industry is known to produce some of its own middleware. EA is said to have a fairly sophisticated in-house UI authoring system, and of course they have Frostbite. Various studios have developed in-house engines, and many of the big-name ones (ex, Unreal Engine, Source, CryEngine) started out as in-house projects. Would you say those are more exceptional cases, or did you mean something more specific by "tools"?
A language need to be open, it's not the case of all middle wares and game engines. Game companies like so much let their sources closed and sharing anything... It's a pain for small video game companies, we can't access to good articles,... So every body learn in his own little corner.
It is part of the culture. Those of us that grew up in Europe and got into computers in the mid-80s, know the demoscene culture pretty well, which grew out of the game's development culture. The goal was to impress other sceners how you managed to push the hardware to the limits, beyond what anyone thought was possible to do. Not sharing how you managed to do it was part of the implicit rules. -- Paulo
Jun 16 2014
prev sibling next sibling parent reply "Sean Kelly" <sean invisibleduck.org> writes:
On Sunday, 15 June 2014 at 19:51:08 UTC, Walter Bright wrote:
 In fact, templates engender such an "OMG! Templates! I don't 
 get Templates!" aura about them that I convinced Andrei to not 
 even use the word "template" in his book about D!
That's precisely the reason I wrote a chapter on templates in Tango With D despite Don's suggestion that I talk about the far sexier CTFE. People have an unreasonable fear of templates and when you get down to it they're terribly simple to understand.
Jun 15 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 5:41 PM, Sean Kelly wrote:
 That's precisely the reason I wrote a chapter on templates in Tango With D
 despite Don's suggestion that I talk about the far sexier CTFE. People have an
 unreasonable fear of templates and when you get down to it they're terribly
 simple to understand.
Ever thought about doing an update to your book?
Jun 16 2014
parent "Sean Kelly" <sean invisibleduck.org> writes:
On Monday, 16 June 2014 at 07:08:05 UTC, Walter Bright wrote:
 On 6/15/2014 5:41 PM, Sean Kelly wrote:
 That's precisely the reason I wrote a chapter on templates in 
 Tango With D
 despite Don's suggestion that I talk about the far sexier  
 CTFE. People have an
 unreasonable fear of templates and when you get down to it  
 they're terribly
 simple to understand.
Ever thought about doing an update to your book?
I'm too far removed from Tango for that to be practical. Online writing makes more sense, but even that requires time I don't often seem to have these days. Baby steps :-)
Jun 16 2014
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Sun, Jun 15, 2014 at 12:51:12PM -0700, Walter Bright via Digitalmars-d wrote:
 On 6/15/2014 6:50 AM, Peter Alexander wrote:
The fear of meta programming comes from Boost, and rightly so in
my opinion. Boost is written with the assumption that users will
never have to read its source code. When it comes to debugging
and performance tuning however, that assumption is shattered.
For years I avoided C++ templates (even though I implemented them in DMC++) because they were just so dang hard to read. D originally was not going to have templates for that reason. But I eventually discovered that hiding behind the complexity of C++ templates was a very simple idea - templates are just functions with compile time rather than run time arguments. (Isn't it amazing that I could implement C++ without figuring this out? I still don't understand that.) That was the enabling breakthrough for D templates.
I think you may have missed the fact that your very realization was a further development in itself. The term "template" comes from the C++ idea of having a pre-written piece of code with some blanks in a few places, that will be filled in to make the actual code. But the concept of "compile-time parameter" is something conceptually different, and more powerfully unifying IMO. It digs at the root of C++'s template nastiness, which ultimately comes from treating template parameters as something fundamentally distinct from function parameters. The ugly syntax is but the superficial expression of this underlying difference in conception. D's superior template syntax is not merely a better dressed syntax; it ultimately stems from treating template parameters as being the *same* thing as function parameters -- except they are evaluated at compile-time rather than runtime. This insight therefore causes D's templates to mesh very nicely with CTFE to form a beautifully-integrated whole.
 In fact, templates engender such an "OMG! Templates! I don't get
 Templates!" aura about them that I convinced Andrei to not even use
 the word "template" in his book about D!
[...] I like how TDPL introduces templates by not introducing them at all, but just talks about compile-time arguments. By the time the reader gets to the chapter on templates, he's already been using them comfortably. But on that note, perhaps it's not altogether a bad thing for the word "template" to have a negative connotation; perhaps we should be pushing the term "compile-time parameter" instead. It engenders an IMO superior way of thinking about these things that may help newcomers overcome the fear of metaprogramming. T -- MACINTOSH: Most Applications Crash, If Not, The Operating System Hangs
Jun 15 2014
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 6:12 PM, H. S. Teoh via Digitalmars-d wrote:
 I think you may have missed the fact that your very realization was a
 further development in itself. The term "template" comes from the C++
 idea of having a pre-written piece of code with some blanks in a few
 places, that will be filled in to make the actual code. But the concept
 of "compile-time parameter" is something conceptually different, and
 more powerfully unifying IMO. It digs at the root of C++'s template
 nastiness, which ultimately comes from treating template parameters as
 something fundamentally distinct from function parameters. The ugly
 syntax is but the superficial expression of this underlying difference
 in conception. D's superior template syntax is not merely a better
 dressed syntax; it ultimately stems from treating template parameters as
 being the *same* thing as function parameters -- except they are
 evaluated at compile-time rather than runtime.

 This insight therefore causes D's templates to mesh very nicely with
 CTFE to form a beautifully-integrated whole.
I like the way you think. Can I subscribe to your newsletter? :-)
 I like how TDPL introduces templates by not introducing them at all, but
 just talks about compile-time arguments. By the time the reader gets to
 the chapter on templates, he's already been using them comfortably.
Our eevil plan at work!
 But on that note, perhaps it's not altogether a bad thing for the word
 "template" to have a negative connotation; perhaps we should be pushing
 the term "compile-time parameter" instead. It engenders an IMO superior
 way of thinking about these things that may help newcomers overcome the
 fear of metaprogramming.
!
Jun 16 2014
prev sibling next sibling parent Rikki Cattermole <alphaglosined gmail.com> writes:
On 16/06/2014 1:12 p.m., H. S. Teoh via Digitalmars-d wrote:
 On Sun, Jun 15, 2014 at 12:51:12PM -0700, Walter Bright via Digitalmars-d
wrote:
 On 6/15/2014 6:50 AM, Peter Alexander wrote:
 The fear of meta programming comes from Boost, and rightly so in
 my opinion. Boost is written with the assumption that users will
 never have to read its source code. When it comes to debugging
 and performance tuning however, that assumption is shattered.
For years I avoided C++ templates (even though I implemented them in DMC++) because they were just so dang hard to read. D originally was not going to have templates for that reason. But I eventually discovered that hiding behind the complexity of C++ templates was a very simple idea - templates are just functions with compile time rather than run time arguments. (Isn't it amazing that I could implement C++ without figuring this out? I still don't understand that.) That was the enabling breakthrough for D templates.
I think you may have missed the fact that your very realization was a further development in itself. The term "template" comes from the C++ idea of having a pre-written piece of code with some blanks in a few places, that will be filled in to make the actual code. But the concept of "compile-time parameter" is something conceptually different, and more powerfully unifying IMO. It digs at the root of C++'s template nastiness, which ultimately comes from treating template parameters as something fundamentally distinct from function parameters. The ugly syntax is but the superficial expression of this underlying difference in conception. D's superior template syntax is not merely a better dressed syntax; it ultimately stems from treating template parameters as being the *same* thing as function parameters -- except they are evaluated at compile-time rather than runtime. This insight therefore causes D's templates to mesh very nicely with CTFE to form a beautifully-integrated whole.
 In fact, templates engender such an "OMG! Templates! I don't get
 Templates!" aura about them that I convinced Andrei to not even use
 the word "template" in his book about D!
[...] I like how TDPL introduces templates by not introducing them at all, but just talks about compile-time arguments. By the time the reader gets to the chapter on templates, he's already been using them comfortably.
Personally I put CTFE ahead of meta programming in D. I don't think we have yet quite cracked its true power. By the time you've used CTFE a bit you beg for templated functions and classes almost.
 But on that note, perhaps it's not altogether a bad thing for the word
 "template" to have a negative connotation; perhaps we should be pushing
 the term "compile-time parameter" instead. It engenders an IMO superior
 way of thinking about these things that may help newcomers overcome the
 fear of metaprogramming.


 T
Jun 16 2014
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 06/16/2014 03:12 AM, H. S. Teoh via Digitalmars-d wrote:
 This insight therefore causes D's templates to mesh very nicely with
 CTFE to form a beautifully-integrated whole.
I wouldn't go exactly that far. For one thing, CTFE cannot be used to manipulate types.
Jun 16 2014
parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 16/06/2014 11:39 p.m., Timon Gehr wrote:
 On 06/16/2014 03:12 AM, H. S. Teoh via Digitalmars-d wrote:
 This insight therefore causes D's templates to mesh very nicely with
 CTFE to form a beautifully-integrated whole.
I wouldn't go exactly that far. For one thing, CTFE cannot be used to manipulate types.
I would go that far, when combining string mixins, templates and CTFE. It can produce whole new set of code from a given set. Just look at dvorm in its Query class. What with generating methods like: Query!Book edition_eq(ubyte)
Jun 16 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 16 June 2014 at 11:49:11 UTC, Rikki Cattermole wrote:
 I would go that far, when combining string mixins,
As far as I can tell string mixins have the same bad properties that macros have. It makes automatic translation very difficult and makes reasoning about code more difficult. It is a cheap and effective solution, but without any trace of beauty... A design blunder IMHO.
Jun 16 2014
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Jun 16, 2014 at 12:44:05PM +0000, via Digitalmars-d wrote:
 On Monday, 16 June 2014 at 11:49:11 UTC, Rikki Cattermole wrote:
I would go that far, when combining string mixins,
As far as I can tell string mixins have the same bad properties that macros have. It makes automatic translation very difficult and makes reasoning about code more difficult. It is a cheap and effective solution, but without any trace of beauty... A design blunder IMHO.
Actually, IIRC, string mixins were never designed to be nice -- they started as a kind of temporary last-resort kludge that got put in, in lieu of a true AST macro system, with the view that it would meet the current metaprogramming needs until the latter, ostensibly superior, solution came along. Unfortunately, AST macros never happened, and string mixins kinda took hold in the D codebase, so that's what we have now. I would personally avoid using string mixins unless there's absolutely no other way to achieve what you want -- they're a kind of last-resort nuclear warhead that you don't bring out unless all the other guns fail to win the battle. Having said that, though, proper use of string mixins with CTFE and templates ('scuse me, *compile-time arguments* ;)) can be extremely powerful, and one of the things that make D metaprogramming so awesome. T -- "The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell. "How come he didn't put 'I think' at the end of it?" -- Anonymous
Jun 16 2014
next sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 16 June 2014 at 15:07:08 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 Having said that, though, proper use of string mixins with CTFE 
 and
 templates ('scuse me, *compile-time arguments* ;)) can be 
 extremely
 powerful, and one of the things that make D metaprogramming so 
 awesome.
Sure, just like m4 and cpp can be extremely powerful. Too powerful…
Jun 16 2014
next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 06/16/2014 05:18 PM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Monday, 16 June 2014 at 15:07:08 UTC, H. S. Teoh via Digitalmars-d
 wrote:
 Having said that, though, proper use of string mixins with CTFE and
 templates ('scuse me, *compile-time arguments* ;)) can be extremely
 powerful, and one of the things that make D metaprogramming so awesome.
Sure, just like m4 and cpp can be extremely powerful. Too powerful…
I wouldn't go as far as comparing mixins to a text macro preprocessor either. At least they are integrated into the language.
Jun 16 2014
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/16/2014 8:18 AM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 Sure,  just like m4 and cpp can be extremely powerful. Too powerful…
One of the sins of cpp is it is not powerful enough, forcing a lot of awkward usages.
Jun 16 2014
prev sibling parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 17/06/2014 3:05 a.m., H. S. Teoh via Digitalmars-d wrote:
 On Mon, Jun 16, 2014 at 12:44:05PM +0000, via Digitalmars-d wrote:
 On Monday, 16 June 2014 at 11:49:11 UTC, Rikki Cattermole wrote:
 I would go that far, when combining string mixins,
As far as I can tell string mixins have the same bad properties that macros have. It makes automatic translation very difficult and makes reasoning about code more difficult. It is a cheap and effective solution, but without any trace of beauty... A design blunder IMHO.
Actually, IIRC, string mixins were never designed to be nice -- they started as a kind of temporary last-resort kludge that got put in, in lieu of a true AST macro system, with the view that it would meet the current metaprogramming needs until the latter, ostensibly superior, solution came along. Unfortunately, AST macros never happened, and string mixins kinda took hold in the D codebase, so that's what we have now. I would personally avoid using string mixins unless there's absolutely no other way to achieve what you want -- they're a kind of last-resort nuclear warhead that you don't bring out unless all the other guns fail to win the battle.
I have a rule about string mixins which I believe to be a good one to follow: If using string mixins, don't expose it to client code unless they explicitly want it. Basically what this means is, use something to wrap it like a mixin template but don't just say hey call this function and use it as a string mixin! Its a little nicer. Also the explicit getting of the string is important for debugging. Sometimes its needed for the more complex cases in edge cases/broken functionality. I use it during development (a simple pragma msg inside the mixin template for example). But in saying this, some of this could be handled by opDispatch. Its just a shame that both approaches currently aren't handled for auto-completion by any IDE's. I would expect the string mixin would be one day.
 Having said that, though, proper use of string mixins with CTFE and
 templates ('scuse me, *compile-time arguments* ;)) can be extremely
 powerful, and one of the things that make D metaprogramming so awesome.


 T
Jun 16 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 04:16:43PM +1200, Rikki Cattermole via Digitalmars-d
wrote:
 On 17/06/2014 3:05 a.m., H. S. Teoh via Digitalmars-d wrote:
[...]
I would personally avoid using string mixins unless there's
absolutely no other way to achieve what you want -- they're a kind of
last-resort nuclear warhead that you don't bring out unless all the
other guns fail to win the battle.
I have a rule about string mixins which I believe to be a good one to follow: If using string mixins, don't expose it to client code unless they explicitly want it. Basically what this means is, use something to wrap it like a mixin template but don't just say hey call this function and use it as a string mixin! Its a little nicer. Also the explicit getting of the string is important for debugging. Sometimes its needed for the more complex cases in edge cases/broken functionality. I use it during development (a simple pragma msg inside the mixin template for example).
Hmm, you know what would be really nice? If there was a way to get at the string representation of the fully-expanded form of instantiated templates as the compiler sees them before handing them off to codegen. Well, OK, I know that's not how the compiler does it, but still, something equivalent to this would be very handy for debugging deeply-nested templates that currently would just spew walls of incomprehensible errors.
 But in saying this, some of this could be handled by opDispatch. Its
 just a shame that both approaches currently aren't handled for
 auto-completion by any IDE's. I would expect the string mixin would be
 one day.
[...] String mixins? Auto-completion? I dunno, that sounds like a stretch to me. How would an IDE handle autocompletion for things like like: string generateCode() { string code = "int x="; if (solveFermatsLastTheorem()) { code ~= "1"; } else { code ~= "2"; } code ~= ";"; return code; } int func() { mixin(generateCode()); } ? T -- Which is worse: ignorance or apathy? Who knows? Who cares? -- Erich Schubert
Jun 16 2014
next sibling parent Rikki Cattermole <alphaglosined gmail.com> writes:
On 17/06/2014 4:44 p.m., H. S. Teoh via Digitalmars-d wrote:
 On Tue, Jun 17, 2014 at 04:16:43PM +1200, Rikki Cattermole via Digitalmars-d
wrote:
 On 17/06/2014 3:05 a.m., H. S. Teoh via Digitalmars-d wrote:
[...]
 I would personally avoid using string mixins unless there's
 absolutely no other way to achieve what you want -- they're a kind of
 last-resort nuclear warhead that you don't bring out unless all the
 other guns fail to win the battle.
I have a rule about string mixins which I believe to be a good one to follow: If using string mixins, don't expose it to client code unless they explicitly want it. Basically what this means is, use something to wrap it like a mixin template but don't just say hey call this function and use it as a string mixin! Its a little nicer. Also the explicit getting of the string is important for debugging. Sometimes its needed for the more complex cases in edge cases/broken functionality. I use it during development (a simple pragma msg inside the mixin template for example).
Hmm, you know what would be really nice? If there was a way to get at the string representation of the fully-expanded form of instantiated templates as the compiler sees them before handing them off to codegen. Well, OK, I know that's not how the compiler does it, but still, something equivalent to this would be very handy for debugging deeply-nested templates that currently would just spew walls of incomprehensible errors.
A feature that I would love, is the full code output post CTFE. Which is kinda what your saying. Would be lovely to see just _exactly_ is going into the binary and more importantly where! I would be very happy even if comments weren't even there.
 But in saying this, some of this could be handled by opDispatch. Its
 just a shame that both approaches currently aren't handled for
 auto-completion by any IDE's. I would expect the string mixin would be
 one day.
[...] String mixins? Auto-completion? I dunno, that sounds like a stretch to me. How would an IDE handle autocompletion for things like like: string generateCode() { string code = "int x="; if (solveFermatsLastTheorem()) { code ~= "1"; } else { code ~= "2"; } code ~= ";"; return code; } int func() { mixin(generateCode()); } ?
I would assume a full frontend would be required for this.
 T
Jun 16 2014
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 17/06/14 06:44, H. S. Teoh via Digitalmars-d wrote:

 String mixins? Auto-completion? I dunno, that sounds like a stretch to
 me. How would an IDE handle autocompletion for things like like:

 	string generateCode() {
 		string code = "int x=";
 		if (solveFermatsLastTheorem()) {
 			code ~= "1";
 		} else {
 			code ~= "2";
 		}
 		code ~= ";";
 		return code;
 	}
 	int func() {
 		mixin(generateCode());
 	}
That would require semantic analysis. Basically evaluate the string mixin and to autocomplete on the resulted code. -- /Jacob Carlborg
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 08:23:08 UTC, Jacob Carlborg wrote:
 That would require semantic analysis. Basically evaluate the 
 string mixin and to autocomplete on the resulted code.
But consider something like gofix/dfix where you have to propagate changes back to the original prefix string. What do you do when the same prefix is used differently in two different mixin contexts? And even more important: how can you be certain that you discover all possible deprecated uses of a string-mixin'ed feature when you have the ability to do versioning. You either have to explore the "combinatorial explosion" of versioning constants or restrict the search to a fixed set.
Jun 17 2014
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/17/2014 4:43 AM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Tuesday, 17 June 2014 at 08:23:08 UTC, Jacob Carlborg wrote:
 That would require semantic analysis. Basically evaluate the string
 mixin and to autocomplete on the resulted code.
But consider something like gofix/dfix where you have to propagate changes back to the original prefix string. What do you do when the same prefix is used differently in two different mixin contexts? And even more important: how can you be certain that you discover all possible deprecated uses of a string-mixin'ed feature when you have the ability to do versioning. You either have to explore the "combinatorial explosion" of versioning constants or restrict the search to a fixed set.
I think you're hitting on the fundamental limitations of automated code-updating tools here: They can't be treated as trusted black-boxes. They may very well be a handy tool, but by their very nature they're always going to need some degree of manual oversight, the amount and nature of which could vary depending on the exact update being attempted.
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 09:17:21 UTC, Nick Sabalausky wrote:
 I think you're hitting on the fundamental limitations of 
 automated code-updating tools here: They can't be treated as 
 trusted black-boxes.
I don't think this is a fundamental limitation of tools, but a consequence of language design. I also think that features that makes it difficult to write programs that analyze the semantics also makes it difficult for humans to understand the code and verify the correctness of the code. Programming languages are in general still quite primitive (not specific to D), they still rely on convention rather than formalisms. Semaphores and macro-like features are pretty good examples where convention has been more convenient than supporting machine reasoning, but it has consequences when we demand better tooling, smarter compilers and faster code! Semaphores cannot be turned into transactional memory code paths… Macro like features prevent high level transforms and optimizations, source code restructuring/refactoring etc.
Jun 17 2014
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 11:16:22AM +0000, via Digitalmars-d wrote:
 On Tuesday, 17 June 2014 at 09:17:21 UTC, Nick Sabalausky wrote:
I think you're hitting on the fundamental limitations of automated
code-updating tools here: They can't be treated as trusted
black-boxes.
I don't think this is a fundamental limitation of tools, but a consequence of language design. I also think that features that makes it difficult to write programs that analyze the semantics also makes it difficult for humans to understand the code and verify the correctness of the code. Programming languages are in general still quite primitive (not specific to D), they still rely on convention rather than formalisms. Semaphores and macro-like features are pretty good examples where convention has been more convenient than supporting machine reasoning, but it has consequences when we demand better tooling, smarter compilers and faster code! Semaphores cannot be turned into transactional memory code paths… Macro like features prevent high level transforms and optimizations, source code restructuring/refactoring etc.
I think you are underestimating the complexity of programming. Automated tools can only go so far -- ultimately, human intervention is needed for certain code transformations, and perhaps even that can only go so far, because programming is inherently complex! Turing-complete languages exhibit unsolvable problems like the halting problem (that even humans can't solve), besides also exhibiting intermediate intractible complexities like non-primitive-recursive functionality and the expression of NP-complete problems that are inherently irreducible to simpler constructs. Granted, 90% of application code these days are nothing but trivial variations on trivial computational tasks, like sorting, shuffling data around, etc.. So this part is easily automatable. But I think you're deceiving yourself if you think that automation is possible beyond the trivialities of programming. T -- If it tastes good, it's probably bad for you.
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 16:08:18 UTC, H. S. Teoh via
Digitalmars-d wrote:
 I think you are underestimating the complexity of programming.
No need to go ad hominem. I don't underestimate anything. What makes you think so?
 Automated
 tools can only go so far -- ultimately, human intervention is 
 needed for
 certain code transformations, and perhaps even that can only go 
 so far,
 because programming is inherently complex!
No more complex then computer assisted proof systems.
 Turing-complete languages
 exhibit unsolvable problems like the halting problem (that even 
 humans can't solve),
How does the halting problem relate to anything practical? It's a fun example of a proof where you reason about one infinite dimension being larger than another infinite dimension, but that is about all it provides.
 besides also exhibiting intermediate intractible
 complexities like non-primitive-recursive functionality and the
 expression of NP-complete problems that are inherently 
 irreducible to simpler constructs.
Most NP-complete problems are NP-complete because they are expressed as a decision problem (boolean function). The moment you take the quantifiable sub problem and formulate it as something reasonable it often seize to be NPC. You almost never want an optimal result, you want a good result that require less resources. NPC is a funny construct, but reasoning about NP-hard problems has very little practical value and that's the only thing NPC is good for. NPC has little to do with automated code translation. E.g. if you for some reason want to remove all "while(){}" loops you can easily replace them with "if(){do{}while()}".
 around, etc.. So this part is easily automatable. But I think 
 you're
 deceiving yourself if you think that automation is possible 
 beyond the
 trivialities of programming.
Gofix worked out ok in a macro-free environment. It cannot work in a macro-heavy environment. Of course, code transformations is easier in a pure functional language, but that does not change the fact that having a transformation-friendly imperative language is desirable.
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 04:50:07PM +0000, via Digitalmars-d wrote:
 On Tuesday, 17 June 2014 at 16:08:18 UTC, H. S. Teoh via
 Digitalmars-d wrote:
I think you are underestimating the complexity of programming.
No need to go ad hominem. I don't underestimate anything. What makes you think so?
I did not intend that as an ad hominem attack, I apologize if it came across that way. My point was just that automation, while desirable, isn't always possible, and that shouldn't be the only reason for rejecting certain language features.
Automated tools can only go so far -- ultimately, human intervention
is needed for certain code transformations, and perhaps even that can
only go so far, because programming is inherently complex!
No more complex then computer assisted proof systems.
Yes, and such proof systems, AFAIK, are limited to highly-constrained subsets of what programming languages today offer in general, because automated proving becomes intractible once you reach a certain point.
Turing-complete languages exhibit unsolvable problems like the
halting problem (that even humans can't solve),
How does the halting problem relate to anything practical? It's a fun example of a proof where you reason about one infinite dimension being larger than another infinite dimension, but that is about all it provides.
The halting problem is equivalent to Kolgomorov complexity, which in turn relates to optimal compression, which has applications in global optimization problems in compiler technology. Sure, the way some textbooks present it, it's just an obscure academic exercise, but it does have far-reaching implications. But the point is, Turing-complete language are capable of expressing problems in the spectrum that spans from trivial problems to something of the complexity of the unsolvable halting problem, and while that endpoint (i.e., the halting problem) itself may not be interesting in practical applications, the stuff that lies in between does, and they can take on any level of complexity up to the halting problem, so many of them are intractible to automate.
besides also exhibiting intermediate intractible complexities like
non-primitive-recursive functionality and the expression of
NP-complete problems that are inherently irreducible to simpler
constructs.
Most NP-complete problems are NP-complete because they are expressed as a decision problem (boolean function). The moment you take the quantifiable sub problem and formulate it as something reasonable it often seize to be NPC. You almost never want an optimal result, you want a good result that require less resources. NPC is a funny construct, but reasoning about NP-hard problems has very little practical value and that's the only thing NPC is good for.
That depends on your specific application. There are applications for which the whole point *is* to find the optimal solution. I don't think you can simply write off the whole thing as "unnecessary".
 NPC has little to do with automated code translation. E.g. if you for
 some reason want to remove all "while(){}" loops you can easily
 replace them with "if(){do{}while()}".
Some forms of code translation may very well be NP-complete, or worse. Like running global optimization on a set of mutually-recursive functions. And some instances of optimal register allocation, IIRC.
around, etc.. So this part is easily automatable. But I think you're
deceiving yourself if you think that automation is possible beyond
the trivialities of programming.
Gofix worked out ok in a macro-free environment. It cannot work in a macro-heavy environment. Of course, code transformations is easier in a pure functional language, but that does not change the fact that having a transformation-friendly imperative language is desirable.
True, but as I said, string mixins really should only be used as a last resort, so 99% of the time you don't need to deal with them anyway. So that shouldn't stop you from implementing automated code transformation on the subset of D that doesn't include string mixins -- and it will work just fine on code that doesn't use string mixins (or whatever else there is in the language, that makes automation hard/impossible). There's no need to get rid of string mixins just because of that 1% of code that actually needs to use them. Nobody says that the transformation must work on 100% of all D programs, otherwise we can't have it at all. It's the same thing as computing a "good enough" solution to an NP complete problem that isn't necessarily globally optimal. I think it's "good enough" for automated code transformation to work on a subset of D that doesn't include string mixins -- there are use cases where string mixins is the cleanest solution, but nobody is saying that code transformation MUST work on those cases too. T -- Жил-был король когда-то, при нём блоха жила.
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 17:19:25 UTC, H. S. Teoh via
Digitalmars-d wrote:
 The halting problem is equivalent to Kolgomorov complexity, 
 which in
 turn relates to optimal compression, which has applications in 
 global
 optimization problems in compiler technology. Sure, the way some
 textbooks present it, it's just an obscure academic exercise, 
 but it
 does have far-reaching implications.
I don't understand why, since in compiler optimization you almost always just aim for best effort.
 can take on any level of complexity up to the halting problem, 
 so many of them are intractible to automate.
Well, but in that case they are also intractible for human beings. Or brains can be simulated in finite space/time IMO.
 That depends on your specific application. There are 
 applications for
 which the whole point *is* to find the optimal solution. I 
 don't think
 you can simply write off the whole thing as "unnecessary".
In games, most certainly. In fact, the games AI has to be dumbed down and made emotionally interesting instead.
 Some forms of code translation may very well be NP-complete, or 
 worse.
Sure. But they are only NP-complete if you are dealing with an infinite dimension. The moment all dimensions are finite (bounded by a constant) the problem is most certainly P? Of course, P can be terrible in terms of performance. So it says little. What complexity is good for is aiding strategies when doing algorithm design. What it is not good for is preventing exploration of algorithms. You can often get good results for computationally intensive problems using randomization/stochastic strategies.
 Like running global optimization on a set of mutually-recursive
 functions. And some instances of optimal register allocation, 
 IIRC.
You almost never want optimal, you want to balance off resource usage with what you gain.
 There's no need to get rid of string mixins just because of 
 that 1% of
 code that actually needs to use them. Nobody says that the
 transformation must work on 100% of all D programs, otherwise 
 we can't
 have it at all.
The language should guarantee that you can detect deprecated features with 100% certainty. Otherwise you risk distributing template-libraries that don't work for certain configurations. If you want to reduce the deployed code size you might want to translate code into something that can be transferred fast to web-browsers. That means you have to be able to translate 100%. Of course, it is possible to avoid string mixins, but the same holds for #define(x…) in CPP.
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 06:08:57PM +0000, via Digitalmars-d wrote:
 On Tuesday, 17 June 2014 at 17:19:25 UTC, H. S. Teoh via
 Digitalmars-d wrote:
[...]
There's no need to get rid of string mixins just because of that 1%
of code that actually needs to use them. Nobody says that the
transformation must work on 100% of all D programs, otherwise we
can't have it at all.
The language should guarantee that you can detect deprecated features with 100% certainty. Otherwise you risk distributing template-libraries that don't work for certain configurations.
Unfortunately, CTFE makes this task impossible. Consider this: int ctfeFunc() { if (solveHaltingProblem()) { useDeprecatedFeature(); } else { useNonDeprecatedFeature(); } } enum x = ctfeFunc(); How would the compiler (or any tool!) detect the use (or non-use) of deprecated features here?
 If you want to reduce the deployed code size you might want to
 translate code into something that can be transferred fast to
 web-browsers. That means you have to be able to translate 100%.
If you want it to run in a browser, and the browser doesn't support certain language features, then you'll just have to restrict your code to the subset of the language that's implementable in a browser, no?
 Of course, it is possible to avoid string mixins, but the same holds
 for #define(x…) in CPP.
I think that's gross exaggeration. #define is ubiquitous in C/C++: you can hardly find any non-trivial program that depends on it, because the language doesn't provide a way to express certain things otherwise. In D, however, we have version, static if, and a whole bunch of other niceties that makes string mixins unnecessary in 99% of cases. String mixins are MUCH easier to avoid in D, than #define's are in C/C++. T -- Notwithstanding the eloquent discontent that you have just respectfully expressed at length against my verbal capabilities, I am afraid that I must unfortunately bring it to your attention that I am, in fact, NOT verbose.
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 18:24:22 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 How would the compiler (or any tool!) detect the use (or 
 non-use) of
 deprecated features here?
You compile it or detect AST-node presence.
 If you want it to run in a browser, and the browser doesn't 
 support
 certain language features, then you'll just have to restrict 
 your code
 to the subset of the language that's implementable in a 
 browser, no?
And avoid standard libraries? Javascript supports eval() so it can support string mixins in theory, even at runtime, but if you cannot easily translate the mixin content then you have a challenge. Not that D is meant for web browsers, but my point is more that macro-like features has consequences that go beyond the compiler internals.
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 06:58:27PM +0000, via Digitalmars-d wrote:
 On Tuesday, 17 June 2014 at 18:24:22 UTC, H. S. Teoh via Digitalmars-d
 wrote:
How would the compiler (or any tool!) detect the use (or non-use) of
deprecated features here?
You compile it or detect AST-node presence.
You can also compile a string mixin to detect if it uses deprecated features, no? But that's missing my point. My point was that CTFE makes automated detection of deprecated features (or any property of a particular piece of code, really) a rather tricky proposition. For example: static if (longComplicatedComputation()) badFeature(); else goodFeature(); What if badFeature() is never compiled because longComplicatedComputation() always returns 0? But the compiler may not be able to statically prove that this is always the case -- in the worst case, it's tantamount to solving the halting problem. Plus, attempting CTFE in real-time inside an IDE may not be a good idea -- a longish compile-time computation (say 10 seconds long) may be OK for batch compilation, but it's not acceptable for an IDE to pause 10 seconds every time you browse that piece of code just because the IDE needs to compute whether or not it should highlight badFeature() as a deprecated feature. Also, detecting AST node presence is unreliable. What if it's needed for compatibility with older compilers? static if (__VERSION__ < 2064L) useDeprecatedFeature(); else useNewFeatureNotIn2064(); This piece of code may be absolutely fine because both older and newer compilers will accept it. But AST node presence would flag the code as problematic because of useDeprecatedFeature. And if you ignore that branch by assuming __VERSION__ == the latest compiler version, then you are no longer validating *all* branches of the code, which you stated in your previous post was an important goal. Basically, metaprogramming does come with a price -- some things may become difficult/impractical to automate. But it also enables use cases for which a language without any metaprogramming features would be unable to handle (or would require much uglier ways to implement).
If you want it to run in a browser, and the browser doesn't support
certain language features, then you'll just have to restrict your code
to the subset of the language that's implementable in a browser, no?
And avoid standard libraries?
AFAIK, Phobos doesn't heavily use string mixins, does it?
 Javascript supports eval() so it can support string mixins in theory,
 even at runtime, but if you cannot easily translate the mixin content
 then you have a challenge.
 
 Not that D is meant for web browsers, but my point is more that
 macro-like features has consequences that go beyond the compiler
 internals.
I'm certainly not saying that string mixins don't have consequences beyond compiler internals. In fact, I don't particularly like them myself, but they do fill a gap that would otherwise be in the language for certain cases of metaprogramming. The original intention, AFAICT, was to replace string mixins with AST macros, but since the latter never materialized, string mixins is what we're left with. In any case, they're relatively rarely used, so I don't see them as the big problem that you seem to consider them to be. T -- Ignorance is bliss... until you suffer the consequences!
Jun 17 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 19:24:54 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 You can also compile a string mixin to detect if it uses 
 deprecated features, no?
Not if it depends on configuration.
 Also, detecting AST node presence is unreliable. What if it's 
 needed for
 compatibility with older compilers?

 	static if (__VERSION__ < 2064L)
 		useDeprecatedFeature();
 	else
 		useNewFeatureNotIn2064();
I don't think that should be legal, that is macro-like. The syntax should follow the language spec through and through.
 In any case, they're relatively rarely used, so I don't see 
 them as the
 big problem that you seem to consider them to be.
They are a big problem for gofix/dfix. I don't like them and I don't use them, so they are not a problem for me… but like "pure", it is a feature that counts against the language design.
Jun 17 2014
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 06/17/2014 01:16 PM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Tuesday, 17 June 2014 at 09:17:21 UTC, Nick Sabalausky wrote:
 I think you're hitting on the fundamental limitations of automated
 code-updating tools here: They can't be treated as trusted black-boxes.
I don't think this is a fundamental limitation of tools, but a consequence of language design. I also think that features that makes it difficult to write programs that analyze the semantics also makes it difficult for humans to understand the code and verify the correctness of the code. Programming languages are in general still quite primitive (not specific to D), they still rely on convention rather than formalisms. ...
That's a very odd statement to make about programming languages in general.
Jun 17 2014
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 16:34:23 UTC, Timon Gehr wrote:
 On 06/17/2014 01:16 PM, "Ola Fosheim Grøstad" 
 <ola.fosheim.grostad+dlang gmail.com>" wrote:
 Programming languages are in general still quite primitive 
 (not specific
 to D), they still rely on convention rather than formalisms.
 ...
That's a very odd statement to make about programming languages in general.
This is in the context of imperative languages that are used for writing the majority of deployed applications. Do you disagree?
Jun 17 2014
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 06/17/2014 06:53 PM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Tuesday, 17 June 2014 at 16:34:23 UTC, Timon Gehr wrote:
 On 06/17/2014 01:16 PM, "Ola Fosheim Grøstad"
 <ola.fosheim.grostad+dlang gmail.com>" wrote:
 Programming languages are in general still quite primitive (not specific
 to D), they still rely on convention rather than formalisms.
 ...
That's a very odd statement to make about programming languages in general.
This is in the context of imperative languages that are used for writing the majority of deployed applications. Do you disagree?
If you are only talking about those languages, not at all.
Jun 17 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 17:03:34 UTC, Timon Gehr wrote:
 If you are only talking about those languages, not at all.
Yes, I was only talking about the ones that are suitable for creating commercial games given the topic of the thread. (Languages that are based on Horn-clauses and the like are in a different league).
Jun 17 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/16/2014 5:44 AM, "Ola Fosheim Grøstad" 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 As far as I can tell string mixins have the same bad properties that macros
 have.
Assuming you are talking about C macros: Having implemented the C preprocessor (multiple times), make's macro system, designed and implemented ABEL's macro system, Ddoc's macro system, and string mixins, I can unequivocably object to that opinion! The C macro system is awesome in its awfulness. Let me count the ways: 1. Probably < 10 people in the world actually understand it all the way down. It is absurdly complex for what little it does. Paul Mensonidas is usually acknowledged as the "world's leading expert on the C preprocessor." Why would a freakin' macro processor even have an ecological niche for a world leading expert on it? The mind boggles. 2. C preprocessor tokens are not the same thing as C tokens! 3. The "phases of translation" of the preprocessor guarantees slow compilation. 4. The way it works make any sort of parallel processing of source code impossible. It must be done serially. 5. There is no scoping of names of any sort - no hygiene whatsoever. Any #include file can trash any subsequent, unrelated, #include file in any way. 6. Any syntax highlighter cannot work entirely correctly without having a full preprocessor. 7. Because of the preprocessor, valid C code need not look remotely like C code according to the C grammar. 8. There are no looping macros, no CAR/CDR capabilities (Ddoc has the latter). So there!
Jun 16 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 17 June 2014 at 03:08:48 UTC, Walter Bright wrote:
 Assuming you are talking about C macros:
I was talking about macros in general. :-)
 expert on the C preprocessor." Why would a freakin' macro 
 processor even have an ecological niche for a world leading 
 expert on it? The mind boggles.
You could say the same about Turing-machines, Lisp, template programming and propositional calculus? The world of computers is mind boggling!
 6. Any syntax highlighter cannot work entirely correctly 
 without having a full preprocessor.
This is the point I was aiming at. Automatic translation becomes more difficult if you cannot deal with "meaningful units" on the parsing level. Take for instance gofix/dfix. How on earth are you going to detect a deprecated feature in a string mixing and replace it with a new construct? It might be possible in some cases, but you actually have to explore all versioning-possibilities, meaning do an exhaustive search. That sounds veeery challenging!
 8. There are no looping macros, no CAR/CDR capabilities (Ddoc 
 has the latter).

 So there!
So it only goes to 8? Then CPP can't be all that loud.
Jun 17 2014
prev sibling parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 11:12, H. S. Teoh via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sun, Jun 15, 2014 at 12:51:12PM -0700, Walter Bright via Digitalmars-d
wrote:
 On 6/15/2014 6:50 AM, Peter Alexander wrote:
The fear of meta programming comes from Boost, and rightly so in
my opinion. Boost is written with the assumption that users will
never have to read its source code. When it comes to debugging
and performance tuning however, that assumption is shattered.
For years I avoided C++ templates (even though I implemented them in DMC++) because they were just so dang hard to read. D originally was not going to have templates for that reason. But I eventually discovered that hiding behind the complexity of C++ templates was a very simple idea - templates are just functions with compile time rather than run time arguments. (Isn't it amazing that I could implement C++ without figuring this out? I still don't understand that.) That was the enabling breakthrough for D templates.
I think you may have missed the fact that your very realization was a further development in itself. The term "template" comes from the C++ idea of having a pre-written piece of code with some blanks in a few places, that will be filled in to make the actual code. But the concept of "compile-time parameter" is something conceptually different, and more powerfully unifying IMO. It digs at the root of C++'s template nastiness, which ultimately comes from treating template parameters as something fundamentally distinct from function parameters. The ugly syntax is but the superficial expression of this underlying difference in conception. D's superior template syntax is not merely a better dressed syntax; it ultimately stems from treating template parameters as being the *same* thing as function parameters -- except they are evaluated at compile-time rather than runtime. This insight therefore causes D's templates to mesh very nicely with CTFE to form a beautifully-integrated whole.
 In fact, templates engender such an "OMG! Templates! I don't get
 Templates!" aura about them that I convinced Andrei to not even use
 the word "template" in his book about D!
[...] I like how TDPL introduces templates by not introducing them at all, but just talks about compile-time arguments. By the time the reader gets to the chapter on templates, he's already been using them comfortably. But on that note, perhaps it's not altogether a bad thing for the word "template" to have a negative connotation; perhaps we should be pushing the term "compile-time parameter" instead. It engenders an IMO superior way of thinking about these things that may help newcomers overcome the fear of metaprogramming.
Good points. One thing I do think needs more emphasis though, is the true costs of 'compile time parameters'. People tend to only consider runtime performance, and rarely consider it in relation to binary bloat, or more subtle forms of performance impact, like icache misses, which are more difficult to measure, and rarely express themselves in benchmark environments.
Jun 15 2014
parent reply "Burp" <brpy yahoo.com> writes:
Gamedev is about solutions being given (MSVC, closed console 
platform
tools, etc), and they are just waiting for the package to appear.
It's not entirely unreasonable either. Most people probably don't
realise how high-stress and unfair the gamedev industry is when 
it
comes to engineers time and commitment to their work.
To say that they literally have no time to spend on 
extra-curricular
projects is an understatement, and risk-aversion is a key form of
self-defence. I know many gamedev's who are frequently expected 
to
choose between their life and families, or their jobs.
There are some gamedev jobs that aren't like this. My last one was 40 hrs/week and never a minute more.
 One thing I do think needs more emphasis though, is the true 
 costs of
 'compile time parameters'.
 People tend to only consider runtime performance, and rarely 
 consider
 it in relation to binary bloat, or more subtle forms of 
 performance
 impact, like icache misses, which are more difficult to 
 measure, and
 rarely express themselves in benchmark environments.
I see gamedev's use this as an excuse too much, it is mostly bunk. They would have manually written it N times anyway, or they simply put code into a template that didn't need to be. Anyway to make D attractive to game development? 1. VisualD needs to be on par with Visual Assist 2. D needs to figure out what the hell it is doing with GC/RC/Memory 3. Target all common platforms 4. Allow for C++ and D to call each other without requiring a C interop layer.(not going to happen but would help immensely)
Jun 15 2014
next sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 12:44, Burp via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 Gamedev is about solutions being given (MSVC, closed console platform
 tools, etc), and they are just waiting for the package to appear.
 It's not entirely unreasonable either. Most people probably don't
 realise how high-stress and unfair the gamedev industry is when it
 comes to engineers time and commitment to their work.
 To say that they literally have no time to spend on extra-curricular
 projects is an understatement, and risk-aversion is a key form of
 self-defence. I know many gamedev's who are frequently expected to
 choose between their life and families, or their jobs.
There are some gamedev jobs that aren't like this. My last one was 40 hrs/week and never a minute more.
I think it's the exception though, not the rule. And did that make the employees tend towards being OSS enthusiasts, or did the proprietary nature of the industry still shield them from OSS on average? You're posting here now, that could be taken as support for my theory :)
 One thing I do think needs more emphasis though, is the true costs of
 'compile time parameters'.
 People tend to only consider runtime performance, and rarely consider
 it in relation to binary bloat, or more subtle forms of performance
 impact, like icache misses, which are more difficult to measure, and
 rarely express themselves in benchmark environments.
I see gamedev's use this as an excuse too much, it is mostly bunk. They would have manually written it N times anyway, or they simply put code into a template that didn't need to be.
Maybe. But often not. I regularly balance a runtime branch vs a template. Binary size is still a significant concern on many platforms (phones, nintendo platforms, handhelds).
  Anyway to make D attractive to game development?
 1. VisualD needs to be on par with Visual Assist
 2. D needs to figure out what the hell it is doing with GC/RC/Memory
 3. Target all common platforms
 4. Allow for C++ and D to call each other without requiring a C interop
 layer.(not going to happen but would help immensely)
Are you deliberately repeating my list from before, or are these your independant findings too? :) I don't find 4 to be a significant problem in practise. And it will erode in relevance as a commitment is made and more code transitions to D.
Jun 15 2014
prev sibling next sibling parent "c0de517e" <kenpex tin.it> writes:
  Anyway to make D attractive to game development?
 1. VisualD needs to be on par with Visual Assist
 2. D needs to figure out what the hell it is doing with 
 GC/RC/Memory
 3. Target all common platforms
 4. Allow for C++ and D to call each other without requiring a C 
 interop layer.(not going to happen but would help immensely)
3. should be probably done with a backend that compiles to C, it's not the default backend anybody would ever use but it would give studios the peace of mind of not losing investment when moving to a new platform. We deal with new, proprietary platforms all the time and no other backend will ever be ported in time to ship a title on a "nextgen" console otherwise. 1. and 4. are needed if you want a general C++ replacement, I agree but if instead of a being a general C++ replacement it started to conquer a niche where it can prove to be incredibly useful, we could tolerate even 2., that's what I tried to say in the blog post, we use Lua, Lua has horrible GC (well, not really GC is a problem, but that everything needs allocations) that requires workarounds to be in a shippable state, yet we do use Lua a lot across the industry. Because it's the best we can find for livecoding. Another example could be ISPC, it's serves a niche but it's really useful in that one and we might consider integrating it for tight numerical kernels, it's a small section of code where a new language could start insinuating itself...
Jun 17 2014
prev sibling parent "Daniel Murphy" <yebbliesnospam gmail.com> writes:
"Burp"  wrote in message news:dcykcbonpududgkdritc forum.dlang.org...

 4. Allow for C++ and D to call each other without requiring a C interop 
 layer.(not going to happen but would help immensely)
What exactly are you looking for here? D currently supports quite a bit of direct C++ interop, and while it's not as complete as the C interop it is quite usable.
Jun 17 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
On Sunday, 15 June 2014 at 13:50:10 UTC, Peter Alexander wrote:
 On Sunday, 15 June 2014 at 11:45:30 UTC, Dicebot wrote:
 I like how he says that productivity is important and mentions 
 fear of meta-programming in the same article ;)
That's true, but meta programming is just a tool. Would be nice to implement dynamic visualisation and interactivity with it though. The fear of meta programming comes from Boost, and rightly so in my opinion. Boost is written with the assumption that users will never have to read its source code. When it comes to debugging and performance tuning however, that assumption is shattered. Fortunately, D makes meta programming more simple, but it's something to keep in mind.
 Interesting though, I had totally different set of demands and 
 expectation when used to work with C/C++. Feels like industry 
 matters much more than a language here.
Absolutely. I'm beginning to learn of these differences since leaving gamedev :-)
Boost is horrible, but even the STL proved to be quite problematic. Visualization would be a great tool, it's quite surprising if you think about it that we can't in any mainstream debugger just graph over time the state of objects, create UIs and so on. I recently did write a tiny program that does live inspection of memory areas as bitmaps, I needed that to debug image algorithms so it's quite specialized... but once you do stuff like that it comes natural to think that we should have the ability of scripting visualizers in debuggers and have them update continuously in runtime http://c0de517e.blogspot.ca/2013/05/peeknpoke.html
Jun 17 2014
parent reply "Joakim" <dlang joakim.airpost.net> writes:
On Tuesday, 17 June 2014 at 22:24:06 UTC, c0de517e wrote:
 Visualization would be a great tool, it's quite surprising if 
 you think about it that we can't in any mainstream debugger 
 just graph over time the state of objects, create UIs and so on.

 I recently did write a tiny program that does live inspection 
 of memory areas as bitmaps, I needed that to debug image 
 algorithms so it's quite specialized... but once you do stuff 
 like that it comes natural to think that we should have the 
 ability of scripting visualizers in debuggers and have them 
 update continuously in runtime 
 http://c0de517e.blogspot.ca/2013/05/peeknpoke.html
Man, I expressed similar thoughts years ago. Software pumps data in, operates on it, and pumps new data out: why don't we have proper visualization tools for those data flows? Only being able to freeze program state and inspect it at repeated snapshots in time with a debugger is so backwards: it's like we're still stuck in the '80s. Then, right after I see you mention it too, I happen to run across a recent lldb frontend for OSX/iOS- he gave up on Android ;) - that does exactly that: https://github.com/meeloo/xspray
Jun 18 2014
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/18/2014 5:39 PM, Joakim wrote:
 Software pumps data in,
 operates on it, and pumps new data out: why don't we have proper
 visualization tools for those data flows?  Only being able to freeze
 program state and inspect it at repeated snapshots in time with a
 debugger is so backwards:
That's why I inadvertently learned to love printf debugging. I get to see the whole "chart" at one. Granted, it's in a bit of a "The Matrix"-style "only comprehensible if you know what you're looking at" kind of way. Actual GUI graphs would certainly be nice. But all the data's there at once, so no need for constant fast-fowarding and rewindi...oh wait, that's right, debuggers can't rewind either. ;) Honestly, I *have* used and loved debuggers, and I still appreciate them. I do think they're great tools. But...I rarely use them anymore: After several years of being forced into printf-debugging (or worse!!) for various reasons, every time I go back to a debugger I feel like I'm debugging with my hands tied behind my back. Or rather, finding a needle in a haystack using only a microscope that's stuck on max magnification and can only ever move to the right. And it's exactly because of the debugger's "temporal blinders" - the inability to ever see more than one *instant* at a time.
Jun 18 2014
next sibling parent "Kapps" <opantm2+spam gmail.com> writes:
On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I 
 get to see the whole "chart" at one. Granted, it's in a bit of 
 a "The Matrix"-style "only comprehensible if you know what 
 you're looking at" kind of way. Actual GUI graphs would 
 certainly be nice. But all the data's there at once,  so no 
 need for constant fast-fowarding and rewindi...oh wait, that's 
 right, debuggers can't rewind either. ;)

 Honestly, I *have* used and loved debuggers, and I still 
 appreciate them. I do think they're great tools. But...I rarely 
 use them anymore: After several years of being forced into 
 printf-debugging (or worse!!) for various reasons, every time I 
 go back to a debugger I feel like I'm debugging with my hands 
 tied behind my back. Or rather, finding a needle in a haystack 
 using only a microscope that's stuck on max magnification and 
 can only ever move to the right. And it's exactly because of 
 the debugger's "temporal blinders" - the inability to ever see 
 more than one *instant* at a time.
There's a time for both. Being able to step into each method with a debugger, execute code, inspect variables, etc, is very very useful in certain situations. However in some situations (particularly multi-threaded ones I find), printf debugging is simply easier as you don't have to stop your program to examine Studio 2012 has IntelliTrace, which in theory could be promising for these situations, but in reality I've never even tried.
Jun 18 2014
prev sibling next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I 
 get to see the whole "chart" at one. Granted, it's in a bit of 
 a "The Matrix"-style "only comprehensible if you know what 
 you're looking at" kind of way. Actual GUI graphs would 
 certainly be nice. But all the data's there at once,  so no 
 need for constant fast-fowarding and rewindi...oh wait, that's 
 right, debuggers can't rewind either. ;)
use it as it sounded like black magic to me.
Jun 18 2014
prev sibling next sibling parent "Kagamin" <spam here.lot> writes:
On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I 
 get to see the whole "chart" at one. Granted, it's in a bit of 
 a "The Matrix"-style "only comprehensible if you know what 
 you're looking at" kind of way. Actual GUI graphs would 
 certainly be nice. But all the data's there at once,  so no 
 need for constant fast-fowarding and rewindi...oh wait, that's 
 right, debuggers can't rewind either. ;)
.net debugger can arbitrarily move instruction pointer, it's not really an unwind (for true unwind you need a tracing debugger, which are commercial because they are so advanced), more like a sudden goto, it doesn't unwind memory, but actually there are many functions, which can be rerun this way.
Jun 19 2014
prev sibling next sibling parent reply "Wyatt" <wyatt.epp gmail.com> writes:
On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 certainly be nice. But all the data's there at once,  so no 
 need for constant fast-fowarding and rewindi...oh wait, that's 
 right, debuggers can't rewind either. ;)
Oh? https://www.gnu.org/software/gdb/news/reversible.html http://rr-project.org/ Debuggers, like most aspects of the C tooling ecosystem, have lain stagnant for a long time, but its not for lack of enhancement opportunities. I think this is starting to change since LLVM has forced everyone to shake the rust off. -Wyatt
Jun 19 2014
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 19 June 2014 at 19:22:23 UTC, Wyatt wrote:
 On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky 
 wrote:
 certainly be nice. But all the data's there at once,  so no 
 need for constant fast-fowarding and rewindi...oh wait, that's 
 right, debuggers can't rewind either. ;)
Oh? https://www.gnu.org/software/gdb/news/reversible.html http://rr-project.org/ Debuggers, like most aspects of the C tooling ecosystem, have lain stagnant for a long time, but its not for lack of enhancement opportunities. I think this is starting to change since LLVM has forced everyone to shake the rust off. -Wyatt
Yes, LLVM has been a great contribution to advance C languages tooling from the PDP architecture constraints.
Jun 19 2014
prev sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Thu, Jun 19, 2014 at 07:22:22PM +0000, Wyatt via Digitalmars-d wrote:
 On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
certainly be nice. But all the data's there at once,  so no need for
constant fast-fowarding and rewindi...oh wait, that's right,
debuggers can't rewind either. ;)
Oh? https://www.gnu.org/software/gdb/news/reversible.html http://rr-project.org/ Debuggers, like most aspects of the C tooling ecosystem, have lain stagnant for a long time, but its not for lack of enhancement opportunities. I think this is starting to change since LLVM has forced everyone to shake the rust off.
[...] Wow. This is Very Cool(tm). I shall have to start using this! The linked website says that all syscalls have to be emulated. Sounds like, if the debugger's idea of what a particular syscall does is different from what it actually does, you may get some strange results. Which is a bit scary... T -- A mathematician is a device for turning coffee into theorems. -- P. Erdos
Jun 19 2014
prev sibling parent reply "Sean Kelly" <sean invisibleduck.org> writes:
On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I 
 get to see the whole "chart" at one.
Yep. A lot of this is probably because as a server programmer I've just gotten used to finding bugs this way as a matter of necessity, but in many cases I actually prefer it to interactive debugging. For example, build core.demangle with -debug=trace and -debug=info set.
Jun 26 2014
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Thu, Jun 26, 2014 at 10:57:28PM +0000, Sean Kelly via Digitalmars-d wrote:
 On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
That's why I inadvertently learned to love printf debugging. I get to
see the whole "chart" at one.
Yep. A lot of this is probably because as a server programmer I've just gotten used to finding bugs this way as a matter of necessity, but in many cases I actually prefer it to interactive debugging. For example, build core.demangle with -debug=trace and -debug=info set.
Over the years, I've come to prefer printf debugging too. At my job I work with headless embedded systems, and interactive debugging can only be done remotely. Unfortunately, remote debugging is rather flaky -- gdbserver does work wonders sometimes, but due to quirky system library setups and older system software (we don't have the luxury of always running the latest and greatest), it works poorly when the application in question calls fork() or is multithreaded or loads dynamic libraries at runtime. One especially unhelpful scenario is when the failure happens at boot-time, before the network devices have been initialized, so you can't even ssh into the machine to start gdbserver -- no debugger magic will help you there! And even in the cases where I do manage to get gdbserver to work, all too often it's unable to match addressees to symbols in the source tree (due to differing runtime environments on the machine vs. my PC), so it ends up not being very informative. While there *are* ways of setting things up so that they will work, it's a lot of trouble. I've since developed a crude printf-based debug logging system which lets me insert debug() lines anywhere in the code, and it gets printf-formatted, prefixed with the process ID, with a simple fcntl file lock to ensure atomic output when multiple processes / threads call debug() at the same time. Armed with this, inserting debug() into a few strategic places does wonders -- the output goes in a file that can be examined long after the problem has occurred (say at boot time when the machine is inaccessible), contains PIDs that can be used to trace exactly which instance of a program failed, reviewed at leisure, etc.. T -- It's bad luck to be superstitious. -- YHL
Jun 26 2014
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/26/2014 7:24 PM, H. S. Teoh via Digitalmars-d wrote:
 On Thu, Jun 26, 2014 at 10:57:28PM +0000, Sean Kelly via Digitalmars-d wrote:
 On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I get to
 see the whole "chart" at one.
Yep. A lot of this is probably because as a server programmer I've just gotten used to finding bugs this way as a matter of necessity, but in many cases I actually prefer it to interactive debugging. For example, build core.demangle with -debug=trace and -debug=info set.
Over the years, I've come to prefer printf debugging too. At my job I work with headless embedded systems, and interactive debugging can only be done remotely.
Aye. Sometimes in embedded work, you're *lucky* if you can even do printf at all, let alone a debugger. I've had to debug with as little as one LED. It's...umm..."interesting". And time consuming. Especially when it's ASM. (But somewhat of a proud-yet-twisted rite of passage though ;) ) There's other times I've had to get by without debuggers too. Like, in the earlier days of web dev, it was common to not have a debugger. Or debugging JS problems that only manifested on Safari (I assume Safari probably has JS diagnostics/debugging now, but it didn't always. That was a pain.)
Jun 26 2014
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Thu, Jun 26, 2014 at 09:16:27PM -0400, Nick Sabalausky via Digitalmars-d
wrote:
[...]
 Aye. Sometimes in embedded work, you're *lucky* if you can even do
 printf at all, let alone a debugger. I've had to debug with as little
 as one LED.  It's...umm..."interesting". And time consuming.
 Especially when it's ASM.  (But somewhat of a proud-yet-twisted rite
 of passage though ;) )
Reminds me of time I hacked an old Apple II game's copy protection by using a disk editor and writing in the instruction opcodes directly. :-)
 There's other times I've had to get by without debuggers too. Like, in
 the earlier days of web dev, it was common to not have a debugger. Or
 debugging JS problems that only manifested on Safari (I assume Safari
 probably has JS diagnostics/debugging now, but it didn't always. That
 was a pain.)
Argh... you remind of times when I had to debug like 50kloc of Javascript for a single typo on IE6, when IE6 has no debugger, not even a JS error console, or anything whatsoever that might indicate something went wrong except for a blank screen where there should be JS-rendered content. It wasn't so bad when the same bug showed up in Firefox or Opera, which do have sane debuggers; but when the bug is specific to IE, it feels like shooting a gun blindfolded in pitch darkness and hoping you'll hit bulls-eye by pure dumb luck. T -- Marketing: the art of convincing people to pay for what they didn't need before which you can't deliver after.
Jun 26 2014
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 27 June 2014 at 02:11:50 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 On Thu, Jun 26, 2014 at 09:16:27PM -0400, Nick Sabalausky via 
 Digitalmars-d wrote:
 [...]
 Aye. Sometimes in embedded work, you're *lucky* if you can 
 even do
 printf at all, let alone a debugger. I've had to debug with as 
 little
 as one LED.  It's...umm..."interesting". And time consuming.
 Especially when it's ASM.  (But somewhat of a 
 proud-yet-twisted rite
 of passage though ;) )
Reminds me of time I hacked an old Apple II game's copy protection by using a disk editor and writing in the instruction opcodes directly. :-)
 There's other times I've had to get by without debuggers too. 
 Like, in
 the earlier days of web dev, it was common to not have a 
 debugger. Or
 debugging JS problems that only manifested on Safari (I assume 
 Safari
 probably has JS diagnostics/debugging now, but it didn't 
 always. That
 was a pain.)
Argh... you remind of times when I had to debug like 50kloc of Javascript for a single typo on IE6, when IE6 has no debugger, not even a JS error console, or anything whatsoever that might indicate something went wrong except for a blank screen where there should be JS-rendered content. It wasn't so bad when the same bug showed up in Firefox or Opera, which do have sane debuggers; but when the bug is specific to IE, it feels like shooting a gun blindfolded in pitch darkness and hoping you'll hit bulls-eye by pure dumb luck. T
IE6 had a debugger, it just wasn't installed by default. You needed to install the debugger for Windows Scripting Host. -- Paulo
Jun 27 2014
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/26/2014 10:10 PM, H. S. Teoh via Digitalmars-d wrote:
 On Thu, Jun 26, 2014 at 09:16:27PM -0400, Nick Sabalausky via Digitalmars-d
wrote:
 [...]
 Aye. Sometimes in embedded work, you're *lucky* if you can even do
 printf at all, let alone a debugger. I've had to debug with as little
 as one LED.  It's...umm..."interesting". And time consuming.
 Especially when it's ASM.  (But somewhat of a proud-yet-twisted rite
 of passage though ;) )
Reminds me of time I hacked an old Apple II game's copy protection by using a disk editor and writing in the instruction opcodes directly. :-)
Cool. I once tried to hack a game I'd bought to change/remove the part where it took my name directly from the payment method and displayed that it was registered to "Nicolas" instead of "Nick" in big bold letters on the title screen. I didn't quite get that adjusted, but I did wind up with a tool (in D) to pack/unpack the game's resource file format.
Jun 27 2014
parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, Jun 27, 2014 at 03:36:08PM -0400, Nick Sabalausky via Digitalmars-d
wrote:
 On 6/26/2014 10:10 PM, H. S. Teoh via Digitalmars-d wrote:
On Thu, Jun 26, 2014 at 09:16:27PM -0400, Nick Sabalausky via Digitalmars-d
wrote:
[...]
Aye. Sometimes in embedded work, you're *lucky* if you can even do
printf at all, let alone a debugger. I've had to debug with as
little as one LED.  It's...umm..."interesting". And time consuming.
Especially when it's ASM.  (But somewhat of a proud-yet-twisted rite
of passage though ;) )
Reminds me of time I hacked an old Apple II game's copy protection by using a disk editor and writing in the instruction opcodes directly. :-)
Cool. I once tried to hack a game I'd bought to change/remove the part where it took my name directly from the payment method and displayed that it was registered to "Nicolas" instead of "Nick" in big bold letters on the title screen. I didn't quite get that adjusted, but I did wind up with a tool (in D) to pack/unpack the game's resource file format.
Heh, nice! :) On another note, something more recent that I'm quite proud of, was to fix a bug that I couldn't reproduce locally, for which the only information I have was the segfault stacktrace the customer gave in the bug report (which had no symbols resolved, btw, just raw hex addresses). I looked up the exact firmware build number he was using, and got myself a copy of the binary from the official release firmware FTP server. Of course, that didn't have any symbols either (it's a release build), but at least the addresses on the stacktrace matched up with the addresses in the disassembly of the binary. So I had to check out the precise revision of the source tree used to make that build from revision control, build it with symbols, then match up the function addresses so that I could identify them. However, the last few frames on the stacktrace are static functions, which have no symbols in the binary even in my build, so I had to trace through the stacktrace by comparing the disassembly with the source code to find the offending function, then find the offending line by tracing through the disassembly and matching it up with the source code, up to the point of the segfault. Once I found the exact source line, the register values on the stacktrace indicated that it was a null dereference, so I worked backwards, in the source code now, until I identified the exact variable corresponding to the register that held the NULL pointer (the compiler's optimizer shuffled the variable around between RAM and various registers as the function progressed, so all of that had to be unravelled before the exact variable could be identified). After that, I could resume the regular routine of tracing the paths through which the NULL could have come. You have no idea how awesome it felt when my test image (which I couldn't test locally since I couldn't reproduce the bug), installed on the customer's backup test environment, worked the first time. T -- Claiming that your operating system is the best in the world because more people use it is like saying McDonalds makes the best food in the world. -- Carl B. Constantine
Jun 27 2014
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2014-06-27 03:16, Nick Sabalausky wrote:

 There's other times I've had to get by without debuggers too. Like, in
 the earlier days of web dev, it was common to not have a debugger. Or
 debugging JS problems that only manifested on Safari (I assume Safari
 probably has JS diagnostics/debugging now, but it didn't always. That
 was a pain.)
These days there is something called Firebug Lite [1]. It's like Firebug but it's written purely in JavaScript. That means you can use it like a booklet in browsers like IE6, iPhone or other phones that doesn't have a debugger. I think it's even better than the one in latest IE. The downside is, if there's a JavaScript error the debugger might not run :(. [1] https://getfirebug.com/firebuglite -- /Jacob Carlborg
Jun 27 2014
prev sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 27 June 2014 11:16, Nick Sabalausky via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 6/26/2014 7:24 PM, H. S. Teoh via Digitalmars-d wrote:
 On Thu, Jun 26, 2014 at 10:57:28PM +0000, Sean Kelly via Digitalmars-d
 wrote:
 On Thursday, 19 June 2014 at 05:35:06 UTC, Nick Sabalausky wrote:
 That's why I inadvertently learned to love printf debugging. I get to
 see the whole "chart" at one.
Yep. A lot of this is probably because as a server programmer I've just gotten used to finding bugs this way as a matter of necessity, but in many cases I actually prefer it to interactive debugging. For example, build core.demangle with -debug=trace and -debug=info set.
Over the years, I've come to prefer printf debugging too. At my job I work with headless embedded systems, and interactive debugging can only be done remotely.
Aye. Sometimes in embedded work, you're *lucky* if you can even do printf at all, let alone a debugger. I've had to debug with as little as one LED. It's...umm..."interesting". And time consuming. Especially when it's ASM. (But somewhat of a proud-yet-twisted rite of passage though ;) ) There's other times I've had to get by without debuggers too. Like, in the earlier days of web dev, it was common to not have a debugger. Or debugging JS problems that only manifested on Safari (I assume Safari probably has JS diagnostics/debugging now, but it didn't always. That was a pain.)
Aye, I wrote my former company's PSP engine with nothing more than the unit's power light as a debugging tool (at least until I managed to initialise the display hardware and render something). I would while(1) around the place... if it reached that point, the power light stayed on. If it crashed before it reached that point, the power light went off (after a 20 second delay, which made every single execution a suspenseful experience!).
Jun 27 2014
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2014-06-27 00:57, Sean Kelly wrote:

 Yep.  A lot of this is probably because as a server programmer
 I've just gotten used to finding bugs this way as a matter of
 necessity, but in many cases I actually prefer it to interactive
 debugging.  For example, build core.demangle with -debug=trace
 and -debug=info set.
I don't know about other debuggers but with LLDB you can set a breakpoint, add commands to that breakpoint, which will be executed when the breakpoint is hit. Then continue the execution. This means you don't need to use the debugger interactively, if you don't want to. -- /Jacob Carlborg
Jun 27 2014
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 7:45 AM, Dicebot wrote:
 I like how he says that productivity is important and mentions fear of
 meta-programming in the same article ;)
Or how productivity is important, but fixing C++'s death-by-a-thousand-cuts productivity killers by...fixing those many little cuts as he already admits D *does*...isn't good enough because it doesn't come in the form of one big silver bullet. Like OO. Except, oops!, he doesn't like big silver bullets like OO either. And every time a language *does* fix a C++ problem, it's suddenly backtracking time: "Oh, well, we already addressed that in C++ via some paperclips and chewing gum." I don't think he even knows what he wants. He just wants to sit there and let somebody else wave a wand to magically solve ALL his problems simultaneously. Nothing less will do. It really gets me that the same industry which created Frostbite 3, Unreal Engine 4, GTA5, Steam (obviously all enormous investments), mostly done *in* C++ which makes them that much MORE effort, will bitch *soo* much about C++ and STILL won't get off their asses enough to write, or even contribute to, a mere language.
Jun 15 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created Frostbite 3, Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), mostly done *in*
C++
 which makes them that much MORE effort, will bitch *soo* much about C++ and
 STILL won't get off their asses enough to write, or even contribute to, a mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new. It's also fair to say that some people have learned D, and gone back to C++.
Jun 15 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 3:53 PM, Walter Bright wrote:
 On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created Frostbite 3,
 Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), mostly
 done *in* C++
 which makes them that much MORE effort, will bitch *soo* much about
 C++ and
 STILL won't get off their asses enough to write, or even contribute
 to, a mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new.
While I do agree completely with that, any suggestion of *C++* being in someone's "comfort zone" just sounds like "Yea, swimming in a pool of razor blades isn't ideal, but I've just gotten so darn comfortable with it! Like an old pair of shoes! With lava inside them!"
Jun 15 2014
prev sibling next sibling parent reply "w0rp" <devw0rp gmail.com> writes:
I'm going to try my hand at making a game with 2.066, because I 
believe  nogc is a final piece in a puzzle of making doing that 
easy. Much like writing bare metal D code without the runtime, 
I'm going to try my hand at writing D code with the main function 
marked as  nogc, because I reckon it's going to leave me with a 
saner set of syntax and semantics than either C or C++ in the 
end, with none of the drawbracks from the stop the world effect 
in a game loop.

Having said that, I do think there's some kind of brain 
malfunction on the part of games programmers that makes them 
think "is slow and can't escape from" when they hear "garbage 
collector" and "makes things more complicated and slower" when 
they hear "template." Neither of these things are true.
Jun 15 2014
next sibling parent reply "MattCoder" <idonthaveany mail.com> writes:
On Sunday, 15 June 2014 at 20:53:58 UTC, w0rp wrote:
 I'm going to try my hand at making a game with 2.066...
It will be open-source? Can you tell what type of game you have in mind? Matheus.
Jun 15 2014
parent "w0rp" <devw0rp gmail.com> writes:
On Sunday, 15 June 2014 at 21:00:23 UTC, MattCoder wrote:
 On Sunday, 15 June 2014 at 20:53:58 UTC, w0rp wrote:
 I'm going to try my hand at making a game with 2.066...
It will be open-source? Can you tell what type of game you have in mind? Matheus.
Yeah. I'll put it all on GitHub. I did a little bit of work on it before. My past experience is just writing graphics code at university with the legacy OpenGL API. I don't want to make some AAA video game or something. I'm just going to make a dumb game where you walk around little mazes and diamonds try to kill you.
Jun 15 2014
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
w0rp:

 I'm going to try my hand at writing D code with the main 
 function marked as  nogc,
That's going to be lot of fun! Are you going to need Unicode text? One report: https://d.puremagic.com/issues/show_bug.cgi?id=12768 Bye, bearophile
Jun 15 2014
prev sibling next sibling parent reply "Brian Rogoff" <brogoff gmail.com> writes:
On Sunday, 15 June 2014 at 20:53:58 UTC, w0rp wrote:
 I'm going to try my hand at making a game with 2.066, because I 
 believe  nogc is a final piece in a puzzle of making doing that 
 easy. Much like writing bare metal D code without the runtime, 
 I'm going to try my hand at writing D code with the main 
 function marked as  nogc, because I reckon it's going to leave 
 me with a saner set of syntax and semantics than either C or 
 C++ in the end, with none of the drawbracks from the stop the 
 world effect in a game loop.
Good luck, I'm sure a lot of people are interested.
 Having said that, I do think there's some kind of brain 
 malfunction on the part of games programmers that makes them 
 think "is slow and can't escape from" when they hear "garbage 
 collector"
There's a similar brain malfunction on the part of GC advocates that makes them think "GC programs are just as performant as non GC'ed programs at no cost". (**) On the Lang.Next panel, Andrei said something like "For the same payload the garbage collected program uses 3 times as much memory" and researcher Emery Berger writes that http://people.cs.umass.edu/~emery/plasma/emery/memory-management-studies.html "... a good GC can match the performance of a good allocator, but it takes 5X more space. If physical memory is tight, however, conventional garbage collectors suffer an order-of-magnitude performance penalty." That's not even taking into account the non-determinism of tracing GC or the issues of finalizers vs destructors or ... Being able to turn off GC, and having the libraries not hit the GC, are all important.
 and "makes things more complicated and slower" when they hear 
 "template." Neither of these things are true.
D metaprogramming is a winning feature. But GC has a number of costs as well as a number of benefits. (**) I much prefer working in a GC'ed language with high level features that require a GC (full closures!) but if resources are tight I think you need to give up some pleasant features.
Jun 15 2014
parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 09:25, Brian Rogoff via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 15 June 2014 at 20:53:58 UTC, w0rp wrote:
 I'm going to try my hand at making a game with 2.066, because I believe
  nogc is a final piece in a puzzle of making doing that easy. Much like
 writing bare metal D code without the runtime, I'm going to try my hand at
 writing D code with the main function marked as  nogc, because I reckon it's
 going to leave me with a saner set of syntax and semantics than either C or
 C++ in the end, with none of the drawbracks from the stop the world effect
 in a game loop.
Good luck, I'm sure a lot of people are interested.
 Having said that, I do think there's some kind of brain malfunction on the
 part of games programmers that makes them think "is slow and can't escape
 from" when they hear "garbage collector"
There's a similar brain malfunction on the part of GC advocates that makes them think "GC programs are just as performant as non GC'ed programs at no cost". (**) On the Lang.Next panel, Andrei said something like "For the same payload the garbage collected program uses 3 times as much memory" and researcher Emery Berger writes that http://people.cs.umass.edu/~emery/plasma/emery/memory-management-studies.html "... a good GC can match the performance of a good allocator, but it takes 5X more space. If physical memory is tight, however, conventional garbage collectors suffer an order-of-magnitude performance penalty." That's not even taking into account the non-determinism of tracing GC or the issues of finalizers vs destructors or ... Being able to turn off GC, and having the libraries not hit the GC, are all important.
Thank you. I was very worried I was about to start another holy war responding to that post ;)
 and "makes things more complicated and slower" when they hear "template."
 Neither of these things are true.
D metaprogramming is a winning feature. But GC has a number of costs as well as a number of benefits. (**) I much prefer working in a GC'ed language with high level features that require a GC (full closures!) but if resources are tight I think you need to give up some pleasant features.
Closures are so few and infrequent, they could trivially be ARC allocated, and the language dependence on GC would be meaningfully reduced.
Jun 15 2014
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 4:53 PM, w0rp wrote:
 I'm going to try my hand at making a game with 2.066, because I believe
  nogc is a final piece in a puzzle of making doing that easy. Much like
 writing bare metal D code without the runtime, I'm going to try my hand
 at writing D code with the main function marked as  nogc, because I
 reckon it's going to leave me with a saner set of syntax and semantics
 than either C or C++ in the end, with none of the drawbracks from the
 stop the world effect in a game loop.

 Having said that, I do think there's some kind of brain malfunction on
 the part of games programmers that makes them think "is slow and can't
 escape from" when they hear "garbage collector" and "makes things more
 complicated and slower" when they hear "template." Neither of these
 things are true.
I think C++ has caused a lot of brain damage to a lot of unfortunate souls. In addition to giving templates a bad name, I think C++ has done a lot to damage the reputations of both static typing and native compilation. I attribute the VM obsession of the early 2000's, and the big surge in dynamic popularity, largely to C and C++ for being misleading-but-popular examples of static typing and native compilation. C++'s lack of finally didn't do any favors for exception handling's popularity, either. (Has "finally" finally been added?)
Jun 15 2014
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Jun 16, 2014 at 12:18:26AM -0400, Nick Sabalausky via Digitalmars-d
wrote:
[...]
 C++'s lack of finally didn't do any favors for exception handling's
 popularity, either. (Has "finally" finally been added?)
http://stackoverflow.com/questions/7779652/try-catch-finally-construct-is-it-in-c11 Apparently, C++ *still* doesn't have finally, preferring RAII instead. Yet another nail in the too-little-too-late coffin that is C++11. T -- Кто везде - тот нигде.
Jun 15 2014
next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Monday, 16 June 2014 at 06:24:47 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 On Mon, Jun 16, 2014 at 12:18:26AM -0400, Nick Sabalausky via 
 Digitalmars-d wrote:
 [...]
 C++'s lack of finally didn't do any favors for exception 
 handling's
 popularity, either. (Has "finally" finally been added?)
http://stackoverflow.com/questions/7779652/try-catch-finally-construct-is-it-in-c11 Apparently, C++ *still* doesn't have finally, preferring RAII instead. Yet another nail in the too-little-too-late coffin that is C++11. T
Who needs finally when you have scope(exit) ?
Jun 15 2014
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/16/2014 2:23 AM, H. S. Teoh via Digitalmars-d wrote:
 On Mon, Jun 16, 2014 at 12:18:26AM -0400, Nick Sabalausky via Digitalmars-d
wrote:
 [...]
 C++'s lack of finally didn't do any favors for exception handling's
 popularity, either. (Has "finally" finally been added?)
http://stackoverflow.com/questions/7779652/try-catch-finally-construct-is-it-in-c11 Apparently, C++ *still* doesn't have finally, preferring RAII instead. Yet another nail in the too-little-too-late coffin that is C++11.
Ahh, ouch. Some years back, I was very surprised when I came across Brian Hook's old "Book of Hook" article denouncing exceptions as a bad approach to error handling. Then I realized C++ didn't have "finally". All of a sudden his perspective made a lot more sense. :) But wait...Hasn't Andrei created library-based scope guards for C++? (Or am I remembering something wrong?) How would that possible without "finally"?
Jun 16 2014
parent reply "safety0ff" <safety0ff.dev gmail.com> writes:
On Monday, 16 June 2014 at 07:27:16 UTC, Nick Sabalausky wrote:
 But wait...Hasn't Andrei created library-based scope guards for 
 C++? (Or am I remembering something wrong?) How would that 
 possible without "finally"?
Skip to 19:00 http://vimeo.com/97329153
Jun 16 2014
parent "safety0ff" <safety0ff.dev gmail.com> writes:
On Monday, 16 June 2014 at 07:41:03 UTC, safety0ff wrote:
 On Monday, 16 June 2014 at 07:27:16 UTC, Nick Sabalausky wrote:
 But wait...Hasn't Andrei created library-based scope guards 
 for C++? (Or am I remembering something wrong?) How would that 
 possible without "finally"?
Skip to 19:00 http://vimeo.com/97329153
Oops, scope exit is at 16:00, just watch the whole video though, it's good :o)
Jun 16 2014
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 16 June 2014 at 04:18:28 UTC, Nick Sabalausky wrote:
 On 6/15/2014 4:53 PM, w0rp wrote:
 I'm going to try my hand at making a game with 2.066, because 
 I believe
  nogc is a final piece in a puzzle of making doing that easy. 
 Much like
 writing bare metal D code without the runtime, I'm going to 
 try my hand
 at writing D code with the main function marked as  nogc, 
 because I
 reckon it's going to leave me with a saner set of syntax and 
 semantics
 than either C or C++ in the end, with none of the drawbracks 
 from the
 stop the world effect in a game loop.

 Having said that, I do think there's some kind of brain 
 malfunction on
 the part of games programmers that makes them think "is slow 
 and can't
 escape from" when they hear "garbage collector" and "makes 
 things more
 complicated and slower" when they hear "template." Neither of 
 these
 things are true.
I think C++ has caused a lot of brain damage to a lot of unfortunate souls. ....
One consequence was making those souls think that the C and C++ compilation model is a synonym for all languages that compile to native code. -- Paulo
Jun 16 2014
prev sibling next sibling parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 05:53, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created Frostbite 3, Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), mostly done
 *in* C++
 which makes them that much MORE effort, will bitch *soo* much about C++
 and
 STILL won't get off their asses enough to write, or even contribute to, a
 mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new. It's also fair to say that some people have learned D, and gone back to C++.
I think the reason is mostly like I said in my other post; that gamedev is a strictly closed and proprietary industry. Open Source is a synonym with "flakey, barely working, non-windows-compatible, probably-linux-nonsense, rubbish", if you ask most gamedev's. They don't understand OSS, and the industry doesn't support any knowledge in the field. I think this is changing, but it hasn't pervasively affected gamedev culture yet... Gamedev is about solutions being given (MSVC, closed console platform tools, etc), and they are just waiting for the package to appear. It's not entirely unreasonable either. Most people probably don't realise how high-stress and unfair the gamedev industry is when it comes to engineers time and commitment to their work. To say that they literally have no time to spend on extra-curricular projects is an understatement, and risk-aversion is a key form of self-defence. I know many gamedev's who are frequently expected to choose between their life and families, or their jobs. If they can't see the package and toolset nicely integrated, they can't imagine the workflow as realistic. I often make the point how important VisualD is, and I don't say that lightly, it is everything to this community. And I must re-iterate, it's a _gigantic_ community!
Jun 15 2014
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 9:55 PM, Manu via Digitalmars-d wrote:
 On 16 June 2014 05:53, Walter Bright via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created Frostbite 3, Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), mostly done
 *in* C++
 which makes them that much MORE effort, will bitch *soo* much about C++
 and
 STILL won't get off their asses enough to write, or even contribute to, a
 mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new. It's also fair to say that some people have learned D, and gone back to C++.
I think the reason is mostly like I said in my other post; that gamedev is a strictly closed and proprietary industry. Open Source is a synonym with "flakey, barely working, non-windows-compatible, probably-linux-nonsense, rubbish", if you ask most gamedev's. They don't understand OSS, and the industry doesn't support any knowledge
Interesting. That explains a chat I had a few years back, that had been puzzling me ever since, with a gamedev guy. I'd known him for a long time, and I *know* he's a very intelligent guy, but when the subject changed to OSS, suddenly it felt like, uhh, *ahem*...like the LA Times was trying to tell me about Nintendo's PlayStation 4 ;). *Zero* awareness of the real-word commercial contributions to OSS (Almost as if Mozilla didn't even exist). But I *knew* this guy was smart enough to know better. I just couldn't figure it out. But if that's a prevalent belief in the industry, then that would explain what felt like an almost surreal conversation.
 in the field. I think this is changing, but it hasn't pervasively
 affected gamedev culture yet...
I've been watching Unity3D pretty closely as of late, and I predict that it, plus it's Asset Store (or similar competitors) are going to start forcing the issue of AAA collaboration/openness more and more. That company seems to be built, in no small part, on putting indies closer and closer to competing with AAAs. And they have a history of making some real eyebrow-raising steps in that direction, with no signs of slowing down. I'm convinced Epic's already taken notice of that, as UE4 seems to be directly targeted at both Frostbite and Unity (Not that Frostbite has gone commercial, AFAIK). Related to this whole topic of openness in gamedev, Slightly Mad's Project C.A.R.S. is really going to be something to keep an eye on. I'd imagine the success or failure of that could very well trigger at least a few ripples.
 To say that they literally have no time to spend on extra-curricular
 projects is an understatement, and risk-aversion is a key form of
 self-defence. I know many gamedev's who are frequently expected to
 choose between their life and families, or their jobs.
Geezus, that garbage is still going on? "EA Spouse" alone was well over a decade ago. That, and all the many, many other examples (often less extreme, but still entirely unacceptable IMO) was exactly the reason I decided at the last minute (in college), to change my long-standing plans and not pursue a career in that industry after all. Several *years* ago, I was under the impression that problem had finally been changing? Is that not so?
 If they can't see the package and toolset nicely integrated, they
 can't imagine the workflow as realistic. I often make the point how
 important VisualD is, and I don't say that lightly, it is everything
 to this community. And I must re-iterate, it's a _gigantic_ community!
Yea. Even as a non-IDE user (but former Visual Studio fan), I do sympathize with that. Naturally it's an unfortunate chicken and egg problem. Those who want it the most aren't really contributing (can't/won't/etc/whatever, either way it just hasn't been happening AFAIK), and the rest of us are still too busy scratching our own itches (and arguing with Walter/Andrei ;), myself *not* excluded). But here's the part I have trouble understanding. Actually, I haven't been able to get it out of my mind all day: Look at Frostbite 3, the entire front-to-back of it, from authoring to runtime. Look at Unreal Engine 4. And look at...whatever crazy tech Rockstar must have had for GTA5 (and it runs playably on a *PS3*?!). And everything that goes into any MMO. And Steam/SteamBox. Etc. That is some *crazy*, impressive, *herculean*-effort stuff. CLEARLY, significant parts of the game industry genuinely understand the importance of investments into technology. And yet...all the complaining they do about C++ and they *still* won't write the language they want? Or even take one that's close and bring it up-to-snuff? Undergrad students write their own languages! It almost sounds like an army of Conan the Barbarians complaining that a 5lb sack of potatoes is blocking their way. Granted, I don't mean to trivialize designing/writing/maintaining a language. I know it's non-trivial even compared the impressive tech the industry does produce. But, to my mind, it still just doesn't add up. I've been trying to wrap my brain around it all day, and I just don't get it. I've be very interested to hear your perspective on that. Is the idea of language design or compiler front-end just intimidating? Is LLVM unknown/unused? Maybe it does get pitched, but so far no manager's gone for it? Something else?
Jun 15 2014
next sibling parent reply "w0rp" <devw0rp gmail.com> writes:
On Monday, 16 June 2014 at 05:46:22 UTC, Nick Sabalausky wrote:
 On 6/15/2014 9:55 PM, Manu via Digitalmars-d wrote:
 To say that they literally have no time to spend on 
 extra-curricular
 projects is an understatement, and risk-aversion is a key form 
 of
 self-defence. I know many gamedev's who are frequently 
 expected to
 choose between their life and families, or their jobs.
Geezus, that garbage is still going on? "EA Spouse" alone was well over a decade ago. That, and all the many, many other examples (often less extreme, but still entirely unacceptable IMO) was exactly the reason I decided at the last minute (in college), to change my long-standing plans and not pursue a career in that industry after all. Several *years* ago, I was under the impression that problem had finally been changing? Is that not so?
I was considering getting a job in the games industry, so I applied to a bunch of places in the UK during my final year of university. When you filtered out the jobs that were looking for years of industry experience, then filtered out the jobs that expected you to work terribly long hours, you got to the positions that said, "We'll get you started as a tester." I switched to web development, where I work roughly 9-5 for a good salary, and I never looked back.
Jun 15 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/16/2014 2:56 AM, w0rp wrote:
 On Monday, 16 June 2014 at 05:46:22 UTC, Nick Sabalausky wrote:
 Geezus, that garbage is still going on? "EA Spouse" alone was well
 over a decade ago. That, and all the many, many other examples (often
 less extreme, but still entirely unacceptable IMO) was exactly the
 reason I decided at the last minute (in college), to change my
 long-standing plans and not pursue a career in that industry after all.

 Several *years* ago, I was under the impression that problem had
 finally been changing? Is that not so?
I was considering getting a job in the games industry, so I applied to a bunch of places in the UK during my final year of university. When you filtered out the jobs that were looking for years of industry experience, then filtered out the jobs that expected you to work terribly long hours, you got to the positions that said, "We'll get you started as a tester." I switched to web development, where I work roughly 9-5 for a good salary, and I never looked back.
Yea. I never even bothered applying anywhere in gamedev (although nothing exists in ohio anyway and I didn't particularly want to move, but still). So instead, my first summer in college I got an internship with one of the web teams at a major corp around here and learned webdev on the job (ASP, back in in the pre-.NET days, even did some WAP/WML) The 8-5 on that (incl lunch) was enough of a hell for me (even despite being a rather decent company) - so I certainly wouldn't want anything in an industry that does crunch mode. Been kinda stuck in web dev ever since. It's not all bad though, as much as I hate about web, there are some aspects of webdev I've come to enjoy. For example, the various problems of making web dev less painful has gone from survival to a genuine interest. I've known some people who did go into AAA games though. One guy who rose the ranks from tester to full fledged programmer back on the PS1 (and later worked on Undying, which was a pretty sweet FPS). And an old college friend of mine joined up with Volition as a character designer/artist for several of their games. He's not there now though, and we pretty much lost contact after college, so no idea what he's up to now. Although if things are going well for him, then I have a couple good guesses. But anyway, I'm rambling again. :)
Jun 16 2014
prev sibling next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 16 June 2014 at 06:56:22 UTC, w0rp wrote:
 On Monday, 16 June 2014 at 05:46:22 UTC, Nick Sabalausky wrote:
 On 6/15/2014 9:55 PM, Manu via Digitalmars-d wrote:
 To say that they literally have no time to spend on 
 extra-curricular
 projects is an understatement, and risk-aversion is a key 
 form of
 self-defence. I know many gamedev's who are frequently 
 expected to
 choose between their life and families, or their jobs.
Geezus, that garbage is still going on? "EA Spouse" alone was well over a decade ago. That, and all the many, many other examples (often less extreme, but still entirely unacceptable IMO) was exactly the reason I decided at the last minute (in college), to change my long-standing plans and not pursue a career in that industry after all. Several *years* ago, I was under the impression that problem had finally been changing? Is that not so?
I was considering getting a job in the games industry, so I applied to a bunch of places in the UK during my final year of university. When you filtered out the jobs that were looking for years of industry experience, then filtered out the jobs that expected you to work terribly long hours, you got to the positions that said, "We'll get you started as a tester." I switched to web development, where I work roughly 9-5 for a good salary, and I never looked back.
Same here. I did managed to get into some interviews at a few AAA studios, attended two GDCE and got to know some people in the industry. But the salary that gets paid, alongside the amount of hours one is forced to work which get rewarded by being fired at the end of the project, have made me choose to work in the regular software industry instead. -- Paulo
Jun 16 2014
prev sibling parent David Gileadi <gileadis NSPMgmail.com> writes:
On 6/15/14, 11:56 PM, w0rp wrote:
 I was considering getting a job in the games industry, so I applied to a
 bunch of places in the UK during my final year of university. When you
 filtered out the jobs that were looking for years of industry
 experience, then filtered out the jobs that expected you to work
 terribly long hours, you got to the positions that said, "We'll get you
 started as a tester."

 I switched to web development, where I work roughly 9-5 for a good
 salary, and I never looked back.
Pretty similar to me, although like Nick I never even tried to interview. Now I write iPhone games as a hobby with no pressure to try to make a living from it, and enjoy it quite a bit!
Jun 16 2014
prev sibling parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 15:46, Nick Sabalausky via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 6/15/2014 9:55 PM, Manu via Digitalmars-d wrote:
 On 16 June 2014 05:53, Walter Bright via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created Frostbite 3,
 Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), mostly done
 *in* C++
 which makes them that much MORE effort, will bitch *soo* much about C++
 and
 STILL won't get off their asses enough to write, or even contribute to,
 a
 mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new. It's also fair to say that some people have learned D, and gone back to C++.
I think the reason is mostly like I said in my other post; that gamedev is a strictly closed and proprietary industry. Open Source is a synonym with "flakey, barely working, non-windows-compatible, probably-linux-nonsense, rubbish", if you ask most gamedev's. They don't understand OSS, and the industry doesn't support any knowledge
Interesting. That explains a chat I had a few years back, that had been puzzling me ever since, with a gamedev guy. I'd known him for a long time, and I *know* he's a very intelligent guy, but when the subject changed to OSS, suddenly it felt like, uhh, *ahem*...like the LA Times was trying to tell me about Nintendo's PlayStation 4 ;). *Zero* awareness of the real-word commercial contributions to OSS (Almost as if Mozilla didn't even exist). But I *knew* this guy was smart enough to know better. I just couldn't figure it out. But if that's a prevalent belief in the industry, then that would explain what felt like an almost surreal conversation.
If you ask the average programmer, chances are they'll have a strong opinion on the matter, probably based on nothing more than hearsay. Sadly, for such a huge industry, it fosters some really immature engineers.
 in the field. I think this is changing, but it hasn't pervasively
 affected gamedev culture yet...
I've been watching Unity3D pretty closely as of late, and I predict that it, plus it's Asset Store (or similar competitors) are going to start forcing the issue of AAA collaboration/openness more and more. That company seems to be built, in no small part, on putting indies closer and closer to competing with AAAs. And they have a history of making some real eyebrow-raising steps in that direction, with no signs of slowing down. I'm convinced Epic's already taken notice of that, as UE4 seems to be directly targeted at both Frostbite and Unity (Not that Frostbite has gone commercial, AFAIK). Related to this whole topic of openness in gamedev, Slightly Mad's Project C.A.R.S. is really going to be something to keep an eye on. I'd imagine the success or failure of that could very well trigger at least a few ripples.
I think these sorts of ripples are starting to have some significant effect, but it's not pervasive yet. Many companies and engineers don't take the time out to look at the bigger picture. It can take quite some time to invade their mind-sets.
 To say that they literally have no time to spend on extra-curricular
 projects is an understatement, and risk-aversion is a key form of
 self-defence. I know many gamedev's who are frequently expected to
 choose between their life and families, or their jobs.
Geezus, that garbage is still going on? "EA Spouse" alone was well over a decade ago. That, and all the many, many other examples (often less extreme, but still entirely unacceptable IMO) was exactly the reason I decided at the last minute (in college), to change my long-standing plans and not pursue a career in that industry after all. Several *years* ago, I was under the impression that problem had finally been changing? Is that not so?
Well, depends who you ask. Some have worked it out and acted on it, others have worked it out and don't have the luxury to act (or face terminal thread to their company). I think it's getting better slowly, but that's coming at the cost of big game studios failing all over the world, resulting in a high level of occupational burnout, and employees so badly scarred they will never work like that again, which re-enforces the movement ;) There's a serious problem when companies business models depend on working their staff into the ground to remain competitive against the competition. It's a race to the bottom, and plenty of companies won the prize...
 If they can't see the package and toolset nicely integrated, they
 can't imagine the workflow as realistic. I often make the point how
 important VisualD is, and I don't say that lightly, it is everything
 to this community. And I must re-iterate, it's a _gigantic_ community!
Yea. Even as a non-IDE user (but former Visual Studio fan), I do sympathize with that. Naturally it's an unfortunate chicken and egg problem. Those who want it the most aren't really contributing (can't/won't/etc/whatever, either way it just hasn't been happening AFAIK), and the rest of us are still too busy scratching our own itches (and arguing with Walter/Andrei ;), myself *not* excluded). But here's the part I have trouble understanding. Actually, I haven't been able to get it out of my mind all day: Look at Frostbite 3, the entire front-to-back of it, from authoring to runtime. Look at Unreal Engine 4. And look at...whatever crazy tech Rockstar must have had for GTA5 (and it runs playably on a *PS3*?!). And everything that goes into any MMO. And Steam/SteamBox. Etc. That is some *crazy*, impressive, *herculean*-effort stuff. CLEARLY, significant parts of the game industry genuinely understand the importance of investments into technology. And yet...all the complaining they do about C++ and they *still* won't write the language they want? Or even take one that's close and bring it up-to-snuff? Undergrad students write their own languages! It almost sounds like an army of Conan the Barbarians complaining that a 5lb sack of potatoes is blocking their way. Granted, I don't mean to trivialize designing/writing/maintaining a language. I know it's non-trivial even compared the impressive tech the industry does produce. But, to my mind, it still just doesn't add up. I've been trying to wrap my brain around it all day, and I just don't get it. I've be very interested to hear your perspective on that. Is the idea of language design or compiler front-end just intimidating? Is LLVM unknown/unused? Maybe it does get pitched, but so far no manager's gone for it? Something else?
Well, first hurdle, closed platform holders provide tooling for their platforms. XBox360 has arch-specific instructions, which aren't widely supported in GCC/LLVM backends. Same goes for Nintendo and Sony used to be a problem, but got past that with the PS3. Secondly, there isn't really budget allocated for the would-be compiler team. Perhaps there should be, but it would be an unconventional move by the first mover, and games is so risk-adverse; good luck trying to convince the suits to get on board with that... Finally, games are so complex that most staff available tend to become highly specialised. There are usually a relatively small number of generalists capable of such a task in any one company, and such staff tend to find themselves becoming mission-critical resources, often exploited to burnout ;) I'm heavily generalising, obviously every company is different, but it's a hard industry to manifest the right set of circumstances where the idea would be taken seriously. It's been done before though. Insomniac were well known for their invention of their internal language 'goal' used by designers and scripters to produce game logic. It generated a lot of discussion, but most people dismissed the idea stating that they didn't have the same resources available that insomniac had, and Lua eventually won that war.
Jun 16 2014
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/16/2014 3:54 AM, Manu via Digitalmars-d wrote:
 Well, first hurdle, closed platform holders provide tooling for their
 platforms. XBox360 has arch-specific instructions, which aren't widely
 supported in GCC/LLVM backends. Same goes for Nintendo and
Sounds like being able to compile down to C/C++ could help here? Although, looking forward, I would imagine this would be less of an issue for PS4/XB3 since they're apparently x64?
 Sony used
 to be a problem, but got past that with the PS3.
Interesting. I've been impressed with how Sony turned things around within the PS3's lifetime. It came out the staring gate with both shoes tied together and a faceplant into the dirt, but they really did a lot to undo the damage within the constraints they had. ('Course, their biggest competitor's quality control problems probably didn't hurt, either.)
 Secondly, there isn't really budget allocated for the would-be compiler team.
 Perhaps there should be, but it would be an unconventional move by the
 first mover, and games is so risk-adverse; good luck trying to
 convince the suits to get on board with that...
Hmm, I guess that would be budgeted separate from any existing "tools" projects.
 Finally, games are so complex that most staff available tend to become
 highly specialised. There are usually a relatively small number of
 generalists capable of such a task in any one company,
I see. I'd heard about that before, but tend to forget it.
 and such staff
 tend to find themselves becoming mission-critical resources, often
 exploited to burnout ;)

 I'm heavily generalising, obviously every company is different, but
 it's a hard industry to manifest the right set of circumstances where
 the idea would be taken seriously.

 It's been done before though. Insomniac were well known for their
 invention of their internal language 'goal' used by designers and
 scripters to produce game logic. It generated a lot of discussion, but
 most people dismissed the idea stating that they didn't have the same
 resources available that insomniac had, and Lua eventually won that
 war.
Yea. Insomniac, along with their "friend" company, Naughty Dog, have always seemed to stand out as being fairly forward-thinking and more tech-driven than others. While I haven't usually been into their respective games, I've always had good reason to respect them both. The way they've pushed PS1/PS2 hardware, the scripting as you mentioned (which I had actually forgotten about), PS2-era advancements in follow-cameras, the more recent animation work, and just generally high level of polish on everything.
Jun 16 2014
prev sibling parent "c0de517e" <kenpex tin.it> writes:
On Sunday, 15 June 2014 at 19:53:54 UTC, Walter Bright wrote:
 On 6/15/2014 12:27 PM, Nick Sabalausky wrote:
 It really gets me that the same industry which created 
 Frostbite 3, Unreal
 Engine 4, GTA5, Steam (obviously all enormous investments), 
 mostly done *in* C++
 which makes them that much MORE effort, will bitch *soo* much 
 about C++ and
 STILL won't get off their asses enough to write, or even 
 contribute to, a mere
 language.
It's all about comfort zone. It is much easier to continue doing what one is familiar with than to try something new. It's also fair to say that some people have learned D, and gone back to C++.
To be fair in the industry there are -many- internal languages, most are very bad, some good. Even for internal languages though adoption is not trivial and I've seen many valiant efforts fail. These languages don't emerge outside a given company because, well, most of them are not really great and anyhow which companies share code and projects? ID does years down the line, it's the most prominent example and indeed you can see the little languages ID crafted over time in their sources...
Jun 17 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
On Sunday, 15 June 2014 at 11:45:30 UTC, Dicebot wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I like how he says that productivity is important and mentions fear of meta-programming in the same article ;) Interesting though, I had totally different set of demands and expectation when used to work with C/C++. Feels like industry matters much more than a language here.
For a personal perspective on meta-programming risks - here is a write up http://c0de517e.blogspot.ca/2014/06/bonus-round-languages-metaprogramming.html The issue I have with metaprogramming (and overloading and some other similar ideas) is that it makes a statement dependent on a lot of context, this is tricky in a large team as now just reading a change doesn't really tell much. Our is an industry where we still exercise a lot of control, we want to know exactly what a statement does in terms of how it's executed. I won't replicate what I wrote on the blog here, so if you're interested I'd love to have more comments on that aspect, but that is why I care about productivity but I'd rather prefer to gain that with faster iteration and language features that don't make semantics more flexible, than metaprogramming. Then of course a tool is a tool and I'd always love to have -more- tools, so I'm not saying metaprogramming is a bad thing to have. Like OO is not really a bad thing to have either. But there are tools to be used with certain care. Metaprogramming -mentality- is scary to me like OO-heavy thinking is.
Jun 17 2014
next sibling parent reply "Araq" <rumpf_a web.de> writes:
 The issue I have with metaprogramming (and overloading and some 
 other similar ideas) is that it makes a statement dependent on 
 a lot of context, this is tricky in a large team as now just 
 reading a change doesn't really tell much. Our is an industry 
 where we still exercise a lot of control, we want to know 
 exactly what a statement does in terms of how it's executed.
Can be easily solved by better tooling, esp since it's all done at compile-time. ("Show code after some/all transformations.") The lack of imagination among programmers (and even professional game developers) is just sad.
Jun 17 2014
parent "c0de517e" <kenpex tin.it> writes:
On Tuesday, 17 June 2014 at 22:58:58 UTC, Araq wrote:
 The issue I have with metaprogramming (and overloading and 
 some other similar ideas) is that it makes a statement 
 dependent on a lot of context, this is tricky in a large team 
 as now just reading a change doesn't really tell much. Our is 
 an industry where we still exercise a lot of control, we want 
 to know exactly what a statement does in terms of how it's 
 executed.
Can be easily solved by better tooling, esp since it's all done at compile-time. ("Show code after some/all transformations.") The lack of imagination among programmers (and even professional game developers) is just sad.
Well, we're still writing code as a static text and that's sad (actually, not true for languages like Mathematica) but on the other hand it's an entire infrastructure that needs to change and that's huge. Is this proposed tooling for example going to be available in all the IDEs we use, the Diff tools, the version control, the code-review tools...
Jun 17 2014
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Tue, Jun 17, 2014 at 10:20:59PM +0000, c0de517e via Digitalmars-d wrote:
[...]
 The issue I have with metaprogramming (and overloading and some other
 similar ideas) is that it makes a statement dependent on a lot of
 context, this is tricky in a large team as now just reading a change
 doesn't really tell much. Our is an industry where we still exercise a
 lot of control, we want to know exactly what a statement does in terms
 of how it's executed.
You don't need metaprogramming to have this problem. In my current job, for example, there is a recent push to move things away from hard-wired APIs toward more generic APIs, in order to make things more uniform and easier to use -- instead of memorizing 15 different sets of functions to use, one for each possible object stored in the database (read_obj, add_obj, delete_obj, read_file, add_file, delete_file, add_table, read_table, delete_table, ... ad nauseaum), use a common set of accessor functions (read, add, delete, update, etc.) under a unified generic interface. There was also the complaint from some developers that C is "superior" to C++ because in C++, a method call in a complex class hierarchy can theoretically end up "anywhere", whereas in C, you at least know exactly which function will get called, since there is no overloading. Well guess what? In order to implement the unified generic interface, we ended up with tables of function pointers that get passed around, and now you have code that looks like this: /* Note: this is not the real code, I just made it up to * illustrate the problem */ int my_generic_func(generic_container *container) { if (container->ops.find("some_key")) { generic_item *item = container->ops.read("some_key"); value_type *value = item->ops.read("some_field"); item->ops.add("new_field", &value); return 1; } else { generic_item *item = container->ops.make_new_item("new_data"); container->ops.add("new_key", item); return 0; } } Now you discover a bug somewhere in this function. How would you go about finding where it is? Well, since this is C, which allegedly doesn't have any polymorphism, that should be trivial, right? Just trace through all of the function calls and narrow it down. Except... all of those ops.xxx() calls are function pointers. So OK, we need to find out where they point to. But they are set in the *container struct somewhere far, far away from this function, so how do we trace them? Now we have to search through the entire source tree to find every place where container structs have their function pointers set. Except... there are several different kinds of containers, and they all have radically different implementations of each function. So how do we know exactly which container type got passed in? Worse yet, even if you manage to narrow that down, you still can't resolve where item->ops.add points to, because containers may contain different kinds of items, and what type of item gets put in there is only known at runtime... Aaargh... If C, which purportedly is unambiguous as to exactly what each statement does -- there is no polymorphism (allegedly), no function overloading, no operator overloading, etc., still exhibits exactly the same context dependence that you object against, then I'm forced to conclude that this context dependence is a red herring in your argument against metaprogramming.
 I won't replicate what I wrote on the blog here, so if you're
 interested I'd love to have more comments on that aspect, but that is
 why I care about productivity but I'd rather prefer to gain that with
 faster iteration and language features that don't make semantics more
 flexible, than metaprogramming.
 
 Then of course a tool is a tool and I'd always love to have -more-
 tools, so I'm not saying metaprogramming is a bad thing to have. Like
 OO is not really a bad thing to have either. But there are tools to be
 used with certain care. Metaprogramming -mentality- is scary to me
 like OO-heavy thinking is.
Believe me, I totally sympathize with where you're coming from -- especially after having to deal with C code like I illustrated above, where every other line is a call to a function pointer that points who knows where! It used to be, in the supposedly bad ole days of having hundreds of similarly-named functions (add_file, add_table, add_obj, delete_file, delete_table, delete_obj, etc.), that I can just run ctags and use vim's tagging function to follow function calls with a single keystroke, and I can rest reasonably assured that it will take me to the function in question, and that it represents the sequence of execution at runtime. Nowadays, tagging is basically useless, because every other line calls container->ops.add() or container->ops.delete(), and there are 15 different implementations of add() and delete(), who knows which one it ends up calling at runtime?! OTOH, OO is a huge time saver, when used correctly, because it allows you to reason in the abstract instead of losing sight of the forest for the trees of details you have to wade through, just to do one single simple task. It reduces cognitive load: instead of memorizing the names of add_obj, add_file, add_table, delete_obj, delete_file, etc., you only need to remember add and delete, and the abstraction takes care of itself by resolving to the correct overload based on the argument types. Metaprogramming goes one step further and lets you reduce boilerplate -- which is the source of subtle bugs caused by typos, not to mention a maintenance nightmare when fixing a bug in one instance of boilerplate doesn't fix all other instances of the same bug in the same boilerplate that's copied over 50 other places in your code -- while presenting an easy to remember abstract interface that you can remember in your sleep. In fact, it allows you to centralize several different implementations of the same logical operation under a single function, so even if you don't necessarily know exactly which version of the function will get called at runtime, you still know exactly what the code looks like, 'cos they all come from the same template. :) At the end of the day, *all* tools must be used with care. Even in C, a language with neither OO nor metaprogramming support, you can code yourself into a nasty mess by using built-in language constructs like function pointers, like I showed above. Just because you *can* cause serious injury to yourself with a hammer, doesn't mean that hammers are inherently evil. T -- My program has no bugs! Only undocumented features...
Jun 17 2014
parent reply "c0de517e" <kenpex tin.it> writes:
 You don't need metaprogramming to have this problem.
True, in fact it was an example of a more general idea.
 There was also the complaint from some developers that C is
 "superior" to C++ because in C++, a method call in a complex 
 class
 hierarchy can theoretically end up "anywhere", whereas in C, 
 you at
 least know exactly which function will get called, since there 
 is no
 overloading.
That's an extreme opinion, tools are tools, it's not that we always have to operate in the strictest ways possible, but yes I'd say you have to be conscious of the consequences. Nowadays my rule of thumb is to ask myself -do I need this- when thinking of grabbing a given language feature for a given implementation. Do I need a class? What would it give me? If it ends up with considerable savings then go ahead.
 Now you discover a bug somewhere in this function. How would 
 you go
 about finding where it is? Well, since this is C, which 
 allegedly
 doesn't have any polymorphism, that should be trivial, right? 
 Just trace
 through all of the function calls and narrow it down. Except... 
 all of
 those ops.xxx() calls are function pointers. So OK, we need to 
 find out
 where they point to.
Yeah... That decision was quite bad (especially today!), as you point out they didn't avoid polymorphism really, just re-implemented it in a language that doesn't support it natively in its type system, and that is often trouble. That's why I actually am wary of certain forms of metaprogramming, I don't think it's often a good idea to try to extend a language beyond what its syntax natively supports, not only because it surprises people, but also because it will surprise tools and so on...
 [...] you only
 need to remember add and delete, and the abstraction takes care 
 of
 itself by resolving to the correct overload based on the 
 argument types.
That is not really OO though or well not what I mean with OO/OOD. The OO that I would avoid is the "thinking in objects" mindset. Associating functions with types is the simplest form of polymorphism and it's totally fine, you don't even need classes for that! But even if you face a problem where you do actually need interfaces over objects, go ahead, I'm not saying it's never useful. But it shouldn't be the "default" state of mind and certainly how to split computation in objects shouldn't be the way we think about solving problems, that's the bad of OO mentality that degenerated into horrors like "patterns". Computation is algorithms that change bits of data.
 Metaprogramming goes one step further and lets you reduce 
 boilerplate --
True, I just wanted to show the tradeoff that are entailed in that. Then it can still be useful, but a lot of times is just abused, and many, many times we would be better served by a better type system than having to hack features via metaprogramming. I would be much more open to generics if we had in C++ bounded templates. And I would use C++11 lambdas, even if I wouldn't have touched with a mile-long pole the Boost::lambda stuff, and so on...
 At the end of the day, *all* tools must be used with care. Even 
 in C, a
 language with neither OO nor metaprogramming support, you can 
 code
 yourself into a nasty mess by using built-in language 
 constructs like
 function pointers, like I showed above. Just because you *can* 
 cause
 serious injury to yourself with a hammer, doesn't mean that 
 hammers
 are inherently evil.
Right, but we live in a world where it seems to me lots of people are hammer happy :)
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Jun 18, 2014 at 12:13:49AM +0000, c0de517e via Digitalmars-d wrote:
[...]
There was also the complaint from some developers that C is
"superior" to C++ because in C++, a method call in a complex class
hierarchy can theoretically end up "anywhere", whereas in C, you at
least know exactly which function will get called, since there is no
overloading.
That's an extreme opinion, tools are tools, it's not that we always have to operate in the strictest ways possible, but yes I'd say you have to be conscious of the consequences.
Yeah, every time I recall the incident when that person told me that, I regret not taking the chance to point out to him that the same thing happens in C with passing around tables of function pointers everywhere.
 Nowadays my rule of thumb is to ask myself -do I need this- when
 thinking of grabbing a given language feature for a given
 implementation. Do I need a class? What would it give me? If it ends
 up with considerable savings then go ahead.
In Java, you don't have a choice. *Everything* must be in a class, even things that blatantly obviously belongs to the global scope. Like MyLameApp.main(), Math.sin(), etc.. Every time I see a class with only static members, I cringe.
Now you discover a bug somewhere in this function. How would you go
about finding where it is? Well, since this is C, which allegedly
doesn't have any polymorphism, that should be trivial, right? Just
trace through all of the function calls and narrow it down. Except...
all of those ops.xxx() calls are function pointers. So OK, we need to
find out where they point to.
Yeah... That decision was quite bad (especially today!), as you point out they didn't avoid polymorphism really, just re-implemented it in a language that doesn't support it natively in its type system, and that is often trouble.
And this is why D rawks. ;-) All the tools are there, but unlike Java, it doesn't force you to use them where they don't fit. Need to use OO? Sure, D supports classes and interfaces. Need global functions? No problem, we have those too. Need generics? Absolutely, check. None of the above? No problem, C-style coding works too -- even raw pointers!.
 That's why I actually am wary of certain forms of metaprogramming, I
 don't think it's often a good idea to try to extend a language beyond
 what its syntax natively supports, not only because it surprises
 people, but also because it will surprise tools and so on...
Actually, D is quite carefully designed in this area. For example, in C++ you can overload <, <=, ==, >=, > to do absolutely crazy things... y'know, like compute your mom's monthly telephone bills by writing a < b, solve the halting problem by writing a <= b, and so on. In D, however, you can't overload <, <=, ==, >=, > to do inconsistent things; instead, you overload opCmp(), and the compiler translates these operators in terms of calls to opCmp(). And while D does support operator overloading, it's deliberately designed to make it easy to support numerical types (which is the whole point of operator overloading, and I'm sure you'll agree, is the one case where operator overloading actually makes sense), but somewhat painful to abuse for other purposes. Instead, if you want a DSL that looks nothing like normal mathematical expressions, D offers you metaprogramming in the form of compile-time string arguments instead. So instead of the C++ horror known as Xpressive, where you write regexes in a way that neither looks like C++ nor regexes, D's std.regex library lets you write: auto myRegex = ctRegex!`([^)]+)*`; which gets processed at compile-time into optimal runtime code. No need for expression templates or any of that insanity so prevalent in C++.
[...]
you only need to remember add and delete, and the abstraction takes
care of itself by resolving to the correct overload based on the
argument types.
That is not really OO though or well not what I mean with OO/OOD. The OO that I would avoid is the "thinking in objects" mindset. Associating functions with types is the simplest form of polymorphism and it's totally fine, you don't even need classes for that! But even if you face a problem where you do actually need interfaces over objects, go ahead, I'm not saying it's never useful. But it shouldn't be the "default" state of mind and certainly how to split computation in objects shouldn't be the way we think about solving problems, that's the bad of OO mentality that degenerated into horrors like "patterns". Computation is algorithms that change bits of data.
I disagree. There are some problem classes for which OO is the best approach. Designing GUI widget hierarchies comes to mind. However, that doesn't mean OO is *always* the right approach, and on that point I agree with you. Nastiness like Java's public static void main() come to mind -- it's really *not* OO -- it's a global function!!!! even a kid could tell you that -- but it's shoehorned into the OO model because supposedly Java is a "purely OO" language, and therefore shoehorning everything into the OO model is somehow "good". Similarly, functional programming has a lot going for it -- but it earned a reputation of being obscure and hard to understand, because you are forced to write *everything* in the functional paradigm, even when it's very unnatural to do so -- like I/O-heavy computation that's most straightforward to implement in imperative style. Having to turn that intuitive, easy-to-understand algorithm into functional style requires unnatural convolutions like introducing tail-recursion, monads, etc., akin to being forced to write singleton classes with only static members in Java. You can certainly make it work, but it's just very unnatural. Thankfully, D is acquiring some rather nice functional-style capabilities, and it's now possible to write in functional style for many cases if you so choose. D's purity system also allows you to use performant, imperative implementations for functional code, which is quite an awesome innovation IMO. One of the things I really like about D is that it doesn't force you to code in a particular style, but it does provide you with the tools to code in that style if you choose to. And different pieces of code written in different styles can interoperate with each other nicely. [...]
 True, I just wanted to show the tradeoff that are entailed in that.
 Then it can still be useful, but a lot of times is just abused, and
 many, many times we would be better served by a better type system
 than having to hack features via metaprogramming.
 
 I would be much more open to generics if we had in C++ bounded

 templates. And I would use C++11 lambdas, even if I wouldn't have
 touched with a mile-long pole the Boost::lambda stuff, and so on...
[...] I think your perception is heavily colored by your bad experience with C++. :) Seriously, you should try some metaprogramming in D sometime, and experience for yourself what *real* metaprogramming feels like. Forget about that sorry mess that is C++; try it out afresh in D and see, you might even like it afterwards. ;) I totally sympathize, 'cos I came from a strong C/C++ background, and C++ templates are just... well, I can say they deserve their bad reputation. But unfortunately, that made people associate metaprogramming with C++'s poor implementation of it, when actually, metaprogramming done right is actually very pleasant to use. T -- Everybody talks about it, but nobody does anything about it! -- Mark Twain
Jun 17 2014
parent reply "c0de517e" <kenpex tin.it> writes:
 I think your perception is heavily colored by your bad 
 experience with
 C++. :)  Seriously, you should try some metaprogramming in D 
 sometime,
 and experience for yourself what *real* metaprogramming feels 
 like.
 Forget about that sorry mess that is C++; try it out afresh in 
 D and
 see, you might even like it afterwards. ;)

 I totally sympathize, 'cos I came from a strong C/C++ 
 background, and
 C++ templates are just... well, I can say they deserve their bad
 reputation. But unfortunately, that made people associate
 metaprogramming with C++'s poor implementation of it, when 
 actually,
 metaprogramming done right is actually very pleasant to use.


 T
I don't doubt that there are forms of metaprogramming that are MUCH better than C++, actually C++ is among the worst. But even in dunno, a lisp with hygienic macros, metaprogramming should be used imho with caution, because it's so easy to introduce all kind of new constructs that look nifty and shorten the code, but one has to understand that each time you add one it's one more foreign syntax that is local to a context and people have to learn and recognize
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Jun 18, 2014 at 01:21:34AM +0000, c0de517e via Digitalmars-d wrote:
[...]
 I don't doubt that there are forms of metaprogramming that are MUCH
 better than C++, actually C++ is among the worst. But even in dunno, a
 lisp with hygienic macros, metaprogramming should be used imho with
 caution, because it's so easy to introduce all kind of new constructs
 that look nifty and shorten the code, but one has to understand that
 each time you add one it's one more foreign syntax that is local to a
 context and people have to learn and recognize
Isn't it the same with having many different functions that do almost exactly the same thing? You're just shifting the context dependency onto the additional cognitive load to remember and recognize all those little variations on the same abstract operation - findInIntArray, findInStringArray, findInFloatArray, findInIntTree, findInFloatTree, findInStringTree, etc.. Why not rather combine them all into a single find() function that works for all those cases? Not to mention, having to separately implement findInIntArray, findInFloatArray, findInDoubleArray, etc., just increased the amount of almost-duplicate code n times, which means n times more opportunities for typos and bugs. As we all know, humans are very error-prone, so minimizing the opportunities for error is a significant benefit. And arguably, the metaprogramming solution actually *reduces* the amount people need to learn and recognize, because you learn the abstract operation "find" once, and then you can apply it to all sorts of containers (as long as the find() implementation supports the underlying concrete types). Without metaprogramming you have to implement find() n times, and the people who come after you have to learn it n times. Now, granted, it *is* possible to abuse metaprogramming by having the int overload of find() do something completely unrelated to the float overload of find(), like one returning the first matching element and the other returning the last matching element, but that's a problem caused by the improper use of metaprogramming, not an inherent problem of metaprogramming itself. You can't blame a hammer for not being able to hammer a nail just because somebody decided to use the wrong end for hammering. T -- Perhaps the most widespread illusion is that if we were in power we would behave very differently from those who now hold it---when, in truth, in order to get power we would have to become very much like them. -- Unknown
Jun 17 2014
parent reply "c0de517e" <kenpex tin.it> writes:
On Wednesday, 18 June 2014 at 01:46:02 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 On Wed, Jun 18, 2014 at 01:21:34AM +0000, c0de517e via 
 Digitalmars-d wrote:
 [...]
 I don't doubt that there are forms of metaprogramming that are 
 MUCH
 better than C++, actually C++ is among the worst. But even in 
 dunno, a
 lisp with hygienic macros, metaprogramming should be used imho 
 with
 caution, because it's so easy to introduce all kind of new 
 constructs
 that look nifty and shorten the code, but one has to 
 understand that
 each time you add one it's one more foreign syntax that is 
 local to a
 context and people have to learn and recognize
Isn't it the same with having many different functions that do almost exactly the same thing? You're just shifting the context dependency onto the additional cognitive load to remember and recognize all those little variations on the same abstract operation - findInIntArray, findInStringArray, findInFloatArray, findInIntTree, findInFloatTree, findInStringTree, etc.. Why not rather combine them all into a single find() function that works for all those cases? Not to mention, having to separately implement findInIntArray, findInFloatArray, findInDoubleArray, etc., just increased the amount of almost-duplicate code n times, which means n times more opportunities for typos and bugs. As we all know, humans are very error-prone, so minimizing the opportunities for error is a significant benefit. And arguably, the metaprogramming solution actually *reduces* the amount people need to learn and recognize, because you learn the abstract operation "find" once, and then you can apply it to all sorts of containers (as long as the find() implementation supports the underlying concrete types). Without metaprogramming you have to implement find() n times, and the people who come after you have to learn it n times. Now, granted, it *is* possible to abuse metaprogramming by having the int overload of find() do something completely unrelated to the float overload of find(), like one returning the first matching element and the other returning the last matching element, but that's a problem caused by the improper use of metaprogramming, not an inherent problem of metaprogramming itself. You can't blame a hammer for not being able to hammer a nail just because somebody decided to use the wrong end for hammering. T
Now you're talking polymorphism again, but I already said polymorphism is rather ok, and it's actually not really metaprogramming, it's just fancier typing.
Jun 17 2014
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Jun 18, 2014 at 02:18:47AM +0000, c0de517e via Digitalmars-d wrote:
 On Wednesday, 18 June 2014 at 01:46:02 UTC, H. S. Teoh via Digitalmars-d
 wrote:
On Wed, Jun 18, 2014 at 01:21:34AM +0000, c0de517e via Digitalmars-d
wrote:
[...]
I don't doubt that there are forms of metaprogramming that are MUCH
better than C++, actually C++ is among the worst. But even in dunno,
a lisp with hygienic macros, metaprogramming should be used imho
with caution, because it's so easy to introduce all kind of new
constructs that look nifty and shorten the code, but one has to
understand that each time you add one it's one more foreign syntax
that is local to a context and people have to learn and recognize
Isn't it the same with having many different functions that do almost exactly the same thing? You're just shifting the context dependency onto the additional cognitive load to remember and recognize all those little variations on the same abstract operation - findInIntArray, findInStringArray, findInFloatArray, findInIntTree, findInFloatTree, findInStringTree, etc.. Why not rather combine them all into a single find() function that works for all those cases? Not to mention, having to separately implement findInIntArray, findInFloatArray, findInDoubleArray, etc., just increased the amount of almost-duplicate code n times, which means n times more opportunities for typos and bugs. As we all know, humans are very error-prone, so minimizing the opportunities for error is a significant benefit. And arguably, the metaprogramming solution actually *reduces* the amount people need to learn and recognize, because you learn the abstract operation "find" once, and then you can apply it to all sorts of containers (as long as the find() implementation supports the underlying concrete types). Without metaprogramming you have to implement find() n times, and the people who come after you have to learn it n times. Now, granted, it *is* possible to abuse metaprogramming by having the int overload of find() do something completely unrelated to the float overload of find(), like one returning the first matching element and the other returning the last matching element, but that's a problem caused by the improper use of metaprogramming, not an inherent problem of metaprogramming itself. You can't blame a hammer for not being able to hammer a nail just because somebody decided to use the wrong end for hammering. T
Now you're talking polymorphism again, but I already said polymorphism is rather ok, and it's actually not really metaprogramming, it's just fancier typing.
Actually I was talking about templates: R find(R,T)(R range, T element) if (isInputRange!R && is(ElementType!R : T)) { while (!range.empty) { if (range.front == element) break; range.popFront(); } return range; } That's a template function that searches an arbitrary input range for arbitrary element type. It captures the essence of linear search in a generic form, and thereafter you never have to write linear search again, you just implement types that conform to the input range API (i.e., with .empty, .front, .popFront members with the appropriate semantics), or appropriate helper functions on native types, and call find() on it. It can search arrays of any type, linked lists, input streams, network sockets, *any* type that implements input range primitives. Without metaprogramming, you'd have to implement n versions of find() for arrays -- one for each element type you'd want to support, n versions for linked lists, n versions for network sockets, etc., with k*n opportunities for bugs, typos, and boilerplate. Is this the kind of metaprogramming you're referring to, or did you have something else in mind? T -- Recently, our IT department hired a bug-fix engineer. He used to work for Volkswagen.
Jun 17 2014
next sibling parent "xenon325" <anm programmer.net> writes:
On Wednesday, 18 June 2014 at 05:20:39 UTC, H. S. Teoh via 
Digitalmars-d wrote:
 On Wed, Jun 18, 2014 at 02:18:47AM +0000, c0de517e via
 Now you're talking polymorphism again, [...] and it's actually 
 not really metaprogramming, it's just
 fancier typing.
Actually I was talking about templates:
You can do that with interfaces. Though there are problems: 1. built-in and value types 2. indirection templates.
Jun 18 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
 Actually I was talking about templates:

 	R find(R,T)(R range, T element)
 		if (isInputRange!R && is(ElementType!R : T))
 	{
 		while (!range.empty)
 		{
 			if (range.front == element)
 				break;
 			range.popFront();
 		}
 		return range;
 	}
http://en.wikipedia.org/wiki/Parametric_polymorphism C++ Templates are more general than that, they do express this kind of polymorphism but that can be done without "full" metaprogramming (turing-complete ability of generating code at compile time). A bounded parametric type is better than templates (C++ concepts are an attempt to patch templates with bounds) Also notice that really all this is expressible even in languages that have only dynamic polymorphism (subtyping) without performance penalties (YES REALLY). People think that implementing interfaces is for some reason inherently slower than templates, the same they believe function pointers are slower than functors. It's FALSE. The ONLY reason why templates and functors can be faster is because they are always inline, the compiler knows exactly what to call without indirections. But that's only WORSE than the "indirect" alternatives, because interfaces and pointers afford you the option not to resolve everything statically, but if you want you can always inline everything (put the implementation in headers) and the compiler will perfectly know that it can directly call a given function without overhead...
Jun 18 2014
next sibling parent "Kagamin" <spam here.lot> writes:
On Wednesday, 18 June 2014 at 07:58:57 UTC, c0de517e wrote:
 People think that implementing interfaces is for some reason 
 inherently slower than templates, the same they believe 
 function pointers are slower than functors. It's FALSE. The 
 ONLY reason why templates and functors can be faster is because 
 they are always inline, the compiler knows exactly what to call 
 without indirections.
The processor needs to know what to call too, when it doesn't, it stalls. That means, the code executes slower.
Jun 18 2014
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Jun 18, 2014 at 07:58:56AM +0000, c0de517e via Digitalmars-d wrote:
 
Actually I was talking about templates:

	R find(R,T)(R range, T element)
		if (isInputRange!R && is(ElementType!R : T))
	{
		while (!range.empty)
		{
			if (range.front == element)
				break;
			range.popFront();
		}
		return range;
	}
http://en.wikipedia.org/wiki/Parametric_polymorphism C++ Templates are more general than that, they do express this kind of polymorphism but that can be done without "full" metaprogramming (turing-complete ability of generating code at compile time).
You're talking about compile-time codegen? Like D's ctRegex perhaps? import std.regex; // Statically generates a regex engine that matches the given // expression. auto r = ctRegex!`(a+b(cd*)+)?z`; I find this extremely awesome, actually. It's self-documenting (ctRegex tells you it's a compile-time generated regex engine; the binary '!' tells you it's a compile-time argument, the regex syntax is confined inside the quoted ``-string, and doesn't spill out into language-level operators, unlike C++'s Xpressive horror), and it's maximally efficient because the matching engine optimization happens at compile-time, whereas most regex libraries do the regex compilation at runtime. [...]
 Also notice that really all this is expressible even in languages that
 have only dynamic polymorphism (subtyping) without performance
 penalties (YES REALLY).
 
 People think that implementing interfaces is for some reason
 inherently slower than templates, the same they believe function
 pointers are slower than functors. It's FALSE. The ONLY reason why
 templates and functors can be faster is because they are always
 inline, the compiler knows exactly what to call without indirections.
 But that's only WORSE than the "indirect" alternatives, because
 interfaces and pointers afford you the option not to resolve
 everything statically, but if you want you can always inline
 everything (put the implementation in headers) and the compiler will
 perfectly know that it can directly call a given function without
 overhead...
This is a strawman argument. A template *can* be instantiated with base class (or interface) arguments, and then you get *both* compile-time *and* runtime polymorphism from the same template, i.e., the best of both worlds. T -- Beware of bugs in the above code; I have only proved it correct, not tried it. -- Donald Knuth
Jun 18 2014
parent "c0de517e" <kenpex tin.it> writes:
 You're talking about compile-time codegen? Like D's ctRegex 
 perhaps?

 	import std.regex;

 	// Statically generates a regex engine that matches the given
 	// expression.
 	auto r = ctRegex!`(a+b(cd*)+)?z`;
Looks nifty. As I said it's not that I want to ban a given technique from ever being used.
 This is a strawman argument. A template *can* be instantiated 
 with base
 class (or interface) arguments, and then you get *both* 
 compile-time
 *and* runtime polymorphism from the same template, i.e., the 
 best of
 both worlds.
Which benefits? Given that if I call an inline function with a type that derives from an interface, the compiler knows the concrete type and doesn't need to go through the interface indirection, the performance of the "dynamic" approach is the same as the "static", so why would you need the static at all? But in C++ both approaches are actually very weak attempts at emulating better polymorphism. Templates are a loose, complex copy'n'paste engine that has no constraints on the types you instance them with. Interfaces give you the constraints, but forcing you to go through classes. See some alternatives - http://www.haskell.org/tutorial/classes.html - http://caml.inria.fr/pub/docs/oreilly-book/html/book-ora018.html
Jun 18 2014
prev sibling next sibling parent "Dicebot" <public dicebot.lv> writes:
On Wednesday, 18 June 2014 at 07:58:57 UTC, c0de517e wrote:
 People think that implementing interfaces is for some reason 
 inherently slower than templates, the same they believe 
 function pointers are slower than functors. It's FALSE. The 
 ONLY reason why templates and functors can be faster is because 
 they are always inline, the compiler knows exactly what to call 
 without indirections. But that's only WORSE than the "indirect" 
 alternatives, because interfaces and pointers afford you the 
 option not to resolve everything statically, but if you want 
 you can always inline everything (put the implementation in 
 headers) and the compiler will perfectly know that it can 
 directly call a given function without overhead...
I don't think it is that simple. What you speak about is only possible if compiler known full source code of all application including all possible dynamically loaded libraries. For compiled languages that puts such optimization out of practical consideration.
Jun 18 2014
prev sibling parent "Dicebot" <public dicebot.lv> writes:
Also I think all this discussion about template and generics 
totally misses the point about meta-programming. It is not about 
just code generation or replacing few type declarations, main 
thing is compile-time reflection. The fact we use templates is 
just a mere implementation details. What important is being able 
to express complicated abstract relations between parts of your 
program and allowing compiler to both verify it and optimize 
based on that information. Boilerplate elimination without such 
verification is not even closely as tempting.
Jun 18 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/17/2014 3:20 PM, c0de517e wrote:
 The issue I have with metaprogramming (and overloading and some other similar
 ideas) is that it makes a statement dependent on a lot of context, this is
 tricky in a large team as now just reading a change doesn't really tell much.
 Our is an industry where we still exercise a lot of control, we want to know
 exactly what a statement does in terms of how it's executed.
It's a fair criticism. On the other hand, we've already given up on a great deal of knowing exactly what a statement does, even in C. How many of us program in assembly anymore? How many of us can even make sense of assembly code? It is absolutely necessary to move to higher levels of abstraction in order to handle the increasing complexity of modern programs. Proper use of metaprogramming reduces complexity and reduces programming bugs. And yes, the price paid for that is you'll need to put more trust in the metaprogramming tools to "do the right thing" with your intention, just like modern programs now trust the compiler.
Jun 17 2014
next sibling parent reply "c0de517e" <kenpex tin.it> writes:
 On the other hand, we've already given up on a great deal of 
 knowing exactly what a statement does, even in C. How many of 
 us program in assembly anymore? How many of us can even make 
 sense of assembly code?

 It is absolutely necessary to move to higher levels of 
 abstraction in order to handle the increasing complexity of 
 modern programs. Proper use of metaprogramming reduces 
 complexity and reduces programming bugs. And yes, the price 
 paid for that is you'll need to put more trust in the 
 metaprogramming tools to "do the right thing" with your 
 intention, just like modern programs now trust the compiler.
Mine is not a campaign to eliminate metaprogramming, not even OO. Nor the language post was a critique of D, Rust or Go. The intention is to make people -aware- of certain issues, to then make better motivated choices and not end up thinking stuff like this is cool http://www.boost.org/doc/libs/1_55_0b1/libs/geometry/doc/html/ eometry/design.html (sorry, I've linked this a few times now but it's really so outrageous I want to punch people in the face - also notice how after all that crap the example code manages to forget about http://en.cppreference.com/w/cpp/numeric/math/hypot) Also on a language perspective I'd say that if certain things can be part of the type system instead of done via metaprogramming, that is much better (boost::lambda vs c++11 lambda) because it becomes standard, it can have a specific syntax to give meaning to certain statements, tools can recognize it and so on.
Jun 17 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/17/2014 11:54 PM, c0de517e wrote:
 The intention is to make people -aware- of certain issues, to then make better
 motivated choices and not end up thinking stuff like this is cool
 http://www.boost.org/doc/libs/1_55_0b1/libs/geometry/doc/html/geometry/design.html
 (sorry, I've linked this a few times now but it's really so outrageous I want
to
Thanks, that link is pure awesomeness in its awfulness!
 punch people in the face - also notice how after all that crap the example code
 manages to forget about http://en.cppreference.com/w/cpp/numeric/math/hypot)
 Also on a language perspective I'd say that if certain things can be part of
the
 type system instead of done via metaprogramming, that is much better
 (boost::lambda vs c++11 lambda) because it becomes standard, it can have a
 specific syntax to give meaning to certain statements, tools can recognize it
 and so on.
I disagree with that because it makes the language into a kitchen sink grab bag of features. It's better to put enabling features into the language and have the standard library define standard forms.
Jun 18 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
 It is absolutely necessary to move to higher levels of 
 abstraction in order to handle the increasing complexity of 
 modern programs.
And this is 100% right, but people should be educated about "premature abstraction". Have you seen the horrors of "generalization"? Especially C++ neophytes get so excited by pointless generalization, then some grow out of it (studying other languages also helps) but some never do. We write a lot about powerful techniques and neat applications, but often forget to show the perils and tradeoffs.
Jun 18 2014
next sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Wednesday, 18 June 2014 at 07:00:44 UTC, c0de517e wrote:
 It is absolutely necessary to move to higher levels of 
 abstraction in order to handle the increasing complexity of 
 modern programs.
And this is 100% right, but people should be educated about "premature abstraction". Have you seen the horrors of "generalization"?
I call them architecture astronautes. To avoid that pitfall, I have a adopted the following method : - Do whatever you need to to get to the next step toward you goal. - Make a pause and observe. Is there some repeated patterns ? Is there some part of the code that is growing in complexity ? Do part of the code rely on fragile hacks ? Is part of the code doing useless work ? etc... - If yes, refactor accordingly, either by adding new level of abstraction, but better, by changing the interface of some abstraction to better fit its actual usage (and not the you were thinking you'd get at first). This allowed me to avoid creating useless abstraction, and push me to refine existing ones.
 Especially C++ neophytes get so excited by pointless 
 generalization, then some grow out of it (studying other 
 languages also helps) but some never do.
I used to be like that :D
Jun 18 2014
parent "Daniel Murphy" <yebbliesnospam gmail.com> writes:
"deadalnix"  wrote in message news:qawhxkzqdgzjwylzrhdf forum.dlang.org...

 I call them architecture astronautes. To avoid that pitfall, I have a 
 adopted the following method :
   - Do whatever you need to to get to the next step toward you goal.
   - Make a pause and observe. Is there some repeated patterns ? Is there 
 some part of the code that is growing in complexity ? Do part of the code 
 rely on fragile hacks ? Is part of the code doing useless work ? etc...
   - If yes, refactor accordingly, either by adding new level of 
 abstraction, but better, by changing the interface of some abstraction to 
 better fit its actual usage (and not the you were thinking you'd get at 
 first).

 This allowed me to avoid creating useless abstraction, and push me to 
 refine existing ones.
This is exactly the approach I use and I've found the extra time spent on refactoring is well worth the time saved on not implementing things you don't need or that don't match the needs of the problem you're solving. It does help that D is rather easy to refactor, and I'm much better at coming up with a good design after I've half-implemented it.
Jun 18 2014
prev sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Jun 18, 2014 at 07:00:43AM +0000, c0de517e via Digitalmars-d wrote:
It is absolutely necessary to move to higher levels of abstraction in
order to handle the increasing complexity of modern programs.
And this is 100% right, but people should be educated about "premature abstraction". Have you seen the horrors of "generalization"?
There's no need to call a hammer useless because (some) people insist on using the wrong end to hit the nail. It's not the language's job to educate people how to do things right; you should direct your complaints at CS instructors instead. The language's job is to provide the necessary tools to get the task done, whatever it may be. Excluding some tools because some people don't know how to use them properly just handicaps the language unnecessarily.
 Especially C++ neophytes get so excited by pointless generalization,
 then some grow out of it (studying other languages also helps) but
 some never do.
 
 We write a lot about powerful techniques and neat applications, but
 often forget to show the perils and tradeoffs.
So your complaints are really directed at how people use the language, rather than the language itself. I don't think it's the language's job to dictate to the user what to do -- Java tried to do that with OO, and the result is so straitjacketed I feel like pulling out my hair every time I use it. I find D far better in the sense of providing all the tools to get the job done, and then STANDING BACK and letting me use the tools as I see fit, instead of dictating a particular way of doing things. Now whether people are competent enough to use the language properly -- that's not really the concern of language design, it's a problem of education. These are really two distinct issues. Using the lack of education as evidence for poor language design -- I just don't follow this kind of reasoning. T -- Life would be easier if I had the source code. -- YHL
Jun 18 2014
prev sibling next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Peter Alexander:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1
From the post:
If I have to point at what is most needed for productivity, I'd 
say interactivity. Interactive visualization, manipulation, 
REPLs, exploratory programming, live-coding.<
A language has both ~native efficiency and is usable for that level of interactivity is Julia :-) Bye, bearophile
Jun 15 2014
parent reply "Brian Rogoff" <brogoff gmail.com> writes:
On Sunday, 15 June 2014 at 12:20:13 UTC, bearophile wrote:
 Peter Alexander:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1
From the post:
If I have to point at what is most needed for productivity, I'd 
say interactivity. Interactive visualization, manipulation, 
REPLs, exploratory programming, live-coding.<
A language has both ~native efficiency and is usable for that level of interactivity is Julia :-)
My own experiments with Julia massively contradict that statement. I wrote some basic scripting programs that read large text files into hash tables which count word occurrences and Julia's performance was abysmal compared to D and Java.
Jun 15 2014
parent "bearophile" <bearophileHUGS lycos.com> writes:
Brian Rogoff:

 On Sunday, 15 June 2014 at 12:20:13 UTC, bearophile wrote:
 A language has both ~native efficiency and is usable for that 
 level of interactivity is Julia :-)
My own experiments with Julia massively contradict that statement. I wrote some basic scripting programs that read large text files into hash tables which count word occurrences and Julia's performance was abysmal compared to D and Java.
Julia is in its infancy, while the JavaVM is the product of a lot of work and tuning. I even expect Julia associative arrays to be currently a little slower than CPython dicts :-) So it will take years. But both D and Java are not as dynamic as Julia. Bye, bearophile
Jun 15 2014
prev sibling next sibling parent reply "Abdulhaq" <alynch4047 gmail.com> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
Reading his summary of the alternatives I felt D came out clearly on top, it's just that he didn't have the motivation to switch. Towards the end he mentions the web, for me (as an application developer rather than systems level guy) Android/iOS is the fly in the ointment - I'm torn as to whether to invest my energies in following D through its explorations or knuckling down and learning the Android API - after all, JDK8 + tooling is bearable now.
Jun 15 2014
parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sun, 2014-06-15 at 12:30 +0000, Abdulhaq via Digitalmars-d wrote:
[…]
 learning the Android API - after all, JDK8 + tooling is bearable 
 now.
On the other hand Android API is Apache Harmony which is Java 6. Of note: Groovy finally works on Android, so you can use what Java 8 brings, on Java 6 and Java 7 using Groovy. And note Groovy may be a dynamic language, but it is also a static language. -- Russel. ============================================================================= Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.net 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jun 15 2014
parent reply "Abdulhaq" <alynch4047 gmail.com> writes:
On Sunday, 15 June 2014 at 13:19:12 UTC, Russel Winder via 
Digitalmars-d wrote:
 On Sun, 2014-06-15 at 12:30 +0000, Abdulhaq via Digitalmars-d 
 wrote:
 […]
 learning the Android API - after all, JDK8 + tooling is 
 bearable now.
On the other hand Android API is Apache Harmony which is Java 6.
Yes I keep forgetting that - wishful thinking maybe.
 Of note: Groovy finally works on Android, so you can use what 
 Java 8
 brings, on Java 6 and Java 7 using Groovy. And note Groovy may 
 be a
 dynamic language, but it is also a static language.
I'll look into it. Perhaps this question is just too broad, but if you wanted to develop an application on the Android platform right now, what approach would you take? Java, Groovy, web-based?
Jun 15 2014
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 15 June 2014 at 16:42:22 UTC, Abdulhaq wrote:
 On Sunday, 15 June 2014 at 13:19:12 UTC, Russel Winder via 
 Digitalmars-d wrote:
 On Sun, 2014-06-15 at 12:30 +0000, Abdulhaq via Digitalmars-d 
 wrote:
 […]
 learning the Android API - after all, JDK8 + tooling is 
 bearable now.
On the other hand Android API is Apache Harmony which is Java 6.
Yes I keep forgetting that - wishful thinking maybe.
 Of note: Groovy finally works on Android, so you can use what 
 Java 8
 brings, on Java 6 and Java 7 using Groovy. And note Groovy may 
 be a
 dynamic language, but it is also a static language.
I'll look into it. Perhaps this question is just too broad, but if you wanted to develop an application on the Android platform right now, what approach would you take? Java, Groovy, web-based?
I have played around with C++ for a small graphics application, but note that the NDK does only support game related APIs. When using middleware like Qt, you have access to the majority of APIs but then have to pay the JNI marshaling cost. -- Paulo
Jun 16 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 16 June 2014 at 08:22:59 UTC, Paulo Pinto wrote:
 I have played around with C++ for a small graphics application, 
 but note that the NDK does only support game related APIs.

 When using middleware like Qt, you have access to the majority 
 of APIs but then have to pay the JNI marshaling cost.
Yep, this is true. For a while I believe that the swedish cross platform product MoSync would pull it off by compiling C++ to java etc. It was pretty nice for what it aimed to do, but apparently the market was not ready for it and the Android/iPhone/Windows platforms started to diverge their UIs at a fast rate making cross platform design difficult. You also have the Marmalade SDK which allows cross platform game coding in C/C++, but it costs real money (coming from a game studio). Anyway, with Swift out, writing regular non-visual apps in that language makes most sense, then porting it to Java. C++ can be used for backend engines, but for anything in the user interface it makes more sense to either special case it for the native APIs or go HTML5.
Jun 16 2014
prev sibling next sibling parent reply "Brian Rogoff" <brogoff gmail.com> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.
Notice that in his post and the comments, a recurring (negative) issue is garbage collection. This is pretty common with mentions of D on reddit too, always a few posters mentioning D's GC as a negative. So many of those comments could be made obsolete by a decent precise garbage collector, and perhaps a compiler switch like the 'noruntime' one that Walter proposed a few months ago. On the plus side, D is mentioned prominently and in the comparison with C++ template programming D really shines.
Jun 15 2014
next sibling parent "Peter Alexander" <peter.alexander.au gmail.com> writes:
On Sunday, 15 June 2014 at 15:31:40 UTC, Brian Rogoff wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.
Notice that in his post and the comments, a recurring (negative) issue is garbage collection. This is pretty common with mentions of D on reddit too, always a few posters mentioning D's GC as a negative. So many of those comments could be made obsolete by a decent precise garbage collector, and perhaps a compiler switch like the 'noruntime' one that Walter proposed a few months ago.
I don't think a precise GC would fix this particular complaint. Games industry folks just don't like GC, mostly because of the pause, but also because of memory scarcity. I believe nogc is going into 2.066. Maybe that will help, but we also need to make sure Phobos actually compiles with nogc :-)
Jun 15 2014
prev sibling next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Sunday, 15 June 2014 at 15:31:40 UTC, Brian Rogoff wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.
Notice that in his post and the comments, a recurring (negative) issue is garbage collection. This is pretty common with mentions of D on reddit too, always a few posters mentioning D's GC as a negative. So many of those comments could be made obsolete by a decent precise garbage collector, and perhaps a compiler switch like the 'noruntime' one that Walter proposed a few months ago. On the plus side, D is mentioned prominently and in the comparison with C++ template programming D really shines.
Another thing I have found funny is that how he both mentions GC as an issue an favors Go (with mandatory GC) over Rust (dismissing it memory model as irrelevant). This post really reads more like a casual rant than well-established opinion.
Jun 15 2014
parent "Brian Rogoff" <brogoff gmail.com> writes:
On Sunday, 15 June 2014 at 16:02:18 UTC, Dicebot wrote:
 Another thing I have found funny is that how he both mentions 
 GC as an issue an favors Go (with mandatory GC) over Rust 
 (dismissing it memory model as irrelevant).
Well, he mentioned that Go's mandatory GC is a negative in game dev, and only a positive vis-a-vis Rust in that Rust requires some advanced type machinery to ensure memory safety sans GC. GC does have large pluses and minuses, so it's easy to contradict ones self when discussing it. I think D would have been better off not requiring it, but trying to be GC friendly (like Rust and Ada), but that ship sailed a long time ago. Now I'd just like to see D acquire a very good GC and the ability to easily write code which doesn't use it, or uses a specialized one, or turns it off, etc.
 This post really reads more like a casual rant than 
 well-established opinion.
Agreed, but it's on a topic dear to all of us :-) I'm pretty optimistic about D's future. The negatives in that rant weren't so bad.
Jun 15 2014
prev sibling parent "justme" <justme example.com> writes:
On Sunday, 15 June 2014 at 15:31:40 UTC, Brian Rogoff wrote:
 Notice that in his post and the comments, a recurring 
 (negative) issue is garbage collection. This is pretty common 
 with mentions of D on reddit too, always a few posters 
 mentioning D's GC as a negative. So many of those comments 
 could be made obsolete by a decent precise garbage collector,
D prides itself for being a "no forced paradigms" language. You don't have to write all in OO, or all in functional style, or do everyting with metaprogramming. D is a "practical programming language for practical programmers". Now, D still not working properly and fluently without GC negates the above sales talk. OS-people and Games people, as well as real-time people all have legitimate reasons for needing to ditch GC in their work. I know GC has been a religious issue from day 1 with D, but that has to change. We got rid of the Bit Data Type, so we should be able to relegate GC into Optional category. Having done that, we will find most programs will use GC, but not all, and not everywhere, just like there will be some OO and some functional style and some templates in almost all programs.
Jun 16 2014
prev sibling next sibling parent reply Caligo via Digitalmars-d <digitalmars-d puremagic.com> writes:
I can't take a blog post seriously when it's poorly written and full of
grammatical errors.  If you are in an engineering field of any kind, and
you can't construct a paragraph in your favorite natural language, you're
not worth anyone's time.  The author of that blog is nothing but a
sophisticated idiot who should not be taken seriously.

I'm so sick of watching narcissistic cunts who just love to broadcast their
opinions, enough said.


On Sun, Jun 15, 2014 at 6:28 AM, Peter Alexander via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think it's
 important we understand what people think of D. I can confirm this
 sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
Jun 15 2014
next sibling parent reply "Xinok" <xinok live.com> writes:
On Sunday, 15 June 2014 at 15:37:51 UTC, Caligo via Digitalmars-d 
wrote:
 I can't take a blog post seriously when it's poorly written and 
 full of
 grammatical errors.  If you are in an engineering field of any 
 kind, and
 you can't construct a paragraph in your favorite natural 
 language, you're
 not worth anyone's time.  The author of that blog is nothing 
 but a
 sophisticated idiot who should not be taken seriously.
Location: Italy Qualifications: Rendering Engineer https://www.blogger.com/profile/01477408942876127202 Given that he lives in Italy, it's safe to assume that English is not his first language. But rather than consider what he has to say or dispute his arguments, you completely dismissed his point of view because his level of writing doesn't meet your standards. Furthermore, you unjustly called him a "sophisticated idiot" and "narcissistic cunt". You've only shown yourself to be the ignorant one.
Jun 15 2014
next sibling parent Caligo via Digitalmars-d <digitalmars-d puremagic.com> writes:
I didn't make the assumption that English is his mother tongue; however,
judging by his writing, I can tell that he's been using the English
language for at least a few years.  In any case, the idea of a sentence or
a paragraph is not unique to the English language.  You learn what a
sentence is and how to construct a paragraph at an early age, regardless of
the language.  I think it would be difficult to argue that producing a
correct paragraph in a natural language is harder than producing a
"paragraph" in an artificial language.  If someone like him is still
struggling with sentences and paragraphs in a natural language, why should
anyone pay attention to what he has to say about programming languages?


On Sun, Jun 15, 2014 at 11:20 AM, Xinok via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 Location: Italy
 Qualifications: Rendering Engineer
 https://www.blogger.com/profile/01477408942876127202

 Given that he lives in Italy, it's safe to assume that English is not his
 first language. But rather than consider what he has to say or dispute his
 arguments, you completely dismissed his point of view because his level of
 writing doesn't meet your standards. Furthermore, you unjustly called him a
 "sophisticated idiot" and "narcissistic cunt". You've only shown yourself
 to be the ignorant one.
Jun 15 2014
prev sibling next sibling parent "Jesse Phillips" <Jesse.K.Phillips+D gmail.com> writes:
On Sunday, 15 June 2014 at 16:20:28 UTC, Xinok wrote:
 Location: Italy
 Qualifications: Rendering Engineer
 https://www.blogger.com/profile/01477408942876127202

 Given that he lives in Italy, it's safe to assume that English 
 is not his first language. But rather than consider what he has 
 to say or dispute his arguments, you completely dismissed his 
 point of view because his level of writing doesn't meet your 
 standards. Furthermore, you unjustly called him a 
 "sophisticated idiot" and "narcissistic cunt". You've only 
 shown yourself to be the ignorant one.
Agreed, this culture of, be perfect or I won't listen to you is annoying. Natural language is not a well specified language like programming (where the computer truly can't do anything but exactly what you tell it). Natural language is only useful as a means for communication. If your only concerned with the grammar and ignore the communication, you've missed the point. (This is coming from someone who's worked on getting a computer to understand that communication and respond appropriately. Note: proper grammar does not remove ambiguity). Instead, concentrate on what was communicated and write a retort for that. For example: In his referenced 2011 post he says that a new C++ is needed because no one fully understands it. In this post he says D is of no value because no one needs to fully understand C++. He obviously doesn't understand the value of putting meta programming into the hands of the common programmer, and I believe D does this even though it can get more complex. Also in the 2011 post, D should have been listed in all three section, Scripting, High-level, System. This suggests he doesn't really see D as bridging the gap and uniting all layers.
Jun 15 2014
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 9:20 AM, Xinok wrote:
 Given that he lives in Italy, it's safe to assume that English is not his first
 language. But rather than consider what he has to say or dispute his arguments,
 you completely dismissed his point of view because his level of writing doesn't
 meet your standards.
Xinok does have a point that we all should be aware of. I've found a very strong correlation between poor writing skills and disorganized thinking. (Your point about non-native English speakers is well taken, one must not confuse unfamiliarity with English with disorganized thinking.) I'm hardly the only one. If one wants their views to be taken seriously, pay attention to spelling, grammar, paragraphs, organized writing, etc. There's an awful lot of stuff to read on the internet, and poor writing often elicits a "meh, I'll skip this one and move on" reaction.
Jun 15 2014
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 1:10 PM, Walter Bright wrote:
 Xinok does have a point that we all should be aware of.
I meant Caligo. My mistake.
Jun 15 2014
prev sibling parent reply "Abdulhaq" <alynch4047 gmail.com> writes:
On Sunday, 15 June 2014 at 20:10:34 UTC, Walter Bright wrote:
 On 6/15/2014 9:20 AM, Xinok wrote:
 Given that he lives in Italy, it's safe to assume that English 
 is not his first
 language. But rather than consider what he has to say or 
 dispute his arguments,
 you completely dismissed his point of view because his level 
 of writing doesn't
 meet your standards.
Xinok does have a point that we all should be aware of. I've found a very strong correlation between poor writing skills and disorganized thinking. (Your point about non-native English speakers is well taken, one must not confuse unfamiliarity with English with disorganized thinking.) I'm hardly the only one. If one wants their views to be taken seriously, pay attention to spelling, grammar, paragraphs, organized writing, etc. There's an awful lot of stuff to read on the internet, and poor writing often elicits a "meh, I'll skip this one and move on" reaction.
True but if I'm going to judge a comment by the way it's written I'll take a second language piece over a foul and insulting rant any day of the week.
Jun 15 2014
parent reply simendsjo <simendsjo gmail.com> writes:
On 06/15/2014 11:16 PM, Abdulhaq wrote:
 On Sunday, 15 June 2014 at 20:10:34 UTC, Walter Bright wrote:
 On 6/15/2014 9:20 AM, Xinok wrote:
 Given that he lives in Italy, it's safe to assume that English is not
 his first
 language. But rather than consider what he has to say or dispute his
 arguments,
 you completely dismissed his point of view because his level of
 writing doesn't
 meet your standards.
Xinok does have a point that we all should be aware of. I've found a very strong correlation between poor writing skills and disorganized thinking. (Your point about non-native English speakers is well taken, one must not confuse unfamiliarity with English with disorganized thinking.) I'm hardly the only one. If one wants their views to be taken seriously, pay attention to spelling, grammar, paragraphs, organized writing, etc. There's an awful lot of stuff to read on the internet, and poor writing often elicits a "meh, I'll skip this one and move on" reaction.
True but if I'm going to judge a comment by the way it's written I'll take a second language piece over a foul and insulting rant any day of the week.
And my guess is the people doing the insults never use another language than their native language on a day-to-day basis. Not being a native English speaker myself, I too got offended by this rant. It's naive to assume everyone should be as fluent in English as native speakers. Some countries are even dubbing all English shows and movies, so they're not exposed to much English outside some forums - where many people might not be native speakers themselves and thus learning you faulty grammar. Luckily most people understands this issue.
Jun 15 2014
next sibling parent reply "Paolo Invernizzi" <paolo.invernizzi no.address> writes:
On Monday, 16 June 2014 at 06:40:39 UTC, simendsjo wrote:
 Some countries are even dubbing all English shows and movies, 
 so they're not exposed to much English outside some forums
That's exactly what's happening in Italy... --- Paolo
Jun 16 2014
parent "deadalnix" <deadalnix gmail.com> writes:
On Monday, 16 June 2014 at 07:17:32 UTC, Paolo Invernizzi wrote:
 On Monday, 16 June 2014 at 06:40:39 UTC, simendsjo wrote:
 Some countries are even dubbing all English shows and movies, 
 so they're not exposed to much English outside some forums
That's exactly what's happening in Italy... --- Paolo
https://www.youtube.com/watch?v=VdjhzMVY9T4
Jun 16 2014
prev sibling parent reply Caligo via Digitalmars-d <digitalmars-d puremagic.com> writes:
My rant wasn't about his lack of fluency in the English language.  You
only learn once what a sentence is, and the concept translates over to
most other natural languages.  The same is true with the concept of
constructing a paragraph.  Even if he's not a native English speaker,
I'm willing to bet that his writings in his mother tongue are just as
bad.  Just ask professors how often they encounter poor quality
writings that were produced by native speakers.  And FWIW, I'm not a
native English speaker either.  I'm multilingual, and I don't use that
fact as an excuse for anything.



On Mon, Jun 16, 2014 at 1:41 AM, simendsjo via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 And my guess is the people doing the insults never use another language
 than their native language on a day-to-day basis. Not being a native
 English speaker myself, I too got offended by this rant. It's naive to
 assume everyone should be as fluent in English as native speakers. Some
 countries are even dubbing all English shows and movies, so they're not
 exposed to much English outside some forums - where many people might
 not be native speakers themselves and thus learning you faulty grammar.
 Luckily most people understands this issue.
Jun 16 2014
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On 6/17/2014 12:16 PM, Caligo via Digitalmars-d wrote:
 My rant wasn't about his lack of fluency in the English language.  You
 only learn once what a sentence is, and the concept translates over to
 most other natural languages.  The same is true with the concept of
 constructing a paragraph.  Even if he's not a native English speaker,
 I'm willing to bet that his writings in his mother tongue are just as
 bad.  Just ask professors how often they encounter poor quality
 writings that were produced by native speakers.  And FWIW, I'm not a
 native English speaker either.  I'm multilingual, and I don't use that
 fact as an excuse for anything.
I completely disagree with all this. I've been teaching English (and also Debate) in Korea for 20 years at all levels of ability, from beginner to advanced. I've taught preschoolers, primary school students, university students, housewives, laborers, office workers, teachers, business executives and more. I also frequently edit documents that have already been translated from Korean to English, cleaning them up to make them more readable to native speakers. I can tell you without hesitation that there are a great many people who write very well in Korean and have a good spoken command of English, but who manage to construct some unintelligible English sentences when they write. The ability to write well in a native language and/or to speak well in a foreign language does not translate to an equivalent ability in a foreign language (particularly when there is an extreme difference in grammar between the two). --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com
Jun 16 2014
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On 6/17/2014 1:03 PM, Mike Parker wrote:
 On 6/17/2014 12:16 PM, Caligo via Digitalmars-d wrote:
 My rant wasn't about his lack of fluency in the English language.  You
 only learn once what a sentence is, and the concept translates over to
 most other natural languages.  The same is true with the concept of
 constructing a paragraph.  Even if he's not a native English speaker,
 I'm willing to bet that his writings in his mother tongue are just as
 bad.  Just ask professors how often they encounter poor quality
 writings that were produced by native speakers.  And FWIW, I'm not a
 native English speaker either.  I'm multilingual, and I don't use that
 fact as an excuse for anything.
I completely disagree with all this. I've been teaching English (and also Debate) in Korea for 20 years at all levels of ability, from beginner to advanced. I've taught preschoolers, primary school students, university students, housewives, laborers, office workers, teachers, business executives and more. I also frequently edit documents that have already been translated from Korean to English, cleaning them up to make them more readable to native speakers. I can tell you without hesitation that there are a great many people who write very well in Korean and have a good spoken command of English, but who manage to construct some unintelligible English sentences when they write. The ability to write well in a native language and/or to speak well in a foreign language does not translate to an equivalent ability in a foreign language (particularly when there is an extreme difference in grammar between the two).
"an equivalent ability *to write well* in a foreign language" --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com
Jun 16 2014
parent reply "c0de517e" <kenpex tin.it> writes:
Hi everybody. I'm Angelo Pesce, the author of the post on 
c0de517e.

First I have to apologize for my bad English and disorganized 
thoughts. It's not even that my language abilities are too poor, 
but mostly that I keep my blog as a place to dump some mostly 
incoherent thoughts. They are not really articles and the posts 
will evolve over time as I fix things and if a discussion entails 
I try to make certain points more clear and certain others more 
coincise and so on.

That said I also have to clarify that I didn't really mean to 
diss D. A tl;dr version of the post would be: better is better, 
but doesn't guarantee success, especially when a lot of inertia 
is present you need disruptions. I think we don't see disruptions 
in languages used by us (AAA, console game devs) because we're a 
niche that is not that interesting, so most exiting things happen 
outside. If I had to place a bet on what -could- be a disruption 
that would see adoption for us that would be live-coding (that 
works in our environments). Proof is that we already do it albeit 
very poorly, which shows how desperate we are for it.

About GC, someone mentioned it. I actually am not against GC, I 
think it got a bad rep because most languages that use it don't 
have good control over heap allocations, and that is bad for us. 
If we had good control over that then we would manage GC 
perfectly well (as we already do with malloc, which is not 
without performance issues).

About metaprogramming - 
http://c0de517e.blogspot.ca/2014/06/bonus-round-languages-metaprogramming.html

I reply (nearly) all comments on my blog.
Jun 16 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/17/2014 12:24 AM, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on c0de517e.
Cool of you to drop in! Welcome!
 First I have to apologize for my bad English and disorganized thoughts.
FWIW, I thought your english was fine (and I'm a native speaker). But my thoughts and writing are often disorganized, too :)
 It's not even that my language abilities are too poor, but mostly that I
 keep my blog as a place to dump some mostly incoherent thoughts.
I use mine the same way ;)
Jun 16 2014
prev sibling next sibling parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 04:24:54 UTC, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on 
 c0de517e.

 ...
Thanks for coming here and clarifying your point of view despite our zealous bashing :) Welcome!
Jun 16 2014
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/16/2014 9:24 PM, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on c0de517e.
Welcome - nice of you to drop by!
Jun 17 2014
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/16/14, 9:24 PM, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on c0de517e.
[snip] Thanks for chiming in! -- Andrei
Jun 17 2014
parent reply "c0de517e" <kenpex tin.it> writes:
On Wednesday, 18 June 2014 at 05:48:14 UTC, Andrei Alexandrescu 
wrote:
 On 6/16/14, 9:24 PM, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on 
 c0de517e.
[snip] Thanks for chiming in! -- Andrei
Hi Andrei! I had a little stab at your hugely influential book in the post, which I've read with interest at the time. Unfortunately I do think it gets abused, which clearly is not the book's fault, to the point that I'm persuaded it originated more bizantine, messy code than solved problems. I might though be biased by the fact I work in a very specific industry though. The same in my mind goes for the GOF Patterns book (actually the patterns one is just bad, damaging and boring at the same time, it just renames existing language concepts and casts them in a OO mantle). Certain concepts gained so much hype that people started applying them mindlessly. Crister Ericson write it well: http://realtimecollisiondetection.net/blog/?p=44 and http://realtimecollisiondetection.net/blog/?p=81 So I'm curious, do you think certain concepts went too far, that we should educate against some hypes and abuses, or you think that it's just my very partial view of the world and if looking at the C++ community at large, template metaprogramming is not abused? What would you think of stuff like this? http://www.boost.org/doc/libs/1_55_0b1/libs/geometry/doc/html/ eometry/design.html (if you can share...)
Jun 17 2014
parent "deadalnix" <deadalnix gmail.com> writes:
On Wednesday, 18 June 2014 at 06:28:16 UTC, c0de517e wrote:
 So I'm curious, do you think certain concepts went too far, 
 that we should educate against some hypes and abuses, or you 
 think that it's just my very partial view of the world and if 
 looking at the C++ community at large, template metaprogramming 
 is not abused?
Everything is a cost/benefit ratio. In C++, template are very complex beasts, and because of things like SFINAE, small errors can backfire quite badly. In D, template are both simpler and more capable. It is therefore rational to used them more in D than in C++. The cost benefit equilibrium is at a different place. I agree that template are overused in some C++ codebase. On the contrary, I would probably use even more template if I had to port such code in D.
Jun 17 2014
prev sibling parent "ponce" <contact gam3sfrommars.fr> writes:
On Tuesday, 17 June 2014 at 04:24:54 UTC, c0de517e wrote:
 Hi everybody. I'm Angelo Pesce, the author of the post on 
 c0de517e.
Hi, I think the general idea of your post is 100% accurate, the bigger risk for D is people not willing to move from C++. I work in C++ full-time and it's an additional cost to use it, and I believe a huge one. But essentially it's hidden behind the daily challenges of specific domain X.
Jun 18 2014
prev sibling next sibling parent "MattCoder" <idonthaveany mail.com> writes:
On Tuesday, 17 June 2014 at 04:03:23 UTC, Mike Parker wrote:
 On 6/17/2014 12:16 PM, Caligo via Digitalmars-d wrote:
 My rant wasn't about his lack of fluency in the English 
 language.  You
 only learn once what a sentence is, and the concept translates 
 over to
 most other natural languages.  The same is true with the 
 concept of
 constructing a paragraph.  Even if he's not a native English 
 speaker,
 I'm willing to bet that his writings in his mother tongue are 
 just as
 bad.  Just ask professors how often they encounter poor quality
 writings that were produced by native speakers.  And FWIW, I'm 
 not a
 native English speaker either.  I'm multilingual, and I don't 
 use that
 fact as an excuse for anything.
I completely disagree with all this. I've been teaching English (and also Debate) in Korea for 20 years at all levels of ability, from beginner to advanced. I've taught preschoolers, primary school students, university students, housewives, laborers, office workers, teachers, business executives and more. I also frequently edit documents that have already been translated from Korean to English, cleaning them up to make them more readable to native speakers. I can tell you without hesitation that there are a great many people who write very well in Korean and have a good spoken command of English, but who manage to construct some unintelligible English sentences when they write. The ability to write well in a native language and/or to speak well in a foreign language does not translate to an equivalent ability in a foreign language (particularly when there is an extreme difference in grammar between the two).
Completely agree with Parker! And thanks for writing this down. Matheus.
Jun 16 2014
prev sibling parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 04:03:23 UTC, Mike Parker wrote:
 On 6/17/2014 12:16 PM, Caligo via Digitalmars-d wrote:
 My rant wasn't about his lack of fluency in the English 
 language.  You
 only learn once what a sentence is, and the concept translates 
 over to
 most other natural languages.  The same is true with the 
 concept of
 constructing a paragraph.  Even if he's not a native English 
 speaker,
 I'm willing to bet that his writings in his mother tongue are 
 just as
 bad.  Just ask professors how often they encounter poor quality
 writings that were produced by native speakers.  And FWIW, I'm 
 not a
 native English speaker either.  I'm multilingual, and I don't 
 use that
 fact as an excuse for anything.
I completely disagree with all this. I've been teaching English (and also Debate) in Korea for 20 years at all levels of ability, from beginner to advanced. I've taught preschoolers, primary school students, university students, housewives, laborers, office workers, teachers, business executives and more. I also frequently edit documents that have already been translated from Korean to English, cleaning them up to make them more readable to native speakers. I can tell you without hesitation that there are a great many people who write very well in Korean and have a good spoken command of English, but who manage to construct some unintelligible English sentences when they write. The ability to write well in a native language and/or to speak well in a foreign language does not translate to an equivalent ability in a foreign language (particularly when there is an extreme difference in grammar between the two).
Very true. I often find myself stuck in the middle of the spoken sentence because I start building up a phrase in a similar way like I would have done in Russian and suddenly realize that result does not make any sense at all when applied to English.
Jun 16 2014
prev sibling parent "Kagamin" <spam here.lot> writes:
On Tuesday, 17 June 2014 at 03:16:16 UTC, Caligo via 
Digitalmars-d wrote:
 My rant wasn't about his lack of fluency in the English 
 language.  You
 only learn once what a sentence is, and the concept translates 
 over to
 most other natural languages.
How would you translate an arbitrary sentence to another language?
Jun 18 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
 Given that he lives in Italy, it's safe to assume that English 
 is not his first language. But rather than consider what he has 
 to say or dispute his arguments, you completely dismissed his 
 point of view because his level of writing doesn't meet your 
 standards. Furthermore, you unjustly called him a 
 "sophisticated idiot" and "narcissistic cunt". You've only 
 shown yourself to be the ignorant one.
Thanks for taking the time to look into my profile, I'm actually not as bad at English as it shows in my blog posts, but indeed is my second language and I'm not happy that I don't speak it quite as well as I'd like and I'm in the process even losing a bit my italian. I don't live in Italy anymore, I'm a rendering technical director for Activision|Blizzard.
Jun 17 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/17/2014 6:27 PM, c0de517e wrote:
 I'm actually not as
 bad at English as it shows in my blog posts, but indeed is my second
 language and I'm not happy that I don't speak it quite as well as I'd
 like
Meh, lots of native English speakers are pretty bad at English ;) Is is a goofy language, though.
Jun 17 2014
prev sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 18 June 2014 08:27, c0de517e via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 Given that he lives in Italy, it's safe to assume that English is not his
 first language. But rather than consider what he has to say or dispute his
 arguments, you completely dismissed his point of view because his level of
 writing doesn't meet your standards. Furthermore, you unjustly called him a
 "sophisticated idiot" and "narcissistic cunt". You've only shown yourself to
 be the ignorant one.
Thanks for taking the time to look into my profile, I'm actually not as bad at English as it shows in my blog posts, but indeed is my second language and I'm not happy that I don't speak it quite as well as I'd like and I'm in the process even losing a bit my italian. I don't live in Italy anymore, I'm a rendering technical director for Activision|Blizzard.
Ah yeah. Do you work anywhere near a bloke named Rowan Hamilton? Say gday for me if you do :)
Jun 18 2014
prev sibling parent "Alessandro Ogheri" <ogheri alessandroogheri.com> writes:
On Sunday, 15 June 2014 at 15:37:51 UTC, Caligo via Digitalmars-d 
wrote:
 I can't take a blog post seriously when it's poorly written and 
 full of
 grammatical errors.  If you are in an engineering field of any 
 kind, and
 you can't construct a paragraph in your favorite natural 
 language, you're
 not worth anyone's time.  The author of that blog is nothing 
 but a
 sophisticated idiot who should not be taken seriously.

 I'm so sick of watching narcissistic cunts who just love to 
 broadcast their
 opinions, enough said.


 On Sun, Jun 15, 2014 at 6:28 AM, Peter Alexander via 
 Digitalmars-d <
 digitalmars-d puremagic.com> wrote:

 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's
 important we understand what people think of D. I can confirm 
 this
 sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I do not get the impression that his english is thaaaat terrible... Or should I first of all apologize for not being english myself and having had the arrogance of entering this forum ?
Jun 26 2014
prev sibling next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I wonder where he got the idea that D isn't high performance... Perhaps the fact that it has a GC?
Jun 15 2014
next sibling parent reply "Joakim" <dlang joakim.airpost.net> writes:
On Sunday, 15 June 2014 at 18:50:14 UTC, Meta wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I wonder where he got the idea that D isn't high performance... Perhaps the fact that it has a GC?
He clarifies in the comments: "D is not 'high-performance' the same way as C and C++ are not. Systems is not the same as high-performance. Fortran always has been more 'high-performance' than C/C++ as it doesn't have pointer aliasing (think that C++ introduced restrict, which is the bread and butter of a HPC language only in C++11, same for threading, still no vector types...) for example. ISPC is a HPC language or Julia, Fortran, even Numpy if you want, not D or C or C++" http://c0de517e.blogspot.in/2014/06/where-is-my-c-replacement.html?showComment=1402865174608#c415780017887651116
Jun 15 2014
parent reply Sean Cavanaugh <WorksOnMyMachine gmail.com> writes:
On 6/15/2014 4:34 PM, Joakim wrote:
 He clarifies in the comments:

 "D is not 'high-performance' the same way as C and C++ are not. Systems
 is not the same as high-performance. Fortran always has been more
 'high-performance' than C/C++ as it doesn't have pointer aliasing (think
 that C++ introduced restrict, which is the bread and butter of a HPC
 language only in C++11, same for threading, still no vector types...)
 for example. ISPC is a HPC language or Julia, Fortran, even Numpy if you
 want, not D or C or C++"
 http://c0de517e.blogspot.in/2014/06/where-is-my-c-replacement.html?showComment=1402865174608#c415780017887651116
I had a nice sad 'ha ha' moment when I realized that msvc can't cope with restrict on the pointers feeding into the simd intrinsics; you have to cast it away. So much for that perf :)
Jun 17 2014
parent reply "c0de517e" <kenpex tin.it> writes:
On Wednesday, 18 June 2014 at 03:28:48 UTC, Sean Cavanaugh wrote:
 On 6/15/2014 4:34 PM, Joakim wrote:
 He clarifies in the comments:

 "D is not 'high-performance' the same way as C and C++ are 
 not. Systems
 is not the same as high-performance. Fortran always has been 
 more
 'high-performance' than C/C++ as it doesn't have pointer 
 aliasing (think
 that C++ introduced restrict, which is the bread and butter of 
 a HPC
 language only in C++11, same for threading, still no vector 
 types...)
 for example. ISPC is a HPC language or Julia, Fortran, even 
 Numpy if you
 want, not D or C or C++"
 http://c0de517e.blogspot.in/2014/06/where-is-my-c-replacement.html?showComment=1402865174608#c415780017887651116
I had a nice sad 'ha ha' moment when I realized that msvc can't cope with restrict on the pointers feeding into the simd intrinsics; you have to cast it away. So much for that perf :)
http://blogs.msdn.com/b/vcblog/archive/2013/07/12/introducing-vector-calling-convention.aspx
Jun 17 2014
parent Sean Cavanaugh <WorksOnMyMachine gmail.com> writes:
On 6/18/2014 1:05 AM, c0de517e wrote:
 On Wednesday, 18 June 2014 at 03:28:48 UTC, Sean Cavanaugh wrote:
 I had a nice sad 'ha ha' moment when I realized that msvc can't cope
 with restrict on the pointers feeding into the simd intrinsics; you
 have to cast it away.  So much for that perf :)
http://blogs.msdn.com/b/vcblog/archive/2013/07/12/introducing-vector-calling-convention.aspx
VectorCall is all about working the original x64 ABI that only lets you officially pass float and double point scalars around in the xmm registers. vectors require writing a bunch of helper forceinline functions that always operate on pointers or references, as passing by value lacked vectorcall, and on x86 pass by value for xmm types won't even compile. Ultimately the code ends up with calls to you have to call something like _mm_store_ps or _mm_stream_ps etc, those are the functions that take pointers, and you have to cast away volatile (and afaik restrict is ignored on them as well but you don't to cast it away).
Jun 18 2014
prev sibling next sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 16 June 2014 04:50, Meta via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think it's
 important we understand what people think of D. I can confirm this sentiment
 is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I wonder where he got the idea that D isn't high performance... Perhaps the fact that it has a GC?
Definitely.
Jun 15 2014
prev sibling next sibling parent reply "John" <john.joyus gmail.com> writes:
On Sunday, 15 June 2014 at 18:50:14 UTC, Meta wrote:
 I wonder where he got the idea that D isn't high performance... 
 Perhaps the fact that it has a GC?
He probably went to http://dlang.org/ and clicked the Run button on the code example there.
Jun 16 2014
parent "John" <john.joyus gmail.com> writes:
On Monday, 16 June 2014 at 21:00:59 UTC, John wrote:
 On Sunday, 15 June 2014 at 18:50:14 UTC, Meta wrote:
 I wonder where he got the idea that D isn't high 
 performance... Perhaps the fact that it has a GC?
He probably went to http://dlang.org/ and clicked the Run button on the code example there.
It would be nice if it shows how long the executable ran to get the results.
Jun 16 2014
prev sibling parent "c0de517e" <kenpex tin.it> writes:
On Sunday, 15 June 2014 at 18:50:14 UTC, Meta wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I wonder where he got the idea that D isn't high performance... Perhaps the fact that it has a GC?
I didn't mean that it doesn't produce fast code and I have nothing against GC (if you can precisely control when you heap allocate). I meant not high-performance as not a HPC language (numerical computation) like Julia, ISPC and the like, languages that focus on executing parallel code. It seems to me that D comes from the C lineage of "systems" low-level languages, but there is always this confusion that low-level means made for demanding computations. I distinguish the two also because lots of C++ aficionados think C++ is just the "fastest" language made for performance, but it's clearly not, in fact before 11 it didn't even know what a thread is, couldn't restrict pointer aliasing and it still today doesn't know about SIMD (great that D does - even if I'd rather have arbitrary sized vectors nowadays).
Jun 17 2014
prev sibling next sibling parent reply Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
You'll likely toss me into the same boat as the post you're ranting about, but
please, watch the 
misogynistic language here.

On 6/15/14, 8:37 AM, Caligo via Digitalmars-d wrote:
 I'm so sick of watching narcissistic <edited> who just love to broadcast their
opinions, enough said.
Jun 15 2014
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/15/2014 1:20 PM, Brad Roberts via Digitalmars-d wrote:
 You'll likely toss me into the same boat as the post you're ranting about, but
 please, watch the misogynistic language here.
I agree. It was not necessary to make his point.
Jun 15 2014
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
On Sunday, 15 June 2014 at 20:20:37 UTC, Brad Roberts via
Digitalmars-d wrote:
 You'll likely toss me into the same boat as the post you're 
 ranting about, but please, watch the misogynistic language here.
Yes. Misogyny is clearly the issue here. Because it obvioulsy would have been SOOO completely different and much more tame to mention narcissistic cocks. As in "People who can't tell misogyny from cussing are a bunch of stupid dicks." Or "Viewing 'cunt' as a worse thing than 'cock' is uniquely charactaristic of misogynistic dickheadded hypocrites." You see, everthing I said is all ok because I didn't use any *female* terms negatively. Now that we've both accepted the virtues of equality, maybe later this evening we can join up and take back the night from all those EVIL MEN that are always waging war on poor defenseless women. Because apperently this is still 1950.
 On 6/15/14, 8:37 AM, Caligo via Digitalmars-d wrote:
 I'm so sick of watching narcissistic <edited> who just love to 
 broadcast their opinions, enough said.
Jun 15 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Professionals at work use and rely on this forum, and NSFW content shouldn't be 
posted here. I also request a professional standard of decorum here.
Jun 15 2014
parent Brad Roberts via Digitalmars-d <digitalmars-d puremagic.com> writes:
I'd reply to those that choose to nit pick the specific choice of words rather
than the underlying 
message, but please, this forum devolves into rants and childish behavior often
enough already.  Try 
to take to heart Walter's words and underlying intent.  A little more
professionalism and care 
towards welcoming communication styles please.

On 6/15/14, 3:48 PM, Walter Bright via Digitalmars-d wrote:
 Professionals at work use and rely on this forum, and NSFW content shouldn't
be posted here. I also
 request a professional standard of decorum here.
Jun 15 2014
prev sibling next sibling parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Sunday, 15 June 2014 at 20:20:37 UTC, Brad Roberts via 
Digitalmars-d wrote:
 You'll likely toss me into the same boat as the post you're 
 ranting about, but please, watch the misogynistic language here.
Unnecessarily offensive in the context, yes, but reasonable people can and do disagree on whether it's misogynistic. In a lot of places it's not even a gendered* insult**, despite it's meaning. Most swearing refers to genitalia or a specific subset (or characteristic of that subset) of humanity, but that doesn't necessarily make them active words of current oppression that should be avoided. *i.e. it's not targeted at women specifically and doesn't imply any negative message about women. YMMV by location and social group. **it's barely an insult at all if you've got the right(?) Irish friends.
Jun 15 2014
parent reply "w0rp" <devw0rp gmail.com> writes:
On Sunday, 15 June 2014 at 22:40:53 UTC, John Colvin wrote:
 On Sunday, 15 June 2014 at 20:20:37 UTC, Brad Roberts via 
 Digitalmars-d wrote:
 You'll likely toss me into the same boat as the post you're 
 ranting about, but please, watch the misogynistic language 
 here.
Unnecessarily offensive in the context, yes, but reasonable people can and do disagree on whether it's misogynistic. In a lot of places it's not even a gendered* insult**, despite it's meaning. Most swearing refers to genitalia or a specific subset (or characteristic of that subset) of humanity, but that doesn't necessarily make them active words of current oppression that should be avoided. *i.e. it's not targeted at women specifically and doesn't imply any negative message about women. YMMV by location and social group. **it's barely an insult at all if you've got the right(?) Irish friends.
It's totally not misogynistic, but it's also not appropriate. If for nothing else, it's just not creative use of the English language. It's certainly better to criticise an opinion by pointing out its flaws than to just curse someone.
Jun 15 2014
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/15/2014 7:08 PM, w0rp wrote:
 On Sunday, 15 June 2014 at 22:40:53 UTC, John Colvin wrote:
 *i.e. it's not targeted at women specifically and doesn't imply any
 negative message about women. YMMV by location and social group.

 **it's barely an insult at all if you've got the right(?) Irish friends.
It's totally not misogynistic, but it's also not appropriate. If for nothing else, it's just not creative use of the English language. It's certainly better to criticise an opinion by pointing out its flaws than to just curse someone.
Yes. Until the equivalent "male" words start getting labeled as misandrist (which I hope never happens), then accusing the "female" version as being sexist is *itself* an extremely flagrant display of misogyny, vastly more misogynistic than the word itself could ever be.
Jun 15 2014
prev sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Sunday, 15 June 2014 at 20:20:37 UTC, Brad Roberts via 
Digitalmars-d wrote:
 You'll likely toss me into the same boat as the post you're 
 ranting about, but please, watch the misogynistic language here.

 On 6/15/14, 8:37 AM, Caligo via Digitalmars-d wrote:
 I'm so sick of watching narcissistic <edited> who just love to 
 broadcast their opinions, enough said.
It's only misogynistic if you think that adjective generally apply to women. i don't think it does. Do you ?
Jun 15 2014
prev sibling next sibling parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 15 June 2014 21:28, Peter Alexander via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think it's
 important we understand what people think of D. I can confirm this sentiment
 is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
He was basically turned off by the GC, just like most people I introduce D to. If he could replace it with ARC I bet he would switch immediately. $100 says he already uses ref counting extensively, and would love for the compiler to eliminate all that boilerplate! I predict, AAA gamedev will switch to D the same day that: 1) ARC is an option 2) LDC can produce fully working, MSC compatible Win64 COFF + CV8/PDB debug output *** this is the single biggest thing holding D back in my opinion *** It's also a significant roadblock for many that Android + iOS still aren't working (gamedev's don't care so much about the OS API's, but the compiler must produce working code for all language features). I've said from the start, and his blog clearly reflects my opinion, that D is the language gamedev's are waiting for. It's _so_ close, but just needs some deliberate care by the core devs to get it across the line.
From my time here, I think one of the most significant problems is
lack of gamedev contributors. And I think there is a simple reason for this, which I can draw from my own experience; gamedev's largely are NOT accustomed to open source workflow or even OSS thinking in principle. Gamedev is strictly a proprietary, and usually very closed and tightly controlled industry, and it's not within most gamedev's daily operating discipline to think in the way that would lead them to become D language contributors. So, that might lead many in this community to say "what a bunch of dicks! fuck 'em!", and that might be fair, but I think it's also frequently underestimated just how big the industry is. Gamedev is _gigantic_, and if D were to secure a win in gamedev, I think that would firmly secure it as a relevant modern language, and kick it off for adoption by everyone else. Facebook pushing D is awesome, but facebook is just one company, and their competitors (google?) are likely to shun their commitment to D in principle, particularly since they have their own competing solutions. Gamedev is also a very 'trendy' industry... all it would take is for one significant company to flirt with D and do a talk at GDC about it. Practically everyone would jump on the wagon overnight, I've seen the pattern over and over. I'm sure I'm biased, but when considering potential for large scale adoption by key industries, I think gamedev is the easiest sell (by far!), and also the closest to the goal. All it would take is a deliberate focus by the mainly non-gamedev contributors to get it across the line, and it could probably be done in a matter of weeks if it were made a priority...
Jun 15 2014
parent Xavier Bigand <flamaros.xavier gmail.com> writes:
Le 16/06/2014 03:28, Manu via Digitalmars-d a écrit :
 On 15 June 2014 21:28, Peter Alexander via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think it's
 important we understand what people think of D. I can confirm this sentiment
 is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
He was basically turned off by the GC, just like most people I introduce D to. If he could replace it with ARC I bet he would switch immediately. $100 says he already uses ref counting extensively, and would love for the compiler to eliminate all that boilerplate! I predict, AAA gamedev will switch to D the same day that: 1) ARC is an option 2) LDC can produce fully working, MSC compatible Win64 COFF + CV8/PDB debug output *** this is the single biggest thing holding D back in my opinion *** It's also a significant roadblock for many that Android + iOS still aren't working (gamedev's don't care so much about the OS API's, but the compiler must produce working code for all language features). I've said from the start, and his blog clearly reflects my opinion, that D is the language gamedev's are waiting for. It's _so_ close, but just needs some deliberate care by the core devs to get it across the line.
From my time here, I think one of the most significant problems is
lack of gamedev contributors. And I think there is a simple reason for this, which I can draw from my own experience; gamedev's largely are NOT accustomed to open source workflow or even OSS thinking in principle. Gamedev is strictly a proprietary, and usually very closed and tightly controlled industry, and it's not within most gamedev's daily operating discipline to think in the way that would lead them to become D language contributors. So, that might lead many in this community to say "what a bunch of dicks! fuck 'em!", and that might be fair, but I think it's also frequently underestimated just how big the industry is. Gamedev is _gigantic_, and if D were to secure a win in gamedev, I think that would firmly secure it as a relevant modern language, and kick it off for adoption by everyone else. Facebook pushing D is awesome, but facebook is just one company, and their competitors (google?) are likely to shun their commitment to D in principle, particularly since they have their own competing solutions. Gamedev is also a very 'trendy' industry... all it would take is for one significant company to flirt with D and do a talk at GDC about it. Practically everyone would jump on the wagon overnight, I've seen the pattern over and over. I'm sure I'm biased, but when considering potential for large scale adoption by key industries, I think gamedev is the easiest sell (by far!), and also the closest to the goal. All it would take is a deliberate focus by the mainly non-gamedev contributors to get it across the line, and it could probably be done in a matter of weeks if it were made a priority...
+1 I just can stop dreaming about doing my app (an architecture software in 3D, it's pretty much the same as a game) in D for Android, iOS,... I just want a fast build. I fixed some issues in our polygones methods, all classes in template are horrible to build.
Jun 16 2014
prev sibling next sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 15 June 2014 21:28, Peter Alexander via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think it's
 important we understand what people think of D. I can confirm this sentiment
 is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
Classic comment: "I really really want to like D, but the tools are so bad right now. Tools matter. Alot. D also has a pretty horrible GC design--" Primary concern followed by secondary concern. I've been saying this for years... ;)
Jun 15 2014
prev sibling next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
I happen o be a regular reader of this blog. Usually quite interesting, but on this one it is a bit weak. Poorly structures and the author contradict himself several time in the article. It also seems poorly informed (I wouldn't put Go and Rust in the same bag for instance).
Jun 15 2014
prev sibling next sibling parent Caligo via Digitalmars-d <digitalmars-d puremagic.com> writes:
I didn't know that the use of the c-word was considered misogynous,
and I don't consider it to be.  It's just an insult, and you're not
fighting sexism.  The software industry being what it is, one of the
most racist and sexist industries, your time is better spent writing
about that.  I do agree that it was inappropriate to insult him, but
it felt right.


On Sun, Jun 15, 2014 at 3:20 PM, Brad Roberts via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 You'll likely toss me into the same boat as the post you're ranting about,
 but please, watch the misogynistic language here.
Jun 16 2014
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Mon, Jun 16, 2014 at 08:08:50PM -0700, Walter Bright via Digitalmars-d wrote:
 On 6/16/2014 5:44 AM, "Ola Fosheim Grøstad"
 <ola.fosheim.grostad+dlang gmail.com>" wrote:
As far as I can tell string mixins have the same bad properties that
macros have.
Assuming you are talking about C macros: Having implemented the C preprocessor (multiple times), make's macro system, designed and implemented ABEL's macro system, Ddoc's macro system, and string mixins, I can unequivocably object to that opinion! The C macro system is awesome in its awfulness. Let me count the ways: 1. Probably < 10 people in the world actually understand it all the way down. It is absurdly complex for what little it does. Paul Mensonidas is usually acknowledged as the "world's leading expert on the C preprocessor." Why would a freakin' macro processor even have an ecological niche for a world leading expert on it? The mind boggles.
So that you can write IOCCC entries that abuse the dark corners of cpp, of course. ;-) [...]
 5. There is no scoping of names of any sort - no hygiene whatsoever.
 Any #include file can trash any subsequent, unrelated, #include file
 in any way.
Do string mixins have scoping of names? I do agree, though, that D's string mixins eliminated a whole class of cpp abuses by requiring that the input string be a set of complete statements or declarations -- things like: mixin("writeln("); mixin(");"); are rejected by the compiler, whereas in C: #define MIXIN_1 writeln( #define MIXIN_2 ); MIXIN_1 MIXIN_2 are happily accepted, leading to IOCCC tricks like: #define block(x) { x } #define split_block(y) } y { void func() block( printf("a"); split_block(void main()) printf("b"); ) // The above nastiness translates to: void func() { printf("a"); } void main() { printf("b"); } (To my shame, I must admit that I actually did this in an IOCCC entry -- in fact, in an even worse form than shown above. :P) None of this insanity is permitted by D's string mixins, which is a big plus, in my book.
 6. Any syntax highlighter cannot work entirely correctly without
 having a full preprocessor.
And how would you syntax-highlight a string mixin that's assembled from arbitrary string fragments?
 7. Because of the preprocessor, valid C code need not look remotely
 like C code according to the C grammar.
Ah yes, this brings back sweet memories of this amusing little IOCCC entry: http://www.ioccc.org/2005/chia/chia.c :-)
 8. There are no looping macros, no CAR/CDR capabilities (Ddoc has the
 latter).
[...] On Mon, Jun 16, 2014 at 08:10:40PM -0700, Walter Bright via Digitalmars-d wrote:
 On 6/16/2014 8:18 AM, "Ola Fosheim Grøstad"
 <ola.fosheim.grostad+dlang gmail.com>" wrote:
Sure,  just like m4 and cpp can be extremely powerful. Too powerful…
One of the sins of cpp is it is not powerful enough, forcing a lot of awkward usages.
I thought cpp's non-Turing-completeness was actually intentional? As if cpp nastiness isn't already bad enough, as you pointed out above... I dread to imagine a cpp that is "powerful enough"(!). In any case, string mixins are better than cpp in several ways, but they still do suffer from some of cpp's problems. T -- Stop staring at me like that! It's offens... no, you'll hurt your eyes!
Jun 16 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/17/2014 12:32 AM, H. S. Teoh via Digitalmars-d wrote:
 I do agree, though, that D's string mixins eliminated a whole class of
 cpp abuses by requiring that the input string be a set of complete
 statements or declarations -- things like:

 	mixin("writeln(");
 	mixin(");");

 are rejected by the compiler, whereas in C:

 	#define MIXIN_1 writeln(
 	#define MIXIN_2 );

 	MIXIN_1 MIXIN_2

 are happily accepted
OTOH, this does make it harder for D to emulate the nifty "stackless fibers" trick used by C's sweet ProtoThreads library (The one thing I'm envious of C over): http://dunkels.com/adam/pt/
Jun 16 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/16/2014 9:32 PM, H. S. Teoh via Digitalmars-d wrote:
 Do string mixins have scoping of names?
Yes, of course, since they are D code.
 And how would you syntax-highlight a string mixin that's assembled from
 arbitrary string fragments?
You wouldn't need to, since the text editor sees only normal D code.
Jun 17 2014
next sibling parent reply dennis luehring <dl.soluz gmx.net> writes:
Am 17.06.2014 11:30, schrieb Walter Bright:
 And how would you syntax-highlight a string mixin that's assembled from
 arbitrary string fragments?
You wouldn't need to, since the text editor sees only normal D code.
the text editor sees just D-code-Strings - so no syntax-highlight except that for Strings
Jun 17 2014
next sibling parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 10:25:39 UTC, dennis luehring wrote:
 Am 17.06.2014 11:30, schrieb Walter Bright:
 And how would you syntax-highlight a string mixin that's 
 assembled from
 arbitrary string fragments?
You wouldn't need to, since the text editor sees only normal D code.
the text editor sees just D-code-Strings - so no syntax-highlight except that for Strings
All editors with D support highlight token strings as if it was code.
Jun 17 2014
prev sibling next sibling parent "Kapps" <opantm2+spam gmail.com> writes:
On Tuesday, 17 June 2014 at 10:25:39 UTC, dennis luehring wrote:
 Am 17.06.2014 11:30, schrieb Walter Bright:
 And how would you syntax-highlight a string mixin that's 
 assembled from
 arbitrary string fragments?
You wouldn't need to, since the text editor sees only normal D code.
the text editor sees just D-code-Strings - so no syntax-highlight except that for Strings
The q{ } syntax is meant to work around this. mixin(q{ void foo() { return 8; } }); Should be highlighted.
Jun 17 2014
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/17/2014 3:25 AM, dennis luehring wrote:
 the text editor sees just D-code-Strings - so no syntax-highlight except that
 for Strings
The syntax highlighter could highlight q{ ... } strings differently, if it chose to, with little difficulty.
Jun 17 2014
prev sibling parent reply Artur Skawina via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 06/17/14 11:30, Walter Bright via Digitalmars-d wrote:
 On 6/16/2014 9:32 PM, H. S. Teoh via Digitalmars-d wrote:
 
 And how would you syntax-highlight a string mixin that's assembled from
 arbitrary string fragments?
You wouldn't need to, since the text editor sees only normal D code.
That's just because D lacks string interpolation and static-foreach. These two things take meta programming to a whole new level. The difference is similar to the C++- vs D-templates case. Superficially "it's just a neater syntax", but in practice it allows perfectly readable code to be written for cases where traditional (both template- and ctfe- based) meta programs become an almost unmaintainable mess. And the fact that those meta-programs can then be properly syntax- highlighted in an editor is a nice extra. artur (who implemented both features last weekend; it started out as a fun "let's-see-how-D-would-look-if-it-had-this"-project, but after making them work and then converting a few small programs, almost immediately realized that he now does not want to live w/o this functionality)
Jun 17 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 13:13:00 UTC, Artur Skawina via 
Digitalmars-d wrote:
 artur (who implemented both features last weekend; it started 
 out as a
 fun "let's-see-how-D-would-look-if-it-had-this"-project, but 
 after making
 them work and then converting a few small programs, almost 
 immediately
 realized that he now does not want to live w/o this 
 functionality)
I'd be very interested in seeing PR for the static foreach at the very least ;)
Jun 17 2014
parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Tuesday, 17 June 2014 at 13:24:11 UTC, Dicebot wrote:
 On Tuesday, 17 June 2014 at 13:13:00 UTC, Artur Skawina via 
 Digitalmars-d wrote:
 artur (who implemented both features last weekend; it started 
 out as a
 fun "let's-see-how-D-would-look-if-it-had-this"-project, but 
 after making
 them work and then converting a few small programs, almost 
 immediately
 realized that he now does not want to live w/o this 
 functionality)
I'd be very interested in seeing PR for the static foreach at the very least ;)
also, foreach that works outside of function scope would be awesome: mixin template A(TL ...) { foreach(i, T; TL) { mixin("T v" ~ i.to!string); } }
Jun 17 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 13:36:48 UTC, John Colvin wrote:
 also, foreach that works outside of function scope would be 
 awesome:

 mixin template A(TL ...)
 {
 	foreach(i, T; TL)
 	{
 		mixin("T v" ~ i.to!string);
 	}
 }
It is not "also", it is primary use case of static foreach
Jun 17 2014
parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Tuesday, 17 June 2014 at 13:52:48 UTC, Dicebot wrote:
 On Tuesday, 17 June 2014 at 13:36:48 UTC, John Colvin wrote:
 also, foreach that works outside of function scope would be 
 awesome:

 mixin template A(TL ...)
 {
 	foreach(i, T; TL)
 	{
 		mixin("T v" ~ i.to!string);
 	}
 }
It is not "also", it is primary use case of static foreach
I though the primary use of static foreach was to force the compiler to attempt compile-time iteration even for non-TemplateArgList arguments like arrays known at compile-time e.g. static foreach(el; [1,2,3,4]) { pragma(msg, el); } or static foreach(el; 5 .. 8) { pragma(msg, el); } The mixin template example I gave is already a "static" foreach, just not explicitly so.
Jun 17 2014
next sibling parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 17 June 2014 at 14:00:44 UTC, John Colvin wrote:
 I though the primary use of static foreach was to force the
 compiler to attempt compile-time iteration even for
 non-TemplateArgList arguments like arrays known at compile-time
If static foreach acts as code generator there is no practical difference. Also I don't believe compile-time iteration use case is even remotely as common as declaration injection.
Jun 17 2014
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 06/17/2014 04:00 PM, John Colvin wrote:
 I though the primary use of static foreach was to force the
 compiler to attempt compile-time iteration even for
 non-TemplateArgList arguments like arrays known at compile-time

 e.g.

 static foreach(el; [1,2,3,4])
 {
       pragma(msg, el);
 }

 or

 static foreach(el; 5 .. 8)
 {
       pragma(msg, el);
 }
No, that's a distinct use and IMO shouldn't be called static foreach (it would be inconsistent with static if in how scopes are handled.) In any case, it is a quite boring use case as well, one can write a template that converts a range into such a list and then plain foreach will work.
Jun 17 2014
prev sibling next sibling parent "Kiith-Sa" <kiithsacmp gmail.com> writes:
On Tuesday, 17 June 2014 at 13:36:48 UTC, John Colvin wrote:
 On Tuesday, 17 June 2014 at 13:24:11 UTC, Dicebot wrote:
 On Tuesday, 17 June 2014 at 13:13:00 UTC, Artur Skawina via 
 Digitalmars-d wrote:
 artur (who implemented both features last weekend; it started 
 out as a
 fun "let's-see-how-D-would-look-if-it-had-this"-project, but 
 after making
 them work and then converting a few small programs, almost 
 immediately
 realized that he now does not want to live w/o this 
 functionality)
I'd be very interested in seeing PR for the static foreach at the very least ;)
also, foreach that works outside of function scope would be awesome: mixin template A(TL ...) { foreach(i, T; TL) { mixin("T v" ~ i.to!string); } }
This would drastically improve readability of some of my code.
Jun 17 2014
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 06/17/2014 03:36 PM, John Colvin wrote:

 also, foreach that works outside of function scope would be awesome:

 mixin template A(TL ...)
 {
      foreach(i, T; TL)
      {
          mixin("T v" ~ i.to!string);
      }
 }
Also, identifier mixins might then somewhat clean up a lot of code. The cases where a declaration name needs to be generated and this forces the whole declaration to be written in awkward string interpolation style are just too common, even more so if static foreach is supported (if there is any named declaration inside the static foreach body at all and the loop loops more than once, mixins will be required to prevent name clashes.) Eg: mixin template A(T...){ static foreach(i,S;T){ S mixin(`v`~i.to!string); auto mixin(`fun`~i.to!string)(S s){ // lots of code potentially using `i' without first // converting it to a string only for it to be parsed back. // ... return s.mixin(`member`~i); // I've wanted this too } } } Also, there may be cases where one really wants to have local declarations (eg. enums) inside the static foreach loop. (I really need to get around to polishing/make formal that static foreach DIP!)
Jun 17 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/17/2014 6:12 AM, Artur Skawina via Digitalmars-d wrote:
 rtur (who implemented both features last weekend; it started out as a
 fun "let's-see-how-D-would-look-if-it-had-this"-project, but after making
 them work and then converting a few small programs,
Taking action rather than debating - I love it!
 immediately realized that he now does not want to live w/o this functionality)
I don't think I can take that kind of pressure!
Jun 17 2014
parent Artur Skawina via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 06/17/14 22:15, Walter Bright via Digitalmars-d wrote:
 On 6/17/2014 6:12 AM, Artur Skawina via Digitalmars-d wrote:
 immediately realized that he now does not want to live w/o this functionality)
I don't think I can take that kind of pressure!
I was responding to "text editor sees only normal D code" -- my point was just that it doesn't have to be like that. But it's something that can be hard to realize or even imagine, until one sees the whole picture, with several language features playing well together. Just like otherwise very experienced C++ programmers often completely fail to appreciate certain D features, which only really start to make sense in context and combination. The difference that these two features made certainly surprised me; suddenly I didn't had to write unreadable lambdas and mixins, the code shrunk by a factor of ~three and became beautiful, even properly syntax highlighted after a few tweaks to the editor settings. It became very obvious that this is not just something-that-would-be- -neat-to-have-but-D-has-so-many-other-more-important-problems, but that it is a must-have. And it's really trivial do add - I did it /within/ the language, took ~100LOC; a proper built-in implementation wouldn't be significantly harder. But I'll shut up now, as apparently meta programming is considered harmful in certain industries, at least from what I read here. :) Wouldn't want to scare anybody away. I'll post in a new thread instead, in a few days, once I find the time to construct a proper example and write at least a few sentences explaining the syntax. artur
Jun 18 2014
prev sibling next sibling parent "c0de517e" <kenpex tin.it> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
Don't take the writing too literally, I improvise a lot. The point of the post was not an analysis of D or of its language features (for that I wrote a fairly big post in 2011 http://c0de517e.blogspot.ca/2011/04/2011-current-and-future-programming.html that also came with a survey among videogame professionals http://c0de517e.blogspot.ca/2011/05/2011-future-programming-languages-for.html) I like D and I think Rust is very nifty and interesting as well FWIW. This post was trying to (badly) say something else, that is, given that we don't see much language activity in videogames (compared to other engineering fields) why is that even if we have good languages, better versions of C++, we don't switch? And the point was that maybe "better" is not enough...
Jun 17 2014
prev sibling parent reply "Wanderer" <no-reply no-reply.org> writes:
On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I 
 think it's important we understand what people think of D. I 
 can confirm this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
My opinion: if you want D to smoothly replace both C++ and Java, simply do the following: 1. Sane language specification (which doesn't allow a slice of a stack-allocated array to escape to other part of a program, doesn't allow an object to contain garbage under ANY circumstances etc). 2. Workable compiler (that doesn't crash on 20% of code it tries to compile :-P). 3. Stable, efficient and well-documented runtime library, including collection classes, IO, date/time, concurrency, GUI, graphics, sound etc. 4. A well-designed IDE written purely in D, which allows analysis and refactoring (like IntelliJ IDEA which is written in Java), free of course. Believe me, after the step 4 is finished, MANY, if not most, of C++ and Java programmers will switch to D in no time. The language already provides many nice improvements, it's just not practical to use D yet (because RTL is still under development, no IDE etc).
Jun 18 2014
next sibling parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 18/06/2014 8:21 p.m., Wanderer wrote:
 On Sunday, 15 June 2014 at 11:28:12 UTC, Peter Alexander wrote:
 http://c0de517e.blogspot.ca/2014/06/where-is-my-c-replacement.html?m=1

 The arguments against D are pretty weak if I'm honest, but I think
 it's important we understand what people think of D. I can confirm
 this sentiment is fairly common in the industry.

 Watch out for the little jab at Andrei :-P
My opinion: if you want D to smoothly replace both C++ and Java, simply do the following: 1. Sane language specification (which doesn't allow a slice of a stack-allocated array to escape to other part of a program, doesn't allow an object to contain garbage under ANY circumstances etc). 2. Workable compiler (that doesn't crash on 20% of code it tries to compile :-P).
I've only found that when using CTFE + templates in a big way. Any other time.. its like 1% if that.
 3. Stable, efficient and well-documented runtime library, including
 collection classes, IO, date/time, concurrency, GUI, graphics, sound etc.

 4. A well-designed IDE written purely in D, which allows analysis and
 refactoring (like IntelliJ IDEA which is written in Java), free of course.

 Believe me, after the step 4 is finished, MANY, if not most, of C++ and
 Java programmers will switch to D in no time. The language already
 provides many nice improvements, it's just not practical to use D yet
 (because RTL is still under development, no IDE etc).
Something that I was thinking about, was about building the ecosystem up but not in a purely free way. Duel licensing. Free for opensource, education and personal use. Not free for commercial use. Buy the IDE, buy the lot kind of deal. I know this is a little like taboo in the D community, but it would help considerably I think.
Jun 18 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Wednesday, 18 June 2014 at 08:27:57 UTC, Rikki Cattermole 
wrote:
 On 18/06/2014 8:21 p.m., Wanderer wrote:
 3. Stable, efficient and well-documented runtime library, 
 including
 collection classes, IO, date/time, concurrency, GUI, graphics, 
 sound etc.
I don't really think big standard libraries are all that important. You need the basic ADTs that cover the holes in the language and some basic interfaces for streams. The other stuff is too system specific and will come when the language is stable, capable and the runtime/GC is (commercial) production level. I think it is wrong for a system level language to create emulation layers in the runtime to even out OS differences (which only work for Posixy OSes). It is better to have semi-official OS-X bindings, Windows bindings, Posix bindings etc. Look at the std C libs, which is pretty small, but quite obsolete due to its CLI/unix roots. std libs should never be obsolete due to changes in the environment.
 4. A well-designed IDE written purely in D, which allows 
 analysis and
 refactoring (like IntelliJ IDEA which is written in Java), 
 free of course.
The low hanging fruit is a community effort towards Eclipse.
 Something that I was thinking about, was about building the 
 ecosystem up but not in a purely free way.
 Duel licensing. Free for opensource, education and personal 
 use. Not free for commercial use. Buy the IDE, buy the lot kind 
 of deal.
The basics have to be open source and free, meaning at least an Eclipse level IDE. Then you can have commercial fine tuned tools in addition to that (like a commercial vendor targeting PNACL, Windows or iOS). I don't think dual licensing through dlang.org is a good idea. It erodes the perception of dlang.org being a "foundation" and turns it into "freeloading company". That's usually bad if you want volunteers. SUN was quite nice with open source, but received almost no external contribution (compared to BSD/Linux). The original source should be perceived as altruistic. I think Walter Bright does that part quite well. Better to have external entities do the commercial heavy lifting IMO.
Jun 18 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
 My opinion: if you want D to smoothly replace both C++ and 
 Java, simply do the following:

 1. Sane language specification (which doesn't allow a slice of 
 a stack-allocated array to escape to other part of a program, 
 doesn't allow an object to contain garbage under ANY 
 circumstances etc).

 2. Workable compiler (that doesn't crash on 20% of code it 
 tries to compile :-P).

 3. Stable, efficient and well-documented runtime library, 
 including collection classes, IO, date/time, concurrency, GUI, 
 graphics, sound etc.

 4. A well-designed IDE written purely in D, which allows 
 analysis and refactoring (like IntelliJ IDEA which is written 
 in Java), free of course.
In my domain 4. is totally unnecessary, we use Visual Studio or we don't use an IDE on 99% of the projects. VisualD is the best thing that could have been done. 3. and 1. are quite unnecessary too, 2. of course is a must But as I wrote I doubt that people will think at a point that yes, now D is 100% a better version of C++/Java/younameit, let's switch. I don't think it's how things go, I think successful languages find one thing a community really can't live without and get adopted there and from there expand. E.G. JavaScript is horribly broken, but some people really needed to be able to put code client-side on web pages, so now JS is everywhere...
Jun 18 2014
parent reply "Dicebot" <public dicebot.lv> writes:
On Wednesday, 18 June 2014 at 16:19:25 UTC, c0de517e wrote:
 But as I wrote I doubt that people will think at a point that 
 yes, now D is 100% a better version of C++/Java/younameit, 
 let's switch. I don't think it's how things go, I think 
 successful languages find one thing a community really can't 
 live without and get adopted there and from there expand. E.G. 
 JavaScript is horribly broken, but some people really needed to 
 be able to put code client-side on web pages, so now JS is 
 everywhere...
I think this is actually a flawed mentality that causes a lot of long-term problems to all programmers. By resisting to switch to languages simply because those are good we inevitably get to the point of switching because it is forced by some corporation that has bucks to create an intrusive ecosystem. And despite the fact language itself can be horrible no choice remains by then.
Jun 18 2014
next sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Wednesday, 18 June 2014 at 16:55:53 UTC, Dicebot wrote:
 On Wednesday, 18 June 2014 at 16:19:25 UTC, c0de517e wrote:
 But as I wrote I doubt that people will think at a point that 
 yes, now D is 100% a better version of C++/Java/younameit, 
 let's switch. I don't think it's how things go, I think 
 successful languages find one thing a community really can't 
 live without and get adopted there and from there expand. E.G. 
 JavaScript is horribly broken, but some people really needed 
 to be able to put code client-side on web pages, so now JS is 
 everywhere...
I think this is actually a flawed mentality that causes a lot of long-term problems to all programmers. By resisting to switch to languages simply because those are good we inevitably get to the point of switching because it is forced by some corporation that has bucks to create an intrusive ecosystem. And despite the fact language itself can be horrible no choice remains by then.
This is, but that's how it works nevertheless. You don't succeed by arguing what the reality should be, but by accepting what it is and act accordingly.
Jun 18 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Wednesday, 18 June 2014 at 18:17:03 UTC, deadalnix wrote:
 This is, but that's how it works nevertheless. You don't 
 succeed by arguing what the reality should be, but by accepting 
 what it is and act accordingly.
Being ashamed of it instead of glorifying such attitude is one way to motivate a change :)
Jun 18 2014
parent reply "c0de517e" <kenpex tin.it> writes:
On Wednesday, 18 June 2014 at 18:18:28 UTC, Dicebot wrote:
 On Wednesday, 18 June 2014 at 18:17:03 UTC, deadalnix wrote:
 This is, but that's how it works nevertheless. You don't 
 succeed by arguing what the reality should be, but by 
 accepting what it is and act accordingly.
Being ashamed of it instead of glorifying such attitude is one way to motivate a change :)
You can't fight human psychology, but if you're -really- smart you strive to understand it and work with it.
Jun 18 2014
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 6/18/2014 3:09 PM, c0de517e wrote:
 On Wednesday, 18 June 2014 at 18:18:28 UTC, Dicebot wrote:
 On Wednesday, 18 June 2014 at 18:17:03 UTC, deadalnix wrote:
 This is, but that's how it works nevertheless. You don't succeed by
 arguing what the reality should be, but by accepting what it is and
 act accordingly.
Being ashamed of it instead of glorifying such attitude is one way to motivate a change :)
You can't fight human psychology, but if you're -really- smart you strive to understand it and work with it.
There's a *big* difference between "human psychology" and "being an idiot who makes decisions poorly". For the former, unconditional acceptance is the only possible option. But for the latter, unconditional acceptance is nothing more than a convenient way to justify (and in effect, encourage) idiocy; it's both self-destructive and entirely avoidable given the actual willingness to avoid it. The belief that "No amount of improvement is worthwhile unless it comes with some single killer feature" might be common, but it definitely is NOT an immutable aspect of human psychology: It's just plain being an idiot who's trying to rationalize their own laziness and fear of change, instead of doing a programmer's/engineer's JOB of making decisions based on valid reasoning. It's NOT an immutable "human psychology" belief until someone's DECIDED to rationalize it as such and make excuses for it. world, and especially the tech sector, has become so pathetically inundated with morons and idiocy: Because instead of fighting and condemning stupidity, it's excused, accepted and even rewarded. That's exactly why so much has gone soooo fucking wrong.
Jun 18 2014
prev sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Wednesday, 18 June 2014 at 19:09:08 UTC, c0de517e wrote:
 On Wednesday, 18 June 2014 at 18:18:28 UTC, Dicebot wrote:
 On Wednesday, 18 June 2014 at 18:17:03 UTC, deadalnix wrote:
 This is, but that's how it works nevertheless. You don't 
 succeed by arguing what the reality should be, but by 
 accepting what it is and act accordingly.
Being ashamed of it instead of glorifying such attitude is one way to motivate a change :)
You can't fight human psychology, but if you're -really- smart you strive to understand it and work with it.
No, this is what is what you do to _pretend_ to be smart and pragmatical person, an approach so popularized by modern culture I sometimes start thinking it is intentional. You see, while fighting human psychology (actually "mentality" is correct term here I think) definitely does not work, influencing it is not only possible but in fact has happened all the time through the human history. Mentality is largely shaped by aggregated culture and any public action you take affects that aggregated culture in some tiny way. You can't force people start thinking in a different way but you can start being an example of a different attitude yourself, popularizing and encouraging it. You can stop referring to that unfortunate trait of mentality as an excuse for not adopting the language in your blog posts - it will do fine without your help. You can casually mention how much of a wasted efforts and daily inconvenience such attitude causes to your co-workers (in a gentle non-intrusive way!). You can start acting _as if_ mentality is different instead of going the route of imaginary pragmatism. In practice acting intentionally irrational is the only way to break the prisoner's dillema and the way people have influenced the culture and mentality all the time. It may not change thinking process of contemporary adults but few people doing stupid things here and there can accumulate enough cultural change to influence the future. Considering amount of "not smart" things I have done through my life by now it must have been totally fucked up. Failing to notice that indicate that something is fundamentally wrong with popular image of pragmatism.
Jun 18 2014
parent "c0de517e" <kenpex tin.it> writes:
 You can casually mention how much of a wasted efforts and daily 
 inconvenience such attitude causes to your co-workers (in a 
 gentle non-intrusive way!). You can start acting _as if_ 
 mentality is different instead of going the route of imaginary 
 pragmatism.

 In practice acting intentionally irrational is the only way to 
 break the prisoner's dillema and the way people have influenced 
 the culture and mentality all the time.
I would fight irrational choices, that's agreeable. But the thing is that the technical plane is not the only thing to consider when making rational choices. It is totally rational to understand that things like proficiency, education, legacy, familiarity, environment, future-proofing affect the decision of which language to use. It's totally rational, and a reason why adoption needs to climb a much higher barrier than simply noting, oh this is much better, just switch. It's like going to a guitarist and trying to have him switch a guitar he played for his lifetime just saying here, this one has less noise, why are you so irrational, it's clearly better.
Jun 18 2014
prev sibling parent reply "c0de517e" <kenpex tin.it> writes:
 I think this is actually a flawed mentality that causes a lot 
 of long-term problems to all programmers. By resisting to 
 switch to languages simply because those are good we 
 inevitably get to the point of switching because it is forced 
 by some corporation that has bucks to create an intrusive 
 ecosystem. And despite the fact language itself can be 
 horrible no choice remains by then.
This is, but that's how it works nevertheless. You don't succeed by arguing what the reality should be, but by accepting what it is and act accordingly.
Exactly. When I write that engineers have to understand how market works it's not that I don't understand what's technically good and bad, but that's not how things become successful. And there's nothing wrong with the fact that soft factors matter more than technical perfection, at all, because we make machines and programs for people, not to look at how pretty they seem.
Jun 18 2014
parent reply "Kagamin" <spam here.lot> writes:
On Wednesday, 18 June 2014 at 19:08:17 UTC, c0de517e wrote:
 Exactly. When I write that engineers have to understand how 
 market works it's not that I don't understand what's 
 technically good and bad, but that's not how things become 
 successful. And there's nothing wrong with the fact that soft 
 factors matter more than technical perfection, at all, because 
 we make machines and programs for people, not to look at how 
 pretty they seem.
And technologies should be for machines and for people, but C++ is not for machines and not for people, it's only for compatibility with itself. BTW, modules break the compatibility, which makes it impossible to migrate to them, because then you would throw away or rewrite all your codebase, and that still doesn't guarantee the result will fly; that said, they destroy the whole reason of existence of C++.
Jun 19 2014
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 19 June 2014 at 13:52:12 UTC, Kagamin wrote:
 On Wednesday, 18 June 2014 at 19:08:17 UTC, c0de517e wrote:
 Exactly. When I write that engineers have to understand how 
 market works it's not that I don't understand what's 
 technically good and bad, but that's not how things become 
 successful. And there's nothing wrong with the fact that soft 
 factors matter more than technical perfection, at all, because 
 we make machines and programs for people, not to look at how 
 pretty they seem.
And technologies should be for machines and for people, but C++ is not for machines and not for people, it's only for compatibility with itself. BTW, modules break the compatibility, which makes it impossible to migrate to them, because then you would throw away or rewrite all your codebase, and that still doesn't guarantee the result will fly; that said, they destroy the whole reason of existence of C++.
Modules are still being discussed. Besides the prototype implementation in LLVM, there are other proposals being discussed. There will be a meeting in a few weeks time about existing proposals. As for the reason of existence of C++, I think it is still very valuable. Only recently have OS and compiler vendors started to move from C to C++. How long will take for them to move from C++ to something else like D? -- Paulo
Jun 19 2014
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 18 June 2014 at 16:55:53 UTC, Dicebot wrote:
 On Wednesday, 18 June 2014 at 16:19:25 UTC, c0de517e wrote:
 But as I wrote I doubt that people will think at a point that 
 yes, now D is 100% a better version of C++/Java/younameit, 
 let's switch. I don't think it's how things go, I think 
 successful languages find one thing a community really can't 
 live without and get adopted there and from there expand. E.G. 
 JavaScript is horribly broken, but some people really needed 
 to be able to put code client-side on web pages, so now JS is 
 everywhere...
I think this is actually a flawed mentality that causes a lot of long-term problems to all programmers. By resisting to switch to languages simply because those are good we inevitably get to the point of switching because it is forced by some corporation that has bucks to create an intrusive ecosystem. And despite the fact language itself can be horrible no choice remains by then.
Specially important in systems programming languages, as the majority of developers only use what is available on the OS/Hardware vendors SDK. -- Paulo
Jun 18 2014