www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - How does D improve design practices over C++?

reply Janderson <ask me.com> writes:
Hi,

I was talking with some collages at work and they asked me how D 
enforces good programming practices.   For course I mentioned a couple 
of the ones I knew of hand -

- Unit checking
- Design by contract
- Invariant checks
- Stronger const
- Modules
- Garbage collection
- No automatic copy constructor
- More restrictive operators
- Delegates (Specifically, encouraging there use by making them simple 
to use)
- Specific constructs such as Interfaces
- More restrictive casting
- No C style Macros

I'm sure I've missed a lot in this area.  I'd like to email them a good 
list of "good coding design" that D promotes through syntax.  Note: The 
C++ verse D page is not what I'm looking for.  I'm more interested in 
coding practices then anything else.  For instance things that you can 
mess up in C++ but D won't let you.

Cheers
-Joel
Oct 28 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Janderson wrote:
 Hi,
 
 I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -
 
 - Unit checking
 - Design by contract
 - Invariant checks
 - Stronger const
 - Modules
 - Garbage collection
 - No automatic copy constructor
 - More restrictive operators
 - Delegates (Specifically, encouraging there use by making them simple 
 to use)
 - Specific constructs such as Interfaces
 - More restrictive casting
 - No C style Macros
 
 I'm sure I've missed a lot in this area.  I'd like to email them a good 
 list of "good coding design" that D promotes through syntax.  Note: The 
 C++ verse D page is not what I'm looking for.  I'm more interested in 
 coding practices then anything else.  For instance things that you can 
 mess up in C++ but D won't let you.
It's a good idea to come up with such a list. Let me add: - Overload sets which prevent function call hijacking - Nested functions - Structs with value semantics, Classes with reference semantics - Ability to move (bitcopy) a struct without invoking construction - Alias parameters to templates - Compile time function evaluation - Function, array, and struct literals - String mixins - In, reference, and out function parameters - Lazy function parameters - Standardized way of doing conditional compilation - Standardized way of debug conditionals - Warnings on implicit casts that lose bits - No uninitialized data - User defined default initializers for typedefs and fields - Template constraints
Oct 29 2008
next sibling parent reply "Bill Baxter" <wbaxter gmail.com> writes:
On Wed, Oct 29, 2008 at 4:13 PM, Walter Bright
<newshound1 digitalmars.com> wrote:
 Janderson wrote:
- The override keyword helps you make sure really overriding something. Been fighting with some bugs in C++ that would have been prevented by that one. --bb
Oct 29 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 On Wed, Oct 29, 2008 at 4:13 PM, Walter Bright
 <newshound1 digitalmars.com> wrote:
 Janderson wrote:
- The override keyword helps you make sure really overriding something. Been fighting with some bugs in C++ that would have been prevented by that one.
Add the corresponding final keyword in then, too, which *prevents* overriding.
Oct 29 2008
prev sibling parent reply Janderson <ask me.com> writes:
Walter Bright wrote:
 Janderson wrote:
 Hi,

 I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -

 - Unit checking
 - Design by contract
 - Invariant checks
 - Stronger const
 - Modules
 - Garbage collection
 - No automatic copy constructor
 - More restrictive operators
 - Delegates (Specifically, encouraging there use by making them simple 
 to use)
 - Specific constructs such as Interfaces
 - More restrictive casting
 - No C style Macros

 I'm sure I've missed a lot in this area.  I'd like to email them a 
 good list of "good coding design" that D promotes through syntax.  
 Note: The C++ verse D page is not what I'm looking for.  I'm more 
 interested in coding practices then anything else.  For instance 
 things that you can mess up in C++ but D won't let you.
It's a good idea to come up with such a list. Let me add: - Overload sets which prevent function call hijacking - Nested functions - Structs with value semantics, Classes with reference semantics - Ability to move (bitcopy) a struct without invoking construction - Alias parameters to templates - Compile time function evaluation - Function, array, and struct literals - String mixins - In, reference, and out function parameters - Lazy function parameters - Standardized way of doing conditional compilation - Standardized way of debug conditionals - Warnings on implicit casts that lose bits - No uninitialized data - User defined default initializers for typedefs and fields - Template constraints
Here's another big one: - encourages less frequent use of pointers -Joel
Oct 29 2008
next sibling parent Janderson <ask me.com> writes:
Janderson wrote:
 Walter Bright wrote:
 Janderson wrote:
 Hi,

 I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a 
 couple of the ones I knew of hand -

 - Unit checking
 - Design by contract
 - Invariant checks
 - Stronger const
 - Modules
 - Garbage collection
 - No automatic copy constructor
 - More restrictive operators
 - Delegates (Specifically, encouraging there use by making them 
 simple to use)
 - Specific constructs such as Interfaces
 - More restrictive casting
 - No C style Macros

 I'm sure I've missed a lot in this area.  I'd like to email them a 
 good list of "good coding design" that D promotes through syntax.  
 Note: The C++ verse D page is not what I'm looking for.  I'm more 
 interested in coding practices then anything else.  For instance 
 things that you can mess up in C++ but D won't let you.
It's a good idea to come up with such a list. Let me add: - Overload sets which prevent function call hijacking - Nested functions - Structs with value semantics, Classes with reference semantics - Ability to move (bitcopy) a struct without invoking construction - Alias parameters to templates - Compile time function evaluation - Function, array, and struct literals - String mixins - In, reference, and out function parameters - Lazy function parameters - Standardized way of doing conditional compilation - Standardized way of debug conditionals - Warnings on implicit casts that lose bits - No uninitialized data - User defined default initializers for typedefs and fields - Template constraints
Here's another big one: - encourages less frequent use of pointers -Joel
- No friend, enforcing better encapsulation. Things can be shared only in the same module.
Oct 29 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Janderson wrote:
 Here's another big one:
 - encourages less frequent use of pointers
I didn't include that because many more advanced C++ programmers eschew pointers, and there is enough in the language to support that.
Oct 29 2008
parent Janderson <ask me.com> writes:
Walter Bright wrote:
 Janderson wrote:
 Here's another big one:
 - encourages less frequent use of pointers
I didn't include that because many more advanced C++ programmers eschew pointers, and there is enough in the language to support that.
For me, having alternatives to pointers for the most common operations prevents (or reduces) the risk of memory corruption which I think is a good practice. I'm not against pointers either. However I think a modern programmer always should encapsulate there use of pointers where they need them. -Joel
Oct 29 2008
prev sibling next sibling parent "Denis Koroskin" <2korden gmail.com> writes:
On Wed, 29 Oct 2008 08:53:47 +0300, Janderson <ask me.com> wrote:

 Hi,

 I was talking with some collages at work and they asked me how D  
 enforces good programming practices.   For course I mentioned a couple  
 of the ones I knew of hand -

 - Unit checking
 - Design by contract
 - Invariant checks
 - Stronger const
 - Modules
 - Garbage collection
 - No automatic copy constructor
 - More restrictive operators
 - Delegates (Specifically, encouraging there use by making them simple  
 to use)
 - Specific constructs such as Interfaces
 - More restrictive casting
 - No C style Macros

 I'm sure I've missed a lot in this area.  I'd like to email them a good  
 list of "good coding design" that D promotes through syntax.  Note: The  
 C++ verse D page is not what I'm looking for.  I'm more interested in  
 coding practices then anything else.  For instance things that you can  
 mess up in C++ but D won't let you.

 Cheers
 -Joel
// The following is a valid C/C++ but not valid D (note the semicolon) for (int i = 0; i < 100; ++i); printf("%d", i);
Oct 29 2008
prev sibling next sibling parent reply Paul D. Anderson <paul.d.removethis.anderson comcast.andthis.net> writes:
Walter Bright Wrote:

 Bill Baxter wrote:
 On Wed, Oct 29, 2008 at 4:13 PM, Walter Bright
 <newshound1 digitalmars.com> wrote:
 Janderson wrote:
- The override keyword helps you make sure really overriding something. Been fighting with some bugs in C++ that would have been prevented by that one.
Add the corresponding final keyword in then, too, which *prevents* overriding.
This is great. I've submitted an abstract for an oral presentation on "Compiler-Assisted Quality" at an upcoming Boeing software conference. (I haven't heard yet whether it's been accepted.) I was going to ask for a list like this to help prepare the presentation. The theme of the conference is software quality, which is why I had to give the presentation that dorky name. The presentations are not supposed to be vendor-specific so I hesitated to use "D Programming Language" in the title (even though D is free as in free beer) but I will be using D for all the examples. Thanks again. Paul
Oct 29 2008
parent reply bearophile <bearophileHUGS lycos.com> writes:
Paul D. Anderson:
I've submitted an abstract for an oral presentation on "Compiler-Assisted
Quality" at an upcoming Boeing software conference. (I haven't heard yet
whether it's been accepted.) I was going to ask for a list like this to help
prepare the presentation.<
D1 is often safer than C and C++, but regarding safety there are several things that can improved still, often with no/little performance penalty (unsafe casting (automatic and manual), integral overflows, GC pointers Vs other pointers, nonnull types, named arguments, fallthrough switch cases, multi var assigns syntax missing, octal literals, bitwise operators as symbols instead English words, and many other smaller things I have listed in the past). You may want to tell them about the idea of SafeD too (javesque D). Bye, bearophile
Oct 29 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
bearophile wrote:
 D1 is often safer than C and C++, but regarding safety there are
 several things that can improved still, often with no/little
 performance penalty (unsafe casting (automatic and manual), integral
 overflows, GC pointers Vs other pointers, nonnull types, named
 arguments, fallthrough switch cases, multi var assigns syntax
 missing, octal literals, bitwise operators as symbols instead English
 words, and many other smaller things I have listed in the past). You
 may want to tell them about the idea of SafeD too (javesque D).
"Safety" in programming languages does not refer to program correctness, but absence of bugs that could result in memory corruption. The agenda of SafeD is to find the subset of D that can guarantee there is no memory corruption. Null pointer dereferencing, for example, is a program bug but is not a safety issue because it cannot cause memory corruption.
Oct 29 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 "Safety" in programming languages does not refer to program correctness, 
 but absence of bugs that could result in memory corruption. The agenda 
 of SafeD is to find the subset of D that can guarantee there is no 
 memory corruption.
Yes, you are right, I have mixed two different things. They are almost orthogonal. The final purpose of a good language is to allow to write in a short enough time programs that give the correct output. But the things I was referring to are helpers to avoid putting bugs into the code, while SafeD is a way to not have really bad memory consequences if a certain class of errors are present anyway in the code :-) Bye, bearophile
Oct 29 2008
parent reply Paul D. Anderson <paul.d.removethis.anderson comcast.andthis.net> writes:
bearophile Wrote:

 Walter Bright:
 "Safety" in programming languages does not refer to program correctness, 
 but absence of bugs that could result in memory corruption. The agenda 
 of SafeD is to find the subset of D that can guarantee there is no 
 memory corruption.
Yes, you are right, I have mixed two different things. They are almost orthogonal. The final purpose of a good language is to allow to write in a short enough time programs that give the correct output. But the things I was referring to are helpers to avoid putting bugs into the code, while SafeD is a way to not have really bad memory consequences if a certain class of errors are present anyway in the code :-) Bye, bearophile
Since Boeing is a defense contractor many projects require safety in the "safe from accidentally doing something that could hurt somebody" sense. Some projects require a safety review (in the above sense) and an airworthiness review (in an obviously related sense). The current practice is an elaborate line-by-line, change-by-change review process, as well as an extensive test program. I doubt that the DoD will ever do away with these reviews (nor do I think they should) but any help the programmers can get from the compiler to avoid unsafe programming "gotchas" has a potential for real cost savings -- finding the problem when it is cheap to fix, rather than in a costly review, revise, recheck loop at the end of the design effort. And Boeing likes cost savings. This is why I'm trying to get D noticed here at Boeing. It's a very good fit for the things we do -- safe (memory safe and physically safe), efficient, powerful without being too complex, and a capable systems language. We still do a lot of bit-twiddling programming and need to be able to get to the hardware. Paul p.s. I'd forgotten about SafeD. Thanks for the reminder.
Oct 29 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Paul D. Anderson wrote:
 This is why I'm trying to get D noticed here at Boeing. It's a very
 good fit for the things we do -- safe (memory safe and physically
 safe), efficient, powerful without being too complex, and a capable
 systems language. We still do a lot of bit-twiddling programming and
 need to be able to get to the hardware.
I used to work at Boeing (757 stab trim design), so I have a particular interest in how this works out. Please keep us up to date! If I can help, let me know. I'm also available to give a presentation at Boeing if invited <g>.
Oct 29 2008
prev sibling next sibling parent reply Brad Roberts <braddr puremagic.com> writes:
On Wed, 29 Oct 2008, Walter Bright wrote:

 bearophile wrote:
 D1 is often safer than C and C++, but regarding safety there are
 several things that can improved still, often with no/little
 performance penalty (unsafe casting (automatic and manual), integral
 overflows, GC pointers Vs other pointers, nonnull types, named
 arguments, fallthrough switch cases, multi var assigns syntax
 missing, octal literals, bitwise operators as symbols instead English
 words, and many other smaller things I have listed in the past). You
 may want to tell them about the idea of SafeD too (javesque D).
"Safety" in programming languages does not refer to program correctness, but absence of bugs that could result in memory corruption. The agenda of SafeD is to find the subset of D that can guarantee there is no memory corruption. Null pointer dereferencing, for example, is a program bug but is not a safety issue because it cannot cause memory corruption.
Actually, that's not true. Dereferencing null _can_ corrupt memory. As you well know, ptr[index] is just ptr + index. Use a large and accurate enough index and you're out of that first page of memory and back into application memory space. Find the address of a key stack variable and you've got room for all sorts of fun and mahem. These are the sorts of bugs in popular enough applications are the things that end up costing companies lots of money to emergency fix. One of the few recent flash exploits were exactly this type of bug. Later, Brad
Oct 29 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Brad Roberts wrote:
 On Wed, 29 Oct 2008, Walter Bright wrote:
 Null pointer dereferencing, for example, is a program bug but is not a safety
 issue because it cannot cause memory corruption.
Actually, that's not true. Dereferencing null _can_ corrupt memory. As you well know, ptr[index] is just ptr + index. Use a large and accurate enough index and you're out of that first page of memory and back into application memory space. Find the address of a key stack variable and you've got room for all sorts of fun and mahem.
True, but technically that is not a null pointer dereference. There are also ways to deal with it. One is to disallow fixed offsets exceeding the protected null space (Java prohibits objects > 64Kb in size for this reason). Next is to disallow pointer arithmetic (which is what SafeD proposes).
 These are the sorts of bugs in popular enough applications are the things 
 that end up costing companies lots of money to emergency fix.  One of the 
 few recent flash exploits were exactly this type of bug.
You're right, and SafeD should make such exploits impossible.
Oct 29 2008
prev sibling next sibling parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Wed, Oct 29, 2008 at 5:43 PM, Brad Roberts <braddr puremagic.com> wrote:
 On Wed, 29 Oct 2008, Walter Bright wrote:

 bearophile wrote:
 D1 is often safer than C and C++, but regarding safety there are
 several things that can improved still, often with no/little
 performance penalty (unsafe casting (automatic and manual), integral
 overflows, GC pointers Vs other pointers, nonnull types, named
 arguments, fallthrough switch cases, multi var assigns syntax
 missing, octal literals, bitwise operators as symbols instead English
 words, and many other smaller things I have listed in the past). You
 may want to tell them about the idea of SafeD too (javesque D).
"Safety" in programming languages does not refer to program correctness, but absence of bugs that could result in memory corruption. The agenda of SafeD is to find the subset of D that can guarantee there is no memory corruption. Null pointer dereferencing, for example, is a program bug but is not a safety issue because it cannot cause memory corruption.
Actually, that's not true. Dereferencing null _can_ corrupt memory. As you well know, ptr[index] is just ptr + index. Use a large and accurate enough index and you're out of that first page of memory and back into application memory space. Find the address of a key stack variable and you've got room for all sorts of fun and mahem. These are the sorts of bugs in popular enough applications are the things that end up costing companies lots of money to emergency fix. One of the few recent flash exploits were exactly this type of bug. Later, Brad
Interestingly, although null dereferences are unsafe, in a safe language like SafeD it's not actually possible to do so. There are no pointers and arrays are bounds-checked. So with the combination of the typing system and the runtime checks, null can never actually be dereferenced, so no special consideration has to be given to it.
Oct 29 2008
parent reply Frits van Bommel <fvbommel REMwOVExCAPSs.nl> writes:
Jarrett Billingsley wrote:
 Interestingly, although null dereferences are unsafe, in a safe
 language like SafeD it's not actually possible to do so.  There are no
 pointers and arrays are bounds-checked.  So with the combination of
 the typing system and the runtime checks, null can never actually be
 dereferenced, so no special consideration has to be given to it.
Assuming it still allows heap-allocated objects, something like this will still work: ---- class C { ubyte[16 * 1024 * 1024 - 1] memory; } void poke(size_t intptr, ubyte b) { C c; // kept at null deliberately c.memory[intptr - c.memory.offsetof] = b; } ubyte peek(size_t intptr) { C c; // kept at null deliberately return c.memory[intptr - c.memory.offsetof]; } ----- (That is, unless it emits 'this' null-checks for object field accesses as well)
Oct 29 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Frits van Bommel wrote:
 Assuming it still allows heap-allocated objects, something like this 
 will still work:
 ----
 class C {
     ubyte[16 * 1024 * 1024 - 1] memory;
 }
The fix for that is to disallow objects with a large static size.
Oct 29 2008
parent Sean Kelly <sean invisibleduck.org> writes:
Walter Bright wrote:
 Frits van Bommel wrote:
 Assuming it still allows heap-allocated objects, something like this 
 will still work:
 ----
 class C {
     ubyte[16 * 1024 * 1024 - 1] memory;
 }
The fix for that is to disallow objects with a large static size.
DMD assumes this anyway for its invariant -> object conversion routine, if I remember correctly, but the max size there is still pretty considerable. Sean
Oct 29 2008
prev sibling parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Wed, Oct 29, 2008 at 7:15 PM, Frits van Bommel
<fvbommel remwovexcapss.nl> wrote:
 Jarrett Billingsley wrote:
 Interestingly, although null dereferences are unsafe, in a safe
 language like SafeD it's not actually possible to do so.  There are no
 pointers and arrays are bounds-checked.  So with the combination of
 the typing system and the runtime checks, null can never actually be
 dereferenced, so no special consideration has to be given to it.
Assuming it still allows heap-allocated objects, something like this will still work: ---- class C { ubyte[16 * 1024 * 1024 - 1] memory; } void poke(size_t intptr, ubyte b) { C c; // kept at null deliberately c.memory[intptr - c.memory.offsetof] = b; } ubyte peek(size_t intptr) { C c; // kept at null deliberately return c.memory[intptr - c.memory.offsetof]; } ----- (That is, unless it emits 'this' null-checks for object field accesses as well)
I kind of imagined it would. I thought the entire point of SafeD would be that the language would completely disallow you from touching memory that you don't own. Which would include Java-like null reference checks.
Oct 29 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Jarrett Billingsley wrote:
 I kind of imagined it would.  I thought the entire point of SafeD
 would be that the language would completely disallow you from touching
 memory that you don't own.  Which would include Java-like null
 reference checks.
Fortunately, the hardware does the checking for null pointers for you.
Oct 29 2008
prev sibling parent Brad Roberts <braddr puremagic.com> writes:
On Wed, 29 Oct 2008, Jarrett Billingsley wrote:

 On Wed, Oct 29, 2008 at 5:43 PM, Brad Roberts <braddr puremagic.com> wrote:
 On Wed, 29 Oct 2008, Walter Bright wrote:

 bearophile wrote:
 D1 is often safer than C and C++, but regarding safety there are
 several things that can improved still, often with no/little
 performance penalty (unsafe casting (automatic and manual), integral
 overflows, GC pointers Vs other pointers, nonnull types, named
 arguments, fallthrough switch cases, multi var assigns syntax
 missing, octal literals, bitwise operators as symbols instead English
 words, and many other smaller things I have listed in the past). You
 may want to tell them about the idea of SafeD too (javesque D).
"Safety" in programming languages does not refer to program correctness, but absence of bugs that could result in memory corruption. The agenda of SafeD is to find the subset of D that can guarantee there is no memory corruption. Null pointer dereferencing, for example, is a program bug but is not a safety issue because it cannot cause memory corruption.
Actually, that's not true. Dereferencing null _can_ corrupt memory. As you well know, ptr[index] is just ptr + index. Use a large and accurate enough index and you're out of that first page of memory and back into application memory space. Find the address of a key stack variable and you've got room for all sorts of fun and mahem. These are the sorts of bugs in popular enough applications are the things that end up costing companies lots of money to emergency fix. One of the few recent flash exploits were exactly this type of bug. Later, Brad
Interestingly, although null dereferences are unsafe, in a safe language like SafeD it's not actually possible to do so. There are no pointers and arrays are bounds-checked. So with the combination of the typing system and the runtime checks, null can never actually be dereferenced, so no special consideration has to be given to it.
Unless something's changed, pointers aren't even a part of SafeD, so that line of reasoning is largely irrelevant. I was also talking about the larger context of the thread, which includes what 'safety' means in the context of programming in general. Since this thread included c and c++, it's a legit concern. If we're going to include the broader d language, the bounds checking is only on for some builds, not all builds, so the problem still exists there too. Later, Brad
Oct 29 2008
prev sibling next sibling parent reply Paul D. Anderson <paul.d.removethis.anderson comcast.andthis.net> writes:
Walter Bright Wrote:

 Paul D. Anderson wrote:
 This is why I'm trying to get D noticed here at Boeing. It's a very
 good fit for the things we do -- safe (memory safe and physically
 safe), efficient, powerful without being too complex, and a capable
 systems language. We still do a lot of bit-twiddling programming and
 need to be able to get to the hardware.
I used to work at Boeing (757 stab trim design), so I have a particular interest in how this works out. Please keep us up to date! If I can help, let me know. I'm also available to give a presentation at Boeing if invited <g>.
I will certainly keep you posted. This particular software conference is a Boeing-only, behind-closed-doors thing, otherwise I would have recommended that you give the presentation, not me. I'll see if I can find a venue for you here. Unfortunately I'm a very small cog in a big machine, but I'll give it a shot. Paul
Oct 30 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Paul D. Anderson wrote:
 Walter Bright Wrote:
 I'm also available to give a presentation at Boeing if invited <g>.
 
I will certainly keep you posted. This particular software conference is a Boeing-only, behind-closed-doors thing, otherwise I would have recommended that you give the presentation, not me. I'll see if I can find a venue for you here. Unfortunately I'm a very small cog in a big machine,
I remember what that's like!
 but I'll give it a shot.
Thanks. No sweat if it doesn't work out.
Oct 30 2008
prev sibling next sibling parent reply ore-sama <spam here.lot> writes:
Walter Bright Wrote:

 Janderson wrote:
 - More restrictive casting
what's this? From my point of view C++ has more restrictive casting.
 - Warnings on implicit casts that lose bits
O_O
Oct 30 2008
parent reply Janderson <ask me.com> writes:
ore-sama wrote:
 Walter Bright Wrote:
 
 Janderson wrote:
 - More restrictive casting
what's this? From my point of view C++ has more restrictive casting.
We'll C++ allows C casts for instance. Also C++ you can static cast something that should be dynamic. Also in D you can't easily cast an array to a pointer (although you can go array.ptr). There are fewer implicit casts allowed (particularly for unsigned/signed). See: http://www.digitalmars.com/d/2.0/type.html I think there are a couple of others but I can't remember them at the moment.
 
 - Warnings on implicit casts that lose bits
O_O
Oct 30 2008
parent ore-sama <spam here.lot> writes:
Janderson Wrote:

 I think there are a couple of others but I can't remember them at the 
 moment.
I can remind: reinterpret_cast is explicit, type casts preserve constness, unsuccessful downcast to reference throws.
Oct 31 2008
prev sibling next sibling parent reply "Tony" <tonytech08 gmail.com> writes:
Let me be facetious with Janderson's list plz...

"Janderson" <ask me.com> wrote in message 
news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D enforces 
 good programming practices.   For course I mentioned a couple of the ones 
 I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than adequate degree (heard of assertions?).
 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in "Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
 - Stronger const
Insignificant. I still use many #defines just because I know that const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
 - Garbage collection
That's a major deal breaker for me.
 - No automatic copy constructor
Can't comment.
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing numerical programming.
 - Delegates (Specifically, encouraging there use by making them simple to 
 use)
Can't comment.
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's interfaces different from C++'s?
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though). Tony
Nov 03 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments? (generally from your comments I'd say that D is the wrong language for you, and you want C or maybe C++). Bye, bearophile
Nov 04 2008
next sibling parent reply Derek Parnell <derek psych.ward> writes:
On Tue, 04 Nov 2008 05:34:13 -0500, bearophile wrote:

 Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments? (generally from your comments I'd say that D is the wrong language for you, and you want C or maybe C++).
LOL ... That is almost exactly what I was going to write too. Tony, it seems that the differences between C++ and D and not significant for you so you may as well keep clear of D for now. -- Derek Parnell Melbourne, Australia skype: derek.j.parnell
Nov 04 2008
next sibling parent reply Janderson <ask me.com> writes:
Derek Parnell wrote:
 On Tue, 04 Nov 2008 05:34:13 -0500, bearophile wrote:
 
 Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments? (generally from your comments I'd say that D is the wrong language for you, and you want C or maybe C++).
LOL ... That is almost exactly what I was going to write too. Tony, it seems that the differences between C++ and D and not significant for you so you may as well keep clear of D for now.
Maybe Tony, will come back when he's discovered how imperfect C++ really is. Reminds me of a good C++ book, what was it called... :) -Joel
Nov 04 2008
parent "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gept45$24cs$1 digitalmars.com...
 Derek Parnell wrote:
 On Tue, 04 Nov 2008 05:34:13 -0500, bearophile wrote:

 Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments? (generally from your comments I'd say that D is the wrong language for you, and you want C or maybe C++).
LOL ... That is almost exactly what I was going to write too. Tony, it seems that the differences between C++ and D and not significant for you so you may as well keep clear of D for now.
Maybe Tony, will come back when he's discovered how imperfect C++ really is.
But I thought I already made it clear that C++ is not my ideal of a language. Not that I can't use it or that I don't, but that I'd love to have the capability to evolve my own language. Tony
Nov 04 2008
prev sibling parent "Tony" <tonytech08 gmail.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message 
news:wzxbs6vmtu21.belg3400a3iw.dlg 40tude.net...
 On Tue, 04 Nov 2008 05:34:13 -0500, bearophile wrote:

 Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments? (generally from your comments I'd say that D is the wrong language for you, and you want C or maybe C++).
LOL ... That is almost exactly what I was going to write too. Tony, it seems that the differences between C++ and D and not significant for you so you may as well keep clear of D for now.
I know. There are good aspects to each though. I know too much about C++ to move to D and relearn stuff. That and the fact that I'm not seeking solution to memory management, for example. I sure wish that language implementation wasn't so damn hard! (The solution starts at the language definition though, for sure). Tony
Nov 04 2008
prev sibling parent "Tony" <tonytech08 gmail.com> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:gep8f5$htn$1 digitalmars.com...
 Tony Wrote:
 Let me be facetious with Janderson's list plz...
Do you want a list of serious answer to your comments?
I don't remember asking any questions, but maybe I did.
 (generally from your comments I'd say that D is the wrong language for 
 you, and you want C or maybe C++).
I'm using C++ now and it works for me, but I'd sure like to strip out a ton and a half of "features" that I won't ever use again. That and evolve the object model a bit. I have a feeling that D's template system is more of what I'd opt for than C++'s. C++ is just too damn big. D is too much policy for my liking (yes, GC again and the object model). Tony
Nov 04 2008
prev sibling next sibling parent reply "Denis Koroskin" <2korden gmail.com> writes:
On Tue, 04 Nov 2008 07:13:24 +0300, Tony <tonytech08 gmail.com> wrote:

 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message
 news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D  
 enforces
 good programming practices.   For course I mentioned a couple of the  
 ones
 I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
It is a recommended practice to supply tests with your source code. These tests are put into special section and run with upon startup (activated by a compiler switch). In D, code is not considered reliable unless it has a comprehensive set of tests.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than adequate degree (heard of assertions?).
No, it isn't. Asserts are here too (at a lanugage level) and this is another plus: int foo() { if (condition) { return 42; } assert(false); // C++ generates a "function doesn't return value" warning } int result == foo(); assert(result == Success); // C++ generates a "variable is unused" warning
 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in "Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
 - Stronger const
Insignificant. I still use many #defines just because I know that const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
I believe you didn't understand what does "Stronger const" mean. Const is transitive in D and it is not in C++. In C++, even if you pass a const object to some function, you can't be sure that it won't be changed: struct Child; struct Parent { Child* child; }; struct Child { Parent* parent; int data; }; void foo(const Parent* parent) { Child* child = parent->child; child->data = 42; child->parent->child = 0; // const object is changed } Using macros for const objects is a bad idea, too, you should use enums instead. Too bad they are limited to numerical values in C++. In D you can use whatever you want: enum Constants { Text = "Hello, World", Result = 42, Pi = 3.1415926, } etc.
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
You missed that one, again. You can have headers in D, too.
 - Garbage collection
That's a major deal breaker for me.
You can turn it off and do the memory management by yourself, if you wish: struct Foo {} Foo* foo = new Foo(); delete foo;
 - No automatic copy constructor
Can't comment.
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing numerical programming.
 - Delegates (Specifically, encouraging there use by making them simple  
 to
 use)
Can't comment.
This is an awesome feature, too bad you don't have this one in C++. You will like it, believe me!
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's interfaces different from C++'s?
C++ doesn't have interfaces, but you can emulate them (pretty badly) inheriting virtually from structs.
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
No, I don't get your point. Do you mean that you don't use C++ casts either (dynamic cast, int->float casts etc)? Why comment, then?
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
You don't need a preprocessor in most cases, but C++ lacks an alternative. You may want to read this paper, too: http://www.digitalmars.com/d/sdwest/paper.html
Nov 04 2008
next sibling parent Christopher Wright <dhasenan gmail.com> writes:
Denis Koroskin wrote:
 On Tue, 04 Nov 2008 07:13:24 +0300, Tony <tonytech08 gmail.com> wrote:
 
 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
No, I don't get your point. Do you mean that you don't use C++ casts either (dynamic cast, int->float casts etc)? Why comment, then?
I think it's more to do with casts that the compiler wouldn't consider possible (like integer to a struct {int}) but the programmer knows is safe. I would generally prefer to avoid casts for this. It's more future proof and safer: struct Foo { int value; static Foo opCall(int v) in { assert (v < 5_000_000); // db can't handle more than that body { ... } } I grant that it's slower, but I really don't care about efficiency except in extreme cases (4MB executable for a simple example, for instance).
Nov 04 2008
prev sibling parent reply "Tony" <tonytech08 gmail.com> writes:
"Denis Koroskin" <2korden gmail.com> wrote in message 
news:op.uj3j16t7o7cclz worker...
 On Tue, 04 Nov 2008 07:13:24 +0300, Tony <tonytech08 gmail.com> wrote:

 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message
 news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D 
 enforces
 good programming practices.   For course I mentioned a couple of the 
 ones
 I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
It is a recommended practice to supply tests with your source code. These tests are put into special section and run with upon startup (activated by a compiler switch). In D, code is not considered reliable unless it has a comprehensive set of tests.
Yes, unit testing is necessary. I see no need to make it a language feature though.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than adequate degree (heard of assertions?).
No, it isn't.
What I meant was that formalism is not necessary for the concepts. Eiffel wasn't the creator of DbC techniques. Do precondition checking? Sure. In the rare instances that an invariant can be had, use it? Sure.
 Asserts are here too (at a lanugage level) and this is another plus:

 int foo() {
     if (condition) {
         return 42;
     }

     assert(false); // C++ generates a "function doesn't return value" 
 warning
 }

 int result == foo();
 assert(result == Success); // C++ generates a "variable is unused" warning

 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in "Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
 - Stronger const
Insignificant. I still use many #defines just because I know that const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
I believe you didn't understand what does "Stronger const" mean. Const is transitive in D and it is not in C++. In C++, even if you pass a const object to some function, you can't be sure that it won't be changed:
Ah, OK, I get it. Yeah, that is good. But it is a minor point compared to, say memory management, OO model, generic programming facilities, for examples. The BIG features are what I am concerned about.
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
You missed that one, again. You can have headers in D, too.
Yes, I don't really know "modules" other than what they obviously imply: separate compilation units/namespaces?
 - Garbage collection
That's a major deal breaker for me.
You can turn it off and do the memory management by yourself, if you wish: struct Foo {} Foo* foo = new Foo(); delete foo;
And if I don't want to use new and delete?
 - Delegates (Specifically, encouraging there use by making them simple 
 to
 use)
Can't comment.
This is an awesome feature, too bad you don't have this one in C++. You will like it, believe me!
Well what the heck is it?!
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's interfaces different from C++'s?
C++ doesn't have interfaces, but you can emulate them (pretty badly) inheriting virtually from structs.
Pure virtual abstract base classes are interfaces in C++. They work fine for me and I use them.
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
No, I don't get your point. Do you mean that you don't use C++ casts either (dynamic cast, int->float casts etc)?
I haven't found the need for them, so no.
 Why comment, then?
Because if I can't use C-style casts, it would be a bummer.
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
You don't need a preprocessor in most cases, but C++ lacks an alternative.
I find the preprocessor useful as a template implementation machine.
 You may want to read this paper, too: 
 http://www.digitalmars.com/d/sdwest/paper.html
OK. Not right away though. I have to rework more of my C++ framework. Tony
Nov 04 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Tony wrote:
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
You missed that one, again. You can have headers in D, too.
Yes, I don't really know "modules" other than what they obviously imply: separate compilation units/namespaces?
It means you don't have to specify everything 2 times if you don't want to (you can if you feel the need). So if you change a function's signature, you just friggin' change it, you don't change it at the function and in the header. This also makes compilation times much faster, since the header files don't need to be included & recompiled during every compilation -- if you're compiling a group of files that depend on one another at once, the compiler will only compile each one once. And, yes, better/automated namespacing.
 - Garbage collection
That's a major deal breaker for me.
You can turn it off and do the memory management by yourself, if you wish: struct Foo {} Foo* foo = new Foo(); delete foo;
And if I don't want to use new and delete?
Then you use the GC. Or malloc/free. Or placement new. Or... seriously, what do you want here?
 - Delegates (Specifically, encouraging there use by making them simple 
 to
 use)
Can't comment.
This is an awesome feature, too bad you don't have this one in C++. You will like it, believe me!
Well what the heck is it?!
It's a function pointer with context. So you can point to a member function, for example. In D2, there's also closures so... void main(string[] args) { auto dg = getADelegate(); writefln("%d", dg()); writefln("%d", dg()); writefln("%d", dg()); } int delegate() getADelegate() { int i = 0; return delegate int() { return ++i; }; } Will yeild: 1 2 3
Nov 04 2008
parent reply "Tony" <tonytech08 gmail.com> writes:
"Robert Fraser" <fraserofthenight gmail.com> wrote in message 
news:gergmk$107u$1 digitalmars.com...
 Tony wrote:
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
You missed that one, again. You can have headers in D, too.
Yes, I don't really know "modules" other than what they obviously imply: separate compilation units/namespaces?
It means you don't have to specify everything 2 times if you don't want to (you can if you feel the need). So if you change a function's signature, you just friggin' change it, you don't change it at the function and in the header.
I like the header as the high level view of the code, be it classes or functions or whatever.
 This also makes compilation times much faster, since the header files 
 don't need to be included & recompiled during every compilation -- if 
 you're compiling a group of files that depend on one another at once, the 
 compiler will only compile each one once.
Compile times as I am not doing large scale development. And with so much processor power available these days, I really don't see a problem with compile times. Some care in laying out code and headers goes a long way.
 And, yes, better/automated namespacing.
If I was using other peoples' code or language standard library code, namespaces could be mildly convenient. As it is though, I'm not doing either of those things.
 - Garbage collection
That's a major deal breaker for me.
You can turn it off and do the memory management by yourself, if you wish: struct Foo {} Foo* foo = new Foo(); delete foo;
And if I don't want to use new and delete?
Then you use the GC. Or malloc/free. Or placement new. Or... seriously, what do you want here?
OK. I was just wondering how C++ like those things were in D. Apparently pretty much the same, if not exactly so even.
 - Delegates (Specifically, encouraging there use by making them simple 
 to
 use)
Can't comment.
This is an awesome feature, too bad you don't have this one in C++. You will like it, believe me!
Well what the heck is it?!
It's a function pointer with context. So you can point to a member function, for example. In D2, there's also closures so... void main(string[] args) { auto dg = getADelegate(); writefln("%d", dg()); writefln("%d", dg()); writefln("%d", dg()); } int delegate() getADelegate() { int i = 0; return delegate int() { return ++i; }; } Will yeild: 1 2 3
In C++: int GetAnInt() { static int i = 0; return ++i; } So what kind of major programming problems do delegates and closures solve? Are they just syntactic sugar? Tony
Nov 06 2008
next sibling parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 I like the header as the high level view of the code, be it classes or
 functions or whatever.
Documentation works quite nicely for that too. Or an editor that's smart enough to collapse function/class bodies or to give you a list of declarations.
 This also makes compilation times much faster, since the header files
 don't need to be included & recompiled during every compilation -- if
 you're compiling a group of files that depend on one another at once, the
 compiler will only compile each one once.
Compile times as I am not doing large scale development. And with so much processor power available these days, I really don't see a problem with compile times. Some care in laying out code and headers goes a long way.
Boost.
 Then you use the GC. Or malloc/free. Or placement new. Or... seriously,
 what do you want here?
OK. I was just wondering how C++ like those things were in D. Apparently pretty much the same, if not exactly so even.
Yeah. D uses GC by default, but there's nothing stopping you from turning it off, or compiling your app with a stub GC that doesn't actually do GC, or using malloc/free, or whatever floats your boat. It has custom allocators/deallocators just like C++ as well.
 In C++:

 int GetAnInt()
 {
    static int i = 0;
    return ++i;
 }

 So what kind of major programming problems do delegates and closures solve?
 Are they just syntactic sugar?
No, not at all. Your example works in this case, but the point of a closure is that it is almost like an object - it is allocated on the heap, and has its own state. So while your example will indeed return a sequence of integers upon successive calls, what the closure version does that your code can't do is have _multiple_ counters. You call getADelegate() multiple times, and each delegate will have _its own state_, meaning that each delegate is a different object, and will generate its own sequence of numbers upon successive calls. Your function only has a single global state variable and cannot do the same. Delegates are kind of similar to the new [](){} function literals that are coming out in C++0x, except more automatic and, from what I understand, more powerful. For example, and correct me if I'm wrong, C++0x's function literals cannot directly access the stack frame of the enclosing function; you have to "capture" them inside the square brackets. D's nested functions (which are delegates) are allowed to access and modify their enclosing function's stack frame. This makes writing things like callbacks _so_ much less painful. Lastly delegates don't have to be just free functions; they also act as a stand-in for pointers-to-member-functions. If you do something like auto obj = new Class(); auto dg = &obj.someMethod; dg is a delegate that points to 'obj' and 'someMethod'. When you call dg(), it's the same as doing obj.someMethod().
Nov 06 2008
parent "Tony" <tonytech08 gmail.com> writes:
"Jarrett Billingsley" <jarrett.billingsley gmail.com> wrote in message 
news:mailman.354.1226008916.3087.digitalmars-d puremagic.com...
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 I like the header as the high level view of the code, be it classes or
 functions or whatever.
Documentation works quite nicely for that too. Or an editor that's smart enough to collapse function/class bodies or to give you a list of declarations.
A header file IS documentation more often than formal documentation. That said, I have produced structural documentation for my own code with doxygen, but mostly I don't need that unless I get away from working with the code for a long period of time or I just want to see the diagrams. More times, the design and architecture documentation that I have in the headers for now is much more useful. I was building an outlining code editor until MS VS 7 came out. I like that IDE a lot now that I have my preferences set in it. Out of the box settings are all about marketing MS technologies that I don't use (.net etc).
 This also makes compilation times much faster, since the header files
 don't need to be included & recompiled during every compilation -- if
 you're compiling a group of files that depend on one another at once, 
 the
 compiler will only compile each one once.
Compile times as I am not doing large scale development. And with so much processor power available these days, I really don't see a problem with compile times. Some care in laying out code and headers goes a long way.
Boost.
What about it?
 Then you use the GC. Or malloc/free. Or placement new. Or... seriously,
 what do you want here?
OK. I was just wondering how C++ like those things were in D. Apparently pretty much the same, if not exactly so even.
Yeah. D uses GC by default, but there's nothing stopping you from turning it off, or compiling your app with a stub GC that doesn't actually do GC, or using malloc/free, or whatever floats your boat. It has custom allocators/deallocators just like C++ as well.
 In C++:

 int GetAnInt()
 {
    static int i = 0;
    return ++i;
 }

 So what kind of major programming problems do delegates and closures 
 solve?
 Are they just syntactic sugar?
No, not at all. Your example works in this case, but the point of a closure is that it is almost like an object - it is allocated on the heap, and has its own state. So while your example will indeed return a sequence of integers upon successive calls, what the closure version does that your code can't do is have _multiple_ counters. You call getADelegate() multiple times, and each delegate will have _its own state_, meaning that each delegate is a different object, and will generate its own sequence of numbers upon successive calls. Your function only has a single global state variable and cannot do the same.
That's what I saw also. Thx for confirming. I guess I'd have to actually have that available to me to get my mind to consider it as I was coding. Would I put it in my language? I dunno. class GetAnIntClass { int X; public: GetAnIntClass(){ X = 0; } int GetAnInt(){ return ++X; } };
 Delegates are kind of similar to the new [](){} function literals that
 are coming out in C++0x, except more automatic and, from what I
 understand, more powerful.  For example, and correct me if I'm wrong,
 C++0x's function literals cannot directly access the stack frame of
 the enclosing function; you have to "capture" them inside the square
 brackets.  D's nested functions (which are delegates) are allowed to
 access and modify their enclosing function's stack frame.  This makes
 writing things like callbacks _so_ much less painful.

 Lastly delegates don't have to be just free functions; they also act
 as a stand-in for pointers-to-member-functions.  If you do something
 like

 auto obj = new Class();
 auto dg = &obj.someMethod;

 dg is a delegate that points to 'obj' and 'someMethod'.  When you call
 dg(), it's the same as doing obj.someMethod(). 
Nov 06 2008
prev sibling parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
<jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
Nov 06 2008
next sibling parent reply "Tony" <tonytech08 gmail.com> writes:
"Jarrett Billingsley" <jarrett.billingsley gmail.com> wrote in message 
news:mailman.356.1226010025.3087.digitalmars-d puremagic.com...
 On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
 <jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so 
 much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
I have designed/coded for months without ever pressing the compile button. Some programmers use the compiler in a very iterative way: using it as a debugger is bad practice. Tony
Nov 06 2008
next sibling parent mgen <bmeck stedwards.edu> writes:
I would argue using it as a debugger for dependency reasons is bad practice but
using it to test smaller modules is good practice, but on a side note its a
waste to not have the compiler running if you are not using up resources doing
something else while coding (then its like a free check on yourself that takes
no time if you glance when it is done). That way I get the info for minimal
price and sometimes that info is very usefuly, after writing 1000 lines of code
I would be worried about debugging if it was not segmented out in steps as I
coded it.

Tony Wrote:

 
 "Jarrett Billingsley" <jarrett.billingsley gmail.com> wrote in message 
 news:mailman.356.1226010025.3087.digitalmars-d puremagic.com...
 On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
 <jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so 
 much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
I have designed/coded for months without ever pressing the compile button. Some programmers use the compiler in a very iterative way: using it as a debugger is bad practice. Tony
Nov 06 2008
prev sibling next sibling parent Janderson <ask me.com> writes:
Tony wrote:
 "Jarrett Billingsley" <jarrett.billingsley gmail.com> wrote in message 
 news:mailman.356.1226010025.3087.digitalmars-d puremagic.com...
 On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
 <jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so 
 much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
I have designed/coded for months without ever pressing the compile button. Some programmers use the compiler in a very iterative way: using it as a debugger is bad practice. Tony
I think this might be a symptom for working on a small project. On a large project when you are working with many others it is very dangerous to hord code for to long. You really need to check in as frequently as possible (and that means test frequently) otherwise: 1) QA and test systems can't perform regression tests appropriately. 2) Merging become a nightmare for everyone. 3) People don't get to see you code so frequently they can reuse it and improve it. 4) It becomes difficult to backout individual changes. (Note there are some good books on this, I forgot the names).
Nov 06 2008
prev sibling next sibling parent superdan <super dan.org> writes:
Tony Wrote:

 
 "Jarrett Billingsley" <jarrett.billingsley gmail.com> wrote in message 
 news:mailman.356.1226010025.3087.digitalmars-d puremagic.com...
 On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
 <jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so 
 much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
I have designed/coded for months without ever pressing the compile button.
tony could u pretty please with sugar on top: cut. the. fucking. bullshit. crap like this says tony is a self-important teen who knows next to nothin' but has an opinion on everythin'. among other funny comments: on compilation speed or on processor speed or on #define or on. well shit. pretty much everything he comments on. like any teen he thrives thru the attention given. so the more replies he gets the more annoying he comes. but the one on "designing" n coding for months without ever compiling. gotta be involuntary funny post o' the year.
Nov 07 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Tony wrote:
 
 I have designed/coded for months without ever pressing the compile button. 
 Some programmers use the compiler in a very iterative way: using it as a 
 debugger is bad practice.
 
 Tony 
 
 
You are either greatly mis-representing that situation, or that simply is, like superdan said, bullshit. Was the code in C/C++? Was there really no compile information, or did you get syntax or semantic errors from a tool other than the compiler, like the IDE/editor? That "for months" of yours translates to how much actual time? Was it several hours a day (4-8 hours) like in a job, or were you talking more like a hobbyists 30 mins per weekend? -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Nov 07 2008
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Bruno Medeiros wrote:
 Tony wrote:
 I have designed/coded for months without ever pressing the compile 
 button. Some programmers use the compiler in a very iterative way: 
 using it as a debugger is bad practice.

 Tony
You are either greatly mis-representing that situation, or that simply is, like superdan said, bullshit. Was the code in C/C++? Was there really no compile information, or did you get syntax or semantic errors from a tool other than the compiler, like the IDE/editor? That "for months" of yours translates to how much actual time? Was it several hours a day (4-8 hours) like in a job, or were you talking more like a hobbyists 30 mins per weekend?
I, too, found that claim quite amusing. I mean, the Usenet was getting boring with the usual suspects: CIA undercover agents, black belt owners, and people who predicted various market tops and bottoms. Writing code for months without compiling - that's original :o). Andrei
Nov 07 2008
next sibling parent reply "Tony" <tonytech08 gmail.com> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:gf2dor$146g$1 digitalmars.com...
 Bruno Medeiros wrote:
 Tony wrote:
 I have designed/coded for months without ever pressing the compile 
 button. Some programmers use the compiler in a very iterative way: using 
 it as a debugger is bad practice.

 Tony
You are either greatly mis-representing that situation, or that simply is, like superdan said, bullshit. Was the code in C/C++? Was there really no compile information, or did you get syntax or semantic errors from a tool other than the compiler, like the IDE/editor? That "for months" of yours translates to how much actual time? Was it several hours a day (4-8 hours) like in a job, or were you talking more like a hobbyists 30 mins per weekend?
I, too, found that claim quite amusing. I mean, the Usenet was getting boring with the usual suspects: CIA undercover agents, black belt owners, and people who predicted various market tops and bottoms. Writing code for months without compiling - that's original :o).
Deal with it. I am back in a compile/debug mode now after months of reworking and evolving my development framework. Research a little, engineer/architect/design a lot, then implement. Pretty much full time work for me this summer save for the PC support calls I handle. If you wanna be cocky though, so can I: I don't "write" software, I build it. :P Tony
Nov 07 2008
parent reply BCS <ao pathlink.com> writes:
Reply to Tony,

 I don't "write" software, I build it.
 
OK, how much time have you spent writing code: e.g. primarily typing actual code rather than designing what you will wright? I can believe going for months without compiling if you are not actually writing code (I have done this my self) but if you are actually writing actual code for months without compiling than you are extraordinarily good (99.99+ percentile) or rather foolish. (Any system that takes more than a few hours to build is to complex for most people to completely understand with any degree of reliability without actually seeing it operate)
Nov 10 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
BCS wrote:
 I can believe going for months without compiling if you are not actually 
 writing code (I have done this my self) but if you are actually writing 
 actual code for months without compiling than you are extraordinarily 
 good (99.99+ percentile) or rather foolish. (Any system that takes more 
 than a few hours to build is to complex for most people to completely 
 understand with any degree of reliability without actually seeing it 
 operate)
Even at Boeing they build a mockup before going into production.
Nov 10 2008
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 I, too, found that claim quite amusing. I mean, the Usenet was getting 
 boring with the usual suspects: CIA undercover agents, black belt 
 owners, and people who predicted various market tops and bottoms. 
 Writing code for months without compiling - that's original :o).
I didn't even do that in the punch card days! (Yes, I am that old.) Back in the 70's when I'd go home on break from college, I'd write Fortran in a spiral notebook, then punch it in to try it out the next semester. I'm not sure there were any months gap, and it was purely because my family couldn't afford a $50,000 computer. I still have that notebook.
Nov 10 2008
prev sibling parent Janderson <ask me.com> writes:
Jarrett Billingsley wrote:
 On Thu, Nov 6, 2008 at 5:01 PM, Jarrett Billingsley
 <jarrett.billingsley gmail.com> wrote:
 On Thu, Nov 6, 2008 at 4:18 PM, Tony <tonytech08 gmail.com> wrote:
 Compile times as I am not doing large scale development. And with so much
 processor power available these days, I really don't see a problem with
 compile times. Some care in laying out code and headers goes a long way.
Boost.
I did also want to make another point about this. Processors are not getting that much faster; it's not 2001 anymore. We've pretty much hit the wall for single-core performance, and compilation is an extremely difficult problem to parallelize. Yes, you can compile multiple files at once, but if a single compilation unit takes 20 minutes to compile, it doesn't matter how many cores you've got. The shortest compile time you can get is 20 minutes. Even if your compiles are "only" three minutes, if you spend three minutes compiling, followed by ten minutes of testing, that means you're spending almost a quarter of your time compiling. That doesn't seem like an efficient use of time.
I agree. Particularly when you code gets to a point where it starts generating it's own code. If you have good coding practices you can cut down on header bloat and other things to make iterative programming faster but at some point good code will start generating code itself (particular with templates). You end up with more code however its also doing a huge amount of things. That's on large scale development. The same can't be said for small scale.
Nov 06 2008
prev sibling parent reply Janderson <ask me.com> writes:
Tony wrote:
 Let me be facetious with Janderson's list plz...
 
 "Janderson" <ask me.com> wrote in message 
 news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D enforces 
 good programming practices.   For course I mentioned a couple of the ones 
 I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
Sure C++ can do unit checking, but its not built in. You have to use macros or templates in something that is not really designed to work correctly with the language. Even if you ignore that there's a barrior to entry by not having something like this in the language. By having it in the language good coding practices are encouraged.
 
 - Design by contract
Overblown concept, but can be done with C++ also to a more than adequate degree (heard of assertions?).
Yes this can be done with C++ but D takes it it many steps further.
 
 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in "Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
Invariant checks can be done in C++ but its very unweildly. It is very annoying to have to instruct each function with scope guards. D encourages good invariant checking by making it easy.
 
 - Stronger const
Insignificant. I still use many #defines just because I know that const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
Actually any good compiler will inline const variables but I'm not talking about those sort of const. Also you pay 2 costs for using #define: 1) its not typesafe 2) it adds to your compile time because the pre-processor has to do more. I'm talking about the const you put in function declarations which are very important.
 
 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
I think this means you simply haven't run up against any of the problems with header files.
 
 - Garbage collection
That's a major deal breaker for me.
Garbage collection can be turned off in D if you don't need it. However for me (even when performance is very important and a game programmer) I can deal with it.
 
 - No automatic copy constructor
Can't comment.
I'd encourage you to read "See C++ Coding Standards: 101 Rules, Guidelines, and Best Practices (C++ In-Depth Series)" by Herb Sutter, Andrei Alexandrescu http://www.amazon.com/Coding-Standards-Guidelines-Practices-Depth/dp/0321113586 one of the only books that has Bjarne Stroustrups seal of approval. Effective C++ is another good read.
 
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing numerical programming.
The point here is that in C++ operators where used for all sorts of things that they where not designed for. This makes code hard to follow. D restricts operators making them more difficult to use for something they are not really designed for, ie encouraging better design.
 
 - Delegates (Specifically, encouraging there use by making them simple to 
 use)
Can't comment.
C++ form of delegates/functors are horrible to debug and use. Even the ones in Andrei Alexandrescu loki which are nicer are still not as easy to use as D's. Delegates are a powerful tool and are very useful in decoupling code. They should be encouraged by being easy to use.
 
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's interfaces different from C++'s?
C++ has interfaces which can easily become abstractions. This is not good. By saying something is an interface, your documenting -> this is an interface don't change me. Its much better when the code can enforce rules rather then by just comments. You look at an interface in D and you know its an interface, C++ you have to read though the code or hope someone has documented it. I'm not the best explainer in the world so maybe someone else can explain this better.
 
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
I'd rather the machine catch something at compile time rather then runtime.
 
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
Macros in C++, powerful yes but I think they are over used. D has a replacement for the macro system which is more powerful. Most of which is not done in the preprocess. I've seen more horrible C++ macros then I can count. They are definably not a good practice. Any good C++ books will talk about "Macro side effects". They arn't typesafe, there's a possibility of having an operation performed in a place you don't expect. They don't work will on multilines (i hate \ because its not maintainable and error prone). Doing things like string operations are just non-intuitive. The other bad thing about macros is they are extremely difficult to debug. D provides many of the functionalities of macros in a nicer form: - Better Templates - Mixins - Version Site note: About ~80% of macros I've seen in C++ could have been done with templates in C++ and they would have been much better in so many ways.
 
 Tony 
 
I hope this has been helpful. BTW: How do I put this lightly. I think you'd be able to find some C++ experts that will disagree with a couple of the things I've said but on the whole this is pretty elementary stuff. -Joel
Nov 04 2008
parent reply "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gepsn2$21jr$1 digitalmars.com...
 Tony wrote:
 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message 
 news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
Sure C++ can do unit checking, but its not built in. You have to use macros or templates in something that is not really designed to work correctly with the language. Even if you ignore that there's a barrior to entry by not having something like this in the language. By having it in the language good coding practices are encouraged.
I write unit tests. I don't know why I'd need or want language support for that.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than adequate degree (heard of assertions?).
Yes this can be done with C++ but D takes it it many steps further.
Like I said, I find the techniques more important that some implementation of them (mechanism rather than policy?).
 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in "Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
Invariant checks can be done in C++ but its very unweildly. It is very annoying to have to instruct each function with scope guards. D encourages good invariant checking by making it easy.
But again, I am thinking that the scenarios where invariants can be established is a very small subset of classes.
 - Stronger const
Insignificant. I still use many #defines just because I know that const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
Actually any good compiler will inline const variables but I'm not talking about those sort of const. Also you pay 2 costs for using #define: 1) its not typesafe 2) it adds to your compile time because the pre-processor has to do more.
Coming from the Windows world, one isn't "afraid" of 1 above whatsoever. Compile time? CPUs are evolving faster than I'll ever be able to outpace them with the complexity or volume of my software.
 I'm talking about the const you put in function declarations which are 
 very important.


 - Modules
If that means doing away with header files, I don't think I like it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
I think this means you simply haven't run up against any of the problems with header files.
That's probably exactly what it means. But maybe I'm tooling up to write utility software rather than large scale software.
 - Garbage collection
That's a major deal breaker for me.
Garbage collection can be turned off in D if you don't need it.
So I've been told. But I think the default should be to include it when you like rather than the other way around. Obviously, I can use a GC library in C++ if I was so inclined.
 However for me (even when performance is very important and a game 
 programmer) I can deal with it.

 - No automatic copy constructor
Can't comment.
I'd encourage you to read "See C++ Coding Standards: 101 Rules, Guidelines, and Best Practices (C++ In-Depth Series)" by Herb Sutter, Andrei Alexandrescu http://www.amazon.com/Coding-Standards-Guidelines-Practices-Depth/dp/0321113586 one of the only books that has Bjarne Stroustrups seal of approval.
 Effective C++ is another good read.
Been there a number of times. Automatic copy constructor issue didn't catch my eye though obviously.
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing numerical programming.
The point here is that in C++ operators where used for all sorts of things that they where not designed for. This makes code hard to follow.
I whole heartedly agree!
 D restricts operators making them more difficult to use for something they 
 are not really designed for, ie encouraging better design.
Minor. I know when not to use operators (read: hardly ever!).
 - Delegates (Specifically, encouraging there use by making them simple 
 to use)
Can't comment.
C++ form of delegates/functors are horrible to debug and use. Even the ones in Andrei Alexandrescu loki which are nicer are still not as easy to use as D's. Delegates are a powerful tool and are very useful in decoupling code. They should be encouraged by being easy to use.
Apparently I should look that one up, cuz I don't even know what they are. But that you said "decoupling", makes me interested.
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's interfaces different from C++'s?
C++ has interfaces which can easily become abstractions. This is not good. By saying something is an interface, your documenting -> this is an interface don't change me. Its much better when the code can enforce rules rather then by just comments. You look at an interface in D and you know its an interface, C++ you have to read though the code or hope someone has documented it. I'm not the best explainer in the world so maybe someone else can explain this better.
I use naming standards for interfaces: iSomeClass, for example. I'm not sure what problem D's interfaces solve. I find no problem with C++'s interface techniques.
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
I'd rather the machine catch something at compile time rather then runtime.
As long as I'm not prevented from doing casting I know is safe, it's fine.
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
Macros in C++, powerful yes but I think they are over used.
Macros and using the preprocessor as a template machine are apples and oranges. Every use of the the C++ preprocessor does not fit the definition of "macro", thought everyone pounces on the obvious as you do below:
 D has a replacement for the macro system which is more powerful.  Most of 
 which is not done in the preprocess.  I've seen more horrible C++ macros 
 then I can count.  They are definably not a good practice.   Any good C++ 
 books will talk about "Macro side effects".  They arn't typesafe, there's 
 a possibility of having an operation performed in a place you don't 
 expect.  They don't work will on multilines (i hate \ because its not 
 maintainable and error prone).  Doing things like string operations are 
 just non-intuitive.

 The other bad thing about macros is they are extremely difficult to debug.

 D provides many of the functionalities of macros in a nicer form:
 - Better Templates
 - Mixins
 - Version

 Site note: About ~80% of macros I've seen in C++ could have been done with 
 templates in C++ and they would have been much better in so many ways.
Tony
Nov 04 2008
next sibling parent reply Don <nospam nospam.com> writes:
Tony wrote:
 "Janderson" <ask me.com> wrote in message 
 news:gepsn2$21jr$1 digitalmars.com...
 Tony wrote:
 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message 
 news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -

 - Unit checking
Not sure what is meant by this, but it sounds minor.
Sure C++ can do unit checking, but its not built in. You have to use macros or templates in something that is not really designed to work correctly with the language. Even if you ignore that there's a barrior to entry by not having something like this in the language. By having it in the language good coding practices are encouraged.
I write unit tests. I don't know why I'd need or want language support for that.
Yes, it's simple syntax sugar. But in practice, it really seems to dramatically affect the number of unit tests that actually get written.
 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a machine do it. (Bad analogy, but y'all get the point).
I'd rather the machine catch something at compile time rather then runtime.
As long as I'm not prevented from doing casting I know is safe, it's fine.
D doesn't restrict your ability to do casting in any way. It just makes it a bit more difficult to cast by accident.
 
 - No C style Macros
Implementing a template or template system with a good preprocessor is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
Macros in C++, powerful yes but I think they are over used.
Macros and using the preprocessor as a template machine are apples and oranges. Every use of the the C++ preprocessor does not fit the definition of "macro", thought everyone pounces on the obvious as you do below:
One of the main original motivations of the C++ template system was to provide a typesafe equivalent to the preprocessor. It turned out to be much more powerful than the preprocessor, but there's still major functionality which you can only do with the preprocessor. D templates are _significantly_ more powerful than C++ templates, so they are capable to doing almost all preprocessor jobs. D also has string mixins which can do stuff you'd do in C with preprocessor token pasting, but it's typesafe and a hundred times more powerful. So I wouldn't list "no C-style Macros" as the benefit. I'd rather say the benefit is that the preprocessor tasks are integrated in the main language (they are not simply discarded, the way they are in Java etc). Bear in mind that the list was not "what's better about D" but rather "what DESIGN PRACTICES are better in D". Which is not exactly the same question.
Nov 05 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Don wrote:
 Tony wrote:
 I write unit tests. I don't know why I'd need or want language support 
 for that.
Yes, it's simple syntax sugar. But in practice, it really seems to dramatically affect the number of unit tests that actually get written.
It is amazing what a difference just a little sugar will do. It puts it past the tipping point. I've run across an awful lot of C/C++ code in my career written by professionals. Very few of those had any sort of organized test code. But the opposite seems to be true with D. By supporting it directly in the syntax, code just looks half-baked if it doesn't have unittests and Ddoc documentation. That little push makes all the difference. It also helps with the management of code. It makes it easy to enforce a rule of "public functions shall have unit tests and ddoc comments." Before ddoc, the Phobos documentation was an embarrassing mess. I wouldn't exactly call it the greatest now, but the improvement has been spectacular, and ddoc is the driver behind that.
Nov 06 2008
parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:

It is amazing what a difference just a little sugar will do. It puts it past
the tipping point. I've run across an awful lot of C/C++ code in my career
written by professionals. Very few of those had any sort of organized test
code. But the opposite seems to be true with D. By supporting it directly in
the syntax, code just looks half-baked if it doesn't have unittests and Ddoc
documentation. That little push makes all the difference.
It also helps with the management of code. It makes it easy to enforce a rule of "public functions shall have unit tests and ddoc comments." Before ddoc, the Phobos documentation was an embarrassing mess. I wouldn't exactly call it the greatest now, but the improvement has been spectacular, and ddoc is the driver behind that.< I use ddoc and unittests all the time, so I love them a lot. Every little bit, function, template, class in my dlibs is heavily unittested (probably more than Phobos), I have tried to cover all weird corner cases too, and that has made the code quite reliable, I have found just few bugs in my code so far (so far probably about 1 bug every 20_000 lines of code, but this sounds too much good to me...). Unfortunately I have seen lot of D code that doesn't use unittests, for example the code I'm talking about in the announce group: http://team0xf.com:8080/omg/file/aca17fefefc1/core/Algebra.d I agree a lot with what you say: lowering the laziness barrier to do something, and adding some standard sugar helps a lot in getting things used. In the Python world they have invented something that lowers such barrier even more, that is doctests: http://www.python.org/doc/2.5.2/lib/module-doctest.html (It's wonderdful and allows even a lazy person to write lot of tests. But it requires capabilibies from the language that I think are absent from D still). In a recent post of mine I was discussing about removing ddoc and unittest from the D language (while putting range controls to all integral values) AND adding features that allow library code to replace them keeping them equally handy (this also allows such features to be debugged ad improved by the D community). At the moment both the ddoc and uniuttest features of DMD have several bugs and limitations. For example I'd like to give names to unittests, and to run just part of them in a handy way, and have a log of how many of them have failed at the end, instead of stopping at the first wrong assert, etc. I'd like to have a way to catch static asserts or something similar, so inside the unit tests I can also put code that is supposed to fail statically, this helps me test both positive and negative cases. Etc. So now the interesting question is: what D needs to allow moving ddoc and unittest into its standard library keeping a handy syntax? ddoc: the -D compilation switch of DMD can be removed, as well as the code that it contains to generate the HTML page. All such code can be moved into a library (and later improved/debugged). A little program can be added into the "bin" directory that is designed just to create such HTML pages. But there's a problem here: such tool has to parse D code, etc, so it's a duplication of efforts, with a risk of growing little differences in the way D code is parsed, etc., that's bad. So the DMD compiler can grow a switch that makes it spit out the result of parsing a D module (for example in Json file format), that such ddoc tool can load (with a pipe too, to avoid putting another file on disk) and use avoiding all the parsing and creating the HTML. unittest: here I am less sure about what it needs to be done (beside removing the -unittest compilation switch from DMD). I think D has to grow few more handy reflection capabilities, that can be used to write short and a simple unittest library for the standard library. A unittests(name) {} syntax may be kept in the language... I am not sure. Bye and thank you, bearophile
Nov 06 2008
next sibling parent reply "Bill Baxter" <wbaxter gmail.com> writes:
On Fri, Nov 7, 2008 at 4:34 PM, bearophile <bearophileHUGS lycos.com> wrote:

 ddoc: the -D compilation switch of DMD can be removed, as well as the code
that it contains to generate the HTML page. All such code can be moved into a
library (and later improved/debugged). A little program can be added into the
"bin" directory that is designed just to create such HTML pages. But there's a
problem here: such tool has to parse D code, etc, so it's a duplication of
efforts, with a risk of growing little differences in the way D code is parsed,
etc., that's bad. So the DMD compiler can grow a switch that makes it spit out
the result of parsing a D module (for example in Json file format), that such
ddoc tool can load (with a pipe too, to avoid putting another file on disk) and
use avoiding all the parsing and creating the HTML.
Ddoc's output is supposed to be entirely determined by the macro set that you give it. In theory anyway. So it should be possible to write a ddoc macro set that can spit out your Json format version of all entities and their DDoc documentation strings. It may not be possible in practice, but I think bug reports on what's missing to make that a reality are less likely to get ignored than ones suggesting that DDoc be removed.
 unittest: here I am less sure about what it needs to be done (beside removing
the -unittest compilation switch from DMD). I think D has to grow few more
handy reflection capabilities, that can be used to write short and a simple
unittest library for the standard library. A unittests(name) {} syntax may be
kept in the language... I am not sure.
Someone who's a big unittesting fan should write up a proposal on this. I think unittests are neat and all -- I probably don't use them as much as I should -- but I don't really know what's so great about named unittests or other things people mention that D's unittests lack. I suspect Walter may be in the same boat. You can't address a problem if you don't really understand it. --bb
Nov 06 2008
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 Someone who's a big unittesting fan should write up a proposal on
 this.  I think unittests are neat and all -- I probably don't use them
 as much as I should -- but I don't really know what's so great about
 named unittests or other things people mention that D's unittests
 lack.  I suspect Walter may be in the same boat.  You can't address a
 problem if you don't really understand it.
I guess I am in that boat. It reminds me of a couple decades ago, when you could buy profilers for your programs. Everybody bought them, but nobody used them. The inch thick manual remained in its shrinkwrap. I suspected the problem was the manual. The profile tool was packed with every feature imaginable, all thoroughly configurable. Unfortunately, actually running the profiler and getting a result took a considerable investment of time by the programmer trying to figure out how to do it. He rarely bothered. That's why the profiler for dmd is just -profile. Nothing to learn. Same goes for the coverage analyzer. Experience has led me to believe that unit tests are extremely valuable, but I rarely see them used - even by professionals. I wanted to make them so easy to use in D that it would hook people in. That's why they are the way they are - super simple, next to nothing to learn, and they work.
Nov 10 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:

Experience has led me to believe that unit tests are extremely valuable, but I
rarely see them used - even by professionals.<
In languages with dynamic typing (Python, Ruby, etc) they are used quite often, partially to replace errors that the static typing catches, and partially for other purposes (but after writing about 110.000 lines of D code I have seen that unit tests result very useful in D code too). In dynamic languages they have even invented a programming style (TDD, Test-Driven Development) that is strictly based on unit tests: you write a unit test first, see it fail, you fix the code to make it pass, you add another unit test, you add a little more code, you see it fail, etc etc. I know it sounds a little crazy, but if you use dynamic languages to write certain classes of programs (surely it's not fit for every kind of code) it seem to work well enough (for some kinds of programmers, I presume).
I wanted to make them so easy to use in D that it would hook people in. That's
why they are the way they are - super simple, next to nothing to learn, and
they work.<
I understand. The profilers you are talking about push too much complexity to the final user. But ergonomics shows there are other possible designs for the interfaces of tools: sometimes you can push some more complexity into the product even if its interface is kept simple enough, making it flexible only where it more counts. So I think there are "few things" that can be added to the current unit test system that can increase its usefulness and make it more handy while keeping a simple user interface. It's not easy to find and list such few things, I can try list something: 1) I'd like a way to state that an expression throws one or more specified exception(s), at runtime, for example: Throws!(ArgumentException)(foo("hello", -5)); It also has to print that line number of the caller. I have created something similar, but it's quite less nice: assert( Throws!(ArgumentException)(foo("hello", -5)) ); See my Throws!() here: http://www.fantascienza.net/leonardo/so/dlibs/func.html 2) The same at compile time. I think it's impossible to do currently: static Throws!(AssertError)(foo(5, -5)); 3) I need ways to unittest a specified module only. And I'd like to be able to do it even if the main is missing. Having a compiler-managed "mainmodule" boolean constant that is true only in the main module may help. 4) I'd like to unittest nested functions too. 5) Few reflective capabilities can be added to D to help the handy creation of an external unittest system, for the people that need something quite more refined and complex. -------------------------- I have already given two times links to the wonderful doctest system of Python, but it seems no one has read it, I have seen no one comment on it. So I try a third time, this time I explain a little more. Note that doctests are unfit for the current D language, but if D gains some runtime capabilities (like I have seen shown here two times), then its phylosophy may become usable. Note that I am not talking about Test-Driven Development here, this is "normal" way of coding. This is a little useful Python function that returns true if the given iterable contains items that are all equal. If given an optional mapping function is used to transform items before comparing them: def allequal(iterable, key=None): """allequal(iterable, key=None): return True if all the items of iterable are equal. If key is specified it returns True if all the key(item) are equal. """ iseq = iter(iterable) try: first = iseq.next() except StopIteration: return True if key is None: for el in iseq: if el != first: return False else: key_first = key(first) for el in iseq: if key(el) != key_first: return False return True My D1 version of the same function (you can find it in the "func" module of my dlibs): /********************************************* Return true if all the items of the iterable 'items' are equal. If 'items' is empty return true. If the optional 'key' callable is specified, it returns true if all the key(item) are equal. If 'items' is an AA, scans its keys. */ bool allEqual(TyIter, TyKey=void*)(TyIter items, TyKey key=null) { static if (!IsCallable!(TyKey)) if (key !is null) throw new ArgumentException("allEqual(): key must " "be a callable or null."); bool isFirst = true; static if (!is( TyIter == void[0] )) { static if (IsCallable!(TyKey)) { ReturnType!(TyKey) keyFirstItem; static if (IsAA!(TyIter)) { foreach (el, _; items) if (isFirst) { isFirst = false; keyFirstItem = key(el); } else { if (key(el) != keyFirstItem) return false; } } else static if (IsArray!(TyIter)) { if (items.length > 1) { keyFirstItem = key(items[0]); foreach (el; items[1 .. $]) if (key(el) != keyFirstItem) return false; } } else { foreach (el; items) if (isFirst) { isFirst = false; keyFirstItem = key(el); } else { if (key(el) != keyFirstItem) return false; } } } else { BaseType1!(TyIter) firstItem; static if (IsAA!(TyIter)) { return items.length < 2 ? true : false; } else static if (IsArray!(TyIter)) { if (items.length > 1) { firstItem = items[0]; foreach (el; items[1 .. $]) if (el != firstItem) return false; } } else { foreach (el; items) if (isFirst) { isFirst = false; firstItem = el; } else { if (el != firstItem) return false; } } } } return true; } // end allEqual() Its unit tests: unittest { // Tests of allEqual() // array assert(allEqual([])); assert(allEqual(new int[0])); assert(allEqual([1])); assert(!allEqual([1, 1, 2])); assert(allEqual([1, 1, 1])); assert(allEqual("aaa")); assert(!allEqual("aab")); // array with key function int abs(int x) { return x >= 0 ? x : -x; } assert(allEqual([], &abs)); assert(allEqual(new int[0], &abs)); assert(allEqual([1], &abs)); assert(allEqual([1, -1], &abs)); assert(!allEqual([1, -2], &abs)); // AA assert(allEqual(AA!(int, int))); assert(allEqual([1: 1])); assert(!allEqual([1: 1, 2: 2])); assert(!allEqual([1: 1, 2: 2, 3: 3])); // AA with key function assert(allEqual(AA!(int, int))); assert(allEqual([1: 1], &abs)); assert(!allEqual([1: 1, 2: 2], &abs)); assert(allEqual([1: 1, -1: 2], &abs)); assert(!allEqual([1: 1, -1: 2, 2: 3], &abs)); // with an iterable struct IterInt { // iterable wrapper int[] items; int opApply(int delegate(ref int) dg) { int result; foreach (el; this.items) { result = dg(el); if (result) break; } return result; } } assert(allEqual(IterInt(new int[0]))); assert(allEqual(IterInt([1]))); assert(!allEqual(IterInt([1, 1, 2]))); assert(allEqual(IterInt([1, 1, 1]))); // iterable with key function assert(allEqual(IterInt(new int[0]), &abs)); assert(allEqual(IterInt([1]), &abs)); assert(allEqual(IterInt([1, -1]), &abs)); assert(!allEqual(IterInt([1, -2]), &abs)); } // End tests of allEqual() In Python if I want to use doctests I can start the Python shell, import a module that contains that allequal() function, and try it in various ways. If I find bugs or strange outputs I can also debug it, etc. Let's say there are no bugs, then this can be the log of the usage of allequal() in that shell:
 from util import allequal
 allequal()
Traceback (most recent call last): ... TypeError: allequal() takes at least 1 argument (0 given)
 allequal([])
True
 allequal([1])
True
 allequal([1, 1L, 1.0, 1.0+0.0J])
True
 allequal([1, 1, 2])
False
 allequal([1, -1, -1.0], key=abs)
True
 allequal(iter([]))
True
 allequal(iter([]), key=abs)
True
 allequal(iter([1]), key=abs)
True
 allequal(iter([1, 2]), key=abs)
False Then I can just copy and paste that log into the docstring of the function (the docstring is similar to the /** ... */ or /// of D): def allequal(iterable, key=None): """allequal(iterable, key=None): return True if all the items of iterable are equal. If key is specified it returns True if all the key(item) are equal. >>> allequal() Traceback (most recent call last): ... TypeError: allequal() takes at least 1 argument (0 given) >>> allequal([]) True >>> allequal([1]) True >>> allequal([1, 1L, 1.0, 1.0+0.0J]) True >>> allequal([1, 1, 2]) False >>> allequal([1, -1, -1.0], key=abs) True >>> allequal(iter([])) True >>> allequal(iter([]), key=abs) True >>> allequal(iter([1]), key=abs) True >>> allequal(iter([1, 2]), key=abs) False """ iseq = iter(iterable) try: first = iseq.next() except StopIteration: return True if key is None: for el in iseq: if el != first: return False else: key_first = key(first) for el in iseq: if key(el) != key_first: return False return True At the end of the module where allequal() is I just need to add: if __name__ == "__main__": import doctest doctest.testmod() (Where if __name__==... is true only in the main module). Note that Traceback, it's a test failed on purpose. Now that shell log is run and the resuls of each expression is compared to the given results. If they are different, then a test is failed, and at the end a list of the failed ones is shown. If the module is imported, the tests aren't run. doctests can't be used for everything, there are more complex and refined ways to test in Python, but for quicker/smaller purposes it's godsend. I've never found something as handy to write tests in 20+ other languages. Bye, bearophile
Nov 10 2008
next sibling parent reply Jason House <jason.james.house gmail.com> writes:
bearophile Wrote:

 I have already given two times links to the wonderful doctest system of
Python, but it seems no one has read it, I have seen no one comment on it. So I
try a third time...
Where's the link? Also, if you want D programmers to read about it, you probably shouldn't say up front that it's incompatible with D ;)
Nov 10 2008
parent bearophile <bearophileHUGS lycos.com> writes:
Jason House:
 Where's the link?
http://www.python.org/doc/2.5.2/lib/module-doctest.html
 Also, if you want D programmers to read about it, you probably shouldn't say
up front that it's incompatible with D ;)
This is a technical forum for mature people, so I think saying the truth is better. Bye, bearophile
Nov 10 2008
prev sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
bearophile wrote:
 Walter Bright:
 
 Experience has led me to believe that unit tests are extremely valuable, but I
rarely see them used - even by professionals.<
In languages with dynamic typing (Python, Ruby, etc) they are used quite often, partially to replace errors that the static typing catches, and partially for other purposes (but after writing about 110.000 lines of D code I have seen that unit tests result very useful in D code too). In dynamic languages they have even invented a programming style (TDD, Test-Driven Development) that is strictly based on unit tests: you write a unit test first, see it fail, you fix the code to make it pass, you add another unit test, you add a little more code, you see it fail, etc etc. I know it sounds a little crazy, but if you use dynamic languages to write certain classes of programs (surely it's not fit for every kind of code) it seem to work well enough (for some kinds of programmers, I presume).
 I wanted to make them so easy to use in D that it would hook people in. That's
why they are the way they are - super simple, next to nothing to learn, and
they work.<
I understand. The profilers you are talking about push too much complexity to the final user. But ergonomics shows there are other possible designs for the interfaces of tools: sometimes you can push some more complexity into the product even if its interface is kept simple enough, making it flexible only where it more counts. So I think there are "few things" that can be added to the current unit test system that can increase its usefulness and make it more handy while keeping a simple user interface. It's not easy to find and list such few things, I can try list something: 1) I'd like a way to state that an expression throws one or more specified exception(s), at runtime, for example: Throws!(ArgumentException)(foo("hello", -5)); It also has to print that line number of the caller. I have created something similar, but it's quite less nice: assert( Throws!(ArgumentException)(foo("hello", -5)) ); See my Throws!() here: http://www.fantascienza.net/leonardo/so/dlibs/func.html
For what it's worth, dunit supports this: tests["no expected exception"] = {}; tests["fails if it doesn't throw"] = expectedException!(AssertError) = { assert(false); }; I was attempting to channel downs when I came up with this syntax.
 2) The same at compile time. I think it's impossible to do currently:
 static Throws!(AssertError)(foo(5, -5));
 
 3) I need ways to unittest a specified module only. And I'd like to be able to
do it even if the main is missing. Having a compiler-managed "mainmodule"
boolean constant that is true only in the main module may help.
Dunit also supports unittesting a single module, but since it replaces main, you can't do without it. That said, I'm looking for workarounds, such as outputting a module sufficient to compile and run tests in the given paths.
 4) I'd like to unittest nested functions too.
That's not going to be easy.
 5) Few reflective capabilities can be added to D to help the handy creation of
an external unittest system, for the people that need something quite more
refined and complex.
d2 will get some improvements that I really would like for dunit. Runtime reflection to get a callable list of the methods on a class is a big one -- the current syntax is a hack. But that won't be sufficient to move to an Nunit/Junit style syntax; you won't get filters (parameterized tests, expected exceptions, and the like). The really big thing after runtime reflection is user-defined metadata lot of other use cases, so I'm hoping this makes it into the language within a reasonable amount of time.
 --------------------------
 
 I have already given two times links to the wonderful doctest system of
Python, but it seems no one has read it, I have seen no one comment on it. So I
try a third time, this time I explain a little more.
 
 Note that doctests are unfit for the current D language, but if D gains some
runtime capabilities (like I have seen shown here two times), then its
phylosophy may become usable.
 
 Note that I am not talking about Test-Driven Development here, this is
"normal" way of coding.
 
 
 This is a little useful Python function that returns true if the given
iterable contains items that are all equal. If given an optional mapping
function is used to transform items before comparing them:
This is interesting. It's not as flexible as dunit or D's unittest blocks -- it'll complain about any user-visible changes to a function. It also looks like it'd be annoying to use, say, mock objects with it. I would have no use for doctests, but I think it's a neat hack.
Nov 10 2008
parent reply bearophile <bearophileHUGS lycos.com> writes:
Christopher Wright:

 For what it's worth, dunit supports this:
 tests["no expected exception"] = {};
 tests["fails if it doesn't throw"] = expectedException!(AssertError) = {
 assert(false); };
I don't understand that syntax.
4) I'd like to unittest nested functions too.<<
That's not going to be easy.<
It's not too much important.
This is interesting. It's not as flexible as dunit or D's unittest blocks --
it'll complain about any user-visible changes to a function. It also looks like
it'd be annoying to use, say, mock objects with it. I would have no use for
doctests, but I think it's a neat hack.<
I use it every day and I find it very useful, but note it's not meant to replace normal unittests (in Python for them you use the unittest module of the std lib, or a system you can find online, like "nose"), it's mailing meant to write "documentation tests", that is to write normal documentation that also contains and shows some usage examples: with doctests you can be sure that documentation never goes out of sync with the code, because it's documentation that runs. Bye, bearophile
Nov 10 2008
parent reply Christopher Wright <dhasenan gmail.com> writes:
bearophile wrote:
 Christopher Wright:
 
 For what it's worth, dunit supports this:
 tests["no expected exception"] = {};
 tests["fails if it doesn't throw"] = expectedException!(AssertError) = {
 assert(false); };
I don't understand that syntax.
It's motivated by the lack of reflection in D1. In dunit, you first make a test fixture: class FooTests : TestFixture { } Then in the constructor, you define tests: class FooTests : TestFixture { this () { tests["test 1"] = { assert (1 < 2); }; } } A filter just goes in between the test name and the test body: class FooTests : TestFixture { this () { tests["test 1"] = expectedException!(AssertError) = { assert (!(1 < 2)); }; } }
 4) I'd like to unittest nested functions too.<<
That's not going to be easy.<
It's not too much important.
 This is interesting. It's not as flexible as dunit or D's unittest blocks --
it'll complain about any user-visible changes to a function. It also looks like
it'd be annoying to use, say, mock objects with it. I would have no use for
doctests, but I think it's a neat hack.<
I use it every day and I find it very useful, but note it's not meant to replace normal unittests (in Python for them you use the unittest module of the std lib, or a system you can find online, like "nose"), it's mailing meant to write "documentation tests", that is to write normal documentation that also contains and shows some usage examples: with doctests you can be sure that documentation never goes out of sync with the code, because it's documentation that runs.
For that, it looks like it would work quite well, as long as it's kept up to date. And being machine verifiable, it should be easy to find out what's outdated.
 Bye,
 bearophile
Nov 11 2008
parent reply Janderson <ask me.com> writes:
Christopher Wright wrote:
 bearophile wrote:
 Christopher Wright:

 For what it's worth, dunit supports this:
 tests["no expected exception"] = {};
 tests["fails if it doesn't throw"] = expectedException!(AssertError) = {
 assert(false); };
I don't understand that syntax.
It's motivated by the lack of reflection in D1. In dunit, you first make a test fixture: class FooTests : TestFixture { } Then in the constructor, you define tests: class FooTests : TestFixture { this () { tests["test 1"] = { assert (1 < 2); }; } } A filter just goes in between the test name and the test body: class FooTests : TestFixture { this () { tests["test 1"] = expectedException!(AssertError) = { assert (!(1 < 2)); }; } }
*sigh* This is smart and all. However this is the sort of thing that puts me off unit tests that are not part of the language (or their own language). It feels like a big hack to me. -Joel
Nov 11 2008
parent Christopher Wright <dhasenan gmail.com> writes:
Janderson wrote:
 *sigh* This is smart and all.  However this is the sort of thing that 
 puts me off unit tests that are not part of the language (or their own 
 language).  It feels like a big hack to me.
 
 -Joel
I sympathize. D needs better runtime reflection with user-defined metadata for a reasonable solution. Unit testing is sufficiently important for me that I'm willing to deal with odd syntax.
Nov 12 2008
prev sibling next sibling parent mgen <bmeck stedwards.edu> writes:
I agree that in D writing unit tests is much easier which i use every so often,
but I think most people just dont see they are there, I think in the spec the
unittests should be given a small page to themselves and naming them / having
scope(exit) { } work on them to do things such as post an email / log of custom
designed code would be ideal for my purposes.
Nov 10 2008
prev sibling parent Jason House <jason.james.house gmail.com> writes:
Walter Bright Wrote:

 Experience has led me to believe that unit tests are extremely valuable, 
 but I rarely see them used - even by professionals. I wanted to make 
 them so easy to use in D that it would hook people in. That's why they 
 are the way they are - super simple, next to nothing to learn, and they 
 work.
D's unit tests are _harder_ to use when there's a failure! Once upon a time, I upgraded to a new version of Tango. It had undocumented breaking changes in it that broke several unit tests in the same module. It took me a while to realize the full scope of the problem because I was unable to see the full impact of the upgrade. I've also had issues where I've had known unit test failures while working on code. In the interim, I'm left with two bad options: 1. Comment out the unit test 2. Stop running unit tests I hope these two small examples give a flavor for why I'd like the ability to get status of all unit tests in a module. It'd be nice if one could set up test driven development with D's unit tests. Right now, I'm in the process of making an adapter around 3 API's. What I'd like to do is get the first API wrapped up and write a reusable tester object that works on the wrapped interface. Then, I'd like to turn that tester on the other two API's as I wrap them and get incremental updates on what does and does not work. For this to work, I'd want to be able to see unit test failures without stopping. I'd also need more than just line numbers for a failure since the same test object will be used for different API's. This applies generally to any templated object with unit tests. With an increased number of test failures, I'll really want to start differentiating them in terms of things that I can easily read and understand. Stuff like "Test Failure: API3: Liberties Test: Incorrect liberties after merger" etc...
Nov 10 2008
prev sibling parent reply Ary Borenszweig <ary esperanto.org.ar> writes:
Bill Baxter wrote:
 On Fri, Nov 7, 2008 at 4:34 PM, bearophile <bearophileHUGS lycos.com> wrote:
 
 ddoc: the -D compilation switch of DMD can be removed, as well as the code
that it contains to generate the HTML page. All such code can be moved into a
library (and later improved/debugged). A little program can be added into the
"bin" directory that is designed just to create such HTML pages. But there's a
problem here: such tool has to parse D code, etc, so it's a duplication of
efforts, with a risk of growing little differences in the way D code is parsed,
etc., that's bad. So the DMD compiler can grow a switch that makes it spit out
the result of parsing a D module (for example in Json file format), that such
ddoc tool can load (with a pipe too, to avoid putting another file on disk) and
use avoiding all the parsing and creating the HTML.
Ddoc's output is supposed to be entirely determined by the macro set that you give it. In theory anyway. So it should be possible to write a ddoc macro set that can spit out your Json format version of all entities and their DDoc documentation strings. It may not be possible in practice, but I think bug reports on what's missing to make that a reality are less likely to get ignored than ones suggesting that DDoc be removed.
 unittest: here I am less sure about what it needs to be done (beside removing
the -unittest compilation switch from DMD). I think D has to grow few more
handy reflection capabilities, that can be used to write short and a simple
unittest library for the standard library. A unittests(name) {} syntax may be
kept in the language... I am not sure.
Someone who's a big unittesting fan should write up a proposal on this. I think unittests are neat and all -- I probably don't use them as much as I should -- but I don't really know what's so great about named unittests or other things people mention that D's unittests lack. I suspect Walter may be in the same boat. You can't address a problem if you don't really understand it. --bb
I write unit tests all the time in Java, both for projects in my work and for projects outside the work (Descent, for instance). I make a class named FooFixture or FooTests to test the Foo class. Each method of FooFixture shows the intention of the test. Each method of FooTests class is a test. If a test "ShouldDoSomething()" fails, I can see in the UI: "ShouldDoSomething() failed, together with a stack trace.". So if I change something in my code, I can know at a glance which tests failed, and probably understand why. This is not possible to achieve if you get "assert failed, blah blah, at line 8 of foo.d". If I see that many tests failed, I can get a better understanding of what I broke and why. In D I see just one failed assertion and that's it, I can't see all of the parts that the problem has affected. That's why I think named unittest blocks and continue on assertion failed are useful. As for ddoc, I wish there were a way to say "link to some declaration". think). When you write documentation, understating declarations it participates with helps understanding the overall picture. There's currently no way to insert links to other declarations, and that makes it hard to navigate ddocs. This also applies for declarations found in parameters, base classes and implemented interfaces, etc.
Nov 10 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Ary Borenszweig wrote:
 
 As for ddoc, I wish there were a way to say "link to some declaration". 

 think). When you write documentation, understating declarations it 
 participates with helps understanding the overall picture. There's 
 currently no way to insert links to other declarations, and that makes 
 it hard to navigate ddocs. This also applies for declarations found in 
 parameters, base classes and implemented interfaces, etc.
It also has the benefit of enabling automatic refactoring of that reference, whenever the actual (non-ddoc) code is refactored. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Nov 28 2008
prev sibling parent "Bill Baxter" <wbaxter gmail.com> writes:
On Fri, Nov 7, 2008 at 4:48 PM, Bill Baxter <wbaxter gmail.com> wrote:
 On Fri, Nov 7, 2008 at 4:34 PM, bearophile <bearophileHUGS lycos.com> wrote:

 ddoc: the -D compilation switch of DMD can be removed, as well as the code
that it contains to generate the HTML page. All such code can be moved into a
library (and later improved/debugged). A little program can be added into the
"bin" directory that is designed just to create such HTML pages. But there's a
problem here: such tool has to parse D code, etc, so it's a duplication of
efforts, with a risk of growing little differences in the way D code is parsed,
etc., that's bad. So the DMD compiler can grow a switch that makes it spit out
the result of parsing a D module (for example in Json file format), that such
ddoc tool can load (with a pipe too, to avoid putting another file on disk) and
use avoiding all the parsing and creating the HTML.
Ddoc's output is supposed to be entirely determined by the macro set that you give it. In theory anyway. So it should be possible to write a ddoc macro set that can spit out your Json format version of all entities and their DDoc documentation strings. It may not be possible in practice, but I think bug reports on what's missing to make that a reality are less likely to get ignored than ones suggesting that DDoc be removed.
 unittest: here I am less sure about what it needs to be done (beside removing
the -unittest compilation switch from DMD). I think D has to grow few more
handy reflection capabilities, that can be used to write short and a simple
unittest library for the standard library. A unittests(name) {} syntax may be
kept in the language... I am not sure.
Someone who's a big unittesting fan should write up a proposal on this. I think unittests are neat and all -- I probably don't use them as much as I should -- but I don't really know what's so great about named unittests or other things people mention that D's unittests lack. I suspect Walter may be in the same boat. You can't address a problem if you don't really understand it.
BTW, another thing I think would be cool to do with DDoc (or try to, anyway) is to create a macro set that generates something like header files with documentation. Basically .di files, but .di files meant for humans to read, including all ddoc comments and formatted sensibly. --bb
Nov 07 2008
prev sibling parent reply Janderson <ask me.com> writes:
Tony wrote:
 "Janderson" <ask me.com> wrote in message 
news:gepsn2$21jr$1 digitalmars.com...
 Tony wrote:
 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message 
news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D 
enforces good programming practices. For course I mentioned a couple of the ones I knew of hand -
 - Unit checking
Not sure what is meant by this, but it sounds minor.
Sure C++ can do unit checking, but its not built in. You have to
use macros or templates in something that is not really designed to work correctly with the language. Even if you ignore that there's a barrior to entry by not having something like this in the language. By having it in the language good coding practices are encouraged.
 I write unit tests. I don't know why I'd need or want language 
support for that. What api do you use? All the api's I've used are not as nice as the built in one for D.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than
adequate degree (heard of assertions?).
 Yes this can be done with C++ but D takes it it many steps further.
Like I said, I find the techniques more important that some
implementation of them (mechanism rather than policy?). D contracts take this one step futher then C++. They allow one to decouple the contracts from the code itself: long square_root(long x) in { assert(x >= 0); } out (result) { assert((result * result) <= x && (result+1) * (result+1) >= x); } body { return cast(long)std.math.sqrt(cast(real)x); } D also has better compile time messages. It also support static asserts.
 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in
"Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
 Invariant checks can be done in C++ but its very unweildly.  It is 
very annoying to have to instruct each function with scope guards. D encourages good invariant checking by making it easy.
 But again, I am thinking that the scenarios where invariants can be 
established is a very small subset of classes.

I think the best programmers use invariant checks a whole lot.

 - Stronger const
Insignificant. I still use many #defines just because I know that
const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
 Actually any good compiler will inline const variables but I'm not 
talking about those sort of const. Also you pay 2 costs for using #define:
 1) its not typesafe
 2) it adds to your compile time because the pre-processor has to do 
more.
 Coming from the Windows world, one isn't "afraid" of 1 above 
whatsoever. Compile time? CPUs are evolving faster than I'll ever be able to outpace them with the complexity or volume of my software. I guess for u compilation time isn't a problem. It is for me. Every large project I've worked on gets to a point where compile time is a problem. Partly because of use of code-generation (templates) which yes slow things down but is also generating a load of code I don't have to. Also #defines really arn't a problem for cpu at compile time, your right there. I'm trying to point out that using #defines for const is totally rediculus (sorry for being so harsh). No C++ book or expert would recommend it and it doesn't result in any run-time optimisation what so ever.
 I'm talking about the const you put in function declarations which 
are very important.
 - Modules
If that means doing away with header files, I don't think I like
it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
 I think this means you simply haven't run up against any of the 
problems with header files.
 That's probably exactly what it means. But maybe I'm tooling up to 
write utility software rather than large scale software. That could be so. D does seem targeted at large scale software. For me that's great. That's the thing with C verse C++. I think C is great until you have to write something large that is maintainable. C++/D has a lot of scaffolding and that only starts to pay dividends when the code gets larger. At some point a C++ program will be smaller then a C program.
 - Garbage collection
That's a major deal breaker for me.
Garbage collection can be turned off in D if you don't need it.
So I've been told. But I think the default should be to include it
when you like rather than the other way around. Obviously, I can use a GC library in C++ if I was so inclined. This is a good debatable point. I don't agree but that's just because of all the headakes I've had to track down with manual management.
 However for me (even when performance is very important and a game 
programmer) I can deal with it.
 - No automatic copy constructor
Can't comment.
I'd encourage you to read "See C++ Coding Standards: 101 Rules,
Guidelines, and Best Practices (C++ In-Depth Series)" by Herb Sutter, Andrei Alexandrescu
 
http://www.amazon.com/Coding-Standards-Guidelines-Practices-Depth/dp/0321113586
 one of the only books that has Bjarne Stroustrups seal of approval.
 Effective C++ is another good read.
Been there a number of times. Automatic copy constructor issue didn't
catch my eye though obviously. In C++ its standard practice by most programmers to disable the copy constructor for many of the classes they create. Some companies it mandatory to either disable it or implement one.
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing
numerical programming.
 The point here is that in C++ operators where used for all sorts of 
things that they where not designed for. This makes code hard to follow.
 I whole heartedly agree!

 D restricts operators making them more difficult to use for 
something they are not really designed for, ie encouraging better design.
 Minor. I know when not to use operators (read: hardly ever!).
For you this might not be a useful feature however for many others it is, and that is who I was address when I wrote this document. With small apps with short development cycles you have intimate knowledge about everything that makes it tick. With large apps you have to communicate though code. A language that enforces some sort of standards and documentation (ie even key words as simple as interface) will ultimately help others with that communication. Also when I go back to the code in 6months I will have a better understanding of it because the compiler prevented me from writting something that was nonsensical.
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's
interfaces different from C++'s?
 C++ has interfaces which can easily become abstractions.  This is 
not good. By saying something is an interface, your documenting -> this is an interface don't change me. Its much better when the code can enforce rules rather then by just comments. You look at an interface in D and you know its an interface, C++ you have to read though the code or hope someone has documented it. I'm not the best explainer in the world so maybe someone else can explain this better.
 I use naming standards for interfaces: iSomeClass, for example. I'm 
not sure what problem D's interfaces solve. I find no problem with C++'s interface techniques.

Compiler checked documentation.  This is an interface and that's what it 
is.  Don't make it abstract.  It a built in naming and enforced 
convention.  Also I know there's lots of heat around styles however 
having things like "i" in front of classes means that you can't change 
what that thing is (well you can in you have access to the user-code and 
what to do a find-replace).

 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a
machine do it. (Bad analogy, but y'all get the point).
 I'd rather the machine catch something at compile time rather then 
runtime.
 As long as I'm not prevented from doing casting I know is safe, it's 
fine. That's the point of casts right :) You can always cast to what you want in D however the more dangerous ones can be slightly harder to do.
 - No C style Macros
Implementing a template or template system with a good preprocessor
is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
 Macros in C++, powerful yes but I think they are over used.
Macros and using the preprocessor as a template machine are apples
and oranges. Every use of the the C++ preprocessor does not fit the definition of "macro", thought everyone pounces on the obvious as you do below:
 D has a replacement for the macro system which is more powerful. 
Most of which is not done in the preprocess. I've seen more horrible C++ macros then I can count. They are definably not a good practice. Any good C++ books will talk about "Macro side effects". They arn't typesafe, there's a possibility of having an operation performed in a place you don't expect. They don't work will on multilines (i hate \ because its not maintainable and error prone). Doing things like string operations are just non-intuitive.
 The other bad thing about macros is they are extremely difficult to 
debug.
 D provides many of the functionalities of macros in a nicer form:
 - Better Templates
 - Mixins
 - Version

 Site note: About ~80% of macros I've seen in C++ could have been 
done with templates in C++ and they would have been much better in so many ways.
 Tony
Nov 05 2008
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Janderson:
 D contracts take this one step futher then C++.  They allow one to 
 decouple the contracts from the code itself:
I think D contracts have to be improved to become really useful. I have used them very little so far in D because of that. At the moment the only contracts I see useful in D are the class invariants. The good thing of contracts is that the compiler can use them to infer several useful things. You can look at Eiffel to see how much more useful they can become. If you want we can discuss a little about this topic, that seems too much ignored in the last discussion that pass in this newsgroup. Bye, bearophile
Nov 05 2008
parent reply Janderson <ask me.com> writes:
bearophile wrote:
 Janderson:
 D contracts take this one step futher then C++.  They allow one to 
 decouple the contracts from the code itself:
I think D contracts have to be improved to become really useful. I have used them very little so far in D because of that. At the moment the only contracts I see useful in D are the class invariants. The good thing of contracts is that the compiler can use them to infer several useful things. You can look at Eiffel to see how much more useful they can become. If you want we can discuss a little about this topic, that seems too much ignored in the last discussion that pass in this newsgroup.
What sort of things. I think improving D's contracts is a great idea!
 Bye,
 bearophile
Nov 05 2008
parent Janderson <ask me.com> writes:
Janderson wrote:
 bearophile wrote:
 Janderson:
 D contracts take this one step futher then C++.  They allow one to 
 decouple the contracts from the code itself:
I think D contracts have to be improved to become really useful. I have used them very little so far in D because of that. At the moment the only contracts I see useful in D are the class invariants. The good thing of contracts is that the compiler can use them to infer several useful things. You can look at Eiffel to see how much more useful they can become. If you want we can discuss a little about this topic, that seems too much ignored in the last discussion that pass in this newsgroup.
What sort of things. I think improving D's contracts is a great idea!
BTW. I find contracts very useful for validating input to functions and validating mathematical problems, ie an area which is not addressed well with invariants.
 
 Bye,
 bearophile
Nov 05 2008
prev sibling next sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
Janderson wrote:
 Tony wrote:
  > "Janderson" <ask me.com> wrote in message 
 news:gepsn2$21jr$1 digitalmars.com...
  >> Tony wrote:
  >>> Let me be facetious with Janderson's list plz...
  >>>
  >>> "Janderson" <ask me.com> wrote in message 
 news:ge8tpd$1f6b$1 digitalmars.com...
  >>>> Hi,
  >>>>
  >>>> I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -
  >>>>
  >>>> - Unit checking
  >>> Not sure what is meant by this, but it sounds minor.
  >> Sure C++ can do unit checking, but its not built in.  You have to 
 use macros or templates in something that is not really designed to work 
 correctly with the language.  Even if you ignore that there's a barrior 
 to entry by not having something like this in the language.  By having 
 it in the language good coding practices are encouraged.
  >
  > I write unit tests. I don't know why I'd need or want language 
 support for that.
 
 
 What api do you use?  All the api's I've used are not as nice as the 
 built in one for D.
This is a joke, right? The code you need to write for a unittest is a lot smaller for D's builtin unittests than for any other testing system I've seen. That's the only category in which D's unittests win. This is mostly the fault of the unittest runners in Tango and Phobos, I'd wager, but you still won't be able to name unittests without changing the compiler, and that's a major issue, in my opinion.
Nov 05 2008
next sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
Christopher Wright wrote:
 This is mostly the fault of the unittest runners in Tango and Phobos, 
 I'd wager, but you still won't be able to name unittests without 
 changing the compiler, and that's a major issue, in my opinion.
A while ago I did some work regarding this (the beginnings of it can be seen in the Descent trunk). In a nutshell, it is possible to get D's unit tests working just like any unit testing system (i.e. run specific tests, run them all & track failures, etc., etc.), but it would be a huge project. In that framework, all tests were assigned an automatic name based on their scope/namespace. So if you had module foo.bar with class Baz, the unit tests in that class would be named foo.bar.Baz.0, foo.bar.Baz.1, etc., based on their lexical ordering. I also added the ability to name a test by adding mixin(TestName!"whatever"), but your tests would work with the framework without any modification if you so chose. However, all this depended on Flectioned. As Flectioned got out of date, it became lass attractive to work on the framework, since I would basically need to re-write/maintain Flectioned along with the test runner.
Nov 05 2008
prev sibling next sibling parent Janderson <ask me.com> writes:
Christopher Wright wrote:
 Janderson wrote:
 Tony wrote:
  > "Janderson" <ask me.com> wrote in message 
 news:gepsn2$21jr$1 digitalmars.com...
  >> Tony wrote:
  >>> Let me be facetious with Janderson's list plz...
  >>>
  >>> "Janderson" <ask me.com> wrote in message 
 news:ge8tpd$1f6b$1 digitalmars.com...
  >>>> Hi,
  >>>>
  >>>> I was talking with some collages at work and they asked me how D 
 enforces good programming practices.   For course I mentioned a couple 
 of the ones I knew of hand -
  >>>>
  >>>> - Unit checking
  >>> Not sure what is meant by this, but it sounds minor.
  >> Sure C++ can do unit checking, but its not built in.  You have to 
 use macros or templates in something that is not really designed to 
 work correctly with the language.  Even if you ignore that there's a 
 barrior to entry by not having something like this in the language.  
 By having it in the language good coding practices are encouraged.
  >
  > I write unit tests. I don't know why I'd need or want language 
 support for that.


 What api do you use?  All the api's I've used are not as nice as the 
 built in one for D.
This is a joke, right? The code you need to write for a unittest is a lot smaller for D's builtin unittests than for any other testing system I've seen. That's the only category in which D's unittests win. This is mostly the fault of the unittest runners in Tango and Phobos, I'd wager, but you still won't be able to name unittests without changing the compiler, and that's a major issue, in my opinion.
That's a good point. I've never really used the advanced unit-test features of other unit test programs. All I really used was sets of tests which I could run automatically. D provides this without all the rest of the cruft that comes along with the api's I've used. Note I'm referring to implementation details not features. For small projects I'm not going to go out of my way to install unit-tests but having them already there means that I can still use them. -Joel
Nov 05 2008
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Christopher Wright wrote:
 The code you need to write for a unittest is a lot smaller for D's 
 builtin unittests than for any other testing system I've seen. That's 
 the only category in which D's unittests win.
Oh, I fully agree that D's unittests are not an advanced or comprehensive framework for unit testing. But D's builtin ones have a huge advantage - they are built in. That seems to make an awful lot of difference in encouraging the writing of unit tests, whereas most C/C++ projects I've run across have no unit tests at all. I know that unit tests have made for a big improvement in the quality of Phobos.
Nov 06 2008
prev sibling parent reply "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gesi8g$4ii$3 digitalmars.com...
 Tony wrote:
 "Janderson" <ask me.com> wrote in message
news:gepsn2$21jr$1 digitalmars.com...
 Tony wrote:
 Let me be facetious with Janderson's list plz...

 "Janderson" <ask me.com> wrote in message
news:ge8tpd$1f6b$1 digitalmars.com...
 Hi,

 I was talking with some collages at work and they asked me how D
enforces good programming practices. For course I mentioned a couple of the ones I knew of hand -
 - Unit checking
Not sure what is meant by this, but it sounds minor.
Sure C++ can do unit checking, but its not built in. You have to
use macros or templates in something that is not really designed to work correctly with the language. Even if you ignore that there's a barrior to entry by not having something like this in the language. By having it in the language good coding practices are encouraged.
 I write unit tests. I don't know why I'd need or want language
support for that. What api do you use? All the api's I've used are not as nice as the built in one for D.
In-house/proprietary, but it's hardly anything automated such as I'm sure there are commercial offerings for. I'm not close enough to deployment to worry about detailed testing: there are still some architectural issues to solve for higher level framework components.
 - Design by contract
Overblown concept, but can be done with C++ also to a more than
adequate degree (heard of assertions?).
 Yes this can be done with C++ but D takes it it many steps further.
Like I said, I find the techniques more important that some
implementation of them (mechanism rather than policy?). D contracts take this one step futher then C++. They allow one to decouple the contracts from the code itself: long square_root(long x) in { assert(x >= 0); } out (result) { assert((result * result) <= x && (result+1) * (result+1) >= x); } body { return cast(long)std.math.sqrt(cast(real)x); }
Or if one wanted something like that in C++: class MyInvariant { MyInvariant(long& x) { // do a check on entry } ~MyInvariant() { // do a check on exit } };
 D also has better compile time messages.  It also support static asserts.


 - Invariant checks
Part of DbC concepts. See Koenig and Moo's array example in
"Accelerated C++". Which, btw, leads me to believe that there are few instances "where the stars line up just right" for invariant checking to be useful.
 Invariant checks can be done in C++ but its very unweildly.  It is
very annoying to have to instruct each function with scope guards. D encourages good invariant checking by making it easy.
 But again, I am thinking that the scenarios where invariants can be
established is a very small subset of classes.

 I think the best programmers use invariant checks a whole lot.
If one has a lot of opportunity to use invariants, I'll bet it has to do with the domain. Such as numerical programming: that would seem to have a lot of those opportunities for use of invariant checking. In other domains (mainstream GUI dev?), much less so or even rarely so.
 - Stronger const
Insignificant. I still use many #defines just because I know that
const vars take space and #defines are a pre-compile-time thing (yes, I value the preprocessor for some uses, this being one of them).
 Actually any good compiler will inline const variables but I'm not
talking about those sort of const. Also you pay 2 costs for using #define:
 1) its not typesafe
 2) it adds to your compile time because the pre-processor has to do
more.
 Coming from the Windows world, one isn't "afraid" of 1 above
whatsoever. Compile time? CPUs are evolving faster than I'll ever be able to outpace them with the complexity or volume of my software. I guess for u compilation time isn't a problem. It is for me. Every large project I've worked on gets to a point where compile time is a problem. Partly because of use of code-generation (templates) which yes slow things down but is also generating a load of code I don't have to.
I'll bet it's a design problem rather than a template machinery problem. Templates are way overused IMO, and few people know how to or just don't bother architecting nice generic classes and functions.
 Also #defines really arn't a problem for cpu at compile time, your right 
 there.  I'm trying to point out that using #defines for const is totally 
 rediculus (sorry for being so harsh).  No C++ book or expert would 
 recommend it and it doesn't result in any run-time optimisation what so 
 ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
 I'm talking about the const you put in function declarations which
are very important.
 - Modules
If that means doing away with header files, I don't think I like
it. I rely on headers as the engineer's blueprint (of course you have to write very clean code to have that make sense).
 I think this means you simply haven't run up against any of the
problems with header files.
 That's probably exactly what it means. But maybe I'm tooling up to
write utility software rather than large scale software. That could be so. D does seem targeted at large scale software. For me that's great. That's the thing with C verse C++. I think C is great until you have to write something large that is maintainable. C++/D has a lot of scaffolding and that only starts to pay dividends when the code gets larger. At some point a C++ program will be smaller then a C program.
 - Garbage collection
That's a major deal breaker for me.
Garbage collection can be turned off in D if you don't need it.
So I've been told. But I think the default should be to include it
when you like rather than the other way around. Obviously, I can use a GC library in C++ if I was so inclined. This is a good debatable point. I don't agree but that's just because of all the headakes I've had to track down with manual management.
 However for me (even when performance is very important and a game
programmer) I can deal with it.
 - No automatic copy constructor
Can't comment.
I'd encourage you to read "See C++ Coding Standards: 101 Rules,
Guidelines, and Best Practices (C++ In-Depth Series)" by Herb Sutter, Andrei Alexandrescu

http://www.amazon.com/Coding-Standards-Guidelines-Practices-Depth/dp/0321113586
 one of the only books that has Bjarne Stroustrups seal of approval.
 Effective C++ is another good read.
Been there a number of times. Automatic copy constructor issue didn't
catch my eye though obviously. In C++ its standard practice by most programmers to disable the copy constructor for many of the classes they create. Some companies it mandatory to either disable it or implement one.
I do that too: I "disable" (declare private and don't supply an implementation") the compiler-called class functions by default when designing a class and putting them back if they are needed.
 - More restrictive operators
I'm not really concerned about that. I'd avoid them unless doing
numerical programming.
 The point here is that in C++ operators where used for all sorts of
things that they where not designed for. This makes code hard to follow.
 I whole heartedly agree!

 D restricts operators making them more difficult to use for
something they are not really designed for, ie encouraging better design.
 Minor. I know when not to use operators (read: hardly ever!).
For you this might not be a useful feature however for many others it is, and that is who I was address when I wrote this document. With small apps with short development cycles you have intimate knowledge about everything that makes it tick. With large apps you have to communicate though code. A language that enforces some sort of standards and documentation (ie even key words as simple as interface) will ultimately help others with that communication. Also when I go back to the code in 6months I will have a better understanding of it because the compiler prevented me from writting something that was nonsensical.
 - Specific constructs such as Interfaces
C++ has interfaces. Should it be a keyword? Maybe. How are D's
interfaces different from C++'s?
 C++ has interfaces which can easily become abstractions.  This is
not good. By saying something is an interface, your documenting -> this is an interface don't change me. Its much better when the code can enforce rules rather then by just comments. You look at an interface in D and you know its an interface, C++ you have to read though the code or hope someone has documented it. I'm not the best explainer in the world so maybe someone else can explain this better.
I guess for D it is a bigger issue because header files are frowned upon? The header file in C++ is the first level of documentation, and if you code cleanly, maybe all that is required. Combine that with some design/architecture description and most of the documentation chore is done.
 I use naming standards for interfaces: iSomeClass, for example. I'm
not sure what problem D's interfaces solve. I find no problem with C++'s interface techniques.

 Compiler checked documentation.  This is an interface and that's what it 
 is.  Don't make it abstract.  It a built in naming and enforced 
 convention.  Also I know there's lots of heat around styles however having 
 things like "i" in front of classes means that you can't change what that 
 thing is (well you can in you have access to the user-code and what to do 
 a find-replace).

 - More restrictive casting
Ouch!! I prefer to slice bread with a knife rather than having a
machine do it. (Bad analogy, but y'all get the point).
 I'd rather the machine catch something at compile time rather then
runtime.
 As long as I'm not prevented from doing casting I know is safe, it's
fine. That's the point of casts right :) You can always cast to what you want in D however the more dangerous ones can be slightly harder to do.
 - No C style Macros
Implementing a template or template system with a good preprocessor
is something completely different than macros. I value the preprocessor for such uses (I wish it was more powerful than in C++ though).
 Macros in C++, powerful yes but I think they are over used.
Macros and using the preprocessor as a template machine are apples
and oranges. Every use of the the C++ preprocessor does not fit the definition of "macro", thought everyone pounces on the obvious as you do below:
 D has a replacement for the macro system which is more powerful.
Most of which is not done in the preprocess. I've seen more horrible C++ macros then I can count. They are definably not a good practice. Any good C++ books will talk about "Macro side effects". They arn't typesafe, there's a possibility of having an operation performed in a place you don't expect. They don't work will on multilines (i hate \ because its not maintainable and error prone). Doing things like string operations are just non-intuitive.
 The other bad thing about macros is they are extremely difficult to
debug.
 D provides many of the functionalities of macros in a nicer form:
 - Better Templates
 - Mixins
 - Version

 Site note: About ~80% of macros I've seen in C++ could have been
done with templates in C++ and they would have been much better in so many ways.
Again though, "macros" and "preprocessor-implemented templates" are not the same thing, but I'd rather not go round and round on this. Tony
Nov 06 2008
next sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
Tony wrote:
 Or if one wanted something like that in C++:
 
 class MyInvariant
 {
     MyInvariant(long& x)
     {
         // do a check on entry
     }
 
     ~MyInvariant()
     {
         // do a check on exit
     }
 };
There are a couple problems with that solution: 1. The assertions are nowhere near the code they apply to. This makes it more difficult to read the code -- both the assertions and the method they apply to. 2. It's more code to write. That makes it less likely that you're going to write it. In D, I use contracts because they're so easy to use. Your C++ alternative isn't nearly as easy, so I wouldn't use it. 3. There is no straightforward, simple way to remove these contracts in a release build, if I want extra efficiency. You have to use #ifdef DEBUG in a lot of places to get that effect. 4. There's no consistent structure. This also hurts readability. Personally, I really hate the idea of using RIAA for this kind of thing. It's just such a hack. I only want to use RIAA for things that should only live for the length of a given function. Of course, C++ is Turing complete, so there's no feature in D that you can't replicate in C++. In the worst case, you can alter DMD (which is written in C++) to compile and run your D program.
Nov 06 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
Christopher Wright wrote:
 Of course, C++ is Turing complete, so there's no feature in D that you 
 can't replicate in C++. In the worst case, you can alter DMD (which is 
 written in C++) to compile and run your D program.
The same goes for C. I've seen professional code that did OOP in C using manually generated function pointer tables.
Nov 06 2008
prev sibling next sibling parent reply "Bill Baxter" <wbaxter gmail.com> writes:
On Fri, Nov 7, 2008 at 6:57 AM, Tony <tonytech08 gmail.com> wrote:

 Obviously I take language feature "recommendation" with a grain of salt. I
 do #define because I've never had any problem with it (and look at the
 Windows header files sometime!) and #defines don't create a data object in
 memory. It's simply never been a problem.
Ack! Windows header files are probably the best example in the world of why using the preprocessor is a BAD idea! The main reason the preprocessor is evil is because it does completely ignores scoping rules. I guess you've never had the misfortune of trying to name a method inside a class "GetObject", or any of the other various common words and phrases the Windows headers #define. --bb
Nov 06 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
  The main reason the preprocessor is evil is because it does
 completely ignores scoping rules.
It's worse than that. The preprocessor is a completely separate and distinct language from the rest of C++. They share nothing. The symbol tables are distinct and inaccessible between them. The tokens are different; even the rules for parsing expressions are different. They originally even were separate programs.
Nov 06 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 Bill Baxter wrote:
  The main reason the preprocessor is evil is because it does
 completely ignores scoping rules.
It's worse than that. The preprocessor is a completely separate and distinct language from the rest of C++. They share nothing. The symbol tables are distinct and inaccessible between them. The tokens are different; even the rules for parsing expressions are different.
And it makes IDE functionality much more difficult to implement... ;) -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Nov 07 2008
prev sibling next sibling parent reply Janderson <ask me.com> writes:
Tony wrote:

 long square_root(long x)
     in
     {
     assert(x >= 0);
     }
     out (result)
     {
     assert((result * result) <= x && (result+1) * (result+1) >= x);
     }
     body
     {
     return cast(long)std.math.sqrt(cast(real)x);
     }
Or if one wanted something like that in C++: class MyInvariant { MyInvariant(long& x) { // do a check on entry } ~MyInvariant() { // do a check on exit } };
You should know that the syntax I presented above is not an invariant in D (that's a contractual check). An invariant check in D looks like this: class Foo { public void f() { } private void g() { } invariant() { //Checked for both f and g } } The invariant example was what I originally said in my first reply was a lot of extra work in C++. Note I used invariant checks a lot in C++ and D's invariant checks are far better. You might want to read though the D documentation so that what I've said makes more sense. You seem to reply only half reading what I've said (case-in-point above). The documentation can be found here: http://www.digitalmars.com/d/2.0/lex.html and here for 1.0 http://www.digitalmars.com/d/1.0/lex.html -Joel
Nov 06 2008
parent "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gf0e9n$31ff$1 digitalmars.com...
 Tony wrote:

 long square_root(long x)
     in
     {
     assert(x >= 0);
     }
     out (result)
     {
     assert((result * result) <= x && (result+1) * (result+1) >= x);
     }
     body
     {
     return cast(long)std.math.sqrt(cast(real)x);
     }
Or if one wanted something like that in C++: class MyInvariant { MyInvariant(long& x) { // do a check on entry } ~MyInvariant() { // do a check on exit } };
You should know that the syntax I presented above is not an invariant in D (that's a contractual check). An invariant check in D looks like this: class Foo { public void f() { } private void g() { } invariant() { //Checked for both f and g } } The invariant example was what I originally said in my first reply was a lot of extra work in C++. Note I used invariant checks a lot in C++ and D's invariant checks are far better. You might want to read though the D documentation so that what I've said makes more sense. You seem to reply only half reading what I've said (case-in-point above). The documentation can be found here: http://www.digitalmars.com/d/2.0/lex.html and here for 1.0 http://www.digitalmars.com/d/1.0/lex.html
I need a large list of examples where invariants were used because I just don't see many opportunities to use them in my own codebase. Also, they seem mainly useful as a development aid rather than as production code (similar to turning off assertions in release code). Tony
Nov 07 2008
prev sibling next sibling parent reply Janderson <ask me.com> writes:
Tony wrote:
 In C++ its standard practice by most programmers to disable the copy 
 constructor for many of the classes they create.  Some companies it 
 mandatory to either disable it or implement one.
I do that too: I "disable" (declare private and don't supply an implementation") the compiler-called class functions by default when designing a class and putting them back if they are needed.
You are repeating what I just said. The point is D its opt in rather then opt out which is the point of the original thread "improve design practices". In C++ if you didn't know you had to do that its something you'd need to learn. In D its not. C++ is a huge language, and not many know the entire language. Case in point, you didn't know what Delegates where yet many C++ programmers use them frequently. Its better if the language makes it easy rather then requiring the programmer to do something to be correct. Just like expecting an email program to have spell check your emails. Modern languages should do the same. -Joel
Nov 06 2008
next sibling parent reply "Bill Baxter" <wbaxter gmail.com> writes:
 C++ is a huge language, and not many know the entire language.  Case in
 point, you didn't know what Delegates where yet many C++ programmers use
 them frequently.  Its better if the language makes it easy rather then
 requiring the programmer to do something to be correct.  Just like expecting
 an email program to have spell check your emails.  Modern languages should
 do the same.
C++ doesn't have "delegates". It has member function pointers. I don't think that's changed. boost::bind (now std::tr1::bind in some places) gives you a way to bundle a member function pointer with an object pointer in a delegate-like way, but I don't think anybody calls those delegates. At least they didn't used to. I don't know who came up with the word "delegate" but I find it to be a terrible match for what they actually are. - "one appointed or elected to represent others"? It's a kind of a stretch. [/rant] --bb
Nov 06 2008
next sibling parent Janderson <ask me.com> writes:
Bill Baxter wrote:
 C++ is a huge language, and not many know the entire language.  Case in
 point, you didn't know what Delegates where yet many C++ programmers use
 them frequently.  Its better if the language makes it easy rather then
 requiring the programmer to do something to be correct.  Just like expecting
 an email program to have spell check your emails.  Modern languages should
 do the same.
C++ doesn't have "delegates". It has member function pointers. I don't think that's changed. boost::bind (now std::tr1::bind in some places) gives you a way to bundle a member function pointer with an object pointer in a delegate-like way, but I don't think anybody calls those delegates. At least they didn't used to. I don't know who came up with the word "delegate" but I find it to be a terrible match for what they actually are. - "one appointed or elected to represent others"? It's a kind of a stretch. [/rant] --bb
http://www.codeguru.com/cpp/cpp/cpp_mfc/pointers/article.php/c4135 Delegates in D are a little different from the original delegates which "delegate" to different methods (it they could have more then destination). D delegates are more like C++ functors (which aren't built in either). -Joel
Nov 06 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 C++ doesn't have "delegates".  It has member function pointers.  I
 don't think that's changed.  boost::bind (now std::tr1::bind in some
 places) gives you a way to bundle a member function pointer with an
 object pointer in a delegate-like way, but I don't think anybody calls
 those delegates.  At least they didn't used to.
The difference between D delegates and boost::bind for member functions is that D delegates bind to the specific virtual function when the delegate is created, while boost::bind binds when the delegate is called. The former is, of course, more efficient when the delegate gets called more than once.
Nov 06 2008
parent reply Michel Fortin <michel.fortin michelf.com> writes:
On 2008-11-07 02:42:20 -0500, Walter Bright <newshound1 digitalmars.com> said:

 The difference between D delegates and boost::bind for member functions 
 is that D delegates bind to the specific virtual function when the 
 delegate is created, while boost::bind binds when the delegate is 
 called. The former is, of course, more efficient when the delegate gets 
 called more than once.
But the later makes it possible to call the same member function on various object instances (which may resolve to different code for virtual functions). I find that capability lacking in D. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
Nov 09 2008
parent reply Christopher Wright <dhasenan gmail.com> writes:
Michel Fortin wrote:
 On 2008-11-07 02:42:20 -0500, Walter Bright <newshound1 digitalmars.com> 
 said:
 
 The difference between D delegates and boost::bind for member 
 functions is that D delegates bind to the specific virtual function 
 when the delegate is created, while boost::bind binds when the 
 delegate is called. The former is, of course, more efficient when the 
 delegate gets called more than once.
But the later makes it possible to call the same member function on various object instances (which may resolve to different code for virtual functions). I find that capability lacking in D.
You can do it in D, but only with templates. And it's ugly. I must admit, I've never encountered a situation in which I wanted a pointer to a member function. What situations did you encounter this in? Why were, say, interfaces insufficient?
Nov 09 2008
next sibling parent Janderson <ask me.com> writes:
Christopher Wright wrote:
 Michel Fortin wrote:
 On 2008-11-07 02:42:20 -0500, Walter Bright 
 <newshound1 digitalmars.com> said:

 The difference between D delegates and boost::bind for member 
 functions is that D delegates bind to the specific virtual function 
 when the delegate is created, while boost::bind binds when the 
 delegate is called. The former is, of course, more efficient when the 
 delegate gets called more than once.
But the later makes it possible to call the same member function on various object instances (which may resolve to different code for virtual functions). I find that capability lacking in D.
You can do it in D, but only with templates. And it's ugly. I must admit, I've never encountered a situation in which I wanted a pointer to a member function. What situations did you encounter this in? Why were, say, interfaces insufficient?
I've found it useful to call different object instances with the same member function pointer on very few occasions in C++. However I'd point out that interfaces in C++ mean you have to change another file (or use a boltin). So sometimes its difficult to modify or even wrap another API's class in that way. In these cases (and others) delegates (or member function pointers) are better then interfaces. -Joel
Nov 09 2008
prev sibling parent Michel Fortin <michel.fortin michelf.com> writes:
On 2008-11-09 09:04:00 -0500, Christopher Wright <dhasenan gmail.com> said:

 Michel Fortin wrote:
 On 2008-11-07 02:42:20 -0500, Walter Bright <newshound1 digitalmars.com> said:
 
 The difference between D delegates and boost::bind for member functions 
 is that D delegates bind to the specific virtual function when the 
 delegate is created, while boost::bind binds when the delegate is 
 called. The former is, of course, more efficient when the delegate gets 
 called more than once.
But the later makes it possible to call the same member function on various object instances (which may resolve to different code for virtual functions). I find that capability lacking in D.
You can do it in D, but only with templates. And it's ugly. I must admit, I've never encountered a situation in which I wanted a pointer to a member function. What situations did you encounter this in? Why were, say, interfaces insufficient?
In the D/Objective-C bridge when I recieve a call from the Objective-C side and I need to dispatch it to the corresponding method of the given D object. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
Nov 12 2008
prev sibling next sibling parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Thu, Nov 6, 2008 at 11:32 PM, Bill Baxter <wbaxter gmail.com> wrote:
 C++ is a huge language, and not many know the entire language.  Case in
 point, you didn't know what Delegates where yet many C++ programmers use
 them frequently.  Its better if the language makes it easy rather then
 requiring the programmer to do something to be correct.  Just like expecting
 an email program to have spell check your emails.  Modern languages should
 do the same.
C++ doesn't have "delegates". It has member function pointers. I don't think that's changed. boost::bind (now std::tr1::bind in some places) gives you a way to bundle a member function pointer with an object pointer in a delegate-like way, but I don't think anybody calls those delegates. At least they didn't used to. I don't know who came up with the word "delegate" but I find it to be a terrible match for what they actually are. - "one appointed or elected to represent others"? It's a kind of a stretch. [/rant]
delegates combined with a signals and slots implementation. So you can think of a delegate as not being a method itself, but rather a representative to all the objects+methods that have subscribed to it. It's still not really a good fit ;) But you make do with what you have.
Nov 06 2008
next sibling parent Janderson <ask me.com> writes:
Jarrett Billingsley wrote:
 On Thu, Nov 6, 2008 at 11:32 PM, Bill Baxter <wbaxter gmail.com> wrote:
 C++ is a huge language, and not many know the entire language.  Case in
 point, you didn't know what Delegates where yet many C++ programmers use
 them frequently.  Its better if the language makes it easy rather then
 requiring the programmer to do something to be correct.  Just like expecting
 an email program to have spell check your emails.  Modern languages should
 do the same.
C++ doesn't have "delegates". It has member function pointers. I don't think that's changed. boost::bind (now std::tr1::bind in some places) gives you a way to bundle a member function pointer with an object pointer in a delegate-like way, but I don't think anybody calls those delegates. At least they didn't used to. I don't know who came up with the word "delegate" but I find it to be a terrible match for what they actually are. - "one appointed or elected to represent others"? It's a kind of a stretch. [/rant]
delegates combined with a signals and slots implementation. So you can think of a delegate as not being a method itself, but rather a representative to all the objects+methods that have subscribed to it. It's still not really a good fit ;) But you make do with what you have.
also the "Delegation pattern" which offers another clue: http://en.wikipedia.org/wiki/Delegation_pattern -Joel
Nov 06 2008
prev sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
Jarrett Billingsley wrote:


 delegates combined with a signals and slots implementation.  So you
 can think of a delegate as not being a method itself, but rather a
 representative to all the objects+methods that have subscribed to it.
 It's still not really a good fit ;)  But you make do with what you
 have.
(System.Delegate). A delegate is pretty much the same as it is in D. An event is a collection of delegates that can be called as one function. You can add and remove delegates from it. Delegates are used in the signals and slots implementation, but so what? You can get the same in D.
Nov 07 2008
parent reply "Jarrett Billingsley" <jarrett.billingsley gmail.com> writes:
On Fri, Nov 7, 2008 at 11:13 PM, Christopher Wright <dhasenan gmail.com> wrote:
 Jarrett Billingsley wrote:


 delegates combined with a signals and slots implementation.  So you
 can think of a delegate as not being a method itself, but rather a
 representative to all the objects+methods that have subscribed to it.
 It's still not really a good fit ;)  But you make do with what you
 have.
(System.Delegate). A delegate is pretty much the same as it is in D. An event is a collection of delegates that can be called as one function. You can add and remove delegates from it. Delegates are used in the signals and slots implementation, but so what? You can get the same in D.
All I was trying to do was explain where the name "delegate" may have come from.
Nov 07 2008
parent Christopher Wright <dhasenan gmail.com> writes:
Jarrett Billingsley wrote:
 On Fri, Nov 7, 2008 at 11:13 PM, Christopher Wright <dhasenan gmail.com> wrote:
 Jarrett Billingsley wrote:


 delegates combined with a signals and slots implementation.  So you
 can think of a delegate as not being a method itself, but rather a
 representative to all the objects+methods that have subscribed to it.
 It's still not really a good fit ;)  But you make do with what you
 have.
(System.Delegate). A delegate is pretty much the same as it is in D. An event is a collection of delegates that can be called as one function. You can add and remove delegates from it. Delegates are used in the signals and slots implementation, but so what? You can get the same in D.
All I was trying to do was explain where the name "delegate" may have come from.
Ah, sorry. /me gives an embarrassed shrug. Wikipedia mentions this 1986 paper: http://web.media.mit.edu/~lieber/Lieberary/OOP/Delegation/Delegation.html It speaks of delegating messages to other objects, using that term.
Nov 08 2008
prev sibling parent "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gf0fkd$4qh$1 digitalmars.com...
 Tony wrote:
 In C++ its standard practice by most programmers to disable the copy 
 constructor for many of the classes they create.  Some companies it 
 mandatory to either disable it or implement one.
I do that too: I "disable" (declare private and don't supply an implementation") the compiler-called class functions by default when designing a class and putting them back if they are needed.
You are repeating what I just said. The point is D its opt in rather then opt out which is the point of the original thread "improve design practices". In C++ if you didn't know you had to do that its something you'd need to learn. In D its not.
That's good, but it's the major features and their implementations that bug me, not the dozen or hundred nuisance things.
 C++ is a huge language, and not many know the entire language.  Case in 
 point, you didn't know what Delegates where
hehe, and still don't.
 yet many C++ programmers use them frequently.  Its better if the language 
 makes it easy rather then requiring the programmer to do something to be 
 correct.  Just like expecting an email program to have spell check your 
 emails.  Modern languages should do the same.
Tony
Nov 07 2008
prev sibling parent reply Janderson <ask me.com> writes:
Tony wrote:
 "Janderson" <ask me.com> wrote in message
 Also #defines really arn't a problem for cpu at compile time, your right
there.  I'm trying to point out that using #defines for const is totally
rediculus (sorry for being so harsh).  No C++ book or expert would recommend it
and it doesn't result in any run-time optimisation what so ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
It seems you might need to think outside of the C++ box. There are much more powerful processors then the one that comes with C++. In many aspects D has a huge advantage with its pre-processing replacements. As to the windows. These where designed for C not C++. Even MFC is old and something Microsoft has not supported for a long time. It was created during the transition time from when Microsoft was switching C++), what do they no-longer use? Look at C++0x, what are they changing, what are they adding to reduce macro use? I'll leave this as an excise to u because I fear if that I tell you'll just want to argue more. If you find it for yourself you might be more inclined to be more open minded or at least present some more informed arguments. -Joel
Nov 06 2008
parent reply "Tony" <tonytech08 gmail.com> writes:
"Janderson" <ask me.com> wrote in message 
news:gf0hf3$9nf$3 digitalmars.com...
 Tony wrote:
 "Janderson" <ask me.com> wrote in message
 Also #defines really arn't a problem for cpu at compile time, your 
 right there.  I'm trying to point out that using #defines for const is 
 totally rediculus (sorry for being so harsh).  No C++ book or expert 
 would recommend it and it doesn't result in any run-time optimisation 
 what so ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
It seems you might need to think outside of the C++ box.
That's an odd statement considering that I came here to investigate D and am defining what to put into my own language (or one I wish someone would implement) because of the issues I have with major "features" of C++ (and D).
 There are much more powerful processors then the one that comes with C++.
And that's probably the first step I'll be taking: creating a preprocessor or pre-preprocessor.
 As to the windows.  These where designed for C not C++.
Indeed.
 Even MFC is old and something Microsoft has not supported for a long time. 
 It was created during the transition time from when Microsoft was 

 managed C++), what do they no-longer use?  Look at C++0x, what are they 
 changing, what are they adding to reduce macro use?

 I'll leave this as an excise to u because I fear if that I tell you'll 
 just want to argue more.
I used to code professionally using MFC and Borland's OWL (in the mid nineties), and the Win32 API directly before that. Believe me, I know what to hate about those things. Using #define is relatively minor compared to major design issues. Debating on whether or not to use #define for constants is right up there with where to put opening braces. It's a waste of time.
 If you find it for yourself you might be more inclined to be more open 
 minded or at least present some more informed arguments.
You're the one arguing, not me. Time to look in the mirror dude. Tony
Nov 07 2008
parent reply Jesse Phillips <jessekphillips gmail.com> writes:
On Fri, 07 Nov 2008 16:09:58 -0600, Tony wrote:

 "Janderson" <ask me.com> wrote in message
 news:gf0hf3$9nf$3 digitalmars.com...
 Tony wrote:
 "Janderson" <ask me.com> wrote in message
 Also #defines really arn't a problem for cpu at compile time, your
 right there.  I'm trying to point out that using #defines for const
 is totally rediculus (sorry for being so harsh).  No C++ book or
 expert would recommend it and it doesn't result in any run-time
 optimisation what so ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
It seems you might need to think outside of the C++ box.
That's an odd statement considering that I came here to investigate D and am defining what to put into my own language (or one I wish someone would implement) because of the issues I have with major "features" of C++ (and D).
It sounds to me that D is not the "improvement" over C++ you are looking for. It sounds like you have done a lot of work in C++ and developed good coding practices for yourself and probably those you work with. I would guess that it has taken some time to develop the design habits you use. It seems to me that the improvements D gives you have already worked out through convention in C++. D presents a much clearer path on what conventions should be used in programming. D provides other benefits, such as GC, that it appears you are not looking for. I say D is not for you because, from what I have read, you are looking to have features removed. I see complaint after complaint about D/C++ having something you don't want to use, but very little/none on what you actually want added to the language. Take GC for example, rather than saying, "I don't want it" you can go with, "I want total control of memory management" If you still don't want to learn how to manipulate the D GC, then D is not for you. I might suggest not commenting on something you find unimportant, if "stronger const" is insignificant, then ignore it and move on.
Nov 07 2008
parent reply "Tony" <tonytech08 gmail.com> writes:
"Jesse Phillips" <jessekphillips gmail.com> wrote in message 
news:gf2j4a$187s$1 digitalmars.com...
 On Fri, 07 Nov 2008 16:09:58 -0600, Tony wrote:

 "Janderson" <ask me.com> wrote in message
 news:gf0hf3$9nf$3 digitalmars.com...
 Tony wrote:
 "Janderson" <ask me.com> wrote in message
 Also #defines really arn't a problem for cpu at compile time, your
 right there.  I'm trying to point out that using #defines for const
 is totally rediculus (sorry for being so harsh).  No C++ book or
 expert would recommend it and it doesn't result in any run-time
 optimisation what so ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
It seems you might need to think outside of the C++ box.
That's an odd statement considering that I came here to investigate D and am defining what to put into my own language (or one I wish someone would implement) because of the issues I have with major "features" of C++ (and D).
It sounds to me that D is not the "improvement" over C++ you are looking for.
That's not to say that there is nothing interesting about D though, that's why I came in to investigate a bit.
 It sounds like you have done a lot of work in C++ and developed good
 coding practices for yourself and probably those you work with. I would
 guess that it has taken some time to develop the design habits you use.
Yep, to the point where the "everything including the kitchen sink" languages are getting kind of long in the tooth.
 It seems to me that the improvements D gives you have already worked out
 through convention in C++.
Probably some of them and there's always room for improvement.
 D presents a much clearer path on what
 conventions should be used in programming.
What I like about C++ is that it is more mechanism than policy.
 D provides other benefits,
 such as GC, that it appears you are not looking for.
Right.
 I say D is not for you because, from what I have read, you are looking to
 have features removed.
Not at all. I'm just creating a vision of what I'd like my ideal language to be. Given those things, I can assess if it's worth developing a new language or not. It's just research for the future when I may have more resources.
 I see complaint after complaint about D/C++ having
 something you don't want to use, but very little/none on what you
 actually want added to the language.
No complaints from me here: I'm not evaluating the language for usage anymore, but rather for elements that I'd put on my list of top language features and how they fit in or not with completely new features not in any language. Also, going "back and forth" with people on some of features solidifies in my mind what kind of capability and what kind of implementation I would do. So I don't want anything added to D since I'm not a D user. I'd like some things added to C++ though and some removed, but since that is like pulling teeth and the whole shebang is way more complex than necessary, a new language or a preprocessor to make it more palatable may be in order. What drew me to peek at D years ago was the comment that it was much easier to implement and the template implementation example. I looked at again lately because I had forgotten some things about it such as what the object model was like.
 Take GC for example, rather than
 saying, "I don't want it" you can go with, "I want total control of
 memory management"
I thought that was pretty much clear (both/either).
  If you still don't want to learn how to manipulate the
 D GC, then D is not for you.
I pretty much knew that after my reread of the website. Reading responses in defense of the features though does prompt some deeper thoughts about them.
 I might suggest not commenting on something you find unimportant, if
 "stronger const" is insignificant, then ignore it and move on.
stronger const is good. It's just not one of those "deal maker/breaker" things like the object model for example (something I am still going to look more closely at in D). Tony
Nov 08 2008
parent Jesse Phillips <jessekphillips gmail.com> writes:
On Sat, 08 Nov 2008 03:48:01 -0600, Tony wrote:

 "Jesse Phillips" <jessekphillips gmail.com> wrote in message
 news:gf2j4a$187s$1 digitalmars.com...
 On Fri, 07 Nov 2008 16:09:58 -0600, Tony wrote:

 "Janderson" <ask me.com> wrote in message
 news:gf0hf3$9nf$3 digitalmars.com...
 Tony wrote:
 "Janderson" <ask me.com> wrote in message
 Also #defines really arn't a problem for cpu at compile time, your
 right there.  I'm trying to point out that using #defines for
 const is totally rediculus (sorry for being so harsh).  No C++
 book or expert would recommend it and it doesn't result in any
 run-time optimisation what so ever.
Obviously I take language feature "recommendation" with a grain of salt. I do #define because I've never had any problem with it (and look at the Windows header files sometime!) and #defines don't create a data object in memory. It's simply never been a problem. Now if one is "hell bent/anal" about "doing away with the preprocessor, power to them, "it ain't me" though. I don't want the template machinery taking over the capability of the preprocessor: I use it to mutate the language and experiment. I will probably implement a preprocessor to replace or add to what I have with C++ before I jump into compiler development for my language that is evolving.
It seems you might need to think outside of the C++ box.
That's an odd statement considering that I came here to investigate D and am defining what to put into my own language (or one I wish someone would implement) because of the issues I have with major "features" of C++ (and D).
It sounds to me that D is not the "improvement" over C++ you are looking for.
That's not to say that there is nothing interesting about D though, that's why I came in to investigate a bit.
 It sounds like you have done a lot of work in C++ and developed good
 coding practices for yourself and probably those you work with. I would
 guess that it has taken some time to develop the design habits you use.
Yep, to the point where the "everything including the kitchen sink" languages are getting kind of long in the tooth.
 It seems to me that the improvements D gives you have already worked
 out through convention in C++.
Probably some of them and there's always room for improvement.
 D presents a much clearer path on what conventions should be used in
 programming.
What I like about C++ is that it is more mechanism than policy.
 D provides other benefits,
 such as GC, that it appears you are not looking for.
Right.
 I say D is not for you because, from what I have read, you are looking
 to have features removed.
Not at all. I'm just creating a vision of what I'd like my ideal language to be. Given those things, I can assess if it's worth developing a new language or not. It's just research for the future when I may have more resources.
 I see complaint after complaint about D/C++ having something you don't
 want to use, but very little/none on what you actually want added to
 the language.
No complaints from me here: I'm not evaluating the language for usage anymore, but rather for elements that I'd put on my list of top language features and how they fit in or not with completely new features not in any language. Also, going "back and forth" with people on some of features solidifies in my mind what kind of capability and what kind of implementation I would do. So I don't want anything added to D since I'm not a D user. I'd like some things added to C++ though and some removed, but since that is like pulling teeth and the whole shebang is way more complex than necessary, a new language or a preprocessor to make it more palatable may be in order. What drew me to peek at D years ago was the comment that it was much easier to implement and the template implementation example. I looked at again lately because I had forgotten some things about it such as what the object model was like.
 Take GC for example, rather than
 saying, "I don't want it" you can go with, "I want total control of
 memory management"
I thought that was pretty much clear (both/either).
  If you still don't want to learn how to manipulate the
 D GC, then D is not for you.
I pretty much knew that after my reread of the website. Reading responses in defense of the features though does prompt some deeper thoughts about them.
 I might suggest not commenting on something you find unimportant, if
 "stronger const" is insignificant, then ignore it and move on.
stronger const is good. It's just not one of those "deal maker/breaker" things like the object model for example (something I am still going to look more closely at in D). Tony
It seems I have misunderstood your purpose here. Carry on. :)
Nov 08 2008
prev sibling next sibling parent Jussi Jumppanen <jussij zeusedit.com> writes:
Janderson Wrote:

 Even MFC is old and something Microsoft has not supported 
 for a long time.  
A few years ago MFC was definitely close to death. But the patient seems to have recovered and appears to doing well: http://msdn.microsoft.com/en-us/library/bb982354.aspx Cheers Jussi
Nov 09 2008
prev sibling parent reply mgen <bmeck stedwards.edu> writes:
They build models still though...
Nov 11 2008
parent Janderson <ask me.com> writes:
mgen wrote:
 They build models still though...
?
Nov 11 2008