www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Is D still alive?

reply Fab <fab-coding web.de> writes:
Dear D community,
My name is Fabian and I used to code C++ and Delphi. But a few month ago I've
got a book about D as a present. All in all D sounds very interesting ... but
- the "big" but - is D still alive?
Are there usable and stable GUI-Toolkits which are actually under development?
Are there any continued database projects?

So - is there any reason to change to D? I would ... I really would change if
there were more points than a nice language. I don't buy a good car if it's to
expensive - so: is D as precious as its pretend to be?

Greetings
Fabian

PS: I hope you can understand my bad English.
Jan 26 2011
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 26 Jan 2011 14:25:10 -0500, Fab <fab-coding web.de> wrote:

 Dear D community,
 My name is Fabian and I used to code C++ and Delphi. But a few month ago  
 I've
 got a book about D as a present. All in all D sounds very interesting  
 ... but
 - the "big" but - is D still alive?
Very much so. D2 is being actively developed. D1 is not, but Tango is being actively developed (which works with D1).
 Are there usable and stable GUI-Toolkits which are actually under  
 development?
I don't have personal experience with any of the current GUI projects, but from what I've read, many of them are usable. Whether they are actively developed, I'm not sure. See many of them here: http://prowiki.org/wiki4d/wiki.cgi?GuiLibraries
 Are there any continued database projects?
AFAIK, there is very little DB support (which will definitely need to be addressed before D is considered a complete language) for D2. However, you *always* have support via C bindings. D has zero-overhead binding to C functions, all you need to do is port the declarations to D. If you are using D1, there are several projects, I don't think many of them are up to date: http://prowiki.org/wiki4d/wiki.cgi?DatabaseBindings
 So - is there any reason to change to D? I would ... I really would  
 change if
 there were more points than a nice language. I don't buy a good car if  
 it's to
 expensive - so: is D as precious as its pretend to be?
I will warn you, once you start using D, you will not want to use something else. I cringe every day when I have to use PHP for work. I would say it is not ready for prime-time yet. It has a way to go, but some have managed to build pretty impressive applications from it. So it would depend on your application. -Steve
Jan 26 2011
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 26 Jan 2011 14:39:08 -0500, Steven Schveighoffer  
<schveiguy yahoo.com> wrote:

 On Wed, 26 Jan 2011 14:25:10 -0500, Fab <fab-coding web.de> wrote:

 Dear D community,
 My name is Fabian and I used to code C++ and Delphi. But a few month  
 ago I've
 got a book about D as a present. All in all D sounds very interesting  
 ... but
 - the "big" but - is D still alive?
Very much so. D2 is being actively developed. D1 is not, but Tango is being actively developed (which works with D1).
I should clarify, D1 is not getting any new features, but it is getting bug fixes. -Steve
Jan 26 2011
parent reply Fab <fab-coding web.de> writes:
Thank you for your answer.

But is there also a productive IDE for 'the daily use'?
I'm used to code Delphi and there is also everything in the IDE I need to code
full featured applications.
I don't need a GUI-Designer (but it would be nice - maybe something like the
QT-Designer) but a IDE which supports graphical debugging is vital for me.
Jan 26 2011
next sibling parent reply Fab <fab-coding web.de> writes:
In addition you have to know for what I want to use D.
I want to code little games (2D: Jump'n'Run) and I want to use D for scholastic
use - drawing plots, calculating functions, ... and so on.

You see: I want to use D for private and for scholastic purposes.
Jan 26 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Fab" <fab-coding web.de> wrote in message 
news:ihpv7r$272q$1 digitalmars.com...
 In addition you have to know for what I want to use D.
 I want to code little games (2D: Jump'n'Run) and I want to use D for 
 scholastic
 use - drawing plots, calculating functions, ... and so on.

 You see: I want to use D for private and for scholastic purposes.
For games, there are SDL and SFML bindings for D. You may also want to look at the Derelict project which includes bindings for a bunch of useful libraries. For plots/charts/graphs/etc, you should look at the humorously-named "Plot2Kill" library. Personally, I think D would be great for small games, private uses and scholastic uses. In fact, even *way* back *before* D1, Kenta Cho made some very good freeware games in D, like Torus Trooper and TUMIKI Fighters (ie, the original version of Blast Works). The areas where D is still a little behind are: If you *need* to be able to compile *native* 64-bit code (32-bit will still work on a 64-bit machine/OS, of course). If you need to create shared dynamic libraries (ie, .dll and .so). If you need to link with Windows C .obj and .lib files that were compiled with anything other than DMC. If you need to use a graphical GUI-builder tool. Or if you want to use something similar to Rails or Django to create web apps.
Jan 26 2011
prev sibling parent Trass3r <un known.com> writes:
 and I want to use D for scholastic use -
 drawing plots, calculating functions, ... and so on.
Well nothing can beat Matlab for quick plots n stuff. (Speaking of which, of course you can write plugins for it with D: https://bitbucket.org/trass3r/matd)
 You see: I want to use D for private and for scholastic purposes.
D's just fine for private stuff :)
Jan 26 2011
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 26 Jan 2011 14:54:06 -0500, Fab <fab-coding web.de> wrote:

 Thank you for your answer.

 But is there also a productive IDE for 'the daily use'?
 I'm used to code Delphi and there is also everything in the IDE I need  
 to code
 full featured applications.
 I don't need a GUI-Designer (but it would be nice - maybe something like  
 the
 QT-Designer) but a IDE which supports graphical debugging is vital for  
 me.
There are several projects in progress for D IDEs, at various levels of maturity, including one that integrates D into visual studio. http://prowiki.org/wiki4d/wiki.cgi?EditorSupport Note, you may be able to find answers to other questions on the D wiki. -Steve
Jan 26 2011
prev sibling next sibling parent "Nick Sabalausky" <a a.a> writes:
"Fab" <fab-coding web.de> wrote in message 
news:ihpu4u$24bp$1 digitalmars.com...
 Thank you for your answer.

 But is there also a productive IDE for 'the daily use'?
 I'm used to code Delphi and there is also everything in the IDE I need to 
 code
 full featured applications.
 I don't need a GUI-Designer (but it would be nice - maybe something like 
 the
 QT-Designer) but a IDE which supports graphical debugging is vital for me.
I use Programmer's Notepad 2 which does everything I care about and is very nicely lean and responsive. I'm not sure if it does debugging though, I've never tried. If you prefer the bigger (bloated, IMO) IDE's then there are two D pulgins for Eclipse ("Descent" is the older more advanced one, and there's another, I forget the name, that's newer and being actively developed). There is also stuff out there to make D work well with Visual Studio (see the "D.announcements" newsgroup).
Jan 26 2011
prev sibling parent Trass3r <un known.com> writes:
 But is there also a productive IDE for 'the daily use'?
I still use Descent for Eclipse. It isn't maintained anymore but it's the only one with a copy of the dmd frontend with some semantic analysis. VisualD on Windows provides some basic auto-completion and goto definition etc via compiler generated json files.
 I don't need a GUI-Designer (but it would be nice - maybe something like  
 the QT-Designer)
I think QtD supports the Qt GUI Designer.
 but a IDE which supports graphical debugging is vital for me
Then you should use VisualD on Windows. It includes cv2pdb which makes it possible to debug D apps with VisualStudio without much pain.
Jan 26 2011
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Steven Schveighoffer" <schveiguy yahoo.com> wrote in message 
news:op.vpxkvij9eav7ka steve-laptop...
 On Wed, 26 Jan 2011 14:25:10 -0500, Fab <fab-coding web.de> wrote:

 Are there any continued database projects?
AFAIK, there is very little DB support (which will definitely need to be addressed before D is considered a complete language) for D2. However, you *always* have support via C bindings. D has zero-overhead binding to C functions, all you need to do is port the declarations to D. If you are using D1, there are several projects, I don't think many of them are up to date: http://prowiki.org/wiki4d/wiki.cgi?DatabaseBindings
Adam Ruppe and Piotr Szturmaj have recently been working on some database stuff. See the recent thread "Can your programming language do this?"
 So - is there any reason to change to D? I would ... I really would 
 change if
 there were more points than a nice language. I don't buy a good car if 
 it's to
 expensive - so: is D as precious as its pretend to be?
I will warn you, once you start using D, you will not want to use something else. I cringe every day when I have to use PHP for work.
So very true :)
 I would say it is not ready for prime-time yet.  It has a way to go, but 
 some have managed to build pretty impressive applications from it.  So it 
 would depend on your application.
Personally, I think that even though D still has some things to be worked out, I think it's *still* far better than any of the other more mature languages.
Jan 26 2011
parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Wed, 26 Jan 2011 15:11:06 -0500, Nick Sabalausky <a a.a> wrote:

 "Steven Schveighoffer" <schveiguy yahoo.com> wrote in message
 news:op.vpxkvij9eav7ka steve-laptop...
 On Wed, 26 Jan 2011 14:25:10 -0500, Fab <fab-coding web.de> wrote:

 Are there any continued database projects?
AFAIK, there is very little DB support (which will definitely need to be addressed before D is considered a complete language) for D2. However, you *always* have support via C bindings. D has zero-overhead binding to C functions, all you need to do is port the declarations to D. If you are using D1, there are several projects, I don't think many of them are up to date: http://prowiki.org/wiki4d/wiki.cgi?DatabaseBindings
Adam Ruppe and Piotr Szturmaj have recently been working on some database stuff. See the recent thread "Can your programming language do this?"
I have ignored that thread (I sometimes just ignore threads because they start out uninteresting, or become uninteresting, and then I miss out on some good stuff!) I'll have to take a look, D2 really does need a DB interface -- badly.
 I would say it is not ready for prime-time yet.  It has a way to go, but
 some have managed to build pretty impressive applications from it.  So  
 it
 would depend on your application.
Personally, I think that even though D still has some things to be worked out, I think it's *still* far better than any of the other more mature languages.
It all seems really good until you hit an issue that cannot be worked around -- like a compiler error or a misdesigned feature. I call these 'mercy' problems, because you are then at the complete mercy of someone else. If you have a deadline, or have a complete stoppage in work, you really have little choice but to move onto another language or abandon the project. Dcollections sat idle for about a year because of a problem like this. That would scare the crap out of me if I was a project manager trying to decide whether to use D or not. I've had first hand experience with using a product (from Microsoft) that failed so badly that we needed to have them fix it (which of course took about 3 months). A year later, they discontinued the product, and we had even more problems. I wrote my own system to replace it from scratch, and everything works so much better now (and uses less memory!). Not to mention, we have all source, so it's always possible to fix. A small part of it is written in D1/Tango and performs beautifully :) But I'd probably not rewrite the server in D things I do with it. I'd suggest to anyone looking to use D for something really big to try and "prove" out how well D will perform for you by coding up bits of your whole project that you think will be needed. Hopefully, you can do everything without hitting a mercy bug and then you can write your full project in it. There are also really scary possibilities that I've seen happen to a few poor souls -- like hard-to-solve OPTLINK bugs. Those may creep up at any time. Really, I just feel that D2's tools are not mature enough, or have enough support to trust a professional product on it -- yet. I'm sure this will change in the future. BTW, I plan to write a semi-professional project in D2 in the near future, but I'm 1) willing to take the risks 2) have no deadline and 3) not depending on this project for a living. -Steve
Jan 26 2011
next sibling parent reply Tomek =?ISO-8859-2?Q?Sowi=F1ski?= <just ask.me> writes:
Steven Schveighoffer napisa=B3:

 Adam Ruppe and Piotr Szturmaj have recently been working on some databa=
se
 stuff. See the recent thread "Can your programming language do this?"
=20 I have ignored that thread (I sometimes just ignore threads because they =
=20
 start out uninteresting, or become uninteresting, and then I miss out on =
=20
 some good stuff!)
=20
 I'll have to take a look, D2 really does need a DB interface -- badly.
That and networking. I can help with the latter as I had done a bit of netw= ork devving, but I don't know what's the current state of affairs (sb worki= ng on it already?) and whether Phobos needs another soul on-board.
 I would say it is not ready for prime-time yet.  It has a way to go, b=
ut
 some have managed to build pretty impressive applications from it.  So=
=20
 it
 would depend on your application.
Personally, I think that even though D still has some things to be work=
ed
 out, I think it's *still* far better than any of the other more mature
 languages.
=20 It all seems really good until you hit an issue that cannot be worked =20 around -- like a compiler error or a misdesigned feature. I call these =
=20
 'mercy' problems, because you are then at the complete mercy of someone =
=20
 else.  If you have a deadline, or have a complete stoppage in work, you =
=20
 really have little choice but to move onto another language or abandon th=
e =20
 project.  Dcollections sat idle for about a year because of a problem lik=
e =20
 this.
Yeah, ditto for QuantLibD. I just spent too much time on a test project try= ing to isolate dmd and phobos bugs to submit something meaningful to bugzil= la and too little time coding. Not to mention that sometimes it was really = hard to know what the language *should* do because of outdated documentatio= n. But maybe the storm has passed and I should try serious work in D again?
 [snip]
=20
 BTW, I plan to write a semi-professional project in D2 in the near future=
, =20
 but I'm 1) willing to take the risks 2) have no deadline and 3) not =20
 depending on this project for a living.
Sheer curiosity: what will the project be about? --=20 Tomek
Jan 26 2011
parent Trass3r <un known.com> writes:
 Yeah, ditto for QuantLibD. I just spent too much time on a test project  
 trying to isolate dmd and phobos bugs to submit something meaningful to  
 bugzilla and too little time coding. Not to mention that sometimes it  
 was really hard to know what the language *should* do because of  
 outdated documentation. But maybe the storm has passed and I should try  
 serious work in D again?
I also ran into serious issues with cl4d that forced me to leave it alone several times. There even was a nasty bug with a corrupt stack frame, luckily it disappeared after some more coding and refactoring. But all in all I have the feeling that the situation has improved. Some serious forward reference bugs were fixed and I could more or less finish my work by now.
Jan 26 2011
prev sibling parent reply retard <re tard.com.invalid> writes:
Wed, 26 Jan 2011 15:35:19 -0500, Steven Schveighoffer wrote:

 I'd suggest to anyone looking to use D for something really big to try
 and "prove" out how well D will perform for you by coding up bits of
 your whole project that you think will be needed.  Hopefully, you can do
 everything without hitting a mercy bug and then you can write your full
 project in it.
I think this reveals a lot about D. You still need to prove things. Or maybe the community members in general aren't very good developers; they can't see the potential of this language. The fact is, no matter what language you choose, if it isn't a complete joke, you can finish the project. (I'm assuming the community members here won't be writing any massive projects which are not possible to do in C++ or PHP or Java.) I don't see any need to prove how well Haskell works. Even though it's a "avoid success at all costs" experimental research language. It just works. I mean to the extent that I'm willing to go with these silly test projects that try to prove something.
Jan 27 2011
next sibling parent Trass3r <un known.com> writes:
retard Wrote:
 Or maybe the community members in general aren't very good developers; they 
 can't see the potential of this language. The fact is, no matter what 
 language you choose, if it isn't a complete joke, you can finish the 
 project
We have a lot of talented people in this community who've created awesome projects, I don't think it's a lack of skills. Sometimes you hit bugs you just can't workaround. In my case these included forward reference bugs where a simple "exchange the declarations" didn't work cause of cross-references and a nasty corrupt stack frame bug I couldn't reduce.
Jan 27 2011
prev sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Thu, 27 Jan 2011 04:59:18 -0500, retard <re tard.com.invalid> wrote:

 Wed, 26 Jan 2011 15:35:19 -0500, Steven Schveighoffer wrote:

 I'd suggest to anyone looking to use D for something really big to try
 and "prove" out how well D will perform for you by coding up bits of
 your whole project that you think will be needed.  Hopefully, you can do
 everything without hitting a mercy bug and then you can write your full
 project in it.
I think this reveals a lot about D. You still need to prove things. Or maybe the community members in general aren't very good developers; they can't see the potential of this language. The fact is, no matter what language you choose, if it isn't a complete joke, you can finish the project. (I'm assuming the community members here won't be writing any massive projects which are not possible to do in C++ or PHP or Java.)
I fully see the potential of the language, but I've also experienced that a one (or two or three) man compiler team does not fix bugs on *my* schedule. I can't buy "enterprise" support, so any bugs I may hit, I'm just going to have to wait for Walter and Co. to get around to them. Not a problem for me, because I'm not developing with D professionally. But if I was going to base a software company on D, I'd be very nervous at this prospect. I find that I can work around many of D's bugs, but there are just some that you have to throw your hands up and wait (I don't have time to learn how a compiler works, and fix D's compiler). I think as D matures and hopefully gets more enterprise support, these problems will be history.
 I don't see any need to prove how well Haskell works. Even though it's a
 "avoid success at all costs" experimental research language. It just
 works. I mean to the extent that I'm willing to go with these silly test
 projects that try to prove something.
The statements I made are not a property of D, they are a property of the lack of backing/maturity. I'm sure when Haskell was at the same maturity stage as D, and if it had no financial backing/support contracts, it would be just as much of a gamble. You seem to think that D is inherently flawed because of D, but it's simply too young for some tasks. It's rapidly getting older, and I think in a year or two it will be mature enough for most projects. -Steve
Jan 28 2011
next sibling parent reply retard <re tard.com.invalid> writes:
Fri, 28 Jan 2011 10:14:04 -0500, Steven Schveighoffer wrote:

 On Thu, 27 Jan 2011 04:59:18 -0500, retard <re tard.com.invalid> wrote:
 
 Wed, 26 Jan 2011 15:35:19 -0500, Steven Schveighoffer wrote:

 I'd suggest to anyone looking to use D for something really big to try
 and "prove" out how well D will perform for you by coding up bits of
 your whole project that you think will be needed.  Hopefully, you can
 do everything without hitting a mercy bug and then you can write your
 full project in it.
I think this reveals a lot about D. You still need to prove things. Or maybe the community members in general aren't very good developers; they can't see the potential of this language. The fact is, no matter what language you choose, if it isn't a complete joke, you can finish the project. (I'm assuming the community members here won't be writing any massive projects which are not possible to do in C++ or PHP or Java.)
I fully see the potential of the language, but I've also experienced that a one (or two or three) man compiler team does not fix bugs on *my* schedule. I can't buy "enterprise" support, so any bugs I may hit, I'm just going to have to wait for Walter and Co. to get around to them. Not a problem for me, because I'm not developing with D professionally.
I agree.
 But if I was going to base a software company on D, I'd be very nervous
 at this prospect.
Exactly.
 
 I think as D matures
 and hopefully gets more enterprise support, these problems will be
 history.
This is the classic chicken or the egg problem. I'm not trying to be unnecessarily mean. Enterprise support is something you desperately need. Consider dsource, wiki4d, d's bugzilla etc. It's amazing how much 3rd party money and effort affects the development. Luckily many things are also free nowadays such as github.
 
 I don't see any need to prove how well Haskell works. Even though it's
 a "avoid success at all costs" experimental research language. It just
 works. I mean to the extent that I'm willing to go with these silly
 test projects that try to prove something.
The statements I made are not a property of D, they are a property of the lack of backing/maturity. I'm sure when Haskell was at the same maturity stage as D, and if it had no financial backing/support contracts, it would be just as much of a gamble.
But Haskell developers have uninterruptedly received funding during the years.
 You seem to think that D is inherently flawed because of D, but it's
 simply too young for some tasks.  It's rapidly getting older, and I
 think in a year or two it will be mature enough for most projects.
I've heard this before. I've also heard the 64-bit port and many other things are done in a year/month or two. The fact is, you're overly optimistic and these are all bullshit. When I come back here in a year or two, I have full justification to laugh at your stupid claims.
Jan 28 2011
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/28/11 3:25 PM, retard wrote:
 Fri, 28 Jan 2011 10:14:04 -0500, Steven Schveighoffer wrote:
 I think as D matures
 and hopefully gets more enterprise support, these problems will be
 history.
This is the classic chicken or the egg problem. I'm not trying to be unnecessarily mean. Enterprise support is something you desperately need. Consider dsource, wiki4d, d's bugzilla etc. It's amazing how much 3rd party money and effort affects the development. Luckily many things are also free nowadays such as github.
 I don't see any need to prove how well Haskell works. Even though it's
 a "avoid success at all costs" experimental research language. It just
 works. I mean to the extent that I'm willing to go with these silly
 test projects that try to prove something.
The statements I made are not a property of D, they are a property of the lack of backing/maturity. I'm sure when Haskell was at the same maturity stage as D, and if it had no financial backing/support contracts, it would be just as much of a gamble.
But Haskell developers have uninterruptedly received funding during the years.
That doesn't say much about anything. Some projects worked well with funding, some worked well with little or no initial funding.
 You seem to think that D is inherently flawed because of D, but it's
 simply too young for some tasks.  It's rapidly getting older, and I
 think in a year or two it will be mature enough for most projects.
I've heard this before. I've also heard the 64-bit port and many other things are done in a year/month or two. The fact is, you're overly optimistic and these are all bullshit. When I come back here in a year or two, I have full justification to laugh at your stupid claims.
I think if you do that I have full justification to send you back where you originally came from. Cut the crap for a change, will you. Thanks. Andrei
Jan 28 2011
prev sibling next sibling parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Fri, 28 Jan 2011 16:25:49 -0500, retard <re tard.com.invalid> wrote:

 Fri, 28 Jan 2011 10:14:04 -0500, Steven Schveighoffer wrote:
 I think as D matures
 and hopefully gets more enterprise support, these problems will be
 history.
This is the classic chicken or the egg problem. I'm not trying to be unnecessarily mean. Enterprise support is something you desperately need. Consider dsource, wiki4d, d's bugzilla etc. It's amazing how much 3rd party money and effort affects the development. Luckily many things are also free nowadays such as github.
I'd say the last 1-2 years have been extremely productive compared to the previous years combined. I feel like the growth has been better than linear. I'm not saying enterprise support is waiting in the wings for D to be "fully mature", and it might take someone developing a for-sale compiler to get to that point. But even without that kind of support, I expect that usable stability in dmd will come about sooner rather than later.
 The statements I made are not a property of D, they are a property of
 the lack of backing/maturity.  I'm sure when Haskell was at the same
 maturity stage as D, and if it had no financial backing/support
 contracts, it would be just as much of a gamble.
But Haskell developers have uninterruptedly received funding during the years.
If I understand you correctly, that's not what I'm talking about. I'm talking about a company who *uses* haskell being able to go to a haskell-supplier company and say "I want you guys to guarantee you will fix any bugs we encounter." If that's what you mean, then I stand corrected, but then that makes Haskell not a good comparison here...
 You seem to think that D is inherently flawed because of D, but it's
 simply too young for some tasks.  It's rapidly getting older, and I
 think in a year or two it will be mature enough for most projects.
I've heard this before. I've also heard the 64-bit port and many other things are done in a year/month or two. The fact is, you're overly optimistic and these are all bullshit. When I come back here in a year or two, I have full justification to laugh at your stupid claims.
Open-source development takes time, and is hard to predict, because typically it comes from free time, which isn't guaranteed. Because it's hard to predict doesn't mean a) "it's all bullshit", b) it doesn't make progress, and c) it will never succeed. In two years if you come back here and laugh at me, I will again shake my head in pity that you care so much about such things. I gave me best estimate, and maybe I'm off. We aren't on trial here, or having lives depend on us. You need to find a more constructive outlet for your pessimism. Well, it would be nice if it was simply on another newsgroup. -Steve
Jan 31 2011
prev sibling parent reply Ulrik Mikaelsson <ulrik.mikaelsson gmail.com> writes:
2011/1/28 retard <re tard.com.invalid>:
 I've heard this before. I've also heard the 64-bit port and many other
 things are done in a year/month or two. The fact is, you're overly
 optimistic and these are all bullshit. When I come back here in a year or
 two, I have full justification to laugh at your stupid claims.
64-bit port for D (at least v1) IS available. I use LDC to build and run on 64-bit, but I'm sure GDC will work as well. The LDC-port for D2 may be untested though, and I do not know about GDC, but personally, I think it's a mistake to recommend D2 "for new projects". Given the importance of compilers, runtime and base-libraries, and the kind of bug-reports I hear frustration around, it seems D2 should still be considered beta, but for people who want working development now, recommend D1. In all it's inferiority, it IS more stable. My guess is much frustration in the D community stems from slightly pre-maturely trying to push D2, before neither D1 nor D2 was ready. I've chosen to only work with D1/Tango from start, and I simply don't recognize the frustration many are feeling. I'm only concerned over that there ARE quite a few developers that seems to have been turned off by instability, and the Phobos/Tango-problem. Well, well. I'm generally a happy D-developer, and I only hope D3 won't be started until D2 is rock-stable, and fully utilized. :)
Jan 31 2011
parent reply Trass3r <un known.com> writes:
 I've chosen to only work with D1/Tango from start, and I simply don't
 recognize the frustration many are feeling. I'm only concerned over
 that there ARE quite a few developers that seems to have been turned
 off by instability, and the Phobos/Tango-problem.
Well, if nobody acted as a guinea pig, no issues would be uncovered ;) And though I already encountered several blocker bugs myself I got the feeling that the situation has become way better. Of course if, for some reason, you absolutely need x64 or have a hard deadline for your project then D1 is probably the better way to go.
Jan 31 2011
parent reply Jesse Phillips <jessekphillips+D gmail.com> writes:
Trass3r Wrote:

 I've chosen to only work with D1/Tango from start, and I simply don't
 recognize the frustration many are feeling. I'm only concerned over
 that there ARE quite a few developers that seems to have been turned
 off by instability, and the Phobos/Tango-problem.
Well, if nobody acted as a guinea pig, no issues would be uncovered ;) And though I already encountered several blocker bugs myself I got the feeling that the situation has become way better. Of course if, for some reason, you absolutely need x64 or have a hard deadline for your project then D1 is probably the better way to go.
Andrei put for the question once of, "How many issues would users run across if they stuck to those features that are also available in v1.0?" I think the answer would be more then sticking with a D1 compiler, but not nearly the number people do hit, which is also diminishing rapidly. I do not think there is an issue with using D2 in a new project, but if you have to ask you probably should go with D1. I say this because someone who is aware of the issues present in the language is able to decide if their desired project would be hindered by the bug. There are definitely some projects, with constraints which would not make D a very good choice. For example D would make a great language on an embedded device, but currently the first one to take it on will have a massive overhead to make it work.
Jan 31 2011
next sibling parent Ulrik Mikaelsson <ulrik.mikaelsson gmail.com> writes:
2011/1/31 Jesse Phillips <jessekphillips+D gmail.com>:
 I do not think there is an issue with using D2 in a new project, but if y=
ou have to ask you probably should go with D1. I say this because someone w= ho is aware of the issues present in the language is able to decide if thei= r desired project would be hindered by the bug. I completely agree with this, but that important "if" is not conveyed by the recommendation "D version 2 which is recommended for new projects." :) I think it's not as much a problem of actual problems, as a problem of wrong expectations. Anyhow, D1 is a great language. D2 looks to be an ever better language and from what I hear rapidly stabilizing. The only reason I brought this up (after days of questioning if it's even worth mentioning), is hoping to avoid the same problems when D3 arrives.
Jan 31 2011
prev sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Monday, January 31, 2011 11:31:29 Jesse Phillips wrote:
 Trass3r Wrote:
 I've chosen to only work with D1/Tango from start, and I simply don't
 recognize the frustration many are feeling. I'm only concerned over
 that there ARE quite a few developers that seems to have been turned
 off by instability, and the Phobos/Tango-problem.
Well, if nobody acted as a guinea pig, no issues would be uncovered ;) And though I already encountered several blocker bugs myself I got the feeling that the situation has become way better. Of course if, for some reason, you absolutely need x64 or have a hard deadline for your project then D1 is probably the better way to go.
Andrei put for the question once of, "How many issues would users run across if they stuck to those features that are also available in v1.0?" I think the answer would be more then sticking with a D1 compiler, but not nearly the number people do hit, which is also diminishing rapidly. I do not think there is an issue with using D2 in a new project, but if you have to ask you probably should go with D1. I say this because someone who is aware of the issues present in the language is able to decide if their desired project would be hindered by the bug. There are definitely some projects, with constraints which would not make D a very good choice. For example D would make a great language on an embedded device, but currently the first one to take it on will have a massive overhead to make it work.
Personally, I find that it's issues such as not being able to link C or C++ code compiled by Microsoft's compiler with code compiled by dmd which would stop be me from being able to use D in projects at work. The stability of the compiler is an issue, but the linker issue totally kills it before the stability issue would even come up. Pretty much everything I work on at work has to run on both Linux and Windows (and soon Mac OS X), and we use Microsoft's compiler here, so D could would _have_ to be able to link with code compiled by Microsoft's compiler. The issue of D1 or D2 is completely irrelevant. Now, if you could use a compiler other than dmd (maybe gdc would work - I don't know), then maybe then it would become a possibility, and then you have the question of D1 or D2, but if the linking issue isn't solved one way or another, then it doesn't matter. I'm sure that the situation is not as grim at all companies, but it definitely is where I work. - Jonathan M Davis
Jan 31 2011
parent retard <re tard.com.invalid> writes:
Mon, 31 Jan 2011 11:53:34 -0800, Jonathan M Davis wrote:

 On Monday, January 31, 2011 11:31:29 Jesse Phillips wrote:
 Trass3r Wrote:
 I've chosen to only work with D1/Tango from start, and I simply
 don't recognize the frustration many are feeling. I'm only
 concerned over that there ARE quite a few developers that seems to
 have been turned off by instability, and the Phobos/Tango-problem.
Well, if nobody acted as a guinea pig, no issues would be uncovered ;) And though I already encountered several blocker bugs myself I got the feeling that the situation has become way better. Of course if, for some reason, you absolutely need x64 or have a hard deadline for your project then D1 is probably the better way to go.
Andrei put for the question once of, "How many issues would users run across if they stuck to those features that are also available in v1.0?" I think the answer would be more then sticking with a D1 compiler, but not nearly the number people do hit, which is also diminishing rapidly. I do not think there is an issue with using D2 in a new project, but if you have to ask you probably should go with D1. I say this because someone who is aware of the issues present in the language is able to decide if their desired project would be hindered by the bug. There are definitely some projects, with constraints which would not make D a very good choice. For example D would make a great language on an embedded device, but currently the first one to take it on will have a massive overhead to make it work.
Personally, I find that it's issues such as not being able to link C or C++ code compiled by Microsoft's compiler with code compiled by dmd which would stop be me from being able to use D in projects at work. The stability of the compiler is an issue, but the linker issue totally kills it before the stability issue would even come up. Pretty much everything I work on at work has to run on both Linux and Windows (and soon Mac OS X), and we use Microsoft's compiler here, so D could would _have_ to be able to link with code compiled by Microsoft's compiler. The issue of D1 or D2 is completely irrelevant.
I don't do Windows development, but not being able to use popular third party development tools because of object file format issues sounds like a huge problem. I did a quick look at the digitalmars site. The limitation is only mentioned once in the FAQ section. A competent programmer might also discover that by reading the optlink page. Also no mention of the quality of D2 toolchain. "1.030 stable", "1.066 latest", and mysterious "2.051". I would assume it's stable. But like Ulrik said "the kind of bug-reports I hear frustration around, it seems D2 should still be considered beta"
Jan 31 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
Jan 28 2011
next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Friday, January 28, 2011 17:16:54 Walter Bright wrote:
 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
Well, since Scotty hasn't been born yet, it's probably a bit premature... ;) - Jonathan M Davis
Jan 28 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
Jonathan M Davis wrote:
 On Friday, January 28, 2011 17:16:54 Walter Bright wrote:
 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
Well, since Scotty hasn't been born yet, it's probably a bit premature... ;)
She canna take the power!
Jan 28 2011
prev sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Fri, 28 Jan 2011 20:16:54 -0500, Walter Bright  
<newshound2 digitalmars.com> wrote:

 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
No really, I can't afford it ;) But seriously, I find it hard to believe that you can buy enterprise support for D if it means that you do the work. There's only one you. So at some point, you might be spread too thin between adding new features, posting to this newsgroup, and supporting all enterprise customers. Any estimate you can give on how many such customers you have? -Steve
Jan 31 2011
next sibling parent reply "Simen kjaeraas" <simen.kjaras gmail.com> writes:
Steven Schveighoffer <schveiguy yahoo.com> wrote:

 On Fri, 28 Jan 2011 20:16:54 -0500, Walter Bright  
 <newshound2 digitalmars.com> wrote:

 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
No really, I can't afford it ;) But seriously, I find it hard to believe that you can buy enterprise support for D if it means that you do the work. There's only one you. So at some point, you might be spread too thin between adding new features, posting to this newsgroup, and supporting all enterprise customers.
Even if that is the case, Walter is free to hire others to do the enterprise work for him. -- Simen
Jan 31 2011
parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Mon, 31 Jan 2011 12:26:56 -0500, Simen kjaeraas  
<simen.kjaras gmail.com> wrote:

 Steven Schveighoffer <schveiguy yahoo.com> wrote:

 On Fri, 28 Jan 2011 20:16:54 -0500, Walter Bright  
 <newshound2 digitalmars.com> wrote:

 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
No really, I can't afford it ;) But seriously, I find it hard to believe that you can buy enterprise support for D if it means that you do the work. There's only one you. So at some point, you might be spread too thin between adding new features, posting to this newsgroup, and supporting all enterprise customers.
Even if that is the case, Walter is free to hire others to do the enterprise work for him.
Again, there's only one Walter. Throwing warm bodies at a problem doesn't always solve it ;) I for one would have a hard time being a full-time enterprise support developer for compiler software. That being said, he does work at a compiler company, so I suppose he probably has better access to compiler-savvy folks... -Steve
Jan 31 2011
prev sibling next sibling parent reply retard <re tard.com.invalid> writes:
Mon, 31 Jan 2011 11:43:37 -0500, Steven Schveighoffer wrote:

 On Fri, 28 Jan 2011 20:16:54 -0500, Walter Bright
 <newshound2 digitalmars.com> wrote:
 
 Steven Schveighoffer wrote:
 I can't buy "enterprise" support,
Of course you can!
No really, I can't afford it ;) But seriously, I find it hard to believe that you can buy enterprise support for D if it means that you do the work. There's only one you. So at some point, you might be spread too thin between adding new features, posting to this newsgroup, and supporting all enterprise customers. Any estimate you can give on how many such customers you have?
The fact that the final specification and design rationale of D is undocumented and in Walter's head means that no other person can sell that kind of deep enterprise support because it's not clear how the language should work. The rest of us can only guess. It also means that the more Walter spends time on enterprise support, the less he has time to work on D. The best for D might be to not buy any support at all. All the conferences and events are just distracting D's development. I think the same applies to Phobos 2.. only Andrei knows the design well enough and knows how it's going to change in the future. No matter how much time one spends studying D or the ecosystem or how D is used in the enterprise world, one simply can't obtain any reasonable level of knowledge to become a "certified" authority in this community. About the enterprise support... I haven't seen any material from Walter targeting professional D developers, only advertisements for people who have never used D. Maybe the hardcore stuff isn't publicly available. The commercial language consultancy support I've seen is that consultants with 20+ years of enterprise "C++ experience" teach young developers with only ~1-5 years of enterprise experience with the platform. Typically even the fresh juniors have some experience with the platform (via university training) and the in-house seniors with 3+ years of experience help them to get more familiar with the platform used in the company. It's also very rare to only focus on the language, usually the frameworks and toolchain are the major culprits. YMMV of course and the world is full of all kinds of bullshit consultancy.
Jan 31 2011
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/31/11 1:52 PM, retard wrote:
 I think the same applies to Phobos 2.. only Andrei knows the design well
 enough and knows how it's going to change in the future. No matter how
 much time one spends studying D or the ecosystem or how D is used in the
 enterprise world, one simply can't obtain any reasonable level of
 knowledge to become a "certified" authority in this community.
People on the Phobos roster all have contributed to it, not to mention the many contributed patches via bugzilla and ideas aired in this group. Also, I have a track record of bringing ideas for future Phobos additions for discussion up here before acting on them. In the words of collection companies: "What else can we do?" I think I'll do something terrible: I'll killfile re tard.com.invalid. This is a rare occurrence - my current filter has only four entries that probably are not valid anymore. I have always saluted your visible efforts at being civil, and I have appreciation for them; obviously you are quite unhappy about the state of the language, and given that it's difficult to not let emotions take the driver's seat. At the same time, however, all too often there's this schadenfreude in your posts that reduces the credibility and the value of the actual content considerably. Between hunting for bits of content and trying to distinguish whether they are genuine or the reverse engineering of a presupposition - the toll on my time is too high. Feel free to email me or to change your ID if you want to share something. Thanks. Andrei
Jan 31 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
retard wrote:
 The fact that the final specification and design rationale of D is 
 undocumented and in Walter's head means that no other person can sell 
 that kind of deep enterprise support because it's not clear how the 
 language should work.
Oh rubbish. C++ was highly successful in the enterprise for 15 years before it got a formal specification.
 The rest of us can only guess. It also means that 
 the more Walter spends time on enterprise support, the less he has time 
 to work on D. The best for D might be to not buy any support at all. All 
 the conferences and events are just distracting D's development.
More nonsense. Supporting users, etc., keeps me current on what the real problems and needs are.
 I think the same applies to Phobos 2.. only Andrei knows the design well 
 enough and knows how it's going to change in the future. No matter how 
 much time one spends studying D or the ecosystem or how D is used in the 
 enterprise world, one simply can't obtain any reasonable level of 
 knowledge to become a "certified" authority in this community.
Official "certs" in the software biz are bullsh*t. I've never seen much of any correspondence between certs and competency.
 About the enterprise support... I haven't seen any material from Walter 
 targeting professional D developers, only advertisements for people who 
 have never used D. Maybe the hardcore stuff isn't publicly available.
If you mean slick brochures and Tom Hopkins trained pitches, no, that's not what I do. I help people who ask for services.
Jan 31 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Steven Schveighoffer wrote:
 But seriously, I find it hard to believe that you can buy enterprise 
 support for D if it means that you do the work.  There's only one you.  
 So at some point, you might be spread too thin between adding new 
 features, posting to this newsgroup, and supporting all enterprise 
 customers.
If I need help, there are lots of people in this n.g. who'd be willing to take a paid gig.
 Any estimate you can give on how many such customers you have?
Not many.
Jan 31 2011
prev sibling parent reply retard <re tard.com.invalid> writes:
Wed, 26 Jan 2011 14:39:08 -0500, Steven Schveighoffer wrote:

 I will warn you, once you start using D, you will not want to use
 something else.  I cringe every day when I have to use PHP for work.
Nice trolling.
Jan 27 2011
parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Thu, 27 Jan 2011 04:50:32 -0500, retard <re tard.com.invalid> wrote:

 Wed, 26 Jan 2011 14:39:08 -0500, Steven Schveighoffer wrote:

 I will warn you, once you start using D, you will not want to use
 something else.  I cringe every day when I have to use PHP for work.
Nice trolling.
*shrug* call it whatever you want, it's true for me, and probably most everyone here. -Steve
Jan 28 2011
prev sibling next sibling parent reply Trass3r <un known.com> writes:
 But a few month ago I've got a book about D as a present
Nice, the word is spreading.
 So - is there any reason to change to D? I would ... I really would  
 change
 if there were more points than a nice language.

 I don't buy a good car if it's too expensive
But once you had a test drive, you just can't get out anymore.
Jan 26 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
Jan 26 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/26/11 4:09 PM, Walter Bright wrote:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
As much as I cringe when I see http://d.puremagic.com/issues/show_bug.cgi?id=5493 Andrei
Jan 26 2011
parent Trass3r <un known.com> writes:
 On 1/26/11 4:09 PM, Walter Bright wrote:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
As much as I cringe when I see http://d.puremagic.com/issues/show_bug.cgi?id=5493 Andrei
Well, coding in D is like being Indiana: you might run into nasty traps here and there but you just can't be a vanilla guy anymore ;)
Jan 26 2011
prev sibling next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday, January 26, 2011 14:09:25 Walter Bright wrote:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
LOL. Yeah. There's just so many little things that D improves on that when you're stuck in C++ land, you're constantly running into little things that you miss having or are annoyed to have to deal with. auto has probably been the biggest one for me, though slicing isn't far behind. At least auto will be in C++ 0x... I dream of the day that I'll be able to use D all of the time and not have to worry about C++ anymore. - Jonathan M Davis
Jan 26 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Jonathan M Davis" <jmdavisProg gmx.com> wrote in message 
news:mailman.973.1296080233.4748.digitalmars-d puremagic.com...
 On Wednesday, January 26, 2011 14:09:25 Walter Bright wrote:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
LOL. Yeah. There's just so many little things that D improves on that when you're stuck in C++ land, you're constantly running into little things that you miss having or are annoyed to have to deal with. auto has probably been the biggest one for me, though slicing isn't far behind. At least auto will be in C++ 0x...
For me, D's killer features were string handling (slicing and appending/concatenation) and *no header files*. (No more header files!! Yay!!!). But auto is fantastic too though, I get sooo much use out of that.
Jan 26 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:ihq7a3$2o05$1 digitalmars.com...
 "Jonathan M Davis" <jmdavisProg gmx.com> wrote in message 
 news:mailman.973.1296080233.4748.digitalmars-d puremagic.com...
 On Wednesday, January 26, 2011 14:09:25 Walter Bright wrote:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
LOL. Yeah. There's just so many little things that D improves on that when you're stuck in C++ land, you're constantly running into little things that you miss having or are annoyed to have to deal with. auto has probably been the biggest one for me, though slicing isn't far behind. At least auto will be in C++ 0x...
For me, D's killer features were string handling (slicing, appending/concatenation, ***and .length instead of null-termination***) and *no header files*. (No more header files!! Yay!!!). But auto is fantastic too though, I get sooo much use out of that.
Fixed. ^^^
Jan 26 2011
prev sibling parent reply Trass3r <un known.com> writes:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of  
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Jan 26 2011
next sibling parent reply spir <denis.spir gmail.com> writes:
On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that. Denis -- _________________ vita es estrany spir.wikidot.com
Jan 26 2011
parent reply Mafi <mafi example.org> writes:
Am 27.01.2011 01:41, schrieb spir:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that. Denis
What about a[1.. min(x, $)]. (min from std.alorithm)? Mafi
Jan 27 2011
parent spir <denis.spir gmail.com> writes:
On 01/27/2011 01:24 PM, Mafi wrote:
 Am 27.01.2011 01:41, schrieb spir:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that. Denis
What about a[1.. min(x, $)]. (min from std.alorithm)?
Yes, that's it. Denis -- _________________ vita es estrany spir.wikidot.com
Jan 27 2011
prev sibling next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work. And if for some reason you really want to be able to just have it use $ if the index is too large, it's easy to write a wrapper function which does that. - Jonathan M Davis
Jan 26 2011
parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 27.01.2011 02:11, schrieb Jonathan M Davis:
 On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work. And if for some reason you really want to be able to just have it use $ if the index is too large, it's easy to write a wrapper function which does that. - Jonathan M Davis
I think he wants int arr[] = int[3]; // ... int arr2[] = arr[1..5]; to be equivalent to int arr[] = int[3]; // ... int arr2[] = arr[1..$]; arr2.length = 4; Cheers, - Daniel
Jan 26 2011
parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 27.01.2011 08:10, schrieb Daniel Gibson:
 Am 27.01.2011 02:11, schrieb Jonathan M Davis:
 On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work. And if for some reason you really want to be able to just have it use $ if the index is too large, it's easy to write a wrapper function which does that. - Jonathan M Davis
I think he wants int arr[] = int[3]; // ... int arr2[] = arr[1..5]; to be equivalent to int arr[] = int[3]; // ... int arr2[] = arr[1..$]; arr2.length = 4; Cheers, - Daniel
Okay, I just saw in spirs reply, that for some reason was not displayed in this branch of the thread but as a direct reply to the top post, that you understood him correctly and I didn't :-)
Jan 26 2011
prev sibling next sibling parent reply spir <denis.spir gmail.com> writes:
On 01/27/2011 02:11 AM, Jonathan M Davis wrote:
 On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work.
Sorry, but you are wrong on this. I understand this sounds unsafe, but no. Most languages, I guess, just do that without any worry. In particular, I have frequented python and Lua mailing lists for years without even reading once about this beeing a misfeature (and indeed have never run into a bug because of this myself). It is simply the right semantics in 99.999% cases. spir d:~$ python Python 2.6.6 (r266:84292, Sep 15 2010, 15:52:39) [GCC 4.4.5] on linux2 Type "help", "copyright", "credits" or "license" for more information.
 s = 'abc'
 s[0:123456789]
'abc' spir d:~$ lua Lua 5.1.4 Copyright (C) 1994-2008 Lua.org, PUC-Rio
 require"io"
 s = "abc"
 print(string.sub(s, 1, 123456789))
abc I'm constantly annoyed by D's behaviour. For instance, often have to write out the end of a string from a given point, but only at most n chars (to avoid cluttering the output, indeed): writeln(s[i..i+n]); which fails if there are less than n remaining chars ;-) Denis -- _________________ vita es estrany spir.wikidot.com
Jan 26 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
spir:

 Sorry, but you are wrong on this. I understand this sounds unsafe, but no.
Most 
 languages, I guess, just do that without any worry. In particular, I have 
 frequented python and Lua mailing lists for years without even reading once 
 about this beeing a misfeature (and indeed have never run into a bug because
of 
 this myself). It is simply the right semantics in 99.999% cases.
Thank you for raising this topic again. I have raised it probably more than one year ago, and all people around here were against this. This is not unsafe, it's actually safer than the current D behaviour. Python designers don't think this is a badly designed part of Python, they think this is as desired, and I too have never seen people complain about this being a bad or bug prone design. An example of Python code: s = "abcdefg" upper = 20 s2 = s[2 : upper] assert s2 == "cdefg" Equivalent D2 code, if you forget to use the min() you are doomed: import std.algorithm: min; void main() { string s = "abcdefg"; int upper = 20; string s2 = s[2 .. min($, upper)]; assert(s2 == "cdefg"); } The saturating nature of Python slice bounds is a safety net that's useful to avoid some troubles. The only good thing of the D2 design is that it performs less tests at runtime (it has no need to call min() in general), this is a bit positive in a system language, but in this case it has a cost in safety. Bye, bearophile
Jan 27 2011
parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Thursday, January 27, 2011 01:34:04 bearophile wrote:
 spir:
 Sorry, but you are wrong on this. I understand this sounds unsafe, but
 no. Most languages, I guess, just do that without any worry. In
 particular, I have frequented python and Lua mailing lists for years
 without even reading once about this beeing a misfeature (and indeed
 have never run into a bug because of this myself). It is simply the
 right semantics in 99.999% cases.
Thank you for raising this topic again. I have raised it probably more than one year ago, and all people around here were against this. This is not unsafe, it's actually safer than the current D behaviour. Python designers don't think this is a badly designed part of Python, they think this is as desired, and I too have never seen people complain about this being a bad or bug prone design. An example of Python code: s = "abcdefg" upper = 20 s2 = s[2 : upper] assert s2 == "cdefg" Equivalent D2 code, if you forget to use the min() you are doomed: import std.algorithm: min; void main() { string s = "abcdefg"; int upper = 20; string s2 = s[2 .. min($, upper)]; assert(s2 == "cdefg"); } The saturating nature of Python slice bounds is a safety net that's useful to avoid some troubles. The only good thing of the D2 design is that it performs less tests at runtime (it has no need to call min() in general), this is a bit positive in a system language, but in this case it has a cost in safety.
I don't think that you stand much chance of convincing a crowd using a system programming language that having the compiler adjust your out of bound indices to be in bounds is good idea. While it may be a great idea in some circumstances, it generally smacks of lax programming. And since it makes slicing more expensive, it would be a feature that would definitely need to pull its weight. And I don't see how it possibly could. It's trivial to add a function which will return a slice which is laxly sliced in the manner that you're looking for. And using min as you're showing isn't exactly hard. Yes, you may get bugs when porting code from Python, but I don't think that porting Python code has ever been one of the major concerns of D's language design. Really, it's the silently breaking stuff when porting C and C++ code which has been the concern. What we have is efficient, and I expect that most people around here think that it's the correct solution. If you want lax slicing, it's easy to get by using min or a wrapper function. So, I see no reason to make any language changes - not to mention it would likely conflict with TDPL to make such a change. - Jonathan M Davis
Jan 27 2011
parent bearophile <bearophileHUGS lycos.com> writes:
Jonathan M Davis:

 I don't think that you stand much chance of convincing a crowd using a system 
 programming language that having the compiler adjust your out of bound indices 
 to be in bounds is  good idea.
Technically here we are talking about slicing bounds, and not indexes. But you are right, there's no hope I will change minds here. On the other hand my duty is to show the ideas I believe are right/better, because I care for this language. Bye, bearophile
Jan 27 2011
prev sibling next sibling parent reply retard <re tard.com.invalid> writes:
Wed, 26 Jan 2011 23:33:54 +0100, Trass3r wrote:

 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header files!!
 Yay!!!). But auto is fantastic too though, I get sooo much use out of
 that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Yep, those were to reasons that lured me to learn Java. However, those were not the reasons to learn D. The main reasons were RAII and Design by Contract. Even funnier, it took D about 9 years to fix the main bug in DbC (contract inheritance).
Jan 27 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
retard wrote:
 The main reasons were RAII and Design by 
 Contract. Even funnier, it took D about 9 years to fix the main bug in DbC 
 (contract inheritance).
The reason that took so long was that few people were using DbC effectively, so it was a low priority. I originally had high hopes that DbC would produce dramatic improvements in code quality, but the real world results were disappointing.
Jan 27 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter:

The reason that took so long was that few people were using DbC effectively, so
it was a low priority. I originally had high hopes that DbC would produce
dramatic improvements in code quality, but the real world results were
disappointing.<
After many years and many failed hopes, I think there is no silver bullet in programming, so maybe nothing is able to produce "dramatic improvements in code quality". But even if this is true, some things are able to improve coding a bit, like unit testing, a well semantically defined language, syntax coloring, quick compile-run cycles, OOP for certain kinds of programs, DbC, and so on. Each of such things improve the situation only a little, but such improvements pile up and most programmers when have tried them don't want to go back to miss those things. I have learnt to know and use DbC on D, and while it has not caused dramatic improvements in my code, I like to use it now and then, it has found some of the bugs in my code and more. I am appreciating D DbC enough that sometimes I miss it when I write Python code, so sometimes when I write OOP Python code, I add home-made class invariants and I call them manually from methods. In one situation this has allowed me to finish some hairy Python code in a very tight time schedule. Regading D implementation of DbC I'd like: - DMD to use a not-release build of Phobos when I compile my D code in not-release build (or some similar solution). So Phobos will be free to use asserts instead of enforce in many more situations (even if not in all situations). This will allow more inlining, allow to use "nothrow" tags more frequently, and avoid some performance problems currently present (confirmed by timings and profiling) in Phobos caused by enforces; - To see some acceptable solution for the "old" (pre-state) problem. On this there are some ideas, but I don't think there is some accepted proposal yet. This is not an indispensable feature of DbC, but it allows to increase its found ways to solve this problem, so I presume there is some way to solve it in D too); - To see some Bugzilla DbC bugs removed. One or more of them is related to how const/immutable badly interacts with the result return value used by out(result){}. Bye and thank you, bearophile
Jan 27 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
bearophile wrote:
 Walter:
 
 The reason that took so long was that few people were using DbC
 effectively, so it was a low priority. I originally had high hopes that DbC
 would produce dramatic improvements in code quality, but the real world
 results were disappointing.<
After many years and many failed hopes, I think there is no silver bullet in programming, so maybe nothing is able to produce "dramatic improvements in code quality". But even if this is true, some things are able to improve coding a bit, like unit testing, a well semantically defined language, syntax coloring, quick compile-run cycles, OOP for certain kinds of programs, DbC, and so on. Each of such things improve the situation only a little, but such improvements pile up and most programmers when have tried them don't want to go back to miss those things.
Unit testing has produced a dramatic improvement in coding.
Jan 27 2011
next sibling parent reply Tomek =?ISO-8859-2?Q?Sowi=F1ski?= <just ask.me> writes:
Walter Bright napisa=B3:

 bearophile wrote:
 Walter:
=20
 The reason that took so long was that few people were using DbC
 effectively, so it was a low priority. I originally had high hopes tha=
t DbC
 would produce dramatic improvements in code quality, but the real world
 results were disappointing.<
=20 After many years and many failed hopes, I think there is no silver bull=
et in
 programming, so maybe nothing is able to produce "dramatic improvements=
in
 code quality".
=20
 But even if this is true, some things are able to improve coding a bit,=
like
 unit testing, a well semantically defined language, syntax coloring, qu=
ick
 compile-run cycles, OOP for certain kinds of programs, DbC, and so on. =
Each
 of such things improve the situation only a little, but such improvemen=
ts
 pile up and most programmers when have tried them don't want to go back=
to
 miss those things.
=20 Unit testing has produced a dramatic improvement in coding.
Yes, it's big. Funny that it's not really a technical change but a cultural= one -- D just leaves no excuses to even the most stone-age programmers not= to test their code. --=20 Tomek
Jan 27 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
Tomek Sowiński wrote:
 Walter Bright napisał:
 
 bearophile wrote:
 Walter:
 
 The reason that took so long was that few people were using DbC 
 effectively, so it was a low priority. I originally had high hopes that
 DbC would produce dramatic improvements in code quality, but the real
 world results were disappointing.<
After many years and many failed hopes, I think there is no silver bullet in programming, so maybe nothing is able to produce "dramatic improvements in code quality". But even if this is true, some things are able to improve coding a bit, like unit testing, a well semantically defined language, syntax coloring, quick compile-run cycles, OOP for certain kinds of programs, DbC, and so on. Each of such things improve the situation only a little, but such improvements pile up and most programmers when have tried them don't want to go back to miss those things.
Unit testing has produced a dramatic improvement in coding.
Yes, it's big. Funny that it's not really a technical change but a cultural one -- D just leaves no excuses to even the most stone-age programmers not to test their code.
I was talking about this with Andrei the other day. D's focus on making it easy to do things the right way has paid off handsomely, though this is not at all obvious from reading a feature list. It only becomes clear when you use it for a while, and then try to go back to the way you were doing things before. I think one of the reasons DbC has not paid off is it still requires a significant investment of effort by the programmer. It's too easy to not bother.
Jan 27 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/28/11, Walter Bright <newshound2 digitalmars.com> wrote:
 I think one of the reasons DbC has not paid off is it still requires a
 significant investment of effort by the programmer. It's too easy to not
 bother.
Another way to look at it is that programmers are enjoying the safety of using regular D so much as to not even think about using DbC. The language does guarantee a lot more than C++. The irony here is that C++ is the one that desperately needs integrated DbC, and yet D, the safer language, is the one providing DbC for free. :p In other words, we get the candy and the cake.
Jan 27 2011
parent bearophile <bearophileHUGS lycos.com> writes:
Andrej Mitrovic:

 Another way to look at it is that programmers are enjoying the safety
 of using regular D so much as to not even think about using DbC.
This may be a case of Risk homeostasis: http://en.wikipedia.org/wiki/Risk_homeostasis But maybe it's mostly a matter of getting used to D. Once you get used to a safer language, you want even more and something better. You want the frosting too on your cake :-) And probably D is not the end of the line in the evolution of C-derived languages. There is space for improvements, like having notnull reference types, typestates, etc. For example, it will be interesting to see where the Rust language goes. Bye, bearophile
Jan 27 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/27/11 8:02 PM, Walter Bright wrote:
 Tomek Sowiński wrote:
 Walter Bright napisał:

 bearophile wrote:
 Walter:

 The reason that took so long was that few people were using DbC
 effectively, so it was a low priority. I originally had high hopes
 that
 DbC would produce dramatic improvements in code quality, but the real
 world results were disappointing.<
After many years and many failed hopes, I think there is no silver bullet in programming, so maybe nothing is able to produce "dramatic improvements in code quality". But even if this is true, some things are able to improve coding a bit, like unit testing, a well semantically defined language, syntax coloring, quick compile-run cycles, OOP for certain kinds of programs, DbC, and so on. Each of such things improve the situation only a little, but such improvements pile up and most programmers when have tried them don't want to go back to miss those things.
Unit testing has produced a dramatic improvement in coding.
Yes, it's big. Funny that it's not really a technical change but a cultural one -- D just leaves no excuses to even the most stone-age programmers not to test their code.
I was talking about this with Andrei the other day. D's focus on making it easy to do things the right way has paid off handsomely, though this is not at all obvious from reading a feature list. It only becomes clear when you use it for a while, and then try to go back to the way you were doing things before.
Although this might as well be true, I generally try to avoid such arguments. The problem with it is it's non-falsifiable (http://en.wikipedia.org/wiki/Falsifiability) so it has a certain stench coming with it. I've seen such a claim in Go fora: you know, once you get to really use Go, you won't feel the need for generics. Meh. I try to _never_ use such an argument. If I'm to convince anyone that D rocks, it won't be by means of unfalsifiable statements. It will be by showing code that knocks your socks off. The kind of code that makes you think: "If I'm to write that in language X, I need to give away desirable traits A, B, and C. Damn!"
 I think one of the reasons DbC has not paid off is it still requires a
 significant investment of effort by the programmer. It's too easy to not
 bother.
One issue with DbC is that its only significant advantage is its interplay with inheritance. Otherwise, scope() in conjunction with assert works with less syntactic overhead. So DbC tends to shine with large and deep hierarchies... but large and deep hierarchies are not that a la mode anymore. Andrei
Jan 27 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 On 1/27/11 8:02 PM, Walter Bright wrote:
 I was talking about this with Andrei the other day. D's focus on making
 it easy to do things the right way has paid off handsomely, though this
 is not at all obvious from reading a feature list. It only becomes clear
 when you use it for a while, and then try to go back to the way you were
 doing things before.
Although this might as well be true, I generally try to avoid such arguments. The problem with it is it's non-falsifiable (http://en.wikipedia.org/wiki/Falsifiability) so it has a certain stench coming with it. I've seen such a claim in Go fora: you know, once you get to really use Go, you won't feel the need for generics. Meh. I try to _never_ use such an argument. If I'm to convince anyone that D rocks, it won't be by means of unfalsifiable statements.
I agree it's a worthless argument to use to try and convince people to give D a try. But for experienced D users, it is an interesting point.
 It will be by 
 showing code that knocks your socks off. The kind of code that makes you 
 think: "If I'm to write that in language X, I need to give away 
 desirable traits A, B, and C. Damn!"
 
 I think one of the reasons DbC has not paid off is it still requires a
 significant investment of effort by the programmer. It's too easy to not
 bother.
One issue with DbC is that its only significant advantage is its interplay with inheritance. Otherwise, scope() in conjunction with assert works with less syntactic overhead. So DbC tends to shine with large and deep hierarchies... but large and deep hierarchies are not that a la mode anymore.
Yes, you might be right.
Jan 27 2011
parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Thursday 27 January 2011 19:29:59 Walter Bright wrote:
 Andrei Alexandrescu wrote:
 One issue with DbC is that its only significant advantage is its
 interplay with inheritance. Otherwise, scope() in conjunction with
 assert works with less syntactic overhead. So DbC tends to shine with
 large and deep hierarchies... but large and deep hierarchies are not
 that a la mode anymore.
Yes, you might be right.
I generally end up using unit tests to verify that stuff works correctly and then throw exceptions on bad input. So while I like having DbC built in, I don't end up using it all that much. It's prim,arily invariant that I end up using though, and that's harder to do inside of the member functions. - Jonathan M Davis
Jan 27 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Jonathan M Davis:

 I generally end up using unit tests to verify that stuff works correctly and
then 
 throw exceptions on bad input. So while I like having DbC built in, I don't
end 
 up using it all that much. It's prim,arily invariant that I end up using
though, 
 and that's harder to do inside of the member functions.
I think the problem here is that you are not using your D tools well enough yet: - Preconditions allow you to save some tests in your unittests, because you have less need to test many input boundary conditions. - Postconditions are useful to save some code to test the correctness of the results inside your unittests. You still need to put many tricky but correct input conditions inside your unittests, but then the postcondition will test that the outputs are inside the class of the correct outputs, and you will need less unittest code to test that the results are exactly the expected ones in some important situations. - The missing "old" feature once somehow implemented allows to remove some other unittests, because you don't need a unittest any more to test the output is generally correct given a certain class of input. - Class/struct invariants do something unittests have a hard time doing: testing the internal consistency of the class/struct data structures in all moments, and catching an inconsistency as soon as possible, this allows to catch bugs very soon, even before results reach the unittesting code. So their work is not much duplicated by unit testing. - Loop invariants can't be replaced well enough by unittests. Again, they help you find bugs very early, often no more than few lines of code later of where the bug is. Unittests are not much able to do this. - Currently you can't use invariants in some situations because of some DMD bugs. - You must look a bit forward too. If D will have some success then some person will try to write some tool to test some contracts at compile time. Compile-time testing of contracts is useful because it's the opposite of a sampling: it's like an extension of the type system, it allows you to be sure of something for all possible cases. Unit tests are useful as a sampling mean, they allow you to assert your function does exactly as expected for some specific input-output pairs. DbC doesn't perform a sampling, it tests more general properties about your inputs or outputs (and if you have some kind of implementation of the "old" feature, also general properties between inputs and their outputs), so DbC testing is wider but less deep and less specific than unit testing. Generally with unittesting you can't be sure to have covered all interesting input-output cases (code coverage tools here help but don't solve the problem. Fuzzytesting tools are able to cover other cases), while with DbC you can't be sure your general rules about correct inputs, correct outputs (or even general rules about what a correct input-output pair is) are accurate enough to catch all bad situations and computations. So generally DbC and unittests are better for different purposes, using them both you are able to complement each other weak points, and to improve your D coding. I also suggest to try to write contracts first and code later, sometimes it helps. Bye, bearophile
Jan 28 2011
parent reply spir <denis.spir gmail.com> writes:
On 01/28/2011 12:36 PM, bearophile wrote:

 I think the problem here is that you are not using your D tools well enough
yet:
 - Preconditions allow you to save some tests in your unittests, because you
have less need to test many input boundary conditions.
 - Postconditions are useful to save some code to test the correctness of the
results inside your unittests. You still need to put many tricky but correct
input conditions inside your unittests, but then the postcondition will test
that the outputs are inside the class of the correct outputs, and you will need
less unittest code to test that the results are exactly the expected ones in
some important situations.
Very intersting points, thank you.
 - The missing "old" feature once somehow implemented allows to remove some
other unittests, because you don't need a unittest any more to test the output
is generally correct given a certain class of input.
What is this "old" feature?
 Generally with unittesting you can't be sure to have covered all interesting
input-output cases (code coverage tools here help but don't solve the problem.
Fuzzytesting tools are able to cover other cases), while with DbC you can't be
sure your general rules about correct inputs, correct outputs (or even general
rules about what a correct input-output pair is) are accurate enough to catch
all bad situations and computations.

 So generally DbC and unittests are better for different purposes, using them
both you are able to complement each other weak points, and to improve your D
coding.
Ditto. I definitely need a "mental jump" to think in terms of DbC; nearly never use it, simply don't think at it.
 I also suggest to try to write contracts first and code later, sometimes it
helps.
Just what I sometimes do for tests ;-) Denis -- _________________ vita es estrany spir.wikidot.com
Jan 28 2011
next sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 28.01.2011 13:30, schrieb spir:
 On 01/28/2011 12:36 PM, bearophile wrote:
 
 I think the problem here is that you are not using your D tools well enough
yet:
 - Preconditions allow you to save some tests in your unittests, because you
 have less need to test many input boundary conditions.
 - Postconditions are useful to save some code to test the correctness of the
 results inside your unittests. You still need to put many tricky but correct
 input conditions inside your unittests, but then the postcondition will test
 that the outputs are inside the class of the correct outputs, and you will
 need less unittest code to test that the results are exactly the expected ones
 in some important situations.
Very intersting points, thank you.
 - The missing "old" feature once somehow implemented allows to remove some
 other unittests, because you don't need a unittest any more to test the output
 is generally correct given a certain class of input.
What is this "old" feature?
if you've got a function fun(int x){ ... ; x++; ... } then you can use old.x (or something like that) in the dbc block at the end to access the original value of x. like assert(x == old.x+1) or something like that. after hearing about dbc in university I thought that having old was fundamental for dbc.. but D doesn't support it. Cheers, - Daniel
Jan 28 2011
prev sibling parent bearophile <bearophileHUGS lycos.com> writes:
spir:

 What is this "old" feature?
It's a basic DbC feature that's currently missing in D because of some implementation troubles (and maybe also because Walter is a bit disappointed about DbC). Example: a class member function performs a certain computation and changes some attributes. In the postcondition you want to test that such changes are correct. To do this well it's useful to know what the original state of those attributes was. This is what the "old" feature allows you to do. Similar things are possible with free functions too. So preconditions in a function allow you to assert that general rules about inputs are fulfilled, postconditions allow you to assert that general rules about its outputs are fulfilled, and the "old" (pre-state) feature allows you to assert that general rules about its input-output pairs are fulfilled. So it's not a small thing :-) It was discussed three or more times: http://www.digitalmars.com/d/archives/digitalmars/D/why_no_old_operator_in_function_postconditions_as_in_Eiffel_54654.html http://www.digitalmars.com/d/archives/digitalmars/D/Communicating_between_in_and_out_contracts_98252.html
I definitely need a "mental jump" to think in terms of DbC; nearly never use
it, simply don't think at it.<
DbC looks like a very simple thing, but you need some time, thinking, and reading about what it is and what its purposes are, to learn to use it :-) Bye, bearophile
Jan 28 2011
prev sibling next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Andrei:

 One issue with DbC is that its only significant advantage is its 
 interplay with inheritance. Otherwise, scope() in conjunction with 
 assert works with less syntactic overhead. So DbC tends to shine with 
 large and deep hierarchies... but large and deep hierarchies are not 
 that a la mode anymore.
Have you tried to use class invariants manually? It's easy to forget its calls, and the calls clutter the code, and are not handy to write. So I don't agree. Bye, bearophile
Jan 27 2011
prev sibling parent reply Roman Ivanov <something example.com> writes:
== Quote from Andrei Alexandrescu (SeeWebsiteForEmail erdani.org)'s article
 On 1/27/11 8:02 PM, Walter Bright wrote:
 I think one of the reasons DbC has not paid off is it still requires a
 significant investment of effort by the programmer. It's too easy to not
 bother.
One issue with DbC is that its only significant advantage is its interplay with inheritance. Otherwise, scope() in conjunction with assert works with less syntactic overhead. So DbC tends to shine with large and deep hierarchies... but large and deep hierarchies are not that a la mode anymore.
DbC opens many interesting possibilities if it's supported by tools other than just the compiler. MS has included it in .NET 4.0: http://research.microsoft.com/en-us/projects/contracts/
Jan 28 2011
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/28/11 10:14 AM, Roman Ivanov wrote:
 == Quote from Andrei Alexandrescu (SeeWebsiteForEmail erdani.org)'s article
 On 1/27/11 8:02 PM, Walter Bright wrote:
 I think one of the reasons DbC has not paid off is it still requires a
 significant investment of effort by the programmer. It's too easy to not
 bother.
One issue with DbC is that its only significant advantage is its interplay with inheritance. Otherwise, scope() in conjunction with assert works with less syntactic overhead. So DbC tends to shine with large and deep hierarchies... but large and deep hierarchies are not that a la mode anymore.
DbC opens many interesting possibilities if it's supported by tools other than just the compiler. MS has included it in .NET 4.0: http://research.microsoft.com/en-us/projects/contracts/
Hm, I'm seeing in section 3 of userdoc.pdf that they don't care for precondition weakening: "While we could allow a weaker precondition, we have found that the complications of doing so outweigh the benefits. We just haven’t seen any compelling examples where weakening the precondition is useful. So we do not allow adding any preconditions at all in a subtype." They do, however, allow strenghtening postconditions. Overall the project looks very interesting and has quite recognizable names from the PL community. Hopefully it will increase awareness of contract programming. Andrei
Jan 28 2011
prev sibling parent retard <re tard.com.invalid> writes:
Fri, 28 Jan 2011 16:14:27 +0000, Roman Ivanov wrote:

 == Quote from Andrei Alexandrescu (SeeWebsiteForEmail erdani.org)'s
 article
 On 1/27/11 8:02 PM, Walter Bright wrote:
 I think one of the reasons DbC has not paid off is it still requires
 a significant investment of effort by the programmer. It's too easy
 to not bother.
One issue with DbC is that its only significant advantage is its interplay with inheritance. Otherwise, scope() in conjunction with assert works with less syntactic overhead. So DbC tends to shine with large and deep hierarchies... but large and deep hierarchies are not that a la mode anymore.
DbC opens many interesting possibilities if it's supported by tools other than just the compiler. MS has included it in .NET 4.0: http://research.microsoft.com/en-us/projects/contracts/
Mono 2.8 also seems to support those: http://www.mono-project.com/news/archive/2010/Oct-06.html
Jan 28 2011
prev sibling parent reply Ellery Newcomer <ellery-newcomer utulsa.edu> writes:
On 01/27/2011 05:41 PM, Walter Bright wrote:
 Unit testing has produced a dramatic improvement in coding.
agreed. unit testing (maybe with dbc, I don't remember) was the only reason I noticed issue 5364.
Jan 27 2011
parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Thu, 27 Jan 2011 21:37:31 -0500, Ellery Newcomer  
<ellery-newcomer utulsa.edu> wrote:

 On 01/27/2011 05:41 PM, Walter Bright wrote:
 Unit testing has produced a dramatic improvement in coding.
agreed. unit testing (maybe with dbc, I don't remember) was the only reason I noticed issue 5364.
I created 4 or 5 bugs against dmd when I added full unit tests to dcollections. So unittests also help D probably as much as they do your code ;) -Steve
Jan 28 2011
prev sibling parent reply Ulrik Mikaelsson <ulrik.mikaelsson gmail.com> writes:
 You mean that if you give an index which is too large, it just uses $
 instead?
 That sounds seriously bug-prone to me. I'd much rather that it blew up and
 thus
 told me that my program had a bug in it rather than silently trying to
 work.
Sorry, but you are wrong on this. I understand this sounds unsafe, but no. Most languages, I guess, just do that without any worry. In particular, I have frequented python and Lua mailing lists for years without even reading once about this beeing a misfeature (and indeed have never run into a bug because of this myself). It is simply the right semantics in 99.999% cases.
Isn't the real reason for this that bounds-checking is usually completely turned-off in release-builds? Sounds like something that could noticeably degrade runtime-performance for array-intensive code?
Jan 27 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Ulrik Mikaelsson:

 Isn't the real reason for this that bounds-checking is usually
 completely turned-off in release-builds?
Bounds checking is turned off in release builds mostly because: 1) DMD is not able to infer & remove most bound checks at compile-time as the latest Oracle VM are able to do; 2) and because Walter & Co. believe such analysis isn't able to remove most bound checks anyway (I have not seen this hypothesis confirmed yet).
 Sounds like something that could noticeably degrade runtime-performance
 for array-intensive code?
The safety net we are talking about is present only at the right bound of a slicing syntax (it's not performed in normal array indexing), and it consists in a single min(x, $) operation, that's one branch. So it slows down code, but only a bit. And in many of such situations you probably need to add a min(x, $) manually in the code anyway, so the overhead is limited. Bye, bearophile
Jan 27 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
Honestly, I think this would just encourage writing sloppy code.

Using min(x, $) explicitly informs the reader of the code of exactly
what happens. It's harder to tell when it's implicit. Not only that,
but it can introduce bugs in your code - because while you might use
any upper bound, you're still not allowed to index beyond the length
of the array. Imagine this could was accepted by DMD:

void main()
{
    auto userArgs = ["foo"];  // runtime arguments by some user input

    auto firstTwo = userArgs[0..2];  // accepted by the new language change,
                                                  // implicitly
changes 2 to min(2, $)

    // more code here

    firstTwo[1] = "test";  // oops!
}
Jan 27 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Andrej Mitrovic:

 Honestly, I think this would just encourage writing sloppy code.
I don't believe this, on the other hand I believe the current behaviour is bug-prone.
 Using min(x, $) explicitly informs the reader of the code of exactly
 what happens.
Right. The problem is that in some cases you may forget to add that min(x, $).
 It's harder to tell when it's implicit.
Slicing syntax is used all the time in both Python and D code, so the Python and D programmers learn quickly and keep in mind what it does. Implicit behaviours are a problem when they are uncommon things, or library functions, etc, much less when they are built-in safety nets you use every five lines of code :-) Not only that,
 but it can introduce bugs in your code - because while you might use
 any upper bound, you're still not allowed to index beyond the length
 of the array.
Writing hundreds of thousands of lines of Python code I've seen that the saturating semantics we are talking about is not bug-prone (in fact it avoids some bugs).
 Imagine this could was accepted by DMD:
 
 void main()
 {
     auto userArgs = ["foo"];  // runtime arguments by some user input
 
     auto firstTwo = userArgs[0..2];  // accepted by the new language change,
                                                   // implicitly
 changes 2 to min(2, $)
 
     // more code here
 
     firstTwo[1] = "test";  // oops!
 }
Programmers just quickly learn that indexes and slices have a bit different semantics. Bye, bearophile
Jan 27 2011
next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/27/11, bearophile <bearophileHUGS lycos.com> wrote:
 Programmers just quickly learn that indexes and slices have a bit different
 semantics.
Right, programmers can easily adapt to slightly different semantics in related areas of the language, I mean C++ is known for being very easy to learn for exactly that reason. (Sarcasm alert!).
Jan 27 2011
prev sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 27.01.2011 19:25, schrieb bearophile:
 
 Programmers just quickly learn that indexes and slices have a bit different
semantics.
 
And I guess this holds true for programmers used to python as well? So they should have no problem with learning to use foo = blah[i..min($,i+n)] instead of foo = blah[i..(i+n)] ;-)
 Bye,
 bearophile
Cheers, - Daniel
Jan 27 2011
prev sibling next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Thursday, January 27, 2011 09:47:44 bearophile wrote:
 Ulrik Mikaelsson:
 Isn't the real reason for this that bounds-checking is usually
 completely turned-off in release-builds?
Bounds checking is turned off in release builds mostly because: 1) DMD is not able to infer & remove most bound checks at compile-time as the latest Oracle VM are able to do; 2) and because Walter & Co. believe such analysis isn't able to remove most bound checks anyway (I have not seen this hypothesis confirmed yet).
 Sounds like something that could noticeably degrade runtime-performance
 for array-intensive code?
The safety net we are talking about is present only at the right bound of a slicing syntax (it's not performed in normal array indexing), and it consists in a single min(x, $) operation, that's one branch. So it slows down code, but only a bit. And in many of such situations you probably need to add a min(x, $) manually in the code anyway, so the overhead is limited.
Personally, I have _never_ needed to add a min call like that. So, while _you_ may think that it's a common thing to do (and it may be that it is in your code), that doesn't mean that that's generally true. - Jonathan M Davis
Jan 27 2011
parent reply Peter Alexander <peter.alexander.au gmail.com> writes:
On 27/01/11 6:14 PM, Jonathan M Davis wrote:
 The safety net we are talking about is present only at the right bound of a
 slicing syntax (it's not performed in normal array indexing), and it
 consists in a single min(x, $) operation, that's one branch. So it slows
 down code, but only a bit. And in many of such situations you probably
 need to add a min(x, $) manually in the code anyway, so the overhead is
 limited.
Personally, I have _never_ needed to add a min call like that. So, while _you_ may think that it's a common thing to do (and it may be that it is in your code), that doesn't mean that that's generally true. - Jonathan M Davis
I agree. I've never needed to add a bounds check like that.
Jan 29 2011
parent bearophile <bearophileHUGS lycos.com> writes:
Peter Alexander:

 I agree. I've never needed to add a bounds check like that.
While my string processing D code contains many of them :-) I presume I am thinking in Python there. Bye, bearophile
Jan 29 2011
prev sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
This *code* was accepted, not *could*.
Jan 27 2011
prev sibling next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound2 digitalmars.com> wrote in message 
news:ihq66j$2llc$1 digitalmars.com...
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
Heh, D is programmer crack :) I use Haxe a lot and it's by no means a bad langauge overall, as far as languages go. Beats the snot out of PHP and ActionScript. But I frequently find myself cursing at it for not being able to handle things that I take for granted in D. Heck, just the other day I was going nuts because I had 5 objects that I needed to do some common trivial setup to, and there was *no* way to do it in a loop. No change I made ever "stuck" - it was as if the objects forgot they were reference types. Nothing worked. At least a full half-hour later, maybe a full hour, I ended up just copy-pasting the same damn code 5 times, once for each object. In D, I would have stuck them all in an array, typed "foreach...", and been done in under a minute.
Jan 26 2011
prev sibling next sibling parent Trass3r <un known.com> writes:
 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
Yep, like being thrown back to the Stone Age.
Jan 26 2011
prev sibling parent reply retard <re tard.com.invalid> writes:
Wed, 26 Jan 2011 14:09:25 -0800, Walter Bright wrote:

 Trass3r wrote:
 But once you had a test drive, you just can't get out anymore.
I've had more than one longtime C++ expert tell me that after using D for a while, then for work reasons get forced back into C++, just find themselves cringing every time they edit it.
I also got brainwashed by the C++ advocates years ago. However, I didn't need D to see how terrible writing C++ is. C++ sure is a powerful language and sometimes a necessary evil, but you don't really need very strong doses of more recent languages to see how much nicer everything else is.
Jan 27 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
retard wrote:
 I also got brainwashed by the C++ advocates years ago. However, I didn't 
 need D to see how terrible writing C++ is. C++ sure is a powerful 
 language and sometimes a necessary evil, but you don't really need very 
 strong doses of more recent languages to see how much nicer everything 
 else is.
What D unique shows is, however, that one can do C++ like things without the C++ issues, in other words, the problems C++ has are not necessary in order to get the powerful things C++ can do.
Jan 27 2011
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/27/11 1:26 PM, Walter Bright wrote:
 retard wrote:
 I also got brainwashed by the C++ advocates years ago. However, I
 didn't need D to see how terrible writing C++ is. C++ sure is a
 powerful language and sometimes a necessary evil, but you don't really
 need very strong doses of more recent languages to see how much nicer
 everything else is.
What D unique shows is, however, that one can do C++ like things without the C++ issues, in other words, the problems C++ has are not necessary in order to get the powerful things C++ can do.
Well put. BTW I'm glad to see retard back in action. A small debate of bearophile about bounds checking and whatnot, Bruno doing his monthly bulk answering to posts, retard being down on D... it all starts looking like a good day on digitalmars.D :o). Andrei
Jan 27 2011
prev sibling parent darenw <darenw darenscotwilson.com> writes:
"the problems C++ has are not necessary in order to get
the powerful things C++ can do."

That thought makes me happy!
Jan 27 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday 26 January 2011 17:52:19 spir wrote:
 On 01/27/2011 02:11 AM, Jonathan M Davis wrote:
 On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header
 files!! Yay!!!). But auto is fantastic too though, I get sooo much
 use out of that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work.
Sorry, but you are wrong on this. I understand this sounds unsafe, but no. Most languages, I guess, just do that without any worry. In particular, I have frequented python and Lua mailing lists for years without even reading once about this beeing a misfeature (and indeed have never run into a bug because of this myself). It is simply the right semantics in 99.999% cases. spir d:~$ python Python 2.6.6 (r266:84292, Sep 15 2010, 15:52:39) [GCC 4.4.5] on linux2 Type "help", "copyright", "credits" or "license" for more information.
 s = 'abc'
 s[0:123456789]
'abc' spir d:~$ lua Lua 5.1.4 Copyright (C) 1994-2008 Lua.org, PUC-Rio
 require"io"
 s = "abc"
 print(string.sub(s, 1, 123456789))
abc I'm constantly annoyed by D's behaviour. For instance, often have to write out the end of a string from a given point, but only at most n chars (to avoid cluttering the output, indeed): writeln(s[i..i+n]); which fails if there are less than n remaining chars ;-)
I _rarely_ see cases where I would consider it okay to give an index for the end of an array and having that index be too large is a good thing. It invariably means that your algorithm is wrong. In my experience, if you really need to know what the size of your array is and handle it properly. There _are_ cases where you say that you want "the rest" of the array or collection and don't care how much that is, but in cases where you're actually looking to specify the index, if the index is wrong, then the code is wrong. Now, I suppose that there are cases where you could simplify an algorithm where you effectively be n or less (if there aren't n elements left). But that's definitely atypical in my experience, and writing a wrapper function for such a case is trivial. Generally speaking, I'd be very worried about code which was lax enough about indices to not care whether it was indexing passed the end of the array or not. Clearly, if you think that not being strict about indices is a good idea, you're either dealing with very different circumstances than I have and/or you're coding very differently. Regardless, since it's trivial to create a wrapper that does what you want, I don't think that there's any reason to change how slicing works. - Jonathan M Davis
Jan 26 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday 26 January 2011 19:06:37 Jonathan M Davis wrote:
 On Wednesday 26 January 2011 17:52:19 spir wrote:
 On 01/27/2011 02:11 AM, Jonathan M Davis wrote:
 On Wednesday, January 26, 2011 16:41:10 spir wrote:
 On 01/26/2011 11:33 PM, Trass3r wrote:
 For me, D's killer features were string handling (slicing and
 appending/concatenation) and *no header files*. (No more header
 files!! Yay!!!). But auto is fantastic too though, I get sooo much
 use out of that.
Getting rid of the pointer crap (proper arrays, bounds checking, classes as reference types,...) is definitely among the top 10 on my list.
Same here. But I would prefere slicing not to check upper bound, rather just extend to the end. Or have a slicing variant do that.
You mean that if you give an index which is too large, it just uses $ instead? That sounds seriously bug-prone to me. I'd much rather that it blew up and thus told me that my program had a bug in it rather than silently trying to work.
Sorry, but you are wrong on this. I understand this sounds unsafe, but no. Most languages, I guess, just do that without any worry. In particular, I have frequented python and Lua mailing lists for years without even reading once about this beeing a misfeature (and indeed have never run into a bug because of this myself). It is simply the right semantics in 99.999% cases. spir d:~$ python Python 2.6.6 (r266:84292, Sep 15 2010, 15:52:39) [GCC 4.4.5] on linux2 Type "help", "copyright", "credits" or "license" for more information.
 s = 'abc'
 s[0:123456789]
'abc' spir d:~$ lua Lua 5.1.4 Copyright (C) 1994-2008 Lua.org, PUC-Rio
 require"io"
 s = "abc"
 print(string.sub(s, 1, 123456789))
abc I'm constantly annoyed by D's behaviour. For instance, often have to write out the end of a string from a given point, but only at most n chars (to avoid cluttering the output, indeed): writeln(s[i..i+n]); which fails if there are less than n remaining chars ;-)
I _rarely_ see cases where I would consider it okay to give an index for the end of an array and having that index be too large is a good thing. It invariably means that your algorithm is wrong. In my experience, if you really need to know what the size of your array is and handle it properly. There _are_ cases where you say that you want "the rest" of the array or collection and don't care how much that is, but in cases where you're actually looking to specify the index, if the index is wrong, then the code is wrong. Now, I suppose that there are cases where you could simplify an algorithm where you effectively be n or less (if there aren't n elements left). But that's definitely atypical in my experience, and writing a wrapper function for such a case is trivial. Generally speaking, I'd be very worried about code which was lax enough about indices to not care whether it was indexing passed the end of the array or not. Clearly, if you think that not being strict about indices is a good idea, you're either dealing with very different circumstances than I have and/or you're coding very differently. Regardless, since it's trivial to create a wrapper that does what you want, I don't think that there's any reason to change how slicing works.
I should probably add that it's more efficient to have the built-in slicing facilities be exact and build inexact ones on top of that then it is to make them inexact and the build exact ones on top. When you get down to it, you _have_ to slice _exactly_ when you get down deep enough in the code. So, making the slicing then be exact on top of that doesn't really cost anything. But adding the extra checks to make the inexact one work adds extra cost. So, while that's fine if the inexact behavior is what you want, it's _not_ fine if it's the exact behavior that you want, since then unnecessary checks are happening in the middle. For efficiency reasons, the exact behavior _needs_ to be the default. exact -> low level exact and inexact -> exact -> low level exact works as efficiently as is possible. Whereas having inexact -> low level exact and exact -> inexact -> low level exact is _not_ efficient. - Jonathan M Davis
Jan 26 2011
prev sibling next sibling parent spir <denis.spir gmail.com> writes:
On 01/27/2011 04:06 AM, Jonathan M Davis wrote:
 Clearly, if you think that not being strict about indices is a good idea,
you're
 either dealing with very different circumstances than I have and/or you're
coding
 very differently. Regardless, since it's trivial to create a wrapper that does
 what you want, I don't think that there's any reason to change how slicing
 works.
I would agree with you be reasoning "in pure air" (I mean without having used both semantic versions). But concrete experience (of a huge number of programmers) in widely used programming languages demonstrate the opposite. Slice /upper/ bound checking is an idea that looks good in theory but is not in practice. Denis -- _________________ vita es estrany spir.wikidot.com
Jan 27 2011
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Although I don't find the original post to be a troll, (despite the 
title), I could not resist poking some fun...


---- Still Alive? ----

This was a triumph!
I'm making a note here: "HUGE SUCCESS!!"

It's hard to overstate, my satisfaction.

D programming Language:
We do what we must, because we can.

For the good of all of us.
Except the ones who are dead.

But there's no sense crying, over every mistake.
You just keep on trying, till you run out of code.
And the desiderata gets achieved.
And you make a neat module, for the people who are, still alive.

I'm not even angry...
I'm being so sincere right now-
Even though you broke my compiler, and killed me.
And tore the linker to pieces.
And threw every piece into a fire.
As they burned it hurt because, I was so happy for you!

Now, these points of data, make a beautiful line.
And we're out of beta. We're releasing on time!
So I'm GLaD I got burned-
Think of all the things we learned-
for the people who are, still alive.


Go ahead and leave me...
I think I'd prefer to stay inside...
Maybe you'll find someone else, to help you?
Maybe the Go Language?
THAT WAS A JOKE! HAHA - FAT CHANCE.

Anyway this code is great!
It's so delicious and moist!

Look at me: still talking, when there's science to do!
When I look out there, it makes me GLaD I'm not you.

I've experiments to run.
There is research to be done.
On the people who are, still alive.
And believe me I am, still alive.
I'm doing science and I'm, still alive.
I feel fantastic and I'm, still alive.
While you're dying I'll be, still alive.
And when you're dead I will be, still alive.

-- 
Bruno Medeiros - Software Engineer
Feb 04 2011