www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - D is crap

reply D is crap <D crap.com> writes:
Sorry, I've spend the last month trying my best to get simple 
shit done. At every turn there is some problem that has to be 
dealt with that is unrelated to my actual work.  Be it the IDE, 
debugging, the library, or user shared code, it is just crap. D 
cannot be used successfully for semi-large projects without 
significant man hours invested in getting everything to work.

I'm sorry but it seems too much focus on enhancements while there 
are too many bugs and lack of tools to actually do anything 
useful.

I'm sorry if this hurts some of you guys feelings but it is fact 
that D sucks as a whole. A modern programming language should be 
modern, and D is not one of those languages. It is built as from 
cobbling together disparate parts that don't work together. The 
more it is used and added on to the worse it gets.

I'm sorry many of you have spent your lives working on something 
that won't amount to much in the real world. It was a valiant 
effort. Maybe it will seed new languages that will actually work 
and have the tooling to be useful. D itself is not a practical 
tool for general real world software development. I suppose it is 
obvious considering there is no significant use of it except by 
the diehards.

I hope I am wrong, but facts are facts. I would like to list 
these facts so potential users can be informed before the embark 
on a likely dead end.


1. The language is not completely well defined. While the 
language itself contains many nice features and what makes D 
appealing, too many features are cobbled together and don't 
completely work 100% properly. This creates very subtle bugs or 
problem areas that can stop one in their tracks. One can see how 
these things are cobbled together by observing the forms and the 
discussions about how to proceed in certain areas.

2. The compilation process is riddled with meaningless error 
messages and a simple missing ';' can launch one off to the moon 
to figure out it is missing. The error messages can cascade. Fix 
the ';' and 20 messages disappear. Usually each message is 100 
characters+ when it involves templates.

Rather than just telling you what is grammatically missing, like 
any normal modern compiler does, you have to hunt and peck and 
fill your head with meaningless information.

3. The compilers are designed as if they come straight out of the 
70's.  The setup is obscure, relies on assumptions that are not 
true, and just like the compilation process, if your unlucky you 
could be working for a few days just to try to get dmd to compile.

4. Object code issues, again, stuff from the 70's are still 
present.  Rather than fix the shit outright, knowing they are 
problematic, the can is kicked down the road to the unsuspecting 
users. The users, who generally know less about what is going on 
than the people who design the software. Those people who can fix 
the problem directly and save a lot of grief for people don't 
because they feel it isn't a big deal and they have more 
important things to do.

5. The documentation is somewhat crappy. While it is extensive 
and auto generated it generally is only a "template" of what a 
real user needs. Rather than just the function declaration, 
usually with nondescript template names like R, S, U, etc about 
half the functions are missing use cases. I realize this takes 
work but it could be done by the people writing the code, again, 
they know best, right?

6. The library is meandering in its design. Feels very convoluted 
at times, cobbled together rather than designed properly from the 
get go. Updated language features creates a cascade of library 
modifications. "Lets move this to this and remove that and then 
add this... oh, but we gotta update the docs, we'll do that 
tomorrow...".

7. The library uses shit for names. Ok, so strip isn't too bad 
but why not trim? That's what every one else uses. Ok, what about 
chomp? munch? squeeze? What the fuck is going on? Did the 
perverted Cookie Monster write this shit?
What about the infamous tr? Yeah, just cause posix said it was ok 
then it must
be so. I'd say we call it t instead.

I could go on and on about stuff like this but I have more 
important things to do, like

8. Lets use vim or emacs. I liked the 70's it was great. So great 
that I want to continue using the same editors because we know 
them well and they work... and their free!  I like coding at the 
pace of a turtle with minimal information because that's hard 
core leet style and makes my balls bigger, which my wife likes.

Oh, what about visual studio? Don't get me started! Maybe if 
Visual D/Mago actually worked half the time and didn't slow me 
down I'd use that. Xmarian? Worse!

Maybe it's time to get out of the dark ages and actually design a 
program that is designed for creating programs? Not just a 
fucking text editor that has a few helpful things that programs 
might use. Do we still have to code in text files? How about we 
just write everything in binary? Ok, sorry... getting OT.

Basically there is no great IDE for D, in fact, there is none. 
They are all general purpose IDE's that have been configured to 
compile D code. Great! Except they don't work well because they 
wern't designed for D. (e.g., template debugging? mixins? Error 
messages? Code maps? refactoring? All the stuff that more modern 
languages and IDE's are using is lacking for D.

9. What I do like about D is that it can compile for various 
platforms rather easy. Usually I do something like -m64 and run 
the app then it crashes. I don't know why because their was no 
error message. The point is that while D can "compile" for 
various platforms it is always an "on going process".

Because 9/10 D programmers program in linux, windows support is 
minimal and buggy. Since I don't use linux, because windows has a 
much larger market share, maybe D is great on linux. On windows 
though, it's a literal pain in the ass. All the time I spend 
trying to figure out how to get things to work properly has given 
me hemorrhoids. God did not design Man's ass to sit in front of a 
computer all day. BTW, a program can't just "work", I have 
clients that have a certain level of expectation, like no seg 
faults. Just because it works for me, or for you is not good 
enough. It has to work for everyone.

10. Most user contributed D packages are outdated. They simply 
don't work anymore due to all the language changes. Instead of 
culling the crap, it persists and the user has to wade through it 
all. It's every man for himself when it comes to D.

11. D has no proper Gui. WTF?  This isn't the 70's no matter how 
much you to relive peace and sex. Oh, did I hear someone say 
bindings? WTF?

12. D has no proper logging system. I just don't want to log a 
message, I want a well designed and easily manageable way to 
understand problems my program is experiencing. Given that D has 
so many latent issues, it's nice to have some way to deal with 
the "Big foot"(But bug that you only see when when your driving 
down a windy dark road in Nebraska).

13. Windows interfacing. Thanks for the bindings! The most used 
OS in the would with the largest commercial share only gets 
bindings that is actually like programming in win32. Rather than 
at least wrap them in a working oop design that hides away the 
abacus feel, we are stuck with bindings. The D community loves 
bindings, I have no fucking clue why. It just means more work. At 
least if I didn't have the bindings I wouldn't have to implement 
anything.

14. Gaming? It can be done, not by me or you but by geniuses who 
live in their basement and no one to support or nothing else to 
do but hash out how to get it done. But while they might share 
their results, don't get your hopes up and expect it to work for 
you.

15. Cross platform design? Maybe, Supposedly it works but I have 
too much trouble with windows to care about adding another layer 
of crap on top.

16. The community. While not worse than most others, doesn't 
really give a shit about the real world programmer. The elite are 
too busy thinking of ways to add the latest and greatest feature, 
thinking it will attract more buyers. The rabble rousers like 
myself don't actually do much. Ultimately things get done but 
nothing useful happens. Kinda like those jackasses that floor it 
with their foot on the break. A lot of smoke but pointless. D's 
been going how long? 10+ years?

The majority of you guys here don't realize that the average 
programming can't waste time creating a gui, implementing a 
design for the bindings you created, patching together different 
packages that don't work together, etc.

While there are, I'm sure, a lot of good intentions, they are 
meaningless when it comes to putting food on the table.  If you 
are going to do something, do it with gusto, not half ass. If you 
are going to design a package, do it right! Not something that 
continually requires fixing and effects every person that uses it 
exponentially. Every minute I spend fixing someone else's shit 
takes a minute away from my life. For N-1 other users that's N 
minutes wasted because the original creator didn't take the extra 
minute. Thanks for wasting all of our time. That's a factor of N. 
Now when we have to do that for M packages, that's M*N's people 
shit we have to fix. All because one person didn't want to spend 
one extra minute fixing their own shit. Ok, so, it might not be 
exponentially but it's still time that could be better spent on 
more important things.



17 ...


18. As you can see, I've ran out of steam. My butt hurts and I 
have better things to do... like delete dmd from my computer. At 
least that's simple and works! (I hope, maybe it will seg fault 
on me or I have to run some special command line switch).


19. PS. Ok, so, D isn't as terrible as I'm making it out. It's 
free. And as they say, you get what you pay for ;)

20. I hope the D community can come together at some point and 
work towards a common goal that will benefit humanity. It's a 
mini-cosmos of what is going on in the world today. Everyone is 
in it for themselves and they don't realize the big picture and 
how every little thing they do has an enormous impact on the 
human species.  We aren't doing this stuff for fun, we do it to 
make a better life for ourselves, which means we also have to do 
it for everyone else(because we are all in it together).

Their is so much wasted time and life by so many people for so 
many useless reasons that we could have built a bridge, brick by 
brick, to moon and back, a dozen fold.  Humanity is an amazing 
species capable of unimaginable things. By extension, so is the D 
community. I just hope D doesn't end up like the Kardashians as 
it has so much more use for humanity.

00. Please get your shit together! I mean that in the best 
possible way!
Jul 02 2016
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
Here is what I'm going to say and people will probably just reiterate it.

1) Report problems. Can't be fixed if we don't know it.
2) We're working on it. No really, I get it, you want to have a GUI 
toolkit yesterday but I don't think you understand just how expensive it 
is dependency wise. Its not like we only have one platform to worry 
about, Windows is the easiest platform of all to target here.
3) What did I said about working on it? No really, I do mean it. Adam D 
Ruppe is working on a replacement docs generation system[0] which is 
significantly better then what we currently have.
4) No one here is paid. It is truly amazing that D can do what it does 
do. We do our best with what we have, even if it means we must look long 
term instead of short term.

[0] http://dpldocs.info/experimental-docs/std.string.html
Jul 02 2016
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Thanks for taking the time to write this. Let me see if I can help.

On 7/2/2016 9:37 PM, D is crap wrote:
 1. The language is not completely well defined. While the language itself
 contains many nice features and what makes D appealing, too many features are
 cobbled together and don't completely work 100% properly. This creates very
 subtle bugs or problem areas that can stop one in their tracks. One can see how
 these things are cobbled together by observing the forms and the discussions
 about how to proceed in certain areas.
This is true. I'm interested in which of these have caused you problems. Naturally, people in the forum are going to debate the edge cases, as they do in every language. It isn't necessary to use those edge cases to write very successful code, however.
 2. The compilation process is riddled with meaningless error messages and a
 simple missing ';' can launch one off to the moon to figure out it is missing.
 The error messages can cascade. Fix the ';' and 20 messages disappear.
I agree this is true for the syntax checker, but also I'm not aware of any compiler that has successfully dealt with this problem other than simply stopping after the first message. The semantic pass is much better at having one and only one error message per actual error.
 Usually each message is 100 characters+ when it involves templates.
Again, you are right, but this is a consequence of templates being complex. The compiler tries to emit all the relevant context to enable the user to figure out the right fix.
 Rather than just telling you what is grammatically missing, like any normal
 modern compiler does, you have to hunt and peck and fill your head with
 meaningless information.
Most of the error messages out of the syntax checker are of the form "xxx expected", so if you could present an example of a bad error message I'd appreciate it so it can be improved.
 3. The compilers are designed as if they come straight out of the 70's.  The
 setup is obscure, relies on assumptions that are not true, and just like the
 compilation process, if your unlucky you could be working for a few days just
to
 try to get dmd to compile.
I'd like to help, but I need more specific information.
 4. Object code issues, again, stuff from the 70's are still present.  Rather
 than fix the shit outright, knowing they are problematic, the can is kicked
down
 the road to the unsuspecting users. The users, who generally know less about
 what is going on than the people who design the software. Those people who can
 fix the problem directly and save a lot of grief for people don't because they
 feel it isn't a big deal and they have more important things to do.
I don't understand the issue here. On Windows, it generates the Microsoft COFF object file, the same as Microsoft VC++. On the Mac, it generates Mach-O object files, the same as the Apple compilers do. On Linux/FreeBSD, it generates ELF object files, the same as gcc/clang do. The linkers, debuggers, librarians, and other object code tools used are all the standard ones on those systems. The D tools are designed to fit right in with the usual command line ecosystem on the particular host system, and to be immediately comfortable for those used to those systems.
 5. The documentation is somewhat crappy. While it is extensive and auto
 generated it generally is only a "template" of what a real user needs. Rather
 than just the function declaration, usually with nondescript template names
like
 R, S, U, etc about half the functions are missing use cases.
You're correct about that, and I've ranted about it often enough. If you have a specific function that has caused you particular grief, please let us know!
 6. The library is meandering in its design. Feels very convoluted at times,
 cobbled together rather than designed properly from the get go. Updated
language
 features creates a cascade of library modifications. "Lets move this to this
and
 remove that and then add this... oh, but we gotta update the docs, we'll do
that
 tomorrow...".
The good news is that no release is done unless the library works with it.
 7. The library uses shit for names. Ok, so strip isn't too bad but why not
trim?
 That's what every one else uses. Ok, what about chomp? munch? squeeze? What the
 fuck is going on? Did the perverted Cookie Monster write this shit?
 What about the infamous tr? Yeah, just cause posix said it was ok then it must
 be so. I'd say we call it t instead.
strip, chomp, squeeze, tr all come from existing functions in Python, Ruby and Javascript's standard libraries.
 Basically there is no great IDE for D, in fact, there is none. They are all
 general purpose IDE's that have been configured to compile D code. Great!
Except
 they don't work well because they wern't designed for D. (e.g., template
 debugging? mixins? Error messages? Code maps? refactoring? All the stuff that
 more modern languages and IDE's are using is lacking for D.
You're right, there is no modern IDE for D. It's not an easy thing to deal with, however. Doing one is a major project.
 9. What I do like about D is that it can compile for various platforms rather
 easy. Usually I do something like -m64 and run the app then it crashes. I don't
 know why because their was no error message. The point is that while D can
 "compile" for various platforms it is always an "on going process".
If the program is compiled with -g and it crashes (seg faults) you'll usually at least get a stack trace. Running it under a debugger will get you much more information.
 Because 9/10 D programmers program in linux, windows support is minimal and
 buggy. Since I don't use linux, because windows has a much larger market share,
 maybe D is great on linux. On windows though, it's a literal pain in the ass.
I actually develop dmd primarily on Windows. I'd like some specifics on how dmd for Windows is an inferior experience. One thing that the D tools and libraries are very good at is smoothly handling the differences in the file systems (case, \, /, line endings, etc.), far better than most cross-platform tools. I'm looking at you, git, for a rotten Windows experience :-)
 10. Most user contributed D packages are outdated. They simply don't work
 anymore due to all the language changes. Instead of culling the crap, it
 persists and the user has to wade through it all. It's every man for himself
 when it comes to D.
There's been discussion of automating DUB so that it will mark packages that no longer build. I don't know what the state of that is.
 11. D has no proper Gui. WTF?  This isn't the 70's no matter how much you to
 relive peace and sex. Oh, did I hear someone say bindings? WTF?
Many people have embarked down that road over the years, and all have failed. The problems are: 1. which GUI? nobody agrees on that 2. any GUI is simply an enormous amount of work. The realization of this is when the GUI projects fail Bindings are the best we can do for now.
 12. D has no proper logging system. I just don't want to log a message, I want
a
 well designed and easily manageable way to understand problems my program is
 experiencing.
There was an std.logger proposed, but I don't know the state of it.
 13. Windows interfacing. Thanks for the bindings! The most used OS in the would
 with the largest commercial share only gets bindings that is actually like
 programming in win32. Rather than at least wrap them in a working oop design
 that hides away the abacus feel, we are stuck with bindings.
Pretty much the same issue as the GUI library.
 14. Gaming? It can be done, not by me or you but by geniuses who live in their
 basement and no one to support or nothing else to do but hash out how to get it
 done. But while they might share their results, don't get your hopes up and
 expect it to work for you.
Games are hard, any way you look at it.
 15. Cross platform design? Maybe, Supposedly it works but I have too much
 trouble with windows to care about adding another layer of crap on top.
One thing Phobos does well is work smoothly across the supported platforms.
 16. The community. While not worse than most others, doesn't really give a shit
 about the real world programmer. The elite are too busy thinking of ways to add
 the latest and greatest feature, thinking it will attract more buyers. The
 rabble rousers like myself don't actually do much. Ultimately things get done
 but nothing useful happens. Kinda like those jackasses that floor it with their
 foot on the break. A lot of smoke but pointless. D's been going how long? 10+
 years?
While a good point, on the other hand every language in wide use is relentlessly adopting new features (C++ just announced C++17 with quite a boatload of new stuff). It's just the way of things, otherwise we'd be stuck with a language from the 70's :-)
 19. PS. Ok, so, D isn't as terrible as I'm making it out. It's free. And as
they
 say, you get what you pay for ;)
Sometimes I idly wonder what would have happened if D were available in the 80's. Sort of like if you put a modern car for sale in the 1960's.
 20. I hope the D community can come together at some point and work towards a
 common goal that will benefit humanity. It's a mini-cosmos of what is going on
 in the world today. Everyone is in it for themselves and they don't realize the
 big picture and how every little thing they do has an enormous impact on the
 human species.  We aren't doing this stuff for fun, we do it to make a better
 life for ourselves, which means we also have to do it for everyone else(because
 we are all in it together).
I think we do remarkably well considering that D is an effort by self-motivated enthusiasts, not by bored people working on it just because they're paid to.
Jul 02 2016
next sibling parent reply Charles Hixson via Digitalmars-d <digitalmars-d puremagic.com> writes:
FWIW, I feel that the elaboration of the template language doesn't serve 
me well.  That's my use case, so I try to ignore it as much as possible, 
but phobos has been re-written to be much less intelligible to me.  I'm 
sure that many people find the inclusion of ranges into everything 
useful, but it rarely helps me.

I really like the way D handles unicode.  D and Vala are the two 
languages that seem to handle it well, and Vala isn't portable. And I 
really like having garbage collection, and especially the syntax that it 
enables.  I was just considering a hash table (associative array) in 
C++, and realized that I had to either allocate on the heap, or I 
couldn't easily do an increment of a struct variable.  (Well, I'm 
thinking about QHash, because walking the files of a directory path in 
standard C++ is unpleasant, and Qt makes both that and unicode tests 
[the ones I need to do] pretty simple.)  But if I allocate structs on 
the heap I have to make sure that everything gets released when I'm 
through, so I need to use an enhanced pointer construct, so....  It's a 
lot simpler in D.

I do wish that phobos included a D wrapper around SQLite, something 
object oriented.  I'd also like to be able to depend on class finalizers 
being called.  Sometimes I wrap a class in a struct just so I can depend 
on the finalizer.

The last time I checked DDoc stopped working after encountering an 
extern "C" block in a file.  This is not desirable, even though one can 
work around it by moving all the extern routines to another file.  DDoc 
could use considerable work in formatting. I'd like to be able to 
globally control the font attributes of classes, structs, aliases, 
enums.  I'd like to be able to document private routines or not 
depending on a (compiler) switch.  I frequently end up declaring things 
to be protected rather than private just so I can generate documentation.

Most of my needs are for run time flexibility rather than for more 
compile time flexibility.  E.g., I'd like to be able to declare a 
statically sized array from a parameter.  I do appreciate D's speed, but 
complex templates aren't something I ever use.   (Truth is, I was pretty 
well satisfied with D1, though perhaps I'm forgetting some limitations, 
but even then I'm pretty much sure that I felt the main limitation was a 
lack of well interfaced libraries.)

Too many of D's libraries seem to try to re-invent the wheel. When a 
library is working well and has been debugged, the best think if to 
create a wrapper around it.  The wrapper *does* need to adapt the 
library to the syntax of the language, but that's not a huge problem.  A 
major part of Python's success is "batteries included".

I feel sort of guilty for "complaining" this way when I've been devoting 
all my efforts to my own projects, but you did, essentially, invite 
comments.

On 07/02/2016 11:23 PM, Walter Bright via Digitalmars-d wrote:
 Thanks for taking the time to write this. Let me see if I can help.

 On 7/2/2016 9:37 PM, D is crap wrote:
 1. The language is not completely well defined. While the language 
 itself
 contains many nice features and what makes D appealing, too many 
 features are
 cobbled together and don't completely work 100% properly. This 
 creates very
 subtle bugs or problem areas that can stop one in their tracks. One 
 can see how
 these things are cobbled together by observing the forms and the 
 discussions
 about how to proceed in certain areas.
This is true. I'm interested in which of these have caused you problems. Naturally, people in the forum are going to debate the edge cases, as they do in every language. It isn't necessary to use those edge cases to write very successful code, however.
 2. The compilation process is riddled with meaningless error messages 
 and a
 simple missing ';' can launch one off to the moon to figure out it is 
 missing.
 The error messages can cascade. Fix the ';' and 20 messages disappear.
I agree this is true for the syntax checker, but also I'm not aware of any compiler that has successfully dealt with this problem other than simply stopping after the first message. The semantic pass is much better at having one and only one error message per actual error.
 Usually each message is 100 characters+ when it involves templates.
Again, you are right, but this is a consequence of templates being complex. The compiler tries to emit all the relevant context to enable the user to figure out the right fix.
 Rather than just telling you what is grammatically missing, like any 
 normal
 modern compiler does, you have to hunt and peck and fill your head with
 meaningless information.
Most of the error messages out of the syntax checker are of the form "xxx expected", so if you could present an example of a bad error message I'd appreciate it so it can be improved.
 3. The compilers are designed as if they come straight out of the 
 70's.  The
 setup is obscure, relies on assumptions that are not true, and just 
 like the
 compilation process, if your unlucky you could be working for a few 
 days just to
 try to get dmd to compile.
I'd like to help, but I need more specific information.
 4. Object code issues, again, stuff from the 70's are still present.  
 Rather
 than fix the shit outright, knowing they are problematic, the can is 
 kicked down
 the road to the unsuspecting users. The users, who generally know 
 less about
 what is going on than the people who design the software. Those 
 people who can
 fix the problem directly and save a lot of grief for people don't 
 because they
 feel it isn't a big deal and they have more important things to do.
I don't understand the issue here. On Windows, it generates the Microsoft COFF object file, the same as Microsoft VC++. On the Mac, it generates Mach-O object files, the same as the Apple compilers do. On Linux/FreeBSD, it generates ELF object files, the same as gcc/clang do. The linkers, debuggers, librarians, and other object code tools used are all the standard ones on those systems. The D tools are designed to fit right in with the usual command line ecosystem on the particular host system, and to be immediately comfortable for those used to those systems.
 5. The documentation is somewhat crappy. While it is extensive and auto
 generated it generally is only a "template" of what a real user 
 needs. Rather
 than just the function declaration, usually with nondescript template 
 names like
 R, S, U, etc about half the functions are missing use cases.
You're correct about that, and I've ranted about it often enough. If you have a specific function that has caused you particular grief, please let us know!
 6. The library is meandering in its design. Feels very convoluted at 
 times,
 cobbled together rather than designed properly from the get go. 
 Updated language
 features creates a cascade of library modifications. "Lets move this 
 to this and
 remove that and then add this... oh, but we gotta update the docs, 
 we'll do that
 tomorrow...".
The good news is that no release is done unless the library works with it.
 7. The library uses shit for names. Ok, so strip isn't too bad but 
 why not trim?
 That's what every one else uses. Ok, what about chomp? munch? 
 squeeze? What the
 fuck is going on? Did the perverted Cookie Monster write this shit?
 What about the infamous tr? Yeah, just cause posix said it was ok 
 then it must
 be so. I'd say we call it t instead.
strip, chomp, squeeze, tr all come from existing functions in Python, Ruby and Javascript's standard libraries.
 Basically there is no great IDE for D, in fact, there is none. They 
 are all
 general purpose IDE's that have been configured to compile D code. 
 Great! Except
 they don't work well because they wern't designed for D. (e.g., template
 debugging? mixins? Error messages? Code maps? refactoring? All the 
 stuff that
 more modern languages and IDE's are using is lacking for D.
You're right, there is no modern IDE for D. It's not an easy thing to deal with, however. Doing one is a major project.
 9. What I do like about D is that it can compile for various 
 platforms rather
 easy. Usually I do something like -m64 and run the app then it 
 crashes. I don't
 know why because their was no error message. The point is that while 
 D can
 "compile" for various platforms it is always an "on going process".
If the program is compiled with -g and it crashes (seg faults) you'll usually at least get a stack trace. Running it under a debugger will get you much more information.
 Because 9/10 D programmers program in linux, windows support is 
 minimal and
 buggy. Since I don't use linux, because windows has a much larger 
 market share,
 maybe D is great on linux. On windows though, it's a literal pain in 
 the ass.
I actually develop dmd primarily on Windows. I'd like some specifics on how dmd for Windows is an inferior experience. One thing that the D tools and libraries are very good at is smoothly handling the differences in the file systems (case, \, /, line endings, etc.), far better than most cross-platform tools. I'm looking at you, git, for a rotten Windows experience :-)
 10. Most user contributed D packages are outdated. They simply don't 
 work
 anymore due to all the language changes. Instead of culling the crap, it
 persists and the user has to wade through it all. It's every man for 
 himself
 when it comes to D.
There's been discussion of automating DUB so that it will mark packages that no longer build. I don't know what the state of that is.
 11. D has no proper Gui. WTF?  This isn't the 70's no matter how much 
 you to
 relive peace and sex. Oh, did I hear someone say bindings? WTF?
Many people have embarked down that road over the years, and all have failed. The problems are: 1. which GUI? nobody agrees on that 2. any GUI is simply an enormous amount of work. The realization of this is when the GUI projects fail Bindings are the best we can do for now.
 12. D has no proper logging system. I just don't want to log a 
 message, I want a
 well designed and easily manageable way to understand problems my 
 program is
 experiencing.
There was an std.logger proposed, but I don't know the state of it.
 13. Windows interfacing. Thanks for the bindings! The most used OS in 
 the would
 with the largest commercial share only gets bindings that is actually 
 like
 programming in win32. Rather than at least wrap them in a working oop 
 design
 that hides away the abacus feel, we are stuck with bindings.
Pretty much the same issue as the GUI library.
 14. Gaming? It can be done, not by me or you but by geniuses who live 
 in their
 basement and no one to support or nothing else to do but hash out how 
 to get it
 done. But while they might share their results, don't get your hopes 
 up and
 expect it to work for you.
Games are hard, any way you look at it.
 15. Cross platform design? Maybe, Supposedly it works but I have too 
 much
 trouble with windows to care about adding another layer of crap on top.
One thing Phobos does well is work smoothly across the supported platforms.
 16. The community. While not worse than most others, doesn't really 
 give a shit
 about the real world programmer. The elite are too busy thinking of 
 ways to add
 the latest and greatest feature, thinking it will attract more 
 buyers. The
 rabble rousers like myself don't actually do much. Ultimately things 
 get done
 but nothing useful happens. Kinda like those jackasses that floor it 
 with their
 foot on the break. A lot of smoke but pointless. D's been going how 
 long? 10+
 years?
While a good point, on the other hand every language in wide use is relentlessly adopting new features (C++ just announced C++17 with quite a boatload of new stuff). It's just the way of things, otherwise we'd be stuck with a language from the 70's :-)
 19. PS. Ok, so, D isn't as terrible as I'm making it out. It's free. 
 And as they
 say, you get what you pay for ;)
Sometimes I idly wonder what would have happened if D were available in the 80's. Sort of like if you put a modern car for sale in the 1960's.
 20. I hope the D community can come together at some point and work 
 towards a
 common goal that will benefit humanity. It's a mini-cosmos of what is 
 going on
 in the world today. Everyone is in it for themselves and they don't 
 realize the
 big picture and how every little thing they do has an enormous impact 
 on the
 human species.  We aren't doing this stuff for fun, we do it to make 
 a better
 life for ourselves, which means we also have to do it for everyone 
 else(because
 we are all in it together).
I think we do remarkably well considering that D is an effort by self-motivated enthusiasts, not by bored people working on it just because they're paid to.
Jul 03 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 3 July 2016 at 07:21:15 UTC, Charles Hixson wrote:
 portable. And I really like having garbage collection, and 
 especially the syntax that it enables.  I was just considering 
 a hash table (associative array) in C++, and realized that I 
 had to either allocate on the heap, or I couldn't easily do an 
 increment of a struct variable.  (Well, I'm thinking about 
 QHash, because walking the files of a directory path in 
 standard C++ is unpleasant, and Qt makes both that and unicode 
 tests [the ones I need to do] pretty simple.)  But if I 
 allocate structs on the heap I have to make sure that 
 everything gets released when I'm through, so I need to use an 
 enhanced pointer construct, so....  It's a lot simpler in D.

 I do wish that phobos included a D wrapper around SQLite, 
 something object oriented.  I'd also like to be able to depend 
 on class finalizers being called.  Sometimes I wrap a class in 
 a struct just so I can depend on the finalizer.
[...]
 Most of my needs are for run time flexibility rather than for 
 more compile time flexibility.  E.g., I'd like to be able to 
 declare a statically sized array from a parameter.  I do 
 appreciate D's speed, but complex templates aren't something I 
 ever use.   (Truth is, I was pretty well satisfied with D1, 
 though perhaps I'm forgetting some limitations, but even then 
 I'm pretty much sure that I felt the main limitation was a lack 
 of well interfaced libraries.)

 Too many of D's libraries seem to try to re-invent the wheel. 
 When a library is working well and has been debugged, the best 
 think if to create a wrapper around it.  The wrapper *does* 
 need to adapt the library to the syntax of the language, but 
 that's not a huge problem.  A major part of Python's success is 
 "batteries included".
Included after 20 years of massive adoption. Something that would never have happened if it had a more solid competitor than Perl... Anyway, writing a library for a dynamic scripting language with no performance requirements is a different challenge. Python2 has major issues with its "batteries included" approach, just look at the fragmentation in Python's HTTP support libraries. adoption over a long period of time and massive backing. If you are looking for GC, more runtime flexibility and libraries (Linux, OS-X, Windows) is going to be the best fit. https://blogs.msdn.microsoft.com/dotnet/2016/06/27/announcing-net-core-1-0/ There are currently many smaller languages that are very interesting and increasingly competitive: golang.org rust.org loci-lang.org ponylang.org whiley.org So D better focusing at doing better at what it is already doing well (compile time) rather than expanding even more. And even then a language like Whiley is more advanced. Whiley has a floating type system with static typing where you can write functions that return either strings or integers and a builtin prover. But Whiley is currently in phase (1)... so no adoption. I agree that a production ready D1 in 2010 would have cut into what is now the market of Go. But that train has passed.
Jul 03 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 3 July 2016 at 08:01:45 UTC, Ola Fosheim Grøstad wrote:
 even then a language like Whiley is more advanced.  Whiley has 
 a floating type system with static typing where you can write
Typo, not «floating type system», but flow-typing: https://en.wikipedia.org/wiki/Flow-sensitive_typing
Jul 03 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/3/2016 12:21 AM, Charles Hixson via Digitalmars-d wrote:
 I do wish that phobos included a D wrapper around SQLite,
There was one recently announced.
 I'd also like to be able to depend on class finalizers being called.
 Sometimes I wrap a class in a struct just so I can depend on the finalizer.
That's the way to do it.
 The last time I checked DDoc stopped working after encountering an extern "C"
 block in a file.
I'm not aware of this. Is there a bug report filed?
 DDoc could use considerable work in formatting. I'd like to be able to
globally control the font attributes
 of classes, structs, aliases, enums.
DDoc relies on having a user-supplied xxx.ddoc file to provide the formatting, css, etc.
 I'd like to be able to document private
 routines or not depending on a (compiler) switch.
I'm not sure what the point of that would be. Private functions shouldn't be called by anything outside of the file, and you already have the file open in the editor when working on it. But you can always submit an enhancement request for it on bugzilla.
 Most of my needs are for run time flexibility rather than for more compile time
 flexibility.  E.g., I'd like to be able to declare a statically sized array
from
 a parameter.
You can do it more or less with alloca(). I know C99 has this feature, and some people swear by it, but I just don't think it pulls its weight.
 I feel sort of guilty for "complaining" this way when I've been devoting all my
 efforts to my own projects, but you did, essentially, invite comments.
If something good can come from it, it's good.
Jul 03 2016
parent Charles Hixson via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 07/03/2016 01:31 AM, Walter Bright via Digitalmars-d wrote:
 On 7/3/2016 12:21 AM, Charles Hixson via Digitalmars-d wrote:
 I do wish that phobos included a D wrapper around SQLite,
There was one recently announced.
Thanks. I missed the announcement.
 I'd also like to be able to depend on class finalizers being called.
 Sometimes I wrap a class in a struct just so I can depend on the 
 finalizer.
That's the way to do it.
It works, but it's a bit of added friction when I'm coding. Especially if I need access to the internals of the class...then I need to manually mangle the names. (I'm not implying I need to match the compiler's version of the name.)
 The last time I checked DDoc stopped working after encountering an 
 extern "C"
 block in a file.
I'm not aware of this. Is there a bug report filed?
I wasn't sure it was a bug. And it may not still be present. Since I figured out the cause I've just put the C externals in a separate file.
 DDoc could use considerable work in formatting. I'd like to be able 
 to globally control the font attributes
 of classes, structs, aliases, enums.
DDoc relies on having a user-supplied xxx.ddoc file to provide the formatting, css, etc.
I wasn't able to figure out how to highlight classes differently from structs, etc. I have built small .ddoc files to make custom changes, but that was for defining macros.
 I'd like to be able to document private
 routines or not depending on a (compiler) switch.
I'm not sure what the point of that would be. Private functions shouldn't be called by anything outside of the file, and you already have the file open in the editor when working on it. But you can always submit an enhancement request for it on bugzilla.
That's not for export. The documentation is written for me to use a month or more later.
 Most of my needs are for run time flexibility rather than for more 
 compile time
 flexibility.  E.g., I'd like to be able to declare a statically sized 
 array from
 a parameter.
You can do it more or less with alloca(). I know C99 has this feature, and some people swear by it, but I just don't think it pulls its weight.
Yes, I could manually allocate them. But that adds a lot of resistance. If they are too much effort, well, OK, but that's an example of the kind of thing that I want that D doesn't address. No language does everything, and D is one of the two or three languages I consider best. Python is another, but slow. Vala shows promise, but it's not portable, and the development is glacial. Also it's poorly documented. It's advantage is it's fast AND has a lot of wrapped libraries. (But last I checked it seemed FAR from 1.0.)
 I feel sort of guilty for "complaining" this way when I've been 
 devoting all my
 efforts to my own projects, but you did, essentially, invite comments.
If something good can come from it, it's good.
My chances of getting any usable result from my work are *very* low. OTOH, if I do the social payoff will be huge. So I persevere. Except when I get discouraged.
Jul 03 2016
prev sibling next sibling parent reply "Schrom, Brian T via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On 7/2/16 11:27 PM, Walter Bright via Digitalmars-d wrote:
 Thanks for taking the time to write this. Let me see if I can help.
 This is true. I'm interested in which of these have caused you problems.
 Naturally, people in the forum are going to debate the edge cases, as they do
in
 every language. It isn't necessary to use those edge cases to write very
 successful code, however.
A semi related experience is that I was using dlangui successfully on the Mac, and ported to a windows system I had and it worked no issues. However, then I tried to deploy to the 1/2 dozen windows setups we have and it segfaults in a different manner on each setup. I'm sure it's related to dlls, initialization, and packaging but it makes for a difficult and unfavorable experience. This is reproducible with the example programs distributed with dlangui (some crash, some run). If the CI testers had covered a reasonable plethora of windows configurations/versions, this would have been identified. (One can make the same argument about my development environment too.) I then thought that maybe LDC would have been a better experience, but on windows, it comes with it's own case of worms...so I ran out of time before getting that to work either. I've not had time to investigate the underlying cause, and when I do, I'll submit a bug report. I desperately want better tools and think D has potential, I just needed them yesterday. ;)
Jul 03 2016
parent Mike Parker <aldacron gmail.com> writes:
On Sunday, 3 July 2016 at 17:55:05 UTC, Schrom, Brian T wrote:

 A semi related experience is that I was using dlangui 
 successfully on the Mac, and ported to a windows system I had 
 and it worked no issues. However, then I tried to deploy to the 
 1/2 dozen windows setups we have and it segfaults in a 
 different manner on each setup.  I'm sure it's related to dlls, 
 initialization, and packaging but it makes for a difficult and 
 unfavorable experience.  This is reproducible with the example 
 programs distributed with dlangui (some crash, some run).  If 
 the CI testers had covered a reasonable plethora of windows 
 configurations/versions, this would have been identified.  (One 
 can make the same argument about my development environment 
 too.)
So did you report an issue to the DLangUI bug tracker? https://github.com/buggins/dlangui/issues
Jul 03 2016
prev sibling next sibling parent reply D <D D.com> writes:
 Sometimes I idly wonder what would have happened if D were 
 available in the 80's. Sort of like if you put a modern car for 
 sale in the 1960's.
Sure, but that violates causality and means nothing. Instead, lets go into the future, which we can do(with our minds at least), and see what D is(can be) and then bring that in to the present(which is the future, because tomorrow becomes today). I see D is the singular greatest language man has seen since 2025. No one even bothers writing or studying compilers any more because D is the bee's knee's. The reason people create something new is because the old isn't working for them. Let's make D work for everyone instead of the few anointed.
 20. I hope the D community can come together at some point and 
 work towards a
 common goal that will benefit humanity. It's a mini-cosmos of 
 what is going on
 in the world today. Everyone is in it for themselves and they 
 don't realize the
 big picture and how every little thing they do has an enormous 
 impact on the
 human species.  We aren't doing this stuff for fun, we do it 
 to make a better
 life for ourselves, which means we also have to do it for 
 everyone else(because
 we are all in it together).
I think we do remarkably well considering that D is an effort by self-motivated enthusiasts, not by bored people working on it just because they're paid to.
Yes, but that is an excuse not to do better. It leads to laziness, lack of focus, disorganization, weakness, atrophy, and eventually death(it is the way of life). You obviously spent nearly a life creating D. Do you want to see it die out like the dodo? D is your brain child and you gave it a great beginning, but I see it's mid life as pointing towards an untimely death. It's hard to create something great in the first place, but it takes more work to keep it their. Have you read Christopher Alexander's `A Pattern Language`? It applies not only to architecture, computer programming, compilers, etc but life itself. I'd like to see D live because all the alternatives are substandard. D needs leadership, specific goals, and an organizational structure. A plan of action to achieve these as soon as possible. If we were in a dessert without water, would we want to find water as soon as possible or take our time? D's lackadaisical approach leads to lackadaisical results. There is an abundance of free labor trying to make it work, but it is inefficient and usually requires highly motivated and intelligent individuals to add to D. Instead, how about a more efficient approach that maximizes leverage? The top of the organization structure, the most intelligent, motivated, and dedicated create the goals and structures that will knowingly lead to the best results. e.g.(just a simple off the cuff type of breakdown) Goals: The D language must be robust, capable of all the desirable behaviors that everyone expects such as intelligibility(readable), efficiency(no code duplication), coherently implementable(no piecemeal implementations), thoroughly documented, etc. The D compiler must be robust, efficient, cleanly implementable, stable, easily maintainable, extensible, expandable, usable, platform agnostic/independence(not the binary, the core design), etc. The D library must be stable, efficient, coherent, logical, nested/hierarchical, platform agnostic/independent. Support a variety of usages(Gui, Graphics, Games, Devices, core, systems, etc...) The D tools must be stable, effective, specific, organized, etc. The D IDE must be stable, effective, friendly, increase efficiency as best as possible for the user. It should work to increase the productivity on all levels of the D regime since the IDE is the portal in to the D world. Hence, it can include forum connections, bug reporting, organizational contribution interfacing, etc. etc... Everything must work together in a cohesive way. The only way this can happen is if it is design that way from the get go. D has some/many of these things. But it is due to people with different ideas of the goal(since it hasn't been laid out). When the goal is understand, it is generally very easy for anyone to accomplish it. When it is not, don't expect the goal to be reached. Once the goal has been well defined, it can be broken down into parts and each part tackled by individuals. A program could be written to help manage the complexities and make it efficient for users to help tackle. This might be difficult to do initially but will pay off in the long run. If designed properly it could be integrated or built in to the IDE. Everything should be modular as much as possible. Any changes at any point in the goals(design change) effects all changes further down the structure. As the structure becomes larger, the weaknesses show through and the ripples in the space time continuum can lead to a total collapse(humpty dumpty still exists, he's just in a lot of pieces and no one cares about him any more). Regardless, I'm sure you know about all this type of stuff. It's more about the drive to implement it. I personally would like to see a whole knew paradigm shift in programming. Something involving direct brain to computer interfacing... unfortunately I need D to get their. ...Anyways, don't take my complaints too harshly. I realize it's a work in progress. I'm trying to `inspire`(ok, maybe not the best word) you to grow some balls(or bigger balls) and lead the way. D is your child, it's ultimately up to you to make it grow in to the person(or, uh, the language/compiler tool set) that you desire it to be. If you can't do it on your own(too old and tired, lack of motivation, lack of time, etc) find the people you need to make it happen then let them do it and you can sit back and crack the whip every once in a while, while you sip on your latte. The DeLorean won't build itself and someone has to make/find the flux capacitor if it's going to work. Maybe go talk to Microsoft and see if they will fund you? Tell Trump you'll vote for him if he supports the D foundation!
Jul 03 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 3 July 2016 at 21:23:56 UTC, D wrote:
 Have you read Christopher Alexander's `A Pattern Language`? It 
 applies not only to architecture, computer programming, 
 compilers, etc but life itself.
I hope not. It is an opinionated book on what they thought were good aspects of architecture. The roots was in the DIY movement where the general audience should be educated so they could design their own environment. The equivalent in programming languages would be to post a disconnected list of programming language features and let people design their own programming languages based on what they like most. If anything, D is there already.
Jul 03 2016
prev sibling next sibling parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
 Thanks for taking the time to write this. Let me see if I can 
 help.
Wow, this was very well handled. Thanks for keeping your head cool and answering in a constructive, friendly and informative manner. It's even more admirable coming from someone who says he used not to be exactly the most "people person" / good boss / group leader, or whatever the expression was.
 Sometimes I idly wonder what would have happened if D were 
 available in the 80's. Sort of like if you put a modern car for 
 sale in the 1960's.
I've also thought about that from time to time. I think D would have been very "mainstream-successful". Starting from where it actually started, I think things have worked out well for D, despite its still limited success. Looking back all of these years I think that D's marketing mistake was the garbage collection. Given its target audience and design trade-offs, I believe adoption of the language was disproportionally affected by that choice. If D had started with stronger support for nogc, even at the cost of delaying some other nice features, I believe adoption would have been quite stronger (and more easily snowballed) -- irrespective of the actual engineering merit of that D variant vs the true D. (it would also have avoided all the current piecemeal work of trying to remove GC allocation from Phobos, etc.; also, notice that nogc marketing would probably have been even more important in the 80s).
Jul 07 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 8 July 2016 at 01:17:55 UTC, Luís Marques wrote:

 Sometimes I idly wonder what would have happened if D were 
 available in the 80's. Sort of like if you put a modern car 
 for sale in the 1960's.
I've also thought about that from time to time. I think D would have been very "mainstream-successful". Starting from where it actually started, I think things have worked out well for D, despite its still limited success. Looking back all of these years I think that D's marketing mistake was the garbage collection. Given its target audience and design trade-offs, I believe adoption of the language was disproportionally affected by that choice. If D had started with stronger support for nogc, even at the cost of delaying some other nice features, I believe adoption would have been quite stronger (and more easily snowballed) -- irrespective of the actual engineering merit of that D variant vs the true D. (it would also have avoided all the current piecemeal work of trying to remove GC allocation from Phobos, etc.; also, notice that nogc marketing would probably have been even more important in the 80s).
This is a futile discussion. D is in many respects a "hindsight language" as regards C/C++.[1] People naturally lacked hindsight back in the 80ies and a lot of D's features would have been frowned upon as "Don't need it!" (templates), "Waste of memory!" (e.g. `array.length`) etc. And remember computers and computing power were not as common as they are today. You were also dealing with a different crowd, there are by far more programmers around now than there used to be in the 80ies, with different expectations. In the 80ies most programmers were either hard core nerds (hence the nerdy image programmers have) or people who had lost their jobs elsewhere and had gone through re-educational programs to become programmers and thus were not really interested in the matter. As for GC, it's hard to tell. When D was actually (not hypothetically) created, GC was _the_ big thing. Java had just taken off, people were pissed off with C/C++, programming and coding was becoming more and more common. Not having GC might actually have been a drawback back in the day. People would have complained that "Ah, D is like C++, no automatic memory management, I might as well stick to C++ or go for Java!" So no, I think D is where it is, because things are like they are, and "what if" discussions are useless. D has to keep on keeping on, there's no magic. [1] Sometimes I think that D should to be careful not to become a language looked on by yet another "hindsight language".
Jul 08 2016
next sibling parent reply bachmeier <no spam.com> writes:
On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:

 As for GC, it's hard to tell. When D was actually (not 
 hypothetically) created, GC was _the_ big thing. Java had just 
 taken off, people were pissed off with C/C++, programming and 
 coding was becoming more and more common. Not having GC might 
 actually have been a drawback back in the day. People would 
 have complained that "Ah, D is like C++, no automatic memory 
 management, I might as well stick to C++ or go for Java!" So 
 no, I think D is where it is, because things are like they are, 
 and "what if" discussions are useless. D has to keep on keeping 
 on, there's no magic.
Yep. If you're going to pick any feature to use to sell a new language, lack of GC is the worst. The only ones that care (and it's a small percentage) are the ones that are least likely to switch due to their existing tools, libraries, and knowledge.
Jul 08 2016
next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 8 July 2016 at 16:08:42 UTC, bachmeier wrote:
 On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:

 As for GC, it's hard to tell. When D was actually (not 
 hypothetically) created, GC was _the_ big thing. Java had just 
 taken off, people were pissed off with C/C++, programming and 
 coding was becoming more and more common. Not having GC might 
 actually have been a drawback back in the day. People would 
 have complained that "Ah, D is like C++, no automatic memory 
 management, I might as well stick to C++ or go for Java!" So 
 no, I think D is where it is, because things are like they 
 are, and "what if" discussions are useless. D has to keep on 
 keeping on, there's no magic.
Yep. If you're going to pick any feature to use to sell a new language, lack of GC is the worst. The only ones that care (and it's a small percentage) are the ones that are least likely to switch due to their existing tools, libraries, and knowledge.
True. The last sentence is something to bear in mind whenever we discuss attracting more people. If someone is really into C++ bare metal micro-optimization kinda stuff, we won't win him/her over with "no GC". As you said, they're the least likely to switch for said reasons. To be able to opt out of GC is still important, but it's not that we will attract thousands and thousands of new users because of that.
Jul 08 2016
prev sibling parent =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Friday, 8 July 2016 at 16:08:42 UTC, bachmeier wrote:
 Yep. If you're going to pick any feature to use to sell a new 
 language, lack of GC is the worst. The only ones that care (and 
 it's a small percentage) are the ones that are least likely to 
 switch due to their existing tools, libraries, and knowledge.
I said strong support for nogc (e.g. easy to do things with the stdlib without allocating), not that GC would not be available.
Jul 08 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:
 As for GC, it's hard to tell. When D was actually (not 
 hypothetically) created, GC was _the_ big thing. Java had just 
 taken off, people were pissed off with C/C++, programming and 
 coding was becoming more and more common.
Errr... Garbage collection was common since the 60s. One problem with GC in the late 80s and early 90s is that it requires twice as much memory and memory was scarce so reference counting was/is the better option. You could make the same argument about templates, memory... I also don't recall anyone being in awe of Java having GC. The big selling point was portability and the very hyped up idea that Java would run well in the browser, which did not materialize. Another selling point was that it wasn't Microsoft...
Jul 08 2016
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 8 July 2016 at 21:53:58 UTC, Ola Fosheim Grøstad wrote:
 On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:
 As for GC, it's hard to tell. When D was actually (not 
 hypothetically) created, GC was _the_ big thing. Java had just 
 taken off, people were pissed off with C/C++, programming and 
 coding was becoming more and more common.
Errr... Garbage collection was common since the 60s.
Which is not the point. My point was that everybody wanted GC after Java. And D was invented when GC was expected by many people.
 One problem with GC in the late 80s and early 90s is that it 
 requires twice as much memory and memory was scarce so 
 reference counting was/is the better option. You could make the 
 same argument about templates, memory...
Which is why D wouldn't have taken off in the 80ies (see my post above).
 I also don't recall anyone being in awe of Java having GC. The 
 big selling point was portability and the very hyped up idea 
 that Java would run well in the browser, which did not 
 materialize. Another selling point was that it wasn't 
 Microsoft...
GC was a big selling point. Every Java book went on about how much safer it is, that you have more time for productive code, blah ... Apple even added GC to Objective-C to appease the GC crowd.
Jul 08 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 8 July 2016 at 22:25:37 UTC, Chris wrote:
 after Java. And D was invented when GC was expected by many 
 people.
The GC was by far the most criticised feature of D...
 GC was a big selling point. Every Java book went on about how
Err... no, the big selling point that gave Java traction was portability and Java being marketed as designed for the internet and web. GC languages were already available and in use, but the JVM/.NET made it difficult for commercial development platforms. Portability and Microsoft's dominance was a big issue back then.
 blah ... Apple even added GC to Objective-C to appease the GC 
 crowd.
Apple removed the GC rather quickly for the same reasons that makes GC a bad choice for D. And replaced it with automatic reference counting.
Jul 09 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:
 removed the GC
...
 replaced it with automatic reference counting.
you *do* know that refcounting *is* GC, do you? ;-)
Jul 09 2016
next sibling parent reply bachmeier <no spam.net> writes:
On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
 On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
 wrote:
 removed the GC
...
 replaced it with automatic reference counting.
you *do* know that refcounting *is* GC, do you? ;-)
And that's a very important point, because the choice of RC vs other types of GC ignores the fact that they're both GC, and old school programmers didn't want anything to do with a "feature" that would slow down their code. RC would have been an even worse choice when D started because it is [claimed to be] slower than other types of GC. It's been a long time now, but I don't recall many arguments against Java's GC because of pauses. The objection was always that it would make the code run more slowly.
Jul 09 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
 The objection was always that it would make the code run more 
 slowly.
i tend to ignore such persons completely after such a claim: they are obviously incompetent as programmers. i also tend to ignore whole " nogc" movement: it is just a failed marketing strategy, which (sadly) tends to consume alot of recources even today.
Jul 09 2016
prev sibling next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
p.s. also, it is funny that D's GC is actually *better* if one to 
avoid GC completely, yet people continue to ask for refcounting.

i meat: if i don't want to use GC in D, it is as easy as avoid 
`new` (and delegates with closures). any code that processing 
allocated objects, but never allocates itself doesn't need to be 
changed at all.

and with refcounting i have to *explicitly* mark all the code as 
"no refcounting here", or accept refcounting overhead for nothing.
Jul 09 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 9 July 2016 at 11:27:13 UTC, ketmar wrote:
 and with refcounting i have to *explicitly* mark all the code 
 as "no refcounting here", or accept refcounting overhead for 
 nothing.
That would be automatic reference counting ;-)... Reference counting is ok for shared ownership, but in most cases overkill. Garbage collection is also useful in some settings, e.g. in some types of graph manipulation. Where things go wrong for D is to use primitive global garbage collection. It would have worked out ok if it provided only primitive local garbage collection. So what D needs is: 1. local garbage collection (for a single fiber or a facade to a graph). 2. solid global ownership management (for both resources and memory). Most newbies can then happily write single-threaded code as usual. More advanced programmers need to deal with shared ownership. Which they might have to do anyway, since garbage collection does not handle resources.
Jul 09 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 02:28:58 UTC, Ola Fosheim Grøstad 
wrote:
 So what D needs is:

 1. local garbage collection (for a single fiber or a facade to 
 a graph).

 2. solid global ownership management (for both resources and 
 memory).
ketmar doesn't need that. even for his real-time audio engine and videogame engines. not a high priority then, and adds ALOT of complexity (thing about complexity of passing values out of thread/fiber). no, thanks.
Jul 09 2016
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
 On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
 On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
 wrote:
 removed the GC
...
 replaced it with automatic reference counting.
you *do* know that refcounting *is* GC, do you? ;-)
And that's a very important point, because the choice of RC vs other types of GC ignores the fact that they're both GC, and old school programmers didn't want anything to do with a "feature" that would slow down their code. RC would have been an even worse choice when D started because it is [claimed to be] slower than other types of GC. It's been a long time now, but I don't recall many arguments against Java's GC because of pauses. The objection was always that it would make the code run more slowly.
I remember reading an article by Apple about their GC in Objective-C and they said that it was a generational GC and that some objects would not be collected at all (if too old), if I remember correctly. Apparently it wasn't good enough, but that's about 7 years ago, so my memory might have been freed of some details :-)
Jul 09 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
 On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
 On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
 wrote:
 removed the GC
...
 replaced it with automatic reference counting.
you *do* know that refcounting *is* GC, do you? ;-)
And that's a very important point, because the choice of RC vs other types of GC ignores the fact that they're both GC, and old school programmers didn't want anything to do with a "feature" that would slow down their code. RC would have been an even worse choice when D started because it is [claimed to be] slower than other types of GC.
No, manual reference counting is not particularly slow. Automatic reference counting is also not considered to be slower than GC. Reference counting is not capable of catching cyclic reference, which is why garbage collection is considered to be a more general solution to the problem. This is pretty much 101 memory management.
Jul 09 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 02:08:36 UTC, Ola Fosheim Grøstad 
wrote:
 No, manual reference counting is not particularly slow. 
 Automatic reference counting is also not considered to be 
 slower than GC.
i keep insisting that refcounting IS GC. please, stop call it something else.
Jul 09 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
 On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
 wrote:
 removed the GC
...
 replaced it with automatic reference counting.
you *do* know that refcounting *is* GC, do you? ;-)
Reference counting is a technique for collecting garbage, but the term «garbage collection» is typically used for techniques that catch cycles by tracing down chains of pointers: https://en.wikipedia.org/wiki/Garbage_collection_(computer_science)#Tracing_garbage_collectors
Jul 09 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 02:02:23 UTC, Ola Fosheim Grøstad 
wrote:
 Reference counting is a technique for collecting garbage, but 
 the term «garbage collection» is typically used for techniques 
 that catch cycles by tracing down chains of pointers:
i don't care about hipsters redefining the terms for arbitrary reasons. refcounting IS GC.
Jul 09 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 10 July 2016 at 06:19:28 UTC, ketmar wrote:
 On Sunday, 10 July 2016 at 02:02:23 UTC, Ola Fosheim Grøstad 
 wrote:
 Reference counting is a technique for collecting garbage, but 
 the term «garbage collection» is typically used for techniques 
 that catch cycles by tracing down chains of pointers:
i don't care about hipsters redefining the terms for arbitrary reasons. refcounting IS GC.
Nothing to do with hipsters. The common interpretation for «garbage collection» in informal context has always been a tracing collector. I've never heard anything else in any informal CS context.
Jul 10 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 09:04:25 UTC, Ola Fosheim Grøstad 
wrote:
 Nothing to do with hipsters. The common interpretation for 
 «garbage collection» in informal context has always been a 
 tracing collector. I've never heard anything else in any 
 informal CS context.
i always heard that "garbage collection" is garbage collection, and it is irrelevant to algorithms used.
Jul 10 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 10 July 2016 at 09:05:47 UTC, ketmar wrote:
 On Sunday, 10 July 2016 at 09:04:25 UTC, Ola Fosheim Grøstad 
 wrote:
 Nothing to do with hipsters. The common interpretation for 
 «garbage collection» in informal context has always been a 
 tracing collector. I've never heard anything else in any 
 informal CS context.
i always heard that "garbage collection" is garbage collection, and it is irrelevant to algorithms used.
I've never been to a lecture/presentation where "garbage collection" did not mean "tracing garbage collection". Attribute this to culture if you don't like it...
Jul 10 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
wrote:
 I've never been to a lecture/presentation where "garbage 
 collection" did not mean "tracing garbage collection".
then you probably watched some... wrong lections. ;-)
Jul 10 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 10 July 2016 at 17:03:26 UTC, ketmar wrote:
 On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
 wrote:
 I've never been to a lecture/presentation where "garbage 
 collection" did not mean "tracing garbage collection".
then you probably watched some... wrong lections. ;-)
Nah, they were experienced language designers and researchers.
Jul 10 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 10 July 2016 at 17:06:20 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 10 July 2016 at 17:03:26 UTC, ketmar wrote:
 On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
 wrote:
 I've never been to a lecture/presentation where "garbage 
 collection" did not mean "tracing garbage collection".
then you probably watched some... wrong lections. ;-)
Nah, they were experienced language designers and researchers.
then i won't trust a word they said.
Jul 10 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 10 July 2016 at 19:12:46 UTC, ketmar wrote:
 then i won't trust a word they said.
There aren't many people you trust then... Seriously, in academic contexts a statement like «X is a garbage collected language» always means tracing. It would be very odd to assume that X used reference counting.
Jul 11 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
wrote:
 There aren't many people you trust then...
exactly. 99% of people are idiots.
Jul 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
 On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
 wrote:
 There aren't many people you trust then...
exactly. 99% of people are idiots.
100%
Jul 11 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 11 July 2016 at 08:45:21 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
 On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
 wrote:
 There aren't many people you trust then...
exactly. 99% of people are idiots.
100%
it depends of rounding mode.
Jul 11 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 08:55:06 UTC, ketmar wrote:
 On Monday, 11 July 2016 at 08:45:21 UTC, Ola Fosheim Grøstad 
 wrote:
 On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
 On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
 wrote:
 There aren't many people you trust then...
exactly. 99% of people are idiots.
100%
it depends of rounding mode.
101%
Jul 11 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 8 July 2016 at 22:25:37 UTC, Chris wrote:
 after Java. And D was invented when GC was expected by many 
 people.
The GC was by far the most criticised feature of D...
 GC was a big selling point. Every Java book went on about how
Err... no, the big selling point that gave Java traction was portability and Java being marketed as designed for the internet and web. GC languages were already available and in use, but the JVM/.NET made it difficult for commercial development platforms. Portability and Microsoft's dominance was a big issue back then.
Yes, of course the "write-once-run-everywhere" fairy tale helped to spread Java, but while it was gaining traction GC became a feature everybody wanted. Sorry, but there is not a single book or introduction to Java that doesn't go on about how great GC is. Java was the main catalyst for GC - or at least for people demanding it. Practically everybody who had gone through IT courses, college etc. with Java (and there were loads) wanted GC. It was a given for many people.
 blah ... Apple even added GC to Objective-C to appease the GC 
 crowd.
Apple removed the GC rather quickly for the same reasons that makes GC a bad choice for D. And replaced it with automatic reference counting.
Yes, it didn't last long. But the fact that they bothered to introduce it, shows you how big GC was/is.
Jul 09 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 9 July 2016 at 09:15:19 UTC, Chris wrote:
 Yes, of course the "write-once-run-everywhere" fairy tale 
 helped to spread Java, but while it was gaining traction GC 
 became a feature everybody wanted. Sorry, but there is not a 
 single book or introduction to Java that doesn't go on about 
 how great GC is.
Just like there is no C++ book that does not rant about how great RAII is... What do you expect from a language evangelic? The first Java implementation Hotspot inherited its technology from StrongTalk, a Smalltalk successor. It was not a Java phenomenon, and FWIW both Lisp, Simula and Algol68 were garbage collected. What was "new" with Java was compile-once-run-everywhere. Although, that wasn't new either, but it was at least marketable as new.
 Java was the main catalyst for GC - or at least for people 
 demanding it. Practically everybody who had gone through IT 
 courses, college etc. with Java (and there were loads) wanted 
 GC. It was a given for many people.
Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
 Yes, it didn't last long. But the fact that they bothered to 
 introduce it, shows you how big GC was/is.
No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
Jul 09 2016
parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad 
wrote:
 Just like there is no C++ book that does not rant about how 
 great RAII is... What do you expect from a language evangelic? 
 The first Java implementation Hotspot inherited its technology 
 from StrongTalk, a Smalltalk successor. It was not a Java 
 phenomenon, and FWIW both Lisp, Simula and Algol68 were garbage 
 collected.
Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me - but this is not the point. My point is that GC became a big thing in the late 90ies early 2000s which is in part owed to Java having become the religion of the day (not Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC when it first came out. It was expected of a (new) language to provide GC by then - and GC had become a selling point for new languages. [1] And of course computers had become more powerful and could handle the overhead of GC better than in the 80ies.
 What was "new" with Java was compile-once-run-everywhere. 
 Although, that wasn't new either, but it was at least 
 marketable as new.

 Java was the main catalyst for GC - or at least for people 
 demanding it. Practically everybody who had gone through IT 
 courses, college etc. with Java (and there were loads) wanted 
 GC. It was a given for many people.
Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
 Yes, it didn't last long. But the fact that they bothered to 
 introduce it, shows you how big GC was/is.
No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
It wasn't demanding. I wrote a lot of code in Objective-C and it was perfectly doable. You even have features like `autorelease` for return values. The thing is that Apple had become an increasingly popular platform and more and more programmers were writing code for OS X. So they thought, they'd make it easier and reduce potential memory leaks (introduced by not so experienced Objective-C coders) by adding GC, especially because a lot of programmers expected GC "in this day and age".
Jul 11 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
 Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC 
 when it first came out. It was expected of a (new) language to 
 provide GC by then - and GC had become a selling point for new 
 languages.
This is not true, it is just wishful thinking. D was harmed by the GC, not propelled by it. I am not missing any point, sorry. Just go look at what people who gave up on D claim to be a major reason, the GC scores high...
 It wasn't demanding. I wrote a lot of code in Objective-C and 
 it was perfectly doable.
Of course it was doable, but developers had trouble getting it right. In Objective-C Foundation you have to memorize what kind of ownership functions return. A responsibility which ARC is relieving the developer from. Autorelease-pools does not change that, and you have to take special measures to avoid running out of memory with autorelease pools as it is a very simple region-allocator (what Walter calls a bump--allocator) so autorelease pools are not a generic solution. Objective-C had a very primitive manual RC solution that relied on conventions. They added a GC and ARC and only kept ARC. As simple as that. C++ actually has much robust memory management that what Objective-C had.
Jul 11 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 11 July 2016 at 11:59:51 UTC, Ola Fosheim Grøstad 
wrote:
 Just go look at what people who gave up on D claim to be a 
 major reason, the GC scores high...
and most of those people never even started to use D. took a brief look, maybe wrote "helloworld", and that's all. it doesn't matter in this case which reason made 'em "turn away". if not GC, it would be something another: they just wanted their Ideal Lanugage, and found that D is not. those people just can't be satisfied, 'cause they are looking for something D isn't at all. D *can* be used without GC. and it will still be "better C". it still will be less painful than C, but this is the price of doing "low-level things". or it can be used on a much higher level, where GC doesn't really matter anymore (and actually desirable).
Jul 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 12:18:26 UTC, ketmar wrote:
 and most of those people never even started to use D. took a 
 brief look, maybe wrote "helloworld", and that's all. it
Where do you get this from? Quite a few D programmers have gone to C++ and Rust.
 D *can* be used without GC. and it will still be "better C". it 
 still will be less painful than C, but this is the price of 
 doing "low-level things".
C is primarily used for portability/system support/interfacing or because you have an existing codebase. Even Microsoft's is now using higher level languages than C in parts of their system level code (operating system). Btw, C has changed quite a bit, it is at C11 now and even have "generics"... but I doubt many will us it. C is increasingly becoming a marginal language (narrow application area).
Jul 11 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 11 July 2016 at 13:56:30 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 12:18:26 UTC, ketmar wrote:
 and most of those people never even started to use D. took a 
 brief look, maybe wrote "helloworld", and that's all. it
Where do you get this from?
from reading this NG and other parts of teh internets.
 Quite a few D programmers have gone to C++ and Rust.
quite a few people who tried D... and anyway, the reasons were more complex, and GC usually just a nice excuse.
  C is primarily used for portability/system support/interfacing 
 or because you have an existing codebase.
this is mostly what "i can't stand GC" people want to do.
 C is increasingly becoming a marginal language (narrow 
 application area).
'cause manual memory management is PITA. not only due to this, of course, but this is still something.
Jul 11 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 11 July 2016 at 11:59:51 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
 Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC 
 when it first came out. It was expected of a (new) language to 
 provide GC by then - and GC had become a selling point for new 
 languages.
This is not true, it is just wishful thinking. D was harmed by the GC, not propelled by it. I am not missing any point, sorry. Just go look at what people who gave up on D claim to be a major reason, the GC scores high...
No. Having GC attracts more users, because they either explicitly want it of they don't care for the overhead. To have GC was definitely a good decision. What was not so good was that it was not optional with a simple on/off switch. Neither was it a good idea not to spend more time on ways to optimize GC, so it was comparatively slow. Keep in mind that the no GC crowd has very specialized needs (games, real time systems). Then again, to win this crowd over from C/C++ is not easy, regardless. And ... let's not forget that GC is often used as a handy excuse not to use D. "You don't use D because of a, b, c or because of GC?" - "Yeah, that one." I bet you that if D hadn't had GC when it first came out, people would've mentioned manual memory management as a reason not to use GC. I never claimed that D was _propelled_ by GC, but that it was a feature that most users would expect. Not having it would probably have done more harm than having it. By the way, have you ever designed a language, I'd love to see how it would look like ;) [snip]
Jul 11 2016
next sibling parent Chris <wendlec tcd.ie> writes:
 I bet you that if D hadn't had GC when it first came out, 
 people would've mentioned manual memory management as a reason 
 not to use GC. I never claimed that D was _propelled_ by GC, 
 but that it was a feature that most users would expect. Not 
 having it would probably have done more harm than having it.

 By the way, have you ever designed a language, I'd love to see 
 how it would look like ;)

 [snip]
s/not to use GC/not to use D
Jul 11 2016
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
 I bet you that if D hadn't had GC when it first came out, 
 people would've mentioned manual memory management as a reason 
 not to use GC. I never claimed that D was _propelled_ by GC, 
 but that it was a feature that most users would expect. Not 
 having it would probably have done more harm than having it.
Actually, I am certain that GC is a feature that _nobody_ would expect from a system level language, outside the Go-crowd.
 By the way, have you ever designed a language, I'd love to see 
 how it would look like ;)
Most programmers have designed DSL, so yes, obviously. If you are talking about a general purpose language then I wouldn't want to announce it until I was certain I got the basics right, like memory management.
Jul 11 2016
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
 I bet you that if D hadn't had GC when it first came out, 
 people would've mentioned manual memory management as a reason 
 not to use GC. I never claimed that D was _propelled_ by GC, 
 but that it was a feature that most users would expect. Not 
 having it would probably have done more harm than having it.
Actually, I am certain that GC is a feature that _nobody_ would expect from a system level language, outside the Go-crowd.
Most certainly from a multi-purpose language. GC would have been demanded sooner or later. The mistake was not to make it optional from the beginning. You focus on a small niche where people use all kinds of performance tricks even in C and C++. A lot of software doesn't care about GC overheads, however, and without GC a lot of people wouldn't even have considered it.
 By the way, have you ever designed a language, I'd love to see 
 how it would look like ;)
Most programmers have designed DSL, so yes, obviously. If you are talking about a general purpose language then I wouldn't want to announce it until I was certain I got the basics right, like memory management.
Go ahead, I'm sure it's fun. ;)
Jul 11 2016
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 14:12:35 UTC, Chris wrote:
 Most certainly from a multi-purpose language. GC would have 
 been demanded sooner or later. The mistake was not to make it 
 optional from the beginning.
If D was designed as a high level language then it would be a mistake not to provide a GC in most scenarios. Yes.
 care about GC overheads, however, and without GC a lot of 
 people wouldn't even have considered it.
Lots of people have been happy with Perl and Python before they added GC to catch cycles... Most applications don't leak a lot of memory to cyclic references and they usually have to run for a while. (But constructing a worst case is easy, of course.) (Btw, didn't mean to say that autorelease pools are the same as a region allocator, but they are similar in spirit.)
 Go ahead, I'm sure it's fun. ;)
Oh, I didn't mean to say I have designed a language. I have many ideas and sketches, but far too many to implement and polish ;-). I have started extending my knowledge on type systems, though, quite interesting. I think the change in computing power we now have is opening up for many new interesting opportunities.
Jul 11 2016
prev sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Monday, 11 July 2016 at 14:12:35 UTC, Chris wrote:
 You focus on a small niche where people use all kinds of 
 performance tricks even in C and C++. A lot of software doesn't 
 care about GC overheads, however, and without GC a lot of 
 people wouldn't even have considered it.
+1 A large majority of performance-heavy software can live with the GC. GC is a blocker for people using micro-controllers with little memory, that usually don't get to choose a compiler.
Jul 11 2016
prev sibling next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:
 Actually, I am certain that GC is a feature that _nobody_ would 
 expect from a system level language, outside the Go-crowd.
hello. i am the man born to ruin your world.
Jul 11 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 14:19:07 UTC, ketmar wrote:
 On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
 wrote:
 Actually, I am certain that GC is a feature that _nobody_ 
 would expect from a system level language, outside the 
 Go-crowd.
hello. i am the man born to ruin your world.
Of course, you are the extra 1% that comes on top of the other 100%.
Jul 11 2016
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
 I bet you that if D hadn't had GC when it first came out, 
 people would've mentioned manual memory management as a reason 
 not to use GC. I never claimed that D was _propelled_ by GC, 
 but that it was a feature that most users would expect. Not 
 having it would probably have done more harm than having it.
Actually, I am certain that GC is a feature that _nobody_ would expect from a system level language, outside the Go-crowd.
I am no longer dabbling in D, but could not resist: - UK Royal Navy with Algol 68 RS - Xerox PARC with Mesa/Cedar - DEC/Olivetti/Compaq with Modula-3 - ETHZ with Oberon, Oberon-2, Active Oberon, Component Pascal 7.0+ features (http://joeduffyblog.com/2015/12/19/safe-native-code/, https://www.infoq.com/news/2016/06/systems-programming-qcon) - Astrobe with Oberon for micro-controlers (ARM Cortex-M4, Cortex-M3 and Xilinx FPGA Systems) - PTC Perc Ultra with Java - IS2T with their MicroEJ OS Java/C platform The biggest problem with D isn't the GC, is lack of focus to make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, Java, C++17.
Jul 11 2016
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:
 The biggest problem with D isn't the GC, is lack of focus to 
 make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
 Java, C++17.
I knew you would chime in... Neither .NET, Swift or Java should be considered system level tools. Ada/Spark has a very narrow use case. Rust is still in it's infancy. C++17 is not yet finished. But yes, C++ currently owns system level programming, C is loosing terrain and Rust has an uncertain future. The biggest problem with D is not GC, because we now how nogc. But D is still lacking in memory management.
Jul 11 2016
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 11 July 2016 at 14:58:16 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:
 The biggest problem with D isn't the GC, is lack of focus to 
 make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
 Java, C++17.
I knew you would chime in... Neither .NET, Swift or Java should be considered system level tools. Ada/Spark has a very narrow use case. Rust is still in it's infancy. C++17 is not yet finished. But yes, C++ currently owns system level programming, C is loosing terrain and Rust has an uncertain future. The biggest problem with D is not GC, because we now how nogc. But D is still lacking in memory management.
Happy not to disappoint. :) OS vendors are the ones that eventually decided what is a systems programming language on their OSes. And if they say so, like Apple is nowadays doing with Swift, developers will have no option other than accept it or move to other platform, regardless of their opinion what features a systems programming languages should offer. Just like C developers that used to bash C++, now have to accept the two biggest C compilers are written in the language they love to hate.
Jul 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 16:26:11 UTC, Paulo Pinto wrote:
 Happy not to disappoint.  :)
You never disappoint in the GC department ;-)
 OS vendors are the ones that eventually decided what is a 
 systems programming language on their OSes.
To a large extent on Apple and Microsoft OSes. Not so much on open source OSes as you are not tied down to binary blobs.
 And if they say so, like Apple is nowadays doing with Swift, 
 developers will have no option other than accept it or move to 
 other platform, regardless of their opinion what features a 
 systems programming languages should offer.
It is true that there have been policy changes which makes it difficult to access features like GPU and Audio on OS-X/iOS without touching Objective-C or Swift. You don't have to use it much, but you need some binding stubs in Objective-C or Objective-C++ if you want to be forward compatible (i.e. link on future versions of the OS without recompiling). But I _have_ noticed that Apple increasingly is making low level setup only available through Objective-C/Swift. It is probably a lock-in strategy to raise porting costs to Android.
 Just like C developers that used to bash C++, now have to 
 accept the two biggest C compilers are written in the language 
 they love to hate.
There was a thread on reddit recently where some Microsoft employees admitted that parts of Windows now is implemented in separate processes, but still...
Jul 11 2016
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 11 July 2016 at 16:44:27 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 16:26:11 UTC, Paulo Pinto wrote:
 Happy not to disappoint.  :)
You never disappoint in the GC department ;-)
 OS vendors are the ones that eventually decided what is a 
 systems programming language on their OSes.
To a large extent on Apple and Microsoft OSes. Not so much on open source OSes as you are not tied down to binary blobs.
 And if they say so, like Apple is nowadays doing with Swift, 
 developers will have no option other than accept it or move to 
 other platform, regardless of their opinion what features a 
 systems programming languages should offer.
It is true that there have been policy changes which makes it difficult to access features like GPU and Audio on OS-X/iOS without touching Objective-C or Swift. You don't have to use it much, but you need some binding stubs in Objective-C or Objective-C++ if you want to be forward compatible (i.e. link on future versions of the OS without recompiling). But I _have_ noticed that Apple increasingly is making low level setup only available through Objective-C/Swift. It is probably a lock-in strategy to raise porting costs to Android.
Actually NeXTStep drivers were written in Objective-C. http://www.cilinder.be/docs/next/NeXTStep/3.3/nd/OperatingSystem/Part3_DriverKit/Concepts/1_Overview/Overview.htmld/ They are not alone, as of Android N, Google is making it pretty clear that if one tries to circuvent the constrained set of NDK APIs and workaround the JNI to access existing shared objects, the application will be simply be killed. http://android-developers.blogspot.de/2016/06/android-changes-for-ndk-developers.html Which basically boils down to OEMs, 3D rendering and low
 Just like C developers that used to bash C++, now have to 
 accept the two biggest C compilers are written in the language 
 they love to hate.
There was a thread on reddit recently where some Microsoft employees admitted that parts of Windows now is implemented in as separate processes, but still...
Yes, the trend started with Windows 8 and the new application model based on the initial design of COM+ Runtime, which was the genesis of .NET before they decided to ditch it for the CLR. If you check the latest BUILD, the current approach being evangelised is .NET Native for 90% of the code, C++/CX or plain On the UWP model, DirectX is probably the only user space API that doesn't have a WinRT projection fully available, but they have been slowly surfacing it in each release. The WinRT, User Driver Framework, the new container model and Linux subsystem, the Checked C, input to the C++ Core Guidelines Singularity, Midori and Drawbridge.
Jul 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:
 Actually NeXTStep drivers were written in Objective-C.
NeXT was a cool concept, but it was sad that they picked such an annoying language to build it.
 They are not alone, as of Android N, Google is making it pretty 
 clear that if one tries to circuvent the constrained set of NDK 
 APIs and workaround the JNI to
 access existing shared objects, the application will be simply 
 be killed.
I don't do Android programming, but NDK is actually fairly rich in comparison to Apple OSes without Objective-C bindings AFAIK. The problem seems to be more in the varying hardware configurations / quality of implementation. Not using Java on Android sounds like a PITA to be honest.
 If you check the latest BUILD, the current approach being 
 evangelised is .NET Native for 90% of the code, C++/CX or plain 


I don't know much about .NET Native, does it apply to or will they bring it to .NET Core? A change in recent years is that Microsoft appears to invest more wholesale replacement.
 The WinRT, User Driver Framework, the new container model and 
 Linux subsystem, the Checked C, input to the C++ Core
I haven't paid much attention to WinRT lately, they have a Linux subsystem?
Jul 11 2016
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 12 July 2016 at 03:25:38 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:
 I don't do Android programming, but NDK is actually fairly rich 
 in comparison to Apple OSes without Objective-C bindings AFAIK. 
 The problem seems to be more in the varying hardware 
 configurations / quality of implementation.
Not really, it is a real pain to use and feals like an half-baked solution developed by people that were forced by their manager to support anything else other than Java. The iOS and WP SDKs have much better support for C++, specially the integration with native APIs via Objective-C++ and C++/CX and the debugging tools.
 Not using Java on Android sounds like a PITA to be honest.
Yes, the Android team goes to great lengths to make it feel like that.
 I don't know much about .NET Native, does it apply to or will 
 they bring it to .NET Core?
Yes, it is called CoreRT.
 A change in recent years is that Microsoft appears to invest 

 as a wholesale replacement.
Not really, the big looser is C. After the OS Dev team won the political war against the DevTools team, thanks to the Longhorn debacle, the wind changed into the whole going native theme. Parallel to that the whole Midori effort was ramped down and its learnings brought back to the production side of Microsft. Also contrary to what Microsoft tried to push with C++/CX on WinRT, besides game developers not many decided to embrace it. compilation to native code via the Visual C++ backend. At the same time, the internal efforts to clean C++ code where taken outside and the C++ Core Guidelines were born. Also Kenny Kerr a very vocal C++ MVP (and MSDN Magazine collaborator) against C++/CX was hired, and is now driving the effort to create a WinRT projection using plain standard modern C++.
 The WinRT, User Driver Framework, the new container model and 
 Linux subsystem, the Checked C, input to the C++ Core
I haven't paid much attention to WinRT lately, they have a Linux subsystem?
Yes, will be available in the upcoming Windows 10 Anniversary edition. It is built on top of the Drawbrige picoprocesses that are now a Windows 10 feature. Basically it only supports x64 ELF binaries and makes use of the pico-processes infrastructure to redirect Linux syscalls into NT ones. It is a collaboration between Microsoft and Ubuntu and there are quite a few Channel 9 videos describing how the whole stack works.
Jul 12 2016
parent Alex <alex alex.com> writes:
On Tuesday, 12 July 2016 at 07:01:05 UTC, Paulo Pinto wrote:

 Also contrary to what Microsoft tried to push with C++/CX on 
 WinRT, besides
 game developers not many decided to embrace it.
We didn't embrace it at all, we just have no choice but to use it for a lot of XBoxOne SDK calls. Any files with CX enabled compile much slower and we try to encapsulate them as much as possible.
Jul 12 2016
prev sibling parent Paolo Invernizzi <paolo.invernizzi no.address> writes:
On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:
 The biggest problem with D isn't the GC, is lack of focus to 
 make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
 Java, C++17.
How true! That's the only real problem with this beautiful language! /P
Jul 11 2016
prev sibling parent reply Infiltrator <Lt.Infiltrator gmail.com> writes:
On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
 ...
 To have GC was definitely a good decision. What was not so
 good was that it was not optional with a simple on/off switch.
 ...
I know that I'm missing something here, but what's wrong with the functions provided in core.memory? Specifically, GC.disable()?
Jul 11 2016
parent Chris <wendlec tcd.ie> writes:
On Monday, 11 July 2016 at 14:03:36 UTC, Infiltrator wrote:
 On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
 ...
 To have GC was definitely a good decision. What was not so
 good was that it was not optional with a simple on/off switch.
 ...
I know that I'm missing something here, but what's wrong with the functions provided in core.memory? Specifically, GC.disable()?
I was thinking of a compiler switch (as they did in Objective-C), and had D been designed with `-nogc` in mind from the start, Phobos would be GC free too. No GC is still a bit rough around the edges.
Jul 11 2016
prev sibling next sibling parent Charles Hixson via Digitalmars-d <digitalmars-d puremagic.com> writes:
Garbage collection allows many syntax "liberalizations" that lack of 
garbage collection renders either impossible or highly dangerous.  (In 
this definition of "garbage collection" I'm including variations like 
reference counting.)  For an example of this consider the dynamic array 
type.  You MUST have garbage collection to use that safely...unless you 
require the freeing of memory with every change in size.  C++ does that 
with the STL, but if you want the dynamic types built into the language, 
then you need garbage collection built into the language.  (This is 
different from saying it needs to be active everywhere, but once you've 
got it, good places to use it keep showing up.)

One of the many advantages of the dynamic array type being built into 
the language is that arrays of different sizes are reasonably comparable 
by methods built into the language.  This is used all over the place.  
In D I almost never need to use "unchecked conversion".

On 07/11/2016 02:30 AM, Chris via Digitalmars-d wrote:
 On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad wrote:
 Just like there is no C++ book that does not rant about how great 
 RAII is... What do you expect from a language evangelic? The first 
 Java implementation Hotspot inherited its technology from StrongTalk, 
 a Smalltalk successor. It was not a Java phenomenon, and FWIW both 
 Lisp, Simula and Algol68 were garbage collected.
Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me - but this is not the point. My point is that GC became a big thing in the late 90ies early 2000s which is in part owed to Java having become the religion of the day (not Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC when it first came out. It was expected of a (new) language to provide GC by then - and GC had become a selling point for new languages. [1] And of course computers had become more powerful and could handle the overhead of GC better than in the 80ies.
 What was "new" with Java was compile-once-run-everywhere. Although, 
 that wasn't new either, but it was at least marketable as new.

 Java was the main catalyst for GC - or at least for people demanding 
 it. Practically everybody who had gone through IT courses, college 
 etc. with Java (and there were loads) wanted GC. It was a given for 
 many people.
Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
 Yes, it didn't last long. But the fact that they bothered to 
 introduce it, shows you how big GC was/is.
No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
It wasn't demanding. I wrote a lot of code in Objective-C and it was perfectly doable. You even have features like `autorelease` for return values. The thing is that Apple had become an increasingly popular platform and more and more programmers were writing code for OS X. So they thought, they'd make it easier and reduce potential memory leaks (introduced by not so experienced Objective-C coders) by adding GC, especially because a lot of programmers expected GC "in this day and age".
Jul 11 2016
prev sibling parent reply Alessandro Ogheri <ogheri alessandroogheri.com> writes:
On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
 On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad 
 wrote:
 Just like there is no C++ book that does not rant about how 
 great RAII is... What do you expect from a language evangelic? 
 The first Java implementation Hotspot inherited its technology 
 from StrongTalk, a Smalltalk successor. It was not a Java 
 phenomenon, and FWIW both Lisp, Simula and Algol68 were 
 garbage collected.
Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me -
Leonardo Da Vinci was coding in Haskell but he was calling it Haskellius idioma programatoribus... but this is not the point. My point is that GC
 became a big thing in the late 90ies early 2000s which is in 
 part owed to Java having become the religion of the day (not 
 Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC 
 when it first came out. It was expected of a (new) language to 
 provide GC by then - and GC had become a selling point for new 
 languages.

 [1] And of course computers had become more powerful and could 
 handle the overhead of GC better than in the 80ies.

 What was "new" with Java was compile-once-run-everywhere. 
 Although, that wasn't new either, but it was at least 
 marketable as new.

 Java was the main catalyst for GC - or at least for people 
 demanding it. Practically everybody who had gone through IT 
 courses, college etc. with Java (and there were loads) wanted 
 GC. It was a given for many people.
Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
 Yes, it didn't last long. But the fact that they bothered to 
 introduce it, shows you how big GC was/is.
No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
It wasn't demanding. I wrote a lot of code in Objective-C and it was perfectly doable. You even have features like `autorelease` for return values. The thing is that Apple had become an increasingly popular platform and more and more programmers were writing code for OS X. So they thought, they'd make it easier and reduce potential memory leaks (introduced by not so experienced Objective-C coders) by adding GC, especially because a lot of programmers expected GC "in this day and age".
Apr 11 2021
parent Paul Backus <snarwin gmail.com> writes:
On Sunday, 11 April 2021 at 15:42:55 UTC, Alessandro Ogheri wrote:
 On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
 Please stop intentionally missing the point. I don't care if 
 Leonardo Da Vinci already had invented GC - which wouldn't 
 surprise me -
Leonardo Da Vinci was coding in Haskell but he was calling it Haskellius idioma programatoribus...
I appreciate a good joke as much as the next guy, but was it really necessary to reply to a four-year-old thread for this?
Apr 11 2021
prev sibling parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
 If the program is compiled with -g and it crashes (seg faults) 
 you'll usually at least get a stack trace. Running it under a 
 debugger will get you much more information.
Only on Windows, and that's a common source of frustration for me :(
Jul 08 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 8 July 2016 at 15:17:33 UTC, Luís Marques wrote:
 On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
 If the program is compiled with -g and it crashes (seg faults) 
 you'll usually at least get a stack trace. Running it under a 
 debugger will get you much more information.
Only on Windows, and that's a common source of frustration for me :(
=== z00.d === void func () { assert(0, "BOOM!"); } void main () { func(); } core.exception.AssertError z00.d(2): BOOM! ---------------- ??:? _d_assert_msg [0xb7534687] z00.d:2 void z00.func() [0x80489f2] z00.d:6 _Dmain [0x80489ff] ??:? rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).runAll().__lambda1() [0xb7566326] ??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).tryExec(scope void delegate()) [0xb75661a0] ??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).runAll() [0xb75662d3] ??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).tryExec(scope void delegate()) [0xb75661a0] ??:? _d_run_main [0xb75660ff] ??:? main [0x8048a83] ??:? __libc_start_main [0xb6f3f696] what am i doing wrong? O_O
Jul 08 2016
parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Friday, 8 July 2016 at 15:31:53 UTC, ketmar wrote:
 core.exception.AssertError z00.d(2): BOOM!
 ----------------

 what am i doing wrong? O_O
That's an exception, not a segfault. Try something like int* x; *x = 42;
Jul 08 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Friday, 8 July 2016 at 17:04:04 UTC, Luís Marques wrote:
 On Friday, 8 July 2016 at 15:31:53 UTC, ketmar wrote:
 core.exception.AssertError z00.d(2): BOOM!
 ----------------

 what am i doing wrong? O_O
That's an exception, not a segfault. Try something like int* x; *x = 42;
segfault is impossible to catch outside of debugger. any hackish "solution" to this is WRONG.
Jul 08 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
p.s. it's not something specifical to D. any program that 
"catches" segfault by itself should be burnt with fire.
Jul 08 2016
prev sibling next sibling parent reply "Schrom, Brian T via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On 7/8/16 8:22 AM, Luís Marques via Digitalmars-d wrote:
 On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
 If the program is compiled with -g and it crashes (seg faults)
 you'll usually at least get a stack trace. Running it under a
 debugger will get you much more information.
Only on Windows, and that's a common source of frustration for me :(
I've had reasonable success using lldb on mac.
Jul 08 2016
parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Friday, 8 July 2016 at 15:30:12 UTC, Schrom, Brian T wrote:
 I've had reasonable success using lldb on mac.
I was referring to the stack trace on segfault, but regarding the user of debuggers on a Mac with D, most of the time it doesn't work very well for me. I think last time I used lldb (maybe last week) when I tried to print something in a D program nothing would happen, not even an error. Now that lldc is more up-to-date I'll check if that helps lldb get less confused.
Jul 08 2016
parent reply Jacob Carlborg <doob me.com> writes:
On 2016-07-08 19:07, Luís Marques wrote:

 I was referring to the stack trace on segfault, but regarding the user
 of debuggers on a Mac with D, most of the time it doesn't work very well
 for me. I think last time I used lldb (maybe last week) when I tried to
 print something in a D program nothing would happen, not even an error.
 Now that lldc is more up-to-date I'll check if that helps lldb get less
 confused.
On OS X when an application segfaults a crash report will be generated. It's available in the Console application. -- /Jacob Carlborg
Jul 10 2016
parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Sunday, 10 July 2016 at 18:53:52 UTC, Jacob Carlborg wrote:
 On OS X when an application segfaults a crash report will be 
 generated. It's available in the Console application.
Doesn't seem to work for me on 10.11.5. Maybe you need to enable that on the latest OSes? In any case, that will probably get you a mangled stack trace, right? It would still be useful (especially if the stack trace if correct, in LLDB I get some crappy ones sometimes) but it would not be as convenient as the stack trace on Windows generated by the druntime.
Jul 11 2016
parent Jacob Carlborg <doob me.com> writes:
On 2016-07-11 14:23, Luís Marques wrote:

 Doesn't seem to work for me on 10.11.5. Maybe you need to enable that on
 the latest OSes?
It works for me. I don't recall specifically enabling crash reports. Are you looking at "All Messages"? You can also look at ~/Library/Logs/DiagnosticReports to see if a new file shows up.
 In any case, that will probably get you a mangled stack
 trace, right?
Well, OS X doesn't no anything about D mangling ;). But it will demangle C++ symbols.
 It would still be useful (especially if the stack trace if
 correct, in LLDB I get some crappy ones sometimes) but it would not be
 as convenient as the stack trace on Windows generated by the druntime.
Yes, of course. -- /Jacob Carlborg
Jul 11 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/8/2016 8:17 AM, Luís Marques wrote:
 On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
 If the program is compiled with -g and it crashes (seg faults) you'll usually
 at least get a stack trace. Running it under a debugger will get you much more
 information.
Only on Windows, and that's a common source of frustration for me :(
Linux too.
Jul 08 2016
parent reply =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:
 Only on Windows, and that's a common source of frustration for 
 me :(
Linux too.
Not by default, right? Only with the magic import and call. That's certainly better than on OS X, where there's no segfault handler at all (I don't think there's anything wrong with using it for a debug build), but it's something a bit obscure that is often not enabled when a segfault crash appears by surprise.
Jul 08 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/8/2016 2:36 PM, Luís Marques wrote:
 On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:
 Only on Windows, and that's a common source of frustration for me :(
Linux too.
Not by default, right?
-g
Jul 09 2016
parent =?UTF-8?B?THXDrXM=?= Marques <luis luismarques.eu> writes:
On Saturday, 9 July 2016 at 08:40:00 UTC, Walter Bright wrote:
 On 7/8/2016 2:36 PM, Luís Marques wrote:
 On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:
 Only on Windows, and that's a common source of frustration 
 for me :(
Linux too.
Not by default, right?
-g
Well, it doesn't work for me on Linux with the latest DMD, even with -g. To be clear, the whole context was "Not by default, right? Only with the magic import and call."
Jul 11 2016
prev sibling next sibling parent Satoshi <satoshi gshost.eu> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 Basically there is no great IDE for D, in fact, there is none. 
 They are all general purpose IDE's that have been configured to 
 compile D code. Great! Except they don't work well because they 
 wern't designed for D. (e.g., template debugging? mixins? Error 
 messages? Code maps? refactoring? All the stuff that more 
 modern languages and IDE's are using is lacking for D.

 11. D has no proper Gui. WTF?  This isn't the 70's no matter 
 how much you to relive peace and sex. Oh, did I hear someone 
 say bindings? WTF?
I am working on a GUI framework and IDE for Dlang for a year. BTW. There is still DlangUI
Jul 02 2016
prev sibling next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
small summary of the starting post: "i am an idiot, and i can't 
use D. of course, you all guilty of my idiocity."
Jul 02 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 3 July 2016 at 06:42:23 UTC, ketmar wrote:
 small summary of the starting post: "i am an idiot, and i can't 
 use D. of course, you all guilty of my idiocity."
No, it is the result of beta-quality. You see the same reaction in other languages/frameworks when they claim to be near-production ready while not having a stable state. Here's the core problem: 1. Define a stable set of functionality. 2. Implement it. 3. Improve on it for 3-10 years without changing the functionality. 4. Obtain extensive tooling. Developers generally underestimate how much work it is to bring something from stable to what is commonly expected from production quality tooling. D is pretty much somewhere between (1) and (2), will most likely not reach (3) and is far away from (4). In C++ it is somewhat different, C/old C++ is at (4), C++11 is at (3), C++14 is at (2/3), C++17 is at (1).
Jul 03 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sunday, 3 July 2016 at 07:04:58 UTC, Ola Fosheim Grøstad wrote:
 3. Improve on it for 3-10 years without changing the 
 functionality.
if i want to use "stable language", i know where to download C89 compiler.
Jul 03 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 3 July 2016 at 07:48:39 UTC, ketmar wrote:
 On Sunday, 3 July 2016 at 07:04:58 UTC, Ola Fosheim Grøstad 
 wrote:
 3. Improve on it for 3-10 years without changing the 
 functionality.
if i want to use "stable language", i know where to download C89 compiler.
That's ok for spare time use, but when developer time and maintenance matters then stable is more important than features. Without a stable branch any development platform becomes much less attractive.
Jul 03 2016
prev sibling next sibling parent Israel <robindahoods gmail.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 Sorry, I've spend the last month trying my best to get simple 
 shit done. At every turn there is some problem that has to be 
 dealt with that is unrelated to my actual work.  Be it the IDE, 
 debugging, the library, or user shared code, it is just crap. D 
 cannot be used successfully for semi-large projects without 
 significant man hours invested in getting everything to work.

 [...]
Bait
Jul 03 2016
prev sibling next sibling parent reply Bauss <jj_1337 live.dk> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 Sorry, I've spend the last month trying my best to get simple 
 shit done. At every turn there is some problem that has to be 
 dealt with that is unrelated to my actual work.  Be it the IDE, 
 debugging, the library, or user shared code, it is just crap. D 
 cannot be used successfully for semi-large projects without 
 significant man hours invested in getting everything to work.
Say what? I have used it for multiple big projects of my own ranging from 40000-100000 lines of code? Neither have I ever had issues with IDE's or debugging. I rarely uses other people's code, so can't comment on the quality of that. However with your biased points I'm sure it's just empty rants anyway.
 I'm sorry but it seems too much focus on enhancements while 
 there are too many bugs and lack of tools to actually do 
 anything useful.
What bugs? What lack of tools? "Oh look there is a bug, help" Well report the bug so it can be fixed moron. "There is no tools" Tell what tools you need or start making them yourself.
 I'm sorry if this hurts some of you guys feelings but it is 
 fact that D sucks as a whole. A modern programming language 
 should be modern, and D is not one of those languages. It is 
 built as from cobbling together disparate parts that don't work 
 together. The more it is used and added on to the worse it gets.
You clearly don't follow the forum posts, D's future visions or any sort of discussions regard D's state and Phobos.
 I'm sorry many of you have spent your lives working on 
 something that won't amount to much in the real world. It was a 
 valiant effort. Maybe it will seed new languages that will 
 actually work and have the tooling to be useful. D itself is 
 not a practical tool for general real world software 
 development. I suppose it is obvious considering there is no 
 significant use of it except by the diehards.
I use it in the real world. I don't know what real world you live in, but okay? If you're too incomprehent to use it, then so be it.
 I hope I am wrong, but facts are facts. I would like to list 
 these facts so potential users can be informed before the 
 embark on a likely dead end.
Biased opinions are not facts.
 1. The language is not completely well defined. While the 
 language itself contains many nice features and what makes D 
 appealing, too many features are cobbled together and don't 
 completely work 100% properly. This creates very subtle bugs or 
 problem areas that can stop one in their tracks. One can see 
 how these things are cobbled together by observing the forms 
 and the discussions about how to proceed in certain areas.
I agree with some of this, but it's slowly becoming better. It's a damage that takes a while to repair, but it's happening. That's something that may come with open-source projects like this where different people, with different intentions and vissions starts working on it, without some specific guide-line (That however has been made by this point, hence why it isn't so much related anymore.)
 2. The compilation process is riddled with meaningless error 
 messages and a simple missing ';' can launch one off to the 
 moon to figure out it is missing. The error messages can 
 cascade. Fix the ';' and 20 messages disappear. Usually each 
 message is 100 characters+ when it involves templates.

 Rather than just telling you what is grammatically missing, 
 like any normal modern compiler does, you have to hunt and peck 
 and fill your head with meaningless information.
Any of such examples? Otherwise this doesn't help anything.
 3. The compilers are designed as if they come straight out of 
 the 70's.  The setup is obscure, relies on assumptions that are 
 not true, and just like the compilation process, if your 
 unlucky you could be working for a few days just to try to get 
 dmd to compile.
How is the setup obscure? It's a single installation, what? I believe the issue is on your end if you spend a long time to get DMD working. That MUST be your configurations. I have installed it on multiple machines and it has worked in first go.
 4. Object code issues, again, stuff from the 70's are still 
 present.  Rather than fix the shit outright, knowing they are 
 problematic, the can is kicked down the road to the 
 unsuspecting users. The users, who generally know less about 
 what is going on than the people who design the software. Those 
 people who can fix the problem directly and save a lot of grief 
 for people don't because they feel it isn't a big deal and they 
 have more important things to do.
Any examples? Else this one can be blown away too.
 5. The documentation is somewhat crappy. While it is extensive 
 and auto generated it generally is only a "template" of what a 
 real user needs. Rather than just the function declaration, 
 usually with nondescript template names like R, S, U, etc about 
 half the functions are missing use cases. I realize this takes 
 work but it could be done by the people writing the code, 
 again, they know best, right?
Well if the documentation is crappy, go ahead and document it yourself. It's open-source and usually documentation happens voluntary and not because someone is paid to write some formal description of the entire library.
 6. The library is meandering in its design. Feels very 
 convoluted at times, cobbled together rather than designed 
 properly from the get go. Updated language features creates a 
 cascade of library modifications. "Lets move this to this and 
 remove that and then add this... oh, but we gotta update the 
 docs, we'll do that tomorrow...".
I believe the documentations are now updated according to the source? I could be wrong.
 7. The library uses shit for names. Ok, so strip isn't too bad 
 but why not trim? That's what every one else uses. Ok, what 
 about chomp? munch? squeeze? What the fuck is going on? Did the 
 perverted Cookie Monster write this shit?
 What about the infamous tr? Yeah, just cause posix said it was 
 ok then it must
 be so. I'd say we call it t instead.
Again a biased opinion. Strip sounds better in my opnion.
 I could go on and on about stuff like this but I have more 
 important things to do, like
Okay, but nobody cares.
 8. Lets use vim or emacs. I liked the 70's it was great. So 
 great that I want to continue using the same editors because we 
 know them well and they work... and their free!  I like coding 
 at the pace of a turtle with minimal information because that's 
 hard core leet style and makes my balls bigger, which my wife 
 likes.
I use Atom, what's your point with old editors?
 Oh, what about visual studio? Don't get me started! Maybe if 
 Visual D/Mago actually worked half the time and didn't slow me 
 down I'd use that. Xmarian? Worse!
I have had zero problems with Visual D and if you have one, why not report it instead of crying like a little girl?
 Maybe it's time to get out of the dark ages and actually design 
 a program that is designed for creating programs? Not just a 
 fucking text editor that has a few helpful things that programs 
 might use. Do we still have to code in text files? How about we 
 just write everything in binary? Ok, sorry... getting OT.
Go buy a crying cookie.
 Basically there is no great IDE for D, in fact, there is none. 
 They are all general purpose IDE's that have been configured to 
 compile D code. Great! Except they don't work well because they 
 wern't designed for D. (e.g., template debugging? mixins? Error 
 messages? Code maps? refactoring? All the stuff that more 
 modern languages and IDE's are using is lacking for D.
What? https://wiki.dlang.org/IDEs
 9. What I do like about D is that it can compile for various 
 platforms rather easy. Usually I do something like -m64 and run 
 the app then it crashes. I don't know why because their was no 
 error message. The point is that while D can "compile" for 
 various platforms it is always an "on going process".
You can specify compiler flags to get debug informtion, you can attempt to catch the error or you can attach a debugger. How hard can it be?
 Because 9/10 D programmers program in linux, windows support is 
 minimal and buggy. Since I don't use linux, because windows has 
 a much larger market share, maybe D is great on linux. On 
 windows though, it's a literal pain in the ass. All the time I 
 spend trying to figure out how to get things to work properly 
 has given me hemorrhoids. God did not design Man's ass to sit 
 in front of a computer all day. BTW, a program can't just 
 "work", I have clients that have a certain level of 
 expectation, like no seg faults. Just because it works for me, 
 or for you is not good enough. It has to work for everyone.
I program on Windows and Windows only. I have never had a single issue because of that and I have been programming D for roughly 4 years now, pretty much every day.
 10. Most user contributed D packages are outdated. They simply 
 don't work anymore due to all the language changes. Instead of 
 culling the crap, it persists and the user has to wade through 
 it all. It's every man for himself when it comes to D.
Well, instead of crying. Fix them and make them work again. Nobody is obligated to keep their projects up to date.
 11. D has no proper Gui. WTF?  This isn't the 70's no matter 
 how much you to relive peace and sex. Oh, did I hear someone 
 say bindings? WTF?
Uhmm Dlang UI?
 12. D has no proper logging system. I just don't want to log a 
 message, I want a well designed and easily manageable way to 
 understand problems my program is experiencing. Given that D 
 has so many latent issues, it's nice to have some way to deal 
 with the "Big foot"(But bug that you only see when when your 
 driving down a windy dark road in Nebraska).
Say what? https://dlang.org/phobos/std_experimental_logger.html There's also profiling logs, code coverage logs etc. that can be specified with compiler flags.
 13. Windows interfacing. Thanks for the bindings! The most used 
 OS in the would with the largest commercial share only gets 
 bindings that is actually like programming in win32. Rather 
 than at least wrap them in a working oop design that hides away 
 the abacus feel, we are stuck with bindings. The D community 
 loves bindings, I have no fucking clue why. It just means more 
 work. At least if I didn't have the bindings I wouldn't have to 
 implement anything.
If you're missing win api calls simply add them yourself and do a pull request or stop crying. I believe there was an open-source project that had bindings to most win api calls though. But again D is open-source and mostly worked on voluntary, so if something is missing add it. Most people don't go adding a billion features that may or may not be used. They usually add something because they need it.
 14. Gaming? It can be done, not by me or you but by geniuses 
 who live in their basement and no one to support or nothing 
 else to do but hash out how to get it done. But while they 
 might share their results, don't get your hopes up and expect 
 it to work for you.
What are you talking about?
 15. Cross platform design? Maybe, Supposedly it works but I 
 have too much trouble with windows to care about adding another 
 layer of crap on top.
You have too much trouble with yourself. Go cry me a river, because you sound too incompetent to achieve anything even if it worked for you.
 16. The community. While not worse than most others, doesn't 
 really give a shit about the real world programmer. The elite 
 are too busy thinking of ways to add the latest and greatest 
 feature, thinking it will attract more buyers. The rabble 
 rousers like myself don't actually do much. Ultimately things 
 get done but nothing useful happens. Kinda like those jackasses 
 that floor it with their foot on the break. A lot of smoke but 
 pointless. D's been going how long? 10+ years?
Such a biased opinion, but how do you expect people to treat you when you're just acting like an ass? I have never had any issues with people in the D community, in fact I have felt pretty welcomed and generally most people around here are nice.
 The majority of you guys here don't realize that the average 
 programming can't waste time creating a gui, implementing a 
 design for the bindings you created, patching together 
 different packages that don't work together, etc.
I work 8+ hours every day, not working with D. I believe that I understand. D is open-source and people work on it when they have time. If you have a problem with that, then D is not the language for you.
 While there are, I'm sure, a lot of good intentions, they are 
 meaningless when it comes to putting food on the table.  If you 
 are going to do something, do it with gusto, not half ass. If 
 you are going to design a package, do it right! Not something 
 that continually requires fixing and effects every person that 
 uses it exponentially. Every minute I spend fixing someone 
 else's shit takes a minute away from my life. For N-1 other 
 users that's N minutes wasted because the original creator 
 didn't take the extra minute. Thanks for wasting all of our 
 time. That's a factor of N. Now when we have to do that for M 
 packages, that's M*N's people shit we have to fix. All because 
 one person didn't want to spend one extra minute fixing their 
 own shit. Ok, so, it might not be exponentially but it's still 
 time that could be better spent on more important things.



 17 ...
Blah...
 18. As you can see, I've ran out of steam. My butt hurts and I 
 have better things to do... like delete dmd from my computer. 
 At least that's simple and works! (I hope, maybe it will seg 
 fault on me or I have to run some special command line switch).
Well please don't come again.
 19. PS. Ok, so, D isn't as terrible as I'm making it out. It's 
 free. And as they say, you get what you pay for ;)
Most programming languages are free, the hell how does that make any sense? D's quality is pretty good. Just because you're too idiotic to understand its semantics, doesn't mean its quality is bad.
 20. I hope the D community can come together at some point and 
 work towards a common goal that will benefit humanity. It's a 
 mini-cosmos of what is going on in the world today. Everyone is 
 in it for themselves and they don't realize the big picture and 
 how every little thing they do has an enormous impact on the 
 human species.  We aren't doing this stuff for fun, we do it to 
 make a better life for ourselves, which means we also have to 
 do it for everyone else(because we are all in it together).

 Their is so much wasted time and life by so many people for so 
 many useless reasons that we could have built a bridge, brick 
 by brick, to moon and back, a dozen fold.  Humanity is an 
 amazing species capable of unimaginable things. By extension, 
 so is the D community. I just hope D doesn't end up like the 
 Kardashians as it has so much more use for humanity.

 00. Please get your shit together! I mean that in the best 
 possible way!
The Kardashians? Really...
Jul 03 2016
parent Icecream Bob <icecream icecream.com> writes:
On Sunday, 3 July 2016 at 07:16:17 UTC, Bauss wrote:
 On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 [...]
Say what? I have used it for multiple big projects of my own ranging from 40000-100000 lines of code? [...]
 [...]
That's adorable. You think that's a big project :D
Jul 05 2016
prev sibling next sibling parent reply tester <tester noreply.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 Sorry, I've spend the last month trying my best to get simple 
 shit done. At every turn there is some problem that has to be 
 dealt with that is unrelated to my actual work.  Be it the IDE, 
 debugging, the library, or user shared code, it is just crap. D 
 cannot be used successfully for semi-large projects without 
 significant man hours invested in getting everything to work.
couldn't agree more all the points in that post! reactions are as expected - dumb. +100
Jul 03 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
 reactions are as expected - dumb.
nice signature.
Jul 03 2016
prev sibling next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 I'm sorry but it seems too much focus on enhancements while 
 there are too many bugs and lack of tools to actually do 
 anything useful.

 I'm sorry if this hurts some of you guys feelings but it is 
 fact that D sucks as a whole. A modern programming language 
 should be modern, and D is not one of those languages. It is 
 built as from cobbling together disparate parts that don't work 
 together. The more it is used and added on to the worse it gets.
That was an entertaining rant, dare I say well written and with a few actionnable points maybe. I fear that your perception of problems are worse that the problems themselves in most cases though, perhaps to make a stronger point. For example choosing the 1970s compilation model with object files was probably the only possible decision. Windows support isn't that bad -except- for the occasional sc.ini tweak when you don't use VS and DMD installers in the right order, and VisualD works well. I don't really see what is missing when compared to the C++ experience. You are right about documentation but there are efforts in this very direction The fact is that rants about D complain about more and more elaborated and "end-userish" issues. People now expects D is 10 years old, perhaps more, but refining the end-user experience is much more recent than that. It all is coming together recently so your expectations went up at the same time. I feel the heart of the argument is that D is something that you have to get into to use it, it's not really something that you can almost avoid learning and just use (like say, you could write bad python scripts pretty quickly). In other words D has high interest rates but there is an initial entry cost.
Jul 03 2016
next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 3 July 2016 at 09:42:35 UTC, Guillaume Piolat wrote:
 People now expects
an experience without surprise.
Jul 03 2016
prev sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 3 July 2016 at 09:42:35 UTC, Guillaume Piolat wrote:
 The fact is that rants about D complain about more and more 
 elaborated and "end-userish" issues.
For example, if we look back to the "Angry Emo D Rant" from 2010. http://dlang.group.iteye.com/group/topic/20404 Of the 17 bullet-point complaints the author made, 12 have been addressed since.
Jul 03 2016
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 I'm sorry many of you have spent your lives working on 
 something that won't amount to much in the real world. It was a 
 valiant effort. Maybe it will seed new languages that will 
 actually work and have the tooling to be useful. D itself is 
 not a practical tool for general real world software 
 development.
What is "real world software development"? Of course it is exactly what I happen to do and it is done exactly the way I do it. Nobody uses scripting languages, nobody uses Python, all programming is web development, real development is done in an IDE, real development is done in a corporate setting as part of a team of at least 15 software engineers and involves automated testing and dependency checking and code reviews. Nobody uses garbage collection. Real developers don't use a compiler. I have nothing useful to add to this discussion. It's Sunday morning and this lecture about how I'm not a real programmer annoyed me.
Jul 03 2016
parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 3 July 2016 at 11:20:14 UTC, bachmeier wrote:
 What is "real world software development"? Of course it is 
 exactly what I happen to do and it is done exactly the way I do 
 it. Nobody uses scripting languages, nobody uses Python, all 
 programming is web development, real development is done in an 
 IDE, real development is done in a corporate setting as part of 
 a team of at least 15 software engineers and involves automated 
 testing and dependency checking and code reviews. Nobody uses 
 garbage collection. Real developers don't use a compiler.

 I have nothing useful to add to this discussion. It's Sunday 
 morning and this lecture about how I'm not a real programmer 
 annoyed me.
Time to quote DHH in his book Rework: http://e-venue.org/wp/ignore-the-real-world/
Jul 03 2016
prev sibling next sibling parent Karabuta <karabutaworld gmail.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 Sorry, I've spend the last month trying my best to get simple 
 shit done. At every turn there is some problem that has to be 
 dealt with that is unrelated to my actual work.  Be it the IDE, 
 debugging, the library, or user shared code, it is just crap. D 
 cannot be used successfully for semi-large projects without 
 significant man hours invested in getting everything to work.

 [...]
I do believe you are a die hard user of Visual Studio. These guys cannot leave outside their comfort zone. Ha ha. +1 For stability though. Ha ha.
Jul 03 2016
prev sibling next sibling parent singingbush <singingbush hotmail.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 10. Most user contributed D packages are outdated. They simply 
 don't work anymore due to all the language changes.
This is the only point in the rant that I can agree with. I've been stung a few times by dub dependencies preventing me from building a project. It's nothing to do with the language itself though and I think complaining about it is completely wrong. Rather than moaning here you should be creating a bug report for whichever project you find problems with or even better contributing a fix. I spend most of my time working with Java so have gotten used to having a huge amount free libs & tools because the eco-system around Java is huge. The D community is tiny by comparison so there's far less in that respect. That hasn't stopped me wanting to use D though, I can still see benefits to using it and I prefer it to Rust or Go.
Jul 03 2016
prev sibling next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 I'm sorry many of you have spent your lives working on 
 something that won't amount to much in the real world.
A friend asked me yesterday if I feel any kind of fulfillment from my job, like in a religious "this is what I am meant to do" way. I said "lol no, it is just a day job". D is similar to me: it makes things easier for me, so I use it. Saves a lot of time which is a reward itself. If it doesn't work for you, meh, whatever.
Jul 03 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 4 July 2016 at 00:17:20 UTC, Adam D. Ruppe wrote:
 D is similar to me: it makes things easier for me, so I use it. 
 Saves a lot of time which is a reward itself. If it doesn't 
 work for you, meh, whatever.
yep. D is just a tool. a great tool, way better than most tools in the similar group (for me, at least), but it's not something i will worship. as any good engineer, i love my tools, they are making my life easier. but if i'll find some tool that makes one of my current tools obsolete, and makes me way more productive with less efforts (mental included ;-)... i'll switch in the blink of an eye. like i did with Basic->Pascal, Pascal->C, then C->D. and mind you, i had a huge codebases at least with Pascal and C. that didn't stopped me. it won't stop me from D->???, if anything. still, there is something one should to consider: sometimes engineers defending their tools with what may look like a religious passion. and often it may be not that, but we just got tired of reading "your toolbox sux, fix it, switch to XYZ, blah-blah-blah", and explaining again and again that "we are fully aware about this, but we have reasons 'a', 'b', 'c', ... 'many' to use this toolbox. and we are working on making it better." such chats may be fun first three or five times, but then you just skipping it, possibly writing short answers like: "bwah, another dumb pseudo-guru. please, get lost." so please, people, stop acting like you are the ones who seeing the light of Ultimate Truth. we *know* your Truth for *ages*. write bug reports instead of rants! ;-) i'm guilty of this myself (ranting and so on), so i've been on both sides. when i just came to D, i've immediately noticed alot of obvious things that should be fixed for good, and sometimes wrote about that in "hey, listen to me, i know what to do!" manner. but as time passed, i've seen many other posts similar to mines. if it was so obvious, D devs must be really dumb to not see that by theirselves, right? and if they are so dumb, i don't think that they can program at all, so D just can't exist! but D exists, so something is wrong in this logical chain. ;-)
Jul 04 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 4 July 2016 at 07:20:05 UTC, ketmar wrote:
 i'm guilty of this myself (ranting and so on), so i've been on 
 both sides. when i just came to D, i've immediately noticed 
 alot of obvious things that should be fixed for good, and 
 sometimes wrote about that in "hey, listen to me, i know what 
 to do!" manner. but as time passed, i've seen many other posts 
 similar to mines. if it was so obvious, D devs must be really 
 dumb to not see that by theirselves, right? and if they are so 
 dumb, i don't think that they can program at all, so D just 
 can't exist! but D exists, so something is wrong in this 
 logical chain. ;-)
Nah, this just means that you have gotten used to the quirks and learned to work around them and therefore no longer notice them as much. That does not mean that newbie's are wrong in complaining about obstacles. Lack of libraries is not the most reasonable thing to complain about, but language and tooling issues are fair game.
Jul 04 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 4 July 2016 at 07:53:13 UTC, Ola Fosheim Grøstad wrote:
 Nah, this just means that you have gotten used to the quirks 
 and learned to work around them and therefore no longer notice 
 them as much.
nope, i still hate all the things i hated years ago, with the same passion. after all, that's why i'm slowly working on aliced. ;-)
Jul 04 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 4 July 2016 at 07:58:29 UTC, ketmar wrote:
 On Monday, 4 July 2016 at 07:53:13 UTC, Ola Fosheim Grøstad 
 wrote:
 Nah, this just means that you have gotten used to the quirks 
 and learned to work around them and therefore no longer notice 
 them as much.
nope, i still hate all the things i hated years ago, with the same passion. after all, that's why i'm slowly working on aliced. ;-)
But that is actually good news, I thought you had lost your spark of passion ;-).
Jul 04 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 4 July 2016 at 16:25:24 UTC, Ola Fosheim Grøstad wrote:
 But that is actually good news, I thought you had lost your 
 spark of passion ;-).
i lost interest in trying to improve mainline: it is more like C++ now, with legacy features "untouchable", and new breaking features unwelcome. while i understand the reasons, i still hate 'em as much as i always hated 'em. so i still have various branches in aliced repo, where i'm trying various features (or feature cuts ;-). and eventually promoting some of them to "main aliced". like -- i really can't understand how people is writing code without named arguments. or, rather, i *can* understand, and it is PITA. from the other side -- reviving `typedef` was too much work, so we lost it.
Jul 04 2016
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Monday, 4 July 2016 at 16:25:24 UTC, Ola Fosheim Grøstad wrote:
p.s. as nobody in his sane mind is not reading this thread 
anymore, i can even reveal that my work on SSA-based backend is 
not completely stalled. it will be universal library, but i'm 
planning to augment DMD with it too. while i don't think that it 
will be better than current dmd codegen, it will generate "good 
enough" code without optimizations. and yeah, non-proprietary 
license too. and, maybe, CTFE jitted with the same backend.

ambitious plans, i know, but hey! also, with this new backend 
aliced may finally leave underground and present herself to the 
world! ;-)

as for the current state -- library does basic copy elimination, 
value spilling and register selection. so it's not just a 
daydreaming. ;-)
Jul 04 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 5 July 2016 at 05:14:48 UTC, ketmar wrote:
 anymore, i can even reveal that my work on SSA-based backend is 
 not completely stalled. it will be universal library, but i'm 
 planning to augment DMD with it too. while i don't think that 
 it will be better than current dmd codegen, it will generate 
 "good enough" code without optimizations. and yeah, 
 non-proprietary license too. and, maybe, CTFE jitted with the 
 same backend.
Sounds like a good project. A non-optimizing backend that retains type information and asserts could be very useful. And keeping it simple is valuable, it makes it easier for others to understand the front-end/back-end connection.
 ambitious plans, i know, but hey! also, with this new backend 
 aliced may finally leave underground and present herself to the 
 world! ;-)
Hey, I'm sure others will chime in on code-gen, just for fun.
 as for the current state -- library does basic copy 
 elimination, value spilling and register selection. so it's not 
 just a daydreaming. ;-)
:-)
Jul 04 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 06:27:17 UTC, Ola Fosheim Grøstad 
wrote:
 Sounds like a good project. A non-optimizing backend that 
 retains type information and asserts could be very useful. And 
 keeping it simple is valuable, it makes it easier for others to 
 understand the front-end/back-end connection.
but... but it IS optimizing backend! once i have SSA, i can do alot of optimizations on that -- i just didn't implemented 'em all, but it doesn't matter. and any such optimizations are completely independent of frontend code anyway.
Jul 04 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 5 July 2016 at 06:38:18 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 06:27:17 UTC, Ola Fosheim Grøstad 
 wrote:
 Sounds like a good project. A non-optimizing backend that 
 retains type information and asserts could be very useful. And 
 keeping it simple is valuable, it makes it easier for others 
 to understand the front-end/back-end connection.
but... but it IS optimizing backend! once i have SSA, i can do alot of optimizations on that -- i just didn't implemented 'em all, but it doesn't matter. and any such optimizations are completely independent of frontend code anyway.
Ohoh, sorry :-) But quite frankly, good SIMD support is more important to me than a good optimizer. So if you extend aliced with good SIMD support and do it well in the backend then it could take off, IMO.
Jul 04 2016
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 06:44:55 UTC, Ola Fosheim Grøstad 
wrote:
 Ohoh, sorry :-) But quite frankly, good SIMD support is more 
 important to me than a good optimizer. So if you extend aliced 
 with good SIMD support and do it well in the backend then it 
 could take off, IMO.
heh. that is something i completely don't care about. but still, it is doable from within SSA engine too! you actually can write vectorization pass over SSA.
Jul 04 2016
prev sibling parent reply ZombineDev <petar.p.kirov gmail.com> writes:
On Tuesday, 5 July 2016 at 06:44:55 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 5 July 2016 at 06:38:18 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 06:27:17 UTC, Ola Fosheim Grøstad 
 wrote:
 Sounds like a good project. A non-optimizing backend that 
 retains type information and asserts could be very useful. 
 And keeping it simple is valuable, it makes it easier for 
 others to understand the front-end/back-end connection.
but... but it IS optimizing backend! once i have SSA, i can do alot of optimizations on that -- i just didn't implemented 'em all, but it doesn't matter. and any such optimizations are completely independent of frontend code anyway.
Ohoh, sorry :-) But quite frankly, good SIMD support is more important to me than a good optimizer. So if you extend aliced with good SIMD support and do it well in the backend then it could take off, IMO.
https://gist.github.com/9il/a167e56d7923185f6ce253ee14969b7f https://gist.github.com/9il/58c1b80110de2db5f2eff6999346a928 available today with LDC ;) See also https://github.com/ldc-developers/ldc/issues/1438
Jul 05 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 5 July 2016 at 09:23:42 UTC, ZombineDev wrote:
 https://gist.github.com/9il/a167e56d7923185f6ce253ee14969b7f
 https://gist.github.com/9il/58c1b80110de2db5f2eff6999346a928

 available today with LDC ;)
I meant good manual SIMD support in the language, not vectorization.
Jul 05 2016
next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Tuesday, 5 July 2016 at 09:51:01 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 5 July 2016 at 09:23:42 UTC, ZombineDev wrote:
 https://gist.github.com/9il/a167e56d7923185f6ce253ee14969b7f
 https://gist.github.com/9il/58c1b80110de2db5f2eff6999346a928

 available today with LDC ;)
I meant good manual SIMD support in the language, not vectorization.
On that note, I made this to use the Intel intrinsic names: https://github.com/p0nce/intel-intrinsics
Jul 05 2016
prev sibling parent reply ZombineDev <petar.p.kirov gmail.com> writes:
On Tuesday, 5 July 2016 at 09:51:01 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 5 July 2016 at 09:23:42 UTC, ZombineDev wrote:
 https://gist.github.com/9il/a167e56d7923185f6ce253ee14969b7f
 https://gist.github.com/9il/58c1b80110de2db5f2eff6999346a928

 available today with LDC ;)
I meant good manual SIMD support in the language, not vectorization.
Have you put any enhancement request on https://issues.dlang.org or written a DIP? If not, I can guarantee with almost 100% that it will not get worked because no one knows what you need. If you really want good SIMD support in D, you should look at https://dlang.org/spec/simd.html and the list of intrinsics that GDC and LDC provide and write a list of things that you find missing in terms of language support. Otherwise your claims are vague and non-actionable.
Jul 05 2016
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
 https://dlang.org/spec/simd.html and the list of intrinsics
core.simd is completely unusable on any 32-bit targets except hipsteros.
Jul 05 2016
parent reply ZombineDev <petar.p.kirov gmail.com> writes:
On Tuesday, 5 July 2016 at 12:40:57 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
 https://dlang.org/spec/simd.html and the list of intrinsics
core.simd is completely unusable on any 32-bit targets except hipsteros.
Why? I only found this issue: https://issues.dlang.org/show_bug.cgi?id=16092
Jul 05 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 14:52:33 UTC, ZombineDev wrote:
 On Tuesday, 5 July 2016 at 12:40:57 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
 https://dlang.org/spec/simd.html and the list of intrinsics
core.simd is completely unusable on any 32-bit targets except hipsteros.
Why?
'cause even documentation says so: "The vector extensions are currently implemented for the OS X 32 bit target, and all 64 bit targets.". and this time documentation is correct.
Jul 05 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/5/2016 9:11 AM, ketmar wrote:
 'cause even documentation says so: "The vector extensions are currently
 implemented for the OS X 32 bit target, and all 64 bit targets.". and this time
 documentation is correct.
This is because: 1. so D can run on earlier 32 bit processors without SIMD 2. SIMD support requires the stack be aligned everywhere to 128 bits. This can be a bit burdensome for 32 bit targets. 3. (1) and (2) are not issues on OSX 32, because their memory model requires it 4. people wanting high performance are going to be using 64 bits anyway
Jul 05 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:
 1. so D can run on earlier 32 bit processors without SIMD
this is something programmer should check in runtime if he is using SIMD.
 2. SIMD support requires the stack be aligned everywhere to 128 
 bits. This can be a bit burdensome for 32 bit targets.
but...
 3. (1) and (2) are not issues on OSX 32, because their memory 
 model requires it
so the code is already there, but only for osx. fun. anyway, fixing long-standing bug with `align()` being ignored on stack variables will allow to use SIMD on x86.
 4. people wanting high performance are going to be using 64 
 bits anyway
so i'm not in a set of "people". ok.
Jul 05 2016
next sibling parent reply Seb <seb wilzba.ch> writes:
On Tuesday, 5 July 2016 at 21:44:17 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:
 4. people wanting high performance are going to be using 64 
 bits anyway
so i'm not in a set of "people". ok.
It might be a good time to think about your hardware. Btw there is a recent announcement that Ubuntu and others will drop 32-bit support quite soon. http://slashdot.org/story/313313 Here is a copy - the same arguments apply also for performance features.
 Major Linux distributions are in agreement: it's time to stop 
 developing new versions for 32-bit processors. Simply: it's a 
 waste of time, both to create the 32-bit port, and to keep 
 32-bit hardware around to test it on. At the end of June, 
 Ubuntu developer Dimitri Ledkov chipped into the debate with 
 this mailing list post, saying bluntly that 32-bit ports are a 
 waste of resources. "Building i386 images is not 'for free', it 
 comes at the cost of utilising our build farm, QA and 
 validation time. Whilst we have scalable build-farms, i386 
 still requires all packages, autopackage tests, and ISOs to be 
 revalidated across our infrastructure." His proposal is that 
 Ubuntu version 18.10 would be 64-bit-only, and if users 
 desperately need to run 32-bit legacy applications, the'll have 
 to do so in containers or virtual machines. [...] In a forum 
 thread, the OpenSUSE Chairman account says 32-bit support 
 "doubles our testing burden (actually, more so, do you know how 
 hard it is to find 32-bit hardware these days?). It also 
 doubles our build load on OBS".
Jul 05 2016
parent reply Basile B. <b2.temp gmx.com> writes:
On Tuesday, 5 July 2016 at 22:38:29 UTC, Seb wrote:
 On Tuesday, 5 July 2016 at 21:44:17 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:
 4. people wanting high performance are going to be using 64 
 bits anyway
so i'm not in a set of "people". ok.
It might be a good time to think about your hardware. Btw there is a recent announcement that Ubuntu and others will drop 32-bit support quite soon. http://slashdot.org/story/313313 Here is a copy - the same arguments apply also for performance features.
 Major Linux distributions are in agreement: it's time to stop 
 developing new versions for 32-bit processors. Simply: it's a 
 waste of time, both to create the 32-bit port, and to keep 
 32-bit hardware around to test it on. At the end of June, 
 Ubuntu developer Dimitri Ledkov chipped into the debate with 
 this mailing list post, saying bluntly that 32-bit ports are a 
 waste of resources. "Building i386 images is not 'for free', 
 it comes at the cost of utilising our build farm, QA and 
 validation time. Whilst we have scalable build-farms, i386 
 still requires all packages, autopackage tests, and ISOs to be 
 revalidated across our infrastructure." His proposal is that 
 Ubuntu version 18.10 would be 64-bit-only, and if users 
 desperately need to run 32-bit legacy applications, the'll 
 have to do so in containers or virtual machines. [...] In a 
 forum thread, the OpenSUSE Chairman account says 32-bit 
 support "doubles our testing burden (actually, more so, do you 
 know how hard it is to find 32-bit hardware these days?). It 
 also doubles our build load on OBS".
I bet it's not a hardware thing but rather an OS thing. People on windows mostly use DMD 32 bit because the 64 bit version requires MS VS environment. Are you windows Ketmar ?
Jul 05 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 23:50:35 UTC, Basile B. wrote:
 Major Linux distributions...
...
 Are you windows Ketmar ?
no. GNU/Linux here. and i don't care what shitheads from "major linux distributions" may think.
Jul 05 2016
parent reply Basile B. <b2.temp gmx.com> writes:
On Wednesday, 6 July 2016 at 01:27:11 UTC, ketmar wrote:
 On Tuesday, 5 July 2016 at 23:50:35 UTC, Basile B. wrote:
 Major Linux distributions...
...
 Are you windows Ketmar ?
no. GNU/Linux here. and i don't care what shitheads from "major linux distributions" may think.
ok, bad bet but why do you insist with DMD 32 bit SIMD support ? can't you use a 64 bit linux distribution ?
Jul 05 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Wednesday, 6 July 2016 at 02:10:09 UTC, Basile B. wrote:
 ok, bad bet but why do you insist with DMD 32 bit SIMD support 
 ? can't you use a 64 bit linux distribution ?
i can even dance naked on the street, no problems. but i just can't see a reason to do that.
Jul 05 2016
parent reply Basile B. <b2.temp gmx.com> writes:
On Wednesday, 6 July 2016 at 02:34:04 UTC, ketmar wrote:
 On Wednesday, 6 July 2016 at 02:10:09 UTC, Basile B. wrote:
 ok, bad bet but why do you insist with DMD 32 bit SIMD support 
 ? can't you use a 64 bit linux distribution ?
i can even dance naked on the street, no problems. but i just can't see a reason to do that.
That's a bad analogy. You layer two possibilities that are unrelated in order to show that the first one is stupid. But that's the opposite, this is the analogy that's stupid, not the the proposition you refer to, because while using a 64 bit linux will help you, dancing naked in the street won't. Eventually this kind of reasoning will impress a child or someone a bit weak but seriously you can't win when the solution is so obvious.
Jul 05 2016
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Wednesday, 6 July 2016 at 03:23:18 UTC, Basile B. wrote:
 while using a 64 bit linux will help you, dancing naked in the 
 street won't.
it will really help me to head my house, yes. so you proposing me to rebuild the world, and all my custom-built software (alot!) for... for nothing, as (i said it before) no of the apps i'm using require more than 2GB of RAM. and SIMD instructions are perfectly usable in 32-bit mode too -- with anything except DMD. so you proposing me to fuck my whole system to workarond a DMD bug/limitation. great. this is even more stupid than naked dances.
Jul 05 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
p.s. *heat. ;-)

p.p.s. and i can use SIMD with DMD built-in asm, of course. 
that's what i did in Follin, and it works like a charm. but, of 
course, the code is completely unportable -- and this is 
something i wanted to avoid...
Jul 05 2016
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/5/2016 2:44 PM, ketmar wrote:
 anyway, fixing long-standing bug with `align()` being ignored on stack
variables
 will allow to use SIMD on x86.
Not really. The alignment requirement has to be done by all functions, whether they use SIMD or not.
 4. people wanting high performance are going to be using 64 bits anyway
so i'm not in a set of "people". ok.
I'm curious about why you require 32 bits.
Jul 05 2016
next sibling parent reply deadalnix <deadalnix gmail.com> writes:
On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:
 On 7/5/2016 2:44 PM, ketmar wrote:
 anyway, fixing long-standing bug with `align()` being ignored 
 on stack variables
 will allow to use SIMD on x86.
Not really. The alignment requirement has to be done by all functions, whether they use SIMD or not.
The intel performance optimization manual have some nice trick to mix code with different stack alignment. You may want to check that out. Sadly, I can't find the article right now, but mostly it boils down to : - as the stack grow down, you can mask the stack pointer at function entry to get it aligned. - If both caller and callee both need alignment, callee can call/jump over the masking instructions directly into the meat of the callee.
Jul 05 2016
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/5/2016 6:06 PM, deadalnix wrote:
 On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:
 On 7/5/2016 2:44 PM, ketmar wrote:
 anyway, fixing long-standing bug with `align()` being ignored on stack
variables
 will allow to use SIMD on x86.
Not really. The alignment requirement has to be done by all functions, whether they use SIMD or not.
The intel performance optimization manual have some nice trick to mix code with different stack alignment. You may want to check that out. Sadly, I can't find the article right now, but mostly it boils down to : - as the stack grow down, you can mask the stack pointer at function entry to get it aligned. - If both caller and callee both need alignment, callee can call/jump over the masking instructions directly into the meat of the callee.
The trouble with that is you lose the ability to use EBP as a general purpose register, as you'll need EBP to point to the parameters and ESP to point to the locals. It's a complex and disruptive change to the code generator. It's certainly doable, but in an age of priorities I suspect the time is better spent on improving 64 bit code generation.
Jul 05 2016
next sibling parent reply Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 6 July 2016 at 04:56:07 UTC, Walter Bright wrote:
 It's certainly doable, but in an age of priorities I suspect 
 the time is better spent on improving 64 bit code generation.
It's not like it is a blocker for anyone, there is: - assembly - auto-vectorization - LDC Not worth your time (and the backend risk!) imho.
Jul 06 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
$subj.
Jul 06 2016
prev sibling parent deadalnix <deadalnix gmail.com> writes:
On Wednesday, 6 July 2016 at 04:56:07 UTC, Walter Bright wrote:
 It's certainly doable, but in an age of priorities I suspect 
 the time is better spent on
\o/
 improving 64 bit code generation.
/o\
Jul 07 2016
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:
 On 7/5/2016 2:44 PM, ketmar wrote:
 anyway, fixing long-standing bug with `align()` being ignored 
 on stack variables
 will allow to use SIMD on x86.
Not really. The alignment requirement has to be done by all functions, whether they use SIMD or not.
nope. it should be done only for the data that participating in SIMD. and this can be perfectly solved by fixing `align()` issues. some bit operations to adjust ESP won't hurt, if people want SIMD: SIMD will give much bigger win.
 4. people wanting high performance are going to be using 64 
 bits anyway
so i'm not in a set of "people". ok.
I'm curious about why you require 32 bits.
and i'm curious why everyone is so amazed by 64-bit systems. none of my software is using more than 2GB of RAM. why should i pay for something i don't need? like, all pointers are magically twice bigger. hello, cache lines, i have a present for you!
Jul 05 2016
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/5/2016 6:30 PM, ketmar wrote:
 I'm curious about why you require 32 bits.
and i'm curious why everyone is so amazed by 64-bit systems. none of my software is using more than 2GB of RAM. why should i pay for something i don't need? like, all pointers are magically twice bigger. hello, cache lines, i have a present for you!
I agree, 64 bits does have its drawbacks. Apple has dropped all 32 bit support, Ubuntu is preparing to. I wonder when Windows will drop it like they did 16 bit support - they'll probably be the last to do so, as they value legacy compatibility more than anyone else. But it's coming. We should be investing in the future with our limited resources.
Jul 05 2016
parent Jacob Carlborg <doob me.com> writes:
On 06/07/16 07:01, Walter Bright wrote:

 Apple has dropped all 32 bit support
No. For ARM 32bit is still relevant. On OS X the Simulator (used to test iOS applications) are running the iOS applications as x86 (both 32 and 64bit) even though the iOS deceives are running ARM. Apparently some users are still running 32bit applications on OS X because they have plugins that are only available as 32bit, think audio and music software. -- /Jacob Carlborg
Jul 07 2016
prev sibling parent reply qznc <qznc web.de> writes:
On Wednesday, 6 July 2016 at 01:30:46 UTC, ketmar wrote:
 and i'm curious why everyone is so amazed by 64-bit systems. 
 none of my software is using more than 2GB of RAM. why should i 
 pay for something i don't need? like, all pointers are 
 magically twice bigger. hello, cache lines, i have a present 
 for you!
The advantage of compiling for AMD64 is that the compiler can assume a lot of extensions like the SSE bunch. If you want to distribute a binary for x86 you only have the 386 instructions. Ok, 686 is probably common enough today. For more special instructions, you could guard them and provide a fallback. One example: To convert a floating point value to integer on 386, you need to store it to memory and load it again. Makes sense, if you floating point stuff is handled by a co-processor, but today this is completely integrated. SSE added an extra instruction to avoid the memory detour. GCC has a switch (-mx32) to store pointers as 32bit on a 64bit system. That is probably very close to what you want.
Jul 06 2016
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Wednesday, 6 July 2016 at 10:26:27 UTC, qznc wrote:
 If you want to distribute a binary
gods save me! why should i do that? i am GPL fanatic. and if i'm doing contract work, i know what machines my contractor will have.
 for x86 you only have the 386 instructions. Ok, 686 is probably 
 common enough today. For more special instructions, you could 
 guard them and provide a fallback.
nope. just write in the readme: "you need at least Nehalem-grade CPU to run that". maybe check it at startup time and fail if CPU is too old. that's all, several lines of code. not any different from demanding 64-bit system. also note that some 64-bit systems can run 32-bit apps, but not vice versa.
 GCC has a switch (-mx32) to store pointers as 32bit on a 64bit 
 system. That is probably very close to what you want.
except that i should either build everything with that flag and hope for the best, or pay for the things i don't need.
Jul 06 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
 Have you put any enhancement request on 
 https://issues.dlang.org or written a DIP? If not, I can 
 guarantee with almost 100% that it will not get worked because 
 no one knows what you need.
SIMD support has been discussed and shot down before. I don't need anything and see no point in a DIP SIMD before getting floats fixed. But it would make the language more attractive.
Jul 05 2016
parent reply ZombineDev <petar.p.kirov gmail.com> writes:
On Tuesday, 5 July 2016 at 12:59:27 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
 Have you put any enhancement request on 
 https://issues.dlang.org or written a DIP? If not, I can 
 guarantee with almost 100% that it will not get worked because 
 no one knows what you need.
SIMD support has been discussed and shot down before.
The fact core.simd exists (regardless how well it works) contradicts your statement. I still can't see what *you* find missing in the current implementation. Do you have any particular SIMD enhancement request that was declined? The floats problem you talk about does not affect SIMD, so to me it seems that your just looking for excuses for not working on a solid proposal.
Jul 05 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 5 July 2016 at 14:38:07 UTC, ZombineDev wrote:
 The fact core.simd exists (regardless how well it works) 
 contradicts your statement.
Of course not. "core.simd" has been an excuse for not doing better.
 The floats problem you talk about does not affect SIMD, so to
Of course it does. You need to take same approach to floats on both scalars and vectors. I think you are mixing up SIMD with machine language.
Jul 05 2016
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
[snip]

I seriously don't know what to make of this post. It's a random 
compilation of complaints made about D over the last couple of 
years that lacks depth and useful information. As such, it could 
be mere trolling.

If the poster on the other hand is sincere, then I suppose s/he's 
a programming novice that is confused and put off by anything 
that doesn't have an IDE to hold his or her hand at every step - 
which is not uncommon. But to get going with D you just need 
`dmd` and a text editor, possibly `dub` for convenience. IMO, D 
doesn't need IDE's as much as other languages (Java needs an IDE 
because of tons of boiler plate, C++ because of C++).

Also, error messages are often the same, so after a while you 
know exactly what's wrong when a certain message appears. This is 
also true of Objective-C, C, Java etc. If compilation fails 
immediately, it's most likely a syntax error and you will see 
something like  `Error: found '{' when expecting ')'` somewhere. 
A bit of patience and common sense can get you a long way.

The reason why this post is not taken too seriously is not 
because people here are religious about D - a reproach that is 
often used to prevent any real discussion. No, it is because it 
sounds like an amazon review where someone ordered a book in a 
foreign language and gives it one star saying "I'm a beginner and 
the vocabulary used is too difficult, this book sucks!" Sorry, 
but that's how it came across to me.
Jul 04 2016
next sibling parent Andrea Fontana <nospam example.com> writes:
On Monday, 4 July 2016 at 09:37:41 UTC, Chris wrote:
 On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 The reason why this post is not taken too seriously is not 
 because people here are religious about D - a reproach that is 
 often used to prevent any real discussion. No, it is because it 
 sounds like an amazon review where someone ordered a book in a 
 foreign language and gives it one star saying "I'm a beginner 
 and the vocabulary used is too difficult, this book sucks!" 
 Sorry, but that's how it came across to me.
When I read this post I tought the same.
Jul 04 2016
prev sibling parent Bauss <jj_1337 live.dk> writes:
On Monday, 4 July 2016 at 09:37:41 UTC, Chris wrote:
 On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:
 [snip]

 I seriously don't know what to make of this post. It's a random 
 compilation of complaints made about D over the last couple of 
 years that lacks depth and useful information. As such, it 
 could be mere trolling.

 [...]
Probably the realest post on this forum.
Jul 04 2016