www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Java > Scala

reply bearophile <bearophileHUGS lycos.com> writes:
A recently written report from a firm that has switched back from Scala to Java:

https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt

Some people say that programmers often show a religious-like attachment to
their preferred languages, but I suspect that often the truth is just that new
languages are not good enough for practical work. Even languages like Scala
that seem very carefully designed by geniuses, a language that is also easily
integrated with Java code, that is one of the most successful and used
languages of the world, risk to be a failure for a good number of people.
Designing a good enough new language is hard, maybe 99% of the newly designed
languages fail, and creating a language that is also usable in daily work is
much harder.

Regarding D2, I think in the last year it is coming out of a phase of its
development: I no longer find a new compiler bug every time I write 20 lines of
D2 code. It happens still, but it's now an uncommon thing.

I don't have a wide experience about designing new languages, so it's not easy
to give good suggestions. But now I suggest to keep some focus about removing
important/basic design bugs/faults of D, like the recent removal of covariance
-related array problem. Example: D2 foreach is currently broken in two
different ways. On the other hand there are examples of successful languages
that contain several basic design faults, like PHP and JavaScript. So I don't
know.

----------------

From that text:

5. Avoid closures. [...] At some point, we stopped seeing lambdas as free and
started seeing them as syntactic sugar on top of anonymous classes and thus
acquired the same distaste for them as we did anonymous classes.<
D2 closures are probably better, they aren't syntactic sugar on top of anonymous classes. On the other hand invisible sources of low performance are a bad thing in a language as D2. This is why I have suggested to add a compiler switch that lists all the closures of a module (or other related ideas about no heap activity tests or enforcement). Bye, bearophile
Nov 29 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:jb417r$2dhj$1 digitalmars.com...
A recently written report from a firm that has switched back from Scala to 
Java:

 https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt
Just skimmed through that. Some interesting stuff. Having never touched Scala (although I've been meaning to at least look into it more...), it sounds like D is way ahead in many ways. Although it's possible that could just be my own bias from being very familiar with and accustomed to D. I wonder to what extent the inefficiencies he mentioned (such as the lambdas being sugar for anon classes) could be due to the JVM itself. Or if the reason is primarily something else, such as something about Scala's internal design or just its implementation. Maybe Scala tries to maximize compatibility with Java, and if so, maybe that's the main underlying cause? Or again, maybe just inherent attributes of the JVM itself (although that would run contrary to what I've heard many people claim about the modern JVM)?
Nov 29 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-11-30 03:41, Nick Sabalausky wrote:
 "bearophile"<bearophileHUGS lycos.com>  wrote in message
 news:jb417r$2dhj$1 digitalmars.com...
 A recently written report from a firm that has switched back from Scala to
 Java:

 https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt
Just skimmed through that. Some interesting stuff. Having never touched Scala (although I've been meaning to at least look into it more...), it sounds like D is way ahead in many ways. Although it's possible that could just be my own bias from being very familiar with and accustomed to D. I wonder to what extent the inefficiencies he mentioned (such as the lambdas being sugar for anon classes) could be due to the JVM itself. Or if the reason is primarily something else, such as something about Scala's internal design or just its implementation. Maybe Scala tries to maximize compatibility with Java, and if so, maybe that's the main underlying cause? Or again, maybe just inherent attributes of the JVM itself (although that would run contrary to what I've heard many people claim about the modern JVM)?
I think it has something to do with Scala trying to be compatible with Java. -- /Jacob Carlborg
Nov 29 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/29/2011 11:42 PM, Jacob Carlborg wrote:
 I think it has something to do with Scala trying to be compatible with Java.
It has to run on the JVM, which is a large and heavy rock.
Nov 30 2011
next sibling parent reply Russel Winder <russel russel.org.uk> writes:
Walter,

On Wed, 2011-11-30 at 00:17 -0800, Walter Bright wrote:
 On 11/29/2011 11:42 PM, Jacob Carlborg wrote:
 I think it has something to do with Scala trying to be compatible with =
Java.
=20
 It has to run on the JVM, which is a large and heavy rock.
I think only response possible to this is "bollocks". It may be what you believe, but that doesn't make it true as an abstract statement. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Nov 30 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 12:29 AM, Russel Winder wrote:
 Walter,

 On Wed, 2011-11-30 at 00:17 -0800, Walter Bright wrote:
 On 11/29/2011 11:42 PM, Jacob Carlborg wrote:
 I think it has something to do with Scala trying to be compatible with Java.
It has to run on the JVM, which is a large and heavy rock.
I think only response possible to this is "bollocks". It may be what you believe, but that doesn't make it true as an abstract statement.
I used to be intimately familiar with the JVM, I even wrote a gc for it. The bytecode ops in it are designed for Java, nothing more. Worse, it's a primitive stack machine. To generate even passably good native code, the JVM has to do a lot of reverse engineering of the bytecode. For example, you cannot pass by value anything other than the primitive Java data types. There are no pointers. Want an unsigned int? Forget it. Arrays of anything but class references? Nyuk nyuk nyuk. Etc.
Nov 30 2011
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Are you not being a bit simplistic here?

There are several JVM implementations around not just one.

Plus if I understand correctly some complains of people using D in real
projects, in many cases JVM JITs are able to generate better code than 
D. At least for the time being.


 I used to be intimately familiar with the JVM, I even wrote a gc for it.
 The bytecode ops in it are designed for Java, nothing more. Worse, it's
 a primitive stack machine. To generate even passably good native code,
 the JVM has to do a lot of reverse engineering of the bytecode.

 For example, you cannot pass by value anything other than the primitive
 Java data types. There are no pointers. Want an unsigned int? Forget it.
 Arrays of anything but class references? Nyuk nyuk nyuk. Etc.
Nov 30 2011
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 11/30/2011 09:56 PM, Paulo Pinto wrote:
 Are you not being a bit simplistic here?

 There are several JVM implementations around not just one.
Where did he talk about implementations? He only described the _design_ of the JVM.
 Plus if I understand correctly some complains of people using D in real
 projects, in many cases JVM JITs are able to generate better code than
 D. At least for the time being.
Nope. (Even when interpreting 'D' as 'DMD'). Except when the D code is written badly.
 I used to be intimately familiar with the JVM, I even wrote a gc for it.
 The bytecode ops in it are designed for Java, nothing more. Worse, it's
 a primitive stack machine. To generate even passably good native code,
 the JVM has to do a lot of reverse engineering of the bytecode.

 For example, you cannot pass by value anything other than the primitive
 Java data types. There are no pointers. Want an unsigned int? Forget it.
 Arrays of anything but class references? Nyuk nyuk nyuk. Etc.
Nov 30 2011
parent reply Jude <10equals2 gmail.com> writes:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 11/30/2011 03:33 PM, Timon Gehr wrote:
 On 11/30/2011 09:56 PM, Paulo Pinto wrote:
 Are you not being a bit simplistic here?
 
 There are several JVM implementations around not just one.
 
Where did he talk about implementations? He only described the _design_ of the JVM.
 Plus if I understand correctly some complains of people using D
 in real projects, in many cases JVM JITs are able to generate
 better code than D. At least for the time being.
 
Nope. (Even when interpreting 'D' as 'DMD'). Except when the D code is written badly.
there was recently a test case where D outperformed C++. I would be VERY surprised if a JVM JIT could outperform D, excepting the occasional corner case of course. I'd love to see any tests that prove that JIT could generally generate better code.
 
 I used to be intimately familiar with the JVM, I even wrote a
 gc for it. The bytecode ops in it are designed for Java,
 nothing more. Worse, it's a primitive stack machine. To
 generate even passably good native code, the JVM has to do a
 lot of reverse engineering of the bytecode.
 
 For example, you cannot pass by value anything other than the
 primitive Java data types. There are no pointers. Want an
 unsigned int? Forget it. Arrays of anything but class
 references? Nyuk nyuk nyuk. Etc.
 
-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iQEcBAEBAgAGBQJO1sLPAAoJENcHIWLyQiSlIuEIALAy/su13wgWfIfemUQ8b9O8 83624N8+SGAjJgMkgVLoYbPQbjqp5bOhWxaUE97CAZeWj4kfcKwVPQ2shB37xhwG EQ6QUlMBCIPVBMdd/2/zz2KFdEgkxNoKcgyQh7mCUTAdwTI49ccXwZ42MgT2NQtB pQe+6k8SJCOEV4KQoya6gjWQHMN54FdiRu4mTxoe1uGUtVSViBY5LxVdxCD7J7wA p3VGDdjDHfssaGYBRpSc8/+NIybAEViN8Sg3EQ3FrO9BUJEopOPeKdzNkm1fPWaQ p43Fv7Abn3L0bK1rBTPMl68T7/lPV16ojoLNmdc/EJ9CazHJuk56iB6Q0I2c438= =rXhD -----END PGP SIGNATURE-----
Nov 30 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Jude:

 I would be VERY surprised if a JVM JIT could outperform D, excepting
 the occasional corner case of course.
Be prepared to be surprised again and again... :-) Bye, bearophile
Nov 30 2011
parent reply Jude <10equals2 gmail.com> writes:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 11/30/2011 06:06 PM, bearophile wrote:
 Jude:
 
 I would be VERY surprised if a JVM JIT could outperform D,
 excepting the occasional corner case of course.
Be prepared to be surprised again and again... :-) Bye, bearophile
Oh, I'm constantly surprised. I don't have a great programming background, haven't been at it for nearly as long as most here I don't believe. Now I'm interested in trying this out. Got any ideas for code that is currently way less than optimal in D? You don't have to give me anything, but an idea about where to start would be nice. thanks Mr. Bear! -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iQEcBAEBAgAGBQJO1sg6AAoJENcHIWLyQiSlfegH/2o/U041zRwOe65sjsZ5D3Jg ZbOTv32eQYvhLuuX2GSGce0GcebTunmk2SnQyZ1C3KQ9X97igLLbuGf0K8DQllY8 TzGVIPp9EYBaOKGVFMRX/KMZQRmWfHZFpE4YX7Q2VEuW+QaNyhgBic0QjA/d6ZN1 B09+zEGuo80vIoZGHXmUPI6xtud1//Eq3LB7iJOpJl0ceOGcLu7JzWjZOFc55lAx Ml0bfu9KRfC/gw3VgQAIpuUAWJH4V9UUyXPM9aF3tgjjcTFVtewpS7PlqdceORj4 XBMLRC5rGA5EHibaGFnwV8fR2tYzoayvBwWUIkfWi8wQrJFUNytmt1sd57rv/Jk= =zkLl -----END PGP SIGNATURE-----
Nov 30 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
Jude:

 Got any ideas for code that is currently way less than optimal in D?
Compared to Java running on the OracleVM D is most times slower when it comes to heavily garbage collected code, and often with floating-point-heavy code. Exceptions (and synchronized methods) are faster than D-DMD ones. Sometimes textual I/O is faster in Java compared to D-DMD-Phobos one. Often the JavaVM is able to de-virtualize and inline virtual calls, while D-DMD is not able to do this (maybe LDC2-LLVM3 will be able to do this a bit), making such code faster. The JavaVM is usually able to dynamically unrool loops that have a loop count known only at runtime, this sometimes speeds up loops a lot compared to D-DMD. Some of Java libraries implement important data structures or other things that are currently sometimes significantly faster than equivalent D ones. This is probably a not complete list. Bye, bearophile
Nov 30 2011
parent reply "David Eagen" <spam_me_here mailinator.com> writes:
	format=flowed;
	charset="iso-8859-1";
	reply-type=original
Content-Transfer-Encoding: 7bit

Recently I needed to analyze some mail logs. I needed to find the hosts that 
were sending mail and how many lines in the log existed for each host. 
Thinking this would be perfect for natively compiled code in D I first wrote 
a D app. I then wrote it in perl and was amazed at how much faster perl was. 
I expanded out to Java and Scala. For all four I used the same source files 
and the same output was created.

The four source files totalled 430MB together with 1.69 million lines. In 
all four implementations the file was read a line at a time and the same 
regex was applied to extract the desired data. Output was 6,857 bytes.

Here are the run results on a 32-bit linux 3.0.0 system. The absolute 
numbers are not important since I ran this on a very old system. It is the 
relative numbers that matter here.

Java (JVM 7 update 1)
real    0m56.465s
user    0m51.911s
sys     0m3.344s

Perl (5.12.4)
real    1m22.256s
user    1m19.773s
sys     0m2.212s

Scala (2.9.1 on JVM 7 update 1)
real    1m41.187s
user    1m36.566s
sys     0m3.892s

D (2.0.56 compiled with -O -release -inline -noboundscheck)
real    4m21.255s
user    4m14.216s
sys     0m5.940s

Java is the fastest, even faster than perl. The D version which is the only 
natively compiled code version is over 4.6 times slower than the Java 
version even when including overhead like the JVM startup time.

The source for each of the four implementations is attached. I admit to 
being very new to D so perhaps I'm really doing something wrong.

"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:jb6j8h$12js$1 digitalmars.com...
 Jude:

 Got any ideas for code that is currently way less than optimal in D?
Compared to Java running on the OracleVM D is most times slower when it comes to heavily garbage collected code, and often with floating-point-heavy code. Exceptions (and synchronized methods) are faster than D-DMD ones. Sometimes textual I/O is faster in Java compared to D-DMD-Phobos one. Often the JavaVM is able to de-virtualize and inline virtual calls, while D-DMD is not able to do this (maybe LDC2-LLVM3 will be able to do this a bit), making such code faster. The JavaVM is usually able to dynamically unrool loops that have a loop count known only at runtime, this sometimes speeds up loops a lot compared to D-DMD. Some of Java libraries implement important data structures or other things that are currently sometimes significantly faster than equivalent D ones. This is probably a not complete list. Bye, bearophile
Nov 30 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
The difference likely has little to do with native vs JVM, but more likely is 
heavily dependent on how well regex is implemented.
Nov 30 2011
next sibling parent Russel Winder <russel russel.org.uk> writes:
On Wed, 2011-11-30 at 21:52 -0800, Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more likel=
y is=20
 heavily dependent on how well regex is implemented.
It might be interesting to expose this regex implementation aspect as a benchmark in itself. And add Go to the mix: the Go folk generally deride everyone else's regex implementations, boasting that theirs is the "one true way". --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 01 2011
prev sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Wednesday, November 30, 2011 21:52:38 Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more likely
 is heavily dependent on how well regex is implemented.
Which begs the question as to how well D would do with the new std.regex. - Jonathan M Davis
Dec 01 2011
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 1:58 AM, Jonathan M Davis wrote:
 On Wednesday, November 30, 2011 21:52:38 Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more likely
 is heavily dependent on how well regex is implemented.
Which begs the question as to how well D would do with the new std.regex.
Sure. Wanna give it a go? (The older regex is an old-fashioned bytecode engine I wrote the core of 10+ years ago in a weekend. It's not a bit surprising that the Java/Perl ones, where resources have been poured into them, would run faster.)
Dec 01 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 1:58 AM, Jonathan M Davis wrote:
 On Wednesday, November 30, 2011 21:52:38 Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more likely
 is heavily dependent on how well regex is implemented.
Which begs the question as to how well D would do with the new std.regex.
My understanding is that his code already uses the new regex. Andrei
Dec 01 2011
parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Thursday, December 01, 2011 09:07:30 Andrei Alexandrescu wrote:
 On 12/1/11 1:58 AM, Jonathan M Davis wrote:
 On Wednesday, November 30, 2011 21:52:38 Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more
 likely is heavily dependent on how well regex is implemented.
Which begs the question as to how well D would do with the new std.regex.
My understanding is that his code already uses the new regex.
He said that we was using 2.056, and I thought that the new std.regex wasn't in that release. I could be wrong though. - Jonathan M Davis
Dec 01 2011
parent Somedude <lovelydear mailmetrash.com> writes:
Le 01/12/2011 19:28, Jonathan M Davis a écrit :
 On Thursday, December 01, 2011 09:07:30 Andrei Alexandrescu wrote:
 On 12/1/11 1:58 AM, Jonathan M Davis wrote:
 On Wednesday, November 30, 2011 21:52:38 Walter Bright wrote:
 The difference likely has little to do with native vs JVM, but more
 likely is heavily dependent on how well regex is implemented.
Which begs the question as to how well D would do with the new std.regex.
My understanding is that his code already uses the new regex.
He said that we was using 2.056, and I thought that the new std.regex wasn't in that release. I could be wrong though. - Jonathan M Davis
I believe you're correct.
Dec 02 2011
prev sibling next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
David Eagen:

 I admit to being very new to D so perhaps I'm really doing something wrong.
This program uses string hashing, regular expressions, and by line file iteration. Those are among the most common operations done by script-like programs. Some suggestions for your D code: - Use the new regex engine - Use File.byLine instead of BufferedFile - don't use the built-in sort, use std.algorithm.sort - try to minimize the number of memory allocations - try to disable the GC if and when you think it's a good thing to do; - avoid casts where possible. - writef("%s,%d\n" ==> writefln("%s,%d" For Phobos devs: is the new regex engine able to support verbose regexes (it means that allow newlines, empty space and even comments)? In practice I only use them in Python. Bye, bearophile
Dec 01 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 01.12.2011, 14:29 Uhr, schrieb bearophile <bearophileHUGS lycos.com>:

 David Eagen:

 I admit to being very new to D so perhaps I'm really doing something  
 wrong.
This program uses string hashing, regular expressions, and by line file iteration. Those are among the most common operations done by script-like programs. Some suggestions for your D code: - Use the new regex engine - Use File.byLine instead of BufferedFile - don't use the built-in sort, use std.algorithm.sort - try to minimize the number of memory allocations - try to disable the GC if and when you think it's a good thing to do; - avoid casts where possible. - writef("%s,%d\n" ==> writefln("%s,%d" For Phobos devs: is the new regex engine able to support verbose regexes (it means that allow newlines, empty space and even comments)? In practice I only use them in Python. Bye, bearophile
Ok, so this is how you can optimize the program. In general we have to keep in mind that D is supposed to be convenient for scripting tasks and you don't usually optimize there, nor should in-depth knowledge of different sort algorithms, the GC or text I/O functions be required. This is something for an online-how-to-optimize :) Btw, write(relay, to!string(relayHosts[relay]), '\n'); is probably the "fastest" :p
Dec 01 2011
parent reply Jesse Phillips <jessekphillips+D gmail.com> writes:
On Thu, 01 Dec 2011 19:54:48 +0100
"Marco Leise" <Marco.Leise gmx.de> wrote:

 Ok, so this is how you can optimize the program. 
Actually these suggestions weren't about optimization. Bearophile is big on consistency and style. For example you don't use the built in sort because it should not exist, and std.algorithm.sort returns a SortedRange which will then be usable by other functions (std.algorithm.find) to use a faster algorithm (bubble search). So basically, do it the D way, then it will be easier to look at improving performance. One statement I liked from, I believe, Andrei was "the easy way should be the right way, and the [wrong way] should be possible" not an exact quote and "wrong way" was actually something else.
Dec 01 2011
parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 01.12.2011, 21:29 Uhr, schrieb Jesse Phillips  
<jessekphillips+D gmail.com>:

 On Thu, 01 Dec 2011 19:54:48 +0100
 "Marco Leise" <Marco.Leise gmx.de> wrote:

 Ok, so this is how you can optimize the program.
Actually these suggestions weren't about optimization. Bearophile is big on consistency and style. For example you don't use the built in sort because it should not exist, and std.algorithm.sort returns a SortedRange which will then be usable by other functions (std.algorithm.find) to use a faster algorithm (bubble search). So basically, do it the D way, then it will be easier to look at improving performance. One statement I liked from, I believe, Andrei was "the easy way should be the right way, and the [wrong way] should be possible" not an exact quote and "wrong way" was actually something else.
Stopping the GC is hardly just doing it the D way, but I get your point.
Dec 01 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 11/30/11 9:47 PM, David Eagen wrote:
 Recently I needed to analyze some mail logs. I needed to find the hosts
 that were sending mail and how many lines in the log existed for each
 host. Thinking this would be perfect for natively compiled code in D I
 first wrote a D app. I then wrote it in perl and was amazed at how much
 faster perl was. I expanded out to Java and Scala. For all four I used
 the same source files and the same output was created.
[snip] This is a good benchmark for I/O and a practical regex. David, could you please send (privately if you want) the file or some statistics about it (bytes, lines, a representative sample)? Thanks! Andrei
Dec 01 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 9:03 AM, Andrei Alexandrescu wrote:
 On 11/30/11 9:47 PM, David Eagen wrote:
 Recently I needed to analyze some mail logs. I needed to find the hosts
 that were sending mail and how many lines in the log existed for each
 host. Thinking this would be perfect for natively compiled code in D I
 first wrote a D app. I then wrote it in perl and was amazed at how much
 faster perl was. I expanded out to Java and Scala. For all four I used
 the same source files and the same output was created.
[snip] This is a good benchmark for I/O and a practical regex. David, could you please send (privately if you want) the file or some statistics about it (bytes, lines, a representative sample)? Thanks! Andrei
One more thing before I forget - you may want to use byLine() for input. In case the issue turns out to be related to I/O, it's much better we improve byLine() instead of the streams library. Thanks, Andrei
Dec 01 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/1/11, Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:
 One more thing before I forget - you may want to use byLine() for input.
Boo byLine! http://d.puremagic.com/issues/show_bug.cgi?id=7022
Dec 01 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 10:51 AM, Andrej Mitrovic wrote:
 On 12/1/11, Andrei Alexandrescu<SeeWebsiteForEmail erdani.org>  wrote:
 One more thing before I forget - you may want to use byLine() for input.
Boo byLine! http://d.puremagic.com/issues/show_bug.cgi?id=7022
The implementation has been historically warped to account for bugs in postblit. Should be easy to fix. Andrei
Dec 01 2011
prev sibling parent reply "David Eagen" <spam_me_here mailinator.com> writes:
	format=flowed;
	charset="iso-8859-1";
	reply-type=response
Content-Transfer-Encoding: 7bit

"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:jb8hvh$2sdl$1 digitalmars.com...
 This is a good benchmark for I/O and a practical regex. David, could you
 please send (privately if you want) the file or some statistics about it
 (bytes, lines, a representative sample)? Thanks!
One more thing before I forget - you may want to use byLine() for input. In case the issue turns out to be related to I/O, it's much better we improve byLine() instead of the streams library.
I implemented the various suggestions (File.byLine, writeln instead of writefln, std.algorithm.sort, except using FReD. FReD wouldn't compile on the linux box I am using. the error was: /phobos/std/file.d(537): Error: undefined identifier package c.stdio Previous timing: real 4m21.255s user 4m14.216s sys 0m5.940s New timing after the changes: real 2m15.840s user 2m12.700s sys 0m2.760s So, it's nearly twice as fast but still the slowest of the four. I was able to compile with FReD on a 32-bit Windows system and it performed 15% faster than std.regex processing these same test files. I would love to try the precompiled regex code for FReD but the compile throws an out of memory error when I try it. The source files are /var/log/syslog files from sendmail on a Solaris 10 box. I can't make them available because they are mail logs from our company but here are the sizes and line counts along with example entries. $ wc -l syslog syslog.0 syslog.1 syslog.2 280618 syslog 331609 syslog.0 535035 syslog.1 543241 syslog.2 1690503 total -rw-r--r-- 1 david david 86244537 2011-11-30 21:26 syslog.0 -rw-r--r-- 1 david david 146156778 2011-11-30 21:26 syslog.1 -rw-r--r-- 1 david david 143481904 2011-11-30 21:26 syslog.2 -rw-r--r-- 1 david david 73030898 2011-11-30 21:26 syslog The entries look like this: Oct 27 03:10:01 thehost sendmail[3248]: [ID 801593 mail.info] p9R8A0MJ003245: to=user somewhere.com, delay=00:00:01, xdelay=00:00:01, mailer=esmtp, pri=120773, relay=some.host.com. [5.6.7.8], dsn=2.0.0, stat=Sent (ok 1319703001 qp 25319 the.mail.host.com!1319703000!80184558!1) Oct 27 03:10:04 thehost sendmail[3289]: [ID 801593 mail.info] p9R8A3Nr003289: from=sender senderbox, size=765, class=0, nrcpts=1, msgid=<201110270810.p9R8A3QA021419 senderbox>, proto=ESMTP, daemon=MTA, relay=senderbox.foo.com [1.2.3.4] -Dave
Dec 01 2011
next sibling parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 02.12.2011, 06:17 Uhr, schrieb David Eagen  
<spam_me_here mailinator.com>:

I did a test run with 430 MB of the two sample lines you gave in the last  
post and got even worse results where Java is 6.8x faster than D.

Code: Your original program
DMD : 2.054 64-bit
Java: IcedTea JDK (6)

A profile run without inlining using OProfile revealed statistics that  
come down to this (roughly):

1/10 reading the file by line
1/ 3 of the time is spend appending to arrays
1/ 2 is the regex
Dec 02 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
The import problem in std.file has been fixed on GitHub, but I couldn't  
get FReD to compile this regex:

enum regex = ctRegex!r"relay=([\w\-\.]+[\w]+)[\.\,]*\s";

Instead I'm using this one:

enum regex = ctRegex!r"relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s";

Both \. and \w inside seem to cause problems. \- was also troublesome, but  
easy to add a case in the parser looking at how \r is handled.

Then I started optimizing with these steps:

1. Run a 64-bit build instead of a 32-bit build :D
    30.2 s => 14.4 s

2. use "auto regex = ctRegex!..." insdead of "enum regex = ctRegex!..."
    14.4 s => 6.4 s

For comparison: the Java version takes 5.3 s here.

That left me with the following profile chart of function calls > %1 time.  
The percentages don't accumulate subroutine calls. So main() is fairly low  
in the list:

samples        %  source                      function
6934     16.7800  uni.d:601                   const( trusted bool  
function(dchar)) std.internal.uni.CodepointTrie!(8).CodepointTrie.opIndex
4235     10.2485  (no location information)   pure  safe dchar  
std.utf.decode(const(char[]), ref ulong)
3807      9.2128  regex.d:6395                 trusted bool  
std.regex.ctRegexImpl!("relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s",  
[]).func(ref  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher)
2240      5.4207  regex.d:3232                 property  trusted bool  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.atEnd()
2151      5.2053  regex.d:2932                 safe bool  
std.regex.Input!(char).Input.nextChar(ref dchar, ref ulong)
1812      4.3850  exception.d:486             pure  safe bool  
std.exception.enforceEx!(std.utf.UTFException, bool).enforceEx(bool, lazy  
immutable(char)[], immutable(char)[], ulong)
1686      4.0801  regex.d:6490                 trusted bool  
std.regex.ctRegexImpl!("relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s",  
[]).func(ref  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).Back
rackingMatcher).int  
test_11()
1409      3.4097  regex.d:6450                 safe  
std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19Backtrackin
MatcherZ.RegexMatch  
std.regex.match!(char[],  
std.regex.StaticRegex!(char).StaticRegex).match(char[],  
std.regex.StaticRegex!(char).StaticRegex)
1335      3.2306  regex.d:6272                 trusted  
std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19Backtrackin
MatcherZ.RegexMatch  
std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19BacktrackingMatcherZ.RegexMatch.__ctor!(std.regex.StaticRegex!(char).StaticRegex).__ctor(std.regex.StaticRegex
(char).StaticRegex,  
char[])
1224      2.9620  regex.d:3234                 trusted void  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.next()
1212      2.9330  regex.d:2951                 property  safe ulong  
std.regex.Input!(char).Input.lastIndex()
1202      2.9088  regex.d:2744                 trusted ulong  
std.regex.ShiftOr!(char).ShiftOr.search(const(char)[], ulong)
1051      2.5434  regex.d:3717                 trusted void  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(int).stackPush(int)
973       2.3546  regex.d:3717                 trusted void  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(ulong).stackPush(ulong)
884       2.1392  main.d:22                   _Dmain
618       1.4955  regex.d:3726                 trusted void  
std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(std.regex.Group!(ulong).Group).stackPush(std.regex.Group!(ulong).Group[])
466       1.1277  (no location information)   _d_arraysetlengthiT

These functions sum up to ~80%. And if it is correct, the garbage  
collector functions each take a low place in the table. At this point I'd  
probably recommend an ASCII regex, but I'd like to know how Java can still  
be substantially faster with library routines. :)

- Marco
Dec 02 2011
next sibling parent Kagamin <spam here.lot> writes:
Marco Leise Wrote:

 These functions sum up to ~80%. And if it is correct, the garbage  
 collector functions each take a low place in the table. At this point I'd  
 probably recommend an ASCII regex, but I'd like to know how Java can still  
 be substantially faster with library routines. :)
Java probably uses utf16, strings are already in utf16 as they're read from file, that means text is transcoded eagerly by chunks, not lazily char-by-char. This may cause icache misses or mispredictions, dunno.
Dec 02 2011
prev sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 02.12.2011 15:32, Marco Leise wrote:
 The import problem in std.file has been fixed on GitHub, but I couldn't
 get FReD to compile this regex:

 enum regex = ctRegex!r"relay=([\w\-\.]+[\w]+)[\.\,]*\s";

 Instead I'm using this one:

 enum regex = ctRegex!r"relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s";

 Both \. and \w inside seem to cause problems. \- was also troublesome,
 but easy to add a case in the parser looking at how \r is handled.
First of all, sorry for some messy problems with escapes in character classes. If we all agree to just treat anything non-special after \ as is then I'll add it. Second, I might take a shot at optimizing engine, once OSX problem is figured out.
 Then I started optimizing with these steps:

 1. Run a 64-bit build instead of a 32-bit build :D
 30.2 s => 14.4 s

 2. use "auto regex = ctRegex!..." insdead of "enum regex = ctRegex!..."
 14.4 s => 6.4 s
Well, another thing to try is gdc/ldc. Last time I succeeded in this endeavor with -O3 it yielded a small boost of ~ 5%.
 For comparison: the Java version takes 5.3 s here.
Don't kill me ;) Seriously... they must be doing no decoding of UTF. Another option is Boyer-moor on "relay=". It would be interesting to search for something a little bit more fussy e.g. "r[eE]lay=" or something like that just to see if it has any effect.
 That left me with the following profile chart of function calls > %1
 time. The percentages don't accumulate subroutine calls. So main() is
 fairly low in the list:
From this short list I'd say that opIndex could be sped up a bit. But nothing other catches my eye. Except for that 4% enforceEx on UTF exception.
 samples % source function
 6934 16.7800 uni.d:601 const( trusted bool function(dchar))
 std.internal.uni.CodepointTrie!(8).CodepointTrie.opIndex
 4235 10.2485 (no location information) pure  safe dchar
 std.utf.decode(const(char[]), ref ulong)
 3807 9.2128 regex.d:6395  trusted bool
 std.regex.ctRegexImpl!("relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s",
 []).func(ref
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher)

 2240 5.4207 regex.d:3232  property  trusted bool
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.atEnd()

 2151 5.2053 regex.d:2932  safe bool
 std.regex.Input!(char).Input.nextChar(ref dchar, ref ulong)
 1812 4.3850 exception.d:486 pure  safe bool
 std.exception.enforceEx!(std.utf.UTFException, bool).enforceEx(bool,
 lazy immutable(char)[], immutable(char)[], ulong)
 1686 4.0801 regex.d:6490  trusted bool
 std.regex.ctRegexImpl!("relay=([A-Za-z0-9_\-.]+[A-Za-z0-9_]+)[.,]*\s",
 []).func(ref
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher).int
 test_11()
 1409 3.4097 regex.d:6450  safe
 std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19BacktrackingMatcherZ.RegexMatch
 std.regex.match!(char[],
 std.regex.StaticRegex!(char).StaticRegex).match(char[],
 std.regex.StaticRegex!(char).StaticRegex)
 1335 3.2306 regex.d:6272  trusted
 std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19BacktrackingMatcherZ.RegexMatch
 std.regex.__T10RegexMatchTAaS613std5regex28__T19BacktrackingMatcherVb1Z19BacktrackingMatcherZ.RegexMatch.__ctor!(std.regex.StaticRegex!(char).StaticRegex).__ctor(std.regex.StaticRegex!(char).StaticRegex,
 char[])
 1224 2.9620 regex.d:3234  trusted void
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.next()

 1212 2.9330 regex.d:2951  property  safe ulong
 std.regex.Input!(char).Input.lastIndex()
 1202 2.9088 regex.d:2744  trusted ulong
 std.regex.ShiftOr!(char).ShiftOr.search(const(char)[], ulong)
 1051 2.5434 regex.d:3717  trusted void
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(int).stackPush(int)

 973 2.3546 regex.d:3717  trusted void
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(ulong).stackPush(ulong)

 884 2.1392 main.d:22 _Dmain
 618 1.4955 regex.d:3726  trusted void
 std.regex.BacktrackingMatcher!(true).BacktrackingMatcher!(char).BacktrackingMatcher.stackPush!(std.regex.Group!(ulong).Group).stackPush(std.regex.Group!(ulong).Group[])

 466 1.1277 (no location information) _d_arraysetlengthiT

 These functions sum up to ~80%. And if it is correct, the garbage
 collector functions each take a low place in the table. At this point
 I'd probably recommend an ASCII regex, but I'd like to know how Java can
 still be substantially faster with library routines. :)
Dec 02 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Cool, thx for your answers. The source code for OpenJDK can be downloaded  
if you want to take a look at it. You are probably right about them not  
decoding the characters lazily since their strings are UTF-16.
The commented version of opIndex is a bit faster on my Core 2. This is the  
first time that I witnessed such speed differences between processors. :)
Also I found that the trie is usually queried twice for each matching  
character in the input string. You can't optimize opIndex any further (but  
try size_t in there instead of uint, it helped here) unless you make some  
changes on the larger scale. So if you should find out that the second  
query isn't required, that would help more than anything else.
I said it on IRC today: This library will be my reference for compile time  
code generation in D. There is a lot of expertise in it, good work!

P.S.: I'm fine with treating anything that is escaped, but not special, as  
is. \w did cause an infinite loop though, so you may want to test with the  
original regex. For \. you can assert(false, "\. is not a valid escape  
sequence") or just ignore the backslash. Personally I usually don't escape  
anything just to be on the safe side. :p
Dec 02 2011
parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 03.12.2011 1:08, Marco Leise wrote:
 Cool, thx for your answers. The source code for OpenJDK can be
 downloaded if you want to take a look at it. You are probably right
 about them not decoding the characters lazily since their strings are
 UTF-16.
 The commented version of opIndex is a bit faster on my Core 2. This is
 the first time that I witnessed such speed differences between
 processors. :)
Wow. I knew something was wrong with non-BT test code, from what I heard it should have been faster but it wasn't for me :)
 Also I found that the trie is usually queried twice for each matching
 character in the input string. You can't optimize opIndex any further
 (but try size_t in there instead of uint, it helped here) unless you
 make some changes on the larger scale. So if you should find out that
 the second query isn't required, that would help more than anything else.
 I said it on IRC today: This library will be my reference for compile
 time code generation in D. There is a lot of expertise in it, good work!
There I have two options to work through: - separate negative and positive character classes it would kill possible branching here. - and now looking at test_11 in you profile output, I see the likely culprit: I should re-think lookahead tests, they used to reduce number of savepoints during matching.
 P.S.: I'm fine with treating anything that is escaped, but not special,
 as is. \w did cause an infinite loop though, so you may want to test
Hm can't reproduce.
 with the original regex. For \. you can assert(false, "\. is not a valid
 escape sequence")
No that was bad idea ... and I planed to change that exception. Now I'm more into ignore the backslash. or just ignore the backslash. Personally I usually
 don't escape anything just to be on the safe side. :p
Worthy of a small community poll.
Dec 02 2011
parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Saturday, December 03, 2011 01:44:33 Dmitry Olshansky wrote:
 On 03.12.2011 1:08, Marco Leise wrote:
 Cool, thx for your answers. The source code for OpenJDK can be
 downloaded if you want to take a look at it. You are probably right
 about them not decoding the characters lazily since their strings are
 UTF-16.
 The commented version of opIndex is a bit faster on my Core 2. This is
 the first time that I witnessed such speed differences between
 processors. :)
Wow. I knew something was wrong with non-BT test code, from what I heard it should have been faster but it wasn't for me :)
 Also I found that the trie is usually queried twice for each matching
 character in the input string. You can't optimize opIndex any further
 (but try size_t in there instead of uint, it helped here) unless you
 make some changes on the larger scale. So if you should find out that
 the second query isn't required, that would help more than anything
 else.
 I said it on IRC today: This library will be my reference for compile
 time code generation in D. There is a lot of expertise in it, good work!
There I have two options to work through: - separate negative and positive character classes it would kill possible branching here. - and now looking at test_11 in you profile output, I see the likely culprit: I should re-think lookahead tests, they used to reduce number of savepoints during matching.
 P.S.: I'm fine with treating anything that is escaped, but not special,
 as is. \w did cause an infinite loop though, so you may want to test
Hm can't reproduce.
 with the original regex. For \. you can assert(false, "\. is not a valid
 escape sequence")
No that was bad idea ... and I planed to change that exception. Now I'm more into ignore the backslash. or just ignore the backslash. Personally I usually
 don't escape anything just to be on the safe side. :p
Worthy of a small community poll.
Also, in case you didn't see it: http://d.puremagic.com/issues/show_bug.cgi?id=7045 - Jonathan M Davis
Dec 02 2011
parent Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 03.12.2011 2:28, Jonathan M Davis wrote:
 On Saturday, December 03, 2011 01:44:33 Dmitry Olshansky wrote:
 On 03.12.2011 1:08, Marco Leise wrote:
 Cool, thx for your answers. The source code for OpenJDK can be
 downloaded if you want to take a look at it. You are probably right
 about them not decoding the characters lazily since their strings are
 UTF-16.
 The commented version of opIndex is a bit faster on my Core 2. This is
 the first time that I witnessed such speed differences between
 processors. :)
Wow. I knew something was wrong with non-BT test code, from what I heard it should have been faster but it wasn't for me :)
 Also I found that the trie is usually queried twice for each matching
 character in the input string. You can't optimize opIndex any further
 (but try size_t in there instead of uint, it helped here) unless you
 make some changes on the larger scale. So if you should find out that
 the second query isn't required, that would help more than anything
 else.
 I said it on IRC today: This library will be my reference for compile
 time code generation in D. There is a lot of expertise in it, good work!
There I have two options to work through: - separate negative and positive character classes it would kill possible branching here. - and now looking at test_11 in you profile output, I see the likely culprit: I should re-think lookahead tests, they used to reduce number of savepoints during matching.
 P.S.: I'm fine with treating anything that is escaped, but not special,
 as is. \w did cause an infinite loop though, so you may want to test
Hm can't reproduce.
 with the original regex. For \. you can assert(false, "\. is not a valid
 escape sequence")
No that was bad idea ... and I planed to change that exception. Now I'm more into ignore the backslash. or just ignore the backslash. Personally I usually
 don't escape anything just to be on the safe side. :p
Worthy of a small community poll.
Also, in case you didn't see it: http://d.puremagic.com/issues/show_bug.cgi?id=7045 - Jonathan M Davis
It should work now. The issue at large remains. -- Dmitry Olshansky
Dec 03 2011
prev sibling next sibling parent bearophile <bearophileHUGS lycos.com> writes:
David Eagen:

 So, it's nearly twice as fast but still the slowest of the four.
I guess that a Python program too will be faster than the D code. Your D code looks good enough now. The problem is that to create fast programs you need well tuned libraries and a good compiler back-end, and this requires lot of work and time. Currently steering too much D development efforts toward tuning is a bad idea because there are several more important issues, even design ones (like applying the already written patches present in GitHug). Cosmetic matters about your code: args[1 .. args.length] ==> args[1 .. $] endsWith(hostName, "foo.com") ==> hostName.endsWith("foo.com") Now I suggest to compile your D program with: - O -release -inline -profile and run it on a smaller input (because it will run slower or much slower). rename the profiling output files (otherwise they get corrupted, I don't know if this problem is in bugzilla), and profile it again with: - O -release -profile Taking a look at the profiler output will help find where the problems are. Maybe that profiling will generate material for a Bugzilla entry too. If you also want to try disabling the GC in the inflationary phase of the program: import core.memory: GC; ... GC.disable(); foreach (arg; args[1 .. args.length]) ... GC.enable(); /* Sort the host names */ ... Bye, bearophile
Dec 02 2011
prev sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Thursday, December 01, 2011 23:17:30 David Eagen wrote:
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message
 news:jb8hvh$2sdl$1 digitalmars.com...
 
 This is a good benchmark for I/O and a practical regex. David, could
 you
 please send (privately if you want) the file or some statistics about
 it
 (bytes, lines, a representative sample)? Thanks!
One more thing before I forget - you may want to use byLine() for input. In case the issue turns out to be related to I/O, it's much better we improve byLine() instead of the streams library.
I implemented the various suggestions (File.byLine, writeln instead of writefln, std.algorithm.sort, except using FReD. FReD wouldn't compile on the linux box I am using. the error was: /phobos/std/file.d(537): Error: undefined identifier package c.stdio Previous timing: real 4m21.255s user 4m14.216s sys 0m5.940s New timing after the changes: real 2m15.840s user 2m12.700s sys 0m2.760s So, it's nearly twice as fast but still the slowest of the four. I was able to compile with FReD on a 32-bit Windows system and it performed 15% faster than std.regex processing these same test files. I would love to try the precompiled regex code for FReD but the compile throws an out of memory error when I try it. The source files are /var/log/syslog files from sendmail on a Solaris 10 box. I can't make them available because they are mail logs from our company but here are the sizes and line counts along with example entries. $ wc -l syslog syslog.0 syslog.1 syslog.2 280618 syslog 331609 syslog.0 535035 syslog.1 543241 syslog.2 1690503 total -rw-r--r-- 1 david david 86244537 2011-11-30 21:26 syslog.0 -rw-r--r-- 1 david david 146156778 2011-11-30 21:26 syslog.1 -rw-r--r-- 1 david david 143481904 2011-11-30 21:26 syslog.2 -rw-r--r-- 1 david david 73030898 2011-11-30 21:26 syslog The entries look like this: Oct 27 03:10:01 thehost sendmail[3248]: [ID 801593 mail.info] p9R8A0MJ003245: to=user somewhere.com, delay=00:00:01, xdelay=00:00:01, mailer=esmtp, pri=120773, relay=some.host.com. [5.6.7.8], dsn=2.0.0, stat=Sent (ok 1319703001 qp 25319 the.mail.host.com!1319703000!80184558!1) Oct 27 03:10:04 thehost sendmail[3289]: [ID 801593 mail.info] p9R8A3Nr003289: from=sender senderbox, size=765, class=0, nrcpts=1, msgid=<201110270810.p9R8A3QA021419 senderbox>, proto=ESMTP, daemon=MTA, relay=senderbox.foo.com [1.2.3.4]
The performance boost would likely be minimal, since the vast majority of the speed problem is in std.regex, but I would point out that endsWith can take multiple arguments. - Jonathan M Davis
Dec 02 2011
parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 02.12.2011, 11:15 Uhr, schrieb Jonathan M Davis <jmdavisProg gmx.com>:

 The performance boost would likely be minimal, since the vast majority  
 of the
 speed problem is in std.regex, but I would point out that endsWith can  
 take
 multiple arguments.

 - Jonathan M Davis
I used a single GC.disable() right at the start of main() and it slowed the program down by ~100ms. Could be noise, but your expectation is correct.
Dec 02 2011
prev sibling parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 01.12.2011, 18:03 Uhr, schrieb Andrei Alexandrescu  
<SeeWebsiteForEmail erdani.org>:

 On 11/30/11 9:47 PM, David Eagen wrote:
 Recently I needed to analyze some mail logs. I needed to find the hosts
 that were sending mail and how many lines in the log existed for each
 host. Thinking this would be perfect for natively compiled code in D I
 first wrote a D app. I then wrote it in perl and was amazed at how much
 faster perl was. I expanded out to Java and Scala. For all four I used
 the same source files and the same output was created.
[snip] This is a good benchmark for I/O and a practical regex. David, could you please send (privately if you want) the file or some statistics about it (bytes, lines, a representative sample)? Thanks! Andrei
Hey, I'm also interested. I *love* optimization problems, although it may be just the regex in this case and I'll keep my hands off regex engines. Perhaps rapidshare or similar services can host that?
Dec 01 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 12:56 PM, Paulo Pinto wrote:
 Are you not being a bit simplistic here?

 There are several JVM implementations around not just one.
It's not the implementation that's the problem, it's the *definition* of the bytecode for the JVM.
 Plus if I understand correctly some complains of people using D in real
 projects, in many cases JVM JITs are able to generate better code than D. At
 least for the time being.
Only if you're writing "Java" code in D. If you write using value structs, for example, or use array slicing, things don't work out so well in Java which can't do either.
Nov 30 2011
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
This is not what I understand from bearophile's comments every now and 
then, when we compares dmd with other languages implementations.

Now even if the JVM design is not the best in the world, most 
implementations nowadays recompile on demand are are able to:

- devirtualize method calls
- replace heap allocation by stack allocation thanks to escape analysis
- re-JIT code and replace calls by inline code
- perform loop unrolling
- ...

Here are a few papers about JVM's implementations that I am aware of:

http://www.ecst.csuchico.edu/~juliano/csci693/Presentations/2008w/Materials/Swisher/DOCS/sources/Escape%20Analysis%20for%20Java.pdf

http://www.research.ibm.com/people/h/hind/ACACES06.pdf

http://webdocs.cs.ualberta.ca/~amaral/IBM-Stoodley-talks/UofAKASWorkshop.pdf

http://wikis.sun.com/display/HotSpotInternals/EscapeAnalysis

http://www.slideshare.net/jbaruch/jvm-magic

I do think there is a lack of language implementations that
compile directly to native code nowadays,  but I am also quite aware 
that the millions spent in VM research along the years have provided 
quite a few fruits.

Am 30.11.2011 22:50, schrieb Walter Bright:
 On 11/30/2011 12:56 PM, Paulo Pinto wrote:
 Are you not being a bit simplistic here?

 There are several JVM implementations around not just one.
It's not the implementation that's the problem, it's the *definition* of the bytecode for the JVM.
 Plus if I understand correctly some complains of people using D in real
 projects, in many cases JVM JITs are able to generate better code than
 D. At
 least for the time being.
Only if you're writing "Java" code in D. If you write using value structs, for example, or use array slicing, things don't work out so well in Java which can't do either.
Nov 30 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 10:19 PM, Paulo Pinto wrote:
 This is not what I understand from bearophile's comments every now and then,
 when we compares dmd with other languages implementations.

 Now even if the JVM design is not the best in the world, most implementations
 nowadays recompile on demand are are able to:

 - devirtualize method calls
 - replace heap allocation by stack allocation thanks to escape analysis
 - re-JIT code and replace calls by inline code
 - perform loop unrolling
 - ...

 Here are a few papers about JVM's implementations that I am aware of:

 http://www.ecst.csuchico.edu/~juliano/csci693/Presentations/2008w/Materials/Swisher/DOCS/sources/Escape%20Analysis%20for%20Java.pdf


 http://www.research.ibm.com/people/h/hind/ACACES06.pdf

 http://webdocs.cs.ualberta.ca/~amaral/IBM-Stoodley-talks/UofAKASWorkshop.pdf

 http://wikis.sun.com/display/HotSpotInternals/EscapeAnalysis

 http://www.slideshare.net/jbaruch/jvm-magic

 I do think there is a lack of language implementations that
 compile directly to native code nowadays, but I am also quite aware that the
 millions spent in VM research along the years have provided quite a few fruits.
When you can implement a competitive malloc() using Java, I'll believe it has reached parity. There's a reason why the JVM is itself implemented in C, not Java. D's runtime is implemented in D. Most Java benchmarks I've seen that showed Java being competitive were written in Java (or at least Java style) and then ported to other languages. The reason is because if you want to convert C, C++, or D code to Java, you have to re-engineer it. The reason escape analysis is used in the JVM is because the Java bytecode is severely limited in what it can express. So, a language bytecode generator has to bash its semantics somehow into that tiny vocabulary, and then the JVM has to "reverse engineer" the intent back out of it. The effort poured into the JVM is to recognize higher level Java constructs, not higher level Scala constructs, hence the poor results from Scala mentioned in the article.
Nov 30 2011
next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Wed, 2011-11-30 at 23:08 -0800, Walter Bright wrote:
[...]
 When you can implement a competitive malloc() using Java, I'll believe it=
has=20
 reached parity. There's a reason why the JVM is itself implemented in C, =
not=20
 Java. D's runtime is implemented in D.
This is like trying to compare apples and dog excrement. Clearly malloc will always be written in C. I think this thread has shown that D folk need to accept that Java is a critical platform out there and will be for many years to come. Languages such as Groovy, JRuby and Closure -- the jury is still out on Scala, and the Jury cannot yet even compare Ceylon and Kotlin -- have evolved the milieu to be a active and efficacious one. The point is that the JVM arena, the CLR arena and the native arena are three separate ones these days, with little or no crossover. D's fight is with C, C++, Go, not with Java. D needs to make inroads into areas currently dominated by C and C++ and those being swept up in the tide of Go. If D is to be anything other than a interesting blip in the history of programming languages it needs to gain traction from more than just the core aficionados. So which area can D compete well in, who are the people and organizations who can show that D is better than C, C++, D, and Go in these areas. Why are they not out there doing guerilla marketing of D?
 Most Java benchmarks I've seen that showed Java being competitive were wr=
itten=20
 in Java (or at least Java style) and then ported to other languages. The =
reason=20
 is because if you want to convert C, C++, or D code to Java, you have to=
=20
 re-engineer it.
So people doing the benchmarks you have seen are substandard and don't realize you are supposed to write the best idiomatic version of the algorithm in each of the languages under test. This is not a stick to beat Java with.
 The reason escape analysis is used in the JVM is because the Java bytecod=
e is=20
 severely limited in what it can express. So, a language bytecode generato=
r has=20
 to bash its semantics somehow into that tiny vocabulary, and then the JVM=
has to=20
 "reverse engineer" the intent back out of it. The effort poured into the =
JVM is=20
 to recognize higher level Java constructs, not higher level Scala constru=
cts,=20
 hence the poor results from Scala mentioned in the article.
Fortran compiler writers have been doing this sort of thing very successfully for years: 1960s Fortran 4 serial code gets converted into parallel code by clever inferences and "reverse engineering. It has always amazed me that owners of these old Fortran codes think it is more important to expend resources on clever compiler trickery that just to rewrite the codes in a modern language, like Fortran 2009. Of course rewriting would give an opportunity to change language. I bet they would go to C++ not D. Though staying with Fortran 2009 may be even better. So the real question here is to get some benchmarks together to show that D outshines C, C++ and Fortran 2009 -- with of course the benchmarks being written properly in idiomatic language for each language not, as you noted earlier, transliterations from one language to all the others. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 01 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 1:26 AM, Russel Winder wrote:
 I think this thread has shown that D folk need to accept that Java is a
 critical platform out there and will be for many years to come.
I understand that. Java isn't going anywhere. I was only addressing the idea that the Java bytecode is a burden for compiler developers or not.
 D's fight is with C, C++, Go, not with Java.  D needs to make inroads
 into areas currently dominated by C and C++ and those being swept up in
 the tide of Go.
I suspect Go's market is more the Java market than the C/C++ one.
 So people doing the benchmarks you have seen are substandard and don't
 realize you are supposed to write the best idiomatic version of the
 algorithm in each of the languages under test.  This is not a stick to
 beat Java with.
It is when they are using a Java stick to beat D with :-)
 Fortran compiler writers have been doing this sort of thing very
 successfully for years:  1960s Fortran 4 serial code gets converted into
 parallel code by clever inferences and "reverse engineering.

 It has always amazed me that owners of these old Fortran codes think it
 is more important to expend resources on clever compiler trickery that
 just to rewrite the codes in a modern language, like Fortran 2009.

 Of course rewriting would give an opportunity to change language.  I bet
 they would go to C++ not D.  Though staying with Fortran 2009 may be
 even better.
The C++ folks do the same thing. Rather than add vector operations to the core language, they rely on "vectorizing" compilers that reverse engineer loops into a higher level construct.
 So the real question here is to get some benchmarks together to show
 that D outshines C, C++ and Fortran 2009 -- with of course the
 benchmarks being written properly in idiomatic language for each
 language not, as you noted earlier, transliterations from one language
 to all the others.
I've done a lot of benchmarks in the past. The universal reaction to them is one of: 1. There's a bug in my compiler that "deleted" the benchmark code (what actually happened is it was dead code, and the compiler deleted dead code). 2. I somehow cheated, because the results cannot possibly be true. 3. I used a sabotaged compiler for language X to compare against. 4. I didn't write the code correctly for language X. This charge is leveled at me even when I'm careful to use OTHER peoples' published X code, and provide links to the originals. So color me a bit weary of the abuse. I tend to let people do their own benchmarks. It's also why I stopped publishing language comparison charts.
Dec 01 2011
parent reply Russel Winder <russel russel.org.uk> writes:
On Thu, 2011-12-01 at 02:09 -0800, Walter Bright wrote:
[...]
 I understand that. Java isn't going anywhere. I was only addressing the i=
dea=20
 that the Java bytecode is a burden for compiler developers or not.
I disagree that Java isn't going anywhere. The hassles over the last year with Oracle are now resolving themselves as IBM influence gains ground. With the publication of the timetable and part road map for Javas 8, 9, 10 11 and 12, the Java community is hugely re-energized. The opening up of the JCP and the voting in of a couple of user groups to the executive committee has made a significant change to the management of Java. Whether this is positive we shall see. The Java bytecodes and JVM are no longer the fixed point they were. Change is now possible. Clearly a zero address stack machine has some issue, I never disagreed with you on that, but I don't see it as the infinite brick wall you were seeming to portray it as.=20 [...]
 I suspect Go's market is more the Java market than the C/C++ one.
I don't think that is the complete story. Go initial market is cloud systems systems programming. To date this has been a mish mash of C, C ++ and Java with a dash of Python. Go's marketing clearly sets it up against C and somewhat against Python. They are ignoring the JVM arena (at least for now) as they don't see how to get traction there quickly enough to make things work for them. [...]
 It is when they are using a Java stick to beat D with :-)
:-) I still think that in the short term there is no value in D trying to address markets currently dominated by JVM or CLR. Much better to carve out a presence in an area with a lower barrier to entry. Python attacked the Perl market and made headway. It then attacked C as GUI development language and basically won. Now it is attacking the HPC data visualization arena and is winning. D needs to spot an angle and go for it. =20 [...]
 The C++ folks do the same thing. Rather than add vector operations to the=
core=20
 language, they rely on "vectorizing" compilers that reverse engineer loop=
s into=20
 a higher level construct.
The Fortran folk introduced "whole array operations" to avoid having to do that -- even though they did all the inferencing on loops for legacy code. Best move yet in the Fortran world, since now people write higher level code and let the code generator do all the nifty optimizations. C++ must have done the same by now: there must be good BLAS, and high level vector/matrix systems, especially with GPGPU being the driving force these days.=20 [...]
 So color me a bit weary of the abuse. I tend to let people do their own=
=20
 benchmarks. It's also why I stopped publishing language comparison charts=
. Sad, but understandable. If you still have some of the codes, perhaps there is a way that this can be turned into something? Clearly the Alioth shootout is one possible model. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 02 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 12:34 AM, Russel Winder wrote:
 On Thu, 2011-12-01 at 02:09 -0800, Walter Bright wrote:
 [...]
 I understand that. Java isn't going anywhere. I was only addressing the idea
 that the Java bytecode is a burden for compiler developers or not.
I disagree that Java isn't going anywhere. The hassles over the last year with Oracle are now resolving themselves as IBM influence gains ground. With the publication of the timetable and part road map for Javas 8, 9, 10 11 and 12, the Java community is hugely re-energized. The opening up of the JCP and the voting in of a couple of user groups to the executive committee has made a significant change to the management of Java. Whether this is positive we shall see.
I meant it wasn't going away. I didn't mean that it would no longer be improved.
 The Java bytecodes and JVM are no longer the fixed point they were.
 Change is now possible.  Clearly a zero address stack machine has some
 issue, I never disagreed with you on that, but I don't see it as the
 infinite brick wall you were seeming to portray it as.
I think it's a disastrous problem as it stands now. A lot of very useful things simply cannot be reasonably expressed in it. But if new instructions are added, anything is possible.
 [...]
 I suspect Go's market is more the Java market than the C/C++ one.
I don't think that is the complete story. Go initial market is cloud systems systems programming. To date this has been a mish mash of C, C ++ and Java with a dash of Python. Go's marketing clearly sets it up against C and somewhat against Python. They are ignoring the JVM arena (at least for now) as they don't see how to get traction there quickly enough to make things work for them.
I know their marketing is not directed against Java, but I was referring to what Go is technically. It's like C++ spawned Java, and C spawned Go. That stacks Go up against Java.
 I still think that in the short term there is no value in D trying to
 address markets currently dominated by JVM or CLR.  Much better to carve
 out a presence in an area with a lower barrier to entry.
I used to think that too, until I found out that half of D users came from the Java world. (The other half are from C++.)
 C++ must have done the same by now:  there must be good BLAS, and high
 level vector/matrix systems, especially with GPGPU being the driving
 force these days.
Sure, but none of that is standard C++.
 If you still have some of the codes, perhaps there is a way that this
 can be turned into something?  Clearly the Alioth shootout is one
 possible model.
I don't have anything that's up to date.
Dec 02 2011
parent reply Gour <gour atmarama.net> writes:
On Fri, 02 Dec 2011 00:42:36 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 I used to think that too, until I found out that half of D users came
 from the Java world. (The other half are from C++.)
What about those not coming from Java (never touched it) and coming from Haskell, Python...(although I used Zortech++)?. Are we still in the latter category? Sincerely, Gour --=20 One who is not connected with the Supreme can have neither=20 transcendental intelligence nor a steady mind, without which=20 there is no possibility of peace. And how can there be any=20 happiness without peace? http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 4:00 AM, Gour wrote:
 What about those not coming from Java (never touched it) and coming from
 Haskell, Python...(although I used Zortech++)?. Are we still in the
 latter category?
Not too many of those :-)
Dec 02 2011
prev sibling next sibling parent reply Caligo <iteronvexor gmail.com> writes:
On Thu, Dec 1, 2011 at 3:26 AM, Russel Winder <russel russel.org.uk> wrote:

 On Wed, 2011-11-30 at 23:08 -0800, Walter Bright wrote:
 [...]
 When you can implement a competitive malloc() using Java, I'll believe
it has
 reached parity. There's a reason why the JVM is itself implemented in C,
not
 Java. D's runtime is implemented in D.
This is like trying to compare apples and dog excrement. Clearly malloc will always be written in C. I think this thread has shown that D folk need to accept that Java is a critical platform out there and will be for many years to come. Languages such as Groovy, JRuby and Closure -- the jury is still out on Scala, and the Jury cannot yet even compare Ceylon and Kotlin -- have evolved the milieu to be a active and efficacious one. The point is that the JVM arena, the CLR arena and the native arena are three separate ones these days, with little or no crossover.
Java is a joke. Get over it. Java is a toy language and the only reason people and companies have taken it seriously is because so many kids have have decided (brainwashed) to play with it. With enough marketing and propaganda you can make people believe whatever you want them to believe. Java is also the "blue collar" programming languages, which means corporations get to exploit people all around the world in order to make even more profits.
 D's fight is with C, C++, Go, not with Java.  D needs to make inroads
 into areas currently dominated by C and C++ and those being swept up in
 the tide of Go.

 If D is to be anything other than a interesting blip in the history of
 programming languages it needs to gain traction from more than just the
 core aficionados.

 So which area can D compete well in, who are the people and
 organizations who can show that D is better than C, C++, D, and Go in
 these areas.  Why are they not out there doing guerilla marketing of D?

 Most Java benchmarks I've seen that showed Java being competitive were
written
 in Java (or at least Java style) and then ported to other languages. The
reason
 is because if you want to convert C, C++, or D code to Java, you have to
 re-engineer it.
So people doing the benchmarks you have seen are substandard and don't realize you are supposed to write the best idiomatic version of the algorithm in each of the languages under test. This is not a stick to beat Java with.
So far I have competed in the ACM ICPC regional programming contests twice. I've met many students there and I've had many teammates, most if not all of them Java programmers. Besides me (I've never actually done any Java), I don't know any other C++ programmer in there. I have seen countless problems solved in Java and C++, with Java always being 10-20 times slower: same problem, same algorithms and/or data structures. Whenever I find an article that talks about Java being faster than C++, I know it's BS. You can find fair comparisons at http://www.spoj.pl/ Java is also very sluggish. I don't exactly know why, but I'm sure it has something to do with JVM and/or GC. Just look at Android and compare it to iPhone to see what I mean. Apps running on the PC written in Java are also sluggish: Things like Eclipse and Open Office come to mind. Java is a joke. It's a faith-based ideology. Get over it.
 The reason escape analysis is used in the JVM is because the Java
bytecode is
 severely limited in what it can express. So, a language bytecode
generator has
 to bash its semantics somehow into that tiny vocabulary, and then the
JVM has to
 "reverse engineer" the intent back out of it. The effort poured into the
JVM is
 to recognize higher level Java constructs, not higher level Scala
constructs,
 hence the poor results from Scala mentioned in the article.
Fortran compiler writers have been doing this sort of thing very successfully for years: 1960s Fortran 4 serial code gets converted into parallel code by clever inferences and "reverse engineering. It has always amazed me that owners of these old Fortran codes think it is more important to expend resources on clever compiler trickery that just to rewrite the codes in a modern language, like Fortran 2009. Of course rewriting would give an opportunity to change language. I bet they would go to C++ not D. Though staying with Fortran 2009 may be even better. So the real question here is to get some benchmarks together to show that D outshines C, C++ and Fortran 2009 -- with of course the benchmarks being written properly in idiomatic language for each language not, as you noted earlier, transliterations from one language to all the others. -- Russel. ============================================================================= Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.net 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 01 2011
next sibling parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 01.12.2011, 20:39 Uhr, schrieb Caligo <iteronvexor gmail.com>:

 Java is a joke. [...] Java is a toy language [...] so many kids
 [...] (brainwashed) [...] propaganda [...] exploit people all
 around the world [...] (I've never actually done any Java) [...]Java  
 always being 10-20 times slower [...] Java is also very
 sluggish. [...] Java is a joke.
Amen.
 Apps running on the PC written in Java are also sluggish:Things like  
 Eclipse and Open Office come to mind.
Open Office is *not* written in Java. You can actually built it without dependencies on Java and not miss anything. http://wiki.services.openoffice.org/wiki/Java_and_OpenOffice.org If you want to ignore a perfectly fine Office suite because there was some notion of Java somewhere, just go ahead :D
Dec 02 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
(Weird - my reply seems to have never appeared. Sorry if this is a dup...)

From: "Caligo" <iteronvexor gmail.com>
 Java is a joke.  Get over it.  Java is a toy language and the only reason
 people and companies have taken it seriously is because so many kids have
 have decided (brainwashed) to play with it.  With enough marketing and
 propaganda you can make people believe whatever you want them to believe.
 Java is also the "blue collar" programming languages, which means
 corporations get to exploit people all around the world in order to make
 even more profits.
I like your bluntness :) And I'd tend to agree: Java has always been basically an OO VisualBasic sans MS (But nobody's been allowed to actually say so because it's been popular.) And of course VisualBasic itself has always been the 90's version of Cobol (although VB has at least had some value as a learning language; I'm not so sure the same could be said of Cobol, although I wouldn't know).
 So far I have competed in the ACM ICPC regional programming contests
 twice.  I've met many students there and I've had many teammates, most if
 not all of them Java programmers.  Besides me (I've never actually done
 any
 Java), I don't know any other C++ programmer in there.  I have seen
 countless problems solved in Java and C++, with Java always being 10-20
 times slower: same problem, same algorithms and/or data structures.
 Whenever I find an article that talks about Java being faster than C++, I
 know it's BS.  You can find fair comparisons at http://www.spoj.pl/

 Java is also very sluggish.  I don't exactly know why, but I'm sure it has
 something to do with JVM and/or GC.  Just look at Android and compare it
 to
 iPhone to see what I mean.  Apps running on the PC written in Java are
 also
 sluggish: Things like Eclipse and Open Office come to mind.

 Java is a joke.  It's a faith-based ideology.  Get over it.
While I admit it's anecdotal, this has always been my experience with Java, too. Java did help show me some of the downsides of C++ (ex, headers never bothered me until I looked at Java), but that's pretty much been the extent of my appreciation for Java.
Dec 02 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 9:38 AM, Nick Sabalausky wrote:
 While I admit it's anecdotal, this has always been my experience with Java,
 too. Java did help show me some of the downsides of C++ (ex, headers never
 bothered me until I looked at Java), but that's pretty much been the extent
 of my appreciation for Java.
I actually learned a lot from Java.
Dec 02 2011
prev sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Thu, 2011-12-01 at 13:39 -0600, Caligo wrote:

I am forced to treat this response as the troll that it is.

[...]
 Java is a joke.  Get over it.  Java is a toy language and the only reason
 people and companies have taken it seriously is because so many kids have
 have decided (brainwashed) to play with it.  With enough marketing and
 propaganda you can make people believe whatever you want them to believe.
 Java is also the "blue collar" programming languages, which means
 corporations get to exploit people all around the world in order to make
 even more profits.
Java is the main language of development just now. D is a tiny little backwater in the nether regions of obscurity. If any language is a joke here, it is D since it is currently unable to claim any serious market share in the world of development. The sooner you accept this, the sooner you can discuss the shortcomings of a language you have no experience of, by your own admission. Your point about how languages become popular has some merit, albeit stated in an overly bigoted fashion. Your point about exploitation should be aimed at the entirety of the economic systems of the world. The systems in the USA, India and China (the three main economies of the world) rest completely and solely on exploitation. It's called capitalism. [...]
 So far I have competed in the ACM ICPC regional programming contests
 twice.  I've met many students there and I've had many teammates, most if
 not all of them Java programmers.  Besides me (I've never actually done a=
ny
 Java), I don't know any other C++ programmer in there.  I have seen
 countless problems solved in Java and C++, with Java always being 10-20
 times slower: same problem, same algorithms and/or data structures.
 Whenever I find an article that talks about Java being faster than C++, I
 know it's BS.  You can find fair comparisons at http://www.spoj.pl/
If you have never used Java or never actually investigated the issues as to when Java is significantly slower than C++ and when it is as fast as C++ then clearly you have no grounds on which to express any opinion based on facts, it is just prejudice and bigotry. Such comments have no place in any discussion.
 Java is also very sluggish.  I don't exactly know why, but I'm sure it ha=
s
 something to do with JVM and/or GC.  Just look at Android and compare it =
to
 iPhone to see what I mean.  Apps running on the PC written in Java are al=
so
 sluggish: Things like Eclipse and Open Office come to mind.
If you don't know why, how can you make claims that you cannot substantiate in any way shape or form. You qualitative assessment of applications such as Eclipse and OpenOffice relates to the codebase and not the language. =20
 Java is a joke.  It's a faith-based ideology.  Get over it.
[...] Clearly you are having a crisis of faith, and so are having to lash out to protect your ideology. I am entirely comfortable with my perceptions of languages, so have no need for such behaviours. I analyse languages, consider use in context, and use the most appropriate language for the job at hand, be it Fortran, C++, C, Ada, Haskell, OCaml, Java, Groovy, Python, Ruby, Clojure, Lisp, Go. Perhaps even D.=20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 02 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Russel Winder" <russel russel.org.uk> wrote in message 
news:mailman.1242.1322814007.24802.digitalmars-d puremagic.com...
 Java is the main language of development just now. D is a tiny little
 backwater in the nether regions of obscurity. If any language is a joke
 here, it is D since it is currently unable to claim any serious market
 share in the world of development.
I see, so popularity is the primary determining factor of quality and validity. Right? What's right is inherently popular and what's popular is inherently right.
 If you don't know why, how can you make claims that you cannot
 substantiate in any way shape or form.
That's just BS. Even the scientific method starts with *observations*, not the "how" or "why". Those come later. Observations without knowing the underlying cause are perfectly valid. Hell, if JVM is slow, then it doesn't really even matter why (unless you're optimizing it or trying to avoid the same pitfalls), now does it?
 I am entirely comfortable with my perceptions
 of languages, so have no need for such behaviours.
That's BS posturing and chest-thumping. What is this, some damn new agers group where nobody's allowed to dislike anything and feel strongly about it? The reason you feel no need for such things is because you don't appear to find significant fault with Java/JVM. Other people do. And unlike you, those people have to put up with a world heavily infected by it. *Of course*, you don't feel a need to complain, you're comfortable equating popularity with validity. I'm not trying to say you're not entitled to be happy with Java/JVM, but when someone who likes the status quo sees someone who dislikes it and then says, "Hey, how dare you be unhappy about it! After all, I'm happy! It's the popular thing, therefore it must be ok and you should like it!", and starts preaching, that's just asinine.
Dec 02 2011
next sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 18:08, Nick Sabalausky a écrit :
 "Russel Winder" <russel russel.org.uk> wrote in message 
 news:mailman.1242.1322814007.24802.digitalmars-d puremagic.com...
 Java is the main language of development just now. D is a tiny little
 backwater in the nether regions of obscurity. If any language is a joke
 here, it is D since it is currently unable to claim any serious market
 share in the world of development.
I see, so popularity is the primary determining factor of quality and validity. Right? What's right is inherently popular and what's popular is inherently right.
 If you don't know why, how can you make claims that you cannot
 substantiate in any way shape or form.
That's just BS. Even the scientific method starts with *observations*, not the "how" or "why". Those come later. Observations without knowing the underlying cause are perfectly valid. Hell, if JVM is slow, then it doesn't really even matter why (unless you're optimizing it or trying to avoid the same pitfalls), now does it?
 I am entirely comfortable with my perceptions
 of languages, so have no need for such behaviours.
That's BS posturing and chest-thumping. What is this, some damn new agers group where nobody's allowed to dislike anything and feel strongly about it? The reason you feel no need for such things is because you don't appear to find significant fault with Java/JVM. Other people do. And unlike you, those people have to put up with a world heavily infected by it. *Of course*, you don't feel a need to complain, you're comfortable equating popularity with validity. I'm not trying to say you're not entitled to be happy with Java/JVM, but when someone who likes the status quo sees someone who dislikes it and then says, "Hey, how dare you be unhappy about it! After all, I'm happy! It's the popular thing, therefore it must be ok and you should like it!", and starts preaching, that's just asinine.
In what way is Eclipse sluggish ? The Java language is slower than C++, but Eclipse happily compiles hundreds of thousands of lines or millions of lines of Java code in a few seconds or at most tens of seconds. Try to do that even with C, not talking about C++. The fact is, you are more productive in Java than in C++ by nearly an order of magnitude. Because: 0) the language is easier, has far less idiosyncrasies 1) the IDEs are extremely helpful, 2) the API is extremely complete and reliable 3) there are libraries for nearly everything 4) debugging is usually easier than in C++ 5) you have less bugs (especially, hard to find bugs like unitialized variables for instance, or race conditions) 6) porting is easier 7) it is safer in the sense that you have less security holes These qualities largely compensate the defects and shortcomings of the language, and I can attest from experience that because of its massive toolset and libraries as well as static typing, Java is comparable in productivity with Python. Besides, with a little attention to what you do, you can extract very decent performance out of it. For instance embedded Java databases like H2 and HSQLDB are demonstrably faster than MySQL and PostgreSQL or Oracle on small to average sized disk-based databases, and they were written by a single guy. In many environments where it is massively used, Java is *not* the bottleneck, the JVM is fast enough. Rather the network or the database are. This is enough to convince most companies to invest massively into Java. So saying that Java is a toy language is ridiculous.
Dec 02 2011
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbarp$1kp6$1 digitalmars.com...
 In what way is Eclipse sluggish ?
Have you used it? OTOH, have you used...pretty much anything that uses Scintilla? There's no comparison.
 The Java language is slower than C++,
 but Eclipse happily compiles hundreds of thousands of lines or millions
 of lines of Java code in a few seconds or at most tens of seconds.
Eclipse doesn't compile Java. 'javac' does.
Dec 02 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 21:24, Nick Sabalausky a écrit :
 
 Eclipse doesn't compile Java. 'javac' does.
 
 
Huh, when was the last time you used eclipse ? eclipse has always compiled Java without javac. It compiles incrementally and shows your errors while you are coding. i.e as soon as you've finished coding, you can run your executable. That's actually a faster turnaround than with an interpreted language because the IDE shows you most of your mistakes at code time.
Dec 02 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbcni$1s2j$1 digitalmars.com...
 Le 02/12/2011 21:24, Nick Sabalausky a écrit :
 Eclipse doesn't compile Java. 'javac' does.
Huh, when was the last time you used eclipse ? eclipse has always compiled Java without javac. It compiles incrementally and shows your errors while you are coding. i.e as soon as you've finished coding, you can run your executable. That's actually a faster turnaround than with an interpreted language because the IDE shows you most of your mistakes at code time.
Ok, my mistake I guess. But pretty much any intereraction with it (such as typing) is unforgivably sluggish, even if it's just plain text. Coding in it is like running through a foot of water.
Dec 02 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:03, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message 
 Ok, my mistake I guess. But pretty much any intereraction with it (such as 
 typing) is unforgivably sluggish, even if it's just plain text. Coding in it 
 is like running through a foot of water.
 
Maybe there was a problem with your project or installation ? I agree it is sluggish for XML, and also for multi megabytes text files, I prefer to use a proper text editor (I like Notepad++ on Win32). But for code, that's not my experience, it is responsive, even with fairly large projects (i.e projects with thousands of classes and dozens of Mb of class libraries).
Dec 02 2011
parent "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbjfj$2r5v$1 digitalmars.com...
 Le 02/12/2011 23:03, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message
 Ok, my mistake I guess. But pretty much any intereraction with it (such 
 as
 typing) is unforgivably sluggish, even if it's just plain text. Coding in 
 it
 is like running through a foot of water.
Maybe there was a problem with your project or installation ?
Every project/installation I've tried.
 I agree it
 is sluggish for XML, and also for multi megabytes text files, I prefer
 to use a proper text editor (I like Notepad++ on Win32).
 But for code, that's not my experience, it is responsive, even with
 fairly large projects (i.e projects with thousands of classes and dozens
 of Mb of class libraries). 
Dec 02 2011
prev sibling next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 09:01 PM, Somedude wrote:
 Le 02/12/2011 18:08, Nick Sabalausky a écrit :
 "Russel Winder"<russel russel.org.uk>  wrote in message
 news:mailman.1242.1322814007.24802.digitalmars-d puremagic.com...
 Java is the main language of development just now. D is a tiny little
 backwater in the nether regions of obscurity. If any language is a joke
 here, it is D since it is currently unable to claim any serious market
 share in the world of development.
I see, so popularity is the primary determining factor of quality and validity. Right? What's right is inherently popular and what's popular is inherently right.
 If you don't know why, how can you make claims that you cannot
 substantiate in any way shape or form.
That's just BS. Even the scientific method starts with *observations*, not the "how" or "why". Those come later. Observations without knowing the underlying cause are perfectly valid. Hell, if JVM is slow, then it doesn't really even matter why (unless you're optimizing it or trying to avoid the same pitfalls), now does it?
 I am entirely comfortable with my perceptions
 of languages, so have no need for such behaviours.
That's BS posturing and chest-thumping. What is this, some damn new agers group where nobody's allowed to dislike anything and feel strongly about it? The reason you feel no need for such things is because you don't appear to find significant fault with Java/JVM. Other people do. And unlike you, those people have to put up with a world heavily infected by it. *Of course*, you don't feel a need to complain, you're comfortable equating popularity with validity. I'm not trying to say you're not entitled to be happy with Java/JVM, but when someone who likes the status quo sees someone who dislikes it and then says, "Hey, how dare you be unhappy about it! After all, I'm happy! It's the popular thing, therefore it must be ok and you should like it!", and starts preaching, that's just asinine.
In what way is Eclipse sluggish ? The Java language is slower than C++, but Eclipse happily compiles hundreds of thousands of lines or millions of lines of Java code in a few seconds or at most tens of seconds. Try to do that even with C, not talking about C++.
Except that _Eclipse_ does not do anything to achieve this. It just invokes ant, which invokes javac, which is presumably written in C and C++. I can do that in a console without waiting 5 minutes until the IDE has finished starting. Furthermore, if you add up all those startup times of every java application and add that to the total time used for compilation I am not convinced the compilation time/LOC ratio is still that impressive (yes, JIT compilation is compilation too!)
 The fact is, you are more productive in Java than in C++ by nearly an
 order of magnitude.
 Because:
 0) the language is easier, has far less idiosyncrasies
Simpler language implies higher complexity to adapt it to your problem domain. It is a matter of trade-offs (but I am not the one to argue that C++ got all of those right.)
 1) the IDEs are extremely helpful,
I don't usually program in Java, but when I do, I use a simple text editor.
 2) the API is extremely complete and reliable
Yes it is, and that is certainly a good thing, but: result = x.add(y.multiply(BigInteger.valueOf(7))).pow(3).abs().setBit(27); ExtremelyDetailedClassName extremelyDetailedClassName = new ExtremelyDetailedClassName(). I guess you get used to it, and that those things are the reason why the IDE is extremely helpful.
 3) there are libraries for nearly everything
More than for C?
 4) debugging is usually easier than in C++
Yes, and that is a big win for productivity.
 5) you have less bugs (especially, hard to find bugs like unitialized
 variables for instance, or race conditions)
You can have race conditions perfectly fine in Java code. The Java memory model is very complicated. There has long been a common practice of using double checked locking, eg. for singleton initialization while avoiding inefficient volatile variables. Until somebody figured out that the java memory model does not actually support the idiom and that all that code that did the supposedly right and clever thing was buggy. ;)
 6) porting is easier
 7) it is safer in the sense that you have less security holes

 These qualities largely compensate the defects and shortcomings of the
 language, and I can attest from experience that because of its massive
 toolset and libraries
I agree. If productivity is important and efficiency is not an issue then writing the project in Java is maybe a better option than writing it in C++. (especially if you can't get hold of a decent number of C++ good C++ programmers to carry out the project, which is hard.) But writing Java code is not very pleasant imho. (But there are other fine languages, like the one we like to discuss here ;))
 as well as static typing,
Java is halfway dynamically checked. Did you know that every field update of an array of class references performs a downcast check?
 Java is comparable in productivity with Python.
 Besides, with a little attention to what you do, you can extract very
 decent performance out of it.
How many Java programmers know what to pay attention to?
 For instance embedded Java databases like
 H2 and HSQLDB are demonstrably faster than MySQL and PostgreSQL or
 Oracle on small to average sized disk-based databases, and they were
 written by a single guy.
 In many environments where it is massively used, Java is *not* the
 bottleneck, the JVM is fast enough. Rather the network or the database
 are. This is enough to convince most companies to invest massively into
 Java. So saying that Java is a toy language is ridiculous.
Java means a lot different things: - Language - Libraries - Security Model - Virtual Machine - ... The issue is that in discussions about java, this often leads to misunderstandings.
Dec 02 2011
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
Dec 02 2011
next sibling parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr <timon.gehr gmx.ch>:

 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_projects
Dec 02 2011
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 10:38 PM, Marco Leise wrote:
 Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr <timon.gehr gmx.ch>:

 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_projects
It feels like 5 minutes if you are accustomed to open the text editor and start working. But I am sure there is something to IDE's, as many programmers seem to like them.
Dec 02 2011
next sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.
 
 But I am sure there is something to IDE's, as many programmers seem to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
Dec 02 2011
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 10:50 PM, Somedude wrote:
 Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.

 But I am sure there is something to IDE's, as many programmers seem to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
So you waste even more energy? How is that not an issue?
Dec 02 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:27, Timon Gehr a écrit :
 On 12/02/2011 10:50 PM, Somedude wrote:
 Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.

 But I am sure there is something to IDE's, as many programmers seem to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
So you waste even more energy? How is that not an issue?
Even with a simple text editor, I wouldn't turn it off, because I don't feel like having to reopen every single window that was open the day before each morning. At best, I would put it in "hibernate" mode (or whatever that's called), i.e the RAM is still alive while the rest of the computer is off, so I don't have to reboot. That's what I usually do at home. I know it's not a very good habit, yet I am one of the most conscious at work. Some others don't even bother to turn off the screen.
Dec 02 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbk0c$2ug3$1 digitalmars.com...
 Le 02/12/2011 23:27, Timon Gehr a écrit :
 On 12/02/2011 10:50 PM, Somedude wrote:
 Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.

 But I am sure there is something to IDE's, as many programmers seem to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
So you waste even more energy? How is that not an issue?
Even with a simple text editor, I wouldn't turn it off, because I don't feel like having to reopen every single window that was open the day before each morning. At best, I would put it in "hibernate" mode (or whatever that's called), i.e the RAM is still alive while the rest of the computer is off, so I don't have to reboot. That's what I usually do at home. I know it's not a very good habit, yet I am one of the most conscious at work. Some others don't even bother to turn off the screen.
Hibernate saves the RAM (and presumably other hardware state) to HDD and then turns the machine entirely off, RAM and all. Then, when you turn it back on, it just restores it all from the disk, which is much faster than letting everything go through the usual startup routines. It is pretty nifty. I don't use it personally because I've had problems with it (possibly b/c I'm on XP), but it is pretty clever.
Dec 02 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:44, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message 
 news:jbbk0c$2ug3$1 digitalmars.com...
 Le 02/12/2011 23:27, Timon Gehr a écrit :
 On 12/02/2011 10:50 PM, Somedude wrote:
 Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.

 But I am sure there is something to IDE's, as many programmers seem to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
So you waste even more energy? How is that not an issue?
Even with a simple text editor, I wouldn't turn it off, because I don't feel like having to reopen every single window that was open the day before each morning. At best, I would put it in "hibernate" mode (or whatever that's called), i.e the RAM is still alive while the rest of the computer is off, so I don't have to reboot. That's what I usually do at home. I know it's not a very good habit, yet I am one of the most conscious at work. Some others don't even bother to turn off the screen.
Hibernate saves the RAM (and presumably other hardware state) to HDD and then turns the machine entirely off, RAM and all. Then, when you turn it back on, it just restores it all from the disk, which is much faster than letting everything go through the usual startup routines. It is pretty nifty. I don't use it personally because I've had problems with it (possibly b/c I'm on XP), but it is pretty clever.
I'm on XP SP3 too and it works. Maybe it's not Hibernate I use because it doesn't save on disk: it's much faster than writing (or reading) 2 Gb on disk, and if I unplug, I need to reboot and it says that there windows wasn't turned off properly. I'm pretty certain the RAM is still on, and the rest of the computer is off. When I turn it on, it's ready in matters of 2 or 3 seconds.
Dec 02 2011
parent "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbkss$22n$1 digitalmars.com...
 Le 02/12/2011 23:44, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message
 news:jbbk0c$2ug3$1 digitalmars.com...
 Le 02/12/2011 23:27, Timon Gehr a écrit :
 On 12/02/2011 10:50 PM, Somedude wrote:
 Le 02/12/2011 22:44, Timon Gehr a écrit :
 It feels like 5 minutes if you are accustomed to open the text editor
 and start working.

 But I am sure there is something to IDE's, as many programmers seem 
 to
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really not an issue at all (we don't turn off the PC at work).
So you waste even more energy? How is that not an issue?
Even with a simple text editor, I wouldn't turn it off, because I don't feel like having to reopen every single window that was open the day before each morning. At best, I would put it in "hibernate" mode (or whatever that's called), i.e the RAM is still alive while the rest of the computer is off, so I don't have to reboot. That's what I usually do at home. I know it's not a very good habit, yet I am one of the most conscious at work. Some others don't even bother to turn off the screen.
Hibernate saves the RAM (and presumably other hardware state) to HDD and then turns the machine entirely off, RAM and all. Then, when you turn it back on, it just restores it all from the disk, which is much faster than letting everything go through the usual startup routines. It is pretty nifty. I don't use it personally because I've had problems with it (possibly b/c I'm on XP), but it is pretty clever.
I'm on XP SP3 too and it works. Maybe it's not Hibernate I use because it doesn't save on disk: it's much faster than writing (or reading) 2 Gb on disk, and if I unplug, I need to reboot and it says that there windows wasn't turned off properly. I'm pretty certain the RAM is still on, and the rest of the computer is off. When I turn it on, it's ready in matters of 2 or 3 seconds.
That's just sleep mode then.
Dec 02 2011
prev sibling parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 02.12.2011, 22:50 Uhr, schrieb Somedude <lovelydear mailmetrash.com>:=


 Le 02/12/2011 22:44, Timon Gehr a =C3=A9crit :
 It feels like 5 minutes if you are accustomed to open the text editor=
 and start working.

 But I am sure there is something to IDE's, as many programmers seem t=
o
 like them.
The thing is, when you work in Java, you need 2Gb of RAM to be comfortable. Then you simply never close your IDE, so that's really no=
t
 an issue at all (we don't turn off the PC at work).
Does that mean you have no excuse to go drink a coffee as the first step= = each morning?
Dec 02 2011
parent Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:36, Marco Leise a écrit :
 Am 02.12.2011, 22:50 Uhr, schrieb Somedude <lovelydear mailmetrash.com>:
 
 Does that mean you have no excuse to go drink a coffee as the first step
 each morning?
This reminds me of the xkcd where you see two developers playing knights with wooden swords while the project is compiling. That's pretty much accurate in C++, where full recompilation can take hours. That or waste time on the net. Not so much in Java. :)
Dec 02 2011
prev sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 22:44:41 Timon Gehr wrote:
 On 12/02/2011 10:38 PM, Marco Leise wrote:
 Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr <timon.gehr gmx.ch>:
 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C
 and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_projects
It feels like 5 minutes if you are accustomed to open the text editor and start working. But I am sure there is something to IDE's, as many programmers seem to like them.
They can do wonders with code completion and making it easy to hop to declarations and the like. They also are often able to point out errors in your code as you're typing it, which can be quite helpful. I frequently miss many of the features that IDEs like eclipse have when I code (I do all of my coding in vim these days, regardless of the language). But I _really_ value the power that vim provides in terms of text editing, and I haven't found an IDE yet which I can get to emulate vim well enough to be acceptable in that regard, so I don't use them. I'd definitely like to though. - Jonathan M Davis
Dec 02 2011
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 11:28 PM, Jonathan M Davis wrote:
 On Friday, December 02, 2011 22:44:41 Timon Gehr wrote:
 On 12/02/2011 10:38 PM, Marco Leise wrote:
 Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr<timon.gehr gmx.ch>:
 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C
 and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_projects
It feels like 5 minutes if you are accustomed to open the text editor and start working. But I am sure there is something to IDE's, as many programmers seem to like them.
They can do wonders with code completion and making it easy to hop to declarations and the like. They also are often able to point out errors in your code as you're typing it, which can be quite helpful. I frequently miss many of the features that IDEs like eclipse have when I code (I do all of my coding in vim these days, regardless of the language). But I _really_ value the power that vim provides in terms of text editing, and I haven't found an IDE yet which I can get to emulate vim well enough to be acceptable in that regard, so I don't use them. I'd definitely like to though. - Jonathan M Davis
I'm more an emacs guy, and I jump to declarations by (maybe C-x C-f filename ENTER) M-s \w+ identifier ENTER (and a few C-s for the occasional false positives), and I can use similar techniques to not only reach a specific declaration, but any specific position in the whole code. I don't think that it is any slower than always lifting your hands from the keyboard in order to be able to use the mouse and slow IDE functionality.
Dec 02 2011
next sibling parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 12/02/2011 05:38 PM, Timon Gehr wrote:
 I'm more an emacs guy, and I jump to declarations by (maybe C-x C-f
 filename ENTER) M-s \w+ identifier ENTER (and a few C-s for the
 occasional false positives), and I can use similar techniques to not
 only reach a specific declaration, but any specific position in the
 whole code. I don't think that it is any slower than always lifting your
 hands from the keyboard in order to be able to use the mouse and slow
 IDE functionality.
Try "F3" in Eclipse. You know, it has keyboard shortcuts too, and user-definable ones at that, and it is smart about Java. I'm an Emacs guy too, but Eclipse blows Emacs out of the water when it comes to programming in Java.
Dec 02 2011
next sibling parent Russel Winder <russel russel.org.uk> writes:
On Fri, 2011-12-02 at 17:54 -0500, Jeff Nowakowski wrote:
[...]
 I'm an Emacs guy too, but Eclipse blows Emacs out of the water when it=
=20
 comes to programming in Java.
No-one has mentioned NetBeans (which does use Ant, Maven or Gradle) to manage build) or IntelliJ IDEA (which although "pay for", many people do because it is very good -- it has built in compilation, but can also use Gradle, Maven, Ant, or Leiningen for build) as well as Eclipse (which has built in compilation or can use Gradle, Maven or Ant for build). I am fundamentally an Emacs + Bash person, but these IDEs are getting to the stage where they are now doing things that make them usable. Actually for anything to do with Java ME or Android the IDEs are nigh on essential due to the connection to the simulators. They all take an age to start, they all take far too much memory (if you haven't got >8GB don't think of starting all three) and they are slow at times. The speed issue on typing is because of all the type checking, immediate parsing and manual checking that goes on. Despite the slow typing, I value the popup manual stuff -- though you have to set the timing right or it gets immensely annoying. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 03 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Fri, 2011-12-02 at 17:54 -0500, Jeff Nowakowski wrote:
[...]
 I'm an Emacs guy too, but Eclipse blows Emacs out of the water when it=
=20
 comes to programming in Java.
Forgot to mention that all of the IDEs are good for Python and C++ as well. Peter Sommerlad and his students have done well integrating the CUTE unit test framework into CDT (Eclipse C/C++ mode) along with creating a way of using SCons instead of make for build. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 03 2011
prev sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 23:38:43 Timon Gehr wrote:
 On 12/02/2011 11:28 PM, Jonathan M Davis wrote:
 On Friday, December 02, 2011 22:44:41 Timon Gehr wrote:
 On 12/02/2011 10:38 PM, Marco Leise wrote:
 Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr<timon.gehr gmx.ch>:
 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It
 just
 invokes ant, which invokes javac, which is presumably written in
 C
 and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the
 IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_proj ects>>
It feels like 5 minutes if you are accustomed to open the text editor and start working. But I am sure there is something to IDE's, as many programmers seem to like them.
They can do wonders with code completion and making it easy to hop to declarations and the like. They also are often able to point out errors in your code as you're typing it, which can be quite helpful. I frequently miss many of the features that IDEs like eclipse have when I code (I do all of my coding in vim these days, regardless of the language). But I _really_ value the power that vim provides in terms of text editing, and I haven't found an IDE yet which I can get to emulate vim well enough to be acceptable in that regard, so I don't use them. I'd definitely like to though. - Jonathan M Davis
I'm more an emacs guy, and I jump to declarations by (maybe C-x C-f filename ENTER) M-s \w+ identifier ENTER (and a few C-s for the occasional false positives), and I can use similar techniques to not only reach a specific declaration, but any specific position in the whole code. I don't think that it is any slower than always lifting your hands from the keyboard in order to be able to use the mouse and slow IDE functionality.
It's more a question of functionality. I cannot acceptibly jump to declarations in vim _period_. Stuff like ctags and cscope absolutely suck in comparison to a decent IDE, and AFAIK that's all vim really has for enabling the ability to do stuff like jump to declarations. I don't know if emacs uses the same underlying programs or whether it does it on its own, so I don't know how it compares. I'd gladly be hopping to declarations using vim with whatever shortcut it is if it could actually do it right, but ctags just isn't smart enough to do it accurately based on function overloading and the like, and I have to constantly worry about updating it, making sure that the vim instance that I'm using points to the right ctags file, etc. It just isn't acceptable IMHO, so I don't bother with it, but vim's other benefits outweigh the overall benefits of the IDE for me, so I still use vim. In any case, what I listed were just examples of what a good IDE can do. There's plenty of other stuff that you'd probably miss if you had been heavily using them in an IDE before. But I can completely understand thinking that a solid text editor is overall better than an IDE, since I use vim for precisely that reason. - Jonathan M Davis
Dec 02 2011
parent "Nick Sabalausky" <a a.a> writes:
"Jonathan M Davis" <jmdavisProg gmx.com> wrote in message 
news:mailman.1262.1322866645.24802.digitalmars-d puremagic.com...
 It's more a question of functionality. I cannot acceptibly jump to
 declarations in vim _period_. Stuff like ctags and cscope absolutely suck 
 in
 comparison to a decent IDE, and AFAIK that's all vim really has for 
 enabling
 the ability to do stuff like jump to declarations. I don't know if emacs 
 uses
 the same underlying programs or whether it does it on its own, so I don't 
 know
 how it compares. I'd gladly be hopping to declarations using vim with 
 whatever
 shortcut it is if it could actually do it right, but ctags just isn't 
 smart
 enough to do it accurately based on function overloading and the like, and 
 I
 have to constantly worry about updating it, making sure that the vim 
 instance
 that I'm using points to the right ctags file, etc. It just isn't 
 acceptable
 IMHO, so I don't bother with it, but vim's other benefits outweigh the 
 overall
 benefits of the IDE for me, so I still use vim.

 In any case, what I listed were just examples of what a good IDE can do.
 There's plenty of other stuff that you'd probably miss if you had been 
 heavily
 using them in an IDE before. But I can completely understand thinking that 
 a
 solid text editor is overall better than an IDE, since I use vim for 
 precisely
 that reason.
I think the line between IDE and text editor is really blurring these days. Things like Eclipse and VS.NET are clearly IDE's yea, but then there's a lot of stuff like Programmer's Notepad 2, Code::Blocks, etc, that are not quite as fully-featured as Eclipse/VS, but yet they're much more like lightweight IDEs than text editors per se.
Dec 02 2011
prev sibling next sibling parent Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:28, Jonathan M Davis a écrit :
 On Friday, December 02, 2011 22:44:41 Timon Gehr wrote:
 On 12/02/2011 10:38 PM, Marco Leise wrote:
 Am 02.12.2011, 21:50 Uhr, schrieb Timon Gehr <timon.gehr gmx.ch>:
 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C
 and
 C++.
Seems like I was wrong about this.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
No you are all wrong :p, it takes half a minute for Eclipse to start up. Yes, this is still more than a native executable would need. Still I use it for Java, D, JavaScript and PHP. It's awesome! I also sneak in Eclipse project files into github repositories to make the awesomeness available to others: https://github.com/aichallenge/aichallenge/tree/epsilon/eclipse_projects
It feels like 5 minutes if you are accustomed to open the text editor and start working. But I am sure there is something to IDE's, as many programmers seem to like them.
They can do wonders with code completion and making it easy to hop to declarations and the like. They also are often able to point out errors in your code as you're typing it, which can be quite helpful. I frequently miss many of the features that IDEs like eclipse have when I code (I do all of my coding in vim these days, regardless of the language). But I _really_ value the power that vim provides in terms of text editing, and I haven't found an IDE yet which I can get to emulate vim well enough to be acceptable in that regard, so I don't use them. I'd definitely like to though. - Jonathan M Davis
The thing is, since it compiles while you type, refactoring is mostly safe because it has all the references in memory. Another cool thing with bytecode (not related to IDEs, but they can take advantage of it) is, bytecode reversing is VERY effective. You get pretty much the original code from the bytecode unless it has been obfuscated on purpose.
Dec 02 2011
prev sibling parent simendsjo <simendsjo gmail.com> writes:
On 02.12.2011 23:28, Jonathan M Davis wrote:
 But I_really_  value
 the power that vim provides in terms of text editing, and I haven't found an
 IDE yet which I can get to emulate vim well enough to be acceptable in that
 regard, so I don't use them. I'd definitely like to though.
I'm using ViEmu for VS. Works pretty well. http://www.viemu.com/
Dec 02 2011
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-02 21:50, Timon Gehr wrote:
 On 12/02/2011 09:44 PM, Timon Gehr wrote:
 Except that _Eclipse_ does not do anything to achieve this. It just
 invokes ant, which invokes javac, which is presumably written in C and
 C++.
Seems like I was wrong about this.
Eclipse has its own Java compiler, completely written in Java. If you're using Eclipse you don't need JDK, you only need the JRE because Eclipse ships with a Java compiler.
 I can do that in a console without waiting 5 minutes until the IDE
 has finished starting.
But this is still true.
-- /Jacob Carlborg
Dec 04 2011
prev sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 21:44, Timon Gehr a écrit :
 In what way is Eclipse sluggish ? The Java language is slower than C++,
 but Eclipse happily compiles hundreds of thousands of lines or millions
 of lines of Java code in a few seconds or at most tens of seconds. Try
 to do that even with C, not talking about C++.
Except that _Eclipse_ does not do anything to achieve this. It just invokes ant, which invokes javac, which is presumably written in C and C++. I can do that in a console without waiting 5 minutes until the IDE has finished starting.
Wow. Sorry, but that's wrong on all accounts. eclipse integrates its own compiler which isn't javac, it's an incremental compiler that was written by IBM, part of JDT Core http://www.eclipse.org/jdt/core/index.php And I would believe it doesn't invoke ant at all, as eclipse manages the project itself. There is no ant file anywhere in your project AFAIK. And finally, the javac compiler is written in Java, not in C nor C++. And I believe that's true since the first release. Only the JVM is written in C.
 Furthermore, if you add up all those startup times of every java
 application and add that to the total time used for compilation I am not
 convinced the compilation time/LOC ratio is still that impressive (yes,
 JIT compilation is compilation too!)
 
Having programmed as a contractor for years on projects of hundreds of KLOC both in C++ and in Java, what I can recall is, in C++ turnaround is a PITA because of heavy recompilation, even when using idioms like PIMPL. It's one of the main issues of this language. While in Java, the compilation time is near zero. The launch time of applications entirely depends on what you do with them: if it has to open several DB connections to initialize itself, yes it's sluggish, but that doesn't have anything to do with the language, rather with the application. In the end, you may want to use a local or an embedded database for testing. I've seen C++ applications with starting times just as awful, for the same reasons. And JIT compilation, you don't feel it, so it doesn't matter.
 The fact is, you are more productive in Java than in C++ by nearly an
 order of magnitude.
 Because:
 0) the language is easier, has far less idiosyncrasies
Simpler language implies higher complexity to adapt it to your problem domain. It is a matter of trade-offs (but I am not the one to argue that C++ got all of those right.)
 1) the IDEs are extremely helpful,
I don't usually program in Java, but when I do, I use a simple text editor.
If you are doing serious work with this language, you're simply wasting your time with a simple text editor.
 2) the API is extremely complete and reliable
Yes it is, and that is certainly a good thing, but: result = x.add(y.multiply(BigInteger.valueOf(7))).pow(3).abs().setBit(27); ExtremelyDetailedClassName extremelyDetailedClassName = new ExtremelyDetailedClassName().
I agree it's annoying and ugly, I hate it as much as you. But in the end, in terms of productivity, that doesn't matter much. The coding time is a small part of the coder's activity time. However, I agree a badly designed API can make you lose a lot of time.
 I guess you get used to it, and that those things are the reason why the
 IDE is extremely helpful.
 
Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but it's hardly a productivity issue. One other thing that's cool is refactoring is no longer an issue, like it is in C or C++. With powerful IDEs, you can refactor without fearing too much regression, and that's a very important advantage, especially in a heavily OO language.
 3) there are libraries for nearly everything
More than for C?
It's hard to compare as C and Java are definitely not targetted at the same kind of applications, but certainly much more than for C++. And the good thing is, many of them are of high quality. If what you do is serverside application, the Java ecosystem is second to none. On complex command line tools, it's also adequate and can show some very good performance. On other uses, it depends on the requirements.
 
 4) debugging is usually easier than in C++
Yes, and that is a big win for productivity.
 5) you have less bugs (especially, hard to find bugs like unitialized
 variables for instance, or race conditions)
You can have race conditions perfectly fine in Java code. The Java memory model is very complicated. There has long been a common practice of using double checked locking, eg. for singleton initialization while avoiding inefficient volatile variables. Until somebody figured out that the java memory model does not actually support the idiom and that all that code that did the supposedly right and clever thing was buggy. ;)
Yes, I agree you can. But you most often want to avoid using low level API and only resort to high level synchronization that the JDK offers, which greatly reduces risks. D has the same policy, to an even better extent.
 6) porting is easier
 7) it is safer in the sense that you have less security holes

 These qualities largely compensate the defects and shortcomings of the
 language, and I can attest from experience that because of its massive
 toolset and libraries
I agree. If productivity is important and efficiency is not an issue then writing the project in Java is maybe a better option than writing it in C++. (especially if you can't get hold of a decent number of C++ good C++ programmers to carry out the project, which is hard.) But writing Java code is not very pleasant imho. (But there are other fine languages, like the one we like to discuss here ;))
 as well as static typing,
Java is halfway dynamically checked. Did you know that every field update of an array of class references performs a downcast check?
Yes. That's because of type erasure. Java containers only contain Object objects. For backward compatibility with Java 1.4 and earlier, that didn't have generics. Type erasure is a pain, even the language designers agree on this, but they didn't have the choice if they wanted to avoid hundreds of millions of lines of customer code to be rewritten.
 Java is comparable in productivity with Python.
 Besides, with a little attention to what you do, you can extract very
 decent performance out of it.
How many Java programmers know what to pay attention to?
I don't know, but don't assume Java programmers are all stupid. It is as wrong an idea as saying all Indian programmers are lousy. There is far more Java/Indian programmers than D programmers, so even though a majority are the average Joe (and I count myself in them), many of them are good. Given the state of the industry, sufficient to say there are jobs and a good pay to attract all sorts of programmers, good and bad alike. It's just the curve bell applied to a larger community. What I can say though, is a bad C++ programmer can cause much worse problems than a bad Java programmer.
 For instance embedded Java databases like
 H2 and HSQLDB are demonstrably faster than MySQL and PostgreSQL or
 Oracle on small to average sized disk-based databases, and they were
 written by a single guy.
 In many environments where it is massively used, Java is *not* the
 bottleneck, the JVM is fast enough. Rather the network or the database
 are. This is enough to convince most companies to invest massively into
 Java. So saying that Java is a toy language is ridiculous.
Java means a lot different things: - Language - Libraries - Security Model - Virtual Machine - ... The issue is that in discussions about java, this often leads to misunderstandings.
Yes.
Dec 02 2011
next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 10:40 PM, Somedude wrote:
 Le 02/12/2011 21:44, Timon Gehr a écrit :
 In what way is Eclipse sluggish ? The Java language is slower than C++,
 but Eclipse happily compiles hundreds of thousands of lines or millions
 of lines of Java code in a few seconds or at most tens of seconds. Try
 to do that even with C, not talking about C++.
Except that _Eclipse_ does not do anything to achieve this. It just invokes ant, which invokes javac, which is presumably written in C and C++. I can do that in a console without waiting 5 minutes until the IDE has finished starting.
Wow. Sorry, but that's wrong on all accounts. eclipse integrates its own compiler which isn't javac, it's an incremental compiler that was written by IBM, part of JDT Core http://www.eclipse.org/jdt/core/index.php And I would believe it doesn't invoke ant at all, as eclipse manages the project itself. There is no ant file anywhere in your project AFAIK. And finally, the javac compiler is written in Java, not in C nor C++. And I believe that's true since the first release. Only the JVM is written in C.
Yes, I figured that out about one hour ago and corrected my statement ;).
 Furthermore, if you add up all those startup times of every java
 application and add that to the total time used for compilation I am not
 convinced the compilation time/LOC ratio is still that impressive (yes,
 JIT compilation is compilation too!)
Having programmed as a contractor for years on projects of hundreds of KLOC both in C++ and in Java, what I can recall is, in C++ turnaround is a PITA because of heavy recompilation, even when using idioms like PIMPL. It's one of the main issues of this language. While in Java, the compilation time is near zero. The launch time of applications entirely depends on what you do with them: if it has to open several DB connections to initialize itself, yes it's sluggish, but that doesn't have anything to do with the language, rather with the application. In the end, you may want to use a local or an embedded database for testing. I've seen C++ applications with starting times just as awful, for the same reasons.
Yes. You can write sluggy code in any language.
 And JIT compilation, you don't feel it, so it doesn't matter.
It is a waste of resources/energy imho. The fact that those are cheaper than programmers does not make it right.
 The fact is, you are more productive in Java than in C++ by nearly an
 order of magnitude.
 Because:
 0) the language is easier, has far less idiosyncrasies
Simpler language implies higher complexity to adapt it to your problem domain. It is a matter of trade-offs (but I am not the one to argue that C++ got all of those right.)
 1) the IDEs are extremely helpful,
I don't usually program in Java, but when I do, I use a simple text editor.
If you are doing serious work with this language, you're simply wasting your time with a simple text editor.
That is why I don't think it is pleasant to program in Java.
 2) the API is extremely complete and reliable
Yes it is, and that is certainly a good thing, but: result = x.add(y.multiply(BigInteger.valueOf(7))).pow(3).abs().setBit(27); ExtremelyDetailedClassName extremelyDetailedClassName = new ExtremelyDetailedClassName().
I agree it's annoying and ugly, I hate it as much as you. But in the end, in terms of productivity, that doesn't matter much. The coding time is a small part of the coder's activity time. However, I agree a badly designed API can make you lose a lot of time.
 I guess you get used to it, and that those things are the reason why the
 IDE is extremely helpful.
Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but it's hardly a productivity issue. One other thing that's cool is refactoring is no longer an issue, like it is in C or C++. With powerful IDEs, you can refactor without fearing too much regression, and that's a very important advantage, especially in a heavily OO language.
 3) there are libraries for nearly everything
More than for C?
It's hard to compare as C and Java are definitely not targetted at the same kind of applications, but certainly much more than for C++.
C++ libraries are a superset of C libraries.
 And the good thing is, many of them are of high quality. If what you do is
 serverside application, the Java ecosystem is second to none. On complex
 command line tools, it's also adequate and can show some very good
 performance. On other uses, it depends on the requirements.

 4) debugging is usually easier than in C++
Yes, and that is a big win for productivity.
 5) you have less bugs (especially, hard to find bugs like unitialized
 variables for instance, or race conditions)
You can have race conditions perfectly fine in Java code. The Java memory model is very complicated. There has long been a common practice of using double checked locking, eg. for singleton initialization while avoiding inefficient volatile variables. Until somebody figured out that the java memory model does not actually support the idiom and that all that code that did the supposedly right and clever thing was buggy. ;)
Yes, I agree you can. But you most often want to avoid using low level API and only resort to high level synchronization that the JDK offers, which greatly reduces risks. D has the same policy, to an even better extent.
 6) porting is easier
 7) it is safer in the sense that you have less security holes

 These qualities largely compensate the defects and shortcomings of the
 language, and I can attest from experience that because of its massive
 toolset and libraries
I agree. If productivity is important and efficiency is not an issue then writing the project in Java is maybe a better option than writing it in C++. (especially if you can't get hold of a decent number of C++ good C++ programmers to carry out the project, which is hard.) But writing Java code is not very pleasant imho. (But there are other fine languages, like the one we like to discuss here ;))
 as well as static typing,
Java is halfway dynamically checked. Did you know that every field update of an array of class references performs a downcast check?
Yes. That's because of type erasure. Java containers only contain Object objects. For backward compatibility with Java 1.4 and earlier, that didn't have generics.
I think it is unrelated to type erasure in this case, because Arrays are not generic. The issue is that they are treated as covariant, which is _not_ statically type safe. Generics solve the same problem in a much nicer way (wildcards ftw!), but there your get the type check because of type erasure. So you get the worst possible solution: Dynamic checks on arrays because they were no proper generics some time ago, and dynamic checks on generics, for the same reason. A similar typecheck happens when you assign an object to an interface reference iirc, because the bytecode verifier cannot check multiple subtyping relationships.
 Type erasure is a pain, even the language
 designers agree on this, but they didn't have the choice if they wanted
 to avoid hundreds of millions of lines of customer code to be rewritten.
I would rather have forked the language, but I understand why they did not.
 Java is comparable in productivity with Python.
 Besides, with a little attention to what you do, you can extract very
 decent performance out of it.
How many Java programmers know what to pay attention to?
I don't know, but don't assume Java programmers are all stupid.
I certainly don't.
 It is as wrong an idea as saying all Indian programmers are lousy. There is far
 more Java/Indian programmers than D programmers, so even though a
 majority are the average Joe (and I count myself in them), many of them
 are good. Given the state of the industry, sufficient to say there are
 jobs and a good pay to attract all sorts of programmers, good and bad
 alike. It's just the curve bell applied to a larger community.
A good programmer can develop efficiently in any halfway decent language.
 What I can say though, is a bad C++ programmer can cause much worse
 problems than a bad Java programmer.
That is true, and I don't particularly like C++ either. It is kinda nice to exploit some special case to confuse programmers who like C++ and thought they knew C++ though ;) For example: #define NAME someValidIdentifier struct B{int foo;}; template<class T> int NAME(T x){ bool y = x.foo<1>(0); return y; } int main(){ int c = NAME(B()); } For what definitions of NAME does the program compile?
 For instance embedded Java databases like
 H2 and HSQLDB are demonstrably faster than MySQL and PostgreSQL or
 Oracle on small to average sized disk-based databases, and they were
 written by a single guy.
 In many environments where it is massively used, Java is *not* the
 bottleneck, the JVM is fast enough. Rather the network or the database
 are. This is enough to convince most companies to invest massively into
 Java. So saying that Java is a toy language is ridiculous.
Java means a lot different things: - Language - Libraries - Security Model - Virtual Machine - ... The issue is that in discussions about java, this often leads to misunderstandings.
Yes.
Dec 02 2011
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbglp$2cp0$1 digitalmars.com...
 While in Java, the
 compilation time is near zero.
If you're using Eclipse, in which case the cost isn't gone at all, it's simply shifted to slowed down interaction with the IDE.
 The launch time of applications entirely
 depends on what you do with them: if it has to open several DB
 connections to initialize itself, yes it's sluggish, but that doesn't
 have anything to do with the language, rather with the application.
So how many hundreds of DB connections is Eclipse apparently opening upon startup?
 And JIT compilation, you don't feel it, so it doesn't matter.
Yea you do.
Dec 02 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 23:25, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message 
 news:jbbglp$2cp0$1 digitalmars.com...
 While in Java, the
 compilation time is near zero.
If you're using Eclipse, in which case the cost isn't gone at all, it's simply shifted to slowed down interaction with the IDE.
No.
 The launch time of applications entirely
 depends on what you do with them: if it has to open several DB
 connections to initialize itself, yes it's sluggish, but that doesn't
 have anything to do with the language, rather with the application.
So how many hundreds of DB connections is Eclipse apparently opening upon startup?
 And JIT compilation, you don't feel it, so it doesn't matter.
Yea you do.
No.
Dec 03 2011
parent reply Don <nospam nospam.com> writes:
On 03.12.2011 21:45, Somedude wrote:
 Le 02/12/2011 23:25, Nick Sabalausky a écrit :
 "Somedude"<lovelydear mailmetrash.com>  wrote in message
 news:jbbglp$2cp0$1 digitalmars.com...
 While in Java, the
 compilation time is near zero.
If you're using Eclipse, in which case the cost isn't gone at all, it's simply shifted to slowed down interaction with the IDE.
No.
 The launch time of applications entirely
 depends on what you do with them: if it has to open several DB
 connections to initialize itself, yes it's sluggish, but that doesn't
 have anything to do with the language, rather with the application.
So how many hundreds of DB connections is Eclipse apparently opening upon startup?
 And JIT compilation, you don't feel it, so it doesn't matter.
Yea you do.
No.
If you work in an environment where practically all apps are fast, Eclipse stands out as being slow. The startup time is particularly striking. I don't see any reason for this. Mostly when you open an IDE you want to first open a few files, look at them, maybe do some editing. It ought to be possible to do that within 2 secs of starting the IDE, while everything else continues to load. It's unusual to perform a major refactoring of your code base within 10 secs of opening your IDE, but it seems you can't do anything at all, until everything has been loaded.
Dec 03 2011
next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday, December 04, 2011 03:40:18 Don wrote:
 On 03.12.2011 21:45, Somedude wrote:
 Le 02/12/2011 23:25, Nick Sabalausky a =C3=A9crit :
 "Somedude"<lovelydear mailmetrash.com>  wrote in message
 news:jbbglp$2cp0$1 digitalmars.com...
=20
 While in Java, the
 compilation time is near zero.
=20 If you're using Eclipse, in which case the cost isn't gone at all,=
 it's
 simply shifted to slowed down interaction with the IDE.
=20 No. =20
 The launch time of applications entirely
 depends on what you do with them: if it has to open several DB
 connections to initialize itself, yes it's sluggish, but that
 doesn't
 have anything to do with the language, rather with the applicatio=
n.
=20
 So how many hundreds of DB connections is Eclipse apparently openi=
ng
 upon startup?
=20
 And JIT compilation, you don't feel it, so it doesn't matter.
=20 Yea you do.
=20 No.
=20 If you work in an environment where practically all apps are fast, Eclipse stands out as being slow. The startup time is particularly st=
riking.
 I don't see any reason for this. Mostly when you open an IDE you want=
to
 first open a few files, look at them, maybe do some editing.
 It ought to be possible to do that within 2 secs of starting the IDE,=
 while everything else continues to load.
 It's unusual to perform a major refactoring of your code base within =
10
 secs of opening your IDE, but it seems you can't do anything at all,
 until everything has been loaded.
It's certainly a problem if the IDE loads slowly, but in my experience,= most=20 people open it and leave it open, so while it _is_ annoying when you op= en it,=20 and it _does_ give the IDE a bad first impression, the load time often = really=20 doesn't matter much as far as really affecting normal work goes. - Jonathan M Davis
Dec 03 2011
prev sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 04/12/2011 03:40, Don a écrit :
 If you work in an environment where practically all apps are fast,
 Eclipse stands out as being slow. The startup time is particularly
 striking.
 I don't see any reason for this. Mostly when you open an IDE you want to
 first open a few files, look at them, maybe do some editing.
 It ought to be possible to do that within 2 secs of starting the IDE,
 while everything else continues to load.
 It's unusual to perform a major refactoring of your code base within 10
 secs of opening your IDE, but it seems you can't do anything at all,
 until everything has been loaded.
 
I stopped bothering to respond to Nick Sabalausky, as obviously, he is not trying to discuss, he just throws his opinions around without any substance. As for startup time, who cares really, as you open it only once and leave it open afterwards ? As Jonathan and I have said now at least 3 times, you don't close it as it's your primary tool. And the reason it's slow is, at startup time, it loads: - the GUI toolkit SWT and the interface manager - the customized interface (called "perspective" in eclipse) - hundreds of plugins - the compiler - your open projects - all the files that were open last time As you may have noticed, almost all the tools that have their own non native GUI toolkit are slower to load. Any Gtk tool for instance. Even worse when you have plugins. Try to start the Gimp or Photoshop, and tell me if it's fast. And Emacs is slow to start as well. But who cares really ? They are not meant to be started 10 times a day. On my C2D, a fresh install of eclipse Indigo starts in about 12 seconds, with 340 plugins totaling 138 Mb in the plugins directory, most of them being actually loaded at startup time. Apart from that, eclipse happily handles projects with 2 million lines without a sweat on an average PC, so no, I don't think it's sluggish. If it *was* the sluggish chore Nick Sabalausky pretends it to be, eclipse wouldn't be chosen as the main platform by Zend, Adobe Flex, QNX, Altera, Aptana, etc for their own product, there wouldn't be more than 5 million downloads for each release of the Java platform only (i.e not counting all the said customisations for other languages), and Java users would instead flock to Netbeans or Idea, which both have their strengths and are free IDEs as well. Now Idea (also written in Java) has a reputation for being actually a bit snappier (at the cost of a much long startup time, however, because Idea constructs the code index before opening the project while eclipse does it in background) as well as having many more functionalities, but I personnally haven't had the urge to switch so far. Conclusion on this pretty boring subject: Eclipse being slow is about as old a rant as saying Java is slow.
Dec 03 2011
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-04 06:09, Somedude wrote:
 Le 04/12/2011 03:40, Don a écrit :
 If you work in an environment where practically all apps are fast,
 Eclipse stands out as being slow. The startup time is particularly
 striking.
 I don't see any reason for this. Mostly when you open an IDE you want to
 first open a few files, look at them, maybe do some editing.
 It ought to be possible to do that within 2 secs of starting the IDE,
 while everything else continues to load.
 It's unusual to perform a major refactoring of your code base within 10
 secs of opening your IDE, but it seems you can't do anything at all,
 until everything has been loaded.
I stopped bothering to respond to Nick Sabalausky, as obviously, he is not trying to discuss, he just throws his opinions around without any substance. As for startup time, who cares really, as you open it only once and leave it open afterwards ? As Jonathan and I have said now at least 3 times, you don't close it as it's your primary tool. And the reason it's slow is, at startup time, it loads: - the GUI toolkit SWT and the interface manager - the customized interface (called "perspective" in eclipse) - hundreds of plugins - the compiler - your open projects - all the files that were open last time As you may have noticed, almost all the tools that have their own non native GUI toolkit are slower to load. Any Gtk tool for instance. Even worse when you have plugins. Try to start the Gimp or Photoshop, and tell me if it's fast.
Eclipse uses SWT as its GUI toolkit which uses native widgets. -- /Jacob Carlborg
Dec 04 2011
prev sibling next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/04/2011 06:09 AM, Somedude wrote:
 [...] And Emacs is slow to start as well. [...]
oO? No. It is up and running in less than 0.3s. Have you tested it?
Dec 04 2011
parent Somedude <lovelydear mailmetrash.com> writes:
Le 04/12/2011 15:45, Timon Gehr a écrit :
 On 12/04/2011 06:09 AM, Somedude wrote:
 [...] And Emacs is slow to start as well. [...]
oO? No. It is up and running in less than 0.3s. Have you tested it?
See ? I can be as wrong as Nick Sabalausky when I base my prejudice from dated experience. Haven't launched it for at least 15 years.
Dec 04 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbeva3$2785$1 digitalmars.com...
 Le 04/12/2011 03:40, Don a écrit :
 If you work in an environment where practically all apps are fast,
 Eclipse stands out as being slow. The startup time is particularly
 striking.
 I don't see any reason for this. Mostly when you open an IDE you want to
 first open a few files, look at them, maybe do some editing.
 It ought to be possible to do that within 2 secs of starting the IDE,
 while everything else continues to load.
 It's unusual to perform a major refactoring of your code base within 10
 secs of opening your IDE, but it seems you can't do anything at all,
 until everything has been loaded.
I stopped bothering to respond to Nick Sabalausky, as obviously, he is not trying to discuss, he just throws his opinions around without any substance.
Sounds like some dude I know...
 As for startup time, who cares really, as you open it only once and
 leave it open afterwards ? As Jonathan and I have said now at least 3
 times, you don't close it as it's your primary tool.
And then every time I work on something else and don't want Eclipse continuing to suck up half my resources? I'm expected to just leave it running anyway?
 And the reason it's
 slow is, at startup time, it loads:
 - the GUI toolkit SWT and the interface manager
If SWT is slow to load, that's another strike against it, not a defense.
 - the customized interface (called "perspective" in eclipse)
Although SWT uses native widgets, Eclipse does seem to do a lot of non-standard stuff, too, like the oversized clearly-non-native tabs. So I don't know how much this affects performance. But even if it does, that's just another strike against Eclipse. I don't want non-native, *especially* if it slows things down.
 - hundreds of plugins
If it needs that many plug-ins then something is very, very wrong. For example, maybe some of those should be built-in. And if you meant "hundreds" literally (but I'll give you the benefit of the doubt that you're just exaggerating since it definitely sounds like an exaggeration), then..."Wow, that's just insane. What, does it need a separate plugin for each letter of the alphabet it supports?".
 - the compiler
That should only be needed if you're using the compile-as-you-type feature (which I'd rather not since it slows down basic typing and UI interaction to an unacceptable degree), and on a language for which Eclipse supports it.
 - your open projects
 - all the files that were open last time
See, I don't even want that anyway. I don't like how Eclipse insists on keeping every project I've ever touched open all the time. And automatically resuming the last session, while a nice feature for those who want it, is not something I've ever personally felt a need for. So 1. I'm supposed to pay the price for that? and 2. Seriously, how long does it take to open a few text files?
 On my C2D, a
 fresh install of eclipse Indigo starts in about 12 seconds,
I assume you're on some sort of 10GB multi-core machine as most Java users have to be on, in which case: 12 sec startup is ridiculously slow. Even on 2GB x64 dual-core, that's still very, very slow.
 with 340
 plugins totaling 138 Mb in the plugins directory, most of them being
 actually loaded at startup time.
Oh my god, you were actually serious about "hundreds"...?!? Most of them being needed all the time? (Then why the hell are they plug-ins in the first place?) I knew there was something wrong about how Eclipse was designed, and this just proves it.
 Apart from that, eclipse happily handles projects with 2 million lines
 without a sweat on an average PC, so no, I don't think it's sluggish.
It's certainly sluggish compared to Scintilla-based programs. Even with all the fancy stuff turned off (which I have tried - it does make a difference, but not enough).
 If it *was* the sluggish chore Nick Sabalausky pretends it to be,
 eclipse wouldn't be chosen as the main platform by Zend, Adobe Flex,
 QNX, Altera, Aptana, etc for their own product, there wouldn't be more
 than 5 million downloads for each release of the Java platform only (i.e
 not counting all the said customisations for other languages), and Java
 users would instead flock to Netbeans or Idea, which both have their
 strengths and are free IDEs as well.
Argumentum ad populous, huh? That's one of the worst fallacies I've ever heard. "If Nazis weren't right there wouldn't have been so many of them", huh? There's so many things wrong with that argument it's not even worth validating it by going into them.
 Conclusion on this pretty boring subject: Eclipse being slow is about as
 old a rant as saying Java is slow.
Saying they're slow may be old, but it's still true no matter how stubbornly you refuse to acknowledge it.
Dec 04 2011
next sibling parent reply Mirko Pilger <mirko.pilger gmail.com> writes:
 I assume you're on some sort of 10GB multi-core machine as most Java users
 have to be on, in which case: 12 sec startup is ridiculously slow. Even on
 2GB x64 dual-core, that's still very, very slow.
just to throw some numbers in. out of curiosity i have done a fresh install of eclipse on an 2GB, 3ghz, dual core, x86, windows xp sp3. starting up and opening an empty project takes about 4 sec with eclipse. in comparison visual d (plugin for visual studio shell 2008) starts immediately without any noticeable delay.
Dec 04 2011
next sibling parent reply Caligo <iteronvexor gmail.com> writes:
On Fri, Dec 2, 2011 at 2:19 AM, Russel Winder <russel russel.org.uk> wrote:

 Java is the main language of development just now.  D is a tiny little
 backwater in the nether regions of obscurity.  If any language is a joke
 here, it is D since it is currently unable to claim any serious market
 share in the world of development.  The sooner you accept this, the
 sooner you can discuss the shortcomings of a language you have no
 experience of, by your own admission.

 Your point about how languages become popular has some merit, albeit
 stated in an overly bigoted fashion.
That's like saying people should take Coke and Pepsi more seriously because they have bigger market shares when in reality all you need is water. Money isn't real, you know? D is already a success, a BIG success. Walter and Andrei (and the amazing community, of course) have created a programming language that is light years ahead of C++, Java and Go. I don't think you know this, but every high school student who takes a computer science course is required to learn Java. It doesn't stop there: in college and university it's all Java, too, and this has been going on for almost two decades. And before Java it was mostly C++, but it was phased out. Unless the course specifically requires a different programming language (which is rare), you have to beg to use a different programming language (which I did). Sometimes professors do allow other programming languages, but they mostly limit it to C/C++. In most cases students either have to accept it and do what they are told to do, or fail the course. If that's not indoctrination, I don't know what is. Also, the reason they restrict education to things like Java and C++ has very little to do with the fact that those languages have claimed big market share; rather, it's because corporations have had a vested interest in universities in the first place and they receive what they order. Just look at what Microsoft has been doing in universities: everything from "free" gifts such as free copies of Windows OS and Visual Studio Ultimate that cost thousands of dollars to sponsoring various kinds of events. The students who are influenced by such tactics, to whom do you think they are going to be loyal to? The _main point_ here is that if students had been give the choice to learn a programming language of their choosing, many of the so called "successful" programming languages would not have been so "successful" today. So next time you decide to lecture someone on how popular or "successful" Java is, just remember how it got to be so "successful". Your point about exploitation should be aimed at the entirety of the
 economic systems of the world.  The systems in the USA, India and China
 (the three main economies of the world) rest completely and solely on
 exploitation.  It's called capitalism.
I do see the entirety of the economic system of the world, and, no, it's NOT called capitalism. It's called the Monetary System. Capitalism, Socialism, Communism, etc,... they are all inherently the same because they are all based on the Monetary System. Money is created out of debt, and money is inherently scarce. Differential advantage and exploitation is name of the game, regardless of the form of government you have. And as far as I know, India isn't even in the top five; USA, China, and Japan are in the top three.
 [...]

 So far I have competed in the ACM ICPC regional programming contests
 twice.  I've met many students there and I've had many teammates, most if
 not all of them Java programmers.  Besides me (I've never actually done
any
 Java), I don't know any other C++ programmer in there.  I have seen
 countless problems solved in Java and C++, with Java always being 10-20
 times slower: same problem, same algorithms and/or data structures.
 Whenever I find an article that talks about Java being faster than C++, I
 know it's BS.  You can find fair comparisons at http://www.spoj.pl/
If you have never used Java or never actually investigated the issues as to when Java is significantly slower than C++ and when it is as fast as C++ then clearly you have no grounds on which to express any opinion based on facts, it is just prejudice and bigotry. Such comments have no place in any discussion.
I choose to ignore Java for technical and non-technical reasons. Unlike you, I don't need to spend years of my life doing Java programming to realize what a joke it is, and I have never seen a case where Java was just as fas as C++. This is one of those myths, or corporate propaganda, that's been propagated by educated idiots. I and the teams I've been a member of have solved countless CS problems that have required every kind of data structure and algorithm, and not once have I seen Java come close to C/C++. On average, Java has been about 20 times slower than C++ and requiring on average 50 times more memory when it came to solving those problems. If you honestly believe that Java can be just as fast as C++, then go to http://www.spoj.pl/ and pick a problem and submit a solution in Java that's no more than 3 times slower than C/C++ and requires no more than 10 times more memory.
 Java is also very sluggish.  I don't exactly know why, but I'm sure it
has
 something to do with JVM and/or GC.  Just look at Android and compare it
to
 iPhone to see what I mean.  Apps running on the PC written in Java are
also
 sluggish: Things like Eclipse and Open Office come to mind.
If you don't know why, how can you make claims that you cannot substantiate in any way shape or form. You qualitative assessment of applications such as Eclipse and OpenOffice relates to the codebase and not the language.
 Java is a joke.  It's a faith-based ideology.  Get over it.
[...] Clearly you are having a crisis of faith, and so are having to lash out to protect your ideology. I am entirely comfortable with my perceptions of languages, so have no need for such behaviours. I analyse languages, consider use in context, and use the most appropriate language for the job at hand, be it Fortran, C++, C, Ada, Haskell, OCaml, Java, Groovy, Python, Ruby, Clojure, Lisp, Go. Perhaps even D.
I'm not easily offended, and I've learned to let go. I love to be proven wrong because that's when I learn something new. I think you are having a harder time with this than you realize, and it's easy to understand why: you have spent years of your life with pointless creations such as Java, and they are now part of your identity. Of course you are going to get upset when someone labels Java as something of a joke because you take that statement personally and see it as an attack on who you are. It's okay. Just learn to let go. You still have time.
Dec 17 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/17/11 10:45 PM, Caligo wrote:
 D is already a success, a BIG success.  Walter and Andrei (and the
 amazing community, of course) have created a programming language that
 is light years ahead of C++, Java and Go.
Well if by success you mean "we didn't find totally embarrassing flaws in its design yet", then yes, D is a success. If, however, you meant "it is used by a large number of major projects", then I'd disagree. We want to get there, but we're not already there.
 I don't think you know this, but every high school student who takes a
 computer science course is required to learn Java.  It doesn't stop
 there: in college and university it's all Java, too, and this has been
 going on for almost two decades.  And before Java it was mostly C++, but
 it was phased out.  Unless the course specifically requires a different
 programming language (which is rare), you have to beg to use a different
 programming language (which I did).  Sometimes professors do allow other
 programming languages, but they mostly limit it to C/C++.  In most cases
 students either have to accept it and do what they are told to do, or
 fail the course.  If that's not indoctrination, I don't know what is.
 Also, the reason they restrict education to things like Java and C++ has
 very little to do with the fact that those languages have claimed big
 market share; rather, it's because corporations have had a vested
 interest in universities in the first place and they receive what they
 order.  Just look at what Microsoft has been doing in universities:
 everything from "free" gifts such as free copies of Windows OS and
 Visual Studio Ultimate that cost thousands of dollars to sponsoring
 various kinds of events.  The students who are influenced by such
 tactics, to whom do you think they are going to be loyal to?

 The _main point_ here is that if students had been give the choice to
 learn a programming language of their choosing, many of the so called
 "successful" programming languages would not have been so "successful"
 today.  So next time you decide to lecture someone on how popular or
 "successful" Java is, just remember how it got to be so "successful".
I think you're moseying around a solid point without quite nailing it; you're still doing a lot better than most. It's quite amazing how many discussions a la "Java is successful because..." completely neglect an essential point: one BILLION dollars was poured into Java, a significant fraction of which was put in branding, marketing, and PR. The sheer fact that many of us - even those who actually _lived_ through the Java marketing bonanza - tend to forget about it echoes many studies in marketing: people believe they are making rational and logical choices and refuse to admit and understand they are influenced by marketing, even when they fall prey to textbook marketing techniques. It's easy to forget now, but in the craze of late 1990s, Java was so heavily and so successfully advertised, I remember there were managers who were desperate to adopt Java, and were convinced it would be a strategic disaster if they failed to do so. That weirdly applied even to managers who knew nothing about programming - they were as confused as people who lined up to buy a Windows 95 CD that they couldn't install because they didn't have a computer. It was incredible - a manager would tell me how vital Java adoption is, but had no idea what Java really was. There were Java commercials on the TV! (http://www.youtube.com/watch?v=pHxtB8zr8UM) Back then people were made to believe pretty much anything and everything good about Java. Some believed Java was small and great for limited-memory embedded systems. Some believed there's no real Internet without Java. Some believed Java was awesomely fast. Most importantly, a lot of people in decision positions believed jumping on the Java bandwagon was an absolute necessity. And this gushing of social proof became a self-fulfilling prophecy because with many people working on Java an entire web of tools, libraries, and applications sprung to life, creating offer and demand for more of the same. Andy Warhol would have loved the stunt. Except jumpstarting this gigantic engine wasn't free - it cost Sun one billion dollars. (It could be speculated that ultimately this was part of the reason of Sun's demise because other companies, not Sun, were able to capitalize on Java.) Forgetting the role that that billion dollar played in the success of Java would miss on probably the single most important reason, and by far. Right now I'm begging and cajoling Facebook and Microsoft for 5K-10K to organize a conference on D in 2012. I'll say D is successful when many companies would be honored to offer that level of sponsorship. Andrei
Dec 18 2011
next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 03:57 -0600, Andrei Alexandrescu wrote:
[...]
 It's quite amazing how many discussions a la "Java is successful=20
 because..." completely neglect an essential point: one BILLION dollars=
=20
 was poured into Java, a significant fraction of which was put in=20
 branding, marketing, and PR.
Not all of it from Sun -- they didn't have pockets that deep.
 The sheer fact that many of us - even those who actually _lived_ through=
=20
 the Java marketing bonanza - tend to forget about it echoes many studies=
=20
 in marketing: people believe they are making rational and logical=20
 choices and refuse to admit and understand they are influenced by=20
 marketing, even when they fall prey to textbook marketing techniques.
Corollary: You have to have new product on the shelves every 6 months or people stop buying your product. Just look in the supermarket shelves for the use of "new". The product may be the old product but the packaging is different so it is "new".
 It's easy to forget now, but in the craze of late 1990s, Java was so=20
 heavily and so successfully advertised, I remember there were managers=
=20
 who were desperate to adopt Java, and were convinced it would be a=20
 strategic disaster if they failed to do so. That weirdly applied even to=
=20
 managers who knew nothing about programming - they were as confused as=
=20
 people who lined up to buy a Windows 95 CD that they couldn't install=20
 because they didn't have a computer. It was incredible - a manager would=
=20
 tell me how vital Java adoption is, but had no idea what Java really=20
 was. There were Java commercials on the TV!=20
 (http://www.youtube.com/watch?v=3DpHxtB8zr8UM)
I was in academia at the time so don't know what was happening in the real world, but there certainly was a manic aspect to the Java snowball -- and I use this metaphor advisedly, when you roll a snowball in snow it gets bigger, but when the temperature rises snowballs melt away. Publishers as well as academics were culpable in the mass mania. A revamp in the university curriculum meant new books and new sales, so they pushed it as hard as possible. Dietel, once a prominent operating systems author, created a not so great programming languages publishing empire out of it.
 Back then people were made to believe pretty much anything and=20
 everything good about Java. Some believed Java was small and great for=
=20
 limited-memory embedded systems. Some believed there's no real Internet=
=20
 without Java. Some believed Java was awesomely fast. Most importantly, a=
=20
 lot of people in decision positions believed jumping on the Java=20
 bandwagon was an absolute necessity. And this gushing of social proof=20
 became a self-fulfilling prophecy because with many people working on=20
 Java an entire web of tools, libraries, and applications sprung to life,=
=20
 creating offer and demand for more of the same.
And then there was JavaCard. One of the biggest con jobs of all time. Fundamentally a good idea, badly executed and managed because it became a cash cow for Sun. Now I suspect a blip in history. Which is a shame as smartcards are now powerful enough to run something along the JavaCard lines that would really do something useful with smartcards.=20 I see JavaME is being re-raised as useful technology. Great if I can run courses, but it would be a bad move. JavaSE Embedded is actually a different kettle of fish. Not useful everywhere, but in certain use cases far better than using C or C++. Or D except that there aren't enough backend to D to make that viable.=20
 Andy Warhol would have loved the stunt. Except jumpstarting this=20
 gigantic engine wasn't free - it cost Sun one billion dollars. (It could=
=20
 be speculated that ultimately this was part of the reason of Sun's=20
 demise because other companies, not Sun, were able to capitalize on=20
 Java.) Forgetting the role that that billion dollar played in the=20
 success of Java would miss on probably the single most important reason,=
=20
 and by far.
Whilst I can believe the $1bn overall, not all of it was Sun, and not all of it was Java. cf. the Self language episode. I bet IBM were happy.
 Right now I'm begging and cajoling Facebook and Microsoft for 5K-10K to=
=20
 organize a conference on D in 2012. I'll say D is successful when many=
=20
 companies would be honored to offer that level of sponsorship.
Musicians are coming up with new ways of funding things that is working very well. Pre-sales. Put out the road-map and business plan for an album or concert. Take bookings and money before committing to anything, then you have the cash float to make commitments. Organizing it from cash flow means no need for sponsors. Except that once the show realization is on the road you can inform the sponsors of what a successful event this is going to be and how they are going to look bad if they are not there. PyCon UK (un)conferences tend to get organized on this model these days. Obviously though it is all about having the contacts who can commit budget. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 20 2011
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/20/11 2:09 AM, Russel Winder wrote:
 On Sun, 2011-12-18 at 03:57 -0600, Andrei Alexandrescu wrote:
 [...]
 It's quite amazing how many discussions a la "Java is successful
 because..." completely neglect an essential point: one BILLION dollars
 was poured into Java, a significant fraction of which was put in
 branding, marketing, and PR.
Not all of it from Sun -- they didn't have pockets that deep.
 The sheer fact that many of us - even those who actually _lived_ through
 the Java marketing bonanza - tend to forget about it echoes many studies
 in marketing: people believe they are making rational and logical
 choices and refuse to admit and understand they are influenced by
 marketing, even when they fall prey to textbook marketing techniques.
Corollary: You have to have new product on the shelves every 6 months or people stop buying your product. Just look in the supermarket shelves for the use of "new". The product may be the old product but the packaging is different so it is "new".
Confusion. Product != brand. There doesn't have to be a new brand to replace Starbucks or Coca-Cola every six months. Andrei
Dec 20 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/20/11 2:09 AM, Russel Winder wrote:
 Publishers as well as academics were culpable in the mass mania.  A
 revamp in the university curriculum meant new books and new sales, so
 they pushed it as hard as possible.  Dietel, once a prominent operating
 systems author, created a not so great programming languages publishing
 empire out of it.
I was also in the academia, doing PL research no less. The academic interest was not of commercial nature for the most part - Java _is_ a clean language great for doing research of both kinds: (a) research that studies programs written in that language, (b) research that adds a little feature to the language and proves its properties. The fact that Java is underpowered has no import to that kind of work. Andrei
Dec 20 2011
parent Russel Winder <russel russel.org.uk> writes:
On Tue, 2011-12-20 at 06:45 -0600, Andrei Alexandrescu wrote:
[...]
 I was also in the academia, doing PL research no less. The academic=20
 interest was not of commercial nature for the most part - Java _is_ a=20
 clean language great for doing research of both kinds: (a) research that=
=20
 studies programs written in that language, (b) research that adds a=20
 little feature to the language and proves its properties. The fact that=
=20
 Java is underpowered has no import to that kind of work.
Indeed, academic research has a different set of constraints and must be handled differently to the commercial/industrial setting. Hence languages such as Clean are relevant in an academic setting where they have little or no penetration in "the real world". Professional publishing which is still going, just, is also very different from textbook publishing, which by all accounts has nigh on disappeared.=20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 21 2011
prev sibling parent reply Caligo <iteronvexor gmail.com> writes:
On Tue, Dec 20, 2011 at 2:09 AM, Russel Winder <russel russel.org.uk> wrote:
 Musicians are coming up with new ways of funding things that is working
 very well.  Pre-sales.  Put out the road-map and business plan for an
 album or concert.  Take bookings and money before committing to
 anything, then you have the cash float to make commitments.  Organizing
 it from cash flow means no need for sponsors.  Except that once the show
 realization is on the road you can inform the sponsors of what a
 successful event this is going to be and how they are going to look bad
 if they are not there.

 PyCon UK (un)conferences tend to get organized on this model these days.

 Obviously though it is all about having the contacts who can commit
 budget.
I don't understand why Walter, Andrei, or other D experts aren't going to universities to give talks. As far as I know it costs no money. At least it didn't cost us anything to set up an even when we were activists. You just need to ask and reserve a room, such as an auditorium. It doesn't have to be some official corporate sponsored DCon. Don't forget a YouTube version :-)
Dec 20 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/20/11 2:26 AM, Caligo wrote:
 On Tue, Dec 20, 2011 at 2:09 AM, Russel Winder <russel russel.org.uk
 <mailto:russel russel.org.uk>> wrote:

     Musicians are coming up with new ways of funding things that is working
     very well.  Pre-sales.  Put out the road-map and business plan for an
     album or concert.  Take bookings and money before committing to
     anything, then you have the cash float to make commitments.  Organizing
     it from cash flow means no need for sponsors.  Except that once the show
     realization is on the road you can inform the sponsors of what a
     successful event this is going to be and how they are going to look bad
     if they are not there.

     PyCon UK (un)conferences tend to get organized on this model these days.

     Obviously though it is all about having the contacts who can commit
     budget.


 I don't understand why Walter, Andrei, or other D experts aren't going
 to universities to give talks.
We do, and at corporations as well as universities. There are a variety of scheduling issues, but they can be worked out. The main reason we're not doing more is there are not many invitations. Andrei
Dec 20 2011
prev sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 18/12/2011 05:45, Caligo a écrit :
 
 I choose to ignore Java for technical and non-technical reasons.  Unlike
 you, I don't need to spend years of my life doing Java programming to
 realize what a joke it is, and I have never seen a case where Java was
 just as fas as C++.  This is one of those myths, or corporate
 propaganda, that's been propagated by educated idiots.  I and the teams
 I've been a member of have solved countless CS problems that have
 required every kind of data structure and algorithm, and not once have I
 seen Java come close to C/C++.  On average, Java has been about 20 times
 slower than C++ and requiring on average 50 times more memory when it
 came to solving those problems.  If you honestly believe that Java can
 be just as fast as C++, then go to http://www.spoj.pl/ and pick a
 problem and submit a solution in Java that's no more than 3 times slower
 than C/C++ and requires no more than 10 times more memory.
  
I'm sorry for being blunt, but I know bullshit when I see it, and that's a load of it. Here is the kind of performance you can expect from the JVM: a factor of 2.5x to native C++. That's from the Box2D physics game engine. http://blog.j15r.com/2011/12/for-those-unfamiliar-with-it-box2d-is.html This is very much in line with what the The Computer Language Benchmark Game gives (and I've come to realize by experience that it's actually quite accurate when evaluating maximum speed for languages) and very much what I've come to expect in practice.
Dec 18 2011
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/18/11 5:15 PM, Somedude wrote:
 Here is the kind of performance you can expect from the JVM: a factor of
 2.5x to native C++.
 That's from the Box2D physics game engine.

 http://blog.j15r.com/2011/12/for-those-unfamiliar-with-it-box2d-is.html

 This is very much in line with what the The Computer Language Benchmark
 Game gives (and I've come to realize by experience that it's actually
 quite accurate when evaluating maximum speed for languages) and very
 much what I've come to expect in practice.
2.5x-3x in speed is the number that I've seen most aired. I'm not sure about memory consumption. Andrei
Dec 18 2011
parent Somedude <lovelydear mailmetrash.com> writes:
Le 19/12/2011 00:26, Andrei Alexandrescu a écrit :
 On 12/18/11 5:15 PM, Somedude wrote:
 Here is the kind of performance you can expect from the JVM: a factor of
 2.5x to native C++.
 That's from the Box2D physics game engine.

 http://blog.j15r.com/2011/12/for-those-unfamiliar-with-it-box2d-is.html

 This is very much in line with what the The Computer Language Benchmark
 Game gives (and I've come to realize by experience that it's actually
 quite accurate when evaluating maximum speed for languages) and very
 much what I've come to expect in practice.
2.5x-3x in speed is the number that I've seen most aired. I'm not sure about memory consumption. Andrei
It's very dependant on how you program of course, but I'd say the ballpark is usually at least an order of magnitude more. Java wastes a LOT of memory.
Dec 18 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, December 17, 2011 22:45:51 Caligo wrote:
 On Fri, Dec 2, 2011 at 2:19 AM, Russel Winder <russel russel.org.uk> wrote:
 Java is the main language of development just now.  D is a tiny little
 backwater in the nether regions of obscurity.  If any language is a joke
 here, it is D since it is currently unable to claim any serious market
 share in the world of development.  The sooner you accept this, the
 sooner you can discuss the shortcomings of a language you have no
 experience of, by your own admission.
 
 Your point about how languages become popular has some merit, albeit
 stated in an overly bigoted fashion.
That's like saying people should take Coke and Pepsi more seriously because they have bigger market shares when in reality all you need is water. Money isn't real, you know? D is already a success, a BIG success. Walter and Andrei (and the amazing community, of course) have created a programming language that is light years ahead of C++, Java and Go. I don't think you know this, but every high school student who takes a computer science course is required to learn Java. It doesn't stop there: in college and university it's all Java, too, and this has been going on for almost two decades. And before Java it was mostly C++, but it was phased out. Unless the course specifically requires a different programming language (which is rare), you have to beg to use a different programming language (which I did). Sometimes professors do allow other programming languages, but they mostly limit it to C/C++. In most cases students either have to accept it and do what they are told to do, or fail the course. If that's not indoctrination, I don't know what is. Also, the reason they restrict education to things like Java and C++ has very little to do with the fact that those languages have claimed big market share; rather, it's because corporations have had a vested interest in universities in the first place and they receive what they order. Just look at what Microsoft has been doing in universities: everything from "free" gifts such as free copies of Windows OS and Visual Studio Ultimate that cost thousands of dollars to sponsoring various kinds of events. The students who are influenced by such tactics, to whom do you think they are going to be loyal to? The _main point_ here is that if students had been give the choice to learn a programming language of their choosing, many of the so called "successful" programming languages would not have been so "successful" today. So next time you decide to lecture someone on how popular or "successful" Java is, just remember how it got to be so "successful".
In my experience, it's the professors who get to choose what they're teaching and the main reason that Java is used is a combination of its simplicitly and the fact that it's heavily used in the industry. C and C++ have a lot more pitfalls which make learning harder for newbie programmers. Java does more for you (like garbage collection instead of manual memory management) and has fewer ways to completely screw yourself over, so it makes more sense as a teaching language than C or C++. And since the primary focus is teaching the principles of computer science rather than a particular programming language, the exact language usually doesn't matter much. Now, this _does_ have the effect that the majority of college students are most familiar and comfortable with Java, and so that's what they're generally going to use (so there _is_ a lot of indoctrination in that sense), but that's pretty much inevitable. You use what you know. Ultimately though, that's what's likely to happen with most any university simply because teaching programming languages is not the focus - teaching computer science is. And for most of that, the language isn't particularly relevant. And Java was successful before they started using it in universities, or it likely wouldn't have been used much in them in the first place. It's just that that has a feedback effect, since the increased used in universities tends to increase its use in industry, which tends to then make the universities more likely to select it or to stick with it as long as they don't have a solid reason to switch. But I believe that the initial switch was a combination of the fact that its popularity in industry was increasing and the fact that it works much better as a teaching language than C or C++. It's not because of anything that corporations did (aside from saying that they used it), since Java isn't a language where donating stuff or discounting really helps (unlike C++), since almost all of the tools for Java are free. - Jonathan M Davis
Dec 17 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
On Sat, 2011-12-17 at 22:45 -0600, Caligo wrote:
[...]

I thought this thread had finished, but...
=20
 That's like saying people should take Coke and Pepsi more seriously becau=
se
 they have bigger market shares when in reality all you need is water.
 Money isn't real, you know?
Taking that paragraph out of the context of the previous emails in the thread leads to misinterpretation of what was being said.=20
 D is already a success, a BIG success.  Walter and Andrei (and the amazin=
g
 community, of course) have created a programming language that is light
 years ahead of C++, Java and Go.
=20
 I don't think you know this, but every high school student who takes a
 computer science course is required to learn Java.  It doesn't stop there=
: I didn't know this, but I guess it is only a factor in the USA. In the rest of the world, it is almost certainly not the case. It definitely isn't in the UK.
 in college and university it's all Java, too, and this has been going on
 for almost two decades.  And before Java it was mostly C++, but it was
 phased out.  Unless the course specifically requires a different
Well 15 anyway :-) Up until 1985 or there abouts all educational institutions used Pascal. The global mindset was that Pascal was THE language for teaching. If you weren't teaching and learning using Pascal, you were deemed improperly educated. At UCL, Paul Otto and myself set about revamping the teaching of programming, initially using Scheme + C++ and later using Miranda + C++. This proved massively effective. Students were forced early to appreciate different paradigms of computation as well as learning a language that was increasingly important in the real world. We used SICP for Scheme which was great. C++ had no books so I wrote one, and that turned out well. Miranda had no books originally so a couple of colleagues at UCL wrote one, and that turned out well. Then came 1995 and The Great Revolution: Java hit the streets, Web browsers got funky, and the international educators mindset spontaneously switched to "virtual machines are great" with a smattering of "programming using applets is so cool". Almost overnight all educators switched to Java. I had moved to KCL which was still using Modula-2, and revamped the programming. I offered the folks the choice of C++, Ada or Java as the first programming language, knowing that Clean was used later in the course. They chose Java. Meanwhile at UCL they switched to Java in 1997. Many people were writing many books on educating people using Java. Graham Roberts at UCL and I at KCL needed a book different to the rubbish that was being published in the switch to Java fashion world-wide so ended up writing our own book, which proved very successful. The problem here is that educators forgot the importance of learning multiple languages and especially multiple paradigms. Java was used for all teaching and students suffered. If they had used Java and Haskell and Prolog things would be much better. I exited academia in 2000 as it was fairly obvious that the sector was going to be ruined, at least in the UK, due to the dreadful rhetoric all political parties were issuing. Graham stayed at UCL though and has managed to switch the early programming teaching to use Groovy and Prolog followed by Java. This produces far better programmers than any other sequence I know of just now. Obviously the Hawthorn effect matters: students are enthused and made better by having enthusiastic and knowledgeable teachers who inspire.=20
 programming language (which is rare), you have to beg to use a different
 programming language (which I did).  Sometimes professors do allow other
 programming languages, but they mostly limit it to C/C++.  In most cases
 students either have to accept it and do what they are told to do, or fai=
l
 the course.  If that's not indoctrination, I don't know what is.  Also, t=
he In a sense all education is indoctrination, but we probably don't want to go there. I suspect the major problem with most educators -- only in USA and Italy are all educators called professors, in places like UK, France and Germany, professor is a title that has to be earned and is a matter of status within the system -- is that they are themselves under-educated. Far too many educators teaching programming cannot themselves program. Their defensive reaction is to enforce certain choices on their students. Sometimes there is a reasonable rationale -- you don't write device drivers in Prolog, well unless you are using ICL CAFS -- but generally the restriction is because the educator doesn't know any other programming language than the one they enforce. =20
 reason they restrict education to things like Java and C++ has very littl=
e
 to do with the fact that those languages have claimed big market share;
 rather, it's because corporations have had a vested interest in
 universities in the first place and they receive what they order.  Just
 look at what Microsoft has been doing in universities: everything from
 "free" gifts such as free copies of Windows OS and Visual Studio Ultimate
 that cost thousands of dollars to sponsoring various kinds of events.  Th=
e
 students who are influenced by such tactics, to whom do you think they ar=
e
 going to be loyal to?
I worry more about what Macdonalds does in primary schools. Enforcing pupils to learn to count by counting BigMacs strikes me as the worst sort of indoctrination. Your description of Microsoft's behaviour is a natural consequence of the economic system. Using marketing budget to indoctrinate people into buying your product is all that is going on. Some companies realize that spending that money on molding the minds of 2--12 year olds is the way of creating an income stream in the future.
 The _main point_ here is that if students had been give the choice to lea=
rn
 a programming language of their choosing, many of the so called
 "successful" programming languages would not have been so "successful"
 today.  So next time you decide to lecture someone on how popular or
 "successful" Java is, just remember how it got to be so "successful".
The single most important factor here is that people learn to program using more than one paradigm of computation and thus more than one language -- FOOPLOG and Scala are special case, that are interesting for later study but not for initial study. The importance of multiple paradigms isn't just waffle, the psychology of programming folk have been doing longitudinal studies over the last 20 years that shows that people learn more, faster and end up being better programmers. People who learn with one language and use that language for the rest of their programming lives are in general poorer programmers because of it. [...]
 I do see the entirety of the economic system of the world, and, no, it's
 NOT called capitalism.  It's called the Monetary System.  Capitalism,
 Socialism, Communism, etc,... they are all inherently the same because th=
ey
 are all based on the Monetary System.  Money is created out of debt, and
 money is inherently scarce.  Differential advantage and exploitation is
 name of the game, regardless of the form of government you have.  And as
 far as I know, India isn't even in the top five;  USA, China, and Japan a=
re
 in the top three.
I beg to differ on the detail, but this is almost certainly not the forum to have this debate. You are right that the obvious political labelling is not the whole story, but neither is the analysis using the monetary system. It is a multi-dimension problem. In the end though people in positions of power will do their very best to exploit those who are not.=20 [...]
 I choose to ignore Java for technical and non-technical reasons.  Unlike
 you, I don't need to spend years of my life doing Java programming to
 realize what a joke it is, and I have never seen a case where Java was ju=
st
 as fas as C++.  This is one of those myths, or corporate propaganda, that=
's
 been propagated by educated idiots.  I and the teams I've been a member o=
f Have you actually done the benchmarks to back up this claim. I have. I really rather object to being labelled an educated idiot.
 have solved countless CS problems that have required every kind of data
 structure and algorithm, and not once have I seen Java come close to
 C/C++.  On average, Java has been about 20 times slower than C++ and
 requiring on average 50 times more memory when it came to solving those
 problems.  If you honestly believe that Java can be just as fast as C++,
 then go to http://www.spoj.pl/ and pick a problem and submit a solution i=
n
 Java that's no more than 3 times slower than C/C++ and requires no more
 than 10 times more memory.
Java certainly does use appalling amounts of memory, no argument there. For the reason you (should) know perfectly well Java is never going to perform well on the sort of benchmarks these sites look at because they use a protocol of testing that is inherently and systematically biased in favour of running a program from start. C, C++, D will always thrash Java in this context. Where Java competes very well with C, C++ and D is in long running computations. If you want to look at even more biased benchmarking look at http://shootout.alioth.debian.org/ it is fundamentally designed to show that C is the one true language for writing performance computation. [...]
 I'm not easily offended, and I've learned to let go.  I love to be proven
 wrong because that's when I learn something new.  I think you are having =
a
 harder time with this than you realize, and it's easy to understand why:
 you have spent years of your life with pointless creations such as Java,
 and they are now part of your identity.  Of course you are going to get
 upset when someone labels Java as something of a joke because you take th=
at
 statement personally and see it as an attack on who you are.  It's okay.
 Just learn to let go.  You still have time.
You really have totally missed where I am coming from, so I think it best you simply stop trying to analyse my position. I am not even going to try and defend myself against the slur you have made against me, it is not true, and if you had read my emails rather than trying to be an amateur psycho-analyst, you would already know this. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 17 2011
prev sibling next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sat, 2011-12-17 at 21:01 -0800, Jonathan M Davis wrote:
[...]
 In my experience, it's the professors who get to choose what they're teac=
hing=20
 and the main reason that Java is used is a combination of its simplicitly=
and=20
 the fact that it's heavily used in the industry. C and C++ have a lot mor=
e=20
 pitfalls which make learning harder for newbie programmers. Java does mor=
e for=20
 you (like garbage collection instead of manual memory management) and has=
=20
 fewer ways to completely screw yourself over, so it makes more sense as a=
=20
 teaching language than C or C++. And since the primary focus is teaching =
the=20
 principles of computer science rather than a particular programming langu=
age,=20
 the exact language usually doesn't matter much.
Not sure about in the USA, but in the UK and other places I have been an examiner, educators (not all of whom are professors since that is a high level position in many places not just a synonym for educator) do not have a totally free choice. There are processes in place through which choices have to be put and hence validated by more than just the individual.=20 I suspect far too many educators use Java because everyone else does, and that they don't actually think about the choice they are making. Increasingly, from what I hear, educators are moving away from Java as a first programming language. I think this is a good move. Using languages like Python makes for much easier learning of programming and the principles of programming. UCL looked at this, but had the constraint of having to teach Java in the second year, so went with Groovy rather than Python. Followed by Prolog. It is using more than one language that makes for best education.
 Now, this _does_ have the effect that the majority of college students ar=
e most=20
 familiar and comfortable with Java, and so that's what they're generally =
going=20
 to use (so there _is_ a lot of indoctrination in that sense), but that's=
=20
 pretty much inevitable. You use what you know. Ultimately though, that's=
=20
 what's likely to happen with most any university simply because teaching=
=20
 programming languages is not the focus - teaching computer science is. An=
d for=20
 most of that, the language isn't particularly relevant.
Sadly too little computer science gets taught in most universities and colleges as well as too little programming. And yes, if these institutions teach with Java then the bias is towards Java in the job market. Corollary to all this is everyone on this list should go into academia and start teaching all the introductory programming courses using D. Except at Texas A&M where C++ will continue to be used. :-)=20
 And Java was successful before they started using it in universities, or =
it=20
 likely wouldn't have been used much in them in the first place. It's just=
that=20
 that has a feedback effect, since the increased used in universities tend=
s to=20
 increase its use in industry, which tends to then make the universities m=
ore=20
 likely to select it or to stick with it as long as they don't have a soli=
d=20
 reason to switch. But I believe that the initial switch was a combination=
of=20
 the fact that its popularity in industry was increasing and the fact that=
it=20
 works much better as a teaching language than C or C++. It's not because =
of=20
 anything that corporations did (aside from saying that they used it), sin=
ce=20
 Java isn't a language where donating stuff or discounting really helps (u=
nlike=20
 C++), since almost all of the tools for Java are free.
At UCL and KCL we had switched to using Java long before it was an industry requirement for jobs. Overall though I think this is a chicken and egg situation. In all of this, the issue of portability of code has seemingly been missed. One of the main reasons for Java in 1995 (other than the trendiness of Web browser programming) was portability across all platforms. This made the sys admin of provision of resources for programming classes significantly less than it was. C, C++ and D cannot match this even today. Back then it was a Big Win (tm). Interesting to note how Intel put much marketing and sales resource into C++ and associated tools. It's all about lock in. Which is fine if portability is not an issue. Someone once coined the term WORA (write once run anywhere) -- which really is a (tm) phrase -- and yet this is a total lie with respect to Java. It sort of worked when there was only Java 1.0, but already when 1.2 came out it was clearly a fib. Now with Java 5, 6, 7, ... it is a clear lie. Hence OSGi and Project Jigsaw. The problem is basically the same as with dynamic linking in C, C++ and D, you have to know exactly the right soname for the library you use. The Go folk have got round this by ignoring dynamic linking and insisting on static linking of all code. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 17 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/17/2011 10:36 PM, Russel Winder wrote:
 In all of this, the issue of portability of code has seemingly been
 missed.  One of the main reasons for Java in 1995 (other than the
 trendiness of Web browser programming) was portability across all
 platforms.  This made the sys admin of provision of resources  for
 programming classes significantly less than it was.  C, C++ and D cannot
 match this even today.  Back then it was a Big Win (tm).
I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C. BTW, if I was King of the World, universities would teach assembler programming first. I learned BASIC first, then FORTRAN, then I learned assembler (6800) and it was like someone turned the lights on. I liken it to trying to teach kids algebra first, give them a calculator, and never bother teaching them arithmetic. A programmer who doesn't know assembler is never going to write better than second rate programs.
Dec 17 2011
next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sat, 2011-12-17 at 23:09 -0800, Walter Bright wrote:
[...]
 I find this an odd statement because the Java VM is written in C, so ther=
efore=20
 it is on the same or fewer platforms than C.
It's the indirection thing again: rather than provide a C toolchain for each platform, you load Java (or Python, Ruby, ...) which is already precompiled for the platform which then allows a single toolchain across all platforms.
 BTW, if I was King of the World, universities would teach assembler progr=
amming=20
 first.
I think that sort of worked in the 1980s when computers were (relatively) simple, but I don't think it works now. Clearly any self-respecting programmer should be able to work with assembly language, so it needs to be taught, but these days it comes as the link between hardware and software rather than being the language of software.
 I learned BASIC first, then FORTRAN, then I learned assembler (6800) and =
it was=20
 like someone turned the lights on.
It's all about the operational semantics. Some people are happy with very abstract semantics and so can work with the likes of Fortran very well without knowing assembly language. For others the link to how the computer actually works is critically important.
 I liken it to trying to teach kids algebra first, give them a calculator,=
and=20
 never bother teaching them arithmetic.
=20
 A programmer who doesn't know assembler is never going to write better th=
an=20
 second rate programs.
I am not sure I'd go quite that far but I agree that all programmers really ought to have worked with assembly language at least once in their lives. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 17 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/17/2011 11:23 PM, Russel Winder wrote:
 It's the indirection thing again:  rather than provide a C toolchain for
 each platform, you load Java (or Python, Ruby, ...) which is already
 precompiled for the platform which then allows a single toolchain across
 all platforms.
If you can compile the JVM for a machine, then C exists on that machine (even if you do not make the C tools available).
 BTW, if I was King of the World, universities would teach assembler programming
 first.
I think that sort of worked in the 1980s when computers were (relatively) simple, but I don't think it works now. Clearly any self-respecting programmer should be able to work with assembly language, so it needs to be taught, but these days it comes as the link between hardware and software rather than being the language of software.
The ones that don't know assembler tend to have peculiar deficits and blind spots when they program. This is as true today on modern machines as it was 30 years ago. I see it over and over.
 I learned BASIC first, then FORTRAN, then I learned assembler (6800) and it was
 like someone turned the lights on.
It's all about the operational semantics. Some people are happy with very abstract semantics and so can work with the likes of Fortran very well without knowing assembly language. For others the link to how the computer actually works is critically important.
Yes, they can program, but they have peculiar deficits. It's like having the letter 'q' broken on his keyboard, and he has learned to avoid using any words that contain 'q'. It takes a while to notice it.
 I liken it to trying to teach kids algebra first, give them a calculator, and
 never bother teaching them arithmetic.

 A programmer who doesn't know assembler is never going to write better than
 second rate programs.
I am not sure I'd go quite that far but I agree that all programmers really ought to have worked with assembly language at least once in their lives.
Exactly, which is why I'd make it a first or second course in programming for a professional programming education.
Dec 18 2011
parent reply Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 01:51 -0800, Walter Bright wrote:
 On 12/17/2011 11:23 PM, Russel Winder wrote:
 It's the indirection thing again:  rather than provide a C toolchain fo=
r
 each platform, you load Java (or Python, Ruby, ...) which is already
 precompiled for the platform which then allows a single toolchain acros=
s
 all platforms.
=20 If you can compile the JVM for a machine, then C exists on that machine (=
even if=20
 you do not make the C tools available).
I think this is moving away from the reason. True to have a JVM you need a compilation language -- usually C++, but sometimes C -- for the bootstrap bit. That is not the problem that I was referring to. In the context of teaching with a heterogeneous set of platforms, it is easier to give one set of instructions on how to compile and run a program. Trying to tell people how to compile C can get very complicated, telling people how to run javac has much less difficulty. It is the uniformity of the development platform that favours a virtual machine based language over a native code system. Also there is the issue that code compiled on one platform works when executed on another. But then this may be considered a positive, especially if you are Intel, Microsoft or Apple. In teaching Python, all platforms are the same -- except when it comes to doing native code extensions, and that is where the pain starts. [...]
 The ones that don't know assembler tend to have peculiar deficits and bli=
nd=20
 spots when they program. This is as true today on modern machines as it w=
as 30=20
 years ago.
=20
 I see it over and over.
I think this might be more true of native code languages than virtual machine languages. Java programmers generally don't know the bytecodes, Python programmers generally don't know the bytecodes, Ruby programmers generally don't know the bytecodes (Ruby 1.8 may have been interpreted, but 1.9 is a bytecode bases system). [...]
 I am not sure I'd go quite that far but I agree that all programmers
 really ought to have worked with assembly language at least once in
 their lives.
=20 Exactly, which is why I'd make it a first or second course in programming=
for a=20
 professional programming education.
I am not sure about the last 10 years but until then a hardware course and an assembly language programming course were effectively mandatory in the first year of a computing course in the UK. The problem was that all too often the staff teaching the courses didn't really know what they were talking about :-(( --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 19 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/19/2011 11:42 PM, Russel Winder wrote:
 I think this might be more true of native code languages than virtual
 machine languages.  Java programmers generally don't know the bytecodes,
 Python programmers generally don't know the bytecodes, Ruby programmers
 generally don't know the bytecodes (Ruby 1.8 may have been interpreted,
 but 1.9 is a bytecode bases system).
I don't mean knowing the bytecode. Knowing assembler means you develop a feel for what has to happen at the machine level for various constructs. Knowing bytecode doesn't help with that.
 The problem was that all too often the staff teaching the courses didn't
 really know what they were talking about :-((
I learned programming from my peers in college who took pity on my ignorance and kindly helped out. I remember Larry Zwick, who said "good gawd, don't you know what tables are?" after looking at some coding horror listing of mine. I said "whut's dat?" and he proceeded to teach me table-driven state machines on the spot. I remember learning OOP (though I didn't learn the term for it until years later) by reading through the listing for the ADVENT game, and there was the comment "a troll is a modified dwarf". It was one of those lightbulb moments.
Dec 20 2011
prev sibling next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/18/11, Walter Bright <newshound2 digitalmars.com> wrote:
 A programmer who doesn't know assembler is never going to write better than
 second rate programs.
I dare you to say that Optlink is a first-rate program.
Dec 18 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/18/2011 9:01 AM, Andrej Mitrovic wrote:
 On 12/18/11, Walter Bright<newshound2 digitalmars.com>  wrote:
 A programmer who doesn't know assembler is never going to write better than
 second rate programs.
I dare you to say that Optlink is a first-rate program.
It is. In its heyday it beat the pants off of any other linker, and was the standard replacement for MS-Link. It's author retired young off of the proceeds from selling it. How many people do you know did that?
Dec 18 2011
parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/18/11, Walter Bright <newshound2 digitalmars.com> wrote:
 It is. In its heyday..
s/is/was. My PC doesn't have a turbo button anymore. ;)
Dec 18 2011
prev sibling next sibling parent dsimcha <dsimcha yahoo.com> writes:
On 12/18/2011 2:09 AM, Walter Bright wrote:
 A programmer who doesn't know assembler is never going to write better
 than second rate programs.
I don't even know assembler that well and I agree 100%. I can read bits of assembler and recognize compiler optimizations and could probably mechanically translate C code to x86 assembler, but I'd be lost if asked to write anything more complicated than a small function from scratch or do anything without some reference material. Even this basic level of knowledge has given me insights into language design. For example: I'd love to be asked in an interview whether default arguments to virtual functions are determined by the compile time or runtime type of the object. To someone who knows nothing about assembler this seems like the most off-the-wall language-lawyer minutiae imaginable. To someone who knows assembler, the answer is obviously the compile time type. Otherwise, you'd have to store the function's default arguments in the virtual function table somehow, then look each one up and push it onto the stack at the call site. This would get very hairy and inefficient very fast.
Dec 18 2011
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Walter Bright Wrote:

 On 12/17/2011 10:36 PM, Russel Winder wrote:
 In all of this, the issue of portability of code has seemingly been
 missed.  One of the main reasons for Java in 1995 (other than the
 trendiness of Web browser programming) was portability across all
 platforms.  This made the sys admin of provision of resources  for
 programming classes significantly less than it was.  C, C++ and D cannot
 match this even today.  Back then it was a Big Win (tm).
I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
Dec 18 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/18/2011 11:00 AM, Paulo Pinto wrote:
 Walter Bright Wrote:
 I find this an odd statement because the Java VM is written in C, so therefore
 it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
I thought they were all written in C.
Dec 18 2011
parent reply Paulo Pinto <pjmlp progtools.org> writes:
Walter Bright Wrote:

 On 12/18/2011 11:00 AM, Paulo Pinto wrote:
 Walter Bright Wrote:
 I find this an odd statement because the Java VM is written in C, so therefore
 it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
I thought they were all written in C.
The SunSpot VM is written in Java with a very small subset of C code. http://www.sunspotworld.com http://labs.oracle.com/projects/squawk/squawk-rjvm.html The Jikes RVM is written mostly in Java. http://jikesrvm.org/Presentations The Maxime VM is written mostly in Java http://148.87.46.199/projects/maxine/ The Oracle/Sun HotSpot is written in C++ http://en.wikipedia.org/wiki/HotSpot And this is just a small list, as there are quite a few JVMs around.
Dec 18 2011
next sibling parent reply so <so so.so> writes:
On Sun, 18 Dec 2011 22:08:54 +0200, Paulo Pinto <pjmlp progtools.org>  
wrote:

 The SunSpot VM is written in Java with a very small subset of C code.
 http://www.sunspotworld.com
 http://labs.oracle.com/projects/squawk/squawk-rjvm.html

 The Jikes RVM is written mostly in Java.
 http://jikesrvm.org/Presentations

 The Maxime VM is written mostly in Java
 http://148.87.46.199/projects/maxine/

 The Oracle/Sun HotSpot is written in C++
 http://en.wikipedia.org/wiki/HotSpot

 And this is just a small list, as there are quite a few JVMs around.
Each of these 4 cases you support Walter's point. He didn't say you can't write programs in Java or you can't interoperate with other languages.
Dec 18 2011
parent reply Paulo Pinto <pjmlp progtools.org> writes:
so Wrote:

 On Sun, 18 Dec 2011 22:08:54 +0200, Paulo Pinto <pjmlp progtools.org>  
 wrote:
 
 The SunSpot VM is written in Java with a very small subset of C code.
 http://www.sunspotworld.com
 http://labs.oracle.com/projects/squawk/squawk-rjvm.html

 The Jikes RVM is written mostly in Java.
 http://jikesrvm.org/Presentations

 The Maxime VM is written mostly in Java
 http://148.87.46.199/projects/maxine/

 The Oracle/Sun HotSpot is written in C++
 http://en.wikipedia.org/wiki/HotSpot

 And this is just a small list, as there are quite a few JVMs around.
Each of these 4 cases you support Walter's point. He didn't say you can't write programs in Java or you can't interoperate with other languages.
quote: "... I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C. ..." Means a VM written in 100% C code, which is not the case for the VMs I have listed. Some of them the only C code is to provide direct access to the hardware via JNI, even the JIT and Garbage Collector are written in Java.
Dec 18 2011
next sibling parent Andrew Wiley <wiley.andrew.j gmail.com> writes:
On Sun, Dec 18, 2011 at 3:14 PM, Paulo Pinto <pjmlp progtools.org> wrote:
 so Wrote:

 On Sun, 18 Dec 2011 22:08:54 +0200, Paulo Pinto <pjmlp progtools.org>
 wrote:

 The SunSpot VM is written in Java with a very small subset of C code.
 http://www.sunspotworld.com
 http://labs.oracle.com/projects/squawk/squawk-rjvm.html

 The Jikes RVM is written mostly in Java.
 http://jikesrvm.org/Presentations

 The Maxime VM is written mostly in Java
 http://148.87.46.199/projects/maxine/

 The Oracle/Sun HotSpot is written in C++
 http://en.wikipedia.org/wiki/HotSpot

 And this is just a small list, as there are quite a few JVMs around.
Each of these 4 cases you support Walter's point. He didn't say you can't write programs in Java or you can't interoperate with other languages.
quote: "... I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C. ..." Means a VM written in 100% C code, which is not the case for the VMs I have listed. Some of them the only C code is to provide direct access to the hardware via JNI, even the JIT and Garbage Collector are written in Java.
That's not the point. The point was that to get any of those VMs running on a given target platform, you have to start with C at some level. The end result may not be a pure-C VM depending on how many bootstrapping steps you go through, but you don't have any hope of running Java on a platform you can't target with a C compiler. Whether or not the VM you actually want to use is written in C doesn't really matter from this perspective. You need C to get a VM at all, so JVMs will always be available on the same or fewer platforms than C.
Dec 18 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/18/2011 1:14 PM, Paulo Pinto wrote:
 quote: "... I find this an odd statement because the Java VM is written in C,
 so therefore it is on the same or fewer platforms than C. ..."

 Means a VM written in 100% C code, which is not the case for the VMs I have
 listed. Some of them the only C code is to provide direct access to the
 hardware via JNI, even the JIT and Garbage Collector are written in Java.
If there is any C at all, it implies the existence of C on that platform, making the incidence of Java <= the incidence of C.
Dec 18 2011
prev sibling parent =?ISO-8859-1?Q?Alex_R=F8nne_Petersen?= <xtzgzorex gmail.com> writes:
On 18-12-2011 21:08, Paulo Pinto wrote:
 Walter Bright Wrote:

 On 12/18/2011 11:00 AM, Paulo Pinto wrote:
 Walter Bright Wrote:
 I find this an odd statement because the Java VM is written in C, so therefore
 it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
I thought they were all written in C.
The SunSpot VM is written in Java with a very small subset of C code. http://www.sunspotworld.com http://labs.oracle.com/projects/squawk/squawk-rjvm.html The Jikes RVM is written mostly in Java. http://jikesrvm.org/Presentations The Maxime VM is written mostly in Java http://148.87.46.199/projects/maxine/ The Oracle/Sun HotSpot is written in C++ http://en.wikipedia.org/wiki/HotSpot And this is just a small list, as there are quite a few JVMs around.
It is also worth mentioning Joeq (not a Java VM per se, but it does support Java bytecode loading), which is written in Java. - Alex
Dec 18 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/18/11 1:00 PM, Paulo Pinto wrote:
 Walter Bright Wrote:

 On 12/17/2011 10:36 PM, Russel Winder wrote:
 In all of this, the issue of portability of code has seemingly been
 missed.  One of the main reasons for Java in 1995 (other than the
 trendiness of Web browser programming) was portability across all
 platforms.  This made the sys admin of provision of resources  for
 programming classes significantly less than it was.  C, C++ and D cannot
 match this even today.  Back then it was a Big Win (tm).
I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
Bootstrapping the Java implementation regresses to the same question, so it can be safely discounted. Far as I can tell C is on more platforms than C++ so we can discount C++ as well. It would be interesting to hear about Java implementations written entirely in assembler. Andrei
Dec 18 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-18 21:29, Andrei Alexandrescu wrote:
 On 12/18/11 1:00 PM, Paulo Pinto wrote:
 Walter Bright Wrote:

 On 12/17/2011 10:36 PM, Russel Winder wrote:
 In all of this, the issue of portability of code has seemingly been
 missed. One of the main reasons for Java in 1995 (other than the
 trendiness of Web browser programming) was portability across all
 platforms. This made the sys admin of provision of resources for
 programming classes significantly less than it was. C, C++ and D cannot
 match this even today. Back then it was a Big Win (tm).
I find this an odd statement because the Java VM is written in C, so therefore it is on the same or fewer platforms than C.
Which specific Java VM are you talking about? They come in all flavors, written in Assembly, C, C++ and even Java.
Bootstrapping the Java implementation regresses to the same question, so it can be safely discounted. Far as I can tell C is on more platforms than C++ so we can discount C++ as well. It would be interesting to hear about Java implementations written entirely in assembler. Andrei
JNode is: "Java technology based operating system implemented in the Java language with a very small assembler nano-kernel". http://www.jnode.org/ -- /Jacob Carlborg
Dec 18 2011
prev sibling parent reply "ddverne" <droidevr hotmail.com> writes:
On Sunday, 18 December 2011 at 07:09:21 UTC, Walter Bright wrote:
 A programmer who doesn't know assembler is never going to write 
 better than second rate programs.
Please I don't want to flame this thread or anything like that, but this isn't a lack of modesty or a little odd? The phrase: "Who never wrote anything in ASM will not make a firt-rate program" is a bit odd, because for me it's like say: "A programmer who never programs on punched cards will never going to write a first-rate program". Finally, what I mean is: Saying that will bring something good for the community? Or should a new programmer would stop his D programming studies and start with Assembly.
Dec 19 2011
next sibling parent reply "dsimcha" <dsimcha yahoo.com> writes:
On Monday, 19 December 2011 at 19:52:41 UTC, ddverne wrote:
 On Sunday, 18 December 2011 at 07:09:21 UTC, Walter Bright 
 wrote:
 A programmer who doesn't know assembler is never going to 
 write better than second rate programs.
Please I don't want to flame this thread or anything like that, but this isn't a lack of modesty or a little odd? The phrase: "Who never wrote anything in ASM will not make a firt-rate program" is a bit odd, because for me it's like say: "A programmer who never programs on punched cards will never going to write a first-rate program". Finally, what I mean is: Saying that will bring something good for the community? Or should a new programmer would stop his D programming studies and start with Assembly.
That misses the point. Assembly language teaches the fundamentals of how a computer works at a low level. It's similar to learning Lisp in that it makes you better able to reason about programming even if you never actually program in it. The only difference is that Lisp stretches your reasoning ability towards the highest abstraction levels, assembly language does it for the lowest levels. Programming on punchcards is equivalent to typing: It is/was sometimes a necessary practical skill, but there's nothing conceptually deep about it that makes it worth learning even if it's not immediately practical.
Dec 19 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/19/2011 12:38 PM, dsimcha wrote:
 Programming on punchcards is equivalent to typing: It is/was sometimes a
 necessary practical skill, but there's nothing conceptually deep about it that
 makes it worth learning even if it's not immediately practical.
The only worthwhile skill with punchcards is trying to delicately punch out every hole without breaking any of the bridges between holes. Well, that and throwing handfulls of chad at other people. (Punchcard chad is particularly annoying because it has sharp corners and hooks onto everything, making it hard to clean up.) I should have kept some of my old punchcard decks.
Dec 19 2011
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter:

 The only worthwhile skill with punchcards is trying to delicately punch out 
 every hole without breaking any of the bridges between holes.
Given the amount of time it takes to punch the cards, waiting for your turn to run the program, and reading the printouts, I think punchcards also teach you to use your brain first and to think before doing/trying things, instead of going by trial and error. Trial and error is an efficient strategy only if you have interactive tools that speedup the cycle and the problems to solve are not too much hard. Bye, bearophile
Dec 19 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/19/2011 1:35 PM, bearophile wrote:
 Given the amount of time it takes to punch the cards, waiting for your turn
 to run the program, and reading the printouts, I think punchcards also teach
 you to use your brain first and to think before doing/trying things, instead
 of going by trial and error. Trial and error is an efficient strategy only if
 you have interactive tools that speedup the cycle and the problems to solve
 are not too much hard.
I've never seen any evidence that punchcards made one a better programmer. For sure, one wrote far fewer programs, and infinitely shorter ones, with punchcards, and so simply lack of experience would make one worse. As a programmer who initially learned with punchcards, using an interactive tty is far, far, FAR more productive. And using a full screen editor is another HUGE jump in productivity. Ditto for going to big screens and multiple windows. There are many things I miss about the olden days of programming, but punchcards, paper tape, and ASR-33 teletypes are not among them. While I'm at it, cassette tapes, floppies and modems I always hated and am glad to be done with.
Dec 19 2011
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Walter:

 I've never seen any evidence that punchcards made one a better programmer. For 
 sure, one wrote far fewer programs, and infinitely shorter ones, with 
 punchcards, and so simply lack of experience would make one worse.
This is is right. Nowdays chess gamers across the world are better than time ago just because they have more opportunities to play hard games with very good players (thanks to the web and thanks to very good chess programs). As you say modern computers allow to do much more programming practice, and this makes better programmers. So overall I agree that modern computers/software are much better. On the other hand, I think punchcards were able teach some self-discipline. I have seen programmers (myself too, sometimes) waste 30 minutes of time trying and trying again, when five minutes of focused thinking was probably enough to solve the problem. Somehow you need to learn when it's the right time to step away from the computer and think, and maybe use some paper and pencil too, to invent little graphical abstractions to aid your thinking. Bye, bearophile
Dec 19 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Mon, 2011-12-19 at 14:39 -0800, Walter Bright wrote:
 On 12/19/2011 1:35 PM, bearophile wrote:
 Given the amount of time it takes to punch the cards, waiting for your =
turn
 to run the program, and reading the printouts, I think punchcards also =
teach
 you to use your brain first and to think before doing/trying things, in=
stead
 of going by trial and error. Trial and error is an efficient strategy o=
nly if
 you have interactive tools that speedup the cycle and the problems to s=
olve
 are not too much hard.
=20 I've never seen any evidence that punchcards made one a better programmer=
. For=20
 sure, one wrote far fewer programs, and infinitely shorter ones, with=20
 punchcards, and so simply lack of experience would make one worse.
=20
 As a programmer who initially learned with punchcards, using an interacti=
ve tty=20
 is far, far, FAR more productive.
Definitely. There were techniques and skills for working at the time, but these have long since passed away into unecessariness. The past has a lot to teach (cf. actors, dataflow, CSP, etc.) but we need to be selective so as to avoid too much "rose coloured spectacles" effect.
 And using a full screen editor is another HUGE jump in productivity. Ditt=
o for=20
 going to big screens and multiple windows.
cards < teletype < monitor terminal < windowing system
 There are many things I miss about the olden days of programming, but=20
 punchcards, paper tape, and ASR-33 teletypes are not among them. While I'=
m at=20
 it, cassette tapes, floppies and modems I always hated and am glad to be =
done with. Indeed. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 19 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
On Mon, 2011-12-19 at 13:26 -0800, Walter Bright wrote:
[...]
=20
 I should have kept some of my old punchcard decks.
I have about 7,000 Fortran cards left, they are all pristine. They are great for shopping lists and writing reminders during meetings. I haven't tried actually punching holes in them since 1985. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 19 2011
prev sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Tuesday, December 20, 2011 06:40:40 Russel Winder wrote:
 On Mon, 2011-12-19 at 13:26 -0800, Walter Bright wrote:
 [...]
 
 I should have kept some of my old punchcard decks.
I have about 7,000 Fortran cards left, they are all pristine. They are great for shopping lists and writing reminders during meetings. I haven't tried actually punching holes in them since 1985.
LOL. That's all I've ever used them for (born in '82). My grandfather had a data processing business back in the '70s, so he had a lot of punchcards laying around that we used for taking notes and shopping lists and the like. - Jonathan M Davis
Dec 19 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/19/2011 11:52 AM, ddverne wrote:
 On Sunday, 18 December 2011 at 07:09:21 UTC, Walter Bright wrote:
 A programmer who doesn't know assembler is never going to write better than
 second rate programs.
Please I don't want to flame this thread or anything like that, but this isn't a lack of modesty or a little odd? The phrase: "Who never wrote anything in ASM will not make a firt-rate program" is a bit odd, because for me it's like say: "A programmer who never programs on punched cards will never going to write a first-rate program". Finally, what I mean is: Saying that will bring something good for the community? Or should a new programmer would stop his D programming studies and start with Assembly.
Sure, you can interpret that as arrogance on my part, with justification. On the other hand, I have a lot of experience working with people who do know assembler, and who do not. I see the effects of not knowing it in their code, and in the types of problems they are unable to solve without assistance. You are going to be a better C, C++, or D programmer if you're comfortable with assembler.
Dec 19 2011
next sibling parent reply "ddverne" <droidevr hotmail.com> writes:
On Monday, 19 December 2011 at 20:48:28 UTC, Walter Bright wrote:
 Sure, you can interpret that as arrogance on my part, with 
 justification.

 On the other hand, I have a lot of experience working with 
 people who do know assembler, and who do not. I see the effects 
 of not knowing it in their code, and in the types of problems 
 they are unable to solve without assistance.

 You are going to be a better C, C++, or D programmer if you're 
 comfortable with assembler.
Just a note and I will cease my case. I'm not saying which what you said about "Assembler's programmers do a better program/code" is right or wrong, instead, I'm saying that you should avoid this type of commentary on community where you are one of the leaders because some people may felt offended and you can have your image tarnished. And no matter what anyone says, I respect you a lot for your work.
Dec 19 2011
parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tuesday, 20 December 2011 at 00:55:35 UTC, ddverne wrote:
 instead, I'm saying that you should avoid this type of 
 commentary on community where you are one of the leaders 
 because some people may felt offended and you can have your 
 image tarnished.
http://www.youtube.com/watch?v=cycXuYzmzNg http://i.imgur.com/Dhq6B.jpg :)
Dec 19 2011
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Better than learning Assembly, people should also spend some time learning data
structures and advanced algorithms. Have a read at Knuth's books or similar.

Being 35, I grew up with Assembly, but nowadays what makes my heart cry is the
quality of code I see written everyday by our junior programmers. Sometimes I
also
myself if they know how to program at all. In their case I have cold dreams of
letting them anywhere near a C or C++ compiler!

I think that it is more important that developers learn proper data structures
and algorithms together with computer architecture than just Assembly,
specially if you are dealing with heterogeneous computing as it is becoming
standard nowadays.

--
Paulo


Walter Bright Wrote:

 On 12/19/2011 11:52 AM, ddverne wrote:
 On Sunday, 18 December 2011 at 07:09:21 UTC, Walter Bright wrote:
 A programmer who doesn't know assembler is never going to write better than
 second rate programs.
Please I don't want to flame this thread or anything like that, but this isn't a lack of modesty or a little odd? The phrase: "Who never wrote anything in ASM will not make a firt-rate program" is a bit odd, because for me it's like say: "A programmer who never programs on punched cards will never going to write a first-rate program". Finally, what I mean is: Saying that will bring something good for the community? Or should a new programmer would stop his D programming studies and start with Assembly.
Sure, you can interpret that as arrogance on my part, with justification. On the other hand, I have a lot of experience working with people who do know assembler, and who do not. I see the effects of not knowing it in their code, and in the types of problems they are unable to solve without assistance. You are going to be a better C, C++, or D programmer if you're comfortable with assembler.
Dec 19 2011
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/19/2011 11:41 PM, Paulo Pinto wrote:
 I think that it is more important that developers learn proper data
 structures and algorithms together with computer architecture than just
 Assembly, specially if you are dealing with heterogeneous computing as it is
 becoming standard nowadays.
There's no way I would advocate learning "just assembly". Learning assembler is a very important component of mastering programming, there are many other components.
Dec 20 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Tue, 2011-12-20 at 02:41 -0500, Paulo Pinto wrote:
[...]
 I think that it is more important that developers learn proper data
 structures and algorithms together with computer architecture than
 just Assembly, specially if you are dealing with heterogeneous
 computing as it is becoming standard nowadays.
This last is a crucial factor in the next 2-5 years. Intel has on its road map processor chips with multiple CPUs and GPUs. This means languages have to be ready for heterogeneous execution models. The JVM is completely and totally unready for this shift in processor architectures, and is unlikely to be able to do anything useful before Java 11 or Java 12. C and C++ currently have the advantage of being the languages of choice for Intel and so integration of CPU code and GPU kernels (most likely via OpenCL) will get their wholehearted support, along with Apple. Does D have a position with respect to OpenCL? We really ought to change the Subject if this thread goes any further? --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 22 2011
prev sibling parent reply J Arrizza <cppgent0 gmail.com> writes:
On Mon, Dec 19, 2011 at 12:48 PM, Walter Bright
<newshound2 digitalmars.com>wrote:

 On 12/19/2011 11:52 AM, ddverne wrote:

 On Sunday, 18 December 2011 at 07:09:21 UTC, Walter Bright wrote:

 A programmer who doesn't know assembler is never going to write better
 than
 second rate programs.
  You are going to be a better C, C++, or D programmer if you're
 comfortable with assembler.
In my university the assembler course was a weeder course. If you passed it you got in to second year (750 entrants, 150 openings). My point is being comfortable with assembler is likely an effect not a cause. If you have the motivation and skills to pick up assembler in a semester then you are probably going to be a better programmer in the end simply because of your motivation and skills, not necessarily from knowing assembler. OTOH my first exposure to programming was hand assembly of machine code on a MIKBUG based SWTPC. When I used an actual assembler it was, "thank you gxd for making my life a whole hell of a lot easier!" C was the next step in ease. You mean I don't have to actually keep track of every register's content? And so on up the tree of abstraction I went. In the end, this progression has been extremely beneficial in visualizing how all that abstract source code translates down into machine code. Memory allocation, speed and size optimization, etc. etc. make a lot more sense when you know how the machine behaves at a fundamental level. And on the other-other hand, the bottom line is this. Wetware causes the problems in sw development. How can a language feature help fix or prevent those problems? And of course all that balanced against the need for some developers to break the speed limit. John
Dec 20 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/20/2011 8:29 AM, J Arrizza wrote:
 My point is being comfortable with assembler is likely an effect not a cause.
If
 you have the motivation and skills to pick up assembler in a semester then you
 are probably going to be a better programmer in the end simply because of your
 motivation and skills, not necessarily from knowing assembler.
I don't agree, as I had been programming for two years before I learned assembler. My high level code made dramatic improvements after that.
 In the end, this progression has been extremely beneficial in visualizing how
 all that abstract source code translates down into machine code. Memory
 allocation, speed and size optimization, etc. etc. make a lot more sense when
 you know how the machine behaves at a fundamental level.
Yes, exactly. Also, knowing assembler can get you out of many jams that otherwise would stymie you - such as running into a code gen bug. Code gen bugs are not a thing of the past. I just ran into one with lcc on the mac.
Dec 20 2011
next sibling parent reply "ddverne" <droidevr hotmail.com> writes:
On Tuesday, 20 December 2011 at 19:18:30 UTC, Walter Bright wrote:
 I don't agree, as I had been programming for two years before I 
 learned assembler. My high level code made dramatic 
 improvements after that.
I'm really curious, could you give us some examples of those improvements?
Dec 20 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/20/2011 11:29 AM, ddverne wrote:
 On Tuesday, 20 December 2011 at 19:18:30 UTC, Walter Bright wrote:
 I don't agree, as I had been programming for two years before I learned
 assembler. My high level code made dramatic improvements after that.
I'm really curious, could you give us some examples of those improvements?
Performance.
Dec 20 2011
prev sibling parent J Arrizza <cppgent0 gmail.com> writes:
On Tue, Dec 20, 2011 at 11:18 AM, Walter Bright
 I don't agree, as I had been programming for two years before I learned
 assembler. My high level code made dramatic improvements after that.
It's not my style to pass out compliments, but, well, hey, you can't really use yourself as a typical example of a typical learning curve. And neither can Andrei, or many others on this list. So, I'll stick to my point. In fact, you all provide strong evidence for it.
Dec 20 2011
prev sibling next sibling parent Caligo <iteronvexor gmail.com> writes:
On Sat, Dec 17, 2011 at 11:01 PM, Jonathan M Davis <jmdavisProg gmx.com>wrote:

 In my experience, it's the professors who get to choose what they're
 teaching
 and the main reason that Java is used is a combination of its simplicitly
 and
 the fact that it's heavily used in the industry. C and C++ have a lot more
 pitfalls which make learning harder for newbie programmers. Java does more
 for
 you (like garbage collection instead of manual memory management) and has
 fewer ways to completely screw yourself over, so it makes more sense as a
 teaching language than C or C++. And since the primary focus is teaching
 the
 principles of computer science rather than a particular programming
 language,
 the exact language usually doesn't matter much.
In my experience professors only get to choose what to wear to class, lol. It's interesting how many professors choose the same exact text book for the same courses they teach. And it's also interesting how those textbooks cost 10 times more than the equivalent book covering the same material. Some professors even give out the same exams as other professors in different universities. So, no, I don't think professors get to choose either. It's as if they are given a script, and they have to follow it pretty closely (ABET might have something to do with this, idk). I've had many professors who severely rejected the idea of using something else besides Java for a given project, and I never understood why (even in junior and senior years). Python is just as simplistic as Java, used heavily in the industry, and a more elegant language. So, what's the excuse for not allowing something like Python? oh, maybe because it's an open source project and no corporation has direct control over it, no? It's also interesting to see how the choice of Java in schools and universities has NOT produced better computer scientists and software engineers. I've lost count of people I've worked with in group projects that had not freaking clue as to what they were doing. I've even had TAs working on their PhDs and couldn't compile 200 lines of code written in *sigh* Maybe I should have gone to a private school.
 Now, this _does_ have the effect that the majority of college students are
 most
 familiar and comfortable with Java, and so that's what they're generally
 going
 to use (so there _is_ a lot of indoctrination in that sense), but that's
 pretty much inevitable. You use what you know. Ultimately though, that's
 what's likely to happen with most any university simply because teaching
 programming languages is not the focus - teaching computer science is. And
 for
 most of that, the language isn't particularly relevant.

 And Java was successful before they started using it in universities, or it
 likely wouldn't have been used much in them in the first place. It's just
 that
 that has a feedback effect, since the increased used in universities tends
 to
 increase its use in industry, which tends to then make the universities
 more
 likely to select it or to stick with it as long as they don't have a solid
 reason to switch. But I believe that the initial switch was a combination
 of
 the fact that its popularity in industry was increasing and the fact that
 it
 works much better as a teaching language than C or C++. It's not because of
 anything that corporations did (aside from saying that they used it), since
 Java isn't a language where donating stuff or discounting really helps
 (unlike
 C++), since almost all of the tools for Java are free.

 - Jonathan M Davis
Well, I disagree because Java in the beginning was a complete failure as a language, and they looked for ways to market it. To them it was a product rather than a programming language that was going to help them make money and have control over the industry. Nearly the same exact thing happened with Microsoft Windows: an inferior OS that suddenly became popular and has helped generate billions of dollars of profit and control over 90% of the desktop market share. Java being a great teaching language is something that not everyone will accept. Allowing diversity in schools so that students and professors get to choose what programming language they want to learn and teach, without pressure from the industry, is something that I think most will agree needs to happen.
Dec 17 2011
prev sibling next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 00:38 -0600, Caligo wrote:
[...]
 In my experience professors only get to choose what to wear to class, lol=
. :-)
 It's interesting how many professors choose the same exact text book for
 the same courses they teach.  And it's also interesting how those textboo=
ks
 cost 10 times more than the equivalent book covering the same material.
 Some professors even give out the same exams as other professors in
 different universities.  So, no, I don't think professors get to choose
 either.  It's as if they are given a script, and they have to follow it
 pretty closely (ABET might have something to do with this, idk).  I've ha=
d
 many professors who severely rejected the idea of using something else
 besides Java for a given project, and I never understood why (even in
 junior and senior years).
Most students no longer buy textbooks at all. The bottom has fallen out of the market.
 Python is just as simplistic as Java, used heavily in the industry, and a
 more elegant language.  So, what's the excuse for not allowing something
 like Python?  oh, maybe because it's an open source project and no
 corporation has direct control over it, no?
Python is simple but not simplistic. Many educators are now turning to using Python as the first teaching language. Python is also used in industry and commerce, so it is not just a teaching language. Almost all post-production software uses C++ and Python. Most HPC is now Fortran, C++ and Python. This latter would be a great area for D to try and break into, but sadly I don't hink it would now be possible. =20 [...]
=20
 Well, I disagree because Java in the beginning was a complete failure as =
a
 language, and they looked for ways to market it.  To them it was a produc=
t
 rather than a programming language that was going to help them make money
 and have control over the industry.  Nearly the same exact thing happened
 with Microsoft Windows: an inferior OS that suddenly became popular and h=
as
 helped generate billions of dollars of profit and control over 90% of the
 desktop market share.
I am not sure this analysis works, certainly Java was not a failure from the outset. If it were Oracle in 1995 then yes the market-driven analysis might work, but Sun didn't really work that way at that time. They were then still a hardware company that did software to sell more hardware. Later things changed, cf. JavaCard. HotJava was certainly an innovation, but it ultimately failed. Java switched to traditional client and, more effectively, server. Though you can trace the effects of HotJava through all the browsers and to HTML5.
 Java being a great teaching language is something that not everyone will
 accept.  Allowing diversity in schools so that students and professors ge=
t
 to choose what programming language they want to learn and teach, without
 pressure from the industry, is something that I think most will agree nee=
ds
 to happen.
Having been in the vanguard of using Java as a first teaching language in 1995--1996, I am now very much of the view that to use Java as a first teaching language now is a gross error. Second or third language, no problem, just not the first. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 17 2011
parent reply dsimcha <dsimcha yahoo.com> writes:
On 12/18/2011 2:14 AM, Russel Winder wrote:
 Python is also used in industry and commerce, so it is not just a
 teaching language.  Almost all post-production software uses C++ and
 Python.  Most HPC is now Fortran, C++ and Python.

 This latter would be a great area for D to try and break into, but sadly
 I don't hink it would now be possible.
Please elaborate. I think D for HPC is a terrific idea. It's the only language I know of with all of the following four attributes: 1. Allows you to program straight down to the bare metal with zero or close to zero overhead, like C and C++. 2. Interfaces with C and Fortran legacy code with minimal or no overhead. 3. Has modern convenience/productivity features like GC, a real module system and structural typing via templates. 4. Has support for parallelism in the standard library. (I'm aware of OpenMP, but in my admittedly biased opinion std.parallelism is orders of magnitude more flexible and easier to use.)
Dec 18 2011
parent reply Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 10:30 -0500, dsimcha wrote:
[...]
 Please elaborate.  I think D for HPC is a terrific idea.  It's the only=
=20
 language I know of with all of the following four attributes:
For years HPC was Fortran only (*). Many of the codes used today were written in the 1970s and Fortran compiler technology revolves around making those sequential codes run on parallel kit. During the 1990s there was a move to writing new codes in C++ -- especially in the HEP area. Of course many of these codes rely on being able to call into the Fortran codes. With the rise of data visualization there was a need for a language for writing GUIs and the like, and they have decided on Python since it connects to C and C++ trivially. They also found SciPy and NumPy which actually leads to prototyping new codes in Python and then rewriting in Fortran or C++ if there is a need to -- which often there is not. With PyPy now running faster than CPython there is a possibility that new codes other than core computation frameworks will always be in Python. Given the time and effort it took to get C++ accepted by this Fortran oriented community.
 1.  Allows you to program straight down to the bare metal with zero or=
=20
 close to zero overhead, like C and C++.
Or Fortran.
 2.  Interfaces with C and Fortran legacy code with minimal or no overhead=
. This is an essential. This is Go's biggest problems, it cannot link easily to existing codes.
 3.  Has modern convenience/productivity features like GC, a real module=
=20
 system and structural typing via templates.
The bulk of the HEP community might know the words but only if it is in Fortran -- or C++.
 4.  Has support for parallelism in the standard library.  (I'm aware of=
=20
 OpenMP, but in my admittedly biased opinion std.parallelism is orders of=
=20
 magnitude more flexible and easier to use.)
The main current tool is MPI for cluster and multicore parallelism using SPMD and MIMD models, and OpenMP for thread management on multicore systems. Fortran, C, C++, and Python all have MPI capability and Fortran, C, and C++ have OpenMP -- OpenMP isn't relevant to Python. This has been the model for 20 years and is likely immovable. No matter how good a different model is, it is fighting a war on an entrenched technology. The issue is not really a technical one, it is an inertia and expectation, even political, one. C++ broke into the Fortran dominated area by fiat of a group of people in the HEP community. Also Barton & Nackman "Scientific and Engineering C++: An Introduction with Advanced Techniques and Examples" was an important factor. The question is what is the need and the vector for D to gain traction in this area where it really does have a USP? (*) FORTRAN and Fortran are different languages. The case was changed formally with the Fortran 1995 standard. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 18 2011
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/18/2011 8:20 AM, Russel Winder wrote:
 (*) FORTRAN and Fortran are different languages.  The case was changed
 formally with the Fortran 1995 standard.
*Finally*!
Dec 18 2011
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
I saw it live in CERN when I stayed there during 2003-04 timeframe.

Depending on the research group, the code was either mostly Fortran
or C++.

Python is used everywhere from running builds, automate data acquisition
or show nice data GUIs. On my group, TDAQ-Atlas, Java was actually used for
the data rendering part.

--
Paulo

Russel Winder Wrote:

 On Sun, 2011-12-18 at 10:30 -0500, dsimcha wrote:
 [...]
 Please elaborate.  I think D for HPC is a terrific idea.  It's the only 
 language I know of with all of the following four attributes:
For years HPC was Fortran only (*). Many of the codes used today were written in the 1970s and Fortran compiler technology revolves around making those sequential codes run on parallel kit. During the 1990s there was a move to writing new codes in C++ -- especially in the HEP area. Of course many of these codes rely on being able to call into the Fortran codes. With the rise of data visualization there was a need for a language for writing GUIs and the like, and they have decided on Python since it connects to C and C++ trivially. They also found SciPy and NumPy which actually leads to prototyping new codes in Python and then rewriting in Fortran or C++ if there is a need to -- which often there is not. With PyPy now running faster than CPython there is a possibility that new codes other than core computation frameworks will always be in Python. Given the time and effort it took to get C++ accepted by this Fortran oriented community.
 1.  Allows you to program straight down to the bare metal with zero or 
 close to zero overhead, like C and C++.
Or Fortran.
 2.  Interfaces with C and Fortran legacy code with minimal or no overhead.
This is an essential. This is Go's biggest problems, it cannot link easily to existing codes.
 3.  Has modern convenience/productivity features like GC, a real module 
 system and structural typing via templates.
The bulk of the HEP community might know the words but only if it is in Fortran -- or C++.
 4.  Has support for parallelism in the standard library.  (I'm aware of 
 OpenMP, but in my admittedly biased opinion std.parallelism is orders of 
 magnitude more flexible and easier to use.)
The main current tool is MPI for cluster and multicore parallelism using SPMD and MIMD models, and OpenMP for thread management on multicore systems. Fortran, C, C++, and Python all have MPI capability and Fortran, C, and C++ have OpenMP -- OpenMP isn't relevant to Python. This has been the model for 20 years and is likely immovable. No matter how good a different model is, it is fighting a war on an entrenched technology. The issue is not really a technical one, it is an inertia and expectation, even political, one. C++ broke into the Fortran dominated area by fiat of a group of people in the HEP community. Also Barton & Nackman "Scientific and Engineering C++: An Introduction with Advanced Techniques and Examples" was an important factor. The question is what is the need and the vector for D to gain traction in this area where it really does have a USP? (*) FORTRAN and Fortran are different languages. The case was changed formally with the Fortran 1995 standard. -- Russel. ============================================================================= Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.net 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 18 2011
prev sibling next sibling parent Caligo <iteronvexor gmail.com> writes:
On Sun, Dec 18, 2011 at 12:17 AM, Russel Winder <russel russel.org.uk>wrote:

 On Sat, 2011-12-17 at 22:45 -0600, Caligo wrote:
 [...]

 I thought this thread had finished, but...
 That's like saying people should take Coke and Pepsi more seriously
because
 they have bigger market shares when in reality all you need is water.
 Money isn't real, you know?
Taking that paragraph out of the context of the previous emails in the thread leads to misinterpretation of what was being said.
Taking things out of context wasn't my intention. I said what I said because, based on what you said, it seemed to me that your definition of the word "success" depended heavily on the size of the market share claimed by something.
 D is already a success, a BIG success.  Walter and Andrei (and the
amazing
 community, of course) have created a programming language that is light
 years ahead of C++, Java and Go.

 I don't think you know this, but every high school student who takes a
 computer science course is required to learn Java.  It doesn't stop
there: I didn't know this, but I guess it is only a factor in the USA. In the rest of the world, it is almost certainly not the case. It definitely isn't in the UK.
The education system is really bad here in the U.S. It's a joke, really. If you question the education system, or anything that the government does for that matter, you are being un-American. At one point I was asked to leave the country when I challenged the education system at the university. Watch the documentary "Waiting for Superman" to get an idea of how bad it is, and that's everything before university. I know certain countries offer better education, but I still think the issue is global. I've had professors who have PhDs from big universities like Princeton, yet they can't speak nor write proper English. How the hell does that happen? And then some of them teaching data structures and algorithms or software engineering without having a clue as to what they are teaching if it wasn't for the textbooks.
 I suspect the major problem with most educators -- only in USA and Italy
 are all educators called professors, in places like UK, France and
 Germany, professor is a title that has to be earned and is a matter of
 status within the system -- is that they are themselves under-educated.
 Far too many educators teaching programming cannot themselves program.
 Their defensive reaction is to enforce certain choices on their
 students.  Sometimes there is a reasonable rationale -- you don't write
 device drivers in Prolog, well unless you are using ICL CAFS -- but
 generally the restriction is because the educator doesn't know any other
 programming language than the one they enforce.
I agree.
 reason they restrict education to things like Java and C++ has very
little
 to do with the fact that those languages have claimed big market share;
 rather, it's because corporations have had a vested interest in
 universities in the first place and they receive what they order.  Just
 look at what Microsoft has been doing in universities: everything from
 "free" gifts such as free copies of Windows OS and Visual Studio Ultimate
 that cost thousands of dollars to sponsoring various kinds of events.
The
 students who are influenced by such tactics, to whom do you think they
are
 going to be loyal to?
I worry more about what Macdonalds does in primary schools. Enforcing pupils to learn to count by counting BigMacs strikes me as the worst sort of indoctrination. Your description of Microsoft's behaviour is a natural consequence of the economic system. Using marketing budget to indoctrinate people into buying your product is all that is going on. Some companies realize that spending that money on molding the minds of 2--12 year olds is the way of creating an income stream in the future.
I agree.
 The _main point_ here is that if students had been give the choice to
learn
 a programming language of their choosing, many of the so called
 "successful" programming languages would not have been so "successful"
 today.  So next time you decide to lecture someone on how popular or
 "successful" Java is, just remember how it got to be so "successful".
The single most important factor here is that people learn to program using more than one paradigm of computation and thus more than one language -- FOOPLOG and Scala are special case, that are interesting for later study but not for initial study. The importance of multiple paradigms isn't just waffle, the psychology of programming folk have been doing longitudinal studies over the last 20 years that shows that people learn more, faster and end up being better programmers. People who learn with one language and use that language for the rest of their programming lives are in general poorer programmers because of it.
I agree. Same is true of natural languages. I speak three natural languages (4 artificial languages: C++, D, Python, Haskell), and I'm trying to learn French in my free time. It wasn't until I discovered D that I realized how broken C++ is, and it wasn't until I discovered Haskell that I realized how much I've been missing out on. I think in the future, say 500 years from now, there will exist only one natural language for people to speak because people of the world will have united as one, but I can't begin to imagine what artificial languages will look like in 100 years. I think D has a really good chance of being the The Hundred-Year Language: http://www.paulgraham.com/hundred.html :-)
 [...]
 I do see the entirety of the economic system of the world, and, no, it's
 NOT called capitalism.  It's called the Monetary System.  Capitalism,
 Socialism, Communism, etc,... they are all inherently the same because
they
 are all based on the Monetary System.  Money is created out of debt, and
 money is inherently scarce.  Differential advantage and exploitation is
 name of the game, regardless of the form of government you have.  And as
 far as I know, India isn't even in the top five;  USA, China, and Japan
are
 in the top three.
I beg to differ on the detail, but this is almost certainly not the forum to have this debate. You are right that the obvious political labelling is not the whole story, but neither is the analysis using the monetary system. It is a multi-dimension problem. In the end though people in positions of power will do their very best to exploit those who are not.
The Monetary System is in fact the root cause of all the world's problems. It's an invention that has outlived it's usefulness, and it needs to go. But you are right, this isn't the place for such discussions.
 [...]

 I choose to ignore Java for technical and non-technical reasons.  Unlike
 you, I don't need to spend years of my life doing Java programming to
 realize what a joke it is, and I have never seen a case where Java was
just
 as fas as C++.  This is one of those myths, or corporate propaganda,
that's
 been propagated by educated idiots.  I and the teams I've been a member
of Have you actually done the benchmarks to back up this claim. I have. I really rather object to being labelled an educated idiot.
Everything I have experienced has shown that Java sucks when it comes to performance: everything from my Android phones (I've owned three so far, and I'll be switching to either iPhone or landline next time) to all the CS problems that I've solved and to all the desktop applications I've used that are written in Java. I mean, how much more proof does one need? I can agree that in certain situations Java is just as fast as C++, but I don't understand why it's being used for almost everything else(oh wait, I forgot corporations and money have control over common sense). It just produces bad results. I mean, I have a Samsung Galaxy S running Android 2.2. More than half the time the phone is a pain in the butt to use: it's sluggish and sometimes I have to wait 10-20 seconds for the thing to become responsive. It's a bloody slide show. The iPhone, on the other hand, is smooth and usable. I hate to advertise iPhone, but I think Google made a mistake by going with Java for Android.
 have solved countless CS problems that have required every kind of data
 structure and algorithm, and not once have I seen Java come close to
 C/C++.  On average, Java has been about 20 times slower than C++ and
 requiring on average 50 times more memory when it came to solving those
 problems.  If you honestly believe that Java can be just as fast as C++,
 then go to http://www.spoj.pl/ and pick a problem and submit a solution
in
 Java that's no more than 3 times slower than C/C++ and requires no more
 than 10 times more memory.
Java certainly does use appalling amounts of memory, no argument there. For the reason you (should) know perfectly well Java is never going to perform well on the sort of benchmarks these sites look at because they use a protocol of testing that is inherently and systematically biased in favour of running a program from start. C, C++, D will always thrash Java in this context. Where Java competes very well with C, C++ and D is in long running computations. If you want to look at even more biased benchmarking look at http://shootout.alioth.debian.org/ it is fundamentally designed to show that C is the one true language for writing performance computation.
o.k. I don't know what else to say, really. [...]
 I'm not easily offended, and I've learned to let go.  I love to be proven
 wrong because that's when I learn something new.  I think you are having
a
 harder time with this than you realize, and it's easy to understand why:
 you have spent years of your life with pointless creations such as Java,
 and they are now part of your identity.  Of course you are going to get
 upset when someone labels Java as something of a joke because you take
that
 statement personally and see it as an attack on who you are.  It's okay.
 Just learn to let go.  You still have time.
You really have totally missed where I am coming from, so I think it best you simply stop trying to analyse my position. I am not even going to try and defend myself against the slur you have made against me, it is not true, and if you had read my emails rather than trying to be an amateur psycho-analyst, you would already know this.
You reacted to my original comment about Java the same way Bob reacted when I told him that Ford pick-up trucks suck. If someone takes something personally, that tells me that it's part of their identity. That's pretty good for an amateur, wouldn't you say? But I'm glade I was wrong, because I've learned something new.
Dec 18 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 03:03 -0600, Caligo wrote:
[...]
 You reacted to my original comment about Java the same way Bob reacted wh=
en
 I told him that Ford pick-up trucks suck.  If someone takes something
 personally, that tells me that it's part of their identity.  That's prett=
y
 good for an amateur, wouldn't you say?  But I'm glade I was wrong, becaus=
e
 I've learned something new.
Excellent. I think the moral here is that "first impressions aren't always correct". I will try and watch "Waiting for Superman", thanks for the pointer. With Paul Graham and now Uncle Bob Martin doing the circuit pushing Lisp and Clojure as the language to end all languages, everything really rests on whether Clojure really does the job. It is actually a good language, I like it and use it, but I am not sure it will supplant Java as the language on the JVM, for most of the reasons we have passed though in this thread previously: inertia, lack of education, politics, management incompetence, etc. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 19 2011
prev sibling next sibling parent Mike Parker <aldacron gmail.com> writes:
On 12/5/2011 2:51 AM, Nick Sabalausky wrote:
 "Somedude"<lovelydear mailmetrash.com>  wrote in message
 news:jbeva3$2785$1 digitalmars.com...
 - hundreds of plugins
If it needs that many plug-ins then something is very, very wrong. For example, maybe some of those should be built-in. And if you meant "hundreds" literally (but I'll give you the benefit of the doubt that you're just exaggerating since it definitely sounds like an exaggeration), then..."Wow, that's just insane. What, does it need a separate plugin for each letter of the alphabet it supports?".
To be fair, you've got a lot of what I perceive to be exaggerations in your arguments as well. And to throw my 2 cents in, what you say just doesn't jive with my experience. Eclipse used to feel bloated and slow several years ago, but I've not thought that about it for a while now. I have no problem typing with background compilation going on, no issues with resource consumption, or anything else you mention except a slow start-up time. I use Eclipse frequently, on a 5-year old machine, for both C and Java. I'd use it for D, too, if the plugins were in a better state. But I've fallen in love with VisualD for that, so I may never go back. But when it comes to Java, Eclipse has me spoiled.
Dec 04 2011
prev sibling parent Somedude <lovelydear mailmetrash.com> writes:
Le 04/12/2011 18:51, Nick Sabalausky a écrit :
 "Somedude" <lovelydear mailmetrash.com> wrote in message 
 news:jbeva3$2785$1 digitalmars.com...
 Le 04/12/2011 03:40, Don a écrit :
 If you work in an environment where practically all apps are fast,
 Eclipse stands out as being slow. The startup time is particularly
 striking.
 I don't see any reason for this. Mostly when you open an IDE you want to
 first open a few files, look at them, maybe do some editing.
 It ought to be possible to do that within 2 secs of starting the IDE,
 while everything else continues to load.
 It's unusual to perform a major refactoring of your code base within 10
 secs of opening your IDE, but it seems you can't do anything at all,
 until everything has been loaded.
I stopped bothering to respond to Nick Sabalausky, as obviously, he is not trying to discuss, he just throws his opinions around without any substance.
Sounds like some dude I know...
 As for startup time, who cares really, as you open it only once and
 leave it open afterwards ? As Jonathan and I have said now at least 3
 times, you don't close it as it's your primary tool.
And then every time I work on something else and don't want Eclipse continuing to suck up half my resources? I'm expected to just leave it running anyway?
 And the reason it's
 slow is, at startup time, it loads:
 - the GUI toolkit SWT and the interface manager
If SWT is slow to load, that's another strike against it, not a defense.
 - the customized interface (called "perspective" in eclipse)
Although SWT uses native widgets, Eclipse does seem to do a lot of non-standard stuff, too, like the oversized clearly-non-native tabs. So I don't know how much this affects performance. But even if it does, that's just another strike against Eclipse. I don't want non-native, *especially* if it slows things down.
 - hundreds of plugins
If it needs that many plug-ins then something is very, very wrong. For example, maybe some of those should be built-in. And if you meant "hundreds" literally (but I'll give you the benefit of the doubt that you're just exaggerating since it definitely sounds like an exaggeration), then..."Wow, that's just insane. What, does it need a separate plugin for each letter of the alphabet it supports?".
 - the compiler
That should only be needed if you're using the compile-as-you-type feature (which I'd rather not since it slows down basic typing and UI interaction to an unacceptable degree), and on a language for which Eclipse supports it.
 - your open projects
 - all the files that were open last time
See, I don't even want that anyway. I don't like how Eclipse insists on keeping every project I've ever touched open all the time. And automatically resuming the last session, while a nice feature for those who want it, is not something I've ever personally felt a need for. So 1. I'm supposed to pay the price for that? and 2. Seriously, how long does it take to open a few text files?
 On my C2D, a
 fresh install of eclipse Indigo starts in about 12 seconds,
I assume you're on some sort of 10GB multi-core machine as most Java users have to be on, in which case: 12 sec startup is ridiculously slow. Even on 2GB x64 dual-core, that's still very, very slow.
 with 340
 plugins totaling 138 Mb in the plugins directory, most of them being
 actually loaded at startup time.
Oh my god, you were actually serious about "hundreds"...?!? Most of them being needed all the time? (Then why the hell are they plug-ins in the first place?) I knew there was something wrong about how Eclipse was designed, and this just proves it.
 Apart from that, eclipse happily handles projects with 2 million lines
 without a sweat on an average PC, so no, I don't think it's sluggish.
It's certainly sluggish compared to Scintilla-based programs. Even with all the fancy stuff turned off (which I have tried - it does make a difference, but not enough).
 If it *was* the sluggish chore Nick Sabalausky pretends it to be,
 eclipse wouldn't be chosen as the main platform by Zend, Adobe Flex,
 QNX, Altera, Aptana, etc for their own product, there wouldn't be more
 than 5 million downloads for each release of the Java platform only (i.e
 not counting all the said customisations for other languages), and Java
 users would instead flock to Netbeans or Idea, which both have their
 strengths and are free IDEs as well.
Argumentum ad populous, huh? That's one of the worst fallacies I've ever heard. "If Nazis weren't right there wouldn't have been so many of them", huh? There's so many things wrong with that argument it's not even worth validating it by going into them.
 Conclusion on this pretty boring subject: Eclipse being slow is about as
 old a rant as saying Java is slow.
Saying they're slow may be old, but it's still true no matter how stubbornly you refuse to acknowledge it.
Just FYI, you can close your projects if you want to. The reason you can open several projects at the same time is, you may want to develop libraries and an application using them at the same time, or a server and a client sharing some common code at the same time for instance (that would be 3 projects). For the rest of your post, I humbly suggest you give it another try (and no, my own machine is 5 years old, not the latest technology, and many people still work daily with eclipse on a Pentium 4 without much trouble).
Dec 04 2011
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-02 22:40, Somedude wrote:

 Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but
 it's hardly a productivity issue. One other thing that's cool is
 refactoring is no longer an issue, like it is in C or C++. With powerful
 IDEs, you can refactor without fearing too much regression, and that's a
 very important advantage, especially in a heavily OO language.
That's no longer the case if you take a look at newer C/C++ IDE's like the latest versions of XCode that uses Clang to handle basically all language related features of the IDE, autocompletion, syntax highlighting, refactoring and similar. -- /Jacob Carlborg
Dec 04 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 04.12.2011, 13:41 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-02 22:40, Somedude wrote:

 Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but
 it's hardly a productivity issue. One other thing that's cool is
 refactoring is no longer an issue, like it is in C or C++. With powerful
 IDEs, you can refactor without fearing too much regression, and that's a
 very important advantage, especially in a heavily OO language.
That's no longer the case if you take a look at newer C/C++ IDE's like the latest versions of XCode that uses Clang to handle basically all language related features of the IDE, autocompletion, syntax highlighting, refactoring and similar.
So we want a flexible compiler front end for D written in D as well, right ?
Dec 04 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-04 15:38, Marco Leise wrote:
 Am 04.12.2011, 13:41 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-02 22:40, Somedude wrote:

 Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but
 it's hardly a productivity issue. One other thing that's cool is
 refactoring is no longer an issue, like it is in C or C++. With powerful
 IDEs, you can refactor without fearing too much regression, and that's a
 very important advantage, especially in a heavily OO language.
That's no longer the case if you take a look at newer C/C++ IDE's like the latest versions of XCode that uses Clang to handle basically all language related features of the IDE, autocompletion, syntax highlighting, refactoring and similar.
So we want a flexible compiler front end for D written in D as well, right ?
Exactly, one that can be used as a library. -- /Jacob Carlborg
Dec 04 2011
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/04/2011 08:50 PM, Jacob Carlborg wrote:
 On 2011-12-04 15:38, Marco Leise wrote:
 Am 04.12.2011, 13:41 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-02 22:40, Somedude wrote:

 Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but
 it's hardly a productivity issue. One other thing that's cool is
 refactoring is no longer an issue, like it is in C or C++. With
 powerful
 IDEs, you can refactor without fearing too much regression, and
 that's a
 very important advantage, especially in a heavily OO language.
That's no longer the case if you take a look at newer C/C++ IDE's like the latest versions of XCode that uses Clang to handle basically all language related features of the IDE, autocompletion, syntax highlighting, refactoring and similar.
So we want a flexible compiler front end for D written in D as well, right ?
Exactly, one that can be used as a library.
What would be the interface to such a library?
Dec 04 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-04 20:53, Timon Gehr wrote:
 On 12/04/2011 08:50 PM, Jacob Carlborg wrote:
 On 2011-12-04 15:38, Marco Leise wrote:
 Am 04.12.2011, 13:41 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-02 22:40, Somedude wrote:

 Yes, the IDE takes care of a lot of boilerplate code. It's ugly, but
 it's hardly a productivity issue. One other thing that's cool is
 refactoring is no longer an issue, like it is in C or C++. With
 powerful
 IDEs, you can refactor without fearing too much regression, and
 that's a
 very important advantage, especially in a heavily OO language.
That's no longer the case if you take a look at newer C/C++ IDE's like the latest versions of XCode that uses Clang to handle basically all language related features of the IDE, autocompletion, syntax highlighting, refactoring and similar.
So we want a flexible compiler front end for D written in D as well, right ?
Exactly, one that can be used as a library.
What would be the interface to such a library?
That would take some thought if one would develop a library like that, it would also take some time and space to explain here. Preferably there would be API's to interact with the library for all phases of the compilation: * Lexing * Parsing * Semantic analyzing * AST traversing * Both machine dependent and independent optimization * Code generation * Linking These API's could be low level API's. Then there could be higher level API's built on top of the low level API's for specific tasks, like autocompletion, syntax highlighting, refactoring and so on. An alternative would be to just use the same API as Clang does. Clang also has two levels of API's. The unstable low level API's is the C++ code, on top of that is the stable C API built for more specific tasks like autocompletion, syntax highlighting and refactoring. -- /Jacob Carlborg
Dec 04 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbarp$1kp6$1 digitalmars.com...
 The fact is, you are more productive in Java than in C++ by nearly an
 order of magnitude.
C++ is a pretty bad example to demonstrate Java's "productivity". That's kinda like saying "I can build a house much faster with a rubber hammer than with a handful of oatmeal." It'll be a long painful experience either way (Although I have to admit DMD's source isn't too bad, but I think that's because it seems to be more a 90's C++ than a 2000's C++, back before STL iterators, etc ;) ).
Dec 02 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
What's so special about WPF? I'm asking, since I've never used it.
Isn't it basically XML? wxWidgets has XRC which is the declarative way
of making the UI.
Dec 02 2011
parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Fri, 02 Dec 2011 14:32:44 -0800, Andrej Mitrovic  
<andrej.mitrovich gmail.com> wrote:

 What's so special about WPF? I'm asking, since I've never used it.
 Isn't it basically XML? wxWidgets has XRC which is the declarative way
 of making the UI.
I'd have to say that the most interesting thing about it is the separation of Look from Implementation. You can create any look you want without changing the implementation at all. The fact that it uses XML is mostly to make it easy to use existing XML parsers to load and instantiate the UI. That's just my 0.02$ -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 02 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 03.12.2011, 01:16 Uhr, schrieb Adam Wilson <flyboynw gmail.com>:

 On Fri, 02 Dec 2011 14:32:44 -0800, Andrej Mitrovic  
 <andrej.mitrovich gmail.com> wrote:

 What's so special about WPF? I'm asking, since I've never used it.
 Isn't it basically XML? wxWidgets has XRC which is the declarative way
 of making the UI.
I'd have to say that the most interesting thing about it is the separation of Look from Implementation. You can create any look you want without changing the implementation at all. The fact that it uses XML is mostly to make it easy to use existing XML parsers to load and instantiate the UI. That's just my 0.02$
So is it like the Flash GUI where you can skin every control in a way similar to using CSS on a web site?
Dec 03 2011
parent "Adam Wilson" <flyboynw gmail.com> writes:
On Sat, 03 Dec 2011 00:23:50 -0800, Marco Leise <Marco.Leise gmx.de> wrote:

 Am 03.12.2011, 01:16 Uhr, schrieb Adam Wilson <flyboynw gmail.com>:

 On Fri, 02 Dec 2011 14:32:44 -0800, Andrej Mitrovic  
 <andrej.mitrovich gmail.com> wrote:

 What's so special about WPF? I'm asking, since I've never used it.
 Isn't it basically XML? wxWidgets has XRC which is the declarative way
 of making the UI.
I'd have to say that the most interesting thing about it is the separation of Look from Implementation. You can create any look you want without changing the implementation at all. The fact that it uses XML is mostly to make it easy to use existing XML parsers to load and instantiate the UI. That's just my 0.02$
So is it like the Flash GUI where you can skin every control in a way similar to using CSS on a web site?
Well, I don't know flash, but it is a bit like CSS in that you can completely change the look, WPF is just more flexible than CSS about how you do it. -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 03 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 2:15 PM, Nick Sabalausky wrote:
 (Although I have to admit DMD's source isn't too bad, but I think that's
 because it seems to be more a 90's C++ than a 2000's C++, back before STL
 iterators, etc ;) ).
I started on DMD back when many C++ compilers did not implement templates properly. The other reason is I find C++ templates to be very hard to read & use. A couple of templates have since crept in :-)
Dec 03 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 9:08 AM, Nick Sabalausky wrote:
 That's BS posturing and chest-thumping.
Of course. Russell even said so! I think it's a hilarious post.
Dec 02 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 10:19 PM, Paulo Pinto wrote:
 This is not what I understand from bearophile's comments every now and then,
 when we compares dmd with other languages implementations.
P.S. I wrote a Java compiler back in the 90's (Symantec Visual Cafe), and did a lot of work on a JVM. (Steve Russell worked on the JIT.) The work I did is (of course) hopelessly obsolete by today's standards, and I've forgotten a great deal of the details, but the fundamental issues haven't changed.
Nov 30 2011
prev sibling parent reply Kagamin <spam here.lot> writes:
Walter Bright Wrote:

 On 11/30/2011 12:56 PM, Paulo Pinto wrote:
 Are you not being a bit simplistic here?

 There are several JVM implementations around not just one.
It's not the implementation that's the problem, it's the *definition* of the bytecode for the JVM.
To think, LLVM devs complain about LLVM IR being so low-level, and it would so nice to have something as high-level as Java bytecode, which is so sweet for optimizers and jit.
Dec 02 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 11:56 AM, Kagamin wrote:
 To think, LLVM devs complain about LLVM IR being so low-level, and it would
 so nice to have something as high-level as Java bytecode, which is so sweet
 for optimizers and jit.
The people I know who have written professional Jits for the Java bytecode don't think it is very amenable to it. The Java bytecode is designed to be amenable for writing a simple interpreter for it. Not a Jit.
Dec 02 2011
parent Russel Winder <russel russel.org.uk> writes:
On Fri, 2011-12-02 at 12:44 -0800, Walter Bright wrote:
 On 12/2/2011 11:56 AM, Kagamin wrote:
 To think, LLVM devs complain about LLVM IR being so low-level, and it w=
ould
 so nice to have something as high-level as Java bytecode, which is so s=
weet
 for optimizers and jit.
=20 The people I know who have written professional Jits for the Java bytecod=
e don't=20
 think it is very amenable to it.
=20
 The Java bytecode is designed to be amenable for writing a simple interpr=
eter=20
 for it. Not a Jit.
Zero address stack machines do not map easily to two or three address register architectures such as Intel's and old IBMs. It's easier on RISC architectures such as SPARC. The JVM interpreter is close to trivial, at least for evaluation. The real problem comes with class loading. Class loaders can be a real pain. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 03 2011
prev sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Wed, 2011-11-30 at 00:48 -0800, Walter Bright wrote:
[...]
 I used to be intimately familiar with the JVM, I even wrote a gc for it. =
The=20
 bytecode ops in it are designed for Java, nothing more. Worse, it's a pri=
mitive=20
 stack machine. To generate even passably good native code, the JVM has to=
do a=20
 lot of reverse engineering of the bytecode.
I am still intimately familiar with the JVM, I even wrote an implementation of one for smartcards. I guess we should stop the pissing contest ;-) The JVM and the Java programming language were clearly designed together in the early 1990s. However, the JVM now recognizes the importance of Groovy, JRuby, Clojure, Jython, etc. and the JVM has evolved to better support the needs of these languages. Also changes have been made to the JVM to improve its support for Java as Java has evolved. Java 9 or 10 may even reify generic type parameters and ditch type erasure. Which will be great for Java and a real pain for Scala. Yes, the JVM is a zero address stack machine. Yes the JIT has to do a lot of work to compile native code. But it does it. It is very successful. I am sure it could better. It remains the market leader. Promoting D gains nothing by pointing out things about Java -- however interesting the points may be to some of us.
 For example, you cannot pass by value anything other than the primitive J=
ava=20
 data types. There are no pointers. Want an unsigned int? Forget it. Array=
s of=20
 anything but class references? Nyuk nyuk nyuk. Etc.
So f$$$$$$ what. The computational model of Java is as it is, it is not trying to be a native code system such as C, C++, D, Go. To each its own. The skill of the programmer is to use the computational model of the platform they have to use to create realization of algorithms. Either than or change the platform. Moaning about the deficiencies of one platform compared to another achieves nothing positive.=20 Java handling of bytes and shorts is derisible, the lack of unsigned may be laughable, but there is a huge amount of Java activity out there and let's be serious, next to no D. Is the desire to make D a well used language? If so the tenor of threads like this in this news group needs to change dramatically. Bitching about things is the sign of a community ill at ease with its own failure to become part of the mainstream. Just look at the Scala mailing lists for a classic example. It is hugely counter-productive to the uptake of Scala. The danger is that the D community is its own worst enemy, much like the Scala community has a reputation for being. I put forward vague proposals for how to promote D in my earlier email -- wherever that ends up in people's threads :-) -- basically do some comparative work to show D's efficacy, get some high profile projects to market the use of D. Basically get outward looking rather than inward looking. This activity almost started with the effort to create new websites and new imagery around D, but it all seems to have stalled. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 01 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 1:44 AM, Russel Winder wrote:
 Promoting D gains nothing by pointing out things about Java -- however
 interesting the points may be to some of us.
We're talking on the n.g. here, I'm not writing articles comparing Java to D.
 For example, you cannot pass by value anything other than the primitive Java
 data types. There are no pointers. Want an unsigned int? Forget it. Arrays of
 anything but class references? Nyuk nyuk nyuk. Etc.
So f$$$$$$ what.
It matters if you are trying to implement a language that needs those features, but must run on the JVM. That is what I meant by a "rock" a language that targets the JVM must carry.
 I put forward vague proposals for how to promote D in my earlier email
 -- wherever that ends up in people's threads :-) -- basically do some
 comparative work to show D's efficacy, get some high profile projects to
 market the use of D.  Basically get outward looking rather than inward
 looking.

 This activity almost started with the effort to create new websites and
 new imagery around D, but it all seems to have stalled.
Which ones do you want to help with?
Dec 01 2011
next sibling parent reply Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 02:19:01 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 This activity almost started with the effort to create new websites
 and new imagery around D, but it all seems to have stalled.
=20 Which ones do you want to help with?
I'd like to help with GUI bindings if D community would come more close together here with some people ready to lead the herd... Sincerely, Gour --=20 Those who are on this path are resolute in purpose,=20 and their aim is one. O beloved child of the Kurus,=20 the intelligence of those who are irresolute is many-branched. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 01 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 2:42 AM, Gour wrote:
 I'd like to help with GUI bindings if D community would come more close
 together here with some people ready to lead the herd...
Why not you lead the effort?
Dec 01 2011
next sibling parent reply Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 02:59:45 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 Why not you lead the effort?
Lack of skills: both D and with GUI toolkits...let's hope someone more capable will chime in. Sincerely, Gour --=20 Never was there a time when I did not exist,=20 nor you, nor all these kings; nor in the future=20 shall any of us cease to be. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 01 2011
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-01 12:44, Gour wrote:
 On Thu, 01 Dec 2011 02:59:45 -0800
 Walter Bright<newshound2 digitalmars.com>  wrote:

 Why not you lead the effort?
Lack of skills: both D and with GUI toolkits...let's hope someone more capable will chime in.
I hope that I will eventually have time to continue developing DWT. -- /Jacob Carlborg
Dec 01 2011
parent reply Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 13:33:30 +0100
Jacob Carlborg <doob me.com> wrote:

 I hope that I will eventually have time to continue developing DWT.
I also have to receive definite answer on which toolkit to focus for our project...If it becomes DWT... Sincerely, Gour --=20 =46rom anger, complete delusion arises, and from delusion=20 bewilderment of memory. When memory is bewildered,=20 intelligence is lost, and when intelligence is lost=20 one falls down again into the material pool. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 01 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 6:10 AM, Gour wrote:
 On Thu, 01 Dec 2011 13:33:30 +0100
 Jacob Carlborg<doob me.com>  wrote:

 I hope that I will eventually have time to continue developing DWT.
I also have to receive definite answer on which toolkit to focus for our project...If it becomes DWT...
At some point, it doesn't matter. Just pick the one that is the most amenable to getting it done.
Dec 01 2011
parent Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 09:59:46 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 At some point, it doesn't matter. Just pick the one that is the most
 amenable to getting it done.
Well, it should get green light that it's good-enough (looking) for Mac. We'll see... Sincerely, Gour --=20 The embodied soul may be restricted from sense enjoyment,=20 though the taste for sense objects remains. But, ceasing=20 such engagements by experiencing a higher taste,=20 he is fixed in consciousness. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 3:44 AM, Gour wrote:
 Walter Bright<newshound2 digitalmars.com>  wrote:
 Why not you lead the effort?
Lack of skills: both D and with GUI toolkits
Don't let that stop you - I'm serious. The best way to learn is by diving in.
Dec 01 2011
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-01 18:58, Walter Bright wrote:
 On 12/1/2011 3:44 AM, Gour wrote:
 Walter Bright<newshound2 digitalmars.com> wrote:
 Why not you lead the effort?
Lack of skills: both D and with GUI toolkits
Don't let that stop you - I'm serious. The best way to learn is by diving in.
I completely agree. With most of the projects I've done lately I had no experience with that particular subject before starting them. -- /Jacob Carlborg
Dec 01 2011
prev sibling parent reply Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 09:58:26 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 Don't let that stop you - I'm serious. The best way to learn is by
 diving in.
Thank you for encouragement. I'm determined to do the project in D and we'll do the needful. ;) Sincerely, Gour --=20 One who is able to withdraw his senses from sense objects,=20 as the tortoise draws its limbs within the shell,=20 is firmly fixed in perfect consciousness. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 4:03 AM, Gour wrote:
 On Thu, 01 Dec 2011 09:58:26 -0800
 Walter Bright<newshound2 digitalmars.com>  wrote:

 Don't let that stop you - I'm serious. The best way to learn is by
 diving in.
Thank you for encouragement. I'm determined to do the project in D and we'll do the needful. ;)
The most important characteristic of a champion is enthusiasm and commitment. The expertise will follow naturally, it is not a prerequisite. Champions are self-selected. To be the D Gui Champion, select yourself to be it, and Just Do It. Don't let anything stand in your way. In general, that's how things happen in the D ecosystem.
Dec 02 2011
parent Gour <gour atmarama.net> writes:
On Fri, 02 Dec 2011 12:11:47 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

Dear Walter,

 The most important characteristic of a champion is enthusiasm and
 commitment. The expertise will follow naturally, it is not a
 prerequisite.
=20
 Champions are self-selected. To be the D Gui Champion, select
 yourself to be it, and Just Do It. Don't let anything stand in your
 way.
Thank you very much for your wise words which are relevant for much wider context than developing D ecosystemi!! I saved them in my mail archive and they rapidly increase S/N ratio of this newsgroup. Sincerely, Gour --=20 As the ignorant perform their duties with attachment to results,=20 the learned may similarly act, but without attachment, for the=20 sake of leading people on the right path. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 03 2011
prev sibling parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Thu, 01 Dec 2011 03:44:17 -0800, Gour <gour atmarama.net> wrote:

 On Thu, 01 Dec 2011 02:59:45 -0800
 Walter Bright <newshound2 digitalmars.com> wrote:

 Why not you lead the effort?
Lack of skills: both D and with GUI toolkits...let's hope someone more capable will chime in. Sincerely, Gour
Gour, I'd love to talk to you more about GUI's. I am new to D, but I have spent years working with GUI toolkits and studying their construction. My company would like to move to D in the future but, among other things, the lack of a first class GUI toolkit makes that a non-starter at the moment. If such a thing existed my company would be very willing to jump ship. And I have permission to use some limited company resources, mostly just web hosting for the project right now, but ability to expand that latter if the project shows progress. I would also be up for leading the project, but a project of this size would need lots of contributors. And there still needs to be serious discussions about how to design such a project. Personally, my UI design background tends away from traditional style toolkits like wxD and DWT, and as such I would probably want to take the project in a different direction than those. For example, all of our software at work is built on WPF and I can say that I completely believe that WPF style UI toolkits are the way of the future. Besides, why cover the same ground that those two projects are already covering? I could list all the pro's and con's that we've discovered in actual usage of WPF but I don't want to needlessly clutter up this thread which has little to do with UI. :-) You can find me on IRC as LightBender and the email account I list here is actively monitored. -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 02 2011
parent reply Gour <gour atmarama.net> writes:
On Fri, 02 Dec 2011 10:56:41 -0800
"Adam Wilson" <flyboynw gmail.com> wrote:

Hello Adam,

 Gour, I'd love to talk to you more about GUI's. I am new to D, but I
 have spent years working with GUI toolkits and studying their
 construction.=20
Well, I'm just someone with not-so-much free time looking to help some GUI bindings project in order to be able to use it for open-source project(s).
 My company would like to move to D in the future but, among other
 things, the lack of a first class GUI toolkit makes that a non-starter
 at the moment. If such a thing existed my company would be very
 willing to jump ship.=20
This is one of the best paragraph I read in this newsgroup in a recent time. I hope it will be loud-enough.
 And I have permission to use some limited company resources, mostly
 just web hosting for the project right now, but ability to expand that
 latter if the project shows progress.
I believe that stuff like bitbucket/github should be enough...
 I would also be up for leading the project, but a project of this
 size would need lots of contributors. And there still needs to be
 serious discussions about how to design such a project.=20
/me nods
 Personally, my UI design background tends away from traditional style
 toolkits like wxD and DWT, and as such I would probably want to take
 the project in a different direction than those.=20
Interesting...
 For example, all of our software at work is built on WPF and I can say
 that I completely believe that WPF style UI toolkits are the way of
 the future.
Hmm...but WPF is Windows-only, right? Moreover, developing something from the scratch woudl require enormous amount of time in comparison with *just* providing higher-level D-ish API for some of the already available GUI toolkit. On top of that, I believe that the best forces available within D-army are now focused on improving DMD/Phobos, so don't know how many soldiers are ready into going developing something new. Let me say, that when the time matures, I'm definitely to have some solution more suitable for D and its advantages over e.g. C(++), but for now I believe that just having some pragmatic solution in the form of actively-worked-on project covering one of the {gtk,qt,wx}.
 Besides, why cover the same ground that those two projects are
 already covering? I could list all the pro's and con's that we've
 discovered in actual usage of WPF but I don't want to needlessly
 clutter up this thread which has little to do with UI. :-)
Time & effort which are limited in D community right now?
 You can find me on IRC as LightBender and the email account I list
 here is actively monitored.
I'm available as 'gour' on IRC. Sincerely, Gour --=20 As a strong wind sweeps away a boat on the water,=20 even one of the roaming senses on which the mind=20 focuses can carry away a man's intelligence. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 11:29 AM, Gour wrote:
 Moreover, developing something from the scratch woudl require enormous
 amount of time in comparison with *just* providing higher-level D-ish
 API for some of the already available GUI toolkit.
Developing a D GUI from scratch is way beyond our reach at the moment. People have spent enormous efforts developing GUI libraries for other platforms, there's no good reason for not leveraging their efforts. It's not just the code involved. It's the tutorials, web sites, manuals, support, etc., that would have to be reinvented. By developing a D interface to an existing one, none of that has to be developed.
Dec 02 2011
parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Fri, 02 Dec 2011 12:15:55 -0800, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 12/2/2011 11:29 AM, Gour wrote:
 Moreover, developing something from the scratch woudl require enormous
 amount of time in comparison with *just* providing higher-level D-ish
 API for some of the already available GUI toolkit.
Developing a D GUI from scratch is way beyond our reach at the moment. People have spent enormous efforts developing GUI libraries for other platforms, there's no good reason for not leveraging their efforts.
I absolutely agree. However, I don't thank that we should exclude the possibility of building a scratch library either.
 It's not just the code involved. It's the tutorials, web sites, manuals,  
 support, etc., that would have to be reinvented. By developing a D  
 interface to an existing one, none of that has to be developed.
This is too true. But if it was easy, everybody would be doing it. You could say the same thing about compilers, but that didn't stop you ... :-) My intention is not to draw away any devs who could potentially work on DMD/Phobos, in fact I want them working hard on those because without them any work I do is pointless, and in some cases impossible (showstopper bugs and ICE's are rather annoying like that). I suspect that it'll be a case of "me, myself, and I" working on a native UI for D for quite some time. But at the same time, I want to continue to have conversations with the community at large, probably mostly about design and whatnot. If there are people who really want to help I won't turn them away, but I'll avoid actively recruiting to make sure that DMD/Phobos gets first pick, as they should. Sound good? -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 02 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 2:15 PM, Adam Wilson wrote:
 On Fri, 02 Dec 2011 12:15:55 -0800, Walter Bright <newshound2 digitalmars.com>
 wrote:

 On 12/2/2011 11:29 AM, Gour wrote:
 Moreover, developing something from the scratch woudl require enormous
 amount of time in comparison with *just* providing higher-level D-ish
 API for some of the already available GUI toolkit.
Developing a D GUI from scratch is way beyond our reach at the moment. People have spent enormous efforts developing GUI libraries for other platforms, there's no good reason for not leveraging their efforts.
I absolutely agree. However, I don't thank that we should exclude the possibility of building a scratch library either.
At some point, a decision has to be made. Consider that existing successful GUI libraries have had *enormous* resources poured into them. That just is not possible in the D community right now. And even if it were, do we really want to wait 5 years for it to be built? Of course, if someone still wants to develop a D GUI from scratch, nobody is going to stop them.
 It's not just the code involved. It's the tutorials, web sites, manuals,
 support, etc., that would have to be reinvented. By developing a D interface
 to an existing one, none of that has to be developed.
This is too true. But if it was easy, everybody would be doing it. You could say the same thing about compilers, but that didn't stop you ... :-)
Frankly, I think a compiler is much easier to build.
 My intention is not to draw away any devs who could potentially work on
 DMD/Phobos, in fact I want them working hard on those because without them any
 work I do is pointless, and in some cases impossible (showstopper bugs and
ICE's
 are rather annoying like that). I suspect that it'll be a case of "me, myself,
 and I" working on a native UI for D for quite some time. But at the same time,
I
 want to continue to have conversations with the community at large, probably
 mostly about design and whatnot. If there are people who really want to help I
 won't turn them away, but I'll avoid actively recruiting to make sure that
 DMD/Phobos gets first pick, as they should. Sound good?
That's fine if you want to do that.
Dec 02 2011
next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 14:57:56 Walter Bright wrote:
 It's not just the code involved. It's the tutorials, web sites,
 manuals,
 support, etc., that would have to be reinvented. By developing a D
 interface to an existing one, none of that has to be developed.
This is too true. But if it was easy, everybody would be doing it. You could say the same thing about compilers, but that didn't stop you ... :-)
Frankly, I think a compiler is much easier to build.
I'd have to agree on that one. Compilers are much more straightforward than GUI toolkits - given all of the crazy, non-deterministic interactions that you have to deal with in GUIs. Compilers are by no means easy to write - especially with regards to the optimizer and error handling - but they're much more straightforward IMHO. As for GUIs written in D, we just don't have the manpower for doing that at this point. There's no reason why it couldn't be done or shouldn't be done eventually, but that's a _huge_ task, and we get most of the gain by simply making it possible to interact well with existing C/C++ GUI toolkits in D - even that is a _lot_ of work. So, while it would be fantastic to have a solid GUI toolkit written in D, doing it in the short term doesn't really make much sense IMHO, and there are so many people already pouring thousands of hours into existing, mature, C/C++ GUI toolkits, that I think that we'd be remiss to not take advantage of that through D's interopability with C/C++. But I certainly have no problem with us having a D-based GUI toolkit in the long term. It's the short term which is the problem. - Jonathan M Davis
Dec 02 2011
prev sibling next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/2/11, Walter Bright <newshound2 digitalmars.com> wrote:
 Consider that existing successful GUI libraries have had *enormous*
 resources poured into them.
I think a vast majority of that time was spent dealing with OS-specific bugs due to the requirement that widgets must look and feel native to each OS. That, and dealing with C++ shenanigans. But I think the whole "must look native" stance is now wearing off though (sorry Nick). Last I've heard, Qt5 was moving to a OpenGL backend, I don't know how that will compare with native widgets. But hey, even Microsoft wants to kill of GDI (they've put its docs in the "deprecated technologies" section). Virtually every company that wants to differentiate its product uses some form of a custom GUI that isn't 100% native. Even Microsoft does that with Office and IE. So I don't see non-native widgets as a big problem. Anyway my point was if you don't care about native looks then implementing a GUI library doesn't have to be a 5-year project, especially if you have more than one person working on it. People have already done it before (in D!) - Harmonia, Hybrid and the more recent Rae (all 3 are in dead-state but they're open-source).
Dec 02 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-03 00:24, Andrej Mitrovic wrote:
 On 12/2/11, Walter Bright<newshound2 digitalmars.com>  wrote:
 Consider that existing successful GUI libraries have had *enormous*
 resources poured into them.
I think a vast majority of that time was spent dealing with OS-specific bugs due to the requirement that widgets must look and feel native to each OS. That, and dealing with C++ shenanigans. But I think the whole "must look native" stance is now wearing off though (sorry Nick). Last I've heard, Qt5 was moving to a OpenGL backend, I don't know how that will compare with native widgets. But hey, even Microsoft wants to kill of GDI (they've put its docs in the "deprecated technologies" section).
If Microsoft kills GDI and uses some DirectX/OpenGL backend for its GUI, then that will be the native GUI. Just because GDI/Win32 has been the native GUI for Windows doesn't mean it can't be replaced with a new native GUI. -- /Jacob Carlborg
Dec 04 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/4/11, Jacob Carlborg <doob me.com> wrote:
 If Microsoft kills GDI and uses some DirectX/OpenGL backend for its GUI,
 then that will be the native GUI. Just because GDI/Win32 has been the
 native GUI for Windows doesn't mean it can't be replaced with a new
 native GUI.
My point was popular apps rarely have a native UI.
Dec 04 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-04 22:07, Andrej Mitrovic wrote:
 On 12/4/11, Jacob Carlborg<doob me.com>  wrote:
 If Microsoft kills GDI and uses some DirectX/OpenGL backend for its GUI,
 then that will be the native GUI. Just because GDI/Win32 has been the
 native GUI for Windows doesn't mean it can't be replaced with a new
 native GUI.
My point was popular apps rarely have a native UI.
I see. BTW, do you have an examples? -- /Jacob Carlborg
Dec 04 2011
next sibling parent Kagamin <spam here.lot> writes:
Jacob Carlborg Wrote:

 On 2011-12-04 22:07, Andrej Mitrovic wrote:
 On 12/4/11, Jacob Carlborg<doob me.com>  wrote:
 If Microsoft kills GDI and uses some DirectX/OpenGL backend for its GUI,
 then that will be the native GUI. Just because GDI/Win32 has been the
 native GUI for Windows doesn't mean it can't be replaced with a new
 native GUI.
My point was popular apps rarely have a native UI.
I see. BTW, do you have an examples?
FF UI designers particularly suck at this.
Dec 04 2011
prev sibling parent reply Mirko Pilger <mirko.pilger gmail.com> writes:
 I see. BTW, do you have an examples?
entertainment software like media players. and interestingly enough tools for artists often come with non-native guis. e.g. 3d modeler applications (like autodesk maya, blender, luxology modo), pixologic zbrush, adobe after effects, the foundry nuke, ableton live. it almost seems like developers of those applications believe traditional native guis interfere with creativity or are unsuitable for creative workflows.
Dec 05 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-05 12:01, Mirko Pilger wrote:
 I see. BTW, do you have an examples?
entertainment software like media players. and interestingly enough tools for artists often come with non-native guis. e.g. 3d modeler applications (like autodesk maya, blender, luxology modo), pixologic zbrush, adobe after effects, the foundry nuke, ableton live.
I can understand that these applications need to use non-native widgets, but there is not reason to use a complete non-native GUI. For example, iTunes uses a non-native scroll bar, for no reason what so ever. Nothing is better with this non-native scroll bar. BTW, Blender has the most horrible GUI I have ever used.
 it almost seems like developers of those applications believe
 traditional native guis interfere with creativity or are unsuitable for
 creative workflows.
-- /Jacob Carlborg
Dec 05 2011
prev sibling next sibling parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Fri, 02 Dec 2011 14:57:56 -0800, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 12/2/2011 2:15 PM, Adam Wilson wrote:
 On Fri, 02 Dec 2011 12:15:55 -0800, Walter Bright  
 <newshound2 digitalmars.com>
 wrote:

 On 12/2/2011 11:29 AM, Gour wrote:
 Moreover, developing something from the scratch woudl require enormous
 amount of time in comparison with *just* providing higher-level D-ish
 API for some of the already available GUI toolkit.
Developing a D GUI from scratch is way beyond our reach at the moment. People have spent enormous efforts developing GUI libraries for other platforms, there's no good reason for not leveraging their efforts.
I absolutely agree. However, I don't thank that we should exclude the possibility of building a scratch library either.
At some point, a decision has to be made. Consider that existing successful GUI libraries have had *enormous* resources poured into them. That just is not possible in the D community right now. And even if it were, do we really want to wait 5 years for it to be built? Of course, if someone still wants to develop a D GUI from scratch, nobody is going to stop them.
I wouldn't suggest that the community wait for a native UI; only that because it takes so long, now is the best time to get started. Full speed ahead with wxD and DWT! They represent a path of least resistance that should absolutely be exploited.
 It's not just the code involved. It's the tutorials, web sites,  
 manuals,
 support, etc., that would have to be reinvented. By developing a D  
 interface
 to an existing one, none of that has to be developed.
This is too true. But if it was easy, everybody would be doing it. You could say the same thing about compilers, but that didn't stop you ... :-)
Frankly, I think a compiler is much easier to build. .
Hehe, well, I've found myself completely unable to wrap my head around compilers ... but I studied game development in school and I find graphics pretty easy to work with. I guess it comes down to where our expertise lies. Too each his own?
 My intention is not to draw away any devs who could potentially work on
 DMD/Phobos, in fact I want them working hard on those because without  
 them any
 work I do is pointless, and in some cases impossible (showstopper bugs  
 and ICE's
 are rather annoying like that). I suspect that it'll be a case of "me,  
 myself,
 and I" working on a native UI for D for quite some time. But at the  
 same time, I
 want to continue to have conversations with the community at large,  
 probably
 mostly about design and whatnot. If there are people who really want to  
 help I
 won't turn them away, but I'll avoid actively recruiting to make sure  
 that
 DMD/Phobos gets first pick, as they should. Sound good?
That's fine if you want to do that.
Already started to; i've been laying down the skeleton and learning D at the same time. I like the language. But I think I'll leave language design to those who understand it best and stick to what I know. I suspect that this is going to be a "me, myself, and I" project for some time. I'm ok with that. :-) -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 02 2011
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, Adam Wilson <flyboynw gmail.com> wrote:
 Already started to; i've been laying down the skeleton and learning D at
 the same time. I like the language. But I think I'll leave language design
 to those who understand it best and stick to what I know. I suspect that
 this is going to be a "me, myself, and I" project for some time. I'm ok
 with that. :-)
Yea, I had something in the works as well (and then a million other side-projects beside that, just like every D programmer out there..). The most difficult part to implement for me was the layouts. Actually I only got that half-way done. But having a backbuffer system, dirty-region blits, alpha blending, mouse tracking (e.g. select and move widget), signals/slots, those things were done in a matter of weeks (mostly because there's already usable code out there, e.g. a newer std.signals, the new CairoD bindings for drawing, win32 bindings, etc) . It's just a fun project I like to play with once in a while. But now that wxD is on github I'm going to use that for some of my projects. It's all too easy getting worked up with an "engine" instead of the actual "game", if you know what I mean ;).
Dec 02 2011
next sibling parent reply Gour <gour atmarama.net> writes:
On Sat, 3 Dec 2011 01:27:23 +0100
Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:

 But now that wxD is on github I'm going to use that for some of
 my projects.=20
Do you plan working on 2.9/3.0 and/or using SWIG for it?
 It's all too easy getting worked up with an "engine"
 instead of the actual "game", if you know what I mean ;).
He he. ;) Sincerely, Gour --=20 A person who is not disturbed by the incessant flow of=20 desires =E2=80=94 that enter like rivers into the ocean, which is=20 ever being filled but is always still =E2=80=94 can alone achieve=20 peace, and not the man who strives to satisfy such desires. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, Gour <gour atmarama.net> wrote:
 Do you plan working on 2.9/3.0 and/or using SWIG for it?
I can link to 2.8.4. and add more wrapper code in wxc so I can call into 2.8.4. I don't know just how how different 2.9 is, if it's not too different maybe I'll just jump straight to wrapping that. I'll be doing this in a fork on github anyway so we'll see how that goes. As for SWIG, I guess I better explore that option first before I do everything manually. But I either have to fix some const issues in the swig generator, or I'll have to let klickverbot know about it, or as a last resort I'll use a version of an older compiler compatible with swig output. wxWidgets is a pretty big codebase, I don't know how well swig will handle this (or how well I will handle swig).. Anyway I'll see how it goes the next couple of days..
Dec 03 2011
parent reply Gour <gour atmarama.net> writes:
On Sat, 3 Dec 2011 09:29:45 +0100
Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:

 I can link to 2.8.4. and add more wrapper code in wxc so I can call
 into 2.8.4. I don't know just how how different 2.9 is, if it's not
 too different maybe I'll just jump straight to wrapping that. I'll be
 doing this in a fork on github anyway so we'll see how that goes.
OK.
 As for SWIG, I guess I better explore that option first before I do
 everything manually.=20
I agree.
 But I either have to fix some const issues in the
 swig generator, or I'll have to let klickverbot know about it, or as a
 last resort I'll use a version of an older compiler compatible with
 swig output.=20
Why not reporting it to klickverbot?
 wxWidgets is a pretty big codebase, I don't know how well
 swig will handle this (or how well I will handle swig).. Anyway I'll
 see how it goes the next couple of days..
OK. When I get definite answer about which toolkit to use, I'll join you if it is going to be wx. ;) Sincerely, Gour --=20 When your intelligence has passed out of the dense forest=20 of delusion, you shall become indifferent to all that has=20 been heard and all that is to be heard. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 03 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, Gour <gour atmarama.net> wrote:
 Why not reporting it to klickverbot?
I just did. The issue was just that of conversion between function pointer types. DMD used to allow invalid conversions (e.g. const to non-const, and vice-versa) until this was fixed a release or two ago. SWIG will have to be fixed but I don't think it's a huge deal.
Dec 03 2011
next sibling parent Gour <gour atmarama.net> writes:
On Sat, 3 Dec 2011 18:25:19 +0100
Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:

 I just did. The issue was just that of conversion between function
 pointer types. DMD used to allow invalid conversions (e.g. const to
 non-const, and vice-versa) until this was fixed a release or two ago.
 SWIG will have to be fixed but I don't think it's a huge deal.
Cool, cool...Thank you for taking care. Sincerely, Gour --=20 In this endeavor there is no loss or diminution,=20 and a little advancement on this path can protect=20 one from the most dangerous type of fear. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 03 2011
prev sibling parent reply David Nadlinger <see klickverbot.at> writes:
On 12/3/11 6:25 PM, Andrej Mitrovic wrote:
 On 12/3/11, Gour<gour atmarama.net>  wrote:
 Why not reporting it to klickverbot?
I just did. The issue was just that of conversion between function pointer types. DMD used to allow invalid conversions (e.g. const to non-const, and vice-versa) until this was fixed a release or two ago. SWIG will have to be fixed but I don't think it's a huge deal.
Thanks for letting me know, this was a oversight during porting the D1 version. Embarrassingly, I was even one of the authors of the DMD patch that added strict function pointer type checking, but I didn't touch SWIG for quite some while now. Anyway, fixed in trunk along with a few other issues: [D] Fix exception glue code for newer DMD 2 versions. [D] Do not default to 32 bit glue code for DMD anymore. [D] Improved allprotected test case error messages. [D] Test case fix: Aliases now required for non-overridden base class overloads. [D] Test case fix: IntVector holds ints, not size_t. [D] Use stdc.config.c_long/c_ulong to represent C long types. David
Dec 03 2011
next sibling parent Gour <gour atmarama.net> writes:
On Sat, 03 Dec 2011 20:51:52 +0100
David Nadlinger <see klickverbot.at> wrote:

 Thanks for letting me know, this was a oversight during porting the
 D1 version. Embarrassingly, I was even one of the authors of the DMD
 patch that added strict function pointer type checking, but I didn't
 touch SWIG for quite some while now.
[...] Thank you very much. It looks as some steps forward in making SWIG and its D support ready to bind wx. ;) Sincerely, Gour --=20 http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 03 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, David Nadlinger <see klickverbot.at> wrote:
 Anyway, fixed in trunk along with a few other issues
Thanks! When I find new issues should I file them to the svn bug tracker or contact you directly?
Dec 03 2011
prev sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, David Nadlinger <see klickverbot.at> wrote:
 Anyway, fixed in trunk along with a few other issues:
Thanks again. I've had some trouble building SWIG but that was because libpcre3 was missing, I got it to build and the samples seem to work now for 2.056.
Dec 14 2011
prev sibling parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Fri, 02 Dec 2011 16:27:23 -0800, Andrej Mitrovic  
<andrej.mitrovich gmail.com> wrote:

 On 12/3/11, Adam Wilson <flyboynw gmail.com> wrote:
 Already started to; i've been laying down the skeleton and learning D at
 the same time. I like the language. But I think I'll leave language  
 design
 to those who understand it best and stick to what I know. I suspect that
 this is going to be a "me, myself, and I" project for some time. I'm ok
 with that. :-)
Yea, I had something in the works as well (and then a million other side-projects beside that, just like every D programmer out there..). The most difficult part to implement for me was the layouts. Actually I only got that half-way done. But having a backbuffer system, dirty-region blits, alpha blending, mouse tracking (e.g. select and move widget), signals/slots, those things were done in a matter of weeks (mostly because there's already usable code out there, e.g. a newer std.signals, the new CairoD bindings for drawing, win32 bindings, etc) . It's just a fun project I like to play with once in a while. But now that wxD is on github I'm going to use that for some of my projects. It's all too easy getting worked up with an "engine" instead of the actual "game", if you know what I mean ;).
Hehe, I have work projects to keep me in the "game", but engines always have to come before the game. Would you be willing to send me your code? I don't know how much of it i'd end up using, but it would be really helpful in understanding how you got as far as you did and where the trouble points were ... maybe it'll give me a head-start on the project. -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 03 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/4/11, Adam Wilson <flyboynw gmail.com> wrote:
 Would you be willing to send me your code? I don't know how much of it i'd
 end up using, but it would be really helpful in understanding how you got
 as far as you did and where the trouble points were ... maybe it'll give
 me a head-start on the project.
This is still a WIP project with re-architecturing on a day-to-day basis. Highly experimental stuff. I'm not willing to share the code yet, sorry. But, you are much better off looking at a solid GUI framework such as Harmonia (http://harmonia.terrainformatica.com/doku.php). Even though it's D1 it looks pretty neat, it's a windowless GUI and has everything modularized. It even has theming support and HTML rendering. It is Windows-only though, but I think other OSes could have been supported if there were people willing to port it (it isn't explicitly tied to win32).
Dec 05 2011
next sibling parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Mon, 05 Dec 2011 14:55:55 -0800, Andrej Mitrovic  
<andrej.mitrovich gmail.com> wrote:

 On 12/4/11, Adam Wilson <flyboynw gmail.com> wrote:
 Would you be willing to send me your code? I don't know how much of it  
 i'd
 end up using, but it would be really helpful in understanding how you  
 got
 as far as you did and where the trouble points were ... maybe it'll give
 me a head-start on the project.
This is still a WIP project with re-architecturing on a day-to-day basis. Highly experimental stuff. I'm not willing to share the code yet, sorry. But, you are much better off looking at a solid GUI framework such as Harmonia (http://harmonia.terrainformatica.com/doku.php). Even though it's D1 it looks pretty neat, it's a windowless GUI and has everything modularized. It even has theming support and HTML rendering. It is Windows-only though, but I think other OSes could have been supported if there were people willing to port it (it isn't explicitly tied to win32).
No worries, had to ask. Thanks for the link though, it looks promising. :-) -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 05 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/6/11, Adam Wilson <flyboynw gmail.com> wrote:
 No worries, had to ask. Thanks for the link though, it looks promising. :-)
Listen, if you ever need help I'm in #d, nickname drey_. I think we talked before. It's never a bad idea to exchange ideas, so I'll be there.
Dec 06 2011
parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 06 Dec 2011 01:31:45 -0800, Andrej Mitrovic  
<andrej.mitrovich gmail.com> wrote:

 On 12/6/11, Adam Wilson <flyboynw gmail.com> wrote:
 No worries, had to ask. Thanks for the link though, it looks promising.  
 :-)
Listen, if you ever need help I'm in #d, nickname drey_. I think we talked before. It's never a bad idea to exchange ideas, so I'll be there.
I do believe that we have talked. I go by LightBender in #d. That sounds like a plan. The project is currently in the idea stage and I want to know what other developers think. -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 06 2011
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-05 23:55, Andrej Mitrovic wrote:
 On 12/4/11, Adam Wilson<flyboynw gmail.com>  wrote:
 Would you be willing to send me your code? I don't know how much of it i'd
 end up using, but it would be really helpful in understanding how you got
 as far as you did and where the trouble points were ... maybe it'll give
 me a head-start on the project.
This is still a WIP project with re-architecturing on a day-to-day basis. Highly experimental stuff. I'm not willing to share the code yet, sorry. But, you are much better off looking at a solid GUI framework such as Harmonia (http://harmonia.terrainformatica.com/doku.php). Even though it's D1 it looks pretty neat, it's a windowless GUI and has everything modularized. It even has theming support and HTML rendering. It is Windows-only though, but I think other OSes could have been supported if there were people willing to port it (it isn't explicitly tied to win32).
Why would one want to use HTML to create a GUI like this? I see that it has something called "sinking-bubbling event propagation schema". This seems to work backwards compare to how (I assume) all of the OS implement events in their GUI's. Seems like the library would need to emulate the event schema on top of the native one. Doesn't sound very effective. -- /Jacob Carlborg
Dec 05 2011
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/6/11, Jacob Carlborg <doob me.com> wrote:
 Doesn't sound very effective.
I don't know what that bubbling is all about. You can easily intercept a signal to a child window via std.signals, in Qt this would be installing an event filter of some sort. So sink/bubble seems unnecessary. The library is far from being efficient. It recreates a main window memory buffer on each paint message from the OS (IOW very often), also unless I'm mistaken the widgets themselves don't have a backbuffer so the library assumes their paint routines are not expensive. But that's fixable. I don't know why HTML was used, although that's just an alternative front-end to the library as far as I can tell. It does seem like this library died pretty quickly, I don't recall of any projects that used it. Anyway, I don't know if Adam wants to work on a native or non-native GUI, for native ones in pure-D (not a wrapper over existing GUIs) he can look at DFL or something else from here: http://prowiki.org/wiki4d/wiki.cgi?GuiLibraries But the library is modular enough and could be used as a starting point, imo.
Dec 06 2011
parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 06 Dec 2011 00:48:44 -0800, Andrej Mitrovic  
<andrej.mitrovich gmail.com> wrote:

 On 12/6/11, Jacob Carlborg <doob me.com> wrote:
 Doesn't sound very effective.
I don't know what that bubbling is all about. You can easily intercept a signal to a child window via std.signals, in Qt this would be installing an event filter of some sort. So sink/bubble seems unnecessary. The library is far from being efficient. It recreates a main window memory buffer on each paint message from the OS (IOW very often), also unless I'm mistaken the widgets themselves don't have a backbuffer so the library assumes their paint routines are not expensive. But that's fixable. I don't know why HTML was used, although that's just an alternative front-end to the library as far as I can tell. It does seem like this library died pretty quickly, I don't recall of any projects that used it. Anyway, I don't know if Adam wants to work on a native or non-native GUI, for native ones in pure-D (not a wrapper over existing GUIs) he can look at DFL or something else from here: http://prowiki.org/wiki4d/wiki.cgi?GuiLibraries But the library is modular enough and could be used as a starting point, imo.
The use of HTML is a bit contrived I think. HTML is a Document Markup Language and was primarily intended to format research papers for transmission and viewing over the early internet. To build a good UI, you need a much more expressive design language. My goal for the project is what you would term non-native in that it does not make use of the OS widgets; however the plan is to provide native looking skins for the widgets. I'd like to design something that interfaces with the machine at a lower level than widgets. On Windows I am targeting Direct2D, on Linux, OpenGL is the best candidate, and on OSX, well OpenGL might work but OSX has a lot of options to explore, and I don't have access to a Mac. (Anybody know how to get OSX working on VirtualBox?) That wiki link is fantastic, I had no idea there were that many GUI projects going for D! Although it looks like the only ones that did what I'd like to do are dead (Rae and Harmonia)... Guess it's time to give it another shot. -- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 06 2011
next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/6/11, Adam Wilson <flyboynw gmail.com> wrote:
 My goal for the project is what you would term non-native in that it does
 not make use of the OS widgets; however the plan is to provide native
 looking skins for the widgets.
That's what the theming API is for on Windows. OSX might have something similar. Harmonia uses the theming API, for OSX/Linux you can take a look at Qt and how they skin their widgets.
Dec 06 2011
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-06 20:14, Adam Wilson wrote:
 My goal for the project is what you would term non-native in that it
 does not make use of the OS widgets; however the plan is to provide
 native looking skins for the widgets. I'd like to design something that
 interfaces with the machine at a lower level than widgets. On Windows I
 am targeting Direct2D, on Linux, OpenGL is the best candidate, and on
 OSX, well OpenGL might work but OSX has a lot of options to explore, and
 I don't have access to a Mac.
On Mac OS X you don't have many choices on that level, basically only OpenGL. There's Quartz but that is a higher level than OpenGL (I think).
 (Anybody know how to get OSX working on VirtualBox?)
You're only allowed to install Mac OS X virtually on a physical Mac. For a version below 10.7 you are only allowed to use Mac OS X Server. You are allowed to install the client version virtually of 10.7 or higher. But that still needs to be install on a physical Mac. That said, I read that VMware Fusion 4.1 allows to install client versions of 10.5 and 10.6 because of a mistake made by VMware (don't know if that allows to install on a physical PC). Another alternative is to install Mac OS X on a physical PC anyway (virtually or native). How to do that is quite simple and easy to find by searching at google or similar. Note that you're not allowed to do any of this according to the Apple's licenses. I'm just saying it's possible. -- /Jacob Carlborg
Dec 06 2011
prev sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/7/11, Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:
 That's what the theming API is for on Windows. OSX might have
 something similar. Harmonia uses the theming API, for OSX/Linux you
 can take a look at Qt and how they skin their widgets.
By theming API I mean Visual Styles on Windows. A simple example is online here: https://github.com/AndrejMitrovic/DWinProgramming/tree/master/Samples/Extra/VisualStyles (run via rdmd ..\..\..\build.d "%CD%") You can also fetch a hit test clip of a control to figure out which part of your theme-drawn control is clickable. I don't know whether OSX has something similar.
Dec 11 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-12 07:54, Andrej Mitrovic wrote:
 On 12/7/11, Andrej Mitrovic<andrej.mitrovich gmail.com>  wrote:
 That's what the theming API is for on Windows. OSX might have
 something similar. Harmonia uses the theming API, for OSX/Linux you
 can take a look at Qt and how they skin their widgets.
By theming API I mean Visual Styles on Windows. A simple example is online here: https://github.com/AndrejMitrovic/DWinProgramming/tree/master/Samples/Extra/VisualStyles (run via rdmd ..\..\..\build.d "%CD%") You can also fetch a hit test clip of a control to figure out which part of your theme-drawn control is clickable. I don't know whether OSX has something similar.
Mac OS X comes bundled with two themes, Blue and Graphite. I know there are applications available to provides more themes, don't know if they work with the latest versions of Mac OS X though. -- /Jacob Carlborg
Dec 12 2011
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Saturday, December 03, 2011 01:27:23 Andrej Mitrovic wrote:
 On 12/3/11, Adam Wilson <flyboynw gmail.com> wrote:
 Already started to; i've been laying down the skeleton and learning D at
 the same time. I like the language. But I think I'll leave language
 design to those who understand it best and stick to what I know. I
 suspect that this is going to be a "me, myself, and I" project for some
 time. I'm ok with that. :-)
Yea, I had something in the works as well (and then a million other side-projects beside that, just like every D programmer out there..). The most difficult part to implement for me was the layouts. Actually I only got that half-way done. But having a backbuffer system, dirty-region blits, alpha blending, mouse tracking (e.g. select and move widget), signals/slots, those things were done in a matter of weeks (mostly because there's already usable code out there, e.g. a newer std.signals, the new CairoD bindings for drawing, win32 bindings, etc) . It's just a fun project I like to play with once in a while. But now that wxD is on github I'm going to use that for some of my projects. It's all too easy getting worked up with an "engine" instead of the actual "game", if you know what I mean ;).
If you have a considerably better proposal for std.signals or know of one and are willing to champion in it, then you should try and get into it Phobos as a replacement. If the current std.signals is good enough that that's unnecessary of course, but if there's a proposal that's considerably better, then we should at least look at it. - Jonathan M Davis
Dec 02 2011
prev sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 12/3/11, Jonathan M Davis <jmdavisProg gmx.com> wrote:
 If you have a considerably better proposal for std.signals
Johannes Pfau made an updated one, not me. And I'd rather use it for a while and hack in new features when necessary and debug it properly than shove it into phobos and make it another DOA module that nobody uses.
Dec 04 2011
parent reply Johannes Pfau <spam example.com> writes:
Andrej Mitrovic wrote:
On 12/3/11, Jonathan M Davis <jmdavisProg gmx.com> wrote:
 If you have a considerably better proposal for std.signals
Johannes Pfau made an updated one, not me. And I'd rather use it for a while and hack in new features when necessary and debug it properly than shove it into phobos and make it another DOA module that nobody uses.
It's not really ready for phobos. Probably better than the current std.signals, but still not ready. The API seems to be ok, but can be extended with some ideas from boost.signals2 and the implementation should be replaced with boost.signals2 code as well. When I find some time I might clean it up and propose it for phobos, but don't hold your breath. -- Johannes Pfau
Dec 05 2011
parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Monday, December 05, 2011 09:13:57 Johannes Pfau wrote:
 Andrej Mitrovic wrote:
On 12/3/11, Jonathan M Davis <jmdavisProg gmx.com> wrote:
 If you have a considerably better proposal for std.signals
Johannes Pfau made an updated one, not me. And I'd rather use it for a while and hack in new features when necessary and debug it properly than shove it into phobos and make it another DOA module that nobody uses.
It's not really ready for phobos. Probably better than the current std.signals, but still not ready. The API seems to be ok, but can be extended with some ideas from boost.signals2 and the implementation should be replaced with boost.signals2 code as well. When I find some time I might clean it up and propose it for phobos, but don't hold your breath.
I've never used std.signals, so I don't really know how good it is, but if std.signals is not up to snuff, then it should be replaced with something which is. So, if someone is willing to champion that effort, then that would be great, but unless someone does so, std.signals is likely to stay just as it is. But ideally, if std.signals is poor, it should be replaced with something better. - Jonathan M Davis
Dec 05 2011
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-02 23:57, Walter Bright wrote:
 On 12/2/2011 2:15 PM, Adam Wilson wrote:
 On Fri, 02 Dec 2011 12:15:55 -0800, Walter Bright
 <newshound2 digitalmars.com>
 wrote:

 On 12/2/2011 11:29 AM, Gour wrote:
 Moreover, developing something from the scratch woudl require enormous
 amount of time in comparison with *just* providing higher-level D-ish
 API for some of the already available GUI toolkit.
Developing a D GUI from scratch is way beyond our reach at the moment. People have spent enormous efforts developing GUI libraries for other platforms, there's no good reason for not leveraging their efforts.
I absolutely agree. However, I don't thank that we should exclude the possibility of building a scratch library either.
At some point, a decision has to be made. Consider that existing successful GUI libraries have had *enormous* resources poured into them. That just is not possible in the D community right now. And even if it were, do we really want to wait 5 years for it to be built? Of course, if someone still wants to develop a D GUI from scratch, nobody is going to stop them.
 It's not just the code involved. It's the tutorials, web sites, manuals,
 support, etc., that would have to be reinvented. By developing a D
 interface
 to an existing one, none of that has to be developed.
This is too true. But if it was easy, everybody would be doing it. You could say the same thing about compilers, but that didn't stop you ... :-)
Frankly, I think a compiler is much easier to build.
Well, that's just only you who thinks like that :) -- /Jacob Carlborg
Dec 04 2011
prev sibling parent reply "Adam Wilson" <flyboynw gmail.com> writes:
On Fri, 02 Dec 2011 11:29:18 -0800, Gour <gour atmarama.net> wrote:

 On Fri, 02 Dec 2011 10:56:41 -0800
 "Adam Wilson" <flyboynw gmail.com> wrote:

 Hello Adam,

 Gour, I'd love to talk to you more about GUI's. I am new to D, but I
 have spent years working with GUI toolkits and studying their
 construction.
Well, I'm just someone with not-so-much free time looking to help some GUI bindings project in order to be able to use it for open-source project(s).
Completely understandable. I'm not exactly swimming in free time either ... :-S
 My company would like to move to D in the future but, among other
 things, the lack of a first class GUI toolkit makes that a non-starter
 at the moment. If such a thing existed my company would be very
 willing to jump ship.
This is one of the best paragraph I read in this newsgroup in a recent time. I hope it will be loud-enough.
I hope it will to, but at the same time, wxD and DWT are inadequate for our needs, and will probably always be so. I consider 'first-class" something like a QML/JavaFX/WPF framework, and a lot of other companies are in the same boat. I have nothing against wxD or DWT, they just don't work for us.
 And I have permission to use some limited company resources, mostly
 just web hosting for the project right now, but ability to expand that
 latter if the project shows progress.
I believe that stuff like bitbucket/github should be enough...
I agree, I was mostly stating that as an indication that my company is willing to help out if progress being made.
 I would also be up for leading the project, but a project of this
 size would need lots of contributors. And there still needs to be
 serious discussions about how to design such a project.
/me nods
 Personally, my UI design background tends away from traditional style
 toolkits like wxD and DWT, and as such I would probably want to take
 the project in a different direction than those.
Interesting...
 For example, all of our software at work is built on WPF and I can say
 that I completely believe that WPF style UI toolkits are the way of
 the future.
Hmm...but WPF is Windows-only, right? Moreover, developing something from the scratch woudl require enormous amount of time in comparison with *just* providing higher-level D-ish API for some of the already available GUI toolkit. On top of that, I believe that the best forces available within D-army are now focused on improving DMD/Phobos, so don't know how many soldiers are ready into going developing something new. Let me say, that when the time matures, I'm definitely to have some solution more suitable for D and its advantages over e.g. C(++), but for now I believe that just having some pragmatic solution in the form of actively-worked-on project covering one of the {gtk,qt,wx}.
WPF is Windows only, and that is probably my biggest gripe with it outside of the numerous, and serious, implementation flaws. The closest you can get to WPF on Linux is Moonlight, and that is limited compared to what WPF can do. Argh! That is true, but there are already two projects out their to accomplish that, and I personally would have no problem with anyone who wanted to work on those instead, they are useful and allow something to be built in the near-term, which will significantly help D. But I, nor my company, can do much of anything short term in D. Specifically, we are still developing and a GUI toolkit isn't the only thing missing in terms of what we need to port. So I am thinking much longer term here. It absolutely will take time, lots of it, but in the end, I feel that D will be much better positioned in the long-run to take serious market share if it has a state-of-the-art UI toolkit. Toolkits like wxD and DWT absolutely have a place, but the big players are moving away from that model of UI creation, as the news about QML demonstrates. QML is obviously early, but it's headed down the same path as JavaFX and WPF/Silverlight. The reason I want to start now is that it's still early in the evolution of declarative UI's. WPF is only 5 years old, but it took MS about 300 programmers and 4 years to build, and it's the oldest implementation of that type of UI that I know of outside academia. Given how long it took, starting now is better than starting later. The earlier you show up to market with what people want, the bigger market share you grab. That's good for D. :-) My intention is not to draw away any devs who could potentially work on DMD/Phobos, in fact I want them working hard on those because without them my work is pointless, and in some cases impossible (showstopper bugs and ICE's are rather annoying like that). I suspect that it'll be a case of "me, myself, and I" working on a declarative UI for D for quite some time. But at the same time, I want to continue to have conversations with the community at large, probably mostly about design and whatnot. If there are people who really want to help I won't turn them away, but I'll avoid actively recruiting to make sure that DMD/Phobos gets first pick, as they should. Sound good?
 Besides, why cover the same ground that those two projects are
 already covering? I could list all the pro's and con's that we've
 discovered in actual usage of WPF but I don't want to needlessly
 clutter up this thread which has little to do with UI. :-)
Time & effort which are limited in D community right now?
That is legitimate assertion and GUI toolkits are among the most complicated undertakings a group can attempt to carry out.
 You can find me on IRC as LightBender and the email account I list
 here is actively monitored.
I'm available as 'gour' on IRC. Sincerely, Gour
-- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 02 2011
parent reply Don <nospam nospam.com> writes:
On 02.12.2011 22:02, Adam Wilson wrote:
 On Fri, 02 Dec 2011 11:29:18 -0800, Gour <gour atmarama.net> wrote:

 On Fri, 02 Dec 2011 10:56:41 -0800
 "Adam Wilson" <flyboynw gmail.com> wrote:

 Hello Adam,

 Gour, I'd love to talk to you more about GUI's. I am new to D, but I
 have spent years working with GUI toolkits and studying their
 construction.
Well, I'm just someone with not-so-much free time looking to help some GUI bindings project in order to be able to use it for open-source project(s).
Completely understandable. I'm not exactly swimming in free time either ... :-S
 My company would like to move to D in the future but, among other
 things, the lack of a first class GUI toolkit makes that a non-starter
 at the moment. If such a thing existed my company would be very
 willing to jump ship.
This is one of the best paragraph I read in this newsgroup in a recent time. I hope it will be loud-enough.
I hope it will to, but at the same time, wxD and DWT are inadequate for our needs, and will probably always be so. I consider 'first-class" something like a QML/JavaFX/WPF framework, and a lot of other companies are in the same boat. I have nothing against wxD or DWT, they just don't work for us.
 And I have permission to use some limited company resources, mostly
 just web hosting for the project right now, but ability to expand that
 latter if the project shows progress.
I believe that stuff like bitbucket/github should be enough...
I agree, I was mostly stating that as an indication that my company is willing to help out if progress being made.
 I would also be up for leading the project, but a project of this
 size would need lots of contributors. And there still needs to be
 serious discussions about how to design such a project.
/me nods
 Personally, my UI design background tends away from traditional style
 toolkits like wxD and DWT, and as such I would probably want to take
 the project in a different direction than those.
Interesting...
 For example, all of our software at work is built on WPF and I can say
 that I completely believe that WPF style UI toolkits are the way of
 the future.
Hmm...but WPF is Windows-only, right? Moreover, developing something from the scratch woudl require enormous amount of time in comparison with *just* providing higher-level D-ish API for some of the already available GUI toolkit. On top of that, I believe that the best forces available within D-army are now focused on improving DMD/Phobos, so don't know how many soldiers are ready into going developing something new. Let me say, that when the time matures, I'm definitely to have some solution more suitable for D and its advantages over e.g. C(++), but for now I believe that just having some pragmatic solution in the form of actively-worked-on project covering one of the {gtk,qt,wx}.
WPF is Windows only, and that is probably my biggest gripe with it outside of the numerous, and serious, implementation flaws. The closest you can get to WPF on Linux is Moonlight, and that is limited compared to what WPF can do. Argh! That is true, but there are already two projects out their to accomplish that, and I personally would have no problem with anyone who wanted to work on those instead, they are useful and allow something to be built in the near-term, which will significantly help D. But I, nor my company, can do much of anything short term in D. Specifically, we are long (10+ years) and a GUI toolkit isn't the only thing missing in terms of what we need to port. So I am thinking much longer term here. It absolutely will take time, lots of it, but in the end, I feel that D will be much better positioned in the long-run to take serious market share if it has a state-of-the-art UI toolkit. Toolkits like wxD and DWT absolutely have a place, but the big players are moving away from that model of UI creation, as the news about QML demonstrates. QML is obviously early, but it's headed down the same path as JavaFX and WPF/Silverlight. The reason I want to start now is that it's still early in the evolution of declarative UI's. WPF is only 5 years old, but it took MS about 300 programmers and 4 years to build, and it's the oldest implementation of that type of UI that I know of outside academia. Given how long it took, starting now is better than starting later. The earlier you show up to market with what people want, the bigger market share you grab. That's good for D. :-)
Not sure about that. If you start later, you can learn more from the mistakes of others.
 My intention is not to draw away any devs who could potentially work on
 DMD/Phobos, in fact I want them working hard on those because without
 them my work is pointless, and in some cases impossible (showstopper
 bugs and ICE's are rather annoying like that). I suspect that it'll be a
 case of "me, myself, and I" working on a declarative UI for D for quite
 some time. But at the same time, I want to continue to have
 conversations with the community at large, probably mostly about design
 and whatnot. If there are people who really want to help I won't turn
 them away, but I'll avoid actively recruiting to make sure that
 DMD/Phobos gets first pick, as they should. Sound good?
The history of D libraries is tragic -- there have been many ambitious projects, which have ended up being abandoned. (Actually I think this is pretty widespread in open source, not specific to D). I would hate for that list to get any longer. The successful D projects have, as far as I know, always used a bottom-up approach. So I would recommend to try to carve a small aspect off from the large GUI problem, and implement that. And do it so well that everyone wants to use it.
 Besides, why cover the same ground that those two projects are
 already covering? I could list all the pro's and con's that we've
 discovered in actual usage of WPF but I don't want to needlessly
 clutter up this thread which has little to do with UI. :-)
Time & effort which are limited in D community right now?
That is legitimate assertion and GUI toolkits are among the most complicated undertakings a group can attempt to carry out.
 You can find me on IRC as LightBender and the email account I list
 here is actively monitored.
I'm available as 'gour' on IRC. Sincerely, Gour
Dec 04 2011
parent "Adam Wilson" <flyboynw gmail.com> writes:
On Sun, 04 Dec 2011 06:26:39 -0800, Don <nospam nospam.com> wrote:

 On 02.12.2011 22:02, Adam Wilson wrote:
 On Fri, 02 Dec 2011 11:29:18 -0800, Gour <gour atmarama.net> wrote:

 On Fri, 02 Dec 2011 10:56:41 -0800
 "Adam Wilson" <flyboynw gmail.com> wrote:

 Hello Adam,

 Gour, I'd love to talk to you more about GUI's. I am new to D, but I
 have spent years working with GUI toolkits and studying their
 construction.
Well, I'm just someone with not-so-much free time looking to help some GUI bindings project in order to be able to use it for open-source project(s).
Completely understandable. I'm not exactly swimming in free time either ... :-S
 My company would like to move to D in the future but, among other
 things, the lack of a first class GUI toolkit makes that a non-starter
 at the moment. If such a thing existed my company would be very
 willing to jump ship.
This is one of the best paragraph I read in this newsgroup in a recent time. I hope it will be loud-enough.
I hope it will to, but at the same time, wxD and DWT are inadequate for our needs, and will probably always be so. I consider 'first-class" something like a QML/JavaFX/WPF framework, and a lot of other companies are in the same boat. I have nothing against wxD or DWT, they just don't work for us.
 And I have permission to use some limited company resources, mostly
 just web hosting for the project right now, but ability to expand that
 latter if the project shows progress.
I believe that stuff like bitbucket/github should be enough...
I agree, I was mostly stating that as an indication that my company is willing to help out if progress being made.
 I would also be up for leading the project, but a project of this
 size would need lots of contributors. And there still needs to be
 serious discussions about how to design such a project.
/me nods
 Personally, my UI design background tends away from traditional style
 toolkits like wxD and DWT, and as such I would probably want to take
 the project in a different direction than those.
Interesting...
 For example, all of our software at work is built on WPF and I can say
 that I completely believe that WPF style UI toolkits are the way of
 the future.
Hmm...but WPF is Windows-only, right? Moreover, developing something from the scratch woudl require enormous amount of time in comparison with *just* providing higher-level D-ish API for some of the already available GUI toolkit. On top of that, I believe that the best forces available within D-army are now focused on improving DMD/Phobos, so don't know how many soldiers are ready into going developing something new. Let me say, that when the time matures, I'm definitely to have some solution more suitable for D and its advantages over e.g. C(++), but for now I believe that just having some pragmatic solution in the form of actively-worked-on project covering one of the {gtk,qt,wx}.
WPF is Windows only, and that is probably my biggest gripe with it outside of the numerous, and serious, implementation flaws. The closest you can get to WPF on Linux is Moonlight, and that is limited compared to what WPF can do. Argh! That is true, but there are already two projects out their to accomplish that, and I personally would have no problem with anyone who wanted to work on those instead, they are useful and allow something to be built in the near-term, which will significantly help D. But I, nor my company, can do much of anything short term in D. Specifically, we are long (10+ years) and a GUI toolkit isn't the only thing missing in terms of what we need to port. So I am thinking much longer term here. It absolutely will take time, lots of it, but in the end, I feel that D will be much better positioned in the long-run to take serious market share if it has a state-of-the-art UI toolkit. Toolkits like wxD and DWT absolutely have a place, but the big players are moving away from that model of UI creation, as the news about QML demonstrates. QML is obviously early, but it's headed down the same path as JavaFX and WPF/Silverlight. The reason I want to start now is that it's still early in the evolution of declarative UI's. WPF is only 5 years old, but it took MS about 300 programmers and 4 years to build, and it's the oldest implementation of that type of UI that I know of outside academia. Given how long it took, starting now is better than starting later. The earlier you show up to market with what people want, the bigger market share you grab. That's good for D. :-)
Not sure about that. If you start later, you can learn more from the mistakes of others.
By that line of logic the most ideal time to start would be never. There are always mistakes to learn from, sometimes you have to be willing to be the one who makes them so that others can learn. Fortunately for this project, there is a conceptual foundation that has already been laid out by WPF, and believe me after years of fighting with it, I'm intimately familiar with it's mistakes. It's time to move on, fix those mistakes, and make new ones so that someone else can come along and fix those. The only way progress happens is by learning from the mistakes of the past. Time to make some new misakes! :-)
 My intention is not to draw away any devs who could potentially work on
 DMD/Phobos, in fact I want them working hard on those because without
 them my work is pointless, and in some cases impossible (showstopper
 bugs and ICE's are rather annoying like that). I suspect that it'll be a
 case of "me, myself, and I" working on a declarative UI for D for quite
 some time. But at the same time, I want to continue to have
 conversations with the community at large, probably mostly about design
 and whatnot. If there are people who really want to help I won't turn
 them away, but I'll avoid actively recruiting to make sure that
 DMD/Phobos gets first pick, as they should. Sound good?
The history of D libraries is tragic -- there have been many ambitious projects, which have ended up being abandoned. (Actually I think this is pretty widespread in open source, not specific to D). I would hate for that list to get any longer. The successful D projects have, as far as I know, always used a bottom-up approach. So I would recommend to try to carve a small aspect off from the large GUI problem, and implement that. And do it so well that everyone wants to use it.
I agree and I have to admit that this is of some concern to me as well. Fortunately for me, my company has a financial interest in seeing a useful WPF-like UI library available for a language that isn't controlled by large corporate interests that don't have their developers best interests in mind (see the current firestorm over Silverlight). D has serious advantages for my company, it's way more productive than C++ yet retains much of the native speed of C like compiled languages and it's because there are a couple of large holes in the available toolkits that we need filled, WPF-like UI and WCF-like SOA architecture support come to mind. We'd also like to see a few releases of the compiler that are free of any ICE's in general usage and a better GC. Compiler's and GC's aren't really my capability set, but libraries, particularly of the UI type, are much more my speed. I'm just trying to help where I can. And we have a reason to stick around. Who knows maybe someone comes along with something better, but since I don't see that happening any time soon, I don't see any reason not to get started. At the moment I am trying to tackle two small pieces of the puzzle, Property Update Notifications (primarily used for Data-binding, but can be useful elsewhere), and designing a generic drawing interface that different drawing back-ends could be plugged into, like DirectX/OpenGL/others.
 Besides, why cover the same ground that those two projects are
 already covering? I could list all the pro's and con's that we've
 discovered in actual usage of WPF but I don't want to needlessly
 clutter up this thread which has little to do with UI. :-)
Time & effort which are limited in D community right now?
That is legitimate assertion and GUI toolkits are among the most complicated undertakings a group can attempt to carry out.
 You can find me on IRC as LightBender and the email account I list
 here is actively monitored.
I'm available as 'gour' on IRC. Sincerely, Gour
-- Adam Wilson Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Dec 04 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 2:59 AM, Walter Bright wrote:
 On 12/1/2011 2:42 AM, Gour wrote:
 I'd like to help with GUI bindings if D community would come more close
 together here with some people ready to lead the herd...
Why not you lead the effort?
http://goo.gl/g60RV Andrei
Dec 01 2011
parent Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 09:11:21 -0800
Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:

 Why not you lead the effort?
=20 http://goo.gl/g60RV
Thanks. ;) Sincerely, Gour --=20 You have a right to perform your prescribed duty,=20 but you are not entitled to the fruits of action.=20 Never consider yourself the cause of the results=20 of your activities, and never be attached to not doing your duty. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
prev sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Thu, 2011-12-01 at 02:19 -0800, Walter Bright wrote:
[...]
 It matters if you are trying to implement a language that needs those fea=
tures,=20
 but must run on the JVM. That is what I meant by a "rock" a language that=
=20
 targets the JVM must carry.
Indeed. Jython and JRuby have suffered and impedance mismatch of types which leads to lots of problems. However Charlie Nutter, Ola Bini, et al. did a great job with JRuby so that it is faster than Ruby -- which is a C code system!
 Which ones do you want to help with?
What can be treated as a loss leader to gain an income stream? Writing a few emails here and there to try and be constructive in the evolution of D is one thing, to actually do something for the codebase or somesuch needs time and effort. I have to focus on what brings in consultancy, analysis, training or expert witness gigs. This all seems to be in the Java, Python, C++ and C arena just now. It therefore makes sense to be active in FOSS projects there. If we can see somewhere where D can gain real traction and become a serious player, and where there is an opportunity for some income, I am entirely happy to chip in my own time -- as I have with Python, PyPy, Groovy, Gant, GPars, Gradle. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 02 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 12:46 AM, Russel Winder wrote:
 Which ones do you want to help with?
What can be treated as a loss leader to gain an income stream? Writing a few emails here and there to try and be constructive in the evolution of D is one thing, to actually do something for the codebase or somesuch needs time and effort. I have to focus on what brings in consultancy, analysis, training or expert witness gigs. This all seems to be in the Java, Python, C++ and C arena just now. It therefore makes sense to be active in FOSS projects there. If we can see somewhere where D can gain real traction and become a serious player, and where there is an opportunity for some income, I am entirely happy to chip in my own time -- as I have with Python, PyPy, Groovy, Gant, GPars, Gradle.
I was just asked today to do a 2 day D training seminar, so the demand is out there. I suggest that a great start would be to simply list D as one of the languages you offer consultancy on. I also know that concurrent programming is one area where you have particular expertise, so advice on how to improve D's support of that can help a lot, too. Writing an article about it can really help to raise the awareness of D in the larger community.
Dec 02 2011
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 1:44 AM, Russel Winder wrote:
 Is the desire to make D a well used
 language?  If so the tenor of threads like this in this news group needs
 to change dramatically.  Bitching about things is the sign of a
 community ill at ease with its own failure to become part of the
 mainstream.  Just look at the Scala mailing lists for a classic example.
 It is hugely counter-productive to the uptake of Scala.  The danger is
 that the D community is its own worst enemy, much like the Scala
 community has a reputation for being.
Very true words. Andrei
Dec 01 2011
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
I am with Russel here.

I work mostly in JVM and .Net environments and although currently I am the
opinion that there are too many VM based applications, we hardly have any
performance issues.

When they do happen we are able to track them mostly to bad coding practices.

JNI or P/Invoke are seldom used for performance reasons and mostly to integrate
with some specific OS feature.


Russel Winder Wrote:

 Walter,
 
 On Wed, 2011-11-30 at 00:17 -0800, Walter Bright wrote:
 On 11/29/2011 11:42 PM, Jacob Carlborg wrote:
 I think it has something to do with Scala trying to be compatible with Java.
It has to run on the JVM, which is a large and heavy rock.
I think only response possible to this is "bollocks". It may be what you believe, but that doesn't make it true as an abstract statement. -- Russel. ============================================================================= Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.net 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Nov 30 2011
parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Wed, 30 Nov 2011 14:46:10 -0000, Paulo Pinto <pjmlp progtools.org>  
wrote:
 I am with Russel here.

 I work mostly in JVM and .Net environments and although currently I am  
 the opinion that there are too many VM based applications, we hardly  
 have any performance issues.
Then you're not doing enough :p Seriously tho, I think it's a fairly accurate (as generalisations go) statement that you /can/ get more performance out of natively compiled code. If you don't need that extra performance, ever, then you're not losing anything by using a virtual machine style language.
 When they do happen we are able to track them mostly to bad coding  
 practices.
Yeah, this is the more/most common cause of performance issues (in any language). Perhaps a useful measure of a language is how easy/hard it is to write bad performing code.
 JNI or P/Invoke are seldom used for performance reasons and mostly to  
 integrate with some specific OS feature.
This is one of the downsides to using JAVA or other sandboxed/JVM style languages, when you actually need bare metal access (and in some domains that is very rarely) you have to take a hit in performance and/or functionality. It's another cost you pay, but one which may not be relevant depending on the domain you're in. The ivory tower of performance with safety with quick development time with easy to understand code and tools is simply not possible, there are to many decisions in language development which boil down to a trade off between one or more of these ideals. Regan
Nov 30 2011
parent reply Paulo Pinto <pjmlp progtools.org> writes:
Well doing lots of transactions per second while aggregating data
from network elements scattered across mobile network stations
seems quite a lot of work to me.

I worked in several projects from quite a few big mobile companies and I 
can say that most code that runs on the network side doing analysis of 
your mobile activities is either JVM or .Net based.

Since 2004 the middleware used by mobile operators has been being 

devices with hardware restrictions are left untouched.

Most of the operators software is now Web, Eclipse/Netbeans or Windows 
Forms/WPF based.

Polyglot programming is preferred to one single language (hammer), and 
as such, most project managers don't see as a downside that you need to 
mix languages.

--
Paulo

Am 30.11.2011 16:38, schrieb Regan Heath:
 On Wed, 30 Nov 2011 14:46:10 -0000, Paulo Pinto <pjmlp progtools.org>
 wrote:
 I am with Russel here.

 I work mostly in JVM and .Net environments and although currently I am
 the opinion that there are too many VM based applications, we hardly
 have any performance issues.
Then you're not doing enough :p Seriously tho, I think it's a fairly accurate (as generalisations go) statement that you /can/ get more performance out of natively compiled code. If you don't need that extra performance, ever, then you're not losing anything by using a virtual machine style language.
 When they do happen we are able to track them mostly to bad coding
 practices.
Yeah, this is the more/most common cause of performance issues (in any language). Perhaps a useful measure of a language is how easy/hard it is to write bad performing code.
 JNI or P/Invoke are seldom used for performance reasons and mostly to
 integrate with some specific OS feature.
This is one of the downsides to using JAVA or other sandboxed/JVM style languages, when you actually need bare metal access (and in some domains that is very rarely) you have to take a hit in performance and/or functionality. It's another cost you pay, but one which may not be relevant depending on the domain you're in. The ivory tower of performance with safety with quick development time with easy to understand code and tools is simply not possible, there are to many decisions in language development which boil down to a trade off between one or more of these ideals. Regan
Nov 30 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Paulo Pinto" <pjmlp progtools.org> wrote in message 
news:jb654s$1uu7$1 digitalmars.com...
 Polyglot programming is preferred to one single language (hammer), and as 
 such, most project managers don't see as a downside that you need to mix 
 languages.
That's one of the reasons I hate doing employment gigs as opposed to contract work or good old "make & sell direct": There are a *lot* of things most managers are completely blind to.
Nov 30 2011
prev sibling parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Wed, 30 Nov 2011 20:52:49 -0000, Paulo Pinto <pjmlp progtools.org>  
wrote:
 Well doing lots of transactions per second while aggregating data
 from network elements scattered across mobile network stations
 seems quite a lot of work to me.
I wasn't suggesting it wasn't "a lot" but that you could "do more" with the same hardware using a compiled language. My actual point was that the deciding factor (as to what language to use) is actually "do you need to do more?" not "can you do more?".
 I worked in several projects from quite a few big mobile companies and I  
 can say that most code that runs on the network side doing analysis of  
 your mobile activities is either JVM or .Net based.
That doesn't surprise me. I have a friend in financial programming and they're primarily using Java also. But... they have rewritten performance critical sections in C/C++ where they discovered they needed to "do more".
 Since 2004 the middleware used by mobile operators has been being  

 devices with hardware restrictions are left untouched.

 Most of the operators software is now Web, Eclipse/Netbeans or Windows  
 Forms/WPF based.

 Polyglot programming is preferred to one single language (hammer), and  
 as such, most project managers don't see as a downside that you need to  
 mix languages.
Oh, I agree, definitely use the right tool for the job. I wasn't saying Java or a JVM language was the wrong tool.. *unless* you actually hit the performance wall and "need more". In which case you can either buy better hardware or move to a more performant tool - like a compiled language. I think one reason for the movement toward Java and JVM style languages is that hardware is getting cheaper and cheaper, and developers cost the same or more. With a 'simpler to write' 'quicker to write' language like Java (where you don't have to learn things like manual memory management) you can more easily train programmers, and they will be cheaper also. Then, you can 'fix' any performance issues you have with better hardware, for less than the cost of training/paying a C/C++ developer to re-develop it. It makes business sense. Regan -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Dec 01 2011
next sibling parent reply Patrick Stewart <ncc1701d starfed.com> writes:
 I think one reason for the movement toward Java and JVM style languages is  
 that hardware is getting cheaper and cheaper, and developers cost the same  
 or more.  With a 'simpler to write' 'quicker to write' language like Java  
 (where you don't have to learn things like manual memory management) you  
 can more easily train programmers, and they will be cheaper also.  Then,  
 you can 'fix' any performance issues you have with better hardware, for  
 less than the cost of training/paying a C/C++ developer to re-develop it.   
 It makes business sense.
 
 Regan
Bingo. Give the man a cookie. Anyway, if there was no C/C++, in what language would we build compilers :) ?
Dec 01 2011
next sibling parent "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 01 Dec 2011 11:59:38 -0000, Patrick Stewart <ncc1701d starfed.com>  
wrote:

 I think one reason for the movement toward Java and JVM style languages  
 is
 that hardware is getting cheaper and cheaper, and developers cost the  
 same
 or more.  With a 'simpler to write' 'quicker to write' language like  
 Java
 (where you don't have to learn things like manual memory management) you
 can more easily train programmers, and they will be cheaper also.  Then,
 you can 'fix' any performance issues you have with better hardware, for
 less than the cost of training/paying a C/C++ developer to re-develop  
 it.
 It makes business sense.

 Regan
Bingo. Give the man a cookie. Anyway, if there was no C/C++, in what language would we build compilers :) ?
Exactly, compilers, and developers having to wait for compilers wastes money and development time. Another win for D vs C/C++ :p Regan -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Dec 01 2011
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Am 01.12.2011 12:59, schrieb Patrick Stewart:
 I think one reason for the movement toward Java and JVM style languages is
 that hardware is getting cheaper and cheaper, and developers cost the same
 or more.  With a 'simpler to write' 'quicker to write' language like Java
 (where you don't have to learn things like manual memory management) you
 can more easily train programmers, and they will be cheaper also.  Then,
 you can 'fix' any performance issues you have with better hardware, for
 less than the cost of training/paying a C/C++ developer to re-develop it.
 It makes business sense.

 Regan
Bingo. Give the man a cookie. Anyway, if there was no C/C++, in what language would we build compilers :) ?
In Ada, Modula-2, Modula-3, Oberon, Component Pascal, Pascal, Delphi, Bartok just as possible examples? There were programming languages before C and C++ existed, and surely there will be other systems programming languages. D might be such sucessor.
Dec 01 2011
parent reply Patrick Stewart <ncc1701d starfed.com> writes:
Paulo Pinto Wrote:

 Am 01.12.2011 12:59, schrieb Patrick Stewart:
 I think one reason for the movement toward Java and JVM style languages is
 that hardware is getting cheaper and cheaper, and developers cost the same
 or more.  With a 'simpler to write' 'quicker to write' language like Java
 (where you don't have to learn things like manual memory management) you
 can more easily train programmers, and they will be cheaper also.  Then,
 you can 'fix' any performance issues you have with better hardware, for
 less than the cost of training/paying a C/C++ developer to re-develop it.
 It makes business sense.

 Regan
Bingo. Give the man a cookie. Anyway, if there was no C/C++, in what language would we build compilers :) ?
In Ada, Modula-2, Modula-3, Oberon, Component Pascal, Pascal, Delphi, Bartok just as possible examples? There were programming languages before C and C++ existed, and surely there will be other systems programming languages. D might be such sucessor.
Perl, Python, PHP, Java, Haskell, Lua, Ruby... Not quite sure, but this comes to my mind as languages which are written mostly or completely in C. I guess it beats by far any other listed language we can use for building compilers. Correct me if I'm wrong, it is a nice day for learning something new.
Dec 01 2011
next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Thu, 2011-12-01 at 19:26 -0500, Patrick Stewart wrote:
[...]
 Perl, Python, PHP, Java, Haskell, Lua, Ruby... Not quite sure, but this c=
omes to my mind as languages which are written mostly or completely in C. = I guess it beats by far any other listed language we can use for building c= ompilers. Correct me if I'm wrong, it is a nice day for learning something = new. CPython is written in C but PyPy is written in RPython (*). PyPy is about 5 times faster than CPython on most of the performance benchmarks CPython has. Wasn't the latest Perl initially written in Haskell? (*) RPython is a subset of Python which allows for the creation of native code executables of interpreters, compilers, etc. that are provably faster than hand written C. http://pypy.org/ --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 01 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/1/2011 11:59 PM, Russel Winder wrote:
 (*) RPython is a subset of Python which allows for the creation of
 native code executables of interpreters, compilers, etc. that are
 provably faster than hand written C.  http://pypy.org/
Provably faster? I can't find support for that on http://pypy.org
Dec 02 2011
next sibling parent Patrick Stewart <ncc1701d starfed.com> writes:
Walter Bright Wrote:

 On 12/1/2011 11:59 PM, Russel Winder wrote:
 (*) RPython is a subset of Python which allows for the creation of
 native code executables of interpreters, compilers, etc. that are
 provably faster than hand written C.  http://pypy.org/
Provably faster? I can't find support for that on http://pypy.org
"... faster than C/C++" has became marketing term... A lot of BS if you ask me.
Dec 02 2011
prev sibling next sibling parent reply dsimcha <dsimcha yahoo.com> writes:
On 12/2/2011 3:08 AM, Walter Bright wrote:
 On 12/1/2011 11:59 PM, Russel Winder wrote:
 (*) RPython is a subset of Python which allows for the creation of
 native code executables of interpreters, compilers, etc. that are
 provably faster than hand written C. http://pypy.org/
Provably faster? I can't find support for that on http://pypy.org
http://speed.pypy.org/ Not exactly rigorous mathematical proof, but pretty strong evidence. Also, I use PyPy once in a while for projects where speed matters a little but I want to share my code with Python people or want to use Python's huge standard library. Anecdotally, it's definitely faster. The reason has nothing to do with the language it's written in. It's because PyPy JIT compiles a lot of the Python code instead of interpreting it.
Dec 02 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/2/2011 6:19 AM, dsimcha wrote:
 On 12/2/2011 3:08 AM, Walter Bright wrote:
 On 12/1/2011 11:59 PM, Russel Winder wrote:
 (*) RPython is a subset of Python which allows for the creation of
 native code executables of interpreters, compilers, etc. that are
 provably faster than hand written C. http://pypy.org/
Provably faster? I can't find support for that on http://pypy.org
http://speed.pypy.org/
The charts on that page refuse to display in IE. Nevertheless, it doesn't seem to compare against C, but against CPython.
 Not exactly rigorous mathematical proof, but pretty strong evidence. Also, I
use
 PyPy once in a while for projects where speed matters a little but I want to
 share my code with Python people or want to use Python's huge standard library.
 Anecdotally, it's definitely faster. The reason has nothing to do with the
 language it's written in. It's because PyPy JIT compiles a lot of the Python
 code instead of interpreting it.
Dec 02 2011
next sibling parent Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 20:51, Walter Bright a écrit :
 The charts on that page refuse to display in IE.
 
 Nevertheless, it doesn't seem to compare against C, but against CPython.
 
One of the interesting ideas in this chart is the timeline, which allows to detect any performance regression/improvement with every new version of Pypy. The test battery is (automatically ?) executed and the chart updated.
Dec 02 2011
prev sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Fri, 2011-12-02 at 11:51 -0800, Walter Bright wrote:
[...]
 The charts on that page refuse to display in IE.
You probably need to tell them about this, they don't have resources to check stuff other than cursorily, they rely on user feedback.
 Nevertheless, it doesn't seem to compare against C, but against CPython.
I think I may be responsible for a slight misunderstanding/misrepresentation here. Apologies. PyPys main short term goal is to become the best realization of the Python standard; to supplant CPython as the reference implementation. A few weeks ago Laura Creighton announced on the PyPy Dev email list that they had achieved the "5 times faster than CPython" goal on the critical benchmarks. The comparison is the RPython implementation of Python vs the CPython implementation. It is an avowed goal of the PyPy project that its performance should be close enough to C that people use PyPy rather than C. Currently people can use Cython to write (slightly annotated) Python code that gets compiled to C and thence to native code to get performance as close to C as makes no difference. The intention is to avoid having to annotate and compile to get the performance. This is a long term goal for PyPy and not yet near fruition, hence people using Cythoin for the performance critical sections. This is an area where Go and PyPy will be fighting head on.
 Not exactly rigorous mathematical proof, but pretty strong evidence. Al=
so, I use
 PyPy once in a while for projects where speed matters a little but I wa=
nt to
 share my code with Python people or want to use Python's huge standard =
library.
 Anecdotally, it's definitely faster. The reason has nothing to do with =
the
 language it's written in. It's because PyPy JIT compiles a lot of the P=
ython
 code instead of interpreting it.
The PyPy JIT is clearly a "big win". I am sure Armin will come up with more stuff :-) --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 03 2011
next sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 03/12/2011 10:02, Russel Winder a écrit :
 On Fri, 2011-12-02 at 11:51 -0800, Walter Bright wrote:
 [...]
 The charts on that page refuse to display in IE.
You probably need to tell them about this, they don't have resources to check stuff other than cursorily, they rely on user feedback.
 Nevertheless, it doesn't seem to compare against C, but against CPython.
I think I may be responsible for a slight misunderstanding/misrepresentation here. Apologies. PyPys main short term goal is to become the best realization of the Python standard; to supplant CPython as the reference implementation. A few weeks ago Laura Creighton announced on the PyPy Dev email list that they had achieved the "5 times faster than CPython" goal on the critical benchmarks. The comparison is the RPython implementation of Python vs the CPython implementation. It is an avowed goal of the PyPy project that its performance should be close enough to C that people use PyPy rather than C. Currently people can use Cython to write (slightly annotated) Python code that gets compiled to C and thence to native code to get performance as close to C as makes no difference. The intention is to avoid having to annotate and compile to get the performance. This is a long term goal for PyPy and not yet near fruition, hence people using Cythoin for the performance critical sections. This is an area where Go and PyPy will be fighting head on.
 Not exactly rigorous mathematical proof, but pretty strong evidence. Also, I
use
 PyPy once in a while for projects where speed matters a little but I want to
 share my code with Python people or want to use Python's huge standard library.
 Anecdotally, it's definitely faster. The reason has nothing to do with the
 language it's written in. It's because PyPy JIT compiles a lot of the Python
 code instead of interpreting it.
The PyPy JIT is clearly a "big win". I am sure Armin will come up with more stuff :-)
Well, right now, Pypy is indeed 5 times faster than CPython, but it's still about 4 times slower than LuaJit and Javascript v8, which are comparable in speed with Java (especially LuaJit). And basically, this sort of speed is the best that has been done with JIT so far and it seems to be quite hard to improve on. So maybe they can hope to reach LuaJIT speed in the near future, but before reaching C speed, I suppose there is a long way to go.
Dec 03 2011
next sibling parent Somedude <lovelydear mailmetrash.com> writes:
Le 03/12/2011 10:19, Somedude a écrit :
 
 Well, right now, Pypy is indeed 5 times faster than CPython, but it's
 still about 4 times slower than LuaJit and Javascript v8, which are
 comparable in speed with Java (especially LuaJit). And basically, this
 sort of speed is the best that has been done with JIT so far and it
 seems to be quite hard to improve on.
 
 So maybe they can hope to reach LuaJIT speed in the near future, but
 before reaching C speed, I suppose there is a long way to go.
Still, it's quite remarkable that scripting languages have come from 2 to 3 orders of magnitude slower than native code to only 1 to 2 orders of magnitude slower, and closer to 1 order with LuaJIT. Also, automatic memory doesn't have to be wasteful, as Lisp SBCL and LuaJIT show. According to the Comupter Language Benchmark, they both consume at least 10 times less memory than Java (more like 20 times for Lua).
Dec 03 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/3/2011 1:19 AM, Somedude wrote:
 So maybe they can hope to reach LuaJIT speed in the near future, but
 before reaching C speed, I suppose there is a long way to go.
About 7-8 years ago, I attended a conference where a Java guru made a nice presentation about how Java was unfairly characterized as slow, and that improved JIT technology had solved the speed problems. But he also said that closing the performance gap with C was 10 years out, and laughingly said it would likely always be 10 years out :-)
Dec 03 2011
next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, December 03, 2011 10:55:22 Walter Bright wrote:
 On 12/3/2011 1:19 AM, Somedude wrote:
 So maybe they can hope to reach LuaJIT speed in the near future, but
 before reaching C speed, I suppose there is a long way to go.
About 7-8 years ago, I attended a conference where a Java guru made a nice presentation about how Java was unfairly characterized as slow, and that improved JIT technology had solved the speed problems. But he also said that closing the performance gap with C was 10 years out, and laughingly said it would likely always be 10 years out :-)
I think that what it comes down to is that Java used to be a lot slower than it is now but is now plenty fast for a lot of stuff, and that original reputation for slowness has stuck on some level. In some cases, Java will be as fast as C. In many cases it will not. But it really isn't particularly slow for a lot of what it's used for. So, it really does have an unfair rap at this point for being slow. It really isn't. However, it will never match C for speed such that anyone looking to get every ounce of speed from their CPU is going to find it acceptable. - Jonathan M Davis
Dec 03 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/3/2011 11:28 AM, Jonathan M Davis wrote:
 I think that what it comes down to is that Java used to be a lot slower than
 it is now but is now plenty fast for a lot of stuff, and that original
 reputation for slowness has stuck on some level. In some cases, Java will be
 as fast as C. In many cases it will not. But it really isn't particularly slow
 for a lot of what it's used for. So, it really does have an unfair rap at this
 point for being slow. It really isn't. However, it will never match C for
 speed such that anyone looking to get every ounce of speed from their CPU is
 going to find it acceptable.
Java is as fast as C (excluding the slow Java startup time) when the C code is written as Java code (i.e. using types equivalent to Java types, pointers to structs, everything on the heap, etc.). It's when you take advantage of things C has to offer, like user defined value types, pointers, etc., that C pulls way ahead.
Dec 03 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/3/2011 12:33 PM, Walter Bright wrote:
 It's when you take advantage of things C has to offer, like user defined value
 types, pointers, etc., that C pulls way ahead.
A couple examples. Take a linked list: struct List { struct List *prev, *next; ...payload... }; We can do that in Java: class List { List prev; List next; ...payload... } Right? But hidden in the Java class are two extra entries, a pointer to the vtbl[] and a mutex. Every one of the Java List instances is going to consume 8 more bytes than the C version. Consuming more memory has performance costs. No Jit I know of can fix that. Secondly, consider the small string optimization that is common in C. Can't do it in Java, and no credible Jit technology can fix that, either.
Dec 03 2011
parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, December 03, 2011 12:44:55 Walter Bright wrote:
 On 12/3/2011 12:33 PM, Walter Bright wrote:
 It's when you take advantage of things C has to offer, like user defined
 value types, pointers, etc., that C pulls way ahead.
A couple examples. Take a linked list: struct List { struct List *prev, *next; ...payload... }; We can do that in Java: class List { List prev; List next; ...payload... } Right? But hidden in the Java class are two extra entries, a pointer to the vtbl[] and a mutex. Every one of the Java List instances is going to consume 8 more bytes than the C version. Consuming more memory has performance costs. No Jit I know of can fix that. Secondly, consider the small string optimization that is common in C. Can't do it in Java, and no credible Jit technology can fix that, either.
Oh, I completely agree. I'm just saying that Java has an undeserved bad rap for slowness at this point given that it's quite fast in comparison to many languages and is often plenty fast for what most programmers do with it. It's when you _really_ need speed that it can't deliver. - Jonathan M Davis
Dec 03 2011
prev sibling parent Somedude <lovelydear mailmetrash.com> writes:
Le 03/12/2011 19:55, Walter Bright a écrit :
 On 12/3/2011 1:19 AM, Somedude wrote:
 So maybe they can hope to reach LuaJIT speed in the near future, but
 before reaching C speed, I suppose there is a long way to go.
About 7-8 years ago, I attended a conference where a Java guru made a nice presentation about how Java was unfairly characterized as slow, and that improved JIT technology had solved the speed problems. But he also said that closing the performance gap with C was 10 years out, and laughingly said it would likely always be 10 years out :-)
Yeah, when I wrote "I suppose there is a long way to go", I really thought "it's not going to happen". The overall Java/Hotspot performance hasn't really improved since version 1.4.2, which is almost 10 years old now. The only noticeable perf improvements are in the sector of better, more tunable GC. The JVMs from IBM and Oracle (before buying Sun) aren't particularly faster. I think this is a good indication that we have reached some limit in what the current generation of bytecode interpreters can do. I don't expect any JIT language to top that.
Dec 03 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/3/11 3:02 AM, Russel Winder wrote:
 The PyPy JIT is clearly a "big win".  I am sure Armin will come up with
 more stuff :-)
Do they do anything about the GIL? Andrei
Dec 03 2011
next sibling parent dsimcha <dsimcha yahoo.com> writes:
On 12/3/2011 10:39 AM, Andrei Alexandrescu wrote:
 On 12/3/11 3:02 AM, Russel Winder wrote:
 The PyPy JIT is clearly a "big win". I am sure Armin will come up with
 more stuff :-)
Do they do anything about the GIL? Andrei
Unfortunately, no. I checked into this at one point because I basically use parallelism for everything in D and have an 8-core computer at work. Therefore, if PyPy is a factor of 5 (just making up numbers) slower than D for equivalently written code, it's 40x slower once you consider that parallelism is easy in D and really hard except at the coarsest grained levels in PyPy.
Dec 03 2011
prev sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sat, 2011-12-03 at 09:39 -0600, Andrei Alexandrescu wrote:
 On 12/3/11 3:02 AM, Russel Winder wrote:
 The PyPy JIT is clearly a "big win".  I am sure Armin will come up with
 more stuff :-)
=20 Do they do anything about the GIL?
Soon. Having failed to convince Guido he had to remove the GIL from CPython, my message has been taken up by the PyPy folk and they are looking at using STM as a technique to be able to remove the GIL. Not sure whether this will hit performance. The multiprocessing package will need a severe rewrite if STM works to replace the GIL. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 04 2011
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/4/11 3:51 AM, Russel Winder wrote:
 On Sat, 2011-12-03 at 09:39 -0600, Andrei Alexandrescu wrote:
 On 12/3/11 3:02 AM, Russel Winder wrote:
 The PyPy JIT is clearly a "big win".  I am sure Armin will come up with
 more stuff :-)
Do they do anything about the GIL?
Soon. Having failed to convince Guido he had to remove the GIL from CPython, my message has been taken up by the PyPy folk and they are looking at using STM as a technique to be able to remove the GIL. Not sure whether this will hit performance. The multiprocessing package will need a severe rewrite if STM works to replace the GIL.
What's the plan with doing I/O and other irreversible actions during STM transactions then? Andrei
Dec 04 2011
parent reply Brad Anderson <eco gnuk.net> writes:
On Sun, Dec 4, 2011 at 8:07 AM, Andrei Alexandrescu <
SeeWebsiteForEmail erdani.org> wrote:

 On 12/4/11 3:51 AM, Russel Winder wrote:

 On Sat, 2011-12-03 at 09:39 -0600, Andrei Alexandrescu wrote:

 On 12/3/11 3:02 AM, Russel Winder wrote:

 The PyPy JIT is clearly a "big win".  I am sure Armin will come up with
 more stuff :-)
Do they do anything about the GIL?
Soon. Having failed to convince Guido he had to remove the GIL from CPython, my message has been taken up by the PyPy folk and they are looking at using STM as a technique to be able to remove the GIL. Not sure whether this will hit performance. The multiprocessing package will need a severe rewrite if STM works to replace the GIL.
What's the plan with doing I/O and other irreversible actions during STM transactions then? Andrei
One of the PyPy guys briefly answers this in the comments of this article on the subject of PyPy and removal of the GIL. http://morepypy.blogspot.com/2011/06/global-interpreter-lock-or-how-to-kill.html
Dec 04 2011
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/4/11 10:34 PM, Brad Anderson wrote:
     What's the plan with doing I/O and other irreversible actions during
     STM transactions then?

     Andrei


 One of the PyPy guys briefly answers this in the comments of this
 article on the subject of PyPy and removal of the GIL.
 http://morepypy.blogspot.com/2011/06/global-interpreter-lock-or-how-to-kill.html
Unfortunately the post and the discussion are not very informative. The approaches discussed (write() finishes a transaction and starts another one) are rather naive and fail to address even the simplest scenarios involving e.g. interleaving input and output. Anyway, my comment is strictly on that discussion, not on the entire effort within the Python community. When we were working on D's concurrency model Bartosz pushed for a while quite strongly in favor of STM, but I predicted it won't pan out, at least not in time for us to rely on it. It was 2008 and the issues with STM were by then understood but not resolved properly. It's still too early to make a verdict one way or another. There's a steady stream of related publications coming. It will be interesting to see what happens. Andrei
Dec 04 2011
parent Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-04 at 23:08 -0600, Andrei Alexandrescu wrote:
[...]
 Anyway, my comment is strictly on that discussion, not on the entire=20
 effort within the Python community. When we were working on D's=20
 concurrency model Bartosz pushed for a while quite strongly in favor of=
=20
 STM, but I predicted it won't pan out, at least not in time for us to=20
 rely on it. It was 2008 and the issues with STM were by then understood=
=20
 but not resolved properly. It's still too early to make a verdict one=20
 way or another. There's a steady stream of related publications coming.=
=20
 It will be interesting to see what happens.
The current hints appear to be that STM does not scale well. I haven't done any experiments myself, and I haven't seen real data from people who do claim to have. The languages with STM that I know are Haskell and Clojure. Haskell has fundamental problems with parallelism because it is a lazy language -- though there is a lot of work trying to deal with this generally, and Simon Peyton-Jones and Simon Marlowe have their data parallel stuff which works well from what I can see. Clojure will have to be investigated further...=20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 05 2011
prev sibling parent so <so so.so> writes:
On Fri, 02 Dec 2011 10:08:38 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 12/1/2011 11:59 PM, Russel Winder wrote:
 (*) RPython is a subset of Python which allows for the creation of
 native code executables of interpreters, compilers, etc. that are
 provably faster than hand written C.  http://pypy.org/
Provably faster? I can't find support for that on http://pypy.org
Maybe because it is in itself oxymoron. We all know ASM is irrelevant because every language able to outperform it every possible way.
Dec 03 2011
prev sibling next sibling parent reply Patrick Stewart <ncc1701d starfed.com> writes:
Russel Winder Wrote:

| CPython is written in C but PyPy is written in RPython (*).  PyPy is
| about 5 times faster than CPython on most of the performance benchmarks
| CPython has.

CPython is the main implementation and first Python that cameo out. It is still
bleeding edge. I think that counts as a big win for C.

| Wasn't the latest Perl initially written in Haskell?

And Haskell in C?

Besides, any compiler capable of bootstrapping itself has to be written in some
other language at the beginning.

It just makes me laugh when I see statement written from people using language
Y, whose implementation (or even worse - whose interpreter or VM) is written in
language X: 
 "Y is faster than X!" or "X is crap and outdated!". It is just a load of BS.
Dec 02 2011
next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 05:18:15 Patrick Stewart wrote:
 Russel Winder Wrote:
 | CPython is written in C but PyPy is written in RPython (*).  PyPy is
 | about 5 times faster than CPython on most of the performance benchmarks
 | CPython has.
 
 CPython is the main implementation and first Python that cameo out. It is
 still bleeding edge. I think that counts as a big win for C.
 | Wasn't the latest Perl initially written in Haskell?
 
 And Haskell in C?
 
 Besides, any compiler capable of bootstrapping itself has to be written in
 some other language at the beginning.
 
 It just makes me laugh when I see statement written from people using
 language Y, whose implementation (or even worse - whose interpreter or VM)
 is written in language X: "Y is faster than X!" or "X is crap and
 outdated!". It is just a load of BS.
Really what it comes down to is that many languages are geared more towards something other than performance - e.g. programmer productivity - so they're not really performant enough to really be the best choice for a compiler. And in some cases, they just don't have the features that it takes. But they're still useful for many programming tasks and are therefore well-worth using in those circumstances. However, it's certainly short-sighted to say that language Y is better than X at performance when language X is needed in order to implement language Y. At best, language Y is generally better for many tasks due to features other than performance and therefore obsoletes language X for many tasks. But there's no way that a language that isn't performant enough to actually implement a compiler in is going to fully replace those which _are_ that performant. - Jonathan M Davis
Dec 02 2011
parent Paulo Pinto <pjmlp progtools.org> writes:
Still you can then make use of native compilers instead of targeting a specific
VM and still achiveve quite a good performance.

Like Microsoft does with Bartok compiler for .NET or Mono does together with
Unity when targeting iPhone and Android. Now as a simple example.



Jonathan M Davis Wrote:

 On Friday, December 02, 2011 05:18:15 Patrick Stewart wrote:
 Russel Winder Wrote:
 | CPython is written in C but PyPy is written in RPython (*).  PyPy is
 | about 5 times faster than CPython on most of the performance benchmarks
 | CPython has.
 
 CPython is the main implementation and first Python that cameo out. It is
 still bleeding edge. I think that counts as a big win for C.
 | Wasn't the latest Perl initially written in Haskell?
 
 And Haskell in C?
 
 Besides, any compiler capable of bootstrapping itself has to be written in
 some other language at the beginning.
 
 It just makes me laugh when I see statement written from people using
 language Y, whose implementation (or even worse - whose interpreter or VM)
 is written in language X: "Y is faster than X!" or "X is crap and
 outdated!". It is just a load of BS.
Really what it comes down to is that many languages are geared more towards something other than performance - e.g. programmer productivity - so they're not really performant enough to really be the best choice for a compiler. And in some cases, they just don't have the features that it takes. But they're still useful for many programming tasks and are therefore well-worth using in those circumstances. However, it's certainly short-sighted to say that language Y is better than X at performance when language X is needed in order to implement language Y. At best, language Y is generally better for many tasks due to features other than performance and therefore obsoletes language X for many tasks. But there's no way that a language that isn't performant enough to actually implement a compiler in is going to fully replace those which _are_ that performant. - Jonathan M Davis
Dec 02 2011
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
The GHC compiler is written in Haskell, which is the reference implementation.

Patrick Stewart Wrote:

 Russel Winder Wrote:
 
 | CPython is written in C but PyPy is written in RPython (*).  PyPy is
 | about 5 times faster than CPython on most of the performance benchmarks
 | CPython has.
 
 CPython is the main implementation and first Python that cameo out. It is
still bleeding edge. I think that counts as a big win for C.
 
 | Wasn't the latest Perl initially written in Haskell?
 
 And Haskell in C?
 
 Besides, any compiler capable of bootstrapping itself has to be written in
some other language at the beginning.
 
 It just makes me laugh when I see statement written from people using language
Y, whose implementation (or even worse - whose interpreter or VM) is written in
language X: 
  "Y is faster than X!" or "X is crap and outdated!". It is just a load of BS.
Dec 02 2011
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 02:09 PM, Paulo Pinto wrote:
 The GHC compiler is written in Haskell, which is the reference implementation.
GHC has long been completely dependent on a C compiler though, because it could only compile the Haskell code to C and then invoked GCC afaik.
Dec 02 2011
parent reply Paulo Pinto <pjmlp progtools.org> writes:
Am 02.12.2011 18:49, schrieb Timon Gehr:
 On 12/02/2011 02:09 PM, Paulo Pinto wrote:
 The GHC compiler is written in Haskell, which is the reference
 implementation.
GHC has long been completely dependent on a C compiler though, because it could only compile the Haskell code to C and then invoked GCC afaik.
So what? Many compiler implementors use a target language instead of pure native code generation to spare effort developing a proper compiler backend. Current GHC versions compile directly to native code, even though you can still use the old C code generation backend.
Dec 02 2011
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 09:45 PM, Paulo Pinto wrote:
 Am 02.12.2011 18:49, schrieb Timon Gehr:
 On 12/02/2011 02:09 PM, Paulo Pinto wrote:
 The GHC compiler is written in Haskell, which is the reference
 implementation.
GHC has long been completely dependent on a C compiler though, because it could only compile the Haskell code to C and then invoked GCC afaik.
So what?
Nothing, just trivia.
 Many compiler implementors use a target language instead of pure native
 code generation to spare effort developing a proper compiler backend.

 Current GHC versions compile directly to native code, even though you
 can still use the old C code generation backend.
I know.
Dec 02 2011
parent reply Paulo Pinto <pjmlp progtools.org> writes:
Ah ok, I thought you were starting a "C above all" thread. :)

Am 02.12.2011 21:51, schrieb Timon Gehr:
 On 12/02/2011 09:45 PM, Paulo Pinto wrote:
 Am 02.12.2011 18:49, schrieb Timon Gehr:
 On 12/02/2011 02:09 PM, Paulo Pinto wrote:
 The GHC compiler is written in Haskell, which is the reference
 implementation.
GHC has long been completely dependent on a C compiler though, because it could only compile the Haskell code to C and then invoked GCC afaik.
So what?
Nothing, just trivia.
 Many compiler implementors use a target language instead of pure native
 code generation to spare effort developing a proper compiler backend.

 Current GHC versions compile directly to native code, even though you
 can still use the old C code generation backend.
I know.
Dec 02 2011
next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 09:58 PM, Paulo Pinto wrote:
 Ah ok, I thought you were starting a "C above all" thread. :)
No way. I really like Haskell too ;).
Dec 02 2011
prev sibling parent reply so <so so.so> writes:
On Fri, 02 Dec 2011 22:58:32 +0200, Paulo Pinto <pjmlp progtools.org>  
wrote:

 Ah ok, I thought you were starting a "C above all" thread. :)
Well that would not be a discussion, would it? Anyone against it? C is above all for the things it was developed, simple as that.
Dec 03 2011
parent reply Don <nospam nospam.com> writes:
On 03.12.2011 15:30, so wrote:
 On Fri, 02 Dec 2011 22:58:32 +0200, Paulo Pinto <pjmlp progtools.org>
 wrote:

 Ah ok, I thought you were starting a "C above all" thread. :)
Well that would not be a discussion, would it? Anyone against it? C is above all for the things it was developed, simple as that.
I'm against it. C's machine model is outdated, and it has never performed as well on floating point code as Fortran does. It gets trounced by a language designed for 1950's computers!!! I find it quite bizarre that both C and Fortran are used as if they were absolute performance standards, a theoretical upper bound.
Dec 04 2011
parent so <so so.so> writes:
On Sun, 04 Dec 2011 17:00:44 +0200, Don <nospam nospam.com> wrote:

 On 03.12.2011 15:30, so wrote:
 On Fri, 02 Dec 2011 22:58:32 +0200, Paulo Pinto <pjmlp progtools.org>
 wrote:

 Ah ok, I thought you were starting a "C above all" thread. :)
Well that would not be a discussion, would it? Anyone against it? C is above all for the things it was developed, simple as that.
I'm against it. C's machine model is outdated, and it has never performed as well on floating point code as Fortran does. It gets trounced by a language designed for 1950's computers!!!
Neither C nor C++ designed for scientific purposes. AFAIK this is explicitly stated in both their bibles.
 I find it quite bizarre that both C and Fortran are used as if they were  
 absolute performance standards, a theoretical upper bound.
It is not the point i was trying to make. We know in practice, they both are the best performing languages on their kind. Can one make a better language? Absolutely! Have we seen one yet? Not really. And this is the main reason i was arguing all along regarding backwards compatibility to C. Potential to make a better language is big but taboos are much bigger.
Dec 04 2011
prev sibling parent Gour <gour atmarama.net> writes:
On Fri, 02 Dec 2011 07:59:03 +0000
Russel Winder <russel russel.org.uk> wrote:

 Wasn't the latest Perl initially written in Haskell?
Yes, Pugs, but not maintained any longer, afaik. Sincerely, Gour --=20 When your intelligence has passed out of the dense forest=20 of delusion, you shall become indifferent to all that has=20 been heard and all that is to be heard. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
On Thu, 01 Dec 2011 19:26:59 -0500
Patrick Stewart <ncc1701d starfed.com> wrote:

 Perl, Python, PHP, Java, Haskell, Lua, Ruby... Not quite sure, but
 this comes to my mind as languages which  are written mostly or
 completely in C.=20
Haskell (GHC) 50:50 Haskell/C... Sincerely, Gour --=20 A person who is not disturbed by the incessant flow of=20 desires =E2=80=94 that enter like rivers into the ocean, which is=20 ever being filled but is always still =E2=80=94 can alone achieve=20 peace, and not the man who strives to satisfy such desires. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Dec 02 2011
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
I was referring to languages that have existing native implementations and have
been used in the industry to develop systems software like operating systems,
compilers, drivers and so on. 

Most developers tend to use C just because of it is what it is available and
the skills one should have when looking for jobs.

In the early MS-DOS, Windows 3.x days I did use Turbo Pascal to develop quite a
few
system level applications before switching to C and C++ for some years.

Patrick Stewart Wrote:

 Paulo Pinto Wrote:
 
 Am 01.12.2011 12:59, schrieb Patrick Stewart:
 I think one reason for the movement toward Java and JVM style languages is
 that hardware is getting cheaper and cheaper, and developers cost the same
 or more.  With a 'simpler to write' 'quicker to write' language like Java
 (where you don't have to learn things like manual memory management) you
 can more easily train programmers, and they will be cheaper also.  Then,
 you can 'fix' any performance issues you have with better hardware, for
 less than the cost of training/paying a C/C++ developer to re-develop it.
 It makes business sense.

 Regan
Bingo. Give the man a cookie. Anyway, if there was no C/C++, in what language would we build compilers :) ?
In Ada, Modula-2, Modula-3, Oberon, Component Pascal, Pascal, Delphi, Bartok just as possible examples? There were programming languages before C and C++ existed, and surely there will be other systems programming languages. D might be such sucessor.
Perl, Python, PHP, Java, Haskell, Lua, Ruby... Not quite sure, but this comes to my mind as languages which are written mostly or completely in C. I guess it beats by far any other listed language we can use for building compilers. Correct me if I'm wrong, it is a nice day for learning something new.
Dec 02 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/1/11 3:38 AM, Regan Heath wrote:
 I think one reason for the movement toward Java and JVM style languages
 is that hardware is getting cheaper and cheaper, and developers cost the
 same or more.
Well, that trend may be stalling. Andrei
Dec 01 2011
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Am 01.12.2011 18:12, schrieb Andrei Alexandrescu:
 On 12/1/11 3:38 AM, Regan Heath wrote:
 I think one reason for the movement toward Java and JVM style languages
 is that hardware is getting cheaper and cheaper, and developers cost the
 same or more.
Well, that trend may be stalling. Andrei
I would like to as well actually. One reason I have been paying more atention to Go and D among others, is that even though I admire all the advantages of the current VM environments, I also am starting to miss the times of native compilation, being old enough to still remember Z80 assembly. Recently I tried to call for attention to our local team that alongside JVM and .Net projects, we could tackle some C++. I even gave WinRT, iPhone, Facebook's PHP->C++ compiler, Qt and Android NDK as examples. The guy's response was 'Oh I though C++ was no longer relevant', and the matter was closed. -- Paulo
Dec 01 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Paulo Pinto" <pjmlp progtools.org> wrote in message 
news:jb93uf$2eq5$1 digitalmars.com...
 Recently I tried to call for attention to our local team that alongside
 JVM and .Net projects, we could tackle some C++. I even gave WinRT,
 iPhone, Facebook's PHP->C++ compiler, Qt and Android NDK as examples.

 The guy's response was 'Oh I though C++ was no longer relevant', and the
 matter was closed.
Oh my god. That's genuinely disturbing to think that such enormously ignorant people are out there in the dev world. Our field really *has* turned into the fasion industry. :(
Dec 01 2011
next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 01:07:51 Nick Sabalausky wrote:
 "Paulo Pinto" <pjmlp progtools.org> wrote in message
 news:jb93uf$2eq5$1 digitalmars.com...
 
 Recently I tried to call for attention to our local team that alongside
 JVM and .Net projects, we could tackle some C++. I even gave WinRT,
 iPhone, Facebook's PHP->C++ compiler, Qt and Android NDK as examples.
 
 The guy's response was 'Oh I though C++ was no longer relevant', and the
 matter was closed.
Oh my god. That's genuinely disturbing to think that such enormously ignorant people are out there in the dev world. Our field really *has* turned into the fasion industry. :(
Well, if you work in a part of the software industry that hasn't used C++ in years, and you don't mess around with it at home, then there's a good chance that you're not going to be very aware of how much C++ is still used. There are plenty of developers who never liked C++ in the first place and so have stayed away from it, and there are plenty of developers who really don't do much with software outside of their jobs. So, it's not that hard to have reasonably competent developers who really don't know where C++ stands right now, and it's _very_ easy to have incompetent developers who have no idea where it stands. For the most part, I would expect good developers to at least be aware that C++ is still very much used in some areas, but less competent ones are less likely to, and even good developers don't always know everything that they should (though thinking that C++ is irrelevant is pretty bad). Really, I don't find their response all that surprising. Sad yes, but not entirely surprising. However, I _am_ frequently surprised at how incompetent many programmers are. - Jonathan M Davis
Dec 01 2011
prev sibling parent so <so so.so> writes:
On Fri, 02 Dec 2011 08:07:51 +0200, Nick Sabalausky <a a.a> wrote:

 "Paulo Pinto" <pjmlp progtools.org> wrote in message
 news:jb93uf$2eq5$1 digitalmars.com...
 Recently I tried to call for attention to our local team that alongside
 JVM and .Net projects, we could tackle some C++. I even gave WinRT,
 iPhone, Facebook's PHP->C++ compiler, Qt and Android NDK as examples.

 The guy's response was 'Oh I though C++ was no longer relevant', and the
 matter was closed.
Oh my god. That's genuinely disturbing to think that such enormously ignorant people are out there in the dev world. Our field really *has* turned into the fasion industry. :(
Don't loose hope! There are also those that resisted C to C++ transition, the reason was that the transition is not able sustain itself. It didn't solve the many problems C facing and came with its baggage of problems. You can argue about their concern but, IMO time proved them right. I was late to that transition and when i started, i started with C++ and stuck there since. The reason is the exact same reason C programmers was stuck with C. There was/is NO other language able to fill their shoes. Now you can pick one of three: . Do nothing at all by using existing languages. . Do what C++ did and cut the community by two. . Do what C++ tried, but do it right. Don't inherit the predecessors problems. Come with facts rather than fiction. You can find tons of resources to why C developers acted the way they did. You don't necessarily see only Linus and his co beating a newbie, there are resources where you actually find pretty solid arguments.
Dec 03 2011
prev sibling parent reply Don <nospam nospam.com> writes:
On 01.12.2011 18:12, Andrei Alexandrescu wrote:
 On 12/1/11 3:38 AM, Regan Heath wrote:
 I think one reason for the movement toward Java and JVM style languages
 is that hardware is getting cheaper and cheaper, and developers cost the
 same or more.
Well, that trend may be stalling. Andrei
The cheaper hardware, or the expensive developers?
Dec 02 2011
parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Friday, December 02, 2011 10:44:27 Don wrote:
 On 01.12.2011 18:12, Andrei Alexandrescu wrote:
 On 12/1/11 3:38 AM, Regan Heath wrote:
 I think one reason for the movement toward Java and JVM style
 languages
 is that hardware is getting cheaper and cheaper, and developers cost
 the
 same or more.
Well, that trend may be stalling. Andrei
The cheaper hardware, or the expensive developers?
The hardware. Essentially, we're running into more situations where energy and machine costs exceed developer costs (e.g. server farms) or where performance matters more than developer costs due to more restrictive hardware (e.g. smartphones). Java and JVM style languages favor developer productivity - which is one of the reasons that they've been used so heavily over the past decade or so - but in those situations where performance and resource utilization are much more critical, more performant languages - such as C++ - are better. http://channel9.msdn.com/posts/C-and-Beyond-2011-Herb-Sutter-Why-C Of course, if D is performant enough, it potentially manages to be both performant enough for such situations _and_ highly productive for developers. But given the various issues and complaints with regards to GC performance and unnecessary heap allocations, I'm not sure that it's really there yet unless you go out of your way to avoid the GC. Hopefully, we'll get there though. - Jonathan M Davis
Dec 02 2011
prev sibling parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 11/30/2011 03:17 AM, Walter Bright wrote:
 It has to run on the JVM, which is a large and heavy rock.
You should check the beams in your eyes before talking about the motes in others. Did you see this recent post? "I don't think porting any game to D is a good idea right now. I've did some major game developement on D. Half my code uses manual memory management and still the D garbage collector is a major performance issue. Unless you want to do all of the memory management yourself, which pretty much results in not using phobos and most of the cool features in D I wouldn't recommend porting a bigger game to D." The JVM garbage collector is miles ahead of D's. I think it's pitiful that a language that aims to be a C++ replacement has been an utter failure in the gaming area, the one area where C++ has reigned supreme.
Nov 30 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 10:12 AM, Jeff Nowakowski wrote:
 On 11/30/2011 03:17 AM, Walter Bright wrote:
 It has to run on the JVM, which is a large and heavy rock.
The JVM garbage collector is miles ahead of D's.
Yes, it is. What I meant by the "large and heavy rock" is the difficulty of expressing any sort of semantics that are not Java semantics in the JVM bytecode.
 I think it's pitiful that a
 language that aims to be a C++ replacement has been an utter failure in the
 gaming area, the one area where C++ has reigned supreme.
In C++, one does all the memory management manually.
Nov 30 2011
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 11/30/2011 01:38 PM, Walter Bright wrote:
 Yes, it is. What I meant by the "large and heavy rock" is the difficulty
 of expressing any sort of semantics that are not Java semantics in the
 JVM bytecode.
Fair enough.
 In C++, one does all the memory management manually.
But in C++ libraries are designed with this in mind. You didn't address his point: "Unless you want to do all of the memory management yourself, which pretty much results in not using phobos and most of the cool features in D." And isn't the point of D to relieve you of the burden of doing stuff like memory management? You should read Tim Sweeney's (Gears of War developer) "The Next Mainstream Programming Language", where the slide for Gameplay Simulation says, "Usually garbage-collected." I assume by this he means that for C++ the developers end up writing their own garbage collector inside the program. http://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf If D could demonstrably solve the problems outlined in these slides, you'd have a whole industry at your fingertips.
Nov 30 2011
next sibling parent Gour <gour atmarama.net> writes:
On Wed, 30 Nov 2011 15:11:53 -0500
Jeff Nowakowski <jeff dilacero.org> wrote:

 And isn't the point of D to relieve you of the burden of doing stuff=20
 like memory management? You should read Tim Sweeney's (Gears of War=20
 developer) "The Next Mainstream Programming Language", where the
 slide for Gameplay Simulation says, "Usually garbage-collected."=20
Isn't he thought about Haskell or FP language in that presentation? Sincerely, Gour --=20 In this endeavor there is no loss or diminution,=20 and a little advancement on this path can protect=20 one from the most dangerous type of fear. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Nov 30 2011
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 12:11 PM, Jeff Nowakowski wrote:
 But in C++ libraries are designed with this in mind. You didn't address his
 point: "Unless you want to do all of the memory management yourself, which
 pretty much results in not using phobos and most of the cool features in D."
As in C++, you do have to take considerable care with the memory management.
 And isn't the point of D to relieve you of the burden of doing stuff like
memory
 management?
When you're managing a lot of memory, and performance is critical, nobody has invented a magic bullet for that where you can "fire and forget" memory consumption.
 You should read Tim Sweeney's (Gears of War developer) "The Next
 Mainstream Programming Language", where the slide for Gameplay Simulation says,
 "Usually garbage-collected." I assume by this he means that for C++ the
 developers end up writing their own garbage collector inside the program.

 http://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf

 If D could demonstrably solve the problems outlined in these slides, you'd have
 a whole industry at your fingertips.
Yes, I've read that. That all said, Andrei and I have been investigating using reference counting for the collection classes.
Nov 30 2011
prev sibling parent reply bigsandwich <bigsandwich gmail.com> writes:
Jeff Nowakowski Wrote:

 On 11/30/2011 01:38 PM, Walter Bright wrote:
 Yes, it is. What I meant by the "large and heavy rock" is the difficulty
 of expressing any sort of semantics that are not Java semantics in the
 JVM bytecode.
Fair enough.
 In C++, one does all the memory management manually.
But in C++ libraries are designed with this in mind. You didn't address his point: "Unless you want to do all of the memory management yourself, which pretty much results in not using phobos and most of the cool features in D." And isn't the point of D to relieve you of the burden of doing stuff like memory management? You should read Tim Sweeney's (Gears of War developer) "The Next Mainstream Programming Language", where the slide for Gameplay Simulation says, "Usually garbage-collected." I assume by this he means that for C++ the developers end up writing their own garbage collector inside the program. http://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf If D could demonstrably solve the problems outlined in these slides, you'd have a whole industry at your fingertips.
I don't usually post, but "someone is wrong on the internet" (http://xkcd.com/386/) :) "Usually garbage collected" in the case of Unreal refers to Unreal Script which is not C++ at all. Its a language similar to Java that is compiled into bytecode. Most games use allocations schemes for different parts of the game, including garbage collection. You wouldn't want to use GC in performance critical code anyway, so it probably doesn't matter that its that slow. What does matter is having a way to isolate GC to the few libraries where your willing to pay for it and turn it off for everything else. It would be great if there was a way to build a static library and just pass a switch that would make compilation fail if there was GC allocation in the library. Other good features if you want to push into game dev: 1) Be able to override allocators (ie new, delete, or whatever they are called in D now) so that you can allocate out of different heaps. 2) Be able to control when GC occurs (ie every x frames, or in between levels, or only when there is a lull in the action).
Nov 30 2011
next sibling parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 11/30/2011 03:58 PM, bigsandwich wrote:
 "Usually garbage collected" in the case of Unreal refers to Unreal
 Script which is not C++ at all.  Its a language similar to Java that
 is compiled into bytecode.
It doesn't say that in the slides. It says that they use C++ *and* script code. The slides are also talking about Gears of War, too, not just Unreal.
 Most games use allocations schemes for different parts of the game,
 including garbage collection.
Which just repeats what I said, "I assume by this he means that for C++ the developers end up writing their own garbage collector inside the program."
 You wouldn't want to use GC in performance critical code anyway, so
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. This is "soft" real-time.
 What does matter is having a way to isolate GC to the few libraries
 where your willing to pay for it and turn it off for everything
 else.
Interestingly enough, under Musings he says, "Memory model: Garbage collection should be the only option". Real-time garbage collection that actually works well in a game setting would be the ideal.
Nov 30 2011
next sibling parent bigsandwich <bigsandwich gmail.com> writes:
Jeff Nowakowski Wrote:

 On 11/30/2011 03:58 PM, bigsandwich wrote:
 "Usually garbage collected" in the case of Unreal refers to Unreal
 Script which is not C++ at all.  Its a language similar to Java that
 is compiled into bytecode.
It doesn't say that in the slides. It says that they use C++ *and* script code. The slides are also talking about Gears of War, too, not just Unreal.
No, it says "Simulation Code" which in every Unreal game I've worked on is all written in Unreal Script, which is garbage collected. The C++ code isn't, and the simulation is not written in C++. I'm not saying there aren't games that use some form of GC in C++, I'm just saying that the kind of code he's talking about in the slide isn't written in C++ AT ALL if you are using Unreal.
 Most games use allocations schemes for different parts of the game,
 including garbage collection.
Which just repeats what I said, "I assume by this he means that for C++ the developers end up writing their own garbage collector inside the program."
Why would you assume this?
 You wouldn't want to use GC in performance critical code anyway, so
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. This is "soft" real-time.
Yes, thats why you can get away with scripting language and a GC for "some" parts of the game - parts of the game that are not perf intensive. Also notice that he talks about concurrency. Modern game engines need to be multi-threaded and the sim may update at a different rate then, for example, rendering.
 What does matter is having a way to isolate GC to the few libraries
 where your willing to pay for it and turn it off for everything
 else.
Interestingly enough, under Musings he says, "Memory model: Garbage collection should be the only option". Real-time garbage collection that actually works well in a game setting would be the ideal.
I think you are reading too much into those slides. He's talking about an ideal DSL for game programming that can cover all the bases. That doesn't exist, and I'm not sure its even possible to create such a thing. I think you could probably get close though.
Nov 30 2011
prev sibling parent reply Kagamin <spam here.lot> writes:
Jeff Nowakowski Wrote:

 You wouldn't want to use GC in performance critical code anyway, so
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. This is "soft" real-time.
How does it work in Supreme Commander, for example (.net)?
Nov 30 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 01/12/2011 08:54, Kagamin a écrit :
 Jeff Nowakowski Wrote:
 
 You wouldn't want to use GC in performance critical code anyway, so
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. This is "soft" real-time.
How does it work in Supreme Commander, for example (.net)?
If you want something that is comparable, you can have a look at the code of the Spring RTS game engine. I used to hack a little on it a long while ago.
Dec 02 2011
parent reply "Marco Leise" <Marco.Leise gmx.de> writes:
Am 02.12.2011, 18:54 Uhr, schrieb Somedude <lovelydear mailmetrash.com>:=


 Le 01/12/2011 08:54, Kagamin a =C3=A9crit :
 Jeff Nowakowski Wrote:

 You wouldn't want to use GC in performance critical code anyway, so=
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. Th=
is
 is "soft" real-time.
How does it work in Supreme Commander, for example (.net)?
If you want something that is comparable, you can have a look at the code of the Spring RTS game engine. I used to hack a little on it a long while ago.
I could imagine it is most important for fast online games, like first = person shooters or racing games. The GC stopping the application four a = = couple ms could mean your aim 'jumps over' your opponent or you drive in= to = a wall.
Dec 02 2011
parent Somedude <lovelydear mailmetrash.com> writes:
Le 02/12/2011 19:30, Marco Leise a écrit :
 Am 02.12.2011, 18:54 Uhr, schrieb Somedude <lovelydear mailmetrash.com>:
 
 Le 01/12/2011 08:54, Kagamin a écrit :
 Jeff Nowakowski Wrote:

 You wouldn't want to use GC in performance critical code anyway, so
 it probably doesn't matter that its that slow.
Check the slides again. It has to run at 30-60 frames per second. This is "soft" real-time.
How does it work in Supreme Commander, for example (.net)?
If you want something that is comparable, you can have a look at the code of the Spring RTS game engine. I used to hack a little on it a long while ago.
I could imagine it is most important for fast online games, like first person shooters or racing games. The GC stopping the application four a couple ms could mean your aim 'jumps over' your opponent or you drive into a wall.
I believe games like Supreme Commander or Spring RTS are very stressful on the CPU. Actually more so than FPS games (but probably less on the GPU). One has to manage the pathfinding, collisions, and simulate the scripted behaviour of thousands of units and bullets in real time. Plus the scripted IA. Supreme Commander is multithreaded (I would guess pathfinding in its own thread, sound too, and maybe the graphic engine), but last time I checked (a while now), Spring RTS still wasn't.
Dec 02 2011
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 11/30/2011 09:58 PM, bigsandwich wrote:
 Jeff Nowakowski Wrote:

 On 11/30/2011 01:38 PM, Walter Bright wrote:
 Yes, it is. What I meant by the "large and heavy rock" is the difficulty
 of expressing any sort of semantics that are not Java semantics in the
 JVM bytecode.
Fair enough.
 In C++, one does all the memory management manually.
But in C++ libraries are designed with this in mind. You didn't address his point: "Unless you want to do all of the memory management yourself, which pretty much results in not using phobos and most of the cool features in D." And isn't the point of D to relieve you of the burden of doing stuff like memory management? You should read Tim Sweeney's (Gears of War developer) "The Next Mainstream Programming Language", where the slide for Gameplay Simulation says, "Usually garbage-collected." I assume by this he means that for C++ the developers end up writing their own garbage collector inside the program. http://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf If D could demonstrably solve the problems outlined in these slides, you'd have a whole industry at your fingertips.
I don't usually post, but "someone is wrong on the internet" (http://xkcd.com/386/) :) "Usually garbage collected" in the case of Unreal refers to Unreal Script which is not C++ at all. Its a language similar to Java that is compiled into bytecode.
Such kind of code could be written in (Safe)D perfectly fine and it would certainly perform better as well
 Most games use allocations schemes for different parts of the game, including
garbage collection.  You wouldn't want to use GC in performance critical code
anyway, so it probably doesn't matter that its that slow.

 What does matter is having a way to isolate GC to the few libraries where your
willing to pay for it and turn it off for everything else.  It would be great
if there was a way to build a static library and just pass a switch that would
make compilation fail if there was GC allocation in the library.
Afaik this is planned.
 Other good features if you want to push into game dev:
 1) Be able to override allocators (ie new, delete, or whatever they are called
in D now) so that you can allocate out of different heaps.
http://www.d-programming-language.org/class.html#allocators Andrei wants to deprecate them in favour of std.conv.emplace. I have already used std.conv.emplace to good effect for some simple custom class allocators. Every allocation in my performance sensitive code looks like this: New!Class(params) I can then replace the allocator for a whole module by changing one small alias at the top of the module to quickly test performance of different allocation schemes on typical workloads.
 2) Be able to control when GC occurs (ie every x frames, or in between levels,
or only when there is a lull in the action).
http://www.d-programming-language.org/phobos/core_memory.html In D, you have full control over the GC. You can disable and enable it, you can explicitly start a collection and you can even ask it to free as much of its internal data structures as possible etc. (And there is still a huge potential for improvements of its implementation! :o)) The quite common claim that you cannot use Phobos without cluttering up all your code with allocations is just not true. Phobos is not an Object-oriented class library. There are seldom hidden allocations, and it is almost always obvious when allocations must happen. The most useful high-level features of D are compile time features that do not harm performance (on the contrary).
Nov 30 2011
parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr <timon.gehr gmx.ch> wrote:

 The quite common claim that you cannot use Phobos without cluttering up  
 all your code with allocations is just not true. Phobos is not an  
 Object-oriented class library. There are seldom hidden allocations, and  
 it is almost always obvious when allocations must happen. The most  
 useful high-level features of D are compile time features that do not  
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Dec 01 2011
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-01 12:26, Regan Heath wrote:
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr <timon.gehr gmx.ch> wrote:

 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R
You can replace the garbage collector with your own implementation. -- /Jacob Carlborg
Dec 01 2011
parent "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 01 Dec 2011 12:24:13 -0000, Jacob Carlborg <doob me.com> wrote:

 On 2011-12-01 12:26, Regan Heath wrote:
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr <timon.gehr gmx.ch>  
 wrote:

 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R
You can replace the garbage collector with your own implementation.
True, and any serious game dev or similar workshop using for performance critical code probably will. And, I realise improving the existing GC and writing several more different GC implementations to make swap-in-out as easy as a compile time switch is still low on the priority list, for good reason. But, it does mean that writing code in D without the GC is not trivial and the problems are sometimes subtle. Problems that can be glossed over by a statement saying you can manually manage memory in D - which while technically true is not actually true in any meaningful way. Ideally, and I'm sure we'll get there eventually, we should be able to turn the GC on and off with a flag at compile time, and no other code changes (WRT how phobos and druntime handle memory). Which probably means that memory allocated within phobos/druntime needs a 'delete' or similar statement which is a no-op in the presence of GC, and a 'free' otherwise. Or some similar construct. Regan -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Dec 01 2011
prev sibling next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 12/01/2011 12:26 PM, Regan Heath wrote:
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr <timon.gehr gmx.ch> wrote:

 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right?
I mostly use that feature at compile time.
 Maybe this isn't a problem for people who want to manually manage memory
 for performance reasons but it is right at the very lowest levels of
 "what is D" and it seems like something where it should be possible to
 choose how the memory for a string is allocated etc.

 R
Dec 01 2011
prev sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 01/12/2011 12:26, Regan Heath a écrit :
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr <timon.gehr gmx.ch> wrote:
 
 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R
OTOH, string concatenation is not the most critical thing for game writers, is it ? I would see it more likely in XML parsing libraries, but D is already the fastest performer in this area (but maybe it could be faster without GC ?).
Dec 02 2011
next sibling parent Don <nospam nospam.com> writes:
On 02.12.2011 19:02, Somedude wrote:
 Le 01/12/2011 12:26, Regan Heath a écrit :
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr<timon.gehr gmx.ch>  wrote:

 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R
OTOH, string concatenation is not the most critical thing for game writers, is it ? I would see it more likely in XML parsing libraries, but D is already the fastest performer in this area (but maybe it could be faster without GC ?).
I believe the high speed was achieved without any memory allocation at all. D's slices make this possible. You're more likely to see concatenation in XML _creation_ rather than _parsing_.
Dec 02 2011
prev sibling next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 12/02/2011 07:02 PM, Somedude wrote:
 Le 01/12/2011 12:26, Regan Heath a écrit :
 On Wed, 30 Nov 2011 22:04:13 -0000, Timon Gehr<timon.gehr gmx.ch>  wrote:

 The quite common claim that you cannot use Phobos without cluttering
 up all your code with allocations is just not true. Phobos is not an
 Object-oriented class library. There are seldom hidden allocations,
 and it is almost always obvious when allocations must happen. The most
 useful high-level features of D are compile time features that do not
 harm performance (on the contrary).
But, it is true you cannot use the ~ operator on strings without involving the GC, right? Maybe this isn't a problem for people who want to manually manage memory for performance reasons but it is right at the very lowest levels of "what is D" and it seems like something where it should be possible to choose how the memory for a string is allocated etc. R
OTOH, string concatenation is not the most critical thing for game writers, is it ? I would see it more likely in XML parsing libraries, but D is already the fastest performer in this area (but maybe it could be faster without GC ?).
I would claim that it is fast there mainly because of the combination of humble dependence on GC and array slicing. Array slicing is a lot less powerful without any GC support. (of course, the particular way the parser is implemented matters a huge lot. The soon-to-be replaced std.xml module does not hold a candle to Tango's XML parsers, performance-wise.)
Dec 02 2011
prev sibling parent so <so so.so> writes:
On Fri, 02 Dec 2011 20:02:05 +0200, Somedude <lovelydear mailmetrash.com>  
wrote:

 OTOH, string concatenation is not the most critical thing for game
 writers, is it ?
 I would see it more likely in XML parsing libraries, but D is already
 the fastest performer in this area (but maybe it could be faster without
 GC ?).
XML has almost no value whatsoever. The reasons are tools, this includes manual memory management and GC issues, and library "developement" issues. There are also "must" have features provided by either language or by compiler (then it leads to platform dependency). I know some of you (Walter is one) don't like the idea. All you need to do is just check "any" popular open-source HP library and see how wrong you are. I have absolutely no idea why people against this idea. Maybe before " " you were right, but now?
Dec 03 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Wed, 2011-11-30 at 08:42 +0100, Jacob Carlborg wrote:
[...]
 I wonder to what extent the inefficiencies he mentioned (such as the la=
mbdas
 being sugar for anon classes) could be due to the JVM itself. Or if the
 reason is primarily something else, such as something about Scala's int=
ernal
 design or just its implementation. Maybe Scala tries to maximize
 compatibility with Java, and if so, maybe that's the main underlying ca=
use?
 Or again, maybe just inherent attributes of the JVM itself (although th=
at
 would run contrary to what I've heard many people claim about the moder=
n
 JVM)?
=20 I think it has something to do with Scala trying to be compatible with Ja=
va. There are lots of other issues, many relating to "type erasure" which is a millstone for Java and something Scala now depends on -- all the Manifest stuff is there to replace reification of type parameters for generic types. Java 9, 10, or 11 may well remove type erasure and reify the type parameters in the class file and hence the verifier, running class, and type checking. CLR has already done this of course. Java 8 will bring lambda functions as a part of the language not realized via instantiating anonymous classes, this may well cause Scala to rethink many things. On the other hand Java will already have rewritten it data structures library to allow for lambda functions. Java with lambdas, looks like object-oriented programming is finally dying ;-) --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Nov 30 2011
prev sibling next sibling parent reply Jesse Phillips <jessekphillips+d gmail.com> writes:
On Tue, 29 Nov 2011 20:34:51 -0500, bearophile wrote:

 A recently written report from a firm that has switched back from Scala
 to Java:
 
 https://raw.github.com/
gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt Interesting. The main key points I see coming from this. * The community wasn't helpful/consistent * Implementation problems * Being built on top of the JVM I think the D has a good community behind it, but I suppose only the new guys can really give a usable opinion. There are definitely implementation problems in D, some design issues. But as you mention, this gap is closing and at an increasing rate. D has done well with being independent but inter-operable with C.
Nov 29 2011
parent reply Jason House <jason.james.house gmail.com> writes:
Jesse Phillips Wrote:

 On Tue, 29 Nov 2011 20:34:51 -0500, bearophile wrote:
 
 A recently written report from a firm that has switched back from Scala
 to Java:
 
 https://raw.github.com/
gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt Interesting. The main key points I see coming from this. * The community wasn't helpful/consistent * Implementation problems * Being built on top of the JVM
I looked at the issues much differently. I read it hours ago, but here are what I remember as big issues: * immature / incomplete libraries * immature toolchain * poor performance with common patterns * not widely adopted / hard to teach I think D shares all of these problems to differing degrees.
Nov 29 2011
next sibling parent reply Steve Teale <steve.teale britseyeview.com> writes:
On Tue, 29 Nov 2011 23:22:16 -0500, Jason House wrote:

 Jesse Phillips Wrote:
 
 On Tue, 29 Nov 2011 20:34:51 -0500, bearophile wrote:
 
 A recently written report from a firm that has switched back from
 Scala to Java:
 
 https://raw.github.com/
gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt
I looked at the issues much differently. I read it hours ago, but here are what I remember as big issues: * immature / incomplete libraries * immature toolchain * poor performance with common patterns * not widely adopted / hard to teach I think D shares all of these problems to differing degrees.
There's a good deal of activity on the D library front, but the toolchain situation seems to be stuck. The big deals as I see them, for ages, have been the flaky debugging and inability to generate shared libraries in Linux, and the COFF/OMF divide in Windows. Steve
Nov 29 2011
next sibling parent reply Gour <gour atmarama.net> writes:
On Wed, 30 Nov 2011 06:28:59 +0000 (UTC)
Steve Teale <steve.teale britseyeview.com> wrote:

 There's a good deal of activity on the D library front
Indeed...things are improving, but it is a fact that none of the bindings for GUI libs are actively developed supporting last stable upstream version. There (was)is lot of talk how e.g. Haskell is beautiful language, how it would take over etc., but can someone name few desktop apps written in it? (I know about darcs & xmonad) So, design of the language is one issue, and here D really excels, but having developed ecosystem is another thing and it would be nice that D community organize itself (more) to fill some holes instead of having lot of individual efforts on half-bakev project. Cleaning, as someone wrote. 'demetery' of dsource.org would be helpful to be clear what is still alive. Putting stuff on github with the mantra 'fork it' is not enough... Sincelery, Gour --=20 In this endeavor there is no loss or diminution,=20 and a little advancement on this path can protect=20 one from the most dangerous type of fear. http://atmarama.net | Hlapicina (Croatia) | GPG: 52B5C810
Nov 29 2011
parent Paulo Pinto <pjmlp progtools.org> writes:
Am 30.11.2011 07:52, schrieb Gour:
 On Wed, 30 Nov 2011 06:28:59 +0000 (UTC)
 Steve Teale<steve.teale britseyeview.com>  wrote:

 There's a good deal of activity on the D library front
Indeed...things are improving, but it is a fact that none of the bindings for GUI libs are actively developed supporting last stable upstream version. There (was)is lot of talk how e.g. Haskell is beautiful language, how it would take over etc., but can someone name few desktop apps written in it? (I know about darcs& xmonad)
Not sure. But at least Microsoft and Intel do support the language development, Intel even had some open positions on their HPC lab some months ago. Maybe some of the companies below also have internal desktop applications in Haskell, although I would assume most would code hybrid http://www.haskell.org/haskellwiki/Haskell_in_industry Anyway, I can imagine that even on dry corporate world I move on, I would collect more support for Haskell than D just because of the companies involved with the language, even if there isn't a proper desktop application I can show. -- Paulo
Nov 29 2011
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-11-30 07:28, Steve Teale wrote:
 On Tue, 29 Nov 2011 23:22:16 -0500, Jason House wrote:

 Jesse Phillips Wrote:

 On Tue, 29 Nov 2011 20:34:51 -0500, bearophile wrote:

 A recently written report from a firm that has switched back from
 Scala to Java:

 https://raw.github.com/
gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt
I looked at the issues much differently. I read it hours ago, but here are what I remember as big issues: * immature / incomplete libraries * immature toolchain * poor performance with common patterns * not widely adopted / hard to teach I think D shares all of these problems to differing degrees.
There's a good deal of activity on the D library front, but the toolchain situation seems to be stuck. The big deals as I see them, for ages, have been the flaky debugging and inability to generate shared libraries in Linux, and the COFF/OMF divide in Windows. Steve
I agree. The shared library problem is blocked by DMD not being able to correctly generate PIC. Lately it seems that mostly new bugs get attention and the old once are almost forgotten. -- /Jacob Carlborg
Nov 29 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux. The problem is nobody has figured out the details of making Phobos/Druntime a shared library. I.e. there's more to creating a shared library than throwing -fPIC.
Nov 30 2011
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-11-30 09:19, Walter Bright wrote:
 On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux.
So you're saying this issue has already been fixed: http://d.puremagic.com/issues/show_bug.cgi?id=4583 ?
 The problem is nobody has figured out the details of making Phobos/Druntime a
shared
 library.
I'm trying to. I've started to read about TLS to try to fix that on Mac OS X for dynamic libraries.
 I.e. there's more to creating a shared library than throwing -fPIC.
Yeah, I know that. I've already implemented dynamic libraries for D1 with Tango on Mac OS X. Trying to do the same thing now for D2. It's just a bit problematic if the compiler doesn't generate the correct code. Note that it's been a while since I tried to implement dynamic libraries on Linux so I don't know if the above mentioned issue is fixed or not, it's still open though. -- /Jacob Carlborg
Nov 30 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 4:18 AM, Jacob Carlborg wrote:
 On 2011-11-30 09:19, Walter Bright wrote:
 On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux.
So you're saying this issue has already been fixed: http://d.puremagic.com/issues/show_bug.cgi?id=4583 ?
I still need to investigate that one.
Nov 30 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-11-30 19:40, Walter Bright wrote:
 On 11/30/2011 4:18 AM, Jacob Carlborg wrote:
 On 2011-11-30 09:19, Walter Bright wrote:
 On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux.
So you're saying this issue has already been fixed: http://d.puremagic.com/issues/show_bug.cgi?id=4583 ?
I still need to investigate that one.
Ok, thanks. -- /Jacob Carlborg
Nov 30 2011
prev sibling parent reply "Martin Nowak" <dawg dawgfoto.de> writes:
On Wed, 30 Nov 2011 09:19:59 +0100, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux. The problem is nobody has figured out the details of making Phobos/Druntime a shared library. I.e. there's more to creating a shared library than throwing -fPIC.
I just had a look at creating shared libraries yesterday. The dynamic loader was complaining about some unsupported relocations (R_X86_64_32 with fPIC). What are the known remaining issues? I was also wondering whether using the gc_proxy is the best decision for Windows. On POSIX the GC symbols can be left undefined in the shared library and are resolved by the dynamic loader. The module infos could be added to a global list during _init and removed during _fini. This approach works for runtime loading as well as for linking to shared libs. martin
Nov 30 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-11-30 16:36, Martin Nowak wrote:
 On Wed, 30 Nov 2011 09:19:59 +0100, Walter Bright
 <newshound2 digitalmars.com> wrote:

 On 11/29/2011 11:46 PM, Jacob Carlborg wrote:
 I agree. The shared library problem is blocked by DMD not being able to
 correctly generate PIC.
The compiler does correctly generate PIC code on Linux. The problem is nobody has figured out the details of making Phobos/Druntime a shared library. I.e. there's more to creating a shared library than throwing -fPIC.
I just had a look at creating shared libraries yesterday. The dynamic loader was complaining about some unsupported relocations (R_X86_64_32 with fPIC). What are the known remaining issues? I was also wondering whether using the gc_proxy is the best decision for Windows. On POSIX the GC symbols can be left undefined in the shared library and are resolved by the dynamic loader. The module infos could be added to a global list during _init and removed during _fini. This approach works for runtime loading as well as for linking to shared libs. martin
We also need make the runtime aware of module infos from libraries (and some other data) when they're dynamically loaded, i.e. via dlopen. This is quite easy on Mac OS X using the "_dyld_register_func_for_add_image" function. I have no idea how to do the same on Linux or FreeBSD. I'm pretty sure the compiler needs to generate different code for TLS when the variable to access is in a dynamic library. There are couple of different models available for TLS, I don't fully understand all the differences. http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man3/dyld.3.html http://www.akkadia.org/drepper/tls.pdf -- /Jacob Carlborg
Nov 30 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 9:36 AM, Jacob Carlborg wrote:
 I'm pretty sure the compiler needs to generate different code for TLS when the
 variable to access is in a dynamic library.
That is correct, and you'll see the difference when you use -fPIC.
Nov 30 2011
prev sibling parent Jesse Phillips <jessekphillips+d gmail.com> writes:
On Tue, 29 Nov 2011 23:22:16 -0500, Jason House wrote:

 I looked at the issues much differently. I read it hours ago, but here
 are what I remember as big issues:
 * immature / incomplete libraries 
To me it wasn't the question of "what do we use" but "how do we use" that was giving them challenges. The quote that stands out to me is, "Not being able to rely on a strong community presence meant we had to fend for ourselves in figuring out what "good" Scala was." Which then leads into your "hard to teach" point.
 * immature toolchain 
They did complain about the "official" toolchain, but found something that work, "using Maven really highlighted the second-class status assigned to it in the Scala ecosystem."
 * poor performance with common patterns 
This is where my mention of being built on the JVM/implementation comes to play, along with the comments about needing think in Java Lang, Java Byte, and Scala. D does have performance problems due to implementation.
 * not widely adopted / hard to teach
This did come up but I attributed it to the community. When you have a best practice of "ignore the community entirely" I feel that signifies a huge issue and doesn't lend well to self teaching. I think D has a lower entry barrier and a community to guide a more idiomatic D. There is much to learn about writing D, but at least there or some lines already in the sand.
Nov 29 2011
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Quite an interesting read, but I cannot stop to think than again is one 
of the typical "blame the tool" thing.

In my line of business we only allow employees with proper university 
background to enter the company, and even so, we get developers which I 
keep asking if they learned anything at all while at the university.

Even for them Java is a very complex language. I already spent quite 
some hours explaining programming concepts that made me think if I was
explaining programming to children. But then again, I don't have any 
control about the hiring process in these companies.


--
Paulo

Am 30.11.2011 02:34, schrieb bearophile:
 A recently written report from a firm that has switched back from Scala to
Java:

 https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt

 Some people say that programmers often show a religious-like attachment to
their preferred languages, but I suspect that often the truth is just that new
languages are not good enough for practical work. Even languages like Scala
that seem very carefully designed by geniuses, a language that is also easily
integrated with Java code, that is one of the most successful and used
languages of the world, risk to be a failure for a good number of people.
Designing a good enough new language is hard, maybe 99% of the newly designed
languages fail, and creating a language that is also usable in daily work is
much harder.

 Regarding D2, I think in the last year it is coming out of a phase of its
development: I no longer find a new compiler bug every time I write 20 lines of
D2 code. It happens still, but it's now an uncommon thing.

 I don't have a wide experience about designing new languages, so it's not easy
to give good suggestions. But now I suggest to keep some focus about removing
important/basic design bugs/faults of D, like the recent removal of covariance
-related array problem. Example: D2 foreach is currently broken in two
different ways. On the other hand there are examples of successful languages
that contain several basic design faults, like PHP and JavaScript. So I don't
know.

 ----------------

  From that text:

 5. Avoid closures. [...] At some point, we stopped seeing lambdas as free and
started seeing them as syntactic sugar on top of anonymous classes and thus
acquired the same distaste for them as we did anonymous classes.<
D2 closures are probably better, they aren't syntactic sugar on top of anonymous classes. On the other hand invisible sources of low performance are a bad thing in a language as D2. This is why I have suggested to add a compiler switch that lists all the closures of a module (or other related ideas about no heap activity tests or enforcement). Bye, bearophile
Nov 29 2011
next sibling parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 30.11.2011, 08:21 Uhr, schrieb Paulo Pinto <pjmlp progtools.org>:

 Quite an interesting read, but I cannot stop to think than again is one  
 of the typical "blame the tool" thing.

 In my line of business we only allow employees with proper university  
 background to enter the company, and even so, we get developers which I  
 keep asking if they learned anything at all while at the university.

 Even for them Java is a very complex language. I already spent quite  
 some hours explaining programming concepts that made me think if I was
 explaining programming to children. But then again, I don't have any  
 control about the hiring process in these companies.


 --
 Paulo
There are schools that teach programming from the practical point of view as opposed to the theoretical university background. While you have to set a higher bar in regards to the grades there, those people may know Java or example projects. Since the school would be in close contact to actual companies they know best, what a typical programmer needs to learn in 2-3 years to some work done in a company. The trade-off is the lack of academical background. They may not have heard much about the big-O notation and other CS terms or how algorithms other than bubble-sort and quick-sort work. But more often than not, an employer just wants someone they can give a workspace with Eclipse or VisualStudio and let them do some small task on an existing code base right away. (my experience) I also had my time explaining objects to a trainee. But it was a relief for me when she started to understand what a pointer is and how two of them can refer to the same object.
 Am 30.11.2011 02:34, schrieb bearophile:
 A recently written report from a firm that has switched back from Scala  
 to Java:

 https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt

 Some people say that programmers often show a religious-like attachment  
 to their preferred languages, but I suspect that often the truth is  
 just that new languages are not good enough for practical work. Even  
 languages like Scala that seem very carefully designed by geniuses, a  
 language that is also easily integrated with Java code, that is one of  
 the most successful and used languages of the world, risk to be a  
 failure for a good number of people. Designing a good enough new  
 language is hard, maybe 99% of the newly designed languages fail, and  
 creating a language that is also usable in daily work is much harder.

 Regarding D2, I think in the last year it is coming out of a phase of  
 its development: I no longer find a new compiler bug every time I write  
 20 lines of D2 code. It happens still, but it's now an uncommon thing.

 I don't have a wide experience about designing new languages, so it's  
 not easy to give good suggestions. But now I suggest to keep some focus  
 about removing important/basic design bugs/faults of D, like the recent  
 removal of covariance -related array problem. Example: D2 foreach is  
 currently broken in two different ways. On the other hand there are  
 examples of successful languages that contain several basic design  
 faults, like PHP and JavaScript. So I don't know.

 ----------------

  From that text:

 5. Avoid closures. [...] At some point, we stopped seeing lambdas as  
 free and started seeing them as syntactic sugar on top of anonymous  
 classes and thus acquired the same distaste for them as we did  
 anonymous classes.<
D2 closures are probably better, they aren't syntactic sugar on top of anonymous classes. On the other hand invisible sources of low performance are a bad thing in a language as D2. This is why I have suggested to add a compiler switch that lists all the closures of a module (or other related ideas about no heap activity tests or enforcement). Bye, bearophile
Nov 30 2011
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Paulo Pinto" <pjmlp progtools.org> wrote in message 
news:jb4lj2$ulq$1 digitalmars.com...
 In my line of business we only allow employees with proper university 
 background to enter the company, and even so, we get developers which I 
 keep asking if they learned anything at all while at the university.

 Even for them Java is a very complex language. I already spent quite some 
 hours explaining programming concepts that made me think if I was
 explaining programming to children. But then again, I don't have any 
 control about the hiring process in these companies.
That's just like complaining "I went to Japan and every time I asked for a 'taco', they gave me octopus instead!" Your company is getting *exactly* what they're looking for. Universities DO NOT create programmers. Period. (*Especially* liberal arts schools.) Most CS courses don't even *try* to create programmers. Hell, half of them are taught by people who can barely code (I have stories...). One could argue that universities either should or shouldn't try to create programmers, but either way: If you're specifically looking for "proper university background" then naturally you're going to have to sort through a lot of academic weenies *if you're lucky*. If you're not particularly lucky, you'll get a bunch of the millions upon millions who went through the revolving academic door either: A. Merely because society told them they should get a degree, or B. Because mommy and daddy paid them to go. You can be sure of one thing: If the majority of a candidate's experience is in Uni, then they don't know what they're doing. Obviously that's not to say that there aren't good or even great programmers who have been through college/uni/academia/etc. Hell, Walter and Andrei are among the best programmers I know. The point is, you're optimizing for the wrong metric, and that's giving you truckloads of both false positives and false negatives. You're winding up with exactly the "tako" you've asked for.
Nov 30 2011
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/30/2011 11:37 PM, Nick Sabalausky wrote:
 Obviously that's not to say that there aren't good or even great programmers
 who have been through college/uni/academia/etc. Hell, Walter and Andrei are
 among the best programmers I know.
I don't know about Andrei, but my degree is in Mechanical Engineering. I don't have any formal training in programming. On the other hand, I was fortunate to have friends at Caltech who were very good programmers, and were kind enough to show me the ropes.
Dec 01 2011
parent Steve Teale <steve.teale britseyeview.com> writes:
 
 I don't know about Andrei, but my degree is in Mechanical Engineering. I
 don't have any formal training in programming.
 
 On the other hand, I was fortunate to have friends at Caltech who were
 very good programmers, and were kind enough to show me the ropes.
I was once a chemist - but they always change into something else. I think it's immaterial. Computer programming is one of those subjects where you have to learn new skills at a rate that is probably in excess of the learning rate at a university, throughout your career. Then to actually do it you also have to learn a lot of other stuff about application domains. I suspect you'd do just as well if you recruited programmers on the basis of some simple intelligence test and a corresponding test for pragmatism. Then you give them a book to read and a computer to experiment on, and you don't treat them like a fool when they come and ask foolish questions - just be supportive.
Dec 02 2011
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-11-30 02:34, bearophile wrote:
 A recently written report from a firm that has switched back from Scala to
Java:

 https://raw.github.com/gist/1406238/72ade1a89004a9a7d705b00cfd14b90b2b6a26bd/gistfile1.txt

 Some people say that programmers often show a religious-like attachment to
their preferred languages, but I suspect that often the truth is just that new
languages are not good enough for practical work. Even languages like Scala
that seem very carefully designed by geniuses, a language that is also easily
integrated with Java code, that is one of the most successful and used
languages of the world, risk to be a failure for a good number of people.
Designing a good enough new language is hard, maybe 99% of the newly designed
languages fail, and creating a language that is also usable in daily work is
much harder.

 Regarding D2, I think in the last year it is coming out of a phase of its
development: I no longer find a new compiler bug every time I write 20 lines of
D2 code. It happens still, but it's now an uncommon thing.

 I don't have a wide experience about designing new languages, so it's not easy
to give good suggestions. But now I suggest to keep some focus about removing
important/basic design bugs/faults of D, like the recent removal of covariance
-related array problem. Example: D2 foreach is currently broken in two
different ways. On the other hand there are examples of successful languages
that contain several basic design faults, like PHP and JavaScript. So I don't
know.

 ----------------

  From that text:

 5. Avoid closures. [...] At some point, we stopped seeing lambdas as free and
started seeing them as syntactic sugar on top of anonymous classes and thus
acquired the same distaste for them as we did anonymous classes.<
D2 closures are probably better, they aren't syntactic sugar on top of anonymous classes. On the other hand invisible sources of low performance are a bad thing in a language as D2. This is why I have suggested to add a compiler switch that lists all the closures of a module (or other related ideas about no heap activity tests or enforcement). Bye, bearophile
Seems they complaining about libraries and the tool chain. I don't understand the problem, just use the Java libraries. About the language, shouldn't it be possible to just use the parts of Scala that also exists in Java. Then pick a few Scala features here and there that make things easier. -- /Jacob Carlborg
Nov 29 2011
parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 30/11/2011 08:45, Jacob Carlborg a écrit :
 
 Seems they complaining about libraries and the tool chain. I don't
 understand the problem, just use the Java libraries. About the language,
 shouldn't it be possible to just use the parts of Scala that also exists
 in Java. Then pick a few Scala features here and there that make things
 easier.
 
Your argument is like saying D users who complain the D libraries are not mature enough should stick to C libraries.
Dec 02 2011
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Somedude" <lovelydear mailmetrash.com> wrote in message 
news:jbbb2b$1l6q$1 digitalmars.com...
 Le 30/11/2011 08:45, Jacob Carlborg a écrit :
 Seems they complaining about libraries and the tool chain. I don't
 understand the problem, just use the Java libraries. About the language,
 shouldn't it be possible to just use the parts of Scala that also exists
 in Java. Then pick a few Scala features here and there that make things
 easier.
Your argument is like saying D users who complain the D libraries are not mature enough should stick to C libraries.
Personally, I'd rather use C libs in D than use them in C. But I do get your point ;)
Dec 02 2011
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-02 21:04, Somedude wrote:
 Le 30/11/2011 08:45, Jacob Carlborg a écrit :
 Seems they complaining about libraries and the tool chain. I don't
 understand the problem, just use the Java libraries. About the language,
 shouldn't it be possible to just use the parts of Scala that also exists
 in Java. Then pick a few Scala features here and there that make things
 easier.
Your argument is like saying D users who complain the D libraries are not mature enough should stick to C libraries.
In some cases that's true. -- /Jacob Carlborg
Dec 04 2011
prev sibling next sibling parent Dejan Lekic <dejan.lekic gmail.com> writes:
There is also this article about similar thing. 
http://www.infoq.com/news/2011/11/yammer-scala

As a Java programmer I can only say one thing - I hate Java shortcomings, 
but the simplicity pays off.

Second, this is just one case of Scala -> Java transition. I bet the number 
of "Java->Scala" companies is way larger than "Scala->Java". :)
Nov 30 2011
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Sunday, December 18, 2011 06:17:22 Russel Winder wrote:
 The problem here is that educators forgot the importance of learning
 multiple languages and especially multiple paradigms. Java was used for
 all teaching and students suffered. If they had used Java and Haskell
 and Prolog things would be much better.
In my experience, it's fairly common for there to be _one_ required class which is intended to teach about other paradigms - primarily functional languages - so I think that it's fairly typical for students to be exposed to such languages, but given how foreign they are, I think that the typical reaction is that the students don't want to touch such languages again unless they have to. C (and maybe C++) stand a good chance of being used for classes like those on networking and operating systems, so a fair number of students will have some exposure to those, but I think that their typical reaction is to dislike those languages and avoid them unless they have to use them (though I think that they're more to use them than a functional language). Naturally, every student is unique, but most of them seem to prefer what they know best - and that's Java. For most classes though, the focus is on the concepts not the language, which on the whole is exactly where it should be. Any halfway decent programmer should be able to learn a new language, and the concepts of computer science apply to all of them. So, on the whole, that approach is a solid one IMHO. The problem is that it does lead to programmers who are versed primarily in one language rather than being familiar with several, unless they the initiatize and learn them on their own. - Jonathan M Davis
Dec 17 2011
prev sibling parent reply Isaac Gouy <igouy2 yahoo.com> writes:
 From: Russel Winder
 Subject: Re: Java > Scala
 Newsgroups: gmane.comp.lang.d.general
 Sat, 17 Dec 2011 22:18:26 -0800
 I really rather object to being labelled an educated idiot.
...
 If you want to look at even more biased benchmarking look at
 http://shootout.alioth.debian.org/ it is fundamentally designed to show
 that C is the one true language for writing performance computation.
I rather object to the baseless accusation that the benchmarks game is "designed to show that C is the one true language for writing performance computation." Your accusation is false. Your accusation is ignorant (literally).
Dec 18 2011
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/18/11 1:08 PM, Isaac Gouy wrote:
 From: Russel Winder
 Subject: Re: Java>  Scala
 Newsgroups: gmane.comp.lang.d.general
 Sat, 17 Dec 2011 22:18:26 -0800
 I really rather object to being labelled an educated idiot.
....
 If you want to look at even more biased benchmarking look at
 http://shootout.alioth.debian.org/ it is fundamentally designed to show
 that C is the one true language for writing performance computation.
I rather object to the baseless accusation that the benchmarks game is "designed to show that C is the one true language for writing performance computation." Your accusation is false. Your accusation is ignorant (literally).
It also strikes me as something rather random to say. Far as I can tell the shootout comes with plenty of warnings and qualifications and uses a variety of tests that don't seem chosen to favor C or generally systems programming languages. But I'm sure Russel had something in mind. Russel, would you want to expand a bit? Thanks, Andrei
Dec 18 2011
next sibling parent reply Somedude <lovelydear mailmetrash.com> writes:
Le 19/12/2011 00:08, Andrei Alexandrescu a écrit :
 On 12/18/11 1:08 PM, Isaac Gouy wrote:
 From: Russel Winder
 Subject: Re: Java>  Scala
 Newsgroups: gmane.comp.lang.d.general
 Sat, 17 Dec 2011 22:18:26 -0800
 I really rather object to being labelled an educated idiot.
....
 If you want to look at even more biased benchmarking look at
 http://shootout.alioth.debian.org/ it is fundamentally designed to show
 that C is the one true language for writing performance computation.
I rather object to the baseless accusation that the benchmarks game is "designed to show that C is the one true language for writing performance computation." Your accusation is false. Your accusation is ignorant (literally).
It also strikes me as something rather random to say. Far as I can tell the shootout comes with plenty of warnings and qualifications and uses a variety of tests that don't seem chosen to favor C or generally systems programming languages. But I'm sure Russel had something in mind. Russel, would you want to expand a bit? Thanks, Andrei
Not only is it random and baseless, my own personal experience is that the shootout actually gives a fairly accurate display of what one can expect on the areas of speed and memory usage. Less on the side of code size though, because the programs are still too small to take advantage of some language features designed for large scale programs. And I still pray to see D back in the shootout.
Dec 18 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/18/11 5:40 PM, Somedude wrote:
 And I still pray to see D back in the shootout.
Praying might help. Working on it may actually be more effective :o). Andrei
Dec 18 2011
prev sibling next sibling parent reply Russel Winder <russel russel.org.uk> writes:
On Sun, 2011-12-18 at 17:08 -0600, Andrei Alexandrescu wrote:
 On 12/18/11 1:08 PM, Isaac Gouy wrote:
 From: Russel Winder
 Subject: Re: Java>  Scala
 Newsgroups: gmane.comp.lang.d.general
 Sat, 17 Dec 2011 22:18:26 -0800
 I really rather object to being labelled an educated idiot.
....
 If you want to look at even more biased benchmarking look at
 http://shootout.alioth.debian.org/ it is fundamentally designed to sho=
w
 that C is the one true language for writing performance computation.
I rather object to the baseless accusation that the benchmarks game is =
"designed to show that C is the one true language for writing performance c= omputation." Overstated perhaps, baseless, no. But this is a complex issue.
 Your accusation is false.

 Your accusation is ignorant (literally).
The recent thread between Caligo, myself and others on this list should surely have displayed the futility of arguing in this form.
 It also strikes me as something rather random to say. Far as I can tell=
=20
 the shootout comes with plenty of warnings and qualifications and uses a=
=20
 variety of tests that don't seem chosen to favor C or generally systems=
=20
 programming languages.
The Shootout infrastructure and overall management is great. Isaac has done a splendid job there. The data serves a purpose for people who read between the lines and interpret the results with intelligence. The opening page does indeed set out that you have to be very careful with the data to avoid comparing apples and oranges. The data is presented in good faith. The system as set out is biased though, systematically so. This is not a problem per se since all the micro-benchmarks are about computationally intensive activity. Native code versions are therefore always going to appear better. But then this is fine the Shootout is about computationally intensive comparison. Actually I am surprised that Java does so well in this comparison due to its start-up time issues. Part of the "problem" I alluded to was people using the numbers without thinking. No amount of words on pages affect these people, they take the numbers as is and make decisions based solely on them. C, C++ and Fortran win on most of them and so are the only choice of language. (OK so Haskell wins on the quad-core thread-ring, which I find very interesting.) As I understand it, Isaac ruins this basically single handed, relying of folk providing versions of the code. This means there is a highly restricted resource issue in managing the Shootout. Hence a definite set of problems and a restricted set of languages to make management feasible. This leads to interesting situation such as D is not part of the set but Clean and Mozart/Oz are. But then Isaac is the final arbiter here, as it is his project, and what he says goes. I looked at the Java code and the Groovy code a couple of years back (I haven't re-checked the Java code recently), and it was more or less a transliteration of the C code. This meant that the programming languages were not being shown off at their best. I started a project with the Groovy community to provide reasonable version of Groovy codes and was getting some take up. Groovy was always going to be with Python and Ruby and nowhere near C, C++, and Fortran, or Java, but the results being displayed at the time were orders of magnitude slower than Groovy could be, as shown by the Java results. The most obvious problem was that the original Groovy code was written so as to avoid any parallelism at all. Of course Groovy (like Python) would never be used directly for this sort of computation, a mixed Groovy/Java or Python/C (or Python/C++, Python/Fortran) would be -- the "tight loop" being coded in the static language, the rest in the dynamic language. Isaac said though that this was not permitted, that only pure single language versions were allowed. Entirely reasonable in one sense, unfair in another: fair because it is about language performance in the abstract, unfair because it is comparing languages out of real world use context. (It is worth noting that the Python is represented by CPython, and I suspect PyPy would be a lot faster for these micro-benchmarks. But only when PyPy is Python 3 compliant since Python 3 and not Python 2 is the representative in the Shootout. A comparison here is between using Erlang and Erlang HiPE.) In the event, Isaac took Groovy out of the Shootout, so the Groovy rewrite effort was disbanded. I know Isaac says run your own site, but that rather misses the point, and leads directly to the sort of hassles Walter had when providing a benchmark site. There is no point in a language development team running a benchmark. The issues are perceived, if not real, bias in the numbers. Benchmarks have to be run by an independent even if the contributions are from language development teams.
 But I'm sure Russel had something in mind. Russel, would you want to=20
 expand a bit?
Hopefully the above does what you ask. The summary is that Isaac is running this in good faith, but there are systematic biases in the whole thing, which is entirely fine as long as you appreciate that. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 19 2011
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 12/20/11 1:29 AM, Russel Winder wrote:
 The system as set out is biased though, systematically so.  This is not
 a problem per se since all the micro-benchmarks are about
 computationally intensive activity.  Native code versions are therefore
 always going to appear better.  But then this is fine the Shootout is
 about computationally intensive comparison.
This is fine, so no bias so far. It's a speed benchmark, so it's supposed to measure speed. It says as much. If native code comes usually in top places, the word is "expected", not "biased".
 Actually I am surprised
 that Java does so well in this comparison due to its start-up time
 issues.
I suppose this is because the run time of the tests is long enough to bury VM startup time. Alternatively, the benchmark may only measure the effective execution time.
 Part of the "problem" I alluded to was people using the numbers without
 thinking.  No amount of words on pages affect these people, they take
 the numbers as is and make decisions based solely on them.
Well, how is that a bias of the benchmark?
 C, C++ and
 Fortran win on most of them and so are the only choice of language.
The benchmark measures speed. If one is looking for speed wouldn't the choice of language be in keeping with these results? I'd be much more suspicious of the quality and/or good will of the benchmark if other languages would frequently come to the top.
 As I understand it, Isaac ruins this basically single handed, relying of
 folk providing versions of the code.  This means there is a highly
 restricted resource issue in managing the Shootout.  Hence a definite
 set of problems and a restricted set of languages to make management
 feasible.  This leads to interesting situation such as D is not part of
 the set but Clean and Mozart/Oz are.  But then Isaac is the final
 arbiter here, as it is his project, and what he says goes.
If I recall things correctly, Isaac dropped the D code because it was 32-bit only, which was too much trouble for his setup. Now we have good 64 bit generation, so it may be a good time to redo D implementations of the benchmarks and submit it again to Isaac for inclusion in the shootout. Quite frankly, however, your remark (which I must agree, for all respect I hold for you, is baseless) is a PR faux pas - and unfortunately not the only one of our community. I'd find it difficult to go now and say, "by the way, Isaac, we're that community that insulted you on a couple of occasions. Now that we got to talk again, how about putting D back in the shootout?"
 I looked at the Java code and the Groovy code a couple of years back (I
 haven't re-checked the Java code recently), and it was more or less a
 transliteration of the C code.
That is contributed code. In order to demonstrate bias you'd need to show that faster code was submitted and refused.
 This meant that the programming
 languages were not being shown off at their best.  I started a project
 with the Groovy community to provide reasonable version of Groovy codes
 and was getting some take up.  Groovy was always going to be with Python
 and Ruby and nowhere near C, C++, and Fortran, or Java, but the results
 being displayed at the time were orders of magnitude slower than Groovy
 could be, as shown by the Java results.  The most obvious problem was
 that the original Groovy code was written so as to avoid any parallelism
 at all.
Who wrote the code? Is the owner of the shootout site responsible for those poor results?
 Of course Groovy (like Python) would never be used directly for this
 sort of computation, a mixed Groovy/Java or Python/C (or Python/C++,
 Python/Fortran) would be -- the "tight loop" being coded in the static
 language, the rest in the dynamic language.   Isaac said though that
 this was not permitted, that only pure single language versions were
 allowed.  Entirely reasonable in one sense, unfair in another: fair
 because it is about language performance in the abstract, unfair because
 it is comparing languages out of real world use context.
I'd find it a stretch to label that as unfair, for multiple reasons. The shootout measures speed of programming languages, not speed of systems languages wrapped in shells of other languages. The simpler reason is that it's the decision of the site owner to choose the rules. I happen to find them reasonable, but I get your point too (particularly if the optimized routines are part of the language's standard library).
 (It is worth noting that the Python is represented by CPython, and I
 suspect PyPy would be a lot faster for these micro-benchmarks.  But only
 when PyPy is Python 3 compliant since Python 3 and not Python 2 is the
 representative in the Shootout.  A comparison here is between using
 Erlang and Erlang HiPE.)

 In the event, Isaac took Groovy out of the Shootout, so the Groovy
 rewrite effort was disbanded.  I know Isaac says run your own site, but
 that rather misses the point, and leads directly to the sort of hassles
 Walter had when providing a benchmark site.
That actually hits the point so hard, the point is blown into so little pieces, you'd think it wasn't there in the first place. It's a website. If it doesn't do what you want, at the worst case that would be "a bummer". But it's not "unfair" as the whole notion of fairness is inappropriate here. Asking for anything including fairness _does_ miss the point.
 There is no point in a
 language development team running a benchmark.  The issues are
 perceived, if not real, bias in the numbers.  Benchmarks have to be run
 by an independent even if the contributions are from language
 development teams.

 But I'm sure Russel had something in mind. Russel, would you want to
 expand a bit?
Hopefully the above does what you ask. The summary is that Isaac is running this in good faith, but there are systematic biases in the whole thing, which is entirely fine as long as you appreciate that.
Well, to me your elaboration seems like one of those delicious monologues Ricky Gervais gets into in the show "Extras". He makes some remark, figures it's a faux pas, and then tries to mend it but instead it all gets worse and worse. Andrei
Dec 20 2011
prev sibling next sibling parent Isaac Gouy <igouy2 yahoo.com> writes:
 From: Russel Winder <russel russel.org.uk>=0A=0A> Sent: Monday, December =
19, 2011 11:29 PM=0A=0A>> >> If you want to look at even more biased bench= marking look at=0A>> >> http://shootout.alioth.debian.org/ it is fundament= ally designed to =0A>> >> show that C is the one true language for writing= performance =0A>> >> computation.=0A=0A> Overstated perhaps, baseless, no= .=A0 But this is a complex issue.=0A=0AFalse and baseless, and a simple iss= ue. =0A=0AYour words are clear - "... designed to show ...".=0A=0AYour fals= e accusation is about purpose and intention - you should take back that acc= usation.
Dec 20 2011
prev sibling next sibling parent Isaac Gouy <igouy2 yahoo.com> writes:
 From: Russel Winder <russel russel.org.uk>=0A> Sent: Monday, December 19,=
2011 11:29 PM=0A=0A=0AAs for your other comments:=0A=0A> The opening page = does indeed set out that you have to be very careful with=0A> the data to a= void comparing apples and oranges.=A0 =0A=0ANo, the opening page says - "A = comparison between programs written in such different languages *is* a comp= arison between apples and oranges..."=0A=0A=0A> Actually I am surprised tha= t Java does so well in this comparison due =0A> to its start-up time issues= .=0A=0APerhaps the start-up time issues are less than you suppose.=0A=0AThe= Help page shows 4 different measurement approaches for the Java program, a= nd for these tiny tiny programs, with these workloads, the "excluding start= -up" "Warmed" times really aren't much different from the usual times that = include all the start-up costs -=0A=0Ahttp://shootout.alioth.debian.org/hel= p.php#java=0A=0A=0A> Part of the "problem" I alluded to was people using th= e numbers =0A> without thinking. =0A=0ADo you include yourself among that g= roup of people?=0A=0A=0A> I started a project with the Groovy community to = provide reasonable version=0A> of Groovy codes and was getting some take up= .=A0=0A=0AYou took on the task in the first week of March 2009=0A=0A=A0=A0 = http://groovy.329449.n5.nabble.com/the-benchmarks-game-Groovy-programs-td36= 6268.html#a366290=0A=0Aand iirc 6 months later not a single program had bee= n contributed !=0A=0A=A0=A0 http://groovy.329449.n5.nabble.com/Alioth-Shoot= out-td368794.html=0A=0A=0A> In the event, Isaac took Groovy out of the Shoo= tout, so the Groovy=0A> rewrite effort was disbanded.=A0=0A=0AYour "Groovy = rewrite effort" didn't contribute a single program in 6 months !=0A=0A=0A> = There is no point in a language development team running a benchmark. =0A= =0A=0ATell that to the PyPy developers http://speed.pypy.org/=0A=0A=0ATell = that to Mike Pall http://luajit.org/performance_x86.html=0A=0ATell that to = the Go developers
Dec 20 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
On Tue, 2011-12-20 at 23:13 -0800, Isaac Gouy wrote:
 From: Russel Winder <russel russel.org.uk>
[...]
 Actually I am surprised that Java does so well in this comparison due=
=20
 to its start-up time issues.
=20 Perhaps the start-up time issues are less than you suppose.
Very possibly the case, I have only switched to Java 7 recently and haven't had time to assess start up or JIT kick in times. Great strides in startup time have been made with each release of Java, at least on Linux, using mmap and preloaded runtime infrastructure.
 The Help page shows 4 different measurement approaches for the Java
 program, and for these tiny tiny programs, with these workloads, the
 "excluding start-up" "Warmed" times really aren't much different from
 the usual times that include all the start-up costs -
=20
 http://shootout.alioth.debian.org/help.php#java
My experience is similar, that JIT warmup is only a small effect when using int and yet quite dramatic when using long. This is to be expected though because a long is not an atomic type in the JVM but is two ints. [...]
 Your "Groovy rewrite effort" didn't contribute a single program in 6 mont=
hs ! In the interim Groovy had been ejected so there was no point. No real need, and probably highly inappropriate to rehearse all the arguments here. It's water under the bridge. There is no no enthusiasm for getting back into the shootout since Groovy is not a language for computationally intensive code, that is the realm of D, C, C++, Fortran, and sometimes Haskell. If we want to progress this point we should do so on the Groovy lists, or privately, rather than here.
 There is no point in a language development team running a benchmark.=
=20
=20
=20
 Tell that to the PyPy developers http://speed.pypy.org/
That is only a comparison of PyPy against CPython using the CPython benchmarks. Those are internal Python benchmarks. So yes very reasonable.
=20
 Tell that to Mike Pall http://luajit.org/performance_x86.html
An internal comparison of Lua engines. Entirely reasonable.
 Tell that to the Go developers
Do you have a particular URL in mind? My point though was that (and this is where Walter received so much flak) where a language vendor tries to do comparisons with other languages there are always claims of bias whether true or not. In this sense the Alioth Shootout has the benefit of being clearly independent of any language vendor. The examples above are not at all inconsistent with my point. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 21 2011
prev sibling next sibling parent Isaac Gouy <igouy2 yahoo.com> writes:
 From: Russel Winder <russel russel.org.uk>
 Sent: Wednesday, December 21, 2011 1:43 AM
 Subject: Re: Java > Scala
  Your "Groovy rewrite effort" didn't contribute a single 
program in 6 months ! In the interim Groovy had been ejected so there was no point.
That is not true - Groovy had not been ejected. Your 6 month failure to contribute a single program was a strong reason measurements were not made for Groovy programs on the new hardware following September 2009.
 If we want to progress this point we should do
 so on the Groovy lists, or privately, rather than here.
Don't send me email - I want nothing to do with you.
Dec 21 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
On Tue, 2011-12-20 at 10:31 -0800, Isaac Gouy wrote:
[...]
 Your words are clear - "... designed to show ...".
=20
 Your false accusation is about purpose and intention - you should take ba=
ck that accusation.=20 As you have interpreted it, that is true. I take back the accusation you have observed, it is not true. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 21 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
On Wed, 2011-12-21 at 08:05 -0800, Isaac Gouy wrote:
[...]
 That is not true - Groovy had not been ejected.
=20
 Your 6 month failure to contribute a single program was a strong reason m=
easurements were not made for Groovy programs on the new hardware following= September 2009. The lack of interest in the Groovy community for doing the work caused me to loose energy. In the end the removal of Groovy from the Shootout was a shame but not really that big a deal. [...]
=20
 Don't send me email - I want nothing to do with you.
Be that as it may, there *must* be no negative effect on the D, Python, or Groovy communities and their position within the Shootout. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Dec 21 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/18/2011 11:08 AM, Isaac Gouy wrote:
 I rather object to the baseless accusation that the benchmarks game is
 "designed to show that C is the one true language for writing performance
 computation."

 Your accusation is false.

 Your accusation is ignorant (literally).
This is why I quit posting any benchmark results. Someone was always accusing me of bias, sabotage, etc.
Dec 18 2011
parent Isaac Gouy <igouy2 yahoo.com> writes:
 From: Walter Bright <newshound2 digitalmars.com>
 Sent: Sunday, December 18, 2011 10:46 PM
 On 12/18/2011 11:08 AM, Isaac Gouy wrote:
  I rather object to the baseless accusation that the benchmarks game is
  "designed to show that C is the one true language for writing 
performance
  computation."
 
  Your accusation is false.
 
  Your accusation is ignorant (literally).
This is why I quit posting any benchmark results. Someone was always accusing me of bias, sabotage, etc.
My feeling is that used to happen much more often 4 or 5 years ago, these days a third-party has usually jumped-in to challenge ignorant comments about the benchmarks game before I even notice. Such ignorant comments are just seen to reflect badly on the person who made them.
Dec 19 2011