www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.announce - Interesting rant about Scala's issues

reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
A lot of them could apply to us as well.

https://www.youtube.com/watch?v=TS1lpKBMkgg


Andrei
Apr 02 2014
next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Andrei Alexandrescu:

 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
The slides: http://www.slideshare.net/extempore/keynote-pnw-scala-2013 Bye, bearophile
Apr 02 2014
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/2/2014 6:55 PM, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
Reminds me of our empty-front-popFront discussion. Trying to support all kinds of variations on that results in unoptimizable code.
Apr 02 2014
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/2/2014 6:55 PM, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
at about 44:00: "I begged them not to do them [AST macros]." :-)
Apr 02 2014
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 04/03/2014 04:45 AM, Walter Bright wrote:
 On 4/2/2014 6:55 PM, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
at about 44:00: "I begged them not to do them [AST macros]." :-)
(This is a misquote.)
Apr 05 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/5/2014 10:10 AM, Timon Gehr wrote:
 On 04/03/2014 04:45 AM, Walter Bright wrote:
 On 4/2/2014 6:55 PM, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
at about 44:00: "I begged them not to do them [AST macros]." :-)
(This is a misquote.)
Yeah, I should have been more accurate. In response to a question about macros & reflection: "I begged them not to, not to just export the compiler to I begged them I begged them not to do it."
Apr 05 2014
next sibling parent "Seth Tisue" <seth tisue.net> writes:
On Saturday, 5 April 2014 at 18:47:50 UTC, Walter Bright wrote:
 In response to a question about macros & reflection:

 "I begged them not to, not to just export the compiler to I 
 begged them I begged them not to do it."
A reboot is in progress on this, too: http://scalareflect.org
Apr 05 2014
prev sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Saturday, 5 April 2014 at 18:47:50 UTC, Walter Bright wrote:
 On 4/5/2014 10:10 AM, Timon Gehr wrote:
 On 04/03/2014 04:45 AM, Walter Bright wrote:
 On 4/2/2014 6:55 PM, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
at about 44:00: "I begged them not to do them [AST macros]." :-)
(This is a misquote.)
Yeah, I should have been more accurate. In response to a question about macros & reflection: "I begged them not to, not to just export the compiler to I begged them I begged them not to do it."
Which is a very different statement.
Apr 09 2014
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Andrei Alexandrescu:

 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg
I agree that D language/compiler could improve its integration with versioning systems (slide 31). How to design this? Bye, bearophile
Apr 02 2014
prev sibling next sibling parent reply "Bienlein" <jeti789 web.de> writes:
My knowledge of compiler constructions is fairly limited and I 
might be wrong, but it seems to me that the Scala compiler is 
broken. Scala has gained some bad reputation for long build times 
(just google for Scala and build time) which IMHO cannot be 
explained by the large number of language features. D has a 
comparable large number of language features and compiles much 
faster than Scala.

D has been designed from the beginning with caution on 
compilation speed and thinking about how to keep it slow to begin 
with. D not only in that way is a language that was thought out. 
On the contrary, Scala seems to me to be a language where many 
features of various languages were thrown into one and then a 
compiler was built for it. The incremental Scala compiler pretty 
much rescues the build time problem, though, and they are mostly 
lucky now. Also, IMHO, implicits are really crazy and it should 
have been clear from the beginning that they will become a 
problem for scalable build times, see 
http://java.dzone.com/articles/implicits-scala-conversion. 
Interestingly. Martin Odersky got his Ph.D. from Niklaus Wirth at 
the ETH and I don't want to know what Wirth would say about 
implicits.

The presentation by Paul Phillips was discussed in the Scala 
forums at great length:

What's up with Paul Phillips?
https://groups.google.com/forum/?hl=de#!topic/scala-debate/IgrKCdConlA
54 replies

What's up with Paul Phillips?
https://groups.google.com/forum/?hl=de#!topic/scala-user/ImqlClXTrS4[201-225-false]
201 replies

Sadly, the only serious language on the JVM besides Java8 is 
Scala. Ceylon has not taken off at all after becoming 1.0. 
Groovy's language extensions are basically AST transformations 
and not truly baked into a "real" language. Nobody knows how 
Kotlin will be doing when it turns 1.0 maybe somewhen in 
autumn/winter this year.

To get a plus for your skill set when applying for Java jobs you 
will have to learn Scala. For a Java developer like me any 
chances for a job doing D are very slim. But I keep looking into 
D just out of interest and to get some food for my mind. There is 
so much to learn from looking at D and playing with it that I 
keep doing it just on a fun & interest basis.
Apr 03 2014
next sibling parent reply "Rikki Cattermole" <alphaglosined gmail.com> writes:
On Thursday, 3 April 2014 at 08:18:01 UTC, Bienlein wrote:
 My knowledge of compiler constructions is fairly limited and I 
 might be wrong, but it seems to me that the Scala compiler is 
 broken. Scala has gained some bad reputation for long build 
 times (just google for Scala and build time) which IMHO cannot 
 be explained by the large number of language features. D has a 
 comparable large number of language features and compiles much 
 faster than Scala.

 D has been designed from the beginning with caution on 
 compilation speed and thinking about how to keep it slow to 
 begin with. D not only in that way is a language that was 
 thought out. On the contrary, Scala seems to me to be a 
 language where many features of various languages were thrown 
 into one and then a compiler was built for it. The incremental 
 Scala compiler pretty much rescues the build time problem, 
 though, and they are mostly lucky now. Also, IMHO, implicits 
 are really crazy and it should have been clear from the 
 beginning that they will become a problem for scalable build 
 times, see 
 http://java.dzone.com/articles/implicits-scala-conversion. 
 Interestingly. Martin Odersky got his Ph.D. from Niklaus Wirth 
 at the ETH and I don't want to know what Wirth would say about 
 implicits.

 The presentation by Paul Phillips was discussed in the Scala 
 forums at great length:

 What's up with Paul Phillips?
 https://groups.google.com/forum/?hl=de#!topic/scala-debate/IgrKCdConlA
 54 replies

 What's up with Paul Phillips?
 https://groups.google.com/forum/?hl=de#!topic/scala-user/ImqlClXTrS4[201-225-false]
 201 replies

 Sadly, the only serious language on the JVM besides Java8 is 
 Scala. Ceylon has not taken off at all after becoming 1.0. 
 Groovy's language extensions are basically AST transformations 
 and not truly baked into a "real" language. Nobody knows how 
 Kotlin will be doing when it turns 1.0 maybe somewhen in 
 autumn/winter this year.

 To get a plus for your skill set when applying for Java jobs 
 you will have to learn Scala. For a Java developer like me any 
 chances for a job doing D are very slim. But I keep looking 
 into D just out of interest and to get some food for my mind. 
 There is so much to learn from looking at D and playing with it 
 that I keep doing it just on a fun & interest basis.
If I remember what the state of Groovy is (around 2012). The compiler devs focussed quite heavily on functionality not performance. Even refused to go that direction. It was quite bad. Its a real shame. I liked it. Although if they had and had unsigned types I probably wouldn't be in D!
Apr 03 2014
parent reply "Bienlein" <jeti789 web.de> writes:
 If I remember what the state of Groovy is (around 2012). The 
 compiler devs focussed quite heavily on functionality not 
 performance. Even refused to go that direction.
 It was quite bad.

 Its a real shame. I liked it. Although if they had and had 
 unsigned types I probably wouldn't be in D!
Since Groovy 2.0 there is optional static type checking and when using it performance is much better. When Groovy is run over the Havlak benchmark it is only 10% behind in speed compared to Java with static typing and only about 40% in behind when purely dynamic as with pre-2.0 Groovy. See the bottom most paragraph in the readme of https://github.com/oplohmann/havlak-jvm-languages The benchmark in this article (http://java.dzone.com/articles/groovy-20-performance-compared) only measures method invocation time, but it also gives some idea that performance in Groovy is really good now. What Scala is really good at is concurrency. You must give them that. Akka (akka.io) and new ideas about futures and promises really started in the Scala community. Some of that stuff also made it into JDK8. Something like Akka for D will be a killer app for D. It can't be done as a spare time activity, otherwise I would already have embarked on it ;-).
Apr 03 2014
parent "Rikki Cattermole" <alphaglosined gmail.com> writes:
On Thursday, 3 April 2014 at 08:43:33 UTC, Bienlein wrote:
 If I remember what the state of Groovy is (around 2012). The 
 compiler devs focussed quite heavily on functionality not 
 performance. Even refused to go that direction.
 It was quite bad.

 Its a real shame. I liked it. Although if they had and had 
 unsigned types I probably wouldn't be in D!
Since Groovy 2.0 there is optional static type checking and when using it performance is much better. When Groovy is run over the Havlak benchmark it is only 10% behind in speed compared to Java with static typing and only about 40% in behind when purely dynamic as with pre-2.0 Groovy. See the bottom most paragraph in the readme of https://github.com/oplohmann/havlak-jvm-languages The benchmark in this article (http://java.dzone.com/articles/groovy-20-performance-compared) only measures method invocation time, but it also gives some idea that performance in Groovy is really good now.
Sounds like a lot has changed since I was in it then.
 What Scala is really good at is concurrency. You must give them 
 that. Akka (akka.io) and new ideas about futures and promises 
 really started in the Scala community. Some of that stuff also 
 made it into JDK8. Something like Akka for D will be a killer 
 app for D. It can't be done as a spare time activity, otherwise 
 I would already have embarked on it ;-).
Yes Akka is definitely a rather neat and great technology. Also would be great to have in D. I would love to help get something like this working in D. But time. Its bad enough with Cmsed in its current state. Let alone if I were to meet its goals of providing pretty much everything under the sun. Like node communication between frontend and backend for a web service. That also would be rather a killer feature. But in saying this it would actually probably be better if it was built like Akka. So on second thoughts guess what I'll be working on soon. Something like Akka. If you hear from me within a week in the format of an announcement please help :)
Apr 03 2014
prev sibling parent reply "bachmeier" <no spam.net> writes:
On Thursday, 3 April 2014 at 08:18:01 UTC, Bienlein wrote:
 My knowledge of compiler constructions is fairly limited and I 
 might be wrong, but it seems to me that the Scala compiler is 
 broken. Scala has gained some bad reputation for long build 
 times (just google for Scala and build time) which IMHO cannot 
 be explained by the large number of language features. D has a 
 comparable large number of language features and compiles much 
 faster than Scala.

 D has been designed from the beginning with caution on 
 compilation speed and thinking about how to keep it slow to 
 begin with. D not only in that way is a language that was 
 thought out. On the contrary, Scala seems to me to be a 
 language where many features of various languages were thrown 
 into one and then a compiler was built for it. The incremental 
 Scala compiler pretty much rescues the build time problem, 
 though, and they are mostly lucky now. Also, IMHO, implicits 
 are really crazy and it should have been clear from the 
 beginning that they will become a problem for scalable build 
 times, see 
 http://java.dzone.com/articles/implicits-scala-conversion. 
 Interestingly. Martin Odersky got his Ph.D. from Niklaus Wirth 
 at the ETH and I don't want to know what Wirth would say about 
 implicits.

 The presentation by Paul Phillips was discussed in the Scala 
 forums at great length:

 What's up with Paul Phillips?
 https://groups.google.com/forum/?hl=de#!topic/scala-debate/IgrKCdConlA
 54 replies

 What's up with Paul Phillips?
 https://groups.google.com/forum/?hl=de#!topic/scala-user/ImqlClXTrS4[201-225-false]
 201 replies

 Sadly, the only serious language on the JVM besides Java8 is 
 Scala. Ceylon has not taken off at all after becoming 1.0. 
 Groovy's language extensions are basically AST transformations 
 and not truly baked into a "real" language. Nobody knows how 
 Kotlin will be doing when it turns 1.0 maybe somewhen in 
 autumn/winter this year.
What about Clojure? It is getting real world use. The recent release makes it easier to call Clojure from Java. Example: IFn map = Clojure.var("clojure.core", "map"); IFn inc = Clojure.var("clojure.core", "inc"); map.invoke(inc, Clojure.read("[1 2 3]")); is all you need to use Clojure's map from a Java program. https://github.com/clojure/clojure/blob/master/changes.md
Apr 03 2014
parent reply "Bienlein" <jeti789 web.de> writes:
On Thursday, 3 April 2014 at 11:03:56 UTC, bachmeier wrote:
 What about Clojure? It is getting real world use. The recent
 release makes it easier to call Clojure from Java. Example:

 IFn map = Clojure.var("clojure.core", "map");
 IFn inc = Clojure.var("clojure.core", "inc");
 map.invoke(inc, Clojure.read("[1 2 3]"));

 is all you need to use Clojure's map from a Java program.

 https://github.com/clojure/clojure/blob/master/changes.md
Yeah, you might be right. I was maybe too much focused on imperative/OO languages. It is now especially easy to call Closure from Kotlin. Have a look: http://blog.jetbrains.com/kotlin/2014/04/kotlin-gets-support-for-s-expressions
Apr 03 2014
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 3 April 2014 at 11:19:14 UTC, Bienlein wrote:
 On Thursday, 3 April 2014 at 11:03:56 UTC, bachmeier wrote:
 What about Clojure? It is getting real world use. The recent
 release makes it easier to call Clojure from Java. Example:

 IFn map = Clojure.var("clojure.core", "map");
 IFn inc = Clojure.var("clojure.core", "inc");
 map.invoke(inc, Clojure.read("[1 2 3]"));

 is all you need to use Clojure's map from a Java program.

 https://github.com/clojure/clojure/blob/master/changes.md
Yeah, you might be right. I was maybe too much focused on imperative/OO languages. It is now especially easy to call Closure from Kotlin. Have a look: http://blog.jetbrains.com/kotlin/2014/04/kotlin-gets-support-for-s-expressions
I think you missed the post date.
Apr 03 2014
parent "Bienlein" <jeti789 web.de> writes:
On Thursday, 3 April 2014 at 13:23:16 UTC, Paulo Pinto wrote:

 I think you missed the post date.
I think so too ...
Apr 03 2014
prev sibling next sibling parent reply "w0rp" <devw0rp gmail.com> writes:
I notice that he mentioned the objection to defining equality and 
so on for the root object. I have heard this before from Philip 
Wadler, and the more I think about it, the more it makes sense. 
This is essentially the idea of removing every method from 
Object, which we have dicussed before.
Apr 03 2014
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 3 April 2014 at 22:58:24 UTC, w0rp wrote:
 I notice that he mentioned the objection to defining equality 
 and so on for the root object. I have heard this before from 
 Philip Wadler, and the more I think about it, the more it makes 
 sense. This is essentially the idea of removing every method 
 from Object, which we have dicussed before.
I used to discuss against it, but came to realize it does actually make sense. Java came up with it most likely as it was the way in Smalltalk. But nowadays we know better, those concepts are better expressed via interfaces/traits/protocols, or whatever they are called in every language. In languages with generics support and some form of interface definitions, there is no need for root objects. -- Paulo
Apr 04 2014
parent "Bienlein" <jeti789 web.de> writes:
On Friday, 4 April 2014 at 07:43:22 UTC, Paulo Pinto wrote:

 Java came up with it most likely as it was the way in Smalltalk.
That's right. As Smalltalk is dynamically typed it is not an issue there anyway and Java to begin with had to parameterized types till JDK4.
I guess you need to be more up to date to Scala news. :)
https://groups.google.com/forum/m/#!msg/scala-internals/6HL6lVLI3bQ/IY4gEyOwFhoJ
https://github.com/lampepfl/dotty
Interesting. But it kind of looks like yet another academical thing similar to Scala. To me Kotlin is the better "Scala done right" language. But I won't play with it until I see its compiler speed being a lot better than with Scala ...
Apr 04 2014
prev sibling next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Thursday, 3 April 2014 at 01:55:48 UTC, Andrei Alexandrescu 
wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg


 Andrei
His examination of the compare function was interesting. I think, though, that it's misguided, and not one of Scala's problems. Returning an int to denote less than, equal, and greater than is a very small complexity, and makes it very fast to check the result.
Apr 03 2014
next sibling parent reply "Meta" <jared771 gmail.com> writes:
A more interesting point of his is the limitation of Scala's 
ability to optimize functions like filter... This is also a 
problem in D, but not as visible as we do not have macros to 
perform the sort of transformation he describes (turning filter 
f1, filter f2, filter f3 into filter f1 f2 f3). Maybe we should 
think about enforcing that lambas passed to higher order 
functions are pure, when we can (not in the compiler, of course. 
In the library.)
Apr 03 2014
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 01:14:37 UTC, Meta wrote:
 A more interesting point of his is the limitation of Scala's 
 ability to optimize functions like filter... This is also a 
 problem in D, but not as visible as we do not have macros to 
 perform the sort of transformation he describes (turning filter 
 f1, filter f2, filter f3 into filter f1 f2 f3). Maybe we should 
 think about enforcing that lambas passed to higher order 
 functions are pure, when we can (not in the compiler, of 
 course. In the library.)
And unfortunately, his next example also compiles in D. At least D has some rationale for allowing this in the fact that it's a systems-level language, but this is still awful. import std.stdio; void main() { float f = long.max; int n = int.max; auto x = f - n; writeln(typeof(x).stringof, " ", x); }
Apr 03 2014
parent "Meta" <jared771 gmail.com> writes:
Whoops, should be:

import std.stdio;

void main()
{
	float x1 = long.max;
	float x2 = long.max - int.max;
	writeln(typeof(x2).stringof, " ", x2);
}

Not that it makes a difference.
Apr 03 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 6:14 PM, Meta wrote:
 A more interesting point of his is the limitation of Scala's ability to
optimize
 functions like filter... This is also a problem in D, but not as visible as we
 do not have macros to perform the sort of transformation he describes (turning
 filter f1, filter f2, filter f3 into filter f1 f2 f3). Maybe we should think
 about enforcing that lambas passed to higher order functions are pure, when we
 can (not in the compiler, of course. In the library.)
Since in D you can detect if a function is pure, and specialize accordingly, it is not necessary to require that the filter function be pure.
Apr 03 2014
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 01:51:58 UTC, Walter Bright wrote:
 Since in D you can detect if a function is pure, and specialize 
 accordingly, it is not necessary to require that the filter 
 function be pure.
That's true, but then somebody somewhere accidentally passes in a delegate that references some outside state, and performance is suddenly shot for no apparent reason. The upside in D is that you can explicitly mark delegates as pure and have the compiler check for you, but that still puts the onus on the user to be disciplined and not forget.
Apr 03 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 7:00 PM, Meta wrote:
 The upside in D is that you can explicitly mark delegates as pure and
 have the compiler check for you, but that still puts the onus on the user to be
 disciplined and not forget.
It's really like everything else in programming - at some point, if you don't avail yourself of the checking features, you have to check it yourself.
Apr 03 2014
prev sibling parent reply Ben Boeckel <mathstuf gmail.com> writes:
On Thu, Apr 03, 2014 at 18:51:56 -0700, Walter Bright wrote:
 Since in D you can detect if a function is pure, and specialize
 accordingly, it is not necessary to require that the filter function
 be pure.
Is there a built-in compose operator or function (Haskell's (.) operator)? How would you copy the common attributes of the composed functions to the new function (if not builtin)? --Ben
Apr 03 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 7:01 PM, Ben Boeckel wrote:
 Is there a built-in compose operator or function (Haskell's (.)
 operator)? How would you copy the common attributes of the composed
 functions to the new function (if not builtin)?
The compiler does attribute inference for template functions and lambdas.
Apr 03 2014
prev sibling next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Meta:

 Returning an int to denote less than, equal, and greater than 
 is a very small complexity, and makes it very fast to check the 
 result.
The point of that part of the rant is that using an integer is very not-precise, typing-wise. Having more precise typing sometimes helps. In a little higher level language using a 3-value enum (as in Haskell, more or less) is still sufficiently efficient. And Ada language shows that often you can have both precise types (strong typing) and almost C-like efficiency. Bye, bearophile
Apr 03 2014
parent reply "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 01:31:20 UTC, bearophile wrote:
 The point of that part of the rant is that using an integer is 
 very not-precise, typing-wise. Having more precise typing 
 sometimes helps.

 In a little higher level language using a 3-value enum (as in 
 Haskell, more or less) is still sufficiently efficient. And Ada 
 language shows that often you can have both precise types 
 (strong typing) and almost C-like efficiency.

 Bye,
 bearophile
I would agree if D actually had type-safe enums. enum a { val = 1 } enum b { val = 1 } assert(a.val - b.val == 0);
Apr 03 2014
parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Meta:

 I would agree if D actually had type-safe enums.

 enum a
 {
     val = 1
 }
 	
 enum b
 {
     val = 1
 }

 assert(a.val - b.val == 0);
C enums are mostly type unsafe. C++11 has enum class that is strongly typed. D enums are intermediate (and it has final switches). I have asked for fully typesafe enums in D, but in several years I think Walter has never answered, nor he has explained why D has chosen such intermediate point. I presume this choice is based on practical reasons, but I don't know exactly what they are (perhaps to minimize the number of casts). D used to have several corners of weak typing (like a partial confusion between pointers and dynamic arrays) that later have being (painfully and slowly) fixed (and this despite D Zen is supposed to prefer a strict design first, followed by some relaxations later). Bye, bearophile
Apr 03 2014
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 7:19 PM, bearophile wrote:
 I have asked for fully
 typesafe enums in D, but in several years I think Walter has never answered,
nor
 he has explained why D has chosen such intermediate point. I presume this
choice
 is based on practical reasons, but I don't know exactly what they are (perhaps
 to minimize the number of casts).
Because every cast breaks the type system. A type system that requires too many casts for normal things is NOT a type safe system. I have explained this on numerous occasions.
Apr 03 2014
parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 Because every cast breaks the type system. A type system that 
 requires too many casts for normal things is NOT a type safe 
 system.

 I have explained this on numerous occasions.
You have discussed many times about the unsafety of casts and I agree with your point of view. I try to reduce the number of casts as much as possible in my code (and recently I have replaced many "cast(double)x" with nice "double(x)", as time passes D allows to remove more and more casts from the code, this is an improvement). You see I care of casts also from the little casts statistic I've done on your Warp: http://forum.dlang.org/thread/lhf0u6$2r80$1 digitalmars.com?page=3#post-wjjivmmeyeismgkntwsj:40forum.dlang.org Or from answers I keep giving in D.learn, where I suggest to minimize the usage of casts: http://forum.dlang.org/thread/efbjrtwqywkhfybmyvxy forum.dlang.org But since I follow D development and I write D code I don't remember any kind of discussion regarding the specific disadvantages of a stronger typed enum. This means answering questions like: what does it happen if D enums become strongly typed? How many casts is this going to cause in D code? Is it true that such enum casts are going to be worse than type unsafety of the current design? I don't remember seeing any little study that shows that stronger casts in D increase a lot the number of casts. Perhaps this little study was done before I have started to use D1 and I have missed it. I suspect that in my D code most cast usage does not need casts if you replace them with stronger casts as the "enum class" of C++11, but I have no proof of this. Bye, bearophile
Apr 04 2014
next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
 I suspect that in my D code most cast usage does not need casts 
 if you replace them with stronger casts as the "enum class" of 
 C++11, but I have no proof of this.
Too much casting. I meant to say: I suspect that in my D code most usages of enum don't need casts if you replace them with stronger enums (like the "enum class" of C++11), but I have no proof of this. Bye, bearophile
Apr 04 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/4/2014 3:24 AM, bearophile wrote:
 You see I care of casts also from the little casts statistic I've done on your
 Warp:
 http://forum.dlang.org/thread/lhf0u6$2r80$1 digitalmars.com?page=3#post-wjjivmmeyeismgkntwsj:40forum.dlang.org
Most of the casts in Warp come from the workarounds I had to do to get around the auto-decode of std.array.front(). I have designed byChar, byWchar and byDchar ranges for Phobos to get around this issue, but that is stalled now because of the messed up design of ranges. None of that has anything to do with enums.
 But since I follow D development and I write D code I don't remember any kind
of
 discussion regarding the specific disadvantages of a stronger typed enum.
Here's one: enum Index { A, B, C } T[Index.max] array; // Error: Index.max is not an int ... array[B] = t; // Error: B is not an int And another: array[A + 1] = t; // Error: incompatible types Index and int And another: enum Mask { A=1,B=4 } Mask m = A | B; // Error: incompatible operator | for enum and on it goes. These are routine and normal uses of enums.
 This means answering questions like: what does it happen if D enums become
strongly
 typed? How many casts is this going to cause in D code? Is it true that such
 enum casts are going to be worse than type unsafety of the current design?
Yes, because I have to fill the above code with cast(int), and you are well aware that such blunt casting destroys all type safety. And besides, even if such strongly typed enums were a good idea, making such a change would be an utter disaster for existing code. It is out of the question.
Apr 04 2014
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 18:02:02 UTC, Walter Bright wrote:
 Here's one:

   enum Index { A, B, C }
   T[Index.max] array; // Error: Index.max is not an int
   ...
   array[B] = t;   // Error: B is not an int


 And another:

   array[A + 1] = t; // Error: incompatible types Index and int

 And another:

   enum Mask { A=1,B=4 }

   Mask m = A | B;   // Error: incompatible operator | for enum

 and on it goes. These are routine and normal uses of enums.
It's trivial to write an EnumMemberValue template that convert the enum member to its underlying value at compile time. Also, if I remember correctly, min and max are terribly broken for enums in the first place.
Apr 04 2014
next sibling parent "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 18:15:43 UTC, Meta wrote:
 And another:

  array[A + 1] = t; // Error: incompatible types Index and int

 And another:

  enum Mask { A=1,B=4 }

  Mask m = A | B;   // Error: incompatible operator | for enum

 and on it goes. These are routine and normal uses of enums.
https://github.com/D-Programming-Language/phobos/pull/2058 Perhaps we *will* get typesafe enums of a sort via a library implementation.
Apr 04 2014
prev sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Meta:

 Also, if I remember correctly, min and max are terribly
 broken for enums in the first place.
Yes, perhaps they need to be deprecated. Also, there is a ugly name clashing between enum field names and the enum properties. The solution is to group them into a single namespace (like "meta"), and then forbid an enum member with the name of the namespace. https://d.puremagic.com/issues/show_bug.cgi?id=4997 Unfortunately overall the design of D enums has more holes than swiss cheese. This is why in a recent post I said to Andrei that perhaps there are still several little breaking changes to do to D, and they need priority over additive enhancements. Bye, bearophile
Apr 04 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/4/2014 11:47 AM, bearophile wrote:
 Also, there is a ugly name clashing between enum field names and the enum
 properties. The solution is to group them into a single namespace (like
"meta"),
 and then forbid an enum member with the name of the namespace.

 https://d.puremagic.com/issues/show_bug.cgi?id=4997
Actually, that was intentional, which is why the issue is marked as "enhancement". The builtin properties are override-able.
 Unfortunately overall the design of D enums has more holes than swiss cheese.
Enums are not meant to be rigidly typed.
 This is why in a recent post I said to Andrei that perhaps there are still
 several little breaking changes to do to D, and they need priority over
additive
 enhancements.
I understand your concerns, but I don't share your opinion that they need fixing. Their behaviors were deliberately designed, and in my experience work out nicely.
Apr 04 2014
prev sibling next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

Thank you for the answers.

 Here's one:

   enum Index { A, B, C }
   T[Index.max] array; // Error: Index.max is not an int
   ...
   array[B] = t;   // Error: B is not an int
In the last months I've grown a moderate desire for optionally strongly typed array indexes in D (as seen in Ada, but with a different syntax) (it's optional, so it's meant to be an additive change, that causes no harm to existing D code). With them code like yours becomes OK (as it's OK in Ada). Such optional strong typing for array indexes is not means for script-like D programs, but for the medium-integrity D programs.
 And another:

   array[A + 1] = t; // Error: incompatible types Index and int
This can be solved with optionally strongly typed array indexes plus a succ/prec property for enums. I have asked for such property years ago. In Ada you use the built in function "Succ". Alternatively, in D you can also use a library-defined group of little functions/templates succ/prec/Succ/Prec (that contain a cast, but it's in Phobos, so it's less dangerous than a cast in user code): array[Succ!(Index.A)] = t; auto i = Index.A; array[i.succ] = t;
 And another:

   enum Mask { A=1,B=4 }

   Mask m = A | B;   // Error: incompatible operator | for enum
In GitHub there is a patch that is meant to implement Flags in [Flags]): https://github.com/D-Programming-Language/phobos/pull/2058 If such Flags is implemented with enums, then it contains casts, but again casts in Phobos are less dangerous than casts in user code. Bye, bearophile
Apr 04 2014
next sibling parent "Araq" <rumpf_a web.de> writes:
 And another:

  enum Mask { A=1,B=4 }

  Mask m = A | B;   // Error: incompatible operator | for enum
That would be a 'set of enum' in Pascal/Delphi.
Apr 04 2014
prev sibling next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
 array[Succ!(Index.A)] = t;
 auto i = Index.A;
 array[i.succ] = t;
And with "enum precondition" in the succ() function you can do both cases with a single function: array[Index.A.succ] = t; auto i = Index.A; array[i.succ] = t; Bye, bearophile
Apr 04 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/4/2014 12:05 PM, bearophile wrote:
 And with "enum precondition" in the succ() function you can do both cases with
a
 single function:

 array[Index.A.succ] = t;
 auto i = Index.A;
 array[i.succ] = t;
What about i+10? Do you expect the person to write i.succ.succ.succ.succ.succ.succ.succ.succ.succ.succ? Sorry, that sux! And what about: int j; array[i+j] ? And forcing the user to use templates to do any logical or arithmetic operations on enum operands? It's just awful.
Apr 04 2014
parent Ben Boeckel <mathstuf gmail.com> writes:
On Fri, Apr 04, 2014 at 13:04:28 -0700, Walter Bright wrote:
 On 4/4/2014 12:05 PM, bearophile wrote:
And with "enum precondition" in the succ() function you can do both cases with a
single function:

array[Index.A.succ] = t;
auto i = Index.A;
array[i.succ] = t;
What about i+10? Do you expect the person to write i.succ.succ.succ.succ.succ.succ.succ.succ.succ.succ? Sorry, that sux!
You say that, I see a pattern: Presumably, you'd have something like: iter :: Int -> (a -> a) -> a -> a so you'd have: array[iter(j, succ, i)] But really, I think using an interface (cf Ix referenced elsewhere in the thread) rather than a concrete type for array indices may have been better, but I also think that ship has sailed here.
 And what about:
 
     int j;
     array[i+j]
Why would you be adding arbitrary integers to enumerations and expecting a valid result?
 And forcing the user to use templates to do any logical or arithmetic
 operations on enum operands? It's just awful.
Again, interfaces :) . --Ben
Apr 04 2014
prev sibling parent reply Leandro Lucarella <luca llucax.com.ar> writes:
bearophile, el  4 de April a las 18:39 me escribiste:
 Walter Bright:
 
 Thank you for the answers.
 
Here's one:

  enum Index { A, B, C }
  T[Index.max] array; // Error: Index.max is not an int
  ...
  array[B] = t;   // Error: B is not an int
In the last months I've grown a moderate desire for optionally strongly typed array indexes in D (as seen in Ada, but with a
What about: enum Int : int { One = 1, Two, Three } // implicitly casteable to int enum Symbolic { Dogs, Cars, Trees } // not implicitly casteable (and // maybe not even expose the // internal value) ? -- Leandro Lucarella (AKA luca) http://llucax.com.ar/
Apr 05 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/5/2014 2:40 AM, Leandro Lucarella wrote:
 enum Symbolic { Dogs, Cars, Trees }    // not implicitly casteable (and
 				       // maybe not even expose the
 				       // internal value)

 ?
struct Symbolic { private static struct _impl { private int x; } enum Dogs = _impl(0); enum Cars = _impl(1); enum Trees = _impl(2); } Of course, you can hide all this in a template.
Apr 05 2014
parent reply Leandro Lucarella <luca llucax.com.ar> writes:
Walter Bright, el  5 de April a las 11:04 me escribiste:
 On 4/5/2014 2:40 AM, Leandro Lucarella wrote:
enum Symbolic { Dogs, Cars, Trees }    // not implicitly casteable (and
				       // maybe not even expose the
				       // internal value)

?
struct Symbolic { private static struct _impl { private int x; } enum Dogs = _impl(0); enum Cars = _impl(1); enum Trees = _impl(2); } Of course, you can hide all this in a template.
Well, you can "emulate" enums as they are now with structs too, so that doesn't change anything in the argument about why to provide syntax sugar for one and not the other. -- Leandro Lucarella (AKA luca) http://llucax.com.ar/
Apr 05 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/5/2014 6:28 PM, Leandro Lucarella wrote:
 Walter Bright, el  5 de April a las 11:04 me escribiste:
 Of course, you can hide all this in a template.
Well, you can "emulate" enums as they are now with structs too, so that doesn't change anything in the argument about why to provide syntax sugar for one and not the other.
The argument for syntactic sugar is it must show a very large benefit over using a template. Having special syntax for everything makes the language unusable.
Apr 05 2014
next sibling parent reply Leandro Lucarella <luca llucax.com.ar> writes:
Walter Bright, el  5 de April a las 21:15 me escribiste:
 On 4/5/2014 6:28 PM, Leandro Lucarella wrote:
Walter Bright, el  5 de April a las 11:04 me escribiste:
Of course, you can hide all this in a template.
Well, you can "emulate" enums as they are now with structs too, so that doesn't change anything in the argument about why to provide syntax sugar for one and not the other.
The argument for syntactic sugar is it must show a very large benefit over using a template. Having special syntax for everything makes the language unusable.
What I mean is the current semantics of enum are as they are for historical reasons, not because they make (more) sense (than other possibilities). You showed a lot of examples that makes sense only because you are used to the current semantics, not because they are the only option or the option that makes the most sense. Is it better to redesign enum semantics now? Probably not, but I'm just saying :) -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- El techo de mi cuarto lleno de cometas
Apr 06 2014
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 4/6/14, 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.

 Is it better to redesign enum semantics now? Probably not, but I'm just
 saying :)
I fully agree. In my opinion, too, the enum design in D is suboptimal. Andrei
Apr 06 2014
parent "Eric" <eric makechip.com> writes:
On Sunday, 6 April 2014 at 16:46:12 UTC, Andrei Alexandrescu 
wrote:
 On 4/6/14, 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are 
 for
 historical reasons, not because they make (more) sense (than 
 other
 possibilities). You showed a lot of examples that makes sense 
 only
 because you are used to the current semantics, not because 
 they are the
 only option or the option that makes the most sense.

 Is it better to redesign enum semantics now? Probably not, but 
 I'm just
 saying :)
I fully agree. In my opinion, too, the enum design in D is suboptimal. Andrei
Hey bearophile - I rest my case... -Eric
Apr 06 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Apr 06 2014
next sibling parent reply "Araq" <rumpf_a web.de> writes:
On Sunday, 6 April 2014 at 17:52:19 UTC, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are 
 for
 historical reasons, not because they make (more) sense (than 
 other
 possibilities). You showed a lot of examples that makes sense 
 only
 because you are used to the current semantics, not because 
 they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
The fact that you are unaware of how it's properly done (hint: Pascal got right with 'set of enum' being distinct from 'enum') makes it a historical accident.
Apr 06 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/6/2014 2:26 PM, Araq wrote:
 The fact that you are unaware of how it's properly done (hint: Pascal got right
 with 'set of enum' being distinct from 'enum') makes it a historical accident.
I wrote a Pascal compiler before the C one.
Apr 06 2014
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Apr 06 2014
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/6/2014 4:17 PM, Andrei Alexandrescu wrote:
 On 4/6/14, 10:52 AM, Walter Bright wrote:
 I use enums a lot in D. I find they work very satisfactorily. The way
 they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Sorry, yer wrong!
Apr 06 2014
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 4/6/14, 6:49 PM, Walter Bright wrote:
 On 4/6/2014 4:17 PM, Andrei Alexandrescu wrote:
 On 4/6/14, 10:52 AM, Walter Bright wrote:
 I use enums a lot in D. I find they work very satisfactorily. The way
 they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Sorry, yer wrong!
This program compiles and flag free and no cast in sight but fails at runtime. Textbook example of unsound type design. import std.stdio; enum A { x = 2, y = 4 } void main() { A a = A.x | A.y; final switch (a) { case A.x: break; case A.y: break; } } The "|" operator converts back to an A. It shouldn't. In this case it provides a value not only outside the enum range, but even greater than A.max (when converted to integer). I'm fine with "yes, it's unsound, but we wanted to do flags and we couldn't find a better solution", but this "it's deliberate and it's good" I just find difficult to get behind. Andrei
Apr 07 2014
parent "w0rp" <devw0rp gmail.com> writes:
On Monday, 7 April 2014 at 21:02:04 UTC, Andrei Alexandrescu 
wrote:
 This program compiles and flag free and no cast in sight but 
 fails at runtime. Textbook example of unsound type design.

 import std.stdio;

 enum A { x = 2, y = 4 }

 void main()
 {
     A a = A.x | A.y;
     final switch (a)
     {
         case A.x: break;
         case A.y: break;
     }
 }

 The "|" operator converts back to an A. It shouldn't. In this 
 case it provides a value not only outside the enum range, but 
 even greater than A.max (when converted to integer).

 I'm fine with "yes, it's unsound, but we wanted to do flags and 
 we couldn't find a better solution", but this "it's deliberate 
 and it's good" I just find difficult to get behind.


 Andrei
Yeah, I've seen this happen before. I think we could actually introduce a little more type safety on enums without a great deal of breakage. It would be nice to have a final switch give you as much of a guarantee about what it's doing as it can.
Apr 07 2014
prev sibling parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Mon, 07 Apr 2014 00:17:45 +0100, Andrei Alexandrescu  
<SeeWebsiteForEmail erdani.org> wrote:

 On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Got a DIP/spec/design to share? R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Apr 07 2014
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Monday, 7 April 2014 at 10:07:03 UTC, Regan Heath wrote:
 Got a DIP/spec/design to share?

 R
I think biggest mistake of D enums is merging constants and actual enumerations into single entity which has resulted in weak typing of enumerations.
Apr 07 2014
next sibling parent "Eric" <eric makechip.com> writes:
On Monday, 7 April 2014 at 12:04:15 UTC, Dicebot wrote:
 On Monday, 7 April 2014 at 10:07:03 UTC, Regan Heath wrote:
 Got a DIP/spec/design to share?

 R
I think biggest mistake of D enums is merging constants and actual enumerations into single entity which has resulted in weak typing of enumerations.
Which leads to things like not being able to use enums of class type or struct type in switch statements. -Eric
Apr 07 2014
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 4/7/14, 5:04 AM, Dicebot wrote:
 On Monday, 7 April 2014 at 10:07:03 UTC, Regan Heath wrote:
 Got a DIP/spec/design to share?

 R
I think biggest mistake of D enums is merging constants and actual enumerations into single entity which has resulted in weak typing of enumerations.
That ain't the biggest. Biggest is unsound operations. -- Andrei
Apr 07 2014
prev sibling parent Leandro Lucarella <luca llucax.com.ar> writes:
Dicebot, el  7 de April a las 12:04 me escribiste:
 On Monday, 7 April 2014 at 10:07:03 UTC, Regan Heath wrote:
Got a DIP/spec/design to share?

R
I think biggest mistake of D enums is merging constants and actual enumerations into single entity which has resulted in weak typing of enumerations.
Yeah, enum E { A = 1, B = 2 } is the same as: struct E { immutable A = 1, B = 2; } (leaving storage aside) Which for me it doesn't make any sense. Even thinking about the argument (you should have a big gain to introduce syntax sugar). enums should be enums, flags should be flags, and manifest constants should be... well, you got the idea. enum is to D what const is to C++ :P -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- The average person laughs 13 times a day
Apr 08 2014
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
Am 07.04.2014 12:07, schrieb Regan Heath:
 On Mon, 07 Apr 2014 00:17:45 +0100, Andrei Alexandrescu
 <SeeWebsiteForEmail erdani.org> wrote:

 On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Got a DIP/spec/design to share? R
How they work in languages like Ada. -- Paulo
Apr 07 2014
parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Mon, 07 Apr 2014 16:15:41 +0100, Paulo Pinto <pjmlp progtools.org>  
wrote:

 Am 07.04.2014 12:07, schrieb Regan Heath:
 On Mon, 07 Apr 2014 00:17:45 +0100, Andrei Alexandrescu
 <SeeWebsiteForEmail erdani.org> wrote:

 On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are  
 the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Got a DIP/spec/design to share? R
How they work in languages like Ada.
Ok, brief look at those shows me enums can be converted to a "Pos" index but otherwise you cannot associate a numberic value with them, right? So if we had that in D, Walters examples would look like.. 1) enum Index { A, B, C } T[Index.C.pos + 1] array; // perhaps? ... array[Index.B.pos] = t; // yes? 2) array[Index.A.pos + 1] = t; // yes? 3) enum Mask { A=1,B=4 } // not possible? Mask m = A | B; // Error: incompatible operator | for enum Have I got that right? For a proposal like this to even be considered I would imagine it would have to be backward compatible with existing uses, so you would have to be proposing a new keyword or syntax on "enum" to trigger typesafe enums, perhaps "typesafe" is a good keyword, e.g. typesafe enum Index { A, B, C } // requires use of .pos to convert to int 0, 1, or 2. enum Index { A, B, C } // existing pragmatic behaviour R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Apr 07 2014
parent "Paulo Pinto" <pjmlp progtools.org> writes:
Using Ada code examples below:

On Monday, 7 April 2014 at 16:25:45 UTC, Regan Heath wrote:
 On Mon, 07 Apr 2014 16:15:41 +0100, Paulo Pinto 
 <pjmlp progtools.org> wrote:

 Am 07.04.2014 12:07, schrieb Regan Heath:
 On Mon, 07 Apr 2014 00:17:45 +0100, Andrei Alexandrescu
 <SeeWebsiteForEmail erdani.org> wrote:

 On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they 
 are for
 historical reasons, not because they make (more) sense 
 (than other
 possibilities). You showed a lot of examples that makes 
 sense only
 because you are used to the current semantics, not because 
 they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Got a DIP/spec/design to share? R
How they work in languages like Ada.
Ok, brief look at those shows me enums can be converted to a "Pos" index but otherwise you cannot associate a numberic value with them, right? So if we had that in D, Walters examples would look like.. 1) enum Index { A, B, C } T[Index.C.pos + 1] array; // perhaps?
type Index is (A, B, C); d_array: array Index'Length of T;
   ...
   array[Index.B.pos] = t;   // yes?
d_array(Index'Pos(B)) := t;
 2)

   array[Index.A.pos + 1] = t; // yes?
d_array(Index'Succ(A)) := t;
 3)

   enum Mask { A=1,B=4 } // not possible?

   Mask m = A | B;   // Error: incompatible operator | for enum
type Mask is (A, B); for Mask use (A => 1, B => 4); m : Mask := Mask'Pos(A) or Mask'Pos(B);
 Have I got that right?

 For a proposal like this to even be considered I would imagine 
 it would have to be backward compatible with existing uses, so 
 you would have to be proposing a new keyword or syntax on 
 "enum" to trigger typesafe enums, perhaps "typesafe" is a good 
 keyword, e.g.

 typesafe enum Index { A, B, C } // requires use of .pos to 
 convert to int 0, 1, or 2.
 enum Index { A, B, C }          // existing pragmatic behaviour

 R
This is the C++ approach with enum class for strong typed enums.
Apr 08 2014
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 4/7/14, 3:07 AM, Regan Heath wrote:
 On Mon, 07 Apr 2014 00:17:45 +0100, Andrei Alexandrescu
 <SeeWebsiteForEmail erdani.org> wrote:

 On 4/6/14, 10:52 AM, Walter Bright wrote:
 On 4/6/2014 3:31 AM, Leandro Lucarella wrote:
 What I mean is the current semantics of enum are as they are for
 historical reasons, not because they make (more) sense (than other
 possibilities). You showed a lot of examples that makes sense only
 because you are used to the current semantics, not because they are the
 only option or the option that makes the most sense.
I use enums a lot in D. I find they work very satisfactorily. The way they work was deliberately designed, not a historical accident.
Sorry, I think they ought to have been better. -- Andrei
Got a DIP/spec/design to share?
No. -- Andrei
Apr 07 2014
prev sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 Having special syntax for everything makes the language 
 unusable.
While there are ways to reach excesses in every design direction, and make things unusable, the risk discussed here seems remote to me. So do you have an example of this risk? Or examples of languages that have fallen in this trap? Perhaps Ada? Bye, bearophile
Apr 06 2014
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/6/2014 4:26 AM, bearophile wrote:
 So do you have an example of this risk?
Algol is a rather famous one. A counterexample is Go, which has gotten a lot of traction with a simple syntax.
Apr 06 2014
parent reply Paulo Pinto <pjmlp progtools.org> writes:
Am 06.04.2014 19:54, schrieb Walter Bright:
 On 4/6/2014 4:26 AM, bearophile wrote:
 So do you have an example of this risk?
Algol is a rather famous one. A counterexample is Go, which has gotten a lot of traction with a simple syntax.
It has more to do with Google than with the language's design.
Apr 06 2014
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Sunday, 6 April 2014 at 19:53:43 UTC, Paulo Pinto wrote:
 A counterexample is Go, which has gotten a lot of traction 
 with a simple
 syntax.
It has more to do with Google than with the language's design.
That, and being perceived as a http-server-language and having standard libraries and a threading model geared towards web servers. In addition Go has managed to improve the C syntax by removing in-most-cases redundant syntax. Which is quite nice for readability, IMO.
Apr 06 2014
prev sibling parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Sunday, 6 April 2014 at 11:26:41 UTC, bearophile wrote:
 Walter Bright:
 Having special syntax for everything makes the language 
 unusable.
While there are ways to reach excesses in every design direction, and make things unusable, the risk discussed here seems remote to me.
Too much syntax diversity for almost the same things leads to a language that is harder to learn, but I think readability has little to do with special syntax, but rather how it is done and how frequently used those constructs are. You can get syntax diversity with simple formal syntax too. Lisp code often shows signs of this. D and C++ show signs of this with overuse of templates. I find template heavy code to be very poor in terms of readability and well designed special syntax would have been much better in terms of usability.
Apr 06 2014
prev sibling parent reply Ben Boeckel <mathstuf gmail.com> writes:
On Fri, Apr 04, 2014 at 11:02:01 -0700, Walter Bright wrote:
 Most of the casts in Warp come from the workarounds I had to do to
 get around the auto-decode of std.array.front(). I have designed
 byChar, byWchar and byDchar ranges for Phobos to get around this
 issue, but that is stalled now because of the messed up design of
 ranges.
Sorry, I'm a D noob; what's 'auto-decode'?
 Here's one:
 
   enum Index { A, B, C }
   T[Index.max] array; // Error: Index.max is not an int
   ...
   array[B] = t;   // Error: B is not an int
Maybe instead of having array indices be int, having them specify some interface (akin to Ix[1] used by Haskell's Array). Not that this is likely fixable at this point.
 And another:
 
   array[A + 1] = t; // Error: incompatible types Index and int
 
 And another:
 
   enum Mask { A=1,B=4 }
 
   Mask m = A | B;   // Error: incompatible operator | for enum
I like Qt's Q_FLAG and Q_FLAGS where you have a separate type for the flags and combined flags. Maybe something like: enum MaskBits { mixin EnumBits!MaskBits; A=1, B=4 } alias Flags!MaskBits Mask; where EnumBits would define the binary operations would be possible?
 And besides, even if such strongly typed enums were a good idea,
 making such a change would be an utter disaster for existing code. It
 is out of the question.
Agreed. There's also Haskell's 'newtype' which might be useful to have (strongalias? strictalias?). I guess this is no different than something like: class NewType(T) { private: T store; public this(T store) { this.store = store; } package T unT() { return store; } } and if you want unT to be public: public T unT(NewType!T nt) { return nt.unT(); } --Ben [1]http://www.haskell.org/ghc/docs/latest/html/libraries/base/Data-Ix.html
Apr 04 2014
parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Friday, 4 April 2014 at 18:57:44 UTC, Ben Boeckel wrote:
 On Fri, Apr 04, 2014 at 11:02:01 -0700, Walter Bright wrote:
 Most of the casts in Warp come from the workarounds I had to 
 do to
 get around the auto-decode of std.array.front(). I have 
 designed
 byChar, byWchar and byDchar ranges for Phobos to get around 
 this
 issue, but that is stalled now because of the messed up design 
 of
 ranges.
Sorry, I'm a D noob; what's 'auto-decode'?
unicode decoding. front decodes a code-point from a string instead of a code-unit (single char)
Apr 04 2014
parent Ben Boeckel <mathstuf gmail.com> writes:
On Fri, Apr 04, 2014 at 20:11:39 +0000, John Colvin wrote:
 unicode decoding. front decodes a code-point from a string instead of
 a code-unit (single char)
Another reason for separating 'byte' and 'char' types? --Ben
Apr 04 2014
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 7:19 PM, bearophile wrote:
 I have asked for fully typesafe enums in D,
You can do this: struct MyInt { int x; alias this x; ... put your various constraints here ... } to get typesafe enums. In fact, you can use this construct to create a type that overrides selected behaviors of any other type.
Apr 03 2014
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Friday, 4 April 2014 at 04:31:41 UTC, Walter Bright wrote:
 On 4/3/2014 7:19 PM, bearophile wrote:
 I have asked for fully typesafe enums in D,
You can do this: struct MyInt { int x; alias this x; ... put your various constraints here ... } to get typesafe enums. In fact, you can use this construct to create a type that overrides selected behaviors of any other type.
Combined with your other post about casts, I'm not sure we're talking about the same kind of type-safety. In the case of your example, alias this does not make it typesafe, as a MyInt can still be implicitly converted to int. struct MyInt { int x; alias x this; } void takesInt(int n) { } void main() { //Fine takesInt(MyInt(1)); } Implicit conversions are generally not a facet of type-safe systems. Saying that too-strong typing is bad because casts break the type system is a strawman, although I agree that there is a balance that must be struck.
Apr 03 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/3/2014 9:54 PM, Meta wrote:
 In the case of your example, alias this does not make
 it typesafe, as a MyInt can still be implicitly converted to int.
You can disable the implicit conversion to int with this scheme. The alias this only takes effect if there is no other member that will take the operation.
Apr 04 2014
parent reply Rory McGuire <rjmcguire gmail.com> writes:
On Fri, Apr 4, 2014 at 9:05 AM, Walter Bright <newshound2 digitalmars.com>wrote:

 You can disable the implicit conversion to int with this scheme. The alias
 this only takes effect if there is no other member that will take the
 operation.
What is the exact method of disabling the implicit cast? I had a look after you sent your last mail. Didn't find anything in the spec.
Apr 04 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 4/4/2014 12:23 AM, Rory McGuire wrote:
 On Fri, Apr 4, 2014 at 9:05 AM, Walter Bright <newshound2 digitalmars.com
 <mailto:newshound2 digitalmars.com>> wrote:

     You can disable the implicit conversion to int with this scheme. The alias
     this only takes effect if there is no other member that will take the
operation.

 What is the exact method of disabling the implicit cast? I had a look after you
 sent your last mail. Didn't find anything in the spec.
It's supposed to be by adding your own opImplicitCast overload, but that isn't implemented yet.
Apr 04 2014
next sibling parent Rory McGuire <rjmcguire gmail.com> writes:
okay, awesome, I guessed correctly then. (on where to find it anyway).


On Fri, Apr 4, 2014 at 11:16 AM, Walter Bright
<newshound2 digitalmars.com>wrote:

 On 4/4/2014 12:23 AM, Rory McGuire wrote:

 On Fri, Apr 4, 2014 at 9:05 AM, Walter Bright <newshound2 digitalmars.com
 <mailto:newshound2 digitalmars.com>> wrote:

     You can disable the implicit conversion to int with this scheme. The
 alias
     this only takes effect if there is no other member that will take the
 operation.

 What is the exact method of disabling the implicit cast? I had a look
 after you
 sent your last mail. Didn't find anything in the spec.
It's supposed to be by adding your own opImplicitCast overload, but that isn't implemented yet.
Apr 04 2014
prev sibling next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
04-Apr-2014 13:16, Walter Bright пишет:
 On 4/4/2014 12:23 AM, Rory McGuire wrote:
 On Fri, Apr 4, 2014 at 9:05 AM, Walter Bright <newshound2 digitalmars.com
 <mailto:newshound2 digitalmars.com>> wrote:

     You can disable the implicit conversion to int with this scheme.
 The alias
     this only takes effect if there is no other member that will take
 the operation.

 What is the exact method of disabling the implicit cast? I had a look
 after you
 sent your last mail. Didn't find anything in the spec.
It's supposed to be by adding your own opImplicitCast overload, but that isn't implemented yet.
The difference between opImplictCast and alias this being ... ? -- Dmitry Olshansky
Apr 04 2014
parent Rory McGuire <rjmcguire gmail.com> writes:
First use alias this go "import" functionality transparently. Then use
opImplicitCast to set what can be implicitly casted to.
On 04 Apr 2014 12:45 PM, "Dmitry Olshansky" <dmitry.olsh gmail.com> wrote:

 04-Apr-2014 13:16, Walter Bright =D0=BF=D0=B8=D1=88=D0=B5=D1=82:

 On 4/4/2014 12:23 AM, Rory McGuire wrote:

 On Fri, Apr 4, 2014 at 9:05 AM, Walter Bright <
 newshound2 digitalmars.com
 <mailto:newshound2 digitalmars.com>> wrote:

     You can disable the implicit conversion to int with this scheme.
 The alias
     this only takes effect if there is no other member that will take
 the operation.

 What is the exact method of disabling the implicit cast? I had a look
 after you
 sent your last mail. Didn't find anything in the spec.
It's supposed to be by adding your own opImplicitCast overload, but that isn't implemented yet.
The difference between opImplictCast and alias this being ... ? -- Dmitry Olshansky
Apr 04 2014
prev sibling parent "Jesse Phillips" <Jesse.K.Phillips+D gmail.com> writes:
On Friday, 4 April 2014 at 09:16:26 UTC, Walter Bright wrote:
 It's supposed to be by adding your own opImplicitCast overload, 
 but that isn't implemented yet.
Wait, this is back? What else did the community get wrong when trying to interpret discussions? http://prowiki.org/wiki4d/wiki.cgi?LanguageDevel#FutureDirections Previously proposed, but now dropped: * opImplicitCast * Separating arrays and slices (T[new]) (see NG:digitalmars.D/95225) Demise of T[new] NG:digitalmars.D/98602 * Make references and arrays library types (see NG discussion) * this() for structs. * Make array literals immutable. * Remove 'new'. * Remove C-style struct initializers.
Apr 04 2014
prev sibling parent =?UTF-8?B?U2ltZW4gS2rDpnLDpXM=?= <simen.kjaras gmail.com> writes:
On 2014-04-04 04:31, Walter Bright wrote:
 On 4/3/2014 7:19 PM, bearophile wrote:
 I have asked for fully typesafe enums in D,
You can do this: struct MyInt { int x; alias this x; ... put your various constraints here ... } to get typesafe enums. In fact, you can use this construct to create a type that overrides selected behaviors of any other type.
For a more complete implementation of typesafe enums, here's my take: https://github.com/Biotronic/Collectanea/blob/master/biotronic/enumeration.d -- Simen
Apr 04 2014
prev sibling parent reply Ben Boeckel <mathstuf gmail.com> writes:
On Fri, Apr 04, 2014 at 00:59:23 +0000, Meta wrote:
 His examination of the compare function was interesting. I think,
 though, that it's misguided, and not one of Scala's problems.
Maybe not major, but it's not completely ignorable.
 Returning an int to denote less than, equal, and greater than is a
 very small complexity, and makes it very fast to check the result.
See, this is *exactly* his point. You're basically putting the "well, C does it" and "int is fast" as your rationale. I think by this point, we (as a collective community) have seen that C has some serious flaws when you start allowing user input from untrusted sources into your code. The latter is easily classified as premature optimization. There is *zero* rationale as to why this would be a compilable implementation of comparison: int compare(int a, int b) { return a * b; } The fact that this compiles when used as a comparison is *insane* when you take a fresh look at how you can construct a language. If you have sum types, you can both deny the above silliness and represent it as an integer and be just fine. In fact, you're possibly better off since you can now do: add $offset $compare_result jmp *$offset rather than doing 2 comparisons and a branch since you *know* the result will never be more than 2. --Ben
Apr 03 2014
parent "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Friday, 4 April 2014 at 01:54:16 UTC, Ben Boeckel wrote:
 There is *zero* rationale as to why this would be a compilable
 implementation of comparison:

     int compare(int a, int b) {
         return a * b;
     }

 The fact that this compiles when used as a comparison is 
 *insane* when
 you take a fresh look at how you can construct a language.
You're actually not restricted to int; you can also return float, or in fact any type that's compare to 0, including user-defined types that implement their own `opCmp`.
Apr 05 2014
prev sibling next sibling parent reply "Bienlein" <jeti789 web.de> writes:
On Thursday, 3 April 2014 at 01:55:48 UTC, Andrei Alexandrescu
wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg


 Andrei
He's got a point in mentioning things like "def equals(x: Any): Boolean" and "def compare(x: T, y: T): Int" (although the latter is not the worst problem I can think of). But the real message is to me what is said starting from 24:20: "There remain those periodic stepwise jumps in performance taking place in the compiler. ... There is a factor of 10 lying around. It's that bad. It's so hart to pinpoint what is doing what and why that ain't nothing possible to modify. You can't make it fast if you can't change it." So build time performance problems in Scala is not simply because the language has so many more features than Java. There are real problems in the compiler. What was done in D was to "stabilize" D and call it D1 and then start on D2. I think this was a wise thing to do. Maybe for the Scala compiler guys it's time to stabilize Scala and call it Scala1 and start with Scala2.
Apr 04 2014
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 4 April 2014 at 07:43:18 UTC, Bienlein wrote:
 On Thursday, 3 April 2014 at 01:55:48 UTC, Andrei Alexandrescu
 wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg


 Andrei
He's got a point in mentioning things like "def equals(x: Any): Boolean" and "def compare(x: T, y: T): Int" (although the latter is not the worst problem I can think of). But the real message is to me what is said starting from 24:20: "There remain those periodic stepwise jumps in performance taking place in the compiler. ... There is a factor of 10 lying around. It's that bad. It's so hart to pinpoint what is doing what and why that ain't nothing possible to modify. You can't make it fast if you can't change it." So build time performance problems in Scala is not simply because the language has so many more features than Java. There are real problems in the compiler. What was done in D was to "stabilize" D and call it D1 and then start on D2. I think this was a wise thing to do. Maybe for the Scala compiler guys it's time to stabilize Scala and call it Scala1 and start with Scala2.
I guess you need to be more up to date to Scala news. :) https://groups.google.com/forum/m/#!msg/scala-internals/6HL6lVLI3bQ/IY4gEyOwFhoJ https://github.com/lampepfl/dotty -- Paulo
Apr 04 2014
prev sibling parent reply "w0rp" <devw0rp gmail.com> writes:
On Friday, 4 April 2014 at 07:43:18 UTC, Bienlein wrote:
 On Thursday, 3 April 2014 at 01:55:48 UTC, Andrei Alexandrescu
 wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg


 Andrei
He's got a point in mentioning things like "def equals(x: Any): Boolean" and "def compare(x: T, y: T): Int" (although the latter is not the worst problem I can think of). But the real message is to me what is said starting from 24:20: "There remain those periodic stepwise jumps in performance taking place in the compiler. ... There is a factor of 10 lying around. It's that bad. It's so hart to pinpoint what is doing what and why that ain't nothing possible to modify. You can't make it fast if you can't change it." So build time performance problems in Scala is not simply because the language has so many more features than Java. There are real problems in the compiler. What was done in D was to "stabilize" D and call it D1 and then start on D2. I think this was a wise thing to do. Maybe for the Scala compiler guys it's time to stabilize Scala and call it Scala1 and start with Scala2.
Yeah, generally the message I was getting was that you should be fighting against piling on new features, fighting against inelegant hacks for performance, and working on improving the things that you have. It's like with his argument for compare. Suppose you had a typesafe enum which was tri-state. Like a type class. Less, Equal, More. (Basically 'Ordering' from Haskell, though without dumb unreadable abbreviations.) You get programs which are more obviously correct, and there's an obvious efficiency gain to be had there. If you can prove that your values are only Less, Equal, or More, you could represent that with exactly -1, 0, 1 internally and then it would be obviously better than the apparently faster C-like thing. I think this is a really interesting argument. Don't write ugly things to get performance. Instead write obviously correct things and then make obvious optimisations. This argument makes me think a lot about component programming and then optimising that after you can prove interesting things about it, like inlining lambdas, etc.
Apr 04 2014
parent reply "renoX" <renozyx gmail.com> writes:
On Friday, 4 April 2014 at 08:00:09 UTC, w0rp wrote:
 I think this is a really interesting argument. Don't write ugly 
 things to get performance. Instead write obviously correct 
 things and then make obvious optimisations.
Bah, except that if you use everywhere big ints, floating point intervals instead of floating points (the former is the correct representation of reals, the latter isn't), normalized strings, the obvious optimisations won't necessarily be enough to avoid being very slow..
Apr 04 2014
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 4 April 2014 at 08:05:58 UTC, renoX wrote:
 On Friday, 4 April 2014 at 08:00:09 UTC, w0rp wrote:
 I think this is a really interesting argument. Don't write 
 ugly things to get performance. Instead write obviously 
 correct things and then make obvious optimisations.
Bah, except that if you use everywhere big ints, floating point intervals instead of floating points (the former is the correct representation of reals, the latter isn't), normalized strings, the obvious optimisations won't necessarily be enough to avoid being very slow..
Says who? And slow to whom? 1 - Write correct code 2 - Use a profiler, if the code isn't fast enough for the use case being written for 3 - If desired use case isn't there, use the profiler information to improve the specific hotpaths in need of tuning. I see too many people micro-optimize for nothing. -- Paulo
Apr 04 2014
parent reply "Dicebot" <public dicebot.lv> writes:
On Friday, 4 April 2014 at 08:16:20 UTC, Paulo Pinto wrote:
 Says who? And slow to whom?

 1 - Write correct code

 2 - Use a profiler, if the code isn't fast enough for the use 
 case being written for

 3 - If desired use case isn't there, use the profiler 
 information to improve the specific hotpaths in need of tuning.

 I see too many people micro-optimize for nothing.

 --
 Paulo
While this is true in general, spotting performance overhead from using bigints everywhere in profiler can be rather tricky because it will be evenly spread across the program. Micro-optimizations are bad but this is not very practical example.
Apr 04 2014
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 4 April 2014 at 09:07:31 UTC, Dicebot wrote:
 On Friday, 4 April 2014 at 08:16:20 UTC, Paulo Pinto wrote:
 Says who? And slow to whom?

 1 - Write correct code

 2 - Use a profiler, if the code isn't fast enough for the use 
 case being written for

 3 - If desired use case isn't there, use the profiler 
 information to improve the specific hotpaths in need of tuning.

 I see too many people micro-optimize for nothing.

 --
 Paulo
While this is true in general, spotting performance overhead from using bigints everywhere in profiler can be rather tricky because it will be evenly spread across the program. Micro-optimizations are bad but this is not very practical example.
To pick up on the bigints example, most compilers only use them if they don't fit into registers. -- Paulo
Apr 04 2014
prev sibling parent reply Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 03/04/2014 02:55, Andrei Alexandrescu wrote:
 A lot of them could apply to us as well.

 https://www.youtube.com/watch?v=TS1lpKBMkgg


 Andrei
One interesting point near the end. He glossed over it since he was running out of time, but this was in the slides: " What I'm after * I don't need a programming language. * I need a coherent set of tools for creating software. A "language" is incidental. " I totally agree. Sure, the language may be the core, and one of the most important aspects, but the rest of the tool-chain is extremely important too. I don't think everyone in the D community (and outside it too) fully stands behind this idea. -- Bruno Medeiros https://twitter.com/brunodomedeiros
Apr 09 2014
next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 4/9/2014 4:21 PM, Bruno Medeiros wrote:
 Sure, the language may be the core, and one of the most
 important aspects, but the rest of the tool-chain is extremely important
 too.

 I don't think everyone in the D community (and outside it too) fully
 stands behind this idea.
I think a big part of that is because there's been a lot of work done using languages where good tooling is used as a substitute for a good language (*cough*java*cough*) - to predictably painful results. Tooling is certainly very important, but until someone comes up with a substitute for "programming languages" that actually *works well* as a *complete* substitute (decades of attempts, still zero successes), then unlike tooling, the language is still the one thing that's absolutely *mandatory*.
Apr 09 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 4/9/2014 1:58 PM, Nick Sabalausky wrote:
 Tooling is certainly very important, but until someone comes up with a
 substitute for "programming languages" that actually *works well* as a
 *complete* substitute (decades of attempts, still zero successes), then unlike
 tooling, the language is still the one thing that's absolutely *mandatory*.
Yeah, I've seen the "programming without programming" tools come and go over the decades. I'm not holding my breath. To me, they always seem like "learn without effort" and "get fit without exercise" pitches.
Apr 10 2014
prev sibling parent "Christof Schardt" <csnews schardt.info> writes:
"Bruno Medeiros" <bruno.do.medeiros+dng gmail.com> schrieb im Newsbeitrag 
news:li4a40$tn2> What I'm after
 * I don't need a programming language.
 * I need a coherent set of tools for creating software. A "language" is 
 incidental.
 "

 I totally agree. Sure, the language may be the core, and one of the most 
 important aspects, but the rest of the tool-chain is extremely important 
 too.

 I don't think everyone in the D community (and outside it too) fully 
 stands behind this idea.
At least I do. Let me explain: My "life-project" is represented by 450.000 lines C++ (music-notation) written in 14 years. I learned to hate C++ because of its unproductivity. I'm desparately waiting for something better. (that's why I have been following the D development from the beginning) BUT: The tool "VisualAssist", which I use now for a long time, is so tremendously useful, that it makes programming for me often fun and joy!!! Enjoy programming C++! Crazy, isn't it? Crucial is at least: - easy navigation to definitions/declarations/locations of usage - hyper-intelligent completion (VA does a superior job) - reasonable refactoring Visual Assist does a perfect job in providing this level of comfort to C++-Programmers. I did not find anything comparable yet in the D-toolbox. I feel D is superior to C++ in almost all fields, but if I have to change to a half-intelligent IDE, which gives me only reduced orientation/control/completion in a 1/2-mio-lines-project, then it is no alternative for me. That's why I'm still standing at the fence and watch all the exciting development from outside. Regards Christof Schardt
Apr 09 2014