www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Google's Go

reply Steve Teale <steve.teale britseyeview.com> writes:

realize of course does not mean anything. But I'd be interested to hear what
the D aficionados think of Go.

It probably would not suit Andrei.
Jan 23 2010
next sibling parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

realize of course does not mean anything. But I'd be interested to hear what
the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do". The only reason it gets any attention at all is because of the names attached to it. -- Adam D. Ruppe http://arsdnet.net
Jan 23 2010
next sibling parent reply retard <re tard.com.invalid> writes:
Sat, 23 Jan 2010 12:11:37 -0500, Adam D. Ruppe wrote:

 On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

 I realize of course does not mean anything. But I'd be interested to
 hear what the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do". The only reason it gets any attention at all is because of the names attached to it.
Now this is a valuable comment - thanks for sharing it :) I wonder how much D rides on Walter's and your fame. I mean the "D marketing" I see all over the web doesn't often build on facts. People just like the C++ look'n'feel so they can write the same (P)OOP code on native level to gain a constant efficiency bonus. Some old farts use D1 because they highly respect the D-man art and Walter's ability to co-operative and communicate with the community (which indeed feels really good if you have zero experience on other language communities). They do not fancy the new D2 features that much. And let's be honest, D1 is terribly In other words, the professional developers often know what to expect from a tool and D2 is pretty awesome tool for a professional. But the large masses gather around languages that allow implementing buggy, sub- optimal toy projects easily. If we look at e.g. Haskell, building a simple tic tac toe turns out to be impossible for most programmers. Even a hello world seems rather complex since you need to understand the monads. Go isn't especially good for building large enterprise software or operating systems, but you can easily build a text mode tictactoe game, and a large company supports the language ecosystem. That draws a lot of attention.
Jan 23 2010
parent reply Bane <branimir.milosavljevic gmail.com> writes:
... Some old farts use D1 because they 
 highly respect the D-man art and Walter's ability to co-operative and 
 communicate with the community (which indeed feels really good if you 
 have zero experience on other language communities). They do not fancy 
 the new D2 features that much. And let's be honest, D1 is terribly 

Hey! This old fart here prefers D1 instead of D2 because: - it has enough features to satisfy his needs, both in language and in Phobos - it is known how it works, it works correct, and there are docs describing it - there are less things in it, so it is easier to learn and play with it Hell, if C is useful tool and you cant get simpler than it (asembler excluded), then D1 is full fledged corporate tool with great std lib. So please, don't flame D1, or youll have some angry old men on your back :)
Jan 23 2010
parent retard <re tard.com.invalid> writes:
Sat, 23 Jan 2010 14:38:20 -0500, Bane wrote:

... Some old farts use D1 because they
 highly respect the D-man art and Walter's ability to co-operative and
 communicate with the community (which indeed feels really good if you
 have zero experience on other language communities). They do not fancy
 the new D2 features that much. And let's be honest, D1 is terribly

Hey! This old fart here prefers D1 instead of D2 because: - it has enough features to satisfy his needs, both in language and in Phobos - it is known how it works, it works correct, and there are docs describing it - there are less things in it, so it is easier to learn and play with it Hell, if C is useful tool and you cant get simpler than it (asembler excluded), then D1 is full fledged corporate tool with great std lib. So please, don't flame D1, or youll have some angry old men on your back :)
Please don't take my posts too seriously =)
Jan 23 2010
prev sibling parent reply grauzone <none example.net> writes:
Adam D. Ruppe wrote:
 On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

realize of course does not mean anything. But I'd be interested to hear what
the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do".
Oh well, D isn't that great either. While it doesn't have such big names attached on it (although "Andrei" is not that small of a name), it had more time. What worries one most is how D rushes "to completion", just because of the deadline of that one book. It's obvious that some features are half cooked at best. Just look at the features added in D 2.038 (auto ref, DIP2): terrible hacks to get some broken language features barely to work before the deadline is over. Also notice how the compiler, after all these years, still chokes up on basic language features. The struggling of the QtD developers is a major sign for this. I wonder what the heck is wrong with dmd's internal design that the situation is that bad. I mean, Walter could get a freaking C++ compiler right, why not the compiler for his own language? /rant
 The only reason it gets any attention at all is because of the names attached
 to it.
 
Jan 23 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
grauzone wrote:
 Adam D. Ruppe wrote:
 On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

 which I realize of course does not mean anything. But I'd be 
 interested to hear what the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do".
Oh well, D isn't that great either. While it doesn't have such big names attached on it (although "Andrei" is not that small of a name), it had more time. What worries one most is how D rushes "to completion", just because of the deadline of that one book. It's obvious that some features are half cooked at best. Just look at the features added in D 2.038 (auto ref, DIP2): terrible hacks to get some broken language features barely to work before the deadline is over.
Meh. It's all so subjective, calling some language design "a hack" is nonfalsifiable. That reminds me of some of the Walter/Bartosz/me discussions. The first person to call something a hack usually won any argument because there was no way to counter that label sensibly. Prepend "terrible" and we have a winner. Andrei
Jan 24 2010
next sibling parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
Andrei Alexandrescu wrote:
 Meh. It's all so subjective, calling some language design "a hack" is
 nonfalsifiable. That reminds me of some of the Walter/Bartosz/me
 discussions. The first person to call something a hack usually won any
 argument because there was no way to counter that label sensibly.
 Prepend "terrible" and we have a winner.
 
 
 Andrei
The solution is obvious: "The use of 'hack' or 'terrible hack' to describe something is a terrible hack." That'll either end the argument or cause everyone to stackfault.
Jan 24 2010
parent Walter Bright <newshound1 digitalmars.com> writes:
Daniel Keep wrote:
 "The use of 'hack' or 'terrible hack' to describe something is a
 terrible hack."
Ah, for the good ol' days when things were simply described as a "kludge" or a "super kludge."
Jan 24 2010
prev sibling parent reply grauzone <none example.net> writes:
Andrei Alexandrescu wrote:
 grauzone wrote:
 Adam D. Ruppe wrote:
 On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

 which I realize of course does not mean anything. But I'd be 
 interested to hear what the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do".
Oh well, D isn't that great either. While it doesn't have such big names attached on it (although "Andrei" is not that small of a name), it had more time. What worries one most is how D rushes "to completion", just because of the deadline of that one book. It's obvious that some features are half cooked at best. Just look at the features added in D 2.038 (auto ref, DIP2): terrible hacks to get some broken language features barely to work before the deadline is over.
Meh. It's all so subjective, calling some language design "a hack" is nonfalsifiable. That reminds me of some of the Walter/Bartosz/me discussions. The first person to call something a hack usually won any argument because there was no way to counter that label sensibly. Prepend "terrible" and we have a winner.
I guess that's true. I also don't want to start a discussion about "taste" or whatever. Let me just say that those are unorthogonal, single-trick-pony features. They don't add too much value to the language, and are specially designed to cover for some annoying corner cases of other features. Please tell me how auto ref template parameters are universally useful? Or how inout(T) isn't just a shortcut to avoid writing const/immutable-aware code 3 times or putting it into a template? What's the use of auto ref returns, other than a small performance optimization? (Though I admit that min() example in the spec is cute.) Especially inout(T) seems to be only useful in one specialized situation. Could be that I'm lacking foresight, but then those questions above should be simply to answer. I don't quite get your point about non-falsifiability: you could put any optional feature into a language with that argument.
 
 Andrei
Jan 24 2010
next sibling parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Sun, 24 Jan 2010 04:25:21 -0500, grauzone <none example.net> wrote:

 Or how inout(T) isn't just a shortcut to avoid writing  
 const/immutable-aware code 3 times or putting it into a template?
The benefits are: - Single implementation where all that is different is the type qualifier. (also allows the function to be virtual) - No propogation of contract through an accessor. In other words, using an inout accessor on an object or struct does not alter the caller's contract with the data itself. The latter function is almost essential for properties, for without such a mechanism, you are forced to write your property definitions in triplicate. i.e. class C {} struct S { C c; } immutable s1 = S(new C); S s2 = S(new C); immutable c1 = s1.c; C c2 = s2.c; Now, change S.c into a property. The first line of thinking is, "well, accessing c doesn't change the object itself, so it should be const." But that means you must return a const(C), so it breaks defining c1 and c2 (can't assign immutable or mutable from const). So, you say, "I'll just define it without const," but then you can't call the property unless S is a mutable type, so that only works in c2's case Maybe you think you can get away with just mutable and immutable, but again, it doesn't work if the whole object is const, since you can't call either function from there. Templates won't work here, you cannot template the 'this' pointer. So you end up with 3 identical implementations, and *no* const guarantee on the mutable one: property C c() { return _c; } property const(C) c() const { return _c; } property immutable(C) c() immutable { return _c; } Repeat this for all the properties in your object and you have a freaking mess. You can't even compose these into a common implementation unless you are willing to do some casting. The inout solution is simple to understand, does exactly what you want, and provides the most efficient binary representation (1 compiled function instead of 3 identical compiled functions). It works because const and immutable are *compile-time* restrictions, not runtime restrictions. I came up with the idea because I was trying to port Tango to D2, and I realized with the amount of properties and object use that Tango has, it would be a nightmare. It is not a hack, it is a complete, elegant solution to a very nasty problem. I'm actually quite surprised that something like this was possible, and how easy it would be to use. I predict that most functions where you would normally write const, you should write inout, and for free you get implementations for mutable, const, and immutable that work exactly how you want them to (and guarantee exactly what you want them to guarantee). -Steve
Jan 25 2010
next sibling parent reply "Lars T. Kyllingstad" <public kyllingen.NOSPAMnet> writes:
Steven Schveighoffer wrote:
 On Sun, 24 Jan 2010 04:25:21 -0500, grauzone <none example.net> wrote:
 
 Or how inout(T) isn't just a shortcut to avoid writing 
 const/immutable-aware code 3 times or putting it into a template?
The benefits are: - Single implementation where all that is different is the type qualifier. (also allows the function to be virtual) - No propogation of contract through an accessor. In other words, using an inout accessor on an object or struct does not alter the caller's contract with the data itself. The latter function is almost essential for properties, for without such a mechanism, you are forced to write your property definitions in triplicate. i.e. class C {} struct S { C c; } immutable s1 = S(new C); S s2 = S(new C); immutable c1 = s1.c; C c2 = s2.c; Now, change S.c into a property. The first line of thinking is, "well, accessing c doesn't change the object itself, so it should be const." But that means you must return a const(C), so it breaks defining c1 and c2 (can't assign immutable or mutable from const). So, you say, "I'll just define it without const," but then you can't call the property unless S is a mutable type, so that only works in c2's case Maybe you think you can get away with just mutable and immutable, but again, it doesn't work if the whole object is const, since you can't call either function from there. Templates won't work here, you cannot template the 'this' pointer. So you end up with 3 identical implementations, and *no* const guarantee on the mutable one: property C c() { return _c; } property const(C) c() const { return _c; } property immutable(C) c() immutable { return _c; }
Out of curiosity: How does inout(T) fix this? I thought inout was all about transporting the const-ness of the input type to the return type, and in this example there are no input parameters. -Lars
Jan 25 2010
parent reply "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Mon, 25 Jan 2010 09:05:11 -0500, Lars T. Kyllingstad  
<public kyllingen.nospamnet> wrote:

 Steven Schveighoffer wrote:
 On Sun, 24 Jan 2010 04:25:21 -0500, grauzone <none example.net> wrote:

 Or how inout(T) isn't just a shortcut to avoid writing  
 const/immutable-aware code 3 times or putting it into a template?
The benefits are: - Single implementation where all that is different is the type qualifier. (also allows the function to be virtual) - No propogation of contract through an accessor. In other words, using an inout accessor on an object or struct does not alter the caller's contract with the data itself. The latter function is almost essential for properties, for without such a mechanism, you are forced to write your property definitions in triplicate. i.e. class C {} struct S { C c; } immutable s1 = S(new C); S s2 = S(new C); immutable c1 = s1.c; C c2 = s2.c; Now, change S.c into a property. The first line of thinking is, "well, accessing c doesn't change the object itself, so it should be const." But that means you must return a const(C), so it breaks defining c1 and c2 (can't assign immutable or mutable from const). So, you say, "I'll just define it without const," but then you can't call the property unless S is a mutable type, so that only works in c2's case Maybe you think you can get away with just mutable and immutable, but again, it doesn't work if the whole object is const, since you can't call either function from there. Templates won't work here, you cannot template the 'this' pointer. So you end up with 3 identical implementations, and *no* const guarantee on the mutable one: property C c() { return _c; } property const(C) c() const { return _c; } property immutable(C) c() immutable { return _c; }
Out of curiosity: How does inout(T) fix this? I thought inout was all about transporting the const-ness of the input type to the return type, and in this example there are no input parameters.
inout is applied to the hidden input parameter -- this: property inout(C) inout {return _c; } -Steve
Jan 25 2010
parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Mon, 25 Jan 2010 09:26:55 -0500, Steven Schveighoffer  
<schveiguy yahoo.com> wrote:

 inout is applied to the hidden input parameter -- this:

  property inout(C) inout {return _c; }
Wow, forgot the function name there :) property inout(C) c() inout {return _c;} -Steve
Jan 25 2010
prev sibling parent reply =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
Steven Schveighoffer wrote:
 On Sun, 24 Jan 2010 04:25:21 -0500, grauzone <none example.net> wrote:
 Templates won't work here, you cannot template the 'this' pointer.  So
 you end up with 3 identical implementations, and *no* const guarantee on
 the mutable one:

  property C c() { return _c; }
  property const(C) c() const { return _c; }
  property immutable(C) c() immutable { return _c; }
Is there a known bug in 2.039 about this feature? It doesn't seem to consider the hidden this parameter: class C {} struct S { C _c; property inout(C) c() inout { return _c; } } Error: inout on return means inout must be on a parameter as well for inout inout(C)() Ali
Jan 25 2010
parent "Steven Schveighoffer" <schveiguy yahoo.com> writes:
On Mon, 25 Jan 2010 22:45:10 -0500, Ali Çehreli <acehreli yahoo.com> wrote:

 Steven Schveighoffer wrote:
  > On Sun, 24 Jan 2010 04:25:21 -0500, grauzone <none example.net> wrote:

  > Templates won't work here, you cannot template the 'this' pointer.  So
  > you end up with 3 identical implementations, and *no* const guarantee  
 on
  > the mutable one:
  >
  >  property C c() { return _c; }
  >  property const(C) c() const { return _c; }
  >  property immutable(C) c() immutable { return _c; }

 Is there a known bug in 2.039 about this feature? It doesn't seem to  
 consider the hidden this parameter:

 class C {}

 struct S
 {
     C _c;

       property inout(C) c() inout
      {
          return _c;
      }
 }

 Error: inout on return means inout must be on a parameter as well for  
 inout inout(C)()
Yes, this is a bug. inout is so fundamentally broken right now, I didn't bother reporting all the deficiencies in bugzilla, but I did mention them on the announce newsgroup. I hope within the next couple releases, it is addressed. In fact, if it is not addressed in the next release, I should probably report all the issues to bugzilla. -Steve
Jan 26 2010
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
grauzone wrote:
 Andrei Alexandrescu wrote:
 grauzone wrote:
 Adam D. Ruppe wrote:
 On Sat, Jan 23, 2010 at 12:00:04PM -0500, Steve Teale wrote:

 which I realize of course does not mean anything. But I'd be 
 interested to hear what the D aficionados think of Go.
There's been a couple threads about it before. My opinion: it is garbage. It has maybe two or three good ideas in there, but on the whole, it is a very poor showing. The arrogant developers didn't do any research into prior art when designing it, and it shows - good ideas are absent without even a mention. Bad ideas remain in there saying "this is the best we could do".
Oh well, D isn't that great either. While it doesn't have such big names attached on it (although "Andrei" is not that small of a name), it had more time. What worries one most is how D rushes "to completion", just because of the deadline of that one book. It's obvious that some features are half cooked at best. Just look at the features added in D 2.038 (auto ref, DIP2): terrible hacks to get some broken language features barely to work before the deadline is over.
Meh. It's all so subjective, calling some language design "a hack" is nonfalsifiable. That reminds me of some of the Walter/Bartosz/me discussions. The first person to call something a hack usually won any argument because there was no way to counter that label sensibly. Prepend "terrible" and we have a winner.
I guess that's true. I also don't want to start a discussion about "taste" or whatever. Let me just say that those are unorthogonal, single-trick-pony features. They don't add too much value to the language, and are specially designed to cover for some annoying corner cases of other features. Please tell me how auto ref template parameters are universally useful?
Sorry, I forgot to answer these questions earlier. First off, going from unwarranted presuppositions to actual questions is definite progress. Auto ref is a long-standing need that D has had and has particular applications to SafeD. An example is getopt(): string user, site; getopt("user", &user, "site", &site); This call has a mix of rvalues and pointers in the call. In SafeD there is pressure for eliminating pointers, so ideally getopt() should be rewritten to accept: getopt("user", user, "site", site); In that case, getopt() must properly deal with mixed by-value and by-reference arguments. Another issue is perfect forwarding. If you have a function fun you can't define another function gun that forwards to fun and has the same effect. This is the hallmark of functional composition, and in a language with the option of passing by reference is a big challenge. C++ does not allow perfect forwarding, which turned out to be a crippling problem that has spurred a Sisyphic quest for partially assuaging that problem. Without auto ref and auto ref returns, D did not have perfect forwarding, and much nontrivial code was in a world of pain. A simple example was that ranges that enhance other ranges can't forward front() to the ranges they manage. I mentioned this before - just search this file: http://www.dsource.org/projects/phobos/browser/trunk/phobos/std/range.d for "mixin". I'd mentioned this rationale once.
 Or how inout(T) isn't just a shortcut to avoid writing 
 const/immutable-aware code 3 times or putting it into a template?
I don't even understand the question. You're asking "how motherhood and apple pie aren't just great things?" I'm glad Steve took the time to answer that in detail.
 What's 
 the use of auto ref returns, other than a small performance 
 optimization? (Though I admit that min() example in the spec is cute.) 
See above.
 Especially inout(T) seems to be only useful in one specialized situation.
It's a family of situation that experience with C++ has made clear is a big issue.
 Could be that I'm lacking foresight, but then those questions above 
 should be simply to answer.
I wouldn't put it on lack of foresight more than attitude. The questions above are indeed simple to answer, and the answer does occur easily to the willing. Andrei
Jan 25 2010
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Steve Teale wrote:

realize of course does not mean anything. But I'd be interested to hear what
the D aficionados think of Go.
 
 It probably would not suit Andrei.
What wouldn't? (Honest question - I don't understand.) Andrei
Jan 23 2010
parent Steve Teale <steve.teale britseyeview.com> writes:
Andrei Alexandrescu Wrote:

 Steve Teale wrote:

realize of course does not mean anything. But I'd be interested to hear what
the D aficionados think of Go.
 
 It probably would not suit Andrei.
What wouldn't? (Honest question - I don't understand.) Andrei
I just meant no templates - forgive me ;=)
Jan 23 2010
prev sibling next sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Steve Teale (steve.teale britseyeview.com)'s article

realize
of course does not mean anything. But I'd be interested to hear what the D aficionados think of Go.
 It probably would not suit Andrei.
Well, Go's garbage collector clearly isn't very good because the language doesn't delete itself. Seriously, it may eventually evolve into a decent language, but right now it's just too minimalistic to be at all practical. Just off the top of my head from reading about it a few months ago, the lack of asserts, exceptions and Windows support is enough to turn me off to it. Multiple return values are a horrible substitute for exceptions, because they require the programmer to explicitly check the return value. (When's the last time you checked the return value of printf, or even malloc?) IMHO the best thing about exceptions is that they provide a sane default for error handling: If you don't handle them then you've effectively asserted that they can't happen in your situation. If this "assertion" fails, then our program fails fast and with an error message that massively narrows down where the problem is. I flat-out refuse to program in a language where the default is for errors to be ignored and I have to constantly write explicit error-checking boilerplate even if I don't care about handling the error.
Jan 23 2010
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
dsimcha:
 Multiple return values are a horrible substitute for exceptions,
But I am waiting for multiple return values in D3, because they are quite handy if implemented with a nice syntax :-) Bye, bearophile
Jan 23 2010
parent reply retard <re tard.com.invalid> writes:
Sat, 23 Jan 2010 14:16:47 -0500, bearophile wrote:

 dsimcha:
 Multiple return values are a horrible substitute for exceptions,
But I am waiting for multiple return values in D3, because they are quite handy if implemented with a nice syntax :-) Bye, bearophile
You can write a string mixin that converts a string like: (a,b) = fun_call(c,d); into fun_call(c,d,a,b); and another one that converts a string for defining functions: (int, int) fun_call(int c, int d) { ... } into void fun_call(int c, int d, out int ret1, out int ret2) { ... } I've personally used the import expression and string mixins + ctfe instead of ordinary D imports to implement rather nice language extensions by parsing the language on module level.
Jan 23 2010
parent bearophile <bearophileHUGS lycos.com> writes:
retard:
 You can write a string mixin that converts a string like:
   (a,b) = fun_call(c,d);
 into
   fun_call(c,d,a,b);
That's silly. Bye, bearophile
Jan 23 2010
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
dsimcha wrote:
 IMHO the best thing
 about exceptions is that they provide a sane default for error handling:  If
you
 don't handle them then you've effectively asserted that they can't happen in
your
 situation.  If this "assertion" fails, then our program fails fast and with an
 error message that massively narrows down where the problem is.
It's even better than that. Since the default handling for exceptions is to print a pretty message, like "cannot open file xxxxx", for many utility programs that is all you need. You don't have to write any error handling code, and yet your program handles errors correctly and gracefully reports them to the user.
Jan 23 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 dsimcha wrote:
 IMHO the best thing
 about exceptions is that they provide a sane default for error 
 handling:  If you
 don't handle them then you've effectively asserted that they can't 
 happen in your
 situation.  If this "assertion" fails, then our program fails fast and 
 with an
 error message that massively narrows down where the problem is.
It's even better than that. Since the default handling for exceptions is to print a pretty message, like "cannot open file xxxxx", for many utility programs that is all you need. You don't have to write any error handling code, and yet your program handles errors correctly and gracefully reports them to the user.
I wouldn't go that far. Unfortunately, writing even exception-neutral code still changes the way one writes code even if you don't need to handle errors explicitly (fortunately scope statements help with that). Andrei
Jan 23 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 Walter Bright wrote:
 Since the default handling for exceptions 
 is to print a pretty message, like "cannot open file xxxxx", for many 
 utility programs that is all you need. You don't have to write any 
 error handling code, and yet your program handles errors correctly and 
 gracefully reports them to the user.
I wouldn't go that far. Unfortunately, writing even exception-neutral code still changes the way one writes code even if you don't need to handle errors explicitly (fortunately scope statements help with that).
I agree that scope statements are only necessary if you need to recover from errors. I don't know what you mean by how it changes the way one writes code.
Jan 23 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 Andrei Alexandrescu wrote:
 Walter Bright wrote:
 Since the default handling for exceptions is to print a pretty 
 message, like "cannot open file xxxxx", for many utility programs 
 that is all you need. You don't have to write any error handling 
 code, and yet your program handles errors correctly and gracefully 
 reports them to the user.
I wouldn't go that far. Unfortunately, writing even exception-neutral code still changes the way one writes code even if you don't need to handle errors explicitly (fortunately scope statements help with that).
I agree that scope statements are only necessary if you need to recover from errors. I don't know what you mean by how it changes the way one writes code.
I mean even if ostensibly you don't want to handle errors, you still need to mind the multiple hidden exit paths in your code to achieve even the most intuitive guarantees (such as temporarily changing a global for the duration of a function). Such a style of coding blindsides old-style SESE programmers (and is the main reason for which I unrecommend SESE). Andrei
Jan 23 2010
parent Walter Bright <newshound1 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 I mean even if ostensibly you don't want to handle errors, you still 
 need to mind the multiple hidden exit paths in your code to achieve even 
 the most intuitive guarantees (such as temporarily changing a global for 
 the duration of a function). Such a style of coding blindsides old-style 
 SESE programmers (and is the main reason for which I unrecommend SESE).
I agree. That makes sense.
Jan 23 2010
prev sibling next sibling parent reply grauzone <none example.net> writes:
dsimcha wrote:
 Multiple return values are a horrible substitute for exceptions, because they
 require the programmer to explicitly check the return value.  (When's the last
 time you checked the return value of printf, or even malloc?)  IMHO the best
thing
 about exceptions is that they provide a sane default for error handling:  If
you
 don't handle them then you've effectively asserted that they can't happen in
your
 situation.  If this "assertion" fails, then our program fails fast and with an
 error message that massively narrows down where the problem is.  I flat-out
refuse
 to program in a language where the default is for errors to be ignored and I
have
 to constantly write explicit error-checking boilerplate even if I don't care
about
 handling the error.
Exception handling in D (or C++/Java for that matter) isn't that great either. In D, you don't even know what exceptions a function may throw. It's completely dynamic. You may even accidentally catch unknown exceptions, because your catch-filter isn't "narrow" enough. Java has checked exceptions, but they are misdesigned, get in the programmers way, and generally suck. Feels almost like D copied the standard Java feature by the letter, removing checked exceptions, and call that an improvement. Also, let's have a look how exceptions improve the robustness of most Java code: try { something(); } catch (Exception e) { Logger.getLogger().log(e); } I guess D programmers are just better than D programmers, which makes this example out of place, of course. Or they just let it fall through, until it gets catched by a similar catch-all statement. Generally, exceptions as they are implemented in some popular languages seem to give the impression that you don't need to care about error handling. That isn't always true and probably only works in small command line tools.
Jan 23 2010
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
grauzone wrote:
 dsimcha wrote:
 Multiple return values are a horrible substitute for exceptions, 
 because they
 require the programmer to explicitly check the return value.  (When's 
 the last
 time you checked the return value of printf, or even malloc?)  IMHO 
 the best thing
 about exceptions is that they provide a sane default for error 
 handling:  If you
 don't handle them then you've effectively asserted that they can't 
 happen in your
 situation.  If this "assertion" fails, then our program fails fast and 
 with an
 error message that massively narrows down where the problem is.  I 
 flat-out refuse
 to program in a language where the default is for errors to be ignored 
 and I have
 to constantly write explicit error-checking boilerplate even if I 
 don't care about
 handling the error.
Exception handling in D (or C++/Java for that matter) isn't that great either. In D, you don't even know what exceptions a function may throw. It's completely dynamic. You may even accidentally catch unknown exceptions, because your catch-filter isn't "narrow" enough. Java has checked exceptions, but they are misdesigned, get in the programmers way, and generally suck. Feels almost like D copied the standard Java feature by the letter, removing checked exceptions, and call that an improvement.
For my money, D2 exceptions (as designed and described in TDPL) are better than all other exception frameworks, by a mile. This is mainly because D offers a practical methods of handling collateral exceptions (exceptions thrown during the stack unwinding caused by other exceptions).
 Also, let's have a look how exceptions improve the robustness of most 
 Java code:
 
 try { something(); } catch (Exception e) { Logger.getLogger().log(e); }
 
 I guess D programmers are just better than D programmers, which makes 
 this example out of place, of course. Or they just let it fall through, 
 until it gets catched by a similar catch-all statement.
 
 Generally, exceptions as they are implemented in some popular languages 
 seem to give the impression that you don't need to care about error 
 handling. That isn't always true and probably only works in small 
 command line tools.
I agree! I'd put it even stronger. Andrei
Jan 24 2010
prev sibling parent reply Justin Johansson <no spam.com> writes:
dsimcha wrote:
 
 Multiple return values are a horrible substitute for exceptions, because they
 require the programmer to explicitly check the return value.  (When's the last
 time you checked the return value of printf, or even malloc?)  IMHO the best
thing
 about exceptions is that they provide a sane default for error handling:  If
you
 don't handle them then you've effectively asserted that they can't happen in
your
 situation.  If this "assertion" fails, then our program fails fast and with an
 error message that massively narrows down where the problem is.  I flat-out
refuse
 to program in a language where the default is for errors to be ignored and I
have
 to constantly write explicit error-checking boilerplate even if I don't care
about
 handling the error.
Well, back in C++ land, as an occasional alternative to using exceptions, I use a templated "checked_value" structure for returning an error code / function result pair. The C++ templated structure is shown below. A function returning a "checked_value" uses the first constructor form to return a valid result and the second form to return an erroneous result (using an enum to designate the error condition). If client code tries to access the value (via the overloaded cast operator, or you could have a getter function for the value instead), and an error is in effect that you didn't check for, then shit (an assert or other severity) happens. I'm sure there will be lots of religious comments about this idiom, but it works well for me by forcing a check for an error result before otherwise using the return value. Of course, this idiom only works if the function in question "returns something", a value, that the client code would by necessity have to use. It wouldn't (and couldn't) work if ValueType is "void". In the exceptional case (pun intended) of functions returning void, one may have to resort to throwing an exception instead to signal an error. So to counter the dsimcha's point, my solution does not assume a default situation of ignoring errors. The thrust of my argument is that exceptions are not an all-or-nothing approach in sane dealing in errors. template <typename ValueType> struct checked_value { public: checked_value( ValueType value) : _value( value), _rcode( RC_OK) {} checked_value( resultCode_t rcode) : _value( ValueType::PROP_INIT), _rcode( rcode) {} operator ValueType() const { // The value should not be accessible if the function failed. // Choose your poison with either assert or something // more severe in release compile if you don't like asserts // being preprocessed out. assert(!failed()); return _value; } int failed() const { return (_rcode < RC_OK); } resultCode rcode() const { return _rcode; } private: ValueType const _value; ResultCode const _rcode; }; Cheers Justin Johansson
Jan 26 2010
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Justin Johansson:
 So to counter the dsimcha's point, my solution does
 not assume a default situation of ignoring errors. The thrust
 of my argument is that exceptions are not an all-or-nothing
 approach in sane dealing in errors.
In GCC there is also the "warn_unused_result" function attribute that raises a compile time warning if you don't use the return value of a function: http://gcc.gnu.org/onlinedocs/gcc/Function-Attributes.html#index-g_t_0040code_007bwarn_005funused_005fresult_007d-attribute-2444 You can use it to be more sure you are reading and using an error return value. Ideally you can invent a similar "error_unused_result" that raises an error instead. (Some other of those GCC attributes can be useful in D2). Bye, bearophile
Jan 26 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
bearophile wrote:
 In GCC there is also the "warn_unused_result" function attribute that
 raises a compile time warning if you don't use the return value of a
 function: 
That may wind up suffering similar problems as checked exceptions do - just do a quick and dirty assign the return value to a hastily declared temp and ignore it, meaning to fix it later. Of course, it doesn't actually get fixed later. Even worse, during code reviews, it will look like the code is paying attention to the error code, but it won't be.
Jan 26 2010
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 That may wind up suffering similar problems as checked exceptions do - 
 just do a quick and dirty assign the return value to a hastily declared 
 temp and ignore it, meaning to fix it later.
Oh, I agree that exceptions are generally better (but that warn_unused_result of GCC can be used in C too where you don't have D-like exceptions). Regarding exceptions, I don't think you can find many of them in game engines written in C++, where performance matters.
 Of course, it doesn't actually get fixed later. Even worse, during code 
 reviews, it will look like the code is paying attention to the error 
 code, but it won't be.
Something similar can happen with exceptions, you can wrap something in a try-catch to silence the possible exceptions. Bye, bearophile
Jan 26 2010
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 bearophile wrote:
 In GCC there is also the "warn_unused_result" function attribute that
 raises a compile time warning if you don't use the return value of a
 function: 
That may wind up suffering similar problems as checked exceptions do - just do a quick and dirty assign the return value to a hastily declared temp and ignore it, meaning to fix it later. Of course, it doesn't actually get fixed later. Even worse, during code reviews, it will look like the code is paying attention to the error code, but it won't be.
I'd think in a code review a variable assigned and subsequently never used is bound to raise a red flag. Andrei
Jan 26 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Andrei Alexandrescu wrote:
 Walter Bright wrote:
 bearophile wrote:
 In GCC there is also the "warn_unused_result" function attribute that
 raises a compile time warning if you don't use the return value of a
 function: 
That may wind up suffering similar problems as checked exceptions do - just do a quick and dirty assign the return value to a hastily declared temp and ignore it, meaning to fix it later. Of course, it doesn't actually get fixed later. Even worse, during code reviews, it will look like the code is paying attention to the error code, but it won't be.
I'd think in a code review a variable assigned and subsequently never used is bound to raise a red flag.
I agree, if it was noticed. For me, int result = foo(); in the code implies that 'result' is later used. I'd have to search to see if it wasn't, which is harder than noticing that: foo(); ignores the result. This goes back to my theory that a feature that encourages the programmer to insert misleading dead code to shut the compiler up is a misfeature.
Jan 26 2010
parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjni3o$r26$1 digitalmars.com...
 Andrei Alexandrescu wrote:
 Walter Bright wrote:
 bearophile wrote:
 In GCC there is also the "warn_unused_result" function attribute 
 that
 raises a compile time warning if you don't use the return value of a
 function:
That may wind up suffering similar problems as checked exceptions do - just do a quick and dirty assign the return value to a hastily declared temp and ignore it, meaning to fix it later. Of course, it doesn't actually get fixed later. Even worse, during code reviews, it will look like the code is paying attention to the error code, but it won't be.
I'd think in a code review a variable assigned and subsequently never used is bound to raise a red flag.
I agree, if it was noticed. For me, int result = foo(); in the code implies that 'result' is later used. I'd have to search to see if it wasn't, which is harder than noticing that: foo(); ignores the result. This goes back to my theory that a feature that encourages the programmer to insert misleading dead code to shut the compiler up is a misfeature.
A programmer who does the above is not a good programmer. That said, a compiler that doesn't warn that variables or arguments have gone unused in a function is not a good compiler. That said, a language that doesn't enforce return val checking is not a good programming language.
Jan 29 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 This goes back to my theory that a feature that encourages the 
 programmer to insert misleading dead code to shut the compiler up is a 
 misfeature.
A programmer who does the above is not a good programmer.
The reality is, even the best programmers will do such things if it is convenient to, even if they preach against it and know it is wrong.
 That said, a 
 compiler that doesn't warn that variables or arguments have gone unused 
 in a function is not a good compiler. That said, a language that doesn't 
 enforce return val checking is not a good programming language. 
I often have unused variables and such when developing and debugging code, and when using conditional compilation, and having the compiler nag me about them would be very irritating.
Jan 30 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk1v87$1pem$1 digitalmars.com...
 John D wrote:
 That said, a compiler that doesn't warn that variables or arguments have 
 gone unused in a function is not a good compiler. That said, a language 
 that doesn't enforce return val checking is not a good programming 
 language.
I often have unused variables and such when developing and debugging code, and when using conditional compilation, and having the compiler nag me about them would be very irritating.
Only if they're fatal conditions. As non-fatal warnings that irritation dissapears.
Jan 30 2010
prev sibling parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk1v87$1pem$1 digitalmars.com...
 John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message
 This goes back to my theory that a feature that encourages the 
 programmer to insert misleading dead code to shut the compiler up is 
 a misfeature.
A programmer who does the above is not a good programmer.
The reality is, even the best programmers will do such things if it is convenient to, even if they preach against it and know it is wrong.
That is oxymoronic: can't be "the best" without discipline and self-control. Also "know thyself" is important: if a programmer really doesn't feel like programming and would rather be out playing softball, he should go out and play software instead of just hacking. IMO, but ideals and abstract discussions like this, especially in text, are not very useful, so let's end what I started.
 That said, a compiler that doesn't warn that variables or arguments 
 have gone unused in a function is not a good compiler. That said, a 
 language that doesn't enforce return val checking is not a good 
 programming language.
I often have unused variables and such when developing and debugging code, and when using conditional compilation, and having the compiler nag me about them would be very irritating.
So you are indeed saying that the D compiler gives no such warnings, or that you programming at a lower warning level? Personally, I always program with the highest warning level turned on. I don't regularly generate so many warnings, but rather errors and that seems somehow correct to me. If I made a compiler for my proprietary business, there would be no switching to a lower warning level, but then I'm a hard-ass director I guess. Please don't assume that I'm suggesting that others should use my methods, for I am not. I'm just stating what I do and what I would/will require of others working for me ("with" me is a whole new ballgame for me).
Jan 30 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
John D wrote:
 So you are indeed saying that the D compiler gives no such warnings,
That's right.
Jan 30 2010
parent "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk3bjj$1k1q$1 digitalmars.com...
 John D wrote:
 So you are indeed saying that the D compiler gives no such warnings,
That's right.
Because it is "D" ?
Jan 31 2010
prev sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from John D (jdean googling.com)'s article
 A programmer who does the above is not a good programmer. That said, a
 compiler that doesn't warn that variables or arguments have gone unused
 in a function is not a good compiler. That said, a language that doesn't
 enforce return val checking is not a good programming language.
Think about it for a minute. When is the last time you checked the return value for printf() in C? Would you really want the compiler nagging you about something like this?
Jan 30 2010
parent "John D" <jdean googling.com> writes:
"dsimcha" <dsimcha yahoo.com> wrote in message 
news:hk24v0$23di$1 digitalmars.com...
 == Quote from John D (jdean googling.com)'s article
 A programmer who does the above is not a good programmer. That said, a
 compiler that doesn't warn that variables or arguments have gone 
 unused
 in a function is not a good compiler. That said, a language that 
 doesn't
 enforce return val checking is not a good programming language.
Think about it for a minute. When is the last time you checked the return value for printf() in C? Would you really want the compiler nagging you about something like this?
I have thought about it that's why I stated it instead of writing it as a question. (I'm being a little bit facetious, cuz there is more than one way to solve a problem). I can restate it as a Q though, for it is a language design question: Requiring that all return values must be checked by the developer and having it enforced by the language specification and compiler: good or bad?
Jan 30 2010
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnd9r$dtp$1 digitalmars.com...
 bearophile wrote:
 In GCC there is also the "warn_unused_result" function attribute that
 raises a compile time warning if you don't use the return value of a
 function:
That may wind up suffering similar problems as checked exceptions do - just do a quick and dirty assign the return value to a hastily declared temp and ignore it, meaning to fix it later.
It it were an error (or treated as an error or a D-style "warning"), then yes. But as a true warning, not so much.
Jan 26 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 It it were an error (or treated as an error or a D-style "warning"), then 
 yes. But as a true warning, not so much. 
Many shops require that warnings be treated as hard errors. Furthermore, a warning that is not an error tends to become ignored background noise over time. (The infamous Windows Vista permission box is a fine example of that.)
Jan 26 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnidq$s0p$1 digitalmars.com...
 Nick Sabalausky wrote:
 It it were an error (or treated as an error or a D-style "warning"), then 
 yes. But as a true warning, not so much.
Many shops require that warnings be treated as hard errors.
In which case it's no longer a "true warning", so that doesn't really affect what I said. Besides, if a shop does make that a hard-and-fast requirement, then by doing so they are the creating the situation you described of causing programmers to hide problems in code without verifying first that it's not actually a problem. So a compiler like DMD obviously doesn't do a damn thing to change that. In fact, it just encourages that scenario in not just those shops, but in all the other ones as well. Sure, those other shops could just leave warnings off, but then you're recreating the scenario where the warnings become background noise, except it's worse because nobody even has a chance to pay attention to them.
 Furthermore, a warning that is not an error tends to become ignored 
 background noise over time. (The infamous Windows Vista permission box is 
 a fine example of that.)
I *do* investigate and handle all my warnings before checking code in (note that that's significantly different from warnings being unoptionally coerced into errors by the compiler, because in the latter case I may need to silence a warning because I'm momentarily in the middle of trying to deal with something else). And even if a time constraint requires me to just get it checked in before I can deal with the warnings, I still make damn sure to get around to it (and, to date, I always have). When I investigate a warning, if it turns out to be a problem, then I fix the problem. If it turns out to not be a problem, then I do go and do what I need to halt the warning, *but* it's perfectly fine because *I've already verified there's no problem being covered up*. Any time any coder compiles their code and sees a stream of white noise warnings, that should be a red flag that they're approaching development wrong. Now yes, that *does* happen, and many coders *do* ignore that stuff, and yes, if there is something we can do to improve that situation without causing additional problems, then we should. But DMD's approach *doesn't actually do a damn thing to improve the situation*. In fact, it just *encourages* it: DMD's philosophy with warnings: It's bad to go blindly covering up warnings because they might be indicating a real problem, and it's bad to ignore your warnings as white noise for the same reason. Therefore, we should provide the user with two options: Use "-w" to be forced to silence the warnings before proceeding, and thus be provided with *more* motivation/temptation to blindly cover them up, or omit "-w" to make it even easier to ignore all warnings. Ie, all the problems you've identified with warnings are valid concerns, but DMD's "solution" clearly exacerbates those problems rather than alleviating them.
Jan 26 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 I *do* investigate and handle all my warnings before checking code in (note 
 that that's significantly different from warnings being unoptionally coerced 
 into errors by the compiler, because in the latter case I may need to 
 silence a warning because I'm momentarily in the middle of trying to deal 
 with something else). And even if a time constraint requires me to just get 
 it checked in before I can deal with the warnings, I still make damn sure to 
 get around to it (and, to date, I always have). When I investigate a 
 warning, if it turns out to be a problem, then I fix the problem. If it 
 turns out to not be a problem, then I do go and do what I need to halt the 
 warning, *but* it's perfectly fine because *I've already verified there's no 
 problem being covered up*.
You might know that, but the maintainer wouldn't. Furthermore, when such warnings pop up in code you didn't write and don't understand but was dumped in your lap, you'd do what everyone else does in such situations, which is throw in the minimal code necessary to get rid of the warning, and move on. I agree that a conscientious and careful developer would never do such things. But they do. Even the ones who write articles and books saying "do not do this" will admit to doing it. When it's convenient and easy to do the wrong thing, people will do it. Even the ones who know better, and promise themselves they won't do it, do it. The language should be designed as much as possible so that the right thing is the easy and convenient thing to do. Using exceptions for error handling fits right in with that. There's no incentive in D, for example, of doing the Java thing of "swallow and ignore all exceptions."
Jan 26 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnr0l$1jl5$1 digitalmars.com...
 Nick Sabalausky wrote:
 I *do* investigate and handle all my warnings before checking code in 
 (note that that's significantly different from warnings being 
 unoptionally coerced into errors by the compiler, because in the latter 
 case I may need to silence a warning because I'm momentarily in the 
 middle of trying to deal with something else). And even if a time 
 constraint requires me to just get it checked in before I can deal with 
 the warnings, I still make damn sure to get around to it (and, to date, I 
 always have). When I investigate a warning, if it turns out to be a 
 problem, then I fix the problem. If it turns out to not be a problem, 
 then I do go and do what I need to halt the warning, *but* it's perfectly 
 fine because *I've already verified there's no problem being covered up*.
You might know that, but the maintainer wouldn't.
Informing maintainers of things that aren't obvious from the code is what comments are for. And yes, yes, yes, comments don't always get made or updated, but 1. that's a different issue, and much more importantly, 2. there's nothing about DMD's philosophy of warnings that gets around that problem anyway. Sure, D designs things to avoid the need for a warning in the first place, whenever possible (and that's great), but the problem is, it's not always possible. And for the cases when it isn't possible, and the need for a warning does arise, and the original coder gets that warning, verifies that it's ok, and then quiets it to prevent future warnings from being overlooked in misdst of a pile of old-but-ok warnings (which is exactly what he should do in that situation), then we're still right back to "There isn't a damn thing that can be done to help the maintainer know that besides hoping an approptiate comment was made".
 Furthermore, when such warnings pop up in code you didn't write and don't 
 understand but was dumped in your lap, you'd do what everyone else does in 
 such situations, which is throw in the minimal code necessary to get rid 
 of the warning, and move on.
I might be tempted to do that *if* warnings are both fatal and non-optional, but other than that, I can't imagine why in the world I, or even any random lazy programmer for that matter, would go around blindly silencing non-fatal warnings in unfamiliar code. There's no potential benefit in doing that, not even a perceived benefit.
 I agree that a conscientious and careful developer would never do such 
 things. But they do. Even the ones who write articles and books saying "do 
 not do this" will admit to doing it.

 When it's convenient and easy to do the wrong thing, people will do it. 
 Even the ones who know better, and promise themselves they won't do it, do 
 it.
Yes, that's bad, and it's a problem. But as I already explained, forcing all warnings to be either fatal or disabled can do nothing but *encourage* such bad practices even *more*.
 The language should be designed as much as possible so that the right 
 thing is the easy and convenient thing to do.
Right, which is why the language/compiler shouldn't make it *more* tempting to blindly silence warnings by preventing the programmer from continuing their immediate task until all warnings are either hidden (ie turned off) or squelched.
 Using exceptions for error handling fits right in with that.
 There's no incentive in D, for example, of doing the Java thing of 
 "swallow and ignore all exceptions."
I absolutely agree with that. The thing I was originally objecting to was the idea that GCC's warn_unused_result warning, when used *as* as warning (ie, not in a "warnings are/must be treated as errors" context) "may wind up suffering similar problems as checked exceptions do". Then I guess I got sidetracked on DMD's warnings philosophy.
Jan 26 2010
prev sibling parent reply Rainer Deyke <rainerd eldwood.com> writes:
Walter Bright wrote:
 When it's convenient and easy to do the wrong thing, people will do it.
 Even the ones who know better, and promise themselves they won't do it,
 do it.
WRT warnings, DMD doesn't just make it convenient and easy to do the wrong thing, it /forces/ you to do the wrong thing. -- Rainer Deyke - rainerd eldwood.com
Jan 26 2010
parent Don <nospam nospam.com> writes:
Rainer Deyke wrote:
 Walter Bright wrote:
 When it's convenient and easy to do the wrong thing, people will do it.
 Even the ones who know better, and promise themselves they won't do it,
 do it.
WRT warnings, DMD doesn't just make it convenient and easy to do the wrong thing, it /forces/ you to do the wrong thing.
Except that DMD doesn't really have any incorrect warnings any more. The only time that you get a warning that isn't necessarily a bug in your code is with implicit conversions. Once the range-tracking implicit conversion rules for arithmetic operations are fully implemented, all of the warnings could be turned into errors, and DMD could lose the -w switch.
Jan 27 2010
prev sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Justin Johansson (no spam.com)'s article
 dsimcha wrote:
 Multiple return values are a horrible substitute for exceptions, because they
 require the programmer to explicitly check the return value.  (When's the last
 time you checked the return value of printf, or even malloc?)  IMHO the best
thing
 about exceptions is that they provide a sane default for error handling:  If
you
 don't handle them then you've effectively asserted that they can't happen in
your
 situation.  If this "assertion" fails, then our program fails fast and with an
 error message that massively narrows down where the problem is.  I flat-out
refuse
 to program in a language where the default is for errors to be ignored and I
have
 to constantly write explicit error-checking boilerplate even if I don't care
about
 handling the error.
Well, back in C++ land, as an occasional alternative to using exceptions, I use a templated "checked_value" structure for returning an error code / function result pair. The C++ templated structure is shown below. A function returning a "checked_value" uses the first constructor form to return a valid result and the second form to return an erroneous result (using an enum to designate the error condition). If client code tries to access the value (via the overloaded cast operator, or you could have a getter function for the value instead), and an error is in effect that you didn't check for, then shit (an assert or other severity) happens. I'm sure there will be lots of religious comments about this idiom, but it works well for me by forcing a check for an error result before otherwise using the return value. Of course, this idiom only works if the function in question "returns something", a value, that the client code would by necessity have to use. It wouldn't (and couldn't) work if ValueType is "void". In the exceptional case (pun intended) of functions returning void, one may have to resort to throwing an exception instead to signal an error. So to counter the dsimcha's point, my solution does not assume a default situation of ignoring errors. The thrust of my argument is that exceptions are not an all-or-nothing approach in sane dealing in errors. template <typename ValueType> struct checked_value { public: checked_value( ValueType value) : _value( value), _rcode( RC_OK) {} checked_value( resultCode_t rcode) : _value( ValueType::PROP_INIT), _rcode( rcode) {} operator ValueType() const { // The value should not be accessible if the function failed. // Choose your poison with either assert or something // more severe in release compile if you don't like asserts // being preprocessed out. assert(!failed()); return _value; } int failed() const { return (_rcode < RC_OK); } resultCode rcode() const { return _rcode; } private: ValueType const _value; ResultCode const _rcode; }; Cheers Justin Johansson
Nice. This is probably a good idea in situations where it **always** makes sense for the immediate caller to handle the error. Only problem is that Go! doesn't have asserts either.
Jan 26 2010
parent reply Justin Johansson <no spam.com> writes:
dsimcha wrote:
 == Quote from Justin Johansson (no spam.com)'s article
 dsimcha wrote:
 Multiple return values are a horrible substitute for exceptions, because they
 require the programmer to explicitly check the return value.  (When's the last
 time you checked the return value of printf, or even malloc?)  IMHO the best
thing
 about exceptions is that they provide a sane default for error handling:  If
you
 don't handle them then you've effectively asserted that they can't happen in
your
 situation.  If this "assertion" fails, then our program fails fast and with an
 error message that massively narrows down where the problem is.  I flat-out
refuse
 to program in a language where the default is for errors to be ignored and I
have
 to constantly write explicit error-checking boilerplate even if I don't care
about
 handling the error.
Well, back in C++ land, as an occasional alternative to using exceptions, I use a templated "checked_value" structure for returning an error code / function result pair. The C++ templated structure is shown below. A function returning a "checked_value" uses the first constructor form to return a valid result and the second form to return an erroneous result (using an enum to designate the error condition). If client code tries to access the value (via the overloaded cast operator, or you could have a getter function for the value instead), and an error is in effect that you didn't check for, then shit (an assert or other severity) happens. I'm sure there will be lots of religious comments about this idiom, but it works well for me by forcing a check for an error result before otherwise using the return value. Of course, this idiom only works if the function in question "returns something", a value, that the client code would by necessity have to use. It wouldn't (and couldn't) work if ValueType is "void". In the exceptional case (pun intended) of functions returning void, one may have to resort to throwing an exception instead to signal an error. So to counter the dsimcha's point, my solution does not assume a default situation of ignoring errors. The thrust of my argument is that exceptions are not an all-or-nothing approach in sane dealing in errors. template <typename ValueType> struct checked_value { public: checked_value( ValueType value) : _value( value), _rcode( RC_OK) {} checked_value( resultCode_t rcode) : _value( ValueType::PROP_INIT), _rcode( rcode) {} operator ValueType() const { // The value should not be accessible if the function failed. // Choose your poison with either assert or something // more severe in release compile if you don't like asserts // being preprocessed out. assert(!failed()); return _value; } int failed() const { return (_rcode < RC_OK); } resultCode rcode() const { return _rcode; } private: ValueType const _value; ResultCode const _rcode; }; Cheers Justin Johansson
Nice.
Thanks.
 This is probably a good idea in situations where it **always** makes sense
 for the immediate caller to handle the error.
Yes, like any tool you would use it when it is appropriate for the task at hand. AAMOF, I use it as a static function in a class wrapping the constructor to prevalidate arguments passed to the constructor. Now (1) For some reason (possibly valid only in an historic context), I have this great aversion to throwing exceptions from inside C++ constructors. From memory, I once threw an exception from inside a constructor with an early C++ compiler and wound up with stack corruption or something like that, and consequently I developed the practice of forever more avoiding throwing from inside a C++ constructor. (2) Also, and this might be urban myth, I'm of the belief that a try block does not come for free at runtime even if an exception is not thrown (i.e. there's a cost to establishing the try block itself), that for performance critical code the old fashion return and check error code idiom is faster. I wonder if our resident C++ compiler writer can shine any truth on (1). With (2), can people either confirm (A) that it is true that try{} is not completely and absolutely free, or (B) that (2) is urban myth.
 Only problem is that Go! doesn't have asserts either.
I'm sure there must be a spoken language somewhere in the universe in which the word "Go" translates to "joke" in English. I tried it for a few days when it was first announced and no matter how much I tried to give Go a "fair go", as we say in Australia, the experience felt like going back 30 years in terms of it's language maturity. Cheers Justin Johansson
Jan 26 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I have 
 this great aversion to throwing exceptions from inside C++ constructors. 
  From memory, I once threw an exception from inside a constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style. C++ constructors throwing should work according to the C++ standard.
 (2) Also, and this might be urban myth, I'm of the belief that a try
 block does not come for free at runtime even if an exception is not 
 thrown (i.e. there's a cost to establishing the try block itself),
 that for performance critical code the old fashion return and check
 error code idiom is faster.
With 32 bit Windows, there is a small penalty even when not throwing. With Linux and 64 bit Windows, there is no penalty. dmd will attempt to optimize away try blocks if the statements in them are all nothrow.
Jan 26 2010
next sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Walter Bright (newshound1 digitalmars.com)'s article
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I have
 this great aversion to throwing exceptions from inside C++ constructors.
  From memory, I once threw an exception from inside a constructor
 with an early C++ compiler and wound up with stack corruption or
 something like that, and consequently I developed the practice of
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
I've never understood the recommendation not to put complex logic in the constructor. If you need complex logic to establish the class's invariant, or need to initialize complicated immutable data structures, where else is that logic supposed to go?
Jan 26 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
dsimcha wrote:
 I've never understood the recommendation not to put complex logic in the
 constructor.
Complex logic is fine, just logic that cannot fail.
 If you need complex logic to establish the class's invariant, or
 need to initialize complicated immutable data structures, where else is that
logic
 supposed to go?
For example, allocating resources that might fail, can be done with a separate initialization function: auto c = new Class; c.create();
Jan 26 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnt03$1od9$1 digitalmars.com...
 dsimcha wrote:
 I've never understood the recommendation not to put complex logic in the
 constructor.
Complex logic is fine, just logic that cannot fail.
 If you need complex logic to establish the class's invariant, or
 need to initialize complicated immutable data structures, where else is 
 that logic
 supposed to go?
For example, allocating resources that might fail, can be done with a separate initialization function: auto c = new Class; c.create();
But then you're right back into manual-resource-management-land (yes, I know, not for the GCed memory used to store the object, but besides that, yea, manual-resource-management-land). More specifically, you'll need to (or at least, *should*) set up a way for the object to keep track of whether or not it's been properly initialized, and then have all other accesses to it throw exceptions or something until it is properly inited. And since it's useless until it's inited (or at least not fully-useful), plus the fact that those "accessed an uninited Foo" exceptions will be popping up in places *other* than where you tried-to-create-the-Foo-but-forgot-to-init (which is where you'd ideally want the exception to origninate from because that's where the real problem is), this means we should ideally create a factory function to encapsulate both the instatiation and initilization. And the constructor and init function should now be private to force all instantiations through the safer factory function. And now, finally, our reward for all that work: We've reinvented the wheel^H^H^H^H^H^H constructor! And with a syntax that's inconsistent with all the other classes! Wheee!!!
Jan 26 2010
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 dsimcha wrote:
 I've never understood the recommendation not to put complex logic in the
 constructor.
Complex logic is fine, just logic that cannot fail.
 If you need complex logic to establish the class's invariant, or
 need to initialize complicated immutable data structures, where else 
 is that logic
 supposed to go?
For example, allocating resources that might fail, can be done with a separate initialization function: auto c = new Class; c.create();
I think this approach is stilted. Andrei
Jan 26 2010
prev sibling parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnt03$1od9$1 digitalmars.com...
 dsimcha wrote:
 I've never understood the recommendation not to put complex logic in 
 the
 constructor.
Complex logic is fine, just logic that cannot fail.
 If you need complex logic to establish the class's invariant, or
 need to initialize complicated immutable data structures, where else 
 is that logic
 supposed to go?
For example, allocating resources that might fail, can be done with a separate initialization function: auto c = new Class; c.create();
Commonly called "2 step construction". But you forgot to add the check on the return val from c.create().
Jan 29 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
John D wrote:
 Commonly called "2 step construction". But you forgot to add the check on 
 the return val from c.create(). 
No need to - c.create() should throw if it fails.
Jan 30 2010
parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk1uso$1omt$2 digitalmars.com...
 John D wrote:
 Commonly called "2 step construction". But you forgot to add the check 
 on the return val from c.create().
No need to - c.create() should throw if it fails.
What's the point of doing 2-step construction if not to avoid exceptions?
Jan 30 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"John D" <jdean googling.com> wrote in message 
news:hk33sm$186d$1 digitalmars.com...
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hk1uso$1omt$2 digitalmars.com...
 John D wrote:
 Commonly called "2 step construction". But you forgot to add the check 
 on the return val from c.create().
No need to - c.create() should throw if it fails.
What's the point of doing 2-step construction if not to avoid exceptions?
To avoid having a constructor that throws exceptions...Although I still fail to see how it's useful to avoid that in D.
Jan 30 2010
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hk1uso$1omt$2 digitalmars.com...
 John D wrote:
 Commonly called "2 step construction". But you forgot to add the check 
 on the return val from c.create().
No need to - c.create() should throw if it fails.
What's the point of doing 2-step construction if not to avoid exceptions?
It's so the default initialization cannot fail, which is handy in cases where dealing with failure is problematic.
Jan 30 2010
parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk3bld$1k1q$2 digitalmars.com...
 John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hk1uso$1omt$2 digitalmars.com...
 John D wrote:
 Commonly called "2 step construction". But you forgot to add the 
 check on the return val from c.create().
No need to - c.create() should throw if it fails.
What's the point of doing 2-step construction if not to avoid exceptions?
It's so the default initialization cannot fail, which is handy in cases where dealing with failure is problematic.
Pffft. Said the backend compiler writer. So let it be written.
Jan 31 2010
parent "Nick Sabalausky" <a a.a> writes:
"John D" <jdean googling.com> wrote in message 
news:hk3f9m$1t0f$1 digitalmars.com...
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hk3bld$1k1q$2 digitalmars.com...
 John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hk1uso$1omt$2 digitalmars.com...
 John D wrote:
 Commonly called "2 step construction". But you forgot to add the check 
 on the return val from c.create().
No need to - c.create() should throw if it fails.
What's the point of doing 2-step construction if not to avoid exceptions?
It's so the default initialization cannot fail, which is handy in cases where dealing with failure is problematic.
Pffft. Said the backend compiler writer. So let it be written.
Why are you even here?
Jan 31 2010
prev sibling parent Justin Johansson <no spam.com> writes:
dsimcha wrote:
 == Quote from Walter Bright (newshound1 digitalmars.com)'s article
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I have
 this great aversion to throwing exceptions from inside C++ constructors.
  From memory, I once threw an exception from inside a constructor
 with an early C++ compiler and wound up with stack corruption or
 something like that, and consequently I developed the practice of
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
I've never understood the recommendation not to put complex logic in the constructor. If you need complex logic to establish the class's invariant, or need to initialize complicated immutable data structures, where else is that logic supposed to go?
I do (especially when working with immutable value classes). A good use of OO classes is to bind/glue/cement a few pieces of your programming task jigsaw puzzle together. This you do with a constructor meaning that once you have a few pieces of the puzzle nutted out (and prevalidated), you use a constructor to associate the pieces together under the umbrella of a single object which can be thought of as a tuple of values or a row in a table. It would be unusual in relational DB terms to create a new row in a table and validate the columns after the event. Forms are filled out, fields validate and only when all's good does one commit that to a database (by creating a new row in a table). The same philosophy holds in an OO programming language. At best, it's a hint of code that could be better refactored, and, at worse, it's a sure sign of a bad design to check any of the following after the event (that's is after deciding to create a new instance of a class, tuple or row as the case may be): (1) parameters passed to a constructor inside constructor (2) individual values in a constructed tuple (3) columns in the row of a DB table (4) attribute values of an object inside constructor This argument is programming language agnostic, and leans on the acknowledged benefits of immutable structures and functional programming. Furthermore, and I think this is the most compelling point for not throwing exceptions in constructors is when a class is declared with const members. When declaring such a class, the const members of the class need to be assigned up front from the arguments passed to the constructor. This means that there would not be any opportunity to throw an exception anyway until execution enters the body of the constructor code. In C++ it's done like this (and surely in D it's similar .. I forget not having D for six months) ... class Foo { public: Bar const bar; // const means need to write // : bar( aBar) after constructor // argument list. // cannot write bar = aBar in ctor body Foo( Bar aBar) : bar( aBar) { // Do "complex logic" here and // check Bar value now for validity (after // it's already been assigned to member) // and throw an exception now if it's no // good the devil asks? } } Cheers Justin Johansson
Jan 26 2010
prev sibling next sibling parent reply Ary Borenszweig <ary esperanto.org.ar> writes:
Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I 
 have this great aversion to throwing exceptions from inside C++ 
 constructors.  From memory, I once threw an exception from inside a 
 constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
Jan 26 2010
next sibling parent reply Justin Johansson <no spam.com> writes:
Ary Borenszweig wrote:
 Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I 
 have this great aversion to throwing exceptions from inside C++ 
 constructors.  From memory, I once threw an exception from inside a 
 constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
A factory method is the way to go. Different languages give you different means for achieving this design pattern but nevertheless all such means make for the better factoring of code. In C++ there are three means :- (1) Use of static class member, so your example would like like this: BigInt x = BitInt::fromString( someString); (2) Use of () function call operator overload on a factory class so your example would now look like this BigIntFactory bigIntFactory; // may be statically declared BigInt x = bigIntFactory( someString); (3) Global function, which I won't discuss any futher for obvious reasons. In D, similar to C++, though function call () operator overload is effected in much cleaner fashion with D's absolutely wonderful static opCall. So your example would look something like this (as said earlier I haven't done D for 6 months so pls forgive any error in detail) : class BigInt { static BigInt opCall( someString) { if (!validate( someString)) throw someError; // extract data for BitInt instance somehow // from string .. maybe tied into validate function byte[] bigIntData = ... return new BigInt( bigIntData); } this( byte[] bigIntData) { this.bigIntData = bigIntData; } private: byte[] bigIntData; // other BigInt methods ... } Now in D, BigInt x = BigInt( someString); In Java, well, let's not discuss that here. :-) In Scala you have "companion classes" that go hand-in-hand with the regular class. Scala uses companion classes to reduce the noise that the static class members introduce in other languages. (Example anybody?) Summary for D: It really isn't that much work to use D's static opCall() to good effect and, IMHO, complex designs do end up a lot cleaner. As they say, necessity is the mother of invention. It seems to me that both Scala and D have been driven by necessity in the design of companion classes and static opCall respectively. Cheers Justin Johansson
Jan 26 2010
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Justin Johansson" <no spam.com> wrote in message 
news:hjo31b$275o$1 digitalmars.com...
 Ary Borenszweig wrote:
 Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I 
 have this great aversion to throwing exceptions from inside C++ 
 constructors.  From memory, I once threw an exception from inside a 
 constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
A factory method is the way to go. Different languages give you different means for achieving this design pattern but nevertheless all such means make for the better factoring of code.
Still within the context of this D BigInt example, what benefit does using a factory method provide over a throwing constructor? I've always seen factories as falling into one of two categories: 1. A hack to get around a limitation in a language's constructor feature. or 2. Part of a helper API (I guess the kids are calling those "facades" these days...) for pre-configuring a new instance in a commonly-useful, but non-default way.
Jan 26 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Nick Sabalausky wrote:
 "Justin Johansson" <no spam.com> wrote in message 
 news:hjo31b$275o$1 digitalmars.com...
 Ary Borenszweig wrote:
 Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I 
 have this great aversion to throwing exceptions from inside C++ 
 constructors.  From memory, I once threw an exception from inside a 
 constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
A factory method is the way to go. Different languages give you different means for achieving this design pattern but nevertheless all such means make for the better factoring of code.
Still within the context of this D BigInt example, what benefit does using a factory method provide over a throwing constructor? I've always seen factories as falling into one of two categories: 1. A hack to get around a limitation in a language's constructor feature. or 2. Part of a helper API (I guess the kids are calling those "facades" these days...) for pre-configuring a new instance in a commonly-useful, but non-default way.
Factories are mostly (imho) important in the fourth scenario: when you want to create an object from data. Andrei
Jan 26 2010
parent reply "John D" <jdean googling.com> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:hjo75f$2g30$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Justin Johansson" <no spam.com> wrote in message 
 news:hjo31b$275o$1 digitalmars.com...
 Ary Borenszweig wrote:
 Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), 
 I have this great aversion to throwing exceptions from inside C++ 
 constructors.  From memory, I once threw an exception from inside 
 a constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
A factory method is the way to go. Different languages give you different means for achieving this design pattern but nevertheless all such means make for the better factoring of code.
Still within the context of this D BigInt example, what benefit does using a factory method provide over a throwing constructor? I've always seen factories as falling into one of two categories: 1. A hack to get around a limitation in a language's constructor feature. or 2. Part of a helper API (I guess the kids are calling those "facades" these days...) for pre-configuring a new instance in a commonly-useful, but non-default way.
Factories are mostly (imho) important in the fourth scenario: when you want to create an object from data.
dynamic libraries.
Jan 29 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"John D" <jdean googling.com> wrote in message 
news:hk0mph$2dt6$1 digitalmars.com...
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
 news:hjo75f$2g30$1 digitalmars.com...
 Nick Sabalausky wrote:
 Still within the context of this D BigInt example, what benefit does 
 using a factory method provide over a throwing constructor?

 I've always seen factories as falling into one of two categories: 1. A 
 hack to get around a limitation in a language's constructor feature. or 
 2. Part of a helper API (I guess the kids are calling those "facades" 
 these days...) for pre-configuring a new instance in a commonly-useful, 
 but non-default way.
Factories are mostly (imho) important in the fourth scenario: when you want to create an object from data.
dynamic libraries.
Can you elaborate on this?
Jan 30 2010
parent reply "John D" <jdean googling.com> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:hk3bij$1k4q$1 digitalmars.com...
 "John D" <jdean googling.com> wrote in message 
 news:hk0mph$2dt6$1 digitalmars.com...
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
 news:hjo75f$2g30$1 digitalmars.com...
 Nick Sabalausky wrote:
 Still within the context of this D BigInt example, what benefit does 
 using a factory method provide over a throwing constructor?

 I've always seen factories as falling into one of two categories: 1. 
 A hack to get around a limitation in a language's constructor 
 feature. or 2. Part of a helper API (I guess the kids are calling 
 those "facades" these days...) for pre-configuring a new instance in 
 a commonly-useful, but non-default way.
Factories are mostly (imho) important in the fourth scenario: when you want to create an object from data.
of dynamic libraries.
Can you elaborate on this?
I can. But I shouldn't have to. So shut up.
Jan 31 2010
parent "Nick Sabalausky" <a a.a> writes:
"John D" <jdean googling.com> wrote in message 
news:hk3f9n$1t0f$2 digitalmars.com...
 "Nick Sabalausky" <a a.a> wrote in message 
 news:hk3bij$1k4q$1 digitalmars.com...
 "John D" <jdean googling.com> wrote in message 
 news:hk0mph$2dt6$1 digitalmars.com...
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
 news:hjo75f$2g30$1 digitalmars.com...
 Nick Sabalausky wrote:
 Still within the context of this D BigInt example, what benefit does 
 using a factory method provide over a throwing constructor?

 I've always seen factories as falling into one of two categories: 1. A 
 hack to get around a limitation in a language's constructor feature. 
 or 2. Part of a helper API (I guess the kids are calling those 
 "facades" these days...) for pre-configuring a new instance in a 
 commonly-useful, but non-default way.
Factories are mostly (imho) important in the fourth scenario: when you want to create an object from data.
of dynamic libraries.
Can you elaborate on this?
I can. But I shouldn't have to. So shut up.
I see we have another superdan on our hands. If it even is another...This screwball did show up around the time superdan disappeared.
Jan 31 2010
prev sibling parent Yigal Chripun <yigal100 gmail.com> writes:
On 27/01/2010 02:57, Justin Johansson wrote:
 Ary Borenszweig wrote:
 Walter Bright wrote:
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I
 have this great aversion to throwing exceptions from inside C++
 constructors. From memory, I once threw an exception from inside a
 constructor
 with an early C++ compiler and wound up with stack corruption or
 something like that, and consequently I developed the practice of
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw). I'm probably in the minority with that view, but you shouldn't feel like you're doing the wrong thing when you stick to such a style.
auto x = new BigInt(someString); How do you implement BigInt's constructor without being able to throw an exception? Or would you do it like: auto x = BigInt.fromString(someString); to be able to throw? (just to obey the "no throw in constructors"... but that's not as trivial as the previous form)
A factory method is the way to go. Different languages give you different means for achieving this design pattern but nevertheless all such means make for the better factoring of code. In C++ there are three means :- (1) Use of static class member, so your example would like like this: BigInt x = BitInt::fromString( someString); (2) Use of () function call operator overload on a factory class so your example would now look like this BigIntFactory bigIntFactory; // may be statically declared BigInt x = bigIntFactory( someString); (3) Global function, which I won't discuss any futher for obvious reasons. In D, similar to C++, though function call () operator overload is effected in much cleaner fashion with D's absolutely wonderful static opCall. So your example would look something like this (as said earlier I haven't done D for 6 months so pls forgive any error in detail) : class BigInt { static BigInt opCall( someString) { if (!validate( someString)) throw someError; // extract data for BitInt instance somehow // from string .. maybe tied into validate function byte[] bigIntData = ... return new BigInt( bigIntData); } this( byte[] bigIntData) { this.bigIntData = bigIntData; } private: byte[] bigIntData; // other BigInt methods ... } Now in D, BigInt x = BigInt( someString); In Java, well, let's not discuss that here. :-) In Scala you have "companion classes" that go hand-in-hand with the regular class. Scala uses companion classes to reduce the noise that the static class members introduce in other languages. (Example anybody?) Summary for D: It really isn't that much work to use D's static opCall() to good effect and, IMHO, complex designs do end up a lot cleaner. As they say, necessity is the mother of invention. It seems to me that both Scala and D have been driven by necessity in the design of companion classes and static opCall respectively. Cheers Justin Johansson
Factories are a hack to overcome limitations of the language, mainly the fact that constructors aren't virtual. The above solution(s) have two main drawbacks: testability and Multi-threading will be affected.
Jan 31 2010
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Ary Borenszweig wrote:
 auto x = new BigInt(someString);
 
 How do you implement BigInt's constructor without being able to throw an 
 exception?
I presume you mean how can it work without allocating memory. In D, a memory allocation failure is generally not a recoverable exception, it's a fatal exception. A better example would be one that attempted to open a file in the constructor.
Jan 26 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjo5t3$2dqq$1 digitalmars.com...
 Ary Borenszweig wrote:
 auto x = new BigInt(someString);

 How do you implement BigInt's constructor without being able to throw an 
 exception?
I presume you mean how can it work without allocating memory. In D, a memory allocation failure is generally not a recoverable exception, it's a fatal exception. A better example would be one that attempted to open a file in the constructor.
I think he meant that passing it something like "Hello, Fred." should be disallowed.
Jan 26 2010
parent Ary Borenszweig <ary esperanto.org.ar> writes:
Nick Sabalausky wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hjo5t3$2dqq$1 digitalmars.com...
 Ary Borenszweig wrote:
 auto x = new BigInt(someString);

 How do you implement BigInt's constructor without being able to throw an 
 exception?
I presume you mean how can it work without allocating memory. In D, a memory allocation failure is generally not a recoverable exception, it's a fatal exception. A better example would be one that attempted to open a file in the constructor.
I think he meant that passing it something like "Hello, Fred." should be disallowed.
Yes, that. I don't see another way rather than throwing an exception. Or maybe just do like ruby, when you do "hello".to_i it gives you zero but then you have to check it later. I don't like factory methods like BigInt.fromString. If I want to create a BigInt from a string I should use the constructor, why not?
Jan 27 2010
prev sibling parent "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjnrej$1kpc$1 digitalmars.com...
 Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I 
 have this great aversion to throwing exceptions from inside C++ 
 constructors. From memory, I once threw an exception from inside a 
 constructor
 with an early C++ compiler and wound up with stack corruption or 
 something like that, and consequently I developed the practice of 
 forever more avoiding throwing from inside a C++ constructor.
I'm a believer in the methodology that a constructor should be "trivial" in that it cannot fail (i.e. cannot throw).
Me too. (It's not "a methodology" though, just a style or coding guideline/requirement).
 I'm probably in the minority with that view,
C++ is called a "multi-paradigm language", but one of the most widely used paradigms of C++ is never included in the list along will OO, generic, procedural, etc. The one curiously omitted from the lists is RAII. That "lil thing" will PERMEATE a program's architecture. I'm of the persuasion that Initialization Is Not Resource Aquisition (IINRA). At the very least, RAII violates the principle of Separation of Concerns. More of a problem though is that RAII paradigm FORCES you to use exceptions rather than another/other error handling technique(s): it takes away your choice and sucks in a complex language feature and all its surrounding issues.
 but you shouldn't feel like you're doing the wrong thing when you stick 
 to such a style.
(That's better: "style" instead of "methodology").
Jan 29 2010
prev sibling parent reply =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
Justin Johansson wrote:
 (1) For some reason (possibly valid only in an historic context), I hav=
e
 this great aversion to throwing exceptions from inside C++ constructors=
=2E
  From memory, I once threw an exception from inside a constructor
 with an early C++ compiler and wound up with stack corruption or
 something like that, and consequently I developed the practice of
 forever more avoiding throwing from inside a C++ constructor.
=20
 (2) Also, and this might be urban myth, I'm of the belief that a try
 block does not come for free at runtime even if an exception is not
 thrown (i.e. there's a cost to establishing the try block itself),
 that for performance critical code the old fashion return and check
 error code idiom is faster.
=20
 I wonder if our resident C++ compiler writer can shine any truth on (1)=
=2E
=20
 With (2), can people either confirm (A) that it is true that try{}
 is not completely and absolutely free, or (B) that (2) is urban myth.
=20
Throwing exception inside constructors should be avoided because then the destructor is never called and you risk leaking like crazy. Regarding point (2), it depends on your compiler. Recent g++ versions may use something called "DWARF exception unwinding" which costs nothing so long as you don't throw. Older versions (or when there is a risk that some stack frames won't be in the right format, like on windows) use setjmp and longjmp. In that case, the try block costs a setjmp (plus whatever is required for making the jmp_buf thread-local). I don't know how other compilers handle it. Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 26 2010
next sibling parent reply =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
Jérôme M. Berger wrote:
 	Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
That's necessarily so, because calling the destructor on the incomplete object might cause other troubles. On the other hand, the destructors of all of the constructed members are called when the encapsulating object's constructor throws. Exception-safe programming in C++ consists of a handful of guidelines that takes care of throwing from constructors. Ali
Jan 26 2010
parent reply =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
Ali =C3=87ehreli wrote:
 J=C3=A9r=C3=B4me M. Berger wrote:
     Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
=20 That's necessarily so, because calling the destructor on the incomplete=
 object might cause other troubles.
=20
 On the other hand, the destructors of all of the constructed members ar=
e
 called when the encapsulating object's constructor throws.
=20
Does that include the parent destructor? Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 27 2010
parent =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
J=C3=A9r=C3=B4me M. Berger wrote:
 Ali =C3=87ehreli wrote:
 J=C3=A9r=C3=B4me M. Berger wrote:
     Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
That's necessarily so, because calling the destructor on the incomplet=
e
 object might cause other troubles.

 On the other hand, the destructors of all of the constructed members a=
re
 called when the encapsulating object's constructor throws.
Does that include the parent destructor? =20
I just checked and it seems to. Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 27 2010
prev sibling next sibling parent Rainer Deyke <rainerd eldwood.com> writes:
Jérôme M. Berger wrote:
 	Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
That's not an issue if you properly encapsulate your resources. One class, one responsibility. Consider (C++): class C { C() { ... } Resource a, b, c; }; If any of the resource constructors throw, construction of the 'C' object is aborted and the any Resource objects that have already been constructed are properly destroyed. By the time the body of the 'C' constructor is entered, all the resource already exist and are guaranteed to be properly disposed of. If you have a struct/class that performs more than one logical action in its constructor and/or destructor, refactor it into several structs/classes. -- Rainer Deyke - rainerd eldwood.com
Jan 26 2010
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
""Jérôme M. Berger"" <jeberger free.fr> wrote in message 
news:hjnrhv$1l0k$1 digitalmars.com...
Throwing exception inside constructors should be avoided because
then the destructor is never called and you risk leaking like crazy.
That's a risk outside of constructors too. Hence: scope(failure) {}
Jan 26 2010
parent reply =?UTF-8?B?IkrDqXLDtG1lIE0uIEJlcmdlciI=?= <jeberger free.fr> writes:
Nick Sabalausky wrote:
 ""J=EF=BF=BDr=EF=BF=BDme M. Berger"" <jeberger free.fr> wrote in messag=
e=20
 news:hjnrhv$1l0k$1 digitalmars.com...
 Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
=20 That's a risk outside of constructors too. Hence: =20 scope(failure) {} =20
- This doesn't exist in C++; - Outside of constructor, there is no problem because the destructor is called. Jerome --=20 mailto:jeberger free.fr http://jeberger.free.fr Jabber: jeberger jabber.fr
Jan 27 2010
parent reply "Nick Sabalausky" <a a.a> writes:
Nick Sabalausky wrote:
 ""J?r?me M. Berger"" <jeberger free.fr> wrote in message
 news:hjnrhv$1l0k$1 digitalmars.com...
 Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
That's a risk outside of constructors too. Hence: scope(failure) {}
- This doesn't exist in C++;
Well, I was really only talking about D. I guess the answer to "Should throwing exceptions in a constructor be avoided?" is "Depends on the language." For C++, constructors have historically had a number of tricky edge cases (though I couldn't really remember what), so I'm not surprised that throwing exceptions is one of them. For D though, I'm not sure I see that there's a problem.
- Outside of constructor, there is no problem because the destructor
is called.
1. Not always true in D (though that's a more general problem with D destructors). 2. There can be times when some cleanup would be needed that isn't in a destructor (especially in D since destructors can't be relied on actually being called.)
Jan 27 2010
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Nick Sabalausky wrote:
 Nick Sabalausky wrote:
 ""J?r?me M. Berger"" <jeberger free.fr> wrote in message
 news:hjnrhv$1l0k$1 digitalmars.com...
 Throwing exception inside constructors should be avoided because
 then the destructor is never called and you risk leaking like crazy.
That's a risk outside of constructors too. Hence: scope(failure) {}
- This doesn't exist in C++;
Well, I was really only talking about D. I guess the answer to "Should throwing exceptions in a constructor be avoided?" is "Depends on the language." For C++, constructors have historically had a number of tricky edge cases (though I couldn't really remember what), so I'm not surprised that throwing exceptions is one of them. For D though, I'm not sure I see that there's a problem.
The edge cases involve member initialization that throws. The root of the matter is that in C++ you can't have a type with a guaranteed nothrow constructor. That makes a lot of things very difficult when it comes to object composition. I've aired a number of times the idea that the parameterless constructor should not throw. That would simplify many issues, but there are still cases when that doesn't work (nonull!!!) Andrei
Jan 27 2010
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Steve Teale" <steve.teale britseyeview.com> wrote in message 
news:hjf9uk$1trp$1 digitalmars.com...

realize of course does not mean anything. But I'd be interested to hear 
what the D aficionados think of Go.

 It probably would not suit Andrei.
It's a gimped, obfuscated and immature imitation of D. It's little more than a concurrency-model experiment masquerading as a real language. Also: - As far as I'm concerned, its real name is "Issue 9" (search "google go issue 9"). - It's the Buick/Cadillac/Oldsmobile of computer languages: Garbage that gets attention solely because of the name(s) attached. - Does nothing to change my opinion that Google has done nothing noteworthy outside of search engines and maybe their ad service.
Jan 23 2010
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:hjff0j$27i2$1 digitalmars.com...
 - Does nothing to change my opinion that Google has done nothing 
 noteworthy outside of search engines and maybe their ad service.
And their maps, of course. But then again, how google, of all companies, can't manage to get the non-JS fallback to work cleanly and reliabily is beyond me (or how they can't manage to *start* the map centered around the search point instead of starting it elsewhere and slowing scrolling it to the center as it does now, or can't figure out that not everybody needs that damn pop-up bubble on the search point *every* single time they do a map search). Even mapquest was able to pull that stuff off years ago.
Jan 23 2010
parent grauzone <none example.net> writes:
Nick Sabalausky wrote:
 "Nick Sabalausky" <a a.a> wrote in message 
 news:hjff0j$27i2$1 digitalmars.com...
 - Does nothing to change my opinion that Google has done nothing 
 noteworthy outside of search engines and maybe their ad service.
And their maps, of course. But then again, how google, of all companies, can't manage to get the non-JS fallback to work cleanly and reliabily is
I don't think Google _wants_ it to work. Whenever I use something from Google (other than the web search), I get the feeling they want to force me to enable JS. Maybe disabling JS gets into the way of their ad- and data mining features, or they want to force Web 2.0 on everyone.
 beyond me (or how they can't manage to *start* the map centered around the 
 search point instead of starting it elsewhere and slowing scrolling it to 
 the center as it does now, or can't figure out that not everybody needs that 
 damn pop-up bubble on the search point *every* single time they do a map 
 search). Even mapquest was able to pull that stuff off years ago. 
 
 
Jan 23 2010
prev sibling next sibling parent reply Bane <branimir.milosavljevic gmail.com> writes:
Nick Sabalausky Wrote:

 "Steve Teale" <steve.teale britseyeview.com> wrote in message 
 news:hjf9uk$1trp$1 digitalmars.com...

realize of course does not mean anything. But I'd be interested to hear 
what the D aficionados think of Go.

 It probably would not suit Andrei.
It's a gimped, obfuscated and immature imitation of D. It's little more than a concurrency-model experiment masquerading as a real language. Also: - As far as I'm concerned, its real name is "Issue 9" (search "google go issue 9"). - It's the Buick/Cadillac/Oldsmobile of computer languages: Garbage that gets attention solely because of the name(s) attached. - Does nothing to change my opinion that Google has done nothing noteworthy outside of search engines and maybe their ad service.
It looks like to me they are making Google Goo for prestige. Search engine, browser, now programming language... Whats next? OS? Laptops? Fast food franchise?
Jan 23 2010
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Bane wrote:
 Nick Sabalausky Wrote:
 
 "Steve Teale" <steve.teale britseyeview.com> wrote in message 
 news:hjf9uk$1trp$1 digitalmars.com...

 realize of course does not mean anything. But I'd be interested to hear 
 what the D aficionados think of Go.

 It probably would not suit Andrei.
It's a gimped, obfuscated and immature imitation of D. It's little more than a concurrency-model experiment masquerading as a real language. Also: - As far as I'm concerned, its real name is "Issue 9" (search "google go issue 9"). - It's the Buick/Cadillac/Oldsmobile of computer languages: Garbage that gets attention solely because of the name(s) attached. - Does nothing to change my opinion that Google has done nothing noteworthy outside of search engines and maybe their ad service.
It looks like to me they are making Google Goo for prestige. Search engine, browser, now programming language... Whats next? OS? Laptops? Fast food franchise?
I don't understand all the criticism behind Google's product. Of corporate software producers, Apple and Google are the two ones making products that work reliably and are carefully designed. Besides, there's not much conspiracy going on. People at Google go off and do their own projects all the time. I don't see Go part of a careful ploy. (For one thing it would be much more polished in that case.) It is also known inside Google circles that Go's authors are not language designers, and that is visible in the quality of the language. (That could change any time; Google does employ quite a few good language designers who may contribute to Go in the future.) I think Tiobe's method of measuring language popularity is rather noisy. Go is unfinished on Unix, not present on Windows, and all around severely lacking in vision and originality. There's no way in hell it starting this year.) Andrei
Jan 23 2010
next sibling parent reply Roman Ivanov <isroman-del ete-km.ru> writes:
Andrei Alexandrescu Wrote:

 Bane wrote:
 Nick Sabalausky Wrote:
 
 "Steve Teale" <steve.teale britseyeview.com> wrote in message 
 news:hjf9uk$1trp$1 digitalmars.com...

 realize of course does not mean anything. But I'd be interested to hear 
 what the D aficionados think of Go.

 It probably would not suit Andrei.
It's a gimped, obfuscated and immature imitation of D. It's little more than a concurrency-model experiment masquerading as a real language. Also: - As far as I'm concerned, its real name is "Issue 9" (search "google go issue 9"). - It's the Buick/Cadillac/Oldsmobile of computer languages: Garbage that gets attention solely because of the name(s) attached. - Does nothing to change my opinion that Google has done nothing noteworthy outside of search engines and maybe their ad service.
It looks like to me they are making Google Goo for prestige. Search engine, browser, now programming language... Whats next? OS? Laptops? Fast food franchise?
I don't understand all the criticism behind Google's product. Of corporate software producers, Apple and Google are the two ones making products that work reliably and are carefully designed.
They get lots and lots of undeserved attention. Even when the final products are not that great, and occasionally when the people praising them would be hostile towards the same kinds of products from smaller companies. Reception often border on being an outright hysteria. It's mostly the fault of the people who react this way, but both companies put a lot of effort in creating this effect via various kind of marketing too.
Jan 23 2010
parent reply Roman Ivanov <isroman-del ete-km.ru> writes:
Roman Ivanov Wrote:

 Andrei Alexandrescu Wrote:
 
 Bane wrote:
 Nick Sabalausky Wrote:
 
 "Steve Teale" <steve.teale britseyeview.com> wrote in message 
 news:hjf9uk$1trp$1 digitalmars.com...

 realize of course does not mean anything. But I'd be interested to hear 
 what the D aficionados think of Go.

 It probably would not suit Andrei.
It's a gimped, obfuscated and immature imitation of D. It's little more than a concurrency-model experiment masquerading as a real language. Also: - As far as I'm concerned, its real name is "Issue 9" (search "google go issue 9"). - It's the Buick/Cadillac/Oldsmobile of computer languages: Garbage that gets attention solely because of the name(s) attached. - Does nothing to change my opinion that Google has done nothing noteworthy outside of search engines and maybe their ad service.
It looks like to me they are making Google Goo for prestige. Search engine, browser, now programming language... Whats next? OS? Laptops? Fast food franchise?
I don't understand all the criticism behind Google's product. Of corporate software producers, Apple and Google are the two ones making products that work reliably and are carefully designed.
They get lots and lots of undeserved attention. Even when the final products are not that great, and occasionally when the people praising them would be hostile towards the same kinds of products from smaller companies. Reception often border on being an outright hysteria. It's mostly the fault of the people who react this way, but both companies put a lot of effort in creating this effect via various kind of marketing too.
Also, a lot of Google's recent software initiatives are really weird stuff with highly questionable value. However, because of the reception mentioned above, they kind of bend the existing software infrastructure and culture around themselves. Not in a good way too. For example, I really don't like the idea that Wave is going to be a replacement for the aging email infrastructure. (Which might not happen, but that's how it's marketed.) I don't like the idea of an 8-gig operating system that's designed to run one application. (Not entirely true, but close enough to reality.) Those things might be of high quality, they may be reliable in their own way, while still having negative effect on software industry as a whole.
Jan 23 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Roman Ivanov" <isroman-del ete-km.ru> wrote in message 
news:hjftkk$3up$1 digitalmars.com...
 Roman Ivanov Wrote:
 They get lots and lots of undeserved attention. Even when the final 
 products are not that great, and occasionally when the people praising 
 them would be hostile towards the same kinds of products from smaller 
 companies.

 Reception often border on being an outright hysteria. It's mostly the 
 fault of the people who react this way, but both companies put a lot of 
 effort in creating this effect via various kind of marketing too.
Also, a lot of Google's recent software initiatives are really weird stuff with highly questionable value. However, because of the reception mentioned above, they kind of bend the existing software infrastructure and culture around themselves. Not in a good way too. For example, I really don't like the idea that Wave is going to be a replacement for the aging email infrastructure. (Which might not happen, but that's how it's marketed.) I don't like the idea of an 8-gig operating system that's designed to run one application. (Not entirely true, but close enough to reality.) Those things might be of high quality, they may be reliable in their own way, while still having negative effect on software industry as a whole.
Agreed on all the above. And personally, I'd add a few more points: 1. I'd add "cloud computing" to the list of questionable initiatives (or at least questionable outside of certain niche use-cases that I'm sure probably do exist). 2. Their software generally reminds me of Apple software (no offense, or bait, intended to any Mac-users here) in that, IMO: 2.1. They tend to have lousy attention to detail (Google Code looks clean and pretty, and maybe it's reliable, but trying to use it absolutely drives me nuts. Plus, the stuff I mentioned about google maps in another post). 2.2. They're annoyingly slim on configurable settings (stuff I mentioned about google maps in another post, and why in the world they think I should be force-fed a non-standard custom skin in Chrome). 2.3. For their desktop apps, they like to force useless always-resident services onto my system. 3. Heck, Google is a web-oriented company, and I just hate modern web technology. ;)
Jan 23 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Nick Sabalausky wrote:
 "Roman Ivanov" <isroman-del ete-km.ru> wrote in message 
 news:hjftkk$3up$1 digitalmars.com...
 Roman Ivanov Wrote:
 They get lots and lots of undeserved attention. Even when the final 
 products are not that great, and occasionally when the people praising 
 them would be hostile towards the same kinds of products from smaller 
 companies.

 Reception often border on being an outright hysteria. It's mostly the 
 fault of the people who react this way, but both companies put a lot of 
 effort in creating this effect via various kind of marketing too.
Also, a lot of Google's recent software initiatives are really weird stuff with highly questionable value. However, because of the reception mentioned above, they kind of bend the existing software infrastructure and culture around themselves. Not in a good way too. For example, I really don't like the idea that Wave is going to be a replacement for the aging email infrastructure. (Which might not happen, but that's how it's marketed.) I don't like the idea of an 8-gig operating system that's designed to run one application. (Not entirely true, but close enough to reality.) Those things might be of high quality, they may be reliable in their own way, while still having negative effect on software industry as a whole.
Agreed on all the above. And personally, I'd add a few more points: 1. I'd add "cloud computing" to the list of questionable initiatives (or at least questionable outside of certain niche use-cases that I'm sure probably do exist). 2. Their software generally reminds me of Apple software (no offense, or bait, intended to any Mac-users here) in that, IMO: 2.1. They tend to have lousy attention to detail (Google Code looks clean and pretty, and maybe it's reliable, but trying to use it absolutely drives me nuts. Plus, the stuff I mentioned about google maps in another post).
Google and Apple on which planet are you referring to? Far as I can tell they set the _standard_ on attention to detail, and Microsoft and others are desperately catching up!
 2.2. They're annoyingly slim on configurable settings (stuff I mentioned 
 about google maps in another post, and why in the world they think I should 
 be force-fed a non-standard custom skin in Chrome).
Well yeah a better maps application is... oh, wait. Google maps is the best by a huge margin.
 2.3. For their desktop apps, they like to force useless always-resident 
 services onto my system.

 3. Heck, Google is a web-oriented company, and I just hate modern web 
 technology. ;)
3 contradicts 2.3. Did you mean to be sarcastic? Andrei
Jan 23 2010
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Andrei Alexandrescu wrote:
 2.2. They're annoyingly slim on configurable settings (stuff I 
 mentioned about google maps in another post, and why in the world they 
 think I should be force-fed a non-standard custom skin in Chrome).
Well yeah a better maps application is... oh, wait. Google maps is the best by a huge margin.
Not to mention the blessed calendar. Google calendar is the best, though Yahoo's is pretty mean as well. Since recently I need to use Outlook Calendar and it's like time travel to 1997. "Look, we can offer html rendering of your calendar! It has _colors_ in it!!" Andrei
Jan 23 2010
prev sibling next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:hjgopn$1i7g$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Roman Ivanov" <isroman-del ete-km.ru> wrote in message 
 news:hjftkk$3up$1 digitalmars.com...
 Roman Ivanov Wrote:
 They get lots and lots of undeserved attention. Even when the final 
 products are not that great, and occasionally when the people praising 
 them would be hostile towards the same kinds of products from smaller 
 companies.

 Reception often border on being an outright hysteria. It's mostly the 
 fault of the people who react this way, but both companies put a lot of 
 effort in creating this effect via various kind of marketing too.
Also, a lot of Google's recent software initiatives are really weird stuff with highly questionable value. However, because of the reception mentioned above, they kind of bend the existing software infrastructure and culture around themselves. Not in a good way too. For example, I really don't like the idea that Wave is going to be a replacement for the aging email infrastructure. (Which might not happen, but that's how it's marketed.) I don't like the idea of an 8-gig operating system that's designed to run one application. (Not entirely true, but close enough to reality.) Those things might be of high quality, they may be reliable in their own way, while still having negative effect on software industry as a whole.
Agreed on all the above. And personally, I'd add a few more points: 1. I'd add "cloud computing" to the list of questionable initiatives (or at least questionable outside of certain niche use-cases that I'm sure probably do exist). 2. Their software generally reminds me of Apple software (no offense, or bait, intended to any Mac-users here) in that, IMO: 2.1. They tend to have lousy attention to detail (Google Code looks clean and pretty, and maybe it's reliable, but trying to use it absolutely drives me nuts. Plus, the stuff I mentioned about google maps in another post).
Google and Apple on which planet are you referring to? Far as I can tell they set the _standard_ on attention to detail, and Microsoft and others are desperately catching up!
First of all, I never said that MS or other companies were good at attention-to-detail, so let's not get into that strawman. Apple *used to* set the standard on attention to detail (or so I've heard), but that era was over when OSX came around. Also, any current attention-to-detail from either Google or Apple is typically limited to visual style. Just some off-the-top-of-my-head examples of shitty attention-to-detail from Apple and Google: Apple: - iPods have a Power button, but they cannot be turned off via the so-called Power button. They are turned off by holding "Up" for a few seconds. Easily wins the lifetime award for "Dumbest Interface Design Choice I've Ever Seen". - How long did it take the iPhone to get basic copy/paste? (I'm not sure, but I know it was a ridiculously long time for such a basic feature. Handspring, for instance, had smartphones with copy/paste from day one, and that was years before the iPhone was ever announced.) - Last I looked, OSX didn't have any way to set up a light-on-dark color scheme, or any color scheme for that matter, at least not beyond wallpaper and selection-color. - iTunes installs an iPod service which it restarts whenever it detects it's not running (and it polls for it every few seconds), and there's no clean way to for people who never use an iPod on their system to permanently disable it. - iTunes: Right-click a song, and "Show in Windows Explorer" opens *two* explorer windows to the given directory (with the second one opened roughly 10 seconds or so after the first). That's persisted for quite a while, and I've never seen it in any other program that launches an explorer window. - iTunes: Took forever before it finally supported audio CDs that had data for Track 1. (Ex: Many early CD-based videogames). Never saw another CD player (software or hardware) that had that problem. - "Error code -(some number here)" - The "Hover Zoom" dock effect that Apple's been so proud of essentially amounts to turning some of the most frequently-used-buttons into moving targets. They obviously didn't think that one through. Google: - Google web-apps in general: There's been many times I've had that "Loading" box at the top of the page (the one google often uses on their web-apps) just hang without ever loading a thing. - Maps: Search for an address, and when it shows up, it will be centered wrong. *Then* it will slowly scroll to the correct orientation. Unbelievably hacky "solution" to a trivial bug, and it's persisted for a long time. - Maps: Cannot switch between JS version and non-JS version without going through contortions to force a new session. I've never seen a non-google website have that problem. Everywhere else, all that's needed is a page refresh. - Maps: If I search for one address, and get one tick-mark result, why would I want a quarter of the resulting map to be obscured by a bubble telling me exactly what I already know darn well that I *just* searched for? And why do I need that to happen *every single time*? At the very least they could have had the sense to provide a user setting to disable obnoxiousness like that, but...nope, can't have that either. - Code: See the wonderful attention-to-detail in this screenshot (been like this for quite awhile): http://www.semitwist.com/download/GoogleCode.png - Code: Has "search within" constraints for all issues, open issues, and any of your own issues that are still open, but there's nothing in that box for just simply searching your own issues (which is what I want 90% of the time I do a ticket search on a Googe Code project). Also, just plain can't search on closed issues without also searching open ones (or if you can do it, the UI makes it pretty damn obscure). - Code: When the results of a ticket search span multiple pages, there are no links to get to any page other than "Next" and "Prev". No "Last", no "Page 3", just "Next". Those are pretty damn standard links for paged results, but Google Code doesn't have them. - Code: A number of very-trivial-to-get-right things are broken when JS is off. Submitting a ticket, for example, is rather buggy with JS off even though there's nothing needed beyond the same damn HTML forms that even novices have been able to write ever since the 90's. There other things for both Apple and Google, but those are just the (non-hardware) ones from off-top-top-of-my-head.
 2.2. They're annoyingly slim on configurable settings (stuff I mentioned 
 about google maps in another post, and why in the world they think I 
 should be force-fed a non-standard custom skin in Chrome).
Well yeah a better maps application is... oh, wait. Google maps is the best by a huge margin.
Since when does "best" automatically imply "good"? And sure, I'll grant there *is* a lot of good about google maps (although most of the good stuff has already been copied by the competitors by now), but there's also a lot that's not good (see above).
 2.3. For their desktop apps, they like to force useless always-resident 
 services onto my system.

 3. Heck, Google is a web-oriented company, and I just hate modern web 
 technology. ;)
3 contradicts 2.3.
Let's not get overly literal. "Google is *primarily* a web-oriented company" Better now?
Jan 24 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Nick Sabalausky wrote:
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
 news:hjgopn$1i7g$1 digitalmars.com...
 - iPods have a Power button, but they cannot be turned off via the so-called 
 Power button. They are turned off by holding "Up" for a few seconds. Easily 
 wins the lifetime award for "Dumbest Interface Design Choice I've Ever 
 Seen".
http://www.macnn.com/articles/06/05/25/apple.wins.dad.awards/
 - How long did it take the iPhone to get basic copy/paste? (I'm not sure, 
 but I know it was a ridiculously long time for such a basic feature. 
 Handspring, for instance, had smartphones with copy/paste from day one, and 
 that was years before the iPhone was ever announced.)
http://www.appletell.com/apple/comment/apple-wins-eight-product-design-awards-at-cebit/ Brownie points for not going with the mainstream opinion. Andrei P.S. Never heard of Handspring.
Jan 24 2010
parent "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:hjhl98$932$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
 news:hjgopn$1i7g$1 digitalmars.com...
 - iPods have a Power button, but they cannot be turned off via the 
 so-called Power button. They are turned off by holding "Up" for a few 
 seconds. Easily wins the lifetime award for "Dumbest Interface Design 
 Choice I've Ever Seen".
http://www.macnn.com/articles/06/05/25/apple.wins.dad.awards/
 - How long did it take the iPhone to get basic copy/paste? (I'm not sure, 
 but I know it was a ridiculously long time for such a basic feature. 
 Handspring, for instance, had smartphones with copy/paste from day one, 
 and that was years before the iPhone was ever announced.)
http://www.appletell.com/apple/comment/apple-wins-eight-product-design-awards-at-cebit/ Brownie points for not going with the mainstream opinion.
"Gone With The Wind" won prestigious awards too. I still hated every minute of it.
 P.S. Never heard of Handspring.
They were one of the biggest manufacturers of third-party PalmOS devices (I miss PalmOS PDAs). My first PDA was a Handspring Visor. Handspring's Treo was one of the earliest smartphones, although that was before people were calling them smartphones.
Jan 24 2010
prev sibling parent reply "Steven E. Harris" <seh panix.com> writes:
Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:

 Far as I can tell they set the _standard_ on attention to detail, and
 Microsoft and others are desperately catching up!
One point comes to mind: keyboard shortcuts. OSX is maddeningly inconsistent and patchy with being able to use applications using just the keyboard. Windows (and, hence, Microsoft) is the best in this area. There's usually some way to do everything with the keyboard in a given application. In OSX, there's usually some way to do /something/ with the keyboard. -- Steven E. Harris
Jan 24 2010
next sibling parent reply retard <re tard.com.invalid> writes:
Sun, 24 Jan 2010 10:46:34 -0500, Steven E. Harris wrote:

 Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
 
 Far as I can tell they set the _standard_ on attention to detail, and
 Microsoft and others are desperately catching up!
One point comes to mind: keyboard shortcuts. OSX is maddeningly inconsistent and patchy with being able to use applications using just the keyboard. Windows (and, hence, Microsoft) is the best in this area. There's usually some way to do everything with the keyboard in a given application. In OSX, there's usually some way to do /something/ with the keyboard.
Has Windows really improved that much on that area? I use Linux + xmonad + vim/emacs + mostly command line applications on xterms. Web browsing is one of the only tasks that seems to require some kind of pointing device because of flash crap and crazy navigation systems.
Jan 24 2010
next sibling parent reply "Simen kjaeraas" <simen.kjaras gmail.com> writes:
retard <re tard.com.invalid> wrote:

 Web browsing is
 one of the only tasks that seems to require some kind of pointing device
 because of flash crap and crazy navigation systems.
Use Opera, then. a/q to move between links, tab to change between input fields, shift+arrow keys for spatial navigation. And bunches and bunches of other shortcuts. http://www.opera.com/browser/tutorials/nomouse/ -- Simen
Jan 24 2010
parent retard <re tard.com.invalid> writes:
Sun, 24 Jan 2010 21:34:56 +0100, Simen kjaeraas wrote:

 retard <re tard.com.invalid> wrote:
 
 Web browsing is
 one of the only tasks that seems to require some kind of pointing
 device because of flash crap and crazy navigation systems.
Use Opera, then. a/q to move between links, tab to change between input fields, shift+arrow keys for spatial navigation. And bunches and bunches of other shortcuts. http://www.opera.com/browser/tutorials/nomouse/
Flash applets can't convert key presses into mouse events. Often the applet also needs to be activated by clicking on it with the mouse.
Jan 24 2010
prev sibling parent reply "Steven E. Harris" <seh panix.com> writes:
retard <re tard.com.invalid> writes:

 Has Windows really improved that much on that area?
I don't recall the keyboard support in Windows 3.1 -- the first version of Windows I had used -- but from Windows 95 onward, the basic system and then situation have been the same: better than anything else I've tried. You mentioned Web browsers. I can drive Firefox very well with just the keyboard. Sometimes pointing with the mouse is more convenient, but it's never made necessary by the browser. -- Steven E. Harris
Jan 24 2010
parent dsimcha <dsimcha yahoo.com> writes:
== Quote from Steven E. Harris (seh panix.com)'s article
 retard <re tard.com.invalid> writes:
 Has Windows really improved that much on that area?
I don't recall the keyboard support in Windows 3.1 -- the first version of Windows I had used
It was pretty good. I know this because back then mice weren't all that reliable and you had to be able to operate Windows with just the keyboard to fix your mouse driver when something went wrong.
Jan 24 2010
prev sibling parent reply KennyTM~ <kennytm gmail.com> writes:
On Jan 24, 10 23:46, Steven E. Harris wrote:
 Andrei Alexandrescu<SeeWebsiteForEmail erdani.org>  writes:

 Far as I can tell they set the _standard_ on attention to detail, and
 Microsoft and others are desperately catching up!
One point comes to mind: keyboard shortcuts. OSX is maddeningly inconsistent and patchy with being able to use applications using just the keyboard. Windows (and, hence, Microsoft) is the best in this area. There's usually some way to do everything with the keyboard in a given application. In OSX, there's usually some way to do /something/ with the keyboard.
Users can reassign most keyboard shortcut in Mac OS X.
Jan 24 2010
parent reply "Steven E. Harris" <seh panix.com> writes:
KennyTM~ <kennytm gmail.com> writes:

 Users can reassign most keyboard shortcut in Mac OS X.
That's not what I mean. There are applications where it's not possible to bind any key to certain essential operations, such as "move the cursor from the tree pane thing on the left to the list-like pane in the center, and back again". I played with iCal for a while in the Apple Store one day, amusing myself with punishment. There's a keyboard shortcut to bring up a small "go to date" dialog, which allows one to type in a date. But the enter key doesn't "confirm" the dialog. I could not find any keystroke that had the same effect as pressing "OK", or whatever the accepting button is labeled. Is that a cruel joke? -- Steven E. Harris
Jan 26 2010
parent Michel Fortin <michel.fortin michelf.com> writes:
On 2010-01-26 22:34:38 -0500, "Steven E. Harris" <seh panix.com> said:

 KennyTM~ <kennytm gmail.com> writes:
 
 Users can reassign most keyboard shortcut in Mac OS X.
That's not what I mean. There are applications where it's not possible to bind any key to certain essential operations, such as "move the cursor from the tree pane thing on the left to the list-like pane in the center, and back again". I played with iCal for a while in the Apple Store one day, amusing myself with punishment. There's a keyboard shortcut to bring up a small "go to date" dialog, which allows one to type in a date. But the enter key doesn't "confirm" the dialog. I could not find any keystroke that had the same effect as pressing "OK", or whatever the accepting button is labeled. Is that a cruel joke?
That looks like an oversight. It looks like the date entry field eats all keyboard events. I agree it's particularly bad, especially since it's the typical interface element you'd want to access with the keyboard. But I must also say I don't see that often in OS X applications. There's still a way out with the keyboard: activate the VoiceOver spoken interface (Cmd-Fn-F5), then use its shortcuts (Ctrl-Alt-Arrows) to navigate to the control you want. But it's definitely overkill for something that should just work. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
Jan 27 2010
prev sibling parent reply Bane <branimir.milosavljevic gmail.com> writes:
 I don't understand all the criticism behind Google's product. Of 
 corporate software producers, Apple and Google are the two ones making 
 products that work reliably and are carefully designed.
 
 Besides, there's not much conspiracy going on. People at Google go off 
 and do their own projects all the time. I don't see Go part of a careful 
 ploy. (For one thing it would be much more polished in that case.) It is 
 also known inside Google circles that Go's authors are not language 
 designers, and that is visible in the quality of the language. (That 
 could change any time; Google does employ quite a few good language 
 designers who may contribute to Go in the future.)
 
I naively and firmly believe, from my standpoint as a individual, that any large corporation is evil. Google is no exception. My understanding is that maybe when they were startup, they were idealistic. Once they grew to a certain size, and amounted certain cash, everything goes the same way as with M$ or any other company. Lots of money comes with lots of power, people attracted to it etc. Game is to stay on top, which means eating smaller fishes and expanding to other areas (Google burger anyone?). If not so, then they would not be No1 and some bigger fish would eat them. Thats capitaljism, as Arnold from Red Heat would say. There is a silly movie I watched long time ago, with a big evil corporation named EES (Everything Except Shoes). They did just that. Wonder what will Google do in 10 yrs. Anyway, Go might be one few-good-ideas-but-lots-of-crap-recognized-from-the-beginning language, and that would not be problem itself, there are many languages like that, more or less known. But it is the 'Google' prefix that hurts my eyes. Does this makes Go any better? Nope. More advertised? Hell yes. It doesn't matters if author is doing it in their spare time. Similarly, there was Chrome pushed by Google as better-than-rest-but-nobody-asked-for-it-browser, as 'we make world better place with this'. But there were some issues with privacy and copyright for people who use it, and after reading enough about it, i decided not to use it. So, what we have here, i think, is one powerful corporation, advertising itself as 'we are not evil as M$', with lot of influence, pushing lot of things and recruiting lot of people for their projects, for their goals. Somebody finances all that. And I bet they have some long term plans with it. And I don't think they plan to lose money doing it, and i think that is theirs priority. So, conspiracy? Maybe. As far for Go being crappy because its designers are not good enough for the task, and possibility that Google will at certain point say 'ok, enough embarrassment, here comes our crack team of developers to help fix it' - I don't think its likely. If something is crappy from the begining, it is easier to start again from the scratch than fix it much latter.
Jan 23 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Bane" <branimir.milosavljevic gmail.com> wrote in message 
news:hjfvv2$85c$1 digitalmars.com...
 I naively and firmly believe, from my standpoint as a individual, that any 
 large corporation is evil. Google is no exception.
Hear hear!
 My understanding is that maybe when they were startup, they were 
 idealistic. Once they grew to a certain size, and amounted certain cash, 
 everything goes the same way as with M$ or any other company. Lots of 
 money comes with lots of power, people attracted to it etc. Game is to 
 stay on top, which means eating smaller fishes and expanding to other 
 areas (Google burger anyone?). If not so, then they would not be No1 and 
 some bigger fish would eat them. Thats capitaljism, as Arnold from Red 
 Heat would say.
Yea, something I find quite interesting is that, originally, I liked Google very much. They made the smartest and best web search engine out there, and married it with what, to this day, I consider one of the best examples of web interface design around. But then as soon as they had their IPO they started pumping out 95% junk (ie, at least Google Maps has some good points) but were hailed by everyone anyway because "Hey! It's the same great company as before!". The hell it is.
Jan 23 2010
prev sibling next sibling parent reply Rainer Deyke <rainerd eldwood.com> writes:
Bane wrote:
 I naively and firmly believe, from my standpoint as a individual,
 that any large corporation is evil. Google is no exception.
I think there is some truth to that, in the same way as all governments, political parties, and other large organizations are evil. Still, this doesn't mean that are governments or all large corporations are morally equivalent. I'll take the evil of a secular liberal democracy over the evil of a fundamentalist theocratic dictatorship any day. Google provides free products and services, uses unobtrusive advertising, promotes open standards, and contributes to open source projects. Compared to other corporations of a similar size, they're practically saints. Even if I don't care for their products, or for the direction in which they are trying to push the computer industry. -- Rainer Deyke - rainerd eldwood.com
Jan 23 2010
parent "Nick Sabalausky" <a a.a> writes:
"Rainer Deyke" <rainerd eldwood.com> wrote in message 
news:hjg5gp$i7p$1 digitalmars.com...
 Bane wrote:
 I naively and firmly believe, from my standpoint as a individual,
 that any large corporation is evil. Google is no exception.
I think there is some truth to that, in the same way as all governments, political parties, and other large organizations are evil. Still, this doesn't mean that are governments or all large corporations are morally equivalent. I'll take the evil of a secular liberal democracy over the evil of a fundamentalist theocratic dictatorship any day. Google provides free products and services, uses unobtrusive advertising, promotes open standards, and contributes to open source projects. Compared to other corporations of a similar size, they're practically saints. Even if I don't care for their products, or for the direction in which they are trying to push the computer industry.
That is a good point. (FWIW, The only reason I feel compelled to "me too" this post is as a disclaimer for all my other Google-bashing posts here.)
Jan 23 2010
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Bane wrote:
 I naively and firmly believe
How could these ever go together? I literally stopped reading here. Yet I saw snippets in the reply-to posts and - well I can't tell much about firmness but the naivety is there. Andrei
Jan 23 2010
next sibling parent "Nick Sabalausky" <a a.a> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:hjgof3$1gqe$2 digitalmars.com...
 Bane wrote:
 I naively and firmly believe
How could these ever go together?
Humor. Deliberate irony. Humility. Disclaimer.
 I literally stopped reading here. Yet I saw snippets in the reply-to posts 
 and - well I can't tell much about firmness but the naivety is there.
Did I miss the memo for National Take-Everything-Literally Day?
Jan 24 2010
prev sibling parent reply Bane <branimir.milosavljevic gmail.com> writes:
Andrei Alexandrescu Wrote:

 Bane wrote:
 I naively and firmly believe
How could these ever go together? I literally stopped reading here. Yet I saw snippets in the reply-to posts and - well I can't tell much about firmness but the naivety is there. Andrei
Well, i am naive when it comes to all that stuff, I havent been CO or lawyer of any large corporation so i dont have first hand info, so I might be wrong or too simplistic. So yep, naive=humble. On the other hand, what I feel/believe is dictated by my guts, and it says there is nasty shit going on, so I believe it firmly, just because my gut detector is pretty reliable. So, my firmly belief is my choice based on what i know and trust - my gut. So Andrei, you say naive people cant have strong beliefs, or just that strong beliefs cant be based on assumptions that are naive? If we put aside topic here, this might easily become religious discussion.
Jan 24 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Bane wrote:
 Andrei Alexandrescu Wrote:
 
 Bane wrote:
 I naively and firmly believe
How could these ever go together? I literally stopped reading here. Yet I saw snippets in the reply-to posts and - well I can't tell much about firmness but the naivety is there. Andrei
Well, i am naive when it comes to all that stuff, I havent been CO or lawyer of any large corporation so i dont have first hand info, so I might be wrong or too simplistic. So yep, naive=humble. On the other hand, what I feel/believe is dictated by my guts, and it says there is nasty shit going on, so I believe it firmly, just because my gut detector is pretty reliable. So, my firmly belief is my choice based on what i know and trust - my gut. So Andrei, you say naive people cant have strong beliefs, or just that strong beliefs cant be based on assumptions that are naive? If we put aside topic here, this might easily become religious discussion.
Good point. I guess I said that only because often my own naive beliefs tend to lack strength. I might be wrong here, but naivety and lack of evidence kind of go together. Andrei
Jan 24 2010
parent Bane <branimir.milosavljevic gmail.com> writes:
 Good point. I guess I said that only because often my own naive beliefs 
 tend to lack strength. I might be wrong here, but naivety and lack of 
 evidence kind of go together.
 
 Andrei
Lot of rock solid or fanatical stuff is based on lack of evidence, so you can call it naive. Religions are one thing. Mathematics come to my mind (axioms are things that are taken granted as truth, cant be proven, so there are definitely lack of evidence). And most extreme of all is zen, that (in my understanding) states that, whatever conclusion or belief you come up with, be 100% fanatical about it and forget anything else. Wanna call some religious freak, nobel prize mathematician or angry samurai naive? :D Well, I contributed enough to off topic making philosophical waste. Srry guys :)
Jan 24 2010
prev sibling parent "John D" <jdean googling.com> writes:
"Bane" <branimir.milosavljevic gmail.com> wrote in message 
news:hjfvv2$85c$1 digitalmars.com...

 I naively and firmly believe, from my standpoint as a individual, that 
 any large corporation is evil.
If you remove "naively" and change "corporation" to "institution" or even "organization", I would be inclined to agree with you. If you want to do more research on that, go interview some people from the countries of Europe who are faced with the threat of an European Union and I think you will find many who will echo similar sentiments ("We're losing our individuality", etc.).
Jan 29 2010
prev sibling parent reply Michiel Helvensteijn <m.helvensteijn.remove gmail.com> writes:
Bane wrote:

 It looks like to me they are making Google Goo for prestige. Search
 engine, browser, now programming language... Whats next? OS?
Google has designed two operating systems already.
 Laptops?
Google Laptops, Google Phones. Sure. They're just not building the hardware themselves. I myself have a Google Experience phone.
 Fast food franchise?
I was at a programming contest once where Google sponsored the catering. Does that count? :-) -- Michiel Helvensteijn
Jan 23 2010
parent reply Ary Borenszweig <ary esperanto.org.ar> writes:
Michiel Helvensteijn wrote:
 Bane wrote:
 
 It looks like to me they are making Google Goo for prestige. Search
 engine, browser, now programming language... Whats next? OS?
Google has designed two operating systems already.
They also made GWT, which is not a language but a compiler. They are not just engine and browser. :-P
Jan 24 2010
parent reply Bane <branimir.milosavljevic gmail.com> writes:
Ary Borenszweig Wrote:

 Michiel Helvensteijn wrote:
 Bane wrote:
 
 It looks like to me they are making Google Goo for prestige. Search
 engine, browser, now programming language... Whats next? OS?
Google has designed two operating systems already.
They also made GWT, which is not a language but a compiler. They are not just engine and browser. :-P
OMG! (Didn't know that!) They are everywhere! Oh well, they can't be good at everything they do... Either tasty catering or good language, cant have both...
Jan 23 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Bane" <branimir.milosavljevic gmail.com> wrote in message 
news:hjg0i8$97j$1 digitalmars.com...
 Ary Borenszweig Wrote:

 Michiel Helvensteijn wrote:
 Bane wrote:

 It looks like to me they are making Google Goo for prestige. Search
 engine, browser, now programming language... Whats next? OS?
Google has designed two operating systems already.
They also made GWT, which is not a language but a compiler. They are not just engine and browser. :-P
OMG! (Didn't know that!) They are everywhere! Oh well, they can't be good at everything they do... Either tasty catering or good language, cant have both...
You know, even though I'm one of the resident Google-haters here, I have to admit, I saw a thing on TV about Google's company cafeteria, and - OMG, I'm jealous of it! So yea, "Yay Google! Kind of Cafeterias!" Heh :)
Jan 23 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I have to 
 admit, I saw a thing on TV about Google's company cafeteria, and - OMG, I'm 
 jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there. BTW, while I understand your concern about corporations inevitably growing in size until they rule the world, the historical experience is otherwise. Once they reach a certain size, they tend to collapse from inefficiency and bureaucratic ossification. For example, GM! I could list dozens of others.
Jan 23 2010
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjg3b0$elr$1 digitalmars.com...
 Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I have 
 to admit, I saw a thing on TV about Google's company cafeteria, and - 
 OMG, I'm jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there.
Speaking of good corporate cafeterias, I had an internship a number of years ago at Progressive Insurance, and say what you will about insurance (I know I sure do ;) ), but I loved that cafeteria. But then again, from what I've seen of Google's cafeteria, that Google one even puts Progressive's to shame. And having worked both there and at places that had no cafeteria (not even a third-party one nearby) I feel fairly strongly that the benefits of having something that reliable for lunches, instead of, on a daily basis, having to worry about "where/when I am going to be able to grab something?" or spending the time to prepare one's own lunches cannot be over-emphasized. (But then again, I can be a bit of a worry-wart anyway ;) )
 BTW, while I understand your concern about corporations inevitably growing 
 in size until they rule the world,
Well, I can't speak for other people, but that's not quite the [ethical] problem I have with large corporations. In fact, it's not so much "large corporations" per se that I take issue with as it is "publically-traded corporations". I just find that the whole idea of businesses being literally owned by people whose sole interest in the company is purely financal (not to mention laws that essentially mandate that the corporation hold shareholder financial interests above all other concerns) to be a recipe for social irresponsibility. (Not that I'm socialist or communist or anything like that though, I haven't studied or been around such systems enough to form any coherent opinion on them.)
 the historical experience is otherwise. Once they reach a certain size, 
 they tend to collapse from inefficiency and bureaucratic ossification.

 For example, GM! I could list dozens of others.
Well, yea, that is another problem with big corporations. By the way, did you spell that right? That wasn't supposed to be "bureaucratic assification"? ;)
Jan 23 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Nick Sabalausky wrote:
 By the way, did you spell that right? That wasn't supposed to be 
 "bureaucratic assification"? ;)
"ossification" means turn to stone.
Jan 23 2010
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:
 "ossification" means turn to stone.
It means turning into bone, for example when you don't move a hand for months and the joints cartilage starts to turn into bone (os means bone in latin, and today oss-like sounds mean bone in Spanish, Italian, French, etc). Bye, bearophile
Jan 24 2010
prev sibling parent Chad J <chadjoan __spam.is.bad__gmail.com> writes:
Walter Bright wrote:
 Nick Sabalausky wrote:
 By the way, did you spell that right? That wasn't supposed to be
 "bureaucratic assification"? ;)
"ossification" means turn to stone.
It's a joke dude ;)
Jan 24 2010
prev sibling parent reply Bane <branimir.milosavljevic gmail.com> writes:
 By the way, did you spell that right? That wasn't supposed to be 
 "bureaucratic assification"? ;)
 
 
Am I only Freud person here seeing ass and not the oss in above statement?
Jan 24 2010
parent "Nick Sabalausky" <a a.a> writes:
"Bane" <branimir.milosavljevic gmail.com> wrote in message 
news:hjh7o9$2f4h$1 digitalmars.com...
 By the way, did you spell that right? That wasn't supposed to be
 "bureaucratic assification"? ;)
Am I only Freud person here seeing ass and not the oss in above statement?
I'm Sabalausky, not Freud, but I just love the new word "assification" :) (And yes, I know what you meant by Freud. Just another bad joke on my part. ;) )
Jan 24 2010
prev sibling parent reply "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hjg3b0$elr$1 digitalmars.com...
 Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I 
 have to admit, I saw a thing on TV about Google's company cafeteria, 
 and - OMG, I'm jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there. BTW, while I understand your concern about corporations inevitably growing in size until they rule the world, the historical experience is otherwise.
 Once they reach a certain size,
And what size is that? And more importantly, how long does it take?
 they tend to collapse from inefficiency and bureaucratic ossification.

 For example, GM! I could list dozens of others.
Is that supposed to be consolation to the little sapling that couldn't grow or have a life because the big tree's roots sucked up all the nutrients and ominous branches blocked the sun?
Jan 29 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hjg3b0$elr$1 digitalmars.com...
 Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I 
 have to admit, I saw a thing on TV about Google's company cafeteria, 
 and - OMG, I'm jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there. BTW, while I understand your concern about corporations inevitably growing in size until they rule the world, the historical experience is otherwise.
 Once they reach a certain size,
And what size is that? And more importantly, how long does it take?
I don't have the list handy, but take a look at the largest companies in America (by market capitalization) for each decade. You don't have to go very far back before you stop even recognizing the names.
 they tend to collapse from inefficiency and bureaucratic ossification.

 For example, GM! I could list dozens of others.
Is that supposed to be consolation to the little sapling that couldn't grow or have a life because the big tree's roots sucked up all the nutrients and ominous branches blocked the sun?
The book "The Innovators' Dilemma" lists many stories about small companies successfully competing with established dominant ones. http://www.amazon.com/exec/obidos/ASIN/0060521996/classicempire
Jan 30 2010
next sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 30, 2010 at 10:46 AM, Walter Bright
<newshound1 digitalmars.com> wrote:
 John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message
 news:hjg3b0$elr$1 digitalmars.com...
 Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I have
 to admit, I saw a thing on TV about Google's company cafeteria, and - OMG,
 I'm jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there. BTW, while I understand your concern about corporations inevitably growing in size until they rule the world, the historical experience is otherwise.
 Once they reach a certain size,
And what size is that? And more importantly, how long does it take?
I don't have the list handy, but take a look at the largest companies in America (by market capitalization) for each decade. You don't have to go very far back before you stop even recognizing the names.
 they tend to collapse from inefficiency and bureaucratic ossification.

 For example, GM! I could list dozens of others.
Is that supposed to be consolation to the little sapling that couldn't grow or have a life because the big tree's roots sucked up all the nutrients and ominous branches blocked the sun?
The book "The Innovators' Dilemma" lists many stories about small companies successfully competing with established dominant ones.
Definitely. As soon as a company gets beyond a certain size it is pretty much inevitable that they spend a significant amount of their efforts protecting their current business model. This will almost always mean they will fail to react fast enough to compete with two guys in a garage with a really good idea. --bb
Jan 30 2010
parent reply Bane <branimir.milosavljevic gmail.com> writes:
 Definitely.  As soon as a company gets beyond a certain size it is
 pretty much inevitable that they spend a significant amount of their
 efforts protecting their current business model.  This will almost
 always mean they will fail to react fast enough to compete with two
 guys in a garage with a really good idea.
 
 --bb
Exactly the reason big companies should relocate their employees from cubicles to garages and force them to surf the net whole day.
Jan 30 2010
parent "Nick Sabalausky" <a a.a> writes:
"Bane" <branimir.milosavljevic gmail.com> wrote in message 
news:hk2302$2017$1 digitalmars.com...
 Definitely.  As soon as a company gets beyond a certain size it is
 pretty much inevitable that they spend a significant amount of their
 efforts protecting their current business model.  This will almost
 always mean they will fail to react fast enough to compete with two
 guys in a garage with a really good idea.

 --bb
Exactly the reason big companies should relocate their employees from cubicles to garages and force them to surf the net whole day.
I like it! Sign me up for one of those big companies!
Jan 30 2010
prev sibling parent "John D" <jdean googling.com> writes:
"Walter Bright" <newshound1 digitalmars.com> wrote in message 
news:hk1uqu$1omt$1 digitalmars.com...
 John D wrote:
 "Walter Bright" <newshound1 digitalmars.com> wrote in message 
 news:hjg3b0$elr$1 digitalmars.com...
 Nick Sabalausky wrote:
 You know, even though I'm one of the resident Google-haters here, I 
 have to admit, I saw a thing on TV about Google's company cafeteria, 
 and - OMG, I'm jealous of it!
I've eaten at the Google cafeteria. It's very nice, and would be a compelling perq to work there. BTW, while I understand your concern about corporations inevitably growing in size until they rule the world, the historical experience is otherwise.
 Once they reach a certain size,
And what size is that? And more importantly, how long does it take?
I don't have the list handy,
Your "theory" is quite the strawman. Don't worry about it, I'm not calling you out on it. You can "have the last word" if it makes you feel good.
 but take a look at the largest companies in America (by market 
 capitalization) for each decade. You don't have to go very far back 
 before you stop even recognizing the names.
Well, I don't care to continue this line of discussion, for this is a tech group, but doing the "referring thing" rather than addressing the key points is worse than not saying anything, FYI. Your weak "theory" just dropped to zero. (No offense). (It's not actually your "theory" that I have issue with, but rather that you take some surface statistics, like a manager that just reads the reports of numbers each month and bases decisions on those "controls", when that has little relative meaning compared to real major issues and how to solve problems or make things better or knowing things).
Jan 30 2010
prev sibling parent downs <default_357-line yahoo.de> writes:
Nick Sabalausky wrote:
 - Does nothing to change my opinion that Google has done nothing noteworthy 
 outside of search engines and maybe their ad service.
 
 
Google Maps (not sure if you counted that under search); but their real strength is context-aware integration of different services. Like the way you can get Google Maps results for local business from a google query that includes a streetname.
Jan 25 2010
prev sibling next sibling parent Bane <branimir.milosavljevic gmail.com> writes:
retard Wrote:

 Sat, 23 Jan 2010 14:38:20 -0500, Bane wrote:
 
... Some old farts use D1 because they
 highly respect the D-man art and Walter's ability to co-operative and
 communicate with the community (which indeed feels really good if you
 have zero experience on other language communities). They do not fancy
 the new D2 features that much. And let's be honest, D1 is terribly

Hey! This old fart here prefers D1 instead of D2 because: - it has enough features to satisfy his needs, both in language and in Phobos - it is known how it works, it works correct, and there are docs describing it - there are less things in it, so it is easier to learn and play with it Hell, if C is useful tool and you cant get simpler than it (asembler excluded), then D1 is full fledged corporate tool with great std lib. So please, don't flame D1, or youll have some angry old men on your back :)
Please don't take my posts too seriously =)
Darn! I have just canceled tonight plans and got worked up to write some anti retard flaming posts... Oh, well, hope there is something on TV worth watching... :)
Jan 23 2010
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Steve Teale wrote:
 But I'd be
 interested to hear what the D aficionados think of Go.
http://www.digitalmars.com/d/archives/digitalmars/D/Go_rant_103530.html
Jan 23 2010
parent Steve Teale <steve.teale britseyeview.com> writes:
Walter Bright Wrote:

 Steve Teale wrote:
 But I'd be
 interested to hear what the D aficionados think of Go.
http://www.digitalmars.com/d/archives/digitalmars/D/Go_rant_103530.html
Thanks Walter, but I've had my fill now. I started a thread on golang-nuts asking for a poll on the most irritating feature of Go syntax - got a very defensive response. So we can all participate - that would be fun ;=)
Jan 23 2010
prev sibling next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Recently I've seen this, is it possible to write an equally short & safe & easy
program in D2 (using the concurrency features being developed)?
http://grammerjack.blogspot.com/2010/01/writing-bittorrent-client-in-go.html

(In computer language history it's happened often that a worse language (let's
say Go) becomes more widespread than a "better" one (let's say D2), so there's
no guarantee that D2 will beat Go even if D2 is "better". But being about as
good or better is not bad anyway :-) ).

Bye,
bearophile
Jan 24 2010
parent reply "John D" <jdean googling.com> writes:
"bearophile" <bearophileHUGS lycos.com> wrote in message 
news:hjh61f$2c62$1 digitalmars.com...
 Recently I've seen this, is it possible to write an equally short & 
 safe & easy program in D2 (using the concurrency features being 
 developed)?
 http://grammerjack.blogspot.com/2010/01/writing-bittorrent-client-in-go.html

 (In computer language history it's happened often that a worse language 
 (let's say Go) becomes more widespread than a "better" one ).
I'm not sure that is true. Actually, I think it is not true. People say that about Windows all the time too. While I KNOW much better OSes are possible, there really is (or was all this time, I haven't used the new Macs but I wouldn't want to do Unix-style programming anyway, so it's underpinnings do not appeal to me, FWIW) no comparable competitor. On the flipside, I think C++ is a crappy language, but it's still the best one out there (IMO, until I implement my own, of course) and I use it but I liked using it better before I knew so much about it.
Jan 30 2010
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
John D wrote:
 "bearophile" <bearophileHUGS lycos.com> wrote in message 
 news:hjh61f$2c62$1 digitalmars.com...
 Recently I've seen this, is it possible to write an equally short &
  safe & easy program in D2 (using the concurrency features being 
 developed)? 
 http://grammerjack.blogspot.com/2010/01/writing-bittorrent-client-in-go.html
 
 
 (In computer language history it's happened often that a worse
 language (let's say Go) becomes more widespread than a "better" one
 ).
 
I'm not sure that is true. Actually, I think it is not true. People say that about Windows all the time too. While I KNOW much better OSes are possible, there really is (or was all this time, I haven't used the new Macs but I wouldn't want to do Unix-style programming anyway, so it's underpinnings do not appeal to me, FWIW) no comparable competitor. On the flipside, I think C++ is a crappy language, but it's still the best one out there (IMO, until I implement my own, of course) and I use it but I liked using it better before I knew so much about it.
As Napoleon said - every soldier carries a marshal's baton :o). Andrei
Jan 30 2010
parent reply "John D" <jdean googling.com> writes:
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message 
news:hk1ksq$153t$1 digitalmars.com...
 John D wrote:
 "bearophile" <bearophileHUGS lycos.com> wrote in message 
 news:hjh61f$2c62$1 digitalmars.com...
 Recently I've seen this, is it possible to write an equally short &
  safe & easy program in D2 (using the concurrency features being 
 developed)? 
 http://grammerjack.blogspot.com/2010/01/writing-bittorrent-client-in-go.html


 (In computer language history it's happened often that a worse
 language (let's say Go) becomes more widespread than a "better" one
 ).
I'm not sure that is true. Actually, I think it is not true. People say that about Windows all the time too. While I KNOW much better OSes are possible, there really is (or was all this time, I haven't used the new Macs but I wouldn't want to do Unix-style programming anyway, so it's underpinnings do not appeal to me, FWIW) no comparable competitor. On the flipside, I think C++ is a crappy language, but it's still the best one out there (IMO, until I implement my own, of course) and I use it but I liked using it better before I knew so much about it.
As Napoleon said - every soldier carries a marshal's baton :o).
I can look that up, but it's easier if I just ask you to explain your curt comment and how it relates as a response to my post. Pray tell!
Jan 30 2010
parent reply Ellery Newcomer <ellery-newcomer utulsa.edu> writes:
On 01/30/2010 11:46 PM, John D wrote:
 best one out there (IMO, until I implement my own, of course) <<<<<<
http://en.wikipedia.org/wiki/Lake_Wobegone_effect
Jan 31 2010
parent Bane <branimir.milosavljevic gmail.com> writes:
Ellery Newcomer Wrote:

 On 01/30/2010 11:46 PM, John D wrote:
 best one out there (IMO, until I implement my own, of course) <<<<<<
http://en.wikipedia.org/wiki/Lake_Wobegone_effect
Very interesting and funny article.
Feb 01 2010
prev sibling parent Bane <branimir.milosavljevic gmail.com> writes:
Rainer Deyke Wrote:

 Bane wrote:
 I naively and firmly believe, from my standpoint as a individual,
 that any large corporation is evil. Google is no exception.
I think there is some truth to that, in the same way as all governments, political parties, and other large organizations are evil. Still, this doesn't mean that are governments or all large corporations are morally equivalent. I'll take the evil of a secular liberal democracy over the evil of a fundamentalist theocratic dictatorship any day. Google provides free products and services, uses unobtrusive advertising, promotes open standards, and contributes to open source projects. Compared to other corporations of a similar size, they're practically saints. Even if I don't care for their products, or for the direction in which they are trying to push the computer industry.
When it comes to search engine, mail and maps, I agree 100%. I use all of them exclusively because they are more qualitative and less intrusive than alternatives. OTH, i am very careful and aware what mail i send/recieve and what search strings i type in, just in case that 'free' in terms of money doesn't translates to 'cost too much' in terms of privacy and freedom.
Jan 24 2010