www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - D - more power than (really) needed !

reply bls <lietz wanadoo.fr> writes:
Since I am allways looking for new friends <g> ,and because my opinion 
does not fit into the "D - more or less power than C++" discussion, 
another opinion: D - more power than (really) needed !

Preface :
I guess most people here coming from C++/Java .. my background 
(Modula/Oberon/4GL) is different.
Ergo : I have an other view regarding D.
So why D has enough power ?

Templates nice to have, but nessesary ? No. Let me quote Betrand Meyer.
~~~ "Inheritance is more powerfull than  using templates"
Yes, I know Eiffel supports templates but the point is that everything 
can also be done using inheritance. And heh, we have GC.
Also, i feel using templates is like learing an dialect. (well D's 
implementation is at least readable)

The Oberon Language spec. has 18 pages is extremly simple and
enables you to create an combined code/text/graphic editor, using 
inheritance and a good choice of patterns,  which has much more power 
than let's say Scintilla.
So I wonder how can this be done with such a simple language, and
what does it mean for D? D more power than REALLY needed!

One of the things that I found really impressive (and unexpected) is 
that D code is readable. Good Design!

Just my opinion.
BjŲrn Lietz-Spendig
SIMPLE IS BEAUTIFULL
Mar 04 2006
next sibling parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
bls wrote:
 Since I am allways looking for new friends <g> ,and because my opinion 
 does not fit into the "D - more or less power than C++" discussion, 
 another opinion: D - more power than (really) needed !
 
 Preface :
 I guess most people here coming from C++/Java .. my background 
 (Modula/Oberon/4GL) is different.
 Ergo : I have an other view regarding D.
 So why D has enough power ?
 
 Templates nice to have, but nessesary ? No. Let me quote Betrand Meyer.
 ~~~ "Inheritance is more powerfull than  using templates"
 Yes, I know Eiffel supports templates but the point is that everything 
 can also be done using inheritance. And heh, we have GC.
 Also, i feel using templates is like learing an dialect. (well D's 
 implementation is at least readable)
 
 The Oberon Language spec. has 18 pages is extremly simple and
 enables you to create an combined code/text/graphic editor, using 
 inheritance and a good choice of patterns,  which has much more power 
 than let's say Scintilla.
 So I wonder how can this be done with such a simple language, and
 what does it mean for D? D more power than REALLY needed!
 
 One of the things that I found really impressive (and unexpected) is 
 that D code is readable. Good Design!
 
 Just my opinion.
 BjŲrn Lietz-Spendig
 SIMPLE IS BEAUTIFULL
 
 
 

It's ironic that the overview page on the digital mars website makes comments about C++ having many "islands" .. I think back then D didn't have templates. Well, now it does, and those comments are applicable to D as much as they are to C++ <quote src=http://www.digitalmars.com/d/overview.html> C++ programmers tend to program in particular islands of the language, i.e. getting very proficient using certain features while avoiding other feature sets. While the code is usually portable from compiler to compiler, it can be hard to port it from programmer to programmer. A great strength of C++ is that it can support many radically different styles of programming - but in long term use, the overlapping and contradictory styles are a hindrance. </quote>
Mar 04 2006
parent Lars Ivar Igesund <larsivar igesund.net> writes:
Hasan Aljudy wrote:

 bls wrote:
 Since I am allways looking for new friends <g> ,and because my opinion
 does not fit into the "D - more or less power than C++" discussion,
 another opinion: D - more power than (really) needed !
 
 Preface :
 I guess most people here coming from C++/Java .. my background
 (Modula/Oberon/4GL) is different.
 Ergo : I have an other view regarding D.
 So why D has enough power ?
 
 Templates nice to have, but nessesary ? No. Let me quote Betrand Meyer.
 ~~~ "Inheritance is more powerfull than  using templates"
 Yes, I know Eiffel supports templates but the point is that everything
 can also be done using inheritance. And heh, we have GC.
 Also, i feel using templates is like learing an dialect. (well D's
 implementation is at least readable)
 
 The Oberon Language spec. has 18 pages is extremly simple and
 enables you to create an combined code/text/graphic editor, using
 inheritance and a good choice of patterns,  which has much more power
 than let's say Scintilla.
 So I wonder how can this be done with such a simple language, and
 what does it mean for D? D more power than REALLY needed!
 
 One of the things that I found really impressive (and unexpected) is
 that D code is readable. Good Design!
 
 Just my opinion.
 Björn Lietz-Spendig
 SIMPLE IS BEAUTIFULL
 
 
 

It's ironic that the overview page on the digital mars website makes comments about C++ having many "islands" .. I think back then D didn't have templates. Well, now it does, and those comments are applicable to D as much as they are to C++ <quote src=http://www.digitalmars.com/d/overview.html> C++ programmers tend to program in particular islands of the language, i.e. getting very proficient using certain features while avoiding other feature sets. While the code is usually portable from compiler to compiler, it can be hard to port it from programmer to programmer. A great strength of C++ is that it can support many radically different styles of programming - but in long term use, the overlapping and contradictory styles are a hindrance. </quote>

I have to disagree that this is yet a problem. In C++, the templates look radically different from the rest of the language. Templates in D actually look like the rest of the language, only that you have to use recursion to get the advanced stuff working. (I know that I made a similar advocacy against templates three years ago, but I've changed my mind...). I think that walter has been good at adding features that are easy to grok even for those new to the language, they don't really syntactically crash.
Mar 04 2006
prev sibling next sibling parent Stewart Gordon <smjg_1998 yahoo.com> writes:
bls wrote:
<snip>
 Templates nice to have, but nessesary ? No. Let me quote Betrand Meyer.
 ~~~ "Inheritance is more powerfull than  using templates"

Having both must be even more powerful then.
 Yes, I know Eiffel supports templates but the point is that everything 
 can also be done using inheritance. And heh, we have GC.

Would you care to supply an example? <snip>
 The Oberon Language spec. has 18 pages is extremly simple and
 enables you to create an combined code/text/graphic editor, using 
 inheritance and a good choice of patterns,  which has much more power 
 than let's say Scintilla.
 So I wonder how can this be done with such a simple language, and
 what does it mean for D? D more power than REALLY needed!

There are Turing-complete languages that are even simpler than this. Stewart. -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d- s:- C++ a->--- UB P+ L E W++ N+++ o K- w++ O? M V? PS- PE- Y? PGP- t- 5? X? R b DI? D G e++>++++ h-- r-- !y ------END GEEK CODE BLOCK------ My e-mail is valid but not my primary mailbox. Please keep replies on the 'group where everyone may benefit.
Mar 06 2006
prev sibling parent reply "Craig Black" <cblack ara.com> writes:
"bls" <lietz wanadoo.fr> wrote in message 
news:dubqrg$10ad$1 digitaldaemon.com...
 Since I am allways looking for new friends <g> ,and because my opinion 
 does not fit into the "D - more or less power than C++" discussion, 
 another opinion: D - more power than (really) needed !

 Preface :
 I guess most people here coming from C++/Java .. my background 
 (Modula/Oberon/4GL) is different.
 Ergo : I have an other view regarding D.
 So why D has enough power ?

 Templates nice to have, but nessesary ? No. Let me quote Betrand Meyer.
 ~~~ "Inheritance is more powerfull than  using templates"
 Yes, I know Eiffel supports templates but the point is that everything can 
 also be done using inheritance. And heh, we have GC.
 Also, i feel using templates is like learing an dialect. (well D's 
 implementation is at least readable)

 The Oberon Language spec. has 18 pages is extremly simple and
 enables you to create an combined code/text/graphic editor, using 
 inheritance and a good choice of patterns,  which has much more power than 
 let's say Scintilla.
 So I wonder how can this be done with such a simple language, and
 what does it mean for D? D more power than REALLY needed!

 One of the things that I found really impressive (and unexpected) is that 
 D code is readable. Good Design!

 Just my opinion.
 BjŲrn Lietz-Spendig
 SIMPLE IS BEAUTIFULL

If inheritance is so powerful, then why did Java adopt generics? If Oberon is so great, why doesn't everyone use it? The purist OOP ideology is a narrow-minded perspective of programming. Just because you don't use or understand generic programming does not mean that it is not incredibly useful and/or necessary. A lot of the bias towards OOP purism comes from Java vs. C++ comparisons, much more convincing than Anything vs. D. Hopefully the simplicity and power of D can help to eliminate OOP purism. -Craig
Mar 08 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Craig Black" <cblack ara.com> wrote in message 
news:dun68p$30kv$1 digitaldaemon.com...
 A lot of the bias towards OOP purism comes from Java vs. C++ comparisons, 
 much more convincing than Anything vs. D.  Hopefully the simplicity and 
 power of D can help to eliminate OOP purism.

The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation. Sometimes, I think C++ went hard over in the opposite direction - nothing is an object. OOP programming seems to be regarded as "oh, so 90's" by the modern C++ crowd. Every decade seems to have its programming buzzword: 60's: artificial intelligence 70's: structured programming 80's: user-friendly 90's: object oriented programming 00's: metaprogramming
Mar 08 2006
next sibling parent reply SebastiŠn E. Peyrott <as7cf yahoo.com> writes:
In article <duogj1$22ij$1 digitaldaemon.com>, Walter Bright says...
"Craig Black" <cblack ara.com> wrote in message 
news:dun68p$30kv$1 digitaldaemon.com...
 A lot of the bias towards OOP purism comes from Java vs. C++ comparisons, 
 much more convincing than Anything vs. D.  Hopefully the simplicity and 
 power of D can help to eliminate OOP purism.

The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation. Sometimes, I think C++ went hard over in the opposite direction - nothing is an object. OOP programming seems to be regarded as "oh, so 90's" by the modern C++ crowd. Every decade seems to have its programming buzzword: 60's: artificial intelligence 70's: structured programming 80's: user-friendly 90's: object oriented programming 00's: metaprogramming

Then, it may be in D's best interest to start thinking which will be the 10's buzzword ;) -- SebastiŠn.
Mar 08 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
SebastiŠn E. Peyrott wrote:
 In article <duogj1$22ij$1 digitaldaemon.com>, Walter Bright says...
 
 Then, it may be in D's best interest to start thinking which will be the 10's
 buzzword ;)

"Concurrent." ;-) Sean
Mar 09 2006
prev sibling parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
SebastiŠn E. Peyrott wrote:

Every decade seems to have its programming buzzword:

60's: artificial intelligence
70's: structured programming
80's: user-friendly
90's: object oriented programming
00's: metaprogramming 

Then, it may be in D's best interest to start thinking which will be the 10's buzzword ;)

I am afraid it will be: "virtualization" --anders
Mar 09 2006
parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Anders F BjŲrklund wrote:
 SebastiŠn E. Peyrott wrote:
 
 Every decade seems to have its programming buzzword:

 60's: artificial intelligence
 70's: structured programming
 80's: user-friendly
 90's: object oriented programming
 00's: metaprogramming 

Then, it may be in D's best interest to start thinking which will be the 10's buzzword ;)

I am afraid it will be: "virtualization" --anders

What do you mean? Isn't virtualization at the core a hardware concept? -- Bruno Medeiros - CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Mar 09 2006
next sibling parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Bruno Medeiros wrote:

 I am afraid it will be: "virtualization"

What do you mean? Isn't virtualization at the core a hardware concept?

Like this: http://www.kernelthread.com/publications/virtualization/ In this case: D#, or other bytecode ? --anders
Mar 09 2006
parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Anders F BjŲrklund wrote:
 Bruno Medeiros wrote:
 
 I am afraid it will be: "virtualization"

What do you mean? Isn't virtualization at the core a hardware concept?

Like this: http://www.kernelthread.com/publications/virtualization/ In this case: D#, or other bytecode ? --anders

D progs running in a VM? It's an issue orthogonal to the programming language itself. -- Bruno Medeiros - CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Mar 11 2006
parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Bruno Medeiros wrote:

 D progs running in a VM? It's an issue orthogonal to the programming 
 language itself.

Maybe so, but both Java and C# makes a big deal out of running in a VM. If everything is "going managed", then that leaves D "with assembler" ? What I meant was that the trend seems to be towards scripting languages and virtual machines. Especially with WinFX looming around the corner ? (What if C++ and Win32 becomes "the sandbox", and C# and WinFX is the real OS interface ? Could leave current apps where 16-bit ones are now) Just a thought. (I happen to like assembler, plain old C - and D too...) But a D frontend to a CLR/JVM backend, would be a neat thing to have ? --anders
Mar 12 2006
prev sibling parent Sean Kelly <sean f4.ca> writes:
Bruno Medeiros wrote:
 
 What do you mean? Isn't virtualization at the core a hardware concept?

Generalized abstraction (ie. virtualization) has become a pretty popular term recently. I think this may be a product of security paranoia, as "virtualized" is becoming synonymous with "safe." Sean
Mar 09 2006
prev sibling next sibling parent reply Georg Wrede <georg.wrede nospam.org> writes:
Walter Bright wrote:
 "Craig Black" <cblack ara.com> wrote
 
 A lot of the bias towards OOP purism comes from Java vs. C++ 
 comparisons, much more convincing than Anything vs. D. Hopefully 
 the simplicity and power of D can help to eliminate OOP purism.

The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation. Sometimes, I think C++ went hard over in the opposite direction - nothing is an object. OOP programming seems to be regarded as "oh, so 90's" by the modern C++ crowd. Every decade seems to have its programming buzzword: 60's: artificial intelligence 70's: structured programming 80's: user-friendly 90's: object oriented programming 00's: metaprogramming

I sometimes view the programming language scene as a small forest. We have massive cross-pollination between species there, new "pure" species come up every now and then, and once they grow, other new species start getting dna from them too. In time most of the good ideas get spread around. And those who can receive new dna in adulthood, stay fit and prosper. Conversely, what one does see is old, big trees that are dry and hollow, just rutting or drying as they stand. One can even see trees die suddenly, felled by the wind. But like in the real world, one never sees the moment a language is born. Only when it's grown enough to get one's attention do we see it. With D we're pretty close. With Perl, I happened to stumble on a copy of Perl when it was like D was some 4 years ago. I thought "gee, nice idea", but never did it cross my mind that Perl "would take the world". Heh, Practical Extraction and Reporting was all over the pages, the word Perl was sort-of incidental looking in comparison. In spite of being big (as in huge) today, Perl is also a thriving language. One thing that [come to think of it] advertises this is the version numbering. On my 0.8GHz laptop, Perl is 5.8.6. Being the pathological anarchist that Larry Wall is, he couldn't care less about striving to nice round version numbers. Which I think is actually an excellent principle. As a matter of fact, such would only become a dragstone if one hopes for unhindered renewal and improvement. --- Today, advertising major version numbers (and shoving the minors under the rug) is corporate. As in Oracle, Windows, Solaris. Nobody seems to mind if this or that script contains a line requiring at least version x.x.374 of Perl. Folks just shrug and download it. Look at Apple. If we, with D, can maintain backwards compatibility (no more than what Walter -- and we all, currently aim to), then such would not be a problem here either. We could skip the "Major version every 2 years, bug fixes in between" mantra, and make D more into a pipeline -- like a continuous process. (We're on the Net: no need to plan the manual books and boxes, and the ad campaign 18 months in advance.) This sounds daring, I admit, but we've seen it done. And if we do it on purpose (vs. ending up there), then we can do it in style and elegance. It is for this thing we have the /deprecated/ keyword. If we only made major steps, who'd care about that -- there'd be so much else to plan ahead when switching to the new version at the sweat shop. (And most of the existing software would stay updated with the old version anyway.) A three-level numbering is handy. Last position changing each time folks would want to read the Fixed list, middle version changing whenever they might want to review the readme. And of course, the first position changing whenever major things are implemented. Odd middles for developers, even for everyone. --- But anyhow, the fact remains, we don't need to do D in discrete steps. We can do a slope. Any grad school math teacher can write a *proof* (using differential calculus) that steps lose compared to the continuous. So it's not just good practice (especially in the modern, exponentially accelerating world), but also theoretically provable.
Mar 09 2006
next sibling parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Georg Wrede wrote:

 Heh, Practical Extraction and Reporting was all over the pages, the word 
 Perl was sort-of incidental looking in comparison. In spite of being big 
 (as in huge) today, Perl is also a thriving language. One thing that 
 [come to think of it] advertises this is the version numbering. On my 
 0.8GHz laptop, Perl is 5.8.6. Being the pathological anarchist that 
 Larry Wall is, he couldn't care less about striving to nice round 
 version numbers. Which I think is actually an excellent principle. As a 
 matter of fact, such would only become a dragstone if one hopes for 
 unhindered renewal and improvement.

Larry Wall is actually quite careful about numbers and language... As in http://www.perl.com/pub/a/2000/10/23/soto2000.html ("TPC4") But unfortunately, Perl has been losing some ground to Python lately. At least as a system adminstration language, which is where I use it. Hopefully, Parrot will change all of that nonsense :-) http://dev.perl.org/perl6/ --anders
Mar 09 2006
prev sibling parent Georg Wrede <georg.wrede nospam.org> writes:
Georg Wrede wrote:
 
 If we, with D, can maintain backwards compatibility (no more than what 
 Walter -- and we all, currently aim to), then such would not be a 
 problem here either. We could skip the "Major version every 2 years, bug 
 fixes in between" mantra, and make D more into a pipeline -- like a 
 continuous process. (We're on the Net: no need to plan the manual books 
 and boxes, and the ad campaign 18 months in advance.)
 
 This sounds daring, I admit, but we've seen it done. And if we do it on 
 purpose (vs. ending up there), then we can do it in style and elegance. 
 It is for this thing we have the /deprecated/ keyword. If we only made 
 major steps, who'd care about that -- there'd be so much else to plan 
 ahead when switching to the new version at the sweat shop. (And most of 
 the existing software would stay updated with the old version anyway.)

Knowing that all post-1.0 compilers are _guaranteed_ to be available on the net _indefinitely_, is one major factor for today's code shops, when they decide on the next language.
Mar 10 2006
prev sibling next sibling parent reply Deewiant <deewiant.doesnotlike.spam gmail.com> writes:
Walter Bright wrote:
 The trouble with OOP is, well, not everything is an object. For example, 
 take the trig function sin(x). It's not an object. Of course, we could bash 
 it into being an object, but that doesn't accomplish anything but 
 obfuscation.
 

sin(x) isn't an object, but x is, and sin can be made a property of that object.
Mar 09 2006
parent reply Oskar Linde <oskar.lindeREM OVEgmail.com> writes:
Deewiant wrote:
 Walter Bright wrote:
 The trouble with OOP is, well, not everything is an object. For example, 
 take the trig function sin(x). It's not an object. Of course, we could bash 
 it into being an object, but that doesn't accomplish anything but 
 obfuscation.

sin(x) isn't an object, but x is, and sin can be made a property of that object.

How would you define a binary mathematical function then? atan2(y,x) and binominal(n,k) for instance. /Oskar
Mar 09 2006
next sibling parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Oskar Linde wrote:
 Deewiant wrote:
 
 Walter Bright wrote:

 The trouble with OOP is, well, not everything is an object. For 
 example, take the trig function sin(x). It's not an object. Of 
 course, we could bash it into being an object, but that doesn't 
 accomplish anything but obfuscation.

sin(x) isn't an object, but x is, and sin can be made a property of that object.

How would you define a binary mathematical function then? atan2(y,x) and binominal(n,k) for instance. /Oskar

Sure .. (y,x) is an object, (n,k) is also an object. not that I know what binominal(n,k) means, but since (n,k) is the data you're working with, this data can be the object. If you're referring to this http://mathworld.wolfram.com/BinomialCoefficient.html then binominal(n,k) means "n choose k" which reads like a the messege "choose k" to the object "n" ;) n.choose(k); I think, for atan2, x,y are coordinates, no? hence x,y is a point.
Mar 09 2006
prev sibling parent reply Deewiant <deewiant.doesnotlike.spam gmail.com> writes:
Oskar Linde wrote:
 Deewiant wrote:
 Walter Bright wrote:
 The trouble with OOP is, well, not everything is an object. For
 example, take the trig function sin(x). It's not an object. Of
 course, we could bash it into being an object, but that doesn't
 accomplish anything but obfuscation.

sin(x) isn't an object, but x is, and sin can be made a property of that object.

How would you define a binary mathematical function then? atan2(y,x) and binominal(n,k) for instance. /Oskar

My reply was somewhat tongue in cheek - I suppose I should have added a smiley. Of course everything is not an object. Although binomial(n, k) I'd define as n.choose(k). <g>
Mar 09 2006
parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
Deewiant wrote:
 Oskar Linde wrote:
 
Deewiant wrote:

Walter Bright wrote:

The trouble with OOP is, well, not everything is an object. For
example, take the trig function sin(x). It's not an object. Of
course, we could bash it into being an object, but that doesn't
accomplish anything but obfuscation.

sin(x) isn't an object, but x is, and sin can be made a property of that object.

How would you define a binary mathematical function then? atan2(y,x) and binominal(n,k) for instance. /Oskar

My reply was somewhat tongue in cheek - I suppose I should have added a smiley. Of course everything is not an object. Although binomial(n, k) I'd define as n.choose(k). <g>

n.choose(k) is actually /less/ confusing than binomial(n,k)
Mar 09 2006
parent Kyle Furlong <kylefurlong gmail.com> writes:
Hasan Aljudy wrote:
 Deewiant wrote:
 Oskar Linde wrote:

 Deewiant wrote:

 Walter Bright wrote:

 The trouble with OOP is, well, not everything is an object. For
 example, take the trig function sin(x). It's not an object. Of
 course, we could bash it into being an object, but that doesn't
 accomplish anything but obfuscation.

sin(x) isn't an object, but x is, and sin can be made a property of that object.

How would you define a binary mathematical function then? atan2(y,x) and binominal(n,k) for instance. /Oskar

My reply was somewhat tongue in cheek - I suppose I should have added a smiley. Of course everything is not an object. Although binomial(n, k) I'd define as n.choose(k). <g>

n.choose(k) is actually /less/ confusing than binomial(n,k)

I disagree, think about how you would do a math problem. It would not involve this sort of syntax.
Mar 09 2006
prev sibling parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
Walter Bright wrote:
 "Craig Black" <cblack ara.com> wrote in message 
 news:dun68p$30kv$1 digitaldaemon.com...
 
A lot of the bias towards OOP purism comes from Java vs. C++ comparisons, 
much more convincing than Anything vs. D.  Hopefully the simplicity and 
power of D can help to eliminate OOP purism.

The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation.

That's because we've been taught math in a procedural way ;) Ideally, x, the angle, would be an object, and sin is a method on that object. However, I think that we're so used to think about math functions in a procedural way, so it's better they stay procedural. Like you said, it'll be a bit confusing if it was an object, but that's not because it can't be an object, but mainly because that's not how we think about it.
 Sometimes, I think C++ went hard over in the opposite direction - nothing is 
 an object. OOP programming seems to be regarded as "oh, so 90's" by the 
 modern C++ crowd.

C++ doesn't really support OOP .. it's just a myth :(
Mar 09 2006
next sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Hasan Aljudy wrote:

 Sometimes, I think C++ went hard over in the opposite direction - nothing
 is an object. OOP programming seems to be regarded as "oh, so 90's" by
 the modern C++ crowd.

C++ doesn't really support OOP .. it's just a myth :(

Hmm, I don't get it. Has anyone told all those making OO C++ libraries this? I get a feeling sometimes that there is a belief that the languages preaching OOP (Java, Eiffel, etc) defines OO. Object orientation is an idea of how to group what belong to each other together. There are umpteen books on best practices how to do this, and AFAICS, the Java books from the past two years differs hugely from those I used back in 98-99 when I started at the University. Especially I read the other day that although inheritance solves a lot, it is often a bad practice if you want a system that is reusable and extensible (Head First Design Patterns, one of the current top books in the Java category.) There are no rules to OOP! There are guidelines, and depending on your project, it might be smart to follow one of those, or guidelines that are totally different.
Mar 09 2006
parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
Lars Ivar Igesund wrote:
<snip>
 
 I get a feeling sometimes that there is a belief that the languages
 preaching OOP (Java, Eiffel, etc) defines OO. Object orientation is an idea
 of how to group what belong to each other together. There are umpteen books
 on best practices how to do this

True, OO is not so well defined, mainly because alot of people think they understand oop, when they really don't. I wish the whole world learned object oriented principles from Schock http://pages.cpsc.ucalgary.ca/~schock/courses/w05/cpsc233/lectures.html
, and AFAICS, the Java books from the past
 two years differs hugely from those I used back in 98-99 when I started at
 the University. Especially I read the other day that although inheritance
 solves a lot, it is often a bad practice if you want a system that is
 reusable and extensible (Head First Design Patterns, one of the current top
 books in the Java category.)

The problem is when people look at OOP superficially as a set of tools, e.g. inheritance! OOP is not a set of tools, it's a set of principles. Inheritance is not there just for the sake of it, it serves a very good purpose: increasing cohesion by pulling common/shared code in a super class. You can (hypothetically, at least) write object oriented code that doesn't use inheritance, and you can write non-object oriented code that uses inheritance & classes (this happens alot).
Mar 09 2006
parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Hasan Aljudy wrote:

 I wish the whole world learned object oriented principles from Schock
 http://pages.cpsc.ucalgary.ca/~schock/courses/w05/cpsc233/lectures.html

I doubt he is the only lecturer out there to have understood object orientation ;)
 
 The problem is when people look at OOP superficially as a set of tools,
 e.g. inheritance!

OOP _is_ a set of tools. Nothing more, nothing less.
 OOP is not a set of tools, it's a set of principles. Inheritance is not
 there just for the sake of it, it serves a very good purpose: increasing
 cohesion by pulling common/shared code in a super class.

Yes, and by this you get tight coupling and reduced reusability and extendability. Principles just get in the way for practicality. Using inheritance just for the sake of it, or because of principles, is nothing even close to practical. But then my programming principles encompass pragmatism :)
Mar 09 2006
parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
Lars Ivar Igesund wrote:
 Hasan Aljudy wrote:
 
 
I wish the whole world learned object oriented principles from Schock
http://pages.cpsc.ucalgary.ca/~schock/courses/w05/cpsc233/lectures.html

I doubt he is the only lecturer out there to have understood object orientation ;)

Yeah .. of course! but I found no resource on the net which can give the kind of understaind of oop that he's given me.
 
 
The problem is when people look at OOP superficially as a set of tools,
e.g. inheritance!

OOP _is_ a set of tools. Nothing more, nothing less.

no my friend, it's alot more! oop is a way of thinking (in terms of coding, of course). It's a different way of doing analysis and design. Object oriented analysis and design produce completely different results from procedural analysis and design. You can do procedural analysis and design but write code that uses classes and employes inheritance, but your code will still be procedural, because when you thought about the problem you did so procedurally, and the code will inevitable reflect that.
OOP is not a set of tools, it's a set of principles. Inheritance is not
there just for the sake of it, it serves a very good purpose: increasing
cohesion by pulling common/shared code in a super class.

Yes, and by this you get tight coupling and reduced reusability and extendability.

If that happens, it generally means you don't know what you're doing; you're using inheritance the wrong way.
 Principles just get in the way for practicality. Using
 inheritance just for the sake of it, or because of principles, is nothing
 even close to practical.

You don't use inheritance because it's a principle (that's the same as using just for its own sake). You use inheritance when you find that you need to. i.e. when you find yourself writing too much duplicate code. A common *wrong* way for using inheritance, is deciding a class hierarchy before doing any kind of analysis for objects.
 But then my programming principles encompass pragmatism :)

Funny you should say so. Is skipping the analysis and design phases (i.e. diving right into writing code) pragmatic or not? You can say it's pragmatic .. if you're too lazy to do it! However, you'll actually get the most out of your time if you do a proper analysis and design first.
Mar 09 2006
next sibling parent reply Georg Wrede <georg.wrede nospam.org> writes:
Hasan Aljudy wrote:
 Is skipping the analysis and design phases (i.e. diving right into 
 writing code) pragmatic or not?
 You can say it's pragmatic .. if you're too lazy to do it! However, 
 you'll actually get the most out of your time if you do a proper 
 analysis and design first.

I must live in the wrong country, or spend too much time with self-taught programmers, earning their living programming. Very seldom do I see one who actually thinks before he hits the keyboard. Whe I ask, they frown and say "I've already got it covered!" Yeah, right. And just looking at their code proves them wrong. With all the UML and stuff floating around, one would think they would have understood the meaning of planning and analysis. And the funniest thing is, on all these programmer's door it says "analyst", not "programmer". But "UML is for suits." Glad at least the bosses try to get something thought out before it's done. :-(
Mar 10 2006
parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Georg Wrede wrote:
 Hasan Aljudy wrote:
 
 Is skipping the analysis and design phases (i.e. diving right into 
 writing code) pragmatic or not?
 You can say it's pragmatic .. if you're too lazy to do it! However, 
 you'll actually get the most out of your time if you do a proper 
 analysis and design first.

I must live in the wrong country, or spend too much time with self-taught programmers, earning their living programming. Very seldom do I see one who actually thinks before he hits the keyboard. Whe I ask, they frown and say "I've already got it covered!" Yeah, right. And just looking at their code proves them wrong.

Haha. I myself don't apply what I'm saying. Seldom do I plan my projects on paper. Even when I do put something on paper, it's mininal. I usually have everything in my head! Which is probably why it takes me a loooong time to get things done!! Fortunately I'm still a student and not in the industry, yet.
 
 With all the UML and stuff floating around, one would think they would 
 have understood the meaning of planning and analysis. And the funniest 
 thing is, on all these programmer's door it says "analyst", not 
 "programmer".
 
 But "UML is for suits." Glad at least the bosses try to get something 
 thought out before it's done. :-(

Mar 10 2006
prev sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Hasan Aljudy wrote:
 
 OOP _is_ a set of tools. Nothing more, nothing less.

no my friend, it's alot more! oop is a way of thinking (in terms of coding, of course). It's a different way of doing analysis and design. Object oriented analysis and design produce completely different results from procedural analysis and design. You can do procedural analysis and design but write code that uses classes and employes inheritance, but your code will still be procedural, because when you thought about the problem you did so procedurally, and the code will inevitable reflect that.

OOP is a toolbox in a programming toolshed (and inheritance is a tool in that particular toolbox). Just to make this great analogy complete ;) Nothing is as simple as you try to put it. When analyzing/designing, you should create structures, objects and functions so they reflect the topology of the problem at hand. It IS wrong to decide which tool to use before you know the problem. But of course one know how to use some tools better than others, and this will reflect the way one works.
 
OOP is not a set of tools, it's a set of principles. Inheritance is not
there just for the sake of it, it serves a very good purpose: increasing
cohesion by pulling common/shared code in a super class.

Yes, and by this you get tight coupling and reduced reusability and extendability.

If that happens, it generally means you don't know what you're doing; you're using inheritance the wrong way.

One can be lucky, but usually strong cohesion reduce flexibility. Using inheritance in all cases is the wrong way, use it where it helps toward a solution!
 
 Principles just get in the way for practicality. Using
 inheritance just for the sake of it, or because of principles, is nothing
 even close to practical.

You don't use inheritance because it's a principle (that's the same as using just for its own sake).

Yes, which is what I said.
 You use inheritance when you find that you need to. i.e. when you find
 yourself writing too much duplicate code.

Yes, I'm glad you agree. And usually you need very little inheritance, even though you might program everything using objects, either it is to encapsulate state, functionality or resources.
 But then my programming principles encompass pragmatism :)

Funny you should say so. Is skipping the analysis and design phases (i.e. diving right into writing code) pragmatic or not?

Of course not, I've never said that. In your previous posts you have preached the view that we should use OOP (and with it inheritance), and that there never is a need to use anything else. I am just politely disagreeing with you.
 You can say it's pragmatic .. if you're too lazy to do it! However,
 you'll actually get the most out of your time if you do a proper
 analysis and design first.

You might get an even better result if you analyze, design and implement (and refactor) continuosly to get a better understanding of the system and problems at hand, and some OOP ways to do things (especially inheritance) make this difficult. The waterfall method is a software engineering fossile.
Mar 10 2006
parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Seems like we agree on principles, but we don't agree on what OOP is.

See, you think OOP is a set of tools, when it's actually a set of 
principles.

It's like when you think about books as a set of words rather than a set 
of ideas.
You won't know what's the benefit of writing books or reading them! 
They'd be just a useless set of words!


Lars Ivar Igesund wrote:
 Hasan Aljudy wrote:
 
OOP _is_ a set of tools. Nothing more, nothing less.

no my friend, it's alot more! oop is a way of thinking (in terms of coding, of course). It's a different way of doing analysis and design. Object oriented analysis and design produce completely different results from procedural analysis and design. You can do procedural analysis and design but write code that uses classes and employes inheritance, but your code will still be procedural, because when you thought about the problem you did so procedurally, and the code will inevitable reflect that.

OOP is a toolbox in a programming toolshed (and inheritance is a tool in that particular toolbox). Just to make this great analogy complete ;)

I have to disagree! (duh)
 
 Nothing is as simple as you try to put it. 

I'm *not* trying to make it look simple. It *is* complicated.
 When analyzing/designing, you
 should create structures, objects and functions so they reflect the
 topology of the problem at hand. It IS wrong to decide which tool to use
 before you know the problem. But of course one know how to use some tools
 better than others, and this will reflect the way one works.

ok ..
 
OOP is not a set of tools, it's a set of principles. Inheritance is not
there just for the sake of it, it serves a very good purpose: increasing
cohesion by pulling common/shared code in a super class.

Yes, and by this you get tight coupling and reduced reusability and extendability.

If that happens, it generally means you don't know what you're doing; you're using inheritance the wrong way.

One can be lucky, but usually strong cohesion reduce flexibility. Using inheritance in all cases is the wrong way, use it where it helps toward a solution!

I'm not promoting inheritance. It's quite the opposite. I'm saying that OOP is *not* about inheritance. Inheritance is just a tool. Just because your code uses inheritance doesn't mean it's object oriented. If you don't know how to apply object oriented principles, don't blame the paradigm. When you design your object model, your goal should be to achieve a high level of cohesion and a low level of coupling. Sometimes you can use inheritance to achieve that goal. Sometimes inheritance can stand in your way. That's why OOP is *not* a set of tools. If you keep thinking about object orientation as a set of tools then you will not get much out of it.
 
 
Principles just get in the way for practicality. Using
inheritance just for the sake of it, or because of principles, is nothing
even close to practical.

You don't use inheritance because it's a principle (that's the same as using just for its own sake).

Yes, which is what I said.

See. We agree on principles ;)
 
 
You use inheritance when you find that you need to. i.e. when you find
yourself writing too much duplicate code.

Yes, I'm glad you agree. And usually you need very little inheritance, even though you might program everything using objects, either it is to encapsulate state, functionality or resources.
But then my programming principles encompass pragmatism :)

Funny you should say so. Is skipping the analysis and design phases (i.e. diving right into writing code) pragmatic or not?

Of course not, I've never said that. In your previous posts you have preached the view that we should use OOP (and with it inheritance), and that there never is a need to use anything else. I am just politely disagreeing with you.

I'm just trying to change your opinion on OOP. I'm not saying you should use it all the time. I myself don't use OO all the time. I'm not a purist. However, I often find myself converting procedural code to OO code, as part of the refactoring process.
 
 
You can say it's pragmatic .. if you're too lazy to do it! However,
you'll actually get the most out of your time if you do a proper
analysis and design first.

You might get an even better result if you analyze, design and implement (and refactor) continuosly to get a better understanding of the system and problems at hand, and some OOP ways to do things (especially inheritance) make this difficult. The waterfall method is a software engineering fossile.

The waterfall method is stupid. I totally agree. Usually (like I said above), I always find myself converting procedural code to object oriented code during the refactoring process.
Mar 10 2006
prev sibling next sibling parent reply Don Clugston <dac nospam.com.au> writes:
Hasan Aljudy wrote:
 Walter Bright wrote:
 "Craig Black" <cblack ara.com> wrote in message 
 news:dun68p$30kv$1 digitaldaemon.com...

 A lot of the bias towards OOP purism comes from Java vs. C++ 
 comparisons, much more convincing than Anything vs. D.  Hopefully the 
 simplicity and power of D can help to eliminate OOP purism.

The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation.

That's because we've been taught math in a procedural way ;)

Not at all. It's based on functions. Ever heard of functional programming languages? <g>
 Ideally, x, the angle, would be an object, and sin is a method on that 
 object.

???? How about the derivative function? It operates on functions. Consider something as simple as "+". real a; ireal b; creal c = a + b + a; Should "+" be a member of a, b, or c?
 Like you said, it'll be a bit confusing if it was an object, but that's 
 not because it can't be an object, but mainly because that's not how we 
 think about it.

No, it's because the only way that pure OOP can make sense is when EVERYTHING is an object. In particular, functions must be objects. (And then 'object' is a poor word to use). OO has some great ideas. But overuse of OOP is a disaster. It's unbelievably difficult to write good OO code on a large scale.
Mar 10 2006
parent reply Thomas Kuehne <thomas-dloop kuehne.cn> writes:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Don Clugston schrieb am 2006-03-10:
 OO has some great ideas. But overuse of OOP is a disaster. It's 
 unbelievably difficult to write good OO code on a large scale.

So, how do you design/write lare scale projects? Thomas -----BEGIN PGP SIGNATURE----- iD8DBQFEEWJo3w+/yD4P9tIRAkJFAJ45WQo07FcRq8kdPopYQYe/u2dB7ACg0pku UHdbvVwzRzA7WOQv5BJuQYU= =gjl0 -----END PGP SIGNATURE-----
Mar 10 2006
next sibling parent Don Clugston <dac nospam.com.au> writes:
Thomas Kuehne wrote:
 -----BEGIN PGP SIGNED MESSAGE-----
 Hash: SHA1
 
 Don Clugston schrieb am 2006-03-10:
 OO has some great ideas. But overuse of OOP is a disaster. It's 
 unbelievably difficult to write good OO code on a large scale.

So, how do you design/write lare scale projects?

Being careful to avoid over-zealous application of OO. In particular, avoiding inheritance as far as possible; not defining something as an object unless it really is an object. I see inheritance hierarchies as being like goto. Seductive, looks good in small examples, very popular at one time, still occasionally useful -- but very dangerous.
Mar 10 2006
prev sibling parent reply David Medlock <noone nowhere.com> writes:
Thomas Kuehne wrote:

 -----BEGIN PGP SIGNED MESSAGE-----
 Hash: SHA1
 
 Don Clugston schrieb am 2006-03-10:
 
OO has some great ideas. But overuse of OOP is a disaster. It's 
unbelievably difficult to write good OO code on a large scale.

So, how do you design/write lare scale projects? Thomas

using XML or other external format(scripting falls under this in some ways). And by OOP I assume Don meant inheritance, since the other aspects are not unique to it. Outside of simple Shape examples in textbooks, implementation inheritance has largely been a failure for reusable or easy to maintain code. OOP is supposed to be about black box reusability of objects,ie. components but then its main claim to fame is a way to break that abstraction. see: http://okmij.org/ftp/Computation/Subtyping/#Problem -DavidM
Mar 10 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
David Medlock wrote:
 
 Outside of simple Shape examples in textbooks, implementation 
 inheritance has largely been a failure for reusable or easy to maintain 
 code.  OOP is supposed to be about black box reusability of objects,ie. 
 components but then its main claim to fame is a way to break that 
 abstraction.

Inheritance isn't even much of a problem so long as the trees remain very shallow, but things tend to fall apart beyond that. I think it says something about the deep inheritance model that the design work is usually done via an external tool such as UML. Also, I believe that it should be possible to figure out what's going on through code inspection, and complex object hierarchies make this extremely difficult. Sean
Mar 10 2006
prev sibling next sibling parent Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
David Medlock wrote:
 
 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem
 
 -DavidM

Hum, nice article. -- Bruno Medeiros - CS/E student http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Mar 11 2006
prev sibling parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
David Medlock wrote:
<snip>
 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem
 
 -DavidM

A nice example of not understanding OOP :) WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are. There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault. If you try to open a door with a screw-driver and it doesn't work, should you blame the screw-driver, or blame yourself for not understanding how doors work? Apparently subclassing CSet from CBag wasn't such a good idea. Don't blame the object oriented paradigm for it. No where in the paradigm does it say that you should sublcass CSet from CBag! Aside from that, the real clurpit here is C++, which allows you to deal with objects BY VALUE!! polymorhpism dies at that point. And, really, C++ doesn't support OO, it just presents an awefully complicated set of features!! I came to really hate C++ lately.
Mar 11 2006
next sibling parent reply Don Clugston <dac nospam.com.au> writes:
Hasan Aljudy wrote:
 David Medlock wrote:
 <snip>
 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem

 -DavidM

A nice example of not understanding OOP :) WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are. There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault.

I disagree, I think this is absolutely OOP's fault. 'Trial and error' seems to be a fundamental feature of OOP. If you use OOP, you are committing yourself to continually restructuring your code. (OOP is the worst thing that ever happenned to code reuse!)
 If you try to open a door with a screw-driver and it doesn't work, 
 should you blame the screw-driver, or blame yourself for not 
 understanding how doors work?
 
 Apparently subclassing CSet from CBag wasn't such a good idea. Don't 
 blame the object oriented paradigm for it. No where in the paradigm does 
 it say that you should sublcass CSet from CBag!

And this is the problem. It gives no guidance for when you should use subclassing. That turns out to be a fantastically difficult problem, and OOP just glossed over it. Fundamentally, I think the fatal flaw in OOP is that "is-a" relationships do not exist. (Well, _maybe_ for abstract geometrical shapes, but definitely not for anything useful). The text book examples are wrong: a 'Manager' is NOT a type of 'Employee'. An 'Employee' is NOT a type of 'Person'. Actually, 'Object' works as a base class, because every object really IS-A block of bytes in RAM, some of which are organised into a virtual function table. But note that it's not some kind of abstract Platonic entity. It's a bunch of transistors. In OOP, you spend your time trying to find Is-A relationships, but they don't exist. Unsurprisingly, hardly anyone is "really" doing OOP. It seems to be unimplementable.
Mar 13 2006
next sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Don Clugston wrote:

 Hasan Aljudy wrote:
 David Medlock wrote:
 <snip>
 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem

 -DavidM

A nice example of not understanding OOP :) WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are. There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault.

I disagree, I think this is absolutely OOP's fault. 'Trial and error' seems to be a fundamental feature of OOP. If you use OOP, you are committing yourself to continually restructuring your code. (OOP is the worst thing that ever happenned to code reuse!)
 If you try to open a door with a screw-driver and it doesn't work,
 should you blame the screw-driver, or blame yourself for not
 understanding how doors work?
 
 Apparently subclassing CSet from CBag wasn't such a good idea. Don't
 blame the object oriented paradigm for it. No where in the paradigm does
 it say that you should sublcass CSet from CBag!

And this is the problem. It gives no guidance for when you should use subclassing. That turns out to be a fantastically difficult problem, and OOP just glossed over it. Fundamentally, I think the fatal flaw in OOP is that "is-a" relationships do not exist. (Well, _maybe_ for abstract geometrical shapes, but definitely not for anything useful). The text book examples are wrong: a 'Manager' is NOT a type of 'Employee'. An 'Employee' is NOT a type of 'Person'. Actually, 'Object' works as a base class, because every object really IS-A block of bytes in RAM, some of which are organised into a virtual function table. But note that it's not some kind of abstract Platonic entity. It's a bunch of transistors. In OOP, you spend your time trying to find Is-A relationships, but they don't exist. Unsurprisingly, hardly anyone is "really" doing OOP. It seems to be unimplementable.

It might have been this way, but newer OOP books actually prefers has-a relationships before is-a.
Mar 13 2006
parent "Derek Parnell" <derek psych.ward> writes:
On Tue, 14 Mar 2006 03:18:30 +1100, Lars Ivar Igesund  
<larsivar igesund.net> wrote:


 In OOP, you spend your time trying to find Is-A relationships, but they
 don't exist. Unsurprisingly, hardly anyone is "really" doing OOP. It
 seems to be unimplementable.

It might have been this way, but newer OOP books actually prefers has-a relationships before is-a.

That's interesting. I've been thinking of classes as having a set of behaviours and a set of properties for ages. -- Derek Parnell Melbourne, Australia
Mar 18 2006
prev sibling next sibling parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Don Clugston wrote:
 Hasan Aljudy wrote:
 
 David Medlock wrote:
 <snip>

 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem

 -DavidM

A nice example of not understanding OOP :) WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are. There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault.

I disagree, I think this is absolutely OOP's fault. 'Trial and error' seems to be a fundamental feature of OOP. If you use OOP, you are committing yourself to continually restructuring your code. (OOP is the worst thing that ever happenned to code reuse!)

Are you saying you have a mechanical way to write perfect code?!
 
 If you try to open a door with a screw-driver and it doesn't work, 
 should you blame the screw-driver, or blame yourself for not 
 understanding how doors work?

 Apparently subclassing CSet from CBag wasn't such a good idea. Don't 
 blame the object oriented paradigm for it. No where in the paradigm 
 does it say that you should sublcass CSet from CBag!

And this is the problem. It gives no guidance for when you should use subclassing. That turns out to be a fantastically difficult problem, and OOP just glossed over it.

OOP's main guide is cohesion, whenever you have a class that's doing too much, you should redesign your code and split your class into multiple classes. When you see more than once class using pretty much the same code, you should think about pulling this common code to a super class. etc.
 
 Fundamentally, I think the fatal flaw in OOP is that "is-a" 
 relationships do not exist. (Well, _maybe_ for abstract geometrical 
 shapes, but definitely not for anything useful). The text book examples 
 are wrong: a 'Manager' is NOT a type of 'Employee'. An 'Employee' is NOT 
 a type of 'Person'. Actually, 'Object' works as a base class, because 
 every object really IS-A block of bytes in RAM, some of which are 
 organised into a virtual function table. But note that it's not some 
 kind of abstract Platonic entity. It's a bunch of transistors.

A Manager doesn't always have to be an Employee, it depends on what your're doing. However, most of the time, if you find yourself writing the same code in the Employee and Manager classes, then you should probably think about pulling some code up the class hierarchy. You can create an AbstractEmployee, for example, and put all the duplicate code in there, then let Employee and Manager inherit from there. That's just one possibility, it may be a good choice in some situations, but it could be a very bad idea in some other situations. It all depends on what you're doing.
 
 In OOP, you spend your time trying to find Is-A relationships, but they 
 don't exist. 

dude, you're trying to come up with classes and/or class hierarchies before examining the problem at hand!! You can't do that!! If at anytime, your design goal becomes to "find a use" for inheritance, i.e. find is-a relationships, then you're going the wrong path. in OOP, you analyze for /objects/ .. object, not classes.
 Unsurprisingly, hardly anyone is "really" doing OOP. It 
 seems to be unimplementable.

No, they just don't understand it.
Mar 13 2006
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Don Clugston" <dac nospam.com.au> wrote in message 
news:dv3bm6$24ed$1 digitaldaemon.com...
 In OOP, you spend your time trying to find Is-A relationships, but they 
 don't exist. Unsurprisingly, hardly anyone is "really" doing OOP. It seems 
 to be unimplementable.

The dmd compiler front end seems to be a natural fit for OOP.
Mar 13 2006
prev sibling parent David Medlock <noone nowhere.com> writes:
Hasan Aljudy wrote:
 David Medlock wrote:
 <snip>
 see:
 http://okmij.org/ftp/Computation/Subtyping/#Problem

 -DavidM

A nice example of not understanding OOP :) WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are. There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault. If you try to open a door with a screw-driver and it doesn't work, should you blame the screw-driver, or blame yourself for not understanding how doors work?

Sorry, Hasan this is called 'moving the goalposts'. -DavidM
Mar 13 2006
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Hasan Aljudy" <hasan.aljudy gmail.com> wrote in message 
news:dupijv$isn$1 digitaldaemon.com...
 Walter Bright wrote:
 The trouble with OOP is, well, not everything is an object. For example, 
 take the trig function sin(x). It's not an object. Of course, we could 
 bash it into being an object, but that doesn't accomplish anything but 
 obfuscation.

That's because we've been taught math in a procedural way ;) Ideally, x, the angle, would be an object, and sin is a method on that object. However, I think that we're so used to think about math functions in a procedural way, so it's better they stay procedural. Like you said, it'll be a bit confusing if it was an object, but that's not because it can't be an object, but mainly because that's not how we think about it.

If we think of a number as a number rather than an object, then it's just obfuscation to force it into being an object.
Mar 12 2006