www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Let Go, Standard Library From Community

reply Davidl <Davidl 126.com> writes:
Personally, I'm not familiar with tango, but the following is based on the  
thought of
1. one man's effort vs. a team's effort
2. growing D code need only 1 base standard library.

I don't think it's funny to switch from phobos to tango or switch tango  
back to phobos.

I think standard library should be provided by D community, I appreciate  
Walter gave us phobos.
We needed this babysitter. But now D community is growing bigger & bigger.  
I'm wondering if Walter
can put as much effort as he used to put on Phobos to compete with Tango.   
And endless arguing of
Phobos vs. Tango is somewhat meaningless & annoying.
Once tangobos out, I hope standard DMD package would be released by tango  
team & Walter, with Tango
being the base default library. Users can use tangobos for legacy code.
Apr 18 2007
next sibling parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
Davidl wrote:
 Personally, I'm not familiar with tango, but the following is based on
 the thought of
 1. one man's effort vs. a team's effort
 2. growing D code need only 1 base standard library.
 
 I don't think it's funny to switch from phobos to tango or switch tango
 back to phobos.
If you haven't used Tango, then how can you make this comment?
 I think standard library should be provided by D community, I appreciate
 Walter gave us phobos.
 We needed this babysitter. But now D community is growing bigger &
 bigger. I'm wondering if Walter
 can put as much effort as he used to put on Phobos to compete with
 Tango.  And endless arguing of
 Phobos vs. Tango is somewhat meaningless & annoying.
 Once tangobos out, I hope standard DMD package would be released by
 tango team & Walter, with Tango
 being the base default library. Users can use tangobos for legacy code.
I really don't see the need for this. Phobos is a lovely minimalist library that simply gets the job done with a minimum of fuss. Tango is a nicely structured library full of functionality. They fill different needs. What's more, switching between them is a *total* non-issue. Here's how I compile phobos apps: $ rebuild foo Here's how I compile Tango apps $ rebuild -dcdmd-tango bar Heck, I usually don't even do that; I just dump the switches into a text file and use them as response files. If rebuild grew bud-style +target args, it'd be perfect :) It's not like it's rocket science any more. I honestly can't see a reason why the two can't simply coexist. -- Daniel -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
Apr 18 2007
next sibling parent reply eao197 <eao197 intervale.ru> writes:
On Wed, 18 Apr 2007 11:55:55 +0400, Daniel Keep  
<daniel.keep.lists gmail.com> wrote:

 What's more, switching between them is a *total* non-issue.  Here's how
 I compile phobos apps:

 $ rebuild foo

 Here's how I compile Tango apps

 $ rebuild -dcdmd-tango bar

 Heck, I usually don't even do that; I just dump the switches into a text
 file and use them as response files.  If rebuild grew bud-style +target
 args, it'd be perfect :)

 It's not like it's rocket science any more.  I honestly can't see a
 reason why the two can't simply coexist.
I think problem of Phobos/Tango incompatibility affects not application writers, but libraries writers. If someone writes some application domain library (XML parsing, SOAP support, event-driven framework and so on) then he/she can use Phobos or Tango, but this choice divides users of than library to Phobos-only or Tango-only. An attempt to write big library which can work either with Phobos or Tango would lead to undesirable amount of version statements in code (as in good old C/C++ times with many #if/#else). -- Regards, Yauheni Akhotnikau
Apr 18 2007
next sibling parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
eao197 wrote:
 On Wed, 18 Apr 2007 11:55:55 +0400, Daniel Keep
 <daniel.keep.lists gmail.com> wrote:
 
 What's more, switching between them is a *total* non-issue.  Here's how
 I compile phobos apps:

 $ rebuild foo

 Here's how I compile Tango apps

 $ rebuild -dcdmd-tango bar

 Heck, I usually don't even do that; I just dump the switches into a text
 file and use them as response files.  If rebuild grew bud-style +target
 args, it'd be perfect :)

 It's not like it's rocket science any more.  I honestly can't see a
 reason why the two can't simply coexist.
I think problem of Phobos/Tango incompatibility affects not application writers, but libraries writers. If someone writes some application domain library (XML parsing, SOAP support, event-driven framework and so on) then he/she can use Phobos or Tango, but this choice divides users of than library to Phobos-only or Tango-only. An attempt to write big library which can work either with Phobos or Tango would lead to undesirable amount of version statements in code (as in good old C/C++ times with many #if/#else). --Regards, Yauheni Akhotnikau
I've been writing an XML SAX parser based on Expat, and it supports both Phobos and Tango; it wasn't much work at all. A few aliases and version statements, and I'm good. Also, this seems a bit like the difference between writing a library for just the standard C++ libraries and, say, MFC or wxWidgets. I still think it's a little early to be shoving Phobos out the door into the cold, dark night with nary more than a "thanks, it's been great but I'm seeing someone else now" based on what we think might happen at some indeterminable point in the future. I think it would be better to wait until Tango hits 1.0, and a few large libraries have actually been written so we can get a feel for how this is all panning out. Besides, I quite like Phobos :) -- Daniel "There, there, Phobos. *I* still <3 you..." -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
Apr 18 2007
parent eao197 <eao197 intervale.ru> writes:
On Wed, 18 Apr 2007 14:27:10 +0400, Daniel Keep  
<daniel.keep.lists gmail.com> wrote:

 I think problem of Phobos/Tango incompatibility affects not application
 writers, but libraries writers. If someone writes some application
 domain library (XML parsing, SOAP support, event-driven framework and so
 on) then he/she can use Phobos or Tango, but this choice divides users
 of than library to Phobos-only or Tango-only. An attempt to write big
 library which can work either with Phobos or Tango would lead to
 undesirable amount of version statements in code (as in good old C/C++
 times with many #if/#else).

 --Regards,
 Yauheni Akhotnikau
I've been writing an XML SAX parser based on Expat, and it supports both Phobos and Tango; it wasn't much work at all. A few aliases and version statements, and I'm good.
I said not about D-wrappers around existing C/C++ libraries but about 'native' D libraries. For example, to use D on my work I need to make D-port of (at least) 3 our C++ libraries (for configuration files, data serialization, event-oriented programming). Because the existing C++ libraries heavy use C++templates it isn't possible to make D wrapper around existing code.
 Also, this seems a bit like the difference between writing a library for
 just the standard C++ libraries and, say, MFC or wxWidgets.  I still
 think it's a little early to be shoving Phobos out the door into the
 cold, dark night with nary more than a "thanks, it's been great but I'm
 seeing someone else now" based on what we think might happen at some
 indeterminable point in the future.
I didn't speak about throwing Phobos away. Phobos is a good library. And it should be here because D is a multiparadigm language, so in some circumstances C-like Phobos library much more appopriate then fully object-oriented Tango library. But it is a sad situation when some application can't be link against Phobos-based and Tango-based libraries at the same time. I hope that Walter and the Tango team find the solution for that problem. -- Regards, Yauheni Akhotnikau
Apr 18 2007
prev sibling parent Mike Parker <aldacron71 yahoo.com> writes:
eao197 wrote:

 
 I think problem of Phobos/Tango incompatibility affects not application 
 writers, but libraries writers. If someone writes some application 
 domain library (XML parsing, SOAP support, event-driven framework and so 
 on) then he/she can use Phobos or Tango, but this choice divides users 
 of than library to Phobos-only or Tango-only. An attempt to write big 
 library which can work either with Phobos or Tango would lead to 
 undesirable amount of version statements in code (as in good old C/C++ 
 times with many #if/#else).
Or they can do what I did with Derelict and add a module to the library that wraps Phobos/Tango calls. It only needs to wrap the functions/objects the library actually uses. Admittedly, there is a problem if you use standard library calls extensively, or if you use features of one library that aren't (yet) available in the other (such as the Zlib and ZipArchive modules in Phobos). But in that case, unless your library is intended to make money for you, why worry? And if it is commercial, then you have good incentive to implement missing functionality anyway.
Apr 18 2007
prev sibling next sibling parent reply Davidl <Davidl 126.com> writes:
 Davidl wrote:
 Personally, I'm not familiar with tango, but the following is based on
 the thought of
 1. one man's effort vs. a team's effort
 2. growing D code need only 1 base standard library.

 I don't think it's funny to switch from phobos to tango or switch tango
 back to phobos.
If you haven't used Tango, then how can you make this comment?
I use it a little, but the points I listed is so obvious. I think no doubt on those.
 I think standard library should be provided by D community, I appreciate
 Walter gave us phobos.
 We needed this babysitter. But now D community is growing bigger &
 bigger. I'm wondering if Walter
 can put as much effort as he used to put on Phobos to compete with
 Tango.  And endless arguing of
 Phobos vs. Tango is somewhat meaningless & annoying.
 Once tangobos out, I hope standard DMD package would be released by
 tango team & Walter, with Tango
 being the base default library. Users can use tangobos for legacy code.
I really don't see the need for this. Phobos is a lovely minimalist library that simply gets the job done with a minimum of fuss. Tango is a nicely structured library full of functionality. They fill different needs. What's more, switching between them is a *total* non-issue. Here's how I compile phobos apps: $ rebuild foo Here's how I compile Tango apps $ rebuild -dcdmd-tango bar Heck, I usually don't even do that; I just dump the switches into a text file and use them as response files. If rebuild grew bud-style +target args, it'd be perfect :)
I don't think it's handy at all, and I don't like the feeling of forcing users to choose between 2 base libraries at the very first. And I'm quite sure they would *IN MOST CASE* be misled to phobos without any cosideration even they don't know if phobos is what they want, they just think it comes with the compiler. And I think this brings a *GREAT UNFAIR* in library competition
 It's not like it's rocket science any more.  I honestly can't see a
 reason why the two can't simply coexist.

 	-- Daniel
No, for D's own good, one of them should die peacefully.
Apr 18 2007
parent Sean Kelly <sean f4.ca> writes:
Davidl wrote:

 I don't think it's handy at all, and I don't like the feeling of forcing
 users to choose between 2 base libraries at the very first. And I'm quite
 sure they would *IN MOST CASE* be misled to phobos without any cosideration
 even they don't know if phobos is what they want, they just think it comes
 with the compiler. And I think this brings a *GREAT UNFAIR* in library
 competition
The Tango binary distribution now includes the compiler as well. Sean
Apr 18 2007
prev sibling parent reply Davidl <Davidl 126.com> writes:
And also learning two completely different APIs ain't that easy.
With the powerful expressive D lang , users already have many things
to learn to use it practical, not to say the base libraries.

I'm not saying Phobos is *BAD*. What I mean is the coexistance of
Phobos and Tango just confuses users.
Apr 18 2007
parent reply Lionello Lunesu <lio lunesu.remove.com> writes:
Davidl wrote:
 And also learning two completely different APIs ain't that easy.
 With the powerful expressive D lang , users already have many things
 to learn to use it practical, not to say the base libraries.
 
 I'm not saying Phobos is *BAD*. What I mean is the coexistance of
 Phobos and Tango just confuses users.
And I agree with you. L.
Apr 18 2007
parent reply Dan <murpsoft hotmail.com> writes:
I just use phobos.  It gets the job done, instead of trying to be everything
and anything the user might possibly want to interact with; like the Java
libraries.

I find the Java libraries are so freakin' huge and multipurpose that they
couldn't possibly be even close to efficient or practical for any one solution.

Likewise I fear for Tango, as it's got that OO gleam in it's eye and is
implementing classes for crypto.. I mean.. crypto!?  What's next, a math class?
 *shudders*
Apr 18 2007
next sibling parent reply Brad Anderson <brad dsource.org> writes:
Dan wrote:
 I just use phobos.  It gets the job done, instead of trying to be
 everything and anything the user might possibly want to interact with; like
 the Java libraries.
 
 I find the Java libraries are so freakin' huge and multipurpose that they
 couldn't possibly be even close to efficient or practical for any one
 solution.
 
 Likewise I fear for Tango, as it's got that OO gleam in it's eye and is
 implementing classes for crypto.. I mean.. crypto!?  What's next, a math
 class?  *shudders*
Oh to be a fly on the wall when the current lib doesn't have a function you need...
Apr 18 2007
parent reply Dan <murpsoft hotmail.com> writes:
Brad Anderson Wrote:
 Oh to be a fly on the wall when the current lib doesn't have a function you
 need...
I can write assembler. I can write D. I can write ECMAScript. If there's something I need that isn't there, I write it. I personally think there's something wrong with someone who claims to be a programmer but *can't* solve a trivial puzzle. Not knowing what address to write directly to the screenbuffer is forgiveable. Look it up. Not being able to implement huffman compression or a quicksort or binary search, or a jump gate after knowing what you need to achieve... well, that means you lost that gleam in your eye you had back in kindergarten. - Dan
Apr 18 2007
next sibling parent Brad Anderson <brad dsource.org> writes:
Dan wrote:
 Brad Anderson Wrote:
 Oh to be a fly on the wall when the current lib doesn't have a function
 you need...
I can write assembler. I can write D. I can write ECMAScript. If there's something I need that isn't there, I write it. I personally think there's something wrong with someone who claims to be a programmer but *can't* solve a trivial puzzle. Not knowing what address to write directly to the screenbuffer is forgiveable. Look it up. Not being able to implement huffman compression or a quicksort or binary search, or a jump gate after knowing what you need to achieve... well, that means you lost that gleam in your eye you had back in kindergarten. - Dan
That's fine. You choose to keep your feet on the ground, as opposed to standing on the shoulders of giants. No biggie. It's just kind of reinventing the wheel. I know it's fun to implement things yourself at times, and a lot of it has to do with what you're trying to accomplish. When time and all the -abilities matter most, I think your approach is wrong. Or rather, not the one I would choose. BA
Apr 18 2007
prev sibling next sibling parent janderson <askme me.com> writes:
Dan wrote:
 Brad Anderson Wrote:
 Oh to be a fly on the wall when the current lib doesn't have a function you
 need...
I can write assembler. I can write D. I can write ECMAScript. If there's something I need that isn't there, I write it. I personally think there's something wrong with someone who claims to be a programmer but *can't* solve a trivial puzzle.
The problem is that every code monkey ends up re-writing the same code. That means a lot of time wasted focusing on writing solutions to problems that have already been solved. It also means when reading someone else's code there's more to learn, rather then there being a standard way of writing something (more time wasted). Furthermore things that have been in some public library (not necessarily standard) generally receive a high level of free testing from the community, that means I save more time. My approach to coding is to try to do as much re-use as is feasible. It means I can focus on the actual problem more, not how I get there.
 Not knowing what address to write directly to the screenbuffer is forgiveable.
 Look it up.  Not being able to implement huffman compression or a quicksort or
binary search, or a jump gate after knowing what you need to achieve... well,
that means you lost that gleam in your eye you had back in kindergarten.
I can, have and believe anyone one on this newsgroup could as well. It doesn't mean I should do it again. Its a time investment, even if it only takes you 5 minutes to write a quicksort. I'm sure Michael Abrash could write a ASM optimized quicksort that could rival anything normal people could write (and sure given time any code monkey(s) could produce something better). Even Michael Abrash is an avoid code reuser. That's an advantage of using code that has been worked on by a community or been available forever. That's this code monkey's view. Humm, although I respect your option: You should know that this is one common interview question. PS - Yes I'm a Michael Abrash fan-boy.
 
 - Dan
Apr 18 2007
prev sibling parent reply Alexander Panek <a.panek brainsware.org> writes:
Dan wrote:
 [...]
 Not knowing what address to write directly to the screenbuffer is forgiveable.
 Look it up.  Not being able to implement huffman compression or a quicksort or
binary search, or a jump gate after knowing what you need to achieve... well,
that means you lost that gleam in your eye you had back in kindergarten.
This is utterly stupid. I am an undergraduate student, claiming to be quite a good programmer. I have never implemented one of those, yet I can produce proper library & application code... why? Oh..seems like someone else solved this already, and I can head forward to the solution of my actual problem, without having to care about the obstacles that might be in my if I haven't had a pre-made solution. Wait.. this is kindergarten? I wonder how you pay your rent and earn money for food...oh well..you wouldn't need to, when you're in kindergarten.. Kind regards, Alex
Apr 18 2007
next sibling parent Alexander Panek <a.panek brainsware.org> writes:
Alexander Panek wrote:
 [...]
..I'm sorry for my harsh response. I didn't mean to let my really-bad-day-mood out like this.. please don't take it personally.
Apr 18 2007
prev sibling parent reply Dan <murpsoft hotmail.com> writes:
Alexander Panek Wrote:

 Dan wrote:
  > [...]
 Not knowing what address to write directly to the screenbuffer is forgiveable.
 Look it up.  Not being able to implement huffman compression or a quicksort or
binary search, or a jump gate after knowing what you need to achieve... well,
that means you lost that gleam in your eye you had back in kindergarten.
This is utterly stupid. I am an undergraduate student, claiming to be quite a good programmer. I have never implemented one of those, yet I can produce proper library & application code... why? Oh..seems like someone else solved this already, and I can head forward to the solution of my actual problem, without having to care about the obstacles that might be in my if I haven't had a pre-made solution. Wait.. this is kindergarten? I wonder how you pay your rent and earn money for food...oh well..you wouldn't need to, when you're in kindergarten..
*sigh* A good programmer needs to be able to do more than simply type down what's blatantly obvious. They need to be able to overcome obstacles as simple as these. That was the point of the message. If you can't overcome something, and be creative - even if your solution isn't a perfect qsort, but maybe you just need to sort, and recreate a decent shellsort instead. If you can't approach a problem and overcome it, you shouldn't be programming, you should be in data entry and speaking Vorgon (for those who've read/seen Hitchhiker's Guide to the Galaxy). You obviously were quite offended by this line of thought.
Apr 18 2007
parent reply Alexander Panek <a.panek brainsware.org> writes:
Dan wrote:
 Alexander Panek Wrote:
 [...]
 You obviously were quite offended by this line of thought.
Might be. The reason for this may lie in the circumstances of my "career" as a programmer. I'm not attending university yet, I am just having my last year's finals at school. This means, problems of academic level are just new to me, but I'm constantly gaining bits of knowledge how to achieve different kinds of goals. However, I love beautifully designed systems in general, solving problems in the most elegant way. This *may* involve implementing a sorting algorithm at times, to achieve the goal of providing an elegant solution for a given problem. Yet, this is not an every-day obstacle. Good programmers are not anymore those, who can write the fastest algorithms with as few lines as possible, using almost no memory.. programming is more than that. In fact, it's not programming alone. I've attended a school for Electrical Engineering the last five years, where I've learned different kinds of ways to solve problems. There are mathematical ways, graphical ways, weirdo-graphical ways (yea.. VBA-like :P), semi-mathematical ways and analog electronic that can achieve what you want. Still, it's not the knowledge in detail alone that makes a good EE, it's the knowledge of what tools and environments are available that makes a good engineer, in general. Apart from that.. I think programmers, and engineers in general are the laziest people you can find, when it comes to solutions, but that doesn't make them bad engineers. Kind regards, Alex
Apr 18 2007
parent reply janderson <askme me.com> writes:
Alexander Panek wrote:
 Dan wrote:
 Alexander Panek Wrote:
the knowledge in detail alone that makes a good EE, it's the knowledge of what tools and environments are available that makes a good engineer, in general. Apart from that.. I think programmers, and engineers in general are the laziest people you can find, when it comes to solutions, but that doesn't make them bad engineers. Kind regards, Alex
Agreed, the best engineers are the laziest ones. They normally write the most productive code. -Joel
Apr 18 2007
parent Brad Anderson <brad dsource.org> writes:
janderson wrote:
 Alexander Panek wrote:
 Dan wrote:
 Alexander Panek Wrote:
the knowledge in detail alone that makes a good EE, it's the knowledge of what tools and environments are available that makes a good engineer, in general. Apart from that.. I think programmers, and engineers in general are the laziest people you can find, when it comes to solutions, but that doesn't make them bad engineers. Kind regards, Alex
Agreed, the best engineers are the laziest ones. They normally write the most productive code. -Joel
http://www.lazyway.net/sample_chapter_page1.html BA
Apr 18 2007
prev sibling next sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Dan wrote:

 
 I just use phobos.  It gets the job done, instead of trying to be
 everything and anything the user might possibly want to interact with;
 like the Java libraries.
 
 I find the Java libraries are so freakin' huge and multipurpose that they
 couldn't possibly be even close to efficient or practical for any one
 solution.
 
 Likewise I fear for Tango, as it's got that OO gleam in it's eye and is
 implementing classes for crypto.. I mean.. crypto!?  What's next, a math
 class?  *shudders*
Seriously, classes don't inherently mean in-efficient as you seem to believe. Tango puts it's pride in being an efficient library, and if it is provably slower than other solutions, whether they are OO or procedural, then we would like to know, as it is most likely a bug. Note that the overhead with classes most often are related to the allocation cost. In very many cases, when it makes sense to keep the ojbect a live for a period, keeping state can you help you make many operations more efficient than if they had to be done repeatedly through a free function, quickly overcoming the initial allocation cost. This also often leads to easier overall use. Then you have the cases where the related operations (especially disk IO) are so expensive in itself, that whether classes are used or not, won't be noticed in the grand scheme of things. They may however ease use through encapsulation. As for the String class; Yes, Tango has one, but do also have free functions for the same operations, and in most cases we provide versions that do not induce heap activity. Additionally we try to design the library to be flexible and easy to use, and coherent with itself. We even worry about binary sizes. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 18 2007
parent reply Dan <murpsoft hotmail.com> writes:
Lars Ivar Igesund Wrote:

 Dan wrote:
 
 
 I just use phobos.  It gets the job done, instead of trying to be
 everything and anything the user might possibly want to interact with;
 like the Java libraries.
 
 I find the Java libraries are so freakin' huge and multipurpose that they
 couldn't possibly be even close to efficient or practical for any one
 solution.
 
 Likewise I fear for Tango, as it's got that OO gleam in it's eye and is
 implementing classes for crypto.. I mean.. crypto!?  What's next, a math
 class?  *shudders*
Seriously, classes don't inherently mean in-efficient as you seem to believe. Tango puts it's pride in being an efficient library, and if it is provably slower than other solutions, whether they are OO or procedural, then we would like to know, as it is most likely a bug. Note that the overhead with classes most often are related to the allocation cost. In very many cases, when it makes sense to keep the ojbect a live for a period, keeping state can you help you make many operations more efficient than if they had to be done repeatedly through a free function, quickly overcoming the initial allocation cost. This also often leads to easier overall use. Then you have the cases where the related operations (especially disk IO) are so expensive in itself, that whether classes are used or not, won't be noticed in the grand scheme of things. They may however ease use through encapsulation. As for the String class; Yes, Tango has one, but do also have free functions for the same operations, and in most cases we provide versions that do not induce heap activity. Additionally we try to design the library to be flexible and easy to use, and coherent with itself. We even worry about binary sizes.
~~ That's respectable. On that note, I think I'll examine the Tango sources. Whomever is working on Tangobos is definitely doing us all a favor. So if I could demonstrate that many (if not all) of these implementations should really be using a struct for strictly performance reasons, for example, would that have any weight? I find that structs tend to be able to do alot in D, with Interfaces being the only remaining thing I miss.
Apr 18 2007
next sibling parent reply Dan <murpsoft hotmail.com> writes:
Dan Wrote:
 On that note, I think I'll examine the Tango sources.  Whomever is working on
Tangobos is definitely doing us all a favor.
Oh my... I went through tango.core.Array and tango.math.Math and personally found several rather junior mistakes like: int abs(int x){ return x > 0? x : -x; // should be: return x &= 0x7FFF_FFFF; } and all sorts of template mess in Array and String just to implement things like find as well as several equivalents to opAssign, opAdd, opCat etc. I spent three minutes quickly trying out a template that could implement find on an array of any type using delegates or functions, as well as trying to remember what an operator overload would be classified as; and to be honest I did find it somewhat disorienting so I can understand that part. I wonder if Phobos is likewise written...
Apr 18 2007
next sibling parent reply Frits van Bommel <fvbommel REMwOVExCAPSs.nl> writes:
Dan wrote:
 
 Oh my...
 
 I went through tango.core.Array and tango.math.Math and personally found
several rather junior mistakes like:
 
 int abs(int x){
   return x > 0? x : -x;
  // should be: return x &= 0x7FFF_FFFF;
 }
... That's *so* wrong... You may want to save yourself some embarrassment by checking code before you post it here: ===== $ cat test.d import std.stdio; int abs(int x) { return x &= 0x7fff_ffff; } void main() { writefln(abs(-2)); } $ dmd -run test.d 2147483646 ===== hint: abs(-2) != 2147483646 :P (Most computers use a representation called "two's complement" to store signed integers. See http://en.wikipedia.org/wiki/Two's_complement for details) Though returning an _unsigned_ integer instead of a signed one would arguably improve the "abs" function. (Try calling the current version with int.min to see what I mean)
Apr 18 2007
parent reply Dan <murpsoft hotmail.com> writes:
Frits van Bommel Wrote:

 Dan wrote:
 
 Oh my...
 
 I went through tango.core.Array and tango.math.Math and personally found
several rather junior mistakes like:
 
 int abs(int x){
   return x > 0? x : -x;
  // should be: return x &= 0x7FFF_FFFF;
 }
... That's *so* wrong... You may want to save yourself some embarrassment by checking code before you post it here:
Bah... you know what I meant. Unpredicted branches on an x86 cost roughly 6 cycles, not including the neg; we assume for both cases that we're inlining the function (no call overhead) What I intended to show was that you could do it better with a touch of bit math. I think the right way is actually to use something like asm's: rol EAX, 1; // 1000_0001 becomes 0000_0011 shr EAX, 1; // 0000_0011 becomes 0000_0001 Of course, I can't test this. I'm at work, not at home with my development box. I may have also used the wrong right shift instruction, or used the rotate that goes through the sign bit in EFLAGS. Surely one of the asm gurus has presented this already? So I looked around a bit. Some further reading: http://www.azillionmonkeys.com/qed/2scomp.html http://graphics.stanford.edu/~seander/bithacks.html#IntegerAbs
Apr 18 2007
parent Don Clugston <dac nospam.com.au> writes:
Dan wrote:
 Frits van Bommel Wrote:
 
 Dan wrote:
 Oh my...

 I went through tango.core.Array and tango.math.Math and personally found
several rather junior mistakes like:

 int abs(int x){
   return x > 0? x : -x;
  // should be: return x &= 0x7FFF_FFFF;
 }
... That's *so* wrong... You may want to save yourself some embarrassment by checking code before you post it here:
Bah... you know what I meant. Unpredicted branches on an x86 cost roughly 6 cycles, not including the neg; we assume for both cases that we're inlining the function (no call overhead) What I intended to show was that you could do it better with a touch of bit math. I think the right way is actually to use something like asm's: rol EAX, 1; // 1000_0001 becomes 0000_0011 shr EAX, 1; // 0000_0011 becomes 0000_0001 Of course, I can't test this. I'm at work, not at home with my development box. I may have also used the wrong right shift instruction, or used the rotate that goes through the sign bit in EFLAGS. Surely one of the asm gurus has presented this already? So I looked around a bit. Some further reading: http://www.azillionmonkeys.com/qed/2scomp.html http://graphics.stanford.edu/~seander/bithacks.html#IntegerAbs
BTW I'm mentioned on that second page <g>. Seriously, the emphasis to this point in Tango has been getting the interface and the implementation correct, rather than low-level optimisation (since it can be done later). There's a lot still to be done. Any contributions are very welcome.
Apr 19 2007
prev sibling next sibling parent Sean Kelly <sean f4.ca> writes:
Dan wrote:
 
 I went through tango.core.Array and tango.math.Math and personally found
several rather junior mistakes like:
 
 int abs(int x){
   return x > 0? x : -x;
  // should be: return x &= 0x7FFF_FFFF;
 }
 
 and all sorts of template mess in Array and String just to implement things
like find as well as several equivalents to opAssign, opAdd, opCat etc.
 
 I spent three minutes quickly trying out a template that could implement find
on an array of any type using delegates or functions, as well as trying to
remember what an operator overload would be classified as; and to be honest I
did find it somewhat disorienting so I can understand that part.
You're welcome to submit suggestions if you think the current implementation can be improved. Please note that any array routine must work for array literals, static arrays, and dynamic arrays, that duplicate generated code should be avoided whenever possible, and any default predicate should have its code inlined. Also, predicate support must allow for any callable type, not just delegates and functions. Sean
Apr 18 2007
prev sibling parent reply Derek Parnell <derek psych.ward> writes:
On Wed, 18 Apr 2007 15:09:12 -0400, Dan wrote:

 I went through tango.core.Array and tango.math.Math and personally found
 several rather junior mistakes like:
 
 int abs(int x){
   return x > 0? x : -x;
  // should be: return x &= 0x7FFF_FFFF;
 }
I disagree strongly with your "improvement" on the grounds that it assumes a particular implementation of signed integers. The compiler knows better. If I was to suggest any change it would be return x >= 0 ? x : -x; because that way if zero is supplied, it won't try to evaluate -zero. -- Derek Parnell Melbourne, Australia "Justice for David Hicks!" skype: derek.j.parnell
Apr 18 2007
parent reply Stephen Waits <steve waits.net> writes:
Derek Parnell wrote:
   return x >= 0 ? x : -x;
Or, if we're going to get that picky, and you want to prefer +0 over -0, then you can also eliminate the equality test. return x < 0 ? -x : x; I agree with you.. portability is preferred here, considering the non-existent performance difference. I'd even go as far as saying that using bit twiddling in this specific case is the "junior mistake". Why? Because it assumes that the compiler isn't already taking care of this for you. In C++, my gcc4 already optimizes both -x and x *= -1 into optimal code, similar to the bit-twiddling Dan suggested. I have to assume we can get the same under D, whether it exists today or not. --Steve
Apr 18 2007
parent reply "David B. Held" <dheld codelogicconsulting.com> writes:
Stephen Waits wrote:
 [...]
 I agree with you.. portability is preferred here, considering the 
 non-existent performance difference.
 
 I'd even go as far as saying that using bit twiddling in this specific 
 case is the "junior mistake".
 
 Why?  Because it assumes that the compiler isn't already taking care of 
 this for you.
 [...]
To highlight this point, I interviewed a candidate the other day that agonized over the best way to return a vector from a C++ function in an efficient manner. He considered heap-allocating it and returning a pointer (apparently, he thinks 'new' is fast), passing an out reference, and a few other things. And yet, he didn't once mention the fact that he was passing *in* two vectors *by value*. When I asked him the most efficient way to pass in the input vectors, I asked if he should do it by value, by reference, or by pointer. He said: "By pointer". Oy vai! Apparently, he wasn't familiar with [N]RVO, and was optimizing the wrong thing. 90% of any given program contributes only 10% to its overall performance, typically. That's why C.A.R. Hoare warned against the dangers of premature optimization (and making sure you optimize the *right* thing). Dave
Apr 19 2007
parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
David B. Held wrote:
 Stephen Waits wrote:
 [...]
 I agree with you.. portability is preferred here, considering the
 non-existent performance difference.

 I'd even go as far as saying that using bit twiddling in this specific
 case is the "junior mistake".

 Why?  Because it assumes that the compiler isn't already taking care
 of this for you.
 [...]
To highlight this point, I interviewed a candidate the other day that agonized over the best way to return a vector from a C++ function in an efficient manner. He considered heap-allocating it and returning a pointer (apparently, he thinks 'new' is fast), passing an out reference, and a few other things. And yet, he didn't once mention the fact that he was passing *in* two vectors *by value*. When I asked him the most efficient way to pass in the input vectors, I asked if he should do it by value, by reference, or by pointer. He said: "By pointer". Oy vai! Apparently, he wasn't familiar with [N]RVO, and was optimizing the wrong thing. 90% of any given program contributes only 10% to its overall performance, typically. That's why C.A.R. Hoare warned against the dangers of premature optimization (and making sure you optimize the *right* thing). Dave
This is one thing I really lament about my uni education thus far. Two topics that basically have *never* been covered in even minimal detail have been optimisation and debugging. To be honest, I wouldn't know the most efficient way to return or pass out a vector because I've never had any kind of grounding in the effects of the various ways of doing it. For instance: at what point does passing by reference become faster than by value? Wish I knew :) -- Daniel -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
Apr 19 2007
next sibling parent reply "David B. Held" <dheld codelogicconsulting.com> writes:
Daniel Keep wrote:
 [...]
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.
 
 To be honest, I wouldn't know the most efficient way to return or pass
 out a vector because I've never had any kind of grounding in the effects
 of the various ways of doing it.  For instance: at what point does
 passing by reference become faster than by value?
This is where it really helps to have some understanding of what a compiler actually does. The "Java Syndrome" helps students treat the compiler like a magical black box where text goes in and executables come out. While this is useful for cranking out thousands of entry-level programmers that assiduously follow design specs, it's not very good for creating engineers who are also artists. If you want to understand pass-by-value vs. pass-by-reference, it helps to understand how C works, and then how assembly works. Only when you get an idea for the actual instructions involved and how much time they take, relatively speaking, does it become practical to estimate these things. You can never be 100% sure what code a compiler is going to generate without looking at the output, but after you learn a few languages, you can make pretty good guesses. For function call performance, you need to understand registers, memory, stacks, call frames, and calling conventions. You also need to understand that references are just pointers that get automagically dereferenced on use. Once you put these things together, it starts to become fairly clear when one is preferred over the other (though there is definitely a gray area where it depends on the actual code generation and hardware characteristics). If you really want to be a good programmer, you need to take your education into your own hands. I would venture to claim that the majority of experts in the world learned most of what they know from self-study, and not from classroom lectures and textbooks. You are particularly lucky to be alive at this point in time because you have ridiculously more free information available to you than I did when I was growing up. I would have killed to have as much of the internet to access as you do. If you are really serious about your education, you will soon find your classes boring, because most of your learning will be self-directed. But you have to be fairly motivated to engage in that kind of learning, and everyone has different personalities. In the end, a diploma only proves that you will jump through hoops for The Man (which is important for getting a job in the corporate world, where docility is rewarded and innovation is discouraged). Dave
Apr 19 2007
next sibling parent Sean Kelly <sean f4.ca> writes:
David B. Held wrote:
 Daniel Keep wrote:
 [...]
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.
I took a university course on software testing a while back and found it to be fairly useful. It opened with proofs of correctness and went from there. Not directly related to debugging, but close enough to bear mentioning.
 If you really want to be a good programmer, you need to take your 
 education into your own hands.  I would venture to claim that the 
 majority of experts in the world learned most of what they know from 
 self-study, and not from classroom lectures and textbooks.
Definitely.
 If you are really serious about your education, you will soon find your 
 classes boring, because most of your learning will be self-directed. But 
 you have to be fairly motivated to engage in that kind of learning, and 
 everyone has different personalities.  In the end, a diploma only proves 
 that you will jump through hoops for The Man (which is important for 
 getting a job in the corporate world, where docility is rewarded and 
 innovation is discouraged).
True enough, but a lot of importance is placed on degrees nevertheless. My advice would be not to go to college until you're ready to take it seriously, because there's no point in paying for a service you're not going to use. But don't look to formal education as the beginning and end of the learning process. In the professional sector, credentials might get you an interview but they won't get you a job. Sean
Apr 20 2007
prev sibling next sibling parent Daniel Keep <daniel.keep.lists gmail.com> writes:
David B. Held wrote:
 Daniel Keep wrote:
 [...]
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.

 To be honest, I wouldn't know the most efficient way to return or pass
 out a vector because I've never had any kind of grounding in the effects
 of the various ways of doing it.  For instance: at what point does
 passing by reference become faster than by value?
This is where it really helps to have some understanding of what a compiler actually does. The "Java Syndrome" helps students treat the compiler like a magical black box where text goes in and executables come out. While this is useful for cranking out thousands of entry-level programmers that assiduously follow design specs, it's not very good for creating engineers who are also artists.
Well, it's a good thing that I loathe Java, then ;)
 If you want to understand pass-by-value vs. pass-by-reference, it helps
 to understand how C works, and then how assembly works.  Only when you
 get an idea for the actual instructions involved and how much time they
 take, relatively speaking, does it become practical to estimate these
 things.  You can never be 100% sure what code a compiler is going to
 generate without looking at the output, but after you learn a few
 languages, you can make pretty good guesses.
I think I've got a fairly good understanding of *how* things work at the instruction level. My current failing is, I believe, the part about understanding their relative speed under different circumstances. The example I gave in my previous posting was in regards to my vector struct; I don't understand the relative speeds of copying the vectors onto the stack and dereferencing pointers to them to be able to comfortably make a decision on which to use.
 For function call performance, you need to understand registers, memory,
 stacks, call frames, and calling conventions.  You also need to
 understand that references are just pointers that get automagically
 dereferenced on use.  Once you put these things together, it starts to
 become fairly clear when one is preferred over the other (though there
 is definitely a gray area where it depends on the actual code generation
 and hardware characteristics).
 
 If you really want to be a good programmer, you need to take your
 education into your own hands.  I would venture to claim that the
 majority of experts in the world learned most of what they know from
 self-study, and not from classroom lectures and textbooks.  You are
 particularly lucky to be alive at this point in time because you have
 ridiculously more free information available to you than I did when I
 was growing up.  I would have killed to have as much of the internet to
 access as you do.
 
 If you are really serious about your education, you will soon find your
 classes boring, because most of your learning will be self-directed. But
 you have to be fairly motivated to engage in that kind of learning, and
 everyone has different personalities.  In the end, a diploma only proves
 that you will jump through hoops for The Man (which is important for
 getting a job in the corporate world, where docility is rewarded and
 innovation is discouraged).
 
 Dave
Well, I basically taught myself to program from books and online tutorials. Sadly, my first languages were all BASICs (GWBASIC, QBASIC and then Visual Basic 5 & 6), although I hope I've made up for that by taking an interest in a wider variety of languages. My comp.sci. degree was boring until about the third year when it finally started to get interesting :) That said, I don't think I'll ever really be able to learn enough... all I can do is continue to read and experiment, hopefully bettering myself in the process. -- Daniel -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
Apr 20 2007
prev sibling parent reply Stephen Waits <steve waits.net> writes:
David B. Held wrote:
 
 compiler actually does.  The "Java Syndrome" helps students treat the 
Ahh, thanks for giving it a name. I'll add that to our vocabulary here. We've been seeing this get worse and worse in the past 5 years. It's to the point now where entry-level candidates we interview, from some high-profile schools, cannot write a basic recursive function (factorial) or demonstrate any knowledge about pointers, memory, or anything to do with bits. That's been my experience anyway. Basically, seems like the students aren't learning much about the machine any more. Were they ever? Or were us "old-timers" (I'm 35, not quite an old timer, but whatever) just so excited about the whole thing that we all spent way too much time learning stuff on our own? (I also quit college so I could learn more) I fear for some of the guys coming through here, that some day they may find themselves inside a paper bag. --Steve
Apr 20 2007
next sibling parent Sean Kelly <sean f4.ca> writes:
Stephen Waits wrote:
 David B. Held wrote:
 compiler actually does.  The "Java Syndrome" helps students treat the 
Ahh, thanks for giving it a name. I'll add that to our vocabulary here. We've been seeing this get worse and worse in the past 5 years. It's to the point now where entry-level candidates we interview, from some high-profile schools, cannot write a basic recursive function (factorial) or demonstrate any knowledge about pointers, memory, or anything to do with bits.
I have had similar problems interviewing new graduates who learned with Java rather than C++. Fortunately, it's not universal. One interviewee I remember was taught using Java but reasoned correctly about the C/C++ questions using their knowledge of architecture. I've never had an interviewee do that before, even the ones who know C/C++.
 Basically, seems like the students aren't learning much about the 
 machine any more.  Were they ever?  Or were us "old-timers" (I'm 35, not 
 quite an old timer, but whatever) just so excited about the whole thing 
 that we all spent way too much time learning stuff on our own?  (I also 
 quit college so I could learn more
Well, I think it is less important what a student is taught than what they learn ;-) But simple exposure to explicit memory management, stack-based vs. dynamic data, etc, makes a noticeable difference. If nothing else, the average graduate taught with C, C++, or Pascal will have some basic concept of what it means for data to be on the stack vs. the heap.
 I fear for some of the guys coming through here, that some day they may 
 find themselves inside a paper bag.
Same here. However, if a field doesn't interest someone enough to inspire them to learn about it on their own time then they are probably in the wrong field. If someone is pursuing CS for the money, they are in for a disappointment. Sean
Apr 20 2007
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Stephen Waits wrote:
 Basically, seems like the students aren't learning much about the 
 machine any more.  Were they ever?  Or were us "old-timers" (I'm 35, not 
 quite an old timer, but whatever) just so excited about the whole thing 
 that we all spent way too much time learning stuff on our own?  (I also 
 quit college so I could learn more)
A guy in my dorm in college built a CPU board out of random logic (NAND, NOR gates) just for fun. Those are the kind of guys you want to hire! I think it was Niven who wrote that a real scientist was one who'd fearlessly peer through the gates of hell if he thought he could learn something. It isn't hard to tell the real engineers from the "just a job" folks in a job interview. The real engineers: 1) did weird projects in their spare time, just for fun, not for credit 2) regard getting a degree as incidental, and wind up leaving it forgotten in the bottom of a drawer 3) took the hard classes that weren't required 4) didn't duck the calculus classes 5) can enthusiastically describe their projects 6) can tell you how bright an LED can glow if you file down the housing a bit and stick it in liquid nitrogen The charlatans: 1) did nothing that wasn't required 2) generally complain that their hard work goes unrecognized 3) are much more interested in the salary & benefits rather than what the work is 4) have difficulty describing just what their last project was and what their contribution to it was 5) complain about outsourcing or foreigners taking their jobs 6) never made beersicles from pouring beer into liquid nitrogen
Apr 20 2007
next sibling parent reply Stephen Waits <steve waits.net> writes:
Walter Bright wrote:
 It isn't hard to tell the real engineers from the "just a job" folks in 
 a job interview. The real engineers:
 
 1) did weird projects in their spare time, just for fun, not for credit
 2) regard getting a degree as incidental, and wind up leaving it 
 forgotten in the bottom of a drawer
 3) took the hard classes that weren't required
 4) didn't duck the calculus classes
 5) can enthusiastically describe their projects
 6) can tell you how bright an LED can glow if you file down the housing 
 a bit and stick it in liquid nitrogen
 
 The charlatans:
 
 1) did nothing that wasn't required
 2) generally complain that their hard work goes unrecognized
 3) are much more interested in the salary & benefits rather than what 
 the work is
 4) have difficulty describing just what their last project was and what 
 their contribution to it was
 5) complain about outsourcing or foreigners taking their jobs
 6) never made beersicles from pouring beer into liquid nitrogen
Nicely said. This is basically what we go by too; except, maybe for the liquid nitrogen parts. Sounds like somebody had some fun back in the day... :) --Steve
Apr 20 2007
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Stephen Waits wrote:
 This is basically what we go by too; except, maybe for the liquid 
 nitrogen parts.  Sounds like somebody had some fun back in the day...  :)
There was making a flamethrower out of a lawnmower, too, and the time some friends discovered how a stereo could be used to shake the whole building, and lots of other stuff <g>.
Apr 20 2007
parent reply Dan <murpsoft hotmail.com> writes:
Walter Bright Wrote:

 Stephen Waits wrote:
 This is basically what we go by too; except, maybe for the liquid 
 nitrogen parts.  Sounds like somebody had some fun back in the day...  :)
There was making a flamethrower out of a lawnmower, too, and the time some friends discovered how a stereo could be used to shake the whole building, and lots of other stuff <g>.
Yeah, I enjoyed playing with thermite, and would maim to get my hands on a huge roll of aerogel. Apart from women, it's software and physics that interest me the most; and I'm not much for small talk. : )
Apr 20 2007
parent BCS <BCS pathlink.com> writes:
Dan wrote:
 
 Apart from women, it's software and physics that interest me the most; and I'm
not much for small talk.
 
 : )
"Science is like sex: sometimes something useful comes out, but that is not the reason we are doing it" -- Richard Feynman
Apr 23 2007
prev sibling parent reply Chris Nicholson-Sauls <ibisbasenji gmail.com> writes:
Walter Bright wrote:
 Stephen Waits wrote:
 Basically, seems like the students aren't learning much about the 
 machine any more.  Were they ever?  Or were us "old-timers" (I'm 35, 
 not quite an old timer, but whatever) just so excited about the whole 
 thing that we all spent way too much time learning stuff on our own?  
 (I also quit college so I could learn more)
A guy in my dorm in college built a CPU board out of random logic (NAND, NOR gates) just for fun. Those are the kind of guys you want to hire! I think it was Niven who wrote that a real scientist was one who'd fearlessly peer through the gates of hell if he thought he could learn something. It isn't hard to tell the real engineers from the "just a job" folks in a job interview. The real engineers: 1) did weird projects in their spare time, just for fun, not for credit 2) regard getting a degree as incidental, and wind up leaving it forgotten in the bottom of a drawer 3) took the hard classes that weren't required 4) didn't duck the calculus classes 5) can enthusiastically describe their projects 6) can tell you how bright an LED can glow if you file down the housing a bit and stick it in liquid nitrogen The charlatans: 1) did nothing that wasn't required 2) generally complain that their hard work goes unrecognized 3) are much more interested in the salary & benefits rather than what the work is 4) have difficulty describing just what their last project was and what their contribution to it was 5) complain about outsourcing or foreigners taking their jobs 6) never made beersicles from pouring beer into liquid nitrogen
Ironically, a friend of mine just recently acquired some liquid nitrogen and we've been chattering away at work planning all the things we intend to do with it. Time to add beersicles to the list.. made from home brewed beer no less. ;) Bananas and marshmellows do fun things too... especially if loaded into certain old nerf guns, but I digress. -- Chris Nicholson-Sauls -- who still dreams of cubicle nerf wars gone awry
Apr 20 2007
parent Clay Smith <clayasaurus gmail.com> writes:
Chris Nicholson-Sauls wrote:
 Walter Bright wrote:
 Stephen Waits wrote:
 Basically, seems like the students aren't learning much about the 
 machine any more.  Were they ever?  Or were us "old-timers" (I'm 35, 
 not quite an old timer, but whatever) just so excited about the whole 
 thing that we all spent way too much time learning stuff on our own?  
 (I also quit college so I could learn more)
A guy in my dorm in college built a CPU board out of random logic (NAND, NOR gates) just for fun. Those are the kind of guys you want to hire! I think it was Niven who wrote that a real scientist was one who'd fearlessly peer through the gates of hell if he thought he could learn something. It isn't hard to tell the real engineers from the "just a job" folks in a job interview. The real engineers: 1) did weird projects in their spare time, just for fun, not for credit 2) regard getting a degree as incidental, and wind up leaving it forgotten in the bottom of a drawer 3) took the hard classes that weren't required 4) didn't duck the calculus classes 5) can enthusiastically describe their projects 6) can tell you how bright an LED can glow if you file down the housing a bit and stick it in liquid nitrogen The charlatans: 1) did nothing that wasn't required 2) generally complain that their hard work goes unrecognized 3) are much more interested in the salary & benefits rather than what the work is 4) have difficulty describing just what their last project was and what their contribution to it was 5) complain about outsourcing or foreigners taking their jobs 6) never made beersicles from pouring beer into liquid nitrogen
Ironically, a friend of mine just recently acquired some liquid nitrogen and we've been chattering away at work planning all the things we intend to do with it. Time to add beersicles to the list.. made from home brewed beer no less. ;) Bananas and marshmellows do fun things too... especially if loaded into certain old nerf guns, but I digress. -- Chris Nicholson-Sauls -- who still dreams of cubicle nerf wars gone awry
I heard that ice cream created with liquid nitrogen tastes very good and it is very quick to make as well.
Apr 24 2007
prev sibling next sibling parent reply Stephen Waits <steve waits.net> writes:
Daniel Keep wrote:
 
 To be honest, I wouldn't know the most efficient way to return or pass
 out a vector because I've never had any kind of grounding in the effects
 of the various ways of doing it.  For instance: at what point does
 passing by reference become faster than by value?
10 second lesson on optimization: * Memory access is slow. Both reading and writing. It's generally been that way for a long time, and appears to be staying that way. * Optimize what is slowest. * Know what's slowest by profiling. * The largest gains generally come from larger algorithmic changes. --Steve
Apr 20 2007
next sibling parent Dan <murpsoft hotmail.com> writes:
Stephen Waits Wrote:
 * Memory access is slow.  Both reading and writing.  It's generally been 
 that way for a long time, and appears to be staying that way.
 
 * Optimize what is slowest.
Optimize what consumes the most time when the user needs it most. Having a slow idle process that consumes the cpu time doesn't imply you need to make the idle process more effiicient. : )
 * Know what's slowest by profiling.
This involves more than invoking a program. It involves some heuristics.
 * The largest gains generally come from larger algorithmic changes.
I find trivial things like aligning heavily used structs and using in/out/inout parameters in the right places make a big difference and don't even require significant thought. ~~~ To answer his question; A reference is typically a void**, which essentially means a pair of 32-bit (on a 32 bit system) values stored somewhere. Those values are used as indexes into the massive array that is your program's memory space (a.k.a: pointers). If both pointers are already loaded into cache by virtue of being on the same memory page as previously executed data, then dereferencing can take as few as 4 cycles. In the event of a cache miss, it can take several hundred. Typically they're kept on the stack, so you almost always get towards the low end. Passing by value tends to involve moving the data itself, either through registers or the stack (at 1 mov instruction each 32-bit dword). If the data is less than 128-bits, it tends to be cheaper to pass by value than by reference; and with SSE2, that can be true with data up to 512-bits. Additionally, once the data has been passed by value, you can manipulate it on the spot; using inout parameters for direct manipulation and in parameters when you only want to touch a copy of the data (copies are as cheap as the original, unlike when passed by reference) I may have missed some things. Regardless, those are my thoughts on it atm. Sincerely, Dan
Apr 20 2007
prev sibling next sibling parent Sean Kelly <sean f4.ca> writes:
Stephen Waits wrote:
 Daniel Keep wrote:
 To be honest, I wouldn't know the most efficient way to return or pass
 out a vector because I've never had any kind of grounding in the effects
 of the various ways of doing it.  For instance: at what point does
 passing by reference become faster than by value?
10 second lesson on optimization: * Memory access is slow. Both reading and writing. It's generally been that way for a long time, and appears to be staying that way. * Optimize what is slowest.
Optimizers (within the compiler) put a weird spin on some of these however, because they take care of most fine-grained optimizations automatically. I think these are good things to keep in mind, but they are subordinate to what you mention below:
 * Know what's slowest by profiling.
 
 * The largest gains generally come from larger algorithmic changes.
Sean
Apr 20 2007
prev sibling parent reply Derek Parnell <derek psych.ward> writes:
On Fri, 20 Apr 2007 10:27:14 -0700, Stephen Waits wrote:

 Daniel Keep wrote:
 
 To be honest, I wouldn't know the most efficient way to return or pass
 out a vector because I've never had any kind of grounding in the effects
 of the various ways of doing it.  For instance: at what point does
 passing by reference become faster than by value?
10 second lesson on optimization: * Memory access is slow. Both reading and writing. It's generally been that way for a long time, and appears to be staying that way.
'Slow' compared to Register access rather than Disk access <G> In other words, try to use the data that is in registers rather than get it (again) from RAM, and try to use registers as temporary/intermediate areas.
 * Optimize what is slowest.
 
 * Know what's slowest by profiling.
A concentrate on slow functions that are used frequently rather those that are rarely used. I've seen people spend way to much time on shaving off milliseconds from a program's initialization section rather than deal with other areas that are run thousands of times each time the application is executed.
 * The largest gains generally come from larger algorithmic changes.
YES! Time spent working on getting the best algorithm is always going to pay dividends. Improving the efficiency of a bubble-sort for million-element array is probably a waste of time. -- Derek Parnell Melbourne, Australia "Justice for David Hicks!" skype: derek.j.parnell
Apr 20 2007
parent Stephen Waits <steve waits.net> writes:
Derek Parnell wrote:
 On Fri, 20 Apr 2007 10:27:14 -0700, Stephen Waits wrote:
 * Know what's slowest by profiling.
A concentrate on slow functions that are used frequently rather those that are rarely used. I've seen people spend way to much time on shaving off milliseconds from a program's initialization section rather than deal with other areas that are run thousands of times each time the application is executed.
Yes, exactly what I meant; only said more eloquently. There's only so much I can teach in 10 seconds! --Steve
Apr 20 2007
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Daniel Keep wrote:
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.
Most great programmers didn't learn programming from taking college courses. They learned it on their own. Programming has the nice characteristic that it is fairly straightforward to learn on your own. Want to take advantage of what a university can offer? Taking the basic course in each of following will pay you lifelong dividends: 1) Calculus 2) Accounting 3) Physics 4) Chemistry 5) Statistics 6) Electronics
Apr 20 2007
parent reply Dave <Dave_member pathlink.com> writes:
Walter Bright wrote:
 Daniel Keep wrote:
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.
Most great programmers didn't learn programming from taking college courses. They learned it on their own. Programming has the nice characteristic that it is fairly straightforward to learn on your own. Want to take advantage of what a university can offer? Taking the basic course in each of following will pay you lifelong dividends: 1) Calculus 2) Accounting 3) Physics 4) Chemistry 5) Statistics 6) Electronics
I couldn't agree more. Software development is and always has been a craft(*) - part science, part art. Colleges have never been really good at teaching crafts -- no such thing as a "Bachelor of Crafts in Software Development" <g> Personally, the best developers I've ever known (again, personally -- please note the next paragraph) have almost w/o exception been formally trained/educated for something else, and most of the CS majors went into management (but then again maybe they're truly the smart ones <g>). That said, there are *alot* of super-smart CS students and graduates in this group, I gather. If it's truly what you love doing, a CS or related degree can only help because you get to goof-off for 4 years doing what you like and learning too <G> (*) Alan Cooper ("The Father of Visual Basic") for the insight that Software Development is really more of a craft than a science.
Apr 20 2007
parent reply Chris Nicholson-Sauls <ibisbasenji gmail.com> writes:
Dave wrote:
 Walter Bright wrote:
 Daniel Keep wrote:
 This is one thing I really lament about my uni education thus far.  Two
 topics that basically have *never* been covered in even minimal detail
 have been optimisation and debugging.
Most great programmers didn't learn programming from taking college courses. They learned it on their own. Programming has the nice characteristic that it is fairly straightforward to learn on your own. Want to take advantage of what a university can offer? Taking the basic course in each of following will pay you lifelong dividends: 1) Calculus 2) Accounting 3) Physics 4) Chemistry 5) Statistics 6) Electronics
I couldn't agree more. Software development is and always has been a craft(*) - part science, part art. Colleges have never been really good at teaching crafts -- no such thing as a "Bachelor of Crafts in Software Development" <g> Personally, the best developers I've ever known (again, personally -- please note the next paragraph) have almost w/o exception been formally trained/educated for something else, and most of the CS majors went into management (but then again maybe they're truly the smart ones <g>). That said, there are *alot* of super-smart CS students and graduates in this group, I gather. If it's truly what you love doing, a CS or related degree can only help because you get to goof-off for 4 years doing what you like and learning too <G> (*) Alan Cooper ("The Father of Visual Basic") for the insight that Software Development is really more of a craft than a science.
I tend to tell people that all forms of art seemingly arise from some form of science. Programming just happens to be an artform still closely linked to its base science. And our own Walter -- if I recall right -- is a prime example of a major developer whose background is in something else. I'm pretty sure those airplanes didn't require new compilers. -- Chris Nicholson-Sauls
Apr 20 2007
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Chris Nicholson-Sauls wrote:
 I tend to tell people that all forms of art seemingly arise from some 
 form of science. Programming just happens to be an artform still closely 
 linked to its base science.  And our own Walter -- if I recall right -- 
 is a prime example of a major developer whose background is in something 
 else.  I'm pretty sure those airplanes didn't require new compilers.
My training is as a mechanical engineer, with an emphasis on jet engines. I was fortunate enough to attend a university (Caltech) that thoroughly believed that all their sci/eng majors should be well grounded in a broad range of fields, and as I've gotten older and wiser I see the value in it now. Caltech requires of all its graduates: o 3 years of calculus o 2 years physics o 1 year chemistry among other courses.
Apr 22 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 Chris Nicholson-Sauls wrote:
 I tend to tell people that all forms of art seemingly arise from some 
 form of science. Programming just happens to be an artform still 
 closely linked to its base science.  And our own Walter -- if I recall 
 right -- is a prime example of a major developer whose background is 
 in something else.  I'm pretty sure those airplanes didn't require new 
 compilers.
My training is as a mechanical engineer, with an emphasis on jet engines. I was fortunate enough to attend a university (Caltech) that thoroughly believed that all their sci/eng majors should be well grounded in a broad range of fields, and as I've gotten older and wiser I see the value in it now. Caltech requires of all its graduates: o 3 years of calculus o 2 years physics o 1 year chemistry among other courses.
If all you know is CS, then I think you're restricting the kind of work you can do. It's not too tough to figure out how to be a competent programmer coming from a hard science or engineering discipline. But going the other way is pretty much impossible. My tack was to take a lot of CS courses, because they were fun and relatively easy, but go with EE as the major. It was much more difficult, but I'm glad I did it that way. The decent grounding in calculus, linear algebra, Fourier analysis etc that I got from that has allowed me to do things I never would have been able to consider had I just gotten the CS education. I've heard that CS departments at schools these days are suffering from a big drop in the number of majors. But that seems to me to be as it should be. The IT boom brought on a lot of silliness. You really don't need a CS degree to do most IT jobs. Yes, *everybody* needs to know how work with computers these days to varying degrees. Just like everyone needs math to varying degrees. But that doesn't mean there need to be a lot of math majors, or CS majors. Almost everyone takes a class or two from the math department, but very few major in it. Likewise, pretty much everyone these days should have a class or two from the CS dept, but we don't really need that many majors. --bb
Apr 22 2007
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 If all you know is CS, then I think you're restricting the kind of work 
 you can do.  It's not too tough to figure out how to be a competent 
 programmer coming from a hard science or engineering discipline.  But 
 going the other way is pretty much impossible.  My tack was to take a 
 lot of CS courses, because they were fun and relatively easy, but go 
 with EE as the major.  It was much more difficult, but I'm glad I did it 
 that way.  The decent grounding in calculus, linear algebra, Fourier 
 analysis etc that I got from that has allowed me to do things I never 
 would have been able to consider had I just gotten the CS education.
I agree. When I worked at Boeing, it was in the early days of using computers for engineering analysis. There were problems because the programmers didn't understand engineering, and the engineers didn't understand programming. So there'd be programs that worked great but solved the wrong problem. My lead engineer wryly remarked once that I was the only one he'd worked with who brought back numbers from the computer that weren't garbage. (Things have changed a lot since then, my friends who work there tell me that everything is done on computers now.) I've never seen anyone learn calculus outside of the classroom, but plenty of people who learned programming outside of one.
 I've heard that CS departments at schools these days are suffering from 
 a big drop in the number of majors.  But that seems to me to be as it 
 should be.  The IT boom brought on a lot of silliness.  You really don't 
 need a CS degree to do most IT jobs.  Yes, *everybody* needs to know how 
 work with computers these days to varying degrees.  Just like everyone 
 needs math to varying degrees.  But that doesn't mean there need to be a 
 lot of math majors, or CS majors.   Almost everyone takes a class or two 
 from the math department, but very few major in it.  Likewise, pretty 
 much everyone these days should have a class or two from the CS dept, 
 but we don't really need that many majors.
In defense of CS majors, Andrei has an academic CS background, and he's been a huge help in taking my limited back-of-envelope approach to the next level. I once saw a news program on cheaters in universities. Students would shamelessly state on camera that they cheated whenever they could get away with it. I just find that stunning. Several would justify it with the claim that since they'll never use 97% of what they learn in college, there was no point in learning it, and therefore it was fine to cheat. Just, wow. How pathetic and contemptible. A friend of mine went through MIT, and he told me that after he was there for a while he had an epiphany. MIT wasn't teaching him things. MIT was teaching him how to think. And that's what the calculus, physics, etc., classes will give you. Sure it's hard, but that's what it takes to rewire your brain <g>.
Apr 23 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 Bill Baxter wrote:
 I've heard that CS departments at schools these days are suffering 
 from a big drop in the number of majors.  But that seems to me to be 
 as it should be.  The IT boom brought on a lot of silliness.  You 
 really don't need a CS degree to do most IT jobs.  Yes, *everybody* 
 needs to know how work with computers these days to varying degrees.  
 Just like everyone needs math to varying degrees.  But that doesn't 
 mean there need to be a lot of math majors, or CS majors.   Almost 
 everyone takes a class or two from the math department, but very few 
 major in it.  Likewise, pretty much everyone these days should have a 
 class or two from the CS dept, but we don't really need that many majors.
In defense of CS majors, Andrei has an academic CS background, and he's been a huge help in taking my limited back-of-envelope approach to the next level.
Yeh, I'm not saying there's anything inherently wrong with a CS degree. But you may be better off with something else, unless what you really want to do is take fundamental concepts of computing to the next level. Just like you don't major in math just to learn how to use math. You major in math if you're interested in creating *new* math. Or discovering it. Whatever you want to call it. If what you want to do is create advanced new compilers or new languages, or new algorithms, then yeh, the CS degree is a good thing, provided it's done right. --bb
Apr 23 2007
parent reply Sean Kelly <sean f4.ca> writes:
Bill Baxter wrote:
 
 Yeh, I'm not saying there's anything inherently wrong with a CS degree. 
 But you may be better off with something else, unless what you really 
 want to do is take fundamental concepts of computing to the next level. 
  Just like you don't major in math just to learn how to use math.  You 
 major in math if you're interested in creating *new* math.  Or 
 discovering it.  Whatever you want to call it.
 
 If what you want to do is create advanced new compilers or new 
 languages, or new algorithms, then yeh, the CS degree is a good thing, 
 provided it's done right.
If nothing else, a CS background should provide someone with the tools necessary to solve complex problems and prevent too many wheels from being reinvented in the process. An ignorance of basic algorithms and data structures is inexcusable for someone creating software. Sean
Apr 23 2007
parent Dan <murpsoft hotmail.com> writes:
Sean Kelly Wrote:
 being reinvented in the process.  An ignorance of basic algorithms and 
 data structures is inexcusable for someone creating software.
Heh. : D I agree. In fact I was arguing the same point last week and someone was coming down on me for it. That said, I'm guilty of a few cases of ignorance and I learn by reinventing the wheel - I also occassionally invent things that haven't been invented yet... interestingly the process of learning that way has let me become good at (re)invention. I think to myself "self, what would be a really effective way to do this" and then I answer, "well, if I did this it would work" and then two weeks later go "eureka! It really should be this way! I wonder if anyone has done that yet..." : D Well, at least I ask if it's been done before instead of just assuming I invented it. Sincerely, Dan
Apr 23 2007
prev sibling next sibling parent reply Jeff Nowakowski <jeff dilacero.org> writes:
Walter Bright wrote:
 Caltech requires of all its graduates:

 o    3 years of calculus
 o    2 years physics
 o    1 year chemistry
Bill Baxter wrote:
 If all you know is CS, then I think you're restricting the kind of work 
 you can do.
I have a Computer Science degree, and I have never needed calculus, physics, chemistry, etc. in any of my programming jobs. I resent all the time I was forced to waste taking these courses, instead of learning about my trade. Sure, these topics would have been useful if I wanted to get a programming job in a field that made use of it, but I don't want these kinds of jobs, and there are plenty of programming jobs that don't need them. I even wouldn't have minded being exposed to the topics, but I was forced to take the same physics and calculus courses as mechanical engineers. I wasted so much time memorizing formulas and learning how to solve problems that I never touched again. I'm not into heavy math or physics, though I like the concepts at a high level. This doesn't make me a bad coder.
 I've heard that CS departments at schools these days are suffering from 
 a big drop in the number of majors.
My understanding is that there is a greater emphasis on Software Engineering as a degree for those who want to actually code instead of doing academic research. This is a good thing. Too many programmers graduate from college woefully unprepared for working in the industry. -Jeff
Apr 23 2007
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Jeff Nowakowski wrote:
 Walter Bright wrote:
 Caltech requires of all its graduates:

 o    3 years of calculus
 o    2 years physics
 o    1 year chemistry
Bill Baxter wrote:
 If all you know is CS, then I think you're restricting the kind of 
 work you can do.
I have a Computer Science degree, and I have never needed calculus, physics, chemistry, etc. in any of my programming jobs. I resent all the time I was forced to waste taking these courses, instead of learning about my trade.
I have (such as using physics in game code), and, of course, in the engineering coding I've done. At worst, I simply have the pleasure of knowing the basics in that stuff, and can enjoy things like I once attended a dinner put on by the JPL mission director for a Mars probe, and was able to follow what he was talking about. I can appreciate what the Wright bros did and why they were successful while their contemporaries failed. I can read about technical things happening and be able understand what they're talking about. I know why those 9/11 conspiracy theories are hokum (all the "anomalies" are easily explained if you have even an elementary knowledge of physics and chemistry). It enabled me to correct a severe structural flaw in my house that the architect, structural engineer, and builder failed to notice. I have yet to find a roofer who understands what "galvanic corrosion" is, and I always check what kind of nails they use on the flashing (they're always wrong), saving me a ton of maintenance costs. The downside (if you could call it that) is that knowledge of real physics takes away from enjoying movies that have "Hollywood physics".
 Sure, these topics would have been useful if I wanted to get a 
 programming job in a field that made use of it, but I don't want these 
 kinds of jobs, and there are plenty of programming jobs that don't need 
 them.  I even wouldn't have minded being exposed to the topics, but I 
 was forced to take the same physics and calculus courses as mechanical 
 engineers.  I wasted so much time memorizing formulas and learning how 
 to solve problems that I never touched again.
It's too bad you were made to memorize formulas. To me, that isn't what physics is about, and at Caltech we were never made to memorize/regurgitate formulas. It's about learning how to solve complex problems. That skill comes in very handy with programming. For example, a common programming problem is your program doesn't behave. How do you go about fixing it? It's the same organized way of thinking as solving a physics or calculus problem.
Apr 23 2007
parent Daniel Keep <daniel.keep.lists gmail.com> writes:
Walter Bright wrote:
 Jeff Nowakowski wrote:
 Sure, these topics would have been useful if I wanted to get a
 programming job in a field that made use of it, but I don't want these
 kinds of jobs, and there are plenty of programming jobs that don't
 need them.  I even wouldn't have minded being exposed to the topics,
 but I was forced to take the same physics and calculus courses as
 mechanical engineers.  I wasted so much time memorizing formulas and
 learning how to solve problems that I never touched again.
It's too bad you were made to memorize formulas. To me, that isn't what physics is about, and at Caltech we were never made to memorize/regurgitate formulas. It's about learning how to solve complex problems. That skill comes in very handy with programming.
Maybe that's why I'm stuck with this burning desire to *never* get within a hundred kilometres of any maths course ever again. I know *why* it's useful, but having to sit down and memorise huge numbers of formulae so that I can write down answers for one test and then never look at it again was driving me crazy. But I'm not crazy. He he.
 For example, a common programming problem is your program doesn't
 behave. How do you go about fixing it? It's the same organized way of
 thinking as solving a physics or calculus problem.
Really? I'd always been taught that combining "hit it with a hammer until it sees your way" reasoning and chemistry was A Bad Idea™. And now I find out they were wrong! Cool! Now, if you'll excuse me, I'm gonna go hammer some red phosphorous and potassium chlorate together--what's the worst that could happen?* -- Daniel * Serious obligatory disclaimer: Do not try this. Ever. See the third last paragraph of http://www.theodoregray.com/PeriodicTable/Stories/015.8/index.html -- int getRandomNumber() { return 4; // chosen by fair dice roll. // guaranteed to be random. } http://xkcd.com/ v2sw5+8Yhw5ln4+5pr6OFPma8u6+7Lw4Tm6+7l6+7D i28a2Xs3MSr2e4/6+7t4TNSMb6HTOp5en5g6RAHCP http://hackerkey.com/
Apr 23 2007
prev sibling parent reply Sean Kelly <sean f4.ca> writes:
Jeff Nowakowski wrote:
 
 I have a Computer Science degree, and I have never needed calculus, 
 physics, chemistry, etc. in any of my programming jobs.  I resent all 
 the time I was forced to waste taking these courses, instead of learning 
 about my trade.
But many people do. That aside, science and math courses are invaluable for teaching problem solving skills, which are useful regardless of the problem domain.
 Sure, these topics would have been useful if I wanted to get a 
 programming job in a field that made use of it, but I don't want these 
 kinds of jobs, and there are plenty of programming jobs that don't need 
 them.
I think that's likely true for entry level jobs, but over time I've been surprised at just how useful math and science knowledge has been. Even seemingly straightforward topics like economics and accounting involve a decent bit of calculus.
 I even wouldn't have minded being exposed to the topics, but I
 was forced to take the same physics and calculus courses as mechanical 
 engineers.  I wasted so much time memorizing formulas and learning how 
 to solve problems that I never touched again.
That's more specific to the school than anything. One thing I've found is that the larger state schools seem to favor testing information retention rather than problem solving. One mathematics professor I talked to recently lamented this, saying he had to teach this way because it was what students expected at the school, and not doing so would have generated a lot of complaints. Interestingly, he has also seen a steady decline in his students performance over the years. I would guess this is because students are busier today than in the past, but it's interesting nevertheless.
 My understanding is that there is a greater emphasis on Software 
 Engineering as a degree for those who want to actually code instead of 
 doing academic research.  This is a good thing.  Too many programmers 
 graduate from college woefully unprepared for working in the industry.
Personally, I'm more interested in finding people with solid problem solving skills, good communication skills, and an ability to write clear, maintainable code than a knowledge of UML, a facility with specific tools, etc. The last bit is more related to job-specific knowledge anyway. Sean
Apr 23 2007
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
Sean Kelly wrote:
 But many people do.  That aside, science and math courses are invaluable 
 for teaching problem solving skills, which are useful regardless of the 
 problem domain.
I see this idea mentioned over and over. "you won't need all this baroque knowledge we're feeding you, yet it will help your problem solving skills". Guess what I do when I write programs? Yep, I solve problems. I learned this skill while learning to program, and every time I program it is reinforced.
 I think that's likely true for entry level jobs, but over time I've been 
 surprised at just how useful math and science knowledge has been.
I've been programming in the industry since 1993. Most people just don't need the math, and if you do, find a converted math major or mechanical engineer to help you out (unless they have forgotten all their math skills, as many of them do since they never use this stuff!).
 Personally, I'm more interested in finding people with solid problem 
 solving skills, good communication skills, and an ability to write 
 clear, maintainable code than a knowledge of UML, a facility with 
 specific tools, etc.  The last bit is more related to job-specific 
 knowledge anyway.
You can learn good problem solving skills, communication, etc. while actually learning valuable software engineering techniques. I'm not talking about learning Java or UML. I'm talking about learning how to handle errors (this topic was completely ignored during my education), distributed programming, concurrency, keeping a service up 24/7, testing, source control, project management, etc. Solving physics problems is a waste of time, unless that is something that appeals to you. -Jeff
Apr 23 2007
next sibling parent reply Stephen Waits <steve waits.net> writes:
Jeff Nowakowski wrote:
 Sean Kelly wrote:
 I see this idea mentioned over and over.  "you won't need all this 
 baroque knowledge we're feeding you, yet it will help your problem 
 solving skills".  Guess what I do when I write programs?  Yep, I solve 
 problems.  I learned this skill while learning to program, and every 
 time I program it is reinforced.
In this age of specialization, I can understand how you may not use math in your job. And, while math might help in problem solving, I don't think that's why it should be taught. Math is useful. When you're tasked with implementing the backpropagation algorithm to train neural networks, and you get to the part that says you need to choose an easily differentiable activation function, then you need some Calculus knowledge. This is a trivial example, but the type of thing I run into all the time.
 I've been programming in the industry since 1993.  Most people just 
 don't need the math, and if you do, find a converted math major or 
 mechanical engineer to help you out (unless they have forgotten all 
 their math skills, as many of them do since they never use this stuff!).
Your "most people" is completely different from mine. In my line of work, *every* programmer, at minimum, applies linear algebra *every* day. Many make frequent use of calculus too. You may not have needed it in your job, but if you don't have these skills, don't apply for a job with me. :)
 You can learn good problem solving skills, communication, etc. while 
 actually learning valuable software engineering techniques.  I'm not 
Agree. --Steve
Apr 23 2007
next sibling parent Jeff Nowakowski <jeff dilacero.org> writes:
Stephen Waits wrote:
 Your "most people" is completely different from mine.  In my line of 
 work, *every* programmer, at minimum, applies linear algebra *every* 
 day.  Many make frequent use of calculus too.
Yes, in *your* line of work. There's tons and tons of software written that does not require heavy math skills. There's no reason to train all programmers in linear algebra and calculus. I think they should be exposed to it at a high level, and if they want to go deeper that should be their option.
 You may not have needed it in your job, but if you don't have these 
 skills, don't apply for a job with me.  :)
Indeed, and I never would. Isn't the free market wonderful? -Jeff
Apr 23 2007
prev sibling parent reply Derek Parnell <derek nomail.afraid.org> writes:
On Mon, 23 Apr 2007 15:00:06 -0700, Stephen Waits wrote:

 Jeff Nowakowski wrote:
 In this age of specialization, I can understand how you may not use math 
 in your job.  And, while math might help in problem solving, I don't 
 think that's why it should be taught.
 
 Math is useful.
Absolutely! The most compelling reason to learn maths in college is so that you can one day help your kids with their college homework. <G> -- Derek (skype: derek.j.parnell) Melbourne, Australia "Justice for David Hicks!" 24/04/2007 10:45:47 AM
Apr 23 2007
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Derek Parnell wrote:
 On Mon, 23 Apr 2007 15:00:06 -0700, Stephen Waits wrote:
 
 Jeff Nowakowski wrote:
 In this age of specialization, I can understand how you may not use math 
 in your job.  And, while math might help in problem solving, I don't 
 think that's why it should be taught.

 Math is useful.
Absolutely! The most compelling reason to learn maths in college is so that you can one day help your kids with their college homework. <G>
Heh heh. That's a similar reason why we need literature majors. Or classics majors. You have to have those majors in order to train new literature and classics professors for the next generation. --bb
Apr 23 2007
prev sibling parent reply Stephen Waits <steve waits.net> writes:
Jeff Nowakowski wrote:
 I see this idea mentioned over and over.  "you won't need all this 
 baroque knowledge we're feeding you, yet it will help your problem 
 solving skills".  Guess what I do when I write programs?  Yep, I solve 
 problems.  I learned this skill while learning to program, and every 
 time I program it is reinforced.
This blog post just hit my radar yesterday. Thought it was appropriate to this conversation. http://ihurl.com/2f --Steve
Apr 24 2007
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
Stephen Waits wrote:
 This blog post just hit my radar yesterday.  Thought it was appropriate 
 to this conversation.
 
   http://ihurl.com/2f
And the full URL (I really dislike hidden URLs): http://steve-yegge.blogspot.com/2006/03/math-for-programmers.html That was a good blog article, thanks for posting it. I agree with a lot of what he says, in particular about the way math is taught. As I have said, I didn't mind the concepts so much as attention to arcane detail. I actually do what he's been doing, that is revisiting math as I need it while running across it in research papers. I still think, however, that the vast majority of programmers can get by with basic algebra, discrete math, and a high level understanding of calculus. -Jeff
Apr 24 2007
parent Sean Kelly <sean f4.ca> writes:
Jeff Nowakowski wrote:
 Stephen Waits wrote:
 This blog post just hit my radar yesterday.  Thought it was 
 appropriate to this conversation.

   http://ihurl.com/2f
And the full URL (I really dislike hidden URLs): http://steve-yegge.blogspot.com/2006/03/math-for-programmers.html That was a good blog article, thanks for posting it. I agree with a lot of what he says, in particular about the way math is taught. As I have said, I didn't mind the concepts so much as attention to arcane detail. I actually do what he's been doing, that is revisiting math as I need it while running across it in research papers.
Donald Knuth has a book called "Concrete Mathematics" that it fairly useful for programmers. It's also very well written, which is rare. Sean
Apr 24 2007
prev sibling next sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Bill Baxter wrote:
 
 If all you know is CS, then I think you're restricting the kind of work
 you can do.  It's not too tough to figure out how to be a competent
 programmer coming from a hard science or engineering discipline.  But
 going the other way is pretty much impossible.  My tack was to take a
 lot of CS courses, because they were fun and relatively easy, but go
 with EE as the major.  It was much more difficult, but I'm glad I did it
 that way.  The decent grounding in calculus, linear algebra, Fourier
 analysis etc that I got from that has allowed me to do things I never
 would have been able to consider had I just gotten the CS education.
I have a CS master degree myself (is that what you call a major?), and I had all of the subjects you mention above. Over 5 years, I had calculus, linear algebra, the fourier stuff, laplace and friends (this was actually a math course especially for the computer students), statistics, physics, a tad chemistry, discrete mathematics, basic electronics, and some more digital techniques, molecular biology, pencil drawing (!), tree/wood facade project and introductory philosophy. In total, this amounted to almost 2 years, I think. The rest was CS related, the basics (computers, programming, etc), intermediate (software engineering/planning, databases, programming languages, etc) and my chosen subjects/projects (natural languages, algorithms, graphics, and a whole year spent on projects (including the master thesis) for privately held companies). All in all, I think this is as good as it will get at a university. Most of my fellow students ended up as consultants in Accenture and friends, very clever people, but at least a few chose their route more from the career outlooks back when we started, rather than a true interest in computers. Those of us who chose due to the latter, mostly have jobs in other areas, and quite a few have started their own companies.
 I've heard that CS departments at schools these days are suffering from
 a big drop in the number of majors.  But that seems to me to be as it
 should be.  The IT boom brought on a lot of silliness.  You really don't
 need a CS degree to do most IT jobs.  Yes, *everybody* needs to know how
 work with computers these days to varying degrees.  Just like everyone
 needs math to varying degrees.  But that doesn't mean there need to be a
 lot of math majors, or CS majors.   Almost everyone takes a class or two
 from the math department, but very few major in it.  Likewise, pretty
 much everyone these days should have a class or two from the CS dept,
 but we don't really need that many majors.
At my university, they are seeing a little bit lower levels on the incoming students' grades, but they're far away from having empty seats. Norway seems to be in a somewhat special situation these times though, companies sucking up all technical engineers coming out of Norwegian institutions. Also, Trondheim being a small city, and still having major employers like Google, Yahoo, Sun, ARM and Atmel, make CS majors a highly sought after group of employees. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 23 2007
parent reply Dan <murpsoft hotmail.com> writes:
When I went to university, my first year offered an extensive and exhaustive
examination of what an array is, and where the sign bit can be located.  In
Pascal, and the second course in Java.

Can you see why I didn't continue?

I had been programming since I was 12, and explained the course well enough to
have a friend actually pass after a 20 minute crash course by showing him a
language reference after explaining memory in terms of a giant row of boxes.

That said, if I didn't have to subject myself to that again, I would love to
someday actually learn something.  Until an institution offers learning, rather
than edumacation, I'll stick with reinventing the wheel and asking if someone's
discovered that yet.

I learned D, jump tables, level-order binary search arrays, as well as geodesic
binary search arrays in the past six months that way.

Sincerely,
Dan
Apr 23 2007
parent Lars Ivar Igesund <larsivar igesund.net> writes:
Dan wrote:
 
 That said, if I didn't have to subject myself to that again, I would love
 to someday actually learn something.  Until an institution offers
 learning, rather than edumacation, I'll stick with reinventing the wheel
 and asking if someone's discovered that yet.
I'm uncertain of which technical knowledge I truly gained during my studies, and which I also find useful in my work. There were a couple of reasons for this though, not the least the post dotcom jobmarket, leading me to work quite far from my specialization. I _do_ think that the diversity of the studies helped in going for different jobs, and I know that the team and social skills learned are invaluable where I'm now. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 23 2007
prev sibling parent reply Jascha Wetzel <"[firstname]" mainia.de> writes:
 You really don't need a CS degree to do most IT jobs
100% agreed i'll have to stand up for the CS majors here, though... ;) i think, these (fairly typical) statements about CS majors are highly dependent on the university. i attended exactly one lecture during the first semester of my CS major that was supposed to teach you something about programming. it was actually more an overview of programming paradigms and languages types. that's it. no more programming taught for the rest of the at least 4.5 years. therefore, argumenting that you can learn programming on your own and don't need to have a CS major doesn't make much sense to me, since you have to learn it on your own even if you do have a CS major. what i've learned instead: - calculus (incl. numerical and differential) - mathematical logic, knowledge representation - designing, analysing and proving correctness of algorithms - loads of algorithms and data structures - linear algebra, 3d geometry (incl. curves and surfaces), rendering, lighting simulation - image analysis, compression, etc. (incl. fourier analysis, btw) - some machine vision, pattern recognition - language theory, compiler construction and optimization - processor and operating system concepts - analog and digital electronics basics i don't regret having spent time with any of this. what i regret not having taken classes in is cryptography. i hope i'll find the time to make up for that... Bill Baxter wrote:
 Walter Bright wrote:
 Chris Nicholson-Sauls wrote:
 I tend to tell people that all forms of art seemingly arise from some
 form of science. Programming just happens to be an artform still
 closely linked to its base science.  And our own Walter -- if I
 recall right -- is a prime example of a major developer whose
 background is in something else.  I'm pretty sure those airplanes
 didn't require new compilers.
My training is as a mechanical engineer, with an emphasis on jet engines. I was fortunate enough to attend a university (Caltech) that thoroughly believed that all their sci/eng majors should be well grounded in a broad range of fields, and as I've gotten older and wiser I see the value in it now. Caltech requires of all its graduates: o 3 years of calculus o 2 years physics o 1 year chemistry among other courses.
If all you know is CS, then I think you're restricting the kind of work you can do. It's not too tough to figure out how to be a competent programmer coming from a hard science or engineering discipline. But going the other way is pretty much impossible. My tack was to take a lot of CS courses, because they were fun and relatively easy, but go with EE as the major. It was much more difficult, but I'm glad I did it that way. The decent grounding in calculus, linear algebra, Fourier analysis etc that I got from that has allowed me to do things I never would have been able to consider had I just gotten the CS education. I've heard that CS departments at schools these days are suffering from a big drop in the number of majors. But that seems to me to be as it should be. The IT boom brought on a lot of silliness. You really don't need a CS degree to do most IT jobs. Yes, *everybody* needs to know how work with computers these days to varying degrees. Just like everyone needs math to varying degrees. But that doesn't mean there need to be a lot of math majors, or CS majors. Almost everyone takes a class or two from the math department, but very few major in it. Likewise, pretty much everyone these days should have a class or two from the CS dept, but we don't really need that many majors. --bb
Apr 23 2007
parent reply Dan <murpsoft hotmail.com> writes:
Jascha Wetzel <[firstname] mainia.de> Wrote:

 You really don't need a CS degree to do most IT jobs
100% agreed
I agree too. I'm not saying "anyone can do it"; I'm saying that a degree doesn't provide as much as it ought in this particular field. I found my experience almost debilitating and I think there are a few reasons why: 1) By the time you're done 4th year, the exactly implementation you were taught starting 2nd year is already obsolete. 2) Professors were taught by their professors were taught by their professors. The teacher typically has never been in the industry, and doesn't really understand programming beyond trivial examples on a theoretical capacity. They're also typically still stuck in the same paradigms and with the same tools as their professor's professor (the 70's). 3) First year professors are really there to study a master's, not to teach. They typically suck at teaching, not being able to frame the paradigm with the right analogies, but merely having acceptable technical knowledge.
 
 i'll have to stand up for the CS majors here, though... ;)
 i think, these (fairly typical) statements about CS majors are highly
 dependent on the university. i attended exactly one lecture during 
For anyone interested, don't go to University of Calgary for CPSC. the
 first semester of my CS major that was supposed to teach you something
 about programming. it was actually more an overview of programming
 paradigms and languages types. that's it. no more programming taught for
 the rest of the at least 4.5 years.
 therefore, argumenting that you can learn programming on your own and
 don't need to have a CS major doesn't make much sense to me, since you
 have to learn it on your own even if you do have a CS major.
 
 what i've learned instead:
 - calculus (incl. numerical and differential)
 - mathematical logic, knowledge representation
 - designing, analysing and proving correctness of algorithms
 - loads of algorithms and data structures
 - linear algebra, 3d geometry (incl. curves and surfaces), rendering,
 lighting simulation
 - image analysis, compression, etc. (incl. fourier analysis, btw)
 - some machine vision, pattern recognition
 - language theory, compiler construction and optimization
 - processor and operating system concepts
 - analog and digital electronics basics
 
 i don't regret having spent time with any of this.
 what i regret not having taken classes in is cryptography. i hope i'll
 find the time to make up for that...
I wanted to fill the gaps in my skillset, such as learning x86-64 assembler, networking and cryptography, and advanced data structures and algorithms (geodesics, graph theory, neural net algorithms, min-max trees etc.) No dice, but I've since learned a couple of those points. I also wanted a piece of paper. No dice.
 Chris Nicholson-Sauls wrote:
 I tend to tell people that all forms of art seemingly arise from some
 form of science. Programming just happens to be an artform still
It really is a trivial science, if you can see through it. We've built such an abstraction over it that it's hard to see that underneath a kindergartener is learning to use all of the same concepts as a first year university student.
 closely linked to its base science.  And our own Walter -- if I
 recall right -- is a prime example of a major developer whose
 background is in something else.  I'm pretty sure those airplanes
 didn't require new compilers.
My training is as a mechanical engineer, with an emphasis on jet engines. I was fortunate enough to attend a university (Caltech) that thoroughly believed that all their sci/eng majors should be well grounded in a broad range of fields, and as I've gotten older and wiser I see the value in it now. Caltech requires of all its graduates: o 3 years of calculus o 2 years physics o 1 year chemistry among other courses.
If all you know is CS, then I think you're restricting the kind of work you can do. It's not too tough to figure out how to be a competent programmer coming from a hard science or engineering discipline. But going the other way is pretty much impossible. My tack was to take a lot of CS courses, because they were fun and relatively easy, but go with EE as the major. It was much more difficult, but I'm glad I did it that way. The decent grounding in calculus, linear algebra, Fourier analysis etc that I got from that has allowed me to do things I never would have been able to consider had I just gotten the CS education.
Good stuff. We can't just "program", we need to have a field in which we are good at programming. For me, I'm particularly good with regular expressions and data mining, and I like refactoring code.
 
 I've heard that CS departments at schools these days are suffering from
 a big drop in the number of majors.  But that seems to me to be as it
 should be.  The IT boom brought on a lot of silliness.  You really don't
 need a CS degree to do most IT jobs.  Yes, *everybody* needs to know how
 work with computers these days to varying degrees.  Just like everyone
 needs math to varying degrees.  But that doesn't mean there need to be a
 lot of math majors, or CS majors.   Almost everyone takes a class or two
 from the math department, but very few major in it.  Likewise, pretty
 much everyone these days should have a class or two from the CS dept,
 but we don't really need that many majors.
 
 --bb
We actually do need those few people who are exceptional at CS. Desperately. A genius in CS who implements, say, bittorrent or linux or google.com can change how the world works and save us all billions and billions of dollars. The unfortunate fact is, that unless you're in the top 1% at it, you really shouldn't call yourself a programmer; or rather it should be distinguished that you can hack your way into making a computer do something versus actually understanding the machine and breathing it. Sincerely, Dan
Apr 23 2007
parent reply Don Clugston <dac nospam.com.au> writes:
Dan wrote:
 Jascha Wetzel <[firstname] mainia.de> Wrote:
 
 You really don't need a CS degree to do most IT jobs
100% agreed
I agree too. I'm not saying "anyone can do it"; I'm saying that a degree doesn't provide as much as it ought in this particular field. I found my experience almost debilitating and I think there are a few reasons why: 1) By the time you're done 4th year, the exactly implementation you were taught starting 2nd year is already obsolete. 2) Professors were taught by their professors were taught by their professors. The teacher typically has never been in the industry, and doesn't really understand programming beyond trivial examples on a theoretical capacity. They're also typically still stuck in the same paradigms and with the same tools as their professor's professor (the 70's). 3) First year professors are really there to study a master's, not to teach. They typically suck at teaching, not being able to frame the paradigm with the right analogies, but merely having acceptable technical knowledge.
 i'll have to stand up for the CS majors here, though... ;)
 i think, these (fairly typical) statements about CS majors are highly
 dependent on the university. i attended exactly one lecture during 
For anyone interested, don't go to University of Calgary for CPSC.
And definitely do not got to the University of Sydney for CS. In the early 90's, they were still asking exam questions about I/O port addresses for teletype machines. Unbelievable. I came outright first in a course called "System Structures" for which I did not attend a single lecture. I've no idea what it was about. Seriously, I reckon I learn more CS every month just browsing the net, than I did in two years of CS. Fortunately my degree was physics. I think "Computer Science" is one of the all-time classic misnomers. There's hardly any science in CS. (Virtually no experimentation, for example).
Apr 23 2007
parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Don Clugston wrote:
 
 I think "Computer Science" is one of the all-time classic misnomers. 
 There's hardly any science in CS. (Virtually no experimentation, for 
 example).
Prof. Kenneth Yip used to start his intro to CS lectures out by writing "Computer Science" on the blackboard in giant letters. Then he'd start the lecture by saying the name is kind of odd for the subject because it's really not a science. And he'd put a big "X" over the science part. Next he'd say, "and what's more it's not actually about computers either". Another big "X". --bb
Apr 23 2007
parent reply Alexander Panek <alexander.panek brainsware.org> writes:
On Tue, 24 Apr 2007 13:28:43 +0900
Bill Baxter <dnewsgroup billbaxter.com> wrote:

 Don Clugston wrote:
 
 I think "Computer Science" is one of the all-time classic
 misnomers. There's hardly any science in CS. (Virtually no
 experimentation, for example).
Prof. Kenneth Yip used to start his intro to CS lectures out by writing "Computer Science" on the blackboard in giant letters. Then he'd start the lecture by saying the name is kind of odd for the subject because it's really not a science. And he'd put a big "X" over the science part. Next he'd say, "and what's more it's not actually about computers either". Another big "X".
Heh.. I kept telling my colleagues at school something similar, when (I'm always the one helping those who are "not capable" of "programming", in my class..) "It's not about the programming language..it's just.. a problem and a solution!"
Apr 23 2007
parent reply Jascha Wetzel <"[firstname]" mainia.de> writes:
the german name of this subject is more appropriate.
"informatik" suggests the science of information.

just because there are no experiments doesn't mean it's not a science,
though. mathematics is usually considered a science although has no
experiments either. that's because both aren't natural sciences where
there is a given real world complex that we try to understand by
sampling it with experiments.

in mathematics as well as "informatics" we try to find models of the way
we think. therefore all the experiments take place in our heads.
"informatics" additionally is an engineering science, trying to
build/program machines that mimick the way we think in those models we
found. in many cases we find that the way we think isn't most effective,
therefore we change the models to fit the machines we've already built.
which makes us drift a little more into the engineering direction.

Alexander Panek wrote:
 On Tue, 24 Apr 2007 13:28:43 +0900
 Bill Baxter <dnewsgroup billbaxter.com> wrote:
 
 Don Clugston wrote:
 I think "Computer Science" is one of the all-time classic
 misnomers. There's hardly any science in CS. (Virtually no
 experimentation, for example).
Prof. Kenneth Yip used to start his intro to CS lectures out by writing "Computer Science" on the blackboard in giant letters. Then he'd start the lecture by saying the name is kind of odd for the subject because it's really not a science. And he'd put a big "X" over the science part. Next he'd say, "and what's more it's not actually about computers either". Another big "X".
Heh.. I kept telling my colleagues at school something similar, when (I'm always the one helping those who are "not capable" of "programming", in my class..) "It's not about the programming language..it's just.. a problem and a solution!"
Apr 24 2007
next sibling parent Dan <murpsoft hotmail.com> writes:
Jascha Wetzel <[firstname] mainia.de> Wrote:

 the german name of this subject is more appropriate.
 "informatik" suggests the science of information.
 
 just because there are no experiments doesn't mean it's not a science,
 though. mathematics is usually considered a science although has no
 experiments either. that's because both aren't natural sciences 
... [and the rest of his post] Well said, sir. If you think the understanding behind computer science is trivial, being a mathematician is easy! All there is to it is a few variables and formulas and stuff. That said, I actually find myself pretty decent at understanding math, I'm just highly inconsistent at doing it - which is why I have a computer. : )
Apr 24 2007
prev sibling parent reply Don Clugston <dac nospam.com.au> writes:
Jascha Wetzel wrote:
 the german name of this subject is more appropriate.
 "informatik" suggests the science of information.
It's better -- works well for most business apps, but it's a bit of a stretch for things like games. IMHO, "software engineering" is a much better term.
 just because there are no experiments doesn't mean it's not a science,
 though. mathematics is usually considered a science although has no
 experiments either. that's because both aren't natural sciences where
 there is a given real world complex that we try to understand by
 sampling it with experiments.
In science, we're always trying to answer the "why?" question. Mathematics is no exception; that's what proofs are about.(*Why* are there no integral solutions to x^n+y^n=z^n where n>2 ?) But in CS, the question is almost always "how?". While doing CS, you almost never come away with more understanding about how the universe behaves. Most of the actual computer *science* is done in mathematics departments.
Apr 24 2007
parent reply Jascha Wetzel <"[firstname]" mainia.de> writes:
hm, how about...
why is problem X undecidable?
why does solving problem Y take at least O(...) time/space?
why are some problems NP and others P or aren't they?
why do these weights and activation functions make a recursive neural
network X do what it does?

"software engineering" is clearly a large subset, but you would have to
stretch the term pretty much to make most theoretical CS fit in there.

Don Clugston wrote:
 Jascha Wetzel wrote:
 the german name of this subject is more appropriate.
 "informatik" suggests the science of information.
It's better -- works well for most business apps, but it's a bit of a stretch for things like games. IMHO, "software engineering" is a much better term.
 just because there are no experiments doesn't mean it's not a science,
 though. mathematics is usually considered a science although has no
 experiments either. that's because both aren't natural sciences where
 there is a given real world complex that we try to understand by
 sampling it with experiments.
In science, we're always trying to answer the "why?" question. Mathematics is no exception; that's what proofs are about.(*Why* are there no integral solutions to x^n+y^n=z^n where n>2 ?) But in CS, the question is almost always "how?". While doing CS, you almost never come away with more understanding about how the universe behaves. Most of the actual computer *science* is done in mathematics departments.
Apr 24 2007
next sibling parent reply Dan <murpsoft hotmail.com> writes:
Jascha Wetzel <[firstname] mainia.de> Wrote:

 hm, how about...
 why is problem X undecidable?
 why does solving problem Y take at least O(...) time/space?
 why are some problems NP and others P or aren't they?
 why do these weights and activation functions make a recursive neural
 network X do what it does?
 
 "software engineering" is clearly a large subset, but you would have to
 stretch the term pretty much to make most theoretical CS fit in there.
 
 Don Clugston wrote:
 Jascha Wetzel wrote:
 the german name of this subject is more appropriate.
 "informatik" suggests the science of information.
It's better -- works well for most business apps, but it's a bit of a stretch for things like games. IMHO, "software engineering" is a much better term.
 just because there are no experiments doesn't mean it's not a science,
 though. mathematics is usually considered a science although has no
 experiments either. that's because both aren't natural sciences where
 there is a given real world complex that we try to understand by
 sampling it with experiments.
In science, we're always trying to answer the "why?" question. Mathematics is no exception; that's what proofs are about.(*Why* are there no integral solutions to x^n+y^n=z^n where n>2 ?) But in CS, the question is almost always "how?". While doing CS, you almost never come away with more understanding about how the universe behaves. Most of the actual computer *science* is done in mathematics departments.
Software Engineering IMHO is the application of the science to solve a given problem. This is the "how" question he refers to. Computer Science IMHO is the study of a subset of Mathematics such that we are bound to discrete sets and algorithms; and we have a set of typically consistent objectives, such as minimizing time and space complexity; which other studies in Mathematics don't explicitly define. On another note, you cannot understand "why" the abstraction is by examining the formula, you can merely understand "what" it is. Why can we not understand "why" a formula behaves a certain way? Because the concept of why is based on the concept of purpose, and the purpose is subset to existence; defined by it's users rather than being a necessary property of something that exists.
Apr 24 2007
parent 0ffh <spam frankhirsch.net> writes:
Dan wrote:
 Jascha Wetzel <[firstname] mainia.de> Wrote:
 Don Clugston wrote:
 Jascha Wetzel wrote:
 just because there are no experiments doesn't mean it's not a science,
There are experiments, but they're mostly done a-hoc and small-scale - like timing loops instead of trying to count cycles and predict cache behaviour, which is plain impossible on some architectures. Also, from a software-engineering POV every new programming language must be viewed as an experiment. D is one. So was Java (is still a bit).
 Software Engineering IMHO is the application of the science to solve a given
problem.  This is the "how" question he refers to.
 Computer Science IMHO is the study of a subset of Mathematics such that we are
bound to discrete sets and algorithms; and we have a set of typically
consistent objectives, such as minimizing time and space complexity; which
other studies in Mathematics don't explicitly define.
Well, I don't know about computer science, but "Informatik" is that plus electronics and electrical engineering aspects. This has traditional reasons: Mathematicians (and people working in other math-heavy fields) and electrical engineers where the most important "early adopters" of computers. They where also the ones who built and programmed them. So their academia teamed up and founded "Informatik". Therefore there is a strong hardware aspect if you study informatics (of course by now we also have lots of more-specialised courses that drop these aspects). Regards, Frank
Apr 24 2007
prev sibling parent reply Don Clugston <dac nospam.com.au> writes:
Jascha Wetzel wrote:
 hm, how about...
 why is problem X undecidable?
 why does solving problem Y take at least O(...) time/space?
 why are some problems NP and others P or aren't they?
 why do these weights and activation functions make a recursive neural
 network X do what it does?
Agreed. Those are all mathematical, science questions. And the CS department at the university I went to, thought that was what they should be teaching. But there isn't really very much of it, and most of it was developed by regular mathematicians (Goedel, etc), without much connection to actual computers.
 "software engineering" is clearly a large subset, but you would have to
 stretch the term pretty much to make most theoretical CS fit in there.
Yup. But theoretical CS is pretty much a specialised branch of mathematics, which I think is not how the CS term is generally used. But we're stuck with the name now!
Apr 24 2007
parent Lars Ivar Igesund <larsivar igesund.net> writes:
Don Clugston wrote:

 Jascha Wetzel wrote:
 hm, how about...
 why is problem X undecidable?
 why does solving problem Y take at least O(...) time/space?
 why are some problems NP and others P or aren't they?
 why do these weights and activation functions make a recursive neural
 network X do what it does?
Agreed. Those are all mathematical, science questions. And the CS department at the university I went to, thought that was what they should be teaching. But there isn't really very much of it, and most of it was developed by regular mathematicians (Goedel, etc), without much connection to actual computers.
 "software engineering" is clearly a large subset, but you would have to
 stretch the term pretty much to make most theoretical CS fit in there.
Yup. But theoretical CS is pretty much a specialised branch of mathematics, which I think is not how the CS term is generally used.
The algorithmic parts of CS is mostly math, yes. There's quite a bit of software specific stuff though, like research into information systems and the methodologies to build them. This is again closely related to software engineering, in which research also is performed. Then you have the hardware part of it (computers are equally much hardware as software). My impression of those doing research at my university, was that they _are_ highly professional scientists, doing everything from exceptionally boring stuff to exceptionally exciting stuff. I do to some degree agree that for what we as students did, science is a misnomer, and that is not a word used in the norwegian description of the studies either. In Norwegian terms, I'm an engineer in what we call datateknikk, meaning whatever means there are to treat data in a meaningful way. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 24 2007
prev sibling parent BCS <BCS pathlink.com> writes:
Chris Nicholson-Sauls wrote:
 Dave wrote:
 
 I tend to tell people that all forms of art seemingly arise from some 
 form of science. Programming just happens to be an artform still closely 
 linked to its base science.  And our own Walter -- if I recall right -- 
 is a prime example of a major developer whose background is in something 
 else.  I'm pretty sure those airplanes didn't require new compilers.
 
 -- Chris Nicholson-Sauls
I tend to see it the other way, or maybe more often the other way My continuum of understanding chaos: No understanding magic: you know how to get a finite set of result by rote execution. art: a skilled practitioner can get what they want some/most of the time, and can teach it to some people but can't really say how they do it. engineering: for most useful cases a solution can be derived from the desired result. (for a span of not more than X and a load of not more than Y use a Z type beam) science: for all cases within lax bounds, an exact description of what is happening can be described. Ironically I think Computer "Science" is generally somewhere between magic (for the "bad" programmers) and engineering (for the stuff Boeing, NASA and the NSA do) most of the hobby stuff is art.
Apr 23 2007
prev sibling parent Lars Ivar Igesund <larsivar igesund.net> writes:
Dan wrote:
 
 So if I could demonstrate that many (if not all) of these implementations
 should really be using a struct for strictly performance reasons, for
 example, would that have any weight?
Certainly, it would definately be an interesting excercise.
 
 I find that structs tend to be able to do alot in D, with Interfaces being
 the only remaining thing I miss.
We do have some classes that we are aware would fit better as a struct, but the lack of struct constructors prevent that for now. Similarly, lack of an interface notion prevents some usages. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 18 2007
prev sibling next sibling parent reply freeagle <dalibor.free gmail.com> writes:
Dan wrote:
 I just use phobos.  It gets the job done, instead of trying to be everything
and anything the user might possibly want to interact with; like the Java
libraries.
I think there is a reason why Java's library is trying to provide all the standard things that the coders might need. If you have a small std library, like phobos, or stdc, then there will be several groups of people trying to code the things missing. And so you have bazillion different libraries trying to solve the same problem. Trying to find the one that best fits your needs is a pain. Like trying to find the best linux distro. You can make a nice analogy with linux and freebsd e.g. It's very much the same. And majority of people will tell you that only one instance of FreeBSD is a better thing than hundreds of instances of linux Freeagle
Apr 18 2007
parent reply Stephen Waits <steve waits.net> writes:
freeagle wrote:
 And majority of people will tell you that only 
 one instance of FreeBSD is a better thing than hundreds of instances of 
 linux
Me included. But that's OT. The things about a "standard library".. first of all, it's "standard", second, it's singular. I want to (eventually) use D on the PS3. If you've got some giant standard library, then porting it to new platforms becomes burdensome. Welcome to smelly town. In my job, we don't need tons of library support. Basic math, basic containers, basic algorithms. As a matter of fact, we (and about a billion other people) have been using C and C++ and the associated (relatively small other than that iostreams crap) stdlibs for a long time now. Crypto in the standard library? NFW! KISS guys, KISS.. why not retarget Tango as an addition to Phobos? Something more kitchen-sink'ish? (ala Python) Community or not, it's Walter's language. Luckily for us, Walter is really freakin' smart about this kind of stuff, and he'll decide, and we'll all thank him for it. --Steve
Apr 18 2007
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Stephen Waits wrote:
 freeagle wrote:
 And majority of people will tell you that only one instance of FreeBSD 
 is a better thing than hundreds of instances of linux
Me included. But that's OT. The things about a "standard library".. first of all, it's "standard", second, it's singular. I want to (eventually) use D on the PS3. If you've got some giant standard library, then porting it to new platforms becomes burdensome. Welcome to smelly town. In my job, we don't need tons of library support. Basic math, basic containers, basic algorithms. As a matter of fact, we (and about a billion other people) have been using C and C++ and the associated (relatively small other than that iostreams crap) stdlibs for a long time now. Crypto in the standard library? NFW!
Then don't port the crypto functions. Seems like the obvious answer. I'm sure Phobos also has plenty of stuff you won't need. Whichever you go with, just port the parts you need.
 KISS guys, KISS..  why not retarget Tango as an addition to Phobos? 
 Something more kitchen-sink'ish?  (ala Python)
 Community or not, it's Walter's language.  Luckily for us, Walter is 
 really freakin' smart about this kind of stuff, and he'll decide, and 
 we'll all thank him for it.
It's pretty clear that Walter would prefer other people to take over the job of library design so he can concentrate on what he does best: compiler design. But it seems clear also that he wants to keep Phobos around as a place where he can dump in little additions now and then, like std.metastrings. I wonder if there might be some benefits to separating phobos and tango both out into a core lib containing only the barest essentials required to run D code (e.g. object and gc and not much else), and everything else. Seems like it would be helpful in your case, at least. It would seem somehow reassuring to me to be able to clearly see what's essential and what's optional. But in the end the linker should discard whatever you don't use, so, from a practical standpoint, it's not so necessary. --bb
Apr 18 2007
next sibling parent reply Lars Ivar Igesund <larsivar igesund.net> writes:
Bill Baxter wrote:

 I wonder if there might be some benefits to separating phobos and tango
 both out into a core lib containing only the barest essentials required
 to run D code (e.g. object and gc and not much else), and everything
 else.  Seems like it would be helpful in your case, at least.  It would
 seem somehow reassuring to me to be able to clearly see what's essential
 and what's optional.  But in the end the linker should discard whatever
 you don't use, so, from a practical standpoint, it's not so necessary.
This is what Tango already do. Tango's phobos.lib is just the language essentials. The only external interface coming from there is for the GC, threading and Object. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 18 2007
parent jcc7 <technocrat7 gmail.com> writes:
== Quote from Lars Ivar Igesund (larsivar igesund.net)'s article
 Bill Baxter wrote:
 I wonder if there might be some benefits to separating phobos and
 tango both out into a core lib containing only the barest
 essentials required to run D code (e.g. object and gc and not much
 else), and everything else.  Seems like it would be helpful in
 your case, at least.  It would seem somehow reassuring to me to be
 able to clearly see what's essential and what's optional.  But in
 the end the linker should discard whatever you don't use, so, from
 a practical standpoint, it's not so necessary.
This is what Tango already do. Tango's phobos.lib is just the language essentials. The only external interface coming from there is for the GC, threading and Object.
Okay, but we can't mix-and-match the "core" of Tango with the "core" of Phobos since Phobos hasn't been broken apart (DMD's phobos.lib contains the language essentials and fun extra stuff, too). Can we just use the tango source with the standard phobos.lib? I wouldn't think that we could since object is defined differently and parts of Tango probably depend on the GC changes that Sean has made. (Or is this what the "PhobosCompatibility" version is for?) Maybe it's time to try to convince Walter to separate Phobos into at least two parts to make it easier for us to plug-in our own GC, etc. In the meantime, I'll probably be using tangobos (http://www.dsource.org/projects/tangobos) rather than kicking the Phobos habit cold turkey.
Apr 18 2007
prev sibling parent Sean Kelly <sean f4.ca> writes:
Bill Baxter wrote:
 
 I wonder if there might be some benefits to separating phobos and tango 
 both out into a core lib containing only the barest essentials required 
 to run D code (e.g. object and gc and not much else), and everything 
 else.  Seems like it would be helpful in your case, at least.  It would 
 seem somehow reassuring to me to be able to clearly see what's essential 
 and what's optional.  But in the end the linker should discard whatever 
 you don't use, so, from a practical standpoint, it's not so necessary.
Tango is already designed this way. The compiler runtime and garbage collector are each separate libraries with no compile-time dependencies on one another or on any standard library code. See: http://www.dsource.org/projects/tango/wiki/TopicAdvancedConfiguration Sean
Apr 18 2007
prev sibling parent reply Chad J <gamerChad _spamIsBad_gmail.com> writes:
Stephen Waits wrote:
 freeagle wrote:
 
 And majority of people will tell you that only one instance of FreeBSD 
 is a better thing than hundreds of instances of linux
Me included. But that's OT. The things about a "standard library".. first of all, it's "standard", second, it's singular. I want to (eventually) use D on the PS3. If you've got some giant standard library, then porting it to new platforms becomes burdensome. Welcome to smelly town. In my job, we don't need tons of library support. Basic math, basic containers, basic algorithms. As a matter of fact, we (and about a billion other people) have been using C and C++ and the associated (relatively small other than that iostreams crap) stdlibs for a long time now. Crypto in the standard library? NFW! KISS guys, KISS.. why not retarget Tango as an addition to Phobos? Something more kitchen-sink'ish? (ala Python) Community or not, it's Walter's language. Luckily for us, Walter is really freakin' smart about this kind of stuff, and he'll decide, and we'll all thank him for it. --Steve
Size of library and porting difficulty are pretty much unrelated if the library is written well. I do believe Tango is written well. The porting difficult comes when there are OS or machine dependant features used (system function calls, ASM use), since all of those have to be replaced. When you start to talk about a large do-everything library, there are a couple categories of code that become very common: - Things like crypto that only need one dep: a turing complete language. (It can be asm optimized, but intrinsics and abstraction kill that issue.) - Things that sit ontop of OS abstractions. Once you write a File class that handles file level io, then you no longer need to use any OS dependant functions to handle file use. Port one and you port them all. Now I remember reading that Tango is interested in doing some media library stuff (SDL kinda stuff- low level graphics, sound and all) and that might be trouble to port. Of course, I doubt much of the other stuff will depend on it, so just don't port that part of Tango if you are strapped on time. If they decided to handle GUI at some point, it might be the same deal (espec for app GUIs where native look-and-feel matters). I've talked with kris some about the idea of porting Tango to arm-wince-pe, a popular PDA platform. Realize I have done this with phobos, so I have some experience in the matter. I liked what I heard. It is very layered. Get the bottom layers and the rest falls into place. Minimal OS function dependance, no ASM deps - dude that is /nice/.
Apr 18 2007
parent reply Stephen Waits <steve waits.net> writes:
Chad J wrote:
 
 I've talked with kris some about the idea of porting Tango to 
 arm-wince-pe, a popular PDA platform.  Realize I have done this with 
 phobos, so I have some experience in the matter.  I liked what I heard. 
  It is very layered.  Get the bottom layers and the rest falls into 
 place.  Minimal OS function dependance, no ASM deps - dude that is /nice/.
Ok. Now that does sound nice. Having not looked inside Tango, I appreciate your experience and insight. However, my first problem is getting a compiler to emit PS3 code. :) --Steve
Apr 19 2007
parent reply Sean Kelly <sean f4.ca> writes:
Stephen Waits wrote:
 
 However, my first problem is getting a compiler to emit PS3 code.  :)
Does GCC target Cell yet? Sean
Apr 19 2007
parent Stephen Waits <steve waits.net> writes:
Sean Kelly wrote:
 
 Does GCC target Cell yet?
Yah. The official toolchain is based on gcc and there's one in the wild too, used on PS3 Linux. --Steve
Apr 19 2007
prev sibling parent Oskar Linde <oskar.lindeREM OVEgmail.com> writes:
Dan wrote:

 Likewise I fear for Tango, as it's got that OO gleam in it's eye and is
implementing classes for crypto.. I mean.. crypto!?  What's next, a math class?
 *shudders*
Please propose a better way to implement stateful algorithms such as block based ciphers, cryptographic hashes or compression algorithms in a way that doesn't require all the source data to be present in memory at once and that supports working on data streams. /Oskar
Apr 19 2007
prev sibling next sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
It's not about what can be done in theroy/uif you have enough time, it's about
what can be done by (probably slightly less intelligent/educated) programmers
quickly. If you ran a company, would you want to pay your employees $30 an hour
to reimplement functionality provided in a standard library? What about
debugging it?

Dan Wrote:

 Brad Anderson Wrote:
 Oh to be a fly on the wall when the current lib doesn't have a function you
 need...
I can write assembler. I can write D. I can write ECMAScript. If there's something I need that isn't there, I write it. I personally think there's something wrong with someone who claims to be a programmer but *can't* solve a trivial puzzle. Not knowing what address to write directly to the screenbuffer is forgiveable. Look it up. Not being able to implement huffman compression or a quicksort or binary search, or a jump gate after knowing what you need to achieve... well, that means you lost that gleam in your eye you had back in kindergarten. - Dan
Apr 18 2007
prev sibling next sibling parent kmk <kmk200us yahoo.com> writes:
I can't really say if Phobos or Tango is better but do I think there
should be one standard library and that standard library should be
maintained by a community. Walter has done a good job with
Phobos but I think it's too much to have one person maintain both the reference
compiler and the entire standard library. 


Davidl Wrote:

 Personally, I'm not familiar with tango, but the following is based on the  
 thought of
 1. one man's effort vs. a team's effort
 2. growing D code need only 1 base standard library.
 
 I don't think it's funny to switch from phobos to tango or switch tango  
 back to phobos.
 
 I think standard library should be provided by D community, I appreciate  
 Walter gave us phobos.
 We needed this babysitter. But now D community is growing bigger & bigger.  
 I'm wondering if Walter
 can put as much effort as he used to put on Phobos to compete with Tango.   
 And endless arguing of
 Phobos vs. Tango is somewhat meaningless & annoying.
 Once tangobos out, I hope standard DMD package would be released by tango  
 team & Walter, with Tango
 being the base default library. Users can use tangobos for legacy code.
 
 
Apr 18 2007
prev sibling parent reply Dan <murpsoft hotmail.com> writes:
janderson Wrote:
 The problem is that every code monkey ends up re-writing the same code. 
      That means a lot of time wasted focusing on writing solutions to 
 problems that have already been solved.  It also means when reading 
 someone else's code there's more to learn, rather then there being a 
 standard way of writing something (more time wasted).  
To that end, I find it takes me more time to learn an entirely new set of types, methods parameters, interfaces and class hierarchy that follow a convention completely unlike the D language itself and the intricacies of the classes I'm working with when they're opague; than I do performing a mere cast on that void[], or merely using char[] rather than String. Furthermore
 things that have been in some public library (not necessarily standard) 
 generally receive a high level of free testing from the community, that 
 means I save more time.
Yes, I suppose it would save you a deal of time debugging if your code has obvious bugs like the two examples I provided for abs(). It would also be prone to gradual improvement accross many applications and therefore be more maintainable compared to reimplementing the wheel each time.
 My approach to coding is to try to do as much re-use as is feasible.  It 
 means I can focus on the actual problem more, not how I get there.
design++; code--; Sounds like a good objective.
 Humm, although I respect your option:  You should know that this is one 
 common interview question.
Interview question or not... my abilities revolve around making code smaller, faster, cleaner, and more robust within the environment they are targetted. Using library code may or may not have something to do with that, but my mentality when I code is KISS, and it's quite engrained. When I see a bunch of code that works, such as raw D source with language features, and I compare it with a library made out of template and class hierarchies, I'm very hesitant. When people start suggesting that we abandon the simple, working code in favor of the complex, I insist on it being justified - this being the stem of this topic. So, I tried poking holes. : ) As for abs(): The IEEE 754 standard is applied accross most if not all platforms, and proposes much the same for 32-bit, 64-bit, 43-bit, 79/80-bit, doubles, floats, and all. The sign bit is the high bit. Putting the high bit in the low bit of a register, and'ing it with the original value and performing a one's complement will give you the absolute value of all but one value - try int.min; then, try it with your other solutions... Sincerely, Dan
Apr 19 2007
parent reply Sean Kelly <sean f4.ca> writes:
Dan wrote:
 When I see a bunch of code that works, such as raw D
  source with language features, and I compare it with a
 library made out of template and class hierarchies, I'm
 very hesitant.
Some points of analysis worth considering are average lines of code, number of parameters, cyclometric complexity, etc. One can't infer much from the simple presence of templates or a class hierarchy. Heck, the presence of templates can suggest more robust code than average, because they tend to prevent code duplication and force a focus on algorithm design rather than type-specific optimizations (which can be a source of bugs). Similar things could be said for classes. Sean
Apr 19 2007
parent "David B. Held" <dheld codelogicconsulting.com> writes:
Sean Kelly wrote:
 [...]
 Some points of analysis worth considering are average lines of code, 
 number of parameters, cyclometric complexity, etc.  One can't infer much 
 from the simple presence of templates or a class hierarchy.  Heck, the 
 presence of templates can suggest more robust code than average, because 
 they tend to prevent code duplication and force a focus on algorithm 
 design rather than type-specific optimizations (which can be a source of 
 bugs).  Similar things could be said for classes.
 [...]
Personally, I find that libraries that *don't* use templates are primitive, because they probably aren't very configurable. You're stuck with what the library designer chose for you. I think in the future, this is just going to become more and more pronounced. All of the most interesting libraries I can think of are extremely generic. I would be interested to hear about counter-examples. As far as re-inventing wheels vs. code re-use, I think it is instructional to look at other engineering disciplines. Which engineers are praised for designing custom bolts and I-beams and steel pipes and wires? The ones who get the recognition are the ones who take commodity items for granted, who choose the best available, and focus on the things that matter, which are the high-level problems. That is, the best engineers are the ones that can abstract away the trivialities and introduce something compelling and new using off-the-shelf parts. Given that architect-level engineers usually need a decent variety of parts to choose from, I think it makes perfect sense that one would have alternatives in libraries. There is also the aspect of healthy competition and alternative tastes. Obviously, Phobos appeals to C-philes like Dan, while Tango appeals to Java-philes like...well, like all the Java-philes out there. ;) I think making them interchangable is the most worthwhile course of action. Rewriting basic libraries is a good exercise for the learning programmer. I reinvented plenty of good wheels in my day, and had a lot of fun doing it. But when you get old and tired, you lose the zeal for wheel-making, and you start to build cars, using parts that other people made. Dave
Apr 19 2007