www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - A proper language comparison...

reply "Xinok" <xinok live.com> writes:
Once in a while, a thread pops up in the newsgroups pitting D 
against some other language. More often than not, these 
comparisons are flawed, non-encompassing, and uninformative. Most 
recently with the article comparing D with Go and Rust, the 
community pointed out a few flaws involving a late addition of 
one of the D compilers, build configurations (-noboundscheck?), 
and the random number generator used.

Then when I think about how web browsers are compared, there are 
conventional measures and standard benchmarking tools (e.g. 
sunspider). They measure performance for javascript, rendering, 
HTML5, etc. They also measure startup times (hot/cold boot), 
memory usage, etc. Finally, there are feature comparisons, such 
as what HTML5 features each browser supports.

These are the type of comparisons I'd like to see with 
programming languages. For starters, there should be standard 
"challenges" (algorithms and such) implemented in each language 
designed to measure various aspects of the language, such as 
sorting, number crunching, and string processing. However, rather 
than leave it to a single individual to implement the algorithm 
in several different languages, it should be left to the 
community to collaborate and produce an "ideal" implementation of 
the algorithm in their language. We could analyze factors other 
than performance, such as the ease of implementation (how many 
lines? does it use safe/unsafe features? Was it optimized using 
unsafe / difficult features?).


What can we do about it? I propose we come together as a 
community, design challenges that are actually relevant and 
informative, and release the first implementations in D. Then we 
let the battle commence and invite other communities to 
contribute their own implementations in other languages. I think 
we should give it a try; start off small with just a few moderate 
challenges (not too simple or complex) and see where it goes from 
there.
Jul 25 2013
next sibling parent "Brad Anderson" <eco gnuk.net> writes:
On Thursday, 25 July 2013 at 18:23:19 UTC, Xinok wrote:
 Once in a while, a thread pops up in the newsgroups pitting D 
 against some other language. More often than not, these 
 comparisons are flawed, non-encompassing, and uninformative. 
 Most recently with the article comparing D with Go and Rust, 
 the community pointed out a few flaws involving a late addition 
 of one of the D compilers, build configurations 
 (-noboundscheck?), and the random number generator used.

 Then when I think about how web browsers are compared, there 
 are conventional measures and standard benchmarking tools (e.g. 
 sunspider). They measure performance for javascript, rendering, 
 HTML5, etc. They also measure startup times (hot/cold boot), 
 memory usage, etc. Finally, there are feature comparisons, such 
 as what HTML5 features each browser supports.

 These are the type of comparisons I'd like to see with 
 programming languages. For starters, there should be standard 
 "challenges" (algorithms and such) implemented in each language 
 designed to measure various aspects of the language, such as 
 sorting, number crunching, and string processing. However, 
 rather than leave it to a single individual to implement the 
 algorithm in several different languages, it should be left to 
 the community to collaborate and produce an "ideal" 
 implementation of the algorithm in their language. We could 
 analyze factors other than performance, such as the ease of 
 implementation (how many lines? does it use safe/unsafe 
 features? Was it optimized using unsafe / difficult features?).


 What can we do about it? I propose we come together as a 
 community, design challenges that are actually relevant and 
 informative, and release the first implementations in D. Then 
 we let the battle commence and invite other communities to 
 contribute their own implementations in other languages. I 
 think we should give it a try; start off small with just a few 
 moderate challenges (not too simple or complex) and see where 
 it goes from there.

Sounds somewhat like Rosetta Code. http://rosettacode.org/wiki/Rosetta_Code Bearophile spends a lot of time adding D entries there.
Jul 25 2013
prev sibling next sibling parent reply "qznc" <qznc web.de> writes:
On Thursday, 25 July 2013 at 18:23:19 UTC, Xinok wrote:
 Once in a while, a thread pops up in the newsgroups pitting D 
 against some other language. More often than not, these 
 comparisons are flawed, non-encompassing, and uninformative. 
 Most recently with the article comparing D with Go and Rust, 
 the community pointed out a few flaws involving a late addition 
 of one of the D compilers, build configurations 
 (-noboundscheck?), and the random number generator used.

 Then when I think about how web browsers are compared, there 
 are conventional measures and standard benchmarking tools (e.g. 
 sunspider). They measure performance for javascript, rendering, 
 HTML5, etc. They also measure startup times (hot/cold boot), 
 memory usage, etc. Finally, there are feature comparisons, such 
 as what HTML5 features each browser supports.

 These are the type of comparisons I'd like to see with 
 programming languages. For starters, there should be standard 
 "challenges" (algorithms and such) implemented in each language 
 designed to measure various aspects of the language, such as 
 sorting, number crunching, and string processing. However, 
 rather than leave it to a single individual to implement the 
 algorithm in several different languages, it should be left to 
 the community to collaborate and produce an "ideal" 
 implementation of the algorithm in their language. We could 
 analyze factors other than performance, such as the ease of 
 implementation (how many lines? does it use safe/unsafe 
 features? Was it optimized using unsafe / difficult features?).

Sounds very much like this: http://benchmarksgame.alioth.debian.org/ You can compare code size, memory need, execution time for various programs and lots of languages. Safety is not considered though, but how would you measure that? It is called a "game", because you can adapt the weights until your favorite language is the winner. ;) D entries were provided, but removed at some point, because it looked like the C code.
Jul 25 2013
parent Manfred Nowak <svv1999 hotmail.com> writes:
"qznc" <qznc web.de> wrote in news:buxislkfauizlnrvoyzv forum.dlang.org:

 until your favorite language is the winner.

This is a wrong path. The winner should be found by weighting the results in such a way that all tested languages are at least close to equal. If such equality can be achieved by some canonical weighting, then they are all equal in respect to the test battery, because all fell into the same equivalence class. Only if there is more than one equivalence class, at least one extreme participant can be declared, which might be the winner---or the looser as well. -manfred
Jul 25 2013
prev sibling next sibling parent reply "Peter Alexander" <peter.alexander.au gmail.com> writes:
On Thursday, 25 July 2013 at 18:23:19 UTC, Xinok wrote:
 These are the type of comparisons I'd like to see with 
 programming languages. For starters, there should be standard 
 "challenges" (algorithms and such) implemented in each language 
 designed to measure various aspects of the language, such as 
 sorting, number crunching, and string processing. However, 
 rather than leave it to a single individual to implement the 
 algorithm in several different languages, it should be left to 
 the community to collaborate and produce an "ideal" 
 implementation of the algorithm in their language. We could 
 analyze factors other than performance, such as the ease of 
 implementation (how many lines? does it use safe/unsafe 
 features? Was it optimized using unsafe / difficult features?).

The problem is all those last bits: - Line counts aren't a good measure of anything. - What's safe and unsafe is very subjective. - What's difficult is even more subjective. There's also other variables: - How idiomatic is the code? - How well does it scale? - How much headroom is there for more optimisation? - How predictable is the performance? - How well do you need to know the language's implementation to do optimisations? For example, optimised Haskell will often added strict evaluation hints, and strict type hints, but these are very non-idiomatic, and work against the lazy nature of the language. D on the other hand does quite well with all these variables: idiomatic D code is generally quite fast, and its abstractions scale quite well. Performance is predictable and the language has plenty of features allowing you to do unsafe things for more performance if you so desire. The problem with measuring all this stuff is that it's very subjective, so I don't think there can be any standardised way of assessing language performance.
Jul 25 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2013 1:28 PM, bearophile wrote:
 there is no significant stack overflow protection,

It's done by the hardware (putting a "no-access" page at the end of the stack). There's nothing unsafe about it.
 no variable-sized stack-allocated arrays that help a bit
 created bounded collections,

I don't see how that is a safety issue.
Jul 25 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2013 7:19 PM, bearophile wrote:
 If you allocate too much data on the stack this could cause stack overflow. As
 you say a stack overflow is memory safe, but if your program is doing something
 important, a sudden crash could be regarded as dangerous for the user. You
don't
 want a stack overflow in the code that controls your car brakes (this is not a
 totally invented example).

If you are writing a program that, if it fails will cause your car to crash, then you are a bad engineer and you need to report to the woodshed. As I've written before, imagining you can write a program that cannot fail, coupled with coming up with a requirement that a program cannot fail, is BAD ENGINEERING. ALL COMPONENTS FAIL. The way you make a system safe is design it so that it can withstand failure BECAUSE THE FAILURE IS GOING TO HAPPEN. I cannot emphasize this enough.
 Having variable-sized stack-allocated arrays encourages you to put more data on
 the stack, increasing the risk of stack overflows.

 On the other hand, if you only have fixed-sized stack-allocated arrays, you
 could allocate a fixed size array on the stack and then use only part of it.
 This waste of stack space increases the probability of stack overflow. A
 variable-sized stack-allocated array allows you to waste no stack space, and
 avoid those stack overallocations.

On the other hand, fixed size stack allocations are more predictable and hence a stack overflow is more likely to be detected during testing.
 If you are using a segmented stack as Rust, stack overflows become less
probable
 (it becomes more like a malloc failure), because the stack is able to become
 very large when needed. I think Rust needs that elastic stack because in such
 language it's easy to allocate all kind of stuff on the stack (unlike in D).

Segmented stacks are a great idea for 20 years ago. 64 bit code has rendered the idea irrelevant - you can allocate 4 billion byte stacks for each of 4 billion threads. You've got other problems that'll happen centuries before that limit is reached. (Segmented stacks are also a performance problem, and don't interact well with compiled C code.)
Jul 25 2013
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2013 8:39 PM, H. S. Teoh wrote:
 How would a D program recover from stack overflow?

The program doesn't. In a safe system, there'd be a backup in case the main program failed (which it inevitably will).
 Isn't it possible to allocate the stack at the far end of the program's
 address space, so that it can grow as needed?

That works until you have more than one thread.
Jul 25 2013
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 4:28 AM, Dicebot wrote:
 Also nothing that is powered from the same power source can't be considered
 fail-safe. ;)

Yup! I had to laugh at the New Orleans practice of putting emergency generators in the basements. Q. When do you need the emergency generators? A. When the power grid goes down. Q. What does the power grid power? A. The pumps that keep the basements from flooding. Q. When is the power grid most likely to fail? A. When there's a storm that's likely to flood the basements. Ya gotta just cry at the logic. The basement of my house is designed so gravity will drain water out of it rather than requiring a sump pump. A sump pump was cheaper than the trenching required for a gravity drain, but losing everything in the basement would be far more expensive than trenching.
Jul 26 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 5:28 AM, bearophile wrote:
 I agree. On the other hand in important system you usually also try to use more
 reliable single components, like military-grade resistors able to stand bigger
 temperature fluctuations. Safety must be pursued at all levels. That's why in
 both automotive and aeronautics for certain safety-critical routines they
forbid
 recursion and require a static analysis of the max stack space the subprogram
 will require in all possible usages, to reduce a lot the probability of stack
 overflows.

Yes, and that's why your analysis of Rust's stack usage is inadequate in demonstrating it is safer.
 In some situations stack overflows are a security problem. Several persons have
 written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the
 safety hazards caused by stack overflows, and ignoring the tools to avoid them
 in critical-purpose routines, is very bad engineering.

You can't have an undetected stack overflow if you use guard pages.
 I don't know the current situation on this, but I think they are trying to
solve
 this problem in Rust, with some workaround.

I'll add that segmented stacks are a compiler feature, not a language feature. A D compiler could support segmented stacks without changing the language, provided calling C functions still works. But I see no point. 32 bit code is already dead on OSX, and is rapidly dying on Linux and Windows. I hear from more and more outfits that they've transitioned to 64 bits and are not looking back.
Jul 26 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 2:18 PM, Brad Roberts wrote:
 On 7/26/13 12:50 PM, Walter Bright wrote:
 On 7/26/2013 5:28 AM, bearophile wrote:

 In some situations stack overflows are a security problem. Several persons have
 written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the
 safety hazards caused by stack overflows, and ignoring the tools to avoid them
 in critical-purpose routines, is very bad engineering.

You can't have an undetected stack overflow if you use guard pages.

If you use guard pages AND guarantee that no object exceeds the size of the guard page. Without the latter, you can only catch a subset (though a large subset).

True. I've often thought it would be reasonable to restrict object sizes on the stack.
Jul 26 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 2:42 PM, Walter Bright wrote:
 On 7/26/2013 2:18 PM, Brad Roberts wrote:
 On 7/26/13 12:50 PM, Walter Bright wrote:
 On 7/26/2013 5:28 AM, bearophile wrote:

 In some situations stack overflows are a security problem. Several persons have
 written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the
 safety hazards caused by stack overflows, and ignoring the tools to avoid them
 in critical-purpose routines, is very bad engineering.

You can't have an undetected stack overflow if you use guard pages.

If you use guard pages AND guarantee that no object exceeds the size of the guard page. Without the latter, you can only catch a subset (though a large subset).

True. I've often thought it would be reasonable to restrict object sizes on the stack.

No, I was wrong. False. Stack frames larger than 4K are sequentially "probed" so they'll fault on overflow.
Jul 26 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 3:32 PM, Brad Roberts wrote:
 No, I was wrong. False. Stack frames larger than 4K are sequentially "probed"
 so they'll fault on overflow.

Are or could be?

Yes and yes. https://github.com/D-Programming-Language/dmd/blob/master/src/backend/cod3.c#L3050
Jul 26 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 11:20 PM, Brad Roberts wrote:
 Um.. unless I'm reading that maze of #if's and conditionals wrong.. that's only
 being done in a few cases, specifically never on linux.  And either way, are
you
 asserting that all compilers do that?

No. I'm asserting that it is a compiler issue, and an easily dealt with one, not a language issue. As for why the backend does it for some platforms and not for others, I was merely mimicking what other compilers did for each platform. I know that Win32 requires this behavior, I was unsure about Linux. We can certainly investigate turning it on for Linux.
Jul 26 2013
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/27/2013 3:24 AM, Jacob Carlborg wrote:
 On Friday, 26 July 2013 at 19:50:22 UTC, Walter Bright wrote:

 But I see no point. 32 bit code is already dead on OSX, and is rapidly dying
 on Linux and Windows. I hear from more and more outfits that they've
 transitioned to 64 bits and are not looking back.

32bit is far from dead on ARM.

True - and DMD is far from an ARM back end :-)
Jul 27 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 10:25 AM, deadalnix wrote:
 You emphasis it quite well, and that is certainly true for a car, a plane, or
 anything potentially dangerous.

 Different tradeoff apply when you talk about a video game, a media player or
and
 IRC client.

Of course. There is a cost of failure, though, to things like video games and media players. Annoying your customers. I've dumped using many media players because of their tendency to freeze up. I like to set my music on in the morning and run it all day. Having to regularly restart it means "abandon it and try a different one." My current media player freezes about once every couple weeks. It's infrequent enough to be tolerable. The Ubuntu one dies about once an hour. I gave up on that long ago.
Jul 26 2013
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/27/2013 12:40 AM, deadalnix wrote:
 This kind of software can leverage way to recovers that would be untolerable in
 an airplane (for instance because they only work most of the time, or would
 produce an erratic behavior for a short period of time, like an audio glitch).

 D right now is not very friendly to such use cases as it is designed to crash
 hard as soon as something wrong happens.

I think you're seriously mistaken about this not being "friendly". I don't think there's anything "friendly" about a program that goes wild and keeps on running. My experience with such programs (DOS programs would not crash, they'd just run wild) is universally unfriendly. The way to deal tolerantly with errant processes is to have an "executive" process that spawns the worker process. It monitors the worker, and if the worker crashes, the executive simply respawns it. This is a reasonably friendly way to do things. Continuing to run already crashed programs is a very bad idea. After all, what if your corrupted program now proceeds to corrupt all your user's profile data? I don't think the user would consider that friendly. What if your media player scrambles the playlists? (Happened to me.) Oh joy, I loved that one. No thanks. (I've been arguing for decades against the idea that somehow crashed programs should keep on running. I keep hearing all kinds of explanations for why zombies should keep on running, even though you have no idea what they will do (except that it will be bad).)
Jul 27 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/27/2013 3:31 AM, Jacob Carlborg wrote:
 On Friday, 26 July 2013 at 19:54:58 UTC, Walter Bright wrote:

 My current media player freezes about once every couple weeks. It's infrequent
 enough to be tolerable. The Ubuntu one dies about once an hour. I gave up on
 that long ago.

Then you should use a Mac. They're (in)famous for when the whole computer freezes the music keeps playing.

I use a Turtlebeach Audiotron.
Jul 27 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/27/2013 11:50 AM, Meta wrote:
 On Saturday, 27 July 2013 at 17:52:20 UTC, Walter Bright wrote:
 I use a Turtlebeach Audiotron.

That may be your problem right there. Turtle Beach seems to make high-quality products, but in reality the quality is extremely poor. I have 5 (5!) friends now, as well as myself, who bought a Turtle Beach product, and had it break within a year and a half (conveniently for Turtle Beach, the warranty only lasts a year), or even be DOA. I would never buy a Turtle Beach product again, ever.

I've been running the Audiotron for maybe 12 years now, all day every day. I like it enough that I've bought two others. I look for a replacement now and then, but none offer the Audiotron's mix of features: 1. operable from the front panel or a remote 2. provides a web interface so it can be controlled from any computer on the LAN 3. does not require a server demon - it can simply read shared directories on the LAN for the music 4. it is very low power 5. plays internet radio stations too 6. has shuffle For example, the Roku box I have can play music it sucks from the LAN. However, 1. need to connect it to a TV in order to control it 2. no web interface 3. can't use Roku remote without a TV display 4. requires a server demon 5. does not have shuffle 13 years after the Audiotron, the Roku is apparently the zenith of loser technology. I mean, it doesn't have shuffle? What the hell?
Jul 27 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/25/2013 7:19 PM, bearophile wrote:
 You don't want a stack overflow in the code that controls your car brakes
(this is not a
 totally invented example).

Sadly, it isn't: http://www.forbes.com/sites/andygreenberg/2013/07/24/hackers-reveal-nasty-new-car-attacks-with-me-behind-the-wheel-video/ Software controlled brakes with no override? Madness!
Jul 25 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 3:54 AM, Max Samukha wrote:
 Only death statistics for a sufficiently long usage period could tell whether
 software + override is safer than purely software. Note that software +
override
 is significantly more complex, which means a decrease in reliability of the
 system in whole.

Airliners make comprehensive use of overrides and redundancy. The proof that works is how incredibly overall reliable they are, despite regular failures of individual components. Note that before the software brake madness, there were 3 redundant braking systems on a typical car. The front and back brakes were independent - one could lose all fluid and the other would still work. A third independent system was the mechanical emergency brake. This system has been in place for 50 years and works great. Hell, even if all three failed, you could still put the car in gear and turn the ignition off. It'll slow down pretty rapidly.
Jul 26 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 12:30 PM, Chris Cain wrote:
 I wouldn't recommend turning the ignition off. Most cars lose power steering in
 that situation which is can be just as bad as or worse than losing brakes.

The power steering is driven by a belt connected to the crankshaft. You won't lose power steering with the ignition off if the engine is turning. But you need to be careful not to engage the steering lock. That would be a big problem. And also, I suggest this as a last resort if your other braking systems all failed.
 Most cars (including automatics) allow you to manually switch to lower gears
 which will also slow you down.

I have little experience with automatics.
Jul 26 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 2:15 PM, H. S. Teoh wrote:
 I think most automatics lock the steering wheel upon power off (probably
 as some kind of safety guard, maybe against inadvertent damage by some
 parts that expect power to be running when the wheel is turned?).

It's an anti-theft feature.
 I also use manual downshifting on my car (auto transmission) to force it
 to slow down -- e.g., down a hill, when the automatic transmission will
 often blindly shift to a high gear and you'll find yourself having to
 burn up much of your brakes to keep the speed under control. My car has
 a button that locks the maximum gear to 3rd, which is useful for keeping
 within city street limits when going downhill. It also has gear
 positions to force a switch to 2nd or 1st gear, though I rarely use
 those since at lower speeds there's generally no need to bother with
 them. In an emergency situation, forcing it to 1st gear would help
 reduce the speed. (But it does take a few seconds before the auto
 transmission kicks in to effect the switch -- and a few extra seconds at
 high speed can be too long in an emergency situation.)

Although commonplace, it is poor practice to use the engine to slow the car down (unless you're dealing with brake fade from overheating). 1. Brake pads are cheap compared with engine rebuilds. 2. Using the engine as a brake can cause unburned gas to wash the oil off of the cylinder walls, resulting in excessive wear. 3. The engine is not designed to be a brake. Use the brakes. Brake pads are not precious :-)
 I think the one time when forcing 1st gear proved useful was when I had
 to drive downhill after a heavy snowstorm -- you do *not* want to go any
 higher in that situation otherwise you could easily lose friction and
 slide down to a nasty crunch at the bottom. (Well, the general advice
 is, don't drive in such conditions in the first place -- but then guys
 like me are often rather foolhardy. :-P)

I prefer a manual trans in slippery conditions - more control.
Jul 26 2013
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/26/13 3:52 PM, Walter Bright wrote:
 Although commonplace, it is poor practice to use the engine to slow the
 car down (unless you're dealing with brake fade from overheating).

I know next to nothing about cars so take this destruction with a grain of salt.
 1. Brake pads are cheap compared with engine rebuilds.

My understanding is that engine brake does not destroy the engine. It does not involve friction. Indeed Wikipedia agrees: http://en.wikipedia.org/wiki/Engine_braking and even mentions "Engine braking is a generally accepted practice and can help save wear on friction brakes".
 2. Using the engine as a brake can cause unburned gas to wash the oil
 off of the cylinder walls, resulting in excessive wear.

[citation needed]
 3. The engine is not designed to be a brake. Use the brakes. Brake pads
 are not precious :-)

Engine brake is a natural artifact of its design. I don't think you can build an argument around "wasn't design to do that, so don't". Engine braking is a widespread and common technique. I use engine braking most of the time (I always drive manual so that's easy). Saves gas and I've never had a mechanic tell me "you better go easy with that engine brake, look at them cylinder walls!" My brake pads reach a state of immortality. Andrei
Jul 26 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 4:07 PM, Andrei Alexandrescu wrote:
 On 7/26/13 3:52 PM, Walter Bright wrote:
 Although commonplace, it is poor practice to use the engine to slow the
 car down (unless you're dealing with brake fade from overheating).

I know next to nothing about cars so take this destruction with a grain of salt.
 1. Brake pads are cheap compared with engine rebuilds.

My understanding is that engine brake does not destroy the engine. It does not involve friction.

It's news to me that engines are frictionless! (The braking effect is only partially due to engine friction - the pumping of the air is most of it. But the engine WEAR is due to friction.)
 Indeed Wikipedia agrees:
 http://en.wikipedia.org/wiki/Engine_braking and even mentions "Engine braking
is
 a generally accepted practice and can help save wear on friction brakes".

Of course it saves wear on the brakes. The issue is do you prefer wear on your engine?
 2. Using the engine as a brake can cause unburned gas to wash the oil
 off of the cylinder walls, resulting in excessive wear.

[citation needed]

Mechanics at the dealer told me this. They had no reason to lie to me.
 3. The engine is not designed to be a brake. Use the brakes. Brake pads
 are not precious :-)

Engine brake is a natural artifact of its design. I don't think you can build an argument around "wasn't design to do that, so don't". Engine braking is a widespread and common technique.

I agree it is widespread and commonplace. That's why the mechanics felt it necessary to tell me not to do it. I was also told not to do it when I took two different courses in track driving - the Bob Bondurant and Skip Barber ones.
 I use engine braking most of the time (I always drive manual so that's easy).
 Saves gas and I've never had a mechanic tell me "you better go easy with that
 engine brake, look at them cylinder walls!" My brake pads reach a state of
 immortality.

The object isn't to save brake pads, it's to reduce the wear and tear on your engine.
Jul 26 2013
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/26/13 4:25 PM, Walter Bright wrote:
 On 7/26/2013 4:07 PM, Andrei Alexandrescu wrote:
 On 7/26/13 3:52 PM, Walter Bright wrote:
 Although commonplace, it is poor practice to use the engine to slow the
 car down (unless you're dealing with brake fade from overheating).

I know next to nothing about cars so take this destruction with a grain of salt.
 1. Brake pads are cheap compared with engine rebuilds.

My understanding is that engine brake does not destroy the engine. It does not involve friction.

It's news to me that engines are frictionless! (The braking effect is only partially due to engine friction - the pumping of the air is most of it. But the engine WEAR is due to friction.)
 Indeed Wikipedia agrees:
 http://en.wikipedia.org/wiki/Engine_braking and even mentions "Engine
 braking is
 a generally accepted practice and can help save wear on friction brakes".

Of course it saves wear on the brakes. The issue is do you prefer wear on your engine?
 2. Using the engine as a brake can cause unburned gas to wash the oil
 off of the cylinder walls, resulting in excessive wear.

[citation needed]

Mechanics at the dealer told me this. They had no reason to lie to me.
 3. The engine is not designed to be a brake. Use the brakes. Brake pads
 are not precious :-)

Engine brake is a natural artifact of its design. I don't think you can build an argument around "wasn't design to do that, so don't". Engine braking is a widespread and common technique.

I agree it is widespread and commonplace. That's why the mechanics felt it necessary to tell me not to do it. I was also told not to do it when I took two different courses in track driving - the Bob Bondurant and Skip Barber ones.
 I use engine braking most of the time (I always drive manual so that's
 easy).
 Saves gas and I've never had a mechanic tell me "you better go easy
 with that
 engine brake, look at them cylinder walls!" My brake pads reach a
 state of
 immortality.

The object isn't to save brake pads, it's to reduce the wear and tear on your engine.

I stand by my opinion and practice, and I consider yours completely unsubstantiated, to the extent it doesn't need further rebuttal. FWIW I've heard stuff from mechanics (about e.g. how ABS works) that would make a physicist blush. Andrei
Jul 26 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 4:45 PM, Andrei Alexandrescu wrote:
 I stand by my opinion and practice, and I consider yours completely
 unsubstantiated, to the extent it doesn't need further rebuttal.

It's your car!
 FWIW I've heard
 stuff from mechanics (about e.g. how ABS works) that would make a physicist
blush.

I've heard **** from mechanics, too.
Jul 26 2013
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/26/13 11:23 PM, Jordi Sayol wrote:
 On 27/07/13 01:25, Walter Bright wrote:
 2. Using the engine as a brake can cause unburned gas to wash
 the oil off of the cylinder walls, resulting in excessive
 wear.

[citation needed]

Mechanics at the dealer told me this. They had no reason to lie to me.

This absolutely true. About twenty years ago my friend's car broke down in a remote location. To bring the car to the nearest mechanic (2 or 3 kilometers), tied it to another car with a rope and used engine braking without ignition (engine was damaged) to prevent the spring effect. Result, pistons melted by excessive friction. This was due to the effect that Walter's mechanics clearly explained.

Thanks for this anecdote. It's at the very best circumstantial. (With the engine off, the oil pump wasn't even started!) I've asked Walter for one credible source on the entire Internet documenting the case against engine braking. He was unable to produce one. Instead, he attempted to explain how an increase in hysteresis can cause additional wear on the engine (the parts not worn under forward use). However, this is what one poster in http://goo.gl/Ys099U had to say about that: ================= Most of the time when you drive, you're putting a load (and causing wear) on what I'm going to call the "forward" face of each tooth on each gear in your drivetrain. The front of a tooth on the crankshaft pushes against the back of a tooth on the next gear in line, which pushes the next gear, etc. When you use "engine braking", all you are doing is engaging the teeth in the opposite direction, and putting force and wear on the faces that normally are just along for the ride. Now, does that mean you're wearing your engine out faster? Marginally... but the parts you're wearing out would normally have to be replaced (if at all) because they'd worn out from the other side; you're wearing surfaces that would usually be thrown out with hardly any wear at all. To borrow a phrase from the medical field, your engine/transmission will die with that wear, not of it. ================= Of course, that's just some guy on the Internet. That's why I am asking for a _credible_ source (e.g. expert mechanic, respected auto magazine etc) that explains why and how engine brake causes problems. I for one looked for a while without finding one. On the contrary, many vehicle manuals (I've seen Audi and Honda) advise using engine brake. Andrei
Jul 30 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/30/2013 11:18 AM, Andrei Alexandrescu wrote:
 Thanks for this anecdote. It's at the very best circumstantial. (With the
engine
 off, the oil pump wasn't even started!)

The oil pump is driven by the crankshaft, so if the engine is turning, the oil pump is. (There are some highly specialized race engines with an electric oil pump, but that is highly unlikely here.) I was told by U-Haul that when towing a car long distance, you couldn't just put the manual transmission in neutral. You had to take the driveshaft out, because the transmission was designed to circulate the oil based on the front shaft turning, not the back shaft. It would sieze after a while if you only turned the back shaft.
 I've asked Walter for one credible source on the entire Internet documenting
the
 case against engine braking. He was unable to produce one. Instead, he
attempted
 to explain how an increase in hysteresis can cause additional wear on the
engine
 (the parts not worn under forward use). However, this is what one poster in
 http://goo.gl/Ys099U had to say about that:

 =================
 Most of the time when you drive, you're putting a load (and causing wear) on
 what I'm going to call the "forward" face of each tooth on each gear in your
 drivetrain. The front of a tooth on the crankshaft pushes against the back of a
 tooth on the next gear in line, which pushes the next gear, etc. When you use
 "engine braking", all you are doing is engaging the teeth in the opposite
 direction, and putting force and wear on the faces that normally are just along
 for the ride.

 Now, does that mean you're wearing your engine out faster? Marginally... but
the
 parts you're wearing out would normally have to be replaced (if at all) because
 they'd worn out from the other side; you're wearing surfaces that would usually
 be thrown out with hardly any wear at all. To borrow a phrase from the medical
 field, your engine/transmission will die with that wear, not of it.
 =================

I also pointed out the "hammering" effect of alternately forward driving then back driving the rotating parts, as the parts forcefully take up the slack of hysteresis. I also pointed out the effect of unburned gas from backdriving washing oil off of the cylinder walls causing undue wear. This definitely happens with carbureted cars, but with modern fuel injection the fuel is shut off when backdriving.
Jul 30 2013
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/30/13 11:35 AM, Walter Bright wrote:
 On 7/30/2013 11:18 AM, Andrei Alexandrescu wrote:
 Thanks for this anecdote. It's at the very best circumstantial. (With
 the engine
 off, the oil pump wasn't even started!)

The oil pump is driven by the crankshaft, so if the engine is turning, the oil pump is. (There are some highly specialized race engines with an electric oil pump, but that is highly unlikely here.) I was told by U-Haul that when towing a car long distance, you couldn't just put the manual transmission in neutral. You had to take the driveshaft out, because the transmission was designed to circulate the oil based on the front shaft turning, not the back shaft. It would sieze after a while if you only turned the back shaft.

So that invalidates the anecdote.
 I also pointed out the "hammering" effect of alternately forward driving
 then back driving the rotating parts, as the parts forcefully take up
 the slack of hysteresis.

I guess any brisk adjustment of throttle would be unadvisable, one direction or another (i.e. releasing the clutch with a large difference in rotation). Back driving, however, happens as soon as one just lifts the foot off the pedal - the inertia of the car pushes on the engine.
 I also pointed out the effect of unburned gas from backdriving washing
 oil off of the cylinder walls causing undue wear. This definitely
 happens with carbureted cars, but with modern fuel injection the fuel is
 shut off when backdriving.

That's my understanding as well. With fuel injection, essentially backdriving is rolling on zero gas consumption while preserving some mechanical energy - aweee-sooome. Andrei
Jul 30 2013
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 7/30/13 1:01 PM, H. S. Teoh wrote:
 My only regret was paying for the fuel plan (full tank of gas), because
 I underestimated the car's efficiency, when I could've just let them
 fill up half the tank at the end for a lower total price instead.

Yah, never do that. Whoever came up with that idea was a marketing genius. Must have made a bunch of extra $ to the rental companies. Andrei
Jul 30 2013
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/30/2013 12:16 PM, Adam Wilson wrote:
 Back driving ("compression braking" in the automotive world) is indeed a
 recommend procedure in modern cars. My dad (ASE Master Tech) recommends it as a
 way to save wear on the brakes and is as you've noted, quite an efficient use
of
 energy. Heck, it's one of the first things he taught me how to do when I was
 learning how to drive.

 Toyota took it one step further and built a capability into the Prius where the
 electric driveline reverses it's polarity and uses motors to slow down the car
 while simultaneously recharging the battery as the car slows down instead of
 using the brakes. It's called regenerative braking. Needless to say, we don't
do
 brakes very often on Prius'.

If the engine *is designed for it*, that's a different story entirely. The engines I work on were not designed for it.
Jul 30 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/30/2013 12:06 PM, Adam Wilson wrote:
 My dad has been an ASE Master Technician for my entire life and teaches
 Emissions Certification classes for our state. What I am about to say is based
 stuff I've picked up from him.

 I would go one step further and point out that in modern vehicles, those made
 after the EPA catalytic converter and air quality mandates of the early 80's,
 that any oil in the combustion chamber is a Very Bad Thing. Unburned
 hydrocarbons are highly destructive to catalytic converters and oil never burns
 completely during combustion. In fact we rebuilt the engine on my 1996 Honda
 Accord in 2010 precisely because it was starting to burn oil. And indeed, a
year
 later the catalytic converter failed anyway due to the excessive strain placed
 on it by the partially burned oil that was forced through it prior to the
rebuild.

 My dad actually recommended engine braking (the correct term is "compression
 braking" btw, Thanks Dad!) as a way to reduce wear on the brakes. The google
 poster is correct in this statement that all you're doing is putting strain on
 parts that aren't used that way much, unless you reverse a lot. We see cars
 ranging from the early 80's on up, including carbureted, and we've NEVER once
 seen a car with a transmission or engine that died because of compression
 braking. Given our sample size of somewhere over 10,000 ... :-)

How would you know if excessive wear was caused by engine braking or not? Excessive wear can be caused by all kinds of things, like not letting the engine warm up before driving it hard, or running long between oil changes, shifting prematurely or too late, etc.
 The automotive industry has spent obscene amounts of money getting the absolute
 cleanest burn they can to meet CAFE standards, and the very first thing they
did
 was get the oil out of the combustion chamber. I'll also say that based on my
 dad's experience's with the Emissions class that even competent techs are
having
 a VERY difficult time understanding this stuff, the chemistry involved is Ph.D
 stuff, and now ignition system are getting they way too. My dad has often
 lamented that working on cars is now more about understanding the computer
 control systems than it is the mechanics of it. Your average dealer tech
 probably has no clue what they are talking about since they have no reason to
 invest in learning this stuff. They don't see the car again after the warranty
 runs out and these systems rarely fail in five years. At least that's been my
 dad's experience with them.

I'll have to add that my knowledge of these things is pre-1990. So are the cars I work on :-)
Jul 30 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/30/2013 4:22 PM, Adam Wilson wrote:
 Indeed, the other things you listed are quite evil on the internals of the
 engine. Particularly going too long between oil changes. But compression
braking
 isn't on the list from an engineering standpoint. The components of the
 transmission and engine and much beefier than they strictly need to be.

Eh, I'm less convinced about that. I've had two transmissions shatter going steady speed at 30 mph. I doubled the horsepower in my dodge, the first thing that needed upgrading was the transmission (replaced the whole thing). I also upgraded the springs, driveshaft, bell housing (don't want my feet cut off), flywheel & clutch, brakes, and mounts. Not to mention everything inside the engine is upgraded, such as going from a cast to a forged crank (3x stronger). I didn't upgrade the differential and rear axle. Those do tend to be beefier than necessary. If I went to more than double the power, I'd have to do things like weld extra bracing into the frame, "tub" the rear chassis, go to fat tires, put in a roll cage, etc.
 No manufacturer wants THAT recall at 5k per repair. Essentially, it's not any
 different than driving forward, you are just reversing the stress on components
 that were engineered to handle it moving forward.

It also assumes that the profile of the gears and the hardening on them is symmetric. It probably is - but I don't know that for a fact.
 And most people drive cars newer than 15 years, unlike the Crazy Leader of D
Who
 Shall Remain Nameless. ;-)

There's just something about a hotrodder doing it by reflashing the SD memory that leaves me cold :-) I just don't care for new cars. The only ones that piqued my interest are the retro Mustang and the retro Challenger. Not even the new Ferraris look interesting. I'll rent cars on trips, and I can't even recall what brand they were. Zzzzzzz. I'll just conclude with a video on why electric cars will always suck and why Detroit has never made anything worth buying since 1972: http://www.youtube.com/watch?v=PsUnBQE8jhE
Jul 30 2013
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 07/30/2013 09:14 PM, Walter Bright wrote:
 There's just something about a hotrodder doing it by reflashing the SD
 memory that leaves me cold :-)

 I just don't care for new cars. The only ones that piqued my interest
 are the retro Mustang and the retro Challenger. Not even the new
 Ferraris look interesting. I'll rent cars on trips, and I can't even
 recall what brand they were. Zzzzzzz.

 I'll just conclude with a video on why electric cars will always suck
 and why Detroit has never made anything worth buying since 1972:

 http://www.youtube.com/watch?v=PsUnBQE8jhE

Grow up, Walter. You're not a teenager anymore. Driving a noisy, inefficient car doesn't make you cool. It makes you a pathetic man trying to recapture his youth while annoying the neighbors and making a fool of himself.
Aug 01 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 3:52 AM, Jeff Nowakowski wrote:
 Grow up, Walter. You're not a teenager anymore. Driving a noisy, inefficient
car
 doesn't make you cool. It makes you a pathetic man trying to recapture his
youth
 while annoying the neighbors and making a fool of himself.

One advantage to growing older is ceasing to care what others think about the things I enjoy. You might be amused to know that I've been driving muscle cars since I was a kid, and was the only one at college with one, the only one at any of the companies I've worked for, the only one in my neighborhood, etc. Motorheads are few and far between. The only time I run into others is at a meet or the drag strip. I'd drive my other car on dates because girls didn't like them. It's pretty rare to see one on the road. You could say those cars were never cool among my peers, who'd often just look at the car blankly, and pine for a BMW or Porsche. BTW, you don't think the Prius is a status symbol? :-)
Aug 01 2013
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 08/01/2013 02:14 PM, Walter Bright wrote:
 BTW, you don't think the Prius is a status symbol? :-)

Nope, they're an affordable and practical car, and quite common these days. Tesla, now that's a status symbol.
Aug 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 11:37 AM, Jeff Nowakowski wrote:
 On 08/01/2013 02:14 PM, Walter Bright wrote:
 BTW, you don't think the Prius is a status symbol? :-)

Nope, they're an affordable and practical car, and quite common these days. Tesla, now that's a status symbol.

"as long as the hybrid remains a symbol of a driver’s commitment to the environment, especially among the nation’s wealthiest, the future of the Prius should be secure." http://www.forbes.com/sites/eco-nomics/2012/08/09/is-the-toyota-prius-the-latest-status-symbol-of-the-wealthy/ The Prius isn't very green, either: "When you factor in all the energy it takes to drive and build a Prius it takes almost 50% more energy than a Hummer. In a study by CNW Marketing called "Dust to Dust", researchers discovered that the Prius costs and average of $3.25 per mile driven over a lifetime of 100,000 miles (the expected lifespan of a hybrid). On the other hand the Hummer costs $1.95 per mile over an expected 300,000 miles. Which means that the Hummer will last three times as long and use less energy than the Prius." http://www.thetorquereport.com/2007/03/toyotas_prius_is_less_efficien.html It's not easy being green :-)
Aug 01 2013
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/1/13 12:05 PM, Adam Wilson wrote:
 If we've learned anything at the shop it's that people can't be bothered
 with the facts. They seriously don't care if you have studies backing up
 the environmental damage, they believe they are green and will take
 those beliefs to their graves. Ideology is funny that way. :-)

You betcha. Related, you destroyed the myth that engine braking is any bad, but I bet money nobody changed opinions. About green driving, Prius, and Tesla - it's all about what industry you want to sustain. Everything that stands behind the Hummer as a road car is an abomination, pure and simple. Of course I'd agree plenty of Prius drivers are as snooty as it gets in a different way. Yet the reality remains that the Hummer is an evolutionary dead end, and hybrids are a stepping stone to a better future. My current car is a nice and economic Honda Fit. It is the very last internal combustion engine I'll ever own - I hope my next car will be a Tesla (regardless of what anyone thinks about it being a status symbol). Buying a dinosaur juice-based engine at this point is as much fail as buying a carriage with horses in 1915. I predict that internal combustion engines will be seen in less than a hundred years as weird inefficient contraptions, like we think of steam engines today. Also, there is a beauty about electrical engines - their theoretical efficiency is 100%, they are simple, principled, entropy-neutral, and work on conservative laws. (Batteries are more unwieldy though.) Andrei
Aug 01 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 12:39 PM, Andrei Alexandrescu wrote:
 You betcha. Related, you destroyed the myth that engine braking is any bad, but
 I bet money nobody changed opinions.

For an engine designed for it, sure. For an engine not designed for it, no. A carbureted engine is still going to have the unburned gas problem (and you're not going to be very green pumping out semi-burned hydrocarbons out the tailpipe). I don't know at what point injected systems began shutting off the fuel when backdriving.
 Also, there is a beauty about electrical engines - their theoretical efficiency
 is 100%, they are simple, principled, entropy-neutral, and work on conservative
 laws. (Batteries are more unwieldy though.)

You're right, it's all about the batteries. They're a gigantic problem that, while there are incremental improvements, is still far from a solution. But gasoline engines are also getting incremental improvements. Modern ones are way, way better than the ones from the 60's in just about every aspect. There's an inherent efficiency in gas cars in that the energy is generated on site. For electric cars, the energy is generated elsewhere (at the power plant), and then you're faced with all the losses from transmitting the energy, storing it, and recovering it. It's a tough hill to climb. Gasoline is pretty remarkable in its energy density and portability. BTW, with a manual trans, you can get quite a bit better mileage than the EPA ratings. Google "hypermiling" for ways. I do that stuff routinely.
Aug 01 2013
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 1:53 PM, Adam Wilson wrote:
 You did forget to mention that you piss off everyone behind though... ;-P

I do pay attention to what's behind me when doing it. I'll hypermile much more aggressively when there's nobody behind me.
Aug 01 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 2:30 PM, H. S. Teoh wrote:
 The disadvantage of gasoline is that in a sense we're "cheating",
 because the energy stored in it was built up over millions of years by
 ancient organisms that have long decayed, and we're only now discharging
 all that build-up. We didn't pay anything to put that energy there,
 that's why it's so economical.

Yet the electric power to charge the batteries comes from burning coal and natural gas :-) Yeah, I know, solar, wind, etc. But that's still way off in providing base power. Like I said, it ain't easy being green. It's hard to do a "dust to dust" analysis, and most of the time people simply choose to ignore costs that are hard to calculate.
Aug 01 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 11:46 PM, monarch_dodra wrote:
 That's in the US. Most of Europe is on nuclear power.

 If we set aside controversies on the dangers of plant meltdown (BTW, Fukushima
 was hit by a mag 9 *and* a tidal wave, just saying), it *is* about over 9000
 times greener.

 IMO, nuclear power is like airplanes: Spectacular when an accidents happen, but
 at the end of the day (IMO) safer: Coal miners die by the 100's when a cave in
 happens, and thousands of people die in china due to coal pollution.

I know. The problems with nukes are political, not technological. All the plant safety problems can be solved. I also don't really understand the issue with radiation that lasts 10,000 years. It, by definition, will be extremely weak radiation.
Aug 02 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 12:39 PM, Andrei Alexandrescu wrote:
 (regardless of what anyone thinks about it being a status symbol).

Nobody admits that they select a car based on its status signals, even the people who pick anti-status symbols, as that's its own status signal! Reminds me of that old Dr. Pepper commercial: "Join the non-conformists!"
Aug 01 2013
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/1/13 1:29 PM, Walter Bright wrote:
 On 8/1/2013 12:39 PM, Andrei Alexandrescu wrote:
 (regardless of what anyone thinks about it being a status symbol).

Nobody admits that they select a car based on its status signals, even the people who pick anti-status symbols, as that's its own status signal!

Good point. Andrei
Aug 01 2013
prev sibling parent =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 08/01/2013 12:39 PM, Andrei Alexandrescu wrote:

 you destroyed the myth that engine braking is any bad, but I bet money
 nobody changed opinions.

You owe Adam a dollar! :) I have been engine braking since the day I started driving. Then I started following the AudiWorld forums, where I learned that "brakes were for stopping and the engine was for going." There were a lot of anecdotes told. Anyway, I am consciously readjusting to engine braking again after reading this thread. Ali [OT OT] While I have the microphone, let me rant about automatic transmissions: They are undrivable for me because they are not responsive and because they never know that the imminent road condition requires shifting down, now. Every time I have to drive an automatic I have to try to soothe myself. :)
Aug 01 2013
prev sibling parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 08/01/2013 02:56 PM, Walter Bright wrote:
 "as long as the hybrid remains a symbol of a driver’s commitment to the
 environment, especially among the nation’s wealthiest, the future of the
 Prius should be secure."

 http://www.forbes.com/sites/eco-nomics/2012/08/09/is-the-toyota-prius-the-latest-status-symbol-of-the-wealthy/

I wonder how old most of those cars are, because these days there are lots of alternative hybrids, with the Prius being one of the cheaper ones. I think this meme has an expiration date, and is already starting to taste sour.
 The Prius isn't very green, either:

 "When you factor in all the energy it takes to drive and build a Prius
 it takes almost 50% more energy than a Hummer. In a study by CNW
 Marketing called "Dust to Dust", researchers discovered that the Prius
 costs and average of $3.25 per mile driven over a lifetime of 100,000
 miles (the expected lifespan of a hybrid). On the other hand the Hummer
 costs $1.95 per mile over an expected 300,000 miles. Which means that
 the Hummer will last three times as long and use less energy than the
 Prius."

 http://www.thetorquereport.com/2007/03/toyotas_prius_is_less_efficien.html


 It's not easy being green :-)

http://google.com/search?q=prius+hummer --> http://www.thecarconnection.com/tips-article/1010861_prius-versus-hummer-exploding-the-myth "But Toyota also says that the study uses an unrealistically low estimated lifetime for hybrids, and that there's no data to support its assumptions in this. For instance, according to the study the average Prius is expected to go 109,000 miles over its lifetime, while a Hummer H1 would go 379,000 miles. CNW says about hybrids: “…these are generally secondary vehicles in a household OR they are driven in restricted or short range environments such as college campuses or retirement neighborhoods.”" So even assuming CNW is correct about the buyer and usage, if that same buyer had bought a Hummer instead it would have been driven the same miles as the Prius. There are a lot of other disputes pointed out in the article. The CNW study looks like a hit piece.
Aug 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 5:17 PM, Jeff Nowakowski wrote:
 On 08/01/2013 02:56 PM, Walter Bright wrote:
 http://www.forbes.com/sites/eco-nomics/2012/08/09/is-the-toyota-prius-the-latest-status-symbol-of-the-wealthy/

I think this meme has an expiration date, and is already starting to taste sour.

The article is a year old, not that ancient. It's far from the only one on the topic. Google "prius status symbol". Even South Park famously did an episode on it.
 "But Toyota also says that the study uses an unrealistically low estimated
 lifetime for hybrids, and that there's no data to support its assumptions in
 this. For instance, according to the study the average Prius is expected to go
 109,000 miles over its lifetime, while a Hummer H1 would go 379,000 miles. CNW
 says about hybrids: “…these are generally secondary vehicles in a
household OR
 they are driven in restricted or short range environments such as college
 campuses or retirement neighborhoods.”"

So the prius is more cost effective because you drive it less?
 So even assuming CNW is correct about the buyer and usage, if that same buyer
 had bought a Hummer instead it would have been driven the same miles as the
 Prius. There are a lot of other disputes pointed out in the article. The CNW
 study looks like a hit piece.

The Hummer is the poster boy for polluting Americans, and the Prius the poster boy for enlightened environmental consciousness. The truth is a lot harder to get at than that. What does work is, of course, orienting your life so you drive less. Like living closer to work, combining errands into one trip, carpooling, biking, using Amazon instead of going to the mall, etc.
Aug 01 2013
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 08/01/2013 09:24 PM, Walter Bright wrote:
 The article is a year old, not that ancient.It's far from the only
 one on the topic. Google "prius status symbol".

That's why I wondered how old the cars were. You don't buy cars like groceries. If the same stat was true in 5-10 years I'd be surprised.
 Even South Park famously did an episode on it.

Yes, in 2006! The market has changed a lot since then, from many more hybrid models to choose from, to all-electric vehicles. If you want to flaunt wealth and eco-smugness, get a Tesla.
 So the prius is more cost effective because you drive it less?

It's about an apples-to-apples comparison. There's a lot of upfront cost to creating a car. If you drove it for a year and trashed it your environmental numbers would look horrible. Alternatively, you could put a lot more miles on the Prius if you wanted to. The Prius has taxis with over 200,000 miles on them. And that was just one point of contention. You don't seem very interested in looking critically at this report, instead going for Glenn Beck level analysis.
 The Hummer is the poster boy for polluting Americans, and the Prius
 the poster boy for enlightened environmental consciousness. The
 truth is a lot harder to get at than that.

 What does work is, of course, orienting your life so you drive less.
 Like living closer to work, combining errands into one trip,
 carpooling, biking, using Amazon instead of going to the mall, etc.

So here you are advocating to drive less, unlike the 379,000 mile Hummer. Make up your mind.
Aug 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/1/2013 9:46 PM, Jeff Nowakowski wrote:
 So here you are advocating to drive less, unlike the 379,000 mile
 Hummer. Make up your mind.

We could go on for weeks back and forth with clever ripostes. I don't really care to, this is the wrong forum for that.
Aug 01 2013
parent reply Jeff Nowakowski <jeff dilacero.org> writes:
On 08/02/2013 01:30 AM, Walter Bright wrote:
 We could go on for weeks back and forth with clever ripostes. I don't
 really care to, this is the wrong forum for that.

I wasn't interested in "clever ripostes". I was interested in intellectual honesty. And while this is certainly the wrong forum, it hasn't stopped you before when discussing other off-topic stuff. Of course, you need not reply, or you could invoke your ownership privs and either delete my posts or demand I make no further ones.
Aug 02 2013
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/2/13 12:23 AM, Jeff Nowakowski wrote:
 On 08/02/2013 01:30 AM, Walter Bright wrote:
 We could go on for weeks back and forth with clever ripostes. I don't
 really care to, this is the wrong forum for that.

I wasn't interested in "clever ripostes". I was interested in intellectual honesty. And while this is certainly the wrong forum, it hasn't stopped you before when discussing other off-topic stuff. Of course, you need not reply, or you could invoke your ownership privs and either delete my posts or demand I make no further ones.

Whoa, what's the matter here? Andrei
Aug 02 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Aug 01, 2013 at 02:40:57PM -0700, Walter Bright wrote:
 On 8/1/2013 2:30 PM, H. S. Teoh wrote:
The disadvantage of gasoline is that in a sense we're "cheating",
because the energy stored in it was built up over millions of years
by ancient organisms that have long decayed, and we're only now
discharging all that build-up. We didn't pay anything to put that
energy there, that's why it's so economical.

Yet the electric power to charge the batteries comes from burning coal and natural gas :-) Yeah, I know, solar, wind, etc. But that's still way off in providing base power.

The ancient organisms conveniently collected all that solar energy for us in a convenient, easy-to-use, highly efficient form. We're still "cheating" if we're merely diverting some of that pre-collected energy into another form just so we can make convincing presentations about being green.
 Like I said, it ain't easy being green. It's hard to do a "dust to
 dust" analysis, and most of the time people simply choose to ignore
 costs that are hard to calculate.

The bottom line is that to be truly green, we have to spend millions of years building up reservoirs of fuel built from solar energy. The rate at which we're burning up energy in today's society is simply untenable in the long run. (Well, there's always nuclear energy, of which there is plenty to go around, but it comes with other disadvantages. :-P Some days you win, most days you lose.) T -- Beware of bugs in the above code; I have only proved it correct, not tried it. -- Donald Knuth
Aug 01 2013
prev sibling next sibling parent "John Colvin" <john.loughran.colvin gmail.com> writes:
On Thursday, 1 August 2013 at 21:52:19 UTC, H. S. Teoh wrote:
 The bottom line is that to be truly green, we have to spend 
 millions of
 years building up reservoirs of fuel built from solar energy.
 The rate
 at which we're burning up energy in today's society is simply 
 untenable
 in the long run.

I don't believe this is true. It's a technological hurdle, but I don't see any reason that it's an insurmountable one.
 (Well, there's always nuclear energy, of which there is
 plenty to go around, but it comes with other disadvantages. :-P
  Some
 days you win, most days you lose.)


 T

Surely you mean most days you win, some days you lose? Nuclear is great 99.9% of the time, the someone doesn't do their job properly and wooops.... My hopes are with fusion. Specifically ITER. My research is on diagnostics/data analysis for tokamaks, working with the guys at Culham, UK (MAST and JET). They are cautiously optimistic.
Aug 01 2013
prev sibling next sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Thursday, 1 August 2013 at 22:10:07 UTC, John Colvin wrote:
 My hopes are with fusion. Specifically ITER. My research is on 
 diagnostics/data analysis for tokamaks, working with the guys 
 at Culham, UK (MAST and JET). They are cautiously optimistic.

When you read about fusion energy, you *can't* not be optimistic about it. At times though, it feels like its *such* a technological and *financial* hurdle, that the question is more like "Will we have mastered this technology before gas prices get so high a global recession will prevent us from ever financing the research?"
Aug 01 2013
prev sibling parent "John Colvin" <john.loughran.colvin gmail.com> writes:
On Friday, 2 August 2013 at 06:50:46 UTC, monarch_dodra wrote:
 On Thursday, 1 August 2013 at 22:10:07 UTC, John Colvin wrote:
 My hopes are with fusion. Specifically ITER. My research is on 
 diagnostics/data analysis for tokamaks, working with the guys 
 at Culham, UK (MAST and JET). They are cautiously optimistic.

When you read about fusion energy, you *can't* not be optimistic about it. At times though, it feels like its *such* a technological and *financial* hurdle, that the question is more like "Will we have mastered this technology before gas prices get so high a global recession will prevent us from ever financing the research?"

When I say they are cautiosly optimistic, I mean that they are optimistic that ITER will be the last big pre-prototype Tokamak. Next step prototype powerplant.
Aug 02 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Aug 01, 2013 at 01:17:51PM -0700, Walter Bright wrote:
[...]
 There's an inherent efficiency in gas cars in that the energy is
 generated on site. For electric cars, the energy is generated
 elsewhere (at the power plant), and then you're faced with all the
 losses from transmitting the energy, storing it, and recovering it.
 It's a tough hill to climb. Gasoline is pretty remarkable in its
 energy density and portability.

Your comparison isn't totally accurate. Gasoline stores energy in the form of chemical bonds, and batteries store energy in the form of electrical charge. Both release the energy on site. The advantage of gasoline is that chemical bonds in gasoline are far more persistent than the electrical charge in batteries, and they are also denser in terms of units of energy per volume than batteries made with current technology. That's why gasoline is so much easier to store, transport, and have very high efficiency. The disadvantage of gasoline is that in a sense we're "cheating", because the energy stored in it was built up over millions of years by ancient organisms that have long decayed, and we're only now discharging all that build-up. We didn't pay anything to put that energy there, that's why it's so economical. If we had to live on synthetic gasoline, it'd be a totally different story (it *is* possible to synthesize this stuff, y'know, and attain the same efficiency, if not better; the problem is that this costs far too much to compete with the stuff we "stole" from ancient organisms). T -- People tell me that I'm skeptical, but I don't believe it.
Aug 01 2013
prev sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Thursday, 1 August 2013 at 21:40:57 UTC, Walter Bright wrote:
 On 8/1/2013 2:30 PM, H. S. Teoh wrote:
 The disadvantage of gasoline is that in a sense we're 
 "cheating",
 because the energy stored in it was built up over millions of 
 years by
 ancient organisms that have long decayed, and we're only now 
 discharging
 all that build-up. We didn't pay anything to put that energy 
 there,
 that's why it's so economical.

Yet the electric power to charge the batteries comes from burning coal and natural gas :-) Yeah, I know, solar, wind, etc. But that's still way off in providing base power. Like I said, it ain't easy being green. It's hard to do a "dust to dust" analysis, and most of the time people simply choose to ignore costs that are hard to calculate.

That's in the US. Most of Europe is on nuclear power. If we set aside controversies on the dangers of plant meltdown (BTW, Fukushima was hit by a mag 9 *and* a tidal wave, just saying), it *is* about over 9000 times greener. IMO, nuclear power is like airplanes: Spectacular when an accidents happen, but at the end of the day (IMO) safer: Coal miners die by the 100's when a cave in happens, and thousands of people die in china due to coal pollution.
Aug 01 2013
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Jul 31, 2013 at 12:18:34PM +0200, monarch_dodra wrote:
 On Tuesday, 30 July 2013 at 21:58:04 UTC, Andrei Alexandrescu wrote:
On 7/30/13 1:01 PM, H. S. Teoh wrote:
My only regret was paying for the fuel plan (full tank of gas),
because I underestimated the car's efficiency, when I could've just
let them fill up half the tank at the end for a lower total price
instead.

Yah, never do that. Whoever came up with that idea was a marketing genius. Must have made a bunch of extra $ to the rental companies. Andrei

What exactly is the "fuel plan"? Every time I've ever rented a car, it was "here is a car with a full tank. You must return it with a full tank. (or pay for the missing fuel)".

Like Andrei said, it's a marketing genius' idea. Basically they say, pay us a full tank of gas up front, and you can bring the car in without worrying about filling up at the end. Of course, nobody would fall for it if they don't also include a discounted gas price for the full tank. The temptation then becomes trying to run the tank as low as possible before returning it in order to "get your money's worth". T -- Just because you survived after you did it, doesn't mean it wasn't stupid!
Jul 31 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Jul 30, 2013 at 12:16:11PM -0700, Adam Wilson wrote:
[...]
 Toyota took it one step further and built a capability into the
 Prius where the electric driveline reverses it's polarity and uses
 motors to slow down the car while simultaneously recharging the
 battery as the car slows down instead of using the brakes. It's
 called regenerative braking. Needless to say, we don't do brakes
 very often on Prius'.

I got a Prius when renting a car once, and it was incredibly fuel-efficient. I had it for a week, and drove it all over the place including from LAX all the way to Irvine and back, and the tank was still half full by the time I returned the car. I found that when braking, the dashboard display shows that the battery is recharging. It even has a diagram to show you how efficient your driving is, and from experimentation, I found that gradual acceleration/braking resulted in the highest efficiency (keeps the battery bar near the top) -- probably because it was maximally postponing fuel consumption and using regenerative braking instead of the brake pads. My only regret was paying for the fuel plan (full tank of gas), because I underestimated the car's efficiency, when I could've just let them fill up half the tank at the end for a lower total price instead. T -- If you think you are too small to make a difference, try sleeping in a closed room with a mosquito. -- Jan van Steenbergen
Jul 30 2013
prev sibling parent "monarch_dodra" <monarchdodra gmail.com> writes:
On Tuesday, 30 July 2013 at 21:58:04 UTC, Andrei Alexandrescu 
wrote:
 On 7/30/13 1:01 PM, H. S. Teoh wrote:
 My only regret was paying for the fuel plan (full tank of 
 gas), because
 I underestimated the car's efficiency, when I could've just 
 let them
 fill up half the tank at the end for a lower total price 
 instead.

Yah, never do that. Whoever came up with that idea was a marketing genius. Must have made a bunch of extra $ to the rental companies. Andrei

What exactly is the "fuel plan"? Every time I've ever rented a car, it was "here is a car with a full tank. You must return it with a full tank. (or pay for the missing fuel)".
Jul 31 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 4:10 AM, John Colvin wrote:
 On Friday, 26 July 2013 at 05:13:37 UTC, Walter Bright wrote:
 Software controlled brakes with no override? Madness!

I presume you always had some override system with anything critical at Boeing?

ALWAYS! This was hammered into us. Millions of flights without fatalities? That couldn't possibly happen without such systematic design. I am continually amazed at critical systems design in Fukushima and Deep Water Horizon that have no backups or overrides. I'd fire any engineer that came to me the second time with a critical system design he argues "can't fail" and doesn't need a backup/override.
 P.S. I don't suppose you happened to work with Bogdan Hnat there? It's a long
 shot as I think you would have left before he started.

I left Boeing around 1982. I've never heard of Bogdan Hnat, and I think I would have remembered an unusual name like that :-)
Jul 26 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 7/26/2013 3:03 PM, Joseph Rushton Wakeling wrote:
 I hope I'm not being unfair, but my impression was that the very impressive
 modern safety record of air travel is at least partly down to lessons learned
 from some major historical catastrophes.

Designers make mistakes even in redundant systems - sometimes they turn out to be coupled so a failure in one causes a failure in the backup. Sometimes certain failure modes are not anticipated. But one thing they do NOT do is assume that component X cannot fail.
 The one that always springs to mind is
 the De Havilland jets breaking apart mid-flight due to metal fatigue.

Boeing's fix for that not only involved fixing the particular fatigue problem, but designing the structure so WHEN IT DOES CRACK the crack will not bring the airplane down. This design has been proven through a handful of incidents where an airliner has lost whole panels due to cracking and yet the structure remained sound.
 The number of flights and resulting near misses surely helps to battle test
 safely procedures and designs. That volume of learning opportunities can't
 readily be matched in many other industries.

The most important lesson learned from aviation accidents is that all components can and will fail, so you need layers of redundancy. The airplane is far too complicated to rely on crash investigations to identify problems. I watched a show on the Concorde the other day, and was shocked to learn that there'd been an earlier incident where a tire burst on takeoff, the tire parts had penetrated the wing fuel tank, and the fuel drained away. The industry decided to ignore fixing it - and a few years later, it happened again, but this time the leak caught fire and killed everybody.
Jul 26 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 7/30/2013 5:16 PM, Joseph Rushton Wakeling wrote:
 I have to say, one of these days I'd really like to buy you a beer (or two, or
 three...) and have a long, long conversation about these (and other) aspects of
 aerospace engineering.  I imagine it would be fascinating. :-)

So there, Andrei!
 But I do think I'm correct in asserting that the particular disaster with the
 Comet didn't just result in learning about a new mode of failure and how to
cope
 with it, but in an awful lot of new knowledge about designing safety
procedures,
 analysing faults and crash data, and so on?

The disaster did usher in the modern era of crash investigation.
 What I mean is that I would have thought that with the number of flights taking
 place, there would be a continuous stream of data available about individual
 component failures and other problems arising in the course of flights, and
that
 tracking and analysing that data would play a major role in anticipating
 potential future issues, such as modes of failure that hadn't previously been
 anticipated.  The example you give with the concorde is exactly the sort of
 thing that one would expect _should_ have prevented the later fatal accident.

You're right in that there's a flood of service data coming back, and there's an engineering team that is constantly improving the parts based on that service data. They track every single part, where it came from, what batch it was in, who inspected it, it's service history, what airplane it's on, etc.
 My point was that this volume of data isn't necessarily available in other
 engineering situations, so one might anticipate that in these areas it's more
 likely that minor failures will be overlooked rather than learned from, as they
 are rarer and possibly not numerous enough to build up enough data to make
 predictions.

I know that car companies will buy cars out of junkyards and take them apart looking for service issues, but yeah, tracking everything is too expensive for them.
 Of course, even if sufficient data was available, it wouldn't save them if the
 design (or management) culture didn't take into account the basic principles
 you've described.

Boeing (and every other airframe company) would be out of business if they didn't have that culture. What I find surprising is that other industries seem completely unaware of this methodology. They're stuck in the naive "the design requires that this part cannot fail" mindset.
Jul 30 2013
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Peter Alexander:

 - What's safe and unsafe is very subjective.

There are large bodies of people that count bugs in code, and correlate them with coding practices. They have created language subsets like C for automotive industry, C++ for aviation, code for space missions, Ada language and its successive refinements like Ada2012, SPARK subset of Ada. There are lot of people trying sideways solutions, at Microsoft (Spec#, Liquid typing, etc), dependent typing (ATS language), and so on and on, even Haskell variants. Lot of this stuff is not based on statistical data, but there is also some hard data that has shaped some of those very strict coding guidelines. There are several serious studies in the field of coding safety. Dismissing all that decades old work with a 'very subjective' is unjust. As usual D code safety is mostly correlated to the coding style you are using, how you write your unittests and code contracts, how much good are your code reviews, how much careful your programmers are, etc. But the language design is also a factor. To me D safety looks about intermediate between C and Ada-SPARK. D code normally has undetected integral overflows, it doesn't help a lot against null pointers (Nullable is not so good yet), there is no significant stack overflow protection, no variable-sized stack-allocated arrays that help a bit created bounded collections, the management of reference escaping is planned but not yet implemented (scope), and so on. Overall to me D coding seems significantly safer than C coding, and perhaps it's a little safer than C++11 coding too. I know no studies about the safety of D code compared to C++11 code or Ada2012 code, or compared to other languages. Bye, bearophile
Jul 25 2013
prev sibling next sibling parent "Peter Alexander" <peter.alexander.au gmail.com> writes:
On Thursday, 25 July 2013 at 20:28:54 UTC, bearophile wrote:
 Peter Alexander:

 - What's safe and unsafe is very subjective.

There are large bodies of people that count bugs in code, and correlate them with coding practices. They have created language subsets like C for automotive industry, C++ for aviation, code for space missions, Ada language and its successive refinements like Ada2012, SPARK subset of Ada. There are lot of people trying sideways solutions, at Microsoft (Spec#, Liquid typing, etc), dependent typing (ATS language), and so on and on, even Haskell variants. Lot of this stuff is not based on statistical data, but there is also some hard data that has shaped some of those very strict coding guidelines. There are several serious studies in the field of coding safety. Dismissing all that decades old work with a 'very subjective' is unjust.

Allow me to put it another way by way of analogy: health. We know from medical studies what kinds of things are healthy, and what things are unhealthy. However, if I were to present 10 people, and witness their actions for a week, would anyone be able to accurately order them on their "healthiness"? Would every medical expert arrive at the same ordering? Maybe subjective is the wrong word to use. Maybe what I meant was "difficult to quantify".
Jul 25 2013
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 It's done by the hardware (putting a "no-access" page at the 
 end of the stack). There's nothing unsafe about it.


 no variable-sized stack-allocated arrays that help a bit
 created bounded collections,

I don't see how that is a safety issue.

In my opinion where you allocate your data influences the "safety" of your program, but it's not easy for me to tell exactly in what direction such influence is. If you allocate too much data on the stack this could cause stack overflow. As you say a stack overflow is memory safe, but if your program is doing something important, a sudden crash could be regarded as dangerous for the user. You don't want a stack overflow in the code that controls your car brakes (this is not a totally invented example). Having variable-sized stack-allocated arrays encourages you to put more data on the stack, increasing the risk of stack overflows. On the other hand, if you only have fixed-sized stack-allocated arrays, you could allocate a fixed size array on the stack and then use only part of it. This waste of stack space increases the probability of stack overflow. A variable-sized stack-allocated array allows you to waste no stack space, and avoid those stack overallocations. If you are using a segmented stack as Rust, stack overflows become less probable (it becomes more like a malloc failure), because the stack is able to become very large when needed. I think Rust needs that elastic stack because in such language it's easy to allocate all kind of stuff on the stack (unlike in D). - - - - - - - - - - Ada is safer compared to D for other things. One of them is the little mess of integer division that D has inherited from C. This is how the Ada compiler forces you to write a certain division: procedure Strong_Typing is Alpha : Integer := 1; Beta : Integer := 10; Result : Float; begin Result := Float (Alpha) / Float (Beta); end Strong_Typing; In D you could write: int a = 1; int b = 10; double r = a / b; and in r you will not see the 0.1 value. This is a C design mistake that I have seen bite even programmers with more than 2 years of programming experience with C-like languages. Perhaps having "/" and "div" for floating point and integer divisions in D could avoid those bugs. Another mistake is D inheriting the C99 semantics of % that is suboptimal and bug-prone. (Both mistakes are fixed in Python3, by the way, despite I don't fully like the Python3 division). - - - - - - - - - - In Ada there is 'others' to define the value of the array items that you are not specifying, this removes the bugs discussed in Issue 3849: declare type Arr_Type is array (Integer range <>) of Integer; A1 : Arr_Type := (1, 2, 3, 4, 5, 6, 7, 8, 9); begin A1 := (1, 2, 3, others => 10); end; 'others' is usable even for struct literals, when you don't want to specify all fields: type R is record A, B : Integer := 0; C : Float := 0.0; end record; V3 : R => (C => 1.0, others => <>); For D I suggested array syntax like: int[$] a1 = [1, 2, 3]; int[10] a2 = [1, 2, 3, ...]; void main() {} where the "..." tells the compiler the programmer wants it to fill those missing values with their default init. Ada Concurrency is quite refined, and it's kind of safe. Bye, bearophile
Jul 25 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Jul 25, 2013 at 07:39:15PM -0700, Walter Bright wrote:
 On 7/25/2013 7:19 PM, bearophile wrote:
If you allocate too much data on the stack this could cause stack
overflow. As you say a stack overflow is memory safe, but if your
program is doing something important, a sudden crash could be
regarded as dangerous for the user. You don't want a stack overflow
in the code that controls your car brakes (this is not a totally
invented example).


And how does stack overflow differ from heap overflow? Either way, your program can't continue normal execution (e.g., issue a command to engage all brakes).
 If you are writing a program that, if it fails will cause your car to
 crash, then you are a bad engineer and you need to report to the
 woodshed.
 
 As I've written before, imagining you can write a program that
 cannot fail, coupled with coming up with a requirement that a
 program cannot fail, is BAD ENGINEERING.
 
 ALL COMPONENTS FAIL.
 
 The way you make a system safe is design it so that it can withstand
 failure BECAUSE THE FAILURE IS GOING TO HAPPEN. I cannot emphasize
 this enough.

How would a D program recover from stack overflow? [...]
If you are using a segmented stack as Rust, stack overflows become
less probable (it becomes more like a malloc failure), because the
stack is able to become very large when needed. I think Rust needs
that elastic stack because in such language it's easy to allocate all
kind of stuff on the stack (unlike in D).

Segmented stacks are a great idea for 20 years ago. 64 bit code has rendered the idea irrelevant - you can allocate 4 billion byte stacks for each of 4 billion threads. You've got other problems that'll happen centuries before that limit is reached.

Isn't it possible to allocate the stack at the far end of the program's address space, so that it can grow as needed? Apparently Linux doesn't do that, though. Or at least, not by default. I can sorta guess why: allowing the stack to grow arbitrarily large has the possibility of a buggy program consuming all memory resources before getting terminated by running out of memory (say the program has an infinite loop allocating huge gobs of stack space per iteration). On a moderately-sized stack, the program will get killed by stack overflow before it can do much harm; a program with unconstrained stack size may well take down the system with it by eating up all memory (both RAM *and* swap) before it crashes. The system will likely grind to a halt as it starts thrashing from lack of RAM, and then overflowing the swap, and basically make a mess of everything before the OS finally kills off the buggy program (assuming that the OS can recover, of course). Not unlike a fork bomb in some ways. :-P T -- I am Ohm of Borg. Resistance is voltage over current.
Jul 25 2013
prev sibling next sibling parent "Max Samukha" <maxsamukha gmail.com> writes:
On Friday, 26 July 2013 at 05:13:37 UTC, Walter Bright wrote:
 On 7/25/2013 7:19 PM, bearophile wrote:
 You don't want a stack overflow in the code that controls your 
 car brakes (this is not a
 totally invented example).

Sadly, it isn't: http://www.forbes.com/sites/andygreenberg/2013/07/24/hackers-reveal-nasty-new-car-attacks-with-me-behind-the-wheel-video/ Software controlled brakes with no override? Madness!

Only death statistics for a sufficiently long usage period could tell whether software + override is safer than purely software. Note that software + override is significantly more complex, which means a decrease in reliability of the system in whole.
Jul 26 2013
prev sibling next sibling parent "John Colvin" <john.loughran.colvin gmail.com> writes:
On Friday, 26 July 2013 at 05:13:37 UTC, Walter Bright wrote:
 On 7/25/2013 7:19 PM, bearophile wrote:
 You don't want a stack overflow in the code that controls your 
 car brakes (this is not a
 totally invented example).

Sadly, it isn't: http://www.forbes.com/sites/andygreenberg/2013/07/24/hackers-reveal-nasty-new-car-attacks-with-me-behind-the-wheel-video/ Software controlled brakes with no override? Madness!

I presume you always had some override system with anything critical at Boeing? P.S. I don't suppose you happened to work with Bogdan Hnat there? It's a long shot as I think you would have left before he started.
Jul 26 2013
prev sibling next sibling parent "Dicebot" <public dicebot.lv> writes:
On Friday, 26 July 2013 at 02:39:15 UTC, Walter Bright wrote:
 ALL COMPONENTS FAIL.

 The way you make a system safe is design it so that it can 
 withstand failure BECAUSE THE FAILURE IS GOING TO HAPPEN. I 
 cannot emphasize this enough.

So very true, this is a rule number one for designing high availability systems. One must assume that any single program is always doomed to fail and focus on designing whole system that will not. Duplication, various heartbeat watchdogs and fast recovery times for processes are key here. Also nothing that is powered from the same power source can't be considered fail-safe. ;)
Jul 26 2013
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 As I've written before, imagining you can write a program that 
 cannot fail, coupled with coming up with a requirement that a 
 program cannot fail, is BAD ENGINEERING.

 ALL COMPONENTS FAIL.

 The way you make a system safe is design it so that it can 
 withstand failure BECAUSE THE FAILURE IS GOING TO HAPPEN. I 
 cannot emphasize this enough.

I agree. On the other hand in important system you usually also try to use more reliable single components, like military-grade resistors able to stand bigger temperature fluctuations. Safety must be pursued at all levels. That's why in both automotive and aeronautics for certain safety-critical routines they forbid recursion and require a static analysis of the max stack space the subprogram will require in all possible usages, to reduce a lot the probability of stack overflows. In some situations stack overflows are a security problem. Several persons have written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the safety hazards caused by stack overflows, and ignoring the tools to avoid them in critical-purpose routines, is very bad engineering.
 On the other hand, fixed size stack allocations are more 
 predictable and hence a stack overflow is more likely to be 
 detected during testing.

I agree. Here the interactions are not linear :-)
 Segmented stacks are a great idea for 20 years ago. 64 bit code 
 has rendered the idea irrelevant - you can allocate 4 billion 
 byte stacks for each of 4 billion threads. You've got other 
 problems that'll happen centuries before that limit is reached.

Rust designers should comment on this :-) I am not expert enough on this.
 (Segmented stacks are also a performance problem, and don't 
 interact well with compiled C code.)

I don't know the current situation on this, but I think they are trying to solve this problem in Rust, with some workaround. Bye, bearophile
Jul 26 2013
prev sibling next sibling parent "Chris" <wendlec tcd.ie> writes:
On Thursday, 25 July 2013 at 18:23:19 UTC, Xinok wrote:
 Once in a while, a thread pops up in the newsgroups pitting D 
 against some other language. More often than not, these 
 comparisons are flawed, non-encompassing, and uninformative. 
 Most recently with the article comparing D with Go and Rust, 
 the community pointed out a few flaws involving a late addition 
 of one of the D compilers, build configurations 
 (-noboundscheck?), and the random number generator used.

 Then when I think about how web browsers are compared, there 
 are conventional measures and standard benchmarking tools (e.g. 
 sunspider). They measure performance for javascript, rendering, 
 HTML5, etc. They also measure startup times (hot/cold boot), 
 memory usage, etc. Finally, there are feature comparisons, such 
 as what HTML5 features each browser supports.

 These are the type of comparisons I'd like to see with 
 programming languages. For starters, there should be standard 
 "challenges" (algorithms and such) implemented in each language 
 designed to measure various aspects of the language, such as 
 sorting, number crunching, and string processing. However, 
 rather than leave it to a single individual to implement the 
 algorithm in several different languages, it should be left to 
 the community to collaborate and produce an "ideal" 
 implementation of the algorithm in their language. We could 
 analyze factors other than performance, such as the ease of 
 implementation (how many lines? does it use safe/unsafe 
 features? Was it optimized using unsafe / difficult features?).


 What can we do about it? I propose we come together as a 
 community, design challenges that are actually relevant and 
 informative, and release the first implementations in D. Then 
 we let the battle commence and invite other communities to 
 contribute their own implementations in other languages. I 
 think we should give it a try; start off small with just a few 
 moderate challenges (not too simple or complex) and see where 
 it goes from there.

I have learned to be wary of comparisons like that. Any language that is sponsored or owned by a big company always "outperforms" other languages, and at the end of the day they only want to bind you to their products, no matter how "open source" they are. You can basically proof whatever you want. Most of the discussions I have had don't revolve around whether the language is good or not, it's about what people have heard/read, "Who uses it?", "I've heard ..." "Someone said ..." I once told a guy about D. He said "Ah, D, old-fashioned!" and he showed me a link that said "C# has a more modern feature ... bla bla". How ... scientific!
Jul 26 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jul 26, 2013 at 03:18:03PM +0200, Chris wrote:
[...]
 I have learned to be wary of comparisons like that. Any language
 that is sponsored or owned by a big company always "outperforms"
 other languages, and at the end of the day they only want to bind
 you to their products, no matter how "open source" they are.

+1. I'm skeptical of attempts to reduce everything down a single number or three, that can serve as a basis for numerical (or hand-waving) comparisons. As if programming languages were that simple that one could place them neatly on what is effectively a scale of 1 to 10!
 You can basically proof whatever you want. Most of the discussions I
 have had don't revolve around whether the language is good or not,
 it's about what people have heard/read, "Who uses it?", "I've heard
 ..." "Someone said ..." I once told a guy about D. He said "Ah, D,
 old-fashioned!" and he showed me a link that said "C# has a more
 modern feature ... bla bla". How ... scientific!

I get that a lot from Java fanboys. They make bold-sounding statements like "Java is the future!", "Java is the best!", "D sucks, nobody uses it!", "Java will get you a job easily!". But I've yet to hear some factual evidence to back up these claims. Well, maybe except the last one -- it's true that given Java's popularity, it has a high chance of landing you a job. But the point is, just because it will land you a job doesn't necessarily make it a *good* language, it merely shows that it's a *popular* one. Popular doesn't imply good. T -- "I'm running Windows '98." "Yes." "My computer isn't working now." "Yes, you already said that." -- User-Friendly
Jul 26 2013
prev sibling next sibling parent "Chris" <wendlec tcd.ie> writes:
On Friday, 26 July 2013 at 14:05:12 UTC, H. S. Teoh wrote:
 On Fri, Jul 26, 2013 at 03:18:03PM +0200, Chris wrote:
 [...]
 I have learned to be wary of comparisons like that. Any 
 language
 that is sponsored or owned by a big company always 
 "outperforms"
 other languages, and at the end of the day they only want to 
 bind
 you to their products, no matter how "open source" they are.

+1. I'm skeptical of attempts to reduce everything down a single number or three, that can serve as a basis for numerical (or hand-waving) comparisons. As if programming languages were that simple that one could place them neatly on what is effectively a scale of 1 to 10!
 You can basically proof whatever you want. Most of the 
 discussions I
 have had don't revolve around whether the language is good or 
 not,
 it's about what people have heard/read, "Who uses it?", "I've 
 heard
 ..." "Someone said ..." I once told a guy about D. He said 
 "Ah, D,
 old-fashioned!" and he showed me a link that said "C# has a 
 more
 modern feature ... bla bla". How ... scientific!

I get that a lot from Java fanboys. They make bold-sounding statements like "Java is the future!", "Java is the best!", "D sucks, nobody uses it!", "Java will get you a job easily!". But I've yet to hear some factual evidence to back up these claims. Well, maybe except the last one -- it's true that given Java's popularity, it has a high chance of landing you a job. But the point is, just because it will land you a job doesn't necessarily make it a *good* language, it merely shows that it's a *popular* one. Popular doesn't imply good. T

Yep. And I think that someone who knows D or any other language that is not mainstream is seriously into programming. If you know Java or Python, what does that mean? That you are a good programmer? If you know how to program, you can learn any language you want. The question is usually not "I wonder if I can write the program." the question is usally "I know what I have to do. But how do I do it in D, C, Java ...?" It's the how, not the if.
Jul 26 2013
prev sibling next sibling parent "Chris" <wendlec tcd.ie> writes:
On Friday, 26 July 2013 at 14:17:45 UTC, Chris wrote:
 On Friday, 26 July 2013 at 14:05:12 UTC, H. S. Teoh wrote:
 On Fri, Jul 26, 2013 at 03:18:03PM +0200, Chris wrote:
 [...]
 I have learned to be wary of comparisons like that. Any 
 language
 that is sponsored or owned by a big company always 
 "outperforms"
 other languages, and at the end of the day they only want to 
 bind
 you to their products, no matter how "open source" they are.

+1. I'm skeptical of attempts to reduce everything down a single number or three, that can serve as a basis for numerical (or hand-waving) comparisons. As if programming languages were that simple that one could place them neatly on what is effectively a scale of 1 to 10!
 You can basically proof whatever you want. Most of the 
 discussions I
 have had don't revolve around whether the language is good or 
 not,
 it's about what people have heard/read, "Who uses it?", "I've 
 heard
 ..." "Someone said ..." I once told a guy about D. He said 
 "Ah, D,
 old-fashioned!" and he showed me a link that said "C# has a 
 more
 modern feature ... bla bla". How ... scientific!

I get that a lot from Java fanboys. They make bold-sounding statements like "Java is the future!", "Java is the best!", "D sucks, nobody uses it!", "Java will get you a job easily!". But I've yet to hear some factual evidence to back up these claims. Well, maybe except the last one -- it's true that given Java's popularity, it has a high chance of landing you a job. But the point is, just because it will land you a job doesn't necessarily make it a *good* language, it merely shows that it's a *popular* one. Popular doesn't imply good. T

Yep. And I think that someone who knows D or any other language that is not mainstream is seriously into programming. If you know Java or Python, what does that mean? That you are a good programmer? If you know how to program, you can learn any language you want. The question is usually not "I wonder if I can write the program." the question is usally "I know what I have to do. But how do I do it in D, C, Java ...?" It's the how, not the if.

If I think about it, learning and knowing languages makes you a better programmer. If you know D, you become a better programmer in general. If you learn Objective-C or C or whatever, you become a better programmer. You learn new concepts and know what works and what doesn't, rather than sheepishly following rules as if they were universal laws. Ok, that's a bit off topic now.
Jul 26 2013
prev sibling next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Friday, 26 July 2013 at 02:39:15 UTC, Walter Bright wrote:
 If you are writing a program that, if it fails will cause your 
 car to crash, then you are a bad engineer and you need to 
 report to the woodshed.

 As I've written before, imagining you can write a program that 
 cannot fail, coupled with coming up with a requirement that a 
 program cannot fail, is BAD ENGINEERING.

 ALL COMPONENTS FAIL.

 The way you make a system safe is design it so that it can 
 withstand failure BECAUSE THE FAILURE IS GOING TO HAPPEN. I 
 cannot emphasize this enough.

You emphasis it quite well, and that is certainly true for a car, a plane, or anything potentially dangerous. Different tradeoff apply when you talk about a video game, a media player or and IRC client.
Jul 26 2013
prev sibling next sibling parent "SomeDude" <lovelydear mailmetrash.com> writes:
On Thursday, 25 July 2013 at 20:03:52 UTC, Peter Alexander wrote:
 The problem is all those last bits:

 - Line counts aren't a good measure of anything.

That's why some people prefer to compare a gzipped version of the source code. The gzipped version gives a more fair account of the code size.
Jul 26 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jul 26, 2013 at 08:21:45PM +0200, SomeDude wrote:
 On Thursday, 25 July 2013 at 20:03:52 UTC, Peter Alexander wrote:
The problem is all those last bits:

- Line counts aren't a good measure of anything.

That's why some people prefer to compare a gzipped version of the source code. The gzipped version gives a more fair account of the code size.

That's not true. Comments, esp. extensive DDoc comments, can carry a lot of information that isn't part of the code itself. That will contribute to the gzipped size. At the very least, you'd have to strip out comments before compressing to get an accurate idea of "how much code" there is. The idea of using compression to quantify the amount of code is a clever one, though. Perhaps one way to improve it might be to have the compiler serialize the AST of the completely-parsed code, then we compress that serialized AST. Still, it's hard to get from this (or any other) measurement of code size to the "expressiveness" of the language. An inexperienced coder might write rather verbosely where an expert coder would write in just a few concise lines; the quantity of code would differ in each case, so without knowing the level of mastery the coder has over the language, it's still difficult to quantify expressivity. Furthermore, one could deliberately code in such a way as to maximize (or minimize) whatever chosen measure is used to evaluate the language, but that by no means reflects the *typical* usage of the language. And even then, we have to deal with the problem space: not all languages are best at addressing all classes of programming challenges. It would be unfair, for example, to judge Java on the basis of how well one could write an OS in it, since the language just isn't designed for that sort of thing. You'd be running into all sorts of roadblocks everywhere that aren't present when you write other applications that Java is more suitable for. T -- Never step over a puddle, always step around it. Chances are that whatever made it is still dripping.
Jul 26 2013
prev sibling next sibling parent "Tofu Ninja" <emmons0 purdue.edu> writes:
On Friday, 26 July 2013 at 18:46:06 UTC, H. S. Teoh wrote:
 ...for example, to judge Java on the basis of how well one 
 could write an OS in it

http://jos.sourceforge.net/
Jul 26 2013
prev sibling next sibling parent "Chris Cain" <clcain uncg.edu> writes:
On Friday, 26 July 2013 at 19:24:44 UTC, Walter Bright wrote:
 Hell, even if all three failed, you could still put the car in 
 gear and turn the ignition off. It'll slow down pretty rapidly.

I wouldn't recommend turning the ignition off. Most cars lose power steering in that situation which is can be just as bad as or worse than losing brakes. Most cars (including automatics) allow you to manually switch to lower gears which will also slow you down.
Jul 26 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jul 26, 2013 at 08:58:27PM +0200, Tofu Ninja wrote:
 On Friday, 26 July 2013 at 18:46:06 UTC, H. S. Teoh wrote:
...for example, to judge Java on the basis of how well one could
write an OS in it

http://jos.sourceforge.net/

Wait... that's an OS that runs in a JVM? Wouldn't it need another OS to act as host? My mind is boggled... T -- Always remember that you are unique. Just like everybody else. -- despair.com
Jul 26 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jul 26, 2013 at 01:58:33PM -0700, Walter Bright wrote:
 On 7/26/2013 12:30 PM, Chris Cain wrote:
I wouldn't recommend turning the ignition off. Most cars lose power
steering in that situation which is can be just as bad as or worse
than losing brakes.

The power steering is driven by a belt connected to the crankshaft. You won't lose power steering with the ignition off if the engine is turning. But you need to be careful not to engage the steering lock. That would be a big problem. And also, I suggest this as a last resort if your other braking systems all failed.
Most cars (including automatics) allow you to manually switch to
lower gears which will also slow you down.

I have little experience with automatics.

I think most automatics lock the steering wheel upon power off (probably as some kind of safety guard, maybe against inadvertent damage by some parts that expect power to be running when the wheel is turned?). I also use manual downshifting on my car (auto transmission) to force it to slow down -- e.g., down a hill, when the automatic transmission will often blindly shift to a high gear and you'll find yourself having to burn up much of your brakes to keep the speed under control. My car has a button that locks the maximum gear to 3rd, which is useful for keeping within city street limits when going downhill. It also has gear positions to force a switch to 2nd or 1st gear, though I rarely use those since at lower speeds there's generally no need to bother with them. In an emergency situation, forcing it to 1st gear would help reduce the speed. (But it does take a few seconds before the auto transmission kicks in to effect the switch -- and a few extra seconds at high speed can be too long in an emergency situation.) I think the one time when forcing 1st gear proved useful was when I had to drive downhill after a heavy snowstorm -- you do *not* want to go any higher in that situation otherwise you could easily lose friction and slide down to a nasty crunch at the bottom. (Well, the general advice is, don't drive in such conditions in the first place -- but then guys like me are often rather foolhardy. :-P) T -- 2+2=4. 2*2=4. 2^2=4. Therefore, +, *, and ^ are the same operation.
Jul 26 2013
prev sibling next sibling parent Brad Roberts <braddr puremagic.com> writes:
On 7/26/13 12:50 PM, Walter Bright wrote:
 On 7/26/2013 5:28 AM, bearophile wrote:

 In some situations stack overflows are a security problem. Several persons have
 written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the
 safety hazards caused by stack overflows, and ignoring the tools to avoid them
 in critical-purpose routines, is very bad engineering.

You can't have an undetected stack overflow if you use guard pages.

If you use guard pages AND guarantee that no object exceeds the size of the guard page. Without the latter, you can only catch a subset (though a large subset).
Jul 26 2013
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 Yes, and that's why your analysis of Rust's stack usage is 
 inadequate in demonstrating it is safer.

There I was not talking about Rust, but about more constrained systems (where maybe Rust someday will run, but it will need some changes). In desktop PCs usually there is plenty of RAM to grow a stack for lot of time, safely. So I think Go and Rust programs have a low risk of stack overflows if you run them on normal PCs.
 You can't have an undetected stack overflow if you use guard 
 pages.

They don't care about this much. For those high integrity systems they remove stack overflows from happening, sizing the stack with a careful static analysis, because for those software usages a stack overflow is dangerous :-) Their main problem is not detecting it, it's avoiding it. And for other systems a stack overflow can be a nuisance/problem.
 I'll add that segmented stacks are a compiler feature, not a 
 language feature. A D compiler could support segmented stacks 
 without changing the language, provided calling C functions 
 still works.

I see.
 But I see no point.

I have never asked for having a segmented stack in D. But both Go and Rust developers are smart people, running code mostly on 64 bit systems, and both have designed their languages in recent years with segmented stacks. So perhaps you could go read their motivations. My guess is that Rust programs can allocate lot of stuff on the stack, and just like a heap a larger input data causes a larger stack to be used. Having an extensible stack probably avoids stack from exhausting too much easily on normal PCs. But parallelism could be another cause (that you already answered). Bye, bearophile
Jul 26 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jul 26, 2013 at 12:54:57PM -0700, Walter Bright wrote:
 On 7/26/2013 10:25 AM, deadalnix wrote:
You emphasis it quite well, and that is certainly true for a car, a
plane, or anything potentially dangerous.

Different tradeoff apply when you talk about a video game, a media
player or and IRC client.

Of course. There is a cost of failure, though, to things like video games and media players. Annoying your customers. I've dumped using many media players because of their tendency to freeze up. I like to set my music on in the morning and run it all day. Having to regularly restart it means "abandon it and try a different one."

Yeah, I had that experience with my old iPod. There was no (obvious) way to control what's running, so every once in a while, some wayward app would start consuming all system resources and the music would start to skip and eventually the thing would freeze. It was very annoying, especially since there was no way to tell *what* was causing the problem. I stopped using the iPod as a music player (sigh...). On my new Android phone, things a slightly better -- at least there's a task manager that I can use to kill off misbehaving programs or apps known to exhibit erratic behaviour, before I set up the music to play all day. That way, things are less likely to fail.
 My current media player freezes about once every couple weeks. It's
 infrequent enough to be tolerable. The Ubuntu one dies about once an
 hour. I gave up on that long ago.

That sounds pretty bad. I think my HTC phone only ever froze once since late last year when I first got it. Every couple weeks sounds almost as bad as my iPod (which I no longer use). T -- Computers are like a jungle: they have monitor lizards, rams, mice, c-moss, binary trees... and bugs.
Jul 26 2013
prev sibling next sibling parent "Joseph Rushton Wakeling" <joseph.wakeling webdrake.net> writes:
On Friday, 26 July 2013 at 19:38:06 UTC, Walter Bright wrote:
 I am continually amazed at critical systems design in Fukushima 
 and Deep Water Horizon that have no backups or overrides.

I hope I'm not being unfair, but my impression was that the very impressive modern safety record of air travel is at least partly down to lessons learned from some major historical catastrophes. The one that always springs to mind is the De Havilland jets breaking apart mid-flight due to metal fatigue. The number of flights and resulting near misses surely helps to battle test safely procedures and designs. That volume of learning opportunities can't readily be matched in many other industries. That's not to defend the examples you cite -- the holes in safety provision in both of them were pretty shocking.
Jul 26 2013
prev sibling next sibling parent Brad Roberts <braddr puremagic.com> writes:
On 7/26/13 2:43 PM, Walter Bright wrote:
 On 7/26/2013 2:42 PM, Walter Bright wrote:
 On 7/26/2013 2:18 PM, Brad Roberts wrote:
 On 7/26/13 12:50 PM, Walter Bright wrote:
 On 7/26/2013 5:28 AM, bearophile wrote:

 In some situations stack overflows are a security problem. Several persons have
 written programs to analyse the stack usage of Ada-SPARK programs. Ignoring the
 safety hazards caused by stack overflows, and ignoring the tools to avoid them
 in critical-purpose routines, is very bad engineering.

You can't have an undetected stack overflow if you use guard pages.

If you use guard pages AND guarantee that no object exceeds the size of the guard page. Without the latter, you can only catch a subset (though a large subset).

True. I've often thought it would be reasonable to restrict object sizes on the stack.

No, I was wrong. False. Stack frames larger than 4K are sequentially "probed" so they'll fault on overflow.

Are or could be?
Jul 26 2013
prev sibling next sibling parent Brad Roberts <braddr puremagic.com> writes:
On 7/26/13 5:38 PM, Walter Bright wrote:
 On 7/26/2013 3:32 PM, Brad Roberts wrote:
 No, I was wrong. False. Stack frames larger than 4K are sequentially "probed"
 so they'll fault on overflow.

Are or could be?

Yes and yes. https://github.com/D-Programming-Language/dmd/blob/master/src/backend/cod3.c#L3050

Um.. unless I'm reading that maze of #if's and conditionals wrong.. that's only being done in a few cases, specifically never on linux. And either way, are you asserting that all compilers do that?
Jul 26 2013
prev sibling next sibling parent Jordi Sayol <g.sayol yahoo.es> writes:
On 27/07/13 01:25, Walter Bright wrote:
 On 7/26/2013 4:07 PM, Andrei Alexandrescu wrote:
 On 7/26/13 3:52 PM, Walter Bright wrote:
 Although commonplace, it is poor practice to use the engine to slow the
 car down (unless you're dealing with brake fade from overheating).

I know next to nothing about cars so take this destruction with a grain of salt.
 1. Brake pads are cheap compared with engine rebuilds.

My understanding is that engine brake does not destroy the engine. It does not involve friction.

It's news to me that engines are frictionless! (The braking effect is only partially due to engine friction - the pumping of the air is most of it. But the engine WEAR is due to friction.)
 Indeed Wikipedia agrees:
 http://en.wikipedia.org/wiki/Engine_braking and even mentions "Engine braking
is
 a generally accepted practice and can help save wear on friction brakes".

Of course it saves wear on the brakes. The issue is do you prefer wear on your engine?
 2. Using the engine as a brake can cause unburned gas to wash the oil
 off of the cylinder walls, resulting in excessive wear.

[citation needed]

Mechanics at the dealer told me this. They had no reason to lie to me.

This absolutely true. About twenty years ago my friend's car broke down in a remote location. To bring the car to the nearest mechanic (2 or 3 kilometers), tied it to another car with a rope and used engine braking without ignition (engine was damaged) to prevent the spring effect. Result, pistons melted by excessive friction. This was due to the effect that Walter's mechanics clearly explained.
 
 
 3. The engine is not designed to be a brake. Use the brakes. Brake pads
 are not precious :-)

Engine brake is a natural artifact of its design. I don't think you can build an argument around "wasn't design to do that, so don't". Engine braking is a widespread and common technique.

I agree it is widespread and commonplace. That's why the mechanics felt it necessary to tell me not to do it. I was also told not to do it when I took two different courses in track driving - the Bob Bondurant and Skip Barber ones.
 I use engine braking most of the time (I always drive manual so that's easy).
 Saves gas and I've never had a mechanic tell me "you better go easy with that
 engine brake, look at them cylinder walls!" My brake pads reach a state of
 immortality.

The object isn't to save brake pads, it's to reduce the wear and tear on your engine.

-- Jordi Sayol
Jul 26 2013
prev sibling next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Friday, 26 July 2013 at 19:54:58 UTC, Walter Bright wrote:
 On 7/26/2013 10:25 AM, deadalnix wrote:
 You emphasis it quite well, and that is certainly true for a 
 car, a plane, or
 anything potentially dangerous.

 Different tradeoff apply when you talk about a video game, a 
 media player or and
 IRC client.

Of course. There is a cost of failure, though, to things like video games and media players. Annoying your customers. I've dumped using many media players because of their tendency to freeze up. I like to set my music on in the morning and run it all day. Having to regularly restart it means "abandon it and try a different one."

This kind of software can leverage way to recovers that would be untolerable in an airplane (for instance because they only work most of the time, or would produce an erratic behavior for a short period of time, like an audio glitch). D right now is not very friendly to such use cases as it is designed to crash hard as soon as something wrong happens.
 My current media player freezes about once every couple weeks. 
 It's infrequent enough to be tolerable. The Ubuntu one dies 
 about once an hour. I gave up on that long ago.

On linux, I uses audacious for music, it is never crashing.
Jul 27 2013
prev sibling next sibling parent "Jacob Carlborg" <doob me.com> writes:
On Friday, 26 July 2013 at 19:50:22 UTC, Walter Bright wrote:

 But I see no point. 32 bit code is already dead on OSX, and is 
 rapidly dying on Linux and Windows. I hear from more and more 
 outfits that they've transitioned to 64 bits and are not 
 looking back.

32bit is far from dead on ARM. -- /Jacob Carlborg
Jul 27 2013
prev sibling next sibling parent "Jacob Carlborg" <doob me.com> writes:
On Friday, 26 July 2013 at 19:54:58 UTC, Walter Bright wrote:

 My current media player freezes about once every couple weeks. 
 It's infrequent enough to be tolerable. The Ubuntu one dies 
 about once an hour. I gave up on that long ago.

Then you should use a Mac. They're (in)famous for when the whole computer freezes the music keeps playing. -- /Jacob Carlborg
Jul 27 2013
prev sibling next sibling parent "John Colvin" <john.loughran.colvin gmail.com> writes:
On Saturday, 27 July 2013 at 10:31:10 UTC, Jacob Carlborg wrote:
 On Friday, 26 July 2013 at 19:54:58 UTC, Walter Bright wrote:

 My current media player freezes about once every couple weeks. 
 It's infrequent enough to be tolerable. The Ubuntu one dies 
 about once an hour. I gave up on that long ago.

Then you should use a Mac. They're (in)famous for when the whole computer freezes the music keeps playing. -- /Jacob Carlborg

Or, in one memorable case for me, freezing and entering some sort of feedback loop, ending with my sub shaking things off my desk.
Jul 27 2013
prev sibling next sibling parent "Meta" <jared771 gmail.com> writes:
On Saturday, 27 July 2013 at 17:52:20 UTC, Walter Bright wrote:
 I use a Turtlebeach Audiotron.

That may be your problem right there. Turtle Beach seems to make high-quality products, but in reality the quality is extremely poor. I have 5 (5!) friends now, as well as myself, who bought a Turtle Beach product, and had it break within a year and a half (conveniently for Turtle Beach, the warranty only lasts a year), or even be DOA. I would never buy a Turtle Beach product again, ever.
Jul 27 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 30 Jul 2013 11:35:08 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 7/30/2013 11:18 AM, Andrei Alexandrescu wrote:
 Thanks for this anecdote. It's at the very best circumstantial. (With  
 the engine
 off, the oil pump wasn't even started!)

The oil pump is driven by the crankshaft, so if the engine is turning, the oil pump is. (There are some highly specialized race engines with an electric oil pump, but that is highly unlikely here.) I was told by U-Haul that when towing a car long distance, you couldn't just put the manual transmission in neutral. You had to take the driveshaft out, because the transmission was designed to circulate the oil based on the front shaft turning, not the back shaft. It would sieze after a while if you only turned the back shaft.

That depends entirely on your specific car and how you want to tow it. Four-down towing is the preferred method and since all four wheels are touching the ground all you need to do is make sure that the transmission is self lubricating. For example, my wife's Manual 2002 Honda CR-V is ideal for towing even though it's AWD because both the transmission and rear differential are self-lubricating. You have to change the fluid more often, 40k instead of 120k, but that's about it. There are whole websites devoted to which cars are best for this and how to do it in the RV world. :-)
 I've asked Walter for one credible source on the entire Internet  
 documenting the
 case against engine braking. He was unable to produce one. Instead, he  
 attempted
 to explain how an increase in hysteresis can cause additional wear on  
 the engine
 (the parts not worn under forward use). However, this is what one  
 poster in
 http://goo.gl/Ys099U had to say about that:

 =================
 Most of the time when you drive, you're putting a load (and causing  
 wear) on
 what I'm going to call the "forward" face of each tooth on each gear in  
 your
 drivetrain. The front of a tooth on the crankshaft pushes against the  
 back of a
 tooth on the next gear in line, which pushes the next gear, etc. When  
 you use
 "engine braking", all you are doing is engaging the teeth in the  
 opposite
 direction, and putting force and wear on the faces that normally are  
 just along
 for the ride.

 Now, does that mean you're wearing your engine out faster?  
 Marginally... but the
 parts you're wearing out would normally have to be replaced (if at all)  
 because
 they'd worn out from the other side; you're wearing surfaces that would  
 usually
 be thrown out with hardly any wear at all. To borrow a phrase from the  
 medical
 field, your engine/transmission will die with that wear, not of it.
 =================

I also pointed out the "hammering" effect of alternately forward driving then back driving the rotating parts, as the parts forcefully take up the slack of hysteresis. I also pointed out the effect of unburned gas from backdriving washing oil off of the cylinder walls causing undue wear. This definitely happens with carbureted cars, but with modern fuel injection the fuel is shut off when backdriving.

My dad has been an ASE Master Technician for my entire life and teaches Emissions Certification classes for our state. What I am about to say is based stuff I've picked up from him. I would go one step further and point out that in modern vehicles, those made after the EPA catalytic converter and air quality mandates of the early 80's, that any oil in the combustion chamber is a Very Bad Thing. Unburned hydrocarbons are highly destructive to catalytic converters and oil never burns completely during combustion. In fact we rebuilt the engine on my 1996 Honda Accord in 2010 precisely because it was starting to burn oil. And indeed, a year later the catalytic converter failed anyway due to the excessive strain placed on it by the partially burned oil that was forced through it prior to the rebuild. My dad actually recommended engine braking (the correct term is "compression braking" btw, Thanks Dad!) as a way to reduce wear on the brakes. The google poster is correct in this statement that all you're doing is putting strain on parts that aren't used that way much, unless you reverse a lot. We see cars ranging from the early 80's on up, including carbureted, and we've NEVER once seen a car with a transmission or engine that died because of compression braking. Given our sample size of somewhere over 10,000 ... :-) The automotive industry has spent obscene amounts of money getting the absolute cleanest burn they can to meet CAFE standards, and the very first thing they did was get the oil out of the combustion chamber. I'll also say that based on my dad's experience's with the Emissions class that even competent techs are having a VERY difficult time understanding this stuff, the chemistry involved is Ph.D stuff, and now ignition system are getting they way too. My dad has often lamented that working on cars is now more about understanding the computer control systems than it is the mechanics of it. Your average dealer tech probably has no clue what they are talking about since they have no reason to invest in learning this stuff. They don't see the car again after the warranty runs out and these systems rarely fail in five years. At least that's been my dad's experience with them. -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Jul 30 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 30 Jul 2013 12:02:46 -0700, Andrei Alexandrescu  
<SeeWebsiteForEmail erdani.org> wrote:

 On 7/30/13 11:35 AM, Walter Bright wrote:
 On 7/30/2013 11:18 AM, Andrei Alexandrescu wrote:
 Thanks for this anecdote. It's at the very best circumstantial. (With
 the engine
 off, the oil pump wasn't even started!)

The oil pump is driven by the crankshaft, so if the engine is turning, the oil pump is. (There are some highly specialized race engines with an electric oil pump, but that is highly unlikely here.) I was told by U-Haul that when towing a car long distance, you couldn't just put the manual transmission in neutral. You had to take the driveshaft out, because the transmission was designed to circulate the oil based on the front shaft turning, not the back shaft. It would sieze after a while if you only turned the back shaft.

So that invalidates the anecdote.
 I also pointed out the "hammering" effect of alternately forward driving
 then back driving the rotating parts, as the parts forcefully take up
 the slack of hysteresis.

I guess any brisk adjustment of throttle would be unadvisable, one direction or another (i.e. releasing the clutch with a large difference in rotation). Back driving, however, happens as soon as one just lifts the foot off the pedal - the inertia of the car pushes on the engine.
 I also pointed out the effect of unburned gas from backdriving washing
 oil off of the cylinder walls causing undue wear. This definitely
 happens with carbureted cars, but with modern fuel injection the fuel is
 shut off when backdriving.

That's my understanding as well. With fuel injection, essentially backdriving is rolling on zero gas consumption while preserving some mechanical energy - aweee-sooome. Andrei

Back driving ("compression braking" in the automotive world) is indeed a recommend procedure in modern cars. My dad (ASE Master Tech) recommends it as a way to save wear on the brakes and is as you've noted, quite an efficient use of energy. Heck, it's one of the first things he taught me how to do when I was learning how to drive. Toyota took it one step further and built a capability into the Prius where the electric driveline reverses it's polarity and uses motors to slow down the car while simultaneously recharging the battery as the car slows down instead of using the brakes. It's called regenerative braking. Needless to say, we don't do brakes very often on Prius'. -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Jul 30 2013
prev sibling next sibling parent Jordi Sayol <g.sayol yahoo.es> writes:
On 30/07/13 21:02, Andrei Alexandrescu wrote:
 On 7/30/13 11:35 AM, Walter Bright wrote:
 On 7/30/2013 11:18 AM, Andrei Alexandrescu wrote:
 Thanks for this anecdote. It's at the very best circumstantial. (With
 the engine
 off, the oil pump wasn't even started!)

The oil pump is driven by the crankshaft, so if the engine is turning, the oil pump is. (There are some highly specialized race engines with an electric oil pump, but that is highly unlikely here.) I was told by U-Haul that when towing a car long distance, you couldn't just put the manual transmission in neutral. You had to take the driveshaft out, because the transmission was designed to circulate the oil based on the front shaft turning, not the back shaft. It would sieze after a while if you only turned the back shaft.

So that invalidates the anecdote.

I feel guilty...
 
 I also pointed out the "hammering" effect of alternately forward driving
 then back driving the rotating parts, as the parts forcefully take up
 the slack of hysteresis.

I guess any brisk adjustment of throttle would be unadvisable, one direction or another (i.e. releasing the clutch with a large difference in rotation). Back driving, however, happens as soon as one just lifts the foot off the pedal - the inertia of the car pushes on the engine.
 I also pointed out the effect of unburned gas from backdriving washing
 oil off of the cylinder walls causing undue wear. This definitely
 happens with carbureted cars, but with modern fuel injection the fuel is
 shut off when backdriving.

That's my understanding as well. With fuel injection, essentially backdriving is rolling on zero gas consumption while preserving some mechanical energy - aweee-sooome. Andrei

-- Jordi Sayol
Jul 30 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 30 Jul 2013 15:40:52 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 7/30/2013 12:06 PM, Adam Wilson wrote:
 My dad has been an ASE Master Technician for my entire life and teaches
 Emissions Certification classes for our state. What I am about to say  
 is based
 stuff I've picked up from him.

 I would go one step further and point out that in modern vehicles,  
 those made
 after the EPA catalytic converter and air quality mandates of the early  
 80's,
 that any oil in the combustion chamber is a Very Bad Thing. Unburned
 hydrocarbons are highly destructive to catalytic converters and oil  
 never burns
 completely during combustion. In fact we rebuilt the engine on my 1996  
 Honda
 Accord in 2010 precisely because it was starting to burn oil. And  
 indeed, a year
 later the catalytic converter failed anyway due to the excessive strain  
 placed
 on it by the partially burned oil that was forced through it prior to  
 the rebuild.

 My dad actually recommended engine braking (the correct term is  
 "compression
 braking" btw, Thanks Dad!) as a way to reduce wear on the brakes. The  
 google
 poster is correct in this statement that all you're doing is putting  
 strain on
 parts that aren't used that way much, unless you reverse a lot. We see  
 cars
 ranging from the early 80's on up, including carbureted, and we've  
 NEVER once
 seen a car with a transmission or engine that died because of  
 compression
 braking. Given our sample size of somewhere over 10,000 ... :-)

How would you know if excessive wear was caused by engine braking or not? Excessive wear can be caused by all kinds of things, like not letting the engine warm up before driving it hard, or running long between oil changes, shifting prematurely or too late, etc.

Personally, I wouldn't. :-) But my Dad studied metallurgy as a minor at UW and let's just say that he enjoys Metal Failure Analysis WAY more than one could charitably consider normal for a human. Aviation Sidebar: His favorite metallurgy class was taught by an active duty Boeing incident response team member and his favorite story was his teacher dragging the failed main gear bogey of a 727 into the room and asking the class to figure out what went wrong. If my dad says he has never seen that type of problem, he probably hasn't. Based on watching do failure analysis' on cars, my guess is that there would be telltales that clue him into what was happening. As far as the combustion chamber goes, when we rebuilt my 96 Accord it had some minor pitting from the repeated explosions but nothing else of note at at 225k. We resurfaced the barrels and moved on. And I'm a commensurate compression braker. :-) Indeed, the other things you listed are quite evil on the internals of the engine. Particularly going too long between oil changes. But compression braking isn't on the list from an engineering standpoint. The components of the transmission and engine and much beefier than they strictly need to be. No manufacturer wants THAT recall at 5k per repair. Essentially, it's not any different than driving forward, you are just reversing the stress on components that were engineered to handle it moving forward. And in the case of automatics, the Torque Converter acts as a buffer between the engine and transmission at cruising speeds (varying by model) until it hits the lock-up.
 The automotive industry has spent obscene amounts of money getting the  
 absolute
 cleanest burn they can to meet CAFE standards, and the very first thing  
 they did
 was get the oil out of the combustion chamber. I'll also say that based  
 on my
 dad's experience's with the Emissions class that even competent techs  
 are having
 a VERY difficult time understanding this stuff, the chemistry involved  
 is Ph.D
 stuff, and now ignition system are getting they way too. My dad has  
 often
 lamented that working on cars is now more about understanding the  
 computer
 control systems than it is the mechanics of it. Your average dealer tech
 probably has no clue what they are talking about since they have no  
 reason to
 invest in learning this stuff. They don't see the car again after the  
 warranty
 runs out and these systems rarely fail in five years. At least that's  
 been my
 dad's experience with them.

I'll have to add that my knowledge of these things is pre-1990. So are the cars I work on :-)

Hehe, Emissions Control only really got complicated in the last 15 years or so. And most people drive cars newer than 15 years, unlike the Crazy Leader of D Who Shall Remain Nameless. ;-) -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Jul 30 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 30 Jul 2013 15:43:36 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 7/30/2013 12:16 PM, Adam Wilson wrote:
 Back driving ("compression braking" in the automotive world) is indeed a
 recommend procedure in modern cars. My dad (ASE Master Tech) recommends  
 it as a
 way to save wear on the brakes and is as you've noted, quite an  
 efficient use of
 energy. Heck, it's one of the first things he taught me how to do when  
 I was
 learning how to drive.

 Toyota took it one step further and built a capability into the Prius  
 where the
 electric driveline reverses it's polarity and uses motors to slow down  
 the car
 while simultaneously recharging the battery as the car slows down  
 instead of
 using the brakes. It's called regenerative braking. Needless to say, we  
 don't do
 brakes very often on Prius'.

If the engine *is designed for it*, that's a different story entirely. The engines I work on were not designed for it.

Well Toyota's Prius engine is just a simple powerplant with no connection to the road whatsoever, it's just a really cool technology. And electric motors are very different beasts from IC motors. :-) -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Jul 30 2013
prev sibling next sibling parent Joseph Rushton Wakeling <joseph.wakeling webdrake.net> writes:
On 07/27/2013 01:39 AM, Walter Bright wrote:
 Designers make mistakes even in redundant systems - sometimes they turn out to
 be coupled so a failure in one causes a failure in the backup. Sometimes
certain
 failure modes are not anticipated.
 
 But one thing they do NOT do is assume that component X cannot fail.
 
 The one that always springs to mind is
 the De Havilland jets breaking apart mid-flight due to metal fatigue.

Boeing's fix for that not only involved fixing the particular fatigue problem, but designing the structure so WHEN IT DOES CRACK the crack will not bring the airplane down. This design has been proven through a handful of incidents where an airliner has lost whole panels due to cracking and yet the structure remained sound.

I have to say, one of these days I'd really like to buy you a beer (or two, or three...) and have a long, long conversation about these (and other) aspects of aerospace engineering. I imagine it would be fascinating. :-) But I do think I'm correct in asserting that the particular disaster with the Comet didn't just result in learning about a new mode of failure and how to cope with it, but in an awful lot of new knowledge about designing safety procedures, analysing faults and crash data, and so on?
 The number of flights and resulting near misses surely helps to battle test
 safely procedures and designs. That volume of learning opportunities can't
 readily be matched in many other industries.

The most important lesson learned from aviation accidents is that all components can and will fail, so you need layers of redundancy. The airplane is far too complicated to rely on crash investigations to identify problems. I watched a show on the Concorde the other day, and was shocked to learn that there'd been an earlier incident where a tire burst on takeoff, the tire parts had penetrated the wing fuel tank, and the fuel drained away. The industry decided to ignore fixing it - and a few years later, it happened again, but this time the leak caught fire and killed everybody.

I want to stress that I never suggested relying on crash investigations! I said "near misses" ... :-) What I mean is that I would have thought that with the number of flights taking place, there would be a continuous stream of data available about individual component failures and other problems arising in the course of flights, and that tracking and analysing that data would play a major role in anticipating potential future issues, such as modes of failure that hadn't previously been anticipated. The example you give with the concorde is exactly the sort of thing that one would expect _should_ have prevented the later fatal accident. My point was that this volume of data isn't necessarily available in other engineering situations, so one might anticipate that in these areas it's more likely that minor failures will be overlooked rather than learned from, as they are rarer and possibly not numerous enough to build up enough data to make predictions. Of course, even if sufficient data was available, it wouldn't save them if the design (or management) culture didn't take into account the basic principles you've described.
Jul 30 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Tue, 30 Jul 2013 18:14:25 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 7/30/2013 4:22 PM, Adam Wilson wrote:
 Indeed, the other things you listed are quite evil on the internals of  
 the
 engine. Particularly going too long between oil changes. But  
 compression braking
 isn't on the list from an engineering standpoint. The components of the
 transmission and engine and much beefier than they strictly need to be.

Eh, I'm less convinced about that. I've had two transmissions shatter going steady speed at 30 mph. I doubled the horsepower in my dodge, the first thing that needed upgrading was the transmission (replaced the whole thing). I also upgraded the springs, driveshaft, bell housing (don't want my feet cut off), flywheel & clutch, brakes, and mounts. Not to mention everything inside the engine is upgraded, such as going from a cast to a forged crank (3x stronger).

Huh, I can't recall a story of that ever happening to a Honda or Toyota. We've had people install towkits on Minivans without the required oil cooler and set their transmissions on fire. But never shattering... Now the Japanese tend to source higher quality metal than the American manufacturers do, so that might be it...
 I didn't upgrade the differential and rear axle. Those do tend to be  
 beefier than necessary.

 If I went to more than double the power, I'd have to do things like weld  
 extra bracing into the frame, "tub" the rear chassis, go to fat tires,  
 put in a roll cage, etc.


 No manufacturer wants THAT recall at 5k per repair. Essentially, it's  
 not any
 different than driving forward, you are just reversing the stress on  
 components
 that were engineered to handle it moving forward.

It also assumes that the profile of the gears and the hardening on them is symmetric. It probably is - but I don't know that for a fact.
 And most people drive cars newer than 15 years, unlike the Crazy Leader  
 of D Who
 Shall Remain Nameless. ;-)

There's just something about a hotrodder doing it by reflashing the SD memory that leaves me cold :-)

It's kind of hard to be proud of 5 minutes of effort. :-D
 I just don't care for new cars. The only ones that piqued my interest  
 are the retro Mustang and the retro Challenger. Not even the new  
 Ferraris look interesting. I'll rent cars on trips, and I can't even  
 recall what brand they were. Zzzzzzz.

I have to admit the tech in new cars is very appealing to me. But at this point now we're talking about taste, which I try not to debate people on. :-)
 I'll just conclude with a video on why electric cars will always suck  
 and why Detroit has never made anything worth buying since 1972:

 http://www.youtube.com/watch?v=PsUnBQE8jhE

I'm with you on the electric cars. I'll proudly drive my oil burning pollution machines till I die. But if we want to make money in the automotive maintenance world, we gotta follow the crowd... *sigh* -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Jul 30 2013
prev sibling next sibling parent "Tofu Ninja" <emmons0 purdue.edu> writes:
On Thursday, 1 August 2013 at 18:56:03 UTC, Walter Bright wrote:
 It's not easy being green :-)

ALL I CAN THINK ABOUT IS THE CHEETOS COMMERCIALS! "It's not easy being cheesy"
Aug 01 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Thu, 01 Aug 2013 11:56:02 -0700, Walter Bright  =

<newshound2 digitalmars.com> wrote:

 On 8/1/2013 11:37 AM, Jeff Nowakowski wrote:
 On 08/01/2013 02:14 PM, Walter Bright wrote:
 BTW, you don't think the Prius is a status symbol? :-)

Nope, they're an affordable and practical car, and quite common these=


 days.
 Tesla, now that's a status symbol.

"as long as the hybrid remains a symbol of a driver=E2=80=99s commitme=

 environment, especially among the nation=E2=80=99s wealthiest, the fut=

 Prius should be secure."

 http://www.forbes.com/sites/eco-nomics/2012/08/09/is-the-toyota-prius-=

 The Prius isn't very green, either:

 "When you factor in all the energy it takes to drive and build a Prius=

 it takes almost 50% more energy than a Hummer. In a study by CNW  =

 Marketing called "Dust to Dust", researchers discovered that the Prius=

 costs and average of $3.25 per mile driven over a lifetime of 100,000 =

 miles (the expected lifespan of a hybrid). On the other hand the Humme=

 costs $1.95 per mile over an expected 300,000 miles. Which means that =

 the Hummer will last three times as long and use less energy than the =

 Prius."

 http://www.thetorquereport.com/2007/03/toyotas_prius_is_less_efficien.=

 It's not easy being green :-)

If we've learned anything at the shop it's that people can't be bothered= = with the facts. They seriously don't care if you have studies backing up= = the environmental damage, they believe they are green and will take thos= e = beliefs to their graves. Ideology is funny that way. :-) -- = Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Aug 01 2013
prev sibling next sibling parent "Craig Dillabaugh" <cdillaba cg.scs.carleton.ca> writes:
On Thursday, 1 August 2013 at 19:40:36 UTC, Andrei Alexandrescu
wrote:
clip

 My current car is a nice and economic Honda Fit. It is the very 
 last internal combustion engine I'll ever own - I hope my next 
 car will be a Tesla (regardless of what anyone thinks about it 
 being a status symbol). Buying a dinosaur juice-based engine at 
 this point is as much fail as buying a carriage with horses in 
 1915. I predict that internal combustion engines will be seen 
 in less than a hundred years as weird inefficient contraptions, 
 like we think of steam engines today.

If you really want to go green, you should get an electric bike and ride that :o) Then no one can accuse you of buying it as a status symbol, but of course you will likely be killed by one of the Hummer drivers. Craig
Aug 01 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Thu, 01 Aug 2013 13:17:51 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 8/1/2013 12:39 PM, Andrei Alexandrescu wrote:
 You betcha. Related, you destroyed the myth that engine braking is any  
 bad, but
 I bet money nobody changed opinions.

For an engine designed for it, sure. For an engine not designed for it, no. A carbureted engine is still going to have the unburned gas problem (and you're not going to be very green pumping out semi-burned hydrocarbons out the tailpipe). I don't know at what point injected systems began shutting off the fuel when backdriving.

I think most manufacturers made that change in the early 80's with all the EPA mandates. So basically anything still on the road that isn't a Classic. :-)
 Also, there is a beauty about electrical engines - their theoretical  
 efficiency
 is 100%, they are simple, principled, entropy-neutral, and work on  
 conservative
 laws. (Batteries are more unwieldy though.)

You're right, it's all about the batteries. They're a gigantic problem that, while there are incremental improvements, is still far from a solution. But gasoline engines are also getting incremental improvements. Modern ones are way, way better than the ones from the 60's in just about every aspect.

And getting better every year. We're starting to see widespread use of Gasoline Direct Injection and better ignition technologies. The reason we aren't seeing major improvement in gas mileage is because every year new government safety mandates add an average of 30lbs to the car.
 There's an inherent efficiency in gas cars in that the energy is  
 generated on site. For electric cars, the energy is generated elsewhere  
 (at the power plant), and then you're faced with all the losses from  
 transmitting the energy, storing it, and recovering it. It's a tough  
 hill to climb. Gasoline is pretty remarkable in its energy density and  
 portability.

In fact, in raw terms, Gasoline has significantly higher energy density than batteries. And that matters more than you'd think. The Prius for example weighs something like 2900lbs where the typical gasoline powered car of the same size weighs around 2000lbs. This is due to the need for a large array of batteries and carrying around both a gasoline motor and electrical motors. That has a direct effect on the amount of energy required to move it's mass. So per pound, Gasoline has Batteries thoroughly beaten, and according to a friend of mine who works in the field (he designs the inverters that convert battery energy to energy usable by things like automotive electric motors) this will remain so for the foreseeable future. (AKA, nothing groundbreaking on the horizon.)
 BTW, with a manual trans, you can get quite a bit better mileage than  
 the EPA ratings. Google "hypermiling" for ways. I do that stuff  
 routinely.

You did forget to mention that you piss off everyone behind though... ;-P -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Aug 01 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Thu, 01 Aug 2013 12:39:52 -0700, Andrei Alexandrescu  
<SeeWebsiteForEmail erdani.org> wrote:

 On 8/1/13 12:05 PM, Adam Wilson wrote:
 If we've learned anything at the shop it's that people can't be bothered
 with the facts. They seriously don't care if you have studies backing up
 the environmental damage, they believe they are green and will take
 those beliefs to their graves. Ideology is funny that way. :-)

You betcha. Related, you destroyed the myth that engine braking is any bad, but I bet money nobody changed opinions.

Indeed. :-)
 About green driving, Prius, and Tesla - it's all about what industry you  
 want to sustain. Everything that stands behind the Hummer as a road car  
 is an abomination, pure and simple. Of course I'd agree plenty of Prius  
 drivers are as snooty as it gets in a different way. Yet the reality  
 remains that the Hummer is an evolutionary dead end, and hybrids are a  
 stepping stone to a better future.

The most efficient/effective method would be to power the roads and then have cars draw energy from that. With battery storage for where the roads are unpowered. That way you could draw on the power generation capacities of Fission or Fusion devices without needing to stick one in every car. That would greatly reduce the amount of battery capacity needed for the average trips.
 My current car is a nice and economic Honda Fit. It is the very last  
 internal combustion engine I'll ever own - I hope my next car will be a  
 Tesla (regardless of what anyone thinks about it being a status symbol).  
 Buying a dinosaur juice-based engine at this point is as much fail as  
 buying a carriage with horses in 1915. I predict that internal  
 combustion engines will be seen in less than a hundred years as weird  
 inefficient contraptions, like we think of steam engines today.

Personally, I am hoping for Zero-Point Energy powered cars, or if not that, then at least a Mr. Fusion (apologies to all who don't get the somewhat dated cultural reference).
 Also, there is a beauty about electrical engines - their theoretical  
 efficiency is 100%, they are simple, principled, entropy-neutral, and  
 work on conservative laws. (Batteries are more unwieldy though.)


 Andrei

-- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Aug 01 2013
prev sibling next sibling parent "Adam Wilson" <flyboynw gmail.com> writes:
On Thu, 01 Aug 2013 14:11:03 -0700, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 8/1/2013 1:53 PM, Adam Wilson wrote:
 You did forget to mention that you piss off everyone behind though...  
 ;-P

I do pay attention to what's behind me when doing it. I'll hypermile much more aggressively when there's nobody behind me.

That just makes you a very rare person. ;-) -- Adam Wilson IRC: LightBender Project Coordinator The Horizon Project http://www.thehorizonproject.org/
Aug 01 2013
prev sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Friday, 2 August 2013 at 01:24:31 UTC, Walter Bright wrote:
 What does work is, of course, orienting your life so you drive 
 less. Like living closer to work, combining errands into one 
 trip, carpooling, biking, using Amazon instead of going to the 
 mall, etc.

Good luck with that in a country where cars are religious material.
Aug 01 2013