www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Re: Null references (oh no, not again!)

reply Kagamin <spam here.lot> writes:
 I call it my billion-dollar mistake. It was the invention of the null
 reference in 1965. [...] This has led to innumerable errors,
 vulnerabilities, and system crashes, which have probably caused a
 billion dollars of pain and damage in the last forty years.

-- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner
 * Accessing arrays out-of-bounds
 * Dereferencing null pointers
 * Integer overflow
 * Accessing uninitialized variables

 50% of the bugs in Unreal can be traced to these problems!


I doubt that blunt non-null forcing will solve this problem. If you're forced to use non-null, you'll invent a means to fool the compiler, some analogue of null reference - a stub object, which use will result into the same bug, with the difference that application won't crash immediately, but will behave in unpredictable way, at some point causing some other exception, so eventually you'll get your crash. Profit will be infinitesimal if any.
Mar 06 2009
parent reply "Nick Sabalausky" <a a.a> writes:
"Kagamin" <spam here.lot> wrote in message 
news:goqoup$jta$1 digitalmars.com...
 I call it my billion-dollar mistake. It was the invention of the null
 reference in 1965. [...] This has led to innumerable errors,
 vulnerabilities, and system crashes, which have probably caused a
 billion dollars of pain and damage in the last forty years.

-- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner
 * Accessing arrays out-of-bounds
 * Dereferencing null pointers
 * Integer overflow
 * Accessing uninitialized variables

 50% of the bugs in Unreal can be traced to these problems!


I doubt that blunt non-null forcing will solve this problem. If you're forced to use non-null, you'll invent a means to fool the compiler, some analogue of null reference - a stub object, which use will result into the same bug, with the difference that application won't crash immediately, but will behave in unpredictable way, at some point causing some other exception, so eventually you'll get your crash. Profit will be infinitesimal if any.

The idea is that non-null would not be forced, but rather be the default with an optional nullable for the times when it really is needed.
Mar 06 2009
parent reply Georg Wrede <georg.wrede iki.fi> writes:
Nick Sabalausky wrote:
 "Kagamin" <spam here.lot> wrote in message 
 I doubt that blunt non-null forcing will solve this problem. If you're 
 forced to use non-null, you'll invent a means to fool the compiler, some 
 analogue of null reference - a stub object, which use will result into the 
 same bug, with the difference that application won't crash immediately, 
 but will behave in unpredictable way, at some point causing some other 
 exception, so eventually you'll get your crash. Profit will be 
 infinitesimal if any.

The idea is that non-null would not be forced, but rather be the default with an optional nullable for the times when it really is needed.

This is interesting. I wonder what the practical result of non-null as default will be. Do programmers bother to specify nullable when needed, or will they "try to do the [perceived] Right Thing" by assigning stupid default values? If the latter happens, then we really are worse off than with nulls. Then searching for the elusive bug will be much more work.
Mar 06 2009
parent reply "Nick Sabalausky" <a a.a> writes:
"Georg Wrede" <georg.wrede iki.fi> wrote in message 
news:gor5ft$1d6c$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Kagamin" <spam here.lot> wrote in message
 I doubt that blunt non-null forcing will solve this problem. If you're 
 forced to use non-null, you'll invent a means to fool the compiler, some 
 analogue of null reference - a stub object, which use will result into 
 the same bug, with the difference that application won't crash 
 immediately, but will behave in unpredictable way, at some point causing 
 some other exception, so eventually you'll get your crash. Profit will 
 be infinitesimal if any.

The idea is that non-null would not be forced, but rather be the default with an optional nullable for the times when it really is needed.

This is interesting. I wonder what the practical result of non-null as default will be. Do programmers bother to specify nullable when needed, or will they "try to do the [perceived] Right Thing" by assigning stupid default values? If the latter happens, then we really are worse off than with nulls. Then searching for the elusive bug will be much more work.

Interesting point. We should probably keep an eye on the languages that use the "Foo" vs "Foo?" syntax for non-null vs nullable to see what usage patterns arise. Although, I generally have little more than contempt for programmers who blindly do what they were taught (by other amateurs) is usually "the right thing" without considering whether it really is appropriate for the situation at hand. Although I would think that there must be plenty of examples of things we already use that could make things worse if people used them improperly.
Mar 06 2009
parent Georg Wrede <georg.wrede iki.fi> writes:
Nick Sabalausky wrote:
 "Georg Wrede" <georg.wrede iki.fi> wrote in message 
 news:gor5ft$1d6c$1 digitalmars.com...
 Nick Sabalausky wrote:
 "Kagamin" <spam here.lot> wrote in message
 I doubt that blunt non-null forcing will solve this problem. If you're 
 forced to use non-null, you'll invent a means to fool the compiler, some 
 analogue of null reference - a stub object, which use will result into 
 the same bug, with the difference that application won't crash 
 immediately, but will behave in unpredictable way, at some point causing 
 some other exception, so eventually you'll get your crash. Profit will 
 be infinitesimal if any.

with an optional nullable for the times when it really is needed.

This is interesting. I wonder what the practical result of non-null as default will be. Do programmers bother to specify nullable when needed, or will they "try to do the [perceived] Right Thing" by assigning stupid default values? If the latter happens, then we really are worse off than with nulls. Then searching for the elusive bug will be much more work.

Interesting point. We should probably keep an eye on the languages that use the "Foo" vs "Foo?" syntax for non-null vs nullable to see what usage patterns arise. Although, I generally have little more than contempt for programmers who blindly do what they were taught (by other amateurs) is usually "the right thing" without considering whether it really is appropriate for the situation at hand. Although I would think that there must be plenty of examples of things we already use that could make things worse if people used them improperly.

An interesting thought occurred to me just now. IIRC, Walter's argument to always zeroing memory at allocation, was to give "sensible starting values" and to "more easily see if data is uninitialised". If assignment before use is compulsory, then we don't need to zero out memory anymore. This ought to speed data intensive tasks up.
Mar 06 2009