www.digitalmars.com         C & C++   DMDScript  

D - Case against C

reply Mark Evans <Mark_member pathlink.com> writes:
Perspectives from language author and CS professor who harks back to the days of
C's invention.  He also discusses C++.

Mark
http://murray.newcastle.edu.au/users/staff/peter/Moylan.html
ftp://murray.newcastle.edu.au/pub/reports/CaseAgainstC.txt

---------------------------------------------------------
C enthusiasts seem to be largely in ignorance of the advances which have been
made in language design in the last 20 years.

C was not the only such language [invented at that time], and certainly not the
earliest. In fact, a whole rash of machine-oriented languages appeared at about
that time. (I was the author of one...) These languages had a strong family
resemblance to one another; not because the authors were copying from one
another ... but because they were all influenced by the same pool of ideas...

Most other machine-oriented languages which appeared at about the same time as C
are now considered to be obsolete. Why, then, has C survived? ... Do these
reasons look familiar? Yes, they are almost identical to the arguments which
were being trotted out a few years ago in favour of BASIC.

We *have* learnt some new things about language design in the last 20 years, and
we do know that some of the things that seemed like a good idea at the time are
in fact not such good ideas. Is it not time to move on to D, or even E?
[Evidently! -Mark]

The choice of a programming language is often an emotional issue which is not
subject to rational discussion. [Right. -Mark] Nevertheless it is hoped to show
here that there are good objective reasons why C is not a good choice for large
programming projects. These reasons are related primarily to the issues of
software readability and programmer productivity.

Needless to say, there were those who felt that "real" programmers would
continue to work in machine language. Those "real" programmers are still among
us, and are still arguing that their special skills and superior virtue somehow
compensate for their poor productivity.

For anyone working with almost any reasonably advanced application, it is hard
to avoid the use of pointers. This does not mean that we have to like them.

There appears to be a widespread belief among C programmers that – because the
language is close to machine language – a C program will produce more efficient
object code than an equivalent program written in a high-level language.

[S]peed of a program tends to depend more on the global strategies adopted –
what sort of data structures to use, what sorting algorithms to use, and so on –
than the micro-efficiency issues related to precisely how each line of code is
written. When working in a low-level language like C, it becomes harder to keep
track of the global issues.

Nothing in this document should be interpreted as a criticism of the original
designers of C. I happen to believe that the language was an excellent invention
for its time. I am simply suggesting that there have been some advances in the
art and science of software design since that time, and that we ought to be
taking advantage of them.

I am not so naive as to expect that diatribes such as this will cause the
language to die out. Loyalty to a language is very largely an emotional issue
which is not subject to rational debate. I would hope, however, that I can
convince at least some people to re-think their positions.

I recognise, too, that factors other than the inherent quality of a language can
be important. Compiler availability is one such factor. Re-use of existing
software is another; it can dictate the continued use of a language even when it
is clearly not the best choice on other grounds. (Indeed, I continue to use the
language myself for some projects, mainly for this reason.) What we need to
guard against, however, is making inappropriate choices through simple inertia.

[And that includes language design choices. -Mark]
Jan 25 2003
parent Garen Parham <garen_nospam_ wsu.edu> writes:
Nothing really new there.  When he said:

"What we need to guard against, however, is making inappropriate choices
through simple inertia."

I have to wonder just exactly how we're supposed to do that?  It's really
true that network-affect has an enormous influence; but I can't see any way
of trying to prevent it from happening, it's inevitable.  Just speaking
about it in general doesn't add anything constructive.  It's kind of like
anti-technology doomsdayers telling us we should "be careful," but have
nothing else to say other than that (which is obvious).
Jan 25 2003