www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - What Makes A Programming Language Good

reply Walter Bright <newshound2 digitalmars.com> writes:
http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/
Jan 17 2011
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 07:20:56 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

So, why do users still get a scary linker error when they try to compile a program with more than 1 module? IMO, sticking to the C-ism of "one object file at a time" and dependency on external build tools / makefiles is the biggest mistake DMD did in this regard. Practically everyone to whom I recommended to try D hit this obstacle. rdmd is nice but I see no reason why this shouldn't be in the compiler. Think of the time wasted by build tool authors (bud, rebuild, xfbuild and others, and now rdmd), which could have been put to better use if this were handled by the compiler, who could do it much easier (until relatively recently it was very hard to track dependencies correctly). -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 07:20:56 +0200, Walter Bright 
 <newshound2 digitalmars.com> wrote:
 
 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-progra
ming-language-good/ 

So, why do users still get a scary linker error when they try to compile a program with more than 1 module?

What is that message?
 IMO, sticking to the C-ism of "one object file at a time" and dependency 
 on external build tools / makefiles is the biggest mistake DMD did in 
 this regard. Practically everyone to whom I recommended to try D hit 
 this obstacle. rdmd is nice but I see no reason why this shouldn't be in 
 the compiler. Think of the time wasted by build tool authors (bud, 
 rebuild, xfbuild and others, and now rdmd), which could have been put to 
 better use if this were handled by the compiler, who could do it much 
 easier (until relatively recently it was very hard to track dependencies 
 correctly).

dmd can build entire programs with one command: dmd file1.d file2.d file3.d ...etc...
Jan 18 2011
next sibling parent reply Christopher Nicholson-Sauls <ibisbasenji gmail.com> writes:
On 01/18/11 03:11, Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 11:05:34 +0200, Walter Bright
 <newshound2 digitalmars.com> wrote:
 
 Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 07:20:56 +0200, Walter Bright
 <newshound2 digitalmars.com> wrote:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

compile a program with more than 1 module?

What is that message?

C:\Temp\D\Build> dmd test1.d OPTLINK (R) for Win32 Release 8.00.8 Copyright (C) Digital Mars 1989-2010 All rights reserved. http://www.digitalmars.com/ctg/optlink.html test1.obj(test1) Error 42: Symbol Undefined _D5test21fFZv --- errorlevel 1 1) The error message is very technical: a) does not indicate what exactly is wrong (module not passed to linker, not that the linker knows that) b) does not give any indication of what the user has to do to fix it 2) OPTLINK doesn't demangle D mangled names, when it could, and it would improve the readability of its error messages considerably. (I know not all mangled names are demangleable, but it'd be a great improvement regardless)
 dmd can build entire programs with one command:

     dmd file1.d file2.d file3.d ...etc...

That doesn't scale anywhere. What if you want to use a 3rd-party library with a few dozen modules?

Then I would expect the library vendor provides either a pre-compiled binary library, or the means to readily generate same -- whether that means a Makefile, a script, or what have you. At that time, there is no need to provide DMD with anything -- unless you are one-lining it a la 'dmd file1 file2 file3 third_party_stuff.lib'. Forgive me if I misunderstand, but I really don't want a language/compiler that goes too far into hand-holding. Let me screw up if I want to. -- Chris N-S
Jan 18 2011
next sibling parent reply Trass3r <un known.com> writes:
 Then I would expect the library vendor provides either a pre-compiled
 binary library

As soon as you provide templates in your library this isn't sufficient anymore.
 or the means to readily generate same -- whether that
 means a Makefile, a script, or what have you.

We must avoid having the same disastrous situation like C/C++ where everyone uses a different system, CMake, make, scons, blabla. Makefiles aren't portable (imo stuff like msys is no solution, it's a hack) and especially for small or medium-sized projects it's often enough to compile a single main file and all of its dependencies. We really need a standard, portable way to compile D projects, be it implemented in the compiler or in some tool everyone uses. dsss was kind of promising but as you know it's dead.
Jan 18 2011
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/11 4:32 AM, Trass3r wrote:
 Then I would expect the library vendor provides either a pre-compiled
 binary library

As soon as you provide templates in your library this isn't sufficient anymore.
 or the means to readily generate same -- whether that
 means a Makefile, a script, or what have you.

We must avoid having the same disastrous situation like C/C++ where everyone uses a different system, CMake, make, scons, blabla. Makefiles aren't portable (imo stuff like msys is no solution, it's a hack) and especially for small or medium-sized projects it's often enough to compile a single main file and all of its dependencies. We really need a standard, portable way to compile D projects, be it implemented in the compiler or in some tool everyone uses. dsss was kind of promising but as you know it's dead.

You may add to bugzilla the features that rdmd needs to acquire. Andrei
Jan 18 2011
parent reply Trass3r <un known.com> writes:
 the features that rdmd needs to acquire

Well something that's also missing in xfBuild is a proper way to organize different build types: (debug, release) x (x86, x64) x ... But that would require config files similar to dsss' ones I think.
Jan 18 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"Trass3r" <un known.com> wrote in message 
news:ih4ij7$1g01$1 digitalmars.com...
 the features that rdmd needs to acquire

Well something that's also missing in xfBuild is a proper way to organize different build types: (debug, release) x (x86, x64) x ... But that would require config files similar to dsss' ones I think.

FWIW, stbuild (part of semitwist d tools) exists to do exactly that: http://www.dsource.org/projects/semitwist/browser/trunk/src/semitwist/apps/stmanage/stbuild http://www.dsource.org/projects/semitwist/browser/trunk/bin Although I'm thinking of replacing it with something more rake-like.
Jan 18 2011
parent "Nick Sabalausky" <a a.a> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:ih4p4o$1r1o$1 digitalmars.com...
 "Trass3r" <un known.com> wrote in message 
 news:ih4ij7$1g01$1 digitalmars.com...
 the features that rdmd needs to acquire

Well something that's also missing in xfBuild is a proper way to organize different build types: (debug, release) x (x86, x64) x ... But that would require config files similar to dsss' ones I think.

FWIW, stbuild (part of semitwist d tools) exists to do exactly that: http://www.dsource.org/projects/semitwist/browser/trunk/src/semitwist/apps/stmanage/stbuild http://www.dsource.org/projects/semitwist/browser/trunk/bin

Oh, and an example of the config file: http://www.dsource.org/projects/semitwist/browser/trunk/stbuild.conf
 Although I'm thinking of replacing it with something more rake-like.

 

Jan 18 2011
prev sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Vladimir Panteleev:

 So, you want D to force people to do more work, out of no practical reason?

When you develop a large system, the nice hand holding that works with small systems often stops working (because the whole language ecosystem is often not much designed for hierarchical decomposition of problems). In this situation you are often on your own, and often the automatic features work against you because their work and actions are often opaque. So those programmer develop a mistrust toward a compiler+tools that hold too much your hand. A related problem is visible in old automatic pilot systems. They are very useful, but when their operative limits are reached (because some emergency has pushed the plane state outside them), they suddenly stop working, and leave the human pilots in bad waters because the humans don't have a lot of time to awake from their sleepy state and understand the situation well enough to face the problems. So those old automatic pilot systems were actively dangerous (new automatic pilot systems have found ways to reduce such problems). To solve the situation, the future automatic D tools need to work in a very transparent way, giving all the information in a easy to use and understand way, showing all they do in a very clear way. So when they fail or when they stop being enough, the programmer doesn't need to work three times harder to solve the problems manually. Bye, bearophile
Jan 18 2011
parent el muchacho <nicolas.janin gmail.com> writes:
Le 18/01/2011 11:45, bearophile a écrit :
 Vladimir Panteleev:
 
 So, you want D to force people to do more work, out of no practical reason?

When you develop a large system, the nice hand holding that works with small systems often stops working (because the whole language ecosystem is often not much designed for hierarchical decomposition of problems). In this situation you are often on your own, and often the automatic features work against you because their work and actions are often opaque. So those programmer develop a mistrust toward a compiler+tools that hold too much your hand. A related problem is visible in old automatic pilot systems. They are very useful, but when their operative limits are reached (because some emergency has pushed the plane state outside them), they suddenly stop working, and leave the human pilots in bad waters because the humans don't have a lot of time to awake from their sleepy state and understand the situation well enough to face the problems. So those old automatic pilot systems were actively dangerous (new automatic pilot systems have found ways to reduce such problems). To solve the situation, the future automatic D tools need to work in a very transparent way, giving all the information in a easy to use and understand way, showing all they do in a very clear way. So when they fail or when they stop being enough, the programmer doesn't need to work three times harder to solve the problems manually. Bye, bearophile

My 2 cents: There is no need for transparency in the compilation and linking processes if things are well defined. Armies of developers in Java shops that include banks trust their IDE to do almost everything, be it eclipse, Netbeans or IntelliJ, sometimes the 3 at the same time in the same team. This is the case in my team, where some developers use IntelliJ while others use eclipse, out of the same source code repository. Both IDEs can compile and debug the software and the final build is made by a big ant file which can check out, generate code, build with javac and run tests. So there are 3 build systems in parallel. One task of the ant file is run once by each developer to generate the code and then the build is entirely handled by the build system, is the compiler of the IDE. There is no need to specify any dependency in the ant file. Of course, the IDE's compiler needs to be told where to find the library dependencies because we don't use Maven yet, but apart from taht, there is no need to specify anything else. This is in contrast with the horrible makefiles that still cripple most C++ projects, and still prevent C++ shops to benefit from efficient IDEs. Having worked both on large C++ systems and Java systems, my only conclusion is: make is a huge waste of time.
Jan 29 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 11:05:34 +0200, Walter Bright 
 <newshound2 digitalmars.com> wrote:
 
 Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 07:20:56 +0200, Walter Bright 
 <newshound2 digitalmars.com> wrote:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-progra
ming-language-good/ 

compile a program with more than 1 module?

What is that message?

C:\Temp\D\Build> dmd test1.d OPTLINK (R) for Win32 Release 8.00.8 Copyright (C) Digital Mars 1989-2010 All rights reserved. http://www.digitalmars.com/ctg/optlink.html test1.obj(test1) Error 42: Symbol Undefined _D5test21fFZv --- errorlevel 1 1) The error message is very technical: a) does not indicate what exactly is wrong (module not passed to linker, not that the linker knows that)

There could be many reasons for the error, see: http://www.digitalmars.com/ctg/OptlinkErrorMessages.html#symbol_undefined which is linked from the url listed: http://www.digitalmars.com/ctg/optlink.html and more directly from the FAQ: http://www.digitalmars.com/faq.html
   b) does not give any indication of what the user has to do to fix it

The link above does give such suggestions, depending on what the cause of the error is.
 2) OPTLINK doesn't demangle D mangled names, when it could, and it would 
 improve the readability of its error messages considerably.
    (I know not all mangled names are demangleable, but it'd be a great 
 improvement regardless)

The odd thing is that Optlink did demangle the C++ mangled names, and people actually didn't like it that much.
 dmd can build entire programs with one command:

     dmd file1.d file2.d file3.d ...etc...

That doesn't scale anywhere. What if you want to use a 3rd-party library with a few dozen modules?

Just type the filenames and library names on the command line. You can put hundreds if you like. If you do blow up the command line processor (nothing dmd can do about that), you can put all those files in a file, say "cmd", and invoke with: dmd cmd The only limit is the amount of memory in your system.
Jan 18 2011
next sibling parent reply Jim <bitcirkel yahoo.com> writes:
      dmd  cmd

 The only limit is the amount of memory in your system.

That's not what I meant - I meant it doesn't scale as far as user effort in concerned. There is no reason why D should force users to maintain response files, make files, etc. D (the language) doesn't need them, and nor should the reference implementation.

I have to second that. Your main.d imports abd.d which, in turn, imports xyz.d. Why can't the compiler traverse this during compilation in order to find all relevant modules and compile them if needed? I imagine such a compiler could also do some interesting optimisations based on its greater perspective. The single file as a compilation unit seems a little myopic to me. Its reasons are historic, I bet.
Jan 18 2011
next sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
Jim wrote:
 Why can't the compiler traverse this during compilation in order to
 find all relevant modules and compile them if needed?

How will it find all the modules? Since modules and files don't have to have matching names, it can't assume "import foo;" will necessarily be found in "foo.d". I use this fact a lot to get all a program's dependencies in one place. The modules don't necessarily have to be under the current directory either. It'd have a lot of files to search, which might be brutally slow. ... but, if you do want that behavior, you can get it today somewhat easily: dmd *.d, which works quite well if all the things are in one folder anyway.
Jan 18 2011
parent reply Adam Ruppe <destructionator gmail.com> writes:
Vladimir Panteleev:
 I think [file/module name mismatches] is a misfeature.

Maybe. 9/10 times they match anyway, but I'd be annoyed if the package names had to match the containing folder. Here's what I think might work: just use the existing import path rule. If it gets a match, great. If not, the user can always manually add the other file to the command line anyway.
 I suppose you avoid using build tools and
 prefer makefiles/build scripts for some reason?

Yeah, makefiles and build scripts are adequately fit already. That is, they don't suck enough to justify the effort of getting something new. I've thought about making an automatic build+download thing myself in the past, but the old way has been good enough for me. (If I were to do it, I'd take rdmd and add a little http download facility to it. If you reference a module that isn't already there, it'd look up the path to download it from a config file, grab it, and try the compile. If the config file doesn't exist, it can grab one automatically from a central location. That way, it'd be customizable and extensible by anyone, but still just work out of the box. But, like I said, it stalled out because my classic makefile and simple scripts have been good enough for me.)
 ...which won't work on Windows, for projects with packages, and if
 you have any unrelated .d files (backups, test programs) in your
 directory (which I almost always do).

Indeed.
Jan 18 2011
next sibling parent reply Jim <bitcirkel yahoo.com> writes:
Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if
 the package names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.
Jan 18 2011
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
spir:

 The D styleguide requires on one hand capitalised names for types, and 
 lowercase for filenames on the other. How are we supposed to make them 
 match?

Why do you want them to match? Bye, bearophile
Jan 18 2011
parent reply bearophile <bearophileHUGS lycos.com> writes:
spir:

 Because when a module defines a type Foo (or rather, it's what is 
 exported), I like it to be called Foo.d.

Generally D modules contain many types. Bye, bearophile
Jan 19 2011
parent bearophile <bearophileHUGS lycos.com> writes:
spir:

 Yep, but often one is the main exported element.

That's not true for Phobos, my dlibs1, and lot of my code that uses those libs.
When there are several, hopefully sensibly related, exported things, then it's
easy to indicate: mathFuncs, stringTools, bitOps... while still following D
naming conventions.<

D module names are better fully lowercase. This is their D standard... Bye, bearophile
Jan 19 2011
prev sibling next sibling parent Daniel Gibson <metalcaedes gmail.com> writes:
Am 18.01.2011 18:41, schrieb spir:
 On 01/18/2011 06:33 PM, Jim wrote:
 Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if
 the package names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.

The D styleguide requires on one hand capitalised names for types, and lowercase for filenames on the other. How are we supposed to make them match? Denis _________________ vita es estrany spir.wikidot.com

Filenames should match with the module they contain, not with the contained class(es).
Jan 18 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Jim wrote:
 Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if the package
 names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.

Forcing the module name to match the file name sounds good, but in practice it makes it hard to debug modules. What I like to do is to copy a suspicious module to foo.d (or whatever.d) and link it in explicitly, which will override the breaking one. Then, I hack away at it until I discover the problem, then fix the original.
Jan 18 2011
next sibling parent reply Jim <bitcirkel yahoo.com> writes:
Walter Bright Wrote:
 Forcing the module name to match the file name sounds good, but in practice it 
 makes it hard to debug modules. What I like to do is to copy a suspicious
module 
 to foo.d (or whatever.d) and link it in explicitly, which will override the 
 breaking one. Then, I hack away at it until I discover the problem, then fix
the 
 original.

This would admittedly impose some constraints, but I think it would ultimately be worth it. It makes everything much clearer and creates a bunch of opportunities for further development. I'd create a branch (in git or mercury) for that task, it's quick and dirt cheap, very easy to switch to and from, and you get the diff for free.
Jan 18 2011
parent Jesse Phillips <jessekphillips+D gmail.com> writes:
Jim Wrote:

 Walter Bright Wrote:
 Forcing the module name to match the file name sounds good, but in practice it 
 makes it hard to debug modules. What I like to do is to copy a suspicious
module 
 to foo.d (or whatever.d) and link it in explicitly, which will override the 
 breaking one. Then, I hack away at it until I discover the problem, then fix
the 
 original.

This would admittedly impose some constraints, but I think it would ultimately be worth it. It makes everything much clearer and creates a bunch of opportunities for further development.

I don't see such benefit. First off, I don't see file/module names not matching very often. Tools can be developed to assume such structure exists which means more incentive to keep such structure, I believe rdmd already makes this assumption. It also wouldn't be hard to make a program that takes a list of files, names and places them into the proper structure.
 I'd create a branch (in git or mercury) for that task, it's quick and dirt
cheap, very easy to switch to and from, and you get the diff for free.

Right, using such tools is great. But what if you are like me and don't have a dev environment set up for Phobos, but I want to fix some module? Do I have to setup such an environment or through the file in a folder std/ just do some work on it? I don't really know how annoying I would find such a change, but I don't think I would ever see at as a feature.
Jan 18 2011
prev sibling parent reply Thias <void invalid.com> writes:
On 18/01/11 20:26, Walter Bright wrote:
 Jim wrote:
 Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if the package
 names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.

Forcing the module name to match the file name sounds good, but in practice it makes it hard to debug modules. What I like to do is to copy a suspicious module to foo.d (or whatever.d) and link it in explicitly, which will override the breaking one. Then, I hack away at it until I discover the problem, then fix the original.

Couldn’t you do exactly the same thing by just copying the file? cp suspicious.d suspicious.orig edit suspicious.d
Jan 18 2011
parent "Nick Sabalausky" <a a.a> writes:
"Thias" <void invalid.com> wrote in message 
news:ih52a8$2bba$1 digitalmars.com...
 On 18/01/11 20:26, Walter Bright wrote:
 Jim wrote:
 Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if the package
 names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.

Forcing the module name to match the file name sounds good, but in practice it makes it hard to debug modules. What I like to do is to copy a suspicious module to foo.d (or whatever.d) and link it in explicitly, which will override the breaking one. Then, I hack away at it until I discover the problem, then fix the original.

Couldn’t you do exactly the same thing by just copying the file? cp suspicious.d suspicious.orig edit suspicious.d

That's what I do. Works fine. (Although I keep the .d extension, and do like "suspicious orig.d")
Jan 18 2011
prev sibling next sibling parent Adam Ruppe <destructionator gmail.com> writes:
Vladimir Panteleev wrote:
 Then the question is: does the time you spent writing and maintaining
 makefiles and build scripts exceed the time it would take you to
 set up a build tool?

I never spent too much time on it anyway, but this thread prompted me to write my own build thing. It isn't 100% done yet, but it does basically work in just 100 lines of code: http://arsdnet.net/dcode/build.d Also depends on these: http://arsdnet.net/dcode/exec.d http://arsdnet.net/dcode/curl.d The exec.d is Linux only, so this program is linux only too. When the new std.process gets into Phobos, exec.d will be obsolete and we'll be cross platform. I borrowed some code from rdmd, so thanks to Andrei for that. I didn't use rdmd directly though since it seems more script oriented than I wanted. The way it works: build somefile.d It uses dmd -v (same as rdmd) to get the list of files it tries to import. It watches dmd's error output for files it can't find. It then tries to fetch those files from my dpldocs.info http folder and tries again (http://dpldocs.info/repository/FILE). If dmd -v completes without errors, it moves on to run the actual compile. All of build's arguments are passed straight to dmd. In my other post, I talked about a configuration file. That would be preferred over just using my own http server so we can spread out our efforts. I just wanted something simple now to see if it actually works well. It worked on my simple program, but on my more complex program, the linker failed...but about the stupid assocative array opapply. Usually my hack to add object_.d from druntime fixes that, but not here. I don't know why. undefined reference to `_D6object30__T16AssociativeArrayTAyaTyAaZ16AssociativeArray7opApplyMFMDFKAyaKyAaZiZi' Meh, I should get to my real work anyway, maybe I'll come back to it. The stupid AAs give me more linker errors than anything else, and they are out of my control!
Jan 18 2011
prev sibling parent reply Austin Hastings <ah08010-d yahoo.com> writes:
On 1/18/2011 10:31 AM, Vladimir Panteleev wrote:
 Then the question is: does the time you spent writing and maintaining
 makefiles and build scripts exceed the time it would take you to set up
 a build tool?

For D, no. When I tried to get started with D2, there were a lot of pointers to kewl build utilities on d-source. None of them worked. None of them that needed to self-build were capable of it. (Some claimed to "just run," which was also false.) So I wound up pissing away about two days (spread out here and there as one library or another would proudly report "this uses build tool Z - isn't it cool?!" and I'd chase down another failure). On the other hand, Gnu Make works. And Perl works. And the dmd2 compiler spits out a dependency list that, with a little bit of perl foo, turns into a makefile fragment nicely. So now I have a standard makefile package that knows about parsing D source to figure out all the incestuous little details about what calls what. And I'm able, thanks to the miracle of "start here and recurse," to move this system from project to project with about 15 minutes of tweaking. Sometimes more, if there's a whole bunch of targets getting built. What's even more, of course, is that my little bit of Makefile foo is portable. I can use make with C, D, Java, C++, Perl, XML, or whatever language-of-the-week I'm playing with. Which is certainly not true of "L33+ build tool Z." And make is pretty much feature-complete at this point, again unlike any of the D build tools. Which means that investing in knowing how to tweak make pays off way better than time spent learning BTZ. =Austin
Jan 18 2011
parent reply Austin Hastings <ah08010-d yahoo.com> writes:
On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:
 On Wed, 19 Jan 2011 07:16:40 +0200, Austin Hastings
 <ah08010-d yahoo.com> wrote:

 None of them worked.

Most of those build utilities do exactly what make + your perl-foo do.

No, they don't. That's the point: I was _getting started_ with D2. I had no strong desire to reinvent the wheel, build tool-wise. But the tools I was pointed at just didn't work. I don't mean in a theoretic way - "this tool doesn't detect clock skew on a network that spans the international date line!" - I mean they wouldn't compile, or would compile but couldn't parse the D2 source files, or would compile but then crashed when I ran them. =Austin
Jan 18 2011
next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 19.01.2011 07:35, schrieb Vladimir Panteleev:
 On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings <ah08010-d yahoo.com>
wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:
 On Wed, 19 Jan 2011 07:16:40 +0200, Austin Hastings
 <ah08010-d yahoo.com> wrote:

 None of them worked.

Most of those build utilities do exactly what make + your perl-foo do.

No, they don't.

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.
 That's the point: I was _getting started_ with D2. I had no strong desire to
 reinvent the wheel, build tool-wise. But the tools I was pointed at just
 didn't work.

When a tool works for the author and many other users but not for you, you have to wonder where the fault really is. Besides, aren't all these tools open-source? The one time I had a problem with DSSS, it was easy to fix, and I sent the author a patch and everyone was better off from it. Isn't that how open-source works? :)

When you're learning a language, you want to get familiar with it before starting to fix stuff.
Jan 19 2011
parent Walter Bright <newshound2 digitalmars.com> writes:
Daniel Gibson wrote:
 When you're learning a language, you want to get familiar with it before 
 starting to fix stuff.

I tend to learn things by fixing them :-)
Feb 07 2011
prev sibling parent reply "nedbrek" <nedbrek yahoo.com> writes:
"Vladimir Panteleev" <vladimir thecybershadow.net> wrote in message 
news:op.vpjlwrletuzx1w cybershadow.mshome.net...
 On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings <ah08010-d yahoo.com> 
 wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.

Can someone tell me the corner case that requires a build tool to parse the whole source file? My make helper is awk, it just looks for the "import" and strips out the needed info... Thanks, Ned
Jan 19 2011
parent reply "Nick Sabalausky" <a a.a> writes:
"nedbrek" <nedbrek yahoo.com> wrote in message 
news:ih6o0g$2geu$1 digitalmars.com...
 "Vladimir Panteleev" <vladimir thecybershadow.net> wrote in message 
 news:op.vpjlwrletuzx1w cybershadow.mshome.net...
 On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings <ah08010-d yahoo.com> 
 wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.

Can someone tell me the corner case that requires a build tool to parse the whole source file? My make helper is awk, it just looks for the "import" and strips out the needed info...

Just as a few examples: mixin("import foo.bar;"); // or enum a = "import "; enum b = "foo."; enum c = "bar;"; mixin(a~b~c); // or static if(/+some fancy condition here+/) import foo.bar;
Jan 19 2011
next sibling parent "nedbrek" <nedbrek yahoo.com> writes:
"Nick Sabalausky" <a a.a> wrote in message 
news:ih7dj0$s4j$1 digitalmars.com...
 "nedbrek" <nedbrek yahoo.com> wrote in message 
 news:ih6o0g$2geu$1 digitalmars.com...
 "Vladimir Panteleev" <vladimir thecybershadow.net> wrote in message 
 news:op.vpjlwrletuzx1w cybershadow.mshome.net...
 On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings 
 <ah08010-d yahoo.com> wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.

Can someone tell me the corner case that requires a build tool to parse the whole source file? My make helper is awk, it just looks for the "import" and strips out the needed info...

Just as a few examples: mixin("import foo.bar;"); // or enum a = "import "; enum b = "foo."; enum c = "bar;"; mixin(a~b~c); // or static if(/+some fancy condition here+/) import foo.bar;

Thanks! Fortunately, I am the only one on this project, so I will be careful to avoid such things! :) Ned
Jan 19 2011
prev sibling parent el muchacho <nicolas.janin gmail.com> writes:
Le 19/01/2011 20:20, Nick Sabalausky a écrit :
 "nedbrek" <nedbrek yahoo.com> wrote in message 
 news:ih6o0g$2geu$1 digitalmars.com...
 "Vladimir Panteleev" <vladimir thecybershadow.net> wrote in message 
 news:op.vpjlwrletuzx1w cybershadow.mshome.net...
 On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings <ah08010-d yahoo.com> 
 wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.

Can someone tell me the corner case that requires a build tool to parse the whole source file? My make helper is awk, it just looks for the "import" and strips out the needed info...

Just as a few examples: mixin("import foo.bar;"); // or enum a = "import "; enum b = "foo."; enum c = "bar;"; mixin(a~b~c); // or static if(/+some fancy condition here+/) import foo.bar;

This is exactly the reason why the build system must be included in the compiler and not in external tools.
Jan 29 2011
prev sibling next sibling parent spir <denis.spir gmail.com> writes:
On 01/18/2011 06:33 PM, Jim wrote:
 Adam Ruppe Wrote:
 Maybe. 9/10 times they match anyway, but I'd be annoyed if
 the package names had to match the containing folder.

This is enforced in some languages, and I like it. It'd be confusing if they didn't match when I would go to look for something. I think it would be a good idea for D to standardise this. Not only so that the compiler can traverse and compile but for all dev tools (static analysers, package managers, etc). Standardisation makes it easier to create toolchains, which I believe are essential for the growth of any language use.

The D styleguide requires on one hand capitalised names for types, and lowercase for filenames on the other. How are we supposed to make them match? Denis _________________ vita es estrany spir.wikidot.com
Jan 18 2011
prev sibling next sibling parent reply spir <denis.spir gmail.com> writes:
On 01/18/2011 07:10 PM, bearophile wrote:
 spir:

 The D styleguide requires on one hand capitalised names for types, and
 lowercase for filenames on the other. How are we supposed to make them
 match?

Why do you want them to match?

Because when a module defines a type Foo (or rather, it's what is exported), I like it to be called Foo.d. A module called doFoo.d would certainly mainly define a func doFoo. So, people directly know what's in there (and this, from D's own [supposed] naming rules :-). Simple, no? Denis _________________ vita es estrany spir.wikidot.com
Jan 19 2011
parent "Nick Sabalausky" <a a.a> writes:
"spir" <denis.spir gmail.com> wrote in message 
news:mailman.710.1295434677.4748.digitalmars-d puremagic.com...
 On 01/18/2011 07:10 PM, bearophile wrote:
 spir:

 The D styleguide requires on one hand capitalised names for types, and
 lowercase for filenames on the other. How are we supposed to make them
 match?

Why do you want them to match?

Because when a module defines a type Foo (or rather, it's what is exported), I like it to be called Foo.d. A module called doFoo.d would certainly mainly define a func doFoo. So, people directly know what's in there (and this, from D's own [supposed] naming rules :-). Simple, no?

If I have a class Foo in it's own module, I call the module (and file) "foo". I find this to be simple too, because this way types are always capitalzed and modules are always non-captialized. Plus, like Vladimir indicated, this makes it a lot easier to distinguish between the type ("Foo") and the module ("foo").
Jan 19 2011
prev sibling next sibling parent spir <denis.spir gmail.com> writes:
On 01/19/2011 12:56 PM, bearophile wrote:
 spir:

 Because when a module defines a type Foo (or rather, it's what is
 exported), I like it to be called Foo.d.

Generally D modules contain many types.

Yep, but often one is the main exported element. When there are several, hopefully sensibly related, exported things, then it's easy to indicate: mathFuncs, stringTools, bitOps... while still following D naming conventions. Was it me or you who heavily & repetedly insisted on the importance of consistent style, in particular naming, in a programming community (I strongly support you on this point). Why should modules not benefit of this? For sure, there are case-insensitive filesystems, but only prevents using _in the same dir_ (or package) module names that differ only on case. I guess. Denis _________________ vita es estrany spir.wikidot.com
Jan 19 2011
prev sibling parent spir <denis.spir gmail.com> writes:
On 02/07/2011 10:06 AM, Walter Bright wrote:
 Daniel Gibson wrote:
 When you're learning a language, you want to get familiar with it before
 starting to fix stuff.

I tend to learn things by fixing them :-)

¡ great ! Though original authors often do not appreciate this attitude very much, early fans even less ;-) [whatever your true interest, humility, and, hum, say, "good will"] Denis -- _________________ vita es estrany spir.wikidot.com
Feb 07 2011
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Andrej Mitrovic wrote:
 On 1/18/11, Walter Bright <newshound2 digitalmars.com> wrote:
 You can put
 hundreds if you like.

DMD can, but Optlink can't handle long arguments.

Example?
Jan 18 2011
prev sibling next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Vladimir Panteleev:

 IMO, sticking to the C-ism of "one object file at a time" and dependency  
 on external build tools / makefiles is the biggest mistake DMD did in this  
 regard.

A Unix philosophy is to create tools that are able to do only one thing well, and rdmd uses DMD to do its job of helping compile small projects automatically. Yet the D compiler is not following that philosophy in many situations because it is doing lot of stuff beside compiling D code, like profiler, code coverage analyser, unittester, docs generator, JSON summary generator, and more. D1 compiler used to have a cute literary programming feature too, that's often used by Haskell blogs. Here Walter is pragmatic: docs generator happens to be quicker to create and maintain if it's built inside the compiler. So it's right to fold this rdmd functionality inside the compiler? Is this practically useful, like is this going to increase rdmd speed? Folding rdmd functionality inside the compiler may risk freezing the future evolution of future D build tools, so it has a risks/costs too. Bye, bearophile
Jan 18 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-01-18 10:22, bearophile wrote:
 Vladimir Panteleev:

 IMO, sticking to the C-ism of "one object file at a time" and dependency
 on external build tools / makefiles is the biggest mistake DMD did in this
 regard.

A Unix philosophy is to create tools that are able to do only one thing well, and rdmd uses DMD to do its job of helping compile small projects automatically. Yet the D compiler is not following that philosophy in many situations because it is doing lot of stuff beside compiling D code, like profiler, code coverage analyser, unittester, docs generator, JSON summary generator, and more. D1 compiler used to have a cute literary programming feature too, that's often used by Haskell blogs. Here Walter is pragmatic: docs generator happens to be quicker to create and maintain if it's built inside the compiler. So it's right to fold this rdmd functionality inside the compiler? Is this practically useful, like is this going to increase rdmd speed? Folding rdmd functionality inside the compiler may risk freezing the future evolution of future D build tools, so it has a risks/costs too. Bye, bearophile

I would say that in this case the LLVM/Clang approach would be the best. Build a solid compiler library that other tools can be built upon, including the compiler. -- /Jacob Carlborg
Jan 18 2011
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Vladimir Panteleev wrote:
 IMO, sticking to the C-ism of "one object file at a time" and dependency 
 on external build tools / makefiles is the biggest mistake DMD did in 
 this regard.

You don't need such a tool with dmd until your project exceeds a certain size. Most of my little D projects' "build tool" is a one line script that looks like: dmd foo.d bar.d There's just no need to go farther than that.
Jan 18 2011
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/11 11:37 PM, Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 22:17:08 +0200, Walter Bright
 <newshound2 digitalmars.com> wrote:

 Vladimir Panteleev wrote:
 IMO, sticking to the C-ism of "one object file at a time" and
 dependency on external build tools / makefiles is the biggest mistake
 DMD did in this regard.

You don't need such a tool with dmd until your project exceeds a certain size. Most of my little D projects' "build tool" is a one line script that looks like: dmd foo.d bar.d There's just no need to go farther than that.

Let's review the two problems discussed in this thread: 1) Not passing all modules to the compiler results in a nearly-incomprehensible (for some) linker error. 2) DMD's inability (or rather, unwillingness) to build the whole program when it's in the position to, which creates the dependency on external build tools (or solutions that require unnecessary human effort). Are you saying that there's no need to fix neither of these because they don't bother you personally?

I think the larger picture is even more important. We need a package system that takes Internet distribution into account. I got word on the IRC that dsss was that. It would be great to resurrect that, or start a new project with such goals. Andrei
Jan 18 2011
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-01-19 06:55, Andrei Alexandrescu wrote:
 On 1/18/11 11:37 PM, Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 22:17:08 +0200, Walter Bright
 <newshound2 digitalmars.com> wrote:

 Vladimir Panteleev wrote:
 IMO, sticking to the C-ism of "one object file at a time" and
 dependency on external build tools / makefiles is the biggest mistake
 DMD did in this regard.

You don't need such a tool with dmd until your project exceeds a certain size. Most of my little D projects' "build tool" is a one line script that looks like: dmd foo.d bar.d There's just no need to go farther than that.

Let's review the two problems discussed in this thread: 1) Not passing all modules to the compiler results in a nearly-incomprehensible (for some) linker error. 2) DMD's inability (or rather, unwillingness) to build the whole program when it's in the position to, which creates the dependency on external build tools (or solutions that require unnecessary human effort). Are you saying that there's no need to fix neither of these because they don't bother you personally?

I think the larger picture is even more important. We need a package system that takes Internet distribution into account. I got word on the IRC that dsss was that. It would be great to resurrect that, or start a new project with such goals. Andrei

I've been thinking for a while about doing a package system for D, basically gems but for D. But I first want to finish (finish as in somewhat usable and release it) another project I'm working on. -- /Jacob Carlborg
Jan 19 2011
prev sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
Andrei wrote:
  We need a package system that takes Internet distribution
 into account.

Do you think something like my simple http based system would work? Fetch dependencies. Try to compile. If the linker complains about missing files, download them from http://somewebsite/somepath/filename, try again from the beginning. There's no metadata, no version tracking, nothing like that, but I don't think such things are necessary. Worst case, just download the specific version you need for your project manually.
Jan 19 2011
next sibling parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 19.01.2011 14:56, schrieb Adam Ruppe:
 Andrei wrote:
   We need a package system that takes Internet distribution
 into account.

Do you think something like my simple http based system would work? Fetch dependencies. Try to compile. If the linker complains about missing files, download them from http://somewebsite/somepath/filename, try again from the beginning.

That'd suck horribly for bigger projects, and also when you've got a lot of dependencies, I guess.
 There's no metadata, no version tracking, nothing like that, but
 I don't think such things are necessary. Worst case, just download
 the specific version you need for your project manually.

I don't think it's such a big burden to list the dependencies for your project. Or, even better: combine both ideas: Automatically create and save a list of dependencies by trying (like you described). Then when you release your project, the dependency list is there and all dependencies can be fetched before building. Cheers, - Daniel
Jan 19 2011
parent Adam Ruppe <destructionator gmail.com> writes:
Daniel Gibson wrote:
 That'd suck horribly for bigger projects, and also when
 you've got a lot of dependencies, I guess

Maybe, especially if the dependencies have dependencies (it'd have to download one set before knowing what to look for for the next set), but that is a one time cost - after the files the first time, no need to download them again. It could probably cache the dependency list too, though I'm not sure the lag of checking it again is that bad anyway. Though, IMO, the biggest advantage for a system like this is for small programs instead of big ones. If you're doing a big program, it isn't much of an added effort to include the deps or manually script the build/makefile. You probably have some compile switches anyway. For a little program though, it is somewhat annoying to have to list all that stuff manually. Writing out the command line might take longer than writing the program!
 Or, even better: combine both ideas: Automatically create and
 save a list of dependencies by trying (like you described).

Yea, that'd work too. It could possibly figure out whole packages to grab that way too, instead of doing individual files. It'd be a little extra effort, though.
Jan 19 2011
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/19/11 7:56 AM, Adam Ruppe wrote:
 Andrei wrote:
   We need a package system that takes Internet distribution
 into account.

Do you think something like my simple http based system would work? Fetch dependencies. Try to compile. If the linker complains about missing files, download them from http://somewebsite/somepath/filename, try again from the beginning. There's no metadata, no version tracking, nothing like that, but I don't think such things are necessary. Worst case, just download the specific version you need for your project manually.

I'm not sure. A friend of mine who is well versed in such issues suggested two sources of inspiration: apt-get and cpan. As a casual user of both, I can indeed say that they are doing a very good job. Andrei
Jan 19 2011
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-01-19 14:56, Adam Ruppe wrote:
 Andrei wrote:
   We need a package system that takes Internet distribution
 into account.

Do you think something like my simple http based system would work? Fetch dependencies. Try to compile. If the linker complains about missing files, download them from http://somewebsite/somepath/filename, try again from the beginning. There's no metadata, no version tracking, nothing like that, but I don't think such things are necessary. Worst case, just download the specific version you need for your project manually.

That doesn't sound like a good solution. I think you would have to manually specify the dependencies. -- /Jacob Carlborg
Jan 19 2011
prev sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
retard wrote:
 A build tool without any kind of dependency versioning support is a
 complete failure.

You just delete the old files and let it re-download them to update. If the old one is working for you, simply keep it.
Jan 19 2011
next sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
 I meant that if the latest version 0.321 of the project 'foobar'
 depends on 'bazbaz 0.5.8.2'

Personally, I'd just prefer people to package their damned dependencies with their app.... But, a configuration file could fix that easily enough. Set one up like this: bazbaz = http://bazco.com/0.5.8.2/ Then it'd try to download http://bazco.com/0.5.8.2/bazbaz.module.d instead of the default site (which is presumably the latest version). This approach also makes it easy to add third party servers and libraries, so you wouldn't be dependent on a central source for your code. Here's a potential problem: what if bazbaz needs some specific version of something too? Maybe it could check for a config file on its server too, and use those directives when getting the library.
Jan 19 2011
parent Adam Ruppe <destructionator gmail.com> writes:
retard wrote:
 How it goes is you come up with more and more features if you spend
 sometime THINKING about the possible functionality for such a tool.

It, as written now, does everything I've ever wanted. If I try to do every possible function, it'll never be done. The question is what's trivially easy to automate, somewhat difficult to do by other means, and fairly useful. I'm not convinced building falls under that *at all*, much less every random edge case under the sun.
Jan 19 2011
prev sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
Vladimir Panteleev wrote:
 Your tool will just download the latest version of Y and the
 whole thing crashes and burns.

My problem is I don't see how that'd happen in the first place. Who would distribute something they've never compiled? If they compiled it, it would have downloaded the other libs already, so any sane distribution *already* has the dependent libraries, making this whole thing moot. The build tool is meant to help the developer, not the user. If the user needs help, it means the developer didn't do his job properly. That said, the configuration file, as described in my last post, seems like it can solve this easily enough.
Jan 19 2011
next sibling parent reply Mafi <mafi example.org> writes:
Am 19.01.2011 21:22, schrieb Andrej Mitrovic:
 Meh.

 Just give us File access in CTFE and we'll be done talking about build
 tools. Just run DMD on the thing and the app automagically tracks and
 downloads all of its dependencies.

 Im kidding. But file access in CTFE would be so damn cool. :)

import("file.ext") //compile time string of the contents of file.ext You can do for example: mixin(import("special.d")); //c-style import/include
Jan 19 2011
parent Jesse Phillips <jessekphillips+D gmail.com> writes:
Mafi Wrote:

 Am 19.01.2011 21:22, schrieb Andrej Mitrovic:
 Meh.

 Just give us File access in CTFE and we'll be done talking about build
 tools. Just run DMD on the thing and the app automagically tracks and
 downloads all of its dependencies.

 Im kidding. But file access in CTFE would be so damn cool. :)

import("file.ext") //compile time string of the contents of file.ext You can do for example: mixin(import("special.d")); //c-style import/include

Then you have to add -J command line switches for the location of the importable files.
Jan 19 2011
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-01-19 21:13, Adam Ruppe wrote:
 Vladimir Panteleev wrote:
 Your tool will just download the latest version of Y and the
 whole thing crashes and burns.

My problem is I don't see how that'd happen in the first place. Who would distribute something they've never compiled? If they compiled it, it would have downloaded the other libs already, so any sane distribution *already* has the dependent libraries, making this whole thing moot. The build tool is meant to help the developer, not the user. If the user needs help, it means the developer didn't do his job properly.

I would say it's for the user of the library. He only cares about the library he wants to use and not its dependencies.
 That said, the configuration file, as described in my last post,
 seems like it can solve this easily enough.

-- /Jacob Carlborg
Jan 20 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 11:05:34 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 Vladimir Panteleev wrote:
 On Tue, 18 Jan 2011 07:20:56 +0200, Walter Bright  
 <newshound2 digitalmars.com> wrote:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

compile a program with more than 1 module?

What is that message?

C:\Temp\D\Build> dmd test1.d OPTLINK (R) for Win32 Release 8.00.8 Copyright (C) Digital Mars 1989-2010 All rights reserved. http://www.digitalmars.com/ctg/optlink.html test1.obj(test1) Error 42: Symbol Undefined _D5test21fFZv --- errorlevel 1 1) The error message is very technical: a) does not indicate what exactly is wrong (module not passed to linker, not that the linker knows that) b) does not give any indication of what the user has to do to fix it 2) OPTLINK doesn't demangle D mangled names, when it could, and it would improve the readability of its error messages considerably. (I know not all mangled names are demangleable, but it'd be a great improvement regardless)
 dmd can build entire programs with one command:

     dmd file1.d file2.d file3.d ...etc...

That doesn't scale anywhere. What if you want to use a 3rd-party library with a few dozen modules? -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 11:11:01 +0200, Vladimir Panteleev  
<vladimir thecybershadow.net> wrote:

   a) does not indicate what exactly is wrong (module not passed to  
 linker, not that the linker knows that)

By the way, disregarding extern(C) declarations et cetera, the compiler has the ability to detect when such linker errors will appear and take appropriate measures (e.g. suggest using the -c flag, passing the appropriate .d or .obj file on its command line, or using a build tool). -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 11:22:36 +0200, bearophile <bearophileHUGS lycos.com>  
wrote:

 Folding rdmd functionality inside the compiler may risk freezing the  
 future evolution of future D build tools, so it has a risks/costs too.

Nobody needs more than one (good) D build tool. How many build tools does Go/Scala/Haskell/etc. have? Regardless if it's in the compiler or not, the only real requirement is that the source is maintainable, and the barrier to contribute to it is low enough (hurray for GitHub). -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

It's a cute blog post. It suggests that it will be good to: Getting Code: 1) Have a central repository for D modules that is easy to use for both submitters and users. - D code in such repository must Just Work. - I must stress that having a shared community-wide style to write D code helps a lot when you want to use in your program modules written by other people. Otherwise your program looks like a patchwork of wildly different styles. - D language must be designed to help writing code that works well both on 32 and 64 bit systems, helping to avoid the traps listed here: http://www.viva64.com/en/a/0065/ - Path-related problems must be minimized. - Probably D the package system needs to be improved. Some Java people are even talking about introducing means to create superpackages. Some module system theory from ML-like languages may help here. Figuring Out Code: - D compiler has to improve its error messages a lot. Error messages need to become polished and sharp. Linker errors are bad, they need to be avoided where possible (this means the compiler catches some errors before they become linker errors). C# compiler also gives a standard error number for each error, that allows to read about the error and its causes and solutions. Writing Code: - Interactive Console: it will be good to have sometime like this built in the D distribution. From the article:
Further, while I know there are people who like IDEs, for me they serve to
cripple the exploratory play of coding so that it’s about as fun as filling out
tax forms.<

In this regard I think D has to take two things into account: - People today like to use modern IDEs. So the core of the language too needs be designed to work well with IDEs. Currently D doesn't look much designed to be IDE-friendly. - On the other hand I see it a failure if the language requires a complex IDE to write code. I am able to write complex Python programs with no IDE, this means that Python is well designed. One more thing: from a recent discussion with Walter about software engineering, it seems that computer languages are both a design tool and engineering building material. Python is better than D for exploratory programming (http://en.wikipedia.org/wiki/Exploratory_programming ), or even to invent new algorithms and to explore new software ideas. D is probably better than Python to build larger software engineering systems. Lately Python has added several features to improve its "programming in the large" skills (decorators, Abstract Base Classes, optional annotations, etc), likewise I think D will enjoy some little handy features that help both exploratory programming and "programming in the small", like tuple unpacking syntax (http://d.puremagic.com/issues/show_bug.cgi?id=4579 ). There are features like named arguments (http://en.wikipedia.org/wiki/Parameter_%28computer_programming 29#Named_parameters ) that are useful for both little and large programs, named arguments are one of the things I'd like in D still. Bye, bearophile
Jan 18 2011
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 12:10:25 +0200, bearophile <bearophileHUGS lycos.com>  
wrote:

 Walter:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

It's a cute blog post. It suggests that it will be good to: Getting Code: 1) Have a central repository for D modules that is easy to use for both submitters and users.

Forcing a code repository is bad. Let authors use anything that they're comfortable with. The "repository" must be nothing more than a database of metadata (general information about a package, and how to download it).
 - D code in such repository must Just Work.

This is not practical. The only practical way is to put that responsibility on the authors, and to encourage forking and competition.
 - I must stress that having a shared community-wide style to write D  
 code helps a lot when you want to use in your program modules written by  
 other people. Otherwise your program looks like a patchwork of wildly  
 different styles.

I assume you mean naming conventions and not actual code style (indentation etc.)
 - Probably D the package system needs to be improved. Some Java people  
 are even talking about introducing means to create superpackages. Some  
 module system theory from ML-like languages may help here.

Why?
 Writing Code:
 - Interactive Console: it will be good to have sometime like this built  
 in the D distribution.

I don't think this is practical until someone writes a D interpreter. Have you ever seen an interactive console for a purely-compiled language?
 - People today like to use modern IDEs. So the core of the language too  
 needs be designed to work well with IDEs. Currently D doesn't look much  
 designed to be IDE-friendly.

How would DMD become even more IDE-friendly that it already is? What about -X?
 One more thing: from a recent discussion with Walter about software  
 engineering, it seems that computer languages are both a design tool and  
 engineering building material. Python is better than D for exploratory  
 programming (http://en.wikipedia.org/wiki/Exploratory_programming ), or  
 even to invent new algorithms and to explore new software ideas. D is  
 probably better than Python to build larger software engineering  
 systems. Lately Python has added several features to improve its  
 "programming in the large" skills (decorators, Abstract Base Classes,  
 optional annotations, etc), likewise I think D will enjoy some little  
 handy features that help both exploratory programming and "programming  
 in the small", like tuple unpacking syntax  
 (http://d.puremagic.com/issues/show_bug.cgi?id=4579 ). There are  
 features like named arguments  
 (http://en.wikipedia.org/wiki/Parameter_%28computer_programming
29#Named_parameters  
 ) that are useful for both little and large programs, named arguments  
 are one of the things I'd like in D still.

I have to agree that named arguments are awesome, they make the code much more readable and maintainable in many instances. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Vladimir Panteleev:

 Forcing a code repository is bad.

In this case I was not suggesting to force things :-) But having a place to find reliable modules is very good.
 This is not practical.

It works in Python, Ruby and often in Perl too, so I don't agree.
 I assume you mean naming conventions and not actual code style (indentation
etc.)

I meant that D code written by different people is better looking similar, where possible. C/C++ programmers have too much freedom where freedom is not necessary. Reducing some of such useless freedom helps improve the code ecosystem.
 - Probably D the package system needs to be improved. Some Java people  
 are even talking about introducing means to create superpackages. Some  
 module system theory from ML-like languages may help here.

 Why?

- Currently D packages are not working well yet, there are bug reports on this. - Something higher level than packages is useful when you build very large systems. - Module system theory from ML-like languages shows many years old ideas that otherwise will need to be painfully re-invented half-broken by D language developers. Sometimes wasting three days reading saves you some years of pain.
 I don't think this is practical until someone writes a D interpreter.

CTFE interpter is already there :-)
 How would DMD become even more IDE-friendly that it already is?

- error messages that give column number - folding annotations? - less usage of string mixins and more on delegates and normal D code - More introspection - etc
 I have to agree that named arguments are awesome, they make the code much more
readable and maintainable in many instances.<

I haven not already written an enhancement request on this because until few weeks ago I have thought that named arguments improve the usage of functions with many arguments, so they may encourage D programmers to create more functions like this from Windows API: HWND CreateWindow( LPCTSTR lpClassName, LPCTSTR lpWindowName,DWORD style,int x, int y, int width, int height, HWND hWndParent,HMENU hMenu,HANDLE hInstance,LPVOID lpParam); but lately I have understood that this is not the whole truth, named arguments are useful even when your functions have just 3 arguments. They make code more readable in both little script-like programs, and help avoid some mistakes in larger programs too. Bye, bearophile
Jan 18 2011
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Vladimir Panteleev:

 I think we have a misunderstanding, then? Who ensures that the modules
 "just work"? If someone breaks something, are they thrown out of The Holy  
 Repository?

There is no single solution to such problems. It's a matter of creating rules and lot of work to enforce them as years pass. If you talk about Holy things you are pushing this discussion toward a stupid direction.
 It also demotivates and alienates programmers.

Many programmers are able to understand the advantages of removing some unnecessary freedoms. Python has shown me that brace wars are not productive :-)
 I'm curious (not arguing), can you provide examples? I can't think of any
drastic
 improvements to the package system.

I was talking about fixing bugs, improving strength, maybe later adding super-packages, and generally taking a good look at the literature about the damn ML-style module systems and their theory.
 So you think the subset of D that's CTFE-able is good enough to make an
 interactive console that's actually useful?

The built-in interpreter needs some improvements in its memory management, and eventually it may support exceptions and other some other missing things. Currently functions can't access global mutable state in the compile-time execution path, despite they don't need to be wholly pure. But in a REPL you may want to do almost everything, like mutating global variables, importing modules and opening a GUI window on the fly, etc. SO currently the D CTFE interpreter is not good enough for a console, but I think it's already better than nothing (I'd like right now a console able to run D code with the current limitations of the CTFE interpreter), it will be improved, and it may even be made more flexible to be usable both for CTFE with pure-ish functions and in a different modality for the console. This allows to have a single interpreter for two purposes. Most modern video games are partially written with a scripting language, like Lua. So a third possible purpose is to allow run-time execution of code (so the program needs the compile at run time too), avoiding the need of a Lua/Python/MiniD interpreter. Bye, bearophile
Jan 18 2011
prev sibling parent reply el muchacho <nicolas.janin gmail.com> writes:
Le 18/01/2011 13:01, Vladimir Panteleev a écrit :
 On Tue, 18 Jan 2011 13:27:56 +0200, bearophile
 <bearophileHUGS lycos.com> wrote:
 
 Vladimir Panteleev:

It also demotivates and alienates programmers.

I don't believe so. I've never seen any C++ programmer who has worked on other languages like Java complain about the Java naming conventions or the obligatory one class = one file. Never. In the contrary, I believe most of them, when going back to C++, try to follow the same conventions as much as possible.
Jan 29 2011
parent reply Daniel Gibson <metalcaedes gmail.com> writes:
Am 29.01.2011 21:21, schrieb el muchacho:
 Le 18/01/2011 13:01, Vladimir Panteleev a écrit :
 On Tue, 18 Jan 2011 13:27:56 +0200, bearophile
 <bearophileHUGS lycos.com>  wrote:

 Vladimir Panteleev:

It also demotivates and alienates programmers.

I don't believe so. I've never seen any C++ programmer who has worked on other languages like Java complain about the Java naming conventions or the obligatory one class = one file. Never. In the contrary, I believe most of them, when going back to C++, try to follow the same conventions as much as possible.

I often find one class = one file annoying. I haven't done much with C++, but some stuff with Java and D1. I mostly agree with javas naming conventions, though. Cheers, - Daniel
Jan 29 2011
parent foobar <foo bar.com> writes:
Daniel Gibson Wrote:

 Am 29.01.2011 21:21, schrieb el muchacho:
 Le 18/01/2011 13:01, Vladimir Panteleev a écrit :
 On Tue, 18 Jan 2011 13:27:56 +0200, bearophile
 <bearophileHUGS lycos.com>  wrote:

 Vladimir Panteleev:

It also demotivates and alienates programmers.

I don't believe so. I've never seen any C++ programmer who has worked on other languages like Java complain about the Java naming conventions or the obligatory one class = one file. Never. In the contrary, I believe most of them, when going back to C++, try to follow the same conventions as much as possible.

I often find one class = one file annoying. I haven't done much with C++, but some stuff with Java and D1. I mostly agree with javas naming conventions, though. Cheers, - Daniel

I just wanted to remind that the accurate Java rule was one _public_ class per file. You can have more than one class in a file as long as only one is declared public. I dunno about your experience but mine was that this is not a problem in practice, at least not for me. As usually said bout this kind of things, YMMV.
Jan 29 2011
prev sibling parent reply Lutger Blijdestijn <lutger.blijdestijn gmail.com> writes:
Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 12:10:25 +0200, bearophile <bearophileHUGS lycos.com>
 wrote:
 
 Walter:

 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-



 It's a cute blog post. It suggests that it will be good to:

 Getting Code:
 1) Have a central repository for D modules that is easy to use for both
 submitters and users.

Forcing a code repository is bad. Let authors use anything that they're comfortable with. The "repository" must be nothing more than a database of metadata (general information about a package, and how to download it).

I'm pretty happy that my Fedora repositories are just a handful, most of which are setup out of the box. It's a big time saver, one of it's best features. I would use / evaluate much less software if I had to read instructions and download each package manually.
 - D code in such repository must Just Work.

This is not practical. The only practical way is to put that responsibility on the authors, and to encourage forking and competition.

True, though one of the cool things Gregor did back the days with dsss is automagically run unittests for each package in the repository and publish the results. It wasn't perfect but gave at least some indication.
Jan 18 2011
parent reply Lutger Blijdestijn <lutger.blijdestijn gmail.com> writes:
Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com> wrote:
 
 I'm pretty happy that my Fedora repositories are just a handful, most of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.

Why? It works quite well for Ruby as well as other languages.
Jan 18 2011
next sibling parent Lutger Blijdestijn <lutger.blijdestijn gmail.com> writes:
Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 14:36:43 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com> wrote:
 
 Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com> wrote:

 I'm pretty happy that my Fedora repositories are just a handful, most
 of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.

Why? It works quite well for Ruby as well as other languages.

Um? Maybe I don't know enough about RubyGems (I don't use Ruby but used it once or twice for a Ruby app) but AFAIK it isn't maintained by a group of people who select and package libraries from authors' web pages, but it is the authors who publish their libraries directly on RubyGems.

Aha, I've been misunderstanding you all this time, thinking you were arguing against the very idea of standard repository and package *format*. Then I agree, I also prefer something more decentralized.
Jan 18 2011
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/11 6:36 AM, Lutger Blijdestijn wrote:
 Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com>  wrote:

 I'm pretty happy that my Fedora repositories are just a handful, most of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.

Why? It works quite well for Ruby as well as other languages.

Package management something we really need to figure out for D. Question is, do we have an expert on board (apt-get architecture, cpan, rubygems...)? Andrei
Jan 18 2011
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-01-18 17:29, Andrei Alexandrescu wrote:
 On 1/18/11 6:36 AM, Lutger Blijdestijn wrote:
 Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com> wrote:

 I'm pretty happy that my Fedora repositories are just a handful,
 most of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.

Why? It works quite well for Ruby as well as other languages.

Package management something we really need to figure out for D. Question is, do we have an expert on board (apt-get architecture, cpan, rubygems...)? Andrei

I'm not an expert but I've been thinking for a while about doing a package system for D, basically RubyGems but for D. But I first want to finish (finish as in somewhat usable and release it) another project I'm working on. -- /Jacob Carlborg
Jan 19 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-01-19 14:39, Gour wrote:
 On Wed, 19 Jan 2011 14:07:27 +0100
 Jacob Carlborg<doob me.com>  wrote:

 I'm not an expert but I've been thinking for a while about doing a
 package system for D, basically RubyGems but for D.

Have you thought about waf (which already has some support for D as build system) and it is intended to be build framework? (http://waf-devel.blogspot.com/2010/12/make-your-own-build-system-with-waf.html) Sincerely, Gour

Never heard of it, I'll have a look. -- /Jacob Carlborg
Jan 19 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-01-19 18:44, Jacob Carlborg wrote:
 On 2011-01-19 14:39, Gour wrote:
 On Wed, 19 Jan 2011 14:07:27 +0100
 Jacob Carlborg<doob me.com> wrote:

 I'm not an expert but I've been thinking for a while about doing a
 package system for D, basically RubyGems but for D.

Have you thought about waf (which already has some support for D as build system) and it is intended to be build framework? (http://waf-devel.blogspot.com/2010/12/make-your-own-build-system-with-waf.html) Sincerely, Gour

Never heard of it, I'll have a look.

1. it uses python, yet another dependency 2. it seems complicated -- /Jacob Carlborg
Jan 19 2011
next sibling parent Lutger Blijdestijn <lutger.blijdestijn gmail.com> writes:
Russel Winder wrote:

 On Thu, 2011-01-20 at 12:32 +0100, Gour wrote:
 On Thu, 20 Jan 2011 10:13:00 +0000
 Russel Winder <russel russel.org.uk> wrote:
 
 SCons, Waf, and Gradle are currently the tools of choice.

Gradle is (mostly) for Java-based projects, afaict?

It is the case that there are two more or less distinct domains of build -- JVM-oriented, and everything else. There is though nothing stopping a single build system from trying to be more universal. Sadly every attempt to date has failed for one reason or another (not necessarily technical). Basically there seems to be a positive feedback loop in action keeping the two domains separate: basically the tools from one domain don't work well on the opposite domain and so no-one uses them there, so no evolution happens to improve things. In this particular case, Gradle has great support for everything JVM-related and no real support for C, C++, Fortran, etc. All attempts to raise the profile of the Ant C/C++ compilation tasks, which Gradle could use trivially, have come to nothing.

Do you have an opinion for the .NET world? I'm currently just using MSBuild, but know just enough to get it working. It sucks.
Jan 20 2011
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-01-20 10:19, Gour wrote:
 On Wed, 19 Jan 2011 19:40:49 +0100
 Jacob Carlborg<doob me.com>  wrote:

 1. it uses python, yet another dependency

True, but it brings more features over e.g. cmake 'cause you have full language on disposal.

I would go with a tool that uses a dynamic language as a DSL. I'm assuming you can embed the the dynamic language completely without the need for external dependencies.
 2. it seems complicated

Well, build systems are complex... ;) Sincerely, Gour

Hm, right. I was actually kind of thinking about a build tool, not a package system/tool. But it seemed complex anyway, it should be able to be quite simple. -- /Jacob Carlborg
Jan 20 2011
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
On 18/01/2011 16:29, Andrei Alexandrescu wrote:
 Package management something we really need to figure out for D.
 Question is, do we have an expert on board (apt-get architecture, cpan,
 rubygems...)?

 Andrei

I agree. Having worked on Eclipse a lot, which uses OSGi as the underlying package management system, I really stand by it's usefulness. For larger projects it is chaos without it (more or less chaos depending on the particular situation). -- Bruno Medeiros - Software Engineer
Feb 04 2011
prev sibling next sibling parent "Simen kjaeraas" <simen.kjaras gmail.com> writes:
Vladimir Panteleev <vladimir thecybershadow.net> wrote:

 - I must stress that having a shared community-wide style to write D  
 code helps a lot when you want to use in your program modules written  
 by other people. Otherwise your program looks like a patchwork of  
 wildly different styles.

I assume you mean naming conventions and not actual code style (indentation etc.)

Likely he meant more than that. At least such is the impression I've had before. I am not vehemently opposed to such an idea, and I definitely agree that naming conventions should be observed, but I have at times had the impression that bearophile wants all aspects of code to be controlled by such a coding style. -- Simen
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 13:27:56 +0200, bearophile <bearophileHUGS lycos.com>  
wrote:

 Vladimir Panteleev:

 Forcing a code repository is bad.

In this case I was not suggesting to force things :-) But having a place to find reliable modules is very good.
 This is not practical.

It works in Python, Ruby and often in Perl too, so I don't agree.

I think we have a misunderstanding, then? Who ensures that the modules "just work"? If someone breaks something, are they thrown out of The Holy Repository?
 I assume you mean naming conventions and not actual code style  
 (indentation etc.)

I meant that D code written by different people is better looking similar, where possible. C/C++ programmers have too much freedom where freedom is not necessary. Reducing some of such useless freedom helps improve the code ecosystem.

It also demotivates and alienates programmers.
 - Currently D packages are not working well yet, there are bug reports  
 on this.
 - Something higher level than packages is useful when you build very  
 large systems.
 - Module system theory from ML-like languages shows many years old ideas  
 that otherwise will need to be painfully re-invented half-broken by D  
 language developers. Sometimes wasting three days reading saves you some  
 years of pain.

I'm curious (not arguing), can you provide examples? I can't think of any drastic improvements to the package system.
 I don't think this is practical until someone writes a D interpreter.

CTFE interpter is already there :-)

So you think the subset of D that's CTFE-able is good enough to make an interactive console that's actually useful? -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn  
<lutger.blijdestijn gmail.com> wrote:

 I'm pretty happy that my Fedora repositories are just a handful, most of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.
 True, though one of the cool things Gregor did back the days with dsss is
 automagically run unittests for each package in the repository and  
 publish
 the results. It wasn't perfect but gave at least some indication.

I think that idea is taken from CPAN. CPAN refuses to install the package if it fails unit tests (unless you force it to). -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 14:30:53 +0200, bearophile <bearophileHUGS lycos.com>  
wrote:

 Vladimir Panteleev:

 I think we have a misunderstanding, then? Who ensures that the modules
 "just work"? If someone breaks something, are they thrown out of The  
 Holy
 Repository?

There is no single solution to such problems. It's a matter of creating rules and lot of work to enforce them as years pass. If you talk about Holy things you are pushing this discussion toward a stupid direction.

If a single entity controls the inclusion of submissions into an important set, then there will inevitably be conflicts. Also I still have no idea what you meant when you said that Python, Ruby and Perl do it. AFAIK their repositories are open and anyone can submit their project.
 I'm curious (not arguing), can you provide examples? I can't think of  
 any drastic
 improvements to the package system.

I was talking about fixing bugs, improving strength, maybe later adding super-packages, and generally taking a good look at the literature about the damn ML-style module systems and their theory.

I meant examples of why this is useful for D. (Why are you damning the ML-style module systems?) -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 14:36:43 +0200, Lutger Blijdestijn  
<lutger.blijdestijn gmail.com> wrote:

 Vladimir Panteleev wrote:

 On Tue, 18 Jan 2011 13:35:34 +0200, Lutger Blijdestijn
 <lutger.blijdestijn gmail.com> wrote:

 I'm pretty happy that my Fedora repositories are just a handful, most  
 of
 which are setup out of the box. It's a big time saver, one of it's best
 features. I would use / evaluate much less software if I had to read
 instructions and download each package manually.

I don't see how this relates to code libraries. Distribution repositories simply repackage and distribute software others have written. Having something like that for D is unrealistic.

Why? It works quite well for Ruby as well as other languages.

Um? Maybe I don't know enough about RubyGems (I don't use Ruby but used it once or twice for a Ruby app) but AFAIK it isn't maintained by a group of people who select and package libraries from authors' web pages, but it is the authors who publish their libraries directly on RubyGems. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

I missed a lot of this thread and coming in part way through may miss
lots of past nuances, or even major facts.

On Thu, 2011-01-20 at 10:19 +0100, Gour wrote:
 On Wed, 19 Jan 2011 19:40:49 +0100
 Jacob Carlborg <doob me.com> wrote:
=20
 1. it uses python, yet another dependency

True, but it brings more features over e.g. cmake 'cause you have full language on disposal.

Waf and SCons (both Python based) are top of the pile in the C/C ++/Fortran/LaTeX build game, with CMake a far back third and everything else failing to finish. In the Java/Scala/Groovy/Clojure build game Gradle beats Maven beats Gant beats Ant, for the reason that Groovy beats XML as a build specification language. Internal DSLs using dynamic languages just win in this game. (Though the Scala crew are trying to convince people that SBT, a build tool written such that you use Scala code to specify the build is good. It is a priori but it has some really critical negative design issues.)
 2. it seems complicated

Well, build systems are complex... ;)

Definitely. Well at least for anything other than trivial projects anyway. The trick is to make it as easy as possible to specify the complexity easily and comprehensibly. Make did this in 1978, but is not now the tool of choice. Autotools was an heroic attempt to do something based on Make. CMake likewise. SCons, Waf, and Gradle are currently the tools of choice. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 20 2011
prev sibling next sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

On Thu, 2011-01-20 at 12:32 +0100, Gour wrote:
 On Thu, 20 Jan 2011 10:13:00 +0000
 Russel Winder <russel russel.org.uk> wrote:
=20
 SCons, Waf, and Gradle are currently the tools of choice.

Gradle is (mostly) for Java-based projects, afaict?

It is the case that there are two more or less distinct domains of build -- JVM-oriented, and everything else. There is though nothing stopping a single build system from trying to be more universal. Sadly every attempt to date has failed for one reason or another (not necessarily technical). Basically there seems to be a positive feedback loop in action keeping the two domains separate: basically the tools from one domain don't work well on the opposite domain and so no-one uses them there, so no evolution happens to improve things. In this particular case, Gradle has great support for everything JVM-related and no real support for C, C++, Fortran, etc. All attempts to raise the profile of the Ant C/C++ compilation tasks, which Gradle could use trivially, have come to nothing. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 20 2011
prev sibling parent Russel Winder <russel russel.org.uk> writes:
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable

On Thu, 2011-01-20 at 19:24 +0100, Lutger Blijdestijn wrote:
[ . . . ]
=20
 Do you have an opinion for the .NET world? I'm currently just using MSBui=

 but know just enough to get it working. It sucks.=20
=20

I thought .NET was dominated by NAnt -- I have no direct personal experience, so am "speaking" from a position of deep ignorance. SCons and Waf should both work. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel russel.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Jan 21 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 12:07:21 +0200, Christopher Nicholson-Sauls  
<ibisbasenji gmail.com> wrote:

 That doesn't scale anywhere. What if you want to use a 3rd-party library
 with a few dozen modules?

Then I would expect the library vendor provides either a pre-compiled binary library, or the means to readily generate same -- whether that means a Makefile, a script, or what have you.

Why? You're saying that both the user and every library maintainer must do that additional work. Why should the user care that they have to deal with pre-compiled libraries in general? The only thing the user should bother with is the package name for the library. D can take care of everything else: check out the library sources from version control, build a library and generate .di files. The .di files can include pragmas which specify to link to that library. There are no technical reasons against this. In fact, DSSS already does most of this. AFAIK Ruby takes care of everything else, even when the library isn't installed on your system.
 Forgive me if I misunderstand, but I really don't want a
 language/compiler that goes too far into hand-holding.  Let me screw up
 if I want to.

So, you want D to force people to do more work, out of no practical reason? -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Tue, 18 Jan 2011 10:32:53 +0000 (UTC)
Trass3r <un known.com> wrote:

 We must avoid having the same disastrous situation like C/C++ where
 everyone uses a different system, CMake, make, scons, blabla.

I agree (planning not to use blabla build system, but waf). Otoh, I hope D2 will also be able to avoid things like: http://cdsmith.wordpress.com/2011/01/16/haskells-own-dll-hell/ However, for now I'm more concerned to see 64bit DMD, complete QtD (or some other workable GUI bindings), some database bindings etc. first... Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 13:28:32 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 What is that message?

OPTLINK (R) for Win32 Release 8.00.8 Copyright (C) Digital Mars 1989-2010 All rights reserved. http://www.digitalmars.com/ctg/optlink.html test1.obj(test1) Error 42: Symbol Undefined _D5test21fFZv --- errorlevel 1 1) The error message is very technical: a) does not indicate what exactly is wrong (module not passed to linker, not that the linker knows that)

There could be many reasons for the error, see:

Sorry, you're missing the point. The toolchain has the ability to output a much more helpful error message (or just do the right thing and compile the whole project, which is obviously what the user intends to do in 99% of the time).
 http://www.digitalmars.com/ctg/OptlinkErrorMessages.html#symbol_undefined

 which is linked from the url listed:

 http://www.digitalmars.com/ctg/optlink.html

 and more directly from the FAQ:

 http://www.digitalmars.com/faq.html

   b) does not give any indication of what the user has to do to fix it

The link above does give such suggestions, depending on what the cause of the error is.

This is not nearly good enough. I can bet you that over 95% of users will Google for the error message instead. Further more, that webpage is very technical. Some D users (those wanting a high-performance high-level programming language) don't even need to know what a linker is or does.
 2) OPTLINK doesn't demangle D mangled names, when it could, and it  
 would improve the readability of its error messages considerably.
    (I know not all mangled names are demangleable, but it'd be a great  
 improvement regardless)

The odd thing is that Optlink did demangle the C++ mangled names, and people actually didn't like it that much.

I think we can agree that there is a significant difference between the two audiences (users of your C++ toolchain who need a high-end, high-performance C++ compiler, vs. people who want to try a new programming language). You can make it an option, or just print both mangled and demangled.
 dmd can build entire programs with one command:

     dmd file1.d file2.d file3.d ...etc...

library with a few dozen modules?

Just type the filenames and library names on the command line. You can put hundreds if you like. If you do blow up the command line processor (nothing dmd can do about that), you can put all those files in a file, say "cmd", and invoke with: dmd cmd The only limit is the amount of memory in your system.

That's not what I meant - I meant it doesn't scale as far as user effort in concerned. There is no reason why D should force users to maintain response files, make files, etc. D (the language) doesn't need them, and nor should the reference implementation. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 14:47:29 +0200, Jim <bitcirkel yahoo.com> wrote:

 I imagine such a compiler could also do some interesting optimisations  
 based on its greater perspective.

Compiling the entire program at once opens the door to much more than just optimizations. You could have virtual templated methods, for one. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent reply Adam Ruppe <destructionator gmail.com> writes:
Interestingly, my own experience with Ruby, a few years ago, was
almost 180 degrees opposite of the blogger's.

The two most frustrating aspects were documentation and deployment.
The documents were sparse and useless and deployment was the
hugest headache I've ever experienced, in great part due to Rubygems
not working properly!

They've probably improved it a lot since then, but it reinforced
my long-standing belief that third party libraries are, more often
than not, more trouble than they're worth anyway.
Jan 18 2011
parent Brad <brad.lanam.comp_nospam nospam_gmail.com> writes:
In digitalmars.D, you wrote:
 The two most frustrating aspects were documentation and deployment.
 The documents were sparse and useless and deployment was the
 hugest headache I've ever experienced, in great part due to Rubygems
 not working properly!

 They've probably improved it a lot since then, but it reinforced
 my long-standing belief that third party libraries are, more often
 than not, more trouble than they're worth anyway.

I only poked into RubyGems briefly and I had the same impression at the time. Perl's CPAN is much more mature. Much of the time, I feel as you do about 3rd party libraries. They try to do too much, are inflexible and not customizable. But many of the perl packages on CPAN are written to address a single task, are flexible and easy to use. I use several and have my favorites. Others are not worth the trouble. But this problem is going to happen to any system. Some of the packages are simply useless, poorly designed, too specific, not supported, out of date, etc. But other packages are well designed, well supported, work great. Some haven't changed in ages and work well. I think counters can help -- how many downloads, indicating popularity. How many _recent_ downloads, or a histogram of downloads by month so the user can tell if the package is out of date. Don't like the rating systems much, but that's also a possibility. An integrated bug database and forums similar to sourceforge would be very useful. You can check activity and see if the author of the package is active and keeps on top of problems. -- Brad
Jan 20 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 15:51:58 +0200, Adam Ruppe <destructionator gmail.com>  
wrote:

 Jim wrote:
 Why can't the compiler traverse this during compilation in order to
 find all relevant modules and compile them if needed?

How will it find all the modules? Since modules and files don't have to have matching names, it can't assume "import foo;" will necessarily be found in "foo.d". I use this fact a lot to get all a program's dependencies in one place.

I think this is a misfeature. I suppose you avoid using build tools and prefer makefiles/build scripts for some reason?
 The modules don't necessarily have to be under the current
 directory either. It'd have a lot of files to search, which might
 be brutally slow.

Not if the compiler knows the file name based on the module name.
 ... but, if you do want that behavior, you can get it today somewhat
 easily: dmd *.d, which works quite well if all the things are in
 one folder anyway.

...which won't work on Windows, for projects with packages, and if you have any unrelated .d files (backups, test programs) in your directory (which I almost always do). -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/18/11, Walter Bright <newshound2 digitalmars.com> wrote:
 You can put
 hundreds if you like.

DMD can, but Optlink can't handle long arguments.
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 16:58:31 +0200, Adam Ruppe <destructionator gmail.com>  
wrote:

 Yeah, makefiles and build scripts are adequately fit already.

Then the question is: does the time you spent writing and maintaining makefiles and build scripts exceed the time it would take you to set up a build tool? -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 1/18/11, Andrej Mitrovic <andrej.mitrovich gmail.com> wrote:
 On 1/18/11, Walter Bright <newshound2 digitalmars.com> wrote:
 You can put
 hundreds if you like.

DMD can, but Optlink can't handle long arguments.

Although now that I've read the error description I might have passed a wrong argument somehow. I'll take a look.
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tue, 18 Jan 2011 22:17:08 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 Vladimir Panteleev wrote:
 IMO, sticking to the C-ism of "one object file at a time" and  
 dependency on external build tools / makefiles is the biggest mistake  
 DMD did in this regard.

You don't need such a tool with dmd until your project exceeds a certain size. Most of my little D projects' "build tool" is a one line script that looks like: dmd foo.d bar.d There's just no need to go farther than that.

Let's review the two problems discussed in this thread: 1) Not passing all modules to the compiler results in a nearly-incomprehensible (for some) linker error. 2) DMD's inability (or rather, unwillingness) to build the whole program when it's in the position to, which creates the dependency on external build tools (or solutions that require unnecessary human effort). Are you saying that there's no need to fix neither of these because they don't bother you personally? -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Wed, 19 Jan 2011 07:16:40 +0200, Austin Hastings <ah08010-d yahoo.com>  
wrote:

 None of them worked.

Most of those build utilities do exactly what make + your perl-foo do. -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Wed, 19 Jan 2011 08:09:11 +0200, Austin Hastings <ah08010-d yahoo.com>  
wrote:

 On 1/19/2011 12:50 AM, Vladimir Panteleev wrote:
 On Wed, 19 Jan 2011 07:16:40 +0200, Austin Hastings
 <ah08010-d yahoo.com> wrote:

 None of them worked.

Most of those build utilities do exactly what make + your perl-foo do.

No, they don't.

Actually, you're probably right here. To my knowledge, there are only two build tools that take advantage of the -deps compiler option - rdmd and xfbuild. Older ones were forced to parse the source files - rebuild even used DMD's frontend for that. There's also a relatively new tool (dbuild oslt?) which generates makefiles.
 That's the point: I was _getting started_ with D2. I had no strong  
 desire to reinvent the wheel, build tool-wise. But the tools I was  
 pointed at just didn't work.

When a tool works for the author and many other users but not for you, you have to wonder where the fault really is. Besides, aren't all these tools open-source? The one time I had a problem with DSSS, it was easy to fix, and I sent the author a patch and everyone was better off from it. Isn't that how open-source works? :) -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 18 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Wed, 19 Jan 2011 12:57:42 +0200, spir <denis.spir gmail.com> wrote:

 Because when a module defines a type Foo (or rather, it's what is  
 exported), I like it to be called Foo.d. A module called doFoo.d would  
 certainly mainly define a func doFoo. So, people directly know what's in  
 there (and this, from D's own [supposed] naming rules :-). Simple, no?

I actually tried this convention for a project. It turned out a not very good idea, because if you want to access a static member or subclass of said class, you must specify the type twice (once for the module name, and another for the type) - e.g. "Foo.Foo.bar()". Besides, it's against the recommended D code style convention: http://www.digitalmars.com/d/2.0/dstyle.html -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 19 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Wed, 19 Jan 2011 14:07:27 +0100
Jacob Carlborg <doob me.com> wrote:

 I'm not an expert but I've been thinking for a while about doing a=20
 package system for D, basically RubyGems but for D.=20

Have you thought about waf (which already has some support for D as build system) and it is intended to be build framework? (http://waf-devel.blogspot.com/2010/12/make-your-own-build-system-with-waf.= html) Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Jan 19 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Wed, 19 Jan 2011 13:56:17 +0000, Adam Ruppe wrote:

 Andrei wrote:
  We need a package system that takes Internet distribution
 into account.

Do you think something like my simple http based system would work? Fetch dependencies. Try to compile. If the linker complains about missing files, download them from http://somewebsite/somepath/filename, try again from the beginning. There's no metadata, no version tracking, nothing like that, but I don't think such things are necessary. Worst case, just download the specific version you need for your project manually.

A build tool without any kind of dependency versioning support is a complete failure. Especially if it also tries to handle external non-D dependencies. It basically makes supporting all libraries with rapid API changes quite impossible.
Jan 19 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Wed, 19 Jan 2011 19:41:47 +0000, Adam Ruppe wrote:

 retard wrote:
 A build tool without any kind of dependency versioning support is a
 complete failure.

You just delete the old files and let it re-download them to update. If the old one is working for you, simply keep it.

I meant that if the latest version 0.321 of the project 'foobar' depends on 'bazbaz 0.5.8.2' but also versions 0.5.8.4 - 0.5.8.11 (API but not ABI compatible) and 0.5.9 (mostly incompatible) and 0.6 - 0.9.12.3 (totally incompatible) exist, the build fails badly when downloading the latest library. If you don't document the versions of the dependencies anywhere, it's almost impossible to build to project even manually.
Jan 19 2011
prev sibling next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Wed, 19 Jan 2011 21:41:47 +0200, Adam Ruppe <destructionator gmail.com>  
wrote:

 retard wrote:
 A build tool without any kind of dependency versioning support is a
 complete failure.

You just delete the old files and let it re-download them to update. If the old one is working for you, simply keep it.

You're missing the point. You want to install package X (either directly or as a dependency for something else), which was written for a specific version of Y. Your tool will just download the latest version of Y and the whole thing crashes and burns. Someone posted this somewhere else in this thread, I believe it's quite relevant: http://cdsmith.wordpress.com/2011/01/16/haskells-own-dll-hell/ -- Best regards, Vladimir mailto:vladimir thecybershadow.net
Jan 19 2011
prev sibling next sibling parent retard <re tard.com.invalid> writes:
Wed, 19 Jan 2011 20:01:28 +0000, Adam Ruppe wrote:

 I meant that if the latest version 0.321 of the project 'foobar'
 depends on 'bazbaz 0.5.8.2'

Personally, I'd just prefer people to package their damned dependencies with their app.... But, a configuration file could fix that easily enough. Set one up like this: bazbaz = http://bazco.com/0.5.8.2/ Then it'd try to download http://bazco.com/0.5.8.2/bazbaz.module.d instead of the default site (which is presumably the latest version). This approach also makes it easy to add third party servers and libraries, so you wouldn't be dependent on a central source for your code. Here's a potential problem: what if bazbaz needs some specific version of something too? Maybe it could check for a config file on its server too, and use those directives when getting the library.

How it goes is you come up with more and more features if you spend some time THINKING about the possible functionality for such a tool. Instead of NIH, why don't you just study what the existing tools do and pick up all the relevant features. Why there are so many open source tools doing the exactly same thing is that developers are too lazy to study the previous work and start developing code before the common sense kicks in.
Jan 19 2011
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
Meh.

Just give us File access in CTFE and we'll be done talking about build
tools. Just run DMD on the thing and the app automagically tracks and
downloads all of its dependencies.

Im kidding. But file access in CTFE would be so damn cool. :)
Jan 19 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Wed, 19 Jan 2011 19:40:49 +0100
Jacob Carlborg <doob me.com> wrote:

 1. it uses python, yet another dependency

True, but it brings more features over e.g. cmake 'cause you have full language on disposal.
 2. it seems complicated

Well, build systems are complex... ;) Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Jan 20 2011
prev sibling next sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Thu, 20 Jan 2011 10:13:00 +0000
Russel Winder <russel russel.org.uk> wrote:

 SCons, Waf, and Gradle are currently the tools of choice.

Gradle is (mostly) for Java-based projects, afaict? Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Jan 20 2011
prev sibling next sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
On 18/01/2011 05:20, Walter Bright wrote:
 http://urbanhonking.com/ideasfordozens/2011/01/18/what-makes-a-programming-language-good/

"I quit being a professional programmer." I usually avoid discussions and dismiss out of hand opinions about software development from those who no longer develop code (did that recently with my boss, to cut off a discussion). Mostly for time saving, it's not that I think they automatically wrong, or even likely to be wrong. Still, I read that article, and it's not bad, there are some good points. In fact, I strongly agree, in essence, with one of the things he said: That language ecosystems are what matter, not just the language itself. At least for most programmers, what you want is to develop software, software that is useful or interesting, it's not about staring at the beauty of your code and that's it. This, any language community that focuses excessively on the language only and forsakes, dismisses, or forgets the rest of the toolchain and ecosystem, will never succeed beyond a niche. (*cough* LISP *cough*) -- Bruno Medeiros - Software Engineer
Feb 04 2011
parent so <so so.do> writes:
 "I quit being a professional programmer."

 I usually avoid discussions and dismiss out of hand opinions about  
 software development from those who no longer develop code (did that  
 recently with my boss, to cut off a discussion). Mostly for time saving,  
 it's not that I think they automatically wrong, or even likely to be  
 wrong.

We are still in the stone age of programming, what has changed in last 10 years? Nothing.
Feb 04 2011
prev sibling next sibling parent Ulrik Mikaelsson <ulrik.mikaelsson gmail.com> writes:
2011/2/4 Bruno Medeiros <brunodomedeiros+spam com.gmail>:
 language ecosystems are what matter, not just the language itself. At least
 for most programmers, what you want is to develop software, software that is
 useful or interesting, it's not about staring at the beauty of your code and
 that's it.

if the language itself, or the platforms it's tied to makes me grit my teeth. What good is earning lots of money, to buy the finest food, if I only have my gums left to chew it with? (Figuratively speaking, of course)
Feb 06 2011
prev sibling parent Gour <gour atmarama.net> writes:
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: quoted-printable

On Mon, 07 Feb 2011 01:06:46 -0800
Walter Bright <newshound2 digitalmars.com> wrote:

 I tend to learn things by fixing them :-)

Heh...this is called 'engineer'. ;) Sincerely, Gour --=20 Gour | Hlapicina, Croatia | GPG key: CDBF17CA ----------------------------------------------------------------
Feb 07 2011