www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Have Win DMD use gmake instead of a separate DMMake makefile?

reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
Although it took longer than I expected to get around to it, I'm
working on a release-generator tool for DMD. I'm finding that a very
significant amount of the effort involved (much more than I expected)
is discovering and dealing with all the fun little differences between
the posix and win32 makefiles (and now we have some win64 makefiles as
well).

Efforts can be made to decrease these differences, but simply
having them separate makefiles in the first place (let alone using
completely different "make"s: GNU make vs DM make) is a natural
invitation for divergence.

No disrespect intended to Digital Mars Make, but since GNU make appears
to be more feature-rich, have wider overall adoption, and is freely
available on Windows as a pre-built binary
<http://gnuwin32.sourceforge.net/packages/make.htm>: Would it be
acceptable to use gmake as *the* make for DMD? Ie, either convert the
windows makefiles to gmake, or expand the posix makefiles to support
windows?

I'd be willing to give it a shot myself, and I could trivially
write a small batch utility to download Win gmake and put it on the
current PATH, so that nobody has to go downloading/installing it
manually. I would do this *after* finishing the release-generator tool,
but afterwords it would allow the tool's implantation to be greatly
simplified.

Is this something that would be acceptable, or does building DMD for
Windows need to stay as DM make?

(Sorry for the accidental cross-post to "announce", can someone delete
that one?)
Aug 10 2013
next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't. Another suggestion that I kind of liked was to just build them all with a single script written in D and ditch make entirely, which would seriously reduce the amount of duplication across platforms. But that's obviously a much bigger change and would likely be much more controversial than simply using a more standard make. - Jonathan M Davis
Aug 10 2013
next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sat, 10 Aug 2013 16:21:45 -0700
Jonathan M Davis <jmdavisProg gmx.com> wrote:
 
 Another suggestion that I kind of liked was to just build them all
 with a single script written in D and ditch make entirely, which
 would seriously reduce the amount of duplication across platforms.
 But that's obviously a much bigger change and would likely be much
 more controversial than simply using a more standard make.
 
Yea, while I do like that too, it would make bootstrapping difficult. But then again, if parts of DMD start being written in D (as there has been some talk about), then that would have to deal with the exact same bootstrapping issue anyway. Although, if that does happen (parts of DMD written in D), then I'd imagine it may help a lot to do it *starting* from a really good solid makefile instead of the inconsistent makefiles we have now. *Then* we could transition to a D-based buildscript if we really wanted, but I think starting with a D-based buildscript, or the current posix/win makefiles, could just make everything messier. The posix makefiles actually aren't too bad at this point (the "generated" directories phobos/druntime use on posix are a big improvement) but the windows makefiles seem to be lagging behind. At the very least, I'd like to see that situation engineered away via common posix/windows makefiles - which of course requires using the same "make".
Aug 10 2013
parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, August 10, 2013 20:06:32 Nick Sabalausky wrote:
 On Sat, 10 Aug 2013 16:21:45 -0700
 
 Jonathan M Davis <jmdavisProg gmx.com> wrote:
 Another suggestion that I kind of liked was to just build them all
 with a single script written in D and ditch make entirely, which
 would seriously reduce the amount of duplication across platforms.
 But that's obviously a much bigger change and would likely be much
 more controversial than simply using a more standard make.
Yea, while I do like that too, it would make bootstrapping difficult. But then again, if parts of DMD start being written in D (as there has been some talk about), then that would have to deal with the exact same bootstrapping issue anyway.
Yeah, it introduces a circular dependency, but it's one that we're planning it introduce anyway.
 Although, if that does happen (parts of DMD written in D), then I'd
 imagine it may help a lot to do it *starting* from a really good solid
 makefile instead of the inconsistent makefiles we have now. *Then* we
 could transition to a D-based buildscript if we really wanted, but I
 think starting with a D-based buildscript, or the current posix/win
 makefiles, could just make everything messier.
 
 The posix makefiles actually aren't too bad at this point (the
 "generated" directories phobos/druntime use on posix are a big
 improvement) but the windows makefiles seem to be lagging behind. At
 the very least, I'd like to see that situation engineered away via
 common posix/windows makefiles - which of course requires using the
 same "make".
I actually think that a build script written in D could be quite clean, but it would obviously not be even vaguely standard, which could be viewed as a definite con. Really, I don't care all that much how the build scripts are put together just so long as they're reasonably maintainable, and what we have right now isn't, especially on the Windows side. If I had to tackle the problem though, I'd likely tackle it with a script written in D. But I've got enough on my plate already without worrying about the built system. Thanks for looking into it. - Jonathan M Davis
Aug 10 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't.
Tools built for Unix never work right on Windows. It's why, for example, I run git on Linux and don't use the Windows badly ported versions of git. Tiresome problems revolve around failure to adapt to \ path separators, ; in PATH, CRLF line endings, Windows SEH, case insensitive file names, no symbolic links, etc., no Perl installed, etc. DMD and Phobos are fairly unusual in how well adapted they are to both Windows and Linux.
 Another suggestion that I kind of liked was to just build them all with a
 single script written in D and ditch make entirely, which would seriously
 reduce the amount of duplication across platforms. But that's obviously a much
 bigger change and would likely be much more controversial than simply using a
 more standard make.
I don't see much point in that. The dmd build is straightforward, and I see no particular gain from reinventing that wheel.
Aug 10 2013
next sibling parent reply "Mike Parker" <aldacron gmail.com> writes:
On Sunday, 11 August 2013 at 05:48:19 UTC, Walter Bright wrote:
 On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building 
 DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't.
Tools built for Unix never work right on Windows. It's why, for example, I run git on Linux and don't use the Windows badly ported versions of git. Tiresome problems revolve around failure to adapt to \ path separators, ; in PATH, CRLF line endings, Windows SEH, case insensitive file names, no symbolic links, etc., no Perl installed, etc.
Things can be wonky from a vanilla windows command prompt, which is why I never use any Linux tools there. MSYS makes all those problems go away. I use git exclusively on windows, but via gitbash, which is built on top of MSYS. Of course, it would be silly to require MSYS or Cygwin to build on Windows, but there's always CMake. A number of open source projects use it these days. Ship a configuration file with the source, then the user can generate Makefiles for a number of compilers and platforms.
Aug 10 2013
parent reply Sean Kelly <sean invisibleduck.org> writes:
On Aug 10, 2013, at 11:46 PM, Mike Parker <aldacron gmail.com> wrote:

 Things can be wonky from a vanilla windows command prompt, which is =
why I never use any Linux tools there. MSYS makes all those problems go = away. I use git exclusively on windows, but via gitbash, which is built = on top of MSYS.
=20
 Of course, it would be silly to require MSYS or Cygwin to build on =
Windows, but there's always CMake. A number of open source projects use = it these days. Ship a configuration file with the source, then the user = can generate Makefiles for a number of compilers and platforms. I haven't used it recently, but GnuWin32 = (http://gnuwin32.sourceforge.net/) provides good ports of *nix apps = without the need for MSYS or Cygwin.=
Aug 12 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 12 Aug 2013 11:06:37 -0700
Sean Kelly <sean invisibleduck.org> wrote:

 On Aug 10, 2013, at 11:46 PM, Mike Parker <aldacron gmail.com> wrote:
 
 Things can be wonky from a vanilla windows command prompt, which is
 why I never use any Linux tools there. MSYS makes all those
 problems go away. I use git exclusively on windows, but via
 gitbash, which is built on top of MSYS.
 
 Of course, it would be silly to require MSYS or Cygwin to build on
 Windows, but there's always CMake. A number of open source projects
 use it these days. Ship a configuration file with the source, then
 the user can generate Makefiles for a number of compilers and
 platforms.
I haven't used it recently, but GnuWin32 (http://gnuwin32.sourceforge.net/) provides good ports of *nix apps without the need for MSYS or Cygwin.
I can second that. I use GNU's 'grep' and 'tee' from the standard windows command prompt fairly often, and there's some other good stuff in there too. (I've tried other versions of 'tee' for windows, but the GNU one actually seems to be the best. Certainly the fastest by a impressed with how well they turned out to work even without Posix and bash. Perhaps surprisingly though, I don't actually use ls on windows - but that's only because the win version doesn't give much (any?) visual distinction of directories vs files. Instead, I stuck an "ls.bat" in my windows directory that invokes "dir /w %*". Probably my My good experience with some of the GnuWin32 tools is part of what lead me to suggest gmake. But admittedly I haven't actually tried gmake on windows, so there may very well be problems with that one, for all I know.
Aug 12 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 4:59 PM, Nick Sabalausky wrote:
 Perhaps surprisingly though, I don't actually use ls on windows - but
 that's only because the win version doesn't give much (any?)
 visual distinction of directories vs files. Instead, I stuck an
 "ls.bat" in my windows directory that invokes "dir /w %*". Probably my

You can set the default switches that DIR uses by setting the DIRCMD environment variable: set DIRCMD=/w I use: set DIRCMD=/O:D/P
Aug 12 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 12 Aug 2013 17:37:51 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 8/12/2013 4:59 PM, Nick Sabalausky wrote:
 Perhaps surprisingly though, I don't actually use ls on windows -
 but that's only because the win version doesn't give much (any?)
 visual distinction of directories vs files. Instead, I stuck an
 "ls.bat" in my windows directory that invokes "dir /w %*". Probably

You can set the default switches that DIR uses by setting the DIRCMD environment variable: set DIRCMD=/w
Cool, didn't know that. But typing "ls" quickly became second nature to me back when I started learning unix, and then I kept instinctively trying to type it whenever I came back to windows (this despite being primarily a windows guy). So I just made typing "ls" actually work. ;) I keep trying to type "cp" on windows, too. I'm hesitant to make that one work though, or to put gnu's "cp" on my PATH, because I'm sure I'd end up slipping "cp" into my distributed batch files without thinking.
 I use:
 
      set DIRCMD=/O:D/P
Yea, I used to use /P, too. Back in the MS-DOS/Win3.1 days, I had a "wdir.bat" set up to run "dir /w /p". Back then, "dir" was practically useless without /P. But with the GUI-window terminals now, I find it easier to skip the /P feature and just use the scroll wheel. Or if I really need to (ex, really long output, or a unix vm without X, or ssh terminal via putty) then I'll just pipe into 'less' or redirect into a text file.
Aug 12 2013
parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Tuesday, 13 August 2013 at 01:09:41 UTC, Nick Sabalausky wrote:
 ex, really long output, or a unix vm without X,
Tip: try hitting shift + page up and shift + page down. Works in xterm and the text mode linux console to scroll the terminal.
Aug 12 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 03:12:21 +0200
"Adam D. Ruppe" <destructionator gmail.com> wrote:

 On Tuesday, 13 August 2013 at 01:09:41 UTC, Nick Sabalausky wrote:
 ex, really long output, or a unix vm without X,
Tip: try hitting shift + page up and shift + page down. Works in xterm and the text mode linux console to scroll the terminal.
Whoa, even in text-mode? That's cool. I'll have to also remember to try next time I boot my freebsd vm. I deliberately didn't put X on it, so that one's text mode only. Maybe that trick will work there, too.
Aug 12 2013
next sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Tuesday, 13 August 2013 at 01:56:19 UTC, Nick Sabalausky wrote:
 Whoa, even in text-mode?
yep. You should try using text mode only for a while - it is amazingly usable.
Aug 12 2013
prev sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Monday, August 12, 2013 21:56:09 Nick Sabalausky wrote:
 On Tue, 13 Aug 2013 03:12:21 +0200
 
 "Adam D. Ruppe" <destructionator gmail.com> wrote:
 On Tuesday, 13 August 2013 at 01:09:41 UTC, Nick Sabalausky wrote:
 ex, really long output, or a unix vm without X,
Tip: try hitting shift + page up and shift + page down. Works in xterm and the text mode linux console to scroll the terminal.
Whoa, even in text-mode? That's cool. I'll have to also remember to try next time I boot my freebsd vm. I deliberately didn't put X on it, so that one's text mode only. Maybe that trick will work there, too.
shift + page up and shift + page down works on the standard linux console you get if not in X and are in a tty (though it loses its history if you change to a different tty). I assume that it's the same with FreeBSD, but I don't know. - Jonathan M Davis
Aug 12 2013
parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Tuesday, 13 August 2013 at 05:05:35 UTC, Jonathan M Davis 
wrote:
 (though it loses its history if you change to a different tty)
Yeah, though there's an easy solution here too: gnu screen! Though shift+pageup doesn't quite work even there, if you change ttys it does clear that buffer, but screen's own scrollback is still there, hit C-a [ to go into copy mode and you can scroll around it). There are some minor bugs with screen and gpm (the text mode mouse driver), but overall it works well. Of course, screen works just as well on putty, xterm, and just about anything else. On my laptop's putty shortcut, I have it automatically run screen for me, so it resumes my session and offers the nice C-a a feature all the time (though on the laptop I rebound that to C-s due to the shape fo the keyboard, and to make nesting screens easier). I love it.
Aug 13 2013
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/12/13 4:59 PM, Nick Sabalausky wrote:
 On Mon, 12 Aug 2013 11:06:37 -0700
 Sean Kelly <sean invisibleduck.org> wrote:

 On Aug 10, 2013, at 11:46 PM, Mike Parker <aldacron gmail.com> wrote:

 Things can be wonky from a vanilla windows command prompt, which is
 why I never use any Linux tools there. MSYS makes all those
 problems go away. I use git exclusively on windows, but via
 gitbash, which is built on top of MSYS.

 Of course, it would be silly to require MSYS or Cygwin to build on
 Windows, but there's always CMake. A number of open source projects
 use it these days. Ship a configuration file with the source, then
 the user can generate Makefiles for a number of compilers and
 platforms.
I haven't used it recently, but GnuWin32 (http://gnuwin32.sourceforge.net/) provides good ports of *nix apps without the need for MSYS or Cygwin.
I can second that.
Is is possible from a licensing standpoint to just distribute a copy of gmake built by gnuwin? Andrei
Aug 12 2013
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 12 Aug 2013 17:42:26 -0700
Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:
 
 Is is possible from a licensing standpoint to just distribute a copy
 of gmake built by gnuwin?
 
I don't even pretend to understand one word of any version of the GPL, so I couldn't say. However, if it were my own project, what I would do is provide a trivial script to *download* gnuwin gmake binary directly from the official gnuwin servers (or another legitimate mirror). I'm neither a lawyer nor a GPL expert, so I can't say 100% for *certain* (but then, what ever *is* 100% certain in US law without being individually tested in court?), but FWIW I would be very surprised if that approach would be objectionable, since we wouldn't actually be distributing it ourselves, just providing a tool that retrieves it from the actual distributor.
Aug 12 2013
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side. -- /Jacob Carlborg
Aug 12 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Aug 13 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Tuesday, 13 August 2013 at 08:30:26 UTC, Walter Bright wrote:
 On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute 
 a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Presumably you are referring to this quote, which does not show up as a reply? "Oh, I forgot to mention, licensing. We want Phobos to be free of any restrictive licensing. GPL is restrictive, and so is LGPL. We very deliberately picked Boost. Having Phobos be a mix of GPL and Boost would utterly defeat picking Boost." If you're only talking about distributing a GPL-licensed gmake binary with dmd/phobos, I don't think it has any impact on Phobos licensing, ie the GPL would only apply to the gmake binary. The GPL is a very badly written license, but I think it has been generally established that you can distribute tools like gmake or g++ with your own code and that doesn't make your own code have to be GPL, as long as gmake/g++ are only used to process/compile your code and your own code doesn't integrate the source for gmake/g++, ie gdc, which is almost never the case. Personally, I like the D-based build system idea. Distribute dmd/phobos with a D-based build tool like dub, which has been compiled ahead of time for each supported platform and will compile D for you, if you want. Of course, this means that you'll still need to maintain makefiles on the D build machines that will build dub for the D maintainers to distribute, but it isolates all the complexity of makefiles from end users, so they don't have to touch any of that stuff. Whether that ease of use is worth the extra effort, I don't know.
Aug 13 2013
next sibling parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 13 August 2013 10:55, Joakim <joakim airpost.net> wrote:
 On Tuesday, 13 August 2013 at 08:30:26 UTC, Walter Bright wrote:
 On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Presumably you are referring to this quote, which does not show up as a reply? "Oh, I forgot to mention, licensing. We want Phobos to be free of any restrictive licensing. GPL is restrictive, and so is LGPL. We very deliberately picked Boost. Having Phobos be a mix of GPL and Boost would utterly defeat picking Boost." If you're only talking about distributing a GPL-licensed gmake binary with dmd/phobos, I don't think it has any impact on Phobos licensing, ie the GPL would only apply to the gmake binary. The GPL is a very badly written license, but I think it has been generally established that you can distribute tools like gmake or g++ with your own code and that doesn't make your own code have to be GPL, as long as gmake/g++ are only used to process/compile your code and your own code doesn't integrate the source for gmake/g++, ie gdc, which is almost never the case.
Pardon? (I don't understand what point you are trying to put across about gdc, so I think it might be wrong ;-) -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Aug 13 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Tuesday, 13 August 2013 at 10:09:11 UTC, Iain Buclaw wrote:
 On 13 August 2013 10:55, Joakim <joakim airpost.net> wrote:
 would only apply to the gmake binary.  The GPL is a very badly 
 written
 license, but I think it has been generally established that 
 you can
 distribute tools like gmake or g++ with your own code and that 
 doesn't make
 your own code have to be GPL, as long as gmake/g++ are only 
 used to
 process/compile your code and your own code doesn't integrate 
 the source for
 gmake/g++, ie gdc, which is almost never the case.
Pardon? (I don't understand what point you are trying to put across about gdc, so I think it might be wrong ;-)
You seem to have a lot of problems reading what's written. ;) The point was that if you are distributing dmd and phobos source with GPL binary tools like gmake or g++, which are then only compiled by those binaries, there is no problem with the GPL. You only need to provide the source for gmake and g++. If you were distributing gdc, which actually integrates with the same compiler backend source as g++, then you have to include the source for the gdc frontend and whatever other glue files it uses. Since most source code doesn't integrate with the gcc compiler backend, the GPL licensing of gmake/g++ is not a problem for most projects, including dmd. On Tuesday, 13 August 2013 at 16:11:44 UTC, H. S. Teoh wrote:
 On Tue, Aug 13, 2013 at 11:55:12AM +0200, Joakim wrote:
 [...]
 Personally, I like the D-based build system idea.  Distribute
 dmd/phobos with a D-based build tool like dub, which has been
 compiled ahead of time for each supported platform and will 
 compile
 D for you, if you want.  Of course, this means that you'll 
 still
 need to maintain makefiles on the D build machines that will 
 build
 dub for the D maintainers to distribute, but it isolates all 
 the
 complexity of makefiles from end users, so they don't have to 
 touch
 any of that stuff.  Whether that ease of use is worth the extra
 effort, I don't know.
There's no need to maintain any makefiles once the D build tool is usable. As long as you have a working binary of dmd that can compile the tool, that's good enough.
I thought you'd still need the makefiles around for the rare occasion when you bootstrap to a new platform, as the D-based build tool won't compile there initially. Perhaps I'm wrong, I don't know much about all the vagaries involved with bootstrapping. On Tuesday, 13 August 2013 at 18:10:06 UTC, Walter Bright wrote:
 On 8/13/2013 2:55 AM, Joakim wrote:
 On Tuesday, 13 August 2013 at 08:30:26 UTC, Walter Bright 
 wrote:
 On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just 
 distribute a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Presumably you are referring to this quote, which does not show up as a reply?
Nobody seems to have read it or be able to find it, it has no replies, so I quote it here:
I think we've all seen that post. The problem is that Andrei, and Jacob later, were only asking about licensing issues with gmake, but your pasted response to Brad didn't mention licensing. You're probably right that distributing gmake is problematic on technical and "ease of use" grounds. I was just making a narrow point that the GPL likely isn't an issue, which is what Andrei and Jacob asked about.
Aug 13 2013
parent Iain Buclaw <ibuclaw ubuntu.com> writes:
On 13 August 2013 20:34, Joakim <joakim airpost.net> wrote:
 On Tuesday, 13 August 2013 at 10:09:11 UTC, Iain Buclaw wrote:
 On 13 August 2013 10:55, Joakim <joakim airpost.net> wrote:
 would only apply to the gmake binary.  The GPL is a very badly written
 license, but I think it has been generally established that you can
 distribute tools like gmake or g++ with your own code and that doesn't
 make
 your own code have to be GPL, as long as gmake/g++ are only used to
 process/compile your code and your own code doesn't integrate the source
 for
 gmake/g++, ie gdc, which is almost never the case.
Pardon? (I don't understand what point you are trying to put across about gdc, so I think it might be wrong ;-)
You seem to have a lot of problems reading what's written. ;) The point was that if you are distributing dmd and phobos source with GPL binary tools like gmake or g++, which are then only compiled by those binaries, there is no problem with the GPL. You only need to provide the source for gmake and g++. If you were distributing gdc, which actually integrates with the same compiler backend source as g++, then you have to include the source for the gdc frontend and whatever other glue files it uses. Since most source code doesn't integrate with the gcc compiler backend, the GPL licensing of gmake/g++ is not a problem for most projects, including dmd.
Right, the way you put it, looked like you were hinting that gdc was an example of code that doesn't integrate the source for g++... Anyway, IMO on the matter. Don't distribute binary blobs with source code. If they wish to build a tool/library from source, leave instructions in a README to tell them where to get all prerequisites (eg: link to a tarball) if their distributiion does not provide packages for said prerequisites already. -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Aug 13 2013
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 13, 2013 at 11:55:12AM +0200, Joakim wrote:
[...]
 Personally, I like the D-based build system idea.  Distribute
 dmd/phobos with a D-based build tool like dub, which has been
 compiled ahead of time for each supported platform and will compile
 D for you, if you want.  Of course, this means that you'll still
 need to maintain makefiles on the D build machines that will build
 dub for the D maintainers to distribute, but it isolates all the
 complexity of makefiles from end users, so they don't have to touch
 any of that stuff.  Whether that ease of use is worth the extra
 effort, I don't know.
There's no need to maintain any makefiles once the D build tool is usable. As long as you have a working binary of dmd that can compile the tool, that's good enough. T -- For every argument for something, there is always an equal and opposite argument against it. Debates don't give answers, only wounded or inflated egos.
Aug 13 2013
parent reply "Wyatt" <wyatt.epp gmail.com> writes:
On Tuesday, 13 August 2013 at 16:11:44 UTC, H. S. Teoh wrote:
 There's no need to maintain any makefiles once the D build tool 
 is usable.
On this note, I've often wondered why D compilers (or dmd, at least) don't just try to infer the necessary files to compile/link for a project based on the modules it imports. Obviously this breaks down once you need to express linkage with external libraries; but figuring out as much as possible automatically would be neat. e.g. Say the top of smallProj.d has: import common.func; import common.data; ...and you have a common/ directory with func.d and data.d, it seems the compiler could accept: dmd smallProj.d ...as a shorthand for: dmd common/func.d common/data.d smallProj.d Given that it doesn't work this way, I'm guessing there's some aspect I've missed that throws it into the same sort of hellscape of agony as every other build system in the world, but I can't figure out what it might be... -Wyatt
Aug 13 2013
parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 13 August 2013 at 16:39:06 UTC, Wyatt wrote:
 Given that it doesn't work this way, I'm guessing there's some 
 aspect I've missed that throws it into the same sort of 
 hellscape of agony as every other build system in the world, 
 but I can't figure out what it might be...
It exactly what rdmd does. Merging this functionality into dmd itself was discussed but not done.
Aug 13 2013
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 11:55:12 +0200
"Joakim" <joakim airpost.net> wrote:

 On Tuesday, 13 August 2013 at 08:30:26 UTC, Walter Bright wrote:
 On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute 
 a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Presumably you are referring to this quote, which does not show up as a reply?
I think he's referring to technical issues with gmake apparently not always playing nice on windows.
Aug 13 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/13/2013 10:14 AM, Nick Sabalausky wrote:
 I think he's referring to technical issues with gmake apparently not
 always playing nice on windows.
There's only one post by Brad in this thread and one reply! What's the mystery?
Aug 13 2013
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/13/2013 2:55 AM, Joakim wrote:
 On Tuesday, 13 August 2013 at 08:30:26 UTC, Walter Bright wrote:
 On 8/12/2013 11:48 PM, Jacob Carlborg wrote:
 On 2013-08-13 02:42, Andrei Alexandrescu wrote:

 Is is possible from a licensing standpoint to just distribute a copy of
 gmake built by gnuwin?
I don't see why we couldn't do that. It's a completely separate tool and shouldn't "infect" anything else. We might need to accompany it with a license file and a link to the source code to be on the safe side.
Again, read my reply to Brad in this thread.
Presumably you are referring to this quote, which does not show up as a reply?
Nobody seems to have read it or be able to find it, it has no replies, so I quote it here: ============================================================ n 8/11/2013 11:49 AM, Brad Roberts wrote:
 Gross over generalization when talking about _one_ app in _one_ scenario.
It happens over and over to me. Most 'ports' to Windows seem to be: 1. get it to compile 2. ship it!
 You're deflecting rather than being willing to discuss a topic that comes up
 regularly.
I'm posting in this thread because I'm willing to discuss it. I've added much more detail in this post.
 You are also well aware of just how often having multiple make files
 has cause pain by them not being updated in sync.
Yes, and I am usually the one who gets to resync them - and I think it's worth it.
 Does gmake have _any_ of those problems?
The last time I tried it, it bombed because the makefiles had CRLF's. Not an auspicious start. This has probably been fixed, but I haven't cared to try again. But ok, it's been a while, let's take a look: Consider: http://gnuwin32.sourceforge.net/install.html In the first paragraph, it says the user must have msvcrt.dll, which doesn't come with it and the user must go find it if he doesn't have it. Then "some packages require msvcp60.dll", which the user must also go find elsewhere. Then, it must be "installed". It even is complicated enough to motivate someone to write a "download and maintenance utility." "Some packages must be installed in their default directories (usually c:\progra~1\<packagename>), or you have to set corresponding environment variables or set options at the command line; see the documentation of the package, or, when available, the installation instructions on the package page." Oh joy. I downloaded the zip file, unzipped it, and ran make.exe. I was rewarded with a dialog box: "The program can't start because libintl3.dll is missing from your computer. Try reinstalling the program to fix this problem." This dll isn't included with the zip file, and the install instructions don't mention it, let alone where I can get it. "The length of the command-line is limited; see MSDN." DM make solves that problem. "The MS-Windows command interpreters, command.com and cmd.exe, understand both the backward slash '\' (which is the default) and the forward slash '/' (such as on Unix) in filenames. In general, it is best to use forward slashes, since some programs internally use the filename, e.g. to derive a directory name, and in doing this rely on the forward slash as path separator." Actually, Windows utilities (even ones provided by Microsoft) sometimes fail to recognize / as a separator. I've not found any consistent rule about this, other than "it's going to suck sooner or later if you try using / instead of \." I didn't get further, because I don't have libintl3.dll. ------------------------------ Contrast that with DM make: 1. There is no install and no setup. It's just make.exe. Run it, it works. No friction. 2. Don't need no dlls one must search the internet for, and also no worries about "dll hell" from getting the wrong one. DM make runs on a vanilla install of Windows. 3. It's designed from the ground up to work with Windows. For example, it recognizes "del" as a builtin Windows command, not a program, and handles it directly. It does things in the Windows way. 4. It handles arbitrarily long command lines. 5. No worries with people having a different make.exe than the one the makefiles were built for, as make.exe is distributed with dmd. 6. It's a small program, 50K, meaning it fits in a corner and is a trivial part of the dmd package. ------------------------------ If for no other reason, I oppose using gnu make for dmd on Windows because it significantly raises the barrier of entry for anyone who wants to just recompile Phobos. Gratuitously adding friction for users is not what we need - note the regular posts we get from newbies and the existing friction they encounter.
Aug 13 2013
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, August 10, 2013 22:48:14 Walter Bright wrote:
 On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 Another suggestion that I kind of liked was to just build them all with a
 single script written in D and ditch make entirely, which would seriously
 reduce the amount of duplication across platforms. But that's obviously a
 much bigger change and would likely be much more controversial than
 simply using a more standard make.
I don't see much point in that. The dmd build is straightforward, and I see no particular gain from reinventing that wheel.
Well, make is horrible, and while posix.mak is way better than win32.mak or win64.mak, it's still pretty bad. Personally, I would never use make without something like cmake in front of it. If we were to write up something in D, it could be properly cross-platform (so only one script instead of 3+), and I fully expect that it could be far, far cleaner than what we're forced to do in make. - Jonathan M Davis
Aug 11 2013
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sat, 10 Aug 2013 22:48:14 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD
 for Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't.
Tools built for Unix never work right on Windows. It's why, for example, I run git on Linux and don't use the Windows badly ported versions of git. Tiresome problems revolve around failure to adapt to \ path separators, ; in PATH, CRLF line endings, Windows SEH, case insensitive file names, no symbolic links, etc., no Perl installed, etc.
Fair point.
 
 Another suggestion that I kind of liked was to just build them all
 with a single script written in D and ditch make entirely, which
 would seriously reduce the amount of duplication across platforms.
 But that's obviously a much bigger change and would likely be much
 more controversial than simply using a more standard make.
I don't see much point in that. The dmd build is straightforward, and I see no particular gain from reinventing that wheel.
The current state is fairly awful when trying to do cross-platform automation of anything that involves building DMD. The make targets are completely different, the available configuration options and defaults are completely different, and the output locations are completely different. Trying to deal with and accommodate the divergent behaviors of posix.mak and win*.mak is a minefield that leads to fragile, tangled code even with my best attempts to keep it clean. And this isn't the first time I've automated building DMD, either. And yea, all those differences can be addressed, but as long as we're maintaining posix/win buildscripts separately - and in essentially two separate languages (two flavors of make) - then divergence is only going to reoccur.
Aug 11 2013
prev sibling next sibling parent reply Brad Roberts <braddr puremagic.com> writes:
On 8/10/13 10:48 PM, Walter Bright wrote:
 On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't.
Tools built for Unix never work right on Windows. It's why, for example, I run git on Linux and don't use the Windows badly ported versions of git. Tiresome problems revolve around failure to adapt to \ path separators, ; in PATH, CRLF line endings, Windows SEH, case insensitive file names, no symbolic links, etc., no Perl installed, etc. DMD and Phobos are fairly unusual in how well adapted they are to both Windows and Linux.
Gross over generalization when talking about _one_ app in _one_ scenario. You're deflecting rather than being willing to discuss a topic that comes up regularly. You are also well aware of just how often having multiple make files has cause pain by them not being updated in sync. Does gmake have _any_ of those problems?
Aug 11 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/11/2013 11:49 AM, Brad Roberts wrote:
 Gross over generalization when talking about _one_ app in _one_ scenario.
It happens over and over to me. Most 'ports' to Windows seem to be: 1. get it to compile 2. ship it!
 You're deflecting rather than being willing to discuss a topic that comes up
 regularly.
I'm posting in this thread because I'm willing to discuss it. I've added much more detail in this post.
 You are also well aware of just how often having multiple make files
 has cause pain by them not being updated in sync.
Yes, and I am usually the one who gets to resync them - and I think it's worth it.
 Does gmake have _any_ of those problems?
The last time I tried it, it bombed because the makefiles had CRLF's. Not an auspicious start. This has probably been fixed, but I haven't cared to try again. But ok, it's been a while, let's take a look: Consider: http://gnuwin32.sourceforge.net/install.html In the first paragraph, it says the user must have msvcrt.dll, which doesn't come with it and the user must go find it if he doesn't have it. Then "some packages require msvcp60.dll", which the user must also go find elsewhere. Then, it must be "installed". It even is complicated enough to motivate someone to write a "download and maintenance utility." "Some packages must be installed in their default directories (usually c:\progra~1\<packagename>), or you have to set corresponding environment variables or set options at the command line; see the documentation of the package, or, when available, the installation instructions on the package page." Oh joy. I downloaded the zip file, unzipped it, and ran make.exe. I was rewarded with a dialog box: "The program can't start because libintl3.dll is missing from your computer. Try reinstalling the program to fix this problem." This dll isn't included with the zip file, and the install instructions don't mention it, let alone where I can get it. "The length of the command-line is limited; see MSDN." DM make solves that problem. "The MS-Windows command interpreters, command.com and cmd.exe, understand both the backward slash '\' (which is the default) and the forward slash '/' (such as on Unix) in filenames. In general, it is best to use forward slashes, since some programs internally use the filename, e.g. to derive a directory name, and in doing this rely on the forward slash as path separator." Actually, Windows utilities (even ones provided by Microsoft) sometimes fail to recognize / as a separator. I've not found any consistent rule about this, other than "it's going to suck sooner or later if you try using / instead of \." I didn't get further, because I don't have libintl3.dll. ------------------------------ Contrast that with DM make: 1. There is no install and no setup. It's just make.exe. Run it, it works. No friction. 2. Don't need no dlls one must search the internet for, and also no worries about "dll hell" from getting the wrong one. DM make runs on a vanilla install of Windows. 3. It's designed from the ground up to work with Windows. For example, it recognizes "del" as a builtin Windows command, not a program, and handles it directly. It does things in the Windows way. 4. It handles arbitrarily long command lines. 5. No worries with people having a different make.exe than the one the makefiles were built for, as make.exe is distributed with dmd. 6. It's a small program, 50K, meaning it fits in a corner and is a trivial part of the dmd package. ------------------------------ If for no other reason, I oppose using gnu make for dmd on Windows because it significantly raises the barrier of entry for anyone who wants to just recompile Phobos. Gratuitously adding friction for users is not what we need - note the regular posts we get from newbies and the existing friction they encounter.
Aug 11 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On the subject of friction, I believe we make a mistake by making a dependency 
on libcurl, a library over which we don't have control. Some issues:

http://d.puremagic.com/issues/show_bug.cgi?id=10710

http://d.puremagic.com/issues/show_bug.cgi?id=8756
Aug 11 2013
next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 On the subject of friction, I believe we make a mistake by 
 making a dependency on libcurl, a library over which we don't 
 have control. Some issues:

 http://d.puremagic.com/issues/show_bug.cgi?id=10710

 http://d.puremagic.com/issues/show_bug.cgi?id=8756
Issue 8756 doesn't seem caused by libcurl. (If the pragma(lib) feature is not portable then perhaps it should become a feature of just DMD and not of the D language. This aligns a bit better the theory/dream of D with the reality of D.) On the other hand doing everything yourself/ourselves has some other large disadvantages, look at the Phobos bigints: the GHC Haskell compiler uses the GMP multi-precision numbers, that are usually faster or much faster than Phobos ones, have more functions implemented that are missing in Phobos (like power-modulus), and help GHC developers focus on more Haskell-related issues. Rust developers don't try to design/develop at the same time a language, a linker, a back-end, a run-time and a standard library. Restricting the work helps speed up the development of what's more related to D. Bye, bearophile
Aug 11 2013
next sibling parent reply "Anon" <z z.z> writes:
On Sunday, 11 August 2013 at 21:21:45 UTC, bearophile wrote:
 Walter Bright:

 On the subject of friction, I believe we make a mistake by 
 making a dependency on libcurl, a library over which we don't 
 have control. Some issues:

 http://d.puremagic.com/issues/show_bug.cgi?id=10710

 http://d.puremagic.com/issues/show_bug.cgi?id=8756
Issue 8756 doesn't seem caused by libcurl. (If the pragma(lib) feature is not portable then perhaps it should become a feature of just DMD and not of the D language. This aligns a bit better the theory/dream of D with the reality of D.)
Does pragma(lib, "curl") not work on Windows/DMD? I know it does in Linux (used in DMD and LDC, ignored under GDC), and was under the impression that that was the portable way to use pragma(lib). If it isn't now, I would argue that naming the library (rather than the file) should be the standard, accepted use of pragma(lib). It neatly avoids cluttering D code with version()s and repeated pragmas to handle the different naming schemes.
Aug 11 2013
parent reply Jacob Carlborg <doob me.com> writes:
On 2013-08-11 23:35, Anon wrote:

 Does pragma(lib, "curl") not work on Windows/DMD? I know it does in
 Linux (used in DMD and LDC, ignored under GDC),
 and was under the impression that that was the portable way to use
 pragma(lib).
No, it's not portable. Example, libraries on Posix are usually named "libfoo.a", on Windows they're named "foo.lib". As far as I know that is not handled by pragma(lib). -- /Jacob Carlborg
Aug 12 2013
parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Tuesday, 13 August 2013 at 06:56:11 UTC, Jacob Carlborg wrote:
 No, it's not portable. Example, libraries on Posix are usually 
 named "libfoo.a", on Windows they're named "foo.lib". As far as 
 I know that is not handled by pragma(lib).
Yes, it is. pragma(lib, "foo") works on both systems. It passes the name "foo" to the linker, which knows about the cross platform differences. I use it all the time on Windows and Linux without any problems. The problem people have with pragma(lib) is that gdc's architecture doesn't allow it; the front end in gcc can't pass an argument to the linker, and also that separate compilation with any compiler doesn't pass the pragma (its instruction is only in the .d file, not the .o). I don't see this as a problem with the feature - it doesn't have to work in every case to be very useful. worst case, if it doesn't work, you're just doing it the way you would anyway. Even so, however, pragma(lib) can still serve to document the fact that the library is required. In summary, pragma(lib) is great and you can have it when you pry it from my cold, dead hands.
Aug 13 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/11/2013 2:21 PM, bearophile wrote:
 Walter Bright:

 On the subject of friction, I believe we make a mistake by making a dependency
 on libcurl, a library over which we don't have control. Some issues:

 http://d.puremagic.com/issues/show_bug.cgi?id=10710

 http://d.puremagic.com/issues/show_bug.cgi?id=8756
Issue 8756 doesn't seem caused by libcurl. (If the pragma(lib) feature is not portable then perhaps it should become a feature of just DMD and not of the D language. This aligns a bit better the theory/dream of D with the reality of D.) On the other hand doing everything yourself/ourselves has some other large disadvantages, look at the Phobos bigints: the GHC Haskell compiler uses the GMP multi-precision numbers, that are usually faster or much faster than Phobos ones, have more functions implemented that are missing in Phobos (like power-modulus), and help GHC developers focus on more Haskell-related issues.
You might consider that D is designed to be *very* friendly to linking with existing C code libraries for exactly that reason. Haskell is not. You might also recall my steadfast opposition to doing things like rolling our own crypto libraries rather than linking to existing ones. That said, as soon as the D *package* starts to depend on non-default-installed libraries, trouble happens. With libcurl, the only solution so far seems to be to BUILD OUR OWN LIBCURL binary! http://d.puremagic.com/issues/show_bug.cgi?id=10710 This is a terrible situation. Consider things like the trig functions. D started out by forwarding to the C versions. Unfortunately, the C versions are of spotty, unreliable quality (even today!). Because of that, we've been switching to our own implementations. And, consider that using GMP means CTFE would not be supported.
 Rust developers don't try to design/develop at the same time a language, a
 linker, a back-end, a run-time and a standard library.
Neither did D's developers. Note that D was developed with existing backends and linkers. Rust is not released yet, and given that they just switched to their own runtime, they clearly intend to ship with their own runtime.
 Restricting the work helps speed up the development of what's more related to
D.
We really aren't complete fools, bearophile.
Aug 11 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
Oh, I forgot to mention, licensing.

We want Phobos to be free of any restrictive licensing. GPL is restrictive, and 
so is LGPL.

We very deliberately picked Boost. Having Phobos be a mix of GPL and Boost
would 
utterly defeat picking Boost.
Aug 11 2013
parent Sean Kelly <sean invisibleduck.org> writes:
On Aug 11, 2013, at 2:46 PM, Walter Bright <newshound2 digitalmars.com> =
wrote:

 Oh, I forgot to mention, licensing.
=20
 We want Phobos to be free of any restrictive licensing. GPL is =
restrictive, and so is LGPL. Yep. And while LGPL is theoretically fine in most situations, a lot of = legal teams still run screaming from any license containing "GPL" = regardless of the actual content. The Boost license is the most = practical choice for allowing the most people to use our code without = restriction.=
Aug 12 2013
prev sibling next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday, August 11, 2013 14:43:13 Walter Bright wrote:
 That said, as soon as the D *package* starts to depend on
 non-default-installed libraries, trouble happens. With libcurl, the only
 solution so far seems to be to BUILD OUR OWN LIBCURL binary!
At this point, I'm inclined to think that while it's great for us to have bindings to C libraries and to have user-friendly, D wrappers around them, it's better that they don't end up in Phobos. - Jonathan M Davis
Aug 11 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/11/2013 3:18 PM, Jonathan M Davis wrote:
 On Sunday, August 11, 2013 14:43:13 Walter Bright wrote:
 That said, as soon as the D *package* starts to depend on
 non-default-installed libraries, trouble happens. With libcurl, the only
 solution so far seems to be to BUILD OUR OWN LIBCURL binary!
At this point, I'm inclined to think that while it's great for us to have bindings to C libraries and to have user-friendly, D wrappers around them, it's better that they don't end up in Phobos.
My sentiments exactly.
Aug 11 2013
prev sibling next sibling parent reply "bearophile" <bearophileHUGS lycos.com> writes:
Walter Bright:

 as soon as the D *package* starts to depend on 
 non-default-installed libraries, trouble happens. With libcurl, 
 the only solution so far seems to be to BUILD OUR OWN LIBCURL 
 binary!

 http://d.puremagic.com/issues/show_bug.cgi?id=10710

 This is a terrible situation.
For Haskell they release two different kinds of compilers+libraries: one is just a core distribution with the compiler with the standard Haskell modules (including the GMP compiled binaries), and the other contains the compiler with its standard library, plus modules+binaries for the most common libraries. Python on Windows uses a similar strategy.
 Consider things like the trig functions. D started out by 
 forwarding to the C versions. Unfortunately, the C versions are 
 of spotty, unreliable quality (even today!). Because of that, 
 we've been switching to our own implementations.

 And, consider that using GMP means CTFE would not be supported.
At the moment BigInt doesn't run at compile-time. You could wrap an external fast multi-precision library in Phobos D code that uses __ctfw to switch to a simpler pure D implementation at compile-time. Is it useful to use BigInts at compile-time? If the answer is very positive then perhaps the D interpreter could be modified to allow calling external numerical libraries even at compile-time.
 Note that D was developed with existing backends and linkers.
But isn't optlink being rewritten in C? Perhaps I am just confused, sorry. Bye, bearophile
Aug 11 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/11/2013 3:40 PM, bearophile wrote:
 For Haskell they release two different kinds of compilers+libraries: one is
just
 a core distribution with the compiler with the standard Haskell modules
 (including the GMP compiled binaries), and the other contains the compiler with
 its standard library, plus modules+binaries for the most common libraries.

 Python on Windows uses a similar strategy.
This is not really a strategy, it addresses none of the issues I raised.
 Is it useful to use BigInts at compile-time? If the answer is very positive
then
 perhaps the D interpreter could be modified to allow calling external numerical
 libraries even at compile-time.
Don keeps extending CTFE to make it work with more stuff, as people find it more and more useful to do things at compile time. I see no reason BigInt should be excluded from that.
 Note that D was developed with existing backends and linkers.
But isn't optlink being rewritten in C? Perhaps I am just confused, sorry.
Optlink was used for D because it was existing, free, and it worked. You seemed to have the idea that optlink was developed to use with D. Optlink predated D by 12-15 years.
Aug 11 2013
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
On Aug 11, 2013, at 3:18 PM, Jonathan M Davis <jmdavisProg gmx.com> =
wrote:

 On Sunday, August 11, 2013 14:43:13 Walter Bright wrote:
 That said, as soon as the D *package* starts to depend on
 non-default-installed libraries, trouble happens. With libcurl, the =
only
 solution so far seems to be to BUILD OUR OWN LIBCURL binary!
=20 At this point, I'm inclined to think that while it's great for us to =
have=20
 bindings to C libraries and to have user-friendly, D wrappers around =
them,=20
 it's better that they don't end up in Phobos.
Once D has a good, well-known package system I think the inclination to = make Phobos a "kitchen sink" library may subside somewhat.=
Aug 12 2013
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
On Aug 11, 2013, at 1:36 PM, Walter Bright <newshound2 digitalmars.com> =
wrote:

 On the subject of friction, I believe we make a mistake by making a =
dependency on libcurl, a library over which we don't have control. Absolutely. As much as I like libcurl, I was kind of surprised when it = was bundled with DMD.=
Aug 12 2013
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/12/13 11:08 AM, Sean Kelly wrote:
 On Aug 11, 2013, at 1:36 PM, Walter Bright
 <newshound2 digitalmars.com> wrote:

 On the subject of friction, I believe we make a mistake by making a
 dependency on libcurl, a library over which we don't have control.
Absolutely. As much as I like libcurl, I was kind of surprised when it was bundled with DMD.
On the other hand we instantly got a library that took years in development and testing, that a lot of people are familiar with. C'mon people do I need to spell it out that it's always tradeoffs as opposed to only goods or only bads? Andrei
Aug 12 2013
parent Sean Kelly <sean invisibleduck.org> writes:
On Aug 12, 2013, at 11:26 AM, Andrei Alexandrescu =
<SeeWebsiteForEmail erdani.org> wrote:

 On 8/12/13 11:08 AM, Sean Kelly wrote:
 On Aug 11, 2013, at 1:36 PM, Walter Bright
 <newshound2 digitalmars.com> wrote:
=20
 On the subject of friction, I believe we make a mistake by making a
 dependency on libcurl, a library over which we don't have control.
=20 Absolutely. As much as I like libcurl, I was kind of surprised when it was bundled with DMD.
=20 On the other hand we instantly got a library that took years in =
development and testing, that a lot of people are familiar with. C'mon = people do I need to spell it out that it's always tradeoffs as opposed = to only goods or only bads? I don't think anyone was suggesting otherwise.=
Aug 12 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 12, 2013 at 11:08:18AM -0700, Sean Kelly wrote:
 On Aug 11, 2013, at 1:36 PM, Walter Bright <newshound2 digitalmars.com> wrote:
 
 On the subject of friction, I believe we make a mistake by making a
 dependency on libcurl, a library over which we don't have control.
Absolutely. As much as I like libcurl, I was kind of surprised when it was bundled with DMD.
The one saving grace about this situation that it's possible to build Phobos without libcurl, and still get a working toolchain (just that the stuff that uses libcurl wouldn't work). Or at least, it used to. IIRC with the recent introduction of shared library building for Phobos, this may no longer be true. T -- To provoke is to call someone stupid; to argue is to call each other stupid.
Aug 12 2013
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
On Aug 12, 2013, at 11:43 AM, "H. S. Teoh" <hsteoh quickfur.ath.cx> =
wrote:

 On Mon, Aug 12, 2013 at 11:08:18AM -0700, Sean Kelly wrote:
 On Aug 11, 2013, at 1:36 PM, Walter Bright =
<newshound2 digitalmars.com> wrote:
=20
 On the subject of friction, I believe we make a mistake by making a
 dependency on libcurl, a library over which we don't have control.
=20 Absolutely. As much as I like libcurl, I was kind of surprised when it was bundled with DMD.
=20 The one saving grace about this situation that it's possible to build Phobos without libcurl, and still get a working toolchain (just that =
the
 stuff that uses libcurl wouldn't work).
Yep. The current approach still seems kind of confusing though, because = to use std.net.curl you have to manually link libcurl, which makes it = not feel like an actual part of Phobos. I'd kind of like to see = std.net.curl live entirely within etc.curl to clearly indicate that it's = a separate package. RDMD should automatically link libcurl as well, if = it doesn't already (and I can't recall if it does). I suppose std.zip = is in the same boat.=
Aug 12 2013
parent Jacob Carlborg <doob me.com> writes:
On 2013-08-12 21:03, Sean Kelly wrote:

 I suppose std.zip is in the same boat.
Not entirely, since we include the source code for libz. -- /Jacob Carlborg
Aug 13 2013
prev sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday, August 11, 2013 13:36:54 Walter Bright wrote:
 On the subject of friction, I believe we make a mistake by making a
 dependency on libcurl, a library over which we don't have control. Some
 issues:
 
 http://d.puremagic.com/issues/show_bug.cgi?id=10710
 
 http://d.puremagic.com/issues/show_bug.cgi?id=8756
Of course, if having libcurl in Phobos is causing enough problems, that raises the question as to whether we should reverse our decision to include it in Phobos. That would of course create it's own set of problems, because it would break existing code, but it would be trivial enough to change such code to import a separate library rather than std.net.curl if we decided that including it in Phobos was causing enough problems that it wasn't worth it and moved std.net.curl to somewhere else outside of Phobos. - Jonathan M Davis
Aug 13 2013
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Aug 11, 2013 at 12:11:14AM -0700, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 22:48:14 Walter Bright wrote:
 On 8/10/2013 4:21 PM, Jonathan M Davis wrote:
 Another suggestion that I kind of liked was to just build them all
 with a single script written in D and ditch make entirely, which
 would seriously reduce the amount of duplication across platforms.
 But that's obviously a much bigger change and would likely be much
 more controversial than simply using a more standard make.
I don't see much point in that. The dmd build is straightforward, and I see no particular gain from reinventing that wheel.
Well, make is horrible, and while posix.mak is way better than win32.mak or win64.mak, it's still pretty bad. Personally, I would never use make without something like cmake in front of it. If we were to write up something in D, it could be properly cross-platform (so only one script instead of 3+), and I fully expect that it could be far, far cleaner than what we're forced to do in make.
[...] Maybe my previous post didn't get the idea across clearly, so let me try again. My underlying thrust was: instead of maintaining 3 different makefiles (or more) by hand, have a single source for all of them, and write a small D program to generate posix.mak, win32.mak, win64.mak, whatever, from that source. That way, adding/removing files from the build, etc., involves only editing a single file, and regenerating the makefiles/whatever we use. If there's a problem with a platform-specific makefile, then it's just a matter of fixing the platform-specific output handler in the D program. The way we're currently doing it essentially amounts to the same thing as copy-n-pasting the same piece of code 3 times and trying to maintain all 3 copies separately, instead of writing a template that can be specialized 3 times, thus avoiding boilerplate and maintenance headaches. T -- For every argument for something, there is always an equal and opposite argument against it. Debates don't give answers, only wounded or inflated egos.
Aug 11 2013
parent reply Jacob Carlborg <doob me.com> writes:
On 2013-08-12 00:38, H. S. Teoh wrote:

 Maybe my previous post didn't get the idea across clearly, so let me try
 again. My underlying thrust was: instead of maintaining 3 different
 makefiles (or more) by hand, have a single source for all of them, and
 write a small D program to generate posix.mak, win32.mak, win64.mak,
 whatever, from that source.
If it's written in D it will have the same bootstrap problem. But perhaps that's ok since we're moving DMD to D anyway. -- /Jacob Carlborg
Aug 13 2013
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 13, 2013 at 09:03:12AM +0200, Jacob Carlborg wrote:
 On 2013-08-12 00:38, H. S. Teoh wrote:
 
Maybe my previous post didn't get the idea across clearly, so let me
try again. My underlying thrust was: instead of maintaining 3
different makefiles (or more) by hand, have a single source for all
of them, and write a small D program to generate posix.mak,
win32.mak, win64.mak, whatever, from that source.
If it's written in D it will have the same bootstrap problem. But perhaps that's ok since we're moving DMD to D anyway.
[...] Well, yes, we're moving DMD to D anyway, so we're going to face the bootstrap problem regardless. But that's not really what I'm getting at. My whole point was to have a "single source of truth" for how to build what needs to be built, instead of scattering it across 3+ makefiles that need to be maintained separately. T -- Never wrestle a pig. You both get covered in mud, and the pig likes it.
Aug 13 2013
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 09:03:12 +0200
Jacob Carlborg <doob me.com> wrote:

 On 2013-08-12 00:38, H. S. Teoh wrote:
 
 Maybe my previous post didn't get the idea across clearly, so let
 me try again. My underlying thrust was: instead of maintaining 3
 different makefiles (or more) by hand, have a single source for all
 of them, and write a small D program to generate posix.mak,
 win32.mak, win64.mak, whatever, from that source.
If it's written in D it will have the same bootstrap problem.
Sort of, but...no, not really. Since this tool would be cable of generating any platform-specific makefile or script or whatever, and there's no reason to restrict it to *only* generate a makefile/script for the current platform, that means it can function much like a cross-compiler: Suppose there's some computer DMD isn't installed on. Maybe it's even a new platform that DMD hasn't been ported to. H.S. Teoh's tool could be run on *any existing* D-capable system to generate the makefile/script for the intended target computer. Maybe that might even require adding a new shell/makefile output to the tool, but it would *not* require running H.S. Teoh's tool (or anything else) on the actual intended target platform. Then, that makefile/script which was generated on...windows or whatever...is then transferred (email, ftp, floppy, whatever) to the new system and DONE - a working buildscript, ready to attempt compiling DMD, without *anything* having been run yet.
 But perhaps that's ok since we're moving DMD to D anyway.
 
Aug 13 2013
parent reply Jacob Carlborg <doob me.com> writes:
On 2013-08-13 19:37, Nick Sabalausky wrote:

 Sort of, but...no, not really.

 Since this tool would be cable of generating any platform-specific
 makefile or script or whatever, and there's no reason to restrict it
 to *only* generate a makefile/script for the current platform, that
 means it can function much like a cross-compiler:
Well if you output a build file for a different tool that's completely different. -- /Jacob Carlborg
Aug 13 2013
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/13/13 10:39 AM, Jacob Carlborg wrote:
 On 2013-08-13 19:37, Nick Sabalausky wrote:

 Sort of, but...no, not really.

 Since this tool would be cable of generating any platform-specific
 makefile or script or whatever, and there's no reason to restrict it
 to *only* generate a makefile/script for the current platform, that
 means it can function much like a cross-compiler:
Well if you output a build file for a different tool that's completely different.
And more complicated. To quote myself (and fix a typo): The margins involved are small enough to make it difficult for the solution to not become worse than the problem. The sheer fact that this thread has been going on for so long without a slam-dunk solution emerging is quite telling. Andrei
Aug 13 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 10:50:28 -0700
Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:

 On 8/13/13 10:39 AM, Jacob Carlborg wrote:
 Well if you output a build file for a different tool that's
 completely different.
And more complicated. To quote myself (and fix a typo): The margins involved are small enough to make it difficult for the solution to not become worse than the problem. The sheer fact that this thread has been going on for so long without a slam-dunk solution emerging is quite telling.
I think it's more telling of the fact that programmers tend not to agree on things ;) (...Not entirely joking, either.)
Aug 13 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 14:17:55 -0400
Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> wrote:

 On Tue, 13 Aug 2013 10:50:28 -0700
 Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:
 
 On 8/13/13 10:39 AM, Jacob Carlborg wrote:
 Well if you output a build file for a different tool that's
 completely different.
And more complicated. To quote myself (and fix a typo): The margins involved are small enough to make it difficult for the solution to not become worse than the problem. The sheer fact that this thread has been going on for so long without a slam-dunk solution emerging is quite telling.
I think it's more telling of the fact that programmers tend not to agree on things ;) (...Not entirely joking, either.)
But more seriously though, I think a fairly clear solution *has* emerged: A D-based tool. Everything else, including maintaining the status quo, has pretty much been ruled out by the majority for one reason or another - or at the very least have all has serious objections raised. But there haven't been any serious objections to the idea of a D-based tool - just some productive discussion of what form it should take. Besides, a solution doesn't need to be "a slam-dunk" to be worthwhile.
Aug 13 2013
parent "Dicebot" <public dicebot.lv> writes:
Well, there has been announcement of D-based build system (bub) 
recently, maybe it can be used for some dogfooding? :)
Aug 13 2013
prev sibling parent reply "Elie Morisse" <syniurge gmail.com> writes:
Sorry if I missed the point, but wouldn't yet another build 
system be rewriting the wheel in D?

CMake allows to do alot more than compiling, all in a cross 
platform way and is very fast when coupled with Ninja instead of 
Make.

Even though D is nicer than the CMake language wouldn't it take 
quite a lot of work to redo its features in D, as well as the IDE 
support? (CMake is extremely well integrated in KDevelop for 
example)
Aug 12 2013
parent "Elie Morisse" <syniurge gmail.com> writes:
Woops disregard that, I thought Jonathan was talking about a new 
build system, not just for DMD.

On Tuesday, 13 August 2013 at 01:27:28 UTC, Elie Morisse wrote:
 Sorry if I missed the point, but wouldn't yet another build 
 system be rewriting the wheel in D?

 CMake allows to do alot more than compiling, all in a cross 
 platform way and is very fast when coupled with Ninja instead 
 of Make.

 Even though D is nicer than the CMake language wouldn't it take 
 quite a lot of work to redo its features in D, as well as the 
 IDE support? (CMake is extremely well integrated in KDevelop 
 for example)
Aug 12 2013
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Aug 10, 2013 at 04:21:45PM -0700, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't. Another suggestion that I kind of liked was to just build them all with a single script written in D and ditch make entirely, which would seriously reduce the amount of duplication across platforms. But that's obviously a much bigger change and would likely be much more controversial than simply using a more standard make.
[...] I'm all for ditching make. What about this: - We write a small D app that automatically scans all dependencies and generates a shell script / .BAT file / whatever the target platform uses, that contains compile commands that builds DMD and a make replacement written in D. This is for bootstrapping. - The make replacement written in D can then be used to rebuild DMD, build druntime, Phobos, etc.. The first step is what makes this all work, 'cos you'll need to already have a working D compiler before step 2 is usable. (Either that, or ship binaries, but then you'll get people complaining about their platform of choice not being supported, the binaries being incompatible with their quirky installation of system libraries, etc..) Once DMD is built, we can junk the script / .BAT file and use the D make-replacement from then on. T -- People tell me that I'm skeptical, but I don't believe it.
Aug 10 2013
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sat, 10 Aug 2013 17:14:35 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:
 
 I'm all for ditching make.  What about this:
 
 - We write a small D app that automatically scans all dependencies and
   generates a shell script / .BAT file / whatever the target platform
   uses, that contains compile commands that builds DMD and a make
   replacement written in D. This is for bootstrapping.
 
 - The make replacement written in D can then be used to rebuild DMD,
   build druntime, Phobos, etc..
 
That's a very interesting idea. Couple thoughts: - It sounds a lot like RDMD, just with shell-script output added (actually, I think RDMD can already generate makefiles). So I'm wondering how much of RDMD could be leveraged for this. But maybe not much since RDMD is specially-tailored to scanning D-based projects, not C/C++ ones like DMD (but druntime and phobos OTOH...). I guess the tool you're talking about would be specifically-designed to scan DMD's sources. Or is there some existing C/C++ tool we could/should leverage? - What happens when DMD starts using D-language sources? The generated bootstrapper shell scripts would no longer be able to compile DMD because DMD hasn't yet been built. So it wouldn't solve *that* issue, but I suppose we can just shell-bootstrap the last pure-C/C++ DMD as an "origin" DMD and use that to compile the next DMD (or the most recent one it's capable of compiling), and then use that to compile the next-in-line DMD, and so on up to whatever's the latest. (Which is probably what would have to happen *anyway* even without the tool you suggest.) Sounds like that should work, and fairly well, too (as long as DMD's D-based sources are careful to be compilable with *sufficiently* older versions of itself). And it eliminates any point in bothering to make any big improvements on the makefiles.
Aug 10 2013
prev sibling next sibling parent reply Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, August 10, 2013 17:14:35 H. S. Teoh wrote:
 On Sat, Aug 10, 2013 at 04:21:45PM -0700, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't. Another suggestion that I kind of liked was to just build them all with a single script written in D and ditch make entirely, which would seriously reduce the amount of duplication across platforms. But that's obviously a much bigger change and would likely be much more controversial than simply using a more standard make.
[...] I'm all for ditching make. What about this: - We write a small D app that automatically scans all dependencies and generates a shell script / .BAT file / whatever the target platform uses, that contains compile commands that builds DMD and a make replacement written in D. This is for bootstrapping. - The make replacement written in D can then be used to rebuild DMD, build druntime, Phobos, etc.. The first step is what makes this all work, 'cos you'll need to already have a working D compiler before step 2 is usable. (Either that, or ship binaries, but then you'll get people complaining about their platform of choice not being supported, the binaries being incompatible with their quirky installation of system libraries, etc..) Once DMD is built, we can junk the script / .BAT file and use the D make-replacement from then on.
Since we're going to be porting dmd to D, which will force you to have a D compiler to build the D compiler anyway, I don't see any reason to jump through hoops to make anything bootstrappable. In the long run, you're going to have to either start with a dmd which compiled with C++ or cross-compile it from a machine which already has dmd, which is exactly the same boat that C/C++ are in. It's just that they've been around a lot longer, are supported on more platforms, and don't change as much. - Jonathan M Davis
Aug 10 2013
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sat, 10 Aug 2013 17:23:08 -0700
Jonathan M Davis <jmdavisProg gmx.com> wrote:

 On Saturday, August 10, 2013 17:14:35 H. S. Teoh wrote:
 On Sat, Aug 10, 2013 at 04:21:45PM -0700, Jonathan M Davis wrote:
 On Saturday, August 10, 2013 14:35:04 Nick Sabalausky wrote:
 Is this something that would be acceptable, or does building
 DMD for Windows need to stay as DM make?
I don't see any problem with it, but that doesn't mean that Walter won't. Another suggestion that I kind of liked was to just build them all with a single script written in D and ditch make entirely, which would seriously reduce the amount of duplication across platforms. But that's obviously a much bigger change and would likely be much more controversial than simply using a more standard make.
[...] I'm all for ditching make. What about this: - We write a small D app that automatically scans all dependencies and generates a shell script / .BAT file / whatever the target platform uses, that contains compile commands that builds DMD and a make replacement written in D. This is for bootstrapping. - The make replacement written in D can then be used to rebuild DMD, build druntime, Phobos, etc.. The first step is what makes this all work, 'cos you'll need to already have a working D compiler before step 2 is usable. (Either that, or ship binaries, but then you'll get people complaining about their platform of choice not being supported, the binaries being incompatible with their quirky installation of system libraries, etc..) Once DMD is built, we can junk the script / .BAT file and use the D make-replacement from then on.
Since we're going to be porting dmd to D, which will force you to have a D compiler to build the D compiler anyway, I don't see any reason to jump through hoops to make anything bootstrappable. In the long run, you're going to have to either start with a dmd which compiled with C++ or cross-compile it from a machine which already has dmd, which is exactly the same boat that C/C++ are in. It's just that they've been around a lot longer, are supported on more platforms, and don't change as much.
A fair point, although cross-compiling can be a real pain. But maybe that's not too big of an issue: after all, how often would it need to be done? And we could always still resort to building H.S.'s idea if such became necessary. I guess the big question is (largely posed at D's leaders, top dogs and OS-package builders): Have we reached a point where we would be comfortable ditching the makefiles, given a sufficiently well-written D-based alternative?
Aug 10 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Aug 11, 2013 at 09:26:11AM +0100, Russel Winder wrote:
 On Sat, 2013-08-10 at 14:27 -0400, Nick Sabalausky wrote:
 […]
 is discovering and dealing with all the fun little differences
 between the posix and win32 makefiles (and now we have some win64
 makefiles as well).
[…] Isn't this sort of problem solved by using SCons, Waf or (if you really have to) CMake?
[...] +1. But people around here seems to have a beef against anything that isn't make. *shrug* T -- If you think you are too small to make a difference, try sleeping in a closed room with a mosquito. -- Jan van Steenbergen
Aug 11 2013
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday, August 11, 2013 15:38:09 H. S. Teoh wrote:
 Maybe my previous post didn't get the idea across clearly, so let me try
 again. My underlying thrust was: instead of maintaining 3 different
 makefiles (or more) by hand, have a single source for all of them, and
 write a small D program to generate posix.mak, win32.mak, win64.mak,
 whatever, from that source.
 
 That way, adding/removing files from the build, etc., involves only
 editing a single file, and regenerating the makefiles/whatever we use.
 If there's a problem with a platform-specific makefile, then it's just a
 matter of fixing the platform-specific output handler in the D program.
 
 The way we're currently doing it essentially amounts to the same thing
 as copy-n-pasting the same piece of code 3 times and trying to maintain
 all 3 copies separately, instead of writing a template that can be
 specialized 3 times, thus avoiding boilerplate and maintenance
 headaches.
But if you're going that far, why not just do the whole thing with D and ditch make entirely? If it's to avoid bootstrapping issues, we're going to have those anyway once we move the compiler to D (which is well on is well underway), so that really isn't going to matter. - Jonathan M Davis
Aug 11 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Aug 11, 2013 at 06:14:18PM -0700, Jonathan M Davis wrote:
 On Sunday, August 11, 2013 15:38:09 H. S. Teoh wrote:
 Maybe my previous post didn't get the idea across clearly, so let me
 try again. My underlying thrust was: instead of maintaining 3
 different makefiles (or more) by hand, have a single source for all
 of them, and write a small D program to generate posix.mak,
 win32.mak, win64.mak, whatever, from that source.
 
 That way, adding/removing files from the build, etc., involves only
 editing a single file, and regenerating the makefiles/whatever we
 use.  If there's a problem with a platform-specific makefile, then
 it's just a matter of fixing the platform-specific output handler in
 the D program.
 
 The way we're currently doing it essentially amounts to the same
 thing as copy-n-pasting the same piece of code 3 times and trying to
 maintain all 3 copies separately, instead of writing a template that
 can be specialized 3 times, thus avoiding boilerplate and
 maintenance headaches.
But if you're going that far, why not just do the whole thing with D and ditch make entirely? If it's to avoid bootstrapping issues, we're going to have those anyway once we move the compiler to D (which is well on is well underway), so that really isn't going to matter.
[...] If you like, think of it this way: the build tool will be written in D, with the option of generating scripts in legacy formats like makefiles or shell scripts so that it can be bootstrapped by whoever needs it to. We pay zero cost for this because the source document is the input format for the D tool, and the D tool takes care of producing the right sequence of commands. There is only one place to update when new files need to be added or old files removed -- or, if we integrate it with rdmd fully, even this may not be necessary. When somebody asks for a makefile, we just run the program with --generate=makefile. When somebody asks for a shell script, we just run it with --generate=shellscript. The generated makefiles/shell scripts are guaranteed to be consistent with the current state of the code, which is the whole point behind this exercise. T -- Designer clothes: how to cover less by paying more.
Aug 11 2013
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Sunday, August 11, 2013 20:07:17 H. S. Teoh wrote:
 On Sun, Aug 11, 2013 at 06:14:18PM -0700, Jonathan M Davis wrote:
 On Sunday, August 11, 2013 15:38:09 H. S. Teoh wrote:
 Maybe my previous post didn't get the idea across clearly, so let me
 try again. My underlying thrust was: instead of maintaining 3
 different makefiles (or more) by hand, have a single source for all
 of them, and write a small D program to generate posix.mak,
 win32.mak, win64.mak, whatever, from that source.
 
 That way, adding/removing files from the build, etc., involves only
 editing a single file, and regenerating the makefiles/whatever we
 use.  If there's a problem with a platform-specific makefile, then
 it's just a matter of fixing the platform-specific output handler in
 the D program.
 
 The way we're currently doing it essentially amounts to the same
 thing as copy-n-pasting the same piece of code 3 times and trying to
 maintain all 3 copies separately, instead of writing a template that
 can be specialized 3 times, thus avoiding boilerplate and
 maintenance headaches.
But if you're going that far, why not just do the whole thing with D and ditch make entirely? If it's to avoid bootstrapping issues, we're going to have those anyway once we move the compiler to D (which is well on is well underway), so that really isn't going to matter.
[...] If you like, think of it this way: the build tool will be written in D, with the option of generating scripts in legacy formats like makefiles or shell scripts so that it can be bootstrapped by whoever needs it to. We pay zero cost for this because the source document is the input format for the D tool, and the D tool takes care of producing the right sequence of commands. There is only one place to update when new files need to be added or old files removed -- or, if we integrate it with rdmd fully, even this may not be necessary. When somebody asks for a makefile, we just run the program with --generate=makefile. When somebody asks for a shell script, we just run it with --generate=shellscript. The generated makefiles/shell scripts are guaranteed to be consistent with the current state of the code, which is the whole point behind this exercise.
But what is the point of the makefile? As far as I can see, it gains you nothing. It doesn't help bootstrapping at all, because dmd itself will soon be written in D and therefore require that a D compiler already be installed. And the cost definitely isn't zero, because it requires extra code to be able to generate a makefile on top of doing the build purely with the D script (and I fully expect that donig the build with the D script will be simpler than generating the makefile would be). I see no benefit whatsoever in generating makefiles from a D script over simply doing the whole build with the D script. There would be an argument for it if dmd itself were going to stay in C++, because then you could avoid a circular dependency, but dmd is being converted to D, and so we're going to have that circular dependency anyway, negating that argument. - Jonathan M Davis
Aug 11 2013
prev sibling next sibling parent Russel Winder <russel winder.org.uk> writes:
On Sun, 2013-08-11 at 15:41 -0700, H. S. Teoh wrote:
 On Sun, Aug 11, 2013 at 09:26:11AM +0100, Russel Winder wrote:
 On Sat, 2013-08-10 at 14:27 -0400, Nick Sabalausky wrote:
 [=E2=80=A6]
 is discovering and dealing with all the fun little differences
 between the posix and win32 makefiles (and now we have some win64
 makefiles as well).
[=E2=80=A6] =20 Isn't this sort of problem solved by using SCons, Waf or (if you really have to) CMake?
[...] =20 +1. But people around here seems to have a beef against anything that isn't make. *shrug*
Make was a revolution and a revelation in 1977, it changed my life. However, it is sad to see projects such as Rust, Julia and D clinging to a 35 year old build concept when it has been proved time and time again that external DSL frameworks for build do not work for cross-platform working. Only internal DSL build frameworks have succeeded in that arena, cf. Gradle, SBT, SCons, Waf,=E2=80=A6 The only part of this thread that has any up side at all is to ditch all build frameworks and write the build in D over the bootstrap D that will be essential for the D build since D is written in D. It's a pity Rust hasn't twigged to this. I note that the Go tooling is written is C and Go, they ditched make when they realized their vision for packaging =E2=80=93 which works very we= ll indeed, particularly pulling in source packages from GitHub, BitBucket and Launchpad, compiling and installing the compiled package into the appropriate place for use. On the other hand, I bet a cross-platform SCons build of D could be in place and production within days as opposed to the <substitute-your-favourite-long-time> that a D rewrite in D will take. It doesn't matter than the SCons build may be thrown away down the line, it solves a problem now for not that much effort. Still if the core D community are clinging to "build =3D=3D make", then the= y will have to suffer the irritant of having to have a separate build system for each and every platform. That's they way Make is. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Aug 12 2013
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 12, 2013 at 11:16:19AM +0100, Russel Winder wrote:
 On Sun, 2013-08-11 at 15:41 -0700, H. S. Teoh wrote:
 On Sun, Aug 11, 2013 at 09:26:11AM +0100, Russel Winder wrote:
 On Sat, 2013-08-10 at 14:27 -0400, Nick Sabalausky wrote:
 […]
 is discovering and dealing with all the fun little differences
 between the posix and win32 makefiles (and now we have some
 win64 makefiles as well).
[…] Isn't this sort of problem solved by using SCons, Waf or (if you really have to) CMake?
[...] +1. But people around here seems to have a beef against anything that isn't make. *shrug*
Make was a revolution and a revelation in 1977, it changed my life. However, it is sad to see projects such as Rust, Julia and D clinging to a 35 year old build concept when it has been proved time and time again that external DSL frameworks for build do not work for cross-platform working. Only internal DSL build frameworks have succeeded in that arena, cf. Gradle, SBT, SCons, Waf,…
+1. If I were the one making the decisions, I'd go for SCons. Or tup (http://gittup.org/tup/), but tup seems to be currently posix-specific, so SCons still wins if you want cross-platform building.
 The only part of this thread that has any up side at all is to ditch
 all build frameworks and write the build in D over the bootstrap D
 that will be essential for the D build since D is written in D. It's a
 pity Rust hasn't twigged to this.
I think the D build tool should extend / be built on top of rdmd to be able to handle non-D sources. Once we have that, we basically already have a working build system.
 I note that the Go tooling is written is C and Go, they ditched make
 when they realized their vision for packaging – which works very well
 indeed, particularly pulling in source packages from GitHub, BitBucket
 and Launchpad, compiling and installing the compiled package into the
 appropriate place for use.
I ditched make about a decade ago, and I would never go back if I had the choice. Sadly, most of the rest of the world still seems stuck in that quagmirem, unable to move on.
 On the other hand, I bet a cross-platform SCons build of D could be in
 place and production within days as opposed to the
 <substitute-your-favourite-long-time> that a D rewrite in D will take.
 It doesn't matter than the SCons build may be thrown away down the
 line, it solves a problem now for not that much effort.
What do you say? Let's throw together an SConstruct for DMD, druntime, and phobos, and submit a pull for it? The only downside is that I can predict people will start complaining about the Python dependency. (Which is why I proposed writing a build system in D -- it will be superior to make (anything would be!), and people will have no excuse about what language it's written in.)
 Still if the core D community are clinging to "build == make", then
 they will have to suffer the irritant of having to have a separate
 build system for each and every platform. That's they way Make is.
[...] I used to evangelize SCons to everybody I meet... but after people adamantly refused to abandon their precious outdated crappy makefiles, I gave up. If they wish to continue suffering, it's not really my business to stop them. T -- WINDOWS = Will Install Needless Data On Whole System -- CompuMan
Aug 12 2013
next sibling parent reply "Wyatt" <wyatt.epp gmail.com> writes:
On Monday, 12 August 2013 at 16:29:36 UTC, H. S. Teoh wrote:
 What do you say? Let's throw together an SConstruct for DMD, 
 druntime,and phobos, and submit a pull for it?
I don't care as long as you're willing to maintain whatever you choose. But SCons? Granted it's been a few years since I deigned to look at it, but it's historically caused a lot of packaging headaches.
 The only downside is that I can predict people will start 
 complaining
 about the Python dependency. (Which is why I proposed writing a 
 build
 system in D -- it will be superior to make (anything would 
 be!), and
 people will have no excuse about what language it's written in.)
Do you plan on bundling it or are you expecting people to install it? Does it properly handle library search yet or is it still using naive name-only lookup? How about environment variables like CC and PATH? Has it become resilient against python version changes completely breaking it? etc. Despite how onerous they are, Autotools and CMake have a better track record that I've seen. -Wyatt
Aug 12 2013
parent reply Russel Winder <russel winder.org.uk> writes:
On Mon, 2013-08-12 at 19:48 +0200, Wyatt wrote:
[=E2=80=A6]
 I don't care as long as you're willing to maintain whatever you=20
 choose.  But SCons?  Granted it's been a few years since I=20
 deigned to look at it, but it's historically caused a lot of=20
 packaging headaches.
And Make hasn't ;-) [=E2=80=A6]
 Do you plan on bundling it or are you expecting people to install=20
 it?  Does it properly handle library search yet or is it still=20
 using naive name-only lookup? How about environment variables=20
 like CC and PATH?  Has it become resilient against python version=20
 changes completely breaking it?  etc.
This is where Waf has a benefit. SCons can put the build system with the project leaving only a Python dependency, but Waf is built for this mode of working.
 Despite how onerous they are, Autotools and CMake have a better=20
 track record that I've seen.
The QtD CMake reminds me of why I gave up on CMake and switched whole heartedly to SCons (and sometimes Waf) for native code builds. Autotools was a magnificent piece of macro hacking over Make, but CMake is better. It's just that compared to SCons, at least for me, CMake is fourth division. SCons is not perfect, it has many problem. The biggest of which is no resource for development. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Aug 12 2013
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/12/13 11:06 AM, Russel Winder wrote:
 SCons is not perfect, it has many problem. The biggest of which is no
 resource for development.
I'm hanging a general comment here for a lack of a better place. We're far from being enamored to make and we have no vested interest in keeping it. At the same time its place in the dmd foodchain is relatively modest (i.e. it's not a big hindrance to most developers) and replacing it with even the perfect tool is unlikely to make our lives significantly better. Worse, there seems to be no obvious replacement for make - each seems to comes with its own issues as you exemplify above for SCons - which further undermines motivation. Yes, there is duplication across posix.mak and winxx.mak. Inside winxx.mak there is yet another level of annoying duplication. But we don't work on those files frequently enough for all that to be a large problem. That being said, yes, I wish that all got improved. But the margins involved are small enough to make it difficult for the solution to become worse than the problem. Andrei
Aug 12 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 11:13 AM, Andrei Alexandrescu wrote:
 I'm hanging a general comment here for a lack of a better place.

 We're far from being enamored to make and we have no vested interest in keeping
 it. At the same time its place in the dmd foodchain is relatively modest (i.e.
 it's not a big hindrance to most developers) and replacing it with even the
 perfect tool is unlikely to make our lives significantly better. Worse, there
 seems to be no obvious replacement for make - each seems to comes with its own
 issues as you exemplify above for SCons - which further undermines motivation.

 Yes, there is duplication across posix.mak and winxx.mak. Inside winxx.mak
there
 is yet another level of annoying duplication. But we don't work on those files
 frequently enough for all that to be a large problem. That being said, yes, I
 wish that all got improved. But the margins involved are small enough to make
it
 difficult for the solution to become worse than the problem.
Exactly. There's a matter of proportion. We don't need to use a cannon (and all the support a cannon needs) to kill a cockroach. For example: building in a Python dependency just so a user can compile dmd? This is seriously out of place, besides a giant WTF telling anyone who wants to install dmd on Windows that he has to go find Python and install that, too?
Aug 12 2013
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 12, 2013 at 11:34:42AM -0700, Walter Bright wrote:
 On 8/12/2013 11:13 AM, Andrei Alexandrescu wrote:
I'm hanging a general comment here for a lack of a better place.

We're far from being enamored to make and we have no vested interest
in keeping it. At the same time its place in the dmd foodchain is
relatively modest (i.e.  it's not a big hindrance to most developers)
and replacing it with even the perfect tool is unlikely to make our
lives significantly better. Worse, there seems to be no obvious
replacement for make - each seems to comes with its own issues as you
exemplify above for SCons - which further undermines motivation.

Yes, there is duplication across posix.mak and winxx.mak. Inside
winxx.mak there is yet another level of annoying duplication. But we
don't work on those files frequently enough for all that to be a
large problem. That being said, yes, I wish that all got improved.
But the margins involved are small enough to make it difficult for
the solution to become worse than the problem.
Exactly. There's a matter of proportion. We don't need to use a cannon (and all the support a cannon needs) to kill a cockroach.
But you're missing the bigger picture. What I envision is that this D build tool will go beyond merely building DMD/druntime/Phobos. If it's successful, it can become the *standard* D build tool for all D programs. Having a standard D build tool will go a long way in making D programs portable and easy to install, besides freeing us from a dependency on make.
 For example: building in a Python dependency just so a user can
 compile dmd? This is seriously out of place, besides a giant WTF
 telling anyone who wants to install dmd on Windows that he has to go
 find Python and install that, too?
Which is why I proposed writing the build system in D. Ideally, build scripts would themselves be D programs... dogfooding ftw. :) Objectively speaking, though, this is no different from being required to install make in order to compile dmd. You still have to go out of the way to install a 3rd party program before you can build dmd. The only difference is that make tends to be preinstalled in more systems than python (though nowadays most Linux distros come with python by default, so this factor is becoming much less out of place than you're suggesting). T -- Curiosity kills the cat. Moral: don't be the cat.
Aug 12 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 3:48 PM, H. S. Teoh wrote:
 Objectively speaking, though, this is no different from being required
 to install make in order to compile dmd. You still have to go out of the
 way to install a 3rd party program before you can build dmd. The only
 difference is that make tends to be preinstalled in more systems than
 python (though nowadays most Linux distros come with python by default,
 so this factor is becoming much less out of place than you're
 suggesting).
1. Python is not preinstalled on Windows. So then the question is, which Python should the user install? What happens if the user doesn't like that version? What if it conflicts with his other Python he's got installed? What if the Python version is different? What if Python releases an upgrade, and our build system needs to be adjusted to account for that? 2. Make comes with g++. g++ is used to build dmd, so if g++ is installed, so is make. 3. Make doesn't come preinstalled on Windows. But we have a make we can throw in the bin directory without issues. It's only 50K. Nobody goes out of their way - it's there on the same path as dmd. It's always the right version of make to use with our makefiles. We're not going to get into the Python distribution business. 4. Having Python as a prerequisite for using D just paints the wrong image for D. 5. Do we really want D to be restricted to only platforms that have the latest Python up on them? So no, I do NOT at all regard the issue of requiring Python to be remotely equivalent to requiring Make. We did have a problem on FreeBSD because the default make on it would not work with posix.mak. gmake had to be explicitly installed. The only saving grace there is that very few people use FreeBSD, and those that do, tend to be pretty handy with installing gmake.
Aug 12 2013
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 12, 2013 at 04:18:12PM -0700, Walter Bright wrote:
 On 8/12/2013 3:48 PM, H. S. Teoh wrote:
Objectively speaking, though, this is no different from being
required to install make in order to compile dmd. You still have to
go out of the way to install a 3rd party program before you can build
dmd. The only difference is that make tends to be preinstalled in
more systems than python (though nowadays most Linux distros come
with python by default, so this factor is becoming much less out of
place than you're suggesting).
1. Python is not preinstalled on Windows. So then the question is, which Python should the user install? What happens if the user doesn't like that version? What if it conflicts with his other Python he's got installed? What if the Python version is different? What if Python releases an upgrade, and our build system needs to be adjusted to account for that? 2. Make comes with g++. g++ is used to build dmd, so if g++ is installed, so is make. 3. Make doesn't come preinstalled on Windows. But we have a make we can throw in the bin directory without issues. It's only 50K. Nobody goes out of their way - it's there on the same path as dmd. It's always the right version of make to use with our makefiles. We're not going to get into the Python distribution business.
But if we bundle a D-based build tool in the bin directory...
 4. Having Python as a prerequisite for using D just paints the wrong image for
D.
That I agree with. :)
 5. Do we really want D to be restricted to only platforms that have
 the latest Python up on them?
I think that's a bit of hyperbole. It's the same as saying "do we want D to be restricted to only platforms that have g++/make installed?" This is where a D-based build tool comes in. We ship the prebuilt build tool, then that can be used to build dmd (which can in turn be used to build newer versions of the build tool). That way we are freed from dependence on anything else, esp. once dmd is fully ported over to D.
 So no, I do NOT at all regard the issue of requiring Python to be
 remotely equivalent to requiring Make.
 
 We did have a problem on FreeBSD because the default make on it
 would not work with posix.mak. gmake had to be explicitly installed.
 The only saving grace there is that very few people use FreeBSD, and
 those that do, tend to be pretty handy with installing gmake.
This is the plague of makefiles. They *almost* always work, but when they don't, it's a mess. If we have a D build tool that we ship with the dmd sources, then we will have (1) a superior build system to make, (2) no external dependence on gmake / python / whatever else. Since we're aiming for dmd to be itself written in D, we might as well just self-host the build system too. And this tool can be used to build other D programs in general, not just dmd / druntime / phobos. T -- Truth, Sir, is a cow which will give [skeptics] no more milk, and so they are gone to milk the bull. -- Sam. Johnson
Aug 12 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 5:19 PM, H. S. Teoh wrote:
 5. Do we really want D to be restricted to only platforms that have
 the latest Python up on them?
I think that's a bit of hyperbole. It's the same as saying "do we want D to be restricted to only platforms that have g++/make installed?"
No, I don't think it's hyperbole. I would bet that a C++ compiler (which invariably come with a make) exists on essentially every platform 32 bits or larger. Python? Doesn't seem likely.
 If we have a D build tool that we ship with the dmd sources, then we
 will have (1) a superior build system to make, (2) no external
 dependence on gmake / python / whatever else. Since we're aiming for dmd
 to be itself written in D, we might as well just self-host the build
 system too. And this tool can be used to build other D programs in
 general, not just dmd / druntime / phobos.
I think you're underestimating the amount of effort needed to build a "proper" build tool, even to get it up to the level of make. For example, gmake has the -j switch, enabling building using all the cores on your CPU. This is a big deal, but I don't think it's so easy to create a bug-free implementation of it. (DM make does not have -j.)
Aug 12 2013
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Mon, 12 Aug 2013 17:44:35 -0700
Walter Bright <newshound2 digitalmars.com> wrote:

 On 8/12/2013 5:19 PM, H. S. Teoh wrote:
 5. Do we really want D to be restricted to only platforms that have
 the latest Python up on them?
I think that's a bit of hyperbole. It's the same as saying "do we want D to be restricted to only platforms that have g++/make installed?"
No, I don't think it's hyperbole. I would bet that a C++ compiler (which invariably come with a make) exists on essentially every platform 32 bits or larger. Python? Doesn't seem likely.
Yea, while python is indeed hugely widespread, it still doesn't compete with C++/make in availability. There are people who deliberately avoid having scripting packages like python on certain systems for various legitimate reasons. Plus (and this is one of my primary beefs with Python), tools written in Python often have a tendency to expect a certain version of Python (or even certain lib dependencies), and it's up to the person *running* the program to make sure the script is given the right installation of Python. Woe is the user who gets it wrong, because they'll just be greeted with a big internal traceback. It's not like Java where you can install whatever runtimes you need and things will pretty much just run themselves on the correct version automatically, or tell you what expected version you're missing. (Not that I'm advocating make, either)
Aug 12 2013
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 12, 2013 at 05:44:35PM -0700, Walter Bright wrote:
 On 8/12/2013 5:19 PM, H. S. Teoh wrote:
5. Do we really want D to be restricted to only platforms that have
the latest Python up on them?
I think that's a bit of hyperbole. It's the same as saying "do we want D to be restricted to only platforms that have g++/make installed?"
No, I don't think it's hyperbole. I would bet that a C++ compiler (which invariably come with a make) exists on essentially every platform 32 bits or larger. Python? Doesn't seem likely.
OK, perhaps the situation on Windows is far different from Linux, then. I haven't used Windows in any significant way for over a decade, so my perception is probably biased.
If we have a D build tool that we ship with the dmd sources, then we
will have (1) a superior build system to make, (2) no external
dependence on gmake / python / whatever else. Since we're aiming for
dmd to be itself written in D, we might as well just self-host the
build system too. And this tool can be used to build other D programs
in general, not just dmd / druntime / phobos.
I think you're underestimating the amount of effort needed to build a "proper" build tool, even to get it up to the level of make. For example, gmake has the -j switch, enabling building using all the cores on your CPU. This is a big deal, but I don't think it's so easy to create a bug-free implementation of it. (DM make does not have -j.)
[...] SCons supports -j out of the box. You don't even need to write your build rules differently, as you must in make to get it to work properly. Of course, that doesn't really say very much about how much effort it took to implement that. But in this day and age, any build tool that can't parallel build *by default* is simply not worth the effort -- might as well go back to make. One interesting approach is in tup (http://gittup.org/tup/), which actually sandboxes each build command executed, so that it can determine the exact inputs/outputs. This leads to nice features like, when you update your system libraries, it will know to relink your program, or when you #include or import stuff, it knows which exactly files the compiler read (instead of trying to guess by scanning file contents, which sometimes gets it wrong), so when those files are modified it knows what to recompile. T -- By understanding a machine-oriented language, the programmer will tend to use a much more efficient method; it is much closer to reality. -- D. Knuth
Aug 12 2013
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/12/13 4:18 PM, Walter Bright wrote:
 3. Make doesn't come preinstalled on Windows. But we have a make we can
 throw in the bin directory without issues. It's only 50K. Nobody goes
 out of their way - it's there on the same path as dmd. It's always the
 right version of make to use with our makefiles. We're not going to get
 into the Python distribution business.
...
 We did have a problem on FreeBSD because the default make on it would
 not work with posix.mak. gmake had to be explicitly installed. The only
 saving grace there is that very few people use FreeBSD, and those that
 do, tend to be pretty handy with installing gmake.
From these two paragraphs above it seems that distributing a statically-linked version of gmake instead of the current make would be a possible solution. It is bigger but that shouldn't matter. Then we get to build on all supported OSs with posix.mak. Given that gmake is GNU-licensed, would that be a problem? Andrei
Aug 12 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/12/2013 5:41 PM, Andrei Alexandrescu wrote:
  From these two paragraphs above it seems that distributing a statically-linked
 version of gmake instead of the current make would be a possible solution. It
is
 bigger but that shouldn't matter.

 Then we get to build on all supported OSs with posix.mak.

 Given that gmake is GNU-licensed, would that be a problem?
I responded to that in some detail in my reply to Brad in this thread.
Aug 12 2013
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, August 12, 2013 15:48:54 H. S. Teoh wrote:
 Which is why I proposed writing the build system in D. Ideally, build
 scripts would themselves be D programs... dogfooding ftw. :)
I would not want to make any attempt to make dmd, druntime, and Phobos build with a "standard D build tool" until it's been totally sorted out elsewhere. It's one thing to write up a script in D which will handle the specific situation of dmd or druntime or Phobos. It's quite another to write a generic build tool. I think that we could write a nice, clean build script for Phobos (or dmd or druntime) in D which specifically handled it without any attempt to generalize it and end up with something much better than make. But leave the standard D build tool stuff up to the likes of dub and orbit until they are ready to be used with dmd, druntime, and Phobos. - Jonathan M Davis
Aug 12 2013
prev sibling parent Joseph Rushton Wakeling <joseph.wakeling webdrake.net> writes:
On 08/13/2013 12:48 AM, H. S. Teoh wrote:
 But you're missing the bigger picture. What I envision is that this D
 build tool will go beyond merely building DMD/druntime/Phobos. If it's
 successful, it can become the *standard* D build tool for all D
 programs. Having a standard D build tool will go a long way in making D
 programs portable and easy to install, besides freeing us from a
 dependency on make.
Can I suggest then that you write such a tool _first and foremost_ as a build system for arbitrary D programs (or preferably, arbitrary programs without reference to the language...)? Then, if it gets uptake, proves a success, etc., we can consider whether it makes sense for the core D stuff. The thing is that at the end of the day, make may have its problems but it is a well understood tool that is readily accessible to many developers and across platforms. It also means that the build system is not dependent on one or two D hackers, who might fall under a bus, get a new job, whatever. I really don't see the benefits of accepting the maintenance burden for a custom build system, with all the potential bottlenecks that introduces, compared to focusing on the things that really matter -- frontend, runtime, and standard library.
Aug 16 2013
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-08-12 20:06, Russel Winder wrote:

 This is where Waf has a benefit. SCons can put the build system with the
 project leaving only a Python dependency, but Waf is built for this mode
 of working.
Why don't they embed Python inside SCons, at least as an option. -- /Jacob Carlborg
Aug 13 2013
parent reply Russel Winder <russel winder.org.uk> writes:
On Tue, 2013-08-13 at 09:09 +0200, Jacob Carlborg wrote:
 On 2013-08-12 20:06, Russel Winder wrote:
=20
 This is where Waf has a benefit. SCons can put the build system with th=
e
 project leaving only a Python dependency, but Waf is built for this mod=
e
 of working.
=20 Why don't they embed Python inside SCons, at least as an option.
SCons is a Python application and so is normally installed in the Python installation. I am sure there could be a Windows installer created, but I guess most Windows people use Visual Studio (*). No package-based system really needs Python shipped with SCons as it is so trivial to install Python and then SCons. OS X comes with Python pre-installed, though it is an old version (**). The reality here is that most major Fortran, C and C++ places are either politically opposed to Python, or are already using it. The former are incapable of using SCons and the latter would get annoyed by another Python installation when they already have one. Most HPC places are using SciPy, most banks now use Python, most engineering places are using Python. (*) SCons can create Visual Studio projects. (**) Anyone using Python earlier than 2.7 is either contractually obliged or nuts. Most sensible folk are already using Python 3.3 or rapidly moving towards it. SCons sadly is not Python 3 compatible. At least not yet, but there are plans. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
Aug 13 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/13/2013 3:54 AM, Russel Winder wrote:
 (**) Anyone using Python earlier than 2.7 is either contractually
 obliged or nuts. Most sensible folk are already using Python 3.3 or
 rapidly moving towards it.  SCons sadly is not Python 3 compatible. At
 least not yet, but there are plans.
So SCons is developed by people who are nuts? :-)
Aug 13 2013
prev sibling next sibling parent "Dejan Lekic" <dejan.lekic gmail.com> writes:
On Monday, 12 August 2013 at 16:29:36 UTC, H. S. Teoh wrote:
 On Mon, Aug 12, 2013 at 11:16:19AM +0100, Russel Winder wrote:
 On Sun, 2013-08-11 at 15:41 -0700, H. S. Teoh wrote:
 On Sun, Aug 11, 2013 at 09:26:11AM +0100, Russel Winder 
 wrote:
 On Sat, 2013-08-10 at 14:27 -0400, Nick Sabalausky wrote:
 […]
 is discovering and dealing with all the fun little 
 differences
 between the posix and win32 makefiles (and now we have 
 some
 win64 makefiles as well).
[…] Isn't this sort of problem solved by using SCons, Waf or (if you really have to) CMake?
[...] +1. But people around here seems to have a beef against anything that isn't make. *shrug*
Make was a revolution and a revelation in 1977, it changed my life. However, it is sad to see projects such as Rust, Julia and D clinging to a 35 year old build concept when it has been proved time and time again that external DSL frameworks for build do not work for cross-platform working. Only internal DSL build frameworks have succeeded in that arena, cf. Gradle, SBT, SCons, Waf,…
+1. If I were the one making the decisions, I'd go for SCons. Or tup (http://gittup.org/tup/), but tup seems to be currently posix-specific, so SCons still wins if you want cross-platform building.
 The only part of this thread that has any up side at all is to 
 ditch
 all build frameworks and write the build in D over the 
 bootstrap D
 that will be essential for the D build since D is written in 
 D. It's a
 pity Rust hasn't twigged to this.
I think the D build tool should extend / be built on top of rdmd to be able to handle non-D sources. Once we have that, we basically already have a working build system.
 I note that the Go tooling is written is C and Go, they 
 ditched make
 when they realized their vision for packaging – which works 
 very well
 indeed, particularly pulling in source packages from GitHub, 
 BitBucket
 and Launchpad, compiling and installing the compiled package 
 into the
 appropriate place for use.
I ditched make about a decade ago, and I would never go back if I had the choice. Sadly, most of the rest of the world still seems stuck in that quagmirem, unable to move on.
 On the other hand, I bet a cross-platform SCons build of D 
 could be in
 place and production within days as opposed to the
 <substitute-your-favourite-long-time> that a D rewrite in D 
 will take.
 It doesn't matter than the SCons build may be thrown away down 
 the
 line, it solves a problem now for not that much effort.
What do you say? Let's throw together an SConstruct for DMD, druntime, and phobos, and submit a pull for it? The only downside is that I can predict people will start complaining about the Python dependency. (Which is why I proposed writing a build system in D -- it will be superior to make (anything would be!), and people will have no excuse about what language it's written in.)
 Still if the core D community are clinging to "build == make", 
 then
 they will have to suffer the irritant of having to have a 
 separate
 build system for each and every platform. That's they way Make 
 is.
[...] I used to evangelize SCons to everybody I meet... but after people adamantly refused to abandon their precious outdated crappy makefiles, I gave up. If they wish to continue suffering, it's not really my business to stop them. T
Thanks for the link! I found this excellent paper there: http://gittup.org/tup/build_system_rules_and_algorithms.pdf . :)
Aug 13 2013
prev sibling parent "Dejan Lekic" <dejan.lekic gmail.com> writes:
On Monday, 12 August 2013 at 16:29:36 UTC, H. S. Teoh wrote:
 On Mon, Aug 12, 2013 at 11:16:19AM +0100, Russel Winder wrote:
 On Sun, 2013-08-11 at 15:41 -0700, H. S. Teoh wrote:
 On Sun, Aug 11, 2013 at 09:26:11AM +0100, Russel Winder 
 wrote:
 On Sat, 2013-08-10 at 14:27 -0400, Nick Sabalausky wrote:
 […]
 is discovering and dealing with all the fun little 
 differences
 between the posix and win32 makefiles (and now we have 
 some
 win64 makefiles as well).
[…] Isn't this sort of problem solved by using SCons, Waf or (if you really have to) CMake?
[...] +1. But people around here seems to have a beef against anything that isn't make. *shrug*
Make was a revolution and a revelation in 1977, it changed my life. However, it is sad to see projects such as Rust, Julia and D clinging to a 35 year old build concept when it has been proved time and time again that external DSL frameworks for build do not work for cross-platform working. Only internal DSL build frameworks have succeeded in that arena, cf. Gradle, SBT, SCons, Waf,…
+1. If I were the one making the decisions, I'd go for SCons. Or tup (http://gittup.org/tup/), but tup seems to be currently posix-specific, so SCons still wins if you want cross-platform building.
 The only part of this thread that has any up side at all is to 
 ditch
 all build frameworks and write the build in D over the 
 bootstrap D
 that will be essential for the D build since D is written in 
 D. It's a
 pity Rust hasn't twigged to this.
I think the D build tool should extend / be built on top of rdmd to be able to handle non-D sources. Once we have that, we basically already have a working build system.
 I note that the Go tooling is written is C and Go, they 
 ditched make
 when they realized their vision for packaging – which works 
 very well
 indeed, particularly pulling in source packages from GitHub, 
 BitBucket
 and Launchpad, compiling and installing the compiled package 
 into the
 appropriate place for use.
I ditched make about a decade ago, and I would never go back if I had the choice. Sadly, most of the rest of the world still seems stuck in that quagmirem, unable to move on.
 On the other hand, I bet a cross-platform SCons build of D 
 could be in
 place and production within days as opposed to the
 <substitute-your-favourite-long-time> that a D rewrite in D 
 will take.
 It doesn't matter than the SCons build may be thrown away down 
 the
 line, it solves a problem now for not that much effort.
What do you say? Let's throw together an SConstruct for DMD, druntime, and phobos, and submit a pull for it? The only downside is that I can predict people will start complaining about the Python dependency. (Which is why I proposed writing a build system in D -- it will be superior to make (anything would be!), and people will have no excuse about what language it's written in.)
 Still if the core D community are clinging to "build == make", 
 then
 they will have to suffer the irritant of having to have a 
 separate
 build system for each and every platform. That's they way Make 
 is.
[...] I used to evangelize SCons to everybody I meet... but after people adamantly refused to abandon their precious outdated crappy makefiles, I gave up. If they wish to continue suffering, it's not really my business to stop them.
I am one of them. In my real life I use Maven most of the time, but for absolutely everything else, including (small/toy) Java projects, I use GNU Make. I know it is does not really matter to many people, but honestly, Make exists on EVERY platform I have ever tried. We have OpenVMS servers here for an example. Guess what, make works like a charm there, and everybody knows (more/less) how to use it. :)
Aug 13 2013
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-08-10 20:35, Nick Sabalausky wrote:
 Although it took longer than I expected to get around to it, I'm
 working on a release-generator tool for DMD. I'm finding that a very
 significant amount of the effort involved (much more than I expected)
 is discovering and dealing with all the fun little differences between
 the posix and win32 makefiles (and now we have some win64 makefiles as
 well).

 Efforts can be made to decrease these differences, but simply
 having them separate makefiles in the first place (let alone using
 completely different "make"s: GNU make vs DM make) is a natural
 invitation for divergence.

 No disrespect intended to Digital Mars Make, but since GNU make appears
 to be more feature-rich, have wider overall adoption, and is freely
 available on Windows as a pre-built binary
 <http://gnuwin32.sourceforge.net/packages/make.htm>: Would it be
 acceptable to use gmake as *the* make for DMD? Ie, either convert the
 windows makefiles to gmake, or expand the posix makefiles to support
 windows?

 I'd be willing to give it a shot myself, and I could trivially
 write a small batch utility to download Win gmake and put it on the
 current PATH, so that nobody has to go downloading/installing it
 manually. I would do this *after* finishing the release-generator tool,
 but afterwords it would allow the tool's implantation to be greatly
 simplified.

 Is this something that would be acceptable, or does building DMD for
 Windows need to stay as DM make?
This might not be entirely related to the DMD makefiles but with the druntime and Phobos makefiles I really hate that if I need to add a new module I need to repeat the name several times. So whatever happens to the makefiles, I would prefer that the files that should be compiled shouldn't have to be mentioned at all (or at most once). I just compile everything in a directory with the correct extensions. -- /Jacob Carlborg
Aug 13 2013
prev sibling parent reply "Kagamin" <spam here.lot> writes:
On Saturday, 10 August 2013 at 18:35:10 UTC, Nick Sabalausky 
wrote:
 Would it be
 acceptable to use gmake as *the* make for DMD? Ie, either 
 convert the
 windows makefiles to gmake, or expand the posix makefiles to 
 support
 windows?
1. expand posix makefiles to support windows 2. leave dm makefile for those who doesn't have gmake 3. use unified posix/windows makefile 4. everyone is happy
Aug 13 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Tue, 13 Aug 2013 15:13:38 +0200
"Kagamin" <spam here.lot> wrote:

 On Saturday, 10 August 2013 at 18:35:10 UTC, Nick Sabalausky 
 wrote:
 Would it be
 acceptable to use gmake as *the* make for DMD? Ie, either 
 convert the
 windows makefiles to gmake, or expand the posix makefiles to 
 support
 windows?
1. expand posix makefiles to support windows 2. leave dm makefile for those who doesn't have gmake 3. use unified posix/windows makefile 4. everyone is happy
That still involves the overhead of maintaining duplicate makefiles and a tendency for gradual divergence.
Aug 13 2013
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 13, 2013 at 01:45:41PM -0400, Nick Sabalausky wrote:
 On Tue, 13 Aug 2013 15:13:38 +0200
 "Kagamin" <spam here.lot> wrote:
 
 On Saturday, 10 August 2013 at 18:35:10 UTC, Nick Sabalausky 
 wrote:
 Would it be acceptable to use gmake as *the* make for DMD? Ie,
 either convert the windows makefiles to gmake, or expand the posix
 makefiles to support windows?
1. expand posix makefiles to support windows 2. leave dm makefile for those who doesn't have gmake 3. use unified posix/windows makefile 4. everyone is happy
That still involves the overhead of maintaining duplicate makefiles and a tendency for gradual divergence.
It violates DRY, and thus inherits all of the associated problems. T -- The peace of mind---from knowing that viruses which exploit Microsoft system vulnerabilities cannot touch Linux---is priceless. -- Frustrated system administrator.
Aug 13 2013
prev sibling parent "Kagamin" <spam here.lot> writes:
On Tuesday, 13 August 2013 at 17:45:52 UTC, Nick Sabalausky wrote:
 That still involves the overhead of maintaining duplicate 
 makefiles and
 a tendency for gradual divergence.
You complained about incompatible interface of the makefiles for intergation with your tool. The unified gnu makefile solves that problem. As it's said, dm makefile has its positive side.
Aug 13 2013