www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Dependency management in D

reply "Scott Wilson" <scott_wilson_3rd yahoo.com> writes:
Im running some tests with D. Was wondering whats the dependency
story. Cant find any info online, searched for dlang dependency
management and dlang dependency. Found bunch o dub stuff but not
the nitty gritty.

Unit of compilation is one D file but I saw if I pass several D
files to the compiler only one .o file is generated. Whats the
story there.

Plus if I change a non template function in one module then
theres no way to tell make no rebuild of modules importing it.
The dmd -deps call seems to generate all recursively. In
gcc/makedepend if one function changes only relinking is needed
or nothing if dynamic loading.

Overall as far as I understand compiler is fast but dependency
mgmt is coarse. Also whats the deal with cyclic dependencies. I
saw discussion that its not possible but wrote a few test modules
and it works fine.

Am I grokking this and how can I help it thanx.
Sep 18 2014
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Thursday, 18 September 2014 at 16:48:22 UTC, Scott Wilson 
wrote:
 Unit of compilation is one D file but I saw if I pass several D
 files to the compiler only one .o file is generated. Whats the
 story there.
DMD will generate one object file per module file, unless you use the -of option. With -of, the compiler will compile all module files to a single object file.
 Plus if I change a non template function in one module then
 theres no way to tell make no rebuild of modules importing it.
 The dmd -deps call seems to generate all recursively. In
 gcc/makedepend if one function changes only relinking is needed
 or nothing if dynamic loading.
Yes, currently any changes, including function bodies, cause a rebuild of importing modules. One reason for this is CTFE - changing a function body can affect the code in importing modules if they invoke the function during compilation. You could use .di files to separate declarations from implementations.
 Overall as far as I understand compiler is fast but dependency
 mgmt is coarse. Also whats the deal with cyclic dependencies. I
 saw discussion that its not possible but wrote a few test 
 modules
 and it works fine.
Cyclic dependencies between modules which all have module or static constructors are forbidden, because it is unknown in which order the constructors are expected to run.
Sep 18 2014
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2014 9:56 AM, Vladimir Panteleev wrote:
 On Thursday, 18 September 2014 at 16:48:22 UTC, Scott Wilson wrote:
 Unit of compilation is one D file but I saw if I pass several D
 files to the compiler only one .o file is generated. Whats the
 story there.
DMD will generate one object file per module file, unless you use the -of option.
This is incorrect. It'll generate one object file per invocation of dmd.
Sep 18 2014
parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 19 September 2014 at 05:19:07 UTC, Walter Bright wrote:
 On 9/18/2014 9:56 AM, Vladimir Panteleev wrote:
 On Thursday, 18 September 2014 at 16:48:22 UTC, Scott Wilson 
 wrote:
 Unit of compilation is one D file but I saw if I pass several 
 D
 files to the compiler only one .o file is generated. Whats the
 story there.
DMD will generate one object file per module file, unless you use the -of option.
This is incorrect. It'll generate one object file per invocation of dmd.
Erm, that's not quite correct either. I meant, one object file per module file passed on the command line.
Sep 18 2014
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2014 10:27 PM, Vladimir Panteleev wrote:
 On Friday, 19 September 2014 at 05:19:07 UTC, Walter Bright wrote:
 On 9/18/2014 9:56 AM, Vladimir Panteleev wrote:
 On Thursday, 18 September 2014 at 16:48:22 UTC, Scott Wilson wrote:
 Unit of compilation is one D file but I saw if I pass several D
 files to the compiler only one .o file is generated. Whats the
 story there.
DMD will generate one object file per module file, unless you use the -of option.
This is incorrect. It'll generate one object file per invocation of dmd.
Erm, that's not quite correct either. I meant, one object file per module file passed on the command line.
Yeah, you're right. My mistake.
Sep 19 2014
prev sibling parent reply "Scott Wilson" <scott_wilson_3rd yahoo.com> writes:
On Friday, 19 September 2014 at 05:27:33 UTC, Vladimir Panteleev 
wrote:
 On Friday, 19 September 2014 at 05:19:07 UTC, Walter Bright 
 wrote:
 On 9/18/2014 9:56 AM, Vladimir Panteleev wrote:
 On Thursday, 18 September 2014 at 16:48:22 UTC, Scott Wilson 
 wrote:
 Unit of compilation is one D file but I saw if I pass 
 several D
 files to the compiler only one .o file is generated. Whats 
 the
 story there.
DMD will generate one object file per module file, unless you use the -of option.
This is incorrect. It'll generate one object file per invocation of dmd.
Erm, that's not quite correct either. I meant, one object file per module file passed on the command line.
I tried this dmd test.d test2.d Got one test.o and one binary test. No test2.o. Must be that thing where all modules compiled are smashed. The first filename dictates the .o name. Correct? Also tried dmd -c test.d test2.d In that case yes both .o are generated. So the trick with all this is how to opt between building all and building each incrementally. Any tools helping with that? For example tool that shows dependencies. If some files are dependent on each other they better be compiled together anyway. Scott
Sep 19 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/19/2014 10:34 AM, Scott Wilson wrote:
 I tried this

 dmd test.d test2.d

 Got one test.o and one binary test. No test2.o. Must be that thing where all
 modules compiled are smashed. The first filename dictates the .o name. Correct?

 Also tried

 dmd -c test.d test2.d

 In that case yes both .o are generated.
Yes, that's correct. When building an exe directly, there's no point in generating multiple .o files, so only one is generated.
 So the trick with all this is how to opt between building all and building each
 incrementally. Any tools helping with that? For example tool that shows
 dependencies. If some files are dependent on each other they better be compiled
 together anyway.
The -deps switch for dmd will list dependencies.
Sep 19 2014
prev sibling next sibling parent reply ketmar via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Thu, 18 Sep 2014 16:48:21 +0000
Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> wrote:

 Plus if I change a non template function in one module then
 theres no way to tell make no rebuild of modules importing it.
 The dmd -deps call seems to generate all recursively. In
 gcc/makedepend if one function changes only relinking is needed
 or nothing if dynamic loading.
there is no way to tell what exactly was changed. this can be some template, for example, and with changed template all modules that import that one with template must be recompiled to accomodate new version. or think about CTFE.
Sep 18 2014
parent reply "Scott Wilson" <scott_wilson_3rd yahoo.com> writes:
On Thursday, 18 September 2014 at 17:04:43 UTC, ketmar via
Digitalmars-d wrote:
 On Thu, 18 Sep 2014 16:48:21 +0000
 Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> 
 wrote:

 Plus if I change a non template function in one module then
 theres no way to tell make no rebuild of modules importing it.
 The dmd -deps call seems to generate all recursively. In
 gcc/makedepend if one function changes only relinking is needed
 or nothing if dynamic loading.
there is no way to tell what exactly was changed. this can be some template, for example, and with changed template all modules that import that one with template must be recompiled to accomodate new version. or think about CTFE.
Thanks fellas. But whats the thing with .di files? AFAIU the compiler generates them and dependent modules can depend on them instead of .d files directly. Do .di files contain only templates (no comments and plain functions? How well do they work? thanx
Sep 18 2014
parent reply ketmar via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 19 Sep 2014 01:42:58 +0000
Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> wrote:

 Do .di files contain only templates (no comments and plain
 functions? How well do they work? thanx
as for 'how .di files work' question: '.di' is just a plain D source, just with stripped function bodies. nothing very special about that. so yes, .di files contains only templates and function declarations (without function bodies). this *can* work, but when it comes to CTFE... look at the following: =3D=3D=3D z00.d =3D=3D=3D module z00; string foo(string name) { return `int `~name~`() {return 42;}`; } =3D=3D=3D z01.d =3D=3D=3D import z00; mixin(foo(`bar`)); void main () { import std.stdio; writeln(bar()); } and .di file generated with `dmd -H -c -o- z00.d`: =3D=3D=3D z00.di =3D=3D=3D // D import file generated from 'z00.d' module z00; string foo(string name); do you see any gotchas? heh: # dmd z01.d z01.d(2): Error: foo cannot be interpreted at compile time, because it has no available source code z01.d(2): Error: argument to mixin must be a string, not (foo("bar")) of type string the compiler has no source for foo() anymore, so it can't do CTFE. you can avoid this by turning foo() into template: string foo()(string name) { return `int `~name~`() {return 42;}`; } but then you should turn all your functions that can be used in CTFE into templates, and there will be no much sense in .di file anyway. to make a long story short: don't use .di files unless you *REALLY* *KNOW* what you're doing. and even then think twice.
Sep 18 2014
parent reply "Scott Wilson" <scott_wilson_3rd yahoo.com> writes:
On Friday, 19 September 2014 at 02:05:43 UTC, ketmar via
Digitalmars-d wrote:
 On Fri, 19 Sep 2014 01:42:58 +0000
 Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> 
 wrote:

 Do .di files contain only templates (no comments and plain
 functions? How well do they work? thanx
as for 'how .di files work' question: '.di' is just a plain D source, just with stripped function bodies. nothing very special about that. so yes, .di files contains only templates and function declarations (without function bodies). this *can* work, but when it comes to CTFE... look at the following: === z00.d === module z00; string foo(string name) { return `int `~name~`() {return 42;}`; } === z01.d === import z00; mixin(foo(`bar`)); void main () { import std.stdio; writeln(bar()); } and .di file generated with `dmd -H -c -o- z00.d`: === z00.di === // D import file generated from 'z00.d' module z00; string foo(string name); do you see any gotchas? heh: # dmd z01.d z01.d(2): Error: foo cannot be interpreted at compile time, because it has no available source code z01.d(2): Error: argument to mixin must be a string, not (foo("bar")) of type string the compiler has no source for foo() anymore, so it can't do CTFE. you can avoid this by turning foo() into template: string foo()(string name) { return `int `~name~`() {return 42;}`; } but then you should turn all your functions that can be used in CTFE into templates, and there will be no much sense in .di file anyway. to make a long story short: don't use .di files unless you *REALLY* *KNOW* what you're doing. and even then think twice.
That CTFE is used randomly everywhere? Otherwise I dont grok whats so complicated. If I want visible code aka templates and CTFE I put that in the .di. For separation I put declaration in .di and impl in .d. What subtleties Im missing? Scott
Sep 19 2014
parent reply ketmar via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 19 Sep 2014 17:38:20 +0000
Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> wrote:

 That CTFE is used randomly everywhere?
CTFE *can* be used alot. this is one of D killer features (our regexp engine, for example, not only very fast, but regexps can be compiled to native code thru D in *compile* *time* without external tools). all in all it heavily depends of your libraries, of course. if you will do that with care ;-), it will work. but i just can't see any reason to compilcate build process with .di generation. D compilers usually are fast enough on decent boxes, and building can be done on background anyway.
Sep 19 2014
parent reply "Cliff" <cliff.s.hudson gmail.com> writes:
On Friday, 19 September 2014 at 18:56:20 UTC, ketmar via
Digitalmars-d wrote:
 On Fri, 19 Sep 2014 17:38:20 +0000
 Scott Wilson via Digitalmars-d <digitalmars-d puremagic.com> 
 wrote:

 That CTFE is used randomly everywhere?
CTFE *can* be used alot. this is one of D killer features (our regexp engine, for example, not only very fast, but regexps can be compiled to native code thru D in *compile* *time* without external tools). all in all it heavily depends of your libraries, of course. if you will do that with care ;-), it will work. but i just can't see any reason to compilcate build process with .di generation. D compilers usually are fast enough on decent boxes, and building can be done on background anyway.
As someone with some expertise in this subject, I can say with certainty that builds can almost never be fast enough. If D becomes successful - something we all desire I think - then it will require large organizations to use it for large projects - which means large code bases and long(er) compile times. Build labs seem to always be under pressure to churn out official bits as quickly as possible for testing, deployment, analysis, etc. More holistically, it's important that the bits produced in the official process and the dev box process be as similar as possible, if not entirely identical. You can imagine such builds feeding back into intellisense and analysis locally in the developer's IDE, and these processes need to be fast and (generally) lightweight. Taken to the Nth degree, such work is only ever done once for a change anywhere in the organization and the results are available for subsequent steps immediately. I don't know what all of the blockers to good incremental builds under D are, but as D grows in influence, we can be sure people will start to complain about build times, and they will start to ask pointed questions about incrementality, reliability and repeatability in builds. Having a good handle on these issues will allow us to at least plan and give good answers to people who want to take it to the next level.
Sep 19 2014
parent reply ketmar via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 19 Sep 2014 19:07:16 +0000
Cliff via Digitalmars-d <digitalmars-d puremagic.com> wrote:

that's why dedicating people to work solely on build scripts and
infrastructure is good, yet almost nobody does that. ah, "enterprise
BS" again. fsck "enterprise".

as for build times: we always can write parsed and analyzed ASTs to
disk (something like delphi .dcu), thus skipping the most work next
time. and then we can compare cached AST with new if the source was
changed to see what exactly was changed and report that (oh, there is
new function, one function removed, one turned to template, and so on).
this will greatly speed up builds without relying on ugly
"header/implementation" model.

the only good thing "enterprise" can do is to contribute and then
support such code. but i'm sure they never will, they will only
complaing about how they want "faster build times". hell with 'em.
Sep 19 2014
parent reply "Cliff" <cliff.s.hudson gmail.com> writes:
On Friday, 19 September 2014 at 19:22:22 UTC, ketmar via
Digitalmars-d wrote:
 On Fri, 19 Sep 2014 19:07:16 +0000
 Cliff via Digitalmars-d <digitalmars-d puremagic.com> wrote:

 that's why dedicating people to work solely on build scripts and
 infrastructure is good, yet almost nobody does that. ah, 
 "enterprise
 BS" again. fsck "enterprise".

 as for build times: we always can write parsed and analyzed 
 ASTs to
 disk (something like delphi .dcu), thus skipping the most work 
 next
 time. and then we can compare cached AST with new if the source 
 was
 changed to see what exactly was changed and report that (oh, 
 there is
 new function, one function removed, one turned to template, and 
 so on).
 this will greatly speed up builds without relying on ugly
 "header/implementation" model.

 the only good thing "enterprise" can do is to contribute and 
 then
 support such code. but i'm sure they never will, they will only
 complaing about how they want "faster build times". hell with 
 'em.
In a sense I sympathize with your antipath toward enterprises, but the simple fact is they have a lot of money and command a lot of developers. For us, developers = mind share = more libraries for us to use and more ideas to go around. That's all goodness. Leverage it for what it's worth. I'm definitely a fan of finding ways to improve build speeds that don't involve the creation of unnecessary (or worse, redundant) and user maintained artifacts. I'm also a fan of simplifying the build process so that we don't have to have experts maintain build scripts.
Sep 19 2014
parent ketmar via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 19 Sep 2014 19:30:21 +0000
Cliff via Digitalmars-d <digitalmars-d puremagic.com> wrote:

 In a sense I sympathize with your antipath toward enterprises,
 but the simple fact is they have a lot of money and command a lot
 of developers.  For us, developers =3D mind share =3D more libraries
 for us to use and more ideas to go around.  That's all goodness.
 Leverage it for what it's worth.
and what enterprise don't like is sharing. ah, sorry, i'm wrong, they really LOVE sharing. but under "sharing" they understand "you giving us everything we want and we telling you to GTFO".
 I'm definitely a fan of finding ways to improve build speeds that
 don't involve the creation of unnecessary (or worse, redundant)
 and user maintained artifacts.  I'm also a fan of simplifying the
 build process so that we don't have to have experts maintain
 build scripts.
actually, with introducing ".dpu"s (heh, it's "D parsed/prepared unit") d compiler can be used as build tool, just like delphi compiler. and it will be lightning fast, especially if we will keep generated .o files along with .dpu files. we can even generate some kind of bytecode for CTFE (but i'm not sure if it really worth the efforts). i like fast build times too, but there are alot of work involved and i don't see the urgent need in this feature. but any enterprise guy who needs it is free to hire me. my salary will not be that big if they'll take into account how much money and time they will save with my work. ;-)
Sep 19 2014
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/18/2014 9:48 AM, Scott Wilson wrote:
 Im running some tests with D. Was wondering whats the dependency
 story. Cant find any info online, searched for dlang dependency
 management and dlang dependency. Found bunch o dub stuff but not
 the nitty gritty.

 Unit of compilation is one D file but I saw if I pass several D
 files to the compiler only one .o file is generated. Whats the
 story there.
The idea is to essentially "pre-link" the object files that would have been generated if the files were compiled individually into one object file. This is faster and more convenient. It also means that semantic analysis is done only once for each file, and the imports, rather than doing it over and over as is done with separate compilation.
 Plus if I change a non template function in one module then
 theres no way to tell make no rebuild of modules importing it.
 The dmd -deps call seems to generate all recursively. In
 gcc/makedepend if one function changes only relinking is needed
 or nothing if dynamic loading.
Dependency management is the same as in C++, if you are willing to use .di files to represent 'headers' of corresponding .d files.
 Overall as far as I understand compiler is fast but dependency
 mgmt is coarse. Also whats the deal with cyclic dependencies. I
 saw discussion that its not possible but wrote a few test modules
 and it works fine.
If two modules import each other, then if one changes, both should get recompiled.
Sep 18 2014
parent reply "Scott Wilson" <scott_wilson_3rd yahoo.com> writes:
On Friday, 19 September 2014 at 05:17:45 UTC, Walter Bright wrote:
 On 9/18/2014 9:48 AM, Scott Wilson wrote:
 Im running some tests with D. Was wondering whats the 
 dependency
 story. Cant find any info online, searched for dlang dependency
 management and dlang dependency. Found bunch o dub stuff but 
 not
 the nitty gritty.

 Unit of compilation is one D file but I saw if I pass several D
 files to the compiler only one .o file is generated. Whats the
 story there.
The idea is to essentially "pre-link" the object files that would have been generated if the files were compiled individually into one object file. This is faster and more convenient. It also means that semantic analysis is done only once for each file, and the imports, rather than doing it over and over as is done with separate compilation.
 Plus if I change a non template function in one module then
 theres no way to tell make no rebuild of modules importing it.
 The dmd -deps call seems to generate all recursively. In
 gcc/makedepend if one function changes only relinking is needed
 or nothing if dynamic loading.
Dependency management is the same as in C++, if you are willing to use .di files to represent 'headers' of corresponding .d files.
 Overall as far as I understand compiler is fast but dependency
 mgmt is coarse. Also whats the deal with cyclic dependencies. I
 saw discussion that its not possible but wrote a few test 
 modules
 and it works fine.
If two modules import each other, then if one changes, both should get recompiled.
Im worrying about recursive dependencie. If a.d imports b.d and b.d imports c.d then I change c.d. Is a.d compiled again? Sometimes it should sometimes it shouldnt. Scott
Sep 19 2014
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/19/2014 10:41 AM, Scott Wilson wrote:
 Im worrying about recursive dependencie. If a.d imports b.d and
 b.d imports c.d then I change c.d.
 Is a.d compiled again?
Yes.
 Sometimes it should sometimes it shouldnt.
Dependency isn't more fine-grained than who imports who.
Sep 19 2014
prev sibling parent "Dicebot" <public dicebot.lv> writes:
In general D with current compiler technology is not very 
suitable for incremental rebuilds. In C there is a very simple 
separation between implementation and the imported interface. C++ 
makes it much harder by introducing templates which also must be 
present in header files - it took quite a while for C++ compilers 
to stop screwing incremental compilation in presence of templates 
and optimization. For D it is even harder because of CTFE and the 
fact that by default .di headers are not generated.

For now compiling everything in one go is much superior strategy 
assuming you have enough memory. In future incremental 
compilation topic may be revisited when something like compiler 
daemon becomes feasible, one that will be able to cache AST level 
entities and their dependencies, as opposed to filesystem level 
entities.
Sep 19 2014