www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Package manager - interacting with the compiler

reply Jacob Carlborg <doob me.com> writes:
I think I've come so far in my development of a package manager that 
it's time to think how it should interact with the compiler.

Currently I see two use cases:

1. When the package manager installs (and builds) a package

2. When a user (developer) builds a project and want's to use installed 
packages

In the best of worlds the user wouldn't have to do anything and it just 
works. The package manger needs to somehow pass import paths to the 
compiler and libraries to link with.

I'm not entirely sure what the best method to do this would be. But I'm 
thinking that if the compiler could accept compiler flags passed via 
environment variables use case 1 would be easy to implement.

For use case 2 it would be a bit more problematic. In this use case the 
user would need to somehow tell the package manager that I want to use 
these packages, something like:

// project.obspec
orb "foo"
orb "bar"

$ orb use project.obspec

or for single packages

$ orb use foobar
$ dmd project.d

If environment variables are used in this case, then the package manager 
would need a shell script wrapper, the same way as DVM does it, to be 
able to set environment variables for the parent (the shell). The reason 
for this is that a child process (the package manager) can't set 
environment variables for the parent process (the shell). This 
complicates the implementation and installation of the package manager 
and requires different implementations for Posix and Windows.

Another idea would be to manipulate the dmd.conf/sc.ini file but that 
seems to be quite complicated and messy. On the other hand, this 
wouldn't require any changes to the compiler.

Any other ideas?

https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
https://github.com/jacob-carlborg/orbit

-- 
/Jacob Carlborg
Dec 10 2011
next sibling parent reply J Arrizza <cppgent0 gmail.com> writes:
--f46d04138f9377057104b3c35ec1
Content-Type: text/plain; charset=ISO-8859-1

Jacob,

On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com> wrote:

 Currently I see two use cases:

 1. When the package manager installs (and builds) a package

This will have to handle cross-compilations and multiple build variants per platform. Multiple platforms are needed especially for embedded work (simulation vs real binaries) and multiple build variants are needed, at least Debug vs Release variants. Also multiple projects require a set of config files per project. There would be a lot of commonality between project config files but that's ok. The idea of "inheriting" from a common config file can cause a lot of problems. In all cases, the config file(s) need to be version controlled per project since they are unique to the project generating the build.
 2. When a user (developer) builds a project and want's to use installed
 packages

 If environment variables are used in this case, then the package manager
 would need a shell script wrapper, the same way as DVM does it, to be able
 to set environment variables for the parent (the shell).

Please no environment variables for anything. If all config info is in a file and that file is version controlled then keeping tight control over build configurations is much easier. For the same reasons, it would be ideal to have a method that dumps all versions of all packages used by a particular build variant. This could be generated and saved in version control for audit reasons, or, to go the extra step, it could be compared against during every build. This ensures that all components have not been inadvertently changed, i.e. all config changes are done in a controlled way. I'm not sure if any of that points the way to implement the builds any clearer... John --f46d04138f9377057104b3c35ec1 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable <div><div class=3D"gmail_quote">Jacob,</div><div class=3D"gmail_quote"><br>= </div><div class=3D"gmail_quote">On Sat, Dec 10, 2011 at 12:55 AM, Jacob Ca= rlborg <span dir=3D"ltr">&lt;<a href=3D"mailto:doob me.com">doob me.com</a>= &gt;</span> wrote:<br> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1p= x #ccc solid;padding-left:1ex">Currently I see two use cases:<br> <br> 1. When the package manager installs (and builds) a package<br></blockquote=
<div><br></div><div>This will have to handle cross-compilations and multip=

r embedded work (simulation vs real binaries) and multiple build variants a= re needed, at least Debug vs Release variants.</div> <div><br></div><div><div>Also multiple projects require a set of config fil= es per project. There would be a lot of commonality between project config = files but that&#39;s ok. The idea of &quot;inheriting&quot; from a common c= onfig =A0file can cause a lot of problems.</div> </div><div><br></div><div>In all cases, the config file(s) need to be versi= on controlled per project since they are unique to the project generating t= he build.</div><div>=A0</div><blockquote class=3D"gmail_quote" style=3D"mar= gin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"> 2. When a user (developer) builds a project and want&#39;s to use installed= packages<br> =A0</blockquote><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8e= x;border-left:1px #ccc solid;padding-left:1ex"><br> If environment variables are used in this case, then the package manager wo= uld need a shell script wrapper, the same way as DVM does it, to be able to= set environment variables for the parent (the shell). </blockquote><div> Please no environment variables for anything. If all config info is in a fi= le and that file is version controlled then keeping tight control over buil= d configurations is much easier.</div><div><br></div><div>For the same reas= ons, it would be ideal to have a method that dumps all versions of all pack= ages used by a particular build variant. This could be generated and saved = in version control for audit reasons, or, to go the extra step, it could be= compared against during every build. This ensures that all components have= not been inadvertently changed, i.e. all config changes are done in a cont= rolled way.</div> <div><br></div><div>I&#39;m not sure if any of that points the way to imple= ment the builds any clearer...</div><div><br></div><div>John</div><div><br>= </div></div> </div> --f46d04138f9377057104b3c35ec1--
Dec 10 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-10 22:13, J Arrizza wrote:
 Jacob,

 On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com
 <mailto:doob me.com>> wrote:

     Currently I see two use cases:

     1. When the package manager installs (and builds) a package


 This will have to handle cross-compilations and multiple build variants
 per platform. Multiple platforms are needed especially for embedded work
 (simulation vs real binaries) and multiple build variants are needed, at
 least Debug vs Release variants.

 Also multiple projects require a set of config files per project. There
 would be a lot of commonality between project config files but that's
 ok. The idea of "inheriting" from a common config  file can cause a lot
 of problems.

 In all cases, the config file(s) need to be version controlled per
 project since they are unique to the project generating the build.

The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.
     2. When a user (developer) builds a project and want's to use
     installed packages


     If environment variables are used in this case, then the package
     manager would need a shell script wrapper, the same way as DVM does
     it, to be able to set environment variables for the parent (the shell).

 Please no environment variables for anything. If all config info is in a
 file and that file is version controlled then keeping tight control over
 build configurations is much easier.

 For the same reasons, it would be ideal to have a method that dumps all
 versions of all packages used by a particular build variant. This could
 be generated and saved in version control for audit reasons, or, to go
 the extra step, it could be compared against during every build. This
 ensures that all components have not been inadvertently changed, i.e.
 all config changes are done in a controlled way.

 I'm not sure if any of that points the way to implement the builds any
 clearer...

 John

I'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with. -- /Jacob Carlborg
Dec 11 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-12 01:36, Martin Nowak wrote:
 On Sun, 11 Dec 2011 22:15:26 +0100, Jacob Carlborg <doob me.com> wrote:

 On 2011-12-10 22:13, J Arrizza wrote:
 Jacob,

 On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com
 <mailto:doob me.com>> wrote:

 Currently I see two use cases:

 1. When the package manager installs (and builds) a package


 This will have to handle cross-compilations and multiple build variants
 per platform. Multiple platforms are needed especially for embedded work
 (simulation vs real binaries) and multiple build variants are needed, at
 least Debug vs Release variants.

 Also multiple projects require a set of config files per project. There
 would be a lot of commonality between project config files but that's
 ok. The idea of "inheriting" from a common config file can cause a lot
 of problems.

 In all cases, the config file(s) need to be version controlled per
 project since they are unique to the project generating the build.

The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.
 2. When a user (developer) builds a project and want's to use
 installed packages


 If environment variables are used in this case, then the package
 manager would need a shell script wrapper, the same way as DVM does
 it, to be able to set environment variables for the parent (the shell).

 Please no environment variables for anything. If all config info is in a
 file and that file is version controlled then keeping tight control over
 build configurations is much easier.

 For the same reasons, it would be ideal to have a method that dumps all
 versions of all packages used by a particular build variant. This could
 be generated and saved in version control for audit reasons, or, to go
 the extra step, it could be compared against during every build. This
 ensures that all components have not been inadvertently changed, i.e.
 all config changes are done in a controlled way.

 I'm not sure if any of that points the way to implement the builds any
 clearer...

 John

I'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with.

I think a useful approach is to implement http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP13 and map import paths to packages. It allows to handle different versions and hide undeclared dependencies, i.e. no accidental imports. As this only works when building packages every installed package should have a symlink for their most recent version in a common import directory so that plain dmd builds can use the packages. martin

I don't like the ideas in that DIP. I don't think the packages should have symlinks to a common import directory. It will cause problems if the top level package of a library has the same name as some other library, like it is with DWT: org.eclipse.swt org.eclipse.jface And so on. -- /Jacob Carlborg
Dec 12 2011
prev sibling next sibling parent reply "jdrewsen" <jdrewsen nospam.com> writes:
On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg 
wrote:
 I think I've come so far in my development of a package manager 
 that it's time to think how it should interact with the 
 compiler.

 Currently I see two use cases:

 1. When the package manager installs (and builds) a package

 2. When a user (developer) builds a project and want's to use 
 installed packages

 In the best of worlds the user wouldn't have to do anything and 
 it just works. The package manger needs to somehow pass import 
 paths to the compiler and libraries to link with.

 I'm not entirely sure what the best method to do this would be. 
 But I'm thinking that if the compiler could accept compiler 
 flags passed via environment variables use case 1 would be easy 
 to implement.

 For use case 2 it would be a bit more problematic. In this use 
 case the user would need to somehow tell the package manager 
 that I want to use these packages, something like:

 // project.obspec
 orb "foo"
 orb "bar"

 $ orb use project.obspec

 or for single packages

 $ orb use foobar
 $ dmd project.d

 If environment variables are used in this case, then the 
 package manager would need a shell script wrapper, the same way 
 as DVM does it, to be able to set environment variables for the 
 parent (the shell). The reason for this is that a child process 
 (the package manager) can't set environment variables for the 
 parent process (the shell). This complicates the implementation 
 and installation of the package manager and requires different 
 implementations for Posix and Windows.

 Another idea would be to manipulate the dmd.conf/sc.ini file 
 but that seems to be quite complicated and messy. On the other 
 hand, this wouldn't require any changes to the compiler.

 Any other ideas?

 https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
 https://github.com/jacob-carlborg/orbit

For use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables. Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project. What would be nice to have support for using packages without a build tool. Maybe something like what pkg-config provides: dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" returns the flags to use the foo package. /Jonas
Dec 10 2011
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-10 22:17, jdrewsen wrote:
 On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg wrote:
 I think I've come so far in my development of a package manager that
 it's time to think how it should interact with the compiler.

 Currently I see two use cases:

 1. When the package manager installs (and builds) a package

 2. When a user (developer) builds a project and want's to use
 installed packages

 In the best of worlds the user wouldn't have to do anything and it
 just works. The package manger needs to somehow pass import paths to
 the compiler and libraries to link with.

 I'm not entirely sure what the best method to do this would be. But
 I'm thinking that if the compiler could accept compiler flags passed
 via environment variables use case 1 would be easy to implement.

 For use case 2 it would be a bit more problematic. In this use case
 the user would need to somehow tell the package manager that I want to
 use these packages, something like:

 // project.obspec
 orb "foo"
 orb "bar"

 $ orb use project.obspec

 or for single packages

 $ orb use foobar
 $ dmd project.d

 If environment variables are used in this case, then the package
 manager would need a shell script wrapper, the same way as DVM does
 it, to be able to set environment variables for the parent (the
 shell). The reason for this is that a child process (the package
 manager) can't set environment variables for the parent process (the
 shell). This complicates the implementation and installation of the
 package manager and requires different implementations for Posix and
 Windows.

 Another idea would be to manipulate the dmd.conf/sc.ini file but that
 seems to be quite complicated and messy. On the other hand, this
 wouldn't require any changes to the compiler.

 Any other ideas?

 https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
 https://github.com/jacob-carlborg/orbit

For use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables.

I was thinking that the package manager just invokes a build tool like make, rdmd, dsss, shell script and so on.
 Use case 2 does not belong to a package manager in my opinion. It is the job
 of a build tool to configure packages for a project. What would be nice
 to have support for using packages without a build tool. Maybe something
 like what pkg-config provides:

 dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" returns the
 flags to use the foo package.

 /Jonas

I would say that the preferred way is to use a build tool then there is no problem. The build tool just asks the package manager which import paths to use for the given packages and pass the information to the compiler. But I don't want my package manager to depend on a built tool, I want it to be usable on its own. -- /Jacob Carlborg
Dec 11 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-12 12:45, jdrewsen wrote:
 On Sunday, 11 December 2011 at 21:22:37 UTC, Jacob Carlborg wrote:
 I would say that the preferred way is to use a build tool then there
 is no problem. The build tool just asks the package manager which
 import paths to use for the given packages and pass the information to
 the compiler. But I don't want my package manager to depend on a built
 tool, I want it to be usable on its own.

And for that I think the pkg-config method is the way to go. Setting environment vars brings unneeded state into you development session. Another option would be to just wrap dmd in a e.g. orbdmd command and handle it there.

Ok.
 Btw: have you considered renaming from orb to something that makes sense
 to newbies e.g. dpack?

 -Jonas

No, I basically just picked a random name. I tried of trying to come up with good names for tools and libraries. -- /Jacob Carlborg
Dec 12 2011
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2011-12-10 23:05, Jonathan M Davis wrote:
 On Saturday, December 10, 2011 22:17:44 jdrewsen wrote:
 For use case 1 the package manager could just as well call dmd
 directly with the correct flags ie. no need for using environment
 variables. Use case 2 does not belong to a package manager in my
 opinion. It is the job of a build tool to configure packages for
 a project.

This brings up an interesting situation. In general, I don't think that a package manager has any business building the project which is pulling in dependencies. However, it _does_ make some sense to build the dependencies on the box that these are being pulled in on, since they're going to have to be built for that box natively. And each of those projects could be using different build tools. One could be using make. Another could be using cmake. Another could be using scons. Etc. So, how is that dealt with? Does each package list its choose build tool as a dependency and the programmer must then make sure that that build tool has been installed on their system by whatever means non-D packages/programs are installed? Or does that mean that packages using the package manager all need to use a specific build tool? And if they do, should the package manager then be that build tool? Or do we make it so that the package manager doesn't actually build _anything_? Rather it pulls in the source along with pre-built binaries for your architecture, and if you want to build it for your manchine specifically, you have to go and built it yourself after it gets pulled down? This is all looking very messing to me. I have no idea how orbit deals with any of this, since I've never really looked at orbit. But it makes for an ugly problem. So, in general, I'd definitely prefer #1, but it may be that issues involved in this make #2 make more sense, but I'd have to study orbit in some detail to give a better opinion on it. - Jonathan M Davis

Currently you specify the build tool in the specification file, which also contains dependencies, which files to include in the package and so on. The package manager then just invokes the build tool. Currently the build tool needs to be supported by the package manager, it needs to know how to invoke the build tool. Currently there is no verification that the build tool exists. The intention was not to choose among these use case, both of them happen and need to be handled with. -- /Jacob Carlborg
Dec 11 2011
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-10 23:32, J Arrizza wrote:
 A few other potential twists.

 - the installation step needs to be portable int that can install the
 variant build artifacts into non-standard file system locations. For
 example, the build artifacts for the windows build and the build
 artifacts for the linux build need to end up in separate directories and
 the Debug and Release builds need to end up in separate
 directories. Another example is a Build server building multiple projects.

 - the package system itself needs to be portable in that it can be
 installed in any directory. For example, if I want to source control the
 entire package system then it would not be in a standard file-system
 location. Also it implies there may be multiple installations of the
 package system since I can have multiple branches.

 For regulated industries that require the ability to recreate the
 software environment for any released binary image, both of these would
 be a terrific help for doing that.

 John

Currently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs. -- /Jacob Carlborg
Dec 11 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-19 08:06, Marco Leise wrote:
 Am 11.12.2011, 23:12 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-10 23:32, J Arrizza wrote:
 A few other potential twists.

 - the installation step needs to be portable int that can install the
 variant build artifacts into non-standard file system locations. For
 example, the build artifacts for the windows build and the build
 artifacts for the linux build need to end up in separate directories and
 the Debug and Release builds need to end up in separate
 directories. Another example is a Build server building multiple
 projects.

 - the package system itself needs to be portable in that it can be
 installed in any directory. For example, if I want to source control the
 entire package system then it would not be in a standard file-system
 location. Also it implies there may be multiple installations of the
 package system since I can have multiple branches.

 For regulated industries that require the ability to recreate the
 software environment for any released binary image, both of these would
 be a terrific help for doing that.

 John

Currently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs.

You have to have super user rights for the default. Maven installs everything to ~/.maven by default, which will work out of the box.

I have thought of that as well, I've just picked a folder for now. It can easily be changed in the code. -- /Jacob Carlborg
Dec 19 2011
prev sibling next sibling parent Jonathan M Davis <jmdavisProg gmx.com> writes:
On Saturday, December 10, 2011 22:17:44 jdrewsen wrote:
 On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg
 
 wrote:
 I think I've come so far in my development of a package manager
 that it's time to think how it should interact with the
 compiler.
 
 Currently I see two use cases:
 
 1. When the package manager installs (and builds) a package
 
 2. When a user (developer) builds a project and want's to use
 installed packages
 
 In the best of worlds the user wouldn't have to do anything and
 it just works. The package manger needs to somehow pass import
 paths to the compiler and libraries to link with.
 
 I'm not entirely sure what the best method to do this would be.
 But I'm thinking that if the compiler could accept compiler
 flags passed via environment variables use case 1 would be easy
 to implement.
 
 For use case 2 it would be a bit more problematic. In this use
 case the user would need to somehow tell the package manager
 that I want to use these packages, something like:
 
 // project.obspec
 orb "foo"
 orb "bar"
 
 $ orb use project.obspec
 
 or for single packages
 
 $ orb use foobar
 $ dmd project.d
 
 If environment variables are used in this case, then the
 package manager would need a shell script wrapper, the same way
 as DVM does it, to be able to set environment variables for the
 parent (the shell). The reason for this is that a child process
 (the package manager) can't set environment variables for the
 parent process (the shell). This complicates the implementation
 and installation of the package manager and requires different
 implementations for Posix and Windows.
 
 Another idea would be to manipulate the dmd.conf/sc.ini file
 but that seems to be quite complicated and messy. On the other
 hand, this wouldn't require any changes to the compiler.
 
 Any other ideas?
 
 https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
 https://github.com/jacob-carlborg/orbit

For use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables. Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project.

This brings up an interesting situation. In general, I don't think that a package manager has any business building the project which is pulling in dependencies. However, it _does_ make some sense to build the dependencies on the box that these are being pulled in on, since they're going to have to be built for that box natively. And each of those projects could be using different build tools. One could be using make. Another could be using cmake. Another could be using scons. Etc. So, how is that dealt with? Does each package list its choose build tool as a dependency and the programmer must then make sure that that build tool has been installed on their system by whatever means non-D packages/programs are installed? Or does that mean that packages using the package manager all need to use a specific build tool? And if they do, should the package manager then be that build tool? Or do we make it so that the package manager doesn't actually build _anything_? Rather it pulls in the source along with pre-built binaries for your architecture, and if you want to build it for your manchine specifically, you have to go and built it yourself after it gets pulled down? This is all looking very messing to me. I have no idea how orbit deals with any of this, since I've never really looked at orbit. But it makes for an ugly problem. So, in general, I'd definitely prefer #1, but it may be that issues involved in this make #2 make more sense, but I'd have to study orbit in some detail to give a better opinion on it. - Jonathan M Davis
Dec 10 2011
prev sibling next sibling parent J Arrizza <cppgent0 gmail.com> writes:
--bcaec546933da7c22c04b3c47888
Content-Type: text/plain; charset=ISO-8859-1

A few other potential twists.

- the installation step needs to be portable int that can install the
variant build artifacts into non-standard file system locations. For
example, the build artifacts for the windows build and the build artifacts
for the linux build need to end up in separate directories and the Debug
and Release builds need to end up in separate directories. Another example
is a Build server building multiple projects.

- the package system itself needs to be portable in that it can be
installed in any directory. For example, if I want to source control the
entire package system then it would not be in a standard file-system
location. Also it implies there may be multiple installations of the
package system since I can have multiple branches.

For regulated industries that require the ability to recreate the software
environment for any released binary image, both of these would be a
terrific help for doing that.

John

--bcaec546933da7c22c04b3c47888
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable

<div class=3D"gmail_quote"><div>A few other potential twists.</div><div><br=
</div><div>- the installation step needs to be portable int that can insta=

example, the build artifacts for the windows build and the build artifacts= for the linux build need to end up in separate directories and the Debug a= nd Release builds need to end up in separate directories.=A0Another example= is a Build server building multiple projects.=A0</div> <div><br></div><div><div>- the package system itself needs to be portable i= n that it can be installed in any directory. For example, if I want to sour= ce control the entire package system then it would not be in a standard fil= e-system location. Also it implies there may be multiple installations of t= he package system since I can have multiple branches.</div> <div><br></div><div>For regulated industries that require the ability to re= create the software environment for any released binary image, both of thes= e would be a terrific help for doing that.</div><div><br></div></div></div> <div class=3D"gmail_quote"><div>John</div></div> --bcaec546933da7c22c04b3c47888--
Dec 10 2011
prev sibling next sibling parent "Martin Nowak" <dawg dawgfoto.de> writes:
On Sun, 11 Dec 2011 22:15:26 +0100, Jacob Carlborg <doob me.com> wrote:

 On 2011-12-10 22:13, J Arrizza wrote:
 Jacob,

 On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com
 <mailto:doob me.com>> wrote:

     Currently I see two use cases:

     1. When the package manager installs (and builds) a package


 This will have to handle cross-compilations and multiple build variants
 per platform. Multiple platforms are needed especially for embedded work
 (simulation vs real binaries) and multiple build variants are needed, at
 least Debug vs Release variants.

 Also multiple projects require a set of config files per project. There
 would be a lot of commonality between project config files but that's
 ok. The idea of "inheriting" from a common config  file can cause a lot
 of problems.

 In all cases, the config file(s) need to be version controlled per
 project since they are unique to the project generating the build.

The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.
     2. When a user (developer) builds a project and want's to use
     installed packages


     If environment variables are used in this case, then the package
     manager would need a shell script wrapper, the same way as DVM does
     it, to be able to set environment variables for the parent (the  
 shell).

 Please no environment variables for anything. If all config info is in a
 file and that file is version controlled then keeping tight control over
 build configurations is much easier.

 For the same reasons, it would be ideal to have a method that dumps all
 versions of all packages used by a particular build variant. This could
 be generated and saved in version control for audit reasons, or, to go
 the extra step, it could be compared against during every build. This
 ensures that all components have not been inadvertently changed, i.e.
 all config changes are done in a controlled way.

 I'm not sure if any of that points the way to implement the builds any
 clearer...

 John

I'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with.

I think a useful approach is to implement http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP13 and map import paths to packages. It allows to handle different versions and hide undeclared dependencies, i.e. no accidental imports. As this only works when building packages every installed package should have a symlink for their most recent version in a common import directory so that plain dmd builds can use the packages. martin
Dec 11 2011
prev sibling next sibling parent reply Chad J <chadjoan __spam.is.bad__gmail.com> writes:
On 12/10/2011 03:55 AM, Jacob Carlborg wrote:
 I think I've come so far in my development of a package manager that
 it's time to think how it should interact with the compiler.
 ...

o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.
Dec 11 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-12 04:08, Chad J wrote:
 On 12/10/2011 03:55 AM, Jacob Carlborg wrote:
 I think I've come so far in my development of a package manager that
 it's time to think how it should interact with the compiler.
 ...

o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.

It's basically RubyGems but for D. It's great to hear that someone likes it. -- /Jacob Carlborg
Dec 12 2011
parent reply Chad J <chadjoan __spam.is.bad__gmail.com> writes:
On 12/12/2011 08:58 AM, Jacob Carlborg wrote:
 On 2011-12-12 04:08, Chad J wrote:
 On 12/10/2011 03:55 AM, Jacob Carlborg wrote:
 I think I've come so far in my development of a package manager that
 it's time to think how it should interact with the compiler.
 ...

o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.

It's basically RubyGems but for D. It's great to hear that someone likes it.

OK, cool. I should probably mention some of the things I like about portage (off the top of my head), incase it helps: - The world file: A list of all packages that the /user/ elected to install. It does not contain dependencies. It is the top level. - use-flags: Flags/keywords associated with packages that allow you to turn specific features within packages on and off. - Stability levels: Portage has a notion of unstable/untested or "hardmasked" packages at one level, slightly unstable or architecture-specific glitchiness at another level ("keyworded"), and completely stable at another. --------------------------------- As for why I like these things: - The world file: This makes it really easy to replicate installations on other machines. It also allows me to cull my tree by removing something from the world file and then telling it to remove all the orphaned packages. - use-flags: These are super useful when a package has a dependency that just will not compile on my system. In some cases I can disable the feature that causes that dependency, and then still be able to install the package. - Stability levels: These can be controlled at different granularities, examples: system has only stable packages, or all unstable, or stable except for packages in the "keywords" file, and maybe one package in the keywords file has all versions allowed, or just versions 1.3.44 and 1.5.21. This is yet more control over giving troubling packages the boot. --------------------------------- Things I don't like about portage: - The portage tree doesn't keep enough old versions around sometimes. - People who write crappy ebuilds or mark things stable when they mess up my system. The quality control used to be better. (It's still my favorite package manager by a wide margin.)
Dec 13 2011
parent reply Jacob Carlborg <doob me.com> writes:
On 2011-12-13 14:04, Chad J wrote:
 OK, cool.  I should probably mention some of the things I like about
 portage (off the top of my head), incase it helps:

 - The world file: A list of all packages that the /user/ elected to
 install.  It does not contain dependencies.  It is the top level.

That might be a good idea. I had only planned to list all installed packages.
 - use-flags: Flags/keywords associated with packages that allow you to
 turn specific features within packages on and off.

I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.
 - Stability levels: Portage has a notion of unstable/untested or
 "hardmasked" packages at one level, slightly unstable or
 architecture-specific glitchiness at another level ("keyworded"), and
 completely stable at another.

Orbit uses Semantic Versioning: http://semver.org/
 Things I don't like about portage:
 - The portage tree doesn't keep enough old versions around sometimes.

I have no plans of removing old packages as long as it doesn't cause any problems.
 - People who write crappy ebuilds or mark things stable when they mess
 up my system.  The quality control used to be better.
 (It's still my favorite package manager by a wide margin.)

This seems hard to avoid and I don't know what can be done about it. -- /Jacob Carlborg
Dec 13 2011
parent reply Chad J <chadjoan __spam.is.bad__gmail.com> writes:
On 12/13/2011 08:45 AM, Jacob Carlborg wrote:
 On 2011-12-13 14:04, Chad J wrote:
 OK, cool.  I should probably mention some of the things I like about
 portage (off the top of my head), incase it helps:

 - The world file: A list of all packages that the /user/ elected to
 install.  It does not contain dependencies.  It is the top level.

That might be a good idea. I had only planned to list all installed packages.
 - use-flags: Flags/keywords associated with packages that allow you to
 turn specific features within packages on and off.

I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.

Would you allow others to implement this, or somehow be open to it in the future? Of course, I can definitely understand not wanting to handle this right now, due to scope creep.
 - Stability levels: Portage has a notion of unstable/untested or
 "hardmasked" packages at one level, slightly unstable or
 architecture-specific glitchiness at another level ("keyworded"), and
 completely stable at another.

Orbit uses Semantic Versioning: http://semver.org/

I'll read that when I get a bit of time.
 Things I don't like about portage:
 - The portage tree doesn't keep enough old versions around sometimes.

I have no plans of removing old packages as long as it doesn't cause any problems.

Nice. Thanks.
 - People who write crappy ebuilds or mark things stable when they mess
 up my system.  The quality control used to be better.
 (It's still my favorite package manager by a wide margin.)

This seems hard to avoid and I don't know what can be done about it.

Maintainers being more conservative, I suspect. It's not too bad in Portage, and mostly happens on super large projects with many packages, like KDE. The bread-and-butter linux stuff (kernel, compilers, small apps, drivers, etc) all tends to work out fine. It can also be mitigated a lot by having older versions around. I can easily avoid this by reverting to an earlier version of my system... except I can't sometimes. In a production environment I would probably keep all versions of my stuff packaged locally.
Dec 13 2011
parent Jacob Carlborg <doob me.com> writes:
On 2011-12-13 15:05, Chad J wrote:
 On 12/13/2011 08:45 AM, Jacob Carlborg wrote:
 On 2011-12-13 14:04, Chad J wrote:
 OK, cool.  I should probably mention some of the things I like about
 portage (off the top of my head), incase it helps:

 - The world file: A list of all packages that the /user/ elected to
 install.  It does not contain dependencies.  It is the top level.

That might be a good idea. I had only planned to list all installed packages.
 - use-flags: Flags/keywords associated with packages that allow you to
 turn specific features within packages on and off.

I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.

Would you allow others to implement this, or somehow be open to it in the future?

It might happen in the future. But currently I think it's unnecessary and too complicated.
 Of course, I can definitely understand not wanting to handle this right
 now, due to scope creep.

Yeah, it won't happen in the first release. -- /Jacob Carlborg
Dec 13 2011
prev sibling next sibling parent "jdrewsen" <jdrewsen nospam.com> writes:
On Sunday, 11 December 2011 at 21:22:37 UTC, Jacob Carlborg wrote:
 On 2011-12-10 22:17, jdrewsen wrote:
 On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg 
 wrote:
 I think I've come so far in my development of a package 
 manager that
 it's time to think how it should interact with the compiler.

 Currently I see two use cases:

 1. When the package manager installs (and builds) a package

 2. When a user (developer) builds a project and want's to use
 installed packages

 In the best of worlds the user wouldn't have to do anything 
 and it
 just works. The package manger needs to somehow pass import 
 paths to
 the compiler and libraries to link with.

 I'm not entirely sure what the best method to do this would 
 be. But
 I'm thinking that if the compiler could accept compiler flags 
 passed
 via environment variables use case 1 would be easy to 
 implement.

 For use case 2 it would be a bit more problematic. In this 
 use case
 the user would need to somehow tell the package manager that 
 I want to
 use these packages, something like:

 // project.obspec
 orb "foo"
 orb "bar"

 $ orb use project.obspec

 or for single packages

 $ orb use foobar
 $ dmd project.d

 If environment variables are used in this case, then the 
 package
 manager would need a shell script wrapper, the same way as 
 DVM does
 it, to be able to set environment variables for the parent 
 (the
 shell). The reason for this is that a child process (the 
 package
 manager) can't set environment variables for the parent 
 process (the
 shell). This complicates the implementation and installation 
 of the
 package manager and requires different implementations for 
 Posix and
 Windows.

 Another idea would be to manipulate the dmd.conf/sc.ini file 
 but that
 seems to be quite complicated and messy. On the other hand, 
 this
 wouldn't require any changes to the compiler.

 Any other ideas?

 https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
 https://github.com/jacob-carlborg/orbit

For use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables.

I was thinking that the package manager just invokes a build tool like make, rdmd, dsss, shell script and so on.
 Use case 2 does not belong to a package manager in my opinion. 
 It is the job
 of a build tool to configure packages for a project. What 
 would be nice
 to have support for using packages without a build tool. Maybe 
 something
 like what pkg-config provides:

 dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" 
 returns the
 flags to use the foo package.

 /Jonas

I would say that the preferred way is to use a build tool then there is no problem. The build tool just asks the package manager which import paths to use for the given packages and pass the information to the compiler. But I don't want my package manager to depend on a built tool, I want it to be usable on its own.

And for that I think the pkg-config method is the way to go. Setting environment vars brings unneeded state into you development session. Another option would be to just wrap dmd in a e.g. orbdmd command and handle it there. Btw: have you considered renaming from orb to something that makes sense to newbies e.g. dpack? -Jonas
Dec 12 2011
prev sibling parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 11.12.2011, 23:12 Uhr, schrieb Jacob Carlborg <doob me.com>:

 On 2011-12-10 23:32, J Arrizza wrote:
 A few other potential twists.

 - the installation step needs to be portable int that can install the
 variant build artifacts into non-standard file system locations. For
 example, the build artifacts for the windows build and the build
 artifacts for the linux build need to end up in separate directories and
 the Debug and Release builds need to end up in separate
 directories. Another example is a Build server building multiple  
 projects.

 - the package system itself needs to be portable in that it can be
 installed in any directory. For example, if I want to source control the
 entire package system then it would not be in a standard file-system
 location. Also it implies there may be multiple installations of the
 package system since I can have multiple branches.

 For regulated industries that require the ability to recreate the
 software environment for any released binary image, both of these would
 be a terrific help for doing that.

 John

Currently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs.

You have to have super user rights for the default. Maven installs everything to ~/.maven by default, which will work out of the box.
Dec 18 2011