www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - What Julia Does Right

reply Walter Bright <newshound2 digitalmars.com> writes:
Here's a good thought provoking article:

https://viralinstruction.com/posts/goodjulia/

A couple of things stood out for me:


1. https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

I've never thought of a package manager that way.


2. "Rust, for example, may have a wonderfully expressive type system, but it's 
also boilerplate heavy, and its borrowchecker makes writing any code that 
compiles at all quite a time investment. An investment, which most of the time 
gives no returns when you're trying to figure how to approach the problem in
the 
first place. It's also not entirely clear how I would interactively visualise 
and manipulate a dataset using a static language like Rust."

I've always thought that a great strength of D was its plasticity, meaning you 
can easily change data structures and algorithms as you're writing and
rewriting 
code. Apparently this is much more difficult in Rust, which will inevitably 
result in less efficiency, even if the compiler for it generates very good code.
Dec 08 2022
next sibling parent ryuukk_ <ryuukk.dev gmail.com> writes:
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
 Here's a good thought provoking article:

 https://viralinstruction.com/posts/goodjulia/

 A couple of things stood out for me:


 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.


 2. "Rust, for example, may have a wonderfully expressive type 
 system, but it's also boilerplate heavy, and its borrowchecker 
 makes writing any code that compiles at all quite a time 
 investment. An investment, which most of the time gives no 
 returns when you're trying to figure how to approach the 
 problem in the first place. It's also not entirely clear how I 
 would interactively visualise and manipulate a dataset using a 
 static language like Rust."

 I've always thought that a great strength of D was its 
 plasticity, meaning you can easily change data structures and 
 algorithms as you're writing and rewriting code. Apparently 
 this is much more difficult in Rust, which will inevitably 
 result in less efficiency, even if the compiler for it 
 generates very good code.
I had the same impression, Rust lacks flexibility, but that's due to the goal of the compiler, although enums were nice to work with, being able to compose with them with pattern matching helps making the code less rigid and a little more flexible
Dec 08 2022
prev sibling next sibling parent reply zjh <fqbqrr 163.com> writes:
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
 Here's a good thought provoking article:

 https://viralinstruction.com/posts/goodjulia/

 A couple of things stood out for me:


 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.
I just saw the `neat` language which based on `D`. I think `language plug-in` is really a very good concept. The `D` plug-in author can specifically `open/close` some `D` features In this way, the author of the `D` plug-in does not need to set up a separate branch, but directly set up a sub branch under the `plug-in` directory of the`D`language. The author of the `D` plug-in can use the latest `D` features at any time, and the`D` user can also enjoy the `D` plug-in feature. In this way, people do not need to establish branches of `'D'`.
Dec 08 2022
next sibling parent reply zjh <fqbqrr 163.com> writes:
On Friday, 9 December 2022 at 02:10:01 UTC, zjh wrote:
 In this way, people do not need to establish branches of `'D'`.
If the `plug-in` author creates a language alone, but still has to`repeat` many things, and has no reputation, it is very `troublesome`. If combined with languages such as `D`(If D has plug-in system), it can not only strengthen the `D` ecology, but also increase the `plug-in author` reputation. It is really a combination of `strong and strong`. I think `D` authors can really try! Make the `D`language the `base language` of the new language, while other features are implemented by the `D plug-in author`, similar to the relationship between `Microsoft` and `driver`. This can expand the ecology!
Dec 08 2022
parent zjh <fqbqrr 163.com> writes:
On Friday, 9 December 2022 at 02:18:09 UTC, zjh wrote:

 similar to the relationship between `Microsoft` and `driver`. 
 This can expand the ecology!
The language plug-in author is not the official language, but it is an `ideal` place to create new `language` features. If the `language official` approves,language official can also add the `feature` as `default` after the `feature` experiment verification is completed, so that the language official is automatically entered. This is really, very `cool`!
Dec 08 2022
prev sibling parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Friday, 9 December 2022 at 02:10:01 UTC, zjh wrote:
 On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright 
 wrote:
 Here's a good thought provoking article:

 https://viralinstruction.com/posts/goodjulia/

 A couple of things stood out for me:


 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.
I just saw the `neat` language which based on `D`. I think `language plug-in` is really a very good concept. The `D` plug-in author can specifically `open/close` some `D` features In this way, the author of the `D` plug-in does not need to set up a separate branch, but directly set up a sub branch under the `plug-in` directory of the`D`language. The author of the `D` plug-in can use the latest `D` features at any time, and the`D` user can also enjoy the `D` plug-in feature. In this way, people do not need to establish branches of `'D'`.
Note that an important attribute of the way I use macros is that they're pulled into the compiler automatically, but only directly affect code that imports them. So the meaning of existing syntax can be overloaded at the use site by a macro type like all types can, ie. opCall, opBinary, etc., but you only get *new syntax* if you `macro import`. The goal is to have a rich central type system that macros can internally fall back on. If you treat macros as compiler plugins that you pass on the command line, for instance, you fracture the ecosystem because you can no longer combine two libraries with macros that occupy the same syntax. The goal is to keep macros, broadly, encapsulated to the site of use.
Dec 08 2022
parent zjh <fqbqrr 163.com> writes:
On Friday, 9 December 2022 at 05:43:56 UTC, FeepingCreature wrote:

 Note that an important attribute of the way I use macros is 
 that they're pulled into the compiler automatically, but only 
 directly affect code that imports them. So the meaning of 
 existing syntax can be overloaded at the use site by a macro 
 type like all types can, ie. opCall, opBinary, etc., but you 
 only get *new syntax* if you `macro import`. The goal is to 
 have a rich central type system that macros can internally fall 
 back on.

 If you treat macros as compiler plugins that you pass on the 
 command line, for instance, you fracture the ecosystem because 
 you can no longer combine two libraries with macros that occupy 
 the same syntax. The goal is to keep macros, broadly, 
 encapsulated to the site of use.
Thank you for your explanation
Dec 08 2022
prev sibling next sibling parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
 Here's a good thought provoking article:

 https://viralinstruction.com/posts/goodjulia/

 A couple of things stood out for me:


 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.
I'm not sure what you mean here? As far as I can tell, dub does all of that.
Dec 08 2022
parent reply Paul Backus <snarwin gmail.com> writes:
On Friday, 9 December 2022 at 05:45:36 UTC, FeepingCreature wrote:
 I'm not sure what you mean here? As far as I can tell, dub does 
 all of that.
Some features from the article that as far as I know have no equivalent in dub: 1. Caching registry locally.
 Resolving environments feels instant, as opposed to the 
 glacially slow Conda that Python offers. The global "general" 
 registry is downloaded as a single gzipped tarball, and read 
 directly from the zipped tarball, making registry updates way 
 faster than updating Cargo's crates.io.
2. Multiple registries with namespacing.
 Pkg is federated, and allows you to easily and freely mix 
 multiple public and private package registries, even if they 
 have no knowledge of each others and contain different packages 
 with the same names.
3. Cross compilation.
 The BinaryBuilder package allows you to cross-compile the same 
 program to all platforms supported by Julia
Dec 09 2022
next sibling parent reply "H. S. Teoh" <hsteoh qfbox.info> writes:
On Fri, Dec 09, 2022 at 02:11:15PM +0000, Paul Backus via Digitalmars-d wrote:
[...]
 Some features from the article that as far as I know have no
 equivalent in dub:
 
 1. Caching registry locally.
 
 Resolving environments feels instant, as opposed to the glacially
 slow Conda that Python offers. The global "general" registry is
 downloaded as a single gzipped tarball, and read directly from the
 zipped tarball, making registry updates way faster than updating
 Cargo's crates.io.
IMO speed in this area is critical. Room for dub to improve.
 2. Multiple registries with namespacing.
 
 Pkg is federated, and allows you to easily and freely mix multiple
 public and private package registries, even if they have no
 knowledge of each others and contain different packages with the
 same names.
This is the way to go. Flat package namespaces just don't cut it anymore in this day and age. Some kind of namespacing is necessary. One may argue, D's ecosystem is so small, what need do we have of package namespacing? Well, think of it this way: if my D program is so small, what need do I have of module namespacing? Answer: it sets the groundwork for future expansion. If we start off on the wrong foot, it will be difficult to add namespacing later when the ecosystem grows.
 3. Cross compilation.
 
 The BinaryBuilder package allows you to cross-compile the same
 program to all platforms supported by Julia
This IMO would be a big selling point for dub. We need a comprehensive cross-compiling solution that doesn't need manual hacking to work. LDC's built-in Windows target is awesome, but still needs manual setup. Cross-compilation to Mac is still incomplete, and WASM support is separate and needs work. T -- Frank disagreement binds closer than feigned agreement.
Dec 09 2022
parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/9/2022 9:28 AM, H. S. Teoh wrote:
 The BinaryBuilder package allows you to cross-compile the same
 program to all platforms supported by Julia
This IMO would be a big selling point for dub. We need a comprehensive cross-compiling solution that doesn't need manual hacking to work. LDC's built-in Windows target is awesome, but still needs manual setup. Cross-compilation to Mac is still incomplete, and WASM support is separate and needs work.
Recently, dmd acquired the ability to cross compile for all its supported platforms. Cross linking, though, remains a problem.
Dec 09 2022
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 12/9/2022 6:11 AM, Paul Backus wrote:
 Resolving environments feels instant, as opposed to the glacially slow Conda 
 that Python offers. The global "general" registry is downloaded as a single 
 gzipped tarball, and read directly from the zipped tarball, making registry 
 updates way faster than updating Cargo's crates.io.
dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.
Dec 09 2022
parent reply Greggor <Greggor notareal.email> writes:
On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:
 On 12/9/2022 6:11 AM, Paul Backus wrote:
 Resolving environments feels instant, as opposed to the 
 glacially slow Conda that Python offers. The global "general" 
 registry is downloaded as a single gzipped tarball, and read 
 directly from the zipped tarball, making registry updates way 
 faster than updating Cargo's crates.io.
dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.
I would love that, I usually only read the forums & lurk, I made an account just to say this! In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them. It would be lovely if I could skip the unpacking and directly feed them into dmd. Not only that, this could also be an awesome way of easing distributing programs via src. I would love it if I could do something like `dmd myproject.tar.xz -of=myproject`
Dec 09 2022
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 12/9/2022 9:53 PM, Greggor wrote:
 I would love that, I usually only read the forums & lurk, I made an account
just 
 to say this!
Hmm. Looks like I'm not the only one!
Dec 09 2022
prev sibling next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 10/12/2022 6:53 PM, Greggor wrote:
 I would love that, I usually only read the forums & lurk, I made an 
 account just to say this!
You don't need an account to post, entirely optional.
Dec 09 2022
prev sibling next sibling parent Hipreme <msnmancini hotmail.com> writes:
On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:
 On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:
 On 12/9/2022 6:11 AM, Paul Backus wrote:
 Resolving environments feels instant, as opposed to the 
 glacially slow Conda that Python offers. The global 
 "general" registry is downloaded as a single gzipped 
 tarball, and read directly from the zipped tarball, making 
 registry updates way faster than updating Cargo's crates.io.
dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.
I would love that, I usually only read the forums & lurk, I made an account just to say this! In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them. It would be lovely if I could skip the unpacking and directly feed them into dmd. Not only that, this could also be an awesome way of easing distributing programs via src. I would love it if I could do something like `dmd myproject.tar.xz -of=myproject`
While I think that there's many advantages in not using a package manager, I also think that having it is really productive, being able to just add your dependency, it automatically checks the most recent version and then save that version information is absolutely important, it is easy. The compilation command would never be of that size unless you're work in a really small project. Although dub has many problems today, I would really like to see it becoming more and more useful. Currently, for me, as a package manager, it is good enough. As a build system, it does not fit all my requirements which I have been opening issues on its repo. But unfortunately, for changing my project which is pretty big right now from dub to any other build system is currently inviable **for me** because I would lose days until making it stable, which I could use it for coding more features. I can say that for any newcomer, `dub` is a bless, not needing to know any build cli, specific details that shouldn't be required of every developer is a game changer. Build process is a very important step to every project out there, having one which is simple to use is the best choice.
Dec 10 2022
prev sibling parent reply max haughton <maxhaton gmail.com> writes:
On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:
 On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:
 [...]
I would love that, I usually only read the forums & lurk, I made an account just to say this! In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them. It would be lovely if I could skip the unpacking and directly feed them into dmd. Not only that, this could also be an awesome way of easing distributing programs via src. I would love it if I could do something like `dmd myproject.tar.xz -of=myproject`
If you already have the dependencies then it does build without an internet connection. You can also use git submodules with dub if needed.
Dec 10 2022
parent reply Greggor <Greggor notareal.email> writes:
On Saturday, 10 December 2022 at 14:24:53 UTC, max haughton wrote:
 On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:
 On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright 
 wrote:
 [...]
I would love that, I usually only read the forums & lurk, I made an account just to say this! In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them. It would be lovely if I could skip the unpacking and directly feed them into dmd. Not only that, this could also be an awesome way of easing distributing programs via src. I would love it if I could do something like `dmd myproject.tar.xz -of=myproject`
If you already have the dependencies then it does build without an internet connection. You can also use git submodules with dub if needed.
I'm not using git for version control, I'm using fossil so I'm not sure if this is applicable to me. Its not just about building without an internet connection. It's more of a measuring stick I use. There are several reasons I do it, mainly its "trauma" from the JS/NPM ecosystem. the two stories I'm sharing may not seem relevant, but they are to the point I'm making. Maybe I'm a goof & live in a bubble, but I'd like to argue that D currently has a better dependence experience then most ""nicer"" systems based on some anecdotal evidence :^) JS:
My experience with the JS ecosystem especially with node.js is 
that anything I have written or used that relied on NPM packages 
if unmaintained for little while will stop working or gain odd 
bugs for no good reason. At one point I had the freedom to 
decided to do an experiment while working on a personal project 
in JS/node where I decided to not use NPM and minimized my 
dependency use & it changed my perspective on JS, the language 
is fine, its the tooling, culture and ecosystem around it that 
make it less so.

Eventually I wrote a single file library that held a lot of 
functions id reuse, and whats funny about this, I did what some 
C devs do where I've created the "single header" library but in 
the context of JS.
Python:
Please understand that I am not a Python programmer, so may have 
went at this the wrong way, but I do have a very negative 
experience with python as a user and I feel that its still 
important to mention.

About two months ago I set up on my machine a distribution of 
Stable Diffusion (an image gen AI, it's cool, look it up) and 
every step of the way it was a horrid experience. First I tried 
following the instructions provided by the project, they used a 
package manager called Anaconda, it took over 10 minutes for it 
to "resolve" and even then it failed to figure it out.
After throwing away an hour of my life, I decided to do it my 
self, so I figured out the packages I needed and installed them 
via pip, I even thought I was being a smart cookie and used 
something called virtual environments, and yes it did work well 
until my distribution updated the python version and everything 
exploded.
I found out that the python ecosystem uses a lot of native C/C++ 
libs and that a lot of python libraries are wrappers for them, 
this creates a lot of "fun" when python updates.

The virtual environment in python does not include a copy of the 
python install at the time it was created and instead symlink it 
:/, so its not actually isolated, so its just a crutch for pip 
because python like a lot of languages do this horrid practice 
of installing a dependence globally (or user wide). After more 
hair loss I gave up on updating the deps, nothing seemed to work 
and I have never written any python so I am not the person to 
fix it, so what I ended up doing was building python 3.10 from 
src and created a install just for this program and I changed 
the symlink in the python env to point to it. Whats even more 
sad, is that I found out you can't move a python environment 
after you make it, when building python I found out you have to 
hard code a path of where the interpreter lives, what twisted 
mind thought this was a good idea.

I wish the people doing AI picked a better language, I do not 
understand how they reproduce anything, but I guess they are 
smarter then I am, so maybe I'm just too dumb for python.
Dlang:
Looking at my currently open D project, I have a couple of 
dependencies and all of them don't have any sub dependencies, 
that's Awesome!
When DMD updates on my system, most things still work, at worst 
I get a deprecation message in my build log and I investigate 
it, the upgrade tends to be very simple, that is Awesome!
Unlike my C++ experience, most of the time Phobos has what I 
need & there is zero reason to reach for some 3rd party nonsense.
When I do use a external dependency they tend to be of good 
quality, most don't pull in a ton of sub dependencies or have 
complicated build steps. In most cases I can just take a DUB 
package and just add the src to my src tree and call it a day.
My goal here is not to appear like a luddite or to tell others to be ones, I think having nice tooling is a good goal, I'd love to have a good package manager for D with a quality ecosystem. Here is how I would go about this: * A dependency should always be just a tar/zip file with src code in it. * The dependency (tar/zip)s should always be stored in the project directory & not in some system or user folder. * No use of symlinks * To help discourage NPM insanity, Build in a ""Bloat"" measuring tool, how many total dependencies are you using? how many KLOC is it? * https://code.dlang.org should have a link for the manual direct download for the src zip for all versions. Github, Gitea instances & Fossil all have a way of providing a zip file for releases, src zips are already a near universal method of publishing code. So this theoretical package manager can be really fast and light, its basically an over gloried text search & download tool.
Dec 10 2022
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 11/12/2022 2:41 PM, Greggor wrote:
 Github, Gitea instances & Fossil all have a way of providing a zip file 
 for releases, src zips are already a near universal method of publishing 
 code. So this theoretical package manager can be really fast and light, 
 its basically an over gloried text search & download tool.
The dub-registry abstracts that and provides zips for dub to download. In an ideal world we can yes just download zips of sources and build. But we don't live in an ideal world. Builds are complicated. Sometimes you need to specify versions, run pre-build steps, use specific flags, exclude some sources for specific targets ext. Build managers hide all that from users so it is as simple as specifying a dependency and hit build. But the biggest cost in dub is the searching of the registry for packages. ``--skip-registry=all`` is a massive speed up. Although there is work going into speeding it up it won't be quite as massive as caching the registry locally.
Dec 10 2022
prev sibling next sibling parent zjh <fqbqrr 163.com> writes:
On Sunday, 11 December 2022 at 01:41:21 UTC, Greggor wrote:

 ...
Generally speaking, no matter what language you use, package `'rely' on` too many `packages`, is not `good` packages. Try to `avoid using` them.
Dec 10 2022
prev sibling parent reply max haughton <maxhaton gmail.com> writes:
On Sunday, 11 December 2022 at 01:41:21 UTC, Greggor wrote:
 On Saturday, 10 December 2022 at 14:24:53 UTC, max haughton 
 wrote:
 [...]
I'm not using git for version control, I'm using fossil so I'm not sure if this is applicable to me. Its not just about building without an internet connection. It's more of a measuring stick I use. There are several reasons I do it, mainly its "trauma" from the JS/NPM ecosystem. the two stories I'm sharing may not seem relevant, but they are to the point I'm making. Maybe I'm a goof & live in a bubble, but I'd like to argue that D currently has a better dependence experience then most ""nicer"" systems based on some anecdotal evidence :^) JS:
[...]
Python:
[...]
Dlang:
[...]
My goal here is not to appear like a luddite or to tell others to be ones, I think having nice tooling is a good goal, I'd love to have a good package manager for D with a quality ecosystem. Here is how I would go about this: * A dependency should always be just a tar/zip file with src code in it. * The dependency (tar/zip)s should always be stored in the project directory & not in some system or user folder. * No use of symlinks * To help discourage NPM insanity, Build in a ""Bloat"" measuring tool, how many total dependencies are you using? how many KLOC is it? * https://code.dlang.org should have a link for the manual direct download for the src zip for all versions. Github, Gitea instances & Fossil all have a way of providing a zip file for releases, src zips are already a near universal method of publishing code. So this theoretical package manager can be really fast and light, its basically an over gloried text search & download tool.
The way to use a git submodule with dub is just that you can point it at a path, it doesn't care where the path came from as long as there's a dub repo there. Fetching code and putting it somewhere is relatively trivial, it's not why people use package managers - dealing with a ratsnest of dependencies and resolving which versions satisfy all of them (if at all) is why eventually you need a "real" package manager - or rather package managers end up looking the way they do. Also saving dependencies in a user folder is tempting and sometimes correct but if you have to build a real product you typically have either subprojects or entirely separate projects with the same dependencies in them — if these are all stored in their own local folder then you end up fetching and building things multiple times (dub suffered from the latter until very recently IIRC). You could have a single per-product build dir but at that point you've lost the locality and might as well just have it in /home/ or wherever.
Dec 10 2022
parent reply bachmeier <no spam.net> writes:
On Sunday, 11 December 2022 at 03:16:24 UTC, max haughton wrote:

 Also saving dependencies in a user folder is tempting and 
 sometimes correct but if you have to build a real product you 
 typically have either subprojects or entirely separate projects 
 with the same dependencies in them — if these are all stored in 
 their own local folder then you end up fetching and building 
 things multiple times (dub suffered from the latter until very 
 recently IIRC). You could have a single per-product build dir 
 but at that point you've lost the locality and might as well 
 just have it in /home/ or wherever.
This language is not helpful. It is used by people with a very narrow set of experiences that do not understand other use cases. That leads to terrible design decisions. I for one have no interest in build what you define to be a "real product".
Dec 10 2022
parent max haughton <maxhaton gmail.com> writes:
On Sunday, 11 December 2022 at 03:49:53 UTC, bachmeier wrote:
 On Sunday, 11 December 2022 at 03:16:24 UTC, max haughton wrote:

 Also saving dependencies in a user folder is tempting and 
 sometimes correct but if you have to build a real product you 
 typically have either subprojects or entirely separate 
 projects with the same dependencies in them — if these are all 
 stored in their own local folder then you end up fetching and 
 building things multiple times (dub suffered from the latter 
 until very recently IIRC). You could have a single per-product 
 build dir but at that point you've lost the locality and might 
 as well just have it in /home/ or wherever.
This language is not helpful. It is used by people with a very narrow set of experiences that do not understand other use cases. That leads to terrible design decisions. I for one have no interest in build what you define to be a "real product".
I understand them I just don't think its a useful model to build a package manager around. If you just want the package manager to be a glorified curl/git wrapper that fetches code with no other bells and whistles why not just use curl/git? What dub should be able to do, wrt to locality is initialise and manage submodules for depdendencies automatically, as this makes some dependency-safety issues much easier to deal with – makes transitioning to internal forks/similar much more easy
Dec 11 2022
prev sibling next sibling parent Guillaume Piolat <first.last spam.org> writes:
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.
At the end of the day, someone that uses the package manager, will have much less contact with the compiler directly. The package manager is the primary entry and you "speak" more with it than any other tool. On this regards, DUB having colors is more significant than dmd having colors. The only reason I would type `dmd` or `ldc2` nowadays is to reduce a bug for Bugzilla. Modern UI toolkits like comes with a commandline tool which you do most of it, for example flutter. It's like a nice interface for "doing things with the library".
Dec 09 2022
prev sibling parent Don Allen <donaldcallen gmail.com> writes:
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
 Here's a good thought provoking article:

 https://viralinstruction.com/posts/goodjulia/

 A couple of things stood out for me:


 1. 
 https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing

 I've never thought of a package manager that way.


 2. "Rust, for example, may have a wonderfully expressive type 
 system, but it's also boilerplate heavy, and its borrowchecker 
 makes writing any code that compiles at all quite a time 
 investment. An investment, which most of the time gives no 
 returns when you're trying to figure how to approach the 
 problem in the first place. It's also not entirely clear how I 
 would interactively visualise and manipulate a dataset using a 
 static language like Rust."

 I've always thought that a great strength of D was its 
 plasticity, meaning you can easily change data structures and 
 algorithms as you're writing and rewriting code. Apparently 
 this is much more difficult in Rust, which will inevitably 
 result in less efficiency, even if the compiler for it 
 generates very good code.
I speak with a fair bit of experience with both Rust and D. In my opinion, what gives the writer of the Julia article heartburn about Rust has nothing to do with static vs. dynamic typing. Rust is difficult to learn because its insistence upon GC-less memory safety places a significant memory-management burden on the programmer. That's what all the ownership rules are about and the notorious borrow-checker is relentless in enforcing those rules. This is not a language for prototyping. You have to have a very clear idea of your design decisions and how they relate to the ownership/borrowing rules, or you will find yourself in a world of considerable frustration. This is much less true of D (and, I'm sure, Go and Nim, with which I have only a little experience). It's also less true of Haskell, with which I have a lot of experience, which also has a demanding compiler. But those demands are mostly about proper use of Haskell's type system and don't off-load work on the programmer because there's an empty space where a GC ought to be. Having said all this, once you learn how to deal with Rust, you learn where the land-mines are and how to avoid them. Using it then becomes a more normal experience, but the time and effort to get to that steady-state is greater than any language I've ever used in 60+ years of writing code. I will say that the compiler provides excellent error messages, as well as many Lint-ish suggestions about eliminating unnecessary and/or unused things from your code. Cargo is also very solid -- easy to use and well documented. Once you get your code to compile, it's much like Haskell -- it works, modulo your own logic errors.
Dec 09 2022