www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.announce - D-lighted, I'm Sure

reply Mike Parker <aldacron gmail.com> writes:
Not long ago, in my retrospective on the D Blog in 2018, I 
invited folks to write about their first impressions of D. Ron 
Tarrant, who you may have seen in the Lear forum, answered the 
call. The result is the latest post on the blog, the first guest 
post of 2019. Thanks, Ron!

As a reminder, I'm still looking for new-user impressions and 
guest posts on any D-related topic. Please contact me if you're 
interested. And don't forget, there's a bounty for guest posts, 
so you can make a bit of extra cash in the process.

The blog:
https://dlang.org/blog/2019/01/18/d-lighted-im-sure/

Reddit:
https://www.reddit.com/r/programming/comments/ahawhz/dlighted_im_sure_the_first_two_months_with_d/
Jan 18 2019
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 1/18/19 9:29 AM, Mike Parker wrote:
 Not long ago, in my retrospective on the D Blog in 2018, I invited folks 
 to write about their first impressions of D. Ron Tarrant, who you may 
 have seen in the Lear forum, answered the call. The result is the latest 
 post on the blog, the first guest post of 2019. Thanks, Ron!
 
 As a reminder, I'm still looking for new-user impressions and guest 
 posts on any D-related topic. Please contact me if you're interested. 
 And don't forget, there's a bounty for guest posts, so you can make a 
 bit of extra cash in the process.
 
 The blog:
 https://dlang.org/blog/2019/01/18/d-lighted-im-sure/
 
 Reddit:
 https://www.reddit.com/r/programming/comments/ahawhz/dlighted_im_sure_the_firs
_two_months_with_d/ 
 
Nice read! And welcome to Ron! I too, started with BASIC, but on a Commodore 64 :) -Steve
Jan 18 2019
parent reply Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 15:08:48 UTC, Steven Schveighoffer 
wrote:

 Nice read! And welcome to Ron! I too, started with BASIC, but 
 on a Commodore 64 :)

 -Steve
Thanks, Steve. Just to set the record straight, I only had access to that Coleco Adam for the few weeks I was in that Newfoundland outport. Within a year, I too had my very own C-64 plugged into a monster Zenith console job. Remember those? I don't remember what I paid for a used C-64, but the Zenith 26" was $5 at a garage sale up the street and another $5 for delivery.
Jan 18 2019
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 1/18/19 11:42 AM, Ron Tarrant wrote:
 On Friday, 18 January 2019 at 15:08:48 UTC, Steven Schveighoffer wrote:
 
 Nice read! And welcome to Ron! I too, started with BASIC, but on a 
 Commodore 64 :)
Thanks, Steve. Just to set the record straight, I only had access to that Coleco Adam for the few weeks I was in that Newfoundland outport. Within a year, I too had my very own C-64 plugged into a monster Zenith console job. Remember those? I don't remember what I paid for a used C-64, but the Zenith 26" was $5 at a garage sale up the street and another $5 for delivery.
I had to use my parents' TV in the living room :) And I was made to learn typing before I could play games on it, so cruel... -Steve
Jan 18 2019
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jan 18, 2019 at 12:06:54PM -0500, Steven Schveighoffer via
Digitalmars-d-announce wrote:
 On 1/18/19 11:42 AM, Ron Tarrant wrote:
[...]
 Just to set the record straight, I only had access to that Coleco
 Adam for the few weeks I was in that Newfoundland outport. Within a
 year, I too had my very own C-64 plugged into a monster Zenith
 console job.  Remember those? I don't remember what I paid for a
 used C-64, but the Zenith 26" was $5 at a garage sale up the street
 and another $5 for delivery.
I had to use my parents' TV in the living room :) And I was made to learn typing before I could play games on it, so cruel...
[...] Wow, what cruelty! ;-) The Apple II was my first computer ever, and I spent 2 years playing computer games on it until they were oozing out of my ears. Then I got so fed up with them that I decided I'm gonna write my own. So began my journey into BASIC, and then 6502 assembly, etc.. A long road later, I ended up here with D. T -- This is a tpyo.
Jan 18 2019
prev sibling parent Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 17:06:54 UTC, Steven Schveighoffer 
wrote:

 I had to use my parents' TV in the living room :) And I was 
 made to learn typing before I could play games on it, so 
 cruel...
LOL! (Ahem) I feel your pain, sir.
Jan 19 2019
prev sibling parent reply Meta <jared771 gmail.com> writes:
On Friday, 18 January 2019 at 16:42:15 UTC, Ron Tarrant wrote:
 Just to set the record straight, I only had access to that 
 Coleco Adam for the few weeks I was in that Newfoundland 
 outport. Within a year, I too had my very own C-64 plugged into 
 a monster Zenith console job. Remember those? I don't remember 
 what I paid for a used C-64, but the Zenith 26" was $5 at a 
 garage sale up the street and another $5 for delivery.
Great read Ron. Can I ask which town in Newfoundland it was where you stayed back in 1985?
Jan 18 2019
parent reply Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 19:55:34 UTC, Meta wrote:

 Great read Ron. Can I ask which town in Newfoundland it was 
 where you stayed back in 1985?
Sure. I was in St. Lawrence on the Burin Peninsula. Do you know it?
Jan 19 2019
parent Meta <jared771 gmail.com> writes:
On Saturday, 19 January 2019 at 22:09:57 UTC, Ron Tarrant wrote:
 On Friday, 18 January 2019 at 19:55:34 UTC, Meta wrote:

 Great read Ron. Can I ask which town in Newfoundland it was 
 where you stayed back in 1985?
Sure. I was in St. Lawrence on the Burin Peninsula. Do you know it?
Unfortunately (or fortunately?) not. I've spent a good deal of time exploring the western side of NFLD but have only visited Gander and St. John's on the eastern side.
Jan 21 2019
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jan 18, 2019 at 02:29:14PM +0000, Mike Parker via
Digitalmars-d-announce wrote:
[...]
 The blog:
 https://dlang.org/blog/2019/01/18/d-lighted-im-sure/
[...] Very nice indeed! Welcome aboard, Ron! And wow... 6502? That's what I grew up on too! I used to remember most of the opcodes by heart... though nowadays that memory has mostly faded away. The thought of it still evokes nostalgic feelings, though. I'm also not a big fan of dub, but I'm in the minority around these parts. Having grown up on makefiles and dealt with them in a large project at my day job, I've developed a great distaste for them, and nowadays the standard build tool I reach for is SCons. Though possibly in the not-so-distant future I might start using something more scalable like Tup, or Button, written by one of our very own D community members. But for small projects, just plain ole dmd is Good Enough(tm) for me. I won't bore you with my boring editor, vim (with no syntax highlighting -- yes I've been told I'm crazy, and in fact I agree -- just plain ole text, with little things like autoindenting, no fancy IDE features -- Linux is my IDE, the whole of it :-P). Vim users seem to out in force around these parts for some reason, besides the people clamoring for a "proper" IDE, but I suspect I'm the only one who deliberately turns *off* syntax highlighting, and indeed, any sort of color output from dmd or any other tools (I find it distracting). So don't pay too much heed to what I say, at least on this subject. :-D T -- Живёшь только однажды.
Jan 18 2019
next sibling parent reply JN <666total wp.pl> writes:
On Friday, 18 January 2019 at 18:48:00 UTC, H. S. Teoh wrote:
 I'm also not a big fan of dub, but I'm in the minority around 
 these parts.  Having grown up on makefiles and dealt with them 
 in a large project at my day job, I've developed a great 
 distaste for them, and nowadays the standard build tool I reach 
 for is SCons.  Though possibly in the not-so-distant future I 
 might start using something more scalable like Tup, or Button, 
 written by one of our very own D community members. But for 
 small projects, just plain ole dmd is Good Enough(tm) for me.
The trick with makefiles is that they work well for a single developer, or a single project, but become an issue when dealing with multiple libraries, each one coming with its own makefile (if you're lucky, if you're not, you have multiple CMake/SCons/etc. systems to deal with). Makefiles are very tricky to do crossplatform, especially on Windows, and usually they aren't enough, I've often seen people use bash/python/ruby scripts to drive the building process anyway. The big thing dub provides is package management. Having a package manager is an important thing for a language nowadays. Gone are the days of hunting for library source, figuring out where to put includes. Just add a line in your dub.json file and you have the library. Need to upgrade to newer version? Just change the version in dub.json file. Need to download the problem from scratch? No problem, dub can use the json file to download all the dependencies in proper versions.
Jan 18 2019
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jan 18, 2019 at 06:59:59PM +0000, JN via Digitalmars-d-announce wrote:
[...]
 The trick with makefiles is that they work well for a single
 developer, or a single project, but become an issue when dealing with
 multiple libraries, each one coming with its own makefile (if you're
 lucky, if you're not, you have multiple CMake/SCons/etc. systems to
 deal with). Makefiles are very tricky to do crossplatform, especially
 on Windows, and usually they aren't enough, I've often seen people use
 bash/python/ruby scripts to drive the building process anyway.
Actually, the problems I had with makefiles come from within single projects. One of the most fundamental problems, which is also a core design, of Make is that it's timestamp-based. This means: (1) it often builds unnecessarily -- `touch source.d` and it rebuilds source.d even though the contents haven't changed; and (2) it often fails to build necessary targets -- if for whatever reason your system clock is out-of-sync or whatever, and a newer version of source.d has an earlier date than a previously-built object. Furthermore, makefiles generally do not have a global view of your workspace, so builds are not reproducible (unless you go out of your way to do it). Running `make` after editing some source files does not guarantee you'll end up with the same executables as if you checked in your changes, did a fresh checkout, and ran `make`. I've had horrible all-nighters looking for heisenbugs that have no representation in the source code, but are caused by make picking up stale object files from who knows how many builds ago. You end up having to `make clean; make` every other build "just to be sure", which is really stupid in this day and age. (And even `make clean` does not guarantee you get a clean workspace -- too many projects than I care to count exhibit this problem.) Then there's parallel building, which again requires explicit effort, macro hell typical of tools from that era, etc.. I've already ranted about this at great lengths before, so I'm not going to repeat them again. But make is currently near (if not at) the bottom of my list of build tools for many, many reasons. Ultimately, as I've already said elsewhere, what is needed is a *standard tool-independent dependency graph declaration* attached to every project, that captures the dependency graph of the project in a way that any tool that understands the standard format can parse and act on. At the core of it, every build system out there is essentially just an implementation of a directed acyclic graph walk. A standard problem with standard algorithms to solve it. But everybody rolls their own implementation gratuitously incompatible with everything else, and so we find ourselves today with multiple, incompatible build systems that, in large-scale software, often has to somehow co-exist within the same project.
 The big thing dub provides is package management. Having a package
 manager is an important thing for a language nowadays. Gone are the
 days of hunting for library source, figuring out where to put
 includes. Just add a line in your dub.json file and you have the
 library. Need to upgrade to newer version? Just change the version in
 dub.json file. Need to download the problem from scratch? No problem,
 dub can use the json file to download all the dependencies in proper
 versions.
Actually, I have the opposite problem. All too often, my projects that depend on some external library become uncompilable because said library has upgraded from version X to version Z, and version X doesn't exist anymore (the oldest version is now Y), or upstream made an incompatible change, or the network is down and dub can't download the right version, etc.. These days, I'm very inclined to just download the exact version of the source code that I need, and include it as part of my source tree, just so there will be no gratuitous breakage due to upstream changes, old versions being no longer supported, or OS changes that break pre-shipped .so files, and all of that nonsense. Just compile the damn thing from scratch from the exact version of the sources that you KNOW works -- sources that you have in hand RIGHT HERE instead of somewhere out there in the nebulous "cloud" which happens to be unreachable right now, because your network is down and in order to fix the network you need to compile this tool that depends on said missing sources. I understand it's convenient for the package manager to "automatically" install dependencies for you, refresh to the latest version, and what-not. But frankly, I find that the amount of effort it takes to download the source code of some library and setup the include paths manually is miniscule, compared to the dependency hell I have to deal with in a system like dub. These days I almost automatically write off 3rd party libraries that have too many dependencies. The best kind of 3rd party code is the standalone kind, like the kind Adam Ruppe provides: just copy the lousy source code into your source tree and import that. 10 years later it will still compile and work as before, and won't suddenly die from missing .so files (because they've been replaced by newer, ABI-incompatible ones), new upstream versions that broke the old API, or from missing source files that were gone from the dub cache when you migrated to a new hard drive and now you can't get them because your network is down, or, worst of all, one of the dependencies of the dependencies of the library you depend on has vanished into the ether and/or become gratuitously incompatible so now you have to spend 5 hours upgrading the entire codebase and 5 days to rewrite your code to work around the missing functionality, etc.. I'm not against fetching new versions of dependencies and what-not, but I'd like to do that as a conscious action instead of some tool deciding to "upgrade" my project and ending up with something uncompilable -- in the middle of a code-build-debug cycle. I don't *want* stuff upgraded behind my back when I'm trying to debug something! With Adam-style libraries, you just copy the damn file into your source tree and that's all there is to it. When you need to upgrade, just download the new file, copy it into your source, and recompile. If that breaks, rollback your code repo and off you go. I'm sick of the baroque nonsense that is the dependency hell of "modern" package managers, or worse, the *versioned* dependency hell of having to explicitly specify what version of the library you depend on. The source file you copied into the source tree *is* the version you're depending on, period. No fuss, no muss. (And don't get me started on dub as a *build* tool. It's actually not bad as a *package manager*, but as a build tool, I find it essentially unusable because it does not support many things I need, like codegen, non-code-related build tasks like data generation, etc.. Until dub is able to support those things, it's not even an option for many of my projects.) T -- All problems are easy in retrospect.
Jan 18 2019
parent reply Neia Neutuladh <neia ikeran.org> writes:
On Fri, 18 Jan 2019 11:43:58 -0800, H. S. Teoh wrote:
 (1) it often builds unnecessarily -- `touch source.d` and it rebuilds
 source.d even though the contents haven't changed; and
Timestamp-based change detection is simple and cheap. If your filesystem supports a revision id for each file, that might work better, but I haven't heard of such a thing. If you're only dealing with a small number of small files, content-based change detection might be a reasonable option.
 (2) it often fails to build necessary targets -- if for whatever reason
 your system clock is out-of-sync or whatever, and a newer version of
 source.d has an earlier date than a previously-built object.
I'm curious what you're doing that you often have clock sync errors.
Jan 18 2019
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jan 18, 2019 at 08:03:09PM +0000, Neia Neutuladh via
Digitalmars-d-announce wrote:
 On Fri, 18 Jan 2019 11:43:58 -0800, H. S. Teoh wrote:
 (1) it often builds unnecessarily -- `touch source.d` and it
 rebuilds source.d even though the contents haven't changed; and
Timestamp-based change detection is simple and cheap. If your filesystem supports a revision id for each file, that might work better, but I haven't heard of such a thing.
Barring OS/filesystem support, there's recent OS features like inotify that lets a build daemon listen for changes to files within a subdirectory. Tup, for example, uses this to make build times proportional to the size of the changeset rather than the size of the entire workspace. I consider this an essential feature of a modern build system. Timestamp-based change detection also does needless work even when there *is* a change. For example, edit source.c, change a comment, and make will recompile it all the way down -- .o file, .so file or executable, all dependent targets, etc.. Whereas a content-based change detection (e.g. md5 checksum based) will stop at the .o step because the comment did not cause the .o file to change, so further actions like linking into the executable are superfluous and can be elided. For small projects the difference is negligible, but for large-scale projects this can mean the difference between a few seconds -- usable for high productivity code-compile-test cycle -- and half an hour: completely breaks the productivity cycle.
 If you're only dealing with a small number of small files,
 content-based change detection might be a reasonable option.
Content-based change detection is essential IMO. It's onerous if you use the old scan-the-entire-source-tree model of change detection; it's actually quite practical if you use a modern inotify- (or equivalent) based system.
 (2) it often fails to build necessary targets -- if for whatever
 reason your system clock is out-of-sync or whatever, and a newer
 version of source.d has an earlier date than a previously-built
 object.
I'm curious what you're doing that you often have clock sync errors.
Haha, that's just an old example from back in the bad ole days where NTP syncing is rare, and everyone's PC is slightly off anywhere from seconds to minutes (or if it's really badly-managed, hours, or maybe the wrong timezone or whatever). The problem is most manifest when networked filesystems are involved. These days, clock sync isn't really a problem anymore, generally speaking, but there's still something else about make that makes it fail to pick up changes. I still regularly have to `make clean;make` makefile-based projects just to get the lousy system to pick up the changes. I don't have that problem with more modern build systems. Probably it's an issue of undetected dependencies. T -- I think Debian's doing something wrong, `apt-get install pesticide', doesn't seem to remove the bugs on my system! -- Mike Dresser
Jan 18 2019
parent reply Jacob Carlborg <doob me.com> writes:
On 2019-01-18 21:23, H. S. Teoh wrote:

 Haha, that's just an old example from back in the bad ole days where NTP
 syncing is rare, and everyone's PC is slightly off anywhere from seconds
 to minutes (or if it's really badly-managed, hours, or maybe the wrong
 timezone or whatever).
I had one of those issues at work. One day when I came in to work it was suddenly not possible to SSH into a remote machine. It worked the day before. Turns out the ntpd daemon was not running on the remote machine (for some reason) and we're using Kerberos with SSH, that means if the clocks are too much out of sync it will not be able to login. That was a ... fun, debugging experience. -- /Jacob Carlborg
Jan 18 2019
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Jan 18, 2019 at 09:41:14PM +0100, Jacob Carlborg via
Digitalmars-d-announce wrote:
 On 2019-01-18 21:23, H. S. Teoh wrote:
 
 Haha, that's just an old example from back in the bad ole days where
 NTP syncing is rare, and everyone's PC is slightly off anywhere from
 seconds to minutes (or if it's really badly-managed, hours, or maybe
 the wrong timezone or whatever).
I had one of those issues at work. One day when I came in to work it was suddenly not possible to SSH into a remote machine. It worked the day before. Turns out the ntpd daemon was not running on the remote machine (for some reason) and we're using Kerberos with SSH, that means if the clocks are too much out of sync it will not be able to login. That was a ... fun, debugging experience.
[...] Ouch. Ouch! That must not have been a pleasant experience in any sense of the word. Knowing all too well how these things tend to go, the errors you get from the SSH log probably were very unhelpful, mostly stemming from C's bad ole practice or returning a generic unhelpful "failed" error code for all failures indiscriminately. I had to work on SSH-based code recently, and it's just ... not a nice experience overall due to the way the C code was written. T -- GEEK = Gatherer of Extremely Enlightening Knowledge
Jan 18 2019
prev sibling parent reply Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 18:59:59 UTC, JN wrote:

 Just add a line in your dub.json file and you have the library. 
 Need to upgrade to newer version? Just change the version in 
 dub.json file. Need to download the problem from scratch? No 
 problem, dub can use the json file to download all the 
 dependencies in proper versions.
Any idea where we can find a gentle intro to dub?
Jan 19 2019
parent reply Paul Backus <snarwin gmail.com> writes:
On Saturday, 19 January 2019 at 22:07:47 UTC, Ron Tarrant wrote:
 On Friday, 18 January 2019 at 18:59:59 UTC, JN wrote:

 Just add a line in your dub.json file and you have the 
 library. Need to upgrade to newer version? Just change the 
 version in dub.json file. Need to download the problem from 
 scratch? No problem, dub can use the json file to download all 
 the dependencies in proper versions.
Any idea where we can find a gentle intro to dub?
It looks like the best one is the "Getting Started" page on code.dlang.org: https://dub.pm/getting_started
Jan 19 2019
parent Ron Tarrant <rontarrant gmail.com> writes:
On Saturday, 19 January 2019 at 22:15:50 UTC, Paul Backus wrote:

 It looks like the best one is the "Getting Started" page on 
 code.dlang.org:

 https://dub.pm/getting_started
Thanks, Paul. I'll take a look.
Jan 20 2019
prev sibling parent Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 18:48:00 UTC, H. S. Teoh wrote:

 Very nice indeed!  Welcome aboard, Ron!
Thanks, H.S.
 I used to remember most of the opcodes by heart... though 
 nowadays that memory has mostly faded away.
I used to write 6502 in my head while riding my bike to school, then write it out, do up a poke statement to jam it into RAM, and most of the time it worked first try. I was so impressed with myself.
 I won't bore you with my boring editor, vim (with no syntax 
 highlighting -- yes I've been told I'm crazy, and in fact I 
 agree
I read somewhere recently that syntax highlighting is considered a distraction, so you're not the only one. I use it mainly as a spellchecker. If it lights up, I know I spelled it right! :)
 Linux is my IDE, the whole of it :-P).
And I thought Atom had overhead! :) I do hope you know I'm kidding. I have been working up to installing Linux on something around here, too. And FreeBSD. I'm seriously short of hardware and space to set up other machines ATM, so it's going to have to wait.
Jan 19 2019
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2019-01-18 15:29, Mike Parker wrote:
 Not long ago, in my retrospective on the D Blog in 2018, I invited folks 
 to write about their first impressions of D. Ron Tarrant, who you may 
 have seen in the Lear forum, answered the call. The result is the latest 
 post on the blog, the first guest post of 2019. Thanks, Ron!
 
 The blog:
 https://dlang.org/blog/2019/01/18/d-lighted-im-sure/
Regarding Dub. If you only have a project without any dependencies or perhaps only system dependencies already available on the system it might not add that much. But as soon as you want to use someone else D code it helps tremendously. Dub both acts as a build tool and a package manager. It will automatically download the source code for the dependencies, build them and handle the imports paths. As for JSON files, it's possible to use the alternative format SDL. One extremely valuable feature this has over JSON is that it supports comments. To address some of the direct questions in the blog post: "information about how I would go about packaging a D app (with GtkD) for distribution". When it comes to distribution D applications there isn't much that is specific to D. Most of the approaches and documentation that applies to any native language would apply to D as well. There are two D specific things (that I can think of for now) that are worth mentioning: * When you compile a release build for distribution, use the LDC [1] compiler. It produces better code. You can also add things like LTO (Link Time Optimization) and possibly PGO (Profile Guided Optimization). * If you have any static assets for you application, like images, sound videos, config files or similar, it's possible to embed those directly in the executable using the "import expression" [2] feature. This will read a file, at compile time, into a string literal inside the code. Some more general things about distribution. I think it's more platform specific than language specific. I can only speak for macOS (since that's the main platform I use). There it's expected to have the application distributed as a disk image (DMG). This image would contain an application bundle. An application bundle is a regular directory with the ".app" extension with a specific directory and file structure. Some applications in the OS treats these bundles specially. For example, double clicking on this bundle in the file browser will launch the application. The bundle will contain the actual executable and and resources like libraries and assets like images and audio. In your case, don't expect a Mac user to have GTK installed, bundle that in the application bundle. Then there's the issue of which versions of the platforms you want to support. For macOS it's possible to specify a minimum deployment target using the "MACOSX_DEPLOYMENT_TARGET" environment variable. This allows to build the application on the latest version of the OS but still have the application work on older versions. On *BSD and Linux it's not that easy and Linux has the additional axis of distros which adds another level of issues. The best is to compile for each distro and version you want to support, but that's a lot of work. I would provide fully statically linked binaries, including statically linked to the C standard library. This way to can provide one binary for all version of and distros of Linux and you know it will always work. "how to build on one platform for distribution on another (if that’s even possible)" I can say that it's possible, but unless you're targeting a platform that doesn't provide a compiler, like mobile or an embedded platform, I think it's rare to need to cross-compile. I'll tell you way: When building an application that targets multiple platforms you would need to test it at some point. That means running the application on all the supported platforms. That means you need to have access to these platforms. Usually a lot in the beginning when developing the application and in the end when doing final verification before a release. Having said that, I'm all for automating as much as possible. That means automatically running all the tests and building the final release build and packing for distribution. For that I recommend hooking up you're project to one of the publicly available and free CI services. Travis CI [3] is one of them that supports Linux, macOS and Windows (early release [4]). AppVeyor is an alternative that has a much more mature support for Windows. If you really want to cross-compile, it's possible if you use LDC. DMD can compile for the same platform for either 32 bit or 64 bit, but not for a different platform. I think it's simplest to use Docker. I have two Docker files for a container containing LDC setup for cross-compiling to macOS [6] and for Windows [7]. Unfortunately these Docker files pulls down the SDKs from someones (not mine) Dropbox account. [1] https://github.com/ldc-developers/ldc [2] https://dlang.org/spec/expression.html#import_expressions [3] https://travis-ci.com/ [4] https://blog.travis-ci.com/2018-10-11-windows-early-release [5] https://www.appveyor.com [6] https://github.com/jacob-carlborg/docker-ldc-darwin/blob/master/Dockerfile [7] https://github.com/jacob-carlborg/docker-ldc-windows/blob/master/Dockerfile -- /Jacob Carlborg
Jan 18 2019
parent reply Ron Tarrant <rontarrant gmail.com> writes:
On Friday, 18 January 2019 at 20:30:25 UTC, Jacob Carlborg wrote:

 Regarding Dub.
[stuff deleted]
 [1] https://github.com/ldc-developers/ldc
 [2] https://dlang.org/spec/expression.html#import_expressions
 [3] https://travis-ci.com/
 [4] https://blog.travis-ci.com/2018-10-11-windows-early-release
 [5] https://www.appveyor.com
 [6] 
 https://github.com/jacob-carlborg/docker-ldc-darwin/blob/master/Dockerfile
 [7] 
 https://github.com/jacob-carlborg/docker-ldc-windows/blob/master/Dockerfile
Wow. That's a lot to think about. Thanks, Jacob. Looks like I've got my weekend reading all lined up. :)
Jan 19 2019
parent Jacob Carlborg <doob me.com> writes:
On 2019-01-19 23:13, Ron Tarrant wrote:

 Wow. That's a lot to think about. Thanks, Jacob. Looks like I've got my 
 weekend reading all lined up. :)
:) -- /Jacob Carlborg
Jan 20 2019
prev sibling parent reply Jon Degenhardt <jond noreply.com> writes:
On Friday, 18 January 2019 at 14:29:14 UTC, Mike Parker wrote:
 Not long ago, in my retrospective on the D Blog in 2018, I 
 invited folks to write about their first impressions of D. Ron 
 Tarrant, who you may have seen in the Lear forum, answered the 
 call. The result is the latest post on the blog, the first 
 guest post of 2019. Thanks, Ron!

 As a reminder, I'm still looking for new-user impressions and 
 guest posts on any D-related topic. Please contact me if you're 
 interested. And don't forget, there's a bounty for guest posts, 
 so you can make a bit of extra cash in the process.

 The blog:
 https://dlang.org/blog/2019/01/18/d-lighted-im-sure/

 Reddit:
 https://www.reddit.com/r/programming/comments/ahawhz/dlighted_im_sure_the_first_two_months_with_d/
Nicely done. Very enjoyable, thanks for publishing this! --Jon
Jan 19 2019
parent Ron Tarrant <rontarrant gmail.com> writes:
On Saturday, 19 January 2019 at 20:12:04 UTC, Jon Degenhardt 
wrote:

 Nicely done. Very enjoyable, thanks for publishing this!
Thanks, Jon. Glad you enjoyed it.
Jan 19 2019