www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Note from a donor

reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
A person who donated to the Foundation made a small wish list known. 
Allow me to relay it:

* RSA Digital Signature Validation in Phobos
* std.decimal in Phobos
* better dll support for Windows.


Andrei
Oct 24
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu 
wrote:

 * better dll support for Windows.
This one is on a lot of wish lists.
Oct 24
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 24/10/2017 2:25 PM, Mike Parker wrote:
 On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu wrote:
 
 * better dll support for Windows.
This one is on a lot of wish lists.
It definitely needs to be a target for 2018H1, I'll be making sure its added! Too big a blocker and comes up a little too often...
Oct 24
parent reply solidstate1991 <laszloszeremi outlook.com> writes:
On Tuesday, 24 October 2017 at 13:37:09 UTC, rikki cattermole 
wrote:
 On 24/10/2017 2:25 PM, Mike Parker wrote:
 On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei 
 Alexandrescu wrote:
 
 * better dll support for Windows.
This one is on a lot of wish lists.
It definitely needs to be a target for 2018H1, I'll be making sure its added! Too big a blocker and comes up a little too often...
DIP45 has the solution (make export an attribute), it needs to be updated for the new DIP format from what I heard. It needs to be pushed, as Windows still the most popular OS on the consumer side of things, then we can have Phobos and DRuntime as DLLs without using experimental versions of DMD. I have some plans with the better DLL support, such as the possibility of a D based Python (for better class interoperability with D), or even using a modified set of D for scripting (eg. SafeD only).
Oct 24
parent reply Benjamin Thaut <code benjamin-thaut.de> writes:
On Tuesday, 24 October 2017 at 21:11:38 UTC, solidstate1991 wrote:
 DIP45 has the solution (make export an attribute), it needs to 
 be updated for the new DIP format from what I heard. It needs 
 to be pushed, as Windows still the most popular OS on the 
 consumer side of things, then we can have Phobos and DRuntime 
 as DLLs without using experimental versions of DMD. I have some 
 plans with the better DLL support, such as the possibility of a 
 D based Python (for better class interoperability with D), or 
 even using a modified set of D for scripting (eg. SafeD only).
Unfortunately I currenlty don't have a lot of spare time to spend on open source projets. I will however have some more time in December. My current plan is to revive DIP 45 and my dll implementation and give it some finishing touches as discussed with Walter at DConf 2017.
Oct 29
parent Walter Bright <newshound2 digitalmars.com> writes:
On 10/29/2017 1:03 PM, Benjamin Thaut wrote:
 Unfortunately I currenlty don't have a lot of spare time to spend on open
source 
 projets. I will however have some more time in December. My current plan is to 
 revive DIP 45 and my dll implementation and give it some finishing touches as 
 discussed with Walter at DConf 2017.
Looking forward to it!
Oct 29
prev sibling next sibling parent Dejan Lekic <dejan.lekic gmail.com> writes:
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu 
wrote:
 A person who donated to the Foundation made a small wish list 
 known. Allow me to relay it:

 * RSA Digital Signature Validation in Phobos
 * std.decimal in Phobos
 * better dll support for Windows.


 Andrei
First two are in my wish list too!
Oct 24
prev sibling next sibling parent reply Kagamin <spam here.lot> writes:
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu 
wrote:
 * RSA Digital Signature Validation in Phobos
https://issues.dlang.org/show_bug.cgi?id=16510 the blocker for botan was OMF support.
Oct 24
parent reply Adam Wilson <flyboynw gmail.com> writes:
On 10/24/17 07:14, Kagamin wrote:
 On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu wrote:
 * RSA Digital Signature Validation in Phobos
https://issues.dlang.org/show_bug.cgi?id=16510 the blocker for botan was OMF support.
IMO, the correct solution here is to deprecate OMF and use the System linker for 32-bit on Windows as that is already the default behavior on 64-bit Windows So instead of -m32 and -m32mscoff, we would have -m32 and -m32omf. I think that this is a reasonable tradeoff. We could leave -m32mscoff in for a while, for backwards compat. -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 24
parent reply Andre Pany <andre s-e-a-p.de> writes:
On Tuesday, 24 October 2017 at 20:27:26 UTC, Adam Wilson wrote:
 On 10/24/17 07:14, Kagamin wrote:
 On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei 
 Alexandrescu wrote:
 * RSA Digital Signature Validation in Phobos
https://issues.dlang.org/show_bug.cgi?id=16510 the blocker for botan was OMF support.
IMO, the correct solution here is to deprecate OMF and use the System linker for 32-bit on Windows as that is already the default behavior on 64-bit Windows So instead of -m32 and -m32mscoff, we would have -m32 and -m32omf. I think that this is a reasonable tradeoff. We could leave -m32mscoff in for a while, for backwards compat.
In general I agree with you that coff is the way to go. I just dislike the consequences. Today you just download the dmd Windows zip, extract it and you have a running compiler. Nice, self contained and a good advertisement for D. On the other side if the user is forced to install Visual Studio / C++ build pack, this might distract first time D users... If the Microsoft linker could be added to dmd that would be the best solution. Just dreaming :) Kind regards Andre
Oct 24
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 24 October 2017 at 21:11:37 UTC, Andre Pany wrote:
 In general I agree with you that coff is the way to go. I just 
 dislike the consequences. Today you just download the dmd 
 Windows zip, extract it and you have a running compiler. Nice, 
 self contained and a good advertisement for D. On the other 
 side if the user is forced to install Visual Studio / C++ build 
 pack, this might distract first time D users...

 If the Microsoft linker could be added to dmd that would be the 
 best solution. Just dreaming :)

 Kind regards
 Andre
I'm sympathetic to your point. I think there was/is some effort to allow LLD (the LLVM linker) as a replacement for the MSVC linker in LDC. Perhaps if LLD could be a drop-in for MSVC in DMD for Windows, then eventually it could be the default? Not sure that's possible or not.
Oct 24
next sibling parent reply bitwise <bitwise.pvt gmail.com> writes:
On Tuesday, 24 October 2017 at 22:19:59 UTC, jmh530 wrote:
 On Tuesday, 24 October 2017 at 21:11:37 UTC, Andre Pany wrote:
 [...]
I'm sympathetic to your point. I think there was/is some effort to allow LLD (the LLVM linker) as a replacement for the MSVC linker in LDC. Perhaps if LLD could be a drop-in for MSVC in DMD for Windows, then eventually it could be the default? Not sure that's possible or not.
VC++ command line tools seem to be available on their own: http://landinghub.visualstudio.com/visual-cpp-build-tools
Oct 25
parent reply Mike Parker <aldacron gmail.com> writes:
On Wednesday, 25 October 2017 at 15:00:04 UTC, bitwise wrote:

 VC++ command line tools seem to be available on their own:

 http://landinghub.visualstudio.com/visual-cpp-build-tools
Still a big download and requires the Windows SDK to be downloaded and installed separately.
Oct 25
parent reply Adam Wilson <flyboynw gmail.com> writes:
On 10/25/17 09:34, Mike Parker wrote:
 On Wednesday, 25 October 2017 at 15:00:04 UTC, bitwise wrote:

 VC++ command line tools seem to be available on their own:

 http://landinghub.visualstudio.com/visual-cpp-build-tools
Still a big download and requires the Windows SDK to be downloaded and installed separately.
Speaking from very long experience, 95%+ of Windows devs have VS+WinSDK installed as part of their default system buildout. The few that don't will have little trouble understanding why they need it and acquiring it. This is one of those breathless "the sky is falling" arguments we hear on these forums sometimes. Usually from linux devs who are inured to having the GCC tools on every machine and automatically assume that because Windows doesn't by default that it won't be there and that getting it will be some insurmountable burden. TBH, the attitudes around here towards Windows devs can be more than a little snobbish. In reality, it is quite easy to find a linux distro that doesn't have GCC by default, container distros for example. So the snobby attitude is really quite unfounded. -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 25
next sibling parent codephantom <me noyb.com> writes:
On Wednesday, 25 October 2017 at 22:36:32 UTC, Adam Wilson wrote:
 TBH, the attitudes around here towards Windows devs can be more 
 than a little snobbish.
Towards the Windows devs themselves, or the innate tendency (that unfortunately comes with using Windows) to blindly use proprietary tool chains (and all the accompanying rubbish that gets imposed along with it)? I tend to think it's the latter. I don't think it's personal (and it better not be!) And such philosophical snobbery (from either end of the spectrum) will not disappear anytime soon... Move on.... find the middle ground...get rid of Windows and Linux.. ...and use FreeBSD ;-)
Oct 25
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2017-10-26 00:36, Adam Wilson wrote:

 Speaking from very long experience, 95%+ of Windows devs have VS+WinSDK 
 installed as part of their default system buildout. The few that don't 
 will have little trouble understanding why they need it and acquiring it.
IIRC, there have been people on these forums that have been asking why they need to download additional software when they already have the compiler. Same on macOS. -- /Jacob Carlborg
Oct 26
parent reply Adam Wilson <flyboynw gmail.com> writes:
On 10/26/17 00:32, Jacob Carlborg wrote:
 On 2017-10-26 00:36, Adam Wilson wrote:

 Speaking from very long experience, 95%+ of Windows devs have
 VS+WinSDK installed as part of their default system buildout. The few
 that don't will have little trouble understanding why they need it and
 acquiring it.
IIRC, there have been people on these forums that have been asking why they need to download additional software when they already have the compiler. Same on macOS.
How many though? Also, we have to do it for macOS, why is Windows special? The macOS setup was just as hard. Download two large packages (XCode+Cmd tools), install, and done. -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 26
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Thursday, 26 October 2017 at 10:16:27 UTC, Adam Wilson wrote:
 On 10/26/17 00:32, Jacob Carlborg wrote:
 IIRC, there have been people on these forums that have been 
 asking why
 they need to download additional software when they already 
 have the
 compiler.

 Same on macOS.
How many though? Also, we have to do it for macOS, why is Windows special? The macOS setup was just as hard. Download two large packages (XCode+Cmd tools), install, and done.
There definitely has been an uptick in that sort of complaint. The question should be, how many aren't coming here to complain? My initial internal reaction has always been, "just download and install -- how hard is it?". But one day I stopped and asked myself, what if I were coming to D today? I got by just fine for years without having VS installed. Once the 6.0 days were behind me, I neither needed nor wanted VS. I was content with mingw for my C stuff. When I first came to D, I came with the full knowledge that there was no ecosystem, things were rough, and I'd have to do a lot by hand. I stuck around because the language was worth it. If I came in today and saw that I needed to install VS just to get 64-bit binaries, I doubt I'd stick around long enough to discover how great the language is. I also didn't like that I had to install the Xcode tools on my Mac, but that's needed for any development on Mac from what I can see.
Oct 26
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 10/26/17 7:09 AM, Mike Parker wrote:
 On Thursday, 26 October 2017 at 10:16:27 UTC, Adam Wilson wrote:
 On 10/26/17 00:32, Jacob Carlborg wrote:
 IIRC, there have been people on these forums that have been asking why
 they need to download additional software when they already have the
 compiler.

 Same on macOS.
How many though? Also, we have to do it for macOS, why is Windows special? The macOS setup was just as hard. Download two large packages (XCode+Cmd tools), install, and done.
There definitely has been an uptick in that sort of complaint. The question should be, how many aren't coming here to complain? My initial internal reaction has always been, "just download and install -- how hard is it?". But one day I stopped and asked myself, what if I were coming to D today? I got by just fine for years without having VS installed. Once the 6.0 days were behind me, I neither needed nor wanted VS. I was content with mingw for my C stuff. When I first came to D, I came with the full knowledge that there was no ecosystem, things were rough, and I'd have to do a lot by hand. I stuck around because the language was worth it. If I came in today and saw that I needed to install VS just to get 64-bit binaries, I doubt I'd stick around long enough to discover how great the language is. I also didn't like that I had to install the Xcode tools on my Mac, but that's needed for any development on Mac from what I can see.
A wizard-style installation with links to things and a good flow might help a lot here. Is that possible? -- Andrei
Oct 26
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 26 October 2017 at 11:32:26 UTC, Andrei Alexandrescu 
wrote:
 A wizard-style installation with links to things and a good 
 flow might help a lot here. Is that possible? -- Andrei
The DMD installer is already a Wizard on Windows. First it checks if you have a current version of D and will uninstall that, then it checks if you want to install D2 along with some extras (Visual D, DMC, D1), and it goes through additional steps to install the extras if you select them. However, if you need Visual Studio installed, then that takes like a half an hour. My recollection is that it's a little tricky if you upgrade to a new version of VS. I usually just uninstall D and reinstall it rather than deal with that. I would have to uninstall MSVC to figure out how annoying it would be to install without one (and that's a bit of a hassle). I can't remember at what point it checks for MSVC, maybe before installing Visual D? One thing that might slightly simplify things is if the smallest free MSVC you can install is provided as optional if no MSVC installation is found. Some note can be added like "Required for 64bit binaries".
Oct 26
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 10/26/2017 08:36 AM, jmh530 wrote:
 On Thursday, 26 October 2017 at 11:32:26 UTC, Andrei Alexandrescu wrote:
 A wizard-style installation with links to things and a good flow might 
 help a lot here. Is that possible? -- Andrei
The DMD installer is already a Wizard on Windows. First it checks if you have a current version of D and will uninstall that, then it checks if you want to install D2 along with some extras (Visual D, DMC, D1), and it goes through additional steps to install the extras if you select them. However, if you need Visual Studio installed, then that takes like a half an hour. My recollection is that it's a little tricky if you upgrade to a new version of VS. I usually just uninstall D and reinstall it rather than deal with that. I would have to uninstall MSVC to figure out how annoying it would be to install without one (and that's a bit of a hassle). I can't remember at what point it checks for MSVC, maybe before installing Visual D? One thing that might slightly simplify things is if the smallest free MSVC you can install is provided as optional if no MSVC installation is found. Some note can be added like "Required for 64bit binaries".
I am preparing a request for Microsoft to allow us to redistribute some of their binaries. Of course we want to do that only if deemed necessary (they are not available easily from their site etc). Any help building an exact list of requisites would be great. Thanks! -- Andrei
Oct 26
next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Thursday, 26 October 2017 at 15:50:07 UTC, Andrei Alexandrescu 
wrote:
 On 10/26/2017 08:36 AM, jmh530 wrote:
 [...]
I am preparing a request for Microsoft to allow us to redistribute some of their binaries. Of course we want to do that only if deemed necessary (they are not available easily from their site etc). Any help building an exact list of requisites would be great. Thanks! -- Andrei
Thanks a lot Andrei. At my working environment I advertise the use of DLang. Major competitors are python, nodejs, java and go. Everything which makes the installation and use of DLang more easy makes it easier to advertise D. Kind regards Andre
Oct 26
parent reply Mike Parker <aldacron gmail.com> writes:
On Thursday, 26 October 2017 at 16:27:13 UTC, Andre Pany wrote:
 On Thursday, 26 October 2017 at 15:50:07 UTC, Andrei 
 Alexandrescu wrote:
 On 10/26/2017 08:36 AM, jmh530 wrote:
 [...]
I am preparing a request for Microsoft to allow us to redistribute some of their binaries. Of course we want to do that only if deemed necessary (they are not available easily from their site etc). Any help building an exact list of requisites would be great. Thanks! -- Andrei
Thanks a lot Andrei. At my working environment I advertise the use of DLang. Major competitors are python, nodejs, java and go. Everything which makes the installation and use of DLang more easy makes it easier to advertise D.
That's exactly the kind of developer background I'm thinking of. Getting permission to redistribute from MS would be the ideal solution. If not, I'm sure someone will find a way to make it work with the LLVM or MinGW tools eventually.
Oct 26
parent reply MrSmith <mrsmith33 yandex.ru> writes:
On Thursday, 26 October 2017 at 17:02:40 UTC, Mike Parker wrote:
 That's exactly the kind of developer background I'm thinking 
 of. Getting permission to redistribute from MS would be the 
 ideal solution. If not, I'm sure someone will find a way to 
 make it work with the LLVM or MinGW tools eventually.
Would it be possible to create import libs that for all winapi/crt libs, and redistribute them? Will such libs be legal to redist? We have the tools (DMD/LLD), but the dependency on winsdk and VS libs is still there, unfortunatelly.
Oct 26
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 26/10/2017 10:38 PM, MrSmith wrote:
 On Thursday, 26 October 2017 at 17:02:40 UTC, Mike Parker wrote:
 That's exactly the kind of developer background I'm thinking of. 
 Getting permission to redistribute from MS would be the ideal 
 solution. If not, I'm sure someone will find a way to make it work 
 with the LLVM or MinGW tools eventually.
Would it be possible to create import libs that for all winapi/crt libs, and redistribute them? Will such libs be legal to redist? We have the tools (DMD/LLD), but the dependency on winsdk and VS libs is still there, unfortunatelly.
Those files should be included with the request to Microsoft.
Oct 26
prev sibling parent reply Kagamin <spam here.lot> writes:
On Thursday, 26 October 2017 at 21:38:25 UTC, MrSmith wrote:
 Would it be possible to create import libs that for all 
 winapi/crt libs, and redistribute them? Will such libs be legal 
 to redist?
 We have the tools (DMD/LLD), but the dependency on winsdk and 
 VS libs is still there, unfortunatelly.
MinGW compiles import libraries from text .def files that are lists of exported symbols: https://sourceforge.net/p/mingw-w64/mingw-w64/ci/master/tree/mingw-w64-crt/lib64/
Oct 27
parent reply MrSmith <mrsmith33 yandex.ru> writes:
On Friday, 27 October 2017 at 09:56:25 UTC, Kagamin wrote:
 MinGW compiles import libraries from text .def files that are 
 lists of exported symbols: 
 https://sourceforge.net/p/mingw-w64/mingw-w64/ci/master/tree/mingw-w64-crt/lib64/
I will test dmd + lld + use .def files instead of .lib files
Oct 27
parent reply Kagamin <spam here.lot> writes:
On Friday, 27 October 2017 at 14:20:04 UTC, MrSmith wrote:
 On Friday, 27 October 2017 at 09:56:25 UTC, Kagamin wrote:
 MinGW compiles import libraries from text .def files that are 
 lists of exported symbols: 
 https://sourceforge.net/p/mingw-w64/mingw-w64/ci/master/tree/mingw-w64-crt/lib64/
I will test dmd + lld + use .def files instead of .lib files
With this the only missing piece will be the C startup code (mainCRTStartup in crtexe.c), though not sure where it's compiled.
Oct 27
next sibling parent Kagamin <spam here.lot> writes:
On Friday, 27 October 2017 at 16:05:10 UTC, Kagamin wrote:
 (mainCRTStartup in crtexe.c)
Or crt0.c
Oct 27
prev sibling parent reply MrSmith <mrsmith33 yandex.ru> writes:
On Friday, 27 October 2017 at 16:05:10 UTC, Kagamin wrote:
 With this the only missing piece will be the C startup code 
 (mainCRTStartup in crtexe.c), though not sure where it's 
 compiled.
How do I get lld-link to link .obj files? Clang itself emits .o files, and those link successfully. For .obj files
./lld-link test.obj
error: test.obj: The file was not recognized as a valid object file
Oct 28
parent MrSmith <mrsmith33 yandex.ru> writes:
On Saturday, 28 October 2017 at 09:20:40 UTC, MrSmith wrote:
 error: test.obj: The file was not recognized as a valid object 
 file
Ah, forgot to pass -m64 to dmd
Oct 28
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 26 October 2017 at 15:50:07 UTC, Andrei Alexandrescu 
wrote:
 I am preparing a request for Microsoft to allow us to 
 redistribute some of their binaries. Of course we want to do 
 that only if deemed necessary (they are not available easily 
 from their site etc). Any help building an exact list of 
 requisites would be great. Thanks! -- Andrei
Cool. From the Installing DMD wiki (https://wiki.dlang.org/Installing_DMD), it recommends the VS 2015 Community for Windows 7/8 or VS 2017 Community for Windows 10 users. I have Visual Studio Community 2017 installed, available from one of the links here: https://www.visualstudio.com/downloads/ One little trick in the wiki is that it says that you can uncheck boxes to reduce the size/time of the download. It might be helpful to know the minimum required to get D working and the minimum required to get Visual D working. It also lists an alternative to install the Microsoft build tools and an appropriate version of the Windows SDK. LDC also refers to the Visual C++ build tools, but does not reference the SDK. I think the VS build tools replace the Visual C++ tools. The VS build tools look like they have an option to install the SDK and other stuff too.
Oct 26
parent reply Mike Parker <aldacron gmail.com> writes:
On Thursday, 26 October 2017 at 16:34:54 UTC, jmh530 wrote:
 One little trick in the wiki is that it says that you can 
 uncheck boxes to reduce the size/time of the download. It might 
 be helpful to know the minimum required to get D working and 
 the minimum required to get Visual D working.
The Visual C++ install is all that it's needed. I'll update the wiki tomorrow after I verify the options on my desktop.
 It also lists an alternative to install the Microsoft build 
 tools and an appropriate version of the Windows SDK. LDC also 
 refers to the Visual C++ build tools, but does not reference 
 the SDK. I think the VS build tools replace the Visual C++ 
 tools. The VS build tools look like they have an option to 
 install the SDK and other stuff too.
I have the 2015 tools via the VS 2017 installer (it's an option), but I can't recall ever running the build tools installer directly. If anyone can verify and update the wiki, that would be great.
Oct 26
parent jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 26 October 2017 at 17:10:55 UTC, Mike Parker wrote:

 The Visual C++ install is all that it's needed. I'll update the 
 wiki tomorrow after I verify the options on my desktop.
You know that wiki is pretty informative, but I don't see a reference or link to it anywhere here https://dlang.org/download.html When you click on more information, it takes you to more information about the compilers and then you have to click on DMD and then there's a link to Installing DMD. Might make sense (by that I mean, fewer clicks) to be able to have a single Installation instructions page for each and have a link to them near the downloads.
Oct 26
prev sibling next sibling parent reply Bo <Bo bo.com> writes:
On Thursday, 26 October 2017 at 12:36:40 UTC, jmh530 wrote:
 However, if you need Visual Studio installed, then that takes 
 like a half an hour.
And a gig of space, just because D needs a small part of it. That is why people do not want to install VS. Why install a competing language studio, when you are installing D.
Oct 26
parent reply Adam Wilson <flyboynw gmail.com> writes:
On 10/26/17 08:51, Bo wrote:
 On Thursday, 26 October 2017 at 12:36:40 UTC, jmh530 wrote:
 However, if you need Visual Studio installed, then that takes like a
 half an hour.
And a gig of space, just because D needs a small part of it. That is why people do not want to install VS. Why install a competing language studio, when you are installing D.
The XCode installer DMG is 5GB, before unpacking. And unlike VS17, I can't pick and choose. :) -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 26
next sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
(trying to install vs2017 build tools on Win7 sp1 .. ) vs_BuildTools.exe --layout c:\btoffline (error: requires .NET framework 4.6 of higher!!) (ok, download web installed for .NET 4.6...50MB or so, about 5 minutes later) (done...and installed) (lets try again) start: 12:04pm vs_BuildTools.exe --layout c:\btoffline ..Give us a minute..we'll be done soon.. (yeah right!!) total packages to download...1897 what the f##TG$! 12:08pm layout progres..0.19% forget it! I'll go use FreeBSD instead.
pkg install ldc
done..ready.to.go..start coding! VS is the most bloated piece of crap that's ever come out of Microsoft! Why encourage/force D developers to use it? You also need to update your certificates just to install the components, and a through other hoops you have to jump through as well....
Oct 26
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, October 27, 2017 01:12:53 codephantom via Digitalmars-d wrote:
 VS is the most bloated piece of crap that's ever come out of
 Microsoft!
 Why encourage/force D developers to use it?
The problem is that to reasonably interact with the rest of the Windows C/C++ ecosystem, you're pretty much stuck using Microsoft's linker. If we can get that without pulling in all of VS, all the better, but without the linker, we can't link with most existing C/C++ code, which is a big problem. Before we could use MS' linker, we had complaints for years about not being compatible with other C/C++ stuff on Windows. If we can make it work by using another linker and have it be compatible with stuff generated by MS' compiler (e.g. if LLVM's linker could be used in that case), then for many of us, that would definitely be superior to having to deal with VS, but for the moment at least, using VS seems to be the only real option if you want to interact with any existing C/C++ libraries or build for 64-bit (since OPTLINK has never been updated for 64-bit). Now, if you're just using your own code and/or loading dlls at runtime and/or can reasonably build all C/C++ stuff you need with dmc _and_ you don't need 64-bit on Windows, then there's no reason to pull in VS, and it's nice that you don't need to. But for serious Windows projects, there's a good chance that you're going to need MS' linker, much as that sucks, and MS seems to want you to pull in VS to get it. MS simply has not set things up in a way that makes it reasonable to avoid VS if you want to link with C/C++ libraries - especially since VS is all most C/C++ projects on Windows target at this point. - Jonathan M Davis
Oct 26
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Friday, 27 October 2017 at 01:40:07 UTC, Jonathan M Davis 
wrote:
 On Friday, October 27, 2017 01:12:53 codephantom via 
 Digitalmars-d wrote:
 VS is the most bloated piece of crap that's ever come out of
 Microsoft!
 Why encourage/force D developers to use it?
The problem is that to reasonably interact with the rest of the Windows C/C++ ecosystem, you're pretty much stuck using Microsoft's linker. If we can get that without pulling in all of VS, all the better, but without the linker, we can't link with most existing C/C++ code, which is a big problem. Before we could use MS' linker, we had complaints for years about not being compatible with other C/C++ stuff on Windows. MS simply has not set things up in a way that makes it reasonable to avoid VS if you want to link with C/C++ libraries - especially since VS is all most C/C++ projects on Windows target at this point. - Jonathan M Davis
I'm not sure about WinSDK 10, but previous versions has all the libs and tools necessary(linker!) and is much smaller download(500 MB or so) IIRC the problem is that DMD installer won't pick up SDK install path, and most newcomers neither has the knowledge of sc.ini nor the desire to mess with it.
Oct 26
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, October 27, 2017 02:00:53 evilrat via Digitalmars-d wrote:
 On Friday, 27 October 2017 at 01:40:07 UTC, Jonathan M Davis

 wrote:
 On Friday, October 27, 2017 01:12:53 codephantom via

 Digitalmars-d wrote:
 VS is the most bloated piece of crap that's ever come out of
 Microsoft!
 Why encourage/force D developers to use it?
The problem is that to reasonably interact with the rest of the Windows C/C++ ecosystem, you're pretty much stuck using Microsoft's linker. If we can get that without pulling in all of VS, all the better, but without the linker, we can't link with most existing C/C++ code, which is a big problem. Before we could use MS' linker, we had complaints for years about not being compatible with other C/C++ stuff on Windows. MS simply has not set things up in a way that makes it reasonable to avoid VS if you want to link with C/C++ libraries - especially since VS is all most C/C++ projects on Windows target at this point. - Jonathan M Davis
I'm not sure about WinSDK 10, but previous versions has all the libs and tools necessary(linker!) and is much smaller download(500 MB or so) IIRC the problem is that DMD installer won't pick up SDK install path, and most newcomers neither has the knowledge of sc.ini nor the desire to mess with it.
Well, if it's possible to use an SDK instead of VS, then ideally, we'd support with the installer that rather than requiring that VS be there, but obviously, someone will have to do the work to improve the installer. Personally, I'm really not a Windows dev, though I've had to use Visual Studio for work often enough, so my understanding of what other SDKs might exist from Microsoft is quite poor. I've only ever used Windows for development when I've had to. - Jonathan M Davis
Oct 26
parent Jeremy DeHaan <dehaan.jeremiah gmail.com> writes:
On Friday, 27 October 2017 at 02:20:54 UTC, Jonathan M Davis 
wrote:
 Well, if it's possible to use an SDK instead of VS, then 
 ideally, we'd support with the installer that rather than 
 requiring that VS be there, but obviously, someone will have to 
 do the work to improve the installer.

 Personally, I'm really not a Windows dev, though I've had to 
 use Visual Studio for work often enough, so my understanding of 
 what other SDKs might exist from Microsoft is quite poor. I've 
 only ever used Windows for development when I've had to.

 - Jonathan M Davis
A while back I played with the idea of a VS replacement for D and I made some progress using this: http://www.smorgasbordet.com/pellesc/ I managed to swap out the VS linker and use the included linker to build a 64-bit D binary, but that was as far as I got at the time. It's not open source, but the terms of use are pretty permissive. Maybe it's worth taking another look.
Oct 26
prev sibling parent reply Kagamin <spam here.lot> writes:
On Friday, 27 October 2017 at 01:40:07 UTC, Jonathan M Davis 
wrote:
 The problem is that to reasonably interact with the rest of the 
 Windows C/C++ ecosystem, you're pretty much stuck using 
 Microsoft's linker. If we can get that without pulling in all 
 of VS, all the better, but without the linker, we can't link 
 with most existing C/C++ code, which is a big problem.
How so? curl is an import library for libcurl.dll, mingw handles import libraries just fine, same for zlib and mxWidgets. But most of the time you only need to link phobos and some code from dub and that's all.
Oct 27
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, October 27, 2017 09:46:21 Kagamin via Digitalmars-d wrote:
 On Friday, 27 October 2017 at 01:40:07 UTC, Jonathan M Davis

 wrote:
 The problem is that to reasonably interact with the rest of the
 Windows C/C++ ecosystem, you're pretty much stuck using
 Microsoft's linker. If we can get that without pulling in all
 of VS, all the better, but without the linker, we can't link
 with most existing C/C++ code, which is a big problem.
How so? curl is an import library for libcurl.dll, mingw handles import libraries just fine, same for zlib and mxWidgets. But most of the time you only need to link phobos and some code from dub and that's all.
I don't know anything about import libraries, but we need to be able to link against any C/C++ libraries that someone is doing with VS. If mingw can do that, then it could be an option, though a lot more Windows devs are going to have VS than mingw, and it's my understanding that you have to pull in a fair bit of stuff for mingw (though presumably, it's not as big as VS), so I don't know how much of an improvement that would be. But the key thing here is that it needs to be easy and straightforward to link against the C/C++ libraries that are generally available for Windows and which folks are writing for their own projects, and that means being compatible with Microsoft and its linker. - Jonathan M Davis
Oct 27
prev sibling next sibling parent codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
14 minutes later... 12:18pm layout progres..8.16%
Oct 26
prev sibling next sibling parent codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
45 minutes later... 12:49pm layout progres..29%
Oct 26
prev sibling next sibling parent codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
1 hour, 7 minutes later... 1:11pm layout progres..45.43% (2.55GB downloaded...so far) I'll got get some lunch and come back to it...
Oct 26
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
here is an update...( objective: Write some code in D, and build a 64bit .exe) started downloading offline install of vs2017 buildtools at: 12:04pm finished downloading them at: 3:25pm (actual time it finished, not the time I came back to it). so...3 hours and 20 min later... I have an offline installation of build tools 6.83GB 3791 files 1919 folders let's try installing those buildtools . 3.55pm (when I got back to the pc) vs_BuildTools.exe (setup window pops up....says "Give us a minute..we'll be done soon."... but then disappears a few seconds later) what's going on...try again..same thing..try again..nope... turns out I had disabled my internet connection, and setup needs to use it for some reason, or the setup windows just disappears without telling the user anything). (I expect setup need the internet to verify certificates or something - I think you can install them manually too, in offline mode)...but gee...I wish setup had told me something..anything!! ok...so enabled internet connection, and try vs_BuildTools.exe....
 vs_BuildTools.exe  (ok...were off..installing 379MB)
..so what is the other 6.5GB needed for then? 4:03pm - build tools installed. pc needs to restart.... booted up..ready to go..lets try installing dmd again.. error: For 64-bit support MSVC and Windows SDK are needed but no compatible version were found.Do you want to install VS2013? No. I do not! Let's continue installing DMD anyway....oh another message pops up...Could not detect Visual Studio (2008-2017 are supported). No 64-bit support. That's it! I've had enough! 4 hours wasted!
Oct 26
next sibling parent reply Bo <Bo bo.com> writes:
On Friday, 27 October 2017 at 05:20:05 UTC, codephantom wrote:
 That's it!

 I've had enough!

 4 hours wasted!
Please try getting some editors going for D on Windows like Visual Studio Code or Atom. That time wasted will skyrocket even more when you run into one of the many issues. Linux installation is not much better. Brew ... took a hour to install but only had dmd not dub for some reason. The install script on the website: curl -fsS https://dlang.org/install.sh | bash -s dmd Well, that forces people to use: source ~/dlang/dmd-2.076.1/activate Each time they want to work with D. And it does not play nice with WSL because it never gets loaded so trying to access dmd from outside the WSL does not work. Download the Ubuntu/Debian deb file ... well, you better have google near you. How hard is it to have "sudo dpkg -i DEB_PACKAGE" as a instruction clearly on the website instead of only the deb file link... :) Or write in clear way simply: wget http://downloads.dlang.org/releases/2.x/2.076.1/dmd_2.076.1-0_amd64.deb sudo dpkg -i dmd_2.076.1-0_amd64.deb
Oct 27
parent Daniel Kozak <kozzi11 gmail.com> writes:
On my Arch linux, it is really easy to install D.

pacaur -Sy dlang // install dmd, ldc, and some other d tools
pacaur -Sy visual-studio-code // install visual studio code, then I just
install plugins from vscode

So I can't see how this could be easier

On Fri, Oct 27, 2017 at 11:09 AM, Bo via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 On Friday, 27 October 2017 at 05:20:05 UTC, codephantom wrote:

 That's it!

 I've had enough!

 4 hours wasted!
Please try getting some editors going for D on Windows like Visual Studio Code or Atom. That time wasted will skyrocket even more when you run into one of the many issues. Linux installation is not much better. Brew ... took a hour to install but only had dmd not dub for some reason. The install script on the website: curl -fsS https://dlang.org/install.sh | bash -s dmd Well, that forces people to use: source ~/dlang/dmd-2.076.1/activate Each time they want to work with D. And it does not play nice with WSL because it never gets loaded so trying to access dmd from outside the WSL does not work. Download the Ubuntu/Debian deb file ... well, you better have google near you. How hard is it to have "sudo dpkg -i DEB_PACKAGE" as a instruction clearly on the website instead of only the deb file link... :) Or write in clear way simply: wget http://downloads.dlang.org/releases/2.x/2.076.1/dmd_2.076.1- 0_amd64.deb sudo dpkg -i dmd_2.076.1-0_amd64.deb
Oct 27
prev sibling parent reply codephantom <me noyb.com> writes:
On Friday, 27 October 2017 at 05:20:05 UTC, codephantom wrote:
 That's it!
 I've had enough!
 4 hours wasted!
ok... I must have done something wrong.. If I create an offline installation of the VS2017 buildtools, and then install the default 'Visual C++ build tools' selection, and then after that's installed, I install dmd (dmd-2.076.1.exe), then I can straight away compile D code with the -m64 option. But still, I started testing this whole process at 12:04pm today. It's now 10:23PM All I can say, it thank god I used FreeBSD ;-)
 pkg install ldc
(a few seconds later, I can start compiling 64bit D code).
Oct 27
parent reply Mengu <mengukagan gmail.com> writes:
On Friday, 27 October 2017 at 11:25:13 UTC, codephantom wrote:
 On Friday, 27 October 2017 at 05:20:05 UTC, codephantom wrote:
 That's it!
 I've had enough!
 4 hours wasted!
ok... I must have done something wrong.. But still, I started testing this whole process at 12:04pm today. It's now 10:23PM All I can say, it thank god I used FreeBSD ;-)
 pkg install ldc
(a few seconds later, I can start compiling 64bit D code).
looks like d has a long way to go on freebsd as well.
Oct 27
parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 01:08:57 UTC, Mengu wrote:
 looks like d has a long way to go on freebsd as well.
I've had no issues with D in FreeBSD at all... ...and it's been a really smooth transition to D...so far... I have D, Postgresql, and static C/C++ bindings working just fine...and that's really all I need..for now. btw. The FreeBSD platform isn't even mentioned here: https://insights.stackoverflow.com/survey/2017#technology-platforms So I'm just glad it works at all..otherwise I'd have to choose between not using D, or using another platform...and neither choice is appealing.
Oct 27
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, October 28, 2017 02:50:39 codephantom via Digitalmars-d wrote:
 On Saturday, 28 October 2017 at 01:08:57 UTC, Mengu wrote:
 looks like d has a long way to go on freebsd as well.
I've had no issues with D in FreeBSD at all... ...and it's been a really smooth transition to D...so far... I have D, Postgresql, and static C/C++ bindings working just fine...and that's really all I need..for now. btw. The FreeBSD platform isn't even mentioned here: https://insights.stackoverflow.com/survey/2017#technology-platforms So I'm just glad it works at all..otherwise I'd have to choose between not using D, or using another platform...and neither choice is appealing.
FreeBSD generally works well, but it hasn't generally been handled quite as well as Linux - in part because of the auto-tester, and in part because a lot fewer people around here use FreeBSD. I've sometimes had problems, because the autotester currently uses FreeBSD 8.4 (IIRC), and so breakage on recent versions of FreeBSD aren't always caught (though we're working towards getting the autotesters updated - there are a few tests that currently fail with newer versions of FreeBSD but not many). 32-bit in particular has more problems, since I think that most of us who do use FreeBSD are running the 64-bit version, so some of the problems weren't caught until Brad tried to upgrade the auto-tester. Things are made worse for me by the fact that I run TrueOS, which is essentially a vetted snapshot of the development version of FreeBSD, so things break from time to time. At the moment, I'm hoping that https://issues.dlang.org/show_bug.cgi?id=17596 gets sorted out before December, since the next update for the TrueOS stable branch is coming out then, and I expect that it will have the inode64 changes, which breaks dmd and pretty much any D program that deals with files. However, anyone running FreeBSD 11.x is in for a much smoother ride, and the fact that a few of us use TrueOS or FreeBSD CURRENT allows such problems to be caught before it becomes a problem for the release versions of FreeBSD. Getting the auto-tester updated will definitely help though. - Jonathan M Davis
Oct 27
prev sibling parent reply Mengu <mengukagan gmail.com> writes:
On Saturday, 28 October 2017 at 02:50:39 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 01:08:57 UTC, Mengu wrote:
 looks like d has a long way to go on freebsd as well.
I've had no issues with D in FreeBSD at all... ...and it's been a really smooth transition to D...so far... I have D, Postgresql, and static C/C++ bindings working just fine...and that's really all I need..for now. btw. The FreeBSD platform isn't even mentioned here: https://insights.stackoverflow.com/survey/2017#technology-platforms So I'm just glad it works at all..otherwise I'd have to choose between not using D, or using another platform...and neither choice is appealing.
my code that worked amazing on linux and mac os x failed miserably on freebsd which is my server os whenever and wherever possible. i did not have the luxury of days to fix stuff so i simply switched to debian.
Oct 28
parent codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 15:20:05 UTC, Mengu wrote:
 my code that worked amazing on linux and mac os x failed 
 miserably on freebsd which is my server os whenever and 
 wherever possible. i did not have the luxury of days to fix 
 stuff so i simply switched to debian.
Would be interested to know what that code was doing...to make it fail. FreeBSD is certainly increasing it's share in the server market .. particulary for large enterprises....most vm cloud providers now proivde them too....which I never expected a decade ago....( I think the change to the GPL a decade ago, caused many to consider alteratives to Linux..of which there a very few)... and if D takes off too(as I think it will over the coming years, not just because of the language, but because of its licence too)... then much greater attention will have to be given to D, in the FreeBSD environment. Till then...we have what we have... ..and for me..it's pretty good...so far ;-) Make sure you're on 11.x - x64 though...
Oct 28
prev sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 26 October 2017 at 20:44:49 UTC, Adam Wilson wrote:
 The XCode installer DMG is 5GB, before unpacking. And unlike 
 VS17, I can't pick and choose. :)
btw. (and I do realise we've gone way of the topic of this original thread)...but... if it interests anyone, this is the outcome of yesterday, where I wasted my whole day trying to get DMD to compile a 64bit .exe on a fresh install of Windows 7. (and ..I had to muck around with service packs, and .NET frameworks and stuff before hand too). It's the *minimum* 'selection set' you'll need (with regards to the Visual Studio Build Tools 2017) in order to get DMD to sucessfully compile a 64bit exe (-m64) Now to be fair, this is assuming you **don't** want and **don't** have VS installed, but just want the necessary 'build tools' so that DMD can build a *64bit* binary on Windows - (in total about 3.5GB). Code tools Static analysis tools Compilers, build tools, and runtimes VC++ 2017 v141 toolset (x86,x64) SDK's, libraries and frameworks Windows 10 SDK (10.0.16299.0) for Desktop C++ [x86 and x64] Windows 10 SDK (10.0.16299.0) for UWP: C#, VB, JS Windows 10 SDK (10.0.16299.0) for UWP: C++
Oct 28
next sibling parent reply Jerry <hurricane hereiam.com> writes:
On Saturday, 28 October 2017 at 07:39:21 UTC, codephantom wrote:
 btw. (and I do realise we've gone way of the topic of this 
 original thread)...but...

 if it interests anyone, this is the outcome of yesterday, where 
 I wasted my whole day trying to get DMD to compile a 64bit .exe 
 on a fresh install of Windows 7.
Your own incompetence isn't reason enough for everyone else to suffer. I've never had a problem installing Visual Studio, or getting D to work with it.
Oct 28
parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 14:00:14 UTC, Jerry wrote:
 On Saturday, 28 October 2017 at 07:39:21 UTC, codephantom wrote:
 btw. (and I do realise we've gone way of the topic of this 
 original thread)...but...

 if it interests anyone, this is the outcome of yesterday, 
 where I wasted my whole day trying to get DMD to compile a 
 64bit .exe on a fresh install of Windows 7.
Your own incompetence isn't reason enough for everyone else to suffer. I've never had a problem installing Visual Studio, or getting D to work with it.
Nice one Jerry. You're so eager to have a go at me, that you completely missed the point. I explicitly mentioned that I did *******NOT******* want VS installed. All I wanted, was to build a 64bit D binary, and wanted to know what was the minimum components I had to install in order to be able to do that. I had just wanted VS. I would have just installed that. The majority of time spent was downloading the damn thing! Go trawl somewhere else!
Oct 28
next sibling parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 14:43:38 UTC, codephantom wrote:
 Nice one Jerry.


 Go trawl somewhere else!
I think I meant troll, not trawl ;-)
Oct 28
parent codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 14:50:25 UTC, codephantom wrote:
 I think I meant troll, not trawl ;-)
btw... A scientific research paper, titled 'Trolls just want to have fun' found that: - Sadism and Machiavellianism were unique predictors of trolling enjoyment.. - Found clear evidence that sadists tend to troll because they enjoy it.. http://www.sciencedirect.com/science/article/pii/S0191886914000324
Oct 28
prev sibling next sibling parent reply Mengu <mengukagan gmail.com> writes:
On Saturday, 28 October 2017 at 14:43:38 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 14:00:14 UTC, Jerry wrote:
 On Saturday, 28 October 2017 at 07:39:21 UTC, codephantom 
 wrote:
 btw. (and I do realise we've gone way of the topic of this 
 original thread)...but...

 if it interests anyone, this is the outcome of yesterday, 
 where I wasted my whole day trying to get DMD to compile a 
 64bit .exe on a fresh install of Windows 7.
Your own incompetence isn't reason enough for everyone else to suffer. I've never had a problem installing Visual Studio, or getting D to work with it.
Nice one Jerry. You're so eager to have a go at me, that you completely missed the point. I explicitly mentioned that I did *******NOT******* want VS installed. All I wanted, was to build a 64bit D binary, and wanted to know what was the minimum components I had to install in order to be able to do that. I had just wanted VS. I would have just installed that. The majority of time spent was downloading the damn thing! Go trawl somewhere else!
but what if that is how you can build 64 bit binary? with mac os x, we have to download gbs of command line tools library before getting started with any development. if we want to build anything for ios or mac we have to download 5gb xcode. with a fast internet, you get that in a matter of minutes. i don't believe that should be a show stopper or maybe i am missing your point.
Oct 28
parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 15:18:07 UTC, Mengu wrote:
 with mac os x, we have to download gbs of command line tools 
 library before getting started with any development. if we want 
 to build anything for ios or mac we have to download 5gb xcode. 
 with a fast internet, you get that in a matter of minutes. i 
 don't believe that should be a show stopper or maybe i am 
 missing your point.
Yeah..sadly, we don't have fast internet here in Australia. 1GB takes about an hour (presuming house mate not online ;-) And I have a typically average connection. Just the build tools alone (without the VS IDE and stuff), took almost 4 hours for me to download. And all I wanted to do, was compile some D code into a 64bit binary. If I were on a mobile wireless internet connection, my next bill would send me into bankruptcy! (lucky I'm on landline connection). I guess if it took seconds, I'd have a bit less to complain about ;-) But if you really are missing my point..then let me state it more clearly... (1) I don't like waiting 4 hours to download gigabytes of crap I don't actually want, but somehow need (if I want to compile 64bit D that is). (2)I like to have choice. A fast internet might help with (1). (2) seems out of reach (and that's why I dont' and won't be using D on Windows ;-) (being a recreational programmer, I have that luxury..I understand that others do not, but that's no reason for 'some' to dismiss my concerns as irrelevant. They're relavent to me, and that's all that matters ;-)
Oct 28
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Saturday, 28 October 2017 at 15:36:38 UTC, codephantom wrote:
 (if I want to compile 64bit D that is).
 (being a recreational programmer
Why do you want 64 bit? I very rarely do 64 bit builds on Windows (mostly just to make sure my crap actually works) since there's not actually that many advantages to it anyway!
Oct 28
next sibling parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 15:42:00 UTC, Adam D. Ruppe wrote:
 Why do you want 64 bit? I very rarely do 64 bit builds on 
 Windows (mostly just to make sure my crap actually works) since 
 there's not actually that many advantages to it anyway!
I'm more of an experimenter than a programmer. I like seeing how code works in different environments. I have several 16-bit computers at home too...but no D for them ;-( I'm used to writing code in plain text editor (the plainer the better).. and I doing everything else at a shell prompt. It just how I like to 'play'. Perhaps that why I just see VS as a big scary monster that wants to eat up all my computer resources ;-) The DMar's C compiler is 2MB (no... I got the right...MB not GB..) ..think about it...
Oct 28
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Saturday, 28 October 2017 at 16:03:15 UTC, codephantom wrote:
 I like seeing how code works in different environments.
The beauty of it is they work basically the same. Especially on Windows, where 32 bit programs just work on almost any installation, 32 or 64 bit.
 The DMar's C compiler is 2MB (no... I got the right...MB not 
 GB..)
Yes, I have been using it for a looooong time. And it just works with dmd 32 bit! 64 bit is an added hassle, but an unnecessary one for most uses anyway.
Oct 28
next sibling parent codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 16:23:13 UTC, Adam D. Ruppe wrote:
 The beauty of it is they work basically the same. Especially on 
 Windows, where 32 bit programs just work on almost any 
 installation, 32 or 64 bit.
yes. i have dmd on one of my old laptops (it runs XP 32bit) ...works just fine. No VS crap needed. The whole o/s takes up only 2.5GB (about 2GB less than just the VS2017 build tools). Some where along the line, software development took a course for the worst...now we have bloated software with an increadible amount of dependencies on this and that....it's just getting crazy.... IMHO. too big and too slow.. that's why the dinosaurs never survived ;-)
Oct 28
prev sibling parent codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 16:23:13 UTC, Adam D. Ruppe wrote:
 64 bit is an added hassle, but an unnecessary one for most uses 
 anyway.
Today I thought I might install DMD on Windows XP 64bit (the intel one)... just to see if I can compile D with -m64. Well, with the Windows7 SDK and DMD installed, -m64 worked just fine. So D continues to surprise me, and now even supports (seems to anyway) a platform that everyone gave up on a long time ago .. isn't that great! 64bit D ... on Windows XP...
Oct 30
prev sibling parent reply Kagamin <spam here.lot> writes:
On Saturday, 28 October 2017 at 15:42:00 UTC, Adam D. Ruppe wrote:
 Why do you want 64 bit? I very rarely do 64 bit builds on 
 Windows (mostly just to make sure my crap actually works) since 
 there's not actually that many advantages to it anyway!
Because native. I believe Linux doesn't have 32-bit subsystem installed by default, Windows started to do it too.
Oct 30
next sibling parent reply Basile B. <b2.temp gmx.com> writes:
On Monday, 30 October 2017 at 10:53:33 UTC, Kagamin wrote:
 On Saturday, 28 October 2017 at 15:42:00 UTC, Adam D. Ruppe 
 wrote:
 Why do you want 64 bit? I very rarely do 64 bit builds on 
 Windows (mostly just to make sure my crap actually works) 
 since there's not actually that many advantages to it anyway!
Because native. I believe Linux doesn't have 32-bit subsystem
Ha i thought the same but... Yes it has one. Just setup the 32 bit version of the devel libraries (likely not necessary for phobos) and you can compile & link with -m32.
Oct 30
parent Kagamin <spam here.lot> writes:
On Monday, 30 October 2017 at 11:18:39 UTC, Basile B. wrote:
 Ha i thought the same but... Yes it has one.
The first 32 bit application will pull it as a dependency. Same can be done for JVM.
 Just setup the 32 bit version of the devel libraries
BTW why are those even needed? Doesn't ld link against so directly?
Oct 30
prev sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Monday, 30 October 2017 at 10:53:33 UTC, Kagamin wrote:
 Because native.
The processor natively supports all 32 bit code when running in 64 bit more. It just works as far as native hardware goes. You also need your library dependencies installed too, and indeed on Linux that might be an extra install (just like any other dependencies...), but on Windows, the 32 bit core libs are always installed and with D, you don't really use other stuff anyway. D on Windows 32 bit just works and generates an exe that just works on basically any Windows box from the last 15 years and will likely continue to just work for AT LEAST the next 5, probably more. If you're playing around... really no reason not to just use the 32 bit one.
Oct 30
next sibling parent reply codephantom <me noyb.com> writes:
On Monday, 30 October 2017 at 14:46:30 UTC, Adam D. Ruppe wrote:
 If you're playing around... really no reason not to just use 
 the 32 bit one.
Really depends what you're playing with. If you play with large databases, containing a lot data, then 64-bit memory addressing will give you access to more memory. And more memory means faster operations.
Oct 30
parent reply Adam D Ruppe <destructionator gmail.com> writes:
On Tuesday, 31 October 2017 at 01:00:29 UTC, codephantom wrote:
 If you play with large databases, containing a lot data, then 
 64-bit memory addressing will give you access to more memory.
That doesn't really matter. If you're IMPLEMENTING the database, sure it can help (but is still not *necessary*), but if you're just playing with it, let the database engine handle that and just query the bits you are actually interested in. People have been working with huge, HUGE databases in 32 bit programs for years.
 And more memory means faster operations.
Not necessarily. There are advantages to 64 bit, but you can live without them. A 32 bit program can do most the same stuff.
Oct 30
next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Tuesday, 31 October 2017 at 01:25:31 UTC, Adam D Ruppe wrote:
 On Tuesday, 31 October 2017 at 01:00:29 UTC, codephantom wrote:
 If you play with large databases, containing a lot data, then 
 64-bit memory addressing will give you access to more memory.
That doesn't really matter. If you're IMPLEMENTING the database, sure it can help (but is still not *necessary*),
Kinda important, say your server is 128Gb (bugger sizes are quite typical these days).
  but if you're just playing with it, let the database engine 
 handle that and just query the bits you are actually interested 
 in.

 People have been working with huge, HUGE databases in 32 bit 
 programs for years.
Ah ye, we can do the same in 16 bits with ample 640k bytes. Just window your dataset in 64k at a time, trivial. There are advantages in bigger size of virtual address space even if you use tiny fraction of physical memory.
 There are advantages to 64 bit, but you can live without them.
I can live without hot water in my house, would I?
 A 32 bit program can do most the same stuff.
Client applications probably do not care much. Servers and cluster software can use more RAM and take advantage of huge address space in many interesting ways.
Oct 30
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Tuesday, 31 October 2017 at 06:33:02 UTC, Dmitry Olshansky 
wrote:
 I can live without hot water in my house, would I?
So sad but true... my water heater went down today :( Basement flooded and it is blinking out a bad vapor sensor error code.
 Client applications probably do not care much. Servers and 
 cluster software can use more RAM and take advantage of huge 
 address space in many interesting ways.
Yeah, I know. And if you're writing that kind of software, installing Visual Studio isn't a big deal. But my point is that the kind of typical hobby stuff and a huge (HUGE) subset of other work too functions perfectly well with 32 bit, yes, even with optlink. You can do web applications, desktop applications, games, all kinds of things with the out-of-the-box dmd install and nobody will be the wiser of 32 vs 64 bit unless someone makes a specific stink over it.
Oct 31
next sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 31 October 2017 at 21:21:46 UTC, Adam D. Ruppe wrote:
 But my point is that the kind of typical hobby stuff and a huge 
 (HUGE) subset of other work too functions perfectly well with 
 32 bit, yes, even with optlink. You can do web applications, 
 desktop applications, games, all kinds of things with the 
 out-of-the-box dmd install and nobody will be the wiser of 32 
 vs 64 bit unless someone makes a specific stink over it.
My 'hobby stuff' involves pushing things to their limit..and beyond... ;-) Just now, on my 24GB mem desktop, I could malloc 21GB of contiguous memory! If I use -m32, that reduces to 1GB. Anyway...when you going to give us another surmon? Was Andrei the last angel to come visit Walter?
Oct 31
parent Adam D. Ruppe <destructionator gmail.com> writes:
On Wednesday, 1 November 2017 at 01:48:13 UTC, codephantom wrote:
 Anyway...when you going to give us another surmon?
This is WAY off topic so i'ma just leave it at this post (you can email me if you want to go further) but I kinda doubt I'll go to a DConf in Berlin. It is a pain for me. Maybe I'll do it... but don't count on it.
 Was Andrei the last angel to come visit Walter?
No, of course not! Scott Meyers also had to come down to restore the C++hood keys.
Oct 31
prev sibling parent Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Tuesday, 31 October 2017 at 21:21:46 UTC, Adam D. Ruppe wrote:
 On Tuesday, 31 October 2017 at 06:33:02 UTC, Dmitry Olshansky 
 wrote:
 I can live without hot water in my house, would I?
So sad but true... my water heater went down today :(
Ouch, that analogy got out of hand quick)
 Basement flooded and it is blinking out a bad vapor sensor 
 error code.
Sorry to hear that.
 Client applications probably do not care much. Servers and 
 cluster software can use more RAM and take advantage of huge 
 address space in many interesting ways.
Yeah, I know. And if you're writing that kind of software, installing Visual Studio isn't a big deal. But my point is that the kind of typical hobby stuff and a huge (HUGE) subset of other work too functions perfectly well with 32 bit, yes, even with optlink. You can do web applications, desktop applications, games, all kinds of things with the out-of-the-box dmd install and nobody will be the wiser of 32 vs 64 bit unless someone makes a specific stink over it.
Sure. Even Chrome snd its ilk were 32-bit for super long time. I think 64-bit consumed even more ram and that postponed the switch :)
Oct 31
prev sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, October 31, 2017 06:33:02 Dmitry Olshansky via Digitalmars-d 
wrote:
 On Tuesday, 31 October 2017 at 01:25:31 UTC, Adam D Ruppe wrote:
 A 32 bit program can do most the same stuff.
Client applications probably do not care much. Servers and cluster software can use more RAM and take advantage of huge address space in many interesting ways.
Wait, people run Windows on servers? No one could be that crazy, could they? ;) I think that Adam has a valid point that there _are_ plenty of applications that can function just fine as 32-bit, and given how much easier it is to build for 32-bit on Windows with D, if you don't need to interact with any 3rd party libraries built with MS' compiler, then simply using the default 32-bit dmd stuff on Windows could be just fine. But the fact remains that plenty of applications need 64-bit or would benefit from 64-bit, and plenty of applications need access to COFF libraries, and in those cases, you can't do things the easy way on Windows. So, for some stuff, having dmd as it is now with 32-bit works just fine, but for other stuff, it doesn't cut it at all. It really depends on what you're trying to do. Either way, it's unfortunate that we have to jump through as many hoops as we do in order to interact with the default C/C++ stuff on Windows. Hopefully, we'll be able to improve that over time though - and we already have. Once upon a time, we didn't have an installer on Windows (let alone one that tried to help you with VS), and we couldn't build COFF stuff with dmd at all. - Jonathan M Davis
Oct 31
next sibling parent codephantom <me noyb.com> writes:
On Wednesday, 1 November 2017 at 03:55:14 UTC, Jonathan M Davis 
wrote:
 I think that Adam has a valid point that there _are_ plenty of 
 applications that can function just fine as 32-bit, and given 
 how much easier it is to build for 32-bit on Windows with D, if 
 you don't need to interact with any 3rd party libraries built 
 with MS' compiler, then simply using the default 32-bit dmd 
 stuff on Windows could be just fine.
Yes. I agree. The point was valid, and it was not a point many would have dared argued .. so good on him ;-) But progress is needed too.. 64bit is to 32bit, was D is to C. A new world of possibilites await....
Oct 31
prev sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Wednesday, 1 November 2017 at 03:55:14 UTC, Jonathan M Davis 
wrote:
 On Tuesday, October 31, 2017 06:33:02 Dmitry Olshansky via 
 Digitalmars-d wrote:
 On Tuesday, 31 October 2017 at 01:25:31 UTC, Adam D Ruppe 
 wrote:
 A 32 bit program can do most the same stuff.
Client applications probably do not care much. Servers and cluster software can use more RAM and take advantage of huge address space in many interesting ways.
Wait, people run Windows on servers? No one could be that crazy, could they? ;)
You are seriously underestimating Windows Server. Yeah it has gui and remote desktop, but it ticks in at what ~200 mb of ram. Microsoft IIS is still top server on the web. Also if you didn’t noticed in recent years MS did quite a few breakthroughs on performance e.g. user-mode scheduling and RIO sockets.
 I think that Adam has a valid point that there _are_ plenty of 
 applications that can function just fine as 32-bit, and given 
 how much easier it is to build for 32-bit on Windows with D, if 
 you don't need to interact with any 3rd party libraries built 
 with MS' compiler, then simply using the default 32-bit dmd 
 stuff on Windows could be just fine.
That is ok.
 But the fact remains that plenty of applications need 64-bit or 
 would benefit from 64-bit, and plenty of applications need 
 access to COFF libraries, and in those cases, you can't do 
 things the easy way on Windows.
Like dmd itself!
Oct 31
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, November 01, 2017 05:36:21 Dmitry Olshansky via Digitalmars-d 
wrote:
 On Wednesday, 1 November 2017 at 03:55:14 UTC, Jonathan M Davis
 But the fact remains that plenty of applications need 64-bit or
 would benefit from 64-bit, and plenty of applications need
 access to COFF libraries, and in those cases, you can't do
 things the easy way on Windows.
Like dmd itself!
Yeah, given the situation with CTFE, it's kind of atrocious that we don't distribute dmd as a 64-bit binary at least as an option. - Jonathan M Davis
Oct 31
prev sibling next sibling parent Kagamin <spam here.lot> writes:
On Tuesday, 31 October 2017 at 01:25:31 UTC, Adam D Ruppe wrote:
 That doesn't really matter. If you're IMPLEMENTING the 
 database, sure it can help (but is still not *necessary*), but 
 if you're just playing with it, let the database engine handle 
 that and just query the bits you are actually interested in.

 People have been working with huge, HUGE databases in 32 bit 
 programs for years.
It's like clanging rocks together to get fire. It can be done, just expensive and doesn't scale when logic becomes complex.
Oct 31
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 10/30/2017 09:25 PM, Adam D Ruppe wrote:
 There are advantages to 64 bit, but you can live without them. A 32 bit 
 program can do most the same stuff.
The differences in performance are large and growing, however. -- Andrei
Oct 31
prev sibling parent Kagamin <spam here.lot> writes:
On Monday, 30 October 2017 at 14:46:30 UTC, Adam D. Ruppe wrote:
 On Monday, 30 October 2017 at 10:53:33 UTC, Kagamin wrote:
 Because native.
The processor natively supports all 32 bit code when running in 64 bit more. It just works as far as native hardware goes.
For processor it's a whole compatibility mode of operation. It's fairly deeply integrated, but still...
 You also need your library dependencies installed too, and 
 indeed on Linux that might be an extra install (just like any 
 other dependencies...), but on Windows, the 32 bit core libs 
 are always installed and with D, you don't really use other 
 stuff anyway.
For OS it's a whole emulated subsystem with separate collection of compiled code installed and loaded into RAM. On my system it's 1.36gb, 7000 files. On Windows it depends on installation too whether 32 bit subsystem is installed. Also if the code can work on linux 64 bit, there's little reason to stretch it to 32 bit.
 If you're playing around... really no reason not to just use 
 the 32 bit one.
64 bit system is free from some legacy stuff too, it's just better. So it's better to play with 64 bit than with 32 bit. For example remember the whole OMF deal.
Oct 31
prev sibling parent reply Jerry <hurricane hereiam.com> writes:
On Saturday, 28 October 2017 at 15:36:38 UTC, codephantom wrote:
 But if you really are missing my point..then let me state it 
 more clearly...

 (1) I don't like waiting 4 hours to download gigabytes of crap 
 I don't actually want, but somehow need (if I want to compile 
 64bit D that is).
Start the download when you go to sleep, when you wake up it will be finished. I did this as a kid when I had internet that was probably even slower than yours right now. It'll be like those 4 hours never even happened.
 (2)I like to have choice.

 A fast internet might help with (1).

 (2) seems out of reach (and that's why I dont' and won't be 
 using D on Windows ;-)
It's probably why you shouldn't be on Windows to begin with..
 (being a recreational programmer, I have that luxury..I 
 understand that others do not, but that's no reason for 'some' 
 to dismiss my concerns as irrelevant. They're relavent to me, 
 and that's all that matters ;-)
Talk about being narcissistic ;)
Oct 28
next sibling parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 19:46:00 UTC, Jerry wrote:
 It's probably why you shouldn't be on Windows to begin with..
I'm not. I'm on FreeBSD.
 Talk about being narcissistic ;)
I wasn't talking about narcissism, I was talking about trolling. Narcissism was not correlated with trolling enjoyment in that study I mentioned. If you want a debate. Fine. If you want to troll...go elsewhere.
Oct 28
parent reply Jerry <hurricane hereiam.com> writes:
On Sunday, 29 October 2017 at 00:17:10 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 19:46:00 UTC, Jerry wrote:
 It's probably why you shouldn't be on Windows to begin with..
I'm not. I'm on FreeBSD.
So why do you care about something that doesn't even affect you?
 Talk about being narcissistic ;)
I wasn't talking about narcissism, I was talking about trolling. Narcissism was not correlated with trolling enjoyment in that study I mentioned.
It is when someone is only thinking about themselves, such as yourself and wanting D to not use tools that you aren't even complaining about being good enough. You are just complaining about it cause it's a large download size. I find people always refer to people as trolls instead of creating a counterpoint to their argument. It's a lot easier to just label someone a troll and ignore their arguments than building a case for a baseless request.
Oct 28
parent reply codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 01:07:17 UTC, Jerry wrote:
 So why do you care about something that doesn't even affect you?
Well, if you had been following the discussion, instead of just trying to troll, then you would know that I was essentially doing an experiment. AIM: If I was using Windows, and wanted to compile 64bit D code, but did *NOT* want to install VS, then what would be the minimum components required (of VS's build tools). The outcome, **for me**, of that experiment, was a whole day wasted - mostly waiting for GB's of build tools to download. Not to mention the service packs, updates and .NET framework installation. Then there's getting your head around the various licence agreements, and alos trying to understand what these new processes running on my system are doing..and why are they talking to servers on the internet..... Coming from a non-windows C background ... the whole thing was a little daunting. I guess if you're used to all the bloat that comes with Windows...(as you seem to be) or have to use it for practical reasons ....then nothing about my little experiment would surprise you.. So I'm executing my right to free speech, and I'm saying that I don't like it, and I wish it was better. Is that so bad?
Oct 28
next sibling parent Jerry <hurricane hereiam.com> writes:
On Sunday, 29 October 2017 at 01:43:46 UTC, codephantom wrote:
 So I'm executing my right to free speech, and I'm saying that I 
 don't like it, and I wish it was better. Is that so bad?
You are doing more than saying you don't like it. You are requesting and advocating for the removal of a feature that has no reason to be removed. I never said you couldn't say you don't like it, you are free and welcome to do so. I never even said you can't request for a feature to be removed. I'm simply making the counter argument for why it shouldn't be, and I'm free to do that as you are free to make horrible requests. Anyways you keep trying to change your argument and make it appear as something else. Your free speech was never in jeopardy from the beginning, I never told you couldn't say anything. It's clear where this is going and it's clear you have no intentions of actually making any attempt to justify your request as it is unjustifiable. Good day.
Oct 28
prev sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 01:43:46 UTC, codephantom wrote:
 On Sunday, 29 October 2017 at 01:07:17 UTC, Jerry wrote:
 So why do you care about something that doesn't even affect 
 you?
Well, if you had been following the discussion, instead of just trying to troll, then you would know that I was essentially doing an experiment. AIM: If I was using Windows, and wanted to compile 64bit D code, but did *NOT* want to install VS, then what would be the minimum components required (of VS's build tools). The outcome, **for me**, of that experiment, was a whole day wasted - mostly waiting for GB's of build tools to download. Not to mention the service packs, updates and .NET framework installation. Then there's getting your head around the various licence agreements, and alos trying to understand what these new processes running on my system are doing..and why are they talking to servers on the internet..... Coming from a non-windows C background ... the whole thing was a little daunting. I guess if you're used to all the bloat that comes with Windows...(as you seem to be) or have to use it for practical reasons ....then nothing about my little experiment would surprise you.. So I'm executing my right to free speech, and I'm saying that I don't like it, and I wish it was better. Is that so bad?
It seems to me that you have a major case of anti-windows bias here, as I never have any issues on my main windows machine.
Oct 28
next sibling parent reply codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 02:09:31 UTC, 12345swordy wrote:
 It seems to me that you have a major case of anti-windows bias 
 here, as I never have any issues on my main windows machine.
Well, throughout this discussion, I have documented *my* experience (not yours) of getting 64bit D on a fresh install of Windows 7. That was my only objective. I'm really not anti-windows. Windows XP is still one of favourite all time os's! I'm not anti-microsoft either...they have some good stuff too.... I wish they'd get .NET core working on FreeBSD properly though...as I like playing with C# too... What I am, is: anti-bloat anti-too-many-unecessary-dependencies anti you-have-no-choice-but-to-download-GB's-stuff-you-really-don't-need This is not specific to Windows by the looks of the discussion. Apple has similar demands of you apparently (with Xcode). I think it can and should be done better...that's the point I'm really trying to get (push) across. I'm shocked that so many seem to disagree. The modularisation of the VS install process helps a little (if you can get your head around how it works)...but there are still too many dependencies...and the whole experience should be streamlined a lot more... Many developers these days learn to program in these bloated IDE's..and they are used to it..I get that. I learnt to program in C, using vi ;-) I guess different experiences lead to different expectations.
Oct 28
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 02:39:21 UTC, codephantom wrote:
 On Sunday, 29 October 2017 at 02:09:31 UTC, 12345swordy wrote:

 What I am, is:

 anti-bloat
 anti-too-many-unecessary-dependencies
 anti 
 you-have-no-choice-but-to-download-GB's-stuff-you-really-don't-need
How exactly do you know this? You never justify it! We are living in an age where we have terabytes harddrives! Hell, I even recall that gdb needs python for some strange reason, in my linux machines. I don't know WHY it requires it, but I wouldn't jump to conclusions and think it as "unnecessary-dependencies" simply because I don't understand the rational behind it.
Oct 29
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, October 29, 2017 16:14:11 12345swordy via Digitalmars-d wrote:
 On Sunday, 29 October 2017 at 02:39:21 UTC, codephantom wrote:
 On Sunday, 29 October 2017 at 02:09:31 UTC, 12345swordy wrote:

 What I am, is:

 anti-bloat
 anti-too-many-unecessary-dependencies
 anti
 you-have-no-choice-but-to-download-GB's-stuff-you-really-don't-need
How exactly do you know this? You never justify it! We are living in an age where we have terabytes harddrives! Hell, I even recall that gdb needs python for some strange reason, in my linux machines. I don't know WHY it requires it, but I wouldn't jump to conclusions and think it as "unnecessary-dependencies" simply because I don't understand the rational behind it.
Really, it doesn't matter. Yes, it would be great if VS didn't take up as much space as it does. It would be great if Microsoft released stuff with the idea that the core compiler command-line tools were what mattered, and the IDE was just an add-on for those who care. I'm sure that we could all come up with reasons to complain about what Microsoft is doing - _lots_ of geeks love to complain about Microsoft. What matters is that D needs to be able to link and interoperate with C/C++ code generated by Microsoft's compiler, because that's the primary compiler for Windows systems and what most everyone is using if they're developing on Windows. If we can do that in a way that minimizes what needs to be downloaded, then great. If we can't, then well, that sucks, but that's life. While complaining about what Microsoft is doing with VS may be justified, it doesn't really help anything. I think that we'd all be better off if we just let this topic die. - Jonathan M Davis
Oct 29
next sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 16:25:35 UTC, Jonathan M Davis 
wrote:
 While complaining about what Microsoft is doing with VS may be 
 justified, it doesn't really help anything. I think that we'd 
 all be better off if we just let this topic die.

 - Jonathan M Davis
That attitude would have you instantly evicted from my team ;-) It's precisely because of the 'complaining' that Microsoft is changing its ways'. Complain even 'louder'...that's my advice. -- The only real problem with Windows, is that you can't fork it --
Oct 29
prev sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 16:25:35 UTC, Jonathan M Davis 
wrote:
 While complaining about what Microsoft is doing with VS may be 
 justified, it doesn't really help anything. I think that we'd 
 all be better off if we just let this topic die.
Not to push the point too much...but I found this interesting phrase, from Google's 'ten things we know to be true' "Ultimately, our constant dissatisfaction with the way things are becomes the driving force behind everything we do." https://www.google.com/about/philosophy.html
Oct 29
prev sibling next sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 16:14:11 UTC, 12345swordy wrote:
 How exactly do you know this? You never justify it! We are 
 living in an age where we have terabytes harddrives! Hell, I 
 even recall that gdb needs python for some strange reason, in 
 my linux machines. I don't know WHY it requires it, but I 
 wouldn't jump to conclusions and think it as 
 "unnecessary-dependencies" simply because I don't understand 
 the rational behind it.
I believe that complexity and unnecessary dependencies on components and other teams, is the biggest problem/challenge facing modern software development. It lead to software that is intolerant to change. And it's the primary reason why the Cylons, when they arrive, will defeat us ;-) The D language is so refreshing...that I'd hate to see it get caught up in that mess of complexity. We should all really be on guard against it.
Oct 29
prev sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 16:14:11 UTC, 12345swordy wrote:
 Hell, I even recall that gdb needs python for some strange 
 reason, in my linux machines. I don't know WHY it requires it, 
 but I wouldn't jump to conclusions and think it as 
 "unnecessary-dependencies" simply because I don't understand 
 the rational behind it.
Here is an interesting paper, that explores the dependencies between modules in some open source kernels (linux vs BSD). The paper found the linux kernel (compared to the BSD kernels) had far too much dependency between modules, because linux uses far too many global variables, and so too many modules are tightly coupled by mean of them sharing those global variables. They argued that such common coupling (module dependencies) have a deleterious effect on maintainability, and that such common coupling increases exponentially with each subsequent release, further reducing maintainability. The key take away point of course, is that software development really needs to be on guard against the problems associated with modular dependencies. It's one of the reasons functional programming is becoming increasingly important (and useful). There is no reason why the principle should not also apply to the distribution of software. https://dl.acm.org/citation.cfm?id=1150566.1150571
Oct 29
prev sibling parent reply codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 02:09:31 UTC, 12345swordy wrote:
 It seems to me that you have a major case of anti-windows bias 
 here, as I never have any issues on my main windows machine.
Actually, it's the very opposite...I'm strongly arguing 'for' D on Windows. (otherwise I wouldn't have wasted my time with this). If you're ok with having VS, then that is not too much of pain to install..I get it. But if you don't want VS, then it really is a pain. You have to work out what is the min required components....all by yourself - like i had to do. That really was a pain! I want D on Windows (64bit included), and I want it to be a better experience than what I had...that's been the whole point of my involvement in the discussion. In essence, I'm an advocate for D on Windows ;-) (but to do that, without being forced to advocate for VS as well..is kinda challenging..it seems) It's D I'm interested in. Not VS.
Oct 28
next sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 03:46:35 UTC, codephantom wrote:
 It's D I'm interested in. Not VS.
btw. since this thread has gone way off topic... I'd suggest this one instead: https://forum.dlang.org/thread/xwuxfcdaqkcealxzgmqk forum.dlang.org
Oct 28
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Sunday, 29 October 2017 at 03:46:35 UTC, codephantom wrote:
 On Sunday, 29 October 2017 at 02:09:31 UTC, 12345swordy wrote:
 It seems to me that you have a major case of anti-windows bias 
 here, as I never have any issues on my main windows machine.
Actually, it's the very opposite...I'm strongly arguing 'for' D on Windows. (otherwise I wouldn't have wasted my time with this). If you're ok with having VS, then that is not too much of pain to install..I get it. But if you don't want VS, then it really is a pain. You have to work out what is the min required components....all by yourself - like i had to do. That really was a pain! I want D on Windows (64bit included), and I want it to be a better experience than what I had...that's been the whole point of my involvement in the discussion. In essence, I'm an advocate for D on Windows ;-) (but to do that, without being forced to advocate for VS as well..is kinda challenging..it seems) It's D I'm interested in. Not VS.
Just a little answer so that you see that you're not alone with your concerns. I think you're absolutely right and that your experiment was nicely done and clear from the beginning what it was about. Reading is a skill that some people seem to have problems with. To my experience now. I finally managed to install VS2017 by doing essentially the sleep during download thing to get the offline installer. My Internet is not especially bad but not good either (5 Mb down, 1 Mb up ADSL with very fluctuating latencies) and the download took also several hours. For 1.6 GB it's really slow. It has probably more to do with the Microsoft download code than anything else (as the discussions in the link someone provided tend to show). The good thing is that it is now possible to install VS2017 on a relatively small system partition, a thing that I didn't manage to do with VS2013 and VS2015. The DMD installer also had no problem to install the Visual-D plug-in and I managed to build my project in 32 and 64 bit. This said, it's the whole VS experience that I'm really annoyed with. MS goes really out of its way to make the whole IDE as magical as possible, i.e. everything is set so that the gritty reality of code generation is hidden from the developer. The more it goes, the less obvious it gets to install unconventional things in the environment. Even simple stuff can become a real pain. For instance, I like to have visible white spaces when editing code (yeah, I hate tabs in program code). In all editors and IDE I have tried yet, it was easy to set, when not in an appearance toolbar, it's somewhere in "view" or "edit" menu. In VS, it was a chore to find and I had to customize a tool bar using 5 deep dialog box galore. Annoying. I can understand how and why MS do it that way. When you work a little bit longer with it, it is really sleek and nicely integrated in the system. The thing is, it that it removes the perspective of what really happens when building a program (object files, libs, linking etc.) and that's the reason why we get so regularely the complaints about the "Windows experience sucking": MS has nurtured a generation of devs who have no clue what building an app entails. To conclude: if D wants to cater to that crowd, it will have to bite the bullet and make the Windows experience even smoother than it is now. You won't overcome Windows dev's Stockholm syndrome otherwise and Windows devs, should also peg down a little bit and learn that MS's way of doing things is far from being ideal (bloat, loss of control, changing specs every 3 years, programmed obsolescence (Active-X anyone?)).
Oct 29
next sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 10:21:22 UTC, Patrick Schluter 
wrote:
 Just a little answer so that you see that you're not alone with 
 your concerns. I think you're absolutely right and that your 
 experiment was nicely done and clear from the beginning what it 
 was about. Reading is a skill that some people seem to have 
 problems with.
Thanks for the support ;-) btw. I was just experimenting with the Windows 7 SDK iso (a 570MB download) https://www.microsoft.com/en-au/download/details.aspx?id=8442 From that ISO, I only need to install two components: - Header and Libraries (only the x64 ones are needed) (~180MB) - Visual C++ Compilers (~637MB) The total disk space needed to install those 2 components is 810MB. Then I can compile D using -m64 However DMD won't pick up the SDK during install, so I had to change these two settings in the sc.ini: VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ WindowsSdkDir=C:\Program Files\Microsoft SDKs\Windows\v7.1\ To my surprise (not really knowing what I was doing), it all worked (so far), and apart from downloading the iso, you don't need to be connected to the internet during the install of the SDK... The SDK requires you have .NET 4 installed first though, otherwise it won't let you install the Visual C++ Compilers. btw. The min size if you use the VS 2017 build tools was 3.5GB installed. So I have saved myself several gb of download, and several gb of disk space...just by using the Windows 7 SDK instead.
Oct 29
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 10:21:22 UTC, Patrick Schluter 
wrote:
 To conclude: if D wants to cater to that crowd, it will have to 
 bite the bullet and make the Windows experience even smoother 
 than it is now. You won't overcome Windows dev's Stockholm 
 syndrome otherwise and Windows devs, should also peg down a 
 little bit and learn that MS's way of doing things is far from 
 being ideal (bloat, loss of control, changing specs every 3 
 years, programmed obsolescence (Active-X anyone?)).
Or better yet, don't bother with a dying platform full of whiny devs who are helpless without an IDE. One of D's strengths is that it isn't architected for IDE-driven development and the oft-resulting verbosity, that's a market D should probably just leave alone. Instead, focus on the current major platform which lets you use almost any toolchain you want: http://forum.dlang.org/thread/xgiwhblmkvcgnsktjnoo forum.dlang.org Of course, it is admirable what Rainer and others do to maintain VisualD and other D tools for the Windows platform. I just don't see it mattering much in the next decade.
Oct 29
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 On Sunday, 29 October 2017 at 10:21:22 UTC, Patrick Schluter 
 wrote:
 To conclude: if D wants to cater to that crowd, it will have 
 to bite the bullet and make the Windows experience even 
 smoother than it is now. You won't overcome Windows dev's 
 Stockholm syndrome otherwise and Windows devs, should also peg 
 down a little bit and learn that MS's way of doing things is 
 far from being ideal (bloat, loss of control, changing specs 
 every 3 years, programmed obsolescence (Active-X anyone?)).
Or better yet, don't bother with a dying platform full of whiny devs who are helpless without an IDE. One of D's strengths is that it isn't architected for IDE-driven development and the oft-resulting verbosity, that's a market D should probably just leave alone. Instead, focus on the current major platform which lets you use almost any toolchain you want: http://forum.dlang.org/thread/xgiwhblmkvcgnsktjnoo forum.dlang.org Of course, it is admirable what Rainer and others do to maintain VisualD and other D tools for the Windows platform. I just don't see it mattering much in the next decade.
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Oct 29
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 [...]
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Take a look at the links in the thread I linked you, which show PC sales dropping for the last six years and back at the level of a decade ago.
Oct 29
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 21:21:58 UTC, Joakim wrote:
 On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 [...]
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Take a look at the links in the thread I linked you, which show PC sales dropping for the last six years and back at the level of a decade ago.
Mobile market != Desktop market. Windows still have the majority of the Desktop market. https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0
Oct 29
parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 21:30:06 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 21:21:58 UTC, Joakim wrote:
 On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 [...]
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Take a look at the links in the thread I linked you, which show PC sales dropping for the last six years and back at the level of a decade ago.
Mobile market != Desktop market. Windows still have the majority of the Desktop market. https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0
Sure, but most people compute and run apps on mobiles. As I pointed out there, mobiles are coming after the desktop and laptop markets, and will likely kill off Wintel in the coming years.
Oct 29
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 21:36:50 UTC, Joakim wrote:

 pointed out there, mobiles are coming after the desktop and 
 laptop markets, and will likely kill off Wintel in the coming 
 years.
No, they are not "coming after the desktop and markets", that's a ridiculous claim to make. You know why, because I hear the same claim being made back 5-7 years ago. People are not going to use mobile for typing up their MS word documents.
Oct 29
parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 21:59:36 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 21:36:50 UTC, Joakim wrote:

 pointed out there, mobiles are coming after the desktop and 
 laptop markets, and will likely kill off Wintel in the coming 
 years.
No, they are not "coming after the desktop and markets", that's a ridiculous claim to make. You know why, because I hear the same claim being made back 5-7 years ago. People are not going to use mobile for typing up their MS word documents.
Claims that were made 5-7 years ago can often come true later, :) and I already showed you that Samsung is pursuing it. Nobody will type up MS Word docs on their mobile alone, but they can do it with a Galaxy S8 in a Dex dock now, and soon won't be using Word or Office altogether. If you don't believe me, I suggest you read up on some computing history, start with WordStar: https://en.m.wikipedia.org/wiki/WordStar
Oct 29
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 22:22:23 UTC, Joakim wrote:
I suggest you read up on some computing history, start with
 WordStar:

 https://en.m.wikipedia.org/wiki/WordStar
I fail to see how Wordstar is relevant. Regardless people are not going to use mobile in the work place. You crusade against windows OS support is pointless.
Oct 29
parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 22:29:01 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 22:22:23 UTC, Joakim wrote:
I suggest you read up on some computing history, start with
 WordStar:

 https://en.m.wikipedia.org/wiki/WordStar
I fail to see how Wordstar is relevant.
Perhaps that's why you're missing every other thing I'm pointing out too.
 Regardless people are 0not going to use mobile in the work 
 place.
If that's so, I suspect the "work place" will become irrelevant.
 You crusade against windows OS support is pointless.
I don't crusade against anything. I simply point out that wasting time on a dying platform is not the best use of D tool devs' time. They're then free to do whatever they want with that info.
Oct 29
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 22:36:04 UTC, Joakim wrote:
 On Sunday, 29 October 2017 at 22:29:01 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 22:22:23 UTC, Joakim wrote:
I suggest you read up on some computing history, start with
 WordStar:

 https://en.m.wikipedia.org/wiki/WordStar
I fail to see how Wordstar is relevant.
Perhaps that's why you're missing every other thing I'm pointing out too.
The fact that is currently abandon and there isn't anything mobile related in the article that I just scan read? Yea, you need to improve on explaining things.
 Regardless people are 0not going to use mobile in the work 
 place.
If that's so, I suspect the "work place" will become irrelevant.
LOL Ok, now I know you talking nonsense.
 You crusade against windows OS support is pointless.
I don't crusade against anything. I simply point out that wasting time on a dying platform is not the best use of D tool devs' time.
Newsflash, it's not "dying". The mobile market is NOT going to single handily replace the laptop and desktop market.
Oct 29
parent reply Joakim <dlang joakim.fea.st> writes:
On Sunday, 29 October 2017 at 22:48:56 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 22:36:04 UTC, Joakim wrote:
 On Sunday, 29 October 2017 at 22:29:01 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 22:22:23 UTC, Joakim wrote:
I suggest you read up on some computing history, start with
 WordStar:

 https://en.m.wikipedia.org/wiki/WordStar
I fail to see how Wordstar is relevant.
Perhaps that's why you're missing every other thing I'm pointing out too.
The fact that is currently abandon and there isn't anything mobile related in the article that I just scan read? Yea, you need to improve on explaining things.
I'd argue you need to improve on understanding things. Specifically, just as WordStar once dominated the market and is now dead, the same is happening to Word and Windows now.
 Regardless people are 0not going to use mobile in the work 
 place.
If that's so, I suspect the "work place" will become irrelevant.
LOL Ok, now I know you talking nonsense.
 You crusade against windows OS support is pointless.
I don't crusade against anything. I simply point out that wasting time on a dying platform is not the best use of D tool devs' time.
Newsflash, it's not "dying". The mobile market is NOT going to single handily replace the laptop and desktop market.
It has not been fully replaced _yet_, but that is precisely what is about to happen.
Oct 29
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Sunday, 29 October 2017 at 23:01:37 UTC, Joakim wrote:
 I'd argue you need to improve on understanding things.  
 Specifically, just as WordStar once dominated the market and is 
 now dead, the same is happening to Word and Windows now.

 It has not been fully replaced _yet_, but that is precisely 
 what is about to happen.
Oh don't worry about that, MS already has a ARM device prototype with native desktop Windows 10 version(with x86 -> ARM translator).
Oct 29
parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 30 October 2017 at 01:51:33 UTC, evilrat wrote:
 On Sunday, 29 October 2017 at 23:01:37 UTC, Joakim wrote:
 I'd argue you need to improve on understanding things.  
 Specifically, just as WordStar once dominated the market and 
 is now dead, the same is happening to Word and Windows now.

 It has not been fully replaced _yet_, but that is precisely 
 what is about to happen.
Oh don't worry about that, MS already has a ARM device prototype with native desktop Windows 10 version(with x86 -> ARM translator).
I know, I alluded to their Windows on ARM effort in the first post in the thread I linked. I suspect it will be far too late and won't matter, just like Windows Phone. On Monday, 30 October 2017 at 02:39:44 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 23:01:37 UTC, Joakim wrote:

 It has not been fully replaced _yet_, but that is precisely 
 what is about to happen.
You got to try harder then the "because I say so" routine.
That's funny, because you're the only one using that routine. I've linked to extensive data showing that Windows is already dying in the thread that I gave you, specifically my first and last posts in that thread. You seem not to be able to follow such links and data, as you not only don't acknowledge but seemingly flatly deny the ongoing PC sales decline? The collapse is next. On Monday, 30 October 2017 at 03:50:43 UTC, codephantom wrote:
 btw. No mobile device will replace my desktop pc ...

 Like the pharaohs..I want access to my desktop pc in the after 
 life too..so it will be buried with me ;-)
Do you see anybody out there using a UNIX workstation? He's out there, that rare person who came up on it, and like you, says he will never replace it. :) Unless you become a relic like that guy, you too will likely simply switch to a mobile device someday, docked like the S8 in its DeX dock. Maybe you will be one of the few running Microsoft's rumored Andromeda device, as opposed to the mass-market device like the S8: http://www.zdnet.com/article/microsofts-rumored-andromeda-foldable-mobile-device-what-might-be-inside/ More likely you will use something based on Android, iOS, Fuchsia, or some new mobile OS.
Oct 30
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 30 October 2017 at 08:21:56 UTC, Joakim wrote:
 That's funny, because you're the only one using that routine.  
 I've linked to extensive data showing that Windows is already 
 dying in the thread that I gave you, specifically my first and 
 last posts in that thread.  You seem not to be able to follow 
 such links and data, as you not only don't acknowledge but 
 seemingly flatly deny the ongoing PC sales decline?  The 
 collapse is next.
I don't see why anyone would deny a PC sales decline, but it doesn't imply a collapse. I would expect more of a decline and then a stabilization. One reason is that the enterprise market is slow moving and a big component of PC sales. I dispute that the average white collar worker isn't going to be doing a bunch of work in Word/Excel/PowerPoint on a laptop or desktop ten years from now. Maybe twenty years from now, I don't know? I'm not really sure what the point of all this is. I have no intention of doing data analysis on a smart phone any time in the next ten years. I don't see why anyone would. So my main use case for D is probably not going anywhere. And I get stuck using Windows at work because everything's slow moving and there's no way I'm gonna be switching to Linux there. The whole Windows is dying is too far off to be relevant to things I need to actually accomplish.
Oct 30
parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 30 October 2017 at 12:30:12 UTC, jmh530 wrote:
 On Monday, 30 October 2017 at 08:21:56 UTC, Joakim wrote:
 That's funny, because you're the only one using that routine.  
 I've linked to extensive data showing that Windows is already 
 dying in the thread that I gave you, specifically my first and 
 last posts in that thread.  You seem not to be able to follow 
 such links and data, as you not only don't acknowledge but 
 seemingly flatly deny the ongoing PC sales decline?  The 
 collapse is next.
I don't see why anyone would deny a PC sales decline, but it doesn't imply a collapse. I would expect more of a decline and then a stabilization. One reason is that the enterprise market is slow moving and a big component of PC sales. I dispute that the average white collar worker isn't going to be doing a bunch of work in Word/Excel/PowerPoint on a laptop or desktop ten years from now. Maybe twenty years from now, I don't know?
The decline itself doesn't imply a collapse, the collapse is coming because the mobile market is looking for new growth avenues and releasing mobile accessories like Samsung's DeX dock or laptop replacements like the iPad Pro or this laptop shell: https://sentio.com Mobile convergence killed off standalone mp3 players, e-readers, GPS devices, point-and-shoot cameras, feature phones, a whole host of former mobile single-purpose devices. They're going after the PC now, with all the massive scale of the mobile wave: https://twitter.com/lukew/status/842397687420923904 Can the PC market withstand that tidal wave? I'm betting not. As for the average white collar worker in a decade, if they're using Google Docs on their Samsung S18 connected to something like that Sentio laptop shell, do you really imagine they won't be able to get their work done? I think it's more likely they're using software completely different than Office or Docs to get their work done, as those suites are already way outdated by now, but that's a different tangent.
 I'm not really sure what the point of all this is. I have no 
 intention of doing data analysis on a smart phone any time in 
 the next ten years. I don't see why anyone would. So my main 
 use case for D is probably not going anywhere. And I get stuck 
 using Windows at work because everything's slow moving and 
 there's no way I'm gonna be switching to Linux there. The whole 
 Windows is dying is too far off to be relevant to things I need 
 to actually accomplish.
I don't know how intense your data analysis is, but I replaced a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM as my daily driver a couple years ago, without skipping a beat. I built large mixed C++/D codebases on my ultrabook, now I do that on my Android/ARM tablet, which has a slightly weaker chip than my smartphone. The latest ARM-based iPad Pro is notorious for beating low to mid-range Intel Macbooks on benchmarks. It is not difficult to pick up smartphones with 6-8 GBs of RAM nowadays. Unless you need monster machines for your data and aren't allowed to crunch your data on online servers for security reasons, a very niche case, you can very likely do it on a smartphone. You may be right that your particular workplace moves slowly and they're not going mobile anytime soon. But this is such a big shift that you have to wonder if many such slow-moving workplaces will be able to compete with places that don't: just ask all the taxi companies phoning in rides to their drivers who got put out of business by Lyft, Uber, and their smartphone-wielding hordes of drivers. There will always be a few Windows cockroaches that survive the mobile nuclear blast, but we're talking about the majority who won't. As for you particularly, I can't speak to your situation without knowing more, but nobody's saying D should drop Windows support. I started off this OT thread by saying that investing more time in getting D somewhere close to the level of C#/C++ support in Visual Studio or some other IDE is a waste of time. I stand by that. If Rainer or someone else does it anyway, that's up to them how they want to spend their time.
Oct 30
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 The decline itself doesn't imply a collapse, the collapse is 
 coming because the mobile market is looking for new growth 
 avenues and releasing mobile accessories like Samsung's DeX 
 dock or laptop replacements like the iPad Pro or this laptop 
 shell:

 https://sentio.com
I look at this and just wonder why people wouldn't just have a laptop.
 Mobile convergence killed off standalone mp3 players, 
 e-readers, GPS devices, point-and-shoot cameras, feature 
 phones, a whole host of former mobile single-purpose devices.  
 They're going after the PC now, with all the massive scale of 
 the mobile wave:

 https://twitter.com/lukew/status/842397687420923904

 Can the PC market withstand that tidal wave?  I'm betting not.
And what does this show, a huge increase in smart phone/tablet shipments and a modest decline in desktop sales. I don't dispute this. Smart phones are leading to a huge increase in the amount of people who use computers on a daily basis. A whole bunch of people who use PCs may switch to just using smartphones/tablets. However, some people do need and want them. And they will continue to use them.
 As for the average white collar worker in a decade, if they're 
 using Google Docs on their Samsung S18 connected to something 
 like that Sentio laptop shell, do you really imagine they won't 
 be able to get their work done?  I think it's more likely 
 they're using software completely different than Office or Docs 
 to get their work done, as those suites are already way 
 outdated by now, but that's a different tangent.
Okay, but Google Docs isn't supported at my company. Microsoft Office is. We have a huge number of Excel files that use a lot of features that probably can't be ported over to Google Docs without a bunch of work. They might be able to get us to make new stuff with Google Docs, but we're still gonna need Excel for all the old stuff (so why bother switching). There's a reason why banks still use Cobol. Now I would love to move everything to R/Python. It could be done. But not everyone knows R/Python, but everyone knows Excel. If I get hit by a bus, then someone can figure out what I've done and get to work.
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker chip 
 than my smartphone.
I would have gobbled up 4GB using Matlab 5 years ago...Nowadays, I sometimes use a program called Stan. It does Hamiltonian Monte Carlo. Often it takes 10 minutes to run models on my home machine that's got a relatively new i7 processor on it. It's not unknown for the models to take hours with bigger data sets or more complicated models. I don't even like running the models at work because my work computer sucks compared to my home computer.
 The latest ARM-based iPad Pro is notorious for beating low to 
 mid-range Intel Macbooks on benchmarks.  It is not difficult to 
 pick up smartphones with 6-8 GBs of RAM nowadays.  Unless you 
 need monster machines for your data and aren't allowed to 
 crunch your data on online servers for security reasons, a very 
 niche case, you can very likely do it on a smartphone.
Doing everything on an AWS instance would be nice.
 You may be right that your particular workplace moves slowly 
 and they're not going mobile anytime soon.  But this is such a 
 big shift that you have to wonder if many such slow-moving 
 workplaces will be able to compete with places that don't: just 
 ask all the taxi companies phoning in rides to their drivers 
 who got put out of business by Lyft, Uber, and their 
 smartphone-wielding hordes of drivers.
We certainly have big competitive issues, but they aren't because our competitors are using Google Docs.
 There will always be a few Windows cockroaches that survive the 
 mobile nuclear blast, but we're talking about the majority who 
 won't.

 As for you particularly, I can't speak to your situation 
 without knowing more, but nobody's saying D should drop Windows 
 support.  I started off this OT thread by saying that investing 
 more time in getting D somewhere close to the level of C#/C++ 
 support in Visual Studio or some other IDE is a waste of time.  
 I stand by that.  If Rainer or someone else does it anyway, 
 that's up to them how they want to spend their time.
Look at the growth of Python. Among the many drivers of that, are people who use Numpy and its ecosystem (SciPy, Pandas, etc.). The work that Ilya et al are doing on Mir is a fantastic effort to provide similar functionality for D. More users using Mir will help build out the ecosystem and hopefully get it to a competitive place with Numpy one day. This requires more people using D. If efforts by Rainer or someone else to make the Windows experience better and leads to more D users and more Mir users, then I consider a positive. I don't consider it a waste of time.
Oct 30
parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 30 October 2017 at 15:46:56 UTC, jmh530 wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 The decline itself doesn't imply a collapse, the collapse is 
 coming because the mobile market is looking for new growth 
 avenues and releasing mobile accessories like Samsung's DeX 
 dock or laptop replacements like the iPad Pro or this laptop 
 shell:

 https://sentio.com
I look at this and just wonder why people wouldn't just have a laptop.
Expense is one major reason, just buy a laptop shell for $150 and connect it to the smartphone you already have. Another is that most new apps are developed for mobile nowadays, since the PC market is shrinking.
 Mobile convergence killed off standalone mp3 players, 
 e-readers, GPS devices, point-and-shoot cameras, feature 
 phones, a whole host of former mobile single-purpose devices.  
 They're going after the PC now, with all the massive scale of 
 the mobile wave:

 https://twitter.com/lukew/status/842397687420923904

 Can the PC market withstand that tidal wave?  I'm betting not.
And what does this show, a huge increase in smart phone/tablet shipments and a modest decline in desktop sales. I don't dispute this. Smart phones are leading to a huge increase in the amount of people who use computers on a daily basis. A whole bunch of people who use PCs may switch to just using smartphones/tablets. However, some people do need and want them. And they will continue to use them.
Yes, the question is how big is that group that will stick with PCs: do you think it will be 5% of the peak 2011 sales of 350 million PCs or 50% by 2027? Right now, it's down to 75%, which I'd call more than "modest," and keeps heading lower. For a comparison, standalone, ie non-smartphone, camera sales are down 80% from their peak and keep plunging lower: https://petapixel.com/2017/03/03/latest-camera-sales-chart-reveals-death-compact-camera/ I don't see how PCs can avoid a similar fate.
 As for the average white collar worker in a decade, if they're 
 using Google Docs on their Samsung S18 connected to something 
 like that Sentio laptop shell, do you really imagine they 
 won't be able to get their work done?  I think it's more 
 likely they're using software completely different than Office 
 or Docs to get their work done, as those suites are already 
 way outdated by now, but that's a different tangent.
Okay, but Google Docs isn't supported at my company. Microsoft Office is. We have a huge number of Excel files that use a lot of features that probably can't be ported over to Google Docs without a bunch of work. They might be able to get us to make new stuff with Google Docs, but we're still gonna need Excel for all the old stuff (so why bother switching). There's a reason why banks still use Cobol.
Excel is available on Android and the Samsung S8 too, along with multiwindow use on the DeX desktop dock. Such legacy use will indeed keep some old tech alive, but just as most don't use COBOL anymore, most won't be using that old tech.
 Now I would love to move everything to R/Python. It could be 
 done. But not everyone knows R/Python, but everyone knows 
 Excel. If I get hit by a bus, then someone can figure out what 
 I've done and get to work.
Sure, and they will likely be able to use it with Excel for Android too. Btw, Python is available as a package in the Termux Android app that I use when programming on my tablet: https://play.google.com/store/apps/details?id=com.termux&hl=en Some people are working to get R on there too.
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker 
 chip than my smartphone.
I would have gobbled up 4GB using Matlab 5 years ago...Nowadays, I sometimes use a program called Stan. It does Hamiltonian Monte Carlo. Often it takes 10 minutes to run models on my home machine that's got a relatively new i7 processor on it. It's not unknown for the models to take hours with bigger data sets or more complicated models. I don't even like running the models at work because my work computer sucks compared to my home computer.
 The latest ARM-based iPad Pro is notorious for beating low to 
 mid-range Intel Macbooks on benchmarks.  It is not difficult 
 to pick up smartphones with 6-8 GBs of RAM nowadays.  Unless 
 you need monster machines for your data and aren't allowed to 
 crunch your data on online servers for security reasons, a 
 very niche case, you can very likely do it on a smartphone.
Doing everything on an AWS instance would be nice.
All the low-hanging fruit is being gobbled up by mobile, and most of the heavy compute by cloud servers. That leaves a narrow niche in between for beefy desktops, since most PCs sold are laptops. Perhaps you are in that desktop niche, but I contend it isn't very big.
 You may be right that your particular workplace moves slowly 
 and they're not going mobile anytime soon.  But this is such a 
 big shift that you have to wonder if many such slow-moving 
 workplaces will be able to compete with places that don't: 
 just ask all the taxi companies phoning in rides to their 
 drivers who got put out of business by Lyft, Uber, and their 
 smartphone-wielding hordes of drivers.
We certainly have big competitive issues, but they aren't because our competitors are using Google Docs.
No, it happens when they streamline and automate their entire workflow much more, to the point where they aren't using antiquated document systems anymore: http://ben-evans.com/benedictevans/2015/5/21/office-messaging-and-verbs I've never written a single document in the entire time I've contributed to the D open source project. That's because we replace that ancient document workflow with forums, email, gitter, bugzilla, git, and github, some of which is also fairly old tech, but not nearly so as typing up a bunch of documents or spreadsheets. Of course, the D OSS project isn't a business, but the point is made in that linked post: most businesses are also about to transition away from that doc workflow altogether, where they simply replaced a bunch of printed documents and balance sheets with digital versions of the _same_ documents over the last couple decades. It's time for them to make the true digital transition, or they will lose out to those who did and became more efficient for it. Lyft and Uber are merely two public examples of the leading edge of this wave.
 There will always be a few Windows cockroaches that survive 
 the mobile nuclear blast, but we're talking about the majority 
 who won't.

 As for you particularly, I can't speak to your situation 
 without knowing more, but nobody's saying D should drop 
 Windows support.  I started off this OT thread by saying that 
 investing more time in getting D somewhere close to the level 
 of C#/C++ support in Visual Studio or some other IDE is a 
 waste of time.  I stand by that.  If Rainer or someone else 
 does it anyway, that's up to them how they want to spend their 
 time.
Look at the growth of Python. Among the many drivers of that, are people who use Numpy and its ecosystem (SciPy, Pandas, etc.). The work that Ilya et al are doing on Mir is a fantastic effort to provide similar functionality for D. More users using Mir will help build out the ecosystem and hopefully get it to a competitive place with Numpy one day. This requires more people using D. If efforts by Rainer or someone else to make the Windows experience better and leads to more D users and more Mir users, then I consider a positive. I don't consider it a waste of time.
Do those Python/Numpy users have the level of VS or other Windows IDE support that D currently doesn't? Either way, math modeling is such a small niche that I'm not sure it makes a difference, though I'm glad Ilya and others are pushing D in that direction on all OS's.
Oct 30
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 30 October 2017 at 16:50:42 UTC, Joakim wrote:
 [snip]

 No, it happens when they streamline and automate their entire 
 workflow much more, to the point where they aren't using 
 antiquated document systems anymore:

 http://ben-evans.com/benedictevans/2015/5/21/office-messaging-and-verbs

 I've never written a single document in the entire time I've 
 contributed to the D open source project.  That's because we 
 replace that ancient document workflow with forums, email, 
 gitter, bugzilla, git, and github, some of which is also fairly 
 old tech, but not nearly so as typing up a bunch of documents 
 or spreadsheets.

 Of course, the D OSS project isn't a business, but the point is 
 made in that linked post: most businesses are also about to 
 transition away from that doc workflow altogether, where they 
 simply replaced a bunch of printed documents and balance sheets 
 with digital versions of the _same_ documents over the last 
 couple decades.  It's time for them to make the true digital 
 transition, or they will lose out to those who did and became 
 more efficient for it.

 Lyft and Uber are merely two public examples of the leading 
 edge of this wave.
You're making a broader point about Lyft and Uber that I agree with. Automating certain things and providing a digital platform has been very successful for them. But taxicab companies switching from Excel to Google docs wouldn't have solved anything for them. Taxicab companies in London and other places have found better ways to adapt (excepting through increased regulations) by offering their own apps to compete. Similarly, the investment management industry (my industry) has seen a large increase in the share of passive management over the past 10 years (and a corresponding decline in the share of active management). Switching from Excel to Google docs is irrelevant. There are broader competitive forces at work. Now, these competitive forces have been shaped by computer-driven investing and a reduction in costs. So in this sense, your broader point has validity, but perhaps the way you were expressing it with regard to Office vs. Google Docs was not convincing.
 Do those Python/Numpy users have the level of VS or other 
 Windows IDE support that D currently doesn't?
You don't need VS with Python/Numpy, but python has a large number of IDEs available. I haven't used them, but they are there. The only thing I ever used was Ipython notebooks, which became Jupyter.
Oct 30
parent Joakim <dlang joakim.fea.st> writes:
On Monday, 30 October 2017 at 17:35:51 UTC, jmh530 wrote:
 On Monday, 30 October 2017 at 16:50:42 UTC, Joakim wrote:
 [snip]

 No, it happens when they streamline and automate their entire 
 workflow much more, to the point where they aren't using 
 antiquated document systems anymore:

 http://ben-evans.com/benedictevans/2015/5/21/office-messaging-and-verbs

 I've never written a single document in the entire time I've 
 contributed to the D open source project.  That's because we 
 replace that ancient document workflow with forums, email, 
 gitter, bugzilla, git, and github, some of which is also 
 fairly old tech, but not nearly so as typing up a bunch of 
 documents or spreadsheets.

 Of course, the D OSS project isn't a business, but the point 
 is made in that linked post: most businesses are also about to 
 transition away from that doc workflow altogether, where they 
 simply replaced a bunch of printed documents and balance 
 sheets with digital versions of the _same_ documents over the 
 last couple decades.  It's time for them to make the true 
 digital transition, or they will lose out to those who did and 
 became more efficient for it.

 Lyft and Uber are merely two public examples of the leading 
 edge of this wave.
You're making a broader point about Lyft and Uber that I agree with. Automating certain things and providing a digital platform has been very successful for them. But taxicab companies switching from Excel to Google docs wouldn't have solved anything for them. Taxicab companies in London and other places have found better ways to adapt (excepting through increased regulations) by offering their own apps to compete. Similarly, the investment management industry (my industry) has seen a large increase in the share of passive management over the past 10 years (and a corresponding decline in the share of active management). Switching from Excel to Google docs is irrelevant. There are broader competitive forces at work. Now, these competitive forces have been shaped by computer-driven investing and a reduction in costs. So in this sense, your broader point has validity, but perhaps the way you were expressing it with regard to Office vs. Google Docs was not convincing.
That's because I never made that Office/Docs comparison in the first place, I merely gave an example of someone plausibly replacing their current Windows/Excel workflow with Android/Docs in a decade. The operative comparison there is mobile Android versus desktop/laptop Windows, Docs doesn't even matter as Excel also runs on mobile. I was talking about the mobile shift being so big that it takes out a host of Windows PC-driven shops. I also tangentially mentioned that I don't think people will be using Office _or_ Docs in a decade, which is the bigger shift you seemed to want to explore, so I expanded on it. Lyft and Uber are particularly apposite because they've ridden both shifts to quick success.
 Do those Python/Numpy users have the level of VS or other 
 Windows IDE support that D currently doesn't?
You don't need VS with Python/Numpy, but python has a large number of IDEs available. I haven't used them, but they are there. The only thing I ever used was Ipython notebooks, which became Jupyter.
Never used Jupyter but I see that it's a webapp, so it should work fine on mobile, or as a frontend for a cloud instance.
Oct 30
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 30 October 2017 at 16:50:42 UTC, Joakim wrote:
 [...]
 Do those Python/Numpy users have the level of VS or other 
 Windows IDE support that D currently doesn't?  Either way, math 
 modeling is such a small niche that I'm not sure it makes a 
 difference, though I'm glad Ilya and others are pushing D in 
 that direction on all OS's.
Microsoft provides first class support for Python and R on Visual Studio. https://www.visualstudio.com/vs/python/ https://www.visualstudio.com/vs/rtvs/
Oct 30
prev sibling next sibling parent reply Mengu <mengukagan gmail.com> writes:
On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker chip 
 than my smartphone.
how do you program on your tablet? what are your tools? what is your setup? i also believe laptops are here to go.
Oct 31
parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 1 November 2017 at 00:16:19 UTC, Mengu wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker 
 chip than my smartphone.
how do you program on your tablet? what are your tools? what is your setup? i also believe laptops are here to go.
I use the Termux app that I mentioned before, along with a Rapoo bluetooth keyboard and a cheap, foldable stand to prop up my tablet: https://play.google.com/store/apps/details?id=com.termux&hl=en `apt install clang ldc vim git gdb cmake ninja python` in Termux and I'm ready to go (well, not quite, as I also need some library packages depending on the project, but you get the idea). You can also install Termux on a Chromebook laptop that runs Android apps: https://mobile.twitter.com/rmloveland/status/908529214357458946 https://mobile.twitter.com/termux It's far from an IDE, but I never used those before anyway. I want to try out something like that Sentio laptop shell one day, as the bigger 11.6" screen does make sense for me. So far, I've been fine with my 8.4" tablet screen though. On Wednesday, 1 November 2017 at 00:30:21 UTC, Tony wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:

 There will always be a few Windows cockroaches that survive 
 the mobile nuclear blast, but we're talking about the majority 
 who won't.
Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space?
Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market. As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
 I have seen conflicting reports about what OS is bigger in the 
 server market, but Windows is substantial and the more frequent 
 winner.

 https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends

 https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/
I have never seen any report that Windows is "bigger in the server market." Last month's Netcraft survey notes, "which underlying operating systems are used by the world's web facing computers? By far the most commonly used operating system is Linux, which runs on more than two-thirds of all web-facing computers. This month alone, the number of Linux computers increased by more than 91,000; and again, this strong growth can largely be attributed to cloud hosting providers, where Linux-based instances are typically the cheapest and most commonly available." https://news.netcraft.com/archives/2017/09/11/september-2017-web-server-survey.html Your first link is actually a bad sign for Windows, as it's likely just because companies are trying to save money by having their employees run Windows apps off a virtualized Windows Server, rather than buying a ton more Windows PCs. Meanwhile, your second link sees "Linux maintaining a noticeable lead" in the web-hosting market.
 And if desktop OSes were going to go away, the MacOS would go 
 before Windows.
Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays. Do you know the last time Apple released a standalone desktop computer? 2014, when they last updated the Mac Mini. They haven't updated the Mac Pro since 2013. They see the writing on the wall, which is why they're lengthening their release cycles for such legacy products. On Wednesday, 1 November 2017 at 01:59:19 UTC, codephantom wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker 
 chip than my smartphone.
hahhaa hahhaa... I can't stop laughing...hahaaa hahaaaaa. 3GB of ram, 4GB of ram..hahhaa..hahhha.... I'm starting to feel ill too...hahha...hahha..... ok. I'm back...to normal now... Can your tablet run FreeBSD as host, and run multiple vm's at the same time too? Can you put multiple SSD RAID into your tablet? Can you upgrade its ram to 32GB? Can you upgrade its video card to 6GB? Can you overclock its cpu to 4GHz? Can you even replace its cpu? Desktops rule!!!! Tablets are only good for reading pdf's while in bed ;-)
You're right, tablets can't do most of those things, though if you're fine just running FreeBSD in Qemu, that's coming: https://github.com/termux/termux-packages/pull/1329 If your point is that the 1% of PC users who do such things will stick with PCs and the remaining 99% will switch to mobile, I agree with you. :)
Nov 01
next sibling parent Kagamin <spam here.lot> writes:
On Wednesday, 1 November 2017 at 08:49:05 UTC, Joakim wrote:
 If your point is that the 1% of PC users who do such things 
 will stick with PCs and the remaining 99% will switch to 
 mobile, I agree with you. :)
You leave only 0% for DeX :)
Nov 01
prev sibling parent Tony <tonytdominguez aol.com> writes:
On Wednesday, 1 November 2017 at 08:49:05 UTC, Joakim wrote:
 On Wednesday, 1 November 2017 at 00:16:19 UTC, Mengu wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I 
 replaced a Win7 ultrabook that had a dual-core i5 and 4 GBs 
 of RAM with an Android tablet that has a quad-core ARMv7 and 
 3 GBs of RAM as my daily driver a couple years ago, without 
 skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I 
 do that on my Android/ARM tablet, which has a slightly weaker 
 chip than my smartphone.
How does the performance compare between an i5 laptop and an Android tablet?
 Why do predictions about the future matter when at the present 
 Windows dominates the desktop and is also strong in the server 
 space?
Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market. As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now. At this point, were they do to that, they would end up with a machine that has less power in most cases (there are Atom and Celeron laptops), and probably less memory and disk storage. That solution would be most attractive to Chromebook type users and very low end laptop users. And while people buy low spec laptops and desktops, there are still many laptops and desktops sold with chips that aren't named Atom and Celeron or arm. If phones and tablets try to get chips as powerful as those for the desktop and laptops they run into the chip maker's problem - the more processing power, the more the electricity the chip uses. Phones and tablets don't plug into the wall and they are smaller than the batteries in laptops. And in order to use a phone/tablet as a "lean forward" device (as opposed to "lean back") and do work, they will have to spend money on a "laptop shell" that will have a screen and keyboard and probably an SSD/HD which will cancel most of the cost savings from not buying a laptop. In the case of trying to court Android development, I read that 95% of Android is done on Java (and maybe other JVM languages like the now "officially supported" Kotlin) and 5% in C or C++. But that 5% is for applications that have a need for high performance, which is mostly games. Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit.
 I have seen conflicting reports about what OS is bigger in the 
 server market, but Windows is substantial and the more 
 frequent winner.

 https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends

 https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/
I have never seen any report that Windows is "bigger in the server market."
I linked one that said: "And what OSes are running in virtual machines and on physical servers around the world? It turns out like with client OSes, Microsoft is dominant. Fully 87.7% of the physical servers and VMs in the Spiceworks network (which are mostly on-premises) run Microsoft Windows Server."
 Last month's Netcraft survey notes,

 "which underlying operating systems are used by the world's web 
 facing computers?

 By far the most commonly used operating system is Linux, which 
 runs on more than two-thirds of all web-facing computers. This 
 month alone, the number of Linux computers increased by more 
 than 91,000; and again, this strong growth can largely be 
 attributed to cloud hosting providers, where Linux-based 
 instances are typically the cheapest and most commonly 
 available."
 https://news.netcraft.com/archives/2017/09/11/september-2017-web-server-survey.html
Web-facing server is a subset of servers. Shared web hosting services are probably a harder target for native-code applications than internal IT servers. But regardless of whether Windows is dominant, or just widely used, you haven't made predictions that Windows servers are going to die.
 Your first link is actually a bad sign for Windows, as it's 
 likely just because companies are trying to save money by 
 having their employees run Windows apps off a virtualized 
 Windows Server, rather than buying a ton more Windows PCs.
I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
 Meanwhile, your second link sees "Linux maintaining a 
 noticeable lead" in the web-hosting market.
Don't know why I linked that as it doesn't even have a percentage breakdown. My intent was to show a web server breakdown but I will concede that Linux is bigger for web servers. However, Windows is still big and you aren't predicting it will die.
 And if desktop OSes were going to go away, the MacOS would go 
 before Windows.
Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays. Do you know the last time Apple released a standalone desktop computer? 2014, when they last updated the Mac Mini. They haven't updated the Mac Pro since 2013.
Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
 They see the writing on the wall, which is why they're 
 lengthening their release cycles for such legacy products.
Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go. If you look at the graph in this article, the iPad has declined more as a percentage of Apple revenue than the macOS line has in the last five years. https://www.statista.com/statistics/382260/segments-share-revenue-of-apple/ There is a case to be made for supporting Android/iOS cross-compilation. But it doesn't have to come at the expense of Windows 64-bit integration. Not sure they even involve the same skillsets. Embarcadero and Remobjects both now support Android/iOS development from their Windows (and macOS in the case of Remobjects) IDEs.
Nov 02
prev sibling next sibling parent Tony <tonytdominguez aol.com> writes:
On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:

 There will always be a few Windows cockroaches that survive the 
 mobile nuclear blast, but we're talking about the majority who 
 won't.
Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space? I have seen conflicting reports about what OS is bigger in the server market, but Windows is substantial and the more frequent winner. https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/ And if desktop OSes were going to go away, the MacOS would go before Windows.
Oct 31
prev sibling parent codephantom <me noyb.com> writes:
On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I replaced 
 a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with 
 an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM 
 as my daily driver a couple years ago, without skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I do 
 that on my Android/ARM tablet, which has a slightly weaker chip 
 than my smartphone.
hahhaa hahhaa... I can't stop laughing...hahaaa hahaaaaa. 3GB of ram, 4GB of ram..hahhaa..hahhha.... I'm starting to feel ill too...hahha...hahha..... ok. I'm back...to normal now... Can your tablet run FreeBSD as host, and run multiple vm's at the same time too? Can you put multiple SSD RAID into your tablet? Can you upgrade its ram to 32GB? Can you upgrade its video card to 6GB? Can you overclock its cpu to 4GHz? Can you even replace its cpu? Desktops rule!!!! Tablets are only good for reading pdf's while in bed ;-)
Oct 31
prev sibling parent 12345swordy <alexanderheistermann gmail.com> writes:
On Sunday, 29 October 2017 at 23:01:37 UTC, Joakim wrote:

 It has not been fully replaced _yet_, but that is precisely 
 what is about to happen.
You got to try harder then the "because I say so" routine.
Oct 29
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 29/10/17 23:21, Joakim wrote:
 On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 [...]
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Take a look at the links in the thread I linked you, which show PC sales dropping for the last six years and back at the level of a decade ago.
Yes, Windows is dying, and has been for a long long time now. And I'll add one or two "good riddance" while we're at it. The point to remember, however, is that it still has a long long time still to completely die. Windows has been somewhat marginalized as development platform in recent years, but it will be along time still before it becomes irrelevant for users. I'm not sure it makes sense for D to ignore this platform, despite its bleak future. Shachar
Nov 01
parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 1 November 2017 at 09:14:22 UTC, Shachar Shemesh 
wrote:
 On 29/10/17 23:21, Joakim wrote:
 On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 On Sunday, 29 October 2017 at 18:52:06 UTC, Joakim wrote:
 [...]
What makes you think that windows is a "dying platform"!? There is no evidence to suggest this.
Take a look at the links in the thread I linked you, which show PC sales dropping for the last six years and back at the level of a decade ago.
Yes, Windows is dying, and has been for a long long time now. And I'll add one or two "good riddance" while we're at it. The point to remember, however, is that it still has a long long time still to completely die. Windows has been somewhat marginalized as development platform in recent years, but it will be along time still before it becomes irrelevant for users. I'm not sure it makes sense for D to ignore this platform, despite its bleak future. Shachar
I don't propose ignoring it, but I suggest not to invest too much more into it, like all the work it would take to get VS or other Windows IDE support up to the level where Windows devs seem to want.
Nov 01
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 1 November 2017 at 09:24:57 UTC, Joakim wrote:
 I don't propose ignoring it, but I suggest not to invest too 
 much more into it, like all the work it would take to get VS or 
 other Windows IDE support up to the level where Windows devs 
 seem to want.
I'm not sure how much work that would take, TBH, but I think that this is perhaps a better way to phrase your concerns. Saying that people working on Windows IDEs are wasting their time is probably not going to convince them of your point of view. Rather the argument that investing X hours in improving IDE would be better spent investing it in Y other D project is a much less inflammatory point. Granted, I think people could quibble with this, but it wouldn't drive me nuts to hear it.
Nov 01
prev sibling parent reply Bo <Bo bolang.com> writes:
For a dying platform as so many advocate here, it seems to be 
doing fairly well.

Maybe i am too old but the whole dying platform gig has been 
doing all the way to Windows ME and Vista and 8 and ...

The reality is, for any user that wants to be productive Windows 
is hard to beat. The only thing that comes close is the extreme 
hardware restrictive OSx from Apple.

I do think that people here have a massive anti Microsoft bias by 
just reading the comments.

Mobile will overtake PC for productivity? No ... simply no.
Windows is dying? Hardly...

Has the market changed because some users can use tablets, as 
they are not hardcore user but only want to simply browse and 
mail? Yes... There has been a shift there.

But will Windows be out fazed on the corporate floor? No ... Will 
Windows be removed as a gaming platform and replaced by Linux / 
OSx? No ...

While Linux and OSx can be used very well, both platforms share 
too many issues. OSx being hardware limited by design, as it 
makes testing more easy for Apple. Linux as a market that is so 
fragmented on the desktop level.

At times people may want to appreciate the level of robustness 
that Windows is. Its easily as stable like Linux but has the 
support level for almost every piece of hardware.

With the inclusion of WSL ( guess what i use D on because, well, 
i do not want to install VS! ), it combines both world.


Maybe for some people the reason why they are being so annoying 
and frankly rude, is there own bias is getting in the way of the 
message. Its not because a person wants to write D code, that 
they want to install a multi GIGABYTE installation just so they 
can compile 64bit programs.

Same with the comments that come down to "i do not see a reason 
why you want 64bit on Windows", is not a good excuse.

On Wednesday, 1 November 2017 at 09:24:57 UTC, Joakim wrote:
 I don't propose ignoring it, but I suggest not to invest too 
 much more into it, like all the work it would take to get VS or 
 other Windows IDE support up to the level where Windows devs 
 seem to want.
Its just shows a pure vileness to Windows users as "We do not care to fix issues on your platform, use our platform or install VS and have it bit rote on your hard drive for no reason beyond we simply do not want to support Windows on D". No wonder some people think that Windows is a second tier citizen in the D community. It sure as hell does not feel very welcoming reading this thread. When a person has a issue, the response seem to be very aggressive attacking that person and the platform but ignoring the actual issue. How many people posted here claiming that he wanted to have 64bit removed, when it was NOT what he wrote. There is a issue with Windows. The whole attacking the messenger, the whole idiotic argumentation's that Windows is dying, it is all pure useless trolling the people who ask a simple questions: How to solve the D 64bit issue so that like on the Linux or OSx platform, the users can have the SAME level of consistency. Its so strange that Go has solved the 64bit Windows a long time ago. Or C. Or C++ ... and so many other compilers that do NOT need VS installed to produce 64bit binaries on the Windows platform. So in other words, all these comments about just install VS are pure bullshit. If you do not like to answer the question, then do not troll people. And frankly, Walter or whoever, there needed to have been put a stop to this anti Windows bullshit several days ago. As long as people use this level of disrespect towards community members because they are not using the "right" platform. /Signed: A pissed off Windows user
Nov 01
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 There is a issue with Windows. The whole attacking the 
 messenger, the whole idiotic argumentation's that Windows is 
 dying, it is all pure useless trolling the people who ask a 
 simple questions: How to solve the D 64bit issue so that like 
 on the Linux or OSx platform, the users can have the SAME level 
 of consistency.
Windows 32 bit is the special one - it is the ONLY platform where D works out of the box without additional downloads. That's one reason why I advocate it for just playing around - it just works. On ALL other platforms for dmd: Win64, Linux 32/64, Mac, freebsd, you require additional downloads from the OS vendor to build your program. The only difference is size of the download from the OS vendor, and odds that it was already installed by something else. But it is the same idea - you use the OS vendor's linker and system libs to facilitate interoperability with other language code.
Nov 01
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 1 November 2017 at 18:59:21 UTC, Adam D. Ruppe 
wrote:
 Windows 32 bit is the special one - it is the ONLY platform 
 where D works out of the box without additional downloads. 
 That's one reason why I advocate it for just playing around - 
 it just works.

 On ALL other platforms for dmd: Win64, Linux 32/64, Mac, 
 freebsd, you require additional downloads from the OS vendor to 
 build your program.

 The only difference is size of the download from the OS vendor, 
 and odds that it was already installed by something else. But 
 it is the same idea - you use the OS vendor's linker and system 
 libs to facilitate interoperability with other language code.
DMD with the default -m32 is great for just playing around. No Windows D user would deny it. I compile with that far more than with -m64 (or any LDC for that matter). It's only really when you need -m64 (or -m32mscoff for that matter), and I mean really need it, that one needs to bother with Visual Studio. Perhaps that's part of the frustration. Things are so easy with the default and so frustrating when you have to make the change. For me, that point comes when trying to call C libraries in D. Downloading Visual Studio and installing it is a one-time cost, but getting C libraries working with D is something I don't do enough to remember the tricks. If the C library is compiled with Visual Studio, then you have to use the VS linker. But it usually will be something MinGW or Cygwin. From there, you might have some series of steps that I would never remember and always need to google to get it working. That's the frustrating part. Installing Visual Studio is annoying, but not really a huge deal for me in the grand scheme of things.
Nov 01
prev sibling next sibling parent reply bauss <jj_1337 live.dk> writes:
On Wednesday, 1 November 2017 at 18:59:21 UTC, Adam D. Ruppe 
wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 There is a issue with Windows. The whole attacking the 
 messenger, the whole idiotic argumentation's that Windows is 
 dying, it is all pure useless trolling the people who ask a 
 simple questions: How to solve the D 64bit issue so that like 
 on the Linux or OSx platform, the users can have the SAME 
 level of consistency.
Windows 32 bit is the special one - it is the ONLY platform where D works out of the box without additional downloads. That's one reason why I advocate it for just playing around - it just works.
Yes i works when toying around, but as soon as you want to write actual software then you can't write 32 bit anymore, because OPTLINK is just too buggy and will end up not being able to link your code correctly. A good example is that mysql-native currently don't link properly with OPTLINK. It might link for some, but at least for me; I'm forced to either use an older compiler or compile to 64 bit. See: https://github.com/mysql-d/mysql-native/issues/100 There's also reported issues like this one: https://issues.dlang.org/show_bug.cgi?id=15183 I'm aware that issues like these should be reported more often and as soon as they're discovered, but they're also hard to report, because you get virtually no information about what's wrong and you can only guess by commenting out sections of your code until it will link. That's not ideal. I'm sure many other similar issues exists. Yes, 32 bit development with D is easy on Windows, but only for toying around; which is no reason to defend it.
Nov 07
next sibling parent reply Jerry <hurricane hereiam.com> writes:
On Tuesday, 7 November 2017 at 19:10:50 UTC, bauss wrote:
 On Wednesday, 1 November 2017 at 18:59:21 UTC, Adam D. Ruppe 
 wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 There is a issue with Windows. The whole attacking the 
 messenger, the whole idiotic argumentation's that Windows is 
 dying, it is all pure useless trolling the people who ask a 
 simple questions: How to solve the D 64bit issue so that like 
 on the Linux or OSx platform, the users can have the SAME 
 level of consistency.
Windows 32 bit is the special one - it is the ONLY platform where D works out of the box without additional downloads. That's one reason why I advocate it for just playing around - it just works.
Yes i works when toying around, but as soon as you want to write actual software then you can't write 32 bit anymore, because OPTLINK is just too buggy and will end up not being able to link your code correctly. A good example is that mysql-native currently don't link properly with OPTLINK. It might link for some, but at least for me; I'm forced to either use an older compiler or compile to 64 bit. See: https://github.com/mysql-d/mysql-native/issues/100 There's also reported issues like this one: https://issues.dlang.org/show_bug.cgi?id=15183 I'm aware that issues like these should be reported more often and as soon as they're discovered, but they're also hard to report, because you get virtually no information about what's wrong and you can only guess by commenting out sections of your code until it will link. That's not ideal. I'm sure many other similar issues exists. Yes, 32 bit development with D is easy on Windows, but only for toying around; which is no reason to defend it.
You can use -m32mscoff for 32-bit, which uses Visual Studio like the 64-bit version. I've been saying OPTLINK should be removed. Even if you report a bug for optlink, it's never going to get fixed. No one's stupid enough to go digging through that spaghetti code dump. If you're luck, some limitation might introduced to DMD that won't cause the bug in OPTLINK to trigger. That's why it shouldn't be supported anymore, it's hindering DMD, not making it better. It's amazing how many people are so lazy to download Visual Studio, and some of the stupidest reason for not wanting to download it to boot.
Nov 07
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Tuesday, 7 November 2017 at 20:44:57 UTC, Jerry wrote:
 On Tuesday, 7 November 2017 at 19:10:50 UTC, bauss wrote:
 On Wednesday, 1 November 2017 at 18:59:21 UTC, Adam D. Ruppe 
 wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 There is a issue with Windows. The whole attacking the 
 messenger, the whole idiotic argumentation's that Windows is 
 dying, it is all pure useless trolling the people who ask a 
 simple questions: How to solve the D 64bit issue so that 
 like on the Linux or OSx platform, the users can have the 
 SAME level of consistency.
Windows 32 bit is the special one - it is the ONLY platform where D works out of the box without additional downloads. That's one reason why I advocate it for just playing around - it just works.
Yes i works when toying around, but as soon as you want to write actual software then you can't write 32 bit anymore, because OPTLINK is just too buggy and will end up not being able to link your code correctly. A good example is that mysql-native currently don't link properly with OPTLINK. It might link for some, but at least for me; I'm forced to either use an older compiler or compile to 64 bit. See: https://github.com/mysql-d/mysql-native/issues/100 There's also reported issues like this one: https://issues.dlang.org/show_bug.cgi?id=15183 I'm aware that issues like these should be reported more often and as soon as they're discovered, but they're also hard to report, because you get virtually no information about what's wrong and you can only guess by commenting out sections of your code until it will link. That's not ideal. I'm sure many other similar issues exists. Yes, 32 bit development with D is easy on Windows, but only for toying around; which is no reason to defend it.
You can use -m32mscoff for 32-bit, which uses Visual Studio like the 64-bit version. I've been saying OPTLINK should be removed. Even if you report a bug for optlink, it's never going to get fixed. No one's stupid enough to go digging through that spaghetti code dump. If you're luck, some limitation might introduced to DMD that won't cause the bug in OPTLINK to trigger. That's why it shouldn't be supported anymore, it's hindering DMD, not making it better. It's amazing how many people are so lazy to download Visual Studio, and some of the stupidest reason for not wanting to download it to boot.
It's not that people don't want to get Visual Studio, but some people have limited space. Ex. until a few months ago I was actually developing all my stuff on a Windows tablet which only had 30gb of space (The OS etc. also took of those 30 gb.) It would have been impossible for me to get Visual Studio on it, at least if I wanted to use it for anything else. Of course it's not a problem for me at the moment as I have a laptop, but at the time it was the only thing I had. At least I didn't get by any bugs in OPTLINK back then, else it would have been impossible for me to actually write D code.
Nov 07
parent Jerry <hurricane hereiam.com> writes:
On Tuesday, 7 November 2017 at 23:04:09 UTC, bauss wrote:
 On Tuesday, 7 November 2017 at 20:44:57 UTC, Jerry wrote:
 On Tuesday, 7 November 2017 at 19:10:50 UTC, bauss wrote:
 On Wednesday, 1 November 2017 at 18:59:21 UTC, Adam D. Ruppe 
 wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 There is a issue with Windows. The whole attacking the 
 messenger, the whole idiotic argumentation's that Windows 
 is dying, it is all pure useless trolling the people who 
 ask a simple questions: How to solve the D 64bit issue so 
 that like on the Linux or OSx platform, the users can have 
 the SAME level of consistency.
Windows 32 bit is the special one - it is the ONLY platform where D works out of the box without additional downloads. That's one reason why I advocate it for just playing around - it just works.
Yes i works when toying around, but as soon as you want to write actual software then you can't write 32 bit anymore, because OPTLINK is just too buggy and will end up not being able to link your code correctly. A good example is that mysql-native currently don't link properly with OPTLINK. It might link for some, but at least for me; I'm forced to either use an older compiler or compile to 64 bit. See: https://github.com/mysql-d/mysql-native/issues/100 There's also reported issues like this one: https://issues.dlang.org/show_bug.cgi?id=15183 I'm aware that issues like these should be reported more often and as soon as they're discovered, but they're also hard to report, because you get virtually no information about what's wrong and you can only guess by commenting out sections of your code until it will link. That's not ideal. I'm sure many other similar issues exists. Yes, 32 bit development with D is easy on Windows, but only for toying around; which is no reason to defend it.
You can use -m32mscoff for 32-bit, which uses Visual Studio like the 64-bit version. I've been saying OPTLINK should be removed. Even if you report a bug for optlink, it's never going to get fixed. No one's stupid enough to go digging through that spaghetti code dump. If you're luck, some limitation might introduced to DMD that won't cause the bug in OPTLINK to trigger. That's why it shouldn't be supported anymore, it's hindering DMD, not making it better. It's amazing how many people are so lazy to download Visual Studio, and some of the stupidest reason for not wanting to download it to boot.
It's not that people don't want to get Visual Studio, but some people have limited space. Ex. until a few months ago I was actually developing all my stuff on a Windows tablet which only had 30gb of space (The OS etc. also took of those 30 gb.) It would have been impossible for me to get Visual Studio on it, at least if I wanted to use it for anything else. Of course it's not a problem for me at the moment as I have a laptop, but at the time it was the only thing I had. At least I didn't get by any bugs in OPTLINK back then, else it would have been impossible for me to actually write D code.
Well a tablet isn't really for development. Even a cheap laptop would be better for development. You can't really do much of anything with that little space. I don't think the focus should be people with niche development hardware like tablets. If you do enough CTFE the RAM usage of DMD shoots through the roof and you'd end up not having enough RAM to compile anyways. Let alone if you have enough ram but still use the 32-bit version of DMD and hit that limit.
Nov 07
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Tuesday, 7 November 2017 at 20:44:57 UTC, Jerry wrote:
 It's amazing how many people are so lazy to download Visual 
 Studio, and some of the stupidest reason for not wanting to 
 download it to boot.
It has nothing to do with lazyness. If you're behind a proxy that abomination of a installer of Visual Studio doesn't work. I tried several times, offline and online setup, read the Studio forums. Studio 2017 installer doesn't work inside our environment at work (EU Commission). Might be an issue with our infrastructure but it's unlikely as I managed to install a lot of things before.
Nov 07
next sibling parent Jerry <hurricane hereiam.com> writes:
On Wednesday, 8 November 2017 at 06:24:38 UTC, Patrick Schluter 
wrote:
 On Tuesday, 7 November 2017 at 20:44:57 UTC, Jerry wrote:
 It's amazing how many people are so lazy to download Visual 
 Studio, and some of the stupidest reason for not wanting to 
 download it to boot.
It has nothing to do with lazyness. If you're behind a proxy that abomination of a installer of Visual Studio doesn't work. I tried several times, offline and online setup, read the Studio forums. Studio 2017 installer doesn't work inside our environment at work (EU Commission). Might be an issue with our infrastructure but it's unlikely as I managed to install a lot of things before.
Your the first person that's responded to me to have said this. Do you use D at your work anyways? If you require Visual Studio at your work, the free community version probably isn't for you anyways. If you say it's not your infrastructure, then they might very well be blocking the download to your work, knowing that it's a business. But I'd say it's more likely not them blocking it. Contact IT, if you're even suppose to be installing Visual Studio on the computers at work.
Nov 08
prev sibling parent codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 06:24:38 UTC, Patrick Schluter 
wrote:
 It has nothing to do with lazyness. If you're behind a proxy 
 that abomination of a installer of Visual Studio doesn't work. 
 I tried several times, offline and online setup, read the 
 Studio forums. Studio 2017 installer doesn't work inside our 
 environment at work (EU Commission). Might be an issue with our 
 infrastructure but it's unlikely as I managed to install a lot 
 of things before.
There are many good reasons why certain organisations might block the installation of Visual Studio (and not just due to its ridiculous size). It's a monster of a (potential) threat vector, when you really think about it. If I were managing that organisation, I'd be blocking it too ;-) (admins can deploy internally of course, but should do so only after a detailed risk analysis).
Nov 08
prev sibling parent Kagamin <spam here.lot> writes:
On Tuesday, 7 November 2017 at 19:10:50 UTC, bauss wrote:
 See:
 https://github.com/mysql-d/mysql-native/issues/100

 There's also reported issues like this one:
 https://issues.dlang.org/show_bug.cgi?id=15183
Walter said somewhere that submitting objs should be enough.
Nov 10
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/1/2017 11:59 AM, Adam D. Ruppe wrote:
 Windows 32 bit is the special one - it is the ONLY platform where D works out
of 
 the box without additional downloads. That's one reason why I advocate it for 
 just playing around - it just works.
Yay Digital Mars C++ :-)
Nov 07
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 For a dying platform as so many advocate here, it seems to be 
 doing fairly well.

 Maybe i am too old but the whole dying platform gig has been 
 doing all the way to Windows ME and Vista and 8 and ...

 The reality is, for any user that wants to be productive 
 Windows is hard to beat. The only thing that comes close is the 
 extreme hardware restrictive OSx from Apple.

 I do think that people here have a massive anti Microsoft bias 
 by just reading the comments.

 Mobile will overtake PC for productivity? No ... simply no.
 Windows is dying? Hardly...

 Has the market changed because some users can use tablets, as 
 they are not hardcore user but only want to simply browse and 
 mail? Yes... There has been a shift there.

 But will Windows be out fazed on the corporate floor? No ... 
 Will Windows be removed as a gaming platform and replaced by 
 Linux / OSx? No ...

 While Linux and OSx can be used very well, both platforms share 
 too many issues. OSx being hardware limited by design, as it 
 makes testing more easy for Apple. Linux as a market that is so 
 fragmented on the desktop level.

 At times people may want to appreciate the level of robustness 
 that Windows is. Its easily as stable like Linux but has the 
 support level for almost every piece of hardware.

 With the inclusion of WSL ( guess what i use D on because, 
 well, i do not want to install VS! ), it combines both world.


 Maybe for some people the reason why they are being so annoying 
 and frankly rude, is there own bias is getting in the way of 
 the message. Its not because a person wants to write D code, 
 that they want to install a multi GIGABYTE installation just so 
 they can compile 64bit programs.

 Same with the comments that come down to "i do not see a reason 
 why you want 64bit on Windows", is not a good excuse.

 On Wednesday, 1 November 2017 at 09:24:57 UTC, Joakim wrote:
 I don't propose ignoring it, but I suggest not to invest too 
 much more into it, like all the work it would take to get VS 
 or other Windows IDE support up to the level where Windows 
 devs seem to want.
Its just shows a pure vileness to Windows users as "We do not care to fix issues on your platform, use our platform or install VS and have it bit rote on your hard drive for no reason beyond we simply do not want to support Windows on D". No wonder some people think that Windows is a second tier citizen in the D community. It sure as hell does not feel very welcoming reading this thread. When a person has a issue, the response seem to be very aggressive attacking that person and the platform but ignoring the actual issue. How many people posted here claiming that he wanted to have 64bit removed, when it was NOT what he wrote. There is a issue with Windows. The whole attacking the messenger, the whole idiotic argumentation's that Windows is dying, it is all pure useless trolling the people who ask a simple questions: How to solve the D 64bit issue so that like on the Linux or OSx platform, the users can have the SAME level of consistency. Its so strange that Go has solved the 64bit Windows a long time ago. Or C. Or C++ ... and so many other compilers that do NOT need VS installed to produce 64bit binaries on the Windows platform. So in other words, all these comments about just install VS are pure bullshit. If you do not like to answer the question, then do not troll people. And frankly, Walter or whoever, there needed to have been put a stop to this anti Windows bullshit several days ago. As long as people use this level of disrespect towards community members because they are not using the "right" platform. /Signed: A pissed off Windows user
Your rant is rife with mistakes, both factual and perceived. Your main claim seems to be that Windows users who want to use D but don't want VS are being attacked, because the D devs are too lazy to find a way to do that. More likely, that's considered a niche use case that the D devs on Windows don't want to bother with, since most developing for Windows probably use VS already, just like you have to use Xcode on macOS or for iOS. As for saying Windows is dying, that is a factual examination of the data with one recommendation/application: don't bother spending a lot of time on improving D's Windows IDE support. Nobody suggested it had anything to do with supporting Windows users who don't use VS, and it's bonkers to suggest it means we're "anti-Windows" or shows "disrespect" for Windows users. We don't support Haiku OS either, it's not because we're anti-Haiku or disrespect its users. It's just too small and niche for us to care. The evidence is that Windows is heading that way too.
Nov 01
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 1 November 2017 at 19:49:04 UTC, Joakim wrote:
 As for saying Windows is dying, that is a factual examination 
 of the data
When you say it is dying, I (and perhaps most others) would assume the argument you are making is that not only is Windows in decline, but also that it is about to no longer exist as a meaningful platform for programmers to code on. This is a forecast about the future. However, the future is inherently un-knowable. Forecasts are opinions. While these forecasts may be based on facts and people could disagree about the likelihood of the forecast or their confidence in the forecast, it is opinion. It is not fact. I wouldn't dispute that Windows is in decline. I looked up the stack overflow survey of platforms that people program on and added up the Windows components from 2013 to 2016. In 2013 it was 60.4% and steadily fell to 52.2% in 2016. The largest growth of the share was OS X (not Linux). However, even falling from 60% to 50%, it's still 50%. That's huge. And this is programmers who use Stack Overflow, not normal users. Look at the developer environment and its either Visual Studio or a text editor (Sublime or Notepad++) as most popular. The evidence says it is in decline. And the trend doesn't look good. However, that doesn't mean it's going away. It also doesn't mean you can project the current trend into the future at the current rate or at a faster or slower rate. Who knows what the rate could be. What matters is that half of all developers (by this measure) use Windows now. Who knows what the equilibrium will be? Maybe it will stabilize at roughly equal shares across shares across Linux/OSX/Windows. Maybe Windows will become niche (in which case you could conceivably make the argument that it's dying). God only knows. But you cannot say that it is all fact and not opinion. It is opinion. It is a forecast. [1] https://insights.stackoverflow.com/survey/2016
Nov 01
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 1 November 2017 at 21:19:55 UTC, jmh530 wrote:
 When you say it is dying, I (and perhaps most others) would 
 assume the argument you are making is that not only is Windows 
 in decline, but also that it is about to no longer exist as a 
 meaningful platform for programmers to code on.
I would rephrase part of this as: "...but also that it is, at some point in the near future, to no longer exist..."
Nov 01
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 1 November 2017 at 21:19:55 UTC, jmh530 wrote:
 On Wednesday, 1 November 2017 at 19:49:04 UTC, Joakim wrote:
 As for saying Windows is dying, that is a factual examination 
 of the data
When you say it is dying, I (and perhaps most others) would assume the argument you are making is that not only is Windows in decline, but also that it is about to no longer exist as a meaningful platform for programmers to code on. This is a forecast about the future. However, the future is inherently un-knowable. Forecasts are opinions. While these forecasts may be based on facts and people could disagree about the likelihood of the forecast or their confidence in the forecast, it is opinion. It is not fact. I wouldn't dispute that Windows is in decline. I looked up the stack overflow survey of platforms that people program on and added up the Windows components from 2013 to 2016. In 2013 it was 60.4% and steadily fell to 52.2% in 2016. The largest growth of the share was OS X (not Linux). However, even falling from 60% to 50%, it's still 50%. That's huge. And this is programmers who use Stack Overflow, not normal users. Look at the developer environment and its either Visual Studio or a text editor (Sublime or Notepad++) as most popular. The evidence says it is in decline. And the trend doesn't look good. However, that doesn't mean it's going away. It also doesn't mean you can project the current trend into the future at the current rate or at a faster or slower rate. Who knows what the rate could be. What matters is that half of all developers (by this measure) use Windows now. Who knows what the equilibrium will be? Maybe it will stabilize at roughly equal shares across shares across Linux/OSX/Windows. Maybe Windows will become niche (in which case you could conceivably make the argument that it's dying). God only knows. But you cannot say that it is all fact and not opinion. It is opinion. It is a forecast. [1] https://insights.stackoverflow.com/survey/2016
I say dying, you say decline, no point in debating the semantics. I will agree with you that we don't know how soon Windows will actually, effectively die: an imminent collapse is merely my forecast, which I tried to back up with data and examples of how mobile is gunning to kill it off. Dying tech can sometimes rebound for some time, so it is certainly possible for Windows. But ultimately all this discussion of market share won't matter if nobody wants to do the work. Windows has historically been the dominant tech platform and D's support for it is much more advanced than its support for the currently dominant platform, Android, which I'm the only person working on. I'm trying to influence people to work more on Android and less on Windows, based on the aforementioned market share and product data. You presumably believe Windows won't fade that fast and should still receive a higher level of investment than I would recommend. We've each made our case. Given the current levels of investment, I'm not sure anybody cares about these market share arguments anyway. ;) More likely, it is completely idiosyncratic, just based on the need, skill, and time of the particular D dev. We can only hope that this data and argument has had some influence on the community.
Nov 01
parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 1 November 2017 at 21:55:56 UTC, Joakim wrote:
 [snip] I'm not sure anybody cares about these market share 
 arguments anyway.
Ha, fair enough.
Nov 01
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 /Signed: A pissed off Windows user
I think you've summed it all up right there ;-) But seriously, Windows rightly has its place... and for good reasons. Most importantly, it provided the ease of installation along with a easy to use and easy to understand GUI, that simply could not be provided by open source alternatives. Things have changed a lot though, over the last decade. The only thing I can think of, that is wrong with Windows, is that you can't fork it. Therefore it cannot evolve, unless the vendor wants it to, and even then, only in the way the vendor wants it to. But software is for the user, not the vendor. A user should be able to adapt software to meet their own requirements. Closed source prevents that. I think open-source really is the future, and Windows will fade into obscurity - but only if open source continues to deliver the benefits that Windows has always been able to deliver. If that keeps occuring, then there is little justification for having a closed source operating system - whether you call it Windows or whatever. And I think trust will become a bigger issue in the near future too...i.e. how can you trust code you can't view? You can barely even trust code you can view ;-) Having a go at a platform is not the same as having a go at the users of the platform. Please understand the difference. Even in the open source world, opinions differ... a lot. /Signed: Happy FreeBSD user.
Nov 01
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 02, 2017 at 04:13:39AM +0000, codephantom via Digitalmars-d wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 /Signed: A pissed off Windows user
I think you've summed it all up right there ;-) But seriously, Windows rightly has its place... and for good reasons. Most importantly, it provided the ease of installation along with a easy to use and easy to understand GUI, that simply could not be provided by open source alternatives.
It's a matter of opinion. *I*, personally, find Windows atrociously hard to do anything useful in. I get an aneurysm trying to click through countless nested menus just to find that one button I need, when I could have typed the right command in 3 seconds in a Bash shell. But OTOH, I also know that my opinion is in the minority (by far!). :-P I think it just boils down to personal preference and habit built from past familiarity. As they say, there is no accounting for taste. One thing is clear, though: claiming that Windows is "dead" is, frankly, ridiculous. Even a non-Windows person like me who rarely has any reason to notice things Windows-related, can see enough circumstantial evidence around me that Windows is still very much alive and kicking. (Even if in my ideal world there would be no Windows... but then, if the world were my ideal, 90% of computer users out there would probably be very angry about being forced to use obscure text-only interfaces that I'm completely comfortable in. So it's probably not a bad thing the real world doesn't match my ideal one. :-D) [...]
 But software is for the user, not the vendor. A user should be able to
 adapt software to meet their own requirements. Closed source prevents
 that.
 
 I think open-source really is the future, and Windows will fade into
 obscurity - but only if open source continues to deliver the benefits
 that Windows has always been able to deliver. If that keeps occuring,
 then there is little justification for having a closed source
 operating system - whether you call it Windows or whatever.
 
 And I think trust will become a bigger issue in the near future
 too...i.e.  how can you trust code you can't view? You can barely even
 trust code you can view ;-)
[...] There is another side to this argument, though. How many times have *you* reviewed the source code of the software that you use on a daily basis? Do you really *trust* the code that you theoretically *can* review, but haven't actually reviewed? Do you trust the code just because some random strangers on the internet say they've reviewed it and it looks OK? It is a common argument among open source proponents that having more eyes will reduce the number of bugs... It sounds convincing, but the problem with that, is that this only works when there is a relatively small amount of code and a very large pool of potential reviewers. Unfortunately, the hard reality today is that there is so much open source code out there, and the rate at which open source code is being written far exceeds the rate of growth of the number of open source reviewers, that I'd venture to say 80-90% of open source code out there has never seen more than a very small number of reviewers. Probably not more than 1 or 2 for a significant fraction of it, if even that. I have seen open source code that probably has *never* been reviewed, because any review would have instantly brought to light the blatantly-obvious bugs and problems that riddle just about every page of code. And some of that code is so ugly that even if I had personally reviewed it to death, I still wouldn't trust anything that depends on it, sorry to say. And among the scant few projects that do get above average contributors (and thus code reviewers), we *still* have bugs like Heartbleed that go undetected for *years*. And this is in cryptographic code that, ostensibly, undergoes far more careful scrutiny than more "ordinary" code. Where does that leave the trust level of said ordinary code? Especially code that comes from lesser projects that don't enjoy the same level of review as high-visibility projects like OpenSSL? That's not to say that proprietary code is any better, though. Having worked in proprietary software development ("enterprise" software development) for the past 2 decades or so, I can say that the code quality isn't any better. Just because you pay somebody to do the job doesn't guarantee they'll do a *good* job, let's just put it that way. There's a widespread mentality of "not my problem" that goes around in proprietary software development. You don't want to touch some ugly code that isn't directly your responsibility, because it could break and the blame would fall on you. You often don't know why something was written a certain way -- it could be part of an elaborate bugfix for a critical customer bug, so you really don't want to touch it and break things. So you just work around it in the code you *are* responsible for, and let whoever it is figure out what to do with *their* code. Unfortunately, often this "whoever" is actually "nobody", because said persons have moved on. So things end up never getting fixed. Also, sometimes bad designs are left untouched because of office politics, and code quality can greatly suffer because of that. At least with open source code disinterested 3rd parties can review the code without undue bias and notice problems (and ostensibly, fix them). But let's not kid ourselves that open source is *necessarily* better. It *can* be better in some cases, but it depends. Trust is a far more complex issue than "proprietary is bad, open source is good", as certain open source zealots would have us believe. It takes more than just being open source; other factors also play a critical role, so just because something is open source guarantees nothing. T -- Life would be easier if I had the source code. -- YHL
Nov 01
next sibling parent reply codephantom <me noyb.com> writes:
On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
 There is another side to this argument, though.  How many times 
 have *you* reviewed the source code of the software that you 
 use on a daily basis?  Do you really *trust* the code that you 
 theoretically *can* review, but haven't actually reviewed?  Do 
 you trust the code just because some random strangers on the 
 internet say they've reviewed it and it looks OK?
I did make that point ;-) Of course you can't even view closed source. So there is no way to audit it, and therefore no way to trust it. Full stop. That cannot be argued against. On the otherhand, just being open source, does not mean it can be trusted - just look at the OpenSSL debacle - that's a great case study if ever there was one.. But Ken Thompson summed it all up nicely: "You can't trust code that you did not totally create yourself." http://vxer.org/lib/pdf/Reflections%20on%20Trusting%20Trust.pdf But the key value of open source, is not that you can (or cannot) trust it, but that it's an enabler of evolution (and sometimes just a slow ;-) Linus gave a great talk about this important principle back in 2001: https://www.youtube.com/watch?v=WVTWCPoUt8w
Nov 01
next sibling parent codephantom <me noyb.com> writes:
On Thursday, 2 November 2017 at 06:28:52 UTC, codephantom wrote:
 Linus gave a great talk about this important principle back in 
 2001:

 https://www.youtube.com/watch?v=WVTWCPoUt8w
btw. 36:12 is a good starting point in the video, it's where Ken Thompson (yes, the one and only) asks Linus a question about communal programming.
Nov 01
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Thursday, 2 November 2017 at 06:28:52 UTC, codephantom wrote:
 But Ken Thompson summed it all up nicely: "You can't trust code 
 that you did not totally create yourself."
Even that is wrong. You can trust code you create yourself only if it was reviewed by others as involved as you. I do not trust the code I write. The code I write is generally conforming to the problem I think it solves. More than once I was wrong on my assumptions and therefore my code was wrong, even if perfectly implemented.
Nov 02
parent reply Dave Jones <dave jones.com> writes:
On Thursday, 2 November 2017 at 08:59:05 UTC, Patrick Schluter 
wrote:
 On Thursday, 2 November 2017 at 06:28:52 UTC, codephantom wrote:
 But Ken Thompson summed it all up nicely: "You can't trust 
 code that you did not totally create yourself."
Even that is wrong. You can trust code you create yourself only if it was reviewed by others as involved as you. I do not trust the code I write. The code I write is generally conforming to the problem I think it solves. More than once I was wrong on my assumptions and therefore my code was wrong, even if perfectly implemented.
He means trust in the sense that there's no nefarious payload hidden in there, not that it works properly.
Nov 02
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 02, 2017 at 09:16:02AM +0000, Dave Jones via Digitalmars-d wrote:
 On Thursday, 2 November 2017 at 08:59:05 UTC, Patrick Schluter wrote:
 On Thursday, 2 November 2017 at 06:28:52 UTC, codephantom wrote:
 
 But Ken Thompson summed it all up nicely: "You can't trust code
 that you did not totally create yourself."
Even that is wrong. You can trust code you create yourself only if it was reviewed by others as involved as you. I do not trust the code I write. The code I write is generally conforming to the problem I think it solves. More than once I was wrong on my assumptions and therefore my code was wrong, even if perfectly implemented.
He means trust in the sense that there's no nefarious payload hidden in there, not that it works properly.
[...] Sometimes the line is blurry, though. OpenSSL with the Heartbleed bug has no nefarious payload -- but I don't think you could say you "trust" it. Trust is a tricky thing to define. But more to the original point: Thompson's article on trusting trust goes deeper than mere code. The real point is that ultimately, you have to trust some upstream vendor "by faith", as it were, because if you want to be *really* paranoid, you'll have to question not only whether your compiler comes with a backdoor of the kind Thompson describes in the article, but also whether there's something nefarious going on with the *hardware* your code is running on. I mean, these days, CPUs come with microcode, so even if you had access to a known-to-be-uncompromised disassembler and reviewed the executable instruction by instruction, in a philosophical sense you *still* cannot be sure that when you hand this machine code to the CPU, it will not do something nefarious. What if the microcode was compromised somewhere along the line? And even if you could somehow review the microcode and verify that it doesn't do anything nefarious, do you really trust that the CPU manufacturer hasn't modified some of the CPU design circuitry to do something nefarious? You can review the VLSI blueprints for the CPU, but how do you know the factory didn't secretly modify the hardware? If you *really* wish to be 100% sure about anything, you'll have to use a scanning electron microscope to verify that the hardware actually does what the manufacturer says it does and nothing else. (Not to mention, even if you *were* able to review every atom of your CPU to be sure it does what it's supposed to and nothing else, how do you know your hard drive controller isn't somehow compromised to deliver a different, backdoored version of your code when you run the executable, but deliver the innocent reflection of the source code when you're reviewing the binary? So you'll have to use the electron microscope on your HD controller too. And the rest of your motherboard and everything else attached to it.) Of course, practically speaking, somewhere on the line between reviewing code and using an electron microscope (and even in the latter case, one has to question whether the microscope manufacturer inserted something nefarious to hide a hardware exploit -- so you'd better build your own electron microscope from ground up), there is a line where you'd just say, OK, this is good enough, I'll just have to take on faith that below this level, everything works as advertised. Otherwise, you'd get nothing done, because nobody has a long enough lifetime, nor patience, nor the requisite knowledge, to review *everything* down to the transistor level. Somewhere along the line you just have to stop and take on faith that everything past that point isn't compromised in some way. And yes, I said and meant take on *faith* -- because even peer review is a matter of faith -- faith that the reviewers don't have a hidden agenda or are involved in secret collusions to push some agenda. It's very unlikely to happen in practice, but you can't be *sure*. And that's the point Thompson was getting at. You have to build up trust from *somewhere* other than ground zero. And because of that, you should, on the other hand, always be prepared to mitigate unexpected circumstances that may compromise the trust you've placed in something. Rather than becoming paranoid and locking yourself in a Faraday cage inside an underground bunker, isolated from the big bad world, and building everything from scratch yourself, you decide to what level to start building your trust on, and prepare ways to mitigate problems when it turns out that what you trust wasn't that trustworthy after all. So if you want to talk about trust, open source code is only the tip of the iceberg. The recent fiasco about buggy TPM chips generating easily-cracked RSA keys is ample proof of this. Your OS may be fine, but when it relies on a TPM chip that has a bug, you have a problem. And this is just a *bug* we're talking about. What if it wasn't a bug, but a deliberate backdoor inserted by the NSA or some agency with an ulterior motive? Your open source OS won't help you here. And yes, the argument has been made that if only the TPM code were open source, the bug would have been noticed. But again, that depends. Just because the code is open source doesn't guarantee it's getting the attention it needs. And even if it is, there's always the question of whether the hardware it's running on isn't compromised. At *some* point, you just have to draw the line and take things on faith, otherwise you have no choice but to live in a Faraday cage inside an underground bunker. T -- 2+2=4. 2*2=4. 2^2=4. Therefore, +, *, and ^ are the same operation.
Nov 02
prev sibling next sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
 On Thu, Nov 02, 2017 at 04:13:39AM +0000, codephantom via 
 Digitalmars-d wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 /Signed: A pissed off Windows user
I think you've summed it all up right there ;-) But seriously, Windows rightly has its place... and for good reasons. Most importantly, it provided the ease of installation along with a easy to use and easy to understand GUI, that simply could not be provided by open source alternatives.
It's a matter of opinion. *I*, personally, find Windows atrociously hard to do anything useful in. I get an aneurysm trying to click through countless nested menus just to find that one button I need, when I could have typed the right command in 3 seconds in a Bash shell. But OTOH, I also know that my opinion is in the minority (by far!). :-P I think it just boils down to personal preference and habit built from past familiarity. As they say, there is no accounting for taste. One thing is clear, though: claiming that Windows is "dead" is, frankly, ridiculous. Even a non-Windows person like me who rarely has any reason to notice things Windows-related, can see enough circumstantial evidence around me that Windows is still very much alive and kicking. (Even if in my ideal world there would be no Windows... but then, if the world were my ideal, 90% of computer users out there would probably be very angry about being forced to use obscure text-only interfaces that I'm completely comfortable in. So it's probably not a bad thing the real world doesn't match my ideal one. :-D) [...]
 But software is for the user, not the vendor. A user should be 
 able to adapt software to meet their own requirements. Closed 
 source prevents that.
 
 I think open-source really is the future, and Windows will 
 fade into obscurity - but only if open source continues to 
 deliver the benefits that Windows has always been able to 
 deliver. If that keeps occuring, then there is little 
 justification for having a closed source operating system - 
 whether you call it Windows or whatever.
 
 And I think trust will become a bigger issue in the near 
 future too...i.e.  how can you trust code you can't view? You 
 can barely even trust code you can view ;-)
[...] There is another side to this argument, though. How many times have *you* reviewed the source code of the software that you use on a daily basis? Do you really *trust* the code that you theoretically *can* review, but haven't actually reviewed? Do you trust the code just because some random strangers on the internet say they've reviewed it and it looks OK?
Yes, yes, yes, so true.
 It is a common argument among open source proponents that 
 having more eyes will reduce the number of bugs... It sounds 
 convincing, but the problem with that, is that this only works 
 when there is a relatively small amount of code and a very 
 large pool of potential reviewers. Unfortunately, the hard 
 reality today is that there is so much open source code out 
 there, and the rate at which open source code is being written 
 far exceeds the rate of growth of the number of open source 
 reviewers, that I'd venture to say 80-90% of open source code 
 out there has never seen more than a very small number of 
 reviewers. Probably not more than 1 or 2 for a significant 
 fraction of it, if even that.  I have seen open source code 
 that probably has *never* been reviewed, because any review 
 would have instantly brought to light the blatantly-obvious 
 bugs and problems that riddle just about every page of code.  
 And some of that code is so ugly that even if I had personally 
 reviewed it to death, I still wouldn't trust anything that 
 depends on it, sorry to say.
And that's a nice argument for D (dmd, phobos) as it is quite compact and relatively well written so that it can be reviewed by mere mortals. Ever tried to read gcc or glibc ? Forget about it if you're not an astronaut. Even when not knowing D all to well I could understand what was going on in phobos and check some of the common pitfalls (file copy functions are hilariously often buggy in books and code samples, half the time they forget the O_TRUNC in open() or the ftruncate() to delete the old content on overwrites).
 And among the scant few projects that do get above average 
 contributors (and thus code reviewers), we *still* have bugs 
 like Heartbleed that go undetected for *years*. And this is in 
 cryptographic code that, ostensibly, undergoes far more careful 
 scrutiny than more "ordinary" code.  Where does that leave the 
 trust level of said ordinary code? Especially code that comes 
 from lesser projects that don't enjoy the same level of review 
 as high-visibility projects like OpenSSL?

 That's not to say that proprietary code is any better, though.  
 Having worked in proprietary software development ("enterprise" 
 software development) for the past 2 decades or so, I can say 
 that the code quality isn't any better.  Just because you pay 
 somebody to do the job doesn't guarantee they'll do a *good* 
 job, let's just put it that way. There's a widespread mentality 
 of "not my problem" that goes around in proprietary software 
 development.  You don't want to touch some ugly code that isn't 
 directly your responsibility, because it could break and the 
 blame would fall on you.  You often don't know why something 
 was written a certain way -- it could be part of an elaborate 
 bugfix for a critical customer bug, so you really don't want to 
 touch it and break things.  So you just work around it in the 
 code you *are* responsible for, and let whoever it is figure 
 out what to do with *their* code. Unfortunately, often this 
 "whoever" is actually "nobody", because said persons have moved 
 on. So things end up never getting fixed.  Also, sometimes bad 
 designs are left untouched because of office politics, and code 
 quality can greatly suffer because of that.
So true. When I worked in embedded industrial appliances, we had to be able to reproduce exactly the same code (i.e. eprom's) years out. Even if the code had been fixed of some other bugs, when we had to correct a specific bug from one client, it was only that one bug that could be fixed. Any other bug or enhancement was off limit. The hardware and the software were required to behave exactly the same. There was one time when one of our controller blocked the production line at Opel Rüsselsheim for a morning, I wish no one that days stress level to anyone.
 At least with open source code disinterested 3rd parties can 
 review the code without undue bias and notice problems (and 
 ostensibly, fix them). But let's not kid ourselves that open 
 source is *necessarily* better. It *can* be better in some 
 cases, but it depends.  Trust is a far more complex issue than 
 "proprietary is bad, open source is good", as certain open 
 source zealots would have us believe.  It takes more than just 
 being open source; other factors also play a critical role, so 
 just because something is open source guarantees nothing.
There's also some open source projects are also maintained by dicks and working with them make the whole experience nasty.
Nov 02
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 02, 2017 at 08:53:07AM +0000, Patrick Schluter via Digitalmars-d
wrote:
 On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
[...]
 And that's a nice argument for D (dmd, phobos) as it is quite compact and
 relatively well written so that it can be reviewed by mere mortals. Ever
 tried to read gcc or glibc ? Forget about it if you're not an astronaut.
Yeah, I've (tried to) read glibc source code before. It's ... not for the uninitiated. :-P Which brings up another point about open source code: just because you can *see* the code, doesn't guarantee you'll *understand* enough of it to verify its correctness.
 Even when not knowing D all to well I could understand what was going
 on in phobos and check some of the common pitfalls [...]
Yeah, that was one thing that totally amazed me about D the first time I looked at the Phobos source code. It's sooo readable!!!!! Very unlike most of the source of standard libraries of other languages that I've tried to read. The fact that D allows the Phobos authors to express complex concepts needed in standard libraries in a readable, maintainable way, was a big selling point of D to me. There *are* some dark, dirty corners in Phobos where the code makes you cringe... but generally speaking, these are restricted to only a few rare places, rather than pervasive throughout the code the way, say, glibc source code is. Or any sufficiently-complex C/C++ library, really, that generally tends to slide into macro spaghetti hell, conditional #ifdef nightmare, and/or non-standard compiler extension soup that drowns out any semblance of "normal" C/C++ syntax.
 At least with open source code disinterested 3rd parties can review
 the code without undue bias and notice problems (and ostensibly, fix
 them).  But let's not kid ourselves that open source is
 *necessarily* better. It *can* be better in some cases, but it
 depends.  Trust is a far more complex issue than "proprietary is
 bad, open source is good", as certain open source zealots would have
 us believe.  It takes more than just being open source; other
 factors also play a critical role, so just because something is open
 source guarantees nothing.
 
There's also some open source projects are also maintained by dicks and working with them make the whole experience nasty.
Yeah. There's always the option to fork, of course, which isn't possible with proprietary software. But even then, they can still make your life a living hell if you're unlucky enough to get on their wrong side. T -- Philosophy: how to make a career out of daydreaming.
Nov 02
prev sibling next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 02/11/17 07:13, H. S. Teoh wrote:
 There is another side to this argument, though.  How many times have
 *you*  reviewed the source code of the software that you use on a daily
 basis?  Do you really*trust*  the code that you theoretically*can*
 review, but haven't actually reviewed?  Do you trust the code just
 because some random strangers on the internet say they've reviewed it
 and it looks OK?
This question misses the point. The point is not that you, personally, review every piece of code that you use. That is, if not completely impossible, at least highly impractical. The real point is that it is *possible* to review the code you use. You don't have to personally review it, so long as someone did. I think the best example of how effective this capability is is when it, supposedly, failed: OpenSSL and HeartBlead. Recap: some really old code in OpenSSL had a vulnerability that could remotely expose secret keys from within the server. The model came under heavy criticism because it turned out that despite the fact that OpenSSL is a highly used library, it's code was so convoluted that nobody reviewed it. The result: a massive overhaul effort, lead by the OpenBSD team, which resulted in a compatible fork, called LibreSSL. In other words, even when the "many eyes" assumption fails, the recovery is much faster than when the code is close. Shachar
Nov 02
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 02, 2017 at 11:38:21AM +0200, Shachar Shemesh via Digitalmars-d
wrote:
 On 02/11/17 07:13, H. S. Teoh wrote:
 There is another side to this argument, though.  How many times have
 *you* reviewed the source code of the software that you use on a
 daily basis?  Do you really *trust* the code that you theoretically
 *can* review, but haven't actually reviewed?  Do you trust the code
 just because some random strangers on the internet say they've
 reviewed it and it looks OK?
This question misses the point. The point is not that you, personally, review every piece of code that you use. That is, if not completely impossible, at least highly impractical. The real point is that it is *possible* to review the code you use. You don't have to personally review it, so long as someone did.
That only shifts one question to another, though: do you trust the "someone" who did review the code? That is what I mean by "some random strangers on the internet". When you download, say, glibc, whose authors you presumably never met and probably may not have heard of until that point, you're basically trusting that these authors have done their due diligence in reviewing the code and making sure it meets some standard of quality. But you *don't know* if they reviewed it or not, and even if they did, you don't know whether their standard of quality matches yours. After all, they are just some "random strangers on the internet" whom you've never met, and probably never heard of. Yet you're putting your trust in them to write proper software that will be running on your system. Please keep in mind, I'm not saying that *in general*, you can't trust the upstream authors. But the issue here, which is also Thompson's point in his article, is, how do you know whether or not your trust is misplaced? You can't know for sure. At some level, you just have to stop and take it *on faith* that these "random online strangers" are giving you good code, because as you said, to go down the complete paranoia road is highly impractical, if not completely impossible. But if you're going to put your trust in said random online strangers, what makes you think they are more trustworthy than some random anonymous employees of some big corporation, whose proprietary software you're running on your system? Again, you can't know *for sure*. At some point, it just comes down to trusting that they have done their jobs well, and without nefarious motives. So where you put your trust is a matter of faith, not fact, because you can't *objectively* be completely sure unless you go down the paranoia road to personally verifying everything, which is an infeasible, if not outright impossible, task.
 I think the best example of how effective this capability is is when
 it, supposedly, failed: OpenSSL and HeartBlead.
 
 Recap: some really old code in OpenSSL had a vulnerability that could
 remotely expose secret keys from within the server. The model came
 under heavy criticism because it turned out that despite the fact that
 OpenSSL is a highly used library, it's code was so convoluted that
 nobody reviewed it.
And that's the other thing about open source: sure, the code is available for everyone to read. But how many will actually understand it? If it's so convoluted, as you said, nobody will review it. Or if they did, you'd have less confidence whether they caught all of the problems.
 The result: a massive overhaul effort, lead by the OpenBSD team, which
 resulted in a compatible fork, called LibreSSL.
 
 In other words, even when the "many eyes" assumption fails, the
 recovery is much faster than when the code is close.
[...] Ahem. It took the LibreSSL folk *years* to cleanup the original OpenSSL code and bring it up to equivalent functionality. That's hardly what I'd call "much faster". Don't get me wrong; personally I agree with you that open source is better. All I'm saying is that this eventually boils down to a matter of opinion, because ultimately, you're trusting, on faith, that this way of doing things will produce better results. Does it actually? It's hard to say. I like to intepret the evidence as yes, and I think you do too, but I'm not 100% sure it's not just confirmation bias. It's hard to be sure because you can't know until you personally verify everything. But you can't do that, so eventually you have to just trust that it does what you think it does, and hope for the best. How will things pan out eventually? It's anyone's guess. T -- English is useful because it is a mess. Since English is a mess, it maps well onto the problem space, which is also a mess, which we call reality. Similarly, Perl was designed to be a mess, though in the nicest of all possible ways. -- Larry Wall
Nov 02
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
 One thing is clear, though: claiming that Windows is "dead" is, 
 frankly, ridiculous.  Even a non-Windows person like me who 
 rarely has any reason to notice things Windows-related, can see 
 enough circumstantial evidence around me that Windows is still 
 very much alive and kicking.  (Even if in my ideal world there 
 would be no Windows... but then, if the world were my ideal, 
 90% of computer users out there would probably be very angry 
 about being forced to use obscure text-only interfaces that I'm 
 completely comfortable in.  So it's probably not a bad thing 
 the real world doesn't match my ideal one. :-D)
Congratulations, you find a claim that literally nobody has made in this thread to be ridiculous. Next you'll say that Walter's claim that Java will replace COBOL is ridiculous or Adam's claim that we should write a full crypto stack ourselves in D is a bad idea, both of which neither ever said. On Friday, 3 November 2017 at 06:20:25 UTC, Tony wrote:
 On Wednesday, 1 November 2017 at 08:49:05 UTC, Joakim wrote:
 On Wednesday, 1 November 2017 at 00:16:19 UTC, Mengu wrote:
 On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
 I don't know how intense your data analysis is, but I 
 replaced a Win7 ultrabook that had a dual-core i5 and 4 GBs 
 of RAM with an Android tablet that has a quad-core ARMv7 and 
 3 GBs of RAM as my daily driver a couple years ago, without 
 skipping a beat.
  I built large mixed C++/D codebases on my ultrabook, now I 
 do that on my Android/ARM tablet, which has a slightly 
 weaker chip than my smartphone.
How does the performance compare between an i5 laptop and an Android tablet?
My core i5 ultrabook died in late 2015, so I never ran any performance comparisons. I'd say that its 2012 Sandy Bridge dual-core i5 was likely a little faster to compile the same code than the 2014 quad-core Cortex-A15 I'm using in my tablet now. I've recently been trying out AArch64 support for D on a 2017 Android tablet which has one of the fastest quad-core ARMv8 chips from 2016, I'd guess that's faster than the i5. But this is all perception, I don't have measurements.
 Why do predictions about the future matter when at the 
 present Windows dominates the desktop and is also strong in 
 the server space?
Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market. As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.
To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen. And of course it's happening right now, why do you think PC sales are down 25% over the last six years, after rising for decades? For many people, a PC was overkill but they didn't have a choice of another easier form factor and OS. Now they do.
 At this point, were they do to that, they would end up with a 
 machine that has less power in most cases (there are Atom and 
 Celeron laptops), and probably less memory and disk storage. 
 That solution would be most attractive to Chromebook type users 
 and very low end laptop users. And while people buy low spec 
 laptops and desktops, there are still many laptops and desktops 
 sold with chips that aren't named Atom and Celeron or arm. If 
 phones and tablets try to get chips as powerful as those for 
 the desktop and laptops they run into the chip maker's problem 
 - the more processing power, the more the electricity the chip 
 uses. Phones and tablets don't plug into the wall and they are 
 smaller than the batteries in laptops. And in order to use a 
 phone/tablet as a "lean forward" device (as opposed to "lean 
 back") and do work, they will have to spend money on a "laptop 
 shell" that will have a screen and keyboard and probably an 
 SSD/HD which will cancel most of the cost savings from not 
 buying a laptop.
You seem wholly ignorant of this market and the various points I've made in this thread. Do you know what the median Windows PC sold costs? Around $400. Now shop around, are you finding great high-spec devices at that price? The high-spec market that you focus on is a tiny niche, the bulk of the PC market is easily eclipsed by mobile performance, which is why people are already turning in their PCs for mobile. Battery life on mobile is already much better than laptops, for a variety of reasons including the greater efficiency of mobile ARM chips. And the Sentio laptop shell I already linked in this thread has a screen, keyboard, and battery but no SSD/HD, which is why it only costs $150, much less than a laptop.
 In the case of trying to court Android development, I read that 
 95% of Android is done on Java (and maybe other JVM languages 
 like the now "officially supported" Kotlin) and 5% in C or C++. 
 But that 5% is for applications that have a need for high 
 performance, which is mostly games. Good luck selling game 
 developers on using D to develop for Android, when you can't 
 supply those same game developers a top-notch development 
 environment for the premier platform for performance critical 
 games - Windows 64-bit.
I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games. I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device. btw, the mobile gaming market is now larger than the PC gaming market, so to think that they're sitting around using tools and IDEs optimized for that outdated PC platform is silly: https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/
 I have seen conflicting reports about what OS is bigger in 
 the server market, but Windows is substantial and the more 
 frequent winner.

 https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends

 https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/
I have never seen any report that Windows is "bigger in the server market."
I linked one that said: "And what OSes are running in virtual machines and on physical servers around the world? It turns out like with client OSes, Microsoft is dominant. Fully 87.7% of the physical servers and VMs in the Spiceworks network (which are mostly on-premises) run Microsoft Windows Server."
 Last month's Netcraft survey notes,

 "which underlying operating systems are used by the world's 
 web facing computers?

 By far the most commonly used operating system is Linux, which 
 runs on more than two-thirds of all web-facing computers. This 
 month alone, the number of Linux computers increased by more 
 than 91,000; and again, this strong growth can largely be 
 attributed to cloud hosting providers, where Linux-based 
 instances are typically the cheapest and most commonly 
 available."
 https://news.netcraft.com/archives/2017/09/11/september-2017-web-server-survey.html
Web-facing server is a subset of servers. Shared web hosting services are probably a harder target for native-code applications than internal IT servers.
Web servers are a subset but by far the largest one, so any accounting of market share is going to be determined by them. Native code has been dying on the server regardless of web or internal servers, but the real distinction is performance. Facebook writes their backend in C++, the same for any server service that really needs to scale out, which is not likely to be internal IT.
 But regardless of whether Windows is dominant, or just widely 
 used, you haven't made predictions that Windows servers are 
 going to die.
I don't think about niche platforms that hardly anybody uses.
 Your first link is actually a bad sign for Windows, as it's 
 likely just because companies are trying to save money by 
 having their employees run Windows apps off a virtualized 
 Windows Server, rather than buying a ton more Windows PCs.
I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it. Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime. I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be. But regardless of how you slice it, this isn't a good sign for Windows.
 Meanwhile, your second link sees "Linux maintaining a 
 noticeable lead" in the web-hosting market.
Don't know why I linked that as it doesn't even have a percentage breakdown. My intent was to show a web server breakdown but I will concede that Linux is bigger for web servers. However, Windows is still big and you aren't predicting it will die.
I've actually said elsewhere in this forum that the cloud server market is way overblown and will greatly diminish in the coming years because of greater p2p usage, so yeah, I think both linux and Windows on the server will largely die off.
 And if desktop OSes were going to go away, the MacOS would go 
 before Windows.
Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays. Do you know the last time Apple released a standalone desktop computer? 2014, when they last updated the Mac Mini. They haven't updated the Mac Pro since 2013.
Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years. Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS. As for the iOS Macbook, it's out, it's called the iPad Pro. Their CEO, Tim Cook, is always boasting about how it's all he uses these days: https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/ http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone
 They see the writing on the wall, which is why they're 
 lengthening their release cycles for such legacy products.
Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.
Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market. The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out. They don't break out iMac sales but given that it's much more expensive than the Mac Mini, it's doubtful that it sells better. I was only talking about Apple's standalone desktops because they're most comparable to the PC market, but it's true that PC/Mac all-in-ones like the iMac have done better lately, one of the few growing segments. But when the entire desktop/laptop market is shrinking and the much more expensive all-in-one sales are so small, that doesn't mean much.
 If you look at the graph in this article, the iPad has declined 
 more as a percentage of Apple revenue than the macOS line has 
 in the last five years.

 https://www.statista.com/statistics/382260/segments-share-revenue-of-apple/
I don't have access to that chart, but yes, the iPad and tablet markets have been shrinking. It's possible that more people would rather use their smartphone, which usually has a more powerful chip than Android tablets, with the Dex dock or a Sentio-like laptop shell than a tablet. But neither group is using a PC: both are mobile, smartphone even more so.
 There is a case to be made for supporting  Android/iOS 
 cross-compilation. But it doesn't have to come at the expense 
 of Windows 64-bit integration. Not sure they even involve the 
 same skillsets. Embarcadero and Remobjects both now support 
 Android/iOS development from their Windows (and macOS in the 
 case of Remobjects) IDEs.
You're right that some of the skills are different and D devs could develop for mobile from a Windows IDE. But my point was more about general investment and focus, the currently dominant platform, Android, needs it, while the fading platform, Windows, shouldn't get much more. Frankly, I find it tiresome that some Windows devs in this thread think the reason IDE support isn't better is because somebody is listening to me. More likely, Rainer or whoever would do that work is already invested in Windows, but doesn't have the time or interest to do much more. You'd be much better off finding that person and helping or sponsoring them rather than debating me, as I likely have no effect on that person's thinking. I wish it were otherwise, but I doubt it.
Nov 03
parent reply Tony <tonytdominguez aol.com> writes:
On Friday, 3 November 2017 at 09:16:42 UTC, Joakim wrote:
 Why do predictions about the future matter when at the 
 present Windows dominates the desktop and is also strong in 
 the server space?
Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market. As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.
To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen.
You said 99% would go away. So "almost all".
 And of course it's happening right now, why do you think PC 
 sales are down 25% over the last six years, after rising for 
 decades?  For many people, a PC was overkill but they didn't 
 have a choice of another easier form factor and OS.  Now they 
 do.
There are others reasons for PC sales declining beyond someone just using a phone or a tablet. Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past - only a hard drive failure will trigger a PC upgrade for them. Some have cut down from a desktop and a laptop to just a laptop as the laptops got faster. Or a family replaces some combination of laptops and desktops with a combination of laptops/desktops/tablets/phones. That 25% is not indicative of 25% of homes getting rid of ALL of their PC/laptops.
 At this point, were they do to that, they would end up with a 
 machine that has less power in most cases (there are Atom and 
 Celeron laptops), and probably less memory and disk storage. 
 That solution would be most attractive to Chromebook type 
 users and very low end laptop users. And while people buy low 
 spec laptops and desktops, there are still many laptops and 
 desktops sold with chips that aren't named Atom and Celeron or 
 arm. If phones and tablets try to get chips as powerful as 
 those for the desktop and laptops they run into the chip 
 maker's problem - the more processing power, the more the 
 electricity the chip uses. Phones and tablets don't plug into 
 the wall and they are smaller than the batteries in laptops. 
 And in order to use a phone/tablet as a "lean forward" device 
 (as opposed to "lean back") and do work, they will have to 
 spend money on a "laptop shell" that will have a screen and 
 keyboard and probably an SSD/HD which will cancel most of the 
 cost savings from not buying a laptop.
You seem wholly ignorant of this market and the various points I've made in this thread. Do you know what the median Windows PC sold costs? Around $400. Now shop around, are you finding great high-spec devices at that price?
You said 99% are going away. You need to talk about a lot more than median prices. But nevertheless, $400 laptops have better specs and performance than $400 tablets and phones. And you are good to go with a laptop. People who want to go down to the coffee shop and work on their term paper on a laptop just take the laptop. People who want to go down to the coffee shop and work on their term paper on a phone or tablet, have to bring a keyboard and monitor (phone) or a keyboard and tablet stand and squint at their screen (tablet).
 The high-spec market that you focus on is a tiny niche, the 
 bulk of the PC market is easily eclipsed by mobile performance, 
 which is why people are already turning in their PCs for mobile.
I don't think that phones/tablets can compete performance-wise with $400 and up machines, which you claim is over 50% of the market.
 Battery life on mobile is already much better than laptops, for 
 a variety of reasons including the greater efficiency of mobile 
 ARM chips.
That is a common belief, but it is referred to as a myth in many places, including this research paper after performing tests on different architectures. It ends with: "An x86 chip can be more power efficient than an ARM processor, or vice versa, but it’ll be the result of other factors — not whether it’s x86 or ARM." https://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient/3
 And the Sentio laptop shell I already linked in this thread has 
 a screen, keyboard, and battery but no SSD/HD, which is why it 
 only costs $150, much less than a laptop.
I see that 11.6" screen setup with the small storage of a phone as competition for $150 Chromebooks, not $400 Windows laptops. I would prefer to be on my Chromebook and take a call on my cell phone, rather than having my cellphone plugged into a docking station and have to unplug it or put it on speaker phone.
 In the case of trying to court Android development, I read 
 that 95% of Android is done on Java (and maybe other JVM 
 languages like the now "officially supported" Kotlin) and 5% 
 in C or C++. But that 5% is for applications that have a need 
 for high performance, which is mostly games. Good luck selling 
 game developers on using D to develop for Android, when you 
 can't supply those same game developers a top-notch 
 development environment for the premier platform for 
 performance critical games - Windows 64-bit.
I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games. I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device.
I am assuming that game developers work in both spaces, if not concurrently, they move between the two. It also may be incorrect to assume that D would be acceptable in its current incarnation for game development due to the non-deterministic activity of the garbage collector. In which case, it would have little rationale for Android development. As far as iOS, there are two native code languages with a large lead, and both use Automatic Reference Counting, rather than garbage collection which would presumably give them give them the advantage for games. But D could potentially compete for non-game development.
 btw, the mobile gaming market is now larger than the PC gaming 
 market, so to think that they're sitting around using tools and 
 IDEs optimized for that outdated PC platform is silly:

 https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/
Are you suggesting they are developing their games for iOS and Android devices ON those devices? Apple has XCode for developing iOS apps and it runs on macOS machines only. There is also the Xamarin IDE or IDE plug-in from Microsoft that allows C# on iOS, but it runs on macOS or WIndows. For Android, there is Android Studio - "The Official IDE of Android" - which runs on Windows, macOS and Linux. There is no Android version.
 But regardless of whether Windows is dominant, or just widely 
 used, you haven't made predictions that Windows servers are 
 going to die.
I don't think about niche platforms that hardly anybody uses.
It is the dominant internal IT platform. That is not niche and not something that is "hardly used". But what you could say is that given your prediction that Windows sales will decline by 99%, Microsoft will go out of business.
 Your first link is actually a bad sign for Windows, as it's 
 likely just because companies are trying to save money by 
 having their employees run Windows apps off a virtualized 
 Windows Server, rather than buying a ton more Windows PCs.
I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it. Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime. I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be. But regardless of how you slice it, this isn't a good sign for Windows.
They use virtualization for Linux for the same reason I stated - so the application/http/email/database server can be on an OS that can be rebooted to complete upgrades or a VM can be used as an isolated "sandbox" for testing upgrades of a particular server or some in-house developed software.
 And if desktop OSes were going to go away, the MacOS would 
 go before Windows.
Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays. Do you know the last time Apple released a standalone desktop computer? 2014, when they last updated the Mac Mini. They haven't updated the Mac Pro since 2013.
Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years. Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS. As for the iOS Macbook, it's out, it's called the iPad Pro. Their CEO, Tim Cook, is always boasting about how it's all he uses these days: https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/ http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone
A CEO is a baby user of a PC. What would he do besides email? He has people to do his powerpoint and documents. Not a good endorsement. And the iPad Pro is twice the price of what you say is the average price of a PC laptop. You could buy a Windows laptop and an Android Zenpad tablet and still have paid less than an iPad. I'd like to be there when Cook tells all Apple employees they need to turn in their MacBooks for iPads.
 They see the writing on the wall, which is why they're 
 lengthening their release cycles for such legacy products.
Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.
Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market. The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out.
Why did they fund development of a new iMac Pro which is coming this December as well as the new MacBook Pros that came out this June? That's a contradiction of "milk it like an iPod".
Nov 03
parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 3 November 2017 at 11:57:58 UTC, Tony wrote:
 On Friday, 3 November 2017 at 09:16:42 UTC, Joakim wrote:
 Why do predictions about the future matter when at the 
 present Windows dominates the desktop and is also strong in 
 the server space?
Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market. As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.
To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen.
You said 99% would go away. So "almost all".
Yes, I was simply noting not "in the paragraph you're quoting above."
 And of course it's happening right now, why do you think PC 
 sales are down 25% over the last six years, after rising for 
 decades?  For many people, a PC was overkill but they didn't 
 have a choice of another easier form factor and OS.  Now they 
 do.
There are others reasons for PC sales declining beyond someone just using a phone or a tablet. Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past - only a hard drive failure will trigger a PC upgrade for them. Some have cut down from a desktop and a laptop to just a laptop as the laptops got faster. Or a family replaces some combination of laptops and desktops with a combination of laptops/desktops/tablets/phones. That 25% is not indicative of 25% of homes getting rid of ALL of their PC/laptops.
Sure, there are multiple reasons that PC sales are declining and many homes still keep a residual PC to get their work done. With the DeX dock and Sentio shell coming out this year, my prediction is that those residual PCs will get swept out over the coming 5-10 years. But that established PC userbase shrinking is not what you should be worried about. I've talked to multiple middle-class consumers in developing markets- they would be considered poor in the US if you converted their income to dollars- who tell me that they recently got their first smartphone for $150-200 and that it is the first time they ever used the internet, with cheap 3G/4G plans that are only now springing up. They don't use the web, only mobile chat or social apps. Now, do you think these billions of new users of computing and the internet are more likely to buy a cheap laptop shell or dock for their smartphone when they someday need to do some "lean forward" work, as you call it, or spend much more on a Windows PC? I know where my bet is.
 At this point, were they do to that, they would end up with a 
 machine that has less power in most cases (there are Atom and 
 Celeron laptops), and probably less memory and disk storage. 
 That solution would be most attractive to Chromebook type 
 users and very low end laptop users. And while people buy low 
 spec laptops and desktops, there are still many laptops and 
 desktops sold with chips that aren't named Atom and Celeron 
 or arm. If phones and tablets try to get chips as powerful as 
 those for the desktop and laptops they run into the chip 
 maker's problem - the more processing power, the more the 
 electricity the chip uses. Phones and tablets don't plug into 
 the wall and they are smaller than the batteries in laptops. 
 And in order to use a phone/tablet as a "lean forward" device 
 (as opposed to "lean back") and do work, they will have to 
 spend money on a "laptop shell" that will have a screen and 
 keyboard and probably an SSD/HD which will cancel most of the 
 cost savings from not buying a laptop.
You seem wholly ignorant of this market and the various points I've made in this thread. Do you know what the median Windows PC sold costs? Around $400. Now shop around, are you finding great high-spec devices at that price?
You said 99% are going away. You need to talk about a lot more than median prices. But nevertheless, $400 laptops have better specs and performance than $400 tablets and phones. And you are good to go with a laptop. People who want to go down to the coffee shop and work on their term paper on a laptop just take the laptop. People who want to go down to the coffee shop and work on their term paper on a phone or tablet, have to bring a keyboard and monitor (phone) or a keyboard and tablet stand and squint at their screen (tablet).
No, they'll bring a Sentio-like laptop shell, which only costs $150. Your performance or portability arguments for PCs are losers, that's not affecting this mobile trend at all. The biggest issue is that productivity apps have historically been developed for desktop OS's and are only starting to be ported over to or cloned on mobile, like Office Mobile or Photoshop Express.
 The high-spec market that you focus on is a tiny niche, the 
 bulk of the PC market is easily eclipsed by mobile 
 performance, which is why people are already turning in their 
 PCs for mobile.
I don't think that phones/tablets can compete performance-wise with $400 and up machines, which you claim is over 50% of the market.
$400 PCs are vastly over-specced for most of their owners, they won't even use most of the compute headroom on a $200 smartphone, which is why they're already shifting. The only issues holding the remaining 75% back are the need for mobile work accessories like Dex/Sentio and some PC-only apps, both of which are changing this year.
 Battery life on mobile is already much better than laptops, 
 for a variety of reasons including the greater efficiency of 
 mobile ARM chips.
That is a common belief, but it is referred to as a myth in many places, including this research paper after performing tests on different architectures. It ends with: "An x86 chip can be more power efficient than an ARM processor, or vice versa, but it’ll be the result of other factors — not whether it’s x86 or ARM." https://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient/3
I'm not making theoretical comparisons about RISC versus CISC, but actual power and battery life measurements where mobile ARM devices like the iPad Pro come out way ahead of equivalent x86 PCs like the Surface Pro 4 (scroll down to the sections on Energy Management): https://www.notebookcheck.net/Apple-iPad-Pro-Tablet-Review.156404.0.html https://www.notebookcheck.net/Apple-iPad-Pro-10-5-Tablet-Review.228714.0.html Now, I initially said that ARM efficiency is only one factor in greater battery life, no doubt iOS is much more optimized for battery life than Windows. But all benchmarks pretty much find the same results for just ARM chips. I'm not interested in theories about how CISC x86 could be just as good if Intel just tried harder, especially since they threw up the white flag and exited the mobile smartphone/tablet market: https://www.recode.net/2016/5/2/11634168/intel-10-billion-on-mobile-before-giving-up
 And the Sentio laptop shell I already linked in this thread 
 has a screen, keyboard, and battery but no SSD/HD, which is 
 why it only costs $150, much less than a laptop.
I see that 11.6" screen setup with the small storage of a phone as competition for $150 Chromebooks, not $400 Windows laptops. I would prefer to be on my Chromebook and take a call on my cell phone, rather than having my cellphone plugged into a docking station and have to unplug it or put it on speaker phone.
I don't know why you're so obsessed with storage when even midrange smartphones come with 32 GBs nowadays, expandable to much more with a SD card. My tablet has only 16 GBs of storage, with only 10-12 actually accessible, but I've never had a problem building codebases that take up GBs of space with all the object files, alongside a 64 GB microSD card for many, mostly HD TV shows and movies. You're right that taking calls while using the smartphone to get work done could be a pain for some, I don't see that being a big issue however. Maybe those people will start carrying around cheap $10-20 bluetooth handsets to take calls when their smartphone is tied up doing work, like some rich Chinese supposedly do with their phablets: ;) https://www.theverge.com/2013/1/25/3915700/htc-mini-tiny-phone-companion-for-your-oversized-smartphone
 In the case of trying to court Android development, I read 
 that 95% of Android is done on Java (and maybe other JVM 
 languages like the now "officially supported" Kotlin) and 5% 
 in C or C++. But that 5% is for applications that have a need 
 for high performance, which is mostly games. Good luck 
 selling game developers on using D to develop for Android, 
 when you can't supply those same game developers a top-notch 
 development environment for the premier platform for 
 performance critical games - Windows 64-bit.
I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games. I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device.
I am assuming that game developers work in both spaces, if not concurrently, they move between the two.
I think the overlap is much less than you seem to think.
 It also may be incorrect to assume that D  would be acceptable 
 in its current incarnation for game development due to the 
 non-deterministic activity of the garbage collector. In which 
 case, it would have little rationale for Android development. 
 As far as iOS, there are two native code languages with a large 
 lead, and both use Automatic Reference Counting, rather than 
 garbage collection which would presumably give them give them 
 the advantage for games. But D could potentially compete for 
 non-game development.
Yeah, I already went over some of this in the other dlang forum thread about mobile that I linked initially. Most mobile games would do better if written in D, but we don't yet have the D mobile libraries needed to make that easy on them.
 btw, the mobile gaming market is now larger than the PC gaming 
 market, so to think that they're sitting around using tools 
 and IDEs optimized for that outdated PC platform is silly:

 https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/
Are you suggesting they are developing their games for iOS and Android devices ON those devices? Apple has XCode for developing iOS apps and it runs on macOS machines only. There is also the Xamarin IDE or IDE plug-in from Microsoft that allows C# on iOS, but it runs on macOS or WIndows. For Android, there is Android Studio - "The Official IDE of Android" - which runs on Windows, macOS and Linux. There is no Android version.
Yes, of course they're still largely developing mobile games on PCs, though I'm not sure why you think that matters. But your original claim was that they're still using PC-focused IDEs, as opposed to new mobile-focused IDEs like XCode or Android Studio, which you now highlight. I don't use any IDEs, so I honestly don't care which ones D supports, but my point was that mobile game devs don't need to use outdated PC-focused tools when mobile is a bigger business and they have their own mobile-focused tools nowadays.
 But regardless of whether Windows is dominant, or just widely 
 used, you haven't made predictions that Windows servers are 
 going to die.
I don't think about niche platforms that hardly anybody uses.
It is the dominant internal IT platform. That is not niche and not something that is "hardly used". But what you could say is that given your prediction that Windows sales will decline by 99%, Microsoft will go out of business.
Yes, Windows is dominant, dominant in a niche, internal IT. The consumer mobile market is much larger nowadays, and Windows has almost no market share there. As for Microsoft, Windows is not their only product, they have moved Office onto the dominant mobile platforms. As long as they keep supporting mobile, they could eke out an existence. Their big bet on Azure is going to end badly though.
 Your first link is actually a bad sign for Windows, as it's 
 likely just because companies are trying to save money by 
 having their employees run Windows apps off a virtualized 
 Windows Server, rather than buying a ton more Windows PCs.
I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it. Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime. I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be. But regardless of how you slice it, this isn't a good sign for Windows.
They use virtualization for Linux for the same reason I stated - so the application/http/email/database server can be on an OS that can be rebooted to complete upgrades or a VM can be used as an isolated "sandbox" for testing upgrades of a particular server or some in-house developed software.
It seems containerization is taking off more on linux now for such things, though Windows is trying to get into this too, following far behind as always.
 And if desktop OSes were going to go away, the MacOS would 
 go before Windows.
Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays. Do you know the last time Apple released a standalone desktop computer? 2014, when they last updated the Mac Mini. They haven't updated the Mac Pro since 2013.
Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years. Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS. As for the iOS Macbook, it's out, it's called the iPad Pro. Their CEO, Tim Cook, is always boasting about how it's all he uses these days: https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/ http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone
A CEO is a baby user of a PC. What would he do besides email? He has people to do his powerpoint and documents. Not a good endorsement. And the iPad Pro is twice the price of what you say is the average price of a PC laptop. You could buy a Windows laptop and an Android Zenpad tablet and still have paid less than an iPad.
Sure, are you saying you can't do powerpoint and docs well on an iPad Pro or smartphone/Sentio though? The iPad Pro aims for the high end of this PC-replacing mobile market, with its extremely powerful Apple-designed chip, while a $150 laptop shell combined with the smartphone you already have aims for the low end. That basically leaves no space for a PC, once all the software is ported over.
 I'd like to be there when Cook tells all Apple employees they 
 need to turn in their MacBooks for iPads.
Heh, most would likely rejoice by then. :)
 They see the writing on the wall, which is why they're 
 lengthening their release cycles for such legacy products.
Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.
Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market. The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out.
Why did they fund development of a new iMac Pro which is coming this December as well as the new MacBook Pros that came out this June? That's a contradiction of "milk it like an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS: "more and more people point to the current Mac Pro’s stagnation as proof that Apple is abandoning the Mac Pro market." https://daringfireball.net/2017/04/the_mac_pro_lives Apple threw them a bone, because they're long-time users who likely all buy iPhones and iPads too. Pretty soon, there will be so few of these Mac laggards, just like iPod users, that they will stop doing so.
Nov 03
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 3 November 2017 at 14:12:56 UTC, Joakim wrote:
 [snip]

 But that established PC userbase shrinking is not what you 
 should be worried about.  I've talked to multiple middle-class 
 consumers in developing markets- they would be considered poor 
 in the US if you converted their income to dollars- who tell me 
 that they recently got their first smartphone for $150-200 and 
 that it is the first time they ever used the internet, with 
 cheap 3G/4G plans that are only now springing up.  They don't 
 use the web, only mobile chat or social apps.

 Now, do you think these billions of new users of computing and 
 the internet are more likely to buy a cheap laptop shell or 
 dock for their smartphone when they someday need to do some 
 "lean forward" work, as you call it, or spend much more on a 
 Windows PC?  I know where my bet is.
It's pretty clear from this and some of the other posts that your primary focus is computer users. The work you've done in getting LDC to compile programs for Android is a good example. You want to be able to compile D programs that go on a smart phone because that's where the growth of computer users is coming from. I get that. 100%. I think a source of pushback on the Windows subject is that programmers are a mere subset of all computer users. Maybe the billions might buy a cheap laptop shell or dock, but that doesn't mean they will be programmers. Thus, it's good to be able to compile programs for that platform, but it doesn't mean that work done to improve the experience of programmers on other platforms is a waste of time.
Nov 03
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 3 November 2017 at 14:29:27 UTC, jmh530 wrote:
 On Friday, 3 November 2017 at 14:12:56 UTC, Joakim wrote:
 [snip]

 But that established PC userbase shrinking is not what you 
 should be worried about.  I've talked to multiple middle-class 
 consumers in developing markets- they would be considered poor 
 in the US if you converted their income to dollars- who tell 
 me that they recently got their first smartphone for $150-200 
 and that it is the first time they ever used the internet, 
 with cheap 3G/4G plans that are only now springing up.  They 
 don't use the web, only mobile chat or social apps.

 Now, do you think these billions of new users of computing and 
 the internet are more likely to buy a cheap laptop shell or 
 dock for their smartphone when they someday need to do some 
 "lean forward" work, as you call it, or spend much more on a 
 Windows PC?  I know where my bet is.
It's pretty clear from this and some of the other posts that your primary focus is computer users. The work you've done in getting LDC to compile programs for Android is a good example. You want to be able to compile D programs that go on a smart phone because that's where the growth of computer users is coming from. I get that. 100%.
Yes, D should aim for the largest platforms first- that includes Android, iOS, and Windows- because that's where programmers want to use D to create software for the most users.
 I think a source of pushback on the Windows subject is that 
 programmers are a mere subset of all computer users. Maybe the 
 billions might buy a cheap laptop shell or dock, but that 
 doesn't mean they will be programmers. Thus, it's good to be 
 able to compile programs for that platform, but it doesn't mean 
 that work done to improve the experience of programmers on 
 other platforms is a waste of time.
Of course those mobile users will be programmers too, why do you think I've built ldc to be used _on_ Android itself? http://forum.dlang.org/thread/antajtnvmavswjvcdoyq forum.dlang.org Most programmers will one day be coding on mobile devices, though I admit I'm in a small, early-adopting minority now: http://bergie.iki.fi/blog/six-weeks-working-android/ For the majority of devs still using PCs to write code, my point was better to invest in improving the experience with D for those targeting mobile, rather than more marginal effort to make VisualD and others targeting Windows even better, because of the different sizes and trajectories of those OS platforms.
Nov 03
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Nov 03
parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 3 November 2017 at 18:08:54 UTC, 12345swordy wrote:
 On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Yes, but it is evidence of what I said, that "I'm in a small, early-adopting minority now." I don't know how you expect evidence for something that _will_ happen, it's a prediction I'm making, though based on current, rising trends like all those in this feed: https://mobile.twitter.com/termux
Nov 03
next sibling parent reply Craig Dillabaugh <craig.dillabaugh gmail.com> writes:
On Friday, 3 November 2017 at 18:26:54 UTC, Joakim wrote:
 On Friday, 3 November 2017 at 18:08:54 UTC, 12345swordy wrote:
 On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Yes, but it is evidence of what I said, that "I'm in a small, early-adopting minority now." I don't know how you expect evidence for something that _will_ happen, it's a prediction I'm making, though based on current, rising trends like all those in this feed: https://mobile.twitter.com/termux
I don't really care if the device crunching the numbers is a smartphone or a mainframe as long as it is fast enough and: 1) I can do my work with a regular size keyboard and large monitor. 2) I can use whatever applications I want be it a CLI or some GUI app. 3) I can install/execute VMs on my device of choice without running out of memory. 4) My data isn't monitored, controlled, owned, or data-mined by some large corporation. 5) I can easily move my data, etc. to another device if I decide to. 6) I can use it to play any DVD's that I own (don't have a TV). 7) I can't easily lose my computing device :o) How far off do you think mobile devices are off providing this type of experience, or are they already there in your mind? What about #7.
Nov 03
parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 3 November 2017 at 19:23:51 UTC, Craig Dillabaugh 
wrote:
 On Friday, 3 November 2017 at 18:26:54 UTC, Joakim wrote:
 On Friday, 3 November 2017 at 18:08:54 UTC, 12345swordy wrote:
 On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Yes, but it is evidence of what I said, that "I'm in a small, early-adopting minority now." I don't know how you expect evidence for something that _will_ happen, it's a prediction I'm making, though based on current, rising trends like all those in this feed: https://mobile.twitter.com/termux
I don't really care if the device crunching the numbers is a smartphone or a mainframe as long as it is fast enough and: 1) I can do my work with a regular size keyboard and large monitor.
Check, most mobile devices these days support some form of interfacing with monitors and keyboards.
 2) I can use whatever applications I want be it a CLI or some 
 GUI app.
Depends on precisely what those apps are, ie Office Mobile and Photoshop Express are available on Android, but I'm sure some obscure Win32 CAD app isn't.
 3) I can install/execute VMs on my device of choice without 
 running
 out of memory.
No, only early Qemu support for now, VMs have not really come to mobile yet.
 4) My data isn't monitored, controlled, owned, or data-mined by 
 some large corporation.
Check, especially if you know what you're doing.
 5) I can easily move my data, etc. to another device if I 
 decide to.
Check, mobile devices usually support such transfer better than PCs.
 6) I can use it to play any DVD's that I own (don't have a TV).
Hmm, that is a niche use case these days, guessing no. I don't think I've handled a DVD in more than a decade, like most people, so I'm not sure this matters. However, I just watched a HD movie on my tablet last night, and I find it to be a more engaging experience than any TV. Something about having the screen right in front of you, it's more immersive, particularly if your tablet has decent speakers (though I always care about video more than audio, so don't need the big sound system that people usually hook up to their TVs). I haven't owned a TV for more than a decade, though people I've lived with have usually had one, that I almost never watched.
 7) I can't easily lose my computing device :o)
Obviously any "mobile" device, whether a smartphone or laptop is more easily lost than a desktop you keep at home. I've never lost one, but I don't move around that much. I'll say Check, since you can enable device trackers and proximity warnings to help you with this.
 How far off do you think mobile devices are off providing this 
 type of experience, or are they already there in your mind?  
 What about #7.
They're getting there, but not as general-purpose as a PC yet.
Nov 03
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 3 November 2017 at 20:05:52 UTC, Joakim wrote:
 5) I can easily move my data, etc. to another device if I 
 decide to.
Check, mobile devices usually support such transfer better than PCs.
"mobile devices" meaning Android devices. I can't stick a USB flash drive in a iPad.
Nov 03
parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 3 November 2017 at 20:36:57 UTC, jmh530 wrote:
 On Friday, 3 November 2017 at 20:05:52 UTC, Joakim wrote:
 5) I can easily move my data, etc. to another device if I 
 decide to.
Check, mobile devices usually support such transfer better than PCs.
"mobile devices" meaning Android devices. I can't stick a USB flash drive in a iPad.
Sure you can, with the right adapter: https://www.lifewire.com/how-to-connect-usb-devices-to-ipad-1999862 I routinely transfer HD video from my Android devices to a couple TB external slim HDs.
Nov 03
parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 3 November 2017 at 21:33:19 UTC, Joakim wrote:
 Sure you can, with the right adapter:

 https://www.lifewire.com/how-to-connect-usb-devices-to-ipad-1999862

 I routinely transfer HD video from my Android devices to a 
 couple TB external slim HDs.
I'm not disputing Android. Just saying that copying a photo or powerpoint presentation or something from an iPhone/iPad directly to a USB drive, not the easiest thing in the world. I haven't tried it, but apparently there are wifi USB drives that'll do it (maybe not the photo, because Apple's a litle funky about them).
Nov 03
prev sibling parent reply Computermatronic <computermatronic gmail.com> writes:
On Friday, 3 November 2017 at 18:26:54 UTC, Joakim wrote:
 On Friday, 3 November 2017 at 18:08:54 UTC, 12345swordy wrote:
 On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Yes, but it is evidence of what I said, that "I'm in a small, early-adopting minority now." I don't know how you expect evidence for something that _will_ happen, it's a prediction I'm making, though based on current, rising trends like all those in this feed: https://mobile.twitter.com/termux
Can we please get back on topic please? Whether or not windows is 'dying' is irrelevant, since it is not going to die out as a development platform for at least the next 5 years. I, like many other windows users, want to be able to compile 64bit binaries in windows, without having to download and install the bloated and time consuming to download and install Visual Studio. I do most of my programming in Sublime Text, and frequently re-install windows. This may not be the case for many windows users of D, but clearly many windows users of D would like to be able to compile x64 out of the box.
Nov 03
next sibling parent codephantom <me noyb.com> writes:
On Saturday, 4 November 2017 at 02:33:35 UTC, Computermatronic 
wrote:
 Can we please get back on topic please?
Umm... we haven't been 'on topic' since about 210 threads ago ;-) When...Adam decided to claim that "..the few that don't will have little trouble understanding why they need it [VS+WinSDK] and acquiring it." And a little after that one, claiming "the attitudes around here towards Windows devs can be more than a little snobbish.". And then a little after that one, implying that installing VS is less a hassle for new user to D, than in installing Xocde, because at least with VS2017 you can pick and choose. Not that I blame Adam for anything ;-) except...that comments were not as well accepted as he might have thought they would have been ;-) If all this cognitive effort were instead going into writing some code, imagine what could have been achieved by now ;-)
Nov 03
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Saturday, 4 November 2017 at 02:33:35 UTC, Computermatronic 
wrote:
 On Friday, 3 November 2017 at 18:26:54 UTC, Joakim wrote:
 On Friday, 3 November 2017 at 18:08:54 UTC, 12345swordy wrote:
 On Friday, 3 November 2017 at 17:25:26 UTC, Joakim wrote:
 Most programmers will one day be coding on mobile devices, 
 though I admit I'm in a small, early-adopting minority now:

 http://bergie.iki.fi/blog/six-weeks-working-android/
A blog post is not evidence that the majority of programmers will be coding on mobile devices.
Yes, but it is evidence of what I said, that "I'm in a small, early-adopting minority now." I don't know how you expect evidence for something that _will_ happen, it's a prediction I'm making, though based on current, rising trends like all those in this feed: https://mobile.twitter.com/termux
Can we please get back on topic please?
Yes, it is as simple as changing the topic up top back to the original, like I have now and you didn't, and discussing something else. You don't have to read messages that were marked as OT, like mine were, nobody's making you.
 Whether or not windows is 'dying' is irrelevant, since it is 
 not going to die out as a development platform for at least the 
 next 5 years.

 I, like many other windows users, want to be able to compile 
 64bit binaries in windows, without having to download and 
 install the bloated and time consuming to download and install 
 Visual Studio.

 I do most of my programming in Sublime Text, and frequently 
 re-install windows. This may not be the case for many windows 
 users of D, but clearly many windows users of D would like to 
 be able to compile x64 out of the box.
I was intrigued by someone saying in this thread that Go supports Win64 COFF out of the box, so I just tried it out in wine and indeed it works with their hello world example. Running "go build -x" shows that they ship a link.exe for Win64 with their Win64 zip, guess it's the Mingw one? If you want something similar for the D compiler packages for Win64, I suggest you file a bugzilla issue, as that's where the core team and other D devs look for stuff to do: https://issues.dlang.org The more info you have about the linker Go is using, the better. Best if you just submit a pull request for dmd or its installer, making it use this other linker so that VS is not needed: https://github.com/dlang/dmd/pulls https://github.com/dlang/installer/pulls D is a community effort, pitch in to make the things you want happen.
Nov 04
parent reply MrSmith <mrsmith33 yandex.ru> writes:
On Saturday, 4 November 2017 at 08:16:16 UTC, Joakim wrote:
 I was intrigued by someone saying in this thread that Go 
 supports Win64 COFF out of the box, so I just tried it out in 
 wine and indeed it works with their hello world example.  
 Running "go build -x" shows that they ship a link.exe for Win64 
 with their Win64 zip, guess it's the Mingw one?
Does Go need WinSDK though?
Nov 05
next sibling parent Joakim <dlang joakim.fea.st> writes:
On Sunday, 5 November 2017 at 14:19:11 UTC, MrSmith wrote:
 On Saturday, 4 November 2017 at 08:16:16 UTC, Joakim wrote:
 I was intrigued by someone saying in this thread that Go 
 supports Win64 COFF out of the box, so I just tried it out in 
 wine and indeed it works with their hello world example.  
 Running "go build -x" shows that they ship a link.exe for 
 Win64 with their Win64 zip, guess it's the Mingw one?
Does Go need WinSDK though?
Not for the hello world sample I tried in wine, maybe you need to get some libraries for other stuff, dunno.
Nov 05
prev sibling parent Kagamin <spam here.lot> writes:
On Sunday, 5 November 2017 at 14:19:11 UTC, MrSmith wrote:
 On Saturday, 4 November 2017 at 08:16:16 UTC, Joakim wrote:
 I was intrigued by someone saying in this thread that Go 
 supports Win64 COFF out of the box, so I just tried it out in 
 wine and indeed it works with their hello world example.  
 Running "go build -x" shows that they ship a link.exe for 
 Win64 with their Win64 zip, guess it's the Mingw one?
Does Go need WinSDK though?
It looks like integration with lld was fixed in ldc 1.5 release.
Nov 10
prev sibling parent Jerry <hurricane hereiam.com> writes:
On Saturday, 4 November 2017 at 02:33:35 UTC, Computermatronic 
wrote:
 I, like many other windows users, want to be able to compile 
 64bit binaries in windows, without having to download and 
 install the bloated and time consuming to download and install 
 Visual Studio.

 I do most of my programming in Sublime Text, and frequently 
 re-install windows. This may not be the case for many windows 
 users of D, but clearly many windows users of D would like to 
 be able to compile x64 out of the box.
So your fine with reinstalling Windows, going through the entire processing to setup and configure. Download all the new updates and install them. Then setup your environment, downloading potentially dozens of applications (git, debuggers, text editors, compilers, etc..) and configuring settings? But downloading Visual Studio is "time consuming".. I don't even. If all you can complain about Visual Studio is its download size then I'd say it's doing pretty good as a development tool.
Nov 06
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 3 November 2017 at 14:29:27 UTC, jmh530 wrote:
 You want to be able to compile D programs that go on a smart 
 phone because that's where the growth of computer users is 
 coming from.
That's not all that obvious. I think a lot of the adults who got computers in the mid 90s did so to be able to access WWW, and that this segment will be ok with tablets or just phones. And mostly web-apps in addition to banking/tickets and social apps. I think the majority of this not-so-sophisticated segment is quite limited in where they go after the novelty of mobile apps is weaning. In addition you have the kids/teens/young adults market that used to be C64, Nintendo, Sega, Playstation etc. I think that market segment is somewhat stable in what they go for, but will be swayed by the latest fashion/marketing. So, it is increasing because third world countries get access, but same behaviour in some ways. And as such could move to a completely new platform quite quickly because kids have a very low threshold for moving to new tech.
 Thus, it's good to be able to compile programs for that 
 platform, but it doesn't mean that work done to improve the 
 experience of programmers on other platforms is a waste of time.
Well, it is possible that web development will move to less demanding platforms, but is also quite obvious that to get to next generation of programming languages with heavy duty static analysis and software synthesis you need a magnitude more power than current desktop CPUs offer. Not that I can predict the future, but better tooling means smarter tools, smarter tools require another level of power. And judging from what is happening in language research I'd say that is the direction we'll see in the next few decades. But who knows, maybe the next gen javascript will own the market for decades to come. Hard to tell. What I do see is that neither Apple or Intel have done a lot of innovation in the past decade. Maybe they don't have to, maybe their margins are too large to care. That opens the door for new players.
Nov 03
prev sibling parent reply Tony <tonytdominguez aol.com> writes:
On Friday, 3 November 2017 at 14:12:56 UTC, Joakim wrote:


 I don't know why you're so obsessed with storage when even 
 midrange smartphones come with 32 GBs nowadays, expandable to 
 much more with an SD card.  My tablet has only 16 GBs of 
 storage, with only 10-12 actually accessible, but I've never 
 had a problem building codebases that take up GBs of space with 
 all the object files, alongside a 64 GB microSD card for many, 
 mostly HD TV shows and movies.
The smallest storage Windows 10/Linux laptops have is a 128GB SSD. Even with a faster 128GB SSD being around the price of a 1TB hard drive, I still see 1TB being the dominant low-end storage. So I am going by what I see being offered as a minimum. It may be that most or even 99% of people can get by with 32GB flash memory, but it isn't being offered (except on Chromebooks which have traditionally only been web browsers, and on Windows 10S machines which can only run Windows Store apps).
 Are you suggesting they are developing their games for iOS and 
 Android devices ON those devices? Apple has XCode for 
 developing iOS apps and it runs on macOS machines only. There 
 is also the Xamarin IDE or IDE plug-in from Microsoft that 
 allows C# on iOS, but it runs on macOS or WIndows. For 
 Android, there is Android Studio - "The Official IDE of 
 Android" - which runs on Windows, macOS and Linux. There is no 
 Android version.
Yes, of course they're still largely developing mobile games on PCs, though I'm not sure why you think that matters. But your original claim was that they're still using PC-focused IDEs, as opposed to new mobile-focused IDEs like XCode or Android Studio, which you now highlight.
I never made any previous claim about what IDEs are being used. The only time I previously mentioned an IDE was with regard to RemObjects and Embarcadero offering cross-compilation to Android/iOS with their products. "There is a case to be made for supporting Android/iOS cross-compilation. But it doesn't have to come at the expense of Windows 64-bit integration. Not sure they even involve the same skillsets. Embarcadero and Remobjects both now support Android/iOS development from their Windows (and macOS in the case of Remobjects) IDEs." That was to highlight that those two compiler companies have seen fit to also cross-compile to mobile - they saw an importance to mobile development. It wasn't about what IDEs are best for mobile or even what IDEs are being used for mobile. Not that it matters, but I don't think that XCode meets the definition of "new mobile-focused IDE" as-as far as I know, it was developed for OS X development and is still used for such. Android Studio may be "new mobile-focused", even though based on IntelliJ IDEA.
 Yes, Windows is dominant, dominant in a niche, internal IT.  
 The consumer mobile market is much larger nowadays, and Windows 
 has almost no market share there.
Sad too, because of all the tablet/phone interfaces, the only one that is not just "icons on a background", and my personal preference, is Windows Mobile.
 As for Microsoft, Windows is not their only product, they have 
 moved Office onto the dominant mobile platforms.  As long as 
 they keep supporting mobile, they could eke out an existence.  
 Their big bet on Azure is going to end badly though.
They have Word, Excel, Powerpoint for mobile, but they are free. The Android store mentions "in-app purchases" but I wasn't offered any. Maybe it is for OneDrive storage of files. I already have that so it could be why I don't see anything to purchase in the app.
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that came 
 out this June? That's a contradiction of "milk it like an 
 iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it". It is ironic that Microsoft and Ubuntu both saw a convergence of mobile and desktop and began modifying their desktop interace to best suit mobile, and now Ubuntu has abandoned the idea and Microsoft has abandoned the phone market. As it turns out, any convergence will have to come from the two dominant mobile OSes as it is impossible to go the other direction due to the app catch-22.
Nov 05
parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 6 November 2017 at 06:37:52 UTC, Tony wrote:
 On Friday, 3 November 2017 at 14:12:56 UTC, Joakim wrote:
 I don't know why you're so obsessed with storage when even 
 midrange smartphones come with 32 GBs nowadays, expandable to 
 much more with an SD card.  My tablet has only 16 GBs of 
 storage, with only 10-12 actually accessible, but I've never 
 had a problem building codebases that take up GBs of space 
 with all the object files, alongside a 64 GB microSD card for 
 many, mostly HD TV shows and movies.
The smallest storage Windows 10/Linux laptops have is a 128GB SSD. Even with a faster 128GB SSD being around the price of a 1TB hard drive, I still see 1TB being the dominant low-end storage. So I am going by what I see being offered as a minimum. It may be that most or even 99% of people can get by with 32GB flash memory, but it isn't being offered (except on Chromebooks which have traditionally only been web browsers, and on Windows 10S machines which can only run Windows Store apps).
The vast majority of users would be covered by 5-10 GBs of available storage, which is why the lowest tier of even the luxury iPhone was 16 GBs until last year. Every time I talk to normal people, ie non-techies unlike us, and ask them how much storage they have in their device, whether smartphone, tablet, or laptop, they have no idea. If I look in the device, I inevitably find they're only using something like 3-5 GBs max, out of the 20-100+ GBs they have available. You only need 32 GBs or more if you're downloading a bunch of HD videos like I do, playing giant AAA games, or setting up a bunch of VMs, like some devs do. These are all niche uses, that 99% of users don't partake in, which is why 32 GBs is plenty for them.
 Are you suggesting they are developing their games for iOS 
 and Android devices ON those devices? Apple has XCode for 
 developing iOS apps and it runs on macOS machines only. There 
 is also the Xamarin IDE or IDE plug-in from Microsoft that 
 allows C# on iOS, but it runs on macOS or WIndows. For 
 Android, there is Android Studio - "The Official IDE of 
 Android" - which runs on Windows, macOS and Linux. There is 
 no Android version.
Yes, of course they're still largely developing mobile games on PCs, though I'm not sure why you think that matters. But your original claim was that they're still using PC-focused IDEs, as opposed to new mobile-focused IDEs like XCode or Android Studio, which you now highlight.
I never made any previous claim about what IDEs are being used. The only time I previously mentioned an IDE was with regard to RemObjects and Embarcadero offering cross-compilation to Android/iOS with their products. "There is a case to be made for supporting Android/iOS cross-compilation. But it doesn't have to come at the expense of Windows 64-bit integration. Not sure they even involve the same skillsets. Embarcadero and Remobjects both now support Android/iOS development from their Windows (and macOS in the case of Remobjects) IDEs." That was to highlight that those two compiler companies have seen fit to also cross-compile to mobile - they saw an importance to mobile development. It wasn't about what IDEs are best for mobile or even what IDEs are being used for mobile.
If you look back to the first mention of IDES, it was your statement, "Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit." That at least implies that they're using the same IDE to target both mobile and PC gaming, which is what I was disputing. If you agree that they use completely different toolchains, then it is irrelevant whether D supports Windows-focused IDEs, as it doesn't affect mobile-focused devs.
 Not that it matters, but I don't think that XCode meets the 
 definition of "new mobile-focused IDE" as-as far as I know, it 
 was developed for OS X development and is still used for such. 
 Android Studio may be "new mobile-focused", even though based 
 on IntelliJ IDEA.
Sure, they took existing IDEs and refocused them towards mobile development. XCode better be focused on iOS, as that's pretty much all that devs are using it for these days.
 Yes, Windows is dominant, dominant in a niche, internal IT.  
 The consumer mobile market is much larger nowadays, and 
 Windows has almost no market share there.
Sad too, because of all the tablet/phone interfaces, the only one that is not just "icons on a background", and my personal preference, is Windows Mobile.
I've always thought that flat Metro interface was best suited for mobile displays, the easiest to view, render, and touch. To some extent, all the other mobile interfaces have copied it, with their move to flat UIs over the years. However, it obviously takes much more than a nice GUI to do well in mobile.
 As for Microsoft, Windows is not their only product, they have 
 moved Office onto the dominant mobile platforms.  As long as 
 they keep supporting mobile, they could eke out an existence.  
 Their big bet on Azure is going to end badly though.
They have Word, Excel, Powerpoint for mobile, but they are free. The Android store mentions "in-app purchases" but I wasn't offered any. Maybe it is for OneDrive storage of files. I already have that so it could be why I don't see anything to purchase in the app.
My understanding is that they're not full Office either, that features are still missing that you can only get in the paid desktop version. I don't know how much those missing features matter, as I don't use Office or any such suite, but MS would be making a mistake to not offer those on mobile eventually.
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that 
 came out this June? That's a contradiction of "milk it like 
 an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it".
You and I and Jobs may've let them rebel, but Apple is a public corporation. They can't just let easy money go, their shareholders may not like it. Perhaps you're not too familiar with legacy calculations, but they're probably still making good money off Macs, but it just distracts and keeps good Apple devs off the real cash cow, iPhone. Even if the Mac financials aren't _that_ great anymore, you don't necessarily want to piss off your oldest and most loyal customers, who may stop buying iPhones and iPads too. So they have to constantly make a calculation, has the Mac userbase shrunk enough yet that they can just ditch that legacy desktop OS? Maybe they have a converged device in the works, ie the iPhone XV will ship a macOS GUI/environment as an iOS software upgrade to be used with their version of Dex/Sentio, after which they can tell those users, "Just buy an iPhone and get the Mac software upgrade." ;) Either way, I'm sure they're crunching the numbers every quarter on when to cut bait, but given they've kept the iPod Touch around this long, I doubt the Mac will be axed anytime soon. They've already heavily cut their Mac investment though, as all you hear from Mac users is that the pace of feature development and bug fixes has greatly slowed (this article also dings iOS, but notice that most of the specific criticism is for OS X and its apps): https://pljns.com/blog/2016/02/04/apples-declining-software-quality/
 It is ironic that Microsoft and Ubuntu both saw a convergence 
 of mobile and desktop and began modifying their desktop 
 interace to best suit mobile, and now  Ubuntu has abandoned the 
 idea and Microsoft has abandoned the phone market. As it turns 
 out, any convergence will have to come from the two dominant 
 mobile OSes as it is impossible to go the other direction due 
 to the app catch-22.
I think Jobs got it right that you cannot converge too early, ie Apple kept their desktop and mobile OS's separate and are only slowly converging them. One reason is that the mobile hardware was just not powerful and efficient enough back when Windows 8 tried to converge the two UIs. Another is that the mobile market is much more important and far larger, so its better to focus more on getting that right, then just add a desktop GUI later as a mobile feature. Microsoft was really caught between a rock and a hard place, as that desktop GUI for "lean forward" work is all they knew, what the entire computing market and their dominant business was built on. For MS to rush headlong into mobile-first and leave the desktop behind would've taken a giant push, one that their corporate culture, fat, flush, and arrogant after a decade of minting money, was likely incapable of making. Also, nobody saw mobile growing so gigantic, so fast, not even Jobs by all indications. Mobile has really been a tidal wave over the last decade. Funny how all you hear is bitching and whining from a bunch of devs on proggit/HN about how they missed the '80s PC boom or '90s dot.com boom and there's nothing fundamentally exciting like that now, all while the biggest boom of them all, the mobile boom, just grew and grew right in front of their faces. :D
Nov 06
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
 Sure, they took existing IDEs and refocused them towards mobile 
 development.  XCode better be focused on iOS, as that's pretty 
 much all that devs are using it for these days.
iOS has always been mostly a subset of OS-X. There are some differences in the UI components, but the general architecture is the same. I'm not sure why you claim that people aren't writing for OS-X. Just because the iOS space is flooded with simple software does not mean that people don't write complicated applications for OS-X. E.g. there are lots of simple audio applications for iOS, but the complicated ones are on OS-X.
 with legacy calculations, but they're probably still making 
 good money off Macs, but it just distracts and keeps good Apple 
 devs off the real cash cow, iPhone.  Even if the Mac financials 
 aren't _that_ great anymore, you don't necessarily want to piss 
 off your oldest and most loyal customers, who may stop buying 
 iPhones and iPads too.
I don't know if I trust the current management in Apple, they seem to be too hung up on fashion and squeezing the market, but fashions change and fashion items are relatively quickly commoditised. It is slightly slower in this space because the upfront investments are high, but it is easier than in the CPU market where you have some objective measures for performance. This dynamic used to be the case with cell phones too, but eventually Nokia lost that market. Similarly, this dynamic used to be the case with Apple's MacIntosh line. They approached it as a fashion item and they almost folded over it. One reason that Apple could price up their iOS products was that people could justify buying a more expensive phone/tablet since they also replaced their digital camera with it, then the video camera. You have to view their push of iPad Pro in the same vein, it is a product that cannot be commoditised yet and they try to defend the price by convincing people to think of it as a laptop. It would be a bad idea for Apple to ditch the Mac. It is a product that is much more difficult to commoditise than the iOS products. And their owners tend to have multiple Apple devices, so it does not take away from the iOS sales, it comes in addition. The performance of mobile devices will always be limited by heat. The reason mobile devices perform well is that a lot of effort has been put into making good use of the GPU. The reason that desktops are not improving much is probably because AMD has not been able to keep up with Intel, but Intel is now on the market with i9, so maybe they are feeling threatened by Ryzen.
 Also, nobody saw mobile growing so gigantic,
If you are talking about devices, then this is completely false. "mobile" was big before iOS. The academic circles was flooded by "mobile this - mobile that" around year 2000, by 2005 the big thing was AR which only now is gradually becoming available. (And VR peaked around 1995, and is slowly becoming available now). What was unexpected is that Apple and Samsung managed to hold onto such a large segment for so many years. I think Android's initial application inefficiency (Java) has a lot to do with it. Apple chose to limit the hardware to a very narrow architecture and got more performance from that hardware by going binary. That was a gamble too, but they were big enough to take control over it by building their own CPUs.
Nov 06
parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 7 November 2017 at 07:57:11 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
 Sure, they took existing IDEs and refocused them towards 
 mobile development.  XCode better be focused on iOS, as that's 
 pretty much all that devs are using it for these days.
iOS has always been mostly a subset of OS-X. There are some differences in the UI components, but the general architecture is the same.
One is a touch-first mobile OS that heavily restricts what you can do in the background and didn't even have a file manager until this year, while the other is a classic desktop OS, so there are significant differences.
 I'm not sure why you claim that people aren't writing for OS-X. 
 Just because the iOS space is flooded with simple software does 
 not mean that people don't write complicated applications for 
 OS-X.

 E.g. there are lots of simple audio applications for iOS, but 
 the complicated ones are on OS-X.
I never said they don't write apps for macOS, I said iOS is a much bigger market which many more write for.
 with legacy calculations, but they're probably still making 
 good money off Macs, but it just distracts and keeps good 
 Apple devs off the real cash cow, iPhone.  Even if the Mac 
 financials aren't _that_ great anymore, you don't necessarily 
 want to piss off your oldest and most loyal customers, who may 
 stop buying iPhones and iPads too.
I don't know if I trust the current management in Apple, they seem to be too hung up on fashion and squeezing the market, but fashions change and fashion items are relatively quickly commoditised. It is slightly slower in this space because the upfront investments are high, but it is easier than in the CPU market where you have some objective measures for performance.
They have been selling the most popular expensive "fashion item" in the world for a decade now. And according to objective benchmarks, their hardware blows away everybody else in mobile, so they have that going for them too.
 This dynamic used to be the case with cell phones too, but 
 eventually Nokia lost that market. Similarly, this dynamic used 
 to be the case with Apple's MacIntosh line. They approached it 
 as a fashion item and they almost folded over it.
The same may happen to the iPhone some day, but it shows no signs of letting up.
 One reason that Apple could price up their iOS products was 
 that people could justify buying a more expensive phone/tablet 
 since they also replaced their digital camera with it, then the 
 video camera.

 You have to view their push of iPad Pro in the same vein, it is 
 a product that cannot be commoditised yet and they try to 
 defend the price by convincing people to think of it as a 
 laptop.
Since they still have a ways to go to make the cameras or laptop-functionality as good as the standalone products they replaced, it would appear they can still convince their herd to stay on the upgrade cycle.
 It would be a bad idea for Apple to ditch the Mac. It is a 
 product that is much more difficult to commoditise than the iOS 
 products. And their owners tend to have multiple Apple devices, 
 so it does not take away from the iOS sales, it comes in 
 addition.
While I disagree that you can't commoditize the Mac, as you could just bundle most of the needed functionality into an iPhone, I already said that Mac users probably buy iPhones and that Apple's unlikely to kill off the Mac anytime soon, though they've already significantly cut the team working on it.
 The performance of mobile devices will always be limited by 
 heat. The reason mobile devices perform well is that a lot of 
 effort has been put into making good use of the GPU.
Even within that lower power budget, performance is now so good that it rivals laptop CPUs, which is what goes into most PCs sold nowadays, so heat and the GPU are not that much of a concern anymore.
 The reason that desktops are not improving much is probably 
 because AMD has not been able to keep up with Intel, but Intel 
 is now on the market with i9, so maybe they are feeling 
 threatened by Ryzen.
No, the reason they don't improve is consumers don't need the performance.
 Also, nobody saw mobile growing so gigantic,
If you are talking about devices, then this is completely false. "mobile" was big before iOS. The academic circles was flooded by "mobile this - mobile that" around year 2000, by 2005 the big thing was AR which only now is gradually becoming available. (And VR peaked around 1995, and is slowly becoming available now).
You are conflating two different things, fashionable academic topics and industry projections for actual production, which is what I was talking about. I agree that a lot of people were talking about mobile being potentially next for awhile, Microsoft even came out with their UMPC platform years before the iPhone: https://en.m.wikipedia.org/wiki/Ultra-mobile_PC But if you looked at the chart I linked earlier, where mobile sales jumped 25X in a decade, that is extremely difficult to predict, as it was driven by a host of mobile CPU, display, 3G/4G, and power improvements that nobody saw happening so fast. Fashionable tech topics are mostly irrelevant, I'm talking about actual sales projections, especially when you're so confident in them that you bet your company on them. Nobody other than Apple did that, which is why they're still reaping the rewards today.
 What was unexpected is that Apple and Samsung managed to hold 
 onto such a large segment for so many years. I think Android's 
 initial application inefficiency (Java) has a lot to do with 
 it. Apple chose to limit the hardware to a very narrow 
 architecture and got more performance from that hardware by 
 going binary. That was a gamble too, but they were big enough 
 to take control over it by building their own CPUs.
Those two companies still have the best hardware and the multi-billion-dollar marketing budgets to make sure you know it, ;) no doubt that helps them maintain their share.
Nov 07
next sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
 No, the reason they don't improve is consumers don't need the 
 performance.
I don't agree. Consumers would welcome more performance - and many of us 'need' it too. But cpu's have hit the heat barrier, and so manufacturers tend to focus on more cores, better caching algorithms, and such... but I am sure that consumers would find a 10GHz quad core processor far more useful than a 4Ghz 24 core one. Then you have the challenges of redesigning programming languages and software development methodologies to take better advantage of the multi-core thing... There is also the problem of no real competition against Intel, so real innovation is not occuring as rapidly as it once did. What we really need, is to get rid of that heat barrier - which means lots and lots of money (potentially billions) into new research... and without competition, why should Intel bother? They can just do a few minor tweaks here and there, increment a number, and call the tweaked i7 ..the i9.
Nov 07
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 07/11/2017 11:12 AM, codephantom wrote:
 On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
 No, the reason they don't improve is consumers don't need the 
 performance.
I don't agree. Consumers would welcome more performance - and many of us 'need' it too. But cpu's have hit the heat barrier, and so manufacturers tend to focus on more cores, better caching algorithms, and such... but I am sure that consumers would find a 10GHz quad core processor far more useful than a 4Ghz 24 core one. Then you have the challenges of redesigning programming languages and software development methodologies to take better advantage of the multi-core thing... There is also the problem of no real competition against Intel, so real innovation is not occuring as rapidly as it once did. What we really need, is to get rid of that heat barrier - which means lots and lots  of money (potentially billions) into new research... and without competition, why should Intel bother? They can just do a few minor tweaks here and there, increment a number, and call the tweaked i7 ..the i9.
Not quite, but along the right line of thinking IMO. Speed wise we have well and truly hit the limit of what we can do with silicon. The speed improvements today are not the same kind done 20 years ago. Today's speed improvements come from changing and making what instructions run cheaper. Consumers most definitely would benefit from higher number of cores even if they are slower. Why? Two reasons. First of all common programs like web browsers tend to use a LOT of threads. Which would mean less context switching over all (quite expensive and slow). Second most people do not max out their RAM both speed and quantity wise. RAM that matches the CPU clock speed is very expensive when comparing against high end CPU's and RAM is the real bottle neck today. Most people never get close to using up a CPU to its maximum capacity, its sitting idle a good bit of the time. Intel has competition, every heard of AMD and ARM? Intel has made a lot of changes to their strategy in the last 10-30 years e.g. being more energy efficient because of ARM and AMD64 (with micro ops to implement it). I am quite surprised that Intel even created i9 actually, it just wasn't required. Its like as if they took their Xeon lines, removed a bunch of features and only based it on the higher end ones. Remember Xeon = non-consumer (so you get e.g. reliability and performance along with all the new features) and i-series = cheap consumer products.
Nov 07
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole 
wrote:
 I am quite surprised that Intel even created i9 actually, it 
 just wasn't required.
AMD Ryzen Threadripper: https://www.cpubenchmark.net/high_end_cpus.html
Nov 07
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
 I am quite surprised that Intel even created i9 actually, it just 
 wasn't required.
AMD Ryzen Threadripper: https://www.cpubenchmark.net/high_end_cpus.html
I do not trust that benchmark. https://www.intel.com/content/www/us/en/products/compare-products.html?productIds=126699,120496,125056 But after looking at those numbers, I have a strange feeling that Intel is pushing those i9's past 'safe' limits. Ah huh they are messing with threading and cpu clock speeds via Intel Turbo Boost Max Technology 3.0. Nasty.
Nov 07
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 13:29:19 UTC, rikki cattermole 
wrote:
 On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole 
 wrote:
 I am quite surprised that Intel even created i9 actually, it 
 just wasn't required.
AMD Ryzen Threadripper: https://www.cpubenchmark.net/high_end_cpus.html
I do not trust that benchmark.
Well, this is another one with a comparison of two products with similar price: http://cpu.userbenchmark.com/Compare/Intel-Core-i9-7900X-vs-AMD-Ryzen-TR-1950X/3936vs3932 I think the Xeons might be for overcommited server situations. Larger caches and many threads. Sometimes people are more interested in responsiveness (prevent starvation) and not necessarily max speed. So if you do a lot of I/O system calls you might want the ability to run many threads at the same time and focus less on number crunching, perhaps?
 But after looking at those numbers, I have a strange feeling 
 that Intel is pushing those i9's past 'safe' limits.
I think they just turn off cores that does not work and put those chips into the lower end, and the high end is very expensive at $2000 (so maybe low yield or just greed :-)…
Nov 07
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 07/11/2017 1:48 PM, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 13:29:19 UTC, rikki cattermole wrote:
 On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
 I am quite surprised that Intel even created i9 actually, it just 
 wasn't required.
AMD Ryzen Threadripper: https://www.cpubenchmark.net/high_end_cpus.html
I do not trust that benchmark.
Well, this is another one with a comparison of two products with similar price: http://cpu.userbenchmark.com/Compare/Intel-Core-i9-7900X-vs-AMD-Ryzen TR-1950X/3936vs3932 I think the Xeons might be for overcommited server situations. Larger caches and many threads. Sometimes people are more interested in responsiveness (prevent starvation) and not necessarily max speed. So if you do a lot of I/O system calls you might want the ability to run many threads at the same time and focus less on number crunching, perhaps?
That sounds an awful like the average user too ;)
 But after looking at those numbers, I have a strange feeling that 
 Intel is pushing those i9's past 'safe' limits.
I think they just turn off cores that does not work and put those chips into the lower end, and the high end is very expensive at $2000 (so maybe low yield or just greed :-)…
The way I think of it is that Xeon's get all the newest and greatest features, with them slowly trickling down to the i-series. Invest in the Xeon production line one generation and in next use it for i7's ext. Basically R&D cost go all on the Xeon's and then eventually once its paid off it goes straight to the consumers. But i9 is looking like its a completely different beast to the rest of the i-series with Intel actively adding new unique features to it. Quite scary that this doesn't sound like a good move especially when those features could very well make those cpu's last not very long. Looks like they are changing tactic after the last 10 years or so. I do wonder if you're on the right track and turning a Xeon into an i9 is just a firmware upgrade...
Nov 07
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 14:03:31 UTC, rikki cattermole 
wrote:
 The way I think of it is that Xeon's get all the newest and 
 greatest features, with them slowly trickling down to the 
 i-series. Invest in the Xeon production line one generation and 
 in next use it for i7's ext. Basically R&D cost go all on the 
 Xeon's and then eventually once its paid off it goes straight 
 to the consumers.
I see that some features, like instructions, are tested out on some Xeons first, but others are really only on Xeons and not on all Xeons either. So I think the Xeons primarily are used as a tool for differentiating in the high end to maximize profits (turning on/off different feature sets, possibly from the same die). I think this largely is the case because AMD isn't competitive in that segment. The CPU-architecture generations follow a tic-toc pattern where the tics mean you have a new architecture and the toc means you have an improved manufacturing process. I don't think that has something to do with Xeon.
 Looks like they are changing tactic after the last 10 years or 
 so. I do wonder if you're on the right track and turning a Xeon 
 into an i9 is just a firmware upgrade...
I imagine that they would try to not use too much die space for the i9s, and the Xeons seem to require stuff that aren't needed, so perhaps not likely, but who knows?
Nov 07
parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 7 November 2017 at 14:43:14 UTC, Ola Fosheim Grøstad 
wrote:
 The CPU-architecture generations follow a tic-toc pattern where 
 the tics mean you have a new architecture and the toc means you 
 have an improved manufacturing process. I don't think that has 
 something to do with Xeon.
They actually changed this last year. Now it's Process (tick) -> Architecture (tock) -> Optimization. Kaby Lake was the first "optimization" step. People were kind of like it's not THAT much better than Skylake. I think of Xeon as Intel's brand for servers (though this may not be entirely accurate). The above processor families (Skylake, Kabylake, etc.) include Xeon and i7, etc. within them. Xeon is usually similar to what you'd buy on the consumer side, but has more features. My FreeNAS box at home has a Xeon in it so that I can use ECC memory. Other people might use Xeon because it has more cache or something.
Nov 07
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 7 November 2017 at 11:12:19 UTC, codephantom wrote:
 On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
 No, the reason they don't improve is consumers don't need the 
 performance.
I don't agree. Consumers would welcome more performance - and many of us 'need' it too.
There is an easy test of this: are they running out to upgrade to the latest higher performance x86 CPUs? No, as Tony noted earlier, "Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past," though I'd modify that "some" to most.
 But cpu's have hit the heat barrier, and so manufacturers tend 
 to focus on more cores, better caching algorithms, and such...

 but I am sure that consumers would find a 10GHz quad core 
 processor far more useful than a 4Ghz 24 core one.
Right before it melted down. :)
 Then you have the challenges of redesigning programming 
 languages and software development methodologies to take better 
 advantage of the multi-core thing...
Since you have tons of background processes or apps running these days, even on Android, you don't really need multi-threaded apps to make good use of multi-core.
 There is also the problem of no real competition against Intel, 
 so real innovation is not occuring as rapidly as it once did.

 What we really need, is to get rid of that heat barrier - which 
 means lots and lots  of money (potentially billions) into new 
 research... and without competition, why should Intel bother? 
 They can just do a few minor tweaks here and there, increment a 
 number, and call the tweaked i7 ..the i9.
Rikki answered all this: the real competition is from below from ARM and the performance gains now come from scaling out horizontally with multi-core, not vertically with faster clock speeds. More importantly, the market has settled on cheap, power-sipping chips in mobile devices as the dominant platform. x86 has failed miserably at fitting into that, which is why even MS is moving towards ARM: https://www.thurrott.com/windows/windows-10/134434/arm-based-always-connected-windows-10-pcs-approach-finish-line On Tuesday, 7 November 2017 at 11:40:21 UTC, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
 One is a touch-first mobile OS that heavily restricts what you 
 can do in the background and didn't even have a file manager 
 until this year, while the other is a classic desktop OS, so 
 there are significant differences.
Yes, there are differences for the end user, such as the the sandboxing, but that also applies to applications in OS-X appstore though. I don't expect iOS to change much in that department, I think Apple will continue to get people into their iCloud… On the API level iOS is mostly a subset, and features that was only in iOS has been made available on OS-X. The main difference is in some UI classes, but they both use the same tooling and UI design strategies. So in terms of XCode they are kinda similar.
I've never programmed for Apple devices and never would- I got my first and last Apple device more than a decade ago, a Powerbook laptop, don't buy any of their stuff since because of their ridiculous patent stance- so I can't speak to the similarity of APIs between macOS and iOS, but obviously there are significant developer and IDE differences in targeting a mobile OS versus a desktop OS, even if iOS was initially forked from macOS.
 I never said they don't write apps for macOS, I said iOS is a 
 much bigger market which many more write for.
Yes, there are more Apple developers in general. Not sure if the number of people doing OS-X development has shrunk, maybe it has.
Let me correct that for you: there are many more iOS developers now, because it is a _much_ bigger market.
 The same may happen to the iPhone some day, but it shows no 
 signs of letting up.
They probably will hold that market for a while as non-techies don't want to deal with a new unfamiliar UI.
 Since they still have a ways to go to make the cameras or 
 laptop-functionality as good as the standalone products they 
 replaced, it would appear they can still convince their herd 
 to stay on the upgrade cycle.
That is probably true, e.g. low light conditions.
 While I disagree that you can't commoditize the Mac, as you 
 could just bundle most of the needed functionality into an 
 iPhone
My point was that it is easier to commoditize the iPhone than the Mac. There is a very limited set of apps that end users must have on a phone.
Just a couple responses above, you say the iPhone UI will keep those users around. I'd say the Mac is actually easier to commoditize, because the iPhone is such a larger market that you can use that scale to pound the Mac apps, _once_ you can drive a multi-window, large-screen GUI with your iPhone, on a monitor or 13" Sentio-like laptop shell. I agree that very few apps are used on phones, and that they aren't as sticky as desktop apps as a result. Hopefully that means we'll see more competition in mobile than just android/iOS in the future.
 they've already significantly cut the team working on it.
Ok, didn't know that. I've only noticed that they stopped providing competitive products after Jobs died.
 No, the reason they don't improve is consumers don't need the 
 performance.
I don't think this is the case. It is because of the monopoly they have in the top segment. Intel was slow at progress until Athlon bit them too. If they felt the pressure they would put their assets into R&D. Remember that new products have to pay off R&D before making a profit, so by pushing the same old they get better ROI. Of course, they also have trouble with heat and developing a new technological platform is very expensive. But if they faced stiff competition, then they certainly would push that harder. In general the software market has managed to gobble up any performance improvements for decades. As long as developers spend time optimizing their code then there is a market for faster hardware (which saves development costs). The Intel i9-7900X sells at $1000 for just the chip. That's pretty steep, I'm sure they have nice profit margins on that one.
Lack of competition at the high end certainly played a role, but as I noted to codephantom above, consumers not needing the performance played a much larger role, which is why Samsung, with their much weaker SoCs, just passed Intel as the largest semiconductor vendor: http://fortune.com/2017/07/27/samsung-intel-chip-semiconductor/
 You are conflating two different things, fashionable academic 
 topics and industry projections for actual production, which 
 is what I was talking about.
What do you mean by industry projections? It was quite obvious by early 2000s that most people with cellphones (which basically was everyone in Scandinavia) would switch to smart phones. It wasn't a surprise.
Yes, but would that be in 2020 or 2050? Would people who never had a cellphone get a smartphone, driving that market even larger, as is happening today in developing markets? My point is that vague tech chatter about the potential next big thing is irrelevant, what matters is who was actually projecting hard numbers like a billion smartphones sold in 2013: https://mobile.twitter.com/lukew/status/842397687420923904 Jobs certainly wasn't, almost nobody was. If there were a few making wild-eyed claims, how many millions of dollars did they actually bet on it, as Jobs did? Nobody else did that, which shows you how much they believed it.
 confident in them that you bet your company on them.  Nobody 
 other than Apple did that, which is why they're still reaping 
 the rewards today.
Only Microsoft had a comparable starting point. iOS is closely related to OS-X. Not sure if Nokia could have succeed with scaling up Symbian. Maybe, dunno.
I'm not sure how the starting point matters, google funded Android from nothing and it now ships on the most smartphones. But even the google guys never bet the company on it, just gave it away for free for others to build on, which is why they never made as much money as Apple either. On Tuesday, 7 November 2017 at 13:59:26 UTC, codephantom wrote:
 On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
 Also, nobody saw mobile growing so gigantic, so fast, not even 
 Jobs by all indications.  Mobile has really been a tidal wave 
 over the last decade.  Funny how all you hear is bitching and 
 whining from a bunch of devs on proggit/HN about how they 
 missed the '80s PC boom or '90s dot.com boom and there's 
 nothing fundamentally exciting like that now, all while the 
 biggest boom of them all, the mobile boom, just grew and grew 
 right in front of their faces. :D
Well, I was there in the early nineties when the Microsoft WinPad was being talked about. This was almost 20 years before the iPad came out. I remember going through the 90's with Window CE interations, which eventually evolved into Window Mobile 2003 - which is when I purchased my first 'smart phone', and learnt how to write apps for it ( actually my current phone still runs Windows Mobile 6.1 ;-). I tried getting people around me interested in mobile devices, including the business I worked in. Nobody was really interested. They were all happy with their little push button nokias. Microsoft had the vision though, and they had it earlier than perhaps anyone else. But the vision was too far ahead of its time, and, around the early 2000's they refused to lose any more money, put it on the back burner, and competitors came in a took over - at a time when 'consumers' were just beginning to share the vision too....
Yes, that is the impression I have too: MS got in too early, got discouraged that consumers didn't want their bulky hardware and weird software, and backed off right when the mobile market took off.
 But I think what really made it take off so fast and 
 unexpectadly, was the convergence of mobile devices, mobile 
 communication technology (i.e wifi, gps and stuff), and of 
 course the internet... as well as the ability to find cheap 
 labour overseas to build the produces on mass.

 I doubt anyone could have envisioned that convergence...but 
 some companies were in a better position (more agile) than 
 others, at the time, to capitalise on it.

 But the vision of being mobile was certainly there, back in the 
 early nineties - and Microsoft were leading it.
Right, a significant minority of techies saw mobile coming, but I'm talking about forecasting the giant scope and scale and timing of the actual sales chart above. There was nothing special about the minority who thought mobile could be big, the Nokia 7710 shipped with a touchscreen years before the iPhone: https://en.m.wikipedia.org/wiki/Nokia_7710 The N800 shipped before the iPhone: https://en.m.wikipedia.org/wiki/Nokia_N800 Intel had been talking about their MID platform around the same time: https://gizmodo.com/253189/intel-ultra-mobile-platform-2007-officially-announced-mids-and-menlow-to-follow Which of them saw that giant tidal wave coming, sunk every penny on a surfboard, and swam out to ride it? Almost no one, other than Apple to some extent, add even they seem to have underestimated its size.
Nov 07
next sibling parent reply codephantom <me noyb.com> writes:
On Tuesday, 7 November 2017 at 14:33:28 UTC, Joakim wrote:
 Hopefully that means we'll see more competition in
 mobile than just android/iOS in the future.
Watch out for the MINIX3/NetBSD combo...a microkernel coupled with a BSD-unix that can run on pretty much anything. It may well be the future of the consumer mobile platforms, as well as server/cloud platforms. https://www.youtube.com/watch?v=oS4UWgHtRDw
Nov 07
parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 7 November 2017 at 15:09:05 UTC, codephantom wrote:
 On Tuesday, 7 November 2017 at 14:33:28 UTC, Joakim wrote:
 Hopefully that means we'll see more competition in
 mobile than just android/iOS in the future.
Watch out for the MINIX3/NetBSD combo...a microkernel coupled with a BSD-unix that can run on pretty much anything. It may well be the future of the consumer mobile platforms, as well as server/cloud platforms. https://www.youtube.com/watch?v=oS4UWgHtRDw
That'd be great but given how long MINIX has languished, I'm doubtful. Maybe Fuchsia, a google skunkworks OS with a new microkernel called Magenta, has a better shot: https://arstechnica.com/gadgets/2017/05/googles-fuchsia-smartphone-os-dumps-linux-has-a-wild-new-ui/ Whatever it is, I don't think the current mobile OS duopoly is as unassailable as people seem to think. You'll need some unique angle though to cover up for the lack of apps initially, as Jolla found. On Tuesday, 7 November 2017 at 15:21:20 UTC, Ola Fosheim Grøstad wrote:
 On Tuesday, 7 November 2017 at 14:33:28 UTC, Joakim wrote:
 similarity of APIs between macOS and iOS, but obviously there 
 are significant developer and IDE differences in targeting a 
 mobile OS versus a desktop OS, even if iOS was initially 
 forked from macOS.
Not in my experience… There are some things programmers have to be aware of, because some features are not available on iOS, but overall the same deal. Not too surprising as the iOS simulator compiles to X86, so by keeping the code bases similar they make it easier to simulate it on the Mac. So yeah, you kinda run iOS apps on your mac natively. (Not emulated as such.) Only when you go low level (ARM intrinsics) will this be a real problem. So it goes without saying that iOS and OS-X have to be reasonably similar for this to be feasible.
Not at all, it makes things easier certainly, but there's a reason why mobile devs always test on the actual devices, because there are real differences.
 Let me correct that for you: there are many more iOS 
 developers now, because it is a _much_ bigger market.
Yes, but that does not mean that your original core business is no longer important.
When you're making almost 5-10X as much from your new mobile business, of course it isn't: https://www.macrumors.com/2017/11/02/earnings-4q-2017/ Now, they're not going to dump 10-15% of sales because the Mac's a fading business, they'll just keep milking it till it doesn't make any sense, as I already said.
 Just a couple responses above, you say the iPhone UI will keep 
 those users around.  I'd say the Mac is actually easier to 
 commoditize, because the iPhone is such a larger market that 
 you can use that scale to pound the Mac apps, _once_ you can 
 drive a multi-window, large-screen GUI with your iPhone, on a 
 monitor or 13" Sentio-like laptop shell.
By commoditise I mean that you have many competitors in the market because the building blocks are available from many manufacturers (like radios).
Yes, that's what I was referring to also, the hundreds of millions of Android 7.0 smartphones now shipping with built-in multiwindow capability, ie the same building blocks as macOS.
 However, I think "laptop shell" is perceived as clunky. People 
 didn't seem to be very fond of docking-stations for laptops. 
 Quite a few went for impractically large screens on their 
 laptops instead.
There are all kinds of perceptions out there, but cost and "good enough" functionality rule the day, and that's what the mobile laptop shells and docks will provide. I agree that people usually have concerns that lead to large-screened laptops, as I worried that the 15" display on my Powerbook might be too small when I was getting it a decade ago, but I got by just fine. Wondered the same when I got my 13" 1080p Win7 ultrabook five years ago, but ended up thinking that was the perfect size and resolution after using it. I was skeptical that my 8.4" 359 ppi tablet would suffice when I started using it, but haven't had much of an issue over the last two years of daily use. Maybe I'm just very adaptable, but I've increasingly come to the conclusion that smaller works fine, especially with the extremely high ppi on mobile displays these days.
 I agree that very few apps are used on phones, and that they 
 aren't as sticky as desktop apps as a result.  Hopefully that 
 means we'll see more competition in mobile than just 
 android/iOS in the future.
iPhones are easier to displace because the UI is not so intrusive compared to a desktop and the apps people depend on are not so complicated. That might change of course… As people get used to the platform Apple can make things more complicated (less to learn, so you can introduce more features one by one). There are things about modern iOS that I don't find intuitive, but since so many have iPhones they probably get help from people nearby when they run into those issues. Scale matters in many strange ways…
I agree that the simplicity of mobile UIs makes it easier for new mobile entrants, but the much greater demand for mobile and the resulting scale means that desktop OSs will _eventually_ be easier to displace by mobile platforms. That'll happen once all mobile devices ship with easily accessible, desktop-style multi-window UIs built in, which as I said before is starting to happen with Android 7.0 Nougat. Interestingly, this complexity of multi-window UIs might provide mobile platforms the stickiness they've been missing, that they gin up by tricking their users into platform-exclusive apps like iMessage or Facetime.
 Lack of competition at the high end certainly played a role, 
 but as I noted to codephantom above, consumers not needing the 
 performance played a much larger role, which is why Samsung, 
 with their much weaker SoCs, just passed Intel as the largest 
 semiconductor vendor:
I assume those aren't used in desktop computers? Samsung need a lot of SoCs as they manifacture lots of household items…
They're used mostly in mobile devices, that consumers are replacing their desktop, Intel-Inside PCs with but mostly putting to new uses that PCs could never be put to.
 Yes, but would that be in 2020 or 2050?  Would people who 
 never had a cellphone get a smartphone, driving that market 
 even larger, as is happening today in developing markets?
Ok, I think it was fairly obvious that smart phones would at least for a while be a thing as it was already then fashionable in the high end. What wasn't all that obvious was that people would be willing to carry rather clunky iPhones and Android devices with bad battery life compared to the Symbian phones… Which I think was to a large extent driven by social norms, fashion and the press pushing the story on frontpages over and over… Also, when I think of it, I wonder if Apple would have succeeded if the press had not played them up as an underdog against Microsoft in the preceding decade. The underdog Apple rising from the dust and beating out Microsoft and Nokia made for a good story to push… (in terms of narrative/narratology)
You're not tracking my point, that nebulous claims 15 years ago about how mobile would be "a thing" are irrelevant compared to a projection of a billion mobiles sold in 2013, which is what happened. MS, Nokia, and others linked in this thread clearly thought as you did about mobile, yet they completely missed the boat. Clearly they misjudged the scale, scope, and timing of that coming mobile tidal wave.
 Jobs certainly wasn't, almost nobody was.  If there were a few 
 making wild-eyed claims, how many millions of dollars did they 
 actually bet on it, as Jobs did?  Nobody else did that, which 
 shows you how much they believed it.
Apple had worked on this for a long time and had also already failed at it, but they decided to pushed it again when touch screen technologies made it possible.
Yes, Apple made a big push, _at the right time_, while everybody else didn't. Google and Samsung followed fast, to their credit, while everybody else fell to the wayside.
 I'm not sure how the starting point matters, google funded 
 Android from nothing and it now ships on the most smartphones.
I don't think Android came from nothing, and it was significantly more clunky than iOS, but Google did this to have an option if other giants would try to block their revenue stream from ads… So it was more passive-aggressive than a business.
I see, please tell me how much market share Android came from then. It was a startup that never released a product before being bought and grown inside google.
 But even the google guys never bet the company on it, just 
 gave it away for free for others to build on, which is why 
 they never made as much money as Apple either.
Well, it was to proctect their business, not to develop their business, so I am not sure if Android is a good example.
A good example for what? They started a mobile OS from nothing and grew it to two billion-plus users today, which you implied only those with a "starting point" could do. Their motivations for doing so are irrelevant to that fact, but yeah, that's part of why they gave it away for free and didn't make as much money off it as Apple did, which they're now starting to backtrack with their in-house, high-priced Pixel line.
Nov 07
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 19:46:04 UTC, Joakim wrote:
 Not at all, it makes things easier certainly, but there's a 
 reason why mobile devs always test on the actual devices, 
 because there are real differences.
Mostly with low level stuff in my experience.
 Now, they're not going to dump 10-15% of sales because the 
 Mac's a fading business, they'll just keep milking it till it 
 doesn't make any sense, as I already said.
Heh, it would be very bad management to take focus off Macs. I doubt Jobs would have allowed that to happen, but as I said, I don't really trust the current management at Apple. So who knows what they will do? You are thinking too much short term here IMHO. The mobile sector is rather volatile.
 Maybe I'm just very adaptable, but I've increasingly come to 
 the conclusion that smaller works fine, especially with the 
 extremely high ppi on mobile displays these days.
Small tablets are ok, for reading, but programming really requires more screen space. Although I guess one external + the builtin one is ok too. I guess it would be possible to create a docking station for phones that was able to transfer heat away from the device so that you could run at higher speed when docked, but then the phone calls and you have to unplug it or use a headset…
 multi-window UIs built in, which as I said before is starting 
 to happen with Android 7.0 Nougat.
I should take a closer look on modern Android… Sounds interesting.
 happened.  MS, Nokia, and others linked in this thread clearly 
 thought as you did about mobile, yet they completely missed the 
 boat.  Clearly they misjudged the scale, scope, and timing of 
 that coming mobile tidal wave.
Yes, but as I said, not many players could have countered this. Microsoft certainly if they had bought up Nokia right away. Nokia alone… probably not. HP or Sony? On a lucky day…
 Yes, Apple made a big push, _at the right time_, while 
 everybody else didn't.  Google and Samsung followed fast, to 
 their credit, while everybody else fell to the wayside.
Well, but Android units did get a bad reputation in beginning.
 A good example for what?  They started a mobile OS from nothing 
 and grew it to two billion-plus users today, which you implied 
 only those with a "starting point" could do.
The Android makers had a real problem with quality and making a profit. Samsung managed to make a profit, but many others struggled. And it took a long time before Android's reputation caught up with iOS. Most businesses would not have been willing to make that software investment and sustain it until the OS platform would reach a competitive level. So I don't think many could have followed Apple there. Apple recycled a lot of their prior work and experiences. Microsoft could have, sure, and I am sure they regret getting in late. But, they were late with embracing Internet too, so they have always followed their own mindset… and only reluctantly follow new trends. But frankly, I don't think many giants would start with a GPL code base like Linux.
Nov 07
next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
Grøstad wrote:
 But frankly, I don't think many giants would start with a GPL 
 code base like Linux.
Redhat have demonstrated that it can be done. GPL is not the obstacle. The obstacle is the desire to control/dominate a market. There, GPL will do you harm, because you are required to release your source code changes back to the community - and hence your competitors. That's the only reason why there's no Microsoft Linux. Oracle is another giant with their 'own' rebranded Linux - they basically took Redhat's stuff... but even then, it was only so they could tie you into their proprietory solutions. Microsoft are porting stuff to Linux too, perhaps for the same reason. (SQL Server for Linux? A few years ago I would have laughed if someone said that would ever happen). But giants are starting to see that GPL can actually be utilised in their desire to dominate after all, because they can insert their proprietary stuff into it, and so 'domination' is still apparently attainable - even with GPL. And after all, it saves them the trouble of having to write/maintain an operating system. GPL is not a problem. GPL was specifically designed to benefit 'everyone'. The desire to dominate with proprietory closed source products is the problem - because it benefits who? Having said all that, I'm still very much an advocate of the BSD style licence ;-)
Nov 07
parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Wednesday, 8 November 2017 at 01:13:00 UTC, codephantom wrote:
 On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
 Grøstad wrote:
 [...]
Redhat have demonstrated that it can be done. GPL is not the obstacle. The obstacle is the desire to control/dominate a market. There, GPL will do you harm, because you are required to release your source code changes back to the community - and hence your competitors. [...]
And it didn't preclude Google to dominate the smartphone market. Android kernel IS Linux kernel.
Nov 07
parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 8 November 2017 at 06:27:15 UTC, Patrick Schluter 
wrote:
 On Wednesday, 8 November 2017 at 01:13:00 UTC, codephantom 
 wrote:
 On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
 Grøstad wrote:
 [...]
Redhat have demonstrated that it can be done. GPL is not the obstacle. The obstacle is the desire to control/dominate a market. There, GPL will do you harm, because you are required to release your source code changes back to the community - and hence your competitors. [...]
And it didn't preclude Google to dominate the smartphone market. Android kernel IS Linux kernel.
The Android kernel on Android is an heavily customized fork of Linux and probably the only GPL component left on the AOSP source tree, now that GCC has been replaced by clang, just like Apple did on their SDKs. Fuchsia has zero GPL components on it.
Nov 08
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
Grøstad wrote:
 On Tuesday, 7 November 2017 at 19:46:04 UTC, Joakim wrote:
 Not at all, it makes things easier certainly, but there's a 
 reason why mobile devs always test on the actual devices, 
 because there are real differences.
Mostly with low level stuff in my experience.
And what experience would that be? I've admitted I've never developed for Apple platforms, but my understanding is that even leaving aside the completely different touch-first UI, there are significant differences. I wonder what Mac apps you simply ported the UI over to iPhone and they just worked.
 Now, they're not going to dump 10-15% of sales because the 
 Mac's a fading business, they'll just keep milking it till it 
 doesn't make any sense, as I already said.
Heh, it would be very bad management to take focus off Macs. I doubt Jobs would have allowed that to happen, but as I said, I don't really trust the current management at Apple. So who knows what they will do?
I just said they're not going to dump it, so I don't know why you're going on about that. If you mean their current lessened investment is not a good idea, it's because the old desktop OS doesn't matter as much, which is the whole point of this thread.
 You are thinking too much short term here IMHO. The mobile 
 sector is rather volatile.
I have no idea what this refers to: you have a bad habit of adding asides without any explation or non sequiturs, so that we're left stumped as to what you're talking about.
 Maybe I'm just very adaptable, but I've increasingly come to 
 the conclusion that smaller works fine, especially with the 
 extremely high ppi on mobile displays these days.
Small tablets are ok, for reading, but programming really requires more screen space. Although I guess one external + the builtin one is ok too.
Some will use the small tablet screen like me, many a 11-13" laptop shell like Sentio, and a few a dock like DeX to connect the monitor of their choice.
 I guess it would be possible to create a docking station for 
 phones that was able to transfer heat away from the device so 
 that you could run at higher speed when docked, but then the 
 phone calls and you have to unplug it or use a headset…
I've been using a tablet to compile code for years now, never had a problem with heat. The power budget on these mobile chips is already limited, as they don't have a fan, such that you don't have to worry about that. That limits your performance of course, but the point is that most don't compile code or do anything close, so it doesn't matter for them. As for phone calls, I noted earlier in this thread that some already use cheap bluetooth handsets with their phablet, not a headset.
 multi-window UIs built in, which as I said before is starting 
 to happen with Android 7.0 Nougat.
I should take a closer look on modern Android… Sounds interesting.
I've linked it a handful of times in this forum, including the other mobile thread I originally linked: https://arstechnica.com/gadgets/2016/08/android-7-0-nougat-review-do-more-on-your-gigantic-smartphone/3/#h2 Samsung appears to use it for their DeX dock: https://www.androidauthority.com/samsung-dex-pc-replacement-778222/
 happened.  MS, Nokia, and others linked in this thread clearly 
 thought as you did about mobile, yet they completely missed 
 the boat.  Clearly they misjudged the scale, scope, and timing 
 of that coming mobile tidal wave.
Yes, but as I said, not many players could have countered this. Microsoft certainly if they had bought up Nokia right away. Nokia alone… probably not. HP or Sony? On a lucky day…
I see, so your claim is that MS, Nokia, HP, Sony, all much larger companies than Apple or google at the time, could not have countered them even on a lucky day. I wonder why this is, as they certainly had more money, you don't believe they're that bright? :)
 Yes, Apple made a big push, _at the right time_, while 
 everybody else didn't.  Google and Samsung followed fast, to 
 their credit, while everybody else fell to the wayside.
Well, but Android units did get a bad reputation in beginning.
Again, I have no idea what this refers to or what point you're trying to make here.
 A good example for what?  They started a mobile OS from 
 nothing and grew it to two billion-plus users today, which you 
 implied only those with a "starting point" could do.
The Android makers had a real problem with quality and making a profit. Samsung managed to make a profit, but many others struggled. And it took a long time before Android's reputation caught up with iOS. Most businesses would not have been willing to make that software investment and sustain it until the OS platform would reach a competitive level.
Yet the businesses that did build Android, ie google, HTC, and so on, were much smaller than the corporate behemoths like HP or Sony that you claimed above couldn't do it. Your claims about who could or couldn't do it make absolutely no sense.
 So I don't think many could have followed Apple there. Apple 
 recycled a lot of their prior work and experiences. Microsoft 
 could have, sure, and I am sure they regret getting in late. 
 But, they were late with embracing Internet too, so they have 
 always followed their own mindset… and only reluctantly follow 
 new trends.
As I've linked earlier, MS had already got in early, around 2001 with their Tablet PC platform and 2003 with Windows Mobile: https://en.m.wikipedia.org/wiki/Microsoft_Tablet_PC https://en.m.wikipedia.org/wiki/Windows_Mobile Their problem was likely that they got in too early and got discouraged, not that they were "getting in late."
 But frankly, I don't think many giants would start with a GPL 
 code base like Linux.
As I pointed out above, that may be changing with their developing a non-GPL alternative called Fuchsia now. Maybe they just grabbed linux because it was already built and they were in a hurry, and now plan to remedy that mistake. On Wednesday, 8 November 2017 at 00:49:36 UTC, Jerry wrote:
 Well a tablet isn't really for development. Even a cheap laptop 
 would be better for development. You can't really do much of 
 anything with that little space. I don't think the focus should 
 be people with niche development hardware like tablets. If you 
 do enough CTFE the RAM usage of DMD shoots through the roof and 
 you'd end up not having enough RAM to compile anyways. Let 
 alone if you have enough ram but still use the 32-bit version 
 of DMD and hit that limit.
This is a big problem for MS and Windows, as I've been developing D just fine on an Android tablet with 16 GB storage. You can do the same by installing the Termux app for Android and running a single command: https://play.google.com/store/apps/details?id=com.termux&hl=en https://wiki.dlang.org/Build_D_for_Android#Native_compilation_2 If you require heavy-duty hardware to develop software, there go all the entry-level devs who cannot afford the more expensive stuff and will get going on Android instead. This isn't going to bite Windows tomorrow or even next year, but it will get them eventually. On Wednesday, 8 November 2017 at 07:04:24 UTC, Tony wrote:
 On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:

 The vast majority of users would be covered by 5-10 GBs of 
 available storage, which is why the lowest tier of even the 
 luxury iPhone was 16 GBs until last year.  Every time I talk 
 to normal people, ie non-techies unlike us, and ask them how 
 much storage they have in their device, whether smartphone, 
 tablet, or laptop, they have no idea.  If I look in the 
 device, I inevitably find they're only using something like 
 3-5 GBs max, out of the 20-100+ GBs they have available.
You are making an assumption that people want as much storage for a combo phone/PC as they do for only a phone. You need to also check how much storage they are using on their PCs.
You need to read what I actually wrote, I was talking about laptops too. I don't go to people's homes and check their desktops, but their laptops fall under the same low-storage umbrella, and laptops are 80% of PCs sold these days.
 I never made any previous claim about what IDEs are being 
 used. The only time I previously mentioned an IDE was with 
 regard to RemObjects and Embarcadero offering 
 cross-compilation to Android/iOS with their products.

 "There is a case to be made for supporting  Android/iOS 
 cross-compilation. But it doesn't have to come at the expense 
 of Windows 64-bit integration. Not sure they even involve the 
 same skillsets. Embarcadero and Remobjects both now support 
 Android/iOS development from their Windows (and macOS in the 
 case of Remobjects) IDEs."

 That was to highlight that those two compiler companies have 
 seen fit to also cross-compile to mobile - they saw an 
 importance to mobile development. It wasn't about what IDEs 
 are best for mobile or even what IDEs are being used for 
 mobile.
If you look back to the first mention of IDES, it was your statement, "Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit." That at least implies that they're using the same IDE to target both mobile and PC gaming, which is what I was disputing. If you agree that they use completely different toolchains, then it is irrelevant whether D supports Windows-focused IDEs, as it doesn't affect mobile-focused devs.
My statements quoted didn't mention IDEs and they didn't imply IDEs. What was implied was the initial line in the first post "* better dll support for Windows". My assumption is that game developers (or just developers) work on multiple OSes. If you want them to use a language - like D - they should find it compelling to use on all their platforms.
Your statement was made in direct response to my question, "why spend time getting D great Windows IDE support if you don't think Windows has much of a future?" I've already said I don't think there's much overlap between mobile and PC games, the markets are fairly disjoint. The top mobile games are never released for PC and vice versa. As for dll support, that was not mentioned at all in the OT thread to which you were responding, and you never called it out.
 I've always thought that flat Metro interface was best suited 
 for mobile displays, the easiest to view, render, and touch.  
 To some extent, all the other mobile interfaces have copied 
 it, with their move to flat UIs over the years.  However, it 
 obviously takes much more than a nice GUI to do well in mobile.
I don't know what a flat UI is, but every mobile OS I have used - Blackberry 9/10, Nokia Symbian, Nokia Linux, Palm OS, WebOS, Firefox OS, iOS, Android - all have the same essential interface. Icons on a scrolling desktop. Windows 8/10 Mobile, with the resizable live tiles is the only one that does the interface differently, and in my opinion, does it the best.
Yes, icons on a background- not sure how you call it a desktop anymore ;) - are now the default, as opposed to Metro's live tiles. I agree that Metro is better in that regard, though I never handled a WinPhone for more than a couple minutes, but there were all kinds of other problems with it. For example, even in my limited use I remember it had animations when you were jumping into apps or other views, presumably because it was so slow that they wanted to stick something moving in there. And doing one aspect of the UI better is meaningless when you make so many other mistakes, whether supporting multi-core very late or not realizing Continuum is a differentiator and pushing that earlier. As for flat UIs, you really should be aware of the effect your beloved Metro has had: https://en.m.wikipedia.org/wiki/Flat_design
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that 
 came out this June? That's a contradiction of "milk it like 
 an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it".
You and I and Jobs may've let them rebel, but Apple is a public corporation. They can't just let easy money go, their shareholders may not like it. Perhaps you're not too familiar with legacy calculations, but they're probably still making good money off Macs, but it just distracts and keeps good Apple devs off the real cash cow, iPhone. Even if the Mac financials aren't _that_ great anymore, you don't necessarily want to piss off your oldest and most loyal customers, who may stop buying iPhones and iPads too.
It would either be you and Jobs, or just you, letting them rebel. I would keep the line.
That's funny, as I was responding to your statement above, "So, let them rebel." :D
 The large Apple profit comes from offering quality products and 
 then pricing them at the highest gross profit margin in the 
 industry. In order to get people to pay a premium for their 
 products it helps to have a mystique or following, and the 
 macOS line helps to maintain their mystique and it is small 
 potatoes next to their phone business.
I've already said repeatedly that they're not going to drop the Mac line anytime soon, so I don't know why you want to write a paragraph justifying keeping it. As for mystique, it is laughable that you think this outdated Mac line that practically nobody buys compared to the iPhone provides any. :) More likely, they will keep milking the Mac-buying chumps till they stop, or when they can just tell them to buy an iPhone with a multi-window option instead. On Wednesday, 8 November 2017 at 07:33:53 UTC, Walter Bright wrote:
 On 11/1/2017 11:42 AM, Bo wrote:
 And frankly, Walter or whoever, there needed to have been put 
 a stop to this anti Windows bullshit several days ago. As long 
 as people use this level of disrespect towards community 
 members because they are not using the "right" platform.
Don't worry, Windows remains a high priority platform for D. In the not-so-long run, all the platforms are dead. Little to none of D will work on any platform prior to 10 years ago or so. D needs to run on the major platforms of today, and that certainly includes Windows. Nobody is obliged to work on any platform they don't want to work on. And nobody is entitled to berate anyone for working on any platform they want to.
This post contradicts or corrects nothing said in this thread, but simply responds to the crazy, unsupported claims of this guy. I understand that you probably didn't read this OT thread but maybe just saw your name and wanted to reassure this guy, but you should have at least read the responses to him, where I pointed out that it's bonkers to suggest what was written, ie showing that Windows is declining so we should limit our future investment, shows "disrespect" to him or is "anti Windows."
Nov 08
next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 ...
Companies (along with their technologies and profits) are like waves in the ocean..they come..and they go.. But BSD Unix.. like the energy which binds our molecules...will always be with us... it seems.. So I re-iterate. If we all just used FreeBSD, then we'd all be sitting around a fire singing kumbaya (during our break from writing stuff in D), instead of debating the merits of Microsoft, Apple and Google. ..And btw..we could immediately start writing 64bit code, with only a tiny 16MB download (dmd for freebsd). What operating system can compete with that?
Nov 08
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, November 08, 2017 10:35:17 codephantom via Digitalmars-d 
wrote:
 On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 ...
Companies (along with their technologies and profits) are like waves in the ocean..they come..and they go.. But BSD Unix.. like the energy which binds our molecules...will always be with us... it seems.. So I re-iterate. If we all just used FreeBSD, then we'd all be sitting around a fire singing kumbaya (during our break from writing stuff in D), instead of debating the merits of Microsoft, Apple and Google. ..And btw..we could immediately start writing 64bit code, with only a tiny 16MB download (dmd for freebsd). What operating system can compete with that?
Linux. Oh, I'm all for using FreeBSD, but most of the arguments for using FreeBSD over Windows apply to Linux. And if you can't get someone to switch from Windows to Linux, you're not going to get them to switch to FreeBSD. FreeBSD and Linux are definitely different, but the differences are small when compared with Windows. Personally, I think that the best course of action in general as a developer is to try and make your software as cross-platform as reasonably possible and let folks run whatever they want to run. A lot of the OS-related problems we have stem from the fact that too often, software is written for a specific OS (and not just Windows software is guilty of that). Unfortunately, it's not always reasonable or possible to write cross-platform software, but IMHO, that should at least be the goal, even if you're primarily targeting a single platform for release. All of the software at one of my previous employers is written for Windows and uses lots of Windows-specific stuff even when the code really has no need to be Windows-specific. They've talked about wanting to run some of their software on Linux, but they can't do it without some major rewrites (to the point that it might actually be better to do it from scratch), and they're far from alone in being that boat. And it's not like there's something special about Windows that causes the problem. You could just as easily write your software to be Linux or FreeBSD-specific and then want to use it in a Windows application and be screwed. Writing your software to be platform-agnostic really needs to be a goal from the start, and IMHO, it's really not all that hard in most cases. It's just that too often, folks assume that they're only ever going to target a single platform. But if you write your software to be as platform-agnostic as you reasonably can, then the platform that you're actually using matters a lot less. It also means that you can take advantage of development tools from multiple platforms. - Jonathan M Davis
Nov 08
next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 11:47:32 UTC, Jonathan M Davis 
wrote:
 Personally, I think that the best course of action in general 
 as a developer is to try and make your software as 
 cross-platform as reasonably possible and let folks run 
 whatever they want to run. A lot of the OS-related problems we 
 have stem from the fact that too often, software is written for 
 a specific OS (and not just Windows software is guilty of that).
Well.. that was the role that POSIX was meant to play. Even Windows was on board, sort of, for a short time. What a joke that all turned out to be. "Perfect application portability across UNIX-based OSes is clearly beyond the realm of possibility." (from the 2016 paper below) - http://www.cs.columbia.edu/~vatlidak/resources/POSIXmagazine.pdf (conclusion: "We believe that a new revision of the POSIX standard is due, and we urge the research community to investigate what that standard should be." btw. I wonder if anyone has got the linux version of DMD x64 to run on the Windows Subsystem for Linux (available in Windows 10 I believe).
Nov 08
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, November 08, 2017 12:35:19 codephantom via Digitalmars-d 
wrote:
 On Wednesday, 8 November 2017 at 11:47:32 UTC, Jonathan M Davis

 wrote:
 Personally, I think that the best course of action in general
 as a developer is to try and make your software as
 cross-platform as reasonably possible and let folks run
 whatever they want to run. A lot of the OS-related problems we
 have stem from the fact that too often, software is written for
 a specific OS (and not just Windows software is guilty of that).
Well.. that was the role that POSIX was meant to play. Even Windows was on board, sort of, for a short time. What a joke that all turned out to be. "Perfect application portability across UNIX-based OSes is clearly beyond the realm of possibility." (from the 2016 paper below) - http://www.cs.columbia.edu/~vatlidak/resources/POSIXmagazine.pdf (conclusion: "We believe that a new revision of the POSIX standard is due, and we urge the research community to investigate what that standard should be." btw. I wonder if anyone has got the linux version of DMD x64 to run on the Windows Subsystem for Linux (available in Windows 10 I believe).
POSIX certainly helps, but each OS that implements it adds more stuff on top of it (like extra flags or similar but different system calls that improve on the POSIX ones), and there's plenty of stuff that's simply not part of POSIX but is all over the place in slightly different forms, since it's not part of a standard. Heck, even when something is part of POSIX, that doesn't mean that it's properly and fully supported on a system that supports POSIX - e.g. the stuff that's in librt (like clock_getttime) isn't implemented on Mac OS X even though it's part of POSIX, so the stuff for getting the time in core.time and std.datetime has to be different for Mac OS X. Granted, the Mac OS X calls are actually better, but you're still stuck implementing the code differently for different OSes in spite of a standard. And while historically, Windows implemented some POSIX stuff, they went and slapped an underscore on the front of all of the names, totally breaking compatibility. The new Windows Subsystem for Linux should be a huge step forward in some regards, but if I understand correctly, it's basically an emulation layer for running linux programs and not something you'd use as part of a Windows program. So, it only works if you're just looking to run Linux programs under Windows, not if you want to write a program that runs as part of Windows and can take advantage of the Windows stuff where it needs to. So, how useful it is depends on what you're trying to do. Improvements to standards to allow for more stuff to be written in a cross-platform manner without versioning stuff it off for specific OSes is definitely desirable, but the reality of the matter is that even OSes that are very similar end up with differences that occasionally require versioning code - sometimes even when the API being used is part of a standard. And much as things could be improved, I don't see that ever changing. It sure doesn't help though when each OS goes off and implements something drastically different for core stuff (like opengl vs directx). Some competition is good, but when a major API is platform-specific, it makes it a _lot_ harder to write cross-platform code. Ultimately though, even when dealing with different BSDs, you end up with portability problems if you're not careful. - Jonathan M Davis
Nov 08
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 8 November 2017 at 12:35:19 UTC, codephantom wrote:
 btw. I wonder if anyone has got the linux version of DMD x64 to 
 run on the Windows Subsystem for Linux (available in Windows 10 
 I believe).
I'm not that familiar with the Windows Subsystem for Linux, but it looks like it could be very useful. I'll set it up and try to install DMD tonight if I have time.
Nov 08
parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 8 November 2017 at 14:36:11 UTC, jmh530 wrote:
 On Wednesday, 8 November 2017 at 12:35:19 UTC, codephantom 
 wrote:
 btw. I wonder if anyone has got the linux version of DMD x64 
 to run on the Windows Subsystem for Linux (available in 
 Windows 10 I believe).
I'm not that familiar with the Windows Subsystem for Linux, but it looks like it could be very useful. I'll set it up and try to install DMD tonight if I have time.
The linux build of dmd has already been used on WSL to compile ldc without a problem: https://wiki.dlang.org/Build_LDC_for_Android#Notes_for_Bash_on_Ubuntu_on_Windows On Wednesday, 8 November 2017 at 14:40:11 UTC, Ola Fosheim Grøstad wrote:
 On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
 Grøstad wrote:
 On Tuesday, 7 November 2017 at 19:46:04 UTC, Joakim wrote:
 Not at all, it makes things easier certainly, but there's a 
 reason why mobile devs always test on the actual devices, 
 because there are real differences.
Mostly with low level stuff in my experience.
And what experience would that be? I've admitted I've never developed for Apple platforms, but my understanding is that even leaving aside the completely different touch-first UI, there are significant differences. I wonder what Mac apps you simply ported the UI over to iPhone and they just worked.
Writing code from scratch for both. No, of course you cannot port it without a little bit of work as the base UI class is slightly different. However it is overall the same Objective-C framework design. Quoting apple: «If you've developed an iOS app, many of the frameworks available in OS X should already seem familiar to you. The basic technology stack in iOS and OSX are identical in many respects. But, despite the similarities, not all of the frameworks in OS X are exactly the same as their iOS counterparts» https://developer.apple.com/library/content/documentation/MacOSX/Conceptual/OSX_Technology_Overview/MigratingFromCocoaTouch/MigratingFromCocoaTouch.html
This link also notes many other significant differences, such as mobile hardware being much more constrained and "iOS users have no direct access to the file system," as I mentioned.
 You are thinking too much short term here IMHO. The mobile
sector is rather volatile.
I have no idea what this refers to: you have a bad habit of adding asides without any explation or non sequiturs, so that we're left stumped as to what you're talking about.
Over-quoting is spammy. So I don't, but here you go: The mobile sector is more volatile than the desktop/laptop sector, hence it would be a risky move to dump it. I think that was quite clear from what I wrote though…
It was not clear because it is divorced from reality, which of these two markets would you rather be in? https://mobile.twitter.com/lukew/status/842397687420923904 In fact, Apple alone will likely sell more mobile iPhones and iPads this year than every PC vendor combined (see third chart): http://www.asymco.com/2016/11/02/wherefore-art-thou-macintosh/ They have already cut investment in Macs and are not bothering to upgrade the existing Mac line for longer and longer, on the way to axing that line altogether. The notion that their iOS line, which now brings in the vast majority of their profits and revenue, is riskier is a joke.
 I see, so your claim is that MS, Nokia, HP, Sony, all much 
 larger companies than Apple or google at the time, could not 
 have countered them even on a lucky day.  I wonder why this 
 is, as they certainly had more money, you don't believe 
 they're that bright? :)
No, it is because they didn't have the resources internally. Money alone does not build teams or knowledge. Apple had worked on similar technology for decades and could recycle the frameworks for their desktop OS.
 Yet the businesses that did build Android, ie google, HTC, and 
 so on, were much smaller than the corporate behemoths like HP 
 or Sony that you claimed above couldn't do it.  Your claims 
 about who could or couldn't do it make absolutely no sense.
Of course it does. They were not into operating systems and frameworks. Sony a little bit by having the Playstation, but that was very narrow and for a very narrow low level segment of programmers.
I see, so MS, Nokia, HP, Sony, and all the rest didn't have "resources internally" or knowledge of "operating systems and frameworks," but the much smaller search startup google did? When google bought Android in 2005, they had yearly revenues of $6 billion, a pittance compared to the PC and mobile giants you are excusing: https://www.informationweek.com/google-revenue-up-93--in-2005/d/d-id/1040162 I don't know if you're trying to make me laugh with these excuses or what.
 Their problem was likely that they got in too early and got 
 discouraged, not that they were "getting in late."
Apple was also in too early and got discouraged, but they reentered when the touch screen tech got better.
Which MS could have done also, but didn't. Any way you slice it, Apple grabbed an opportunity that plenty of other people could have- and according to you had the knowledge to, since you say many knew mobile was next- yet almost none of them did. That speaks to what I was trying to show with that chart of the mobile tidal wave, that everyone, including Apple to some extent, didn't see _that_ coming.
Nov 08
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 8 November 2017 at 17:51:45 UTC, Joakim wrote:
 The linux build of dmd has already been used on WSL to compile 
 ldc without a problem:

 https://wiki.dlang.org/Build_LDC_for_Android#Notes_for_Bash_on_Ubuntu_on_Windows
Thanks. I'll make use of that. I'll be happy if I can get blas/lapack working.
Nov 08
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 8 November 2017 at 18:06:25 UTC, jmh530 wrote:
 Thanks. I'll make use of that. I'll be happy if I can get 
 blas/lapack working.
I just got DMD set up using those instructions (though not sure all were needed, I followed them anyway). I am probably going to make good use of this, so thanks for highlighting it.
Nov 08
parent reply codephantom <me noyb.com> writes:
On Thursday, 9 November 2017 at 02:23:33 UTC, jmh530 wrote:
 I just got DMD set up using those instructions (though not sure 
 all were needed, I followed them anyway). I am probably going 
 to make good use of this, so thanks for highlighting it.
Thanks for testing it and letting us know. I'll try it out today too....(I just have to wait till the Windows 10 iso finishes downloading...so maybe I should say... I'll try it out 'tomorrow'...
Nov 08
parent codephantom <me noyb.com> writes:
On Thursday, 9 November 2017 at 02:34:35 UTC, codephantom wrote:
 I'll try it out today too....(I just have to wait till the 
 Windows 10 iso finishes downloading...so maybe I should say... 
 I'll try it out 'tomorrow'...
ohhh..wtf...it's still downloading??.....gee... I might go to sleep..and when I wake up it will be finished. It'll be like those hours never even happened.
Nov 08
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 November 2017 at 17:51:45 UTC, Joakim wrote:
 way to axing that line altogether. The notion that their iOS 
 line, which now brings in the vast majority of their profits 
 and revenue, is riskier is a joke.
That really depends on what you mean by risk. There is no general correlation between high profits and low risk.
 I don't know if you're trying to make me laugh with these 
 excuses or what.
So you don't understand that the foundation that Apple had for building iOS takes time, not only resources. Money does not solve all problems, but you think otherwise. Ok. I strongly disagree. I assume it is a goodhearted laughter you are enjoying…
Nov 08
parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 8 November 2017 at 21:02:26 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 8 November 2017 at 17:51:45 UTC, Joakim wrote:
 way to axing that line altogether. The notion that their iOS 
 line, which now brings in the vast majority of their profits 
 and revenue, is riskier is a joke.
That really depends on what you mean by risk. There is no general correlation between high profits and low risk.
I'm not saying mobile isn't risky. It's a cutting-edge tech business, of course it's risky. Just look at HTC, LG, and all the other mobile vendors doing badly. However, I'd rather be in a booming risky business rather than a declining risky business, which is what the desktop market is and therefore riskier.
 I don't know if you're trying to make me laugh with these 
 excuses or what.
So you don't understand that the foundation that Apple had for building iOS takes time, not only resources. Money does not solve all problems, but you think otherwise. Ok. I strongly disagree. I assume it is a goodhearted laughter you are enjoying…
I don't know why you go back to Apple, when you clearly cut out the part of the above excuses quote where I pointed out that _google had none of the advantages_ you think were necessary to win mobile, yet created the OS that now ships on the most mobile devices. Of course it's not just a matter of money, but you were the one who mentioned how internal resources are needed, which is belied by the fact that google had much less. You talk about OS expertise, all while HP has long had their own OS's, HP-UX and later Tru64, same with Sony and the various in-house OS's they've worked on. You don't want to own up to the fact that google succeeded with a lot less resources and OS expertise than the companies you claim couldn't do it, which suggests those factors you think were so important likely weren't. More likely, it is what I said: the incumbents like MS or Sony just didn't foresee mobile growing so large so fast, at least that was one of the main reasons.
Nov 08
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 November 2017 at 21:36:58 UTC, Joakim wrote:
 I don't know why you go back to Apple, when you clearly cut out 
 the part of the above excuses quote where I pointed out that 
 _google had none of the advantages_ you think were necessary to 
 win mobile, yet created the OS that now ships on the most 
 mobile devices.
Android wasn't all that great in the beginning and most manufacturers didn't make much money off it. Samsung was more the exception than the rule, and no, not only Google is making Android happen. For a single company to go that route alone you better have a good starting point. Microsoft had it, obviously. Apple had it. Maybe the owners of BeOS could have done it, not sure, but there are few companies that actually could have produced a high quality OS + application frameworks + hardware in anything less than a decade. Apple could focus on hardware and drivers and a little bit of fickling with their existing OS-X frameworks. That's a major difference.
 belied by the fact that google had much less.  You talk about 
 OS expertise, all while HP has long had their own OS's, HP-UX
That's only a generic Unix with X11 on top. HP had WebOS, but gave up on it!! I can only assume they realized it would be too time consuming and too expensive to be worthwhile. Just take a look at how difficult it is to build something as simple as D or C++ standard library. Then multiply that by the challenges when create complete application frameworks. Nokia bought up QT (which isn't all that great) for a reason, and for _a lot_ of money! I think you underestimate what it takes to get it all to work together in a reasonably manner. Anyhow, with Android out there as a possible contender it basically wouldn't make a whole lot of sense to invest in rolling your own OS. I assume that is the reason HP let WebOS stagnate.
Nov 08
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 November 2017 at 22:28:32 UTC, Ola Fosheim 
Grøstad wrote:
 in anything less than a decade. Apple could focus on hardware 
 and drivers and a little bit of fickling with their existing 
 OS-X frameworks. That's a major difference.
I didn't mean «fickling», that was quasi-norwegian… I meant «tinkering».
Nov 08
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 8 November 2017 at 22:28:32 UTC, Ola Fosheim 
Grøstad wrote:
 On Wednesday, 8 November 2017 at 21:36:58 UTC, Joakim wrote:
 I don't know why you go back to Apple, when you clearly cut 
 out the part of the above excuses quote where I pointed out 
 that _google had none of the advantages_ you think were 
 necessary to win mobile, yet created the OS that now ships on 
 the most mobile devices.
Android wasn't all that great in the beginning and most manufacturers didn't make much money off it. Samsung was more the exception than the rule, and no, not only Google is making Android happen. For a single company to go that route alone you better have a good starting point. Microsoft had it, obviously. Apple had it. Maybe the owners of BeOS could have done it, not sure, but there are few companies that actually could have produced a high quality OS + application frameworks + hardware in anything less than a decade. Apple could focus on hardware and drivers and a little bit of fickling with their existing OS-X frameworks. That's a major difference.
Google pretty much did it on their own in around five years, as all indications are that Android is mostly developed in-house. Yes, the Android hardware vendors add polish, some drivers, and their own skins, but most of the source comes from google.
 belied by the fact that google had much less.  You talk about 
 OS expertise, all while HP has long had their own OS's, HP-UX
That's only a generic Unix with X11 on top. HP had WebOS, but gave up on it!! I can only assume they realized it would be too time consuming and too expensive to be worthwhile.
The point is that HP had plenty of OS expertise. As for WebOS, HP didn't buy it till 2010, when mobile sales were just passing PC sales and it was getting too late. WebOS was not only a dumb idea, just like ChromeOS, it likely had major technical issues, judging from the reviews I read at the time.
 Just take a look at how difficult it is to build something as 
 simple as D or C++ standard library. Then multiply that by the 
 challenges when create complete application frameworks. Nokia 
 bought up QT (which isn't all that great) for a reason, and for 
 _a lot_ of money!
And yet google, much smaller than MS or HP and without the OS expertise you say is needed, did all that mostly by themselves.
 I think you underestimate what it takes to get it all to work 
 together in a reasonably manner. Anyhow, with Android out there 
 as a possible contender it basically wouldn't make a whole lot 
 of sense to invest in rolling your own OS. I assume that is the 
 reason HP let WebOS stagnate.
I think you greatly overestimate what was needed to compete in this mobile market at that time. I'm not saying it was easy, but the PC and mobile giants before iOS/Android clearly didn't have the vision or ability to execute what google, a much smaller search company, did with Android, leaving aside Apple because of your silly claims that their existing software gave them a headstart, which is why those former computing giants are all either dead or fading fast.
Nov 08
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 9 November 2017 at 00:09:32 UTC, Joakim wrote:
 ...
 I think you greatly overestimate what was needed to compete in 
 this mobile market at that time.  I'm not saying it was easy, 
 but the PC and mobile giants before iOS/Android clearly didn't 
 have the vision or ability to execute what google, a much 
 smaller search company, did with Android, leaving aside Apple 
 because of your silly claims that their existing software gave 
 them a headstart, which is why those former computing giants 
 are all either dead or fading fast.
Google bought the company responsible for Hiptop, which was already developing Android, where the majority of employees were former BeOS employees, many of which are still on the Android team.
Nov 09
parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 9 November 2017 at 12:27:49 UTC, Paulo Pinto wrote:
 On Thursday, 9 November 2017 at 00:09:32 UTC, Joakim wrote:
 ...
 I think you greatly overestimate what was needed to compete in 
 this mobile market at that time.  I'm not saying it was easy, 
 but the PC and mobile giants before iOS/Android clearly didn't 
 have the vision or ability to execute what google, a much 
 smaller search company, did with Android, leaving aside Apple 
 because of your silly claims that their existing software gave 
 them a headstart, which is why those former computing giants 
 are all either dead or fading fast.
Google bought the company responsible for Hiptop, which was already developing Android, where the majority of employees were former BeOS employees, many of which are still on the Android team.
Not quite, the company responsible for the Hiptop was Danger, which was acquired by MS in 2008: https://en.m.wikipedia.org/wiki/Danger_Inc. Some key people left Danger to start Android before that, which is what you're thinking of. I mentioned that 2005 google acquisition of Android earlier in this thread. I'm not sure what point you're trying to make though, as HP, Sony, MS, Nokia, etc. had enough money to buy 50 such companies, ie google didn't have any resource or "OS expertise" advantage over those computing giants. They certainly had a better vision for mobile and arguably other technical skills. It's funny, everybody is now ridiculing the dismissive statements made by those giants when Android launched a decade ago: https://www.engadget.com/2007/11/05/symbian-nokia-microsoft-and-apple-downplay-android-relevance/
Nov 09
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 9 November 2017 at 00:09:32 UTC, Joakim wrote:
 smaller search company, did with Android, leaving aside Apple 
 because of your silly claims that their existing software gave 
 them a headstart, which is why those former computing giants 
 are all either dead or fading fast.
It is hardly a silly claim: NextStep (1989) ==> OS-X (2001) ==> iOS (2007) That is 18 years of evolution and experience, and it also meant that they had the development tooling ready + experienced developers for their platform (macOS programmers). It also mattered a lot that Apple already had the manufacturing experience with prior attempts and also the streamlining of the iPod-line as well as the infrastructure for distribution and following up customers (again from the iPod line). So, for Apple it was a relatively modest step to go from iPod + Mac frameworks + standard 3rd party chips + existing tooling + iTunes => iPhone I think you are forgetting that hardly anyone wanted to develop apps for Android in the first few years. Android was pariah, and everybody did iOS apps first, then if it was a big success then maybe they would try to port it over to Android (but usually not).
Nov 09
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
I also think we should add to this discussion that Google was 
hellbent on going forward with Android even when it was clearly 
inferior. Apple tried to squish out Google's services from their 
iOS products for a while. And that is exactly what Google tries 
to prevent by funding things like Chrome and Android.

So for Google Chrome and Android does not have to make sense in 
business terms, it is basically an anti-competitive tool to 
protect their own hegemony (relative monopoly) by retaining 
critical mass and making it difficult for competitors to build up 
a competing product over time (you need a source of income while 
your product is evolving from mediocre to great to do that).
Nov 09
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 9 November 2017 at 14:22:22 UTC, Ola Fosheim Grøstad 
wrote:
 I also think we should add to this discussion that Google was 
 hellbent on going forward with Android even when it was clearly 
 inferior. Apple tried to squish out Google's services from 
 their iOS products for a while. And that is exactly what Google 
 tries to prevent by funding things like Chrome and Android.
Do you blame them, given such anti-competitive measures long undertaken by MS and Apple?
 So for Google Chrome and Android does not have to make sense in 
 business terms, it is basically an anti-competitive tool to 
 protect their own hegemony (relative monopoly) by retaining 
 critical mass and making it difficult for competitors to build 
 up a competing product over time (you need a source of income 
 while your product is evolving from mediocre to great to do 
 that).
There is some truth to this, but if you cannot compete with a free product- cough, cough, Windows Mobile- I don't know what to tell you. In other words, google cannot afford to spend a fraction of the money on Android that Apple spends on iOS, because google makes so little money off of Android by comparison, so there are disadvantages to their free model too. It is one of the reasons why they have now plunged into the high-end smartphone market with their recent Pixel line. I think the lack of a viable business model for Android vendors, other than Samsung, is a huge problem for the platform, as Apple hoovers up two-thirds of the profit with only a tenth of the phones sold: https://www.counterpointresearch.com/80-of-global-handset-profits-comes-from-premium-segment/ As I said earlier, the mobile OS story is not over yet, there are more changes to come.
Nov 09
next sibling parent Jerry <hurricane hereiam.com> writes:
On Thursday, 9 November 2017 at 14:42:41 UTC, Joakim wrote:
 There is some truth to this, but if you cannot compete with a 
 free product- cough, cough, Windows Mobile- I don't know what 
 to tell you.  In other words, google cannot afford to spend a 
 fraction of the money on Android that Apple spends on iOS, 
 because google makes so little money off of Android by 
 comparison, so there are disadvantages to their free model too.
  It is one of the reasons why they have now plunged into the 
 high-end smartphone market with their recent Pixel line.

 I think the lack of a viable business model for Android 
 vendors, other than Samsung, is a huge problem for the 
 platform, as Apple hoovers up two-thirds of the profit with 
 only a tenth of the phones sold:

 https://www.counterpointresearch.com/80-of-global-handset-profits-comes-from-premium-segment/

 As I said earlier, the mobile OS story is not over yet, there 
 are more changes to come.
People that buy Android I find tend to keep their phones for longer. People with Apple phones keep buying new ones. Part of that is how many phone Apple claims are on the latest version. So developers only target the latest one, then their apps don't run on old phone and it encourages people to "upgrade". Android apps tend to support more versions as well, it's a more diverse OS. I've even seen websites that just straight up drop support for old versions of Safari. Can't get the latest version of Safari cause you can't update your phone. Then you go to firefox just to find out you can't install it cause it's no longer support for that iOS version. Can't even download an old version of firefox that did support it cause it's Apple's store and they don't support that.
Nov 09
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 9 November 2017 at 14:42:41 UTC, Joakim wrote:
 Do you blame them, given such anti-competitive measures long 
 undertaken by MS and Apple?
Big businesses do what they can get away with. Once upon a time governments cared about anti-trust (E.g. AT&T and IBM), but nowadays it seems like they don't care much about enabling competition where smaller players get a shot. Governments seem to let the big multi-national corporations do what they want. It's not like MS was punished much for their behaviour… (EU has mounted a little bit of resistance, but only thanks to individuals.)
 There is some truth to this, but if you cannot compete with a 
 free product- cough, cough, Windows Mobile- I don't know what 
 to tell you.
I actually think the Microsoft phones looked quite appealing, but I didn't get the sense that Microsoft would back it up over time. Perception is king. Google had the same problem with Dart. They kept developing Dart, but after they announced that it didn't get into Chrome, many started to wonder if that was the beginning of the end.
  In other words, google cannot afford to spend a fraction of 
 the money on Android that Apple spends on iOS, because google 
 makes so little money off of Android by comparison, so there 
 are disadvantages to their free model too.
As far as I can tell from the iOS APIs the internals doesn't seem to change all that much anymore. I'm sure they do a lot on hardware, drivers and tooling.
 As I said earlier, the mobile OS story is not over yet, there 
 are more changes to come.
Yes, that probably is true. The teenager/young adults segment can shift things real fast if someone push out a perfect mobile gaming-device.
Nov 09
prev sibling parent codephantom <me noyb.com> writes:
On Thursday, 9 November 2017 at 14:42:41 UTC, Joakim wrote:
 As I said earlier, the mobile OS story is not over yet, there 
 are more changes to come.
Yeah...like more factories making more dongles. You want a dongle? https://www.youtube.com/watch?v=-XSC_UG5_kU
Nov 09
prev sibling parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 9 November 2017 at 14:15:47 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 9 November 2017 at 00:09:32 UTC, Joakim wrote:
 smaller search company, did with Android, leaving aside Apple 
 because of your silly claims that their existing software gave 
 them a headstart, which is why those former computing giants 
 are all either dead or fading fast.
It is hardly a silly claim: NextStep (1989) ==> OS-X (2001) ==> iOS (2007) That is 18 years of evolution and experience, and it also meant that they had the development tooling ready + experienced developers for their platform (macOS programmers). It also mattered a lot that Apple already had the manufacturing experience with prior attempts and also the streamlining of the iPod-line as well as the infrastructure for distribution and following up customers (again from the iPod line). So, for Apple it was a relatively modest step to go from iPod + Mac frameworks + standard 3rd party chips + existing tooling + iTunes => iPhone I think you are forgetting that hardly anyone wanted to develop apps for Android in the first few years. Android was pariah, and everybody did iOS apps first, then if it was a big success then maybe they would try to port it over to Android (but usually not).
I agree that Apple had an advantage in getting into the smartphone market, but MS, RIM, Nokia, etc. had much larger advantages in this regard. And you continue to ignore that Android and google started their mobile OS from scratch and now ship on the most smartphones. Of course, they just grabbed existing tech like the linux kernel, Java, and various other OSS projects and put it all together with code of their own, but that's something any of the computing giants and many other upstarts like HTC or Asus could have done. Yet, they didn't, which suggests a lack of vision or some other technical ability than "OS expertise."
Nov 09
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/8/2017 1:36 PM, Joakim wrote:
 You don't want to own up to the fact that
Please refrain from berating others here.
Nov 12
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 13 November 2017 at 10:26:57 UTC, Joakim wrote:
 I accurately characterized the tenor of their problem
Uhm… «accurately» ?? LOL!! 8'D
 generalize and point that out, ie he _was_ confused in the 
 points he was making.
I am never confused, but this is dlang.org, I've seen worse…
Nov 13
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 13 November 2017 at 11:46:42 UTC, Joakim wrote:
 Considering you kept ignoring my evidence of Android and 
 jumping to Apple, I'd say that it was perfectly accurate.
Oh well, I'm focusing on what I am interested in… Anyway, it is rather obvious that subjective ad-hominem statements in a debate hardly will be «perfectly accurate» (you were dead wrong, and that is perfectly accurate, of course ;-)
 I didn't say you were confused, the "confused" comments that 
 Walter pasted were made to Tony.  But thank you for 
 demonstrating that it happens to you too. ;)
I am never confused. Get it? NEVER!!!
Nov 13
prev sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 11:47:32 UTC, Jonathan M Davis 
wrote:

 Oh, I'm all for using FreeBSD, but most of the arguments for 
 using FreeBSD over Windows apply to Linux. And if you can't get 
 someone to switch from Windows to Linux, you're not going to 
 get them to switch to FreeBSD. FreeBSD and Linux are definitely 
 different, but the differences are small when compared with 
 Windows.
Except, that Linux/GNU is basically a clone of a clone. BSD is...just BSD..from which all the clones are made ;-) More importantly, is the GPL vs BSD licence thing. If you examine GPL code, and think..mmm..that looks good, I might use it in my app....then you're in trouble is you distribute that app without also distributing your code. BSD gives you 'genuine freedom' to use the code as you see fit - just don't try claiming that you wrote it, or you'll be in trouble. There is also the 'distribution' thing...FreeBSD is a single, managed, complete distrbution. Linux is just a kernel. It's combined with various GNU stuff to make up a distribution, and most distrubtions make their own little changes here and there, and you never really know what's going on. With FreeBSD there is only the FreeBSD distribution. So there maybe similiarities between FreeBSD and Linux/GNU, but their differences are really significant and warrant attention. Oddly enough, whatever draws me to FreeBSD, also draws me to D - I'm still not sure what it is...but the word 'freedom' keeps coming to mind. I cannot say that for Linux as much. I cannot say that for golang. They offer freedom, and at the same time setup out to restrict it.
Nov 09
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, November 09, 2017 23:42:37 codephantom via Digitalmars-d wrote:
 On Wednesday, 8 November 2017 at 11:47:32 UTC, Jonathan M Davis

 wrote:
 Oh, I'm all for using FreeBSD, but most of the arguments for
 using FreeBSD over Windows apply to Linux. And if you can't get
 someone to switch from Windows to Linux, you're not going to
 get them to switch to FreeBSD. FreeBSD and Linux are definitely
 different, but the differences are small when compared with
 Windows.
Except, that Linux/GNU is basically a clone of a clone. BSD is...just BSD..from which all the clones are made ;-) More importantly, is the GPL vs BSD licence thing. If you examine GPL code, and think..mmm..that looks good, I might use it in my app....then you're in trouble is you distribute that app without also distributing your code. BSD gives you 'genuine freedom' to use the code as you see fit - just don't try claiming that you wrote it, or you'll be in trouble. There is also the 'distribution' thing...FreeBSD is a single, managed, complete distrbution. Linux is just a kernel. It's combined with various GNU stuff to make up a distribution, and most distrubtions make their own little changes here and there, and you never really know what's going on. With FreeBSD there is only the FreeBSD distribution. So there maybe similiarities between FreeBSD and Linux/GNU, but their differences are really significant and warrant attention. Oddly enough, whatever draws me to FreeBSD, also draws me to D - I'm still not sure what it is...but the word 'freedom' keeps coming to mind. I cannot say that for Linux as much. I cannot say that for golang. They offer freedom, and at the same time setup out to restrict it.
I don't disagree that there are differences between FreeBSD and Linux, but my point is that for most folks, the differences are small enough that it's not all that different from trying to convince someone to use one Linux distro or another - especially if you're trying to convince a Windows user, since Windows is so drastically different from both. In most cases, whether you run FreeBSD or Linux really comes down to preference. For the most part, they both serve people's needs very well and on the surface aren't very different. I definitely prefer the BSD license to the GPL as well as how the BSDs typically go about designing things, but if you don't care about the licensing situation, whether it even matters to you which you're using starts getting down to some pretty specific stuff that would seem fairly esoteric to a lot of folks (especially non-geeks). It's even the case that most software that runs on one runs on the other - including the desktop environments - so while the differences definitely matter, they tend to be pretty small from the end user's point of view. Plenty of us do get picky about details, which would lead us to one or the other, depending on our preferences, but there are way more similarities than differences - to the point that to many folks, the differences seem pretty superficial. - Jonathan M Davis
Nov 09
next sibling parent codephantom <me noyb.com> writes:
On Friday, 10 November 2017 at 00:23:03 UTC, Jonathan M Davis 
wrote:
 Plenty of us do get picky about details, which would lead us to 
 one or the other, depending on our preferences, but there are 
 way more similarities than differences - to the point that to 
 many folks, the differences seem pretty superficial.

 - Jonathan M Davis
No, the diffs really are considerable. FreeBSD is not Linux. For example, FreeBSD doesn't have systemd ;-) https://www.youtube.com/watch?v=MpDdGOKZ3dg
Nov 09
prev sibling parent reply codephantom <me noyb.com> writes:
On Friday, 10 November 2017 at 00:23:03 UTC, Jonathan M Davis 
wrote:
 I don't disagree that there are differences between FreeBSD and 
 Linux, but my point is that for most folks, the differences are 
 small enough that it's not all that different from trying to 
 convince someone to use one Linux distro or another - 
 especially if you're trying to convince a Windows user, since 
 Windows is so drastically different from both.
My Windows 10 just finished downloading. I installed it, and even a technie nerd like me couldn't work it out. I think Windows 10 is enough to convince users to switch ... to anything ;-) https://www.youtube.com/watch?v=KHG6fXEba0A
Nov 09
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 10 November 2017 at 01:19:06 UTC, codephantom wrote:
 Well, everytime I wanted to find something, I had to google 
 it...

 Then I realised I had to pay for it as well...and, that's when 
 i gave up.
Bill Gates wasn't the richest man in the world for so long without reason. ;)
Nov 09
prev sibling next sibling parent Tony <tonytdominguez aol.com> writes:
Apple had a big benefit on mobile with their iTunes store that 
had already been established on Desktop and the very popular 
iPod. They also had rich USA buyers who bought more apps than 
users of the other platforms which encouraged developers to 
target iOS. And they had the Apple/Jobs mystique.
Nov 10
prev sibling next sibling parent codephantom <me noyb.com> writes:
On Friday, 10 November 2017 at 22:16:55 UTC, Jerry wrote:
 Indeed, you could contact Microsoft for support and know you 
 are talking to professional and not some rabid fanatic that 
 will split hairs over the differences between linux and freebsd.
Well.. if MSFT stop making stupid design decisions, they could invest their money in more innovation, instead of investing it into supporting and correcting their stupid design decisions. Since Windows XP, what have they done: - they release Vista (people lost their jobs over that, and MSFT had to go back to drawing board and actually consider what their customers want for a change). - have you ever compared opening Event Viewer on windows xp, to opening it on every windows version since xp...it just gets bigger and slower to open. - then they release Windows 7, with its fancy aero interface (which i really liked). - then they took it away. - then they added all this so called 'intelligence' into the o/s, that just bloated it and made it slower. - then they took the start button away - then they thought tiles are a better way to find your programs. - then they though preventing users from customising their system, is something that should be done. - then they thought the boring, plain metro interface - is innovative. - then they thought preventing users from stopping the automatic installation of updates was a good idea - then thought treating the desktop like it's a mobile tablet, is a good idea. - then they thought they'd make it so hard for anyone to find anything, that users would have to revert to using their new little wiget that tracks everything the user does and sends it off to MFST for big data analysis. ..oh man... i could just go on and on..... The only innovation in software in the last decade or more, has come from open source projects. So anyone that suggest we look to MSFT for design decisions, better think again. When I joined the forum a little while back, I dared to suggest that D should be able to compile a 64bit binary on Windows, without having to relying on gigabytes of proprietaty, closed source, bloat from MSFT. I stand by that comment, despite the harrassment from the many MSFT fanboys on these forums. I've also noticed, that since I made that comment, there's been an increase in attempts to do just that. Which is great.
Nov 10
prev sibling parent reply solidstate1991 <laszloszeremi outlook.com> writes:
After all this flaming about Windows, mobile devices (I 
personally prefer my desktop PC thanks to its "power", or at 
least what it used to left, thanks to long unemployment time and 
lack of income, have a Nokia Lumia which I cannot upgrade to W10 
due to BS reasons, and I think open-source architectures will 
kill off the proprietary ARM and x86 in the long run, not the 
mobile platform the desktops/laptops(funny story is that my 
mother tried to ditch desktop multiple times for the mobile, then 
got back, same happened with one of my cousin after he realized 
that pay-to-win games suck)), can we get back on rails? While its 
true that Windows and desktop is losing its place, we need to 
support Windows on a much higher level as long as there's a large 
number of PCs out there. Game development would highly benefit 
from D thanks to its all-in-one approach, probably could cut a 
few millions off from AAA game development. Also audio-engineers 
are switching to Windows, thanks to Apple scrapping the IO on 
their products (I'm also a digital artist, have to stay with 
Windows due to drivers, software, and ease of use).

Walter Bright: What's the licensing state of DMC and OPTLINK? Can 
it made open-source? If yes, we should patch in a COFF32/64 
support, maybe even port it to D for easier development. I can 
spend some of my time working on the DLL support if needed.
Nov 14
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 15/11/2017 3:15 AM, solidstate1991 wrote:
 After all this flaming about Windows, mobile devices (I personally 
 prefer my desktop PC thanks to its "power", or at least what it used to 
 left, thanks to long unemployment time and lack of income, have a Nokia 
 Lumia which I cannot upgrade to W10 due to BS reasons, and I think 
 open-source architectures will kill off the proprietary ARM and x86 in 
 the long run, not the mobile platform the desktops/laptops(funny story 
 is that my mother tried to ditch desktop multiple times for the mobile, 
 then got back, same happened with one of my cousin after he realized 
 that pay-to-win games suck)), can we get back on rails? While its true 
 that Windows and desktop is losing its place, we need to support Windows 
 on a much higher level as long as there's a large number of PCs out 
 there. Game development would highly benefit from D thanks to its 
 all-in-one approach, probably could cut a few millions off from AAA game 
 development. Also audio-engineers are switching to Windows, thanks to 
 Apple scrapping the IO on their products (I'm also a digital artist, 
 have to stay with Windows due to drivers, software, and ease of use).
 
 Walter Bright: What's the licensing state of DMC and OPTLINK? Can it 
 made open-source? If yes, we should patch in a COFF32/64 support, maybe 
 even port it to D for easier development. I can spend some of my time 
 working on the DLL support if needed.
https://github.com/DigitalMars/optlink/pull/19
Nov 14
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2017 7:15 PM, solidstate1991 wrote:
 Walter Bright: What's the licensing state of DMC and OPTLINK?
Boost
 Can it made open-source?
Yes.
 If yes, we should patch in a COFF32/64 support, maybe even port it 
 to D for easier development. I can spend some of my time working on the DLL 
 support if needed.
You're welcome to do it, it's something I've been meaning to do anyway. Optlink will never support MsCoff, you'll realize that when you look at the source :-(
Nov 14
parent reply solidstate1991 <laszloszeremi outlook.com> writes:
On Wednesday, 15 November 2017 at 04:34:09 UTC, Walter Bright 
wrote:
 On 11/14/2017 7:15 PM, solidstate1991 wrote:
 Walter Bright: What's the licensing state of DMC and OPTLINK?
Boost
 Can it made open-source?
Yes.
 If yes, we should patch in a COFF32/64 support, maybe even 
 port it to D for easier development. I can spend some of my 
 time working on the DLL support if needed.
You're welcome to do it, it's something I've been meaning to do anyway. Optlink will never support MsCoff, you'll realize that when you look at the source :-(
It's filled with Assembly code, and otherwise not very readable. Would need a lot of work, I don't think it would worth it. Let's hope that MS will allow us to distribute a linker alongside DMD.
Nov 16
parent reply David Nadlinger <code klickverbot.at> writes:
On Friday, 17 November 2017 at 02:01:41 UTC, solidstate1991 wrote:
 It's filled with Assembly code, and otherwise not very 
 readable. Would need a lot of work, I don't think it would 
 worth it. Let's hope that MS will allow us to distribute a 
 linker alongside DMD.
The more promising avenue would probably be to distribute LLD with DMD. This still leaves the system library licensing to deal with, but if I remember correctly, one of the usual suspects (Rainer? Vladimir?) was working on generating them from MinGW headers. — David
Nov 17
next sibling parent Vladimir Panteleev <thecybershadow.lists gmail.com> writes:
On Friday, 17 November 2017 at 23:31:07 UTC, David Nadlinger 
wrote:
 The more promising avenue would probably be to distribute LLD 
 This still leaves the system library licensing to deal with, 
 but if I remember correctly, one of the usual suspects (Rainer? 
 Vladimir?) was working on generating them from MinGW headers.
Most of the core.sys.windows package is now based on the win32 package from the bindings project. The license header of the D files used from there claim that they were placed in the public domain, and I believe they were originally translated from the MinGW headers.
Nov 17
prev sibling parent MrSmith <mrsmith33 yandex.ru> writes:
On Friday, 17 November 2017 at 23:31:07 UTC, David Nadlinger 
wrote:
 The more promising avenue would probably be to distribute LLD 
 with DMD. This still leaves the system library licensing to 
 deal with, but if I remember correctly, one of the usual 
 suspects (Rainer? Vladimir?) was working on generating them 
 from MinGW headers.

  — David
Btw, what about LIBC from DMC, is it open source now too? Can we use it with DMD?
Nov 17
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 15 November 2017 at 03:15:04 UTC, solidstate1991 
wrote:
 After all this flaming about Windows, mobile devices (I 
 personally prefer my desktop PC thanks to its "power", or at 
 least what it used to left, thanks to long unemployment time 
 and lack of income, have a Nokia Lumia which I cannot upgrade 
 to W10 due to BS reasons, and I think open-source architectures 
 will kill off the proprietary ARM and x86 in the long run, not 
 the mobile platform the desktops/laptops(funny story is that my 
 mother tried to ditch desktop multiple times for the mobile, 
 then got back, same happened with one of my cousin after he 
 realized that pay-to-win games suck)), can we get back on 
 rails? While its true that Windows and desktop is losing its 
 place, we need to support Windows on a much higher level as 
 long as there's a large number of PCs out there. Game 
 development would highly benefit from D thanks to its 
 all-in-one approach, probably could cut a few millions off from 
 AAA game development. Also audio-engineers are switching to 
 Windows, thanks to Apple scrapping the IO on their products 
 (I'm also a digital artist, have to stay with Windows due to 
 drivers, software, and ease of use).
I just saw this post about the upcoming Lenovo/AT&T Moto Tab and thought of you: https://www.phonearena.com/news/Lenovo-Moto-Tab-ATT-features_id99782 For $300, you can buy a tablet that lets you do everything you normally do on a tablet, plus watch TV on the go. If you want to use it for work, you buy the bluetooth accessories shown in that embedded promo youtube video and you can do that too. Want a screen in your kitchen, to control that optional speaker, watch recipe videos while you cook, and do video calls? That's a fairly new use case you can try out too. So for $300 or a bit more, depending on what accessories you get, you replace your laptop and TV, and have completely new things you can do. While this effort is fairly ambitious- having watched movies on my tablet with family members, similar to how the family in the video does, I can attest that your arms get tired holding the tablet out front like they do- seems to me that mobile convergence is only increasing. As for your mom and cousin going back to PCs, let me tell you about my own mom. Five years ago, we were both using Windows laptops: her chunky laptop for her business, my Win7 ultrabook for coding and recreation. Today, we both use Android tablets for these same uses- we're both on our second Android tablet now- plus she'll actually use her tablet at home now because a 10" tablet is nowhere as bulky as a Windows laptop. She never typed much in her business use, mostly reading emails and other viewing, so the laptop keyboard was always superfluous, but she had to have one because almost nobody was selling tablets a decade ago when she got it. Whereas, I paired a bluetooth keyboard with my tablet and get by just fine with that. The sales data I've linked shows that there are a lot more people like us than those you point out, and my point is that the mobile market is encroaching even on to people like your family, with products like that Moto Tab. btw, if you want to get back on-topic, simply change the topic of your post up top and write a post about the original topic, rather than posting in an OT thread about what we're talking about.
Nov 15
parent reply solidstate1991 <laszloszeremi outlook.com> writes:
On Wednesday, 15 November 2017 at 11:46:48 UTC, Joakim wrote:
 I just saw this post about the upcoming Lenovo/AT&T Moto Tab 
 and thought of you:

 https://www.phonearena.com/news/Lenovo-Moto-Tab-ATT-features_id99782

 For $300, you can buy a tablet that lets you do everything you 
 normally do on a tablet, plus watch TV on the go.  If you want 
 to use it for work, you buy the bluetooth accessories shown in 
 that embedded promo youtube video and you can do that too.  
 Want a screen in your kitchen, to control that optional 
 speaker, watch recipe videos while you cook, and do video 
 calls?  That's a fairly new use case you can try out too.

 So for $300 or a bit more, depending on what accessories you 
 get, you replace your laptop and TV, and have completely new 
 things you can do.  While this effort is fairly ambitious- 
 having watched movies on my tablet with family members, similar 
 to how the family in the video does, I can attest that your 
 arms get tired holding the tablet out front like they do- seems 
 to me that mobile convergence is only increasing.

 As for your mom and cousin going back to PCs, let me tell you 
 about my own mom.  Five years ago, we were both using Windows 
 laptops: her chunky laptop for her business, my Win7 ultrabook 
 for coding and recreation.  Today, we both use Android tablets 
 for these same uses- we're both on our second Android tablet 
 now- plus she'll actually use her tablet at home now because a 
 10" tablet is nowhere as bulky as a Windows laptop.

 She never typed much in her business use, mostly reading emails 
 and other viewing, so the laptop keyboard was always 
 superfluous, but she had to have one because almost nobody was 
 selling tablets a decade ago when she got it.  Whereas, I 
 paired a bluetooth keyboard with my tablet and get by just fine 
 with that.

 The sales data I've linked shows that there are a lot more 
 people like us than those you point out, and my point is that 
 the mobile market is encroaching even on to people like your 
 family, with products like that Moto Tab.

 btw, if you want to get back on-topic, simply change the topic 
 of your post up top and write a post about the original topic, 
 rather than posting in an OT thread about what we're talking 
 about.
I'm thinking on picking up some Android tablet for development purposes, would be good to port my game engine for mobile devices, probably have to resort for OpenGL for graphics acceleration instead of using CPU blitter, although that might work under NEON (currently I'm using SSE2).
Nov 16
parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 16 November 2017 at 23:03:41 UTC, solidstate1991 
wrote:
 On Wednesday, 15 November 2017 at 11:46:48 UTC, Joakim wrote:
 [...]
I'm thinking on picking up some Android tablet for development purposes, would be good to port my game engine for mobile devices, probably have to resort for OpenGL for graphics acceleration instead of using CPU blitter, although that might work under NEON (currently I'm using SSE2).
Great! Let me know if you have any problem using ldc to compile for Android. One caveat, ldc only supports 32-bit ARM chips right now. I've been looking into making it work with 64-bit ARM, but I'm not sure exactly what that platform's doing for TLS and llvm will require some modification to make it work with D on AArch64. David has been working on linux/AArch64, you're welcome to chip into that effort if you like: https://github.com/ldc-developers/ldc/issues/2153 On Friday, 17 November 2017 at 02:01:41 UTC, solidstate1991 wrote:
 On Wednesday, 15 November 2017 at 04:34:09 UTC, Walter Bright 
 wrote:
 [...]
It's filled with Assembly code, and otherwise not very readable. Would need a lot of work, I don't think it would worth it. Let's hope that MS will allow us to distribute a linker alongside DMD.
If you want to help with that, I suggest you see what Go is doing and submit a PR for us to do the same: http://forum.dlang.org/post/bwtknbuhnmadpspaccyt forum.dlang.org
Nov 16
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 On Wednesday, 8 November 2017 at 00:09:51 UTC, Ola Fosheim 
 Grøstad wrote:
 On Tuesday, 7 November 2017 at 19:46:04 UTC, Joakim wrote:
 Not at all, it makes things easier certainly, but there's a 
 reason why mobile devs always test on the actual devices, 
 because there are real differences.
Mostly with low level stuff in my experience.
And what experience would that be? I've admitted I've never developed for Apple platforms, but my understanding is that even leaving aside the completely different touch-first UI, there are significant differences. I wonder what Mac apps you simply ported the UI over to iPhone and they just worked.
Writing code from scratch for both. No, of course you cannot port it without a little bit of work as the base UI class is slightly different. However it is overall the same Objective-C framework design. Quoting apple: «If you've developed an iOS app, many of the frameworks available in OS X should already seem familiar to you. The basic technology stack in iOS and OSX are identical in many respects. But, despite the similarities, not all of the frameworks in OS X are exactly the same as their iOS counterparts» https://developer.apple.com/library/content/documentation/MacOSX/Conceptual/OSX_Technology_Overview/MigratingFromCocoaTouch/MigratingFromCocoaTouch.html
 I just said they're not going to dump it, so I don't know why 
 you're going on about that.  If you mean their current lessened 
 investment is not a good idea, it's because the old desktop OS 
 doesn't matter as much, which is the whole point of this thread.
That would be an overall mistake as they would loose mindshare among programmers, but nevertheless the desktop is a much more mature environment.
 You are thinking too much short term here IMHO. The mobile
sector is rather volatile.
I have no idea what this refers to: you have a bad habit of adding asides without any explation or non sequiturs, so that we're left stumped as to what you're talking about.
Over-quoting is spammy. So I don't, but here you go: The mobile sector is more volatile than the desktop/laptop sector, hence it would be a risky move to dump it. I think that was quite clear from what I wrote though…
 I see, so your claim is that MS, Nokia, HP, Sony, all much 
 larger companies than Apple or google at the time, could not 
 have countered them even on a lucky day.  I wonder why this is, 
 as they certainly had more money, you don't believe they're 
 that bright? :)
No, it is because they didn't have the resources internally. Money alone does not build teams or knowledge. Apple had worked on similar technology for decades and could recycle the frameworks for their desktop OS.
 Yet the businesses that did build Android, ie google, HTC, and 
 so on, were much smaller than the corporate behemoths like HP 
 or Sony that you claimed above couldn't do it.  Your claims 
 about who could or couldn't do it make absolutely no sense.
Of course it does. They were not into operating systems and frameworks. Sony a little bit by having the Playstation, but that was very narrow and for a very narrow low level segment of programmers.
 Their problem was likely that they got in too early and got 
 discouraged, not that they were "getting in late."
Apple was also in too early and got discouraged, but they reentered when the touch screen tech got better.
Nov 08
prev sibling next sibling parent reply Tony <tonytdominguez aol.com> writes:
On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that 
 came out this June? That's a contradiction of "milk it 
 like an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it".
You and I and Jobs may've let them rebel, but Apple is a public corporation. They can't just let easy money go, their shareholders may not like it. Perhaps you're not too familiar with legacy calculations, but they're probably still making good money off Macs, but it just distracts and keeps good Apple devs off the real cash cow, iPhone. Even if the Mac financials aren't _that_ great anymore, you don't necessarily want to piss off your oldest and most loyal customers, who may stop buying iPhones and iPads too.
It would either be you and Jobs, or just you, letting them rebel. I would keep the line.
That's funny, as I was responding to your statement above, "So, let them rebel." :D
"Let them rebel" was with regard to your point of view. As demonstrated by the sentence I put after it: "You said that they would like to see it go away, and/or they want to milk it." You said that Apple would be happy to see it go away. Then you added that they were "milking" the line while they could. Satisfying rebelling users doesn't jive with either position. They rebel and you want to get rid of it - and you get rid of it. They rebel wanting changes, and you only want to keep milk it while you can - then you get rid of it, because you can't milk what you have.
 The large Apple profit comes from offering quality products 
 and then pricing them at the highest gross profit margin in 
 the industry. In order to get people to pay a premium for 
 their products it helps to have a mystique or following, and 
 the macOS line helps to maintain their mystique and it is 
 small potatoes next to their phone business.
I've already said repeatedly that they're not going to drop the Mac line anytime soon, so I don't know why you want to write a paragraph justifying keeping it.
My post was in response to this statement of yours "Simple, they see the writing on the wall, ie much smaller sales than mobile, SO THEY WANT THE LEGACY PRODUCT TO GO AWAY, which means they can focus on the much bigger mobile market." That seems to be a contradiction to "they're not going to drop the Mac line anytime soon".
 As for mystique, it is laughable that you think this outdated 
 Mac line that practically nobody buys compared to the iPhone 
 provides any. :) More likely, they will keep milking the 
 Mac-buying chumps till they stop, or when they can just tell 
 them to buy an iPhone with a multi-window option instead.
"Nobody buys" Rolls Royces, but they have a lot of mystique. Mystique isn't measured by sales volume. If people ever get so cost-conscious that they decide to buy a $150 companion for their phone, instead of a $400 laptop, it's unlikely they will be using iPhones. You can get a nice Android phone with plenty of RAM/ROM for half the price of an iPhone.
Nov 10
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 10/11/2017 10:42 AM, Tony wrote:
 If people ever get so cost-conscious that they decide to buy a $150 
 companion for their phone, instead of a $400 laptop, it's unlikely they 
 will be using iPhones. You can get a nice Android phone with plenty of 
 RAM/ROM for half the price of an  iPhone.
You can do pretty decently for $60-80usd if you know where to look with Android. But the reality is for developers, desktops are going no where. If anything, we'll see more server workstations becoming standard for developers. I know, I have one. Well worth it if you do anything decent.
Nov 10
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 10 November 2017 at 10:42:37 UTC, Tony wrote:
 On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that 
 came out this June? That's a contradiction of "milk it 
 like an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it".
You and I and Jobs may've let them rebel, but Apple is a public corporation. They can't just let easy money go, their shareholders may not like it. Perhaps you're not too familiar with legacy calculations, but they're probably still making good money off Macs, but it just distracts and keeps good Apple devs off the real cash cow, iPhone. Even if the Mac financials aren't _that_ great anymore, you don't necessarily want to piss off your oldest and most loyal customers, who may stop buying iPhones and iPads too.
It would either be you and Jobs, or just you, letting them rebel. I would keep the line.
That's funny, as I was responding to your statement above, "So, let them rebel." :D
"Let them rebel" was with regard to your point of view. As demonstrated by the sentence I put after it: "You said that they would like to see it go away, and/or they want to milk it." You said that Apple would be happy to see it go away. Then you added that they were "milking" the line while they could. Satisfying rebelling users doesn't jive with either position. They rebel and you want to get rid of it - and you get rid of it. They rebel wanting changes, and you only want to keep milk it while you can - then you get rid of it, because you can't milk what you have.
Your logic is extremely confused. Let me spell it out for you: the Mac is all but dead, particularly when compared to the mobile computing tidal wave, since they sell 10 iPhones + iPads for every Mac, according to the sales link I gave you before. They have cut investment in that legacy Mac product, but they would like to keep selling a lower-quality product at high prices to the few chumps that still maintain the old Mac aura in their heads. So that is what they do, milk the suckers still paying high prices for a rarely refreshed product with a lot more bugs. I don't know what's hard to understand about this for you. When the Mac userbase rebels, they try to calm them down and say they're coming out with a new Mac Pro _next year_, five years since the last one! Apple is a business. As long as the Mac faithful are still willing to pay a lot of money for lower-quality products, they will gladly take their money, even though it's now just a sideline for their real business, the iPhone. Of course, they'd rather just focus on the iPhone, but if they can take a lot of devs off macOS and still milk those suckers, why wouldn't they? Apple is all about making money, which is why they're the largest company in the world, with some forecasting that they will soon be the first company to have a market cap of... one trillion dollars!!! insertDoctorEvilPinkie();
 The large Apple profit comes from offering quality products 
 and then pricing them at the highest gross profit margin in 
 the industry. In order to get people to pay a premium for 
 their products it helps to have a mystique or following, and 
 the macOS line helps to maintain their mystique and it is 
 small potatoes next to their phone business.
I've already said repeatedly that they're not going to drop the Mac line anytime soon, so I don't know why you want to write a paragraph justifying keeping it.
My post was in response to this statement of yours "Simple, they see the writing on the wall, ie much smaller sales than mobile, SO THEY WANT THE LEGACY PRODUCT TO GO AWAY, which means they can focus on the much bigger mobile market." That seems to be a contradiction to "they're not going to drop the Mac line anytime soon".
No contradiction: they want the Mac to go away, but are happy to keep supplementing their bottom line while pulling engineers off of it, just like the iPod Touch. You seem to be confused by the fact that a business sometimes has contradictory goals- should we focus exclusively on the iPhone and make more money there or keep the Mac limping along too?- and tries to balance the two as long as it makes sense.
 As for mystique, it is laughable that you think this outdated 
 Mac line that practically nobody buys compared to the iPhone 
 provides any. :) More likely, they will keep milking the 
 Mac-buying chumps till they stop, or when they can just tell 
 them to buy an iPhone with a multi-window option instead.
"Nobody buys" Rolls Royces, but they have a lot of mystique. Mystique isn't measured by sales volume.
On the contrary, Apple people have long talked about a halo effect from the iPod and iPhone, where their new, exciting, and much more popular mobile products have helped raise sales for their old and flagging Mac line: https://www.cultofmac.com/22331/apples-iphone-halo-effect-boosts-mac-sales-16-4-percent/
 If people ever get so cost-conscious that they decide to buy a 
 $150 companion for their phone, instead of a $400 laptop, it's 
 unlikely they will be using iPhones. You can get a nice Android 
 phone with plenty of RAM/ROM for half the price of an  iPhone.
Sure, the hypothetical iPhone with multiwindow/dock and the iPad Pro replace the expensive Macbook or Surface Pro, while the Android phone you already have along with something like Dex/Sentio replaces cheaper Windows PCs. I already made this point earlier. On Friday, 10 November 2017 at 10:50:52 UTC, rikki cattermole wrote:
 On 10/11/2017 10:42 AM, Tony wrote:
 If people ever get so cost-conscious that they decide to buy a 
 $150 companion for their phone, instead of a $400 laptop, it's 
 unlikely they will be using iPhones. You can get a nice 
 Android phone with plenty of RAM/ROM for half the price of an  
 iPhone.
You can do pretty decently for $60-80usd if you know where to look with Android. But the reality is for developers, desktops are going no where. If anything, we'll see more server workstations becoming standard for developers. I know, I have one. Well worth it if you do anything decent.
And yet I'd guess that the majority of developers already do most of their work on laptops, which are in turn being eclipsed by mobile chips, as I'm able to get by just fine writing and building C++/D code on an Android/ARM tablet. So while you and a few others may need a core i7, 32 GB RAM desktop, most devs already don't use those. I'm sure there are a few people out there still buying Sun, HP-UX, or UNIX workstations, the guy running a honking core i7 desktop PC is going to become like them: an antique.
Nov 10
next sibling parent reply Tony <tonytdominguez aol.com> writes:
On Friday, 10 November 2017 at 11:28:41 UTC, Joakim wrote:
 It would either be you and Jobs, or just you, letting them 
 rebel. I would keep the line.
That's funny, as I was responding to your statement above, "So, let them rebel." :D
"Let them rebel" was with regard to your point of view. As demonstrated by the sentence I put after it: "You said that they would like to see it go away, and/or they want to milk it." You said that Apple would be happy to see it go away. Then you added that they were "milking" the line while they could. Satisfying rebelling users doesn't jive with either position. They rebel and you want to get rid of it - and you get rid of it. They rebel wanting changes, and you only want to keep milk it while you can - then you get rid of it, because you can't milk what you have.
Your logic is extremely confused. Let me spell it out for you: the Mac is all but dead, particularly when compared to the mobile computing tidal wave, since they sell 10 iPhones + iPads for every Mac, according to the sales link I gave you before. They have cut investment in that legacy Mac product, but they would like to keep selling a lower-quality product at high prices to the few chumps that still maintain the old Mac aura in their heads.
You have little company in thinking the Mac line is a "low-quality product". The computer magazine writers gush about the Macbooks. As far as "all but dead", in the most recent quarter, that line did have declining sales from the previous year, but it was "5.6 billion in revenue in Q3 — over 12% of Apple’s total for the quarter".
 So that is what they do, milk the suckers still paying high 
 prices for a rarely refreshed product with a lot more bugs.  I 
 don't know what's hard to understand about this for you.  When 
 the Mac userbase rebels, they try to calm them down and say 
 they're coming out with a new Mac Pro _next year_, five years 
 since the last one!
Your logic seems extremely confused. If they aren't changing the product it won't have a "lot more bugs". With no changes you get less bugs over time.
 Apple is a business.  As long as the Mac faithful are still 
 willing to pay a lot of money for lower-quality products, they 
 will gladly take their money, even though it's now just a 
 sideline for their real business, the iPhone.  Of course, 
 they'd rather just focus on the iPhone, but if they can take a 
 lot of devs off macOS and still milk those suckers, why 
 wouldn't they?
What does "take a lot of devs off macOS" refer to?
 Apple is all about making money, which is why they're the 
 largest company in the world, with some forecasting that they 
 will soon be the first company to have a market cap of... one 
 trillion dollars!!! insertDoctorEvilPinkie();
Very few companies are not "all about making money". That is why Americans were laid off by the millions and replaced by workers in countries with much cheaper labor rates. Bad for the workers, good for "making money". Apple isn't unique in making all it's products outside the USA. I don't see where it makes sense to call people who buy Mac products suckers (they seem especially popular with software developers) who pay extra for what you call "low-quality equipment" without saying the same thing about the people who buy iPhones. Your mantra is "people need so much less than they are buying". Well, that applies as much to iPhone users as it does Mac users. People don't need $1,000 phones and they don't need to upgrade a phone every two years.
 The large Apple profit comes from offering quality products 
 and then pricing them at the highest gross profit margin in 
 the industry. In order to get people to pay a premium for 
 their products it helps to have a mystique or following, and 
 the macOS line helps to maintain their mystique and it is 
 small potatoes next to their phone business.
I've already said repeatedly that they're not going to drop the Mac line anytime soon, so I don't know why you want to write a paragraph justifying keeping it.
My post was in response to this statement of yours "Simple, they see the writing on the wall, ie much smaller sales than mobile, SO THEY WANT THE LEGACY PRODUCT TO GO AWAY, which means they can focus on the much bigger mobile market." That seems to be a contradiction to "they're not going to drop the Mac line anytime soon".
No contradiction: they want the Mac to go away, but are happy to keep supplementing their bottom line while pulling engineers off of it, just like the iPod Touch.
If somebody wants something to go away and they can make it go away, they make it go away. It is most certainly a contradiction to say "they want it to go away" and they "want it to not go away so they can milk it".
 You seem to be confused by the fact that a business sometimes 
 has contradictory goals- should we focus exclusively on the 
 iPhone and make more money there or keep the Mac limping along 
 too?- and tries to balance the two as long as it makes sense.
That doesn't look like contradictory goals. It looks like two choices. Only iPhone or iPhone + macOS. They chose the latter. What exactly would Apple do, if it didn't make Macs, with regard to iPhone development, that would allow it to make more money from the iPhone? Their revenue from iPhones is 10's of billions of dollars a quarter. I doubt that there is any focus that the iPhone is missing.
 If people ever get so cost-conscious that they decide to buy a 
 $150 companion for their phone, instead of a $400 laptop, it's 
 unlikely they will be using iPhones. You can get a nice 
 Android phone with plenty of RAM/ROM for half the price of an  
 iPhone.
Sure, the hypothetical iPhone with multiwindow/dock and the iPad Pro replace the expensive Macbook or Surface Pro, while the Android phone you already have along with something like Dex/Sentio replaces cheaper Windows PCs. I already made this point earlier.
So Apple users need a tablet and a phone but Android users just need a phone? There are Android phones just as expensive as iPhones, in addition to the ones that are 1/10th to 1/2 the price. Why are you talking about iPads? Why would a $649 and up iPad Pro be something people need when you say they can use their phone instead of a $400 Windows laptop? That is something I would expect Tim Cook to claim.
Nov 10
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Friday, 10 November 2017 at 12:55:24 UTC, Tony wrote:
 Very few companies are not "all about making money". That is 
 why Americans were laid off by the millions and replaced by 
 workers in countries with much cheaper labor rates. Bad for the 
 workers, good for "making money". Apple isn't unique in making 
 all it's products outside the USA.
I understand what you mean, but I don't think it is a scientific fact that companies are all about making money. They are run by humans with a set of beliefs and desires which they operate under… Anyway, even companies that are all about making money need to think long term, meaning to take care of their long term reputation. Microsoft was not all about making money in the 90s, but they were all about growing and retaining market share using bad business practices and that cost them their reputation among IT professionals. IBM also failed in the PC market by trying to profit on their brand. Apple might face a similar destiny, but maybe there are too many non-techies in their camp for that effect to kick in. Hard to tell. Companies like Amazon are more about growth than making money… Some banks are more about being big than making money long term… Too big to fail and the government will save your ass. Etc. Family owned business often have their own set of ethics related to the company history and ethics. Same with gründer-owned businesses. I'm pretty sure Steve Jobs had a clear vision for what kind of company Apple should be and what kind of products they should make. I am not sure if the current Apple management has such clear visions…
 I don't see where it makes sense to call people who buy Mac 
 products suckers (they seem especially popular with software 
 developers) who pay extra for what you call "low-quality 
 equipment" without saying the same thing about the people who 
 buy iPhones.
I don't know. I use a mac daily, but there is not a single product in their line today that is anywhere near good value compared by what you get by building your own Linux/Windows box or buy a quality non-Apple product from Samsung or Asus… Apple's best desktop offer is a modest 2-core i5 at $1000 (with no screen, keyboard or mouse) Want a 2-core i7 instead? Add $350… You have to be a sucker to do that… Sorry. 2-core i7? WTF? Why is Intel even producing those? Ok, so Apple want developers to buy Mac Pro instead… Let's see, here in Norway the entry level price for Mac Pro is… $4200, for a 6-core CPU. Uhm, for that price you could build a 18 core rig… Yes, one have to either be a non-tech sucker or locked into the Mac eco system to buy at those rates.
Nov 10
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
Oh wait, I forgot. The have a new 8-core model that is expected 
to sell for $5000… Right… So that would bring the 18-core model 
at… $15000?

At what pricing-point is it reasonable to call Apple customers 
for suckers? :-)
Nov 10
prev sibling parent reply Tony <tonytdominguez aol.com> writes:
On Friday, 10 November 2017 at 14:28:10 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 10 November 2017 at 12:55:24 UTC, Tony wrote:
 Very few companies are not "all about making money". That is 
 why Americans were laid off by the millions and replaced by 
 workers in countries with much cheaper labor rates. Bad for 
 the workers, good for "making money". Apple isn't unique in 
 making all it's products outside the USA.
I understand what you mean, but I don't think it is a scientific fact that companies are all about making money. They are run by humans with a set of beliefs and desires which they operate under…
But those humans at the top, working for public companies, are monitored by a board and stockholders who place "making money" as the main, and normally only, measure of their job performance.
 Anyway, even companies that are all about making money need to 
 think long term, meaning to take care of their long term 
 reputation. Microsoft was not all about making money in the 
 90s, but they were all about growing and retaining market share 
 using bad business practices and that cost them their 
 reputation among IT professionals.
"growing and retaining market share" is a part of "all about making money", to me. My definition of "not all about making money" is when a company does things to benefit the environment or citizens or employees that they could have legally avoided, which gives them lower profits than they would have had from the other course of action. There are donations for various causes made by some public companies, but I think those are normally an insignificant percentage of their profits.
 Companies like Amazon are more about growth than making money… 
 Some banks are more about being big than making money long 
 term… Too big to fail and the government will save your ass. 
 Etc.
I see Amazon as foregoing profits now for growth - and also wiping out the competition - in order to reap massive profits in the future. At least, I haven't heard of them foregoing profits in order to benefit employees, citizens or the environment. Their stock price has a very high valuation (PE ration of 285.1), reflective of investors expecting massive profits in the future.
 I don't know. I use a mac daily, but there is not a single 
 product in their line today that is anywhere near good value 
 compared by what you get by building your own Linux/Windows box 
 or buy a quality non-Apple product from Samsung or Asus…
That is what I see as the Apple way of doing things from their beginning back in the late 1970s. They make premium and/or unique products and then mark them up more than anybody in the industry. Their products have always been unique with regard to the OS (except for a year or two when they allowed Mac clones) making the situation that no other manufacturer can offer an identical product.
Nov 12
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Sunday, 12 November 2017 at 10:18:09 UTC, Tony wrote:
 But those humans at the top, working for public companies, are 
 monitored by a board and stockholders who place "making money" 
 as the main, and normally only, measure of their job 
 performance.
Sure, when you get a leader that is weak on vision then he or she might opting for milking the customer base to satisfy stock owners and over time erode support… So there most certainly can be radical changes when the original «gründer» or a strong «visonary» is displaced. I think it would have been very difficult to displace Steve Jobs though. You could probably make the same argument about IKEA. As long as the original vision is strong (good value affordable-DIY furniture) then it will be difficult to displace, with weak leadership that could erode and profits would outweigh vision and they would erode their brand (what-we-are-all-about).
 "growing and retaining market share" is a part of "all about 
 making money", to me. My definition of "not all about making 
 money" is when a company does things to benefit the environment 
 or citizens or employees that they could have legally avoided, 
 which gives them lower profits than they would have had from 
 the other course of action.
It all depends. Are the stock markets fully rational? Probably not, many invest based on what they think other investors will like and not by analysing objective measures of profits. Some companies are not even on the stock market (i.e. IKEA is a foundation). Will stock markets only reward companies that have good objective profits to show to or will they also reward companies that have low profit margins but are insanely big? IBM were insanely big in terms of market dominance. Silicon Graphics and SUN were big in high-performance computing. Where did that go? There is a perception that being big will necessarily mean large profits in the future. That may be the case, but it could also mean that you've got a juggernaut that is difficult to steer… However, I think it is very difficult for a company over time to retain a strong brand vision if they only care about short-term profits. With weak leaders that are not capable of projecting visions then the share owners will take control and perhaps send the company in the wrong direction… With good communication of strong visions it is harder to get a majority behind such changes.
 I see Amazon as foregoing profits now for growth - and also 
 wiping out the competition - in order to reap massive profits 
 in the future.  At least, I haven't heard of them foregoing 
 profits in order to benefit employees, citizens or the 
 environment. Their stock price has a very high valuation (PE 
 ration of 285.1), reflective of investors expecting massive 
 profits in the future.
Right, but how rational is that analysis? I find better deals and better products on dedicated netshops. If Amazon controlled the search applications, then it would look more certain. But as long as there are free price-comparison applications… Who knows if being that generic will be an advantage. E.g. is it conceivable that Amazon could beat IKEA? And will people in the future buy physical books, music or movies? What is the long term market place for Amazon? (I like Amazon for convenience though.)
 That is what I see as the Apple way of doing things from their 
 beginning back in the late 1970s. They make premium and/or 
 unique products and then mark them up more than anybody in the 
 industry. Their products have always been unique with regard to 
 the OS (except for a year or two when they allowed Mac clones) 
 making the situation that no other manufacturer can offer an 
 identical product.
Sure, but Steve Jobs understood that they should try to make their products available on the grass-root level also. So they made a line that was affordable enough for people to buy for school class rooms and teenagers. Those are future customers, so even if you don't make large profit margins it is a good investment. iOS is a bit generic and identity-less compared to say MacOS. Current Apple management does not understand that and schools get good deals on Windows PCs instead…
Nov 12
prev sibling parent Joakim <dlang joakim.fea.st> writes:
On Friday, 10 November 2017 at 12:55:24 UTC, Tony wrote:
 On Friday, 10 November 2017 at 11:28:41 UTC, Joakim wrote:
 Your logic is extremely confused.  Let me spell it out for 
 you: the Mac is all but dead, particularly when compared to 
 the mobile computing tidal wave, since they sell 10 iPhones + 
 iPads for every Mac, according to the sales link I gave you 
 before.  They have cut investment in that legacy Mac product, 
 but they would like to keep selling a lower-quality product at 
 high prices to the few chumps that still maintain the old Mac 
 aura in their heads.
You have little company in thinking the Mac line is a "low-quality product". The computer magazine writers gush about the Macbooks.
lol, your own paste of what I wrote says "lower-quality product" above, yet you do not get it right in your quote below and go off on your own error. While you make a few decent points elsewhere, your post is mostly filled with such mistakes, so I'm not going to sit here and argue with stuff you made up or explain basic business concepts to you, like market segmentation or legacy support.
Nov 10
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/10/2017 3:28 AM, Joakim wrote:
 Your logic is extremely confused.
[...]
 You seem to be confused
Please stop berating others here.
Nov 12
prev sibling parent reply Tony <tonytdominguez aol.com> writes:
On Wednesday, 8 November 2017 at 09:34:39 UTC, Joakim wrote:


 I see, so your claim is that MS, Nokia, HP, Sony, all much 
 larger companies than Apple or Google at the time, could not 
 have countered them even on a lucky day.  I wonder why this is, 
 as they certainly had more money, you don't believe they're 
 that bright? :)
Google bought Android from a startup of sharp programmers. There are only so many mobile operating systems and operating systems are not easy to develop. Jobs got back into Apple because they had failed in an attempt to replace OS 9 and Jobs had a talented software team and an OS from his failing Next company. Nokia had a big internal effort to replace Symbian (which had multi-tasking from the beginning, unlike iOS) due to some flaw that it could only handle 640 x 360 screens (bigger than the first couple iPhone generations). But one effort failed and another, based on Linux came too late to survive being cut at the same time the new CEO from Microsoft announced that Symbian would be discontinued and replaced by Windows Mobile.
 On Wednesday, 8 November 2017 at 07:04:24 UTC, Tony wrote:
 On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:

 The vast majority of users would be covered by 5-10 GBs of 
 available storage, which is why the lowest tier of even the 
 luxury iPhone was 16 GBs until last year.  Every time I talk 
 to normal people, ie non-techies unlike us, and ask them how 
 much storage they have in their device, whether smartphone, 
 tablet, or laptop, they have no idea.  If I look in the 
 device, I inevitably find they're only using something like 
 3-5 GBs max, out of the 20-100+ GBs they have available.
You are making an assumption that people want as much storage for a combo phone/PC as they do for only a phone. You need to also check how much storage they are using on their PCs.
You need to read what I actually wrote, I was talking about laptops too. I don't go to people's homes and check their desktops, but their laptops fall under the same low-storage umbrella, and laptops are 80% of PCs sold these days.
OK, I see you did mention laptops. It isn't my case and I find it hard to believe that people are being sold ever larger disk drives when they can survive with a 32GB flash rom.
 I never made any previous claim about what IDEs are being 
 used. The only time I previously mentioned an IDE was with 
 regard to RemObjects and Embarcadero offering 
 cross-compilation to Android/iOS with their products.

 "There is a case to be made for supporting  Android/iOS 
 cross-compilation. But it doesn't have to come at the 
 expense of Windows 64-bit integration. Not sure they even 
 involve the same skillsets. Embarcadero and Remobjects both 
 now support Android/iOS development from their Windows (and 
 macOS in the case of Remobjects) IDEs."

 That was to highlight that those two compiler companies have 
 seen fit to also cross-compile to mobile - they saw an 
 importance to mobile development. It wasn't about what IDEs 
 are best for mobile or even what IDEs are being used for 
 mobile.
If you look back to the first mention of IDES, it was your statement, "Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit." That at least implies that they're using the same IDE to target both mobile and PC gaming, which is what I was disputing. If you agree that they use completely different toolchains, then it is irrelevant whether D supports Windows-focused IDEs, as it doesn't affect mobile-focused devs.
My statements quoted didn't mention IDEs and they didn't imply IDEs. What was implied was the initial line in the first post "* better dll support for Windows". My assumption is that game developers (or just developers) work on multiple OSes. If you want them to use a language - like D - they should find it compelling to use on all their platforms.
Your statement was made in direct response to my question, "why spend time getting D great Windows IDE support if you don't think Windows has much of a future?"
What does IDE support refer to? You didn't say "get good Windows IDEs". In any event, I was talking about DLLs and related Windows issues that you would encounter using Vim and D.
 I've already said I don't think there's much overlap between 
 mobile and PC games, the markets are fairly disjoint.  The top 
 mobile games are never released for PC and vice versa.
I never said the games have overlap. I said the developers have overlap.
 As for dll support, that was not mentioned at all in the OT 
 thread to which you were responding, and you never called it 
 out.
Never called what out? You were saying that Windows was going down by 99% in some unstated timeframe and I challenged that notion. The first and second posts in this thread mention DLL support and I seem to recall people talking about other issues after that besides DLL support - and not about IDEs. You need to clearly demarcate your "OT thread" in a thread and put what context you will consider valid in it.
 As for flat UIs, you really should be aware of the effect your 
 beloved Metro has had:

 https://en.m.wikipedia.org/wiki/Flat_design
I don't see any relationship between that iOS picture in the Wiki article and Metro. The idea is RESIZABLE, LIVE tiles. Not effects to make them look 3D or not.
Nov 10
parent Tony <tonytdominguez aol.com> writes:
On Friday, 10 November 2017 at 11:10:30 UTC, Tony wrote:

 I don't see any relationship between that iOS picture in the 
 Wiki article and Metro. The idea is RESIZABLE, LIVE tiles. Not 
 effects to make them look 3D or not.
"live tile" meaning the underlying app can dynamically put readable information in the tile. Such as the most recent sender of email and subject, the most recent headline, the item at the top of your todo list, a calendar reminder, current weather information.
Nov 10
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 14:33:28 UTC, Joakim wrote:
 similarity of APIs between macOS and iOS, but obviously there 
 are significant developer and IDE differences in targeting a 
 mobile OS versus a desktop OS, even if iOS was initially forked 
 from macOS.
Not in my experience… There are some things programmers have to be aware of, because some features are not available on iOS, but overall the same deal. Not too surprising as the iOS simulator compiles to X86, so by keeping the code bases similar they make it easier to simulate it on the Mac. So yeah, you kinda run iOS apps on your mac natively. (Not emulated as such.) Only when you go low level (ARM intrinsics) will this be a real problem. So it goes without saying that iOS and OS-X have to be reasonably similar for this to be feasible.
 Let me correct that for you: there are many more iOS developers 
 now, because it is a _much_ bigger market.
Yes, but that does not mean that your original core business is no longer important.
 Just a couple responses above, you say the iPhone UI will keep 
 those users around.  I'd say the Mac is actually easier to 
 commoditize, because the iPhone is such a larger market that 
 you can use that scale to pound the Mac apps, _once_ you can 
 drive a multi-window, large-screen GUI with your iPhone, on a 
 monitor or 13" Sentio-like laptop shell.
By commoditise I mean that you have many competitors in the market because the building blocks are available from many manufacturers (like radios). However, I think "laptop shell" is perceived as clunky. People didn't seem to be very fond of docking-stations for laptops. Quite a few went for impractically large screens on their laptops instead.
 I agree that very few apps are used on phones, and that they 
 aren't as sticky as desktop apps as a result.  Hopefully that 
 means we'll see more competition in mobile than just 
 android/iOS in the future.
iPhones are easier to displace because the UI is not so intrusive compared to a desktop and the apps people depend on are not so complicated. That might change of course… As people get used to the platform Apple can make things more complicated (less to learn, so you can introduce more features one by one). There are things about modern iOS that I don't find intuitive, but since so many have iPhones they probably get help from people nearby when they run into those issues. Scale matters in many strange ways…
 Lack of competition at the high end certainly played a role, 
 but as I noted to codephantom above, consumers not needing the 
 performance played a much larger role, which is why Samsung, 
 with their much weaker SoCs, just passed Intel as the largest 
 semiconductor vendor:
I assume those aren't used in desktop computers? Samsung need a lot of SoCs as they manifacture lots of household items…
 Yes, but would that be in 2020 or 2050?  Would people who never 
 had a cellphone get a smartphone, driving that market even 
 larger, as is happening today in developing markets?
Ok, I think it was fairly obvious that smart phones would at least for a while be a thing as it was already then fashionable in the high end. What wasn't all that obvious was that people would be willing to carry rather clunky iPhones and Android devices with bad battery life compared to the Symbian phones… Which I think was to a large extent driven by social norms, fashion and the press pushing the story on frontpages over and over… Also, when I think of it, I wonder if Apple would have succeeded if the press had not played them up as an underdog against Microsoft in the preceding decade. The underdog Apple rising from the dust and beating out Microsoft and Nokia made for a good story to push… (in terms of narrative/narratology)
 Jobs certainly wasn't, almost nobody was.  If there were a few 
 making wild-eyed claims, how many millions of dollars did they 
 actually bet on it, as Jobs did?  Nobody else did that, which 
 shows you how much they believed it.
Apple had worked on this for a long time and had also already failed at it, but they decided to pushed it again when touch screen technologies made it possible.
 I'm not sure how the starting point matters, google funded 
 Android from nothing and it now ships on the most smartphones.
I don't think Android came from nothing, and it was significantly more clunky than iOS, but Google did this to have an option if other giants would try to block their revenue stream from ads… So it was more passive-aggressive than a business.
 But even the google guys never bet the company on it, just gave 
 it away for free for others to build on, which is why they 
 never made as much money as Apple either.
Well, it was to proctect their business, not to develop their business, so I am not sure if Android is a good example.
Nov 07
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
 One is a touch-first mobile OS that heavily restricts what you 
 can do in the background and didn't even have a file manager 
 until this year, while the other is a classic desktop OS, so 
 there are significant differences.
Yes, there are differences for the end user, such as the the sandboxing, but that also applies to applications in OS-X appstore though. I don't expect iOS to change much in that department, I think Apple will continue to get people into their iCloud… On the API level iOS is mostly a subset, and features that was only in iOS has been made available on OS-X. The main difference is in some UI classes, but they both use the same tooling and UI design strategies. So in terms of XCode they are kinda similar.
 I never said they don't write apps for macOS, I said iOS is a 
 much bigger market which many more write for.
Yes, there are more Apple developers in general. Not sure if the number of people doing OS-X development has shrunk, maybe it has.
 The same may happen to the iPhone some day, but it shows no 
 signs of letting up.
They probably will hold that market for a while as non-techies don't want to deal with a new unfamiliar UI.
 Since they still have a ways to go to make the cameras or 
 laptop-functionality as good as the standalone products they 
 replaced, it would appear they can still convince their herd to 
 stay on the upgrade cycle.
That is probably true, e.g. low light conditions.
 While I disagree that you can't commoditize the Mac, as you 
 could just bundle most of the needed functionality into an 
 iPhone
My point was that it is easier to commoditize the iPhone than the Mac. There is a very limited set of apps that end users must have on a phone.
 they've already significantly cut the team working on it.
Ok, didn't know that. I've only noticed that they stopped providing competitive products after Jobs died.
 No, the reason they don't improve is consumers don't need the 
 performance.
I don't think this is the case. It is because of the monopoly they have in the top segment. Intel was slow at progress until Athlon bit them too. If they felt the pressure they would put their assets into R&D. Remember that new products have to pay off R&D before making a profit, so by pushing the same old they get better ROI. Of course, they also have trouble with heat and developing a new technological platform is very expensive. But if they faced stiff competition, then they certainly would push that harder. In general the software market has managed to gobble up any performance improvements for decades. As long as developers spend time optimizing their code then there is a market for faster hardware (which saves development costs). The Intel i9-7900X sells at $1000 for just the chip. That's pretty steep, I'm sure they have nice profit margins on that one.
 You are conflating two different things, fashionable academic 
 topics and industry projections for actual production, which is 
 what I was talking about.
What do you mean by industry projections? It was quite obvious by early 2000s that most people with cellphones (which basically was everyone in Scandinavia) would switch to smart phones. It wasn't a surprise.
 confident in them that you bet your company on them.  Nobody 
 other than Apple did that, which is why they're still reaping 
 the rewards today.
Only Microsoft had a comparable starting point. iOS is closely related to OS-X. Not sure if Nokia could have succeed with scaling up Symbian. Maybe, dunno.
Nov 07
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
 Also, nobody saw mobile growing so gigantic, so fast, not even 
 Jobs by all indications.  Mobile has really been a tidal wave 
 over the last decade.  Funny how all you hear is bitching and 
 whining from a bunch of devs on proggit/HN about how they 
 missed the '80s PC boom or '90s dot.com boom and there's 
 nothing fundamentally exciting like that now, all while the 
 biggest boom of them all, the mobile boom, just grew and grew 
 right in front of their faces. :D
Well, I was there in the early nineties when the Microsoft WinPad was being talked about. This was almost 20 years before the iPad came out. I remember going through the 90's with Window CE interations, which eventually evolved into Window Mobile 2003 - which is when I purchased my first 'smart phone', and learnt how to write apps for it ( actually my current phone still runs Windows Mobile 6.1 ;-). I tried getting people around me interested in mobile devices, including the business I worked in. Nobody was really interested. They were all happy with their little push button nokias. Microsoft had the vision though, and they had it earlier than perhaps anyone else. But the vision was too far ahead of its time, and, around the early 2000's they refused to lose any more money, put it on the back burner, and competitors came in a took over - at a time when 'consumers' were just beginning to share the vision too.... But I think what really made it take off so fast and unexpectadly, was the convergence of mobile devices, mobile communication technology (i.e wifi, gps and stuff), and of course the internet... as well as the ability to find cheap labour overseas to build the produces on mass. I doubt anyone could have envisioned that convergence...but some companies were in a better position (more agile) than others, at the time, to capitalise on it. But the vision of being mobile was certainly there, back in the early nineties - and Microsoft were leading it.
Nov 07
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 7 November 2017 at 13:59:26 UTC, codephantom wrote:
 Microsoft had the vision though, and they had it earlier than 
 perhaps anyone else. But the vision was too far ahead of its 
 time, and, around the early 2000's they refused to lose any 
 more money, put it on the back burner, and competitors came in 
 a took over - at a time when 'consumers' were just beginning to 
 share the vision too....
Yes, HP had the IPAQ: https://en.wikipedia.org/wiki/IPAQ It was kinda interesting, but a bit too clunky and a bit expensive for personal use. I guess it was used for things like filling out forms on-site in businesses or doing measurements and things like that. And touch screen and battery quality was a consideration as well, but either way, Apple App Store was probably a big factor for iOS to succeed. And the iPad was very popular with journalist who saw it as a device for electronic news papers and already was in the Apple fold (desk top publishing) I guess, so the iPad 1 got lots of free marketing. So the technology has to be effortless, but there are also such social factors that drive free media coverage that come into play. If regular news paper journalists had not been enamoured by it, then it would have faded away…
 But I think what really made it take off so fast and 
 unexpectadly, was the convergence of mobile devices, mobile 
 communication technology (i.e wifi, gps and stuff), and of 
 course the internet... as well as the ability to find cheap 
 labour overseas to build the produces on mass.
You could attach lots of stuff to IPAQ, just like any laptop (Wifi, probably GPS, etc…)
Nov 07
prev sibling parent Arjan <arjan ask.me.to> writes:
On Tuesday, 7 November 2017 at 13:59:26 UTC, codephantom wrote:

 But I think what really made it take off so fast and 
 unexpectadly, was the convergence of mobile devices, mobile 
 communication technology (i.e wifi, gps and stuff), and of 
 course the internet... as well as the ability to find cheap 
 labour overseas to build the produces on mass.
The thing is apple combined (at the right time) the various things which made it very succesfull. - they learned from napstar, people are interessted in (buying) a single song => iPod + iTunes - they 'copied' that concept over to the iPhone with appstore For both they enabled not only business for themselfes but created platforms on which many parties external to apple were able to create succesfull business on. iaw its not the device itself what made it succesfull, its because it is part of a 'platform'....
Nov 07
prev sibling parent Tony <tonytdominguez aol.com> writes:
On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:

 The vast majority of users would be covered by 5-10 GBs of 
 available storage, which is why the lowest tier of even the 
 luxury iPhone was 16 GBs until last year.  Every time I talk to 
 normal people, ie non-techies unlike us, and ask them how much 
 storage they have in their device, whether smartphone, tablet, 
 or laptop, they have no idea.  If I look in the device, I 
 inevitably find they're only using something like 3-5 GBs max, 
 out of the 20-100+ GBs they have available.
You are making an assumption that people want as much storage for a combo phone/PC as they do for only a phone. You need to also check how much storage they are using on their PCs.
 I never made any previous claim about what IDEs are being 
 used. The only time I previously mentioned an IDE was with 
 regard to RemObjects and Embarcadero offering 
 cross-compilation to Android/iOS with their products.

 "There is a case to be made for supporting  Android/iOS 
 cross-compilation. But it doesn't have to come at the expense 
 of Windows 64-bit integration. Not sure they even involve the 
 same skillsets. Embarcadero and Remobjects both now support 
 Android/iOS development from their Windows (and macOS in the 
 case of Remobjects) IDEs."

 That was to highlight that those two compiler companies have 
 seen fit to also cross-compile to mobile - they saw an 
 importance to mobile development. It wasn't about what IDEs 
 are best for mobile or even what IDEs are being used for 
 mobile.
If you look back to the first mention of IDES, it was your statement, "Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit." That at least implies that they're using the same IDE to target both mobile and PC gaming, which is what I was disputing. If you agree that they use completely different toolchains, then it is irrelevant whether D supports Windows-focused IDEs, as it doesn't affect mobile-focused devs.
My statements quoted didn't mention IDEs and they didn't imply IDEs. What was implied was the initial line in the first post "* better dll support for Windows". My assumption is that game developers (or just developers) work on multiple OSes. If you want them to use a language - like D - they should find it compelling to use on all their platforms.
 I've always thought that flat Metro interface was best suited 
 for mobile displays, the easiest to view, render, and touch.  
 To some extent, all the other mobile interfaces have copied it, 
 with their move to flat UIs over the years.  However, it 
 obviously takes much more than a nice GUI to do well in mobile.
I don't know what a flat UI is, but every mobile OS I have used - Blackberry 9/10, Nokia Symbian, Nokia Linux, Palm OS, WebOS, Firefox OS, iOS, Android - all have the same essential interface. Icons on a scrolling desktop. Windows 8/10 Mobile, with the resizable live tiles is the only one that does the interface differently, and in my opinion, does it the best.
 Why did they fund development of a new iMac Pro which is 
 coming this December as well as the new MacBook Pros that 
 came out this June? That's a contradiction of "milk it like 
 an iPod".
Because their userbase was rebelling? I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:
So, let them rebel. You said that they would like to see it go away, and/or they want to milk it. If you have to spend money on development to keep selling it, then you can't "milk it".
You and I and Jobs may've let them rebel, but Apple is a public corporation. They can't just let easy money go, their shareholders may not like it. Perhaps you're not too familiar with legacy calculations, but they're probably still making good money off Macs, but it just distracts and keeps good Apple devs off the real cash cow, iPhone. Even if the Mac financials aren't _that_ great anymore, you don't necessarily want to piss off your oldest and most loyal customers, who may stop buying iPhones and iPads too.
It would either be you and Jobs, or just you, letting them rebel. I would keep the line. The large Apple profit comes from offering quality products and then pricing them at the highest gross profit margin in the industry. In order to get people to pay a premium for their products it helps to have a mystique or following, and the macOS line helps to maintain their mystique and it is small potatoes next to their phone business.
Nov 07
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 Linux as a market that is so fragmented on the desktop level.
This demonstrates an all to often misunderstanding of the purpose of Linux, and open source in general (depending on what licence is used). Fragmentation is an important, necessary, and inevitable outcome of open source. Open source provides the freedom for the user to adapt the software to their own environment. That's the whole point..to enable that kind of 'fragmentation'. (Of course, if your real objective is to dominate a market, then fragmentation is the last thing you want).
Nov 02
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 02, 2017 at 07:08:51AM +0000, codephantom via Digitalmars-d wrote:
 On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
 Linux as a market that is so fragmented on the desktop level.
This demonstrates an all to often misunderstanding of the purpose of Linux, and open source in general (depending on what licence is used). Fragmentation is an important, necessary, and inevitable outcome of open source. Open source provides the freedom for the user to adapt the software to their own environment. That's the whole point..to enable that kind of 'fragmentation'.
[...] Yeah I'm not sure 'fragmentation' is the right word to use, but certainly 'customizability' is a big, *huge* factor in my choosing to use Linux instead of Windows. And I don't mean just customization in the way of choosing "themes", which is just purely cosmetic. I mean reaching into the guts of the system and changing how it works, according to my liking. In fact, I despise the so-called "desktop metaphor" -- I think it's a silly idea that doesn't match how the machine works -- so I reconfigure my X server to use Ratpoison instead, a "window" manager that eliminates the mouse and basically maximizes everything into single-screen, single-window, no toolbars, no title decorations, nothing. And keyboard controls for everything. But almost every other Linux user (needless to say Windows user) won't even be able to *breathe* in such a setup, but that's OK, because Linux is not tied to a single UI, whether it be GUI or something else. You can use whatever GUI or "desktop" environment you wish, and it will still all work. This flexibility allows everyone to customize their environment to what suits them best, rather than have some predefined, unchangeable default shoved down everyone's throats. With Windows, there is no way to go that far... even what it *does* allow you to do can cause random stuff to break, 'cos programs are written with the assumption that you *never* change how things work. (Try changing mouse focus to lazy focus sometime... and watch how many applications malfunction, behave oddly, or just plain break. And this is not even a major customization!) Understandably, though, most non-programmer types prefer the familiarity and comfort of Windows' default environment. That's why Windows will still be around for the next little while. :-P T -- I see that you JS got Bach.
Nov 02
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/1/2017 11:42 AM, Bo wrote:
 And 
 frankly, Walter or whoever, there needed to have been put a stop to this anti 
 Windows bullshit several days ago. As long as people use this level of 
 disrespect towards community members because they are not using the "right" 
 platform.
Don't worry, Windows remains a high priority platform for D. In the not-so-long run, all the platforms are dead. Little to none of D will work on any platform prior to 10 years ago or so. D needs to run on the major platforms of today, and that certainly includes Windows. Nobody is obliged to work on any platform they don't want to work on. And nobody is entitled to berate anyone for working on any platform they want to.
Nov 07
parent codephantom <me noyb.com> writes:
On Wednesday, 8 November 2017 at 07:33:53 UTC, Walter Bright 
wrote:
 Nobody is obliged to work on any platform they don't want to 
 work on. And nobody is entitled to berate anyone for working on 
 any platform they want to.
Yeah...right on..! btw. Windows XP is still the best o/s I ever 'owned'. Credit where credit is due. I think we should all just hug..and start using FreeBSD.
Nov 08
prev sibling parent codephantom <me noyb.com> writes:
On Sunday, 29 October 2017 at 20:58:45 UTC, 12345swordy wrote:
 What makes you think that windows is a "dying platform"!? There 
 is no evidence to suggest this.
Windows dying? Perhaps not. But the dominance of Windows is *certainly* under threat. The clear evidence for that, is the strategies MS is putting in place to go cross-platform, and, increasingly, open-source. Good on em' for adapating... D is focused on its cross-platform capabilites, which wil be really important for its future too... So the evidence is in. Windows is becoming less dominant...and there's no reason to believe that won't continue to be the case... btw. No mobile device will replace my desktop pc ... Like the pharaohs..I want access to my desktop pc in the after life too..so it will be buried with me ;-)
Oct 29
prev sibling next sibling parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 19:46:00 UTC, Jerry wrote:
 Start the download when you go to sleep, when you wake up it 
 will be finished. I did this as a kid when I had internet that 
 was probably even slower than yours right now. It'll be like 
 those 4 hours never even happened.
That's greate advice Jerry. Perhaps that should go up on the wiki... "NOTE: If you're using Windows, and you want to use the -m64 mode to compile D into 64bit code, you will need to download several GB's of other stuff...and if you have slow internet connection...then just start it when you go to sleep..and when you wake up it will be finished. It'll be like those 4 hours never even happened."
Oct 28
parent Jerry <hurricane hereiam.com> writes:
On Sunday, 29 October 2017 at 00:45:08 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 19:46:00 UTC, Jerry wrote:
 Start the download when you go to sleep, when you wake up it 
 will be finished. I did this as a kid when I had internet that 
 was probably even slower than yours right now. It'll be like 
 those 4 hours never even happened.
That's greate advice Jerry. Perhaps that should go up on the wiki... "NOTE: If you're using Windows, and you want to use the -m64 mode to compile D into 64bit code, you will need to download several GB's of other stuff...and if you have slow internet connection...then just start it when you go to sleep..and when you wake up it will be finished. It'll be like those 4 hours never even happened."
Most developers don't have shitty internet though, the one's that do don't go whining about it trying to use something "better", where better is just a substitute for smaller package size. Visual Studio is probably the most reliable and stable toolchain on Windows, the only thing anyone (ehm you) has to say bad about it is it's download size. I'd say that's better damn near the best D can do if the only thing someone has to complain about it is the download size.
Oct 28
prev sibling parent Adam Wilson <flyboynw gmail.com> writes:
On 10/28/17 12:46, Jerry wrote:
 On Saturday, 28 October 2017 at 15:36:38 UTC, codephantom wrote:
 But if you really are missing my point..then let me state it more
 clearly...

 (1) I don't like waiting 4 hours to download gigabytes of crap I don't
 actually want, but somehow need (if I want to compile 64bit D that is).
Start the download when you go to sleep, when you wake up it will be finished. I did this as a kid when I had internet that was probably even slower than yours right now. It'll be like those 4 hours never even happened.
 (2)I like to have choice.

 A fast internet might help with (1).

 (2) seems out of reach (and that's why I dont' and won't be using D on
 Windows ;-)
It's probably why you shouldn't be on Windows to begin with..
 (being a recreational programmer, I have that luxury..I understand
 that others do not, but that's no reason for 'some' to dismiss my
 concerns as irrelevant. They're relavent to me, and that's all that
 matters ;-)
Talk about being narcissistic ;)
Hey Jerry, I appreciate what you're trying to accomplish .. but uh ... don't feed that trolls. ;) -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 28
prev sibling parent Jerry <hurricane hereiam.com> writes:
On Saturday, 28 October 2017 at 14:43:38 UTC, codephantom wrote:
 I explicitly mentioned that I did *******NOT******* want VS 
 installed.
So? If you don't want to use it, then don't use D, or don't use Windows. There's simple solution to your problem. Rust requires VS, you can't build on Windows without it. It's not that big of a deal to require. If you don't want to use it, then go ahead, D is open source see how easy it is to replace with something else.
 The majority of time spent was downloading the damn thing!
Again with the size issue, 3.5 GB isn't that big of a file. Start the download and go do something, time management is a skill. You don't have to sit there and watch it download, even if you have shitty internet.
Oct 28
prev sibling parent codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 07:39:21 UTC, codephantom wrote:
 It's the *minimum* 'selection set' you'll need (with regards to 
 the Visual Studio Build Tools 2017) in order to get DMD to 
 sucessfully compile a 64bit exe (-m64)

 Now to be fair, this is assuming you **don't** want and 
 **don't** have VS installed, but just want the necessary 'build 
 tools' so that DMD can build a *64bit* binary on Windows - (in 
 total about 3.5GB).


 Code tools
 	Static analysis tools

 Compilers, build tools, and runtimes
 	VC++ 2017 v141 toolset (x86,x64)

 SDK's, libraries and frameworks
 	Windows 10 SDK (10.0.16299.0) for Desktop C++ [x86 and x64]
 	Windows 10 SDK (10.0.16299.0) for UWP: C#, VB, JS
 	Windows 10 SDK (10.0.16299.0) for UWP: C++
I need to correct my statement above (it was late at night ;-) The actual download size was 977MB The installed size was 3.5GB
Oct 28
prev sibling parent reply Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Thursday, 26 October 2017 at 12:36:40 UTC, jmh530 wrote:
 On Thursday, 26 October 2017 at 11:32:26 UTC, Andrei 
 Alexandrescu wrote:
 A wizard-style installation with links to things and a good 
 flow might help a lot here. Is that possible? -- Andrei
The DMD installer is already a Wizard on Windows. First it checks if you have a current version of D and will uninstall that, then it checks if you want to install D2 along with some extras (Visual D, DMC, D1), and it goes through additional steps to install the extras if you select them. However, if you need Visual Studio installed, then that takes like a half an hour. My recollection is that it's a little tricky if you upgrade to a new version of VS. I usually just uninstall D and reinstall it rather than deal with that.
There are other issues with the Visual Studio install that D is not responsible for but is annoying as hell. It's Microsofts fault but the result is that D installation is made difficult. To install VS, it downloads first the installer which will then download and install the software. That installer doesn't work with the proxy installation we have on our work PC. The only way is to find the offline install package. I know they exist but finding them on the mess that is Microsoft's website is nearly impossible. The other difficulty, installing Visual Studio when the system partition is small is also nearly impossible. As I said, not the fault of D but annoying in any case. This said, building apps purely 32 bit on Windows has the advantage of allowing compatibility with legacy systems still running on XP. There are still some of them still around.
Oct 26
parent jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 26 October 2017 at 20:24:53 UTC, Patrick Schluter 
wrote:
 [snip] That installer doesn't work  with the proxy installation 
 we have on our work PC.
This is usually the bane of my corporate existence. I have to manually install so much stuff. Anaconda and RStudio's package managers don't work. Dub fetch doesn't work on my work machine. But for whatever reason, I've never had a problem with the Visual Studio installer.
 The other difficulty, installing Visual Studio when the system 
 partition is small is also nearly impossible.
My old home setup was a very small SSD to install Windows and then a bigger 3.5" data drive. I had to get a bigger operating system drive (NVME is pretty cool) when I started using Visual Studio more.
Oct 26
prev sibling next sibling parent Mike Parker <aldacron gmail.com> writes:
On Thursday, 26 October 2017 at 11:32:26 UTC, Andrei Alexandrescu 
wrote:

 A wizard-style installation with links to things and a good 
 flow might help a lot here. Is that possible? -- Andrei
The installer currently offers to install VS 2013. A more recent version might be better for some people, but I don't think it would have done anything for me. Just the idea that it's necessary would be a turn off. These days especially I'm less inclined to try new things if the slightest hurdle is presented simply because I put a higher premium on my time than I used to. More broadly, I think in 2017 it's still true that serious Windows developers/shops will have VS installed, but there's no guarantee that anyone developing on Windows is a "Windows developer". With the myriad languages and VMs to choose from, they're don't even have to be aware that VS exists. Anything that hides the dependency from them in a way the minimizes the install would be a win.
Oct 26
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2017-10-26 13:32, Andrei Alexandrescu wrote:

 A wizard-style installation with links to things and a good flow might 
 help a lot here. Is that possible? -- Andrei
Xcode can only, officially, be obtained from the Mac App Store or Apple's developer web site, which a (free) account is required to be able to access. I guess it's possible to link to the Mac App Store Xcode page, but I don't think it's possible to automatically download Xcode. Then the user usually need to install the Xcode command line tools as well. -- /Jacob Carlborg
Oct 27
prev sibling parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 10/26/17 7:09 AM, Mike Parker wrote:
 I also didn't like that I had to install the Xcode tools on my Mac, but 
 that's needed for any development on Mac from what I can see.
Want to hear something scary? The autotester does not use xcode tools :) In fact, I've been meaning to bug Brad about checking to see if things have improved (xcode's compiler used to generate a dmd that would fail some of the tests). I've never used gnu gcc, only ever Xcode's compiler (which is llvm). -Steve
Oct 26
parent reply Brad Roberts <braddr puremagic.com> writes:
On 10/26/17 5:23 AM, Steven Schveighoffer via Digitalmars-d wrote:
 On 10/26/17 7:09 AM, Mike Parker wrote:
 I also didn't like that I had to install the Xcode tools on my Mac, but that's
needed for any 
 development on Mac from what I can see.
Want to hear something scary? The autotester does not use xcode tools :) In fact, I've been meaning to bug Brad about checking to see if things have improved (xcode's compiler used to generate a dmd that would fail some of the tests). I've never used gnu gcc, only ever Xcode's compiler (which is llvm). -Steve
Actually, one of the 3 macos boxes is using stock xcode tooling these days. I specifically went that direction when setting up a new system that replaced one that died on me (well, it boots but if I actually _use_ it it crashes, sigh). So, but on the older osx releases (not positive the exact versions off the top of my head) there were issues that forced us back to an old gcc version rather than the default clang compiler.
Oct 26
parent reply Jacob Carlborg <doob me.com> writes:
On 2017-10-27 04:34, Brad Roberts wrote:

 Actually, one of the 3 macos boxes is using stock xcode tooling these 
 days.  I specifically went that direction when setting up a new system 
 that replaced one that died on me (well, it boots but if I actually 
 _use_ it it crashes, sigh).
 
 So, but on the older osx releases (not positive the exact versions off 
 the top of my head) there were issues that forced us back to an old gcc 
 version rather than the default clang compiler.
I haven't been using GCC in years and I never had any problems with compiling DMD using Clang. -- /Jacob Carlborg
Oct 27
parent reply Brad Roberts <braddr puremagic.com> writes:
On 10/27/2017 1:06 AM, Jacob Carlborg via Digitalmars-d wrote:

 On 2017-10-27 04:34, Brad Roberts wrote:

 Actually, one of the 3 macos boxes is using stock xcode tooling these 
 days.  I specifically went that direction when setting up a new 
 system that replaced one that died on me (well, it boots but if I 
 actually _use_ it it crashes, sigh).

 So, but on the older osx releases (not positive the exact versions 
 off the top of my head) there were issues that forced us back to an 
 old gcc version rather than the default clang compiler.
I haven't been using GCC in years and I never had any problems with compiling DMD using Clang.
The issues weren't compiling dmd but passing the full test suite. Both are required.
Oct 27
parent Jacob Carlborg <doob me.com> writes:
On 2017-10-28 08:11, Brad Roberts wrote:

 The issues weren't compiling dmd but passing the full test suite. Both 
 are required.
Yes, I've run the test suite as well, DMD, druntime and Phobos. -- /Jacob Carlborg
Oct 28
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2017-10-26 12:16, Adam Wilson wrote:

 How many though? 
It's not like I've been counting, but more than one.
 Also, we have to do it for macOS, why is Windows 
 special? The macOS setup was just as hard. Download two large packages 
 (XCode+Cmd tools), install, and done.
I'm not saying Windows is special. I tried to use DMD and Visual Studio together, it didn't work that well. I did not use the DMD installation, I already had DMD installed (using DVM). I did not know the exact paths/environment variables to use for DMD to find the Visual Studio tool chain. I also recall finding it very difficult to find the download for the SDK, it was not included in the Visual Studio installation I used. I did not have these problems on macOS, but perhaps that's just me, I'm not a Windows developer. -- /Jacob Carlborg
Oct 27
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 10/27/17 3:46 AM, Jacob Carlborg wrote:
 I'm not saying Windows is special. I tried to use DMD and Visual Studio 
 together, it didn't work that well. I did not use the DMD installation, 
 I already had DMD installed (using DVM). I did not know the exact 
 paths/environment variables to use for DMD to find the Visual Studio 
 tool chain. I also recall finding it very difficult to find the download 
 for the SDK, it was not included in the Visual Studio installation I used.
This kind of stuff would need to be carefully written down with an eye for improving the experience. Any volunteers? Please help, thanks! -- Andrei
Oct 27
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 27 October 2017 at 11:47:11 UTC, Andrei Alexandrescu 
wrote:
 On 10/27/17 3:46 AM, Jacob Carlborg wrote:
 I'm not saying Windows is special. I tried to use DMD and 
 Visual Studio together, it didn't work that well. I did not 
 use the DMD installation, I already had DMD installed (using 
 DVM). I did not know the exact paths/environment variables to 
 use for DMD to find the Visual Studio tool chain. I also 
 recall finding it very difficult to find the download for the 
 SDK, it was not included in the Visual Studio installation I 
 used.
This kind of stuff would need to be carefully written down with an eye for improving the experience. Any volunteers? Please help, thanks! -- Andrei
It's in the install wiki https://wiki.dlang.org/Installing_DMD the problem (that I mentioned above) is that you have to know where to go to find it. It needs more prominence on the dlang site.
Oct 27
parent reply codephantom <me noyb.com> writes:
On Friday, 27 October 2017 at 12:19:59 UTC, jmh530 wrote:
 It's in the install wiki
Personally, VS is such a pain in the $$ #$# that I would remove any reference of it from the installer. i.e rather than the installer offering to install VS2013, just have the installer display a shortcut to the wiki, if the installer can't find vs. don't offer to install it..you'll almost certainly ruins the clients computer with all the various dependencies and crap that vs requires.... Let the wiki take care of it all. But gee...what a mess MS have made with VS...have a look at all these people complaining.... https://visualstudio.uservoice.com/forums/121579-visual-studio-ide/suggestions/17541385-please-make-iso-files-for-visual-studio-2017?page=1&per_page=20 I don't think trying to dominate the world of software development..with a single app...was a very good strategy. They've stuffed up big time! The less the D language partakes in that stuff up.. the better D will be for it.
Oct 27
parent reply Jerry <hurricane hereiam.com> writes:
On Friday, 27 October 2017 at 13:15:38 UTC, codephantom wrote:
 The less the D language partakes in that stuff up.. the better 
 D will be for it.
This mentality is why D is pretty awful on Windows. It's bad enough that DMD doesn't release a 64-bit version on Windows but now you are advocating for the removal of the ability for it to generate 64-bit binaries as well! Yah that won't bring you 10 steps back. Ideals are nice and all, but some people still need to get shit done. This sort of mentality is hurting D, not helping it.
Oct 27
next sibling parent reply codephantom <me noyb.com> writes:
On Friday, 27 October 2017 at 19:44:49 UTC, Jerry wrote:
 On Friday, 27 October 2017 at 13:15:38 UTC, codephantom wrote:
 The less the D language partakes in that stuff up.. the better 
 D will be for it.
This mentality is why D is pretty awful on Windows. It's bad enough that DMD doesn't release a 64-bit version on Windows but now you are advocating for the removal of the ability for it to generate 64-bit binaries as well! Yah that won't bring you 10 steps back. Ideals are nice and all, but some people still need to get shit done. This sort of mentality is hurting D, not helping it.
Rubbish! And get you facts straight! Where did I advocate from the removal of the ability for D to generate 64-bit binaries? 64bit D on Windows, is a problem because of Windows. The fact that D cannot disentangle itself from the monstrosity known as Visual Studio, is a problem of Visual Studio. If you really want to get work done, then try wasting 10 hours of your time, trying to work out how to install VS, and all stuff that it depends on - you are even forced to upgrade your operating system too! At a minimum, I had to download 3.5GB of VS build tools just so I could compile a 64 bit D program (and it took me almost a whole day to work out the correct process). Is that a problem of D or VS? Is is it problem that D should accept, and just impose on it's users? Or should D find a better way? Which is the worse mentality? And VS destroys competition and imposes considerable and unacceptable requirements on its users. That is the only mentality you should be questioning.
Oct 27
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Saturday, 28 October 2017 at 00:05:53 UTC, codephantom wrote:
 At a minimum, I had to download 3.5GB of VS build tools just so 
 I could compile a 64 bit D program (and it took me almost a 
 whole day to work out the correct process).
At a minimum you'd better try WinSDK first, there should be all necessary tools. After all it is system's development kit, not some fancy junk.
 Is that a problem of D or VS?

 Is is it problem that D should accept, and just impose on it's 
 users?
VS is standard for C++ on Windows. Period. Not much to discuss here. Why we need MS native tools? Because D offers C++ FFI. See the connection? But who said that we compile/link using VS itself? And again, DMD installer offers to install whole VS most likely because on Windows there is not that much experienced devs in the team. So this probably overlooked. Also this is why there are some *core* features that never(or almost never) worked on Windows but works for ages on linux, such as "DLL support" or my favorite "type information across DLL/process boundaries"... Since you already on that wave, can you test Windows SDK installation and make DMD's sc.ini use the SDK?
Oct 27
next sibling parent reply codephantom <me noyb.com> writes:
On Saturday, 28 October 2017 at 01:42:52 UTC, evilrat wrote:
 Since you already on that wave, can you test Windows SDK 
 installation and make DMD's sc.ini use the SDK?
nope. not me. I've had enough ;-) I use FreeBSD. I just wanted so see what effort I had to undertake to compile D into a 64bit binary on Windows - presuming I didn't want visual studio too... Needless to say...I'm not impressed. And I'll leave it at that.
Oct 27
parent reply evilrat <evilrat666 gmail.com> writes:
On Saturday, 28 October 2017 at 02:30:50 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 01:42:52 UTC, evilrat wrote:
 Since you already on that wave, can you test Windows SDK 
 installation and make DMD's sc.ini use the SDK?
nope. not me. I've had enough ;-) I use FreeBSD. I just wanted so see what effort I had to undertake to compile D into a 64bit binary on Windows - presuming I didn't want visual studio too... Needless to say...I'm not impressed. And I'll leave it at that.
No problem. Actually there is a recent post in blog about D and VS where WinSDK is mentioned, might be interested to read - https://dlang.org/blog/2017/10/25/dmd-windows-and-c/ Some clarifications - VS projects(at least MS one's, i.e. C++ and C#) are just xml 'build scripts' for msbuild.exe, which itself don't have the knowledge about project or how to build them, it is plugins that provides such knowledge to it. So in this sense VS project properties editor is just a nice UI for editing build scripts. And when one hit the build button in VS it is just invokes msbuild with that script(project file). That's why we have WinSDK, MSBuild tools, and VS as separate downloads, and VS includes the former two. More or less like that. This might be helpful for some users.
Oct 27
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, October 28, 2017 02:48:00 evilrat via Digitalmars-d wrote:
 On Saturday, 28 October 2017 at 02:30:50 UTC, codephantom wrote:
 On Saturday, 28 October 2017 at 01:42:52 UTC, evilrat wrote:
 Since you already on that wave, can you test Windows SDK
 installation and make DMD's sc.ini use the SDK?
nope. not me. I've had enough ;-) I use FreeBSD. I just wanted so see what effort I had to undertake to compile D into a 64bit binary on Windows - presuming I didn't want visual studio too... Needless to say...I'm not impressed. And I'll leave it at that.
No problem. Actually there is a recent post in blog about D and VS where WinSDK is mentioned, might be interested to read - https://dlang.org/blog/2017/10/25/dmd-windows-and-c/ Some clarifications - VS projects(at least MS one's, i.e. C++ and C#) are just xml 'build scripts' for msbuild.exe, which itself don't have the knowledge about project or how to build them, it is plugins that provides such knowledge to it. So in this sense VS project properties editor is just a nice UI for editing build scripts. And when one hit the build button in VS it is just invokes msbuild with that script(project file). That's why we have WinSDK, MSBuild tools, and VS as separate downloads, and VS includes the former two. More or less like that. This might be helpful for some users.
At a previous job where we had both Linux and Windows builds of our libraries (though applications themselves tended to be single platform), I got so sick of dealing with VS and the builds not being consistent across platforms (since Linux used Makefiles, and those obviously had to be edited separately from the VS stuff) that I rewrote our build stuff so that it was all generated with cmake. Then editing the build was the same on both platforms, and building was _almost_ the same. I didn't even need to open up VS anymore - for configuration or for building. It was glorious. I expect that it's the sort of thing that would annoy many Windows devs though, because the fact that the VS files were generated meant that you couldn't make changes in VS and have it stick (which from my perspective was great, but for a hardcore Windows person, probably not so much). - Jonathan M Davis
Oct 27
next sibling parent reply evilrat <evilrat666 gmail.com> writes:
On Saturday, 28 October 2017 at 03:00:16 UTC, Jonathan M Davis 
wrote:
 ... I rewrote our build stuff so that it was all generated with 
 cmake. Then editing the build was the same on both platforms, 
 and building was _almost_ the same. I didn't even need to open 
 up VS anymore - for configuration or for building. It was 
 glorious.

 I expect that it's the sort of thing that would annoy many 
 Windows devs though, because the fact that the VS files were 
 generated meant that you couldn't make changes in VS and have 
 it stick (which from my perspective was great, but for a 
 hardcore Windows person, probably not so much).
Never heard of anyone who is annoyed by cmake/vs combo. Quite the opposite, there is an issue with "true" hardcore Linux devs who cannot into cmake. They stuck with autotools, which is not an option on Windows. This especially true for any C projects, and also the fact that we stuck with C89 on Windows. And another side of the problem is commercial middleware carp which distributed as VS projects only and only supports some "ancient" VS version, though I can't remember such examples.
Oct 27
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, October 28, 2017 03:45:02 evilrat via Digitalmars-d wrote:
 On Saturday, 28 October 2017 at 03:00:16 UTC, Jonathan M Davis

 wrote:
 ... I rewrote our build stuff so that it was all generated with
 cmake. Then editing the build was the same on both platforms,
 and building was _almost_ the same. I didn't even need to open
 up VS anymore - for configuration or for building. It was
 glorious.

 I expect that it's the sort of thing that would annoy many
 Windows devs though, because the fact that the VS files were
 generated meant that you couldn't make changes in VS and have
 it stick (which from my perspective was great, but for a
 hardcore Windows person, probably not so much).
Never heard of anyone who is annoyed by cmake/vs combo. Quite the opposite, there is an issue with "true" hardcore Linux devs who cannot into cmake. They stuck with autotools, which is not an option on Windows. This especially true for any C projects, and also the fact that we stuck with C89 on Windows. And another side of the problem is commercial middleware carp which distributed as VS projects only and only supports some "ancient" VS version, though I can't remember such examples.
The problem would be Windows devs who wanted to change any settings inside of VS. I don't think that it would have even worked to retain the file that's specific to the user, since it sits next to the normal VS project files, which were in a directory that would be deleted whenever a full rebuild was done. So, as long as you didn't need to configure any aspect of VS where the settings were saved in a file in that directory, you'd be fine, but something like your local debug settings for the project would probably be lost on a regular basis. So, while someone who's more of a Linux dev is likely to be very much in favor of using cmake to control everything, a hardcore Windows dev who uses VS on a regular basis might not view it the same way. Personally, I think that most anyone dealing with VS would be better off using cmake to generate its project files than using VS to control that stuff (it is _so_ easy to do stuff like make it so that the debug and release builds are not in sync if you're configuring VS directly), but I wouldn't have dared to suggest it at my last job, which was a Windows-only shop. The folks there were too Windows-centric and too set in their ways for that to have gone over well at all, even though it likely would have fixed a number of our build-related problems. - Jonathan M Davis
Oct 27
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 28 October 2017 at 03:00:16 UTC, Jonathan M Davis 
wrote:
 On Saturday, October 28, 2017 02:48:00 evilrat via 
 Digitalmars-d wrote:
 On Saturday, 28 October 2017 at 02:30:50 UTC, codephantom 
 wrote:
 On Saturday, 28 October 2017 at 01:42:52 UTC, evilrat wrote:
 Since you already on that wave, can you test Windows SDK 
 installation and make DMD's sc.ini use the SDK?
nope. not me. I've had enough ;-) I use FreeBSD. I just wanted so see what effort I had to undertake to compile D into a 64bit binary on Windows - presuming I didn't want visual studio too... Needless to say...I'm not impressed. And I'll leave it at that.
No problem. Actually there is a recent post in blog about D and VS where WinSDK is mentioned, might be interested to read - https://dlang.org/blog/2017/10/25/dmd-windows-and-c/ Some clarifications - VS projects(at least MS one's, i.e. C++ and C#) are just xml 'build scripts' for msbuild.exe, which itself don't have the knowledge about project or how to build them, it is plugins that provides such knowledge to it. So in this sense VS project properties editor is just a nice UI for editing build scripts. And when one hit the build button in VS it is just invokes msbuild with that script(project file). That's why we have WinSDK, MSBuild tools, and VS as separate downloads, and VS includes the former two. More or less like that. This might be helpful for some users.
At a previous job where we had both Linux and Windows builds of our libraries (though applications themselves tended to be single platform), I got so sick of dealing with VS and the builds not being consistent across platforms (since Linux used Makefiles, and those obviously had to be edited separately from the VS stuff) that I rewrote our build stuff so that it was all generated with cmake. Then editing the build was the same on both platforms, and building was _almost_ the same. I didn't even need to open up VS anymore - for configuration or for building. It was glorious. I expect that it's the sort of thing that would annoy many Windows devs though, because the fact that the VS files were generated meant that you couldn't make changes in VS and have it stick (which from my perspective was great, but for a hardcore Windows person, probably not so much). - Jonathan M Davis
Visual Studio 2017 has native support for cmake as project format. It is also the new official format for Android NDK development. So we are quite ok with using cmake. :)
Oct 28
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, October 28, 2017 07:12:13 Paulo Pinto via Digitalmars-d wrote:
 Visual Studio 2017 has native support for cmake as project format.

 It is also the new official format for Android NDK development.

 So we are quite ok with using cmake. :)
That definitely sounds like an improvement. The place I was working at before is still on VS 2010. :| - Jonathan M Davis
Oct 28
prev sibling parent Bob Arnson <bob joyofsetup.com> writes:
On Saturday, 28 October 2017 at 01:42:52 UTC, evilrat wrote:
 At a minimum you'd better try WinSDK first, there should be all 
 necessary tools. After all it is system's development kit, not 
 some fancy junk.
The Windows SDK hasn't included compilers since Windows 7. Visual C++ is available as a NuGet package, which is just a .zip file. The 2017 version is ~650MB: https://www.nuget.org/packages/VisualCppTools.Community.VS2017Layout/ Unfortunately, it doesn't include the SDK headers or libs. Bob
Oct 27
prev sibling next sibling parent Jerry <hurricane hereiam.com> writes:
On Saturday, 28 October 2017 at 00:05:53 UTC, codephantom wrote:
 Rubbish!

 And get you facts straight!

 Where did I advocate from the removal of the ability for D to 
 generate 64-bit binaries?
So you are saying to not use the platform's tools to generate binaries. That's like saying not to the use linux's tools to generate binaries on that platform and instead D should build it's own tools in order to be able to. D has a small enough community as it is, it isn't capable of developing such tools. You are advocating for the removal of the only way to currently genreate 64-bit binaries in D. The only other solution is mingw, and honestly those tools aren't nearly as polished as one run by a company with almost limitless resources. If you don't want to deal with Visual Studio, I'll deal you one better, why are you bothering to deal with Windows at all? If you don't like Microsoft so much just switch to Linux, there your problem is solved. You can't even install Visual Studio on Linux.
 At a minimum, I had to download 3.5GB of VS build tools just so 
 I could compile a 64 bit D program (and it took me almost a 
 whole day to work out the correct process).
It's really not that difficult, you install it and it pretty much just works. The only problem case is if you install D before you install Visual Studio. Wow 3.5 GB, that's so much! If only there were TB HDDs at an affordable price, oh god why does it have to be so big to install! Anyways maybe I just don't see it as a problem cause I have to download much much bigger files. Good thing you don't play games cause they are getting into the 80 GB range nowadays.
 Is is it problem that D should accept, and just impose on it's 
 users?

 Or should D find a better way?

 Which is the worse mentality?
Your on the Windows platform, not support Windows tools is annoying and you aren't going to find better tools. If you don't like the way Microsoft does business, you have 2 other platforms you can go to. Buy a Mac or boot up Linux. Just stop making Windows a worse platform by suggesting to drop support for the official development tools. There is no "better way". Every other way is going to be worse cause Windows doesn't have as big of a community dabbling in building tools like GCC and Clang for Windows. Why? Cause there's Visual Studio. Like I said, ideals are nice and all but people still need to get shit done. That's what your argument boils down to, the ideal of finding a better way than what is currently available. The problem is you aren't even suggestion a better way, you are just trying to sell it on the false belief that there is something better. But there isn't. This is worse than religion.. Why don't you like VS, cause they changed something? Rofl, whenever there is change people hate it. Cause people don't like change, for the only reason that they don't want to learn something new. I don't know how many times I teach someone a hotkey that's way better than their current method and they just keep going with their horribly slow method cause that's what they know. And download size? I could say why are you even on Windows, Linux is like 20 GB smaller download size and takes up less HDD space than Windows. So why the hell are you even on Windows? Oh yah once you install it you don't have to worry about it for years on end. You want to drop support for VS cause of something you spend once doing and then pretty much never have to do again for years to come. Please no, just switch to Linux and let the people that actually need to use the Windows platform, use it effectively.
Oct 28
prev sibling parent Jerry <hurricane hereiam.com> writes:
On Saturday, 28 October 2017 at 00:05:53 UTC, codephantom wrote:
 Is is it problem that D should accept, and just impose on it's 
 users?

 Or should D find a better way?

 Which is the worse mentality?
There is an afterlife with god. There is nothingness after death. Which is the worse mentality? Yet why is it that the more educated someone is, the more likely they are to be atheist?
Oct 28
prev sibling parent codephantom <me noyb.com> writes:
On Friday, 27 October 2017 at 19:44:49 UTC, Jerry wrote:
 This mentality is why D is pretty awful on Windows.
Have read of this... then you will understand things better: https://dlang.org/blog/2017/10/25/dmd-windows-and-c/
Oct 27
prev sibling parent reply Kagamin <spam here.lot> writes:
On Tuesday, 24 October 2017 at 22:19:59 UTC, jmh530 wrote:
 I'm sympathetic to your point.

 I think there was/is some effort to allow LLD (the LLVM linker) 
 as a replacement for the MSVC linker in LDC. Perhaps if LLD 
 could be a drop-in for MSVC in DMD for Windows, then eventually 
 it could be the default? Not sure that's possible or not.
LLD was integrated in ldc 1.3.0 https://github.com/ldc-developers/ldc/pull/2142 but currently has conflicting command line options. I suppose you can still run it separately, for me even ld works.
Oct 26
parent jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 26 October 2017 at 10:19:23 UTC, Kagamin wrote:
 LLD was integrated in ldc 1.3.0 
 https://github.com/ldc-developers/ldc/pull/2142 but currently 
 has conflicting command line options. I suppose you can still 
 run it separately, for me even ld works.
Interesting.
Oct 26
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Oct 24, 2017 at 09:20:10AM -0400, Andrei Alexandrescu via Digitalmars-d
wrote:
 A person who donated to the Foundation made a small wish list known.
 Allow me to relay it:
 
 * RSA Digital Signature Validation in Phobos
[...] This is going to be a tricky one. I'm very wary of implementing cryptographic algorithms without a crypto expert on board. It's just far too easy to get a tiny detail wrong, and open up a gaping security hole as a result. Even though we're not talking about encryption per se, all it takes is for a bug to wrongly validate an invalid signature and we have a problem. And even if there are no bugs, there may be (probably many) inadvertent side-channel attacks opened up if whoever writes the code is unaware of them. The other alternative is to wrap around a reputable crypto library like openssl, but that would mean even more external dependencies of Phobos. And we all know how well that went with libcurl, zlib, etc.: people constantly complain about why this doesn't work and why that breaks. If we build Phobos with an external dependency on openssl, say, that means the installer must make sure it finds the right DLL/.so paths, configure the compiler accordingly, deal with possibly multiple incompatible local versions of the same library on the user's system, etc.. But if we ship openssl with Phobos to avoid this problem, then we have another problem: needing to push out a high-priority security fixes if an exploit is published, etc., which currently we simply don't have the infrastructure to deal with. Neither alternative sounds appealing to me. (Having said all that, though, D is probably a far better language for implementing crypto algorithms -- built-in bounds checking would have prevented some of the worst security holes that have come to light recently, like Heartbleed and Cloudbleed. Still, I wouldn't feel confident about a crypto library written in D unless it was reviewed by someone with crypto expertise. Or preferably, *multiple* crypto experts. It's just far, far too easy to get it wrong, with disastrous consequences.) T -- A program should be written to model the concepts of the task it performs rather than the physical world or a process because this maximizes the potential for it to be applied to tasks that are conceptually similar and, more important, to tasks that have not yet been conceived. -- Michael B. Allen
Oct 24
next sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 10/24/2017 09:37 AM, H. S. Teoh wrote:
 On Tue, Oct 24, 2017 at 09:20:10AM -0400, Andrei Alexandrescu via 
Digitalmars-d wrote:
 A person who donated to the Foundation made a small wish list known.
 Allow me to relay it:

 * RSA Digital Signature Validation in Phobos
[...] This is going to be a tricky one. I'm very wary of implementing cryptographic algorithms without a crypto expert on board.
deadalnix (Amaury Séchet) is in that field: http://dconf.org/2017/talks/sechet.html Ali
Oct 24
parent Joakim <dlang joakim.fea.st> writes:
On Tuesday, 24 October 2017 at 17:31:06 UTC, Ali Çehreli wrote:
 On 10/24/2017 09:37 AM, H. S. Teoh wrote:
 On Tue, Oct 24, 2017 at 09:20:10AM -0400, Andrei Alexandrescu
via Digitalmars-d wrote:
 A person who donated to the Foundation made a small wish
list known.
 Allow me to relay it:

 * RSA Digital Signature Validation in Phobos
[...] This is going to be a tricky one. I'm very wary of
implementing
 cryptographic algorithms without a crypto expert on board.
deadalnix (Amaury Séchet) is in that field: http://dconf.org/2017/talks/sechet.html Ali
He's a little busy right now: ;) http://cryptotimes.org/alt-coin/amaury-sechet-discusses-the-values-of-bitcoin-abc-development/
Oct 24
prev sibling parent reply Kagamin <spam here.lot> writes:
On Tuesday, 24 October 2017 at 16:37:10 UTC, H. S. Teoh wrote:
 (Having said all that, though, D is probably a far better 
 language for implementing crypto algorithms -- built-in bounds 
 checking would have prevented some of the worst security holes 
 that have come to light recently, like Heartbleed and 
 Cloudbleed.
Those were buffer overflows in parsers, not in cryptographic algorithms.
Oct 25
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, October 25, 2017 13:22:46 Kagamin via Digitalmars-d wrote:
 On Tuesday, 24 October 2017 at 16:37:10 UTC, H. S. Teoh wrote:
 (Having said all that, though, D is probably a far better
 language for implementing crypto algorithms -- built-in bounds
 checking would have prevented some of the worst security holes
 that have come to light recently, like Heartbleed and
 Cloudbleed.
Those were buffer overflows in parsers, not in cryptographic algorithms.
The point still stands though that you have to be _very_ careful when implementing anything security related, and it's shockingly easy to do something that actually leaks information even if it's not outright buggy (e.g. the timing of the code indicates something about success or failure to an observer), and someone who isn't an expert in the area is bound to screw something up - and since this is a security issue, it matters that much more than it would with other code. - Jonathan M Davis
Oct 25
parent reply Kagamin <spam here.lot> writes:
On Wednesday, 25 October 2017 at 14:17:21 UTC, Jonathan M Davis 
wrote:
 The point still stands though that you have to be _very_ 
 careful when implementing anything security related, and it's 
 shockingly easy to do something that actually leaks information 
 even if it's not outright buggy (e.g. the timing of the code 
 indicates something about success or failure to an observer)
Fun read: http://cr.yp.to/papers.html#cachetiming - a cache timing attack on AES recovering full key. This flaw was accounted for in Salsa and Chacha design.
Oct 27
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Oct 27, 2017 at 05:35:17PM +0000, Kagamin via Digitalmars-d wrote:
 On Wednesday, 25 October 2017 at 14:17:21 UTC, Jonathan M Davis wrote:
 The point still stands though that you have to be _very_ careful
 when implementing anything security related, and it's shockingly
 easy to do something that actually leaks information even if it's
 not outright buggy (e.g. the timing of the code indicates something
 about success or failure to an observer)
Fun read: http://cr.yp.to/papers.html#cachetiming - a cache timing attack on AES recovering full key. This flaw was accounted for in Salsa and Chacha design.
Yes, and this is exactly why I would not trust any D crypto implementation that hasn't been vetted by crypto experts. Nobody would think of such weaknesses when they're writing the code, unless they were already aware of such issues beforehand -- and I doubt many of us here would even be aware of half of the issues crypto implementors must work with on a regular basis. If even the openSSL folk didn't manage to avoid this exploit, we non-crypto people wouldn't even stand a chance. Of course, the larger picture is that crypto algorithms are only a small (albeit critical) part of a larger cryptosystem, and oftentimes the weaknesses come not from the algorithm itself but from issues affecting the other parts of the cryptosystem. You can have the best, most unbreakable crypto (or whatever else) algorithm in your hand, but if you use it incorrectly or just carelessly, you'd still get exploited, and all that crypto strength would be for nought. T -- Insanity is doing the same thing over and over again and expecting different results.
Oct 27
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Oct 25, 2017 at 08:17:21AM -0600, Jonathan M Davis via Digitalmars-d
wrote:
 On Wednesday, October 25, 2017 13:22:46 Kagamin via Digitalmars-d wrote:
 On Tuesday, 24 October 2017 at 16:37:10 UTC, H. S. Teoh wrote:
 (Having said all that, though, D is probably a far better language
 for implementing crypto algorithms -- built-in bounds checking
 would have prevented some of the worst security holes that have
 come to light recently, like Heartbleed and Cloudbleed.
Those were buffer overflows in parsers, not in cryptographic algorithms.
The point still stands though that you have to be _very_ careful when implementing anything security related, and it's shockingly easy to do something that actually leaks information even if it's not outright buggy (e.g. the timing of the code indicates something about success or failure to an observer), and someone who isn't an expert in the area is bound to screw something up - and since this is a security issue, it matters that much more than it would with other code.
[...] Yeah. There have been timing attacks against otherwise-secure crypto algorithms that allow extraction of the decryption key. And other side-channel attacks along the lines of CRIME or BREACH. Even CPU instruction timing attacks have been discovered that can leak which path a branch in a crypto algorithm took, which in turn can reveal information about the decryption key. And voltage variations to deduce which bit(s) are 1's and which are 0's. Many of these remain theoretical attacks, but the point is that these weaknesses can come from things you wouldn't even know existed in your code. Crypto code must be subject to a LOT of scrutiny before it can be trusted. And not just cursory scrutiny like we do with the PR queue on github; we're talking about possibly instruction-by-instruction scrutiny of the kind that can discover vulnerabilities to timing or voltage. I would not be comfortable entrusting any important data to D crypto algorithms if they have not been thoroughly reviewed. T -- You have to expect the unexpected. -- RL
Oct 25
parent reply Adam Wilson <flyboynw gmail.com> writes:
On 10/25/17 11:23, H. S. Teoh wrote:
 On Wed, Oct 25, 2017 at 08:17:21AM -0600, Jonathan M Davis via Digitalmars-d
wrote:
 On Wednesday, October 25, 2017 13:22:46 Kagamin via Digitalmars-d wrote:
 On Tuesday, 24 October 2017 at 16:37:10 UTC, H. S. Teoh wrote:
 (Having said all that, though, D is probably a far better language
 for implementing crypto algorithms -- built-in bounds checking
 would have prevented some of the worst security holes that have
 come to light recently, like Heartbleed and Cloudbleed.
Those were buffer overflows in parsers, not in cryptographic algorithms.
The point still stands though that you have to be _very_ careful when implementing anything security related, and it's shockingly easy to do something that actually leaks information even if it's not outright buggy (e.g. the timing of the code indicates something about success or failure to an observer), and someone who isn't an expert in the area is bound to screw something up - and since this is a security issue, it matters that much more than it would with other code.
[...] Yeah. There have been timing attacks against otherwise-secure crypto algorithms that allow extraction of the decryption key. And other side-channel attacks along the lines of CRIME or BREACH. Even CPU instruction timing attacks have been discovered that can leak which path a branch in a crypto algorithm took, which in turn can reveal information about the decryption key. And voltage variations to deduce which bit(s) are 1's and which are 0's. Many of these remain theoretical attacks, but the point is that these weaknesses can come from things you wouldn't even know existed in your code. Crypto code must be subject to a LOT of scrutiny before it can be trusted. And not just cursory scrutiny like we do with the PR queue on github; we're talking about possibly instruction-by-instruction scrutiny of the kind that can discover vulnerabilities to timing or voltage. I would not be comfortable entrusting any important data to D crypto algorithms if they have not been thoroughly reviewed. T
I am one-hundred-ten percent in agreement with Mr. Teoh here. Even .NET Framework and Core forward to the highly vetted system crypto API's (SChannel on Windows and OpenSSL on Linux/macOS). If you need RSA crypto in D, pull in OpenSSL. Period. Everything else is a good way to run afoul of a security audit, and potentially expose yourself. Phobos could forward to these system provided API's like .NET does and provide an idiomatic D interface, but Phobos itself should absolutely and 110% stay out of the crypto implementation business. -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 25
next sibling parent reply codephantom <me noyb.com> writes:
On Wednesday, 25 October 2017 at 22:46:27 UTC, Adam Wilson wrote:
 Even .NET Framework and Core forward to the highly vetted 
 system crypto API's (SChannel on Windows and OpenSSL on 
 Linux/macOS). If you need RSA crypto in D, pull in OpenSSL. 
 Period. Everything else is a good way to run afoul of a 
 security audit, and potentially expose yourself.

 Phobos could forward to these system provided API's like .NET 
 does and provide an idiomatic D interface, but Phobos itself 
 should absolutely and 110% stay out of the crypto 
 implementation business.
I agree. D just needs an interface to Encryption providers. I cannot see how anyone would consider anything other than a provider model, for something that is so highly complex and specialised.
Oct 25
parent Cym13 <cpicard openmailbox.org> writes:
On Wednesday, 25 October 2017 at 23:37:37 UTC, codephantom wrote:
 On Wednesday, 25 October 2017 at 22:46:27 UTC, Adam Wilson 
 wrote:
 Even .NET Framework and Core forward to the highly vetted 
 system crypto API's (SChannel on Windows and OpenSSL on 
 Linux/macOS). If you need RSA crypto in D, pull in OpenSSL. 
 Period. Everything else is a good way to run afoul of a 
 security audit, and potentially expose yourself.

 Phobos could forward to these system provided API's like .NET 
 does and provide an idiomatic D interface, but Phobos itself 
 should absolutely and 110% stay out of the crypto 
 implementation business.
I agree. D just needs an interface to Encryption providers. I cannot see how anyone would consider anything other than a provider model, for something that is so highly complex and specialised.
While I agree that we are nowhere near being able to safely integrate crypto in phobos it is definitely no that specialized. Most communicating programs I can think of use crypto in some form or another: data encryption of course, but also secure random numbers (which we sorely lack atm), signature verification (which was the point here), secure communications (I'm talking protocol, not encryption here)... What communicating program doesn't need to guarantee at least the integrity of its data not to talk about confidentiality? We do have some elements in phobos right now (base hashing algorithms for example) and I think we could add more without falling into crypto hell. Something as simple as a standard interface to the system's cryptographically secure random number generator (such as /dev/urandom on linux) would be a valuable addition. While there definitely value in not playing with fire we shouldn't be dismissing all crypto operations as a whole just because we fear the word.
Oct 25
prev sibling next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 25/10/2017 11:46 PM, Adam Wilson wrote:
 On 10/25/17 11:23, H. S. Teoh wrote:
 On Wed, Oct 25, 2017 at 08:17:21AM -0600, Jonathan M Davis via 
 Digitalmars-d wrote:
 On Wednesday, October 25, 2017 13:22:46 Kagamin via Digitalmars-d wrote:
 On Tuesday, 24 October 2017 at 16:37:10 UTC, H. S. Teoh wrote:
 (Having said all that, though, D is probably a far better language
 for implementing crypto algorithms -- built-in bounds checking
 would have prevented some of the worst security holes that have
 come to light recently, like Heartbleed and Cloudbleed.
Those were buffer overflows in parsers, not in cryptographic algorithms.
The point still stands though that you have to be _very_ careful when implementing anything security related, and it's shockingly easy to do something that actually leaks information even if it's not outright buggy (e.g. the timing of the code indicates something about success or failure to an observer), and someone who isn't an expert in the area is bound to screw something up - and since this is a security issue, it matters that much more than it would with other code.
[...] Yeah. There have been timing attacks against otherwise-secure crypto algorithms that allow extraction of the decryption key. And other side-channel attacks along the lines of CRIME or BREACH. Even CPU instruction timing attacks have been discovered that can leak which path a branch in a crypto algorithm took, which in turn can reveal information about the decryption key. And voltage variations to deduce which bit(s) are 1's and which are 0's. Many of these remain theoretical attacks, but the point is that these weaknesses can come from things you wouldn't even know existed in your code. Crypto code must be subject to a LOT of scrutiny before it can be trusted. And not just cursory scrutiny like we do with the PR queue on github; we're talking about possibly instruction-by-instruction scrutiny of the kind that can discover vulnerabilities to timing or voltage. I would not be comfortable entrusting any important data to D crypto algorithms if they have not been thoroughly reviewed. T
I am one-hundred-ten percent in agreement with Mr. Teoh here. Even .NET Framework and Core forward to the highly vetted system crypto API's (SChannel on Windows and OpenSSL on Linux/macOS). If you need RSA crypto in D, pull in OpenSSL. Period. Everything else is a good way to run afoul of a security audit, and potentially expose yourself. Phobos could forward to these system provided API's like .NET does and provide an idiomatic D interface, but Phobos itself should absolutely and 110% stay out of the crypto implementation business.
Or mbedtls who has also been audited (but much better and nicer code!). Either way, you write it, you pay for auditing or no users. Hence I won't use our port of Botan.
Oct 25
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2017-10-26 00:46, Adam Wilson wrote:

 I am one-hundred-ten percent in agreement with Mr. Teoh here. Even .NET 
 Framework and Core forward to the highly vetted system crypto API's 
 (SChannel on Windows and OpenSSL on Linux/macOS). If you need RSA crypto 
 in D, pull in OpenSSL.
I think we should go with what the system provides. Apple abandoned OpenSSL years ago. It's still shipping with the operating system but if you're using Apple's APIs you're not using OpenSSL, as far as I know. Several BSD variants and Alpine Linux [1] are using LibreSSL, a fork of OpenSSL. Blindly going with only OpenSSL is not a good idea. [1] https://en.wikipedia.org/wiki/LibreSSL#Adoption -- /Jacob Carlborg
Oct 27
prev sibling parent Andre Pany <andre s-e-a-p.de> writes:
On Wednesday, 25 October 2017 at 22:46:27 UTC, Adam Wilson wrote:
 On 10/25/17 11:23, H. S. Teoh wrote:
 On Wed, Oct 25, 2017 at 08:17:21AM -0600, Jonathan M Davis via 
 Digitalmars-d wrote:
 [...]
[...] Yeah. There have been timing attacks against otherwise-secure crypto algorithms that allow extraction of the decryption key. And other side-channel attacks along the lines of CRIME or BREACH. Even CPU instruction timing attacks have been discovered that can leak which path a branch in a crypto algorithm took, which in turn can reveal information about the decryption key. And voltage variations to deduce which bit(s) are 1's and which are 0's. Many of these remain theoretical attacks, but the point is that these weaknesses can come from things you wouldn't even know existed in your code. Crypto code must be subject to a LOT of scrutiny before it can be trusted. And not just cursory scrutiny like we do with the PR queue on github; we're talking about possibly instruction-by-instruction scrutiny of the kind that can discover vulnerabilities to timing or voltage. I would not be comfortable entrusting any important data to D crypto algorithms if they have not been thoroughly reviewed. T
I am one-hundred-ten percent in agreement with Mr. Teoh here. Even .NET Framework and Core forward to the highly vetted system crypto API's (SChannel on Windows and OpenSSL on Linux/macOS). If you need RSA crypto in D, pull in OpenSSL. Period. Everything else is a good way to run afoul of a security audit, and potentially expose yourself. Phobos could forward to these system provided API's like .NET does and provide an idiomatic D interface, but Phobos itself should absolutely and 110% stay out of the crypto implementation business.
I think you made a very good point, it was also mentioned by someone else in this thread. Phobos could provide a crypto interface with implementions for SChannel, mbedtls, openssl. On Windows SChannel would be used as default implementation and on the other operation systems either openssl or mbedtls. This would be very convenient and we would avoid opening the Pandora box. I will close my issue and create a new one with the request for a crypto interface in Phobos. Kind regards Andre
Oct 29
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/24/2017 6:20 AM, Andrei Alexandrescu wrote:
 * better dll support for Windows.
It's been there, but it breaks repeatedly because it is not in the test suite.
Oct 24
next sibling parent Andre Pany <andre s-e-a-p.de> writes:
On Wednesday, 25 October 2017 at 04:55:16 UTC, Walter Bright 
wrote:
 On 10/24/2017 6:20 AM, Andrei Alexandrescu wrote:
 * better dll support for Windows.
It's been there, but it breaks repeatedly because it is not in the test suite.
Yes that is right. One of these issues is the Runtime.unload bug https://issues.dlang.org/show_bug.cgi?id=15443 The wiki page https://wiki.dlang.org/Win32_DLLs_in_D has an example in which the function gc_getProxy and gc_setProxy is used the inject the garbage collector. The example shows only this behavior in the static linked case but not in the dynamically linked case. For the dynamically linked case, there is an error. Ian Hatch found out same more information http://forum.dlang.org/post/nteskpizpsnxrulcsgri forum.dlang.org In general the "export" work of Benjamin Thaut is also highly interesting (D's Import and Export Business, DConf2016) Kind regards André
Oct 25
prev sibling parent evilrat <evilrat666 gmail.com> writes:
On Wednesday, 25 October 2017 at 04:55:16 UTC, Walter Bright 
wrote:
 On 10/24/2017 6:20 AM, Andrei Alexandrescu wrote:
 * better dll support for Windows.
It's been there, but it breaks repeatedly because it is not in the test suite.
Including TypeInfo? (classes, casting, all such things...)
Oct 26
prev sibling next sibling parent Gary Willoughby <dev nomad.so> writes:
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu 
wrote:
 A person who donated to the Foundation made a small wish list 
 known. Allow me to relay it:

 * RSA Digital Signature Validation in Phobos
 * std.decimal in Phobos
 * better dll support for Windows.


 Andrei
std.decimal has been on the review list for quite some time: https://wiki.dlang.org/Review_Queue
Oct 27
prev sibling parent Maksim Fomin <mxfm protonmail.com> writes:
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu 
wrote:
 A person who donated to the Foundation made a small wish list 
 known. Allow me to relay it:

 * better dll support for Windows.

 Andrei
This should be better sent to Walter rather then here.
Nov 07