www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - OT: 'conduct unbecoming of a hacker'

reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
http://sealedabstract.com/rants/conduct-unbecoming-of-a-hacker/
(His particular suggestion about accept patches by default is not 
why I post this).
'
We’re all talk

Back when I joined the hacking community, it was about making 
things. There were mailing lists, and before them there were 
dead-tree magazines, and they would be full of things that people 
had made. And there was discussion about those things that people 
made.
...
Now, it has always been a critical culture. The phenomenon of the 
top HN comment that complains about OP is not a new phenomenon, 
it is older than HN and it is older than the Internet. Various 
people occasionally argue that we need to be “less critical” and 
“less abrasive” and I am sorry to be so critical and abrasive, 
but I think that viewpoint is dumb. Making critical remarks is 
how we make code better.

However that only counts if the remarks actually do make the code 
better. At one time, this was more true than it is today. 
Minimally, criticisms were sent to a person who could do 
something about it, not about them, to the Internet in general.

And when that person elected to do nothing in particular, the 
canonical reply was to write a patch yourself. And that patch was 
posted for discussion, and people would poke and prod at it, and 
see how good of a patch it was. Did it solve the problem? Did it 
introduce any new issues?

And if nobody was willing to write the patch, then that was the 
end of the discussion. A problem is not a real problem if nobody 
is willing to solve it.

And if you did continue to whine that nobody else would solve 
your problem for you, you got booted from the list. People did 
whine, in a sense, but it was very different. Stallman is 
probably the canonical example: he whines more than just about 
anyone, but his whining is neatly interspersed by patches, and he 
is probably one of the most prolific hackers of his generation. 
In spite of the fact that his complaints are extremely 
abstract–copyrights, patent law, etc.–they are at once concrete, 
because he has actually built the copyleft utopia he advocates 
for. Not blogged about how other people should build it, or 
gotten into flamewars about how good it might be–he built it. The 
GNU project is an actual project with actual software and actual 
users, and now we can post that to the list for discussion. 
Stallman wrote a patch, and today you can poke and prod at it.

This idea that people just whine and whine until they run out of 
breath is relatively new. One of the first times I really saw it 
in full force, accepted by most of the community, was with the 
App Store Wars in 2008. This was the time that Apple instituted 
app review for all iOS apps and for one reason or another this 
was going to End Computing As We Knew It. Blog articles were 
written and heated debates were had. Even pg got in on the 
action. But as far as I can tell, everybody complaining just 
eventually ran out of oxygen. There was no mass customer exodus, 
and there was not even a mass hacker exodus. Android did 
eventually rise to become a viable competitor to iOS, but it 
seems to me this had more to do with pricing and carrier partners 
than an ideological struggle; certainly the toppling of Apple’s 
kingdom over this issue never happened. At best, a few egregious 
app review problems got fixed.

There was another tempest in a teacup when Apple got serious 
about their 30% cut. Article after article about how now, this 
time, Apple must correct this injustice or face the terrible 
consequences of software developers rising up to demand money 
that is rightfully theirs. Or something. What actually happened 
is that Kindle removed their online store so now you had to use 
(the horror!) Safari to buy books, and everybody else just raised 
their prices 30%. The end. No uprising, no injustices corrected, 
nothing.

Is it my point that Apple was right and everybody was wrong? No, 
not really. My point is that in the 80s and 90s when the hacker 
community was in crisis, the response was to write a patch. Not 
to whine endlessly in the blogosphere. When we were threatened by 
Ma Bell and Microsoft (which, kids these days forget, were way 
way scarier than Tim Cook can even dream of being), we wrote GNU 
and Linux and the BSDs. And you can laugh about things like 
“Linux on the desktop” all you like, but these projects are all 
seriously impressive achievements in their own right, that 
fundamentally shifted the needle on the software industry. 
Hackers didn’t topple Microsoft, but they seriously threatened 
it, and they won several battles, like the battle for servers and 
the battle for the Internet. Hackers of today can’t threaten 
anyone, or win anything. We’re all talk.
...
his “bikeshedding culture” wouldn’t be so bad if you saw it only 
in hacker discussion spaces, like HN, because at worst you can 
just add them to your hosts file. However I am sorry to report 
that the malaise has now infected places of actually writing 
code, so the problem is now unavoidable.

I have been working on a project lately that requires me to rope 
together various FOSS projects and extend them in logical ways. 
And so the last few months have been a lot of this:

     Hello folks, I need [obviously useful feature]. I realize 
that this is a lot of work, and you’re not going to get to it any 
time soon, and I need it, so I’m going to do it this week myself. 
I plan to do the X followed by the Y followed by the Z, and then 
contribute that back under [your license]. Does that basically 
sound mergeable to you, or are there other things we should 
discuss?

(Un)fortunately I have decided not to name the guilty, which 
makes the next part of this narrative unfalsifiable. For those of 
you joining us from very well-run projects like git, the Linux 
Kernel, FireFox, WebKit, etc., it may even be hard to believe. 
However if you ever have the misfortune to venture into the 
GitHub Graveyard of projects that aren’t quite popular (and even 
a few that are, in a way) you will see at once the force I am 
wrestling with.

My email is inevitably met not with acceptance, nor with 
constructive discussion, but with some attempt to derail the 
entire enterprise. Here are some real examples, paraphrased by 
yours truly:

     I think it should be done some other way, even though the 
other way obviously doesn’t work for you and so far nobody has 
ever been found who is willing to implement it that way
     I don’t want to solve this problem without also solving 
[unrelated problem X], your proposal doesn’t address [unrelated 
problem X], therefore I am inclined to reject it
     I don’t know you and there might be a bug in your patch. This 
patch is too important to leave to somebody new. At the same time 
it is not important enough for any of the core committers to get 
to it.
     Defend this proposal. You’re telling me you “need” encryption 
in an internet communications library, or you “need” unicode 
support in an object storage library. I don’t believe you. We’ve 
gotten along just fine for N months without it, and we’ll get 
along for another 2N months just fine thanks.
     Look, we’ve already implemented [sort-of related feature] 
even though it’s buggy and doesn’t cover your usecase. That 
decision was complicated and people were arguing about it for 
years and I really don’t want to go through that jungle again. If 
you wanted to do it this way you should have spoken up two years 
ago.

Common objections to patches on mailing lists

]11 Common objections to patches on mailing lists

In some cases I have made this proposal to my first-choice 
project, gotten one of the numbered responses, then went to the 
second and third-choice projects only to get a different 
rejection at each one. I track most of these in a spreadsheet, 
and many of them get in long flamewars that run on for months 
after I’ve unsubscribed, forked, and shipped the solution I 
originally proposed. Most of those flamewars come to nothing, but 
a few of them even end up implementing an equivalent solution to 
the one I proposed, all that time later. Exactly one project in 
my dataset ever reached a decision that actually improved on my 
proposed solution, and so far that decision was made in theory 
only–a separate flamewar erupted between the spec writers and the 
implementers, and that flamewar continues to this day. Meanwhile 
my code shipped in 2012.

Is my point that I am the best hacker, mwahahaha? No. These are 
largely pretty easy patches that most software developers could 
write in a few days. I do not claim any special talent in writing 
them. I simply claim that while everybody else was arguing, I was 
the one who did write them. And without exception, every single 
argument over one of these patches caused a delay and produced no 
benefit, at best. I am sorry to report the facts today are that 
flamewars, not patches, are king.
...
Who cares? Let people argue if they think it’s fun

Well, let me be clear: I am in favor of recreational arguing. 
Just read this blog; that is an exercise in the discipline. 2.

However, there are some limits. It’s all fun and games until 
somebody loses an eye. One problem is that when all the hacker 
spaces are infected with patchless argumentation, we discourage 
all the up-and-coming hackers who do, actually, write patches. 
The Rachels of the world are confused: is this what programming 
is about? It’s about winning the mailing list thread? We also 
attract people who are good at arguing, instead of people who are 
good at patches. Those are mistakes, and probably enormous ones, 
but the effect is hard to prove.


...
Hacking should be about making things. And yet a great many of 
our institutions are set up to discourage, distract, destroy, and 
derail the making of anything. It’s time we called it what it is: 
conduct unbecoming of a hacker.
'
Feb 09 2016
next sibling parent Nick B <nick.barbalich gmail.com> writes:
On Wednesday, 10 February 2016 at 02:11:25 UTC, Laeeth Isharc 
wrote:
 http://sealedabstract.com/rants/conduct-unbecoming-of-a-hacker/
 (His particular suggestion about accept patches by default is 
 not why I post this).
 '
 ...
 Hacking should be about making things. And yet a great many of 
 our institutions are set up to discourage, distract, destroy, 
 and derail the making of anything. It’s time we called it what 
 it is: conduct unbecoming of a hacker.
 '
Great post, and very funny. Nick
Feb 10 2016
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 02/09/2016 09:11 PM, Laeeth Isharc wrote:
 My email is inevitably met not with acceptance, nor with constructive
 discussion, but with some attempt to derail the entire enterprise. Here
 are some real examples, paraphrased by yours truly:

      I think it should be done some other way, even though the other way
 obviously doesn’t work for you and so far nobody has ever been found who
 is willing to implement it that way
      I don’t want to solve this problem without also solving [unrelated
 problem X], your proposal doesn’t address [unrelated problem X],
 therefore I am inclined to reject it
      I don’t know you and there might be a bug in your patch. This patch
 is too important to leave to somebody new. At the same time it is not
 important enough for any of the core committers to get to it.
      Defend this proposal. You’re telling me you “need” encryption in an
 internet communications library, or you “need” unicode support in an
 object storage library. I don’t believe you. We’ve gotten along just
 fine for N months without it, and we’ll get along for another 2N months
 just fine thanks.
      Look, we’ve already implemented [sort-of related feature] even
 though it’s buggy and doesn’t cover your usecase. That decision was
 complicated and people were arguing about it for years and I really
 don’t want to go through that jungle again. If you wanted to do it this
 way you should have spoken up two years ago.
Unfortunately, that sounds very similar to experiences I've had here in D-land :( Gets very frustrating.
Feb 10 2016
next sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Wed, Feb 10, 2016 at 12:17:40PM -0500, Nick Sabalausky via Digitalmars-d
wrote:
 On 02/09/2016 09:11 PM, Laeeth Isharc wrote:
My email is inevitably met not with acceptance, nor with constructive
discussion, but with some attempt to derail the entire enterprise.
Here are some real examples, paraphrased by yours truly:

     I think it should be done some other way, even though the other
way obviously doesn’t work for you and so far nobody has ever been
found who is willing to implement it that way
     I don’t want to solve this problem without also solving
[unrelated problem X], your proposal doesn’t address [unrelated
problem X], therefore I am inclined to reject it
     I don’t know you and there might be a bug in your patch. This
patch is too important to leave to somebody new. At the same time it
is not important enough for any of the core committers to get to it.
     Defend this proposal. You’re telling me you “need” encryption in
an internet communications library, or you “need” unicode support in
an object storage library. I don’t believe you. We’ve gotten along
just fine for N months without it, and we’ll get along for another 2N
months just fine thanks.
     Look, we’ve already implemented [sort-of related feature] even
though it’s buggy and doesn’t cover your usecase. That decision was
complicated and people were arguing about it for years and I really
don’t want to go through that jungle again. If you wanted to do it
this way you should have spoken up two years ago.
Unfortunately, that sounds very similar to experiences I've had here in D-land :( Gets very frustrating.
Have to agree with you there. :-( While, on the whole, my experience of D has been very pleasant, and I will probably stick to it for the long term, there *are* some rough edges that, arguably, should have been ironed out by now. But every time the topic comes up people get defensive and then the interminable forum threads ensue, and at the end nothing gets done because everyone is spent from all the arguments. More and more, I've found that participating in forum threads is, in general (there *are* exceptions), inversely proportional to actually getting stuff done. So nowadays I rather just submit a PR instead of getting entangled in the latest Great Debate. OTOH, even PR's can also get discouraging sometimes when it touches certain controversial issues, when it can get stonewalled for months on end, a great deterrent for new contributors to join in. T -- Almost all proofs have bugs, but almost all theorems are true. -- Paul Pedersen
Feb 10 2016
prev sibling parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Wednesday, 10 February 2016 at 17:17:40 UTC, Nick Sabalausky 
wrote:
 Unfortunately, that sounds very similar to experiences I've had 
 here in D-land :( Gets very frustrating.
Yes - one trigger for posting it was the tone of some messages in some recent forum discussions (although it's really a tiny part of things on the whole). 'D is broken and needs a complete but grumbling isn't going to make it better, once the ground has been covered once. I should they that although I have read his blog in the past, the trigger for reading that note was his commentary on the nanomsg hoohah (please - let's discuss that on another thread if necessary, not on this one). And that's the project he was referring to in this note. I don't know all the details there, but I think his broader point about the decline of hacker culture is a much stronger one than the specific application he makes with nanomsg (my take would be a bit different). [I'm not really able to judge situation as regards pull requests for D, although I wonder sometimes about type 1 vs type 2 errors and the balance between maintaining high standards and letting the best be the enemy of the good, given that something imperfect but useful can be the seeds of something that ends up becoming truly excellent - depending on the intrinsic sunk costs from doing things not perfectly the first time which might depend on the particular problem but also might not be obvious beforehand]. Eg for all the time spent arguing about what's holding D back, some of the real progress in terms of making the language attractive to real end-users who are going to hire people and maybe contribute resources came from people just doing stuff. ndslice, PyD Magic (integration with python notebook), bachmeier's integration with R (which means access to all the R libraries, which is huge). I notice Kenji rarely argues about things in the forum. Would we be better off if he were to spend much more time here instead of fixing bugs ? :) would we be better off if some people that like to argue were to pick just one of their points and write some code or a DIP? Joakim: "Pretty funny that he chose Stallman as his example of a guy who gets stuff done, whose Hurd microkernel never actually got done, :) though certainly ambitious, so Stallman would never have had a FOSS OS on which to run his GNU tools if it weren't for Linus." No - I think he used Stallman as an example of someone who although he whined a lot actually did a hell of a lot of work even so and became the change in the world he wanted. In my view productivity isn't about how many projects you don't manage to finish, but how many you do get done, and I am not sure I am in a position to criticize Stallman from that perspective, and even if his ideological approach isn't entirely my cup of tea, I do recognize he played a critical role there that was necessary. Nick: "That's a REALLY good article - quoting 'a patch in the hand is better than two in the bush'." Yes, I think so. Though it's delicate. Problem is for some things making the wrong decision can be a disaster. But, for example, with dirEntries right now - it's broken for serious use (last I checked and if it's fixed now, the point stands as it was broken for a long time). Almost decent solutions have been proposed but fell short of perfection and then they were kind of dropped. So the consequence is it's (possibly and if not then was for a long time) still broken. That's not the first time this kind of thing has happened. Was that the right trade-off between Type 1 and Type 2 errors? If you're not making some of each kind of mistake then it may be the case that the balance is wrong. (Depending on the situation of course - depends on what the consequences are of making a mistake. Conservatism isn't always the lower risk option, even though it feels that way). "* Bare assertions that there is no need for the feature, when the fact that somebody wrote a patch should be prima facie evidence that the feature was needed" Yes. Though with language features it's delicate, and I respect the taste of Andrei/Walter and other key people. "Really, what I’m asking is this: Which is more convincing? Concrete computer code authored by someone with first-hand knowledge of the defect? Or the bare assertion that something is wrong with it? I mean, either one might be correct. But the first is better supported." Yes - quite. Lobo - thanks for video link. Will watch. His montage of the shift in the front cover of hacker magazines was rather revealing of broader societal shifts. (From Byte and Dr Dobbs in the 80s to their successors becoming more like lifestyle magazines today). I've seen the same thing happen in my lifetime in certain parts of finance. In the beginning you have a bunch of highly unusual people who ended up there almost by accident but really care about things intrinsically and are there as exiles from the rest of the world driven by social factors. Then success (which comes _because_ they didn't care about social factors) leads to expansion and it brings in people who are less intrinsically motivated, drawn by money and prestige and the culture changes. Also because people who were on the outside looking in start to find being accepted into the establishment where it is warm and cosy quite appealing and lose sight of what it was that brought them success. CoC! (I mean that as regards the societal shift he talks about - clearly not really applicable to D, at least not today).
Feb 10 2016
next sibling parent reply tsbockman <thomas.bockman gmail.com> writes:
On Thursday, 11 February 2016 at 04:27:43 UTC, Laeeth Isharc 
wrote:
 would we be better off if some people that like to argue were 
 to pick just one of their points and write some code or a DIP?
For the most difficult/contentious issues, writing a DIP is just another form of arguing.
Feb 10 2016
parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Thursday, 11 February 2016 at 04:35:27 UTC, tsbockman wrote:
 On Thursday, 11 February 2016 at 04:27:43 UTC, Laeeth Isharc 
 wrote:
 would we be better off if some people that like to argue were 
 to pick just one of their points and write some code or a DIP?
For the most difficult/contentious issues, writing a DIP is just another form of arguing.
Well writing code might be better, but writing a DIP is a superior form of arguing to just plain grumbling as its more constructive.
Feb 10 2016
next sibling parent reply tsbockman <thomas.bockman gmail.com> writes:
On Thursday, 11 February 2016 at 04:50:04 UTC, Laeeth Isharc 
wrote:
 For the most difficult/contentious issues, writing a DIP is 
 just another form of arguing.
Well writing code might be better, but writing a DIP is a superior form of arguing to just plain grumbling as its more constructive.
True. Just pointing out that for certain recurring issues, the reason that people have fallen back to grumbling is because some DIPs *did* get written, but were rejected for vague, non-constructive reasons, with no (workable) alternative being offered.
Feb 10 2016
parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Thursday, 11 February 2016 at 04:54:15 UTC, tsbockman wrote:
 On Thursday, 11 February 2016 at 04:50:04 UTC, Laeeth Isharc 
 wrote:
 For the most difficult/contentious issues, writing a DIP is 
 just another form of arguing.
Well writing code might be better, but writing a DIP is a superior form of arguing to just plain grumbling as its more constructive.
True. Just pointing out that for certain recurring issues, the reason that people have fallen back to grumbling is because some DIPs *did* get written, but were rejected for vague, non-constructive reasons, with no (workable) alternative being offered.
Which ones, out if interest ? And in your opinion were they thought through ?
Feb 10 2016
parent reply tsbockman <thomas.bockman gmail.com> writes:
On Thursday, 11 February 2016 at 04:59:16 UTC, Laeeth Isharc 
wrote:
 On Thursday, 11 February 2016 at 04:54:15 UTC, tsbockman wrote:
 True. Just pointing out that for certain recurring issues, the 
 reason that people have fallen back to grumbling is because 
 some DIPs *did* get written, but were rejected for vague, 
 non-constructive reasons, with no (workable) alternative being 
 offered.
Which ones, out if interest ? And in your opinion were they thought through ?
Specifically, DIP69 and its predecessors, which propose a Rust-inspired lifetime and escape analysis system as a solution to many of D's memory model woes. It seemed (and still seems) like a good solution to me, but I recognize that I am insufficiently experienced and knowledgeable in the relevant areas to deserve a vote in the matter. So, I'm not necessarily saying that it should have been accepted - but I can definitely understand how frustrating it is for those who worked on it over the course of several months to have it rejected (as far as I can tell) simply because it is "too complicated". This is non-constructive in the sense that it is a subjective judgment which does not point the way to a better solution. As of today, the "Study" group for safe reference-counting doesn't appear to be going much of anywhere, because Walter and Andrei have rejected the DIP69 approach without having a real alternative in hand. (DIP77 seems better than nothing to me, but has not been well-received by those in the community who are most invested in, and most knowledgeable of, memory management issues.) In the spirit of the original post, perhaps what is needed is simply for someone to fork DMD and implement DIP69, so that people can actually try it instead of just imagining it. That's a lot of time and effort to invest though, knowing that your work will most likely be rejected for purely subjective reasons.
Feb 10 2016
parent reply Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Thursday, 11 February 2016 at 05:31:54 UTC, tsbockman wrote:
 On Thursday, 11 February 2016 at 04:59:16 UTC, Laeeth Isharc 
 wrote:
 On Thursday, 11 February 2016 at 04:54:15 UTC, tsbockman wrote:
 True. Just pointing out that for certain recurring issues, 
 the reason that people have fallen back to grumbling is 
 because some DIPs *did* get written, but were rejected for 
 vague, non-constructive reasons, with no (workable) 
 alternative being offered.
Which ones, out if interest ? And in your opinion were they thought through ?
Specifically, DIP69 and its predecessors, which propose a Rust-inspired lifetime and escape analysis system as a solution to many of D's memory model woes. It seemed (and still seems) like a good solution to me, but I recognize that I am insufficiently experienced and knowledgeable in the relevant areas to deserve a vote in the matter. So, I'm not necessarily saying that it should have been accepted - but I can definitely understand how frustrating it is for those who worked on it over the course of several months to have it rejected (as far as I can tell) simply because it is "too complicated". This is non-constructive in the sense that it is a subjective judgment which does not point the way to a better solution. As of today, the "Study" group for safe reference-counting doesn't appear to be going much of anywhere, because Walter and Andrei have rejected the DIP69 approach without having a real alternative in hand. (DIP77 seems better than nothing to me, but has not been well-received by those in the community who are most invested in, and most knowledgeable of, memory management issues.) In the spirit of the original post, perhaps what is needed is simply for someone to fork DMD and implement DIP69, so that people can actually try it instead of just imagining it. That's a lot of time and effort to invest though, knowing that your work will most likely be rejected for purely subjective reasons.
Beyond my pay grade, but looks to me like study group is devoted to just this kind of question and in response to observation that this is something very important to get right and very difficult the discussion is beginning with a simpler but important (there were some stats on chrome that were quite shocking) problem of how to do RC strings. If there's one area where you shouldn't just accept patches this surely must be it ! And I don't see the people that are grumbling participating in the study group...
Feb 10 2016
parent tsbockman <thomas.bockman gmail.com> writes:
On Thursday, 11 February 2016 at 06:01:29 UTC, Laeeth Isharc 
wrote:
 Beyond my pay grade, but looks to me like study group is 
 devoted to just this kind of question
It's not just devoted to this *kind* of question - it's devoted to this *exact* question. It was formed explicitly for the purpose of trying to work out an acceptable solution to the reference-counting safety problem.
 and in response to observation that this is something very 
 important to get right and very difficult the discussion is 
 beginning
"Starting over" or maybe even "sidetracking" would be more accurate - quite a lot of other stuff got posted in the study group before the RCString thing came up. No consensus was emerging, hence the reset.
 with a simpler but important (there were some stats on chrome 
 that were quite shocking) problem of how to do RC strings.

 If there's one area where you shouldn't just accept patches 
 this surely must be it !
True. It's a hard problem, especially since they're shooting for making safe RC a non-breaking change.
 And I don't see the people that are grumbling participating in 
 the study group...
? But some of them are. The grumbling over RC is a reflection of how hard the problem is to solve, not a collective unwillingness to contribute code on the part of either side.
Feb 10 2016
prev sibling parent Laeeth Isharc <laeethnospam nospam.laeeth.com> writes:
On Thursday, 11 February 2016 at 04:50:04 UTC, Laeeth Isharc 
wrote:
 On Thursday, 11 February 2016 at 04:35:27 UTC, tsbockman wrote:
 On Thursday, 11 February 2016 at 04:27:43 UTC, Laeeth Isharc 
 wrote:
 would we be better off if some people that like to argue were 
 to pick just one of their points and write some code or a DIP?
For the most difficult/contentious issues, writing a DIP is just another form of arguing.
Well writing code might be better, but writing a DIP is a superior form of arguing to just plain grumbling as its more constructive.
And not only is it more constructive but the structured format forces you to think things through and to make explicit what isn't in a forum post (which may reduce contention since its much clearer what you mean). And even if this were not the case then there are plenty of things that people like to grumble about but that wouldn't be all that contentious once expressed in a thought through DIP even if people didn't agree with you. Because then its all set out clearly and it becomes more of an objective technical debate.
Feb 10 2016
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 04:27:43 UTC, Laeeth Isharc 
wrote:
 Joakim:
 "Pretty funny that he chose Stallman as his example of a guy 
 who gets stuff done, whose Hurd microkernel never actually got 
 done, :) though certainly ambitious, so Stallman would never 
 have had a FOSS OS on which to run his GNU tools if it weren't 
 for Linus."

 No - I think he used Stallman as an example of someone who 
 although he whined a lot actually did a hell of a lot of work 
 even so and became the change in the world he wanted.  In my 
 view productivity isn't about how many projects you don't 
 manage to finish, but how many you do get done, and I am not 
 sure I am in a position to criticize Stallman from that 
 perspective
He got some stuff done, which I alluded to, but his big project to build an OS on which to run his tools didn't.
 even if his ideological approach isn't entirely my cup of tea, 
 I do recognize he played a critical role there that was 
 necessary.
Eh, there were always the BSDs and essentially nobody runs GNU code today. Android, that big open-source success, comes with almost no GNU code, just the linux kernel from Linus and company and a bunch of Apache-licensed code. A lot of the BSD guys went to work at Apple, where they have now spread the permissively-licensed Darwin base of OS X and iOS to more than a billion devices, along with llvm and other permissively-licensed projects. Stallman's GNU/GPL effort has largely failed, so he was clearly neither critical nor necessary. Was he important, as a vocal proponent of FOSS early on? Perhaps, but things would likely have progressed this way regardless, as his extremist, quasi-religious preaching of "free software" is largely dying out. That religious fervor may even have hurt as much as it helped early on, as that collaborative model only really took off after the more business-friendly rebranding as "open source," which has also led to a move to more permissive licenses, ie not the GPL. My point is that people see the success of open source and his early role as a vocal proponent and assume he was "critical," when the truth is more complicated, as his extreme formulation of completely "free software" has not done that well. On Thursday, 11 February 2016 at 05:31:54 UTC, tsbockman wrote:
 So, I'm not necessarily saying that it should have been 
 accepted - but I can definitely understand how frustrating it 
 is for those who worked on it over the course of several months 
 to have it rejected (as far as I can tell) simply because it is 
 "too complicated". This is non-constructive in the sense that 
 it is a subjective judgment which does not point the way to a 
 better solution.

 As of today, the "Study" group for safe reference-counting 
 doesn't appear to be going much of anywhere, because Walter and 
 Andrei have rejected the DIP69 approach without having a real 
 alternative in hand. (DIP77 seems better than nothing to me, 
 but has not been well-received by those in the community who 
 are most invested in, and most knowledgeable of, memory 
 management issues.)
I'll note that not knowing a better solution doesn't mean one must simply accept the solution at hand, especially if that temporary solution will be difficult to unwind later. Sometimes you simply need more time to come up with something better. It all depends on the scale of the project and the suitability of the solution presented; you cannot simply say that "some" solution is better than nothing, as the original quoted post does. But yeah, maybe the reasons for rejection can be communicated better.
 In the spirit of the original post, perhaps what is needed is 
 simply for someone to fork DMD and implement DIP69, so that 
 people can actually try it instead of just imagining it. That's 
 a lot of time and effort to invest though, knowing that your 
 work will most likely be rejected for purely subjective reasons.
This is why you should generally only work on something you actually need, which is a great discipline. Even if it's rejected, you can code it up and use it yourself, though that's not always possible with certain language changes and DIPs. For example, I asked about ARM and mobile support for D in 2011, noting that mobile was starting to take off and that people had been asking for ARM support periodically for years even prior to that. I was told it was one of many priorities, but nobody knew when it'd be worked on. Two years later, seeing mobile still hadn't been done (though others had gotten ldc/gdc working on linux/ARM to some extent), I took it up and, along with Dan, alpha releases for iOS and Android are now listed on the main download page. It doesn't matter to me if nobody here uses D on mobile- though I certainly think that would be a huge missed opportunity- as _I_ want to use D on Android and now I can. While this is not generalizable for all D PRs, ie nobody wants to maintain a fork of certain language features, it is for pretty much everything in druntime/phobos and some even do it for dmd. Caring enough about a change to code it yourself is a good test for whether it is worth doing, which is one point the original post alludes to.
Feb 10 2016
next sibling parent tsbockman <thomas.bockman gmail.com> writes:
On Thursday, 11 February 2016 at 06:20:33 UTC, Joakim wrote:
 On Thursday, 11 February 2016 at 05:31:54 UTC, tsbockman wrote:
 As of today, the "Study" group for safe reference-counting 
 doesn't appear to be going much of anywhere, because Walter 
 and Andrei have rejected the DIP69 approach without having a 
 real alternative in hand. (DIP77 seems better than nothing to 
 me, but has not been well-received by those in the community 
 who are most invested in, and most knowledgeable of, memory 
 management issues.)
I'll note that not knowing a better solution doesn't mean one must simply accept the solution at hand, especially if that temporary solution will be difficult to unwind later. Sometimes you simply need more time to come up with something better. It all depends on the scale of the project and the suitability of the solution presented; you cannot simply say that "some" solution is better than nothing, as the original quoted post does. But yeah, maybe the reasons for rejection can be communicated better.
Although I realize it might sound like I am, I'm not really criticizing either side in this. I don't really know whether either DIP69 or DIP77 actually represents a reasonable solution to the problem; as I said, I am unqualified to make that determination. I was simply giving my impression of where the discussion stands at the moment. I am certainly not advocating that DIP77 be implemented over the objections of so many of the people in the community who *are* qualified.
 In the spirit of the original post, perhaps what is needed is 
 simply for someone to fork DMD and implement DIP69, so that 
 people can actually try it instead of just imagining it. 
 That's a lot of time and effort to invest though, knowing that 
 your work will most likely be rejected for purely subjective 
 reasons.
This is why you should generally only work on something you actually need, which is a great discipline. Even if it's rejected, you can code it up and use it yourself, though that's not always possible with certain language changes and DIPs.
Definitely.
 For example, I asked about ARM and mobile support for D [...]
Your efforts are appreciated! I don't know if anyone else is using your work *yet*, but give it time and I'm confident that they will. ARM and Android are very important platforms.
Feb 10 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 06:20:33 UTC, Joakim wrote:
 Eh, there were always the BSDs and essentially nobody runs GNU 
 code today.
Uhm... Many do. And beyond GNU, the GPL/LGPL are the most common licenses in community driven open source productivity applications: Gimp, Inkscape, Blender, Audacity...
 My point is that people see the success of open source and his 
 early role as a vocal proponent and assume he was "critical," 
 when the truth is more complicated, as his extreme formulation 
 of completely "free software" has not done that well.
It has not done well with corporations, but it has done very well with open source end-user software! Even projects that are not GPL tend to use LGPL code. Yet, Linux did manage to scare the juggernauts, so now even Microsoft is starting to publish under liberal licenses (first very restrictive, now very liberal).
 This is why you should generally only work on something you 
 actually need, which is a great discipline.  Even if it's 
 rejected, you can code it up and use it yourself, though that's 
 not always possible with certain language changes and DIPs.
Well, yes. Unless you are designing a standard library. The original open source projects were mostly about recreating existing designs, but with a open source license. So the missing features were obvious. In the case of GNU (Unix), it is a cluster of individual small programs. So if people was unhappy they wrote a new version from scratch. And the most popular ones survived. Creative designs that went open source tended to be research projects... so you had a clear vision (and people who were experts in their field leading the design). So, the linked rant is pretty clueless IMO. You need to form consensus in order to gain focus. If you don't have focus you'll never end up with something coherent. If the author believed in what he wrote, then why did he write it? He obviously wrote it because he believe that communication can lead to change. And thereby he undermines his own argument... What brought original FOSS projects focus was the fact that they were not implementing original designs. And there has been LOTS of advocacy and arguments on usenet about every creek and cranny in software... since way before the Internet existed. So, it was an entertaining rant... and nothing more.
 For example, I asked about ARM and mobile support for D in 
 2011, noting that mobile was starting to take off and that 
 people had been asking for ARM support periodically for years 
 even prior to that.  I was told it was one of many priorities, 
 but nobody knew when it'd be worked on.
Yes, and this is all good, but it is not a language issue. It fits well with what makes contributing to GNU/Unix easy. You can write an isolated piece.
 While this is not generalizable for all D PRs, ie nobody wants 
 to maintain a fork of certain language features, it is for
Several people have created their own languages because they have given up on the D development process...
 for dmd.  Caring enough about a change to code it yourself is a 
 good test for whether it is worth doing, which is one point the 
 original post alludes to.
Writing code without a fork, when you know it will be rejected, is pointless. Spending days or weeks on a DIP, in order to have it dismissed by a "nice technical DIP, but does not fit with our vision", is very wasteful. So people create their own languages instead, but without building consensus, meaning they end up in an incomplete state or being to close to D, having a different set of issues. Which is a key aspect of D's problem too, it is too close to C++ to not be compared to it. So fixing some issues and introducing others isn't good enough for adoption. Building consensus is very important. Just take a look at politics. Communicating a clear vision and a believable path to implementation is essential. And listening. Good leaders are always very good at listening, IMO.
Feb 10 2016
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 07:32:00 UTC, Ola Fosheim 
Grøstad wrote:
 On Thursday, 11 February 2016 at 06:20:33 UTC, Joakim wrote:
 Eh, there were always the BSDs and essentially nobody runs GNU 
 code today.
Uhm... Many do. And beyond GNU, the GPL/LGPL are the most common licenses in community driven open source productivity applications: Gimp, Inkscape, Blender, Audacity...
All of which are decades-old projects from the heyday of the GPL, when many mistakenly attributed linux's success to the GPL and copied its license blindly. Almost nobody starts new projects with the GPL today, and it seems they've also given up on such OSS productivity apps.
 My point is that people see the success of open source and his 
 early role as a vocal proponent and assume he was "critical," 
 when the truth is more complicated, as his extreme formulation 
 of completely "free software" has not done that well.
It has not done well with corporations, but it has done very well with open source end-user software! Even projects that are not GPL tend to use LGPL code.
Define "very well." :) Because I've listed multiple permissively-licensed projects that are a couple orders of magnitude more widely used, and I see almost no GPL software even close. It's basically just the linux kernel and that's it, and that only because it was piled high with permissively-licensed and even closed software on top, with Android.
 Yet, Linux did manage to scare the juggernauts, so now even 
 Microsoft is starting to publish under liberal licenses (first 
 very restrictive, now very liberal).
It did nothing of the sort, as GNU/linux basically went nowhere and MS didn't care. It is only once permissively-licensed software like Android and parts of iOS hit billions of devices that MS started doing so, under permissive licenses like those projects, not the GPL.
 This is why you should generally only work on something you 
 actually need, which is a great discipline.  Even if it's 
 rejected, you can code it up and use it yourself, though 
 that's not always possible with certain language changes and 
 DIPs.
Well, yes. Unless you are designing a standard library.
Anything you want to add to a standard library can easily be maintained in a private fork or put up on dub, as you've suggested doing to all of phobos. So even if you don't get a change into the standard library, there's nothing stopping you from using it or distributing it, so this comment seems pointless.
 The original open source projects were mostly about recreating 
 existing designs, but with a open source license. So the 
 missing features were obvious. In the case of GNU (Unix), it is 
 a cluster of individual small programs. So if people was 
 unhappy they wrote a new version from scratch. And the most 
 popular ones survived.

 Creative designs that went open source tended to be research 
 projects... so you had a clear vision (and people who were 
 experts in their field leading the design).

 So, the linked rant is pretty clueless IMO. You need to form 
 consensus in order to gain focus. If you don't have focus 
 you'll never end up with something coherent. If the author 
 believed in what he wrote, then why did he write it? He 
 obviously wrote it because he believe that communication  can 
 lead to change. And thereby he undermines his own argument...
I agree with everything till you start espousing consensus. If anything, consensus often leads to the most incoherent designs, ie design by committee.
 What brought original FOSS projects focus was the fact that 
 they were not implementing original designs. And there has been 
 LOTS of advocacy and arguments on usenet about every creek and 
 cranny in software... since way before the Internet existed.

 So, it was an entertaining rant... and nothing more.
It had some good points, but was inconsistent. Perhaps one reason OSS projects have become less willing to take his patches is that they've become much more widely used, where if your project ships on Android, it will be used by a billion and a half people. Given that these projects are still way understaffed- look at OpenSSL and its Heartbleed bug- it's understandable that some would be conservative. Of course, it's possible that the projects he talked to had almost no users, so it all depends on the scale of the project, as I said before.
 For example, I asked about ARM and mobile support for D in 
 2011, noting that mobile was starting to take off and that 
 people had been asking for ARM support periodically for years 
 even prior to that.  I was told it was one of many priorities, 
 but nobody knew when it'd be worked on.
Yes, and this is all good, but it is not a language issue. It fits well with what makes contributing to GNU/Unix easy. You can write an isolated piece.
Even with language issues, there is potential for deviation. For example, if you think there might be commercial users for that feature, you could sell them a slightly forked version of dmd with your feature. It's what the zapcc devs did with clang: https://www.phoronix.com/scan.php?page=news_item&px=New-Zapcc-Clang-Benchmarks If it works out well enough, maybe you could integrate it back upstream eventually.
 While this is not generalizable for all D PRs, ie nobody wants 
 to maintain a fork of certain language features, it is for
Several people have created their own languages because they have given up on the D development process...
And how far have they gotten? Entire forks don't get very far, but a tracking branch with a few additional features can do just fine.
 for dmd.  Caring enough about a change to code it yourself is 
 a good test for whether it is worth doing, which is one point 
 the original post alludes to.
Writing code without a fork, when you know it will be rejected, is pointless.
Then perhaps you didn't really need that code, if you wouldn't have gotten much use out of it on your own. Yes, I've admitted that maintaining some language features is too painful or different to maintain in a private branch, but many can.
 Spending days or weeks on a DIP, in order to have it dismissed 
 by a "nice technical DIP, but does not fit with our vision", is 
 very wasteful.
I agree that it would be better to nip that in the bud where possible or better explain why the DIP doesn't fit, but many times such rejection is an inevitable byproduct of maintaining a high technical standard. Now, maybe it's impossible to maintain a high technical standard with a community-driven OSS project and you need a business model to be able to afford such exploration and possible waste of time. That may be a fundamental tension between technical quality and the OSS development model that cannot be resolved. But I see no way around such rejection if you want to maintain a high level of quality.
 So people create their own languages instead, but without 
 building consensus, meaning they end up in an incomplete state 
 or being to close to D, having a different set of issues. Which 
 is a key aspect of D's problem too, it is too close to C++ to 
 not be compared to it. So fixing some issues and introducing 
 others isn't good enough for adoption.
One of the reasons it's close is that new versions of C++ are copying D. :) I don't blame them for doing so or think there's anything wrong with it, but there are still enough problems with C++ that I don't think it'll be enough.
 Building consensus is very important. Just take a look at 
 politics. Communicating a clear vision and a believable path to 
 implementation is essential. And listening. Good leaders are 
 always very good at listening, IMO.
Yes, politics, look at how much the politicians get done! ;) I think we all know and Walter and Andrei have said that they're not managers. But I don't think consensus is useful, what matters is making the right technical decisions. Sometimes that comes from long technical debate where everybody shoots holes in everybody else's ideas and eventually somebody decides to implement the least bullet-ridden idea. Sometimes it comes from one person going down a path only they see the end to. Consensus is for getting everybody doing the same thing, which is not the road to technical quality. Linus has talked about the "wasteful" OSS approach, which he compares to evolution: http://bobweigel.net/projects/index.php?title=Weigel/Notes#Linus_on_Development Everything else about vision and listening, sure, but in an OSS project everybody is free to ignore that vision. It's one reason why I wish the official vision statement was much more specific and daring, because no matter what, we're free to ignore it, so there's no risk. At least you laid out a specific, original vision, and we can see if we want to follow it. But if you make bland, shapeless statements like "Improve quality," weren't we trying to do that anyway? It then becomes one of those corporate mission statements, which are notable only for stating the bland and the obvious.
Feb 11 2016
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 09:51:16 UTC, Joakim wrote:
 Consensus is for getting everybody doing the same thing, which 
 is not the road to technical quality.  Linus has talked about 
 the "wasteful" OSS approach, which he compares to evolution:

 http://bobweigel.net/projects/index.php?title=Weigel/Notes
Btw, in looking for a link with that old Linus quote, I also found this other one, that I'd never read before and is relevant for those who think D should specialize more: Linus: Quite frankly, Sun is doomed. And it has nothing to do with their engineering practices or their coding style. Tim: I'd love to hear your thoughts on why. Linus: You heard them above. Sun is basically inbreeding. That tends to be good to bring out specific characteristics of a breed, and tends to be good for _specialization_. But it's horrible for actual survival, and generates a very one-sided system that does not adapt well to change. Microsoft, for all the arguments against them, is better off simply because of the size of its population - they have a much wider consumer base, which in turn has caused them largely to avoid specialization. As a result, Microsoft has a much wider appeal - and suddenly most of the niches that Sun used to have are all gone, and its fighting for its life in many of its remaining ones. Why do you think Linux ends up being the most widely deployed Unix? It's avoided niches, it's avoided inbreeding, and not being too directed means that it doesn't get the problems you see with unbalanced systems. Face it, being one-sided is a BAD THING. Unix was dying because it was becoming much too one-sided. http://yarchive.net/comp/evolution.html
Feb 11 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 10:21:19 UTC, Joakim wrote:
 You heard them above. Sun is basically inbreeding. That tends 
 to be good
 to bring out specific characteristics of a breed, and tends to 
 be good for
 _specialization_.
Linus is not a very good analyst. All the big iron corporations had transition problems: Cray, SGI, IBM, Sun and many more. It would be very difficult to take the expertise SUN had and rapidly turn SUN into a competitive force in a different field. Of course, none of this is relevant to programming languages. Perl died because Python was better. Not because the niches changed. C++11 is replacing C/C++98 for newer projects that need performance. Better hardware means the niche has shrunk, but it will remain a significant niche.
Feb 11 2016
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 09:51:16 UTC, Joakim wrote:
 All of which are decades-old projects from the heyday of the 
 GPL, when many mistakenly attributed linux's success to the GPL 
 and copied its license blindly.
Yes, it does take decades to create complicated productivity apps. The only open source productivity app I use that isn't GPL is Open Office (IIRC), but it uses LGPL components, and is more corporate than community...
 It did nothing of the sort, as GNU/linux basically went nowhere 
 and MS didn't care.  It is only once permissively-licensed 
 software like Android and parts of iOS hit billions of devices 
 that MS started doing so, under permissive licenses like those 
 projects, not the GPL.
Android is based on Linux... but this is not how I remember it. Bill Gates went public with statements against FOSS way before Android appeared. He was clearly frustrated by how Linux made inroads in the server market.
 Anything you want to add to a standard library can easily be 
 maintained in a private fork or put up on dub, as you've 
 suggested doing to all of phobos.  So even if you don't get a 
 change into the standard library, there's nothing stopping you 
 from using it or distributing it, so this comment seems 
 pointless.
I've repeatedly stated that libraries, phobos inclusive, are inconsequential. What matters is the language, compiler and runtime. Libraries are luxury items.
 I agree with everything till you start espousing consensus.  If 
 anything, consensus often leads to the most incoherent designs, 
 ie design by committee.
No. A standards committee is a collection of representatives that are trying to balance out different conflicting agendas. If you build at team you are better off establishing consensus. So step one is to attract people with the same agenda. So if you want to fork you should try to build consensus within a faction and then branch off. If only 1 or 2 people create a fork it will most likely go nowhere.
 For example, if you think there might be commercial users for 
 that feature, you could sell them a slightly forked version of 
 dmd with your feature.
Yes, I've given that some thought. But for it to pay off you need a refactor high quality code documented code base. If a 1-2 person team is to serve customers you cannot waste hours on poor infrastructure.
 And how far have they gotten?  Entire forks don't get very far, 
 but a tracking branch with a few additional features can do 
 just fine.
One project got pretty far, but then it went dead because their life situation changed as far as I can tell. Another one is alive, but is too close to C++, but not close enough: http://loci-lang.org/ I think Loci looks quite ok, but it might be too influenced by personal taste. I think the same applies to many languages, even Rust, Go and D. Too opinionated, by not entirely well founded priorities.
 Then perhaps you didn't really need that code, if you wouldn't 
 have gotten much use out of it on your own.  Yes, I've admitted 
 that maintaining some language features is too painful or 
 different to maintain in a private branch, but many can.
Yep, that's true. I don't need any other languages than C++, TypeScript and Python. But what I need is not the same as what I want!
 Now, maybe it's impossible to maintain a high technical 
 standard with a community-driven OSS project and you need a 
 business model to be able to afford such exploration and 
 possible waste of time. That may be a fundamental tension 
 between technical quality and the OSS development model that 
 cannot be resolved.  But I see no way around such rejection if 
 you want to maintain a high level of quality.
Well, I think it matters that people can sit around a table and talk. And one can probably find solutions for doing cooperative work in a better way with the high bandwidths we have on the Internet today. But building a team with a shared vision and the right knowledge/attitude is not so easy either. Doing innovative things as a collective is very challenging. I think it takes much higher communication/people skills than is commonly found in these kind of projects.
 One of the reasons it's close is that new versions of C++ are 
 copying D. :) I don't blame them for doing so or think there's 
 anything wrong with it, but there are still enough problems 
 with C++ that I don't think it'll be enough.
Which features are you thinking of? I think D has rushed to implement proposed C++ features... (It can take a decade for things to make it into C++)
 Yes, politics, look at how much the politicians get done! ;) I 
 think we all know and Walter and Andrei have said that they're 
 not managers.
So, what is the process? Where is the quality assurance? They have to do managment if they want to lead. Architects do manage the process. They don't lay down bricks. That's how great buildings become... great. And no, a programming language cannot be a "bazaar" (that's Php and Perl).
 But I don't think consensus is useful, what matters is making 
 the right technical decisions.
It matters if you don't want to go solo.
 Consensus is for getting everybody doing the same thing, which 
 is not the road to technical quality.
Consensus is for building a shared vision of what the future will look like so that people thing the project is worth backing and will go somewhere. Then you can have more independent processes on the details.
 Everything else about vision and listening, sure, but in an OSS 
 project everybody is free to ignore that vision.  It's one 
 reason why I wish the official vision statement was much more 
 specific and daring, because no matter what, we're free to 
 ignore it, so there's no risk.
Yes. But it could be simple. Like. 1. full feature freeze 2. heavy refactoring 3. full documentation of module x, y and z. + some details on goals and planning
 anyway?  It then becomes one of those corporate mission 
 statements, which are notable only for stating the bland and 
 the obvious.
Yes, bland missions statements have little value by themselves, although for very big and fractured groups they can get some sense of a "we", although I think often the process is more important, i.e. the social communication between groups can be more important than the resulting mission statement.
Feb 11 2016
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 10:52:31 UTC, Ola Fosheim 
Grøstad wrote:
 On Thursday, 11 February 2016 at 09:51:16 UTC, Joakim wrote:
 All of which are decades-old projects from the heyday of the 
 GPL, when many mistakenly attributed linux's success to the 
 GPL and copied its license blindly.
Yes, it does take decades to create complicated productivity apps.
That almost nobody uses? I can do that in a day. ;)
 The only open source productivity app I use that isn't GPL is 
 Open Office (IIRC), but it uses LGPL components, and is more 
 corporate than community...
I never use any productivity apps, including an IDE, never saw the point to them. And almost nobody uses the OSS ones you list either.
 It did nothing of the sort, as GNU/linux basically went 
 nowhere and MS didn't care.  It is only once 
 permissively-licensed software like Android and parts of iOS 
 hit billions of devices that MS started doing so, under 
 permissive licenses like those projects, not the GPL.
Android is based on Linux... but this is not how I remember it. Bill Gates went public with statements against FOSS way before Android appeared. He was clearly frustrated by how Linux made inroads in the server market.
They may have made statements before, but didn't change their behavior till permissively-licensed projects actually started doing really well in the market, as they're nothing if not observers of market success. As for Android using linux, I addressed that below: Android has much more permissively-licensed and closed software on top of its linux kernel.
 Anything you want to add to a standard library can easily be 
 maintained in a private fork or put up on dub, as you've 
 suggested doing to all of phobos.  So even if you don't get a 
 change into the standard library, there's nothing stopping you 
 from using it or distributing it, so this comment seems 
 pointless.
I've repeatedly stated that libraries, phobos inclusive, are inconsequential. What matters is the language, compiler and runtime. Libraries are luxury items.
I was responding to your statement that people could code something up and use it themselves, "unless you are designing a standard library." If libraries don't matter, I don't see why you'd bring that up as an exception.
 I agree with everything till you start espousing consensus.  
 If anything, consensus often leads to the most incoherent 
 designs, ie design by committee.
No. A standards committee is a collection of representatives that are trying to balance out different conflicting agendas.
Which is precisely what leads to incoherence. It is theoretically possible that one can come up with a balanced design by committee that is also the best, but the problem is that many of the representatives are either wrong or don't matter, so by balancing in their concerns, you almost always end up with a product unbalanced for the real world.
 If you build at team you are better off establishing consensus. 
 So step one is to attract people with the same agenda. So if 
 you want to fork you should try to build consensus within a 
 faction and then branch off. If only 1 or 2 people create a 
 fork it will most likely go nowhere.
That's why I differentiated between getting a team on the same page and high-quality coherent designs. The former may get more done, but usually not at high quality. Read up more at the Linus links I gave to get the alternate perspective, of how to do it _without_ consensus.
 For example, if you think there might be commercial users for 
 that feature, you could sell them a slightly forked version of 
 dmd with your feature.
Yes, I've given that some thought. But for it to pay off you need a refactor high quality code documented code base. If a 1-2 person team is to serve customers you cannot waste hours on poor infrastructure.
On the other hand, that means only those who really know or are willing to spend the time learning the codebase can compete with you, ie new competition can't get going as fast. There are both pros and cons to being early.
 And how far have they gotten?  Entire forks don't get very 
 far, but a tracking branch with a few additional features can 
 do just fine.
One project got pretty far, but then it went dead because their life situation changed as far as I can tell. Another one is alive, but is too close to C++, but not close enough: http://loci-lang.org/ I think Loci looks quite ok, but it might be too influenced by personal taste. I think the same applies to many languages, even Rust, Go and D. Too opinionated, by not entirely well founded priorities.
How is Loci in any way a fork of D? It may be similar in its features and goals, but it doesn't appear to fork any dmd or D code. If you believe those languages' priorities are "not entirely well founded," that's an opportunity for you to get it right. :)
 Then perhaps you didn't really need that code, if you wouldn't 
 have gotten much use out of it on your own.  Yes, I've 
 admitted that maintaining some language features is too 
 painful or different to maintain in a private branch, but many 
 can.
Yep, that's true. I don't need any other languages than C++, TypeScript and Python. But what I need is not the same as what I want!
As the original post noted, both need and want are irrelevant, if you're unwilling to code.
 Now, maybe it's impossible to maintain a high technical 
 standard with a community-driven OSS project and you need a 
 business model to be able to afford such exploration and 
 possible waste of time. That may be a fundamental tension 
 between technical quality and the OSS development model that 
 cannot be resolved.  But I see no way around such rejection if 
 you want to maintain a high level of quality.
Well, I think it matters that people can sit around a table and talk. And one can probably find solutions for doing cooperative work in a better way with the high bandwidths we have on the Internet today. But building a team with a shared vision and the right knowledge/attitude is not so easy either. Doing innovative things as a collective is very challenging. I think it takes much higher communication/people skills than is commonly found in these kind of projects.
I think the biggest issue is money: you can't pay the bills with open source work. If there were a business model for open source, which I happen to have conveniently provided years ago, :) then things change: http://www.phoronix.com/scan.php?page=article&item=sprewell_licensing
 One of the reasons it's close is that new versions of C++ are 
 copying D. :) I don't blame them for doing so or think there's 
 anything wrong with it, but there are still enough problems 
 with C++ that I don't think it'll be enough.
Which features are you thinking of? I think D has rushed to implement proposed C++ features... (It can take a decade for things to make it into C++)
I don't use or follow C++, but stuff like CTFE has been mentioned in this forum before.
 Yes, politics, look at how much the politicians get done! ;) I 
 think we all know and Walter and Andrei have said that they're 
 not managers.
So, what is the process? Where is the quality assurance? They have to do managment if they want to lead. Architects do manage the process. They don't lay down bricks. That's how great buildings become... great. And no, a programming language cannot be a "bazaar" (that's Php and Perl).
I believe my second Linus link below has the answers to these questions, yes, the bazaar. Now, I agree that the current OSS bazaar usually ends up with low-quality results, which is why I mentioned that high-quality OSS needs the help of another kind of bazaar, the market, where actual money changes hands.
 But I don't think consensus is useful, what matters is making 
 the right technical decisions.
It matters if you don't want to go solo.
A lot of solo devs using D to go in the same general direction will work too, probably a lot better than consensus.
 Consensus is for getting everybody doing the same thing, which 
 is not the road to technical quality.
Consensus is for building a shared vision of what the future will look like so that people thing the project is worth backing and will go somewhere. Then you can have more independent processes on the details.
According to Linus, linux never had such a consensus, why did it succeed?
 Everything else about vision and listening, sure, but in an 
 OSS project everybody is free to ignore that vision.  It's one 
 reason why I wish the official vision statement was much more 
 specific and daring, because no matter what, we're free to 
 ignore it, so there's no risk.
Yes. But it could be simple. Like. 1. full feature freeze 2. heavy refactoring 3. full documentation of module x, y and z. + some details on goals and planning
Such restarts rarely work out in the best of circumstances, ie a company with lots of money, even more so in a volunteer community. Not saying it can't or shouldn't be done, just that incremental improvement is more likely. On Thursday, 11 February 2016 at 11:14:13 UTC, Ola Fosheim Grøstad wrote:
 On Thursday, 11 February 2016 at 10:21:19 UTC, Joakim wrote:
 You heard them above. Sun is basically inbreeding. That tends 
 to be good
 to bring out specific characteristics of a breed, and tends to 
 be good for
 _specialization_.
Linus is not a very good analyst. All the big iron corporations had transition problems: Cray, SGI, IBM, Sun and many more. It would be very difficult to take the expertise SUN had and rapidly turn SUN into a competitive force in a different field.
That's _exactly_ what he said, not sure what you disagree with.
 Of course, none of this is relevant to programming languages. 
 Perl died because Python was better. Not because the niches 
 changed. C++11 is replacing C/C++98 for newer projects that 
 need performance. Better hardware means the niche has shrunk, 
 but it will remain a significant niche.
But python has not emerged from that scripting language niche either, and I think you greatly overestimate how well C++11 is doing. All the interest in Rust, Go, D, and Swift is because C++ is having problems attracting those newer projects. You can call C++ a "niche," but the fact is that it is not specialized for game development or cloud services, ie it's still fairly general purpose. Those who want D to specialize more should heed Linus's words.
Feb 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 11:46:44 UTC, Joakim wrote:
 That's why I differentiated between getting a team on the same 
 page and high-quality coherent designs.  The former may get 
 more done, but usually not at high quality.  Read up more at 
 the Linus links I gave to get the alternate perspective, of how 
 to do it _without_ consensus.
Linux is not a good example. Linux is too high profile and can afford massive churn. That is highly inefficient use of programmer resources.
 On the other hand, that means only those who really know or are 
 willing to spend the time learning the codebase can compete 
 with you, ie new competition can't get going as fast.  There 
 are both pros and cons to being early.
Mostly cons. There are very few potential customers. And most likely no local customers, which are the most attractive ones.
 How is Loci in any way a fork of D?  It may be similar in its 
 features and goals, but it doesn't appear to fork any dmd or D 
 code.
I didn't say fork. I was talking about people who have given up on the D development process and created their own language in the same catagory as C++ and D.
 If you believe those languages' priorities are "not entirely 
 well founded," that's an opportunity for you to get it right. :)
Sure, I'm thinking about it. But I currently think WebAssembly/JavaScript + Linux server are the most important targets, so maybe going from scratch is less work, actually. But sure, building a new language over Rust, D or Go is an option.
 As the original post noted, both need and want are irrelevant, 
 if you're unwilling to code.
Nobody are unwilling to code. Most people are unwilling to manage a project or invest in a project that isn't properly managed. What you need is a well managed project with a clear vision, clear goals and good leadership. And please don't point at Linus, he is not a particularly effective leader, but probably does well as a manager. But Posix was already given... The broad strokes for a monolithic kernel is kinda given. He just happend to whip up something at the right time, that many people had been looking for (free unix).
 I don't use or follow C++, but stuff like CTFE has been 
 mentioned in this forum before.
Well, constexpr functions can replace convoluted template programming. Not sure if that is related to D.
 A lot of solo devs using D to go in the same general direction 
 will work too, probably a lot better than consensus.
Well, not sure what we are talking about here. Clearly, you need consensus among said devs if you are going to change the language so that it can either support better manual memory managment or faster garbage collection?
 According to Linus, linux never had such a consensus, why did 
 it succeed?
Because there was no free Unix on x86 and the CPUs at that point in time had MMUs. Many people who used Unix at work or on campus wanted Unix at home too. Many students used Minix in their OS course, and disliked the non-free license. So you basically had a fairly large group of people willing to throw in weeks and months, if not years in the early stages of the project.
 Yes. But it could be simple. Like.

 1. full feature freeze
 2. heavy refactoring
 3. full documentation of module x, y and z.

 + some details on goals and planning
Such restarts rarely work out in the best of circumstances, ie a company with lots of money, even more so in a volunteer community. Not saying it can't or shouldn't be done, just that incremental improvement is more likely.
Refactoring and documentation isn't a restart. It is common hygiene!
 But python has not emerged from that scripting language niche 
 either, and I think you greatly overestimate how well C++11 is 
 doing.
Python was inspired by a language used for teaching programming, but was geared to more advanced programmers. Not sure what you mean by Python not having emerged?
 Those who want D to specialize more should heed Linus's words.
Can you paraphase those words in a condensed manner that is relevant to programming languages? I don't get the argument.
Feb 11 2016
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 12:47:20 UTC, Ola Fosheim 
Grøstad wrote:
 On Thursday, 11 February 2016 at 11:46:44 UTC, Joakim wrote:
 That's why I differentiated between getting a team on the same 
 page and high-quality coherent designs.  The former may get 
 more done, but usually not at high quality.  Read up more at 
 the Linus links I gave to get the alternate perspective, of 
 how to do it _without_ consensus.
Linux is not a good example. Linux is too high profile and can afford massive churn. That is highly inefficient use of programmer resources.
It was not always high-profile, it started off with one guy and grew big through the same decentralized process.
 On the other hand, that means only those who really know or 
 are willing to spend the time learning the codebase can 
 compete with you, ie new competition can't get going as fast.  
 There are both pros and cons to being early.
Mostly cons. There are very few potential customers. And most likely no local customers, which are the most attractive ones.
The chicken has to start somewhere, or there will be no eggs. ;)
 How is Loci in any way a fork of D?  It may be similar in its 
 features and goals, but it doesn't appear to fork any dmd or D 
 code.
I didn't say fork. I was talking about people who have given up on the D development process and created their own language in the same catagory as C++ and D.
You mentioned it in response to forks and how far they've gotten. That guy gave up on D?
 If you believe those languages' priorities are "not entirely 
 well founded," that's an opportunity for you to get it right. 
 :)
Sure, I'm thinking about it. But I currently think WebAssembly/JavaScript + Linux server are the most important targets, so maybe going from scratch is less work, actually. But sure, building a new language over Rust, D or Go is an option.
The loci guy, just a couple years out of university did it, surely you could too, if nobody else is getting it right.
 As the original post noted, both need and want are irrelevant, 
 if you're unwilling to code.
Nobody are unwilling to code. Most people are unwilling to manage a project or invest in a project that isn't properly managed. What you need is a well managed project with a clear vision, clear goals and good leadership. And please don't point at Linus, he is not a particularly effective leader, but probably does well as a manager. But Posix was already given... The broad strokes for a monolithic kernel is kinda given. He just happend to whip up something at the right time, that many people had been looking for (free unix).
In the emails I linked to, he notes that he didn't have a clear vision or goals and that it is _likely impossible to do so for software_. So he agrees with you that he isn't some great leader, and notes that what's important is the decentralized process, where there is _no clear vision_. Now, you're right that copying UNIX is easier than coming up with an entirely new technical design, but he claims that the UNIX guys themselves didn't "design" it, that that was an evolutionary, decentralized process also.
 A lot of solo devs using D to go in the same general direction 
 will work too, probably a lot better than consensus.
Well, not sure what we are talking about here. Clearly, you need consensus among said devs if you are going to change the language so that it can either support better manual memory managment or faster garbage collection?
Not necessarily, Sociomantic didn't wait for permission to go do their own concurrent GC for D1. One can experiment with various approaches to memory management and come back with actual data. It doesn't take much time to prototype something and test out ideas, before you make a push for it to be included in the language. My point is that we're not going to come to a consensus on the best approach to memory management. Somebody, likely several, will have to try out different approaches locally and then compare results. Perhaps that will lead to several different GCs shipping with D, tuned for different loads.
 According to Linus, linux never had such a consensus, why did 
 it succeed?
Because there was no free Unix on x86 and the CPUs at that point in time had MMUs. Many people who used Unix at work or on campus wanted Unix at home too. Many students used Minix in their OS course, and disliked the non-free license. So you basically had a fairly large group of people willing to throw in weeks and months, if not years in the early stages of the project.
That's all well and good, but it doesn't answer the question: how did they succeed without prior consensus, which Linus claims was never there?
 Refactoring and documentation isn't a restart.

 It is common hygiene!
Right, it's a question of whether we stop everything and take a thorough bath, or clean a little here and there on the go. There is refactoring constantly going on, and documentation is always tough for OSS projects.
 But python has not emerged from that scripting language niche 
 either, and I think you greatly overestimate how well C++11 is 
 doing.
Python was inspired by a language used for teaching programming, but was geared to more advanced programmers. Not sure what you mean by Python not having emerged?
I mean it's still a scripting language used for teaching, scripting, and webapps. Almost nobody is using it for application programming, ie anything outside that scripting niche, say for mobile apps.
 Those who want D to specialize more should heed Linus's words.
Can you paraphase those words in a condensed manner that is relevant to programming languages? I don't get the argument.
He noted that the UNIX vendors failed because they were highly specialized for certain corporate niches, unlike linux or Windows, and couldn't survive a collapse of that niche, because they weren't general enough to survive in new niches. Similarly, I'm saying D shouldn't specialize for the same reasons.
Feb 11 2016
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 14:53:37 UTC, Joakim wrote:
 It was not always high-profile, it started off with one guy and 
 grew big through the same decentralized process.
It was fairly popular among students even back when it was not so great. This is not so atypical. Someone fills a void, then it grows. The real enabler was getting access to machines that had MMUs.
 The loci guy, just a couple years out of university did it, 
 surely you could too, if nobody else is getting it right.
I could, in theory. But that would make it my only hobby...
 software_.  So he agrees with you that he isn't some great 
 leader, and notes that what's important is the decentralized 
 process, where there is _no clear vision_.

 Now, you're right that copying UNIX is easier than coming up 
 with an entirely new technical design, but he claims that the 
 UNIX guys themselves didn't "design" it, that that was an 
 evolutionary, decentralized process also.
I find that difficult to follow. A Unix kernel is a pretty clear vision... But Solaris was a much more advanced OS than Linux was, geared towards more complicated setups. I don't think the design of Linux is a major factor, as long as it worked reasonably well. Linux proliferate because it is the path of least resistance and high installed base. Distributions like Slackware and Debian were probably very important. It's not like end user cared about the kernel all that much. They wanted convenient distributions. I think people put too much weight on the kernel. It is not all that special.
 compare results.  Perhaps that will lead to several different 
 GCs shipping with D, tuned for different loads.
Well, the problem is that the language itself does not lend itself to effective GC. If you have a modular compiler, well structured and documented, then it would make sense to change the semantics to see what the effect is.
 That's all well and good, but it doesn't answer the question: 
 how did they succeed without prior consensus, which Linus 
 claims was never there?
I have no idea what he means. The basic conceptual design of a monolithic Unix kernel is rather well established.
 Right, it's a question of whether we stop everything and take a 
 thorough bath, or clean a little here and there on the go.  
 There is refactoring constantly going on, and documentation is 
 always tough for OSS projects.
Yes, but I see people repeatedly state in the forums that they they want to try do some work on the compiler, but that they find the code badly structured, undocumented and the process difficult to grasp... So it probably will pay off, if they actually mean it. For everyone that voice an opinion we probably can add another 5 that choose to be silent?
 I mean it's still a scripting language used for teaching, 
 scripting, and webapps.  Almost nobody is using it for 
 application programming, ie anything outside that scripting 
 niche, say for mobile apps.
Yes, that's true. Although many people use Python in their workflow as a supporting language or even for meta-programming, like generating source for other languages.
 He noted that the UNIX vendors failed because they were highly 
 specialized for certain corporate niches, unlike linux or 
 Windows, and couldn't survive a collapse of that niche, because 
 they weren't general enough to survive in new niches.  
 Similarly, I'm saying D shouldn't specialize for the same 
 reasons.
I don't think it is comparable, Sun sold hardware and consulting. When the hardware market is undermined by commodity they were left with consulting. It is better for a business to stay focused, and then sell that aspect of their business when the market is shrinking. Sun was also not dissolved, it was picked up and integrated with Oracle, who benefits from Sun's assets. Microsoft is not a very good counter example either. Nor HP or Motorola. Reorganizing fractured businesses is even more difficult, I would think. It basically means you are trying to make sense of many businesses at once, instead of managing one... People are not looking for a general purpose language. They are looking for a solution to their particular problem area... Go Rust Swift All fairly specialized and gaining ground.
Feb 11 2016
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 11 February 2016 at 15:31:02 UTC, Ola Fosheim 
Grøstad wrote:
 People are not looking for a general purpose language. They are 
 looking for a solution to their particular problem area...

 Go
 Rust
 Swift

 All fairly specialized and gaining ground.
I wouldn't call Swift specialized, maybe only because it only runs on OS X, iOS and linux right now. So Linus would predict that Go and Rust may do well now because they're specialized, but will be hit hard if their niche collapses and they don't become more general-purpose before then (which I don't think they can do). You seem to think that's not a real concern, that the growth from specialization is worth it. Let's see who's right. :) On Thursday, 11 February 2016 at 15:34:47 UTC, Nick Sabalausky wrote:
 On 02/11/2016 06:53 AM, Dejan Lekic wrote:
 I know some will disagree with me, but I will say it anyway: IT
 community, especially developers, are known for poor social 
 skills...
 People tend to forget that...
There may be a certain *small* level of truth to that, but most of it is nothing more than decades of Hollywood's pejorative stereotyping. And people being naive enough to believe what they see in the fiction that was produced by people who have spent decades proving themselves to have zero comprehension of basic reality, let alone even a basic high-school level research ability.
That's how Hollywood works: they take a well-known trait or stereotype and build a caricature out of it, ie jocks are good-looking and dumb, the President is wise and composed, and so on.
 It's the standard old Hollywood complete and total disconnect 
 with reality - hell, look how they portray Tourette's a having 
 a relationship to swearing (which is just plain bizarre to 
 anyone actually capable of spending a mere one minute on a 
 basic web search), or how cracking security always involves 
 playing a 3D puzzle game. And then there's the oddity that any 
 time a writer or director uses a computer in real life, the 
 machine is clearly built to detect it's being used by Hollywood 
 personnel, so all login systems automatically switch from the 
 normal "Username and Password don't match \ Incorrect login \ 
 Password was incorrect" to a flashing red "ACCESS DENIED". 
 Because presumably they actually see this flashing red "ACCESS 
 DENIED" when they actually do use a computer in real life, 
 because they couldn't really be THAT dumb when producing a 
 film, right? At least that's the only explanation I can come up 
 with for its appearance in otherwise "realistic" movies, at 
 least aside from LSD...which really could explain all the rest 
 of their delusions too...hmm...
A lot of that is about showing simply and visually, or with greater effect, what would be boring if shown realistically. Many watching will not be able to read "Incorrect login," but they can figure out that flashing red is bad. If that person with Tourette's were just twitching uncontrollably, it's not very entertaining, whereas it's funny if they unexpectedly swear like a sailor in front of some prude. :) Watching somebody cracking security or defusing a bomb realistically would be boring and confusing, if not for the 3D puzzles or flashing LED bomb clocks to watch and understand what's going on. They're not that stupid, you know. They're just trying to make as much money as they can, which means dumbing the material down for the lowest common denominator. In fact, I find it astonishing how often they raise issues that later become big in real life.
 Hollywood mental flakes spend decades inventing and reinforcing 
 their own myopic stereotypes, such as "technical ability == 
 dorks with no social skills", most likely because they feel 
 threatened by people with at least half a function brain (which 
 most of them clearly lack), and then the masses believe it, and 
 it becomes *cough* "fact". That's all there is to it.
There may be some truth to that, but more likely they're just pandering to the stereotypes of their audience, ie the salesman who snickers at the IT guy who can't get a date but is jealous that he makes more money. I did think the recent movies The Social Network and Jobs, both written by the writer of The West Wing and The Newsroom, showed a concerted effort to cast those tech CEOs in a negative light. Hollywood is likely mad that tech is encroaching on their domain, with youtube, iTunes, Netflix, etc. There were supposedly characters in The Newsroom who railed against bloggers (never watched the show, heard it was bad), and the writer has done the same in real life. What you say may be true in the last couple years, whereas before they likely didn't see tech as a threat.
Feb 11 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Thursday, 11 February 2016 at 16:39:14 UTC, Joakim wrote:
 I wouldn't call Swift specialized, maybe only because it only 
 runs on OS X, iOS and linux right now.  So Linus would predict 
 that Go and Rust may do well now because they're specialized, 
 but will be hit hard if their niche collapses and they don't 
 become more general-purpose before then (which I don't think 
 they can do).  You seem to think that's not a real concern, 
 that the growth from specialization is worth it.  Let's see 
 who's right. :)
I don't think there is any such thing as general purpose programming languages. They all get sucked into a niche.
Feb 11 2016
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 10 February 2016 at 02:11:25 UTC, Laeeth Isharc 
wrote:
 http://sealedabstract.com/rants/conduct-unbecoming-of-a-hacker/
 (His particular suggestion about accept patches by default is 
 not why I post this).
 '
 We’re all talk

 [...]
Pretty funny that he chose Stallman as his example of a guy who gets stuff done, whose Hurd microkernel never actually got done, :) though certainly ambitious, so Stallman would never have had a FOSS OS on which to run his GNU tools if it weren't for Linus. As for the main point about useless bickering replacing hacking, that's probably because it was a much smaller community back then, so it consisted of only the really hard-core who wanted to _do_ something, whereas now it's expanded outside that group to the more half-hearted. Either that or he has on the usual rose-colored glasses for the past, the usual veteran complaint, "Everything was better when I was young!" :D I don't think the D community actually has these problems that much, as most of the talk is about technical issues, not whether Apple is doing this or that with their business. The fact is that software was not as big a business back then, whereas the largest companies on the planet are built to write software nowadays, so of course people talk a lot more about how the giant software company du jour's actions affect them.
Feb 10 2016
next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 02/10/2016 01:09 PM, Joakim wrote:
 Pretty funny that he chose Stallman as his example of a guy who gets
 stuff done, whose Hurd microkernel never actually got done, :) though
 certainly ambitious, so Stallman would never have had a FOSS OS on which
 to run his GNU tools if it weren't for Linus.
[Unimportant theorizing ahead...] I wouldn't say that's necessarily true: It could be argued the existence and proliferation of the Linux kernel reduced the priority of his Hurd work, even if only to a subconscious extent. If it hadn't been for the Linux kernel, maybe there would have been more drive (and more contributors) to Hurd.
Feb 10 2016
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 10 February 2016 at 18:31:22 UTC, Nick Sabalausky 
wrote:
 On 02/10/2016 01:09 PM, Joakim wrote:
 Pretty funny that he chose Stallman as his example of a guy 
 who gets
 stuff done, whose Hurd microkernel never actually got done, :) 
 though
 certainly ambitious, so Stallman would never have had a FOSS 
 OS on which
 to run his GNU tools if it weren't for Linus.
[Unimportant theorizing ahead...] I wouldn't say that's necessarily true: It could be argued the existence and proliferation of the Linux kernel reduced the priority of his Hurd work, even if only to a subconscious extent. If it hadn't been for the Linux kernel, maybe there would have been more drive (and more contributors) to Hurd.
I've read that the bigger issue was that they couldn't quite get Hurd working on '90s hardware, and the simpler linux kernel outpaced it, ie I doubt linux displaced Hurd contribution as they're different approaches. On Wednesday, 10 February 2016 at 19:07:27 UTC, Ola Fosheim Grøstad wrote:
 On Wednesday, 10 February 2016 at 18:09:57 UTC, Joakim wrote:
 Pretty funny that he chose Stallman as his example of a guy 
 who gets stuff done, whose Hurd microkernel never actually got 
 done, :) though certainly ambitious, so Stallman would never 
 have had a FOSS OS on which to run his GNU tools if it weren't 
 for Linus.
Well, 386BSD was there in 1992-1994, and several other OSes, so I don't think Linux is that special. Linux did have the right timing. Amiga and other specialized hardware was becoming less attractive at that point in time, and students were getting x86 PCs with MMUs and wanted an OS that was more like Unix, but less crude than Minix.
Still means he'd have had to rely on others to provide his OS, plus BSD was under a legal cloud at the time, which is one of the reasons people say linux lapped it, and he'd probably resent it not being GPL, so it wouldn't work for him anyway.
 But I don't think Hurd is much of a Stallman coding-project. 
 His core project is the GPL and he did created Emacs and GCC 
 which were very important for the spread of the GPL.
I thought he was intimately involved with Hurd, but I don't follow it.
 Before GPL most academic software had very limiting "free for 
 non-commercial educational use" clauses in their licenses. The 
 GPL itself is much more important than any individual piece of 
 software.
Perhaps historically as a guinea pig, but its use is waning for more permissive licenses, which have been around for decades too.
 As for the main point about useless bickering replacing 
 hacking, that's probably because it was a much smaller 
 community back then, so it consisted of only the really 
 hard-core who wanted to _do_ something, whereas now it's 
 expanded outside that group to the more half-hearted.  Either 
 that or he has on the usual rose-colored glasses for the past, 
 the usual veteran complaint, "Everything was better when I was 
 young!" :D
Well, both Emacs and GCC have had their forks... so. Yes.
Forks are a different issue, as he'd probably say that's real technical disagreement. He's talking more about silly reasons, though I guess forks are sometimes started because of the same dismissiveness he lays out.
Feb 10 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 10 February 2016 at 19:44:50 UTC, Joakim wrote:
 Perhaps historically as a guinea pig, but its use is waning for 
 more permissive licenses, which have been around for decades 
 too.
Well, they had been around for things like X11, which had a commercial consortium driving the development. X11 was just a reference implementation, and members got early access to it so that they could implement it in their proprietary X11 terminals before the general public got access to it... But as (even public) universities were pressured to earn money from their research the heads higher up insisted on anti-commercial licensing. Only when GPL gained traction could the comp. sci. people start to push for something more liberal. IIRC the most standard licensing was educational-use, non-commercial-use or public domain back then. People had to pay for their compilers and IDEs too... quite a lot... And phone bills from using BBSes. Shareware was a much more accepted concept at that point in time too... GPL changed the world quite significantly.
Feb 10 2016
prev sibling parent reply deadalnix <deadalnix gmail.com> writes:
On Wednesday, 10 February 2016 at 18:31:22 UTC, Nick Sabalausky 
wrote:
 On 02/10/2016 01:09 PM, Joakim wrote:
 Pretty funny that he chose Stallman as his example of a guy 
 who gets
 stuff done, whose Hurd microkernel never actually got done, :) 
 though
 certainly ambitious, so Stallman would never have had a FOSS 
 OS on which
 to run his GNU tools if it weren't for Linus.
[Unimportant theorizing ahead...] I wouldn't say that's necessarily true: It could be argued the existence and proliferation of the Linux kernel reduced the priority of his Hurd work, even if only to a subconscious extent. If it hadn't been for the Linux kernel, maybe there would have been more drive (and more contributors) to Hurd.
Also because context switching got from a handful of cycle at the time to about 1000 cycles on modern CPU, making the idea of microkernel somewhat less attractive. But saying Stallman released nothing is unfair. If we can consider hurd a failure, he was also behind emacs, early gcc and other things.
Feb 10 2016
parent Joakim <dlang joakim.fea.st> writes:
On Wednesday, 10 February 2016 at 23:55:17 UTC, deadalnix wrote:
 Also because context switching got from a handful of cycle at 
 the time to about 1000 cycles on modern CPU, making the idea of 
 microkernel somewhat less attractive.

 But saying Stallman released nothing is unfair. If we can 
 consider hurd a failure, he was also behind emacs, early gcc 
 and other things.
Nobody said "nothing," I specifically noted his "GNU tools" and that a microkernel was ambitious. Just remarking that his kernel didn't get done, for a variety of reasons.
Feb 10 2016
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Wednesday, 10 February 2016 at 18:09:57 UTC, Joakim wrote:
 Pretty funny that he chose Stallman as his example of a guy who 
 gets stuff done, whose Hurd microkernel never actually got 
 done, :) though certainly ambitious, so Stallman would never 
 have had a FOSS OS on which to run his GNU tools if it weren't 
 for Linus.
Well, 386BSD was there in 1992-1994, and several other OSes, so I don't think Linux is that special. Linux did have the right timing. Amiga and other specialized hardware was becoming less attractive at that point in time, and students were getting x86 PCs with MMUs and wanted an OS that was more like Unix, but less crude than Minix. But I don't think Hurd is much of a Stallman coding-project. His core project is the GPL and he did created Emacs and GCC which were very important for the spread of the GPL. Before GPL most academic software had very limiting "free for non-commercial educational use" clauses in their licenses. The GPL itself is much more important than any individual piece of software.
 As for the main point about useless bickering replacing 
 hacking, that's probably because it was a much smaller 
 community back then, so it consisted of only the really 
 hard-core who wanted to _do_ something, whereas now it's 
 expanded outside that group to the more half-hearted.  Either 
 that or he has on the usual rose-colored glasses for the past, 
 the usual veteran complaint, "Everything was better when I was 
 young!" :D
Well, both Emacs and GCC have had their forks... so. Yes.
Feb 10 2016
prev sibling next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 02/09/2016 09:11 PM, Laeeth Isharc wrote:
 http://sealedabstract.com/rants/conduct-unbecoming-of-a-hacker/
 (His particular suggestion about accept patches by default is not why I
 post this).
Just read the rest of the article. That's a REALLY good article. Especially these bits: =============================== "Consider the following outcomes, which happen with some regularity: * [...] * Objections that the problem should be solved another way, but that are accompanied without any volunteers to do it that way. A patch in the hand is better than two in the bush. If somebody does end up doing it the “right” way someday, git revert is only 10 keystrokes. The fact that someday a better patch might appear is not an argument against merging an adequate patch right now. * Bare assertions that there is no need for the feature, when the fact that somebody wrote a patch should be primae facie evidence that the feature was needed" ------------------- "Really, what I’m asking is this: Which is more convincing? Concrete computer code authored by someone with first-hand knowledge of the defect? Or the bare assertion that something is wrong with it? I mean, either one might be correct. But the first is better supported." ============================== Really like: "A patch in the hand is better than two in the bush."
Feb 10 2016
prev sibling next sibling parent lobo <swamplobo gmail.com> writes:
On Wednesday, 10 February 2016 at 02:11:25 UTC, Laeeth Isharc 
wrote:
 http://sealedabstract.com/rants/conduct-unbecoming-of-a-hacker/
 (His particular suggestion about accept patches by default is 
 not why I post this).
 '
 We’re all talk

 [...]
On Wednesday, 10 February 2016 at 02:11:25 UTC, Laeeth Isharc wrote: [] Interesting timing! At my workplace we have a geekfest video session once a month and last week it was this: https://youtu.be/-F-3E8pyjFo It's worth a watch and it covers very similar ground. The presenters discuss their experiences developing subversion and other FOSS projects. bye, lobo
Feb 10 2016
prev sibling next sibling parent reply Dejan Lekic <dejan.lekic gmail.com> writes:
I am sure nobody will disagree with this post. Thing is, whenever 
there are people, there will be disagreements. I remember "final 
by default" vs "virtual by default" thread. I remember people 
whining and leaving the D community for X various reasons.

What made me personally stick to D is that I humbly believe 
people who drive the project have clear ideas where do we go.

I know some will disagree with me, but I will say it anyway: IT 
community, especially developers, are known for poor social 
skills... People tend to forget that...
Feb 11 2016
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 02/11/2016 06:53 AM, Dejan Lekic wrote:
 I know some will disagree with me, but I will say it anyway: IT
 community, especially developers, are known for poor social skills...
 People tend to forget that...
There may be a certain *small* level of truth to that, but most of it is nothing more than decades of Hollywood's pejorative stereotyping. And people being naive enough to believe what they see in the fiction that was produced by people who have spent decades proving themselves to have zero comprehension of basic reality, let alone even a basic high-school level research ability. It's the standard old Hollywood complete and total disconnect with reality - hell, look how they portray Tourette's a having a relationship to swearing (which is just plain bizarre to anyone actually capable of spending a mere one minute on a basic web search), or how cracking security always involves playing a 3D puzzle game. And then there's the oddity that any time a writer or director uses a computer in real life, the machine is clearly built to detect it's being used by Hollywood personnel, so all login systems automatically switch from the normal "Username and Password don't match \ Incorrect login \ Password was incorrect" to a flashing red "ACCESS DENIED". Because presumably they actually see this flashing red "ACCESS DENIED" when they actually do use a computer in real life, because they couldn't really be THAT dumb when producing a film, right? At least that's the only explanation I can come up with for its appearance in otherwise "realistic" movies, at least aside from LSD...which really could explain all the rest of their delusions too...hmm... Hollywood mental flakes spend decades inventing and reinforcing their own myopic stereotypes, such as "technical ability == dorks with no social skills", most likely because they feel threatened by people with at least half a function brain (which most of them clearly lack), and then the masses believe it, and it becomes *cough* "fact". That's all there is to it.
Feb 11 2016
prev sibling parent reply w0rp <devw0rp gmail.com> writes:
His article is way too long. It seems like an article about 
whining about how people whine too much.
Feb 11 2016
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 02/11/2016 04:54 PM, w0rp wrote:
 His article is way too long. It seems like an article about whining
 about how people whine too much.
It's metawhine! :)
Feb 11 2016
parent Abdulhaq <alynch4047 gmail.com> writes:
On Friday, 12 February 2016 at 03:19:52 UTC, Nick Sabalausky 
wrote:
 On 02/11/2016 04:54 PM, w0rp wrote:
 His article is way too long. It seems like an article about 
 whining
 about how people whine too much.
It's metawhine! :)
These meta whines get on my nerves, everything was much better in Usenet days.
Feb 12 2016