www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Daily downloads in decline

reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Per http://erdani.com/d/downloads.daily.png, the 28-day moving average 
of daily dmd downloads is in pronounced decline following a peak at the 
2.067 release. It is possible that the recent release of Rust 1.0 has 
caused that, shifting drive-by experimenters to it.

We need to act on this on multiple fronts.

1. It's a big bummer that nothing has happened with chopping up the 
videos over the weekend. Right now DConf is three 6-hour blobs of 
unstructured footage. John has warned us he might not have broadband 
access to do so during his travels. In retrospect, what we should have 
done was to immediately arrange that John gives access to the videos to 
someone willing and able to do the postprocessing.

2. It's an equally big bummer that "This Week in D" failed to be there 
on Sunday night. I completely understand Adam's overhead, what with his 
still traveling and all, but the bottom line is if it's not every Sunday 
it's not steady and if it's not steady it's not. Again, in retrospect it 
seems we need backup plans for when the protagonist of whatever 
important activity is unable to carry it. Who'd like to double Adam on this?

3. We've just had a good conference with solid content, but if our 
collective actions are to be interpreted, we did our best to be as 
stealth as possible. Please consider writing blogs, articles, tweets, 
posts, related to all that stuff. Speakers in particular should consider 
converting their good work into articles. Programmer news sites are full 
of Rust-related stuff; we must respond in kind with great D content.

All of us who have an interest in D to succeed must understand there is 
also a proportional sense of duty. If you can do X and don't, it can be 
safely assumed X will just not get done at all. Which means whatever you 
can do, please just do it, do it now, and stay with it until it's done.


Thanks,

Andrei
Jun 01 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/1/15 11:11 AM, Andrei Alexandrescu wrote:
[...]

Meant to also add

4. We need to marshal our efforts behind 2.068, and clarify the big 
ticket items accomplished. I'm thinking rangification of Phobos - GC no 
longer needed for most primitives, and documented where needed. As much 
as I want it, ddmd seems to not be happening for 2.068 because of, 
simply put, insufficient resources.

What other large topics for 2.068?


Andrei
Jun 01 2015
next sibling parent "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 On 6/1/15 11:11 AM, Andrei Alexandrescu wrote:
 [...]

 Meant to also add

 4. We need to marshal our efforts behind 2.068, and clarify the 
 big ticket items accomplished. I'm thinking rangification of 
 Phobos - GC no longer needed for most primitives, and 
 documented where needed. As much as I want it, ddmd seems to 
 not be happening for 2.068 because of, simply put, insufficient 
 resources.

 What other large topics for 2.068?


 Andrei
The unique/shared changes are rather large changes, they've been pretty much revamped. https://github.com/D-Programming-Language/phobos/pull/3139 - pulled already https://github.com/D-Programming-Language/phobos/pull/3259 - not done https://github.com/D-Programming-Language/phobos/pull/3225 - not done getting these done before 2.068 should be pretty high priority.
Jun 01 2015
prev sibling next sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 2 June 2015 at 04:14, Andrei Alexandrescu via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 6/1/15 11:11 AM, Andrei Alexandrescu wrote:
 [...]

 Meant to also add

 4. We need to marshal our efforts behind 2.068, and clarify the big ticket
 items accomplished. I'm thinking rangification of Phobos - GC no longer
 needed for most primitives, and documented where needed. As much as I want
 it, ddmd seems to not be happening for 2.068 because of, simply put,
 insufficient resources.

 What other large topics for 2.068?
Based on other conversations, perhaps a good concept to get behind would be: LDC+GDC are first-class toolchains, and DMD devs will work together with the other compilers to release in parallel, making sure distributions for all compilers are available at the same time from the same place. I think it's been a big problem forever now that nobody can really have any confidence that the language version is matching between available compilers. You've said elsewhere, this is a major issue inhibiting ddmd? Perhaps a focus for 2.068 is to organise and address this general problem by developing better systems of coordination and release? If this can be achieved, perhaps we could promote LDC as the default go-to compiler, such that newcomers doing experiments and benchmarks will experience realistic results against competition? Constantly having to correct benchmarkers to use LDC and offer the proper flags is a bit embarrassing. I wonder how many times this has happened and it's not posted on the internet such that we are able to correct them?
Jun 01 2015
prev sibling next sibling parent "Benjamin Thaut" <code benjamin-thaut.de> writes:
On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 On 6/1/15 11:11 AM, Andrei Alexandrescu wrote:

 What other large topics for 2.068?
I'm still working on Windows DLL support whenever I find time, and I hope to get it done for 2.068 or 2.069 Kind Regards Benjamin Thaut
Jun 02 2015
prev sibling parent reply "David Nadlinger" <code klickverbot.at> writes:
On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 As much as I want it, ddmd seems to not be happening for 2.068 
 because of, simply put, insufficient resources.
Why is that? A first test release does not seem to be further away than, say, full rangeification of Phobos. - David
Jun 05 2015
parent reply "Joakim" <dlang joakim.fea.st> writes:
On Friday, 5 June 2015 at 20:23:17 UTC, David Nadlinger wrote:
 On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu 
 wrote:
 As much as I want it, ddmd seems to not be happening for 2.068 
 because of, simply put, insufficient resources.
Why is that? A first test release does not seem to be further away than, say, full rangeification of Phobos.
Answered earlier in the thread: http://forum.dlang.org/post/mkibq1$aj9$1 digitalmars.com
Jun 05 2015
parent reply "David Nadlinger" <code klickverbot.at> writes:
On Saturday, 6 June 2015 at 03:41:13 UTC, Joakim wrote:
 On Friday, 5 June 2015 at 20:23:17 UTC, David Nadlinger wrote:
 On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu 
 wrote:
 As much as I want it, ddmd seems to not be happening for 
 2.068 because of, simply put, insufficient resources.
Why is that? A first test release does not seem to be further away than, say, full rangeification of Phobos.
Answered earlier in the thread: http://forum.dlang.org/post/mkibq1$aj9$1 digitalmars.com
That's not a full answer. I worked with Daniel to get LDC to successfully compile DDMD the Saturday after DConf, which is part of the reason why we can confidently make the 20% claim in the first place (i.e., be sure that is not a C++ vs D issue). Still, I'm confident that getting a LDC release ready would be less work than, say, properly refactoring all of Phobos to avoid allocations by using ranges. Sorry if I appear a bit grumpy, but even though recently a number of people have been clamoring for more focus on high-impact, strategically important work, not a single one of them has showed up at the doorsteps of GDC/LDC with any patches so far. This strikes me as rather schizophrenic and dishonest, especially given that the same people are quick to mention the importance of those compilers in other contexts. Either that, or they seem to maintain the conception that DMD is somehow a viable option for performance-critical code. In the latter case, I don't have much hope for D in the long term, given that this would imply that decisions are made involving an alarming level of delusional double-think. - David
Jun 09 2015
next sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Tuesday, 9 June 2015 at 20:54:00 UTC, David Nadlinger wrote:
 That's not a full answer. I worked with Daniel to get LDC to 
 successfully compile DDMD the Saturday after DConf, which is 
 part of the reason why we can confidently make the 20% claim in 
 the first place (i.e., be sure that is not a C++ vs D issue).
And who will help GDC port under DDMD? It seems to me that GDC, LDC and need serious support.
Jun 09 2015
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/9/15 1:53 PM, David Nadlinger wrote:
 On Saturday, 6 June 2015 at 03:41:13 UTC, Joakim wrote:
 On Friday, 5 June 2015 at 20:23:17 UTC, David Nadlinger wrote:
 On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 As much as I want it, ddmd seems to not be happening for 2.068
 because of, simply put, insufficient resources.
Why is that? A first test release does not seem to be further away than, say, full rangeification of Phobos.
Answered earlier in the thread: http://forum.dlang.org/post/mkibq1$aj9$1 digitalmars.com
That's not a full answer. I worked with Daniel to get LDC to successfully compile DDMD the Saturday after DConf, which is part of the reason why we can confidently make the 20% claim in the first place (i.e., be sure that is not a C++ vs D issue). Still, I'm confident that getting a LDC release ready would be less work than, say, properly refactoring all of Phobos to avoid allocations by using ranges.
Yah, well, both need doing.
 Sorry if I appear a bit grumpy, but even though recently a number of
 people have been clamoring for more focus on high-impact, strategically
 important work, not a single one of them has showed up at the doorsteps
 of GDC/LDC with any patches so far. This strikes me as rather
 schizophrenic and dishonest, especially given that the same people are
 quick to mention the importance of those compilers in other contexts.
 Either that, or they seem to maintain the conception that DMD is somehow
 a viable option for performance-critical code. In the latter case, I
 don't have much hope for D in the long term, given that this would imply
 that decisions are made involving an alarming level of delusional
 double-think.
Welcome to my world, we have a table for you right here, sir. The point is there, just a couple of things that don't sit well: "schizophrenic", "dishonest", "delusional". What's at work here is simple psychology applied to normal and well-meaning human beings, not scheming sociopaths. Andrei
Jun 09 2015
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/9/2015 1:53 PM, David Nadlinger wrote:
 Sorry if I appear a bit grumpy, but even though recently a number of people
have
 been clamoring for more focus on high-impact, strategically important work, not
 a single one of them has showed up at the doorsteps of GDC/LDC with any patches
 so far.
This is what we sign up for when working on open source software. I don't sit here only writing fun stuff, I work on the dirty work that nobody else wants to do, but is critical. I've been pushing for rangification of Phobos for probably 2 years now, and essentially nothing happened. It has to be done, so I do it myself. It's just how it is when there isn't paid staff one can direct.
 This strikes me as rather schizophrenic and dishonest, especially given
 that the same people are quick to mention the importance of those compilers in
 other contexts. Either that, or they seem to maintain the conception that DMD
is
 somehow a viable option for performance-critical code. In the latter case, I
 don't have much hope for D in the long term, given that this would imply that
 decisions are made involving an alarming level of delusional double-think.
We wouldn't have had a Win64 version of D without DMD.
Jun 09 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 10 June 2015 at 01:21:43 UTC, Walter Bright wrote:
 We wouldn't have had a Win64 version of D without DMD.
There's a 64 bit version of LDC-MSVC right on LDC's github page, does it not work?
Jun 09 2015
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/9/2015 6:25 PM, weaselcat wrote:
 On Wednesday, 10 June 2015 at 01:21:43 UTC, Walter Bright wrote:
 We wouldn't have had a Win64 version of D without DMD.
There's a 64 bit version of LDC-MSVC right on LDC's github page, does it not work?
Recently it did not exist, but Win64 had become critical to D's future.
Jun 09 2015
parent reply "Tofu Ninja" <emmons0 purdue.edu> writes:
On Wednesday, 10 June 2015 at 02:25:09 UTC, Walter Bright wrote:
 On 6/9/2015 6:25 PM, weaselcat wrote:
 On Wednesday, 10 June 2015 at 01:21:43 UTC, Walter Bright 
 wrote:
 We wouldn't have had a Win64 version of D without DMD.
There's a 64 bit version of LDC-MSVC right on LDC's github page, does it not work?
Recently it did not exist, but Win64 had become critical to D's future.
Just out of curiosity, if ldc or gdc was able to bring their compile times down, would you ever consider dropping support for dmd?
Jun 09 2015
parent "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 10 June 2015 at 02:56:26 UTC, Tofu Ninja wrote:
 On Wednesday, 10 June 2015 at 02:25:09 UTC, Walter Bright wrote:
 On 6/9/2015 6:25 PM, weaselcat wrote:
 On Wednesday, 10 June 2015 at 01:21:43 UTC, Walter Bright 
 wrote:
 We wouldn't have had a Win64 version of D without DMD.
There's a 64 bit version of LDC-MSVC right on LDC's github page, does it not work?
Recently it did not exist, but Win64 had become critical to D's future.
Just out of curiosity, if ldc or gdc was able to bring their compile times down, would you ever consider dropping support for dmd?
ldc already compiles release builds faster than dmd and is well within 10-20% of dmd on debug builds. In my repeated tests, anyways.
Jun 09 2015
prev sibling parent reply "Jack Stouffer" <jack jackstouffer.com> writes:
On Tuesday, 9 June 2015 at 20:54:00 UTC, David Nadlinger wrote:
 Sorry if I appear a bit grumpy, but even though recently a 
 number of people have been clamoring for more focus on 
 high-impact, strategically important work, not a single one of 
 them has showed up at the doorsteps of GDC/LDC with any patches 
 so far. This strikes me as rather schizophrenic and dishonest, 
 especially given that the same people are quick to mention the 
 importance of those compilers in other contexts. Either that, 
 or they seem to maintain the conception that DMD is somehow a 
 viable option for performance-critical code. In the latter 
 case, I don't have much hope for D in the long term, given that 
 this would imply that decisions are made involving an alarming 
 level of delusional double-think.
I think that a lot of the people asking for a 2.067 LDC are just users of D, and (I am including myself in this group) a lot of those people don't know the first thing about LLVM or good complier design in general. While it may seem dishonest for people to ask for these things and not help, keep in mind that the vast majority of programmers are not even able to help.
Jun 09 2015
parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 10/06/2015 4:44 p.m., Jack Stouffer wrote:
 On Tuesday, 9 June 2015 at 20:54:00 UTC, David Nadlinger wrote:
 Sorry if I appear a bit grumpy, but even though recently a number of
 people have been clamoring for more focus on high-impact,
 strategically important work, not a single one of them has showed up
 at the doorsteps of GDC/LDC with any patches so far. This strikes me
 as rather schizophrenic and dishonest, especially given that the same
 people are quick to mention the importance of those compilers in other
 contexts. Either that, or they seem to maintain the conception that
 DMD is somehow a viable option for performance-critical code. In the
 latter case, I don't have much hope for D in the long term, given that
 this would imply that decisions are made involving an alarming level
 of delusional double-think.
I think that a lot of the people asking for a 2.067 LDC are just users of D, and (I am including myself in this group) a lot of those people don't know the first thing about LLVM or good complier design in general. While it may seem dishonest for people to ask for these things and not help, keep in mind that the vast majority of programmers are not even able to help.
I for one would love to help. But I barely understand X86. Not to mention having to get a setup going ext. Not really worth it right now for me. Although I'd rather work on SDC instead of LDC. Primarily because well it's so shinyyyyyy. I would be happy to write a book to teach compiler development from everything from basic x86 encoding to complex optimization strategies. If only I knew it and yes I know they exist just wrong method for teaching it IMO.
Jun 09 2015
parent reply "deadalnix" <deadalnix gmail.com> writes:
On Wednesday, 10 June 2015 at 04:55:43 UTC, Rikki Cattermole 
wrote:
 I think that a lot of the people asking for a 2.067 LDC are 
 just users
 of D, and (I am including myself in this group) a lot of those 
 people
 don't know the first thing about LLVM or good complier design 
 in
 general. While it may seem dishonest for people to ask for 
 these things
 and not help, keep in mind that the vast majority of 
 programmers are not
 even able to help.
I for one would love to help. But I barely understand X86. Not to mention having to get a setup going ext. Not really worth it right now for me. Although I'd rather work on SDC instead of LDC. Primarily because well it's so shinyyyyyy. I would be happy to write a book to teach compiler development from everything from basic x86 encoding to complex optimization strategies. If only I knew it and yes I know they exist just wrong method for teaching it IMO.
Lately, I've been listening to a playlist of interview, presentations and other thing involving Elon Musk. The playlist is hours long and I'm listening to it while doing other things. After selling paypal, Musk wanted to use part of his money to revive the desire to explore space. What he plan to do is to send a plant on Mars, a very symbolic stunt that would, he hopes, renew the interest in space exploration, maybe increase NASA funding or whatnot. Thing is, he doesn't know about space that much. He has a physic major working on batteries, and then went to have a payment processing company. So he could have said, like you guys, "well I don't know much about space/compiler let's wait for others to make things happen". But nope, he went to talk to space specialists, engineer and scientists, and then, went got in touch with some Russian to buy refurbished ICBM in order to start experimenting. One of the notable thing is how amazed people are that he went to buy ICBM from the russian. Well guess what, that is one of the cheapest thing that can go into space, so if you want to make something happen, that is an excellent starting point. I can continue the story with myself (because everyone knows I compare to Elon in so many ways, and he is greatly inspired by my vision and capability to make things happen). Recently I got to a point on SDC where working on the GC became an important item. Thing is, I know about compiler not memory allocator. Having low level knowledge of how the CPU operate does not provide me wisdom about what kind of algorithm and datastructure will behave nicely on a typical wokload. So I went to read tcmalloc source code, jemalloc source code, libc's malloc, I read a ton of paper about various allocators, and went after Jason Evan - one of the great perk of working for Facebook is to have all these amazing people who can make you feel like an idiot because they know so much more than you do - as to get as much of the "why" as possible. the code told me the "what/how" but that is not sufficient to get a good grasp of the matter at hand. Making things happen is not about waiting for the wisdom to fall from the sky to deliver you the deep and arcane knowledge of compiler/memory management/rocketry . It is about learning enough to get started, and then start do do thing while continuing to learn more. To get back on point, yes some task in LDC or SDC (or DMD, or GDC) require some good knowledge of compiler stuff. Obviously, these are compiler, and I'd add D compiler, which involve a certain level of complexity. But let's be honest, a good chunk of the work is not guru level compiler arcane. Most of the work is actually dumb shit that just need to be done like it is for all other software. You don't wait to know how to paint like Rembrandt to start painting. Because that will never happen. You just paint dumb shit again and again, trying to make the new shit a bit less shitty than the old shit. You do that while studying Rembrandt's techniques. And, after thousand of painting, you finally get there.
Jun 10 2015
next sibling parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 10/06/2015 7:02 p.m., deadalnix wrote:
 On Wednesday, 10 June 2015 at 04:55:43 UTC, Rikki Cattermole wrote:
 I think that a lot of the people asking for a 2.067 LDC are just users
 of D, and (I am including myself in this group) a lot of those people
 don't know the first thing about LLVM or good complier design in
 general. While it may seem dishonest for people to ask for these things
 and not help, keep in mind that the vast majority of programmers are not
 even able to help.
I for one would love to help. But I barely understand X86. Not to mention having to get a setup going ext. Not really worth it right now for me. Although I'd rather work on SDC instead of LDC. Primarily because well it's so shinyyyyyy. I would be happy to write a book to teach compiler development from everything from basic x86 encoding to complex optimization strategies. If only I knew it and yes I know they exist just wrong method for teaching it IMO.
Lately, I've been listening to a playlist of interview, presentations and other thing involving Elon Musk. The playlist is hours long and I'm listening to it while doing other things. After selling paypal, Musk wanted to use part of his money to revive the desire to explore space. What he plan to do is to send a plant on Mars, a very symbolic stunt that would, he hopes, renew the interest in space exploration, maybe increase NASA funding or whatnot. Thing is, he doesn't know about space that much. He has a physic major working on batteries, and then went to have a payment processing company. So he could have said, like you guys, "well I don't know much about space/compiler let's wait for others to make things happen". But nope, he went to talk to space specialists, engineer and scientists, and then, went got in touch with some Russian to buy refurbished ICBM in order to start experimenting. One of the notable thing is how amazed people are that he went to buy ICBM from the russian. Well guess what, that is one of the cheapest thing that can go into space, so if you want to make something happen, that is an excellent starting point. I can continue the story with myself (because everyone knows I compare to Elon in so many ways, and he is greatly inspired by my vision and capability to make things happen). Recently I got to a point on SDC where working on the GC became an important item. Thing is, I know about compiler not memory allocator. Having low level knowledge of how the CPU operate does not provide me wisdom about what kind of algorithm and datastructure will behave nicely on a typical wokload. So I went to read tcmalloc source code, jemalloc source code, libc's malloc, I read a ton of paper about various allocators, and went after Jason Evan - one of the great perk of working for Facebook is to have all these amazing people who can make you feel like an idiot because they know so much more than you do - as to get as much of the "why" as possible. the code told me the "what/how" but that is not sufficient to get a good grasp of the matter at hand. Making things happen is not about waiting for the wisdom to fall from the sky to deliver you the deep and arcane knowledge of compiler/memory management/rocketry . It is about learning enough to get started, and then start do do thing while continuing to learn more. To get back on point, yes some task in LDC or SDC (or DMD, or GDC) require some good knowledge of compiler stuff. Obviously, these are compiler, and I'd add D compiler, which involve a certain level of complexity. But let's be honest, a good chunk of the work is not guru level compiler arcane. Most of the work is actually dumb shit that just need to be done like it is for all other software. You don't wait to know how to paint like Rembrandt to start painting. Because that will never happen. You just paint dumb shit again and again, trying to make the new shit a bit less shitty than the old shit. You do that while studying Rembrandt's techniques. And, after thousand of painting, you finally get there.
I'm well aware. I've been hammering out over long term to learn the underlying technologies. For example writing a PE-COFF linker. My experience is well ugh lets just say, if something seems hard and almost impossible maybe something isn't quite right. Unfortunately I'm in a war of attrition trying to learn x86 and friends. And its a long one!
Jun 10 2015
parent reply Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 10 June 2015 at 09:11, Rikki Cattermole via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 On 10/06/2015 7:02 p.m., deadalnix wrote:

 On Wednesday, 10 June 2015 at 04:55:43 UTC, Rikki Cattermole wrote:

 I think that a lot of the people asking for a 2.067 LDC are just users
 of D, and (I am including myself in this group) a lot of those people
 don't know the first thing about LLVM or good complier design in
 general. While it may seem dishonest for people to ask for these things
 and not help, keep in mind that the vast majority of programmers are not
 even able to help.
I for one would love to help. But I barely understand X86. Not to mention having to get a setup going ext. Not really worth it right now for me. Although I'd rather work on SDC instead of LDC. Primarily because well it's so shinyyyyyy. I would be happy to write a book to teach compiler development from everything from basic x86 encoding to complex optimization strategies. If only I knew it and yes I know they exist just wrong method for teaching it IMO.
Lately, I've been listening to a playlist of interview, presentations and other thing involving Elon Musk. The playlist is hours long and I'm listening to it while doing other things. After selling paypal, Musk wanted to use part of his money to revive the desire to explore space. What he plan to do is to send a plant on Mars, a very symbolic stunt that would, he hopes, renew the interest in space exploration, maybe increase NASA funding or whatnot. Thing is, he doesn't know about space that much. He has a physic major working on batteries, and then went to have a payment processing company. So he could have said, like you guys, "well I don't know much about space/compiler let's wait for others to make things happen". But nope, he went to talk to space specialists, engineer and scientists, and then, went got in touch with some Russian to buy refurbished ICBM in order to start experimenting. One of the notable thing is how amazed people are that he went to buy ICBM from the russian. Well guess what, that is one of the cheapest thing that can go into space, so if you want to make something happen, that is an excellent starting point. I can continue the story with myself (because everyone knows I compare to Elon in so many ways, and he is greatly inspired by my vision and capability to make things happen). Recently I got to a point on SDC where working on the GC became an important item. Thing is, I know about compiler not memory allocator. Having low level knowledge of how the CPU operate does not provide me wisdom about what kind of algorithm and datastructure will behave nicely on a typical wokload. So I went to read tcmalloc source code, jemalloc source code, libc's malloc, I read a ton of paper about various allocators, and went after Jason Evan - one of the great perk of working for Facebook is to have all these amazing people who can make you feel like an idiot because they know so much more than you do - as to get as much of the "why" as possible. the code told me the "what/how" but that is not sufficient to get a good grasp of the matter at hand. Making things happen is not about waiting for the wisdom to fall from the sky to deliver you the deep and arcane knowledge of compiler/memory management/rocketry . It is about learning enough to get started, and then start do do thing while continuing to learn more. To get back on point, yes some task in LDC or SDC (or DMD, or GDC) require some good knowledge of compiler stuff. Obviously, these are compiler, and I'd add D compiler, which involve a certain level of complexity. But let's be honest, a good chunk of the work is not guru level compiler arcane. Most of the work is actually dumb shit that just need to be done like it is for all other software. You don't wait to know how to paint like Rembrandt to start painting. Because that will never happen. You just paint dumb shit again and again, trying to make the new shit a bit less shitty than the old shit. You do that while studying Rembrandt's techniques. And, after thousand of painting, you finally get there.
I'm well aware. I've been hammering out over long term to learn the underlying technologies. For example writing a PE-COFF linker. My experience is well ugh lets just say, if something seems hard and almost impossible maybe something isn't quite right. Unfortunately I'm in a war of attrition trying to learn x86 and friends. And its a long one!
Good luck with that. :-)
Jun 10 2015
parent Rikki Cattermole <alphaglosined gmail.com> writes:
On 10/06/2015 7:35 p.m., Iain Buclaw via Digitalmars-d wrote:
 On 10 June 2015 at 09:11, Rikki Cattermole via Digitalmars-d
 <digitalmars-d puremagic.com <mailto:digitalmars-d puremagic.com>> wrote:

     On 10/06/2015 7:02 p.m., deadalnix wrote:

         On Wednesday, 10 June 2015 at 04:55:43 UTC, Rikki Cattermole wrote:

                 I think that a lot of the people asking for a 2.067 LDC
                 are just users
                 of D, and (I am including myself in this group) a lot of
                 those people
                 don't know the first thing about LLVM or good complier
                 design in
                 general. While it may seem dishonest for people to ask
                 for these things
                 and not help, keep in mind that the vast majority of
                 programmers are not
                 even able to help.


             I for one would love to help. But I barely understand X86.
             Not to
             mention having to get a setup going ext. Not really worth it
             right now
             for me.

             Although I'd rather work on SDC instead of LDC. Primarily
             because well
             it's so shinyyyyyy.

             I would be happy to write a book to teach compiler
             development from
             everything from basic x86 encoding to complex optimization
             strategies.
             If only I knew it and yes I know they exist just wrong
             method for
             teaching it IMO.


         Lately, I've been listening to a playlist of interview,
         presentations
         and other thing involving Elon Musk. The playlist is hours long
         and I'm
         listening to it while doing other things.

         After selling paypal, Musk wanted to use part of his money to
         revive the
         desire to explore space. What he plan to do is to send a plant
         on Mars,
         a very symbolic stunt that would, he hopes, renew the interest
         in space
         exploration, maybe increase NASA funding or whatnot.

         Thing is, he doesn't know about space that much. He has a physic
         major
         working on batteries, and then went to have a payment processing
         company. So he could have said, like you guys, "well I don't
         know much
         about space/compiler let's wait for others to make things
         happen". But
         nope, he went to talk to space specialists, engineer and
         scientists, and
         then, went got in touch with some Russian to buy refurbished ICBM in
         order to start experimenting.

         One of the notable thing is how amazed people are that he went
         to buy
         ICBM from the russian. Well guess what, that is one of the cheapest
         thing that can go into space, so if you want to make something
         happen,
         that is an excellent starting point.

         I can continue the story with myself (because everyone knows I
         compare
         to Elon in so many ways, and he is greatly inspired by my vision and
         capability to make things happen). Recently I got to a point on SDC
         where working on the GC became an important item. Thing is, I
         know about
         compiler not memory allocator. Having low level knowledge of how
         the CPU
         operate does not provide me wisdom about what kind of algorithm and
         datastructure will behave nicely on a typical wokload.

         So I went to read tcmalloc source code, jemalloc source code, libc's
         malloc, I read a ton of paper about various allocators, and went
         after
         Jason Evan - one of the great perk of working for Facebook is to
         have
         all these amazing people who can make you feel like an idiot because
         they know so much more than you do - as to get as much of the
         "why" as
         possible. the code told me the "what/how" but that is not
         sufficient to
         get a good grasp of the matter at hand.

         Making things happen is not about waiting for the wisdom to fall
         from
         the sky to deliver you the deep and arcane knowledge of
         compiler/memory
         management/rocketry . It is about learning enough to get
         started, and
         then start do do thing while continuing to learn more.

         To get back on point, yes some task in LDC or SDC (or DMD, or GDC)
         require some good knowledge of compiler stuff. Obviously, these are
         compiler, and I'd add D compiler, which involve a certain level of
         complexity. But let's be honest, a good chunk of the work is not
         guru
         level compiler arcane. Most of the work is actually dumb shit
         that just
         need to be done like it is for all other software.

         You don't wait to know how to paint like Rembrandt to start
         painting.
         Because that will never happen. You just paint dumb shit again and
         again, trying to make the new shit a bit less shitty than the
         old shit.
         You do that while studying Rembrandt's techniques. And, after
         thousand
         of painting, you finally get there.


     I'm well aware.
     I've been hammering out over long term to learn the underlying
     technologies.
     For example writing a PE-COFF linker.

     My experience is well ugh lets just say, if something seems hard and
     almost impossible maybe something isn't quite right.

     Unfortunately I'm in a war of attrition trying to learn x86 and
     friends. And its a long one!


 Good luck with that. :-)
Thanks, I just wish we weren't in the current situation hardware wise.
Jun 10 2015
prev sibling parent Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 10 June 2015 at 09:02, deadalnix via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 On Wednesday, 10 June 2015 at 04:55:43 UTC, Rikki Cattermole wrote:

 I think that a lot of the people asking for a 2.067 LDC are just users
 of D, and (I am including myself in this group) a lot of those people
 don't know the first thing about LLVM or good complier design in
 general. While it may seem dishonest for people to ask for these things
 and not help, keep in mind that the vast majority of programmers are not
 even able to help.
I for one would love to help. But I barely understand X86. Not to mention having to get a setup going ext. Not really worth it right now for me. Although I'd rather work on SDC instead of LDC. Primarily because well it's so shinyyyyyy. I would be happy to write a book to teach compiler development from everything from basic x86 encoding to complex optimization strategies. If only I knew it and yes I know they exist just wrong method for teaching it IMO.
Lately, I've been listening to a playlist of interview, presentations and other thing involving Elon Musk. The playlist is hours long and I'm listening to it while doing other things. After selling paypal, Musk wanted to use part of his money to revive the desire to explore space. What he plan to do is to send a plant on Mars, a very symbolic stunt that would, he hopes, renew the interest in space exploration, maybe increase NASA funding or whatnot. Thing is, he doesn't know about space that much. He has a physic major working on batteries, and then went to have a payment processing company. So he could have said, like you guys, "well I don't know much about space/compiler let's wait for others to make things happen". But nope, he went to talk to space specialists, engineer and scientists, and then, went got in touch with some Russian to buy refurbished ICBM in order to start experimenting. One of the notable thing is how amazed people are that he went to buy ICBM from the russian. Well guess what, that is one of the cheapest thing that can go into space, so if you want to make something happen, that is an excellent starting point. I can continue the story with myself (because everyone knows I compare to Elon in so many ways, and he is greatly inspired by my vision and capability to make things happen). Recently I got to a point on SDC where working on the GC became an important item. Thing is, I know about compiler not memory allocator. Having low level knowledge of how the CPU operate does not provide me wisdom about what kind of algorithm and datastructure will behave nicely on a typical wokload. So I went to read tcmalloc source code, jemalloc source code, libc's malloc, I read a ton of paper about various allocators, and went after Jason Evan - one of the great perk of working for Facebook is to have all these amazing people who can make you feel like an idiot because they know so much more than you do - as to get as much of the "why" as possible. the code told me the "what/how" but that is not sufficient to get a good grasp of the matter at hand. Making things happen is not about waiting for the wisdom to fall from the sky to deliver you the deep and arcane knowledge of compiler/memory management/rocketry . It is about learning enough to get started, and then start do do thing while continuing to learn more. To get back on point, yes some task in LDC or SDC (or DMD, or GDC) require some good knowledge of compiler stuff. Obviously, these are compiler, and I'd add D compiler, which involve a certain level of complexity. But let's be honest, a good chunk of the work is not guru level compiler arcane. Most of the work is actually dumb shit that just need to be done like it is for all other software.
+1 The current 'arcane' job being done in my camp is encapulating related codegen routines under a single umbrella.
Jun 10 2015
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/1/2015 11:11 AM, Andrei Alexandrescu wrote:
 3. We've just had a good conference with solid content, but if our collective
 actions are to be interpreted, we did our best to be as stealth as possible.
 Please consider writing blogs, articles, tweets, posts, related to all that
 stuff. Speakers in particular should consider converting their good work into
 articles. Programmer news sites are full of Rust-related stuff; we must respond
 in kind with great D content.
I want to chime in with the fact that our most effective method of getting the word out is to write articles. Far more people will read articles (or at least scan them) than watch videos. For those writing D tools - sitting and waiting for someone else to write articles about how cool they are IS NOT GOING TO WORK. You simply must write articles about them, or your hard work creating the tool will be in vain. Posting a link to a github repository is NOT sufficient. You have to give people a reason to look at the tool. I know so many people who wrote cool tools and then gave up in bitterness and frustration because the tools never gained traction. There's a 100% correlation between that and failing to write any articles or attempt any promotion whatsoever. "Build It And They Will Come" is a stupid hollywood myth. Posting "I wrote a tool, here's a github link to the source code" is doomed to oblivion. I'll reiterate what Andrei said - speakers, you already have the material. Write a companion article! A bonus will be that it will improve your professional reputation, which is worth $$$.
Jun 01 2015
parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 18:24:07 UTC, Walter Bright wrote:
 I want to chime in with the fact that our most effective method 
 of getting the word out is to write articles. Far more people 
 will read articles (or at least scan them) than watch videos.
I really need to get my website set up so that I can post stuff like this. I've had the domain for a while but just haven't gotten around to setting it up. :( - Jonathan M Davis
Jun 01 2015
next sibling parent reply "extrawurst" <stephan extrawurst.org> writes:
On Monday, 1 June 2015 at 21:47:46 UTC, Jonathan M Davis wrote:
 On Monday, 1 June 2015 at 18:24:07 UTC, Walter Bright wrote:
 I want to chime in with the fact that our most effective 
 method of getting the word out is to write articles. Far more 
 people will read articles (or at least scan them) than watch 
 videos.
I really need to get my website set up so that I can post stuff like this. I've had the domain for a while but just haven't gotten around to setting it up. :( - Jonathan M Davis
maybe just use https://pages.github.com/ it is simple to setup and easy to use if you are used to github that is ;)
Jun 01 2015
parent "extrawurst" <stephan extrawurst.org> writes:
On Monday, 1 June 2015 at 22:22:06 UTC, extrawurst wrote:
 On Monday, 1 June 2015 at 21:47:46 UTC, Jonathan M Davis wrote:
 On Monday, 1 June 2015 at 18:24:07 UTC, Walter Bright wrote:
 I want to chime in with the fact that our most effective 
 method of getting the word out is to write articles. Far more 
 people will read articles (or at least scan them) than watch 
 videos.
I really need to get my website set up so that I can post stuff like this. I've had the domain for a while but just haven't gotten around to setting it up. :( - Jonathan M Davis
maybe just use https://pages.github.com/ it is simple to setup and easy to use if you are used to github that is ;)
here is my site based on it: http://extrawurst.github.io/ (also rather #dlang centric)
Jun 01 2015
prev sibling parent reply "Mattcoder" <stop spam.com.br> writes:
On Monday, 1 June 2015 at 21:47:46 UTC, Jonathan M Davis wrote:
 I really need to get my website set up so that I can post stuff 
 like this. I've had the domain for a while but just haven't 
 gotten around to setting it up. :(

 - Jonathan M Davis
What about writing articles using medium.com? I see many articles on reddit from this site. Matheus.
Jun 01 2015
parent Walter Bright <newshound2 digitalmars.com> writes:
On 6/1/2015 5:19 PM, Mattcoder wrote:
 What about writing articles using medium.com? I see many articles on reddit
from
 this site.
I've long maintained that professionals should register yourname.com and use that to publish their articles, resume, etc. It'll put you in control of your professional reputation.
Jun 01 2015
prev sibling next sibling parent reply "Joakim" <dlang joakim.fea.st> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving 
 average of daily dmd downloads is in pronounced decline 
 following a peak at the 2.067 release. It is possible that the 
 recent release of Rust 1.0 has caused that, shifting drive-by 
 experimenters to it.
I don't know that we should over-react to the recent slump, especially since it was presaged by such a large spike. I'm sure it'll pick back up again as people start watching the Dconf 2015 videos. Filtering out the noise, the daily download numbers look stable around 1200 for almost the last year. The real issue is how we take the next jump upwards. Hopefully, mobile support, which only Dan and I are working on right now, can help with that. :)
 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.
Given the subpar quality of the livestream, I'm not sure we should be highlighting those videos. I've watched several hours of the livestream and the frequent audio dropouts are very annoying. What is the plan to put out the better videos recorded by the organizers: put them all out as soon as they're available or stagger their release? On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 4. We need to marshal our efforts behind 2.068, and clarify the 
 big ticket items accomplished. I'm thinking rangification of 
 Phobos - GC no longer needed for most primitives, and 
 documented where needed. As much as I want it, ddmd seems to 
 not be happening for 2.068 because of, simply put, insufficient 
 resources.
Can you expand on why ddmd is getting delayed? I, and seemingly many others, were looking forward to ddmd. I did not see Daniel's talk as it wasn't livestreamed. Perhaps we can help get ddmd out the door.
Jun 01 2015
next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 18:58:04 UTC, Joakim wrote:
 Can you expand on why ddmd is getting delayed?  I, and 
 seemingly many others, were looking forward to ddmd.  I did not 
 see Daniel's talk as it wasn't livestreamed.  Perhaps we can 
 help get ddmd out the door.
The primary hang-up is that both gdc and ldc need to be at version 2.067 so that they can be used to compile ddmd and so that we can be sure that having 2.067 as the base for ddmd is going to work with all three compilers (also, we want to release a ddmd binary built by gdc or ldc, not dmd, so that we can avoid a speed drop in the resulting binary). As I understand it LLVM has a blocker in LLVM which currently prevents them from updating. I don't know about gdc. But regardless, those are going to need to be taken care of first. - Jonathan M Davis
Jun 01 2015
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/1/15 11:58 AM, Joakim wrote:
 What is the plan to
 put out the better videos recorded by the organizers: put them all out
 as soon as they're available or stagger their release?
All at once.
 On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 4. We need to marshal our efforts behind 2.068, and clarify the big
 ticket items accomplished. I'm thinking rangification of Phobos - GC
 no longer needed for most primitives, and documented where needed. As
 much as I want it, ddmd seems to not be happening for 2.068 because
 of, simply put, insufficient resources.
Can you expand on why ddmd is getting delayed? I, and seemingly many others, were looking forward to ddmd. I did not see Daniel's talk as it wasn't livestreamed. Perhaps we can help get ddmd out the door.
We need either gdc 2.067 or ldc 2.067 released in order to build ddmd with them. Otherwise we're suffering a 20% perf loss. Andrei
Jun 01 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 19:29:05 UTC, Andrei Alexandrescu wrote:
 On 6/1/15 11:58 AM, Joakim wrote:
 What is the plan to
 put out the better videos recorded by the organizers: put them 
 all out
 as soon as they're available or stagger their release?
All at once.
 On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu 
 wrote:
 4. We need to marshal our efforts behind 2.068, and clarify 
 the big
 ticket items accomplished. I'm thinking rangification of 
 Phobos - GC
 no longer needed for most primitives, and documented where 
 needed. As
 much as I want it, ddmd seems to not be happening for 2.068 
 because
 of, simply put, insufficient resources.
Can you expand on why ddmd is getting delayed? I, and seemingly many others, were looking forward to ddmd. I did not see Daniel's talk as it wasn't livestreamed. Perhaps we can help get ddmd out the door.
We need either gdc 2.067 or ldc 2.067 released in order to build ddmd with them. Otherwise we're suffering a 20% perf loss. Andrei
at the risk of sounding like a broken record, if ldc/gdc not being 2.067 stops a DDMD release due to dmd's generated code being too slow, maybe it's time to phase dmd out ;)
Jun 01 2015
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Once SDC is at a point where it can compile most of or all D code that DMD compiles, it would be a good replacement.
Jun 01 2015
parent reply "deadalnix" <deadalnix gmail.com> writes:
On Monday, 1 June 2015 at 19:51:44 UTC, Meta wrote:
 On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Once SDC is at a point where it can compile most of or all D code that DMD compiles, it would be a good replacement.
Waiting for your PR :)
Jun 01 2015
parent reply "Meta" <jared771 gmail.com> writes:
On Monday, 1 June 2015 at 23:06:28 UTC, deadalnix wrote:
 On Monday, 1 June 2015 at 19:51:44 UTC, Meta wrote:
 On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Once SDC is at a point where it can compile most of or all D code that DMD compiles, it would be a good replacement.
Waiting for your PR :)
I might take you up on if I knew anything about LLVM or implementing a frontend.
Jun 01 2015
parent "deadalnix" <deadalnix gmail.com> writes:
On Tuesday, 2 June 2015 at 00:04:21 UTC, Meta wrote:
 On Monday, 1 June 2015 at 23:06:28 UTC, deadalnix wrote:
 On Monday, 1 June 2015 at 19:51:44 UTC, Meta wrote:
 On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Once SDC is at a point where it can compile most of or all D code that DMD compiles, it would be a good replacement.
Waiting for your PR :)
I might take you up on if I knew anything about LLVM or implementing a frontend.
It is called learning by doing.
Jun 01 2015
prev sibling next sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Given how slow they are at compiling? Not a chance. dmd's speed is a huge feature. What I would recommend (and have heard others recommend) is that development normally be done with dmd so that you can get the fast compile-test-edit cycle that it enables and then use gdc or ldc when you generate production code so that it'll actually then be optimized properly. That way, you get fast production binaries _and_ fast compilation speed where you need it. But developing code with gdc or ldc would just be painful in comparison to developing with dmd. - Jonathan M Davis
Jun 01 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 21:21:58 UTC, Jonathan M Davis wrote:
 On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067 stops a DDMD release due to dmd's generated code 
 being too slow, maybe it's time to phase dmd out ;)
Given how slow they are at compiling? Not a chance. dmd's speed is a huge feature.
dmd's speed is fast only in comparison with C++ compilers, go runs circles around it.
 What I would recommend (and have heard others recommend) is 
 that development normally be done with dmd so that you can get 
 the fast compile-test-edit cycle that it enables and then use 
 gdc or ldc when you generate production code so that it'll 
 actually then be optimized properly. That way, you get fast 
 production binaries _and_ fast compilation speed where you need 
 it. But developing code with gdc or ldc would just be painful 
 in comparison to developing with dmd.

 - Jonathan M Davis
it seems like it would be easier to fix LDC's compiling speed than make a 20-year old ex-C++ backend be able to compete with LLVM/GCC's codegen. or else LDC and GDC are going to forever lag behind dmd due to a lack of manpower, so you have to pick between being able to have relevant bugfixes, new features, etc from the past ~12-18, or The way everyone says "just develop with dmd, then use LDC/GDC for speed!" is ridiculous considering I frequently have to alter my code to even work with LDC/GDC.
Jun 01 2015
next sibling parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 6/1/15 6:43 PM, weaselcat wrote:
 On Monday, 1 June 2015 at 21:21:58 UTC, Jonathan M Davis wrote:
 On Monday, 1 June 2015 at 19:48:01 UTC, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not being
 2.067 stops a DDMD release due to dmd's generated code being too
 slow, maybe it's time to phase dmd out ;)
Given how slow they are at compiling? Not a chance. dmd's speed is a huge feature.
dmd's speed is fast only in comparison with C++ compilers, go runs circles around it.
You are right! go spits out an error that my D code isn't compilable in much faster time than dmd can compile it.
 it seems like it would be easier to fix LDC's compiling speed than make
 a 20-year old ex-C++ backend be able to compete with LLVM/GCC's codegen.

 or else LDC and GDC are going to forever lag behind dmd due to a lack of
 manpower, so you have to pick between being able to have relevant
 bugfixes, new features, etc from the past ~12-18, or having code that


 The way everyone says "just develop with dmd, then use LDC/GDC for
 speed!" is ridiculous considering I frequently have to alter my code to
 even work with LDC/GDC.
This I agree with. Daniel mentioned at dconf that he would like to merge the front-ends, so the issues between compilers would be much easier to solve. -Steve
Jun 01 2015
prev sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Monday, 1 June 2015 at 22:43:58 UTC, weaselcat wrote:
 dmd's speed is fast only in comparison with C++ compilers, go 
 runs circles around it.
In an apples-to-apples comparison, dmd is faster than go. That comparison can be a bit hard to set up though. If you import std.stdio for example, dmd now has to read about 100k lines of code to get all the phobos dependencies in. D can also get kinda slow with heavy ctfe... but of course, go has no ctfe at all.
Jun 01 2015
prev sibling parent reply Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 01/06/2015 20:47, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not being 2.067
 stops a DDMD release due to dmd's generated code being too slow, maybe
 it's time to phase dmd out ;)
It's past the time. The traction and support that Rust gained, even before 1.0, showed that having your primary toolchain based on GCC or LLVM is the only sustainable way forward (and even of those two, GCC might not be able to keep with LLVM). -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 05 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/5/15 12:02 PM, Bruno Medeiros wrote:
 The traction and support that Rust gained, even before 1.0, showed that
 having your primary toolchain based on GCC or LLVM is the only
 sustainable way forward
Well I'm wondering how that inference works. -- Andrei
Jun 05 2015
next sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Friday, 5 June 2015 at 19:09:46 UTC, Andrei Alexandrescu wrote:
 On 6/5/15 12:02 PM, Bruno Medeiros wrote:
 The traction and support that Rust gained, even before 1.0, 
 showed that
 having your primary toolchain based on GCC or LLVM is the only
 sustainable way forward
Well I'm wondering how that inference works. -- Andrei
I have to say i agree with the conclusion, but I'm only puzzled by the reasoning used to come to that conclusion.
Jun 05 2015
prev sibling parent reply Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 05/06/2015 20:09, Andrei Alexandrescu wrote:
 On 6/5/15 12:02 PM, Bruno Medeiros wrote:
 The traction and support that Rust gained, even before 1.0, showed that
 having your primary toolchain based on GCC or LLVM is the only
 sustainable way forward
Well I'm wondering how that inference works. -- Andrei
Indeed I wasn't clear on that reasoning. It's not the traction that Rust gained that is the *cause* for LLVM "being the only sustainable way forward". Rather, it's simply what showed me that this was the case. Let me be more clear. First when I said "traction and support", I wasn't referring to the Rust user base, but rather the number of *paid* developers[1] that Mozilla has working on the Rust project. Besides the core team, they also seem to be willing to also hire programmers on a contract basis to work one certain aspects of the toolchain (see for example: https://michaelwoerister.github.io/2014/02/28/mozilla-contract.html). But more importantly: when I finally started looking into Rust, was also what caused me to look into LLVM properly, particularly by subscribing the LLVM weekly newsletter. And that's when I gradually realized all the work LLVM having done, with so many people working on it. The recent news that Microsoft is going to work on a LLVM based compiler (http://developers.slashdot.org/story/15/04/14/1529210/microsoft-starts-working-on-an-llvm-base -compiler-for-net), is another sign of this tidal wave we are seeing in the horizon: the future for non-proprietary languages is to be based on LLVM or GCC, or they won't be able to compete. And toolchain always wins. If won't matter if the language is superior. [1]: Note the point here isn't that paid developers are better than volunteer ones, the point they will usually will have more time to work on things than volunteer. The point is about the time and effort they are able to dedicate, not that they are paid per se. -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 05 2015
next sibling parent Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 05/06/2015 21:41, Bruno Medeiros wrote:
 But more importantly: when I finally started looking into Rust, was also
 what caused me to look into LLVM properly, particularly by subscribing
 the LLVM weekly newsletter. And that's when I gradually realized all the
 work LLVM having done, with so many people working on it.
Some things I forgot to add: It's not just the amount of resources they have, but also the direction they are going to. In particular, unlike GCC, they are not shying away from Windows and are doing a lot of work to support it natively (not just compiling obviously, but linking with Visual Studio artifacts, and Windows debugging). Once that work matures, I reckon other toolchains will just be left in the dust. (Windows native support is not the only reason LLVM is so promising - but it is the only one I could think of, at the moment, that could be considered a hard blocker for phasing out the DM backend) -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 05 2015
prev sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Friday, 5 June 2015 at 20:41:39 UTC, Bruno Medeiros wrote:
 And toolchain always wins. If won't matter if the language is 
 superior.
Yes, official support for IOS and Android already in Rust, as far as I know. Toolchain for Rust created MUCH faster than D. D is a very good language, but because of the confrontation turns tolchain not `Rust vs D`, and `Digital Mars vs Mozilla` :) The situation can be dramatically reversed in favor of the D: Digital Mars + Google = Win We need some way to attract a large company to develop toolchain for D. Is it really impossible...
Jun 05 2015
prev sibling parent reply "extrawurst" <stephan extrawurst.org> writes:
On Friday, 5 June 2015 at 19:02:15 UTC, Bruno Medeiros wrote:
 On 01/06/2015 20:47, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not 
 being 2.067
 stops a DDMD release due to dmd's generated code being too 
 slow, maybe
 it's time to phase dmd out ;)
It's past the time. The traction and support that Rust gained, even before 1.0, showed that having your primary toolchain based on GCC or LLVM is the only sustainable way forward (and even of those two, GCC might not be able to keep with LLVM).
Nice work on the RustDT IDE btw. (https://users.rust-lang.org/t/rustdt-0-1-0-released-a-new-eclipse-rust-ide)
Jun 05 2015
parent reply Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 05/06/2015 21:29, extrawurst wrote:
 On Friday, 5 June 2015 at 19:02:15 UTC, Bruno Medeiros wrote:
 On 01/06/2015 20:47, weaselcat wrote:
 at the risk of sounding like a broken record, if ldc/gdc not being 2.067
 stops a DDMD release due to dmd's generated code being too slow, maybe
 it's time to phase dmd out ;)
It's past the time. The traction and support that Rust gained, even before 1.0, showed that having your primary toolchain based on GCC or LLVM is the only sustainable way forward (and even of those two, GCC might not be able to keep with LLVM).
Nice work on the RustDT IDE btw. (https://users.rust-lang.org/t/rustdt-0-1-0-released-a-new-eclipse-rust-ide)
Thanks! (it already has more likes/stars in Github than DDT, even though it's nowhere near as feature full :S ) -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 10 2015
parent reply "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Wednesday, 10 June 2015 at 11:36:56 UTC, Bruno Medeiros wrote:
 Thanks! (it already has more likes/stars in Github than DDT, 
 even though it's nowhere near as feature full :S )
It seems to me that many still do not understand what the Rust :) Many have not seen Lisp, so they think that Rust is something innovative. At least from the syndrome of angle brackets and other syntactic shaluhi its developers are not disposed of, but only made matters worse. This language is not better than the same C++.
Jun 10 2015
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Wednesday, 10 June 2015 at 17:04:56 UTC, Dennis Ritchie wrote:
 On Wednesday, 10 June 2015 at 11:36:56 UTC, Bruno Medeiros 
 wrote:
 Thanks! (it already has more likes/stars in Github than DDT, 
 even though it's nowhere near as feature full :S )
It seems to me that many still do not understand what the Rust :) Many have not seen Lisp, so they think that Rust is something innovative. At least from the syndrome of angle brackets and other syntactic shaluhi its developers are not disposed of, but only made matters worse. This language is not better than the same C++.
Sorry, but this sounds like extremely uneducated opinion. Rust has a very clearly defined set of values and goals. It is designed for large scale projects that need to combine high performance with maintainability and does that at cost of learning curve and rapid prototyping. Very strict and punishing compiler (with a pedantic and complicated type system) ensures that it is much harder to make accidental subtle mistakes. Even generics are completely type-checked (via traits). (yes, I did spend quite some time playing with it) There are few important features missing compared to D, i.e. static reflection and metaprogramming can only be done via AST macros. But primarily the main issue I see is that there is no reason to pick Rust for a project with less than 50 KLOC unless you want to learn. Productivity feels very low. Still, saying that it is "same C++" is absolutely missing the point.
Jun 10 2015
next sibling parent reply "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Wednesday, 10 June 2015 at 17:20:12 UTC, Dicebot wrote:
 On Wednesday, 10 June 2015 at 17:04:56 UTC, Dennis Ritchie 
 wrote:
 It seems to me that many still do not understand what the Rust 
 :) Many have not seen Lisp, so they think that Rust is 
 something innovative. At least from the syndrome of angle 
 brackets and other syntactic shaluhi its developers are not 
 disposed of, but only made matters worse. This language is not 
 better than the same C++.
Sorry, but this sounds like extremely uneducated opinion.
Yes, it is. I have not had time to spend some time playing with Rust, so my opinion about Rust is very bad.
 Rust has a very clearly defined set of values and goals. It is 
 designed for large scale projects that need to combine high 
 performance with maintainability and does that at cost of 
 learning curve and rapid prototyping. Very strict and punishing 
 compiler (with a pedantic and complicated type system) ensures 
 that it is much harder to make accidental subtle mistakes. Even 
 generics are completely type-checked (via traits).
OK. But Rust better than the same minimalist Go? Besides, there is no garbage collection Rust. This, at least, not to date. No bounds checking of arrays.
 (yes, I did spend quite some time playing with it)
I also plan to play with Rust, but a little later.
 There are few important features missing compared to D, i.e. 
 static reflection and metaprogramming can only be done via AST 
 macros. But primarily the main issue I see is that there is no 
 reason to pick Rust for a project with less than 50 KLOC unless 
 you want to learn. Productivity feels very low.
Well, if Rust created for huge projects, why these macros? I fear that macros are simply not needed in C-family languages. The macros help in D? Write unbearable code? :D
 Still, saying that it is "same C++" is absolutely missing the 
 point.
Yes, I admit that it is very incorrect: so speaks of Rust, but in this case it is no better than Go from Google. The Rust better than Go in large projects?
Jun 10 2015
parent reply "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Wednesday, 10 June 2015 at 22:01:22 UTC, Dennis Ritchie wrote:
 No bounds checking of arrays.
Huh? Whatever gave you that impression?
 Well, if Rust created for huge projects, why these macros? I 
 fear that macros are simply not needed in C-family languages.
 The macros help in D? Write unbearable code? :D
They're a lot cleaner than C's macros. AFAIK, Rust's writef() equivalent is implemented with them.
Jun 11 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Thursday, 11 June 2015 at 08:54:54 UTC, Marc SchĂźtz wrote:
 On Wednesday, 10 June 2015 at 22:01:22 UTC, Dennis Ritchie 
 wrote:
 No bounds checking of arrays.
Huh? Whatever gave you that impression?
 Well, if Rust created for huge projects, why these macros? I 
 fear that macros are simply not needed in C-family languages.
 The macros help in D? Write unbearable code? :D
They're a lot cleaner than C's macros. AFAIK, Rust's writef() equivalent is implemented with them.
Rust's macros make me wish mixins weren't so ugly to use, or we had proper AST macros.
Jun 11 2015
parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 08:59:18 UTC, weaselcat wrote:
 On Thursday, 11 June 2015 at 08:54:54 UTC, Marc SchĂźtz wrote:
 Rust's macros make me wish mixins weren't so ugly to use, or we 
 had proper AST macros.
Perhaps much will change, if D is added to the symbol $ to replace values in the mixin: http://www.prowiki.org/wiki4d/wiki.cgi?DanielKeep/shfmt
Jun 11 2015
prev sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 08:54:54 UTC, Marc SchĂźtz wrote:
 On Wednesday, 10 June 2015 at 22:01:22 UTC, Dennis Ritchie 
 wrote:
 No bounds checking of arrays.
Huh? Whatever gave you that impression?
Oops. It turns out that bounds checking is really there. But I think that before this really was not in Rust. Or I'm wrong?
 Well, if Rust created for huge projects, why these macros? I 
 fear that macros are simply not needed in C-family languages.
 The macros help in D? Write unbearable code? :D
They're a lot cleaner than C's macros. AFAIK, Rust's writef() equivalent is implemented with them.
And yet I'm not sure that they really need, for easily emulated by using mixins. http://forum.dlang.org/thread/jvaizdvgzwqqpxqjiton beta.forum.dlang.org Can anybody give an example of macros Rust which cannot be emulated by using mixins?
Jun 11 2015
prev sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Wednesday, 10 June 2015 at 17:20:12 UTC, Dicebot wrote:
 It is designed for large scale projects that need to combine 
 high performance with maintainability and does that at cost of 
 learning curve and rapid prototyping.
High performance is a LDC, but the problem is that it is too little invested. He lags behind in the development of the DMD.
Jun 10 2015
prev sibling parent reply "QAston" <qaston gmail.com> writes:
On Wednesday, 10 June 2015 at 17:04:56 UTC, Dennis Ritchie wrote:
 It seems to me that many still do not understand what the Rust 
 :) Many have not seen Lisp, so they think that Rust is 
 something innovative. At least from the syndrome of angle 
 brackets and other syntactic shaluhi its developers are not 
 disposed of, but only made matters worse. This language is not 
 better than the same C++.
I have seen and used lisp, still I think that Rust is innovative. Namely the combination of very good typesystem, best to date (because fully compiler-verified) resource management and AST macros is innovative. But what I like the most about is that it wasn't created in a stike of genius by a guy in a basement, but instead by continuous refinement of many ideas with strong desire to keep the language small and consistent.
Jun 11 2015
parent reply "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 09:12:09 UTC, QAston wrote:
 I have seen and used lisp, still I think that Rust is 
 innovative. Namely the combination of very good typesystem, 
 best to date (because fully compiler-verified) resource 
 management and AST macros is innovative.

 But what I like the most about is that it wasn't created in a 
 stike of genius by a guy in a basement, but instead by 
 continuous refinement of many ideas with strong desire to keep 
 the language small and consistent.
Yes, it is still strong arguments in favor of Rust. But D has a garbage collector and there is no syndrome of angle brackets. Unfortunately, I still don't know Rust templates, so nothing to say about them. Do templates Rust is better than D?
Jun 11 2015
next sibling parent reply "QAston" <qaston gmail.com> writes:
On Thursday, 11 June 2015 at 13:16:13 UTC, Dennis Ritchie wrote:
 Yes, it is still strong arguments in favor of Rust. But D has a 
 garbage collector and there is no syndrome of angle brackets. 
 Unfortunately, I still don't know Rust templates, so nothing to 
 say about them.

 Do templates Rust is better than D?
It's a matter of taste and I won't advocate for Rust on D forums. Syntax doesn't bother me at all as long as it's consistent and in both D and Rust it is. For me simple templates + simple macros are clearer than complicated templates + ctfe + mixins. There are tradeoffs there, with ctfe being an optimization at the expense of build time. On the other hand dmd is much much faster than the Rust compiler which doesn't do ctfe. As for GC vs Lifecycles - there are well known tradeoffs there, but it's now at least possible to have lifecycle semantics as safe as GC semantics at the expense of fighting the compiler a bit.
Jun 11 2015
next sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 13:35:25 UTC, QAston wrote:
 It's a matter of taste and I won't advocate for Rust on D 
 forums.
It is not required. But it would be nice if you could post a topic in DLearn like this just about Rust :) http://forum.dlang.org/thread/ujatnyfraqahrmfokcjx forum.dlang.org
 Syntax doesn't bother me at all as long as it's consistent and 
 in both D and Rust it is.
For me, the syntax is very important. I want my code could understand everything, not just programmers superclass. It's no secret that many people avoid encounters with Lisp because of its syntax.
 For me simple templates + simple macros are clearer than 
 complicated templates + ctfe + mixins.
Yes, probably, it's hard to disagree.
 There are tradeoffs there, with ctfe being an optimization at 
 the expense of build time. On the other hand dmd is much much 
 faster than the Rust compiler which doesn't do ctfe.
Yes, but LDC will overtake the Rust compiler and without CTFE. In addition, the compiler Rust also based on LLVM. So to compare DMD and Rust compiler does not make sense.
Jun 11 2015
prev sibling next sibling parent "weaselcat" <weaselcat gmail.com> writes:
On Thursday, 11 June 2015 at 13:35:25 UTC, QAston wrote:
 For me simple templates + simple macros are clearer than 
 complicated templates + ctfe + mixins. There are tradeoffs 
 there, with ctfe being an optimization at the expense of build 
 time. On the other hand dmd is much much faster than the Rust 
 compiler which doesn't do ctfe.
all three D compilers are faster than rustc, it's honestly very slow and was one of my major gripes with rust - I get distracted easily and like short build times : )
Jun 11 2015
prev sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Thursday, 11 June 2015 at 13:35:25 UTC, QAston wrote:
 For me simple templates + simple macros are clearer than 
 complicated templates + ctfe + mixins.
My experience of explaining those concepts to other people indicates otherwise. D templates and mixins are dirty but also very simple concepts that pretty much any new programmers gets quickly and intuitively, learning how to do more complicated magic in a small steps. Macros are just never really simple as they operate within AST domain and programmers not familiar with language developer tend to be more confortable with thinking on the source code level. It is a more robust power tool for experienced developers but that does not feel like a good justification.
Jun 11 2015
next sibling parent "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 15:08:48 UTC, Dicebot wrote:
 My experience of explaining those concepts to other people 
 indicates otherwise. D templates and mixins are dirty but also 
 very simple concepts that pretty much any new programmers gets 
 quickly and intuitively, learning how to do more complicated 
 magic in a small steps.
I totally agree with that, because I checked it on myself. I am a long time did not understand C++ templates, but I pretty quickly realized templates D, which helped me to fully understand the templates C++. As it turned out, I'm not the only one :)
Jun 11 2015
prev sibling parent "QAston" <qastonx gmail.com> writes:
On Thursday, 11 June 2015 at 15:08:48 UTC, Dicebot wrote:
 My experience of explaining those concepts to other people 
 indicates otherwise. D templates and mixins are dirty but also 
 very simple concepts that pretty much any new programmers gets 
 quickly and intuitively, learning how to do more complicated 
 magic in a small steps. Macros are just never really simple as 
 they operate within AST domain and programmers not familiar 
 with language developer tend to be more confortable with 
 thinking on the source code level. It is a more robust power 
 tool for experienced developers but that does not feel like a 
 good justification.
I bet you don't show ranges implementation to those people, or they don't write new ones. Yes - predicates are simple and powerful combined with conditional compilation and there's even a certain elegance to that. But proper usage, for example to implement a custom range or a wrapper, is difficult as seen here on forums. Still - it's way way easier than in C++. In the end D has much more powerful compile-time capabilities (except encapsulating and manipulating syntax because of no macros; D can only encapsulate declarations). The price paid for those is in possible tooling. No analysis can be done on an arbitrary predicate to suggest you how to match it. With simple (not turing complete) macros and templates that's easy to do. It's a tradeoff, pick what you like. Rust's macros may feel difficult to undestrand exactly because they're limited and therfore can't use regular runtime APIs. There're macro systems which allow arbitrary functions creating ASTs - Lisp is an example. Yes - Lisps had ctfe before it was cool. Just like you glue strings in D using runtime API, you glue lists in Lisp using runtime API. It has the same consequences: Lisp macros can't really be checked without runtime in the background running the macros and there're no suggestions available for their usage.
Jun 11 2015
prev sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Thursday, 11 June 2015 at 13:16:13 UTC, Dennis Ritchie wrote:
 Do templates Rust is better than D?
Those are considerably less powerful: - can only have type arguments - no variadic argument list support - no arbitrary condition constraints (thus only partial duck typing support) On the other hand they have one important advantage: all type arguments must comply to one or more trairs and thus bodies of generics are checked before institation. You are only allowed to call methods and operations of generic arguments that are defined in relevan trait. This is huge win for code hygiene compared to D. Any sort of more advanced meta-programming things can only be done via AST macros which is currently the biggest downside in my eyes when it comes to features. Though quite some people like that.
Jun 11 2015
next sibling parent reply "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 15:03:39 UTC, Dicebot wrote:
 Those are considerably less powerful:
 - can only have type arguments
 - no variadic argument list support
 - no arbitrary condition constraints (thus only partial duck 
 typing support)

 On the other hand they have one important advantage: all type 
 arguments must comply to one or more trairs and thus bodies of 
 generics are checked before institation. You are only allowed 
 to call methods and operations of generic arguments that are 
 defined in relevan trait. This is huge win for code hygiene 
 compared to D.

 Any sort of more advanced meta-programming things can only be 
 done via AST macros which is currently the biggest downside in 
 my eyes when it comes to features. Though quite some people 
 like that.
The fact that there is no support variadiс arguments, it is really negative. It is possible that Walter and Andrei against macro because of this: macro_rules! o_O { ( $( $x:expr; [ $( $y:expr ),* ] );* ) => { &[ $($( $x + $y ),*),* ] } } fn main() { let a: &[i32] = o_O!(10; [1, 2, 3]; 20; [4, 5, 6]); assert_eq!(a, [11, 12, 13, 24, 25, 26]); } It looks disgusting! ;)
Jun 11 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/11/15 9:33 AM, Dennis Ritchie wrote:
 On Thursday, 11 June 2015 at 15:03:39 UTC, Dicebot wrote:
 Those are considerably less powerful:
 - can only have type arguments
 - no variadic argument list support
 - no arbitrary condition constraints (thus only partial duck typing
 support)

 On the other hand they have one important advantage: all type
 arguments must comply to one or more trairs and thus bodies of
 generics are checked before institation. You are only allowed to call
 methods and operations of generic arguments that are defined in
 relevan trait. This is huge win for code hygiene compared to D.

 Any sort of more advanced meta-programming things can only be done via
 AST macros which is currently the biggest downside in my eyes when it
 comes to features. Though quite some people like that.
The fact that there is no support variadiс arguments, it is really negative. It is possible that Walter and Andrei against macro because of this: macro_rules! o_O { ( $( $x:expr; [ $( $y:expr ),* ] );* ) => { &[ $($( $x + $y ),*),* ] } } fn main() { let a: &[i32] = o_O!(10; [1, 2, 3]; 20; [4, 5, 6]); assert_eq!(a, [11, 12, 13, 24, 25, 26]); } It looks disgusting! ;)
Is that actual Rust code that compiles and runs? -- Andrei
Jun 11 2015
parent reply "Dennis Ritchie" <dennis.ritchie mail.ru> writes:
On Thursday, 11 June 2015 at 17:41:49 UTC, Andrei Alexandrescu 
wrote:
 It is possible that Walter and Andrei against macro because of 
 this:

 macro_rules! o_O {
      (
          $(
              $x:expr; [ $( $y:expr ),* ]
          );*
      ) => {
          &[ $($( $x + $y ),*),* ]
      }
 }

 fn main() {
      let a: &[i32]
          = o_O!(10; [1, 2, 3];
                 20; [4, 5, 6]);

      assert_eq!(a, [11, 12, 13, 24, 25, 26]);
 }

 It looks disgusting! ;)
Is that actual Rust code that compiles and runs? -- Andrei
Yes, this code is working. I took it here: https://doc.rust-lang.org/stable/book/macros.html#repetition
Jun 11 2015
parent Bruno Medeiros <bruno.do.medeiros+dng gmail.com> writes:
On 11/06/2015 18:48, Dennis Ritchie wrote:
 On Thursday, 11 June 2015 at 17:41:49 UTC, Andrei Alexandrescu wrote:
 It is possible that Walter and Andrei against macro because of this:

 macro_rules! o_O {
      (
          $(
              $x:expr; [ $( $y:expr ),* ]
          );*
      ) => {
          &[ $($( $x + $y ),*),* ]
      }
 }

 fn main() {
      let a: &[i32]
          = o_O!(10; [1, 2, 3];
                 20; [4, 5, 6]);

      assert_eq!(a, [11, 12, 13, 24, 25, 26]);
 }

 It looks disgusting! ;)
Is that actual Rust code that compiles and runs? -- Andrei
Yes, this code is working. I took it here: https://doc.rust-lang.org/stable/book/macros.html#repetition
OMG, and it's not just an obscure example someone came up with, but actually a snippet from the official documentation! That said, it does look a bit less crazy when looking at it with the syntax highlighting. -- Bruno Medeiros https://twitter.com/brunodomedeiros
Jun 12 2015
prev sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Thursday, 11 June 2015 at 16:33:04 UTC, Dennis Ritchie wrote:
 It is possible that Walter and Andrei against macro because of 
 this:

 macro_rules! o_O {
     (
         $(
             $x:expr; [ $( $y:expr ),* ]
         );*
     ) => {
         &[ $($( $x + $y ),*),* ]
     }
 }

 fn main() {
     let a: &[i32]
         = o_O!(10; [1, 2, 3];
                20; [4, 5, 6]);

     assert_eq!(a, [11, 12, 13, 24, 25, 26]);
 }

 It looks disgusting! ;)
This baffles me. It seems that language designer always need to fuck up macros either by: - Creating a new API to spawn AST, which become a burden on the compiler development (bonus point if you expose compiler internal). - Creating a new syntax, preferably completely inscrutable so you can pretend you are a guru while using it. There is a good way to express AST in a language, and this is how you do it for everything else in the program : you use the damn language syntax and grammar. On that one, LISP get it right, except that its general lack of grammar and syntax (really, it is not that LISP has a lot of (), it is that everything else has been removed) end up creating a new set of problems. It is just this const expr vs CTFE, one has now to learn a new language to do computation at compile time (const expr) when using the same language (CTFE) reduce the burden on the dev and compiler writer.
Jun 12 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/12/15 10:26 AM, deadalnix wrote:
 This baffles me. It seems that language designer always need to fuck up
 macros either by:
   - Creating a new API to spawn AST, which become a burden on the
 compiler development (bonus point if you expose compiler internal).
   - Creating a new syntax, preferably completely inscrutable so you can
 pretend you are a guru while using it.

 There is a good way to express AST in a language, and this is how you do
 it for everything else in the program : you use the damn language syntax
 and grammar.
In their defense, there is a need for a metalanguage as well do manipulate ASTs themselves. That's why all macro systems just look a bit off. I suggest you give it a shot at defining a natural-looking macro system if you're up to something, but I suspect there are many subtleties to cope with. Andrei
Jun 12 2015
next sibling parent "Tofu Ninja" <emmons0 purdue.edu> writes:
On Friday, 12 June 2015 at 17:58:22 UTC, Andrei Alexandrescu 
wrote:
 On 6/12/15 10:26 AM, deadalnix wrote:
 This baffles me. It seems that language designer always need 
 to fuck up
 macros either by:
  - Creating a new API to spawn AST, which become a burden on 
 the
 compiler development (bonus point if you expose compiler 
 internal).
  - Creating a new syntax, preferably completely inscrutable so 
 you can
 pretend you are a guru while using it.

 There is a good way to express AST in a language, and this is 
 how you do
 it for everything else in the program : you use the damn 
 language syntax
 and grammar.
In their defense, there is a need for a metalanguage as well do manipulate ASTs themselves. That's why all macro systems just look a bit off. I suggest you give it a shot at defining a natural-looking macro system if you're up to something, but I suspect there are many subtleties to cope with. Andrei
Good timing. Check the thread I made about statement mixins, it hase a pretty natural and simple macro syntax that is an extension of the mixin syntax. Thorough admitadly no way to manipulate ast.
Jun 12 2015
prev sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Friday, 12 June 2015 at 17:58:22 UTC, Andrei Alexandrescu 
wrote:
 In their defense, there is a need for a metalanguage as well do 
 manipulate ASTs themselves. That's why all macro systems just 
 look a bit off. I suggest you give it a shot at defining a 
 natural-looking macro system if you're up to something, but I 
 suspect there are many subtleties to cope with.

 Andrei
I'm not sure this is really needed when you have features like static if. At some point, too much power create absurd language complication, user bewilderment and diminishing return.
Jun 12 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/12/15 12:04 PM, deadalnix wrote:
 On Friday, 12 June 2015 at 17:58:22 UTC, Andrei Alexandrescu wrote:
 In their defense, there is a need for a metalanguage as well do
 manipulate ASTs themselves. That's why all macro systems just look a
 bit off. I suggest you give it a shot at defining a natural-looking
 macro system if you're up to something, but I suspect there are many
 subtleties to cope with.

 Andrei
I'm not sure this is really needed when you have features like static if. At some point, too much power create absurd language complication, user bewilderment and diminishing return.
Yah, I was thinking in abstract, not for D. D rox. -- Andrei
Jun 12 2015
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/11/2015 8:03 AM, Dicebot wrote:
 On the other hand they have one important advantage: all type arguments must
 comply to one or more trairs and thus bodies of generics are checked before
 institation. You are only allowed to call methods and operations of generic
 arguments that are defined in relevan trait. This is huge win for code hygiene
 compared to D.
On the other hand, generic bodies in D can inquire if various additional traits are available, and then adapt: struct S(R) if (isInputRange!R) { ... static if (isForwardRange!R) { R save() { auto result = this; result.r = r.save; return result; } } ... } This kind of thing is used extensively in Phobos generics.
Jun 11 2015
next sibling parent reply "Meta" <jared771 gmail.com> writes:
On Thursday, 11 June 2015 at 19:31:52 UTC, Walter Bright wrote:
 On 6/11/2015 8:03 AM, Dicebot wrote:
 On the other hand they have one important advantage: all type 
 arguments must
 comply to one or more trairs and thus bodies of generics are 
 checked before
 institation. You are only allowed to call methods and 
 operations of generic
 arguments that are defined in relevan trait. This is huge win 
 for code hygiene
 compared to D.
On the other hand, generic bodies in D can inquire if various additional traits are available, and then adapt: struct S(R) if (isInputRange!R) { ... static if (isForwardRange!R) { R save() { auto result = this; result.r = r.save; return result; } } ... } This kind of thing is used extensively in Phobos generics.
It's not *quite* the same. I believe Rust traits are closer to C++ concepts.
Jun 11 2015
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 6/11/2015 1:18 PM, Meta wrote:
 It's not *quite* the same. I believe Rust traits are closer to C++ concepts.
I suspect C++ concepts have the same limitation.
Jun 11 2015
parent Artur Skawina via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 06/11/15 23:44, Walter Bright via Digitalmars-d wrote:
 On 6/11/2015 1:18 PM, Meta wrote:
 It's not *quite* the same. I believe Rust traits are closer to C++ concepts.
I suspect C++ concepts have the same limitation.
It's not a limitation, but a feature. Your example could look more or less like this in a D-with-traits language: struct S(InputRange R) { ... static if (alias fr = ForwardRange!r) { R save() { auto result = this; result.r = fr.save; return result; } } ... } IOW the problem can easily be solved, even if Rust and C++ concepts don't support this functionality. artur
Jun 12 2015
prev sibling parent "Dicebot" <public dicebot.lv> writes:
On Thursday, 11 June 2015 at 19:31:52 UTC, Walter Bright wrote:
 On 6/11/2015 8:03 AM, Dicebot wrote:
 On the other hand they have one important advantage: all type 
 arguments must
 comply to one or more trairs and thus bodies of generics are 
 checked before
 institation. You are only allowed to call methods and 
 operations of generic
 arguments that are defined in relevan trait. This is huge win 
 for code hygiene
 compared to D.
On the other hand, generic bodies in D can inquire if various additional traits are available, and then adapt: struct S(R) if (isInputRange!R) { ... static if (isForwardRange!R) { R save() { auto result = this; result.r = r.save; return result; } } ... } This kind of thing is used extensively in Phobos generics.
That was exactly what I referred to with "no arbitrary condition constraints (thus only partial duck typing)"
Jun 11 2015
prev sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Monday, 1 June 2015 at 19:29:05 UTC, Andrei Alexandrescu wrote:
 We need either gdc 2.067 or ldc 2.067 released in order to 
 build ddmd with them. Otherwise we're suffering a 20% perf loss.
Note that this is a general problem, not limited to DDMD. We should definitively tie GDC and LDC closer to DMD dev. It is common for many to update to the latest version of DMD, usually because one need bugfix, and sugger that 20% slowdown because LDC and GDC are not ready.
Jun 01 2015
parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 23:05:47 UTC, deadalnix wrote:
 On Monday, 1 June 2015 at 19:29:05 UTC, Andrei Alexandrescu 
 wrote:
 We need either gdc 2.067 or ldc 2.067 released in order to 
 build ddmd with them. Otherwise we're suffering a 20% perf 
 loss.
Note that this is a general problem, not limited to DDMD. We should definitively tie GDC and LDC closer to DMD dev. It is common for many to update to the latest version of DMD, usually because one need bugfix, and sugger that 20% slowdown because LDC and GDC are not ready.
Well, Daniel's big priority right now seems to be making it so that dmd, gdc, and ldc can have _exactly_ the same frontend code so that the updates required for gdc and ldc for new releases will be relatively minimal. Once that's done, the process for getting all of the backends ready for a new frontend release should be much faster. - Jonathan M Davis
Jun 01 2015
prev sibling parent reply "Israel" <tl12000 live.com> writes:
On Monday, 1 June 2015 at 18:58:04 UTC, Joakim wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving 
 average of daily dmd downloads is in pronounced decline 
 following a peak at the 2.067 release. It is possible that the 
 recent release of Rust 1.0 has caused that, shifting drive-by 
 experimenters to it.
I don't know that we should over-react to the recent slump, especially since it was presaged by such a large spike. I'm sure it'll pick back up again as people start watching the Dconf 2015 videos. Filtering out the noise, the daily download numbers look stable around 1200 for almost the last year. The real issue is how we take the next jump upwards. Hopefully, mobile support, which only Dan and I are working on right now, can help with that. :)
 1. It's a big bummer that nothing has happened with chopping 
 up the videos over the weekend. Right now DConf is three 
 6-hour blobs of unstructured footage. John has warned us he 
 might not have broadband access to do so during his travels. 
 In retrospect, what we should have done was to immediately 
 arrange that John gives access to the videos to someone 
 willing and able to do the postprocessing.
Given the subpar quality of the livestream, I'm not sure we should be highlighting those videos. I've watched several hours of the livestream and the frequent audio dropouts are very annoying. What is the plan to put out the better videos recorded by the organizers: put them all out as soon as they're available or stagger their release? On Monday, 1 June 2015 at 18:14:40 UTC, Andrei Alexandrescu wrote:
 4. We need to marshal our efforts behind 2.068, and clarify 
 the big ticket items accomplished. I'm thinking rangification 
 of Phobos - GC no longer needed for most primitives, and 
 documented where needed. As much as I want it, ddmd seems to 
 not be happening for 2.068 because of, simply put, 
 insufficient resources.
Can you expand on why ddmd is getting delayed? I, and seemingly many others, were looking forward to ddmd. I did not see Daniel's talk as it wasn't livestreamed. Perhaps we can help get ddmd out the door.
Im Curious, is the effort putting more focus towards iOS or Android? Also, it would probably help if the version numbers didnt increment so slowly. Maybe 2.10 for DDMD?
Jun 01 2015
next sibling parent "Joakim" <dlang joakim.fea.st> writes:
On Monday, 1 June 2015 at 19:35:16 UTC, Israel wrote:
 Im Curious, is the effort putting more focus towards iOS or 
 Android?
Neither, Dan has been working on iOS, while I focus on Android. iOS support has been farther along for some time though, as Dan has said in another thread: http://forum.dlang.org/thread/mailman.284.1432823930.7663.digitalmars-d puremagic.com?page=6#post-m2wpznsx37.fsf:40comcast.net
Jun 01 2015
prev sibling parent Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 2 June 2015 at 05:35, Israel via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 18:58:04 UTC, Joakim wrote:

 Im Curious, is the effort putting more focus towards iOS or Android?

 Also, it would probably help if the version numbers didnt increment so
 slowly.
 Maybe 2.10 for DDMD?
This is a good idea. It's subtle, but probably significant. My impression of others when discussing D with colleagues, is that it's almost entirely nothing more than a matter of perception. (...well, and the hard blockers I've mentioned elsewhere, but they don't often know about those, I keep that to myself ;) There's an important element of psychology in terms of presentation that we aren't quite getting right. I think the presentation needs to change such that the general impression move from an assorted collection of OSS community projects, to something more akin to 'a product'. With releases that are controlled, deliberate, and coordinated (between all the significant parts). Considering Andrei's comparison to Rust before; Rust does present itself as confident, deliberate, and unified. Whether this is true or not, that's the impression I have of Rust, for whatever reason. I'm not alone, my colleagues who have investigated it independently have come to the same conclusion. It's simply perception, but that's very powerful, and people seem to have more confidence in Rust at the moment. We need to make sure the public facing media makes the proper impression, and I think key to that is making sure all the fragments of the ecosystem are working properly together. I would say this is more important than language development for the moment. Package management (in one way or another) is probably a very important focus right now.
Jun 01 2015
prev sibling next sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/01/2015 02:11 PM, Andrei Alexandrescu wrote:
 All of us who have an interest in D to succeed must understand there is
 also a proportional sense of duty. If you can do X and don't, it can be
 safely assumed X will just not get done at all. Which means whatever you
 can do, please just do it, do it now, and stay with it until it's done.
https://www.youtube.com/watch?v=6eX3fiQLo84
Jun 01 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/1/15 2:29 PM, Nick Sabalausky wrote:
 On 06/01/2015 02:11 PM, Andrei Alexandrescu wrote:
 All of us who have an interest in D to succeed must understand there is
 also a proportional sense of duty. If you can do X and don't, it can be
 safely assumed X will just not get done at all. Which means whatever you
 can do, please just do it, do it now, and stay with it until it's done.
https://www.youtube.com/watch?v=6eX3fiQLo84
Thought you're busy working on rdmd :o). -- Andrei
Jun 01 2015
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/01/2015 05:37 PM, Andrei Alexandrescu wrote:
 On 6/1/15 2:29 PM, Nick Sabalausky wrote:
 On 06/01/2015 02:11 PM, Andrei Alexandrescu wrote:
 All of us who have an interest in D to succeed must understand there is
 also a proportional sense of duty. If you can do X and don't, it can be
 safely assumed X will just not get done at all. Which means whatever you
 can do, please just do it, do it now, and stay with it until it's done.
https://www.youtube.com/watch?v=6eX3fiQLo84
Thought you're busy working on rdmd :o). -- Andrei
There's always time for a Belushi reference! :)
Jun 01 2015
prev sibling next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving 
 average of daily dmd downloads is in pronounced decline 
 following a peak at the 2.067 release. It is possible that the 
 recent release of Rust 1.0 has caused that, shifting drive-by 
 experimenters to it.

 We need to act on this on multiple fronts.

 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.

 2. It's an equally big bummer that "This Week in D" failed to 
 be there on Sunday night. I completely understand Adam's 
 overhead, what with his still traveling and all, but the bottom 
 line is if it's not every Sunday it's not steady and if it's 
 not steady it's not. Again, in retrospect it seems we need 
 backup plans for when the protagonist of whatever important 
 activity is unable to carry it. Who'd like to double Adam on 
 this?

 3. We've just had a good conference with solid content, but if 
 our collective actions are to be interpreted, we did our best 
 to be as stealth as possible. Please consider writing blogs, 
 articles, tweets, posts, related to all that stuff. Speakers in 
 particular should consider converting their good work into 
 articles. Programmer news sites are full of Rust-related stuff; 
 we must respond in kind with great D content.

 All of us who have an interest in D to succeed must understand 
 there is also a proportional sense of duty. If you can do X and 
 don't, it can be safely assumed X will just not get done at 
 all. Which means whatever you can do, please just do it, do it 
 now, and stay with it until it's done.


 Thanks,

 Andrei
For me any language that can help reduce the amount of C like code that keeps getting written is good. So I do cherish for all languages pursuing such goal. Anyway, I was going to post something about IO, but it can wait. Just did a short post to create awareness for the conference, slides and unedited videos availability. -- Paulo
Jun 01 2015
parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Monday, 1 June 2015 at 21:37:49 UTC, Paulo Pinto wrote:
 Just did a short post to create awareness for the conference, 
 slides and unedited videos availability.
What's the link?
Jun 01 2015
prev sibling next sibling parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 2. It's an equally big bummer that "This Week in D" failed to 
 be there on Sunday night. I completely understand Adam's 
 overhead, what with his still traveling and all, but the bottom 
 line is if it's not every Sunday it's not steady and if it's 
 not steady it's not. Again, in retrospect it seems we need 
 backup plans for when the protagonist of whatever important 
 activity is unable to carry it. Who'd like to double Adam on 
 this?
It is true that I'm extra busy this week (I return from Utah tomorrow, still out here visiting all my peeps), yesterday I was on the computer for about one hour total.. but it is also that this week is a gigantic special edition - I'm writing up summaries of all the talks, pointing out important take-aways, and commenting on various between-talk conversations. Alas, this takes a while to write too, I might finish tonight but might not. It'll be worth it though. However, I don't intend to post it without this written up. The piecemeal dconf releases on reddit last year were pretty much a complete failure. The biggest comments they attracted were people complaining that it was taking too long!
Jun 01 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/1/15 2:42 PM, Adam D. Ruppe wrote:
 However, I don't intend to post it without this written up.
Alternative: pipeline day 1 while you work on days 2 and 3. -- Andrei
Jun 01 2015
prev sibling next sibling parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.
It's proving impractical to get this done from hotel/coffeeshop internet connections. Youtube's online editor doesn't appear to be up to be able to deal with videos this long, so it's a matter of downloading the entire videos, editing them locally and then uploading again. However, what I can do is give links to the start points of every talk, so here they are: Day 1: Brian Schott: https://youtu.be/ep5vDQq15as Liran Zvibel: https://youtu.be/-OCl-jWyT9E?t=3720 David Nadlinger: https://youtu.be/-OCl-jWyT9E?t=3720 Amaury Sechet: https://youtu.be/-OCl-jWyT9E?t=10189 Walter & Andrei AUA: https://youtu.be/-OCl-jWyT9E?t=14477 Day 2: Chuck Allison: https://youtu.be/AH35IxWkx8M?t=182 Lightening Talks: Jonathan Crapuchettes: https://youtu.be/AH35IxWkx8M?t=4088 Adam Ruppe: https://youtu.be/AH35IxWkx8M?t=4477 Lionello Lunesu: https://youtu.be/AH35IxWkx8M?t=4929 Erik Smith: https://www.youtube.com/watch?v=AH35IxWkx8M Walter Bright: https://youtu.be/AH35IxWkx8M?t=5896 Mihails Strasuns https://youtu.be/AH35IxWkx8M?t=8031 Andy Smith: https://youtu.be/AH35IxWkx8M?t=15825 Jonathan Davis: https://youtu.be/AH35IxWkx8M?t=19410 Mark Isaacson: https://youtu.be/AH35IxWkx8M?t=23056 Andrei Alexandrescu: https://youtu.be/AH35IxWkx8M?t=23056 Open Mic / Q+A: https://youtu.be/AH35IxWkx8M?t=27449 Day 3: Andrei Alexandrescu: https://youtu.be/oA1exjdEIWw?t=44 Adam Ruppe: https://youtu.be/oA1exjdEIWw?t=4350 Joseph Wakeling: https://youtu.be/oA1exjdEIWw?t=7617 John Colvin: https://youtu.be/oA1exjdEIWw?t=12105 Atila Neves: https://youtu.be/oA1exjdEIWw?t=16190 Erich Gubler: https://youtu.be/oA1exjdEIWw?t=19178
Jun 01 2015
parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Tuesday, 2 June 2015 at 03:02:23 UTC, John Colvin wrote:
 It's proving impractical to get this done from hotel/coffeeshop 
 internet connections. Youtube's online editor doesn't appear to 
 be up to be able to deal with videos this long, so it's a 
 matter of downloading the entire videos, editing them locally 
 and then uploading again.
I could do this (I have 100Mbit at home). Should I? And should I just upload them to my personal YouTube account?
Jun 02 2015
prev sibling next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.
Are the streamed videos the only videos available? What about the original plan to record the talks, were those canceled because someone started streaming with a laptop?
Jun 02 2015
next sibling parent "extrawurst" <stephan extrawurst.org> writes:
On Tuesday, 2 June 2015 at 09:35:52 UTC, Vladimir Panteleev wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 1. It's a big bummer that nothing has happened with chopping 
 up the videos over the weekend. Right now DConf is three 
 6-hour blobs of unstructured footage. John has warned us he 
 might not have broadband access to do so during his travels. 
 In retrospect, what we should have done was to immediately 
 arrange that John gives access to the videos to someone 
 willing and able to do the postprocessing.
Are the streamed videos the only videos available? What about the original plan to record the talks, were those canceled because someone started streaming with a laptop?
No there are HQ recordings coming: http://forum.dlang.org/thread/sujyaurgyfumoiimixmx forum.dlang.org#post-mkjtgl:242ck2:241:40digitalmars.com
Jun 02 2015
prev sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Tuesday, 2 June 2015 at 09:35:52 UTC, Vladimir Panteleev wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 1. It's a big bummer that nothing has happened with chopping 
 up the videos over the weekend. Right now DConf is three 
 6-hour blobs of unstructured footage. John has warned us he 
 might not have broadband access to do so during his travels. 
 In retrospect, what we should have done was to immediately 
 arrange that John gives access to the videos to someone 
 willing and able to do the postprocessing.
Are the streamed videos the only videos available? What about the original plan to record the talks, were those canceled because someone started streaming with a laptop?
The talks were recorded and will be at higher quality than the live stream, but I don't know when they'll be done. Given that postprocessing is being done to them to insert the slides into them and whatnot, it'll take them longer than simply giving us the videos. I would have thought that we'd just wait for those rather than trying to chop up the live stream and present that to folks, but I guess that Andrei wants to make the talks available as quickly as possible. - Jonathan M Davis
Jun 02 2015
parent reply "wobbles" <grogan.colin gmail.com> writes:
On Tuesday, 2 June 2015 at 09:53:32 UTC, Jonathan M Davis wrote:
 On Tuesday, 2 June 2015 at 09:35:52 UTC, Vladimir Panteleev 
 wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 1. It's a big bummer that nothing has happened with chopping 
 up the videos over the weekend. Right now DConf is three 
 6-hour blobs of unstructured footage. John has warned us he 
 might not have broadband access to do so during his travels. 
 In retrospect, what we should have done was to immediately 
 arrange that John gives access to the videos to someone 
 willing and able to do the postprocessing.
Are the streamed videos the only videos available? What about the original plan to record the talks, were those canceled because someone started streaming with a laptop?
The talks were recorded and will be at higher quality than the live stream, but I don't know when they'll be done. Given that postprocessing is being done to them to insert the slides into them and whatnot, it'll take them longer than simply giving us the videos. I would have thought that we'd just wait for those rather than trying to chop up the live stream and present that to folks, but I guess that Andrei wants to make the talks available as quickly as possible. - Jonathan M Davis
Personally, while I appreciate Johns efforts with the livestream and then chopping the videos up, I'm waiting for the professionally done versions. I just cant bring myself to follow the choppy audio :)
Jun 02 2015
parent "extrawurst" <stephan extrawurst.org> writes:
On Tuesday, 2 June 2015 at 10:54:12 UTC, wobbles wrote:
 On Tuesday, 2 June 2015 at 09:53:32 UTC, Jonathan M Davis wrote:
 On Tuesday, 2 June 2015 at 09:35:52 UTC, Vladimir Panteleev 
 wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 1. It's a big bummer that nothing has happened with chopping 
 up the videos over the weekend. Right now DConf is three 
 6-hour blobs of unstructured footage. John has warned us he 
 might not have broadband access to do so during his travels. 
 In retrospect, what we should have done was to immediately 
 arrange that John gives access to the videos to someone 
 willing and able to do the postprocessing.
Are the streamed videos the only videos available? What about the original plan to record the talks, were those canceled because someone started streaming with a laptop?
The talks were recorded and will be at higher quality than the live stream, but I don't know when they'll be done. Given that postprocessing is being done to them to insert the slides into them and whatnot, it'll take them longer than simply giving us the videos. I would have thought that we'd just wait for those rather than trying to chop up the live stream and present that to folks, but I guess that Andrei wants to make the talks available as quickly as possible. - Jonathan M Davis
Personally, while I appreciate Johns efforts with the livestream and then chopping the videos up, I'm waiting for the professionally done versions. I just cant bring myself to follow the choppy audio :)
Yeah me neither. I mean it was good for adhoc live stream but I even suggest taking them offline to keep everyone asking if those are the only videos and making an even poorer impression to the outside (of this forum)
Jun 02 2015
prev sibling next sibling parent "Binarydepth" <binarydepth gmail.com> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving 
 average of daily dmd downloads is in pronounced decline 
 following a peak at the 2.067 release. It is possible that the 
 recent release of Rust 1.0 has caused that, shifting drive-by 
 experimenters to it.

 We need to act on this on multiple fronts.

 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.

 2. It's an equally big bummer that "This Week in D" failed to 
 be there on Sunday night. I completely understand Adam's 
 overhead, what with his still traveling and all, but the bottom 
 line is if it's not every Sunday it's not steady and if it's 
 not steady it's not. Again, in retrospect it seems we need 
 backup plans for when the protagonist of whatever important 
 activity is unable to carry it. Who'd like to double Adam on 
 this?

 3. We've just had a good conference with solid content, but if 
 our collective actions are to be interpreted, we did our best 
 to be as stealth as possible. Please consider writing blogs, 
 articles, tweets, posts, related to all that stuff. Speakers in 
 particular should consider converting their good work into 
 articles. Programmer news sites are full of Rust-related stuff; 
 we must respond in kind with great D content.

 All of us who have an interest in D to succeed must understand 
 there is also a proportional sense of duty. If you can do X and 
 don't, it can be safely assumed X will just not get done at 
 all. Which means whatever you can do, please just do it, do it 
 now, and stay with it until it's done.


 Thanks,

 Andrei
Hi Andrei Regarding the Dconf videos. I have only seen those on youtube. And the problem with youtube is that videos are not structured chronologically. I see two options a youtube playlist or an html page that does so here at dlang.org. BD
Jun 10 2015
prev sibling parent reply "Binarydepth" <binarydepth gmail.com> writes:
On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving 
 average of daily dmd downloads is in pronounced decline 
 following a peak at the 2.067 release. It is possible that the 
 recent release of Rust 1.0 has caused that, shifting drive-by 
 experimenters to it.

 We need to act on this on multiple fronts.

 1. It's a big bummer that nothing has happened with chopping up 
 the videos over the weekend. Right now DConf is three 6-hour 
 blobs of unstructured footage. John has warned us he might not 
 have broadband access to do so during his travels. In 
 retrospect, what we should have done was to immediately arrange 
 that John gives access to the videos to someone willing and 
 able to do the postprocessing.

 Andrei
I can do that. Count me in. BD
Jun 10 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/10/15 10:36 AM, Binarydepth wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day moving average
 of daily dmd downloads is in pronounced decline following a peak at
 the 2.067 release. It is possible that the recent release of Rust 1.0
 has caused that, shifting drive-by experimenters to it.

 We need to act on this on multiple fronts.

 1. It's a big bummer that nothing has happened with chopping up the
 videos over the weekend. Right now DConf is three 6-hour blobs of
 unstructured footage. John has warned us he might not have broadband
 access to do so during his travels. In retrospect, what we should have
 done was to immediately arrange that John gives access to the videos
 to someone willing and able to do the postprocessing.

 Andrei
I can do that. Count me in.
Thanks, you're a little late :o). That's been done, and we're waiting for the official videos now. -- Andrei
Jun 10 2015
parent reply "Binarydepth" <binarydepth gmail.com> writes:
On Wednesday, 10 June 2015 at 18:04:45 UTC, Andrei Alexandrescu 
wrote:
 On 6/10/15 10:36 AM, Binarydepth wrote:
 On Monday, 1 June 2015 at 18:11:32 UTC, Andrei Alexandrescu 
 wrote:
 Per http://erdani.com/d/downloads.daily.png, the 28-day 
 moving average
 of daily dmd downloads is in pronounced decline following a 
 peak at
 the 2.067 release. It is possible that the recent release of 
 Rust 1.0
 has caused that, shifting drive-by experimenters to it.

 We need to act on this on multiple fronts.

 1. It's a big bummer that nothing has happened with chopping 
 up the
 videos over the weekend. Right now DConf is three 6-hour 
 blobs of
 unstructured footage. John has warned us he might not have 
 broadband
 access to do so during his travels. In retrospect, what we 
 should have
 done was to immediately arrange that John gives access to the 
 videos
 to someone willing and able to do the postprocessing.

 Andrei
I can do that. Count me in.
Thanks, you're a little late :o). That's been done, and we're waiting for the official videos now. -- Andrei
Good! And what do you think about structuring the videos in a youtube playlist or HTML page here on dlang.org ?
Jun 10 2015
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 6/10/15 11:10 AM, Binarydepth wrote:
 Good! And what do you think about structuring the videos in a youtube
 playlist or HTML page here on dlang.org ?
That'd be great to do on dconf.org with the final videos. -- Andrei
Jun 10 2015
prev sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Wednesday, 10 June 2015 at 18:10:55 UTC, Binarydepth wrote:
 Good! And what do you think about structuring the videos in a 
 youtube playlist or HTML page here on dlang.org ?
I'm also writing up articles about each talk on the this week in D btw: http://arsdnet.net/this-week-in-d/jun-07.html Wednesday ones are done, I'm trying to finish the rest by this weekend.
Jun 10 2015