www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Where is the D deep learning library?

reply Guillaume Piolat <first.last gmail.com> writes:
With the latest popularity of Machine Learning, and all the 
achievement we see, where is the D alternative in this area?

C++'s offering makes lot of use of meta-programming already:

https://www.reddit.com/r/programming/comments/4py875/dlib_190_clean_c11_deep_learning_api/?ref=share&ref_source=link

Surely a touch of DbI and D's meta power could help!
Jun 27 2016
next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 27 June 2016 at 14:10:15 UTC, Guillaume Piolat wrote:
 With the latest popularity of Machine Learning, and all the 
 achievement we see, where is the D alternative in this area?

 C++'s offering makes lot of use of meta-programming already:

 https://www.reddit.com/r/programming/comments/4py875/dlib_190_clean_c11_deep_learning_api/?ref=share&ref_source=link

 Surely a touch of DbI and D's meta power could help!
Building such a library is a lot of work, as in, if you're only working in your free time, a multi year long task. Sklearn, for example, took like, four years before it was in a state where people wanted to use it, and that's built off of many C and C++ projects. Sure, there are a lot of C libraries that D could provide wrappers for, but that's not what you're talking about. A true D library that takes advantage of the compile time features would have to include thousands of new lines of D code. Plus, in order to be taken seriously, any new code must be as performant as possible, so the writer must be skilled in writing low level code for a lot of platforms, which takes a lot of time.
Jun 27 2016
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 27 June 2016 at 15:18:09 UTC, Jack Stouffer wrote:
 On Monday, 27 June 2016 at 14:10:15 UTC, Guillaume Piolat wrote:
 With the latest popularity of Machine Learning, and all the 
 achievement we see, where is the D alternative in this area?

 C++'s offering makes lot of use of meta-programming already:

 https://www.reddit.com/r/programming/comments/4py875/dlib_190_clean_c11_deep_learning_api/?ref=share&ref_source=link

 Surely a touch of DbI and D's meta power could help!
Building such a library is a lot of work, as in, if you're only working in your free time, a multi year long task. Sklearn, for example, took like, four years before it was in a state where people wanted to use it, and that's built off of many C and C++ projects. Sure, there are a lot of C libraries that D could provide wrappers for, but that's not what you're talking about. A true D library that takes advantage of the compile time features would have to include thousands of new lines of D code. Plus, in order to be taken seriously, any new code must be as performant as possible, so the writer must be skilled in writing low level code for a lot of platforms, which takes a lot of time.
Well I get the manpower thing, everything we do is quite labour-intensive. I'm just curious nobody started such an effort (but there was with DlangScience, gamedev, web...). And with such a hyped area, getting successful is a real possibility for someone who would be a domain expert. You don't need to match the manpower and stability of the established solution to make something useful. Perhaps there could be something distinctive enough to make it attractive?
Jun 27 2016
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 27 June 2016 at 15:31:07 UTC, Guillaume Piolat wrote:
 Well I get the manpower thing, everything we do is quite 
 labour-intensive. I'm just curious nobody started such an 
 effort (but there was with DlangScience, gamedev, web...). And 
 with such a hyped area, getting successful is a real 
 possibility for someone who would be a domain expert.
Do you have a particular need for a machine learning library? I'm just asking because I feel like the examples you mentioned (DlangScience, gamedev, web) are areas where there are people actively working on things in large part because they want to use them on specific applications. It seems like it is much easier to get people to contribute when they actually need something.
Jun 27 2016
parent reply Guillaume Piolat <first.last gmail.com> writes:
On Monday, 27 June 2016 at 16:41:15 UTC, jmh530 wrote:
 On Monday, 27 June 2016 at 15:31:07 UTC, Guillaume Piolat wrote:
 Well I get the manpower thing, everything we do is quite 
 labour-intensive. I'm just curious nobody started such an 
 effort (but there was with DlangScience, gamedev, web...). And 
 with such a hyped area, getting successful is a real 
 possibility for someone who would be a domain expert.
Do you have a particular need for a machine learning library? I'm just asking because I feel like the examples you mentioned (DlangScience, gamedev, web) are areas where there are people actively working on things in large part because they want to use them on specific applications. It seems like it is much easier to get people to contribute when they actually need something.
Not yet, but it could be useful for new types of audio effects and specific tasks like voiced/unvoiced detection.
Jun 27 2016
parent reply Martin Nowak <code+news.digitalmars dawg.eu> writes:
On 06/27/2016 08:13 PM, Guillaume Piolat wrote:
 Not yet, but it could be useful for new types of audio effects and
 specific tasks like voiced/unvoiced detection.
There are many simpler solutions for that than using machine learning. Writing a simple neural network with backpropagation is fairly trivial, if you had that in mind to emulate existing audio effects, not sure if it works well though.
Jun 27 2016
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 27 June 2016 at 22:17:55 UTC, Martin Nowak wrote:
 On 06/27/2016 08:13 PM, Guillaume Piolat wrote:
 Not yet, but it could be useful for new types of audio effects 
 and specific tasks like voiced/unvoiced detection.
There are many simpler solutions for that than using machine learning. Writing a simple neural network with backpropagation is fairly trivial, if you had that in mind to emulate existing audio effects, not sure if it works well though.
I could probably write a simple backpropogation one, but I would probably screw something up if I wrote my own convolutional neural network. My suggestion is that anyone interested in deep learning might want to break it up into some more manageable projects. For instance, one of the features of TensorFlow is auto-differentiation. This means that you can provide it arbitrary functions and it will calculate the gradients for you exactly instead of relying on numerical estimates. autodiff involves building an AST for a function and then walking it to generate the gradient. A D autodiff library would probably be easier to write than a comparable one in other languages since it could take advantage of all the compile time functionality. Alternately, TensorFlow also works well with heterogeneous systems, so any work that improves D's capabilities with OpenCL/CUDA or MPI would be something that might make it easier to develop a D Deep Learning library.
Jun 27 2016
parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Tuesday, 28 June 2016 at 03:29:46 UTC, jmh530 wrote:
 On Monday, 27 June 2016 at 22:17:55 UTC, Martin Nowak wrote:
 [...]
I could probably write a simple backpropogation one, but I would probably screw something up if I wrote my own convolutional neural network. [...]
I am planning to provide OpenCL and CUDA targets to ldc in the next few weeks, probably starting in earnest next week, allowing direct compilation and calling code on the gpu automagically with reflection.
Jun 28 2016
parent jmh530 <john.michael.hall gmail.com> writes:
On Tuesday, 28 June 2016 at 08:14:07 UTC, Nicholas Wilson wrote:
 I am planning to provide OpenCL and CUDA targets to ldc in the 
 next few weeks, probably starting in earnest next week, 
 allowing direct compilation and calling code on the gpu 
 automagically with reflection.
Cool.
Jun 28 2016
prev sibling next sibling parent reply Jack Stouffer <jack jackstouffer.com> writes:
On Monday, 27 June 2016 at 15:31:07 UTC, Guillaume Piolat wrote:
 Well I get the manpower thing, everything we do is quite 
 labour-intensive. I'm just curious nobody started such an 
 effort (but there was with DlangScience, gamedev, web...).
D doesn't even have the building blocks in D code. Ndslice and the upcoming BLAS in D are nessesary steps if we want a idiomatic D machine learning library. And that's just the foundational work.
 You don't need to match the manpower and stability of the 
 established solution to make something useful. Perhaps there 
 could be something distinctive enough to make it attractive?
Unfortunately, academics (who are usually the target audience for these libraries) are very set in their ways. You'll need something that surpasses the functionality of the current solutions and is just as easy to use in order for it to catch on :/
Jun 27 2016
parent Seb <seb wilzba.ch> writes:
On Monday, 27 June 2016 at 18:01:54 UTC, Jack Stouffer wrote:
 On Monday, 27 June 2016 at 15:31:07 UTC, Guillaume Piolat wrote:
 Well I get the manpower thing, everything we do is quite 
 labour-intensive. I'm just curious nobody started such an 
 effort (but there was with DlangScience, gamedev, web...).
D doesn't even have the building blocks in D code. Ndslice and the upcoming BLAS in D are nessesary steps if we want a idiomatic D machine learning library. And that's just the foundational work.
 You don't need to match the manpower and stability of the 
 established solution to make something useful. Perhaps there 
 could be something distinctive enough to make it attractive?
Unfortunately, academics (who are usually the target audience for these libraries) are very set in their ways. You'll need something that surpasses the functionality of the current solutions and is just as easy to use in order for it to catch on :/
It should be added that for the development library "mir" where the upcoming BLAS in D will be hosted, we have plans to integrate also building blocks for ML. In fact Ilya already pushed an online LDA algorithm [1]. At the moment we lack the manpower to focus more on this, but any help is of course appreciated ;-) See this issue for details: https://github.com/libmir/mir/issues/166 [1] http://docs.mir.dlang.io/latest/mir_model_lda_hoffman.html
Jun 28 2016
prev sibling parent reply Martin Nowak <code+news.digitalmars dawg.eu> writes:
On 06/27/2016 05:31 PM, Guillaume Piolat wrote:
 You don't need to match the manpower and stability of the established
 solution to make something useful. Perhaps there could be something
 distinctive enough to make it attractive?
Much faster turnaround times are a very distinctive feature. Many science people find it somewhat normal to wait a whole day for Matlab. I just rewrote a numpy prototype and achieved a 25x speedup. That's game changing b/c it allowed me to actually iterate on ideas.
Jun 27 2016
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 27 June 2016 at 22:24:00 UTC, Martin Nowak wrote:
 normal to wait a whole day for Matlab.
Why I started learning D...
Jun 27 2016
parent bachmeier <no spam.net> writes:
On Monday, 27 June 2016 at 22:41:16 UTC, jmh530 wrote:
 On Monday, 27 June 2016 at 22:24:00 UTC, Martin Nowak wrote:
 normal to wait a whole day for Matlab.
Why I started learning D...
Me too, but then I found out that I could write correct code much faster in D. Quite different from C and C++. And also that it is fun writing D.
Jun 27 2016
prev sibling parent reply Dejan Lekic <dejan.lekic gmail.com> writes:
On Monday, 27 June 2016 at 14:10:15 UTC, Guillaume Piolat wrote:
 With the latest popularity of Machine Learning, and all the 
 achievement we see, where is the D alternative in this area?

 C++'s offering makes lot of use of meta-programming already:

 https://www.reddit.com/r/programming/comments/4py875/dlib_190_clean_c11_deep_learning_api/?ref=share&ref_source=link

 Surely a touch of DbI and D's meta power could help!
Thanks for reminding me why I stopped doing C++ programming... When I saw that... using LeNet = loss_multiclass_log< fc<10, relu<fc<84, relu<fc<120, max_pool<2,2,2,2,relu<con<16,5,5,1,1, max_pool<2,2,2,2,relu<con<6,5,5,1,1, input<matrix<unsigned char>>>>>>>>>>>>>>; ... I got a headache...
Jun 28 2016
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Tuesday, 28 June 2016 at 17:17:47 UTC, Dejan Lekic wrote:
 Thanks for reminding me why I stopped doing C++ programming...

 When I saw that...

     using LeNet = loss_multiclass_log<
                                 fc<10,
                                 relu<fc<84,
                                 relu<fc<120,
                                 
 max_pool<2,2,2,2,relu<con<16,5,5,1,1,
                                 
 max_pool<2,2,2,2,relu<con<6,5,5,1,1,
                                 input<matrix<unsigned
                                    char>>>>>>>>>>>>>>;

 ... I got a headache...
You like this better? alias LeNet = loss_multiclass_log!( fc!10, relu!(fc!84, relu!(fc!120, max_pool!(2,2,2,2,relu!(con!(16,5,5,1,1, max_pool!(2,2,2,2,relu!(con!(6,5,5,1,1, input!(matrix!(ubyte ) ) ) ) ) ) ) ) ) ) );
Jun 28 2016