www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - Arrays - Inserting and moving data

reply "MattCodr" <matheus_nab hotmail.com> writes:
I have a doubt about the best way to insert and move (not 
replace) some data on an array.

For example,

In some cases if I want to do action above, I do a loop moving 
the data until the point that I want and finally I insert the new 
data there.


In D I did this:

begin code
.
.
.
    int[] arr = [0,1,2,3,4,5,6,7,8,9];

    arr.insertInPlace(position, newValue);
    arr.popBack();
.
.
.
end code


After the insertInPlace my array changed it's length to 11, so I 
use arr.popBack(); to keep the array length = 10;

The code above is working well, I just want know if is there a 
better way?

Thanks,

Matheus.
Feb 09 2012
next sibling parent Pedro Lacerda <pslacerda gmail.com> writes:
--000e0cdfd646c0e2ed04b88776e3
Content-Type: text/plain; charset=UTF-8

I __believe__ that insertInPlace doesn't shift the elements, but use an
appender allocating another array instead.
Maybe this function do what you want.


    int[] arr = [0,1,2,3,4,5,6,7,8,9];

    void maybe(T)(T[] arr, size_t pos, T value) {
        size_t i;
        for (i = arr.length - 1; i > pos; i--) {
            arr[i] = arr[i-1];
        }
        arr[i] = value;
    }

    maybe(arr, 3, 0);
    maybe(arr, 0, 1);
    assert(arr == [1, 0, 1, 2, 0, 3, 4, 5, 6, 7]);



2012/2/9 MattCodr <matheus_nab hotmail.com>

 I have a doubt about the best way to insert and move (not replace) some
 data on an array.

 For example,

 In some cases if I want to do action above, I do a loop moving the data
 until the point that I want and finally I insert the new data there.


 In D I did this:

 begin code
 .
 .
 .
   int[] arr = [0,1,2,3,4,5,6,7,8,9];

   arr.insertInPlace(position, newValue);
   arr.popBack();
 .
 .
 .
 end code


 After the insertInPlace my array changed it's length to 11, so I use
 arr.popBack(); to keep the array length = 10;

 The code above is working well, I just want know if is there a better way?

 Thanks,

 Matheus.

--000e0cdfd646c0e2ed04b88776e3 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable <div><div>I __believe__ that insertInPlace doesn&#39;t shift the elements, = but use an appender allocating another array instead.</div><div>Maybe this = function do what you want.</div><div><font face=3D"&#39;courier new&#39;, m= onospace"><br> </font></div><div><font face=3D"&#39;courier new&#39;, monospace"><br></fon= t></div><div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 = int[] arr =3D [0,1,2,3,4,5,6,7,8,9];</font></div><div><font face=3D"&#39;co= urier new&#39;, monospace">=C2=A0 =C2=A0=C2=A0</font></div> <div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 void may= be(T)(T[] arr, size_t pos, T value) {</font></div><div><font face=3D"&#39;c= ourier new&#39;, monospace">=C2=A0 =C2=A0 =C2=A0 =C2=A0 size_t i;</font></d= iv><div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 =C2= =A0 =C2=A0 for (i =3D arr.length - 1; i &gt; pos; i--) {</font></div> <div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 arr[i] =3D arr[i-1];</font></div><div><font face=3D"&#= 39;courier new&#39;, monospace">=C2=A0 =C2=A0 =C2=A0 =C2=A0 }</font></div><= div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 =C2=A0 = =C2=A0 arr[i] =3D value;</font></div> <div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 }</font>= </div><div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0=C2= =A0</font></div><div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0= =C2=A0 maybe(arr, 3, 0);</font></div> <div><font face=3D"&#39;courier new&#39;, monospace">=C2=A0 =C2=A0 maybe(ar= r, 0, 1);</font></div><div><font face=3D"&#39;courier new&#39;, monospace">= =C2=A0 =C2=A0 assert(arr =3D=3D [1, 0, 1, 2, 0, 3, 4, 5, 6, 7]);</font></di= v></div><div><br></div> <br><br><div class=3D"gmail_quote">2012/2/9 MattCodr <span dir=3D"ltr">&lt;= <a href=3D"mailto:matheus_nab hotmail.com">matheus_nab hotmail.com</a>&gt;<= /span><br><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;bord= er-left:1px #ccc solid;padding-left:1ex"> I have a doubt about the best way to insert and move (not replace) some dat= a on an array.<br> <br> For example,<br> <br> In some cases if I want to do action above, I do a loop moving the data unt= il the point that I want and finally I insert the new data there.<br> <br> <br> In D I did this:<br> <br> begin code<br> .<br> .<br> .<br> =C2=A0 int[] arr =3D [0,1,2,3,4,5,6,7,8,9];<br> <br> =C2=A0 arr.insertInPlace(position, newValue);<br> =C2=A0 arr.popBack();<br> .<br> .<br> .<br> end code<br> <br> <br> After the insertInPlace my array changed it&#39;s length to 11, so I use ar= r.popBack(); to keep the array length =3D 10;<br> <br> The code above is working well, I just want know if is there a better way?<= br> <br> Thanks,<br> <br> Matheus.<br> </blockquote></div><br> --000e0cdfd646c0e2ed04b88776e3--
Feb 09 2012
prev sibling next sibling parent "MattCodr" <matheus_nab hotmail.com> writes:
On Thursday, 9 February 2012 at 12:51:09 UTC, Pedro Lacerda wrote:
 I __believe__ that insertInPlace doesn't shift the elements,

Yes, It appears that it really doesn't shift the array, insertInPlace just returns a new array with a new element in n position.
 Maybe this function do what you want.


   int[] arr = [0,1,2,3,4,5,6,7,8,9];

   void maybe(T)(T[] arr, size_t pos, T value) {
       size_t i;
       for (i = arr.length - 1; i > pos; i--) {
           arr[i] = arr[i-1];
       }
       arr[i] = value;
   }

In fact, I usually wrote functions as you did. I just looking for a new way to do that with D and Phobos lib. Thanks, Matheus.
Feb 09 2012
prev sibling next sibling parent reply =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 02/09/2012 03:47 AM, MattCodr wrote:
 I have a doubt about the best way to insert and move (not replace) some
 data on an array.

 For example,

 In some cases if I want to do action above, I do a loop moving the data
 until the point that I want and finally I insert the new data there.


 In D I did this:

 begin code
 .
 .
 .
 int[] arr = [0,1,2,3,4,5,6,7,8,9];

 arr.insertInPlace(position, newValue);
 arr.popBack();
 .
 .
 .
 end code


 After the insertInPlace my array changed it's length to 11, so I use
 arr.popBack(); to keep the array length = 10;

 The code above is working well, I just want know if is there a better way?

 Thanks,

 Matheus.

Most straightforward that I know of is the following: arr = arr[0 .. position] ~ [ newValue ] ~ arr[position + 1 .. $]; But if you don't actually want to modify the data, you can merely access the elements in-place by std.range.chain: import std.stdio; import std.range; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position + 1 .. $]); writeln(r); } 'r' above is a lazy range that just provides access to the three ranges given to it. 'arr' does not change in any way. Ali
Feb 09 2012
next sibling parent =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 02/09/2012 11:03 AM, H. S. Teoh wrote:
 On Thu, Feb 09, 2012 at 10:30:22AM -0800, Ali Çehreli wrote:
 [...]
 But if you don't actually want to modify the data, you can merely
 access the elements in-place by std.range.chain:

 import std.stdio;
 import std.range;

 void main()
 {
      int[] arr = [0,1,2,3,4,5,6,7,8,9];
      immutable position = arr.length / 2;
      immutable newValue = 42;

      auto r = chain(arr[0 .. position], [ newValue ], arr[position +
 1 .. $]);
      writeln(r);
 }

 'r' above is a lazy range that just provides access to the three
 ranges given to it. 'arr' does not change in any way.

Wow! This is really cool. So you *can* have O(1) insertions in the middle of an array after all. :) Of course, you probably want to flatten it once in a while to keep random access cost from skyrocketing.

O(1) would be violated only if there are too many actual ranges.
 (I'm assuming delegates or
 something equivalent are involved in generating the lazy range?)

Simpler than that. :) The trick is that chain() returns a range object that operates lazily. I have used chain() as an example for finite RandomAccessRange types (I used the name 'Together' instead of Chain). Search for "Finite RandomAccessRange" here: http://ddili.org/ders/d.en/ranges.html And yes, I note there that the implementation is not O(1). Also look under the title "Laziness" in that chapter. Ali
Feb 09 2012
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 02/09/2012 08:20 PM, MattCodr wrote:
 On Thursday, 9 February 2012 at 18:30:22 UTC, Ali Çehreli wrote:
 On 02/09/2012 03:47 AM, MattCodr wrote:
 I have a doubt about the best way to insert and move (not replace) some
 data on an array.

 For example,

 In some cases if I want to do action above, I do a loop moving the data
 until the point that I want and finally I insert the new data there.


 In D I did this:

 begin code
 .
 .
 .
 int[] arr = [0,1,2,3,4,5,6,7,8,9];

 arr.insertInPlace(position, newValue);
 arr.popBack();
 .
 .
 .
 end code


 After the insertInPlace my array changed it's length to 11, so I use
 arr.popBack(); to keep the array length = 10;

 The code above is working well, I just want know if is there a better
 way?

 Thanks,

 Matheus.

Most straightforward that I know of is the following: arr = arr[0 .. position] ~ [ newValue ] ~ arr[position + 1 .. $]; But if you don't actually want to modify the data, you can merely access the elements in-place by std.range.chain: import std.stdio; import std.range; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position + 1 .. $]); writeln(r); } 'r' above is a lazy range that just provides access to the three ranges given to it. 'arr' does not change in any way. Ali

Hi Ali, You gave me a tip with this "chain" feature. I changed a few lines of your code, and it worked as I wanted: import std.stdio; import std.range; import std.array; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position .. $-1]); arr = array(r); foreach(int i; arr) writefln("%d", i); } Thanks, Matheus.

Note that this code does the same, but is more efficient if you don't actually need the array: import std.stdio; import std.range; import std.array; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position .. $-1]); foreach(i; r) writefln("%d", i); }
Feb 09 2012
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Feb 09, 2012 at 10:30:22AM -0800, Ali ehreli wrote:
[...]
 But if you don't actually want to modify the data, you can merely
 access the elements in-place by std.range.chain:
 
 import std.stdio;
 import std.range;
 
 void main()
 {
     int[] arr = [0,1,2,3,4,5,6,7,8,9];
     immutable position = arr.length / 2;
     immutable newValue = 42;
 
     auto r = chain(arr[0 .. position], [ newValue ], arr[position +
 1 .. $]);
     writeln(r);
 }
 
 'r' above is a lazy range that just provides access to the three
 ranges given to it. 'arr' does not change in any way.

Wow! This is really cool. So you *can* have O(1) insertions in the middle of an array after all. :) Of course, you probably want to flatten it once in a while to keep random access cost from skyrocketing. (I'm assuming delegates or something equivalent are involved in generating the lazy range?) T -- Give a man a fish, and he eats once. Teach a man to fish, and he will sit forever.
Feb 09 2012
prev sibling next sibling parent "MattCodr" <matheus_nab hotmail.com> writes:
On Thursday, 9 February 2012 at 18:30:22 UTC, Ali Çehreli wrote:
 On 02/09/2012 03:47 AM, MattCodr wrote:
 I have a doubt about the best way to insert and move (not 
 replace) some
 data on an array.

 For example,

 In some cases if I want to do action above, I do a loop moving 
 the data
 until the point that I want and finally I insert the new data 
 there.


 In D I did this:

 begin code
 .
 .
 .
 int[] arr = [0,1,2,3,4,5,6,7,8,9];

 arr.insertInPlace(position, newValue);
 arr.popBack();
 .
 .
 .
 end code


 After the insertInPlace my array changed it's length to 11, so 
 I use
 arr.popBack(); to keep the array length = 10;

 The code above is working well, I just want know if is there a 
 better way?

 Thanks,

 Matheus.

Most straightforward that I know of is the following: arr = arr[0 .. position] ~ [ newValue ] ~ arr[position + 1 .. $]; But if you don't actually want to modify the data, you can merely access the elements in-place by std.range.chain: import std.stdio; import std.range; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position + 1 .. $]); writeln(r); } 'r' above is a lazy range that just provides access to the three ranges given to it. 'arr' does not change in any way. Ali

Hi Ali, You gave me a tip with this "chain" feature. I changed a few lines of your code, and it worked as I wanted: import std.stdio; import std.range; import std.array; void main() { int[] arr = [0,1,2,3,4,5,6,7,8,9]; immutable position = arr.length / 2; immutable newValue = 42; auto r = chain(arr[0 .. position], [ newValue ], arr[position .. $-1]); arr = array(r); foreach(int i; arr) writefln("%d", i); } Thanks, Matheus.
Feb 09 2012
prev sibling next sibling parent reply "MattCodr" <matheus_nab hotmail.com> writes:
On Thursday, 9 February 2012 at 19:49:43 UTC, Timon Gehr wrote:
 Note that this code does the same, but is more efficient if you 
 don't actually need the array:

Yes I know, In fact I need re-think the way I code with this new features of D, like ranges for example. Thanks, Matheus.
Feb 09 2012
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 02/13/2012 03:19 PM, James Miller wrote:
 On 11 February 2012 10:45, Jonathan M Davis<jmdavisProg gmx.com>  wrote:
 On Friday, February 10, 2012 13:32:56 Marco Leise wrote:
 I know that feeling. I had no exposure to functional programming and
 options like chain never come to my head. Although "map" is a concept that
 I made friends with early.

It would benefit your programming in general to learn a functional programming language and become reasonably proficient in it, even if you don't intend to program in it normally. It'll increase the number of tools in your programming toolbox and improve your programming in other programming languages. It's something that not enough programmers get sufficient exposure to IMHO. - Jonathan M Davis

I found that learning Haskell made me significantly better at what I do. New paradigms are good for reminding you to think outside the box, I also learnt Prolog for a university course (AI) and that was an interesting challenge. Logical programming, where you define the boundaries of the program and then it works out the possible answers for you, amazingly useful for BNF grammars and similar constructs. If fact it's got to the point where I feel hamstrung if I can't do at least function passing (fortunately C, C++ and D can do this), and I prefer to work with languages that support closures and anonymous functions, since you can do wonders with simple constructs like map, fold (reduce) and filter. In fact a naive implementation of quicksort can be done succinctly in any language that supports filter. T[] sort(T)(T[] array) { pivot = array[array.length/2]; return sort(filter!("a< "~pivot)(array)~pivot~sort(filter!("a
 "~pivot)(array));

(Disclaimer, this is probably a very slow implementation, possibly very broken, may cause compiler demons to possess your computer, DO NOT USE!) I have left out some details for brevity, and it probably won't work in alot of situations, but it demonstrates the power of functional programming, quicksort in 4 lines (sort of, its not like Haskell's "quicksort in 2 lines" is any better mind you, its slow as balls because of all the memory allocation it has to do). Anyway, yay for functional programming and thread derailment. James

If it is slow and uses an awful lot of auxiliary memory it is not quicksort as much as it may conceptually resemble quicksort. Try to implement in-place quicksort in Haskell. It will look like C code. Also see: http://stackoverflow.com/questions/5268156/how-do-you-do-an-in-place-quicksort-in-haskell
Feb 13 2012
next sibling parent =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 02/13/2012 03:34 PM, James Miller wrote:

 Saying "it is not quicksort as much as it may conceptually resemble
 quicksort" is kinda odd, its like saying "it is not a car, as much as
 it may conceptually resemble a car" because it doesn't run on petrol
 or gas, but instead runs on environment destroying orphan tears.

For what its worth, Andrei uses that argument in his "On Iteration" article with "For starters, [one implementation of Haskell's] qsort is not really quicksort. Quicksort, as defined by Hoare in his seminal paper [8], is an in-place algorithm." http://www.informit.com/articles/printerfriendly.aspx?p=1407357 Ali
Feb 13 2012
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 02/14/2012 12:34 AM, James Miller wrote:
 On 14 February 2012 06:25, Timon Gehr<timon.gehr gmx.ch>  wrote:
 On 02/13/2012 03:19 PM, James Miller wrote:
 On 11 February 2012 10:45, Jonathan M Davis<jmdavisProg gmx.com>    wrote:
 On Friday, February 10, 2012 13:32:56 Marco Leise wrote:

 I know that feeling. I had no exposure to functional programming and
 options like chain never come to my head. Although "map" is a concept
 that
 I made friends with early.

It would benefit your programming in general to learn a functional programming language and become reasonably proficient in it, even if you don't intend to program in it normally. It'll increase the number of tools in your programming toolbox and improve your programming in other programming languages. It's something that not enough programmers get sufficient exposure to IMHO. - Jonathan M Davis

I found that learning Haskell made me significantly better at what I do. New paradigms are good for reminding you to think outside the box, I also learnt Prolog for a university course (AI) and that was an interesting challenge. Logical programming, where you define the boundaries of the program and then it works out the possible answers for you, amazingly useful for BNF grammars and similar constructs. If fact it's got to the point where I feel hamstrung if I can't do at least function passing (fortunately C, C++ and D can do this), and I prefer to work with languages that support closures and anonymous functions, since you can do wonders with simple constructs like map, fold (reduce) and filter. In fact a naive implementation of quicksort can be done succinctly in any language that supports filter. T[] sort(T)(T[] array) { pivot = array[array.length/2]; return sort(filter!("a< "~pivot)(array)~pivot~sort(filter!("a
 "~pivot)(array));

} (Disclaimer, this is probably a very slow implementation, possibly very broken, may cause compiler demons to possess your computer, DO NOT USE!) I have left out some details for brevity, and it probably won't work in alot of situations, but it demonstrates the power of functional programming, quicksort in 4 lines (sort of, its not like Haskell's "quicksort in 2 lines" is any better mind you, its slow as balls because of all the memory allocation it has to do). Anyway, yay for functional programming and thread derailment. James

If it is slow and uses an awful lot of auxiliary memory it is not quicksort as much as it may conceptually resemble quicksort. Try to implement in-place quicksort in Haskell. It will look like C code. Also see: http://stackoverflow.com/questions/5268156/how-do-you-do-an-in-place-quicksort-in-haskell

It is still conceptually quicksort, the divide-and-conquer method based on partitioning the list.

Hoare's original quicksort algorithm is more detailed than what is sketched here. The main point is the in-place partition operation with the two pointers approaching each other.
 I wasn't writing it to show a valid
 implementation (I didn't even test it, it probably doesn't compile), I
 even warned of compiler demons! Its a demonstration of the
 succinctness of functional techniques for certain problems, not a show
 that functional languages "are teh awesum and all other langauges
 suck".

The approach given does not solve the problem (it does not implement Quicksort). Quicksort in Haskell looks like Quicksort in D, because the algorithm depends on destructive updates. Functional techniques can be succinct for certain problems, but implementing Quicksort is not one of them.
 Haskell is almost a pure functional language, therefore, under
 normal circumstances, every change to the array will allocate a new
 array,

Haskell can do destructive array updates that look like pure operations just fine. http://hackage.haskell.org/packages/archive/array/0.2.0.0/doc/html/Data-Array-MArray.html
 this is because of the whole immutability thing that it has going on.

This is confusing the abstraction with its implementation. It is impossible to recreate Haskell's execution semantics in D using only immutable types.
 Of course I would never use such an implementation in real
 life, and Haskellers tend to avoid algorithms that do these kinds of
 things, using sorts like mergesort instead.

Mostly lazy mergesort if I'm not mistaken. And they don't usually use it to sort arrays, they sort lists. Haskell arrays ought to be sorted with introsort if the comparison operation is cheap.
 Saying "it is not quicksort as much as it may conceptually resemble
 quicksort" is kinda odd, its like saying "it is not a car, as much as
 it may conceptually resemble a car" because it doesn't run on petrol
 or gas, but instead runs on environment destroying orphan tears.

It is more like saying "a handcart is not a car, as much as it may conceptually resemble a car" (the engine is missing!).
Feb 13 2012
prev sibling next sibling parent "Marco Leise" <Marco.Leise gmx.de> writes:
Am 09.02.2012, 22:03 Uhr, schrieb MattCodr <matheus_nab hotmail.com>:

 On Thursday, 9 February 2012 at 19:49:43 UTC, Timon Gehr wrote:
 Note that this code does the same, but is more efficient if you don't  
 actually need the array:

Yes I know, In fact I need re-think the way I code with this new features of D, like ranges for example. Thanks, Matheus.

I know that feeling. I had no exposure to functional programming and options like chain never come to my head. Although "map" is a concept that I made friends with early.
Feb 10 2012
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, February 10, 2012 13:32:56 Marco Leise wrote:
 I know that feeling. I had no exposure to functional programming and
 options like chain never come to my head. Although "map" is a concept that
 I made friends with early.

It would benefit your programming in general to learn a functional programming language and become reasonably proficient in it, even if you don't intend to program in it normally. It'll increase the number of tools in your programming toolbox and improve your programming in other programming languages. It's something that not enough programmers get sufficient exposure to IMHO. - Jonathan M Davis
Feb 10 2012
prev sibling next sibling parent James Miller <james aatch.net> writes:
On 11 February 2012 10:45, Jonathan M Davis <jmdavisProg gmx.com> wrote:
 On Friday, February 10, 2012 13:32:56 Marco Leise wrote:
 I know that feeling. I had no exposure to functional programming and
 options like chain never come to my head. Although "map" is a concept that
 I made friends with early.

It would benefit your programming in general to learn a functional programming language and become reasonably proficient in it, even if you don't intend to program in it normally. It'll increase the number of tools in your programming toolbox and improve your programming in other programming languages. It's something that not enough programmers get sufficient exposure to IMHO. - Jonathan M Davis

I found that learning Haskell made me significantly better at what I do. New paradigms are good for reminding you to think outside the box, I also learnt Prolog for a university course (AI) and that was an interesting challenge. Logical programming, where you define the boundaries of the program and then it works out the possible answers for you, amazingly useful for BNF grammars and similar constructs. If fact it's got to the point where I feel hamstrung if I can't do at least function passing (fortunately C, C++ and D can do this), and I prefer to work with languages that support closures and anonymous functions, since you can do wonders with simple constructs like map, fold (reduce) and filter. In fact a naive implementation of quicksort can be done succinctly in any language that supports filter. T[] sort(T)(T[] array) { pivot = array[array.length/2]; return sort(filter!("a < "~pivot)(array)~pivot~sort(filter!("a
 "~pivot)(array));

(Disclaimer, this is probably a very slow implementation, possibly very broken, may cause compiler demons to possess your computer, DO NOT USE!) I have left out some details for brevity, and it probably won't work in alot of situations, but it demonstrates the power of functional programming, quicksort in 4 lines (sort of, its not like Haskell's "quicksort in 2 lines" is any better mind you, its slow as balls because of all the memory allocation it has to do). Anyway, yay for functional programming and thread derailment. James
Feb 13 2012
prev sibling next sibling parent James Miller <james aatch.net> writes:
On 14 February 2012 06:25, Timon Gehr <timon.gehr gmx.ch> wrote:
 On 02/13/2012 03:19 PM, James Miller wrote:
 On 11 February 2012 10:45, Jonathan M Davis<jmdavisProg gmx.com> =C2=A0w=


 On Friday, February 10, 2012 13:32:56 Marco Leise wrote:

 I know that feeling. I had no exposure to functional programming and
 options like chain never come to my head. Although "map" is a concept
 that
 I made friends with early.

It would benefit your programming in general to learn a functional programming language and become reasonably proficient in it, even if you don't inte=



 to
 program in it normally. It'll increase the number of tools in your
 programming
 toolbox and improve your programming in other programming languages. It=



 something that not enough programmers get sufficient exposure to IMHO.

 - Jonathan M Davis

I found that learning Haskell made me significantly better at what I do. New paradigms are good for reminding you to think outside the box, I also learnt Prolog for a university course (AI) and that was an interesting challenge. Logical programming, where you define the boundaries of the program and then it works out the possible answers for you, amazingly useful for BNF grammars and similar constructs. If fact it's got to the point where I feel hamstrung if I can't do at least function passing (fortunately C, C++ and D can do this), and I prefer to work with languages that support closures and anonymous functions, since you can do wonders with simple constructs like map, fold (reduce) and filter. In fact a naive implementation of quicksort can be done succinctly in any language that supports filter. =C2=A0 =C2=A0 T[] sort(T)(T[] array) { =C2=A0 =C2=A0 =C2=A0 =C2=A0 pivot =3D array[array.length/2]; =C2=A0 =C2=A0 =C2=A0 =C2=A0 return sort(filter!("a< =C2=A0"~pivot)(array=


 "~pivot)(array));

=C2=A0 =C2=A0 } (Disclaimer, this is probably a very slow implementation, possibly very broken, may cause compiler demons to possess your computer, DO NOT USE!) I have left out some details for brevity, and it probably won't work in alot of situations, but it demonstrates the power of functional programming, quicksort in 4 lines (sort of, its not like Haskell's "quicksort in 2 lines" is any better mind you, its slow as balls because of all the memory allocation it has to do). Anyway, yay for functional programming and thread derailment. James

If it is slow and uses an awful lot of auxiliary memory it is not quickso=

 as much as it may conceptually resemble quicksort. Try to implement in-pl=

 quicksort in Haskell. It will look like C code. Also see:
 http://stackoverflow.com/questions/5268156/how-do-you-do-an-in-place-quic=


It is still conceptually quicksort, the divide-and-conquer method based on partitioning the list. I wasn't writing it to show a valid implementation (I didn't even test it, it probably doesn't compile), I even warned of compiler demons! Its a demonstration of the succinctness of functional techniques for certain problems, not a show that functional languages "are teh awesum and all other langauges suck". Haskell is almost a pure functional language, therefore, under normal circumstances, every change to the array will allocate a new array, this is because of the whole immutability thing that it has going on. Of course I would never use such an implementation in real life, and Haskellers tend to avoid algorithms that do these kinds of things, using sorts like mergesort instead. Saying "it is not quicksort as much as it may conceptually resemble quicksort" is kinda odd, its like saying "it is not a car, as much as it may conceptually resemble a car" because it doesn't run on petrol or gas, but instead runs on environment destroying orphan tears. James Miller
Feb 13 2012
prev sibling next sibling parent James Miller <james aatch.net> writes:
On 14 February 2012 12:45, Ali =C3=87ehreli <acehreli yahoo.com> wrote:
 On 02/13/2012 03:34 PM, James Miller wrote:

 Saying "it is not quicksort as much as it may conceptually resemble
 quicksort" is kinda odd, its like saying "it is not a car, as much as
 it may conceptually resemble a car" because it doesn't run on petrol
 or gas, but instead runs on environment destroying orphan tears.

For what its worth, Andrei uses that argument in his "On Iteration" artic=

 with "For starters, [one implementation of Haskell's] qsort is not really
 quicksort. Quicksort, as defined by Hoare in his seminal paper [8], is an
 in-place algorithm."

 =C2=A0http://www.informit.com/articles/printerfriendly.aspx?p=3D1407357

 Ali

Fair enough, I didn't realise that Quicksort was defined as in place, in that case, I retract my opposition to "not really a quicksort" however my "missing the point" still stands. I still prefer arrays over S-lists anyway, how else do I efficiently implement a heap?
Feb 13 2012
prev sibling next sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Tuesday, February 14, 2012 13:02:43 James Miller wrote:
 On 14 February 2012 12:45, Ali Çehreli <acehreli yahoo.com> wrote:
 On 02/13/2012 03:34 PM, James Miller wrote:
 Saying "it is not quicksort as much as it may conceptually resemble
 quicksort" is kinda odd, its like saying "it is not a car, as much as
 it may conceptually resemble a car" because it doesn't run on petrol
 or gas, but instead runs on environment destroying orphan tears.

For what its worth, Andrei uses that argument in his "On Iteration" article with "For starters, [one implementation of Haskell's] qsort is not really quicksort. Quicksort, as defined by Hoare in his seminal paper [8], is an in-place algorithm." http://www.informit.com/articles/printerfriendly.aspx?p=1407357 Ali

Fair enough, I didn't realise that Quicksort was defined as in place, in that case, I retract my opposition to "not really a quicksort" however my "missing the point" still stands. I still prefer arrays over S-lists anyway, how else do I efficiently implement a heap?

Orphan tears. It's the only way to go. - Jonathan M Davis
Feb 13 2012
prev sibling parent Manfred Nowak <svv1999 hotmail.com> writes:
MattCodr wrote:

 I have a doubt about the best way to insert and move (not 
 replace) some data on an array.

I have the vision, that a mapping from a dense range of integers to some value type and wast (i.e. Theta( n)) changes of this mapping are a severe hint for a maldesign. -manfred
Feb 14 2012