## digitalmars.D.learn - Measuring Execution time

• Clayton (18/18) Jul 22 2015 How does one represent Duration in only Micro-seconds, or
• John Colvin (7/25) Jul 22 2015 The normal way of doing this would be using
• Clayton (3/11) Jul 22 2015 Much appreciated, that works well John . Learning goes on...
• Clayton (4/12) Jul 23 2015 Am wondering how possible is to restrict that all algorithms get
• Yazan D (4/18) Jul 24 2015 If you are using Linux, you can use `taskset`.
• Yazan D (2/2) Jul 24 2015 There is also http://linux.die.net/man/2/sched_setaffinity if you want t...
• Steven Schveighoffer (7/25) Jul 23 2015 I know John identified Stopwatch, but just an FYI, Duration has the
"Clayton" <johnsjdsd gmail.com> writes:
```How does one represent Duration in only Micro-seconds, or
milliseconds. Trying to measure the execution time of an
algorithm and I get "4 ms, 619 μs, and 8 hnsecs" , I want to sum
all these and get total hnsecs or μs .

I would also  appreciate advise on  whether this is the best way
to measure the execution time of an algorithm.

import std.datetime;
import std.stdio;

void algorithm( ){
writeln("Hello!");
}
void main(){

auto stattime = Clock.currTime();
algorithm( );
endttime = Clock.currTime();

auto duration = endttime - stattime;

writeln("Hello Duration ==> ", duration);

}
```
Jul 22 2015
"John Colvin" <john.loughran.colvin gmail.com> writes:
```On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
How does one represent Duration in only Micro-seconds, or
milliseconds. Trying to measure the execution time of an
algorithm and I get "4 ms, 619 μs, and 8 hnsecs" , I want to
sum all these and get total hnsecs or μs .

I would also  appreciate advise on  whether this is the best
way to measure the execution time of an algorithm.

import std.datetime;
import std.stdio;

void algorithm( ){
writeln("Hello!");
}
void main(){

auto stattime = Clock.currTime();
algorithm( );
endttime = Clock.currTime();

auto duration = endttime - stattime;

writeln("Hello Duration ==> ", duration);

}

The normal way of doing this would be using
std.datetime.StopWatch:

StopWatch sw;
sw.start();
algorithm();
long exec_ms = sw.peek().msecs;
```
Jul 22 2015
"Clayton" <johnsjdsd gmail.com> writes:
```On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
[...]

The normal way of doing this would be using
std.datetime.StopWatch:

StopWatch sw;
sw.start();
algorithm();
long exec_ms = sw.peek().msecs;

Much appreciated, that works well John . Learning goes on...
thanks again
```
Jul 22 2015
"Clayton" <johnsjdsd gmail.com> writes:
```On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
[...]

The normal way of doing this would be using
std.datetime.StopWatch:

StopWatch sw;
sw.start();
algorithm();
long exec_ms = sw.peek().msecs;

Am wondering how possible is to restrict that all algorithms get
run on a specific core( e.g. CPU 0 ) since I wanted my test run
on the same environment.
```
Jul 23 2015
Yazan D <invalid email.com> writes:
```On Thu, 23 Jul 2015 16:43:01 +0000, Clayton wrote:

On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
[...]

The normal way of doing this would be using std.datetime.StopWatch:

StopWatch sw;
sw.start();
algorithm();
long exec_ms = sw.peek().msecs;

Am wondering how possible is to restrict that all algorithms get run on
a specific core( e.g. CPU 0 ) since I wanted my test run on the same
environment.

If you are using Linux, you can use `taskset`.
Example: `taskset -c 0 ./program`. This will run your program on the
first CPU only.
```
Jul 24 2015
Yazan D <invalid email.com> writes:
```There is also http://linux.die.net/man/2/sched_setaffinity if you want to
do it programmatically.
```
Jul 24 2015
Steven Schveighoffer <schveiguy yahoo.com> writes:
```On 7/22/15 5:23 AM, Clayton wrote:
How does one represent Duration in only Micro-seconds, or milliseconds.
Trying to measure the execution time of an algorithm and I get "4 ms,
619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or
μs .

I would also  appreciate advise on  whether this is the best way to
measure the execution time of an algorithm.

import std.datetime;
import std.stdio;

void algorithm( ){
writeln("Hello!");
}
void main(){

auto stattime = Clock.currTime();
algorithm( );
endttime = Clock.currTime();

auto duration = endttime - stattime;

writeln("Hello Duration ==> ", duration);

}

I know John identified Stopwatch, but just an FYI, Duration has the
method total: http://dlang.org/phobos/core_time.html#.Duration.total

I think doing:

writeln("Hello Duration ==> ", duration.total!"usecs");

would also work.

-Steve
```
Jul 23 2015
Jonathan M Davis via Digitalmars-d-learn writes:
```On Thursday, July 23, 2015 13:59:11 Steven Schveighoffer via
Digitalmars-d-learn wrote:
On 7/22/15 5:23 AM, Clayton wrote:
How does one represent Duration in only Micro-seconds, or milliseconds.
Trying to measure the execution time of an algorithm and I get "4 ms,
619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or
μs .

I would also  appreciate advise on  whether this is the best way to
measure the execution time of an algorithm.

import std.datetime;
import std.stdio;

void algorithm( ){
writeln("Hello!");
}
void main(){

auto stattime = Clock.currTime();
algorithm( );
endttime = Clock.currTime();

auto duration = endttime - stattime;

writeln("Hello Duration ==> ", duration);

}

I know John identified Stopwatch, but just an FYI, Duration has the
method total: http://dlang.org/phobos/core_time.html#.Duration.total

I think doing:

writeln("Hello Duration ==> ", duration.total!"usecs");

would also work.

Yes, you could do that, but doing timing with the realtime clock is
fundamentally wrong, because the clock can change on you while you're
timing. That's why using a monotonic clock is better, since it's guaranteed
to never move backwards. Unfortunately, while StopWatch does use a monotonic
clock, it currently does that by using TickDuration for that rather than
MonoTime, so its result is a TickDuration rather than a Duration, so it's a
bit harder to use than would be nice, but it is more correct to use
StopWatch than to subtract SysTimes. Alternatively, you could just use
MonoTime directly. e.g.

auto startTime = MonoTime.currTime;
// do stuff
auto endTime = MonoTime.currTime;

audo duration = endTime - startTime;
writeln("Hello Duration ==> ", duration.total!"usecs");

in which case you get a Duration just like with subtract SysTimes, and the
suggestion of using total works just fine.

I need to put together replacements for the benchmarking functions in
std.datetime (probably in std.benchmark) which use MonoTime and Duration
rather than TickDuration so that we can deprecate the ones in std.datetime
which use TickDuration (and deprecate TickDuration itself).

- Jonathan M Davis
```
Jul 23 2015