[MPlayer-dev-eng] Re: Compile options
Trent Piepho
xyzzy at speakeasy.org
Sat Sep 16 23:01:37 CEST 2006
On Sat, 16 Sep 2006, Andrew Savchenko wrote:
> > -O4:
> > real 2m39.531s
> > user 1m12.721s
> >
> > -O2:
> > real 2m39.629s
> > user 1m9.408s
>
> I misunderstand you. According to your test -O2 is slightly worth than
> -O4 (159.017s vs 159.007s); the lesser is time, the faster is program.
Of course, it makes no sense to compare the real time, that should be the
same. Better is the user time, and O2 looks faster.
> Howere, how about mean deviation value?
> Can you provide results with errors?
> Result without error is meaningless.
One almost never sees a computer benchmark done by someone who knows
anything about statistics. Even well-known review sites think they can run
a benchmark just once and then compare it, without any idea of the
benchmark's variance.
To get better benchmarks, this script might help. Do something like:
for i in `seq 1 5`;do time mplayer -quiet -benchmark ... 2>&1;done|
mpbench.pl foo
You'll get output like this:
vc.foo <- c( 7.643,8.068,7.849,7.863,7.818 );
vo.foo <- c( 0.005,0.005,0.005,0.005,0.005 );
sys.foo <- c( 0.393,0.421,0.411,0.408,0.406 );
user.foo <- c( 7.97,8.37,8.16,8.17,8.17 );
elapsed.foo <- c( 8.11,8.58,8.34,8.36,8.31 );
e.g, all the different kinds of times for each run in a list. If you've
used the open source statistics package R, you might notice that this
output could be cut and pasted into R. Then one might analyse the data
with tools like hist(), boxplot(), t.test(), wilcox.test(), and so on.
Some useful links google turned up:
http://www.stat.psu.edu/~dhunter/R/2006test.html
http://www.math.csi.cuny.edu/Statistics/R/simpleR/stat011.html
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mpbench.pl
Type: text/x-perl
Size: 1037 bytes
Desc:
URL: <http://lists.mplayerhq.hu/pipermail/mplayer-dev-eng/attachments/20060916/93cee766/attachment.pl>
More information about the MPlayer-dev-eng
mailing list