I have a script converting video files and I run it at server on test data and measure its time by time
. In result I saw:
real 2m48.326s
user 6m57.498s
sys 0m3.120s
Why real time is that much lower than user time? Does this have any connection with multithreading? Or what else?
Edit: And I think that script was running circa 2m48s
real
time is wall-clock time as explained below (ie what we would measure if we had a stop-watch) – Levon Jun 13 '12 at 17:39