Calculated duration != real duration of gif
Posted: 2014-02-04T05:46:27-07:00
Hello,
I have a gif with the following details:
Iterations: 2
Number of Frames: 12
Each frame with following delays:
If I calculate the duration of the gif, I get 1900cs. 2*(50+50+50+50+50+100+50+150+150+100+50+100)=1900
But if I open the gif, it runs about nearly 30 seconds (3000cs).
What variable am I missing here? I checked the whole identify -verbose info and couldn't find any other variables that could be responsible for it.
I would be happy to get any advice. :)
Full -verbose info (without colormap and histogramm): http://pastebin.com/MHUepNVw
I have a gif with the following details:
Iterations: 2
Number of Frames: 12
Each frame with following delays:
Code: Select all
Delay: 50x100
Scene: 0 of 12
Delay: 50x100
Scene: 1 of 12
Delay: 50x100
Scene: 2 of 12
Delay: 50x100
Scene: 3 of 12
Delay: 50x100
Scene: 4 of 12
Delay: 100x100
Scene: 5 of 12
Delay: 50x100
Scene: 6 of 12
Delay: 150x100
Scene: 7 of 12
Delay: 150x100
Scene: 8 of 12
Delay: 100x100
Scene: 9 of 12
Delay: 50x100
Scene: 10 of 12
Delay: 100x100
Scene: 11 of 12
But if I open the gif, it runs about nearly 30 seconds (3000cs).
What variable am I missing here? I checked the whole identify -verbose info and couldn't find any other variables that could be responsible for it.
I would be happy to get any advice. :)
Full -verbose info (without colormap and histogramm): http://pastebin.com/MHUepNVw