Dividing large image faster ?

Questions and postings pertaining to the usage of ImageMagick regardless of the interface. This includes the command-line utilities, as well as the C and C++ APIs. Usage questions are like "How do I use ImageMagick to create drop shadows?".
Post Reply
Musaran

Dividing large image faster ?

Post by Musaran »

Hi

I adapted an example from the help to divide big images in small tiles, each in a file named with it's original tile x y position :
convert test.png -crop 32x32 -set filename:tile "%[fx:page.x/32],%[fx:page.y/32]" +repage +adjoin "c:\result\%[filename:tile].png"
It works fine with my 160x160 pixel example, but with my 3200x3200 actual file it is slooooow : 30s per tile.
Works out to 3+ days per file, and I have 89 such files...

What is wrong here ? I can only assume the file is re-read for each crop...
Is there a fix ?


Bonus questions :

1) If my original file is named "[h][v].xcf", is it possible to use the h an v part in the output name ?

2) Is it possible to skip and not ouptut empty (full transparent) tiles ?
(presumably not, as I read there is little conditional)

3) Is it possible to search-and replace one image as part of another ?
(help is under construction)
In my case replace a 5x5 "you are here" marker with background color.

I browsed the help, but ImageMagick is just too big for me. :)
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Dividing large image faster ?

Post by fmw42 »

see http://www.imagemagick.org/Usage/files/#massive for large image processing

You can use compare to search a small image for where it resizes in a large image. see viewtopic.php?f=1&t=14613&p=51076&hilit ... ric#p51076 Then you will have to extract the coordinates and overlay some background color image to replace your text.

You can get the filename (without the suffix) using

name=`convert rose.png -format "%t" info:`
echo "$name"
rose

Then use ${name}.outputsuffix

This is unix, if on windows, see http://www.imagemagick.org/Usage/windows/
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: Dividing large image faster ?

Post by snibgo »

This Windows command (where $SRC$ is about 3500x2500 pixels) would take about 100 hours to complete, if I had the patience:

Code: Select all

"%IMG%convert" ^
  %SRC% ^
  -crop 32x32 ^
  -set filename:tile "%%[fx:page.x/32]_%%[fx:page.y/32]" ^
  +repage +adjoin ^
  x.png
But if I remove the line:

Code: Select all

-set filename:tile "%%[fx:page.x/32]_%%[fx:page.y/32]" 
It takes about 3 minutes (and creates 8034 files).

I suspect there is a bug. Perhaps it is recalculating "filename:tile" for every pixel.

I have reported this as a bug: viewtopic.php?f=3&t=16463
snibgo's IM pages: im.snibgo.com
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Dividing large image faster ?

Post by fmw42 »

On my Mac OSX Tiger G4 single processor and IM 6.6.2-7 Q16 beta


convert -size 128x128 xc:red red.png

time convert \
red.png \
-crop 32x32 \
-set filename:tile "%[fx:page.x/32]_%[fx:page.y/32]" \
+repage +adjoin \
x.png

creates 16 tiles in:

real 0m0.666s
user 0m0.217s
sys 0m0.145s
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: Dividing large image faster ?

Post by snibgo »

A 128x128 source takes me hardly any time, with or without the "-set filename". There is a noticable difference with/without at 500x500, and then it gets worse and worse.
snibgo's IM pages: im.snibgo.com
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Dividing large image faster ?

Post by fmw42 »

convert -size 500x500 xc:red red.png
time convert \
red.png \
-crop 32x32 \
-set filename:tile "%[fx:page.x/32]_%[fx:page.y/32]" \
+repage +adjoin \
x.png

created 256 images in:

real 0m33.758s
user 0m29.075s
sys 0m1.892s





convert -size 1000x1000 xc:red red.png
time convert \
red.png \
-crop 32x32 \
-set filename:tile "%[fx:page.x/32]_%[fx:page.y/32]" \
+repage +adjoin \
x.png

496 images created before I cancelled with time:

real 9m55.610s
user 8m22.684s
sys 0m7.499s



Perhaps memory issues. Have you tried the suggestions for large image processing at http://www.imagemagick.org/Usage/files/#massive
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: Dividing large image faster ?

Post by snibgo »

Memory usage is light. Nowhere near saturation.

-limit area 8192 -limit memory 8192
or
-limit area 2GiB -limit memory 2GiB ^

makes no apparent difference.

"-debug cache" shows information that might be interesting if I understood it.
snibgo's IM pages: im.snibgo.com
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Dividing large image faster ?

Post by fmw42 »

New beta has the fxc: change see viewtopic.php?f=3&t=16463

OK. It seems to be in the new beta.


time convert \
red.png \
-crop 32x32 \
-set filename:tile "%[fxc:page.x/32]_%[fxc:page.y/32]" \
+repage +adjoin \
x.png

create 1024 image in:


real 0m9.612s
user 0m4.265s
sys 0m3.908s


Fred
snibgo
Posts: 12159
Joined: 2010-01-23T23:01:33-07:00
Authentication code: 1151
Location: England, UK

Re: Dividing large image faster ?

Post by snibgo »

Thanks Fred. Good stuff. It's still slow, but we are now talking minutes rather than days.
snibgo's IM pages: im.snibgo.com
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Dividing large image faster ?

Post by fmw42 »

snibgo wrote:Thanks Fred. Good stuff. It's still slow, but we are now talking minutes rather than days.
Alan,

Actually that is 9 sec not 9 minutes.

But, I am not on a very fast machine -- original G4 Mac Mini 1.42 GHz PowerPC. So perhaps it will be faster for you. But not likely orders of magnitude unless perhaps you have multiprocessor capability, which I do not.

Fred
Post Reply