That type of educated guess is the best you can hope for.
But any program would need to adjust its guessing stratagy basied on previous performance measures. that is the program should be timing its IM runs and adjusting the 'guess' basied on THIS machines performance.
I have done simular things with long loops. I run a fast pre-loop to find out how many 'things' it needs to do (number and sizes or files to process for example), then run the main loop so it repeats the count and makes a guess for when it will finish.
I often can get a fairly accurate measure of 'time spent, amount completed, time remaining, and overall process time (amount this varies gives an indication of how accurate it is) as a counting status report to the user.
Of course the programs I add this to generally run from a few hours to a few days. One such program I left run for more than a month, but I knew it would take that long, and left it running on the side.
however If I get a report of 10 to 20 years!!! Then I know it is running too slowly
ASIDE: this wasn't with image processing but text file comparisions of roughly half a million files! Since that time such runs generally take less than a day, due to better processing methods and faster computers. But the technique is the same for any type of 'guess' system.