Page 1 of 1

16bit grayscale to false colors

Posted: 2010-04-26T14:13:39-07:00
by oddparity
Hello all,

I got a 16bit grayscale tiff. It's a scan for scientific evaluation. I would like to have some kind of false color representation because users cannot see the full range of the 16bit values on their monitors. I read a lot of the -fx operator and -clut, but am not sure whether they work on 8bit or 16bit values.

Thank you very much in advance.

Regards,
Dirk

Re: 16bit grayscale to false colors

Posted: 2010-04-26T14:52:13-07:00
by fmw42
see my script, pseudocolor at http://www.fmwconcepts.com/imagemagick/ ... /index.php and also my tidbit at http://www.fmwconcepts.com/imagemagick/ ... hp#rainbow


convert grayimage \
\( -size 1x600 xc:"rgb(100%,0%,0%)" -colorspace hsl gradient: -compose CopyRed -composite -colorspace RGB -rotate 90 \) \
-clut falsecolorimage

It should not matter if your image is 8-bit or 16-bit, but is best if the grayimage spans the full dynamic range. If not add -auto-level after grayimage and before the parens.

Re: 16bit grayscale to false colors

Posted: 2010-04-28T01:28:13-07:00
by oddparity
Hello,

thank you very much for your help. I picked up your idea and modified it slightly because I thought it would be nice not to have the color red at both ends of the spectrum.

I've done it with the following command line now, shifting grey colors to a spectrum of black-blue-green-red:

Code: Select all

convert grayimagename.tif -normalize ( xc:black xc:blue xc:green xc:red +append -filter Cubic -resize 600x1! ) -clut  falsecolorimagename.tif(
For other users reading this it might work without "-normalize", of course. Also pick your colors as you like, please.

I am not sure whether this procedure preserves the intensity of the pixels recognized by the users/viewers (I guess not). But it looks quite OK.

Regards,
Dirk