


Regarding 15-bits of image and 1-bit of noise - please tell us more in support of your "1-bit of noise" comment. what on opening in a "true" 16-bit program? Spreads out to 16-bit or stays "compressed" and the subsequent program adds a bit of non-image non-information to the image? I'd think, if ever, that the 15-bit image would extrapolate out to 16-bits just like an 8-bit image does, and there are not 8- additional bits of noise in a 16-bit image converted from 8-bits, so why do you think there is one bit of noise in a 15-bit image, especially if it never "lands" in a 16-bit world? Where is your comment coming from? Thanks in advance for a deeper explanation and links or whatever supports your "1-bit of noise" contention. Real pictures do have smooth gradients as you test with, and they also often have vastly reduced color count often of "only" 50,000 colors in spite of the bit depth being able to carry much more than that. I often use sub-pixel gausian blur when converting from 8-bit to 16-bit to encourage the elimination of "picket fencing" in the histogram and still keep the image smooth yet detailed enough for re-sharpening and to expand the in-between colors and tones to take advantage of the additional bit depth. What screen display bit depth are you looking at the files through? IS it different for each display screen, and THAT is what you are seeing?Īlso, just try opening up the smooth file on the other computer to see if it also displays smooth of not.Īlso, try printing! What a surprise you may get there!
