I scanned a sample image and
saved it as an uncompressed TIF file and at all thirteen levels of jpg
compression. Here are the results.
To spot the differences, I
suggest you first look at the worst to see where the 'compression
artifacts' are most noticeable then track up to where they become no
longer noticeable, and then look from the best image and track down to
where you can spot the start of quality degradation.
TIF (won't display)
quality, least compression
good as 12/12 but 20% smaller.
slightest hint of loss of detail compared to 12/12, but a massive
33% reduction in size with little loss of quality makes this a good
starting to see some roughness in the otherwise smooth tonal
A bit of
roughness and fuzziness now visible, and slight 'halos' around some
of the major color changes
good quality for most uses with only the slightest suggestions of
see any difference between this and 7/12, but slightly better than
acceptable quality for most ordinary 'recognition' purposes
the artifacts are no longer quite so apparent
quality with visible artifacts
What can we conclude from
the above study?
First, I think most of us
will agree that the smallest file size/highest compression images are of
unacceptably poor quality for most purposes.
Second, I think we can
probably also agree that for most purposes, 9/12 is acceptably good for
high quality as is 12/12, while reducing the file size by more than 50%.
The 'sweet zone' for
compression seems to be a setting between 9/12 and 6/12. For 5/12,
4/12 and 3/12, the saving in file size is not worth the tradeoff in
quality, and 2/12 and lower are just too bad.
I personally compress
thumbnails using a 7/12 setting and full size images at a 9/12 setting.
Hopefully the above will help you choose your own preferred settings.