Jump to content


Photo
- - - - -

Gamut of RAW files


  • Please log in to reply
31 replies to this topic

#1 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 29 January 2004 - 05:30 PM

I'm trying to understand what is a typical gamut for a raw CCD sensor in modern digital cameras. Since RAW files contain only the sensor data (theoretically), the concept of color space doesn't apply. In all the discussions you see on what color space to use, no one seems to understand what the conversion is between the native imager gamut and the chosen space. If sRGB can contain the enitre gamut the sensor can produce, why use a wider space? If it is wider than all color spaces (unlikely) then why wouldn't you use the widest one available. Just because a camera has an Adobe RGB option doesn't mean that the camera can fill it.

Since gamut is effected by the color filters used, any of this data would be sensor-specific. Anyone know of any links to discussions on raw sensor gamut? I'd lke to see raw sensor gamut graphs compared to common color spaces.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#2 jimbo1946

jimbo1946

    Tiger Shark

  • Member
  • PipPipPipPip
  • 638 posts
  • Gender:Male
  • Location:Tucker, Georgia, USA
  • Interests:Scuba, UW photography, land photography, reading, driving my wife/dive buddy nuts!

Posted 30 January 2004 - 05:47 PM

Craig, if I understood one word of that, I would gladly respond. Apparently I'm not the only non-rocket scientist around here!

Call me...
CLUELESS :?: :?: :?:
Jim Chambers
Tucker, Georgia

Nikon D300 in Aquatica housing with housed SB800 flash.

#3 frogfish

frogfish

    Tiger Shark

  • Member
  • PipPipPipPip
  • 676 posts
  • Gender:Male
  • Location:Indonesia

Posted 30 January 2004 - 06:24 PM

I'm trying to understand what is a typical gamut for a raw CCD sensor in modern digital cameras.  Since RAW files contain only the sensor data (theoretically), the concept of color space doesn't apply.  In all the discussions you see on what color space to use, no one seems to understand what the conversion is between the native imager gamut and the chosen space.  If sRGB can contain the enitre gamut the sensor can produce, why use a wider space?  If it is wider than all color spaces (unlikely) then why wouldn't you use the widest one available.  Just because a camera has an Adobe RGB option doesn't mean that the camera can fill it.

Since gamut is effected by the color filters used, any of this data would be sensor-specific.  Anyone know of any links to discussions on raw sensor gamut?  I'd lke to see raw sensor gamut graphs compared to common color spaces.


I'd also very much like to know more about what RAW image files really are and how the conversion programs work. My uninformed assumption is that the conversion program essentially maps the data in a RAW file to a specific color space, with parameters controlled by adjustment fields in the program. As I understand it, sRGB is a very restricted color space, and converters can map RAW files to much larger spaces (such as Adobe RGB (1998). But if the target space selected in the converter program/module is sRGB, then the RAW data will be mapped to that smaller space.

Is there anyone who really knows about this stuff watching this thread?
Robert Delfs

Nikon D2X in Subal housing.
Tabula Int'l Ltd.

#4 echeng

echeng

    The Blue

  • Admin
  • 5844 posts
  • Gender:Male
  • Location:San Francisco, CA
  • Interests:photography, ice cream, cello, chamber music, quadcopters

Posted 30 January 2004 - 07:04 PM

I asked Lee Peterson, since he's on various panels and is involved in next-gen chips. Here's what he said:

-----Original Message-----

Sent: Friday, January 30, 2004 10:05 AM

Subject: Answer



 The Canon EOS-1D and some Kodak cameras compress their raw data

   with lossless JPEG. Not all cameras do this.



 All RGB cameras use one of these Bayer grids:



    0x16161616:    0x61616161:    0x49494949:    0x94949494:



      0 1 2 3 4 5      0 1 2 3 4 5      0 1 2 3 4 5      0 1 2 3 4 5

    0 B G B G B G    0 G R G R G R    0 G B G B G B    0 R G R G R G

    1 G R G R G R    1 B G B G B G    1 R G R G R G    1 G B G B G B

    2 B G B G B G    2 G R G R G R    2 G B G B G B    2 R G R G R G

    3 G R G R G R    3 B G B G B G    3 R G R G R G    3 G B G B G B



The Bayer grid is the way the color pixels are laid out on the chip. Sony has introduced a 4 color grid in their new 828 camera.



Here is a sample of the code for Canon's raw conversion:

void canon_600_load_raw()

{

  uchar  data[1120], *dp;

  ushort pixel[896], *pix;

  int irow, orow, col;



  for (irow=orow=0; irow < height; irow++)

  {

    fread (data, 1120, 1, ifp);

    for (dp=data, pix=pixel; dp < data+1120; dp+=10, pix+=8)

    {

      pix[0] = (dp[0] << 2) + (dp[1] >> 6    );

      pix[1] = (dp[2] << 2) + (dp[1] >> 4 & 3);

      pix[2] = (dp[3] << 2) + (dp[1] >> 2 & 3);

      pix[3] = (dp[4] << 2) + (dp[1]      & 3);

      pix[4] = (dp[5] << 2) + (dp[9]      & 3);

      pix[5] = (dp[6] << 2) + (dp[9] >> 2 & 3);

      pix[6] = (dp[7] << 2) + (dp[9] >> 4 & 3);

      pix[7] = (dp[8] << 2) + (dp[9] >> 6    );

    }

    for (col=0; col < width; col++)

      image[orow*width+col][FC(orow,col)] = pixel[col] << 4;

    for (col=width; col < 896; col++)

      black += pixel[col];

    if ((orow+=2) > height)

      orow = 1;

  }

  black = ((INT64) black << 4) / ((896 - width) * height); }



The potential gamut of the the CCD or CMOS chip is determined by the voltage across the chip and the size of the pixels and amount of light striking the pixel through a filter.  The more a pixel is excited the higher the potential gamut.  (Larger signal)  Try this: Shoot two images, one over exposed and the other under exposed.  Then check the file size.  You will notice that the over exposed file is larger in size. It has a higher chip potential gamut and more information is on the chip and in the file. The problem with over exposure is if surge voltage is exceeded the chip will "Flair"  where each pixel cannot define detail. Post production software can process a curve with more information when it has more information to work with. A Jpeg file has eliminated 25% of the information from the highlight exposure so it can make the file smaller.  Thus,  sRGB processed file has less information and the gamut has been sacrificed.

The gamut of a raw file of what electronic "0s" & "1s" is collected on a chip has no relationship to RGB other than the Bayer collective.  Color and RGB only come from the final processing software.  The color gamut of RGB is greater with more information than a sRGB file.

A tiff file retains all the information of the processed file. A Jpeg file will reduce the file storage size by eliminating a large chunk of highlight information.



Lee Peterson

Marine Camera Distributors

01-29-04

Translation: there is no relationship between the gamut of a raw file to RGB of any kind. They come only from software (which we knew, before). So I guess it's up to the interpretation. For example, if a raw image contains certain low values and high values along a specific axis, the software can decide to extend, contract, or clip the range depending on whether the target gamut is wide enough. You have to profile your camera and raw converter software to get "real" accurate colors.

I am not sure this answers the question, though, which is really -- what values of color can the camera capture and write to its RAW file, and what values do they take on in the representation?
eric cheng
publisher/editor, wetpixel
www | journal | photos


#5 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 30 January 2004 - 07:29 PM

I'm not sure what to make of Lee's comments. The raw sensor must have a characteristic gamut. Black and white sensors do (it's just not very wide).

We know that monitors have a certain gamut due primarily to their choice of phosphors. Similarly, an imager would have gamut restrictions due to the color filters they choose. It's not at all clear to me whether that gamut is wide compared to sRGB or not. What you want is a color space that encompasses the entire range of your sensor so you don't lose anything in the translation. Anything wider than that introduces unnecessary quantization loss, although I doubt that's a big problem with 16 bit. For 8 bit, an unnecessarily wide color space also causes lose of data. That's my interest.

The conventional advise is to use sRGB for anything destined for the screen and something else like Adobe RGB for print. What difference would it make if the camera couldn't produce any colors outside the sRGB range? Of course, some cameras offer Adobe RGB but that doesn't mean it's useful.

Just wondering if it's really worthwhile to process in a wide gamut color space for my 16 bit TIFF's.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#6 tshepherd

tshepherd

    Great Hammerhead

  • Member
  • PipPipPipPip
  • 880 posts
  • Location:Westfield, NJ, USA

Posted 30 January 2004 - 07:35 PM

Don't feel bad Jimbo, it's out of my reach too, if only barely, and I work in software. The code I mostly get, the hardware side, wellllllll.....

:)

#7 whitey

whitey

    Manta Ray

  • Member
  • PipPipPip
  • 400 posts
  • Gender:Male
  • Location:Port Hedland in Australia's Northwest
  • Interests:All forms of nature photography. Dive medicine. The ocean.

Posted 30 January 2004 - 08:26 PM

I don't think your CCD has a gamut. It's not recording color, just voltages on three different colored sensors. Your monitor, as you mentioned, has a gamut, but it's displaying color. Gamut only works as a concept once the software has converted the sensor data into a representation of color - and even a poor quality sensor with limited voltage range coould still be potenttially be converted to the largest color space. Or at least I think it works something like that!

At any rate, in practical terms the choice of color space revolves around output device, not input device. Whatever data the sensor and conversion software produces, it's going to be moving around the color space as you post process it. sRGB as a narrow-gamut colorspace is a poor choice for work that may be printed. AdobeRGB is normally chosen as its moderate sized gamut is a reasonable match for the gamut of the output devices used in printing.

Rob Whitehead

Shooting with Phase One and Canon. EWA-Marine Factory Test Pilot.

www.pilbaraphoto.com


#8 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 31 January 2004 - 06:16 AM

A color gamut is the range of all possible colors that can be represented, so of course a CCD has a native color gamut. If it didn't it would be black and white. The question is whether that native gamut is wider than a given color space used to process our TIFFs and JPGs. If the native CCD gamut is not broader than sRGB there would never be any benefit to using Adobe RGB. That's what I want to know and I want to see color gamut spec sheets on these dSLR imagers so that I make the best decision on what color space I'm using.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#9 bvanant

bvanant

    Giant Squid

  • Team Wetpixel
  • 1580 posts
  • Gender:Male
  • Location:Los Angeles (more or less)
  • Interests:Science, photography, travel

Posted 31 January 2004 - 11:19 AM

Craig: I think that you and Whitey are discussing different things. If you look at the physics of either a CCD or CMOS (good description at the Kodak technical website on the differences) you see that each pixel has some voltage associated with it and the dynamic range of the representation of each pixel is governed by the input light intensity, the "graininess" of the A/D converter, the dark current of the device among others. It is only when the software interpolates this set of voltages into colors that the gamut is defined. In talking to guys that make medical imaging hardware and software for a living, they suggest that the actual representable color space for modern 6MP cameras should be much wider than sRGB but a lot depends on how the software guys implement their algorithms. The data that exists in the CCD or CMOS output is capable of representing a very wide gamut (consider the eyes ability to comprehend very wide gamut spaces with only three receptors) but the software will determine the ultimate space that can be represented. I think what you want to see is not color space gamut of the devices but the software that interprets them.

Bill

Bill
Canon 7d, Nauticam, Lots of glass, Olympus OMD-EM5, Nauticam, 60 macro, 45 macro, 8 mm fisheye, Inon, S&S, Athena Strobes plus lots of fiddly bits.
www.blueviews.net


#10 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 31 January 2004 - 12:36 PM

Software doesn't interpret the CCD data "into" colors. The color information is derived from the fact that there are color filters overlaying the imager itself, and those color filters integrated into the imager effect the color space. That's what I want to know. The demosiac software doesn't matter. It can only subtract from the gamut, not add to it.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#11 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 31 January 2004 - 04:21 PM

Here is a thread in another forum that discusses this matter pretty directly. The only answer is a generic "Adobe RGB is the right color space" but I don't know how definitive that is.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#12 whitey

whitey

    Manta Ray

  • Member
  • PipPipPip
  • 400 posts
  • Gender:Male
  • Location:Port Hedland in Australia's Northwest
  • Interests:All forms of nature photography. Dive medicine. The ocean.

Posted 31 January 2004 - 05:18 PM

Ok, I read your thread. They do seem a bit confused between bit depth and gamut.

I'd have to agree with the comment in your referenced thread though:

"actually cameras don't have gamut's at all but color mixing functions"

although I'd substitute the word 'sensor' for 'camera'.

I agree that we are talking about different things. I'd still argue that gamut only becomes a useable concept once the sensor voltages are combined and mapped to a color space.

Theory aside, as you mentioned people tend to work in AdobeRGB(1998) (because it has a moderate sized gamut). Some people work in broader color spaces than this (I have heard of people using ProPhoto RGB, but haven't ever had need to try it). What would be the potential advantage in working in a color space such as sRGB with a small color space?

Rob Whitehead

Shooting with Phase One and Canon. EWA-Marine Factory Test Pilot.

www.pilbaraphoto.com


#13 tshepherd

tshepherd

    Great Hammerhead

  • Member
  • PipPipPipPip
  • 880 posts
  • Location:Westfield, NJ, USA

Posted 31 January 2004 - 05:30 PM

What would be the potential advantage in working in a color space such as sRGB with a small color space?


Reduced file size? Compatibility with the web (if sRGB is the gamut for the web)? I can't see much reason to work in a smaller color space.

#14 whitey

whitey

    Manta Ray

  • Member
  • PipPipPip
  • 400 posts
  • Gender:Male
  • Location:Port Hedland in Australia's Northwest
  • Interests:All forms of nature photography. Dive medicine. The ocean.

Posted 31 January 2004 - 05:47 PM

OK, i reread your post back at the top re: concerns about quantization loss, I'd forgotten where all this started. So ignore my question about why you want to use a small gamut color space.


The conventional advise is to use sRGB for anything destined for the screen and something else like Adobe RGB for print.  .


FWIW. Scott Kelby in his book 'Photoshop for Photographers' argues that sRGB isn't really particularly good for screen/web use, that it was more suited for older monitors with smaller gamuts. He's a fan of AdobeRGB for just about everything.
Even if my camera produced color that could be happily mapped into sRGB, if the gamut of my output device (printer) is larger than the working space, I'm potentially losing information. I figure the colors move around the color space during post processing, so a color space that matches my printer is more relevant than a color space that matches the potential gamut of my input device.

Thinking about color management does make my brain hurt, but it is sort of interesting!

Thought for the day: Adobe Gamma is better for calibration purposes than using a spyder, because it calibrates the monitor AND the retina-visual cortex axis. :D

Rob Whitehead

Shooting with Phase One and Canon. EWA-Marine Factory Test Pilot.

www.pilbaraphoto.com


#15 echeng

echeng

    The Blue

  • Admin
  • 5844 posts
  • Gender:Male
  • Location:San Francisco, CA
  • Interests:photography, ice cream, cello, chamber music, quadcopters

Posted 31 January 2004 - 05:51 PM

FWIW. Scott Kelby in his book 'Photoshop for Photographers' argues that sRGB isn't really particularly good for screen/web use, that it was more suited for older monitors with smaller gamuts.  He's a fan of AdobeRGB for just about everything.


I've got that book, but I haven't cracked it open yet. :D

But that is very interesting that he says that. I convert all my images to sRGB before I publish them as jpgs on the web because AdobeRGB files I put on the web look really flat. The vast majority of people don't profile their monitors, and I find that sRGB looks right on most people's generic setups. I wish browsers would do the color space conversion to whatever monitor space people are using.
eric cheng
publisher/editor, wetpixel
www | journal | photos


#16 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 31 January 2004 - 06:11 PM

I think the confusion over "gamut" comes from it's usage. "Gamut" simply means a range of things. Clearly a hardware device has a finite "range of colors". Whether it has a "color gamut" is matter of you believing it's an "official term" or not. It's just semantics.

The advantage of a restricted color space is in the efficency of coding. For a given number of bits you get finer granularity in your tones. With 12 bit ADC's and 16 bit files, it doesn't seem to be important. I'm more concerned that my sensor has a greater color range than my color space can code.

The ProPhoto RGB space is an option in Photoshop and it's incredibly wide. Hard to believe that wouldn't do. You have to convert to sRGB for export to the web if you aren't using it natively. I want to make sure that's worthwhile.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries

#17 echeng

echeng

    The Blue

  • Admin
  • 5844 posts
  • Gender:Male
  • Location:San Francisco, CA
  • Interests:photography, ice cream, cello, chamber music, quadcopters

Posted 31 January 2004 - 06:38 PM

Craig: the article at RobGalbraith touched on something that I worry about often. With a wider gamut, I'm worried that the gradients of blue in the image will start to look artificial. I've already seen it with some AdobeRGB files I've looked at, even before I start monkeying around in Photoshop (it's a game over situation for the blues, I've found). I'm not sure that it's a RGB vs. sRGB problem, but it certainly is something that we have to deal with. I guess some experimentation is in order. :D
eric cheng
publisher/editor, wetpixel
www | journal | photos


#18 bvanant

bvanant

    Giant Squid

  • Team Wetpixel
  • 1580 posts
  • Gender:Male
  • Location:Los Angeles (more or less)
  • Interests:Science, photography, travel

Posted 31 January 2004 - 07:04 PM

Craig: As far as I understand it, the gamut is the boundary of the color space that a device can represent. If you are asking what are the maximum range of colors that you can represent, then I think you either need to do the experiment (which may not be trivial) or use the biggest space that you can. One of my friends at Kodak says the the PhotoYCC space exceeds all of the current display and capture technologies. I think the bottom line is that most folks believe that the color space that the camera can measure are typically much larger than sRGB. How much larger depends (I think) on the hardware and firmware in the camera.

BVA

Bill
Canon 7d, Nauticam, Lots of glass, Olympus OMD-EM5, Nauticam, 60 macro, 45 macro, 8 mm fisheye, Inon, S&S, Athena Strobes plus lots of fiddly bits.
www.blueviews.net


#19 whitey

whitey

    Manta Ray

  • Member
  • PipPipPip
  • 400 posts
  • Gender:Male
  • Location:Port Hedland in Australia's Northwest
  • Interests:All forms of nature photography. Dive medicine. The ocean.

Posted 31 January 2004 - 07:27 PM

Eric, it's a funny book. I don't know what Scott's on, but I want some! Now I don't think he is noted for his underwater photography, but FWIW his view is that sRGB is "arguably the worst possible color space for professional photographers...it mimics an "el cheapo PC monitor from 4 or 5 years ago..it's fairly ghastly for photographers, especially if their work will wind up in print."

So I don't think he's a big fan of sRGB. :D

Were the files with unpleasant blue gradients 8-bit?

Rob Whitehead

Shooting with Phase One and Canon. EWA-Marine Factory Test Pilot.

www.pilbaraphoto.com


#20 craig

craig

    Full Moon Rising

  • Super Mod
  • 2826 posts
  • Gender:Male
  • Location:Austin, TX

Posted 31 January 2004 - 08:04 PM

I think experimentation on this matter is out of my league.

The disturbing thing about this is how easy it is to find opinions on color spaces but how hard it is to find the real data for hardware. Maybe I'm looking in the wrong place.

I'm concerned that, even though I believe I understand these things, I experience frequent problems and surprises when I use color spaces other than sRGB. Thing like Eric mentions. I've convinced myself that anything other than sRGB is not worth the frustration but I know that shouldn't be right. Why the hell do my images edited on my calibrated system always look wrong on the web and why is it that when I downsize and image on a calibrated monitor it appears to change gamma? Doesn't happen on my non-calibrated systems.
I love it when a plan comes together.
- Col. John "Hannibal" Smith

------
Nikon, Seatool, Nexus, Inon
My Galleries