Digital cameras often have a setting that allows you to select either sRGB or Adobe RGB. What, exactly does this do? The answer is that, like the white balance and several other camera settings, it affects only the JPEG files that the camera produces. If you shoot RAW (and if you don’t, you really need to start) the sRGB or Adobe RGB setting has no impact whatsoever on the raw image data stored in the RAW file.

Figure 1 shows how image data is captured in a digital camera. The sensor is an array of photo-sites. A photo-site consist of a photo-detector, which is essentially a photon counter, under a “red,” “green” or “blue” filter. Typically, each pixel has one “red”, one “blue” and two “green” photo-sites per pixel, so there are alternating rows of blue/green and green/red photo-sites. This is called the Bayer pattern, as shown in Figure 1. The Bayer pattern is very common, but there are variations.
I use “red,” “green” and “blue” in quotes here because I want to emphasize that these are photo-site filter responses as determined by the photo-chemical properties of the dyes used in the sensor’s color filters. These color responses determine the sensor’s native color space. They are not like the red, green and blue primaries of the standard RGB spaces, such as sRGB, Adobe RGB (hereafter aRGB), or ProPhoto RGB.
After capturing an exposure each photo-site photon count is realized as an analog voltage. This voltage is read off of the sensor array row-by-row in a raster sequence. The signal is sampled for each photo-site response and converted into digital words by an Analog-to-Digital Converter (ADC). It is these digitized samples at the ADC are the sensor’s raw image data.
Figure 1 also indicates that the camera’s ISO setting is an analog amplifier prior to the ADC. Increasing the amplification, for higher ISO, amplifies low level signals but at the cost of also amplifying sensor noise. That story, the story of dynamic range, we’ll hold for another day.
After the ADC, a processor core called a Digital Signal Processor (DSP) (also called an Image Processor) applies in-camera digital processing to the raw image data. Both in-camera and post processing of raw image data include the following.
- Demosaicing: The rearrangement and interpolation of the raw image data (row-by-row sampled photo-site data) into pixel data channels (blue-green-green-red for the Beyer pattern), still in the sensor’s native color space.
- Color Space Transformation: The transformation from raw image data to a standard RGB color space. This step does use the camera’s white balance setting. In camera, the destination space is determined by the s/aRGB camera setting.
- Other Adjustments: Contrast, saturation, sharpening, etc.
- JPEG encoding
- Lossless data compression (typically ZLW coding)
Sometimes, the combination of demosaicing and color space transformation is called rendering.
Your camera always creates a JPEG even if you shoot in RAW only mode. If for nothing else, the LCD preview display is a JPEG. JPEG production can only be applied after demosaicing and color space transformation, and other adjustments may be prior to JPEG production. Nonetheless,
RAW files store raw image data as produced at the ADC in sensor’s native color space – hence, the name RAW. The only in-camera digital processing applied to the raw image data is lossless data compression, and the only camera setting that affects the raw image data is the ISO setting*.
*There are actually some very minor exceptions to the ‘ISO only rule,’ but they are not generally relevant to modern digital cameras. I’ll get back to those exceptions below.
The color space transformation uses the four (Beyer pattern) samples per pixel to calculate the red, green and blue channels of a standard RGB space. That’s actually a rather messy signal processing operation as it must compensate for sensor nonlinearities before the linear transformation to a standard RGB space. If you are a digital wonk, like me, you can download the open standard Adobe DNG Specification (which is the main reference for this blog) and read about the gory details like I did.
White balance (WB) is a special adjustment setting. It does not affect the raw image data. However, color space transformation, both in-camera and in post, does use the camera’s WB setting. Thus, the commonly cited statement that camera WB does not affect post-processing of RAW files is not quite correct. The camera’s WB setting is stored in the RAW file and used for color space transformation. That determines the initial preview that you start with in post processing.
If you are a serious post-processing photographer, be careful with auto-WB. The reason is that auto will make WB for each image. In post processing, for a group of pictures shot in the same setting, you’ll take the time to get your color balance right once and then apply that adjustment across the entire group. But if the group was shot with auto-WB, then strictly speaking you have to color balance each image separately. It doesn’t really matter what the WB setting is, it only matters that they are all the same across the group. I have recently adopted the practice of using the neutral “daylight” WB setting for all shooting. This ensures a uniform WB setting across all images, so I don’t have to check before applying a batch color balance adjustment. The only drawback is that the in-camera previews have the wrong WB rendering, so the color balance is off in the camera previews. I really don’t care. I don’t use the in-camera LCD preview for adjustments. However, I do care about the histogram and blow-out indicators provided on the camera’s LCD to help me set exposures. Thus, I want the camera’s histogram for to accurately reflect the raw image data, not a white-balanced preview. That’s why I use a neutral in-camera rendering (daylight WB) to calculate histograms.
My Nikon D700 offers many more adjustments under the heading “picture controls.” There are pre-sets: standard, neutral, vivid and monochrome, and you can add custom picture controls allowing you specify things like contrast, brightness, saturation, sharpening, etc. We might call this “pre-processing” because all the adjustments are set prior to capturing the image.
Now, if you’re like me, then you agree that the camera’s LCD is not the right place to make adjustments. I do that on my 27” iMac, thank you very much. Nonetheless, with the exception of WB, these in-camera adjustments, along with the camera’s a/sRGB setting, only apply to in-camera JPEG production. I tested this on my D700. I shot my SpyderChecker card with each of the “picture control” presets (neutral, vivid, …), and then opened the RAW files in Lightroom. The big hint that was that the monochrome image, which was in B&W on the camera’s LCD preview, showed up in glorious color when I opened the RAW file in Lightroom. Indeed, all four images were identical.
Now, I understand why manufacturers offer all of these in-camera adjustments. First, of course, they want pictures captured with their cameras to look great with no post processing! But there is more to it than that. Some photographers need to shoot a lot of photos and process them quickly. Think wedding photography. They can take a set the white balance, sharpening, that soft wedding style contrast, etc., for each shooting environment, and then just shoot and deliver the in-camera JPEGs. There’s nothing wrong with that. It is just an efficient way to deliver a quality product to clients quickly.
Nikon also offer something called Active-D lighting, which supposedly increases dynamic range. Other manufacturers may offer similar gimmicks. I did a little internet searching, and found that (apparently) Active-D is a combination of exposure modification and a digital tone mapping algorithm. This exposure modification is one of those “exceptions” to the ‘ISO only rule,’ because it does affect the raw image data. This sounds similar to the recommended practice of exposure to the right (ETTR). I’ll discuss that in a blog on dynamic range. I just turn that stuff off! If you are a serious post-processing photographer, you will make your own tone-mapping adjustments (using things like gradient filters and masked layers). I don’t want to rely on a “canned algorithm.” Fortunately, the tone mapping of Active-D is a digital processing that only applied to in-camera JPEGs. Still, it might monkey with your exposure. So just don’t do it. You should learn the ETTR technique and manage it yourself.
Post processing software uses a working RGB space. You should configure your post processor for the largest possible working space, which typically is ProPhoto RGB. ProPhoto is the default setting for Lightroom. In Photoshop, you set the working color space in the Color Settings dialog box that gives you an incredible array of options. You can even specify your own custom RGB space, such as a monitor’s native RGB space, but ProPhoto is more than sufficient for the vast majority of photographers. Using a wide gamut working space like ProPhoto (which is much wider than can be produced in print or on a monitor) allows processing to create new colors via various adjustments without having to perform the destructive (and nonlinear) out-of-gamut rendering operation after each adjustment. You should apply the out-of-gamut rendering only as the very last step to produce your final product (a JPEG or print) under your control using tools like soft proofing.
Still, many photographers seem to think that RAW files store image data in sRGB or aRGB, undoubtedly because it is a setting on the camera. If that were true, there would certainly be an massive outcry of photography blogs. It just wouldn’t make sense to compress raw image data to s/aRGB in-camera, and then transform to a larger working space like ProPhoto for post processing. There should be only one transformation from sensor native to a wide gamut working space, and then, after processing, one transformation from the working space to the production JPEG or print gamut where you control out-of-gamut rendering.
I recommended above to always use a neutral WB setting in-camera, but I do assume you will post-process in a wide working space. Transforming to sRGB as the working space with the wrong WB setting can produce undesirable results that post adjustments cannot correct because color information was lost.
Unfortunately, manufacturers tend to obfuscate these facts. My Nikon D700 user’s manual, for example, makes no mention of the fact that the s/aRGB setting, the ‘picture control’ adjustments and Active-D tone-mapping, is only for in-camera JPEG production. It simply doesn’t say anything about what output format this setting affects – as if JPEG is the just assumed default image capture format.
A few years back, Nikon introduced some lossy compression in their RAW file format NEF. You can now select either 12 or 14 bit NEF files. 12 bit data is slightly compressed to produce a smaller RAW file. While the 12 bit compression distortions may be imperceptible, the public relations blowback was not. That said, I have my D700 set to 14 bit data to get a true RAW file, as any absolute data integrity purist would do.
More Details Anyone?
The remainder of this blog is going to be a deeper dive into some details, so feel free to stop reading here.
What exactly is the sensor’s native color space? As stated above, it is the response curves of the “red,” “green” and “blue” color filter on top of the sensor photo-sites. I don’t have color response curves for commercial sensors (which are likely proprietary anyway). However, these color filters play exactly the same role as the opsin proteins in the cone cells of our eyes. The cone cell response curves, and the color space they determine, are called LMS for Long, Medium and Short wave, which only roughly correspond to red, green and blue. The figure below shows the LMS response curves vs. the wavelength of monochromatic light. (Monochromatic means pure light of a single wavelength.) Note that the LMS responses are very broad and have substantial overlap.
In contrast, the red, green and blue colors of standard RGB spaces are nearly monochromatic colors. How do I know this? Because they are primary colors, which have a very special property. While other colors can (generally) be represented as combinations of the red, green and blue primaries, the reverse is not true. Consider yellow, which is a secondary color. Our eyes and brains perceive yellow as a combination of the L and M responses. Thus, we can perceive the same yellow as a mixture of red and green monochromatic light, or as a single monochromatic yellow light. So, why are the standard RGB primaries nearly “nearly monochromatic?” Because you want the widest color gamut that can be produced by just three primaries, and that requires highly saturated (which essentially means “nearly monochromatic”) primary colors.
One should not confuse the sensor photo-site color filter response curves with the primaries of standard RGB spaces. It’s just apples and oranges. There are infinitely many combinations of monochromatic light that produce the same perception of a yellow color, and a color sensing system must properly recognize all of them as yellow. A reproduction system on the other hand, (like a monitor, a post processor working space, or a print) only needs to produce one of the combinations that result in the same color perception.
Still not convinced? Well, suppose we build a sensor with response curves being the red, green and blue spectra of, say, sRGB primaries, as illustrated in Figure 3. Case (a) shows the case that we excite the sensor with equal levels of red and green light. The sensor work fine. The sensed levels for both red and green are equal, and our eyes and brains perceive that as yellow. Case (b) shows the case that this sensor is excited with a monochrome yellow light that we perceive as exactly the same yellow as the red-green combination of (a). However, in this case, there is no sensor response. The narrow sRGB primary spectra, used here as sensor response curves, don’t overlap, and don’t cover the entire range of visible wavelengths. In case (b), the sensor reports that the color is black – and that’s just wrong!

It is, therefore, a source of the confusion that camera manufacturers describe their color photo-sites as “red,” “green” and “blue.” It would be better (in my opinion) to call these “sensor LMS” responses.
Ever notice that nobody talks about camera color gamut? You don’t see this discussed in reviews. The reason is that the sensor’s color gamut is likely not a limiting factor. In fact, I can provide a mathematical proof to show that a camera sensor can accurately sense 100% of the human perceivable colors if and only if its response curves are a linear combination of the LMS responses. (Send request to john.sadowsky@me.com.) The sensor issue is not gamut (the range of colors that can be sensed), but rather color accuracy, that is, the degree to which all of the combinations of monochromatic light that are humanly perceived as the same color are sensed correctly as the same color.
Alas, much of the literature (and many “expert” blogs on the internet) does seem confuse the distinction between a sensor’s native color space and processing/reproduction spaces like the standard RGBs. Of course, different manufacturers will use different variations of the ‘color filter’ technology, which leads to different native color spaces. Perhaps this is why the industry has resisted adopting a uniform standard for RAW files. DNG provides all the necessary support for native sensor color space transformation. It continues to be an enigma that many camera manufacturers have resisted DNG.
DNG includes all the raw image data exactly as it exists in the manufacturer’s RAW file. Like all RAW files, additional data DNG includes metadata and camera settings. This additional data is stored in data fields called tags. DNG is actually an extension of the TIFF (Tagged Image File Format) standard that includes additional DNG specific tags. Those DNG tags are used for color space transformation in Adobe Camera Raw. I purchased an app called ExifExtreme from the Apple App Store that reads all the tags in DNG and TIFF files. (There is also UNIX command line tool called tiffutil that will read TIFF tags, but it doesn’t read the DNG specific tags, or the manufacturer’s specific data.) The DNG tags are listed in the DNG spec. The DNG also includes all of the manufacturer specific tags (in my case, from the Nikon NEF file). You get all those in-camera adjustment settings (standard, vivid, Active-D, etc.), even though they are not incorporated in the raw image data. If you are a digital wonk, ExifExtreme is a good $5 investment.
Lastly, I need to mention that other exception to my ‘ISO only rule.’ According to the Adobe DNG Specification, some cameras can apply an “analog WB.” This is implemented as different analog gains on different color channels prior to the ADC. Hence, this WB setting does affect the raw image data. Adobe says it can increase dynamic range. My internet searches turned up only scant discussion on this. Analog WB was apparently implemented on the Nikon D1, and perhaps in some Pentax cameras. My guess is that this is a difficult to implement technology, that there are better ways to increase dynamic range via sensor design, and hence, this has not been a widely adopted technology. If you know more about cameras with analog white balance, please share. The presence of analog white balance is indicated by the DNG tag AnalogBalance. If your DNG does not include the AnalogBalance tag, or has a value 1 1 1, then you don’t have analog white balance (as is the case for my D700 files).
John Sadowsky
good info.
Excellent article!