Blue flowers turn purple in a digital camera

when the color blue contains the addition of purple. For example common bugloss (Anchusa officinalis), also common chicory (Cichorium intybus) a little. I try to switch white balance adjustment from Auto: Ambience priority to Daylight and change hue and lightness of purple color. Flowers look more naturally, but not exactly like in a field:


purple HSL 0 0 0


purple HSL -10 0 -5, Canon Digital Photo Professional

See also:
https://www.google.com/search?client=firefox-b-d&sca_esv=dccc38f8e2930152&sca_upv=1&sxsrf=ADLYWIJRigU5hDURMNaK4TIOZC-tjspJng:1718791834869&q="Anchusa+officinalis"&udm=2 (some flowers are blue, others purple)
Purple is not spectral color (https://en.wikipedia.org/wiki/Line_of_purples) and perhaps this is the reason.

1 Like

AWB sometimes help but not all the time.

related discussion: https://forum.inaturalist.org/t/phone-camera-colours-of-plants-not-accurate/37437

1 Like

I adjust on a computer using RawTherapy and Paint.Net. I look at other colors in the photo like greens. Usually, if you adjust for the greens, you will get a decent accurate color for the other colors. Another way to handle the situation is to take multiple photos (like you did) and select the best one to upload. You could carry a small card with a spectrum of colors and include it in a photo next to the flower. Then, when you view it on a computer, you can adjust for the card with the spectrum.

3 Likes

Yes, but in this case all colors are correct except blue(-purple) flowers. So it seems, that camera and perhaps software canā€™t deal with color between blue and red (purples).

1 Like

Iā€™m familiar with this as the ā€œageratum effect.ā€

" a color shift well known to flower photographers, named for the ageratum flower, and acknowledged by Kodak on its website as ā€œanomalous reflectance due to the fact that some pigments reflect infrared light that is picked up as red by the filmā€." (https://photobotanic.com/news/color-shift-ageratum-affect/)

A site that discusses solutions: https://photo.stackexchange.com/questions/22455/how-can-i-avoid-or-correct-the-inverse-ageratum-effect-in-digital-photography

7 Likes

In identifying lots of observations, Iā€™ve noticed that blue seems like the least stable color. It become purple. Gray becomes blue. Annoying.

This is very interesting, and color shift looks similarly, but they write about the film, not the sensor. They advice to change hue of one zone of the spectrum.
Maybe taking pictures during cloudy weather (infrared absorbed by clouds) would work too if this is true ? Or even in shadow ?

Itā€™s a concern when a personā€™s perception is used to override a mechanical reading: my partner is red/green colorblind, and was totally ignorant of the beautiful cardinal vine flowers we had covering a fence! So, the retinal perception (brain-polished!) may not be what the physics of reflection is saying.
My phone camera has a tendency to take dusty gray and green it up; plus, depending on the sunlight intensity (or types of artificial illumination), the colors can also morph for the same specimen.
We should think of primary colors as being in a family; often what we see is a blend of two wavelengths from different microstructures, or a blend of reflection of different hydrations. When you see purple (blue+red) and the camera is saying blue, itā€™s capturing the wavelength input equally for the low-energy red and the high-energy blue (why everything deep in the ocean looks blue; the low-energy red decreases in transmission starting at 1 atmosphere, 34 ft). Incandescent illumination is notorious for yellowing indoor shots, why a flash should be used.
Other settings you might find surprising in color morphing are white point, black point and shadow. And, of course!, different cameras have different parameters for image capture. Why not try a comparative survey, showing results at same time and illumination, for different cameras and phones? Maybe this is a good variable to consider when test-driving a new camera or phone, rather than focus or editing capabilities. How true-blue is it?? Show me your image of Anchusa officinalis!
Note: Iā€™m a Wikipedia editor, and the chromaticity article requires scientific references (peer-reviewed journals preferred). My own comments are from decades in electron microscopy and photography.
AND: My phone images change (brighten) when downloaded from phone to Chrome drive. AI perception, what a concept.
Not to mention: the confusion of afterimages! https://en.wikipedia.org/wiki/Afterimage

1 Like

Maybe infrared cut-off filter would block this ?

1 Like

I am not a digital camera expert, but I know one who helped develop the digital camera technology at Kodak. He was telling me there is a filter in front of the sensor to eliminate infrared. I am thinking if this filter is not sharp enough, or is not of the proper cutoff wavelength, then this could happen. An interesting experiment might be photographing the signal from a TV remote since they use an IR LED - although I am not sure what to expect.

1 Like

https://www.researchgate.net/publication/229913459_Host-plant_finding_and_recognition_by_visual_and_olfactory_floral_cues_in_an_oligolectic_bee
Hannah Burger, Stefan Dƶtterl and Manfred Ayasse, Host-plant ļ¬nding and recognition by visual and olfactory ļ¬‚oral cues in an oligolectic bee, Functional Ecology 2010, 24, 1234ā€“1240
p. 1238, Fig. 6: Mean spectral reļ¬‚ection of Echium vulgare and Anchusa ofļ¬cinalis petals (ā€¦)

They reflect red and perhaps infrared (in addition to blue), but the plot is cut at 700 nm.
However images of viperā€™s bugloss (Echium vulgare) look well in my opinion, or at least not so badly like of common bugloss (Anchusa ofļ¬cinalis).
Common bugloss reflects less blue in comparison to red and is also darker. Maybe darker red is less visible for the eye ?
And this blue/purple color is sum of blue and red really, and there is difference between eye and camera sensor perception at least in case of common bugloss.

Iā€™ve noticed more of the opposite: Purple flowers coming through as blue on pictures (e.g. violets). I try to play with the white balance in camera on some of these to get more accurate colors, e.g. use the setting for shady or cloudy conditions to capture purple flowers. There are ways to correct it in image processing but if I have a lot of pictures to process that gets tedious.

1 Like

Post process on your computer to get the colours YOU think are correct - unless in strictly controlled and calibrated laboratory conditions there will inevitably be differences between what you think you have seen and what the camera sensor gives you. It is quite subjective.

2 Likes

Humans all perceive colors slightly differently. Colorblindness (NB: I am colorblind) is a more extreme example of this, but even among people with ā€œnormalā€ color vision, there is some variation in how different wavelengths of light activate their optical pigments and how these are perceived. So relying on a phototakerā€™s memory of what color they saw to translate that to try to match what they see on a (probably non-calibrated) computer screen is always going to produce mixed and likely unreliable results.

If accurate representation of color/wavelength is important for a photo, the best bet is to include a color standard card in the photo and use that for calibration as others have mentioned. This doesnā€™t eliminate all error, but it will do very well and can certainly standardize display of color for a given setup very well.

There are also infrared filters that will block near infrared (not too expensive) if that is causing an issue, though I donā€™t know how they work specifically for the wavelengths for flowers discussed here.

One other major consideration to take into account with smartphone cameras is what settings the camera is using. Many apply some color modification by default. Newer phones use AI to change colors without any user input (or probably awarenessā€¦), so they shouldnā€™t be relied on to accurately represent color/wavelength. In fact, some AI settings/functions can cause real issues for ā€œtruthfulnessā€ of the photos they produce, even apart from those functions that allow users to use AI tools to manipulate images intentionally. See
https://www.wired.com/story/samsungs-moon-shots-force-us-to-ask-how-much-ai-is-too-much/
for an interesting (non-living) example.

fwiw, My right eye has always seen colors a little more brightly than the left eye. And, the left eye sees colors cooler, more greenish.

1 Like

Have you been screened for cataracts?

1 Like

Oh, good thought. Iā€™ve had cataracts removed. But, this is just something Iā€™ve noticed since I was a kid. The slight color shift is there even with the new lenses. Itā€™s quite subtle.

3 Likes