when the color blue contains the addition of purple. For example common bugloss (Anchusa officinalis), also common chicory (Cichorium intybus) a little. I try to switch white balance adjustment from Auto: Ambience priority to Daylight and change hue and lightness of purple color. Flowers look more naturally, but not exactly like in a field:
I adjust on a computer using RawTherapy and Paint.Net. I look at other colors in the photo like greens. Usually, if you adjust for the greens, you will get a decent accurate color for the other colors. Another way to handle the situation is to take multiple photos (like you did) and select the best one to upload. You could carry a small card with a spectrum of colors and include it in a photo next to the flower. Then, when you view it on a computer, you can adjust for the card with the spectrum.
Yes, but in this case all colors are correct except blue(-purple) flowers. So it seems, that camera and perhaps software canāt deal with color between blue and red (purples).
Iām familiar with this as the āageratum effect.ā
" a color shift well known to flower photographers, named for the ageratum flower, and acknowledged by Kodak on its website as āanomalous reflectance due to the fact that some pigments reflect infrared light that is picked up as red by the filmā." (https://photobotanic.com/news/color-shift-ageratum-affect/)
This is very interesting, and color shift looks similarly, but they write about the film, not the sensor. They advice to change hue of one zone of the spectrum.
Maybe taking pictures during cloudy weather (infrared absorbed by clouds) would work too if this is true ? Or even in shadow ?
Itās a concern when a personās perception is used to override a mechanical reading: my partner is red/green colorblind, and was totally ignorant of the beautiful cardinal vine flowers we had covering a fence! So, the retinal perception (brain-polished!) may not be what the physics of reflection is saying.
My phone camera has a tendency to take dusty gray and green it up; plus, depending on the sunlight intensity (or types of artificial illumination), the colors can also morph for the same specimen.
We should think of primary colors as being in a family; often what we see is a blend of two wavelengths from different microstructures, or a blend of reflection of different hydrations. When you see purple (blue+red) and the camera is saying blue, itās capturing the wavelength input equally for the low-energy red and the high-energy blue (why everything deep in the ocean looks blue; the low-energy red decreases in transmission starting at 1 atmosphere, 34 ft). Incandescent illumination is notorious for yellowing indoor shots, why a flash should be used.
Other settings you might find surprising in color morphing are white point, black point and shadow. And, of course!, different cameras have different parameters for image capture. Why not try a comparative survey, showing results at same time and illumination, for different cameras and phones? Maybe this is a good variable to consider when test-driving a new camera or phone, rather than focus or editing capabilities. How true-blue is it?? Show me your image of Anchusa officinalis!
Note: Iām a Wikipedia editor, and the chromaticity article requires scientific references (peer-reviewed journals preferred). My own comments are from decades in electron microscopy and photography.
AND: My phone images change (brighten) when downloaded from phone to Chrome drive. AI perception, what a concept.
Not to mention: the confusion of afterimages! https://en.wikipedia.org/wiki/Afterimage
I am not a digital camera expert, but I know one who helped develop the digital camera technology at Kodak. He was telling me there is a filter in front of the sensor to eliminate infrared. I am thinking if this filter is not sharp enough, or is not of the proper cutoff wavelength, then this could happen. An interesting experiment might be photographing the signal from a TV remote since they use an IR LED - although I am not sure what to expect.
They reflect red and perhaps infrared (in addition to blue), but the plot is cut at 700 nm.
However images of viperās bugloss (Echium vulgare) look well in my opinion, or at least not so badly like of common bugloss (Anchusa ofļ¬cinalis).
Common bugloss reflects less blue in comparison to red and is also darker. Maybe darker red is less visible for the eye ?
And this blue/purple color is sum of blue and red really, and there is difference between eye and camera sensor perception at least in case of common bugloss.
Iāve noticed more of the opposite: Purple flowers coming through as blue on pictures (e.g. violets). I try to play with the white balance in camera on some of these to get more accurate colors, e.g. use the setting for shady or cloudy conditions to capture purple flowers. There are ways to correct it in image processing but if I have a lot of pictures to process that gets tedious.
Post process on your computer to get the colours YOU think are correct - unless in strictly controlled and calibrated laboratory conditions there will inevitably be differences between what you think you have seen and what the camera sensor gives you. It is quite subjective.
Humans all perceive colors slightly differently. Colorblindness (NB: I am colorblind) is a more extreme example of this, but even among people with ānormalā color vision, there is some variation in how different wavelengths of light activate their optical pigments and how these are perceived. So relying on a phototakerās memory of what color they saw to translate that to try to match what they see on a (probably non-calibrated) computer screen is always going to produce mixed and likely unreliable results.
If accurate representation of color/wavelength is important for a photo, the best bet is to include a color standard card in the photo and use that for calibration as others have mentioned. This doesnāt eliminate all error, but it will do very well and can certainly standardize display of color for a given setup very well.
There are also infrared filters that will block near infrared (not too expensive) if that is causing an issue, though I donāt know how they work specifically for the wavelengths for flowers discussed here.
One other major consideration to take into account with smartphone cameras is what settings the camera is using. Many apply some color modification by default. Newer phones use AI to change colors without any user input (or probably awarenessā¦), so they shouldnāt be relied on to accurately represent color/wavelength. In fact, some AI settings/functions can cause real issues for ātruthfulnessā of the photos they produce, even apart from those functions that allow users to use AI tools to manipulate images intentionally. See https://www.wired.com/story/samsungs-moon-shots-force-us-to-ask-how-much-ai-is-too-much/
for an interesting (non-living) example.
Oh, good thought. Iāve had cataracts removed. But, this is just something Iāve noticed since I was a kid. The slight color shift is there even with the new lenses. Itās quite subtle.