Picture Brightness

iowahiker

Senior Member
Joined
Jan 21, 2014
Messages
487
Location
iowa
One issue which has nagged me since we got our first digital camera 15 years ago is what "standard" to use for picture brightness. Our pictures are displayed on the camera, two computers, and a TV (HDMI) with no assurance of equivalent brightness. Our camera produces two levels of brightness depending on which one of two mostly automatic shooting modes is used (both are auto iso, aperture, shutter speed). When choosing between two nearly identical pictures with different brightness, what do you use as your "standard" brightness?
 
My assumption is that you're posting images straight out of the camera (SOOC), and not using any post processing. If that is the case, I think the best answer is choose the image you find most appealing on the display that will be seen most often.

If you are using post processing with something like Lightroom or Elements, color calibrate the display on that computer, then use one of the brightness adjustment tools to bring the image up to a level that you find most pleasing and real.

So much in photography is (BG)^2. Best Guess By Golly... So in my book, there is no correct answer or standard, just the one that you prefer.
 
Thanks. We do no "picture processing" but posting pictures on WTW increases my interest in a "standard" brightness. Also, our pictures are managed on one computer while the pictures display better on the TV (HDMI) and so an interest in a brightness "standard".
 
You might enjoy doing a bit of post processing, it gets addictive.... A copy of Photoshop Elements or Lightroom should be under or around $100, and they are for both Windows and Mac computers. The learning curve is there, but once you start working with it, you learn quickly. That way, you could adjust images for the intended audience or device.
 
I had a conversation with a friend of mine about a month ago about post-processing. He felt that it was somehow cheating the "true" image the camera had taken. I tried to explain to him that the picture the camera takes isn't really what was there but the camera's "interpretation" of what it saw based its settings, design and so forth. Display devices (including photo printers) are also giving "interpretations" of the information presented to it. Now you can go to a lot of trouble to calibrate displays, but what most of us really want is to create beautiful pictures (however we define that to be) so the advice Wandering Sagebrush gave about setting up your dominant display sure sounds like the right answer to me.

Not sure I convinced my friend.

Alan
 
alano said:
I had a conversation with a friend of mine about a month ago about post-processing. He felt that it was somehow cheating the "true" image the camera had taken. <snip>
Alan
Ansel Adams (as well as most pro photographers) spent a lot of time in the darkroom to make the camera's rendering look like what he saw. Sometimes you get it right SOOC, but...
 
I calibrate my monitors (Datacolor Spyder) and then adjust the brightness to match what I saw. I shoot RAW mode with all in camera 'tweaks' turned off except auto WB and sometimes auto ISO for wildlife shots and mostly don't have to adjust the brightness at all unless I set exposure compensation in camera to avoid over exposing highlights. I use Lightroom for any post processing I do.
 
I recently attended.a seminar by several local photographers and saw how poorly digital projectors render colors and destroy shadows. Bottom line: don't trust the camera to decide how to process the image (aka jpeg), and understand any monitor or print needs to be calibrated to create a true representation within the limits of the exposure and resolution of the sensor.
 
http://www.bhphotovideo.com/explora/photography/tips-and-solutions/how-read-your-cameras-histogram

Your camera very likely has a histogram option. I posted the link from B&H because they are a bit tricky but once you understand them it helps you to see how your shot was exposed without needing to look at it. Obviously we look at both but if your histogram looks good your exposure is correct and that should fare well in most circumstances. No matter how well we take the picture it will always "pop" more on a screen because of the back lighting. I adjust til I like it and let it go but I do watch the histogram in tough lighting sityations to see if I am close. :)
 
Stalking Light said:
+1. Unless you are shooting film pretty much every "straight out of camera" shot has already been manipulated by your in camera firmware. Post processing just gives *you* more control.
Not sure that I'd exempt film. I used to choose Agfachrome for fall photos off the Blue Ridge Pkwy because the fall colors were much better than Ektachrome. I used Kodachrome on the Outer Banks in bright sunlight because I thought that was best choice in those conditions. Ektrachrome for bird photos in the woods because I had a chance to get a shot in the lower light.

Each film put its own stamp on the result but which one true?

Paul
 
You're right about different films, I was just trying to explain that most digital cameras do picture processing internally and the concept of sooc as a purist thing with them isn't valid. Even shooting RAW just gives you the option of changing some in camera adjustments after the fact. And even then you don't capture the sounds, smells and mood, etc., which all contribute to 'true'. ;)
 
I do use the camera's picture processing, "live" and "vivid", to add color which helped raise the issue of picture brightness. Adding color with "vivid" made pictures look darker relative to totally auto pictures. Brightening in "live" did not reduce the difference (software bug?) and so the question became "which is the right brightness?".

This picture has "vivid" added but no brightness added:

gallery_5179_260_3498797.jpg
 
Back
Top Bottom