Why can’t phone cameras take pictures the way I see things at night, even though they can adjust for brightness?

978 views

Why can’t phone cameras take pictures the way I see things at night, even though they can adjust for brightness?

In: Technology

5 Answers

Anonymous 0 Comments

A camera is a very dumb tool with lots of limitations. Those limitations can be mitigated with technological advancements, but at the end of the day, it’s just a hunk of metal glass and plastic trying to capture photons.

The way that you see with your eyes is very nearly magic. Your eyes bring in light, the light is picked up by your rods and cones, and then the info is sent to your brain where all the magic happens. Your brain can filter things, adjust for things, and even make up information where there really isn’t any. Your eyes and brain also have a very very wide “dynamic range” which is the range of brightness levels you can perceive at once, from really bright lights and into deep dark shadows. Some of this is the nature of our eyes, and some of this is down to the enormous power of our brains processing the information quickly. More importantly, your vision is set up specifically to be giving you information needed to hunt, defend yourself, be aware of threats or resources, etc. So when there are flaws or deficiencies, your brain doesn’t really alert you to them, it just does what it has to do to keep you safe and healthy. You may think that you can see very well in really really low light where a camera sees absolutely nothing, but if you really examine the image your brain is perceiving, you may notice that you are essentially seeing black and white, because we don’t pick up color well in ultra low light, and yet your brain will do its best to show you “false color” because you know “that truck is red” even if you can’t see it. So you brain will basically paint the color in for you based on your knowledge.

A camera can’t (yet) use the intelligent power to make those adjustments the same way your mind can. It just shows you everything it can technically capture, without any fakery or trickery to help fill in blanks, remove distractions, or make up for deficiencies. They also have far lower ranges of capabilities between bright and dark spots, so at night where you have a very bright street lamp and a very dim car interior, your eyes will show you both levels of exposure, but a camera will have to pick one or the other, and you’ll get a blown out white light and an interior you can see, or a well exposed light and a black interior. The technology just isn’t there yet to expand that dynamic range. On cameras with larger sensors, the dynamic range is better, but still very limited compared with the human eye. The camera also can’t fill in the blanks when it comes to colors, so what might look vibrant and detailed to your eye is going to look muddy and “accurate” in a photo.

Anonymous 0 Comments

Your eye is not a camera. It works on completely different principles.

The eye adjusts for darkness with the aperture, and cameras can do that. It also handles dim conditions differently, triggering nerve impulses when a single sense cell detects light. Cameras operate on a frame concept that’s intrinsically more susceptible to noise.

Anonymous 0 Comments

Some can, but to get it exactly how you see it would be tough. You’d have to go into the “pro” settings and adjust the iso/aperature and everything. A typical Galaxy has a dual pixel sensor which is awesome for low light. Since the S7 it’s been able to get better low-light pictures than my own eyes can see. Now it’s even better and the clarity is impressive.

Anonymous 0 Comments

Camera sensors are typically not sensitive enough to pick out details from very low light scenes. Human eyes aren’t well suited for low light either, considering that many more animals have better night vision performance than us, but we still have better eyes than cell phone cameras. There are cameras that are more suited for low light, but they often have very poor *normal* light performance. So making one that can do it all is pretty difficult.

Anonymous 0 Comments

Your phone camera has no aperture to adjust how much light comes into it like our eyes do. So that’s one measure of control gone. Instead phones have to work purely on their ability to separate signal from noise.

Digital cameras detect light with something called a CCD, which is composed of various cells that convert light into an electrical signal. The retina of the eye works on *vaguely* similar principles.

In both cases, there’s a certain amount of “signal” that comes from a proper reaction to light and “noise” generated by spontaneous fluctuations in the media.

When you have a lot of light coming in, the signal is very strong, and the noise is very weak. Imagine noise as random numbers from 0-5 being added to data that ranges from 0-10000. But when it’s dark, and the data coming in is only 0-50, noise gets very hard to filter out. The brain is just really, really good at filtering visual noise.

Part of that is because a lot of what we see is actually the brain extrapolating from the signal it gets and not a picture perfect, “pixel” for “pixel,” true view. This is why optical illusions work on us, because they mess with the processing and interpretation we do on the data. Our brain fills in the gaps in data and gives us an illusion of clarity we don’t actually possess.

**TL;DR, True ELI5** Both struggle to make out shapes in the dark, but our brains are really good at picking out what’s important, and they guess at stuff where they can’t really tell. Phones can’t do that.