dead pixels?

Have questions about the equipment used for macro- or micro- photography? Post those questions in this forum.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

iconoclastica
Posts: 486
Joined: Sat Jun 25, 2016 12:34 pm
Location: Wageningen, Gelderland

dead pixels?

Post by iconoclastica »

Since a couple of weeks I note that black parts of my photos are contaminated with brightly coloured pixels, most notably red , deep blue and purple. They always are in the same positions, so when stacked they show up as radiating lines. I don't now if this really is something new, for mostly I used to take care to have white backgrounds.

Here is a part of such a photo at 100% and a steep curve applied:

Image
Canon 7d - large jpg; iso400; in camera noise reduction on

Is this the end of my sensor anounced, or something innocent?
--- felix filicis ---

Lou Jost
Posts: 5944
Joined: Fri Sep 04, 2015 7:03 am
Location: Ecuador
Contact:

Post by Lou Jost »

I dare say we all have these. They can usually be permanently hidden in-camera by a menu item named something like "pixel mapping" (depends on brand).

iconoclastica
Posts: 486
Joined: Sat Jun 25, 2016 12:34 pm
Location: Wageningen, Gelderland

Post by iconoclastica »

Lou Jost wrote:I dare say we all have these. They can usually be permanently hidden in-camera by a menu item named something like "pixel mapping" (depends on brand).
Hm, I thought I had that activated. I'll check that again, then.
--- felix filicis ---

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

These look like "hot pixels" or "warm pixels", not really dead but inclined to accumulate charge from leakage as well as light.

If so then they'll be most noticeable with long exposure times and when the camera is physically warm, especially if you've been running in Live View so the sensor is heated.

My usual advice is to add light so that you can use a short exposure. That will keep warm pixels in dark areas from showing up. I don't know if in-camera pixel mapping will get rid of these, since whether you want to map out warm pixels depends on the time and temperature.

You have more than I'm used to seeing, but I'm not aware that this is a sign of impending disaster.

--Rik

Macro_Cosmos
Posts: 1511
Joined: Mon Jan 15, 2018 9:23 pm
Contact:

Post by Macro_Cosmos »

These are hot pixels, something that happens when extended exposures are used or the sensor has been at work for longer times.

They are kind of similar to dead pixels, in that their position is going to be the same for corresponding exposure times. Usually, mirrorless cameras yield more hot pixels due to their extremely dense designs and some have IBIS which means heat dissipation solutions difficult if not impossible to implement. DSLRs score far far better in this realm. This is why deep space astro shooters use cooled CMOS/CCD cameras and take dark field frames to map out troublesome pixels.

Most cameras have long exposure noise reduction, LeNR, which basically takes 2 exposures. Actual exposure followed by a dark field frame to map out hot pixels.

Macro_Cosmos
Posts: 1511
Joined: Mon Jan 15, 2018 9:23 pm
Contact:

Post by Macro_Cosmos »

Lou Jost wrote:I dare say we all have these. They can usually be permanently hidden in-camera by a menu item named something like "pixel mapping" (depends on brand).
It's said that typical consumer cameras use at best A grade sensors with up to 15% dead pixels.

A friend also firmly believes that Sony intentionally cripples the M-4-3 sensors by only providing B grade sensors since their recording capabilities can be very threatening, might sound conspiratorial but certainly not implausible.

iconoclastica
Posts: 486
Joined: Sat Jun 25, 2016 12:34 pm
Location: Wageningen, Gelderland

Post by iconoclastica »

rjlittlefield wrote:If so then they'll be most noticeable with long exposure times and when the camera is physically warm, especially if you've been running in Live View so the sensor is heated.
That exactly is.

The 'long exposure noise reduction' becomes effective when exposing over one second. I mostly use about 0.4" and can't go much faster for then the home built speedlight synchronization during silent mode shooting becomes unreliable. To have noise reduction it seems that I'd better expose even longer. Anyway, those pixels are there, albeit less pronounced, even at 1/1000".

I just noted that even with ten seconds exposure the image was downloaded almost immediately. So the noise reduction couldn't possibly be active. Turned out I had the selected/deselected colours of the menu confused... Now I am getting near black blacks again.

Thanks for helping!
--- felix filicis ---

MacroLab3D
Posts: 96
Joined: Tue Jan 31, 2017 11:40 am
Location: Ukraine

Post by MacroLab3D »

I bought 90D recently. It has the best Canons sensor ever. I have also Canon 5D II and Canon M3 and ALL of them have that problem you just described.

Solution - never shoot with ISOs higher than 100 and longer than 1s.
-Oleksandr

iconoclastica
Posts: 486
Joined: Sat Jun 25, 2016 12:34 pm
Location: Wageningen, Gelderland

Post by iconoclastica »

MacroLab3D wrote:Solution - never shoot with ISOs higher than 100 and longer than 1s.
Oleksandr,

On the 7D I am testing with I see no difference shooting in iso100 except that it takes longer to build up the pixel brightness, as one would expect. My impression is that pixel brightness is something like

c * exposure time * 2log( iso)

Then the here mentioned relation with high ISO would be non-existent (given constant lighting) but perceived so, for high ISO is often used in challenging light conditions.

The downside of lower ISO is that the speedlight cannot produce such ultra short flashes or I should have a GN180 speedlight instead of GN45.


Wim
--- felix filicis ---

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Macro_Cosmos wrote:It's said that typical consumer cameras use at best A grade sensors with up to 15% dead pixels.
Many things are said. This one strikes me as alarmist. Can someone point to a solid reference, preferably with experimental evidence?

It seems to me that dead pixels would not be difficult to identify experimentally. For example, shoot a fine pattern of random dots, shift the camera or target by some small integral number of pixels, shoot again, then align and compare the result images. Pixel positions where the two images differ significantly must be places where there was a problematic pixel in one of the two images. Repeat this exercise for a number of image pairs, and it seems that the locations of problematic pixels would become clear.

I do not recall seeing any such efforts. Does anybody know of one?

--Rik

Stevie
Posts: 95
Joined: Sat Dec 27, 2008 9:17 am
Location: Belgium
Contact:

Post by Stevie »

I do not recall seeing any such efforts. Does anybody know of one?
That's how they look for asteroids. ;)
If i were to look for dead pixels, i would shoot a (white) wall with not texture at all.

chris_ma
Posts: 570
Joined: Fri Mar 22, 2019 2:23 pm
Location: Germany

Post by chris_ma »

Stevie wrote:If i were to look for dead pixels, i would shoot a (white) wall with not texture at all.
I don't think that will work very well since the camera has mapped out the dead/hot pixels at the factory and the RAW converter will interpolate from the neighbouring pixels. so Rik's method seems more telling to me (although we'd need very high end optics I guess and OLPF glas might make things hard to spot)

.chris

Macro_Cosmos
Posts: 1511
Joined: Mon Jan 15, 2018 9:23 pm
Contact:

Post by Macro_Cosmos »

rjlittlefield wrote:
Macro_Cosmos wrote:It's said that typical consumer cameras use at best A grade sensors with up to 15% dead pixels.
Many things are said. This one strikes me as alarmist. Can someone point to a solid reference, preferably with experimental evidence?

It seems to me that dead pixels would not be difficult to identify experimentally. For example, shoot a fine pattern of random dots, shift the camera or target by some small integral number of pixels, shoot again, then align and compare the result images. Pixel positions where the two images differ significantly must be places where there was a problematic pixel in one of the two images. Repeat this exercise for a number of image pairs, and it seems that the locations of problematic pixels would become clear.
--Rik
I had one where someone analysed some crop format DSLRs, I can't find that webpage anymore.

Sensor grading is sort of similar to MTF standards. Manufactures have their own grading system. Dead pixels aren't usually something to be worried about.

Here's a short article on the grading of CCD sensors, grade 0 being the best and it's followed by 1, 2, and 3.
https://www.photometrics.com/learn/imag ... cd-grading

Some manufacturers such as thorlabs prefer to use the "commercial, industrial, scientific" kind of grading to illustrate how good the sensor is. The best in class CMOS sensors are usually called sCMOS, where "s" means scientific. There's also as we all know BSI sensors, and the lesser known stacked sensors.

As for the spec of each grade, one would have to ask manufacturers. They likely won't give us average Joe types any insight.
https://www.cloudynights.com/topic/6184 ... facturers/

I do know someone who knows the engineers of QHYccd (they use mostly Sony sensors), perhaps he can give me more insight. I learned about this grading stuff from him too, it sounds very esoteric... stuff such as dark current and what not. Very complicated. Some cameras even have algorithms that updates itself overtime to map out pesky defects.

The user "Tim" in the link above does have a good point, however what did qhyccd mean when they said grade 1?

Kodak also provides different graded sensors, here's an article discussing hot noise and dead pixels: https://www.1stvision.com/machine-visio ... uctor.html
I'd imagine this only applies to CCDs... If I recall correctly, Kodak still has a huge role in just CCD sensors. CMOS is listed as a tag though. The example they used is a Kodak 29MP CCD.

Most of the discussions on this topic occur on forums dedicated to deep space astro.
https://www.cloudynights.com/topic/6641 ... l-qhy165c/
https://www.cloudynights.com/topic/5304 ... ade-2-ccd/
"Grade 1 sensors have no column defects".
"Column defects can show up over time even on grade 0 sensors".

My 15% figure comes from somewhere, I can't find it but it's in my mind for a reason. I think it's rather "defective pixel" than dead pixel, a dead pixel is a defective pixel, so are hot pixels. I'll have to go through my bookmarks to find the couple of articles I'm after.

It might sound alarmist, but it's really no big deal IMO.

chris_ma
Posts: 570
Joined: Fri Mar 22, 2019 2:23 pm
Location: Germany

Post by chris_ma »

I agree it's not a big deal since we all know these sensors can make lovely images.

15% sounds way too high for me though, that's nearly 1 out of 6, or if you look at a 2x3 cluster, one of them would be defective.
chris

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

chris_ma wrote:I agree it's not a big deal since we all know these sensors can make lovely images.
Yes indeed.

In fact there's an aspect to that loveliness that I have never understood.

It's well known that warm pixels and dust spots cause radial streaks whenever scale correction is required during stacking.

This is because scale correction turns isolated dots into radial streaks, which human eyes are very sensitive to.

But note, this same phenomenon will produce radial streaks from any pixels that are systematically brighter or darker than their neighbors, at least using the PMax stacking method.

I had feared, when I first started developing PMax, that this effect would make it impractical to use. But in fact, radial streaks due to random pixel variation happen so seldom that I can't recall seeing one from an actual camera.

I have no good explanation for why the sensors work so well. It's like either photosite variation in the sensor itself is much smaller than I'd expect, or there's some sort of flat-field correction that is routinely applied, which I've not heard of.

Any ideas about why things work so well?

--Rik

Post Reply Previous topicNext topic