Pixels for use at 4-5X on an APS-sized sensor

Have questions about the equipment used for macro- or micro- photography? Post those questions in this forum.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

Charles Krebs
Posts: 5865
Joined: Tue Aug 01, 2006 8:02 pm
Location: Issaquah, WA USA
Contact:

Post by Charles Krebs »

From: http://www.microscopyu.com/tutorials/ja ... index.html
Physical sensor dimensions are not all that important here to determine pixel "count" needed... if you are to record a certain field size that has a specific amount of resolved detail per unit of field size then the number of pixels needed it the same. (Format aspect ration makes some difference, but that is not the point here).

5820 x 4360 = 25,375,200 = 25.7Mp

Image

Image

naturephoto1
Posts: 509
Joined: Sun Nov 13, 2011 5:37 pm
Location: Breinigsville, PA
Contact:

Post by naturephoto1 »

nernelly wrote:I think there might be two points mainly contributing to the big difference in Mpix numbers mentioned above: 1) different resolution criteria and 2) the potentiation when converting pixel size into Mpix count.

This reference discusses many relevant aspects of the topic http://luminous-landscape.com/tutorials ... tion.shtml. At the bottom it presents optimal pixel sizes for two different scenarios based on different resolution criteria (table 2, columns (3)&(4)). The optimal pixel sizes given for e.g. f/11 would be in case A) 7.4mu, in case B) 3.7mu (for green light). Converting these numbers to pixel counts on an APS-C sensor results in A) 7Mpix (cf. table 3) and B) 28Mpix.

The important difference between the two scenarios is the resolution criterion applied. Case B) seems to correspond to the classical Rayleigh criterion (cf. figure 5). This requires a separation between two airy discs that is by a factor 2 smaller than the one used in case A) (cf. figure 7). This factor gets transferred into the optimal pixel sizes and squared when calculating the Mpix count.

Hope I got this right (and comprehensible). Already quite late here...
I was going to make some comment about this not having read the Luminous Landscape article that I will read.

As an example of this Canon found that that for the sensors in their G cameras that larger pixels resulted in better performance. They changed the capabilities for the G10 at 14.7 megapixels to 10 megapixels for the G11 and the G12 and found sharper and better rendering of the images.

Oh well, I still have and like my Canon G9 and G10 for their purposes.

Rich

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Perhaps I've mentioned that I don't trust calculations without experimental confirmation. In fact I'm tired enough at this moment that I don't really trust calculations at all. OK, maybe a little, at the very end. But mostly I'm going to stick with experimental results.

What I've done here is to set up an f/11 lens and capture its image in two different modes: 1) direct projection onto a DSLR sensor, and 2) magnified projection, roughly 10X onto the same DSLR sensor.

Here are the optics. On the front, there's an Olympus 135 mm bellows lens set at f/11, serving as the main image-former. Behind that, as shown here, is a micrometer slide that allows direct measurements in the image plane. Behind that is a 10X microscope objective, feeding back to a Canon T1i. This is mode 2, magnified projection. To do mode 1, a bunch of stuff gets removed and the camera then mounts directly on the back of the first bellows so that the camera's sensor sits where the micrometer slide did in Mode 2.

Image

The above whole contraption is pointed down a long hallway, staring at a USAF resolution chart taped to a door. The chart is lost in glare here, but that's a real image of it on the back of the camera. Again, this is still in mode 2, magnified projection.

Image

So much for setup. Now things get interesting.

Here's what the camera sees in mode 2, magnified projection. (You can see the micrometer scale superimposed on the image.) Notice that the lens is just barely resolving Group 1 Element 3, indicated by the red arrow.

Image

In comparison, switching to mode 1, here's what the camera sees when the image goes directly onto the sensor. Notice that in this mode, Group 0 Element 4 is the last resolved set; Element 5 has completely lost it for the vertical lines.

Image

So...

What we have here is an f/11 image that is clearly not fully captured by a 15 mp APS sensor (pixel size 4.7 microns). In fact it's short by a resolution factor of roughly 2^(5/6) = 1.78, that being the ratio in bar spacing between Group 1 Element 3 and Group 0 element 4.

At this point, I have just about enough faith in calculations to do this one:

15 megapixels current * 1.78 * 1.78 = 47 megapixels needed

I'll read the rest of the discussions and comment more later. Other tasks intrude right now...

--Rik

Edit: to add pixel size in microns
Last edited by rjlittlefield on Tue May 01, 2018 2:40 pm, edited 1 time in total.

seta666
Posts: 1071
Joined: Fri Mar 19, 2010 8:50 am
Location: Castellon, Spain

Post by seta666 »

Very interesting Rik; these results made me think of the technical notes in Cambridgeincolour:

-The actual pixels in a camera's digital sensor do not actually occupy 100% of the sensor area, but instead have gaps in between. This calculation assumes that the microlenses are effective enough that this can be ignored.
-Nikon digital SLR cameras have pixels which are slightly rectangular, therefore resolution loss from diffraction may be greater in one direction. This effect should be visually negligible, and only noticeable with very precise measurement software.
-The above chart approximates the aperture as being circular, but in reality these are polygonal with 5-8 sides (a common approximation).
-One final note is that the calculation for pixel area assumes that the pixels extend all the way to the edge of each sensor, and that they all contribute to those seen in the final image. In reality, camera manufacturers leave some pixels unused around the edge of the sensor. Since not all manufacturers provide info on the number of used vs. unused pixels, only used pixels were considered when calculating the fraction of total sensor area. This pixel sizes above are thus slightly larger than is actually the case (by no more than 5% in the worst case scenario).


So my conclusion is that sensors are far from optimized, those cambridgeincolour values would apply if pixels occupied the whole sensor area with gapless, perfectly square pixels with no AA filter.
In my opinion the loss in resolution is caused because the flaws in pixel desing rather than insufficient pixel count. Those gaps with no pixels cause the loss in resolution and the pixels that do gather light still are affected by diffraction

But to be fair, those gaps were present in film too
Image
natural random arrangement of the fine grains of silver halide in film.

Which made me find this interesting article about X-Ray film (more related to increasing pixel count than reducing the gaps)
http://www.konicaminolta.com/about/rese ... ilver.html


Anyway, I hope they never make a 47mpx APS-C or 80 mpx + full frame camera (imaging the size of those RAWS); they should better try to optimize pixel/sensor design (remove the AA filter, improve microlenses, minimize the gap between pixels, optimize the shape of those pixels etc...)



Regards
Javier

ChrisR
Site Admin
Posts: 8671
Joined: Sat Mar 14, 2009 3:58 am
Location: Near London, UK

Post by ChrisR »

( Rumoured D800 is 36MPixels on full frame in the two sites I saw. Not as dense as the pixels on a Canon T3i/600d/7d which work out about 48MP on full frame)
Rik's Requirement ( ;) ) for 48MP on APS-C would be around 128MP on full frame. (A "CD" could hold about one 16 bit tiff image file?)

Would that be "it"? Possibly not. That Oly 135 is an old lens design - I didn't think mine was particularly sharp. The RR density on compact sensors around 4mm x 6mm works out about 4MP, and we know they can do better, with up to perhaps 3-4 times that. Ok the image circle is smaller - give it time!

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

ChrisR wrote:Would that be "it"? Possibly not. That Oly 135 is an old lens design - I didn't think mine was particularly sharp.
Oh, I didn't mean to imply that f/11 was as sharp as it gets. Heavens no! Here's the f/11 image again, and then the matching f/5.6 that I shot at the same time.

Image Image

So, um, let's see, that's another 3 elements at least, sqrt(2) on each axis, double the pixel count... I think we're up to 96 megapixels now, with lots more room to go still higher from there!

----------------

I've spent quite some time this evening looking at images and reading the cambridgeincolour page. I think I now understand what's going on.

Cambridgeincolour is concerned with a question commonly posed by users: "What apertures can I use, given the sensor I have?" For example their number labeled "Diffraction Limits Standard Grayscale Resolution" reflects the aperture at which artifact-free resolution begins to be impacted.

In contrast, I'm answering a different question: "What sensor would I need, to capture all the detail in the image formed by my optics?" The answer to that question is 3 pixels per line pair for the finest lines resolved in the optical image.

When you work the numbers, it turns out that the answer to my question is a much higher pixel count than you'd get by working backward from cambridgeincolour's answer to their question.

The issue of sensor quality is interesting. Some analyses are based on the minimum sampling rate of 2 pixels per line pair that's implied by Shannon's sampling theorem. But as I've illustrated HERE, it turns out that you really need about 3 pixels per line pair to avoid degrading detail that happens to be positioned badly with respect to pixel boundaries, even with a perfect sensor. So personally, I'm inclined to think that sensors should be evaluated on their performance at 3 pixels per line pair, not 2.

It turns out that the physical sensor in my Canon T1i actually does a pretty good job at 3 pixels per line pair. Here's a sample, processed with Adobe Camera Raw 6.3 and shown here at 200%. In the slanting lines, there's some periodic loss of contrast accompanied by color shift, but still there's no question that the lines are resolved.

Image

So, while I would certainly applaud further improvement in sensor quality, the current one is already pretty good under the conditions where I would want to use it anyway.

OK, let's recap. We got to this discussion because I claimed there was a lot more information in an f/10 image than my 15 mp APS sensor could capture. That prompted a dissent from seta666, who wrote that f/10 "is around the limit to get the most of a 10mpx APS-C sensor". Subsequent discussion referenced the cambridgeincolour diffraction calculator, which provides numbers like these:
For a 15mpx x1.6 APS-C sensor (Canon) values are:
Diffraction May Become Visible f/7.1
Diffraction Limits Extinction Resolution f/8.9
Diffraction Limits Standard Grayscale Resolution f/10.7
OVERALL RANGE OF ONSET f/7.1 - f/10.7
A quick reading of these numbers may suggest that the sensor can capture everything there is to see in an image at f/11. However, the experimental images tell a different story. It may be true that f/11 noticeably adds degradation to an image that has already been degraded by the 15 mp sensor. However, there remains much finer detail in an f/11 image than a 15 mp sensor can capture. Cambridgeincolour is simply answering a different question than I am.

I would again like to call everyone's attention to a basic fact of digital sampling. This is discussed in detail HERE, but to quickly summarize:
In order for our digital image to "look sharp", we have to shoot it or render it at a resolution that virtually guarantees some of the detail in the optical image will be lost. If you see some tiny hairs just barely separated at one place in the digital image, it's a safe bet that there are quite similar tiny hairs at other places that did not get separated, just because they happened to line up differently with the pixels.

Conversely, in order to guarantee that all the detail in the optical image gets captured in the digital image, we have to shoot and render at a resolution that completely guarantees the digital image won't look sharp.

So, there's "sharp" and there's "detailed" -- pick one or the other 'cuz you can't have both. What a bummer!
I happen to be a "detailed" guy, so I like lots of pixels. If you're a "sharp" guy, you'll prefer fewer. They both work.

Thanks for the stimulating discussion.

BTW, I think I will split off this discussion of resolution to its own thread. That will be a bit disruptive now, but it will make a lot more sense later.

--Rik

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

seta666 wrote:Also, could the confusion be caused by the way we look at line pairs?
...
I do not remember seeing any drawing explaining it.
A "line pair" means one dark and one light. That is, one cycle of whatever pattern you're looking for. Typically this is luminance -- bright and dark. You can have cycles of color shifts also, but I can't recall ever seeing this discussed. Bayer sensors have much lower resolution for color shifts than for luminance of white light.

--Rik

ChrisLilley
Posts: 674
Joined: Sat May 01, 2010 6:12 am
Location: Nice, France (I'm British)

Post by ChrisLilley »

ChrisR wrote:( Rumoured D800 is 36MPixels on full frame in the two sites I saw.
Yes you are right, my bad. Its the same sensor pitch as the D7000, but extended over a full frame sensor size. I have edited my earlier post

Sample images from the camera show a resolution of 7360 x 4912.

ChrisLilley
Posts: 674
Joined: Sat May 01, 2010 6:12 am
Location: Nice, France (I'm British)

Post by ChrisLilley »

ChrisR wrote:( Rumoured D800 is 36MPixels on full frame
By the way the Nikon D800 and D800E were officially announced today. They have the same sensor but the 'E' has no anti-aliasing filter.

It might be interesting to discuss the effect of that, in the current thread.

ChrisLilley
Posts: 674
Joined: Sat May 01, 2010 6:12 am
Location: Nice, France (I'm British)

Post by ChrisLilley »

rjlittlefield wrote: Oh, I didn't mean to imply that f/11 was as sharp as it gets. Heavens no! Here's the f/11 image again, and then the matching f/5.6 that I shot at the same time.

Image Image
That is an interesting pair of images. It is clear that the f/5.6 one is resolving all parts of the test image, including patch six in set one. The pattern also looks clearer because the background is out of focus. On the other hand the influence of axial CA is much greater, the colours are less neutral (especially on thin shapes, like the numerals) than they are in the f/11 image.

You don't happen to have the f/8 and f/4 still around, by any chance?

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

ChrisLilley wrote:You don't happen to have the f/8 and f/4 still around, by any chance?
Not only that, but by virtue of the short time interval I can even find them!

Image
It is clear that the f/5.6 one is resolving all parts of the test image, including patch six in set one. The pattern also looks clearer because the background is out of focus.
What looks like "background" is actually some sort of smear pattern that's part of the micrometer slide that the aerial image is being projected onto. The real target is a piece of glossy photo paper.

It's probably worth mentioning that I have a note in the box with this lens: "Best at f/11 at low mag". That note does not represent what's contained in the lens's image as we're seeing here, but rather it comes from earlier testing of the lens based on what was seen by a 1.6 crop factor 6.3 mpx camera. The note is consistent with cambridgeincolour's calculator, which says that for a 6.3 mpx camera "Diffraction May Become Visible" at f/11. But again, it's very clear from these images that in fact there's a bunch more detail in the image at f/11 than can be resolved by even a 15 mpx camera, even though at 15 mpx, f/11 "Diffraction Limits Standard Grayscale Resolution". There's a big difference between saying that diffraction further impacts an already degraded image, and saying that you've captured all the detail that's present.

--Rik

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Filling out the set, here are the f/16 and f/22 aerial images.

Image

And for comparison, here again is the corresponding aerial image at f/11 and as captured by the camera at f/11:

Image Image

This illustrates what I've written elsewhere, that you would have to stop down somewhere between f/16 and f/22, to degrade the optical image to the same extent that a sharper image gets degraded by the 15 mpx sensor. The nature of the falloff is different -- diffraction smoothly degrades contrast as detail becomes finer, while the digital sensor (including its anti-aliasing filter) tends to keep contrast high until it falls quickly at the end. But as shown in the images, the f/16 optical image clearly resolves a block or two beyond the camera image, while the f/22 image barely resolves (at very low contrast) the last block that is clearly resolved by the camera.

--Rik

Blame
Posts: 342
Joined: Fri May 14, 2010 11:56 am

Post by Blame »

Interesting.

I make that f/16 for a 24MP APS-c.

After a bit of thought I recon that the AA filter or lack of one is unlikely to make much of a difference at a resolution of 3 pixels per line pair.

However I am not so sure that in practice going up to f/16 is going to be worthwhile. While you have shown that some detail is available it has very poor contrast. Anything less than perfect optics will reduce that contrast even more.

You used black and white lines but in a feature rich subject such detail will be lost without sharpening. The catch is that stacking increases noise and sharpening also increases noise. How much sharpening can be added before the detail is swamped by noise?

Anyway, it doesn't quite solve a more pressing problem for a lot of us which is how much magnification should we push for with a given lens? Probably to be answered with - how bad do you want the corners to look?

While clearly no definitive practical answer can be given we could probably come up with some sort of rule of thumb. I wouldn't be surprised if it wasn't a stop down from your calculations or as low a magnification as you can get till blur or vignetting totally screws the corners. Whichever comes first.

Your calculation would give a modulation of roughly 15%. One stop down would give a maximum MTF of still under 40%. Less whatever is lost to imperfect lenses.

These are rough numbers because I am extrapolating from a graph from this interesting page http://photo.net/learn/optics/mtf/

rjlittlefield
Site Admin
Posts: 23605
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Blame wrote:Anyway, it doesn't quite solve a more pressing problem for a lot of us which is how much magnification should we push for with a given lens? Probably to be answered with - how bad do you want the corners to look?

While clearly no definitive practical answer can be given we could probably come up with some sort of rule of thumb. I wouldn't be surprised if it wasn't a stop down from your calculations or as low a magnification as you can get till blur or vignetting totally screws the corners. Whichever comes first.
I think the answer to all such questions is "Test your own equipment, then determine the best tradeoff to match your own requirements."

The main point of my study is this thread was to take a hard look at the common assertion -- now recognizable as a myth -- that modern digital sensors have enough resolution to capture everything the lens can provide. In fact they're not even close, if by "everything" we include central resolution at the lens's sharpest aperture.

I think the issue you're getting at is quite different. It's more like "What should I do to get the best image quality, integrated with some variable weighting across the entire frame, using some specified lens, if I'm free to adjust the magnification." That's a very important question, but it's not one that can be addressed by calculation, or by much of anything else in this thread.

--Rik

Blame
Posts: 342
Joined: Fri May 14, 2010 11:56 am

Post by Blame »

rjlittlefield wrote: I think the answer to all such questions is "Test your own equipment, then determine the best tradeoff to match your own requirements."

The main point of my study is this thread was to take a hard look at the common assertion -- now recognizable as a myth -- that modern digital sensors have enough resolution to capture everything the lens can provide. In fact they're not even close, if by "everything" we include central resolution at the lens's sharpest aperture.

I think the issue you're getting at is quite different. It's more like "What should I do to get the best image quality, integrated with some variable weighting across the entire frame, using some specified lens, if I'm free to adjust the magnification." That's a very important question, but it's not one that can be addressed by calculation, or by much of anything else in this thread.
--Rik
Well not not as far as you might think.... like perhaps 3 years now. As I have pointed out the Sony sourced 24MP sensors drop down the limit to f/16. There is some evidence that the 2um microlenses involved are giving problems in the corners which is lens dependent. Could be that 2um is a fairly firm limit for now. However it is possible to drop the beyer square shaped pixel with its 2 greens for a single green pixel in a hexagonal array. That would drop the 3 pixel distance by exactly 25% giving f/12 as your limit. I think it will happen.

I still say that resolution of only is not a lot of use unless the sensor has enough dynamic range to display after stacking and sharpening has had its way. I don't expect any dramatic improvements there as the predominant noise is likely to be the randomness of light itself.

My question is fair because with Infinite lenses we are NOT entirely free. If we choose a poor focal length for a tube lens then we may have to go out and buy another one. Better to have at least a realistic guess to start from. I am restarting my microscope project this year and starting to gather lenses. It is a guess I am struggling with.

To an extent calculation IS useful. There is a known aperture where diffraction starts to become significant. For my 24MP FF sensor I would think f/11 is a realistic value. To go lower is to decrease the depth of field without significantly increasing the resolution. That will result in larger stacks with the additional noise, time and shutter wear involved. Using your calculation I can put a top end of about f/26 although I personally would recon a stop down at f/18 as a practical top on the grounds that increasing won't add much to detail that can be recovered by sharpening.

There you are. A range of f/11 to f/18 as a rule of thumb. It has to help.
Last edited by Blame on Tue Feb 04, 2014 3:09 pm, edited 3 times in total.

Post Reply Previous topicNext topic