Buying an M9 in 2021?

Then why has video been dropped if it was the reason for migration from CCD to CMOS? I am lost here, help me out.

Sorry I did not mean Leica specifically. I meant the industry. Leica is special and can do what they want in catering to a niche audience. However, Leica cannot buy what is not being made.
 
Sorry I did not mean Leica specifically. I meant the industry. Leica is special and can do what they want in catering to a niche audience. However, Leica cannot buy what is not being made.

OK, thanks for straightening that out. My feeble brain was on overload.
 
...

Whatever the CCD vs CMOS difference might be, it has not been particularly evident or significant in anything I've been doing.

...

G

As Godfrey observed, there is no inherent reason for CMOS images and CCD images to be different.

Preferences for CCD images are authentic, but they have nothing whatsoever to do with CMOS vs CCD photo-diode arrays.

In terms of percieved color rendering (or monochrome tonality), both CMOS and CCD photosites produce perfectly identical analog information - a 2D spatial distribution of electrical charges which in turn output a 2D spatial distribution of DC voltages. The characteristics of electrical charge and DC voltage are each perfectly identical regardless of technology differences in CCD and CMOS.[1]

Perceived image quality can be very different because the light entering a sensor photo site array is affected by the sensor cover glass, IR filter and anti-aliasing layers. Differences in the sensor micro-lens and color-filter array optical properties are probably have a much larger impact.

The demosaicking algorithms (both in-camera and off-camera) play a significant role in perceived color rendering.



1. If the sensor cover glass, IR filter layer, ant-aliasing layer, micro-lens array and color filter array are removed from the sensor assemblies, CCD and CMOS photo sites respond differently to light. CCD photosites include slightly more red light and slightly less blue light. The differences are due to physical distances (layer thickness) where light enters a photo diode. However, these differences are very small. Known frequency-response differences can be accounted for in the demosaicking algorithms.
 
Collection efficiency vs angle of acceptance is different between CCD arrays and CMOS sensors. Back Side Illuminated CMOS closed that gap. It will be interesting to see if Leica moves to BSI CMOS for the M11. CMOS processing engines tend to do more processing of the raw image before storing it. The M9 and M8 do very little processing, noise-reduction is limited to long exposures to make a "Dark Image" for hot-pixel correction. The M8 and M9 in "Button Dance Mode" store those images as well as the untouched raw files. The M9 stores the button dance images as DNG files, easy to look at.
 
Collection efficiency vs angle of acceptance is different between CCD arrays and CMOS sensors. Back Side Illuminated CMOS closed that gap. It will be interesting to see if Leica moves to BSI CMOS for the M11. CMOS processing engines tend to do more processing of the raw image before storing it. The M9 and M8 do very little processing, noise-reduction is limited to long exposures to make a "Dark Image" for hot-pixel correction. The M8 and M9 in "Button Dance Mode" store those images as well as the untouched raw files. The M9 stores the button dance images as DNG files, easy to look at.

I am more pleased with the Sony A7M III than the previous A7M II. Could be just a fantasy but the newer camera seems better. I will have to compare the Sony to a Leica or two, CCD and CMOS, with the same lens and see what I come away with. A shoestring field test proving virtually nothing but it will be fun and is kind of like kicking over the hornets' nest. Assemble your brickbats.
 
...CMOS processing engines tend to do more processing of the raw image before storing it. ...

Raw data modification is a human decision. This has no impact whatsoever on whether CCD technology has an inherent advantage.

In-camera, raw-data modification varies from brand to brand. Here is a link with examples of low-pass and high pass filtered raw data.

The 2D Fourier transform is commonly used to analyze raw data files. This link demonstrates fixed-pattern noise caused by dark signal non-uniformity, noise filtering, vignetting and banding.

Besides loose or loss-less compression, there are other types of raw file manipulation as well. SONY's infamous raw-file spatial filtering is known as the star-eater among astronomy photographers. Raw-file spatial filtering is not limited to SONY cameras. A summary of can be found here (scroll down).
 
CCD's are analog devices, output a signal that must be digitized externally. The M8, M9, and M Monochrom store the digitized samples with little to no processing. For myself: I prefer this as I wrote my own DNG processing software.

I have a Nikon Df- it does an amazing job, the CMOS sensor has big pixels and double the saturation count of my M9. I've shot the Df and M monochrom side-by-side at High-ISO, the two cameras give about equivalent performance at ISO 10K.

The CCD vs CMOS debate is over, except for a few small CCD's for the scientific market, are no longer manufactured -by anyone-. Dalsa and ONSemi stopped manufacturing full-frame CCD's.
The M9 and M Monochrom are the last of the breed of consumer full-frame CCD based digital cameras. The BG-55 glass CCD's used in them are the last available for general-purpose photography. I'll use mine until they can no longer be repaired, then probably go mirrorless with a BSI CMOS sensor. I'd prefer a camera that stores images in uncompressed DNG because it is easy to read the file. My oldest Digital camera used the then-new KAF-1600, some 30 years ago. It still works.
 
I brought an M10P earlier this year, and I have an M9 I got in 2013 sensor got replaced in 2017.... has worked fine since.... but its definitely an adjustment in terms of processing M9 and M10P images... The M10P to me needs more contrast added not by much compared to the M9 in post.... M10P feels like it needs a little more sharping compared to the M9.... Sunny White Balance is different between the two... M10P warmer by default M9 cooler.... So its an adjustment... Color is excellent most of the time, certain situations I don't like M9 color... M10P seems to be more consistent...High ISO no contest you can't shoot really above 640 with the M9.... M10P I've pushed it in pretty extreme situations and I was amazed how well it rendered color....10,000+ ISO...M10P I just love shooting it, feels great like my old M6.... In 2022 shortly I wouldn't do a used M9 or M240... I'm saving my money and getting an M10 its the closest thing to shooting a film Leica... My opinion
 
CCD's are analog devices, output a signal that must be digitized externally. The M8, M9, and M Monochrom store the digitized samples with little to no processing. For myself: I prefer this as I wrote my own DNG processing software.
...

My understanding is that Leica applies the lens profile, if elected, to the raw file data regardless of which M (or SL, CL, TL, etc) camera is creating the file. So the M8/M9/M9M/M-E all perform a fair bit of processing to the digitized CCD output data.

If you've written your own DNG demosaic and gamma correction algorithms, presumably with your own camera calibration profiles, have you compared your baseline output against what other, commercial raw converters' results look like when set to their defaults with their supplied CCPs for your cameras? I would be curious to see differences on test images using exposures made with and without the lens profile applied. In the past (when I had an M9) I did an extensive series of tests comparing exposures made this way with several different lenses and there was definitely a difference evident, using the same commercial raw converter settings.

G
 
I've done some comparisons of the DNG file with lens corrections on/off. With the M9 and M Monochrom, these are mostly corrections for light loss at the corners. I do not use lenses that would cause a color shift in the corners, the "Italian flag effect". There is no correction for lens distortion. The processing applied is minimal. On the M8: I have it turned off and have not used it since buying in 2010. I've used the Jupiter-12 and Nikkor 2.8cm F3.5, these being the most "strenuous"- have not seen an issue.

On the M Monochrom- my "trick" is to apply the Gamma curve and output a 16-bit (as opposed to 14-bit) pixel. The DNG header is updated, and the file can be opened in Lightroom.
The "test"- I like the Gamma DNG files better than the straight raw ones, find I do not need to apply much post-processing to individual frames. I also restore underperforming lines due to bad pixels, rather than interpolating over. Just annoyed me to see a bad line on the M Monochrom and know the signal was still there.

The M8 DNG processor- I have a custom one for using the camera with an orange for color IR. With the Orange filter the "Blue" channel is Infrared. There is enough of a signal to equalize it with green and red, gives an "IR Ektachrome" look. I used Fortran code written in the 1980s for the base, originally written for custom sensors. I stopped working on those in 1993, switched over to optical networking.
 
I've done some comparisons of the DNG file with lens corrections on/off. With the M9 and M Monochrom, these are mostly corrections for light loss at the corners. I do not use lenses that would cause a color shift in the corners, the "Italian flag effect". There is no correction for lens distortion. The processing applied is minimal. On the M8: I have it turned off and have not used it since buying in 2010. I've used the Jupiter-12 and Nikkor 2.8cm F3.5, these being the most "strenuous"- have not seen an issue.

On the M Monochrom- my "trick" is to apply the Gamma curve and output a 16-bit (as opposed to 14-bit) pixel. The DNG header is updated, and the file can be opened in Lightroom.
The "test"- I like the Gamma DNG files better than the straight raw ones, find I do not need to apply much post-processing to individual frames. I also restore underperforming lines due to bad pixels, rather than interpolating over. Just annoyed me to see a bad line on the M Monochrom and know the signal was still there.

The M8 DNG processor- I have a custom one for using the camera with an orange for color IR. With the Orange filter the "Blue" channel is Infrared. There is enough of a signal to equalize it with green and red, gives an "IR Ektachrome" look. I used Fortran code written in the 1980s for the base, originally written for custom sensors. I stopped working on those in 1993, switched over to optical networking.

Very interesting stuff, thank you for articulating the detail!

I wrote a good bit of code for image processing in FORTRAN way back when (1980s) when I was at JPL ... little of it terribly useful nowadays, and a different imaging domain from photographic camera sensor acquisition (imaging RADARs, SARs (synthetic aperture RADAR systems)) ... so I'm always curious what others are doing when they write their own code. :)

Fun stuff, thanks again!

G
 
Always good to meet another FORTRAN programmer!

I ported most of my DNG processing code from Microway NDP FORTRAN to Open-Watcom v2 FORTRAN. The latter supports DOS extenders and Windows. I still use the 1994 NDP FORTRAN V4.4 for embedded projects. I bought perpetual rights from Microway that I can provide their development tools along with my source code. Best compiler I've ever used.

I've looked at the Nikon NEF files- would not be hard to write code to unpack, but the Leica DNG files are so simple you can read them with a few tens of lines of code.
 
Heh.. I'd say "ex-FORTRAN" programmer now. I haven't written any FORTRAN code since about 1987 ... my world switched to C, Objective-C, a little C++, and some assembly after that, and I did more debugging than code writing in all the years from about 1991 onwards. I haven't done any code at all for at least a decade now, finished my career writing technical documentation and then retired in 2016.

I still love studying computer language use and development! Almost as fascinating as camera design and engineering... :D

G
 
I suspect the "Control Freak" is responsible for my taste in cameras, lenses, and computers. Writing device drivers is just like shimming a lens.

And while typing this, just chased a bug to the new FPGA load for the custom hardware that I'm writing FORTRAN code for.
 
Back
Top