As a beginner, this question seemed like a no-brainer.
Why on earth would I want a monochrome camera that produces black and white images? Give me the colour thanks! Nobody wants to see black and white photos of space! And why do scientists with access to million dollar telescopes and cameras prefer black and white anyway? Those fools. The colour camera is cheaper.
Then of course, you inevitably try to make your photos better and better and realise that colour cameras are degrading your image, for a number of reasons. Professional deep space and planetary astrophotographers do enjoy colour photos, but to create them they use high-end monochrome cameras and filters to create their images one channel at a time. It’s at least three times as much work and imaging time, but the results are three times better.
Creating colour from monochrome
Let’s go back a step. How do you create a colour image from a monochrome camera?
By capturing 3 images of the same target, each with a red, blue or green filter, these channels can be aligned and combined as an RGB colour image.
But it’s three times the work – why would you want to do that?
The answer lies in the engineering required to create an RGB (colour) camera chip. Ultimately, all the little light wells on a chip are monochrome. They can’t themselves differentiate between red, green or blue photons. They just add the photon to the well. Colour chips instead use tiny filters over each pixel well which gives you an array of tiny pixels. Herein lies the next problem. An array of pixels that require 3 different values don’t divide into 2 or 4 very well, so you end up with an array that must be biased to one colour – usually green. This is called a Bayer filter or colour filter array (CFA).
See the problem?
There are way more green pixels than red or blue. This inherent bias is corrected in the process that converts the array into a colour digital image which is called demosaicing. Typically all this happens in-camera unless you shoot raw and demosaic yourself, but either way, the process averages the results from an area of pixels and this effectively downsamples your image from the actual number of pixels on the chip and creates artifacts such as false colours, colour bleeds, jagged edges, and more. With a monochrome camera, you get a 1:1 unblemished result based on the number of photons that hit the pixel well, with none of this immediate jiggery pokery that gives you a degraded image straight away.
But wait. It gets worse.
Quantum efficiency
Pixel wells on a camera chip are microscopic. They are incredible feats of engineering and miniaturisation that allow functional pixels of such small size to site side-by-side but doing so requires a small gap between each pixel. That gap is another effective area where photons literally slip between the cracks and are never recorded at all. With a colour camera you need 3 times as many of these gaps so there is a loss of performance and sensitivity to light, as a greater surface area is dedicated to gaps instead of pixel wells. This is called quantum efficiency.
Compared here are the quantum efficiency (QE) graphs for the ASI ZWO 174 colour vs the equivalent mono version. You can see the green, with more pixels, has a slightly higher QE, but none can match the high QE of the mono. By virtue of the reduced QE alone, there is a loss of signal. Put simply, it takes longer exposure time to achieve the same pixel saturation you get with a mono camera.
But wait, there’s more.
Loss of signal
When a photon arrives at a mono chip, one of two things happens. It either hits a pixel well and is recorded, or it hits a gap or gets lost for whatever reason. With a colour chip, there is a third case: It can hit a filter in the colour filter array, and be discarded because it doesn’t match the colour for the filter. This means there is a 25% for green or 50% chance for red and blue photons to be lost entirely. This loss over the surface area of the chip creates a situation where detail and signal are both lost overall.
I know what you’re thinking. Why oh why did I buy this piece-of-crap colour DSLR / high speed CMOS or expensive one-shot-colour CCD?
We’ve all been there. Depending on how far you come with your photography, there’s usually a point where you simply must go the monochrome/filter-wheel upgrade path to get the sharp, precise, high quality images you see from Hubble and the professionals. You can bet they aren’t using colour cameras. No-sir-ree, Bob.
Best of both worlds – This collaboration between John Mills and Dylan O’Donnell of IC2944 used a mono CCD for the detail, and a one-shot-colour CCD for the colour.
The case for colour
But there is a case for them. Ok maybe 2 cases.
The first, most obvious one is simplicity – because a colour camera saves you 3 or 4 times as much work processing a photo. A good nebula photo requires at least 30-40 subframes to reduce the noise with stacking. Using a mono camera means you’ll need 90-120 subframes, or even more if you shoot luminance in Ha as well as R, G, and B channels. I shoot with a one-shot-colour CCD and those who follow my socials know I pump out a lot of images quickly for this reason. I even use it for narrowband work which is a bit of an astro faux-pas, but I get away with it. On Instagram.
The other case is a practical one. Mono is best for planetary and other targets, but sometimes the object you are imaging is moving quickly, like a shadow transit of Jupiter’s moons, or the planet surface itself which does a full rotation in 9 hours. Post processing derotation may help in this case, but will still “smear” a nearby moon as the colour channels go out of alignment between filter captures. Another example may be a brilliant green comet racing across a star field. Good luck trying to image these in mono and align them later. It’s a trade off between getting enough frames, and getting them quickly enough to even combine them without the target having moved significantly between channels. In these cases, colour cameras are a convenient way to capture the event quickly and still produce a beautiful colour image.
The answer: Get both. Maybe.
The moral to the story, if there is one, is that you need both! At least, that’s what I keep telling myself. I’m not sure my bank balance agrees with me.
If you want to specialise and achieve the best possible solar, planetary, or nebula photos, then monochrome is the way to go. The loss in detail, clarity, and signal that comes with colour imaging cannot be recovered with post processing alone. For casual imagers, or anyone wanting to view images “live,” then colour is still an attractive option.
Now if anyone would like to donate to my observatory grade, liquid nitrogen cooled, scientific large format mono CCD fund, please send me a message via Google+, Instagram or dylan@photographingspace.com! I take any currency including PayPal or bitcoin.
Interesting article Dylan. I definitely learned something. I am also glad I read this before I invested in my first CCD camera. I was leaning toward a one shot colour, but you have swayed my decision! Thanks.
Having only recently taken an interest in astrophotography I congratulate you on these articles. They are complex subjects put over in a very easy to understand way and I am learning a lot very quickly. I hope soon to be having a go and look forward to sending you some of my efforts. Many thanks for your encouragement and for sharing your enthusiasm and knowledge. It is very much appreciated.
What good would the Sigma Foveon sensor be in astro photography ? 3 layers, one for each primary color. Best black and white in the market outside of Leica specialized monochrome cameras, and from raw you can keep data independently from each layer, export, and then merge in another software. Would that work in astro photo ?
Total noob here…
Certainly seems promising technology doesn’t it Luc! I don’t know much about it but there is some buzz about it in astro groups, and if it does what it says it would be excellent for collecting RGB data, and perhaps even using as a straight 1-channel camera for narrowband as well.
Thank you for the breakdown. This helped me a lot.
With the new CMOS cameras, do monochrome have the same lead?
Hi Jake,
In my opinion — yes. Mono still has the lead over OSC for a few reasons. However, the difference is going to be somewhat negligible for the average user. In the sense of image quality, using a good OSC camera to its full potential and doing EVERYTHING right can get you, for argument’s sake, let’s say 80% of the quality of doing the same with a monochrome camera and filters. So, if you want to get that remaining 20% of image quality and are willing and able to spend the money and time to make it happen, you can. BUT, that doesn’t mean you can’t get amazing quality with OSC! There are so many things that have to go right.
All that being said, barring some sort of amazing feat of engineering and physics, the similarly specced out monochrome cameras will almost always give you that boost of quality that an OSC doesn’t quite hit.
All the best, and clear skies!
Cory