A few weeks ago I posted an image of M33. Then the hydrogen alpha filter I had ordered finally came in and I was curious to see what adding in some data with that filter would do for the image.
For those that don’t know, a hydrogen alpha filter is designed to let in only a small sliver of light centered around the wavelength of one of the hydrogen electron state transitions. In this case that wavelength is a very deep red light that is just on the edge of human vision. The advantage of a filter like this is that it lets in only the light you want and blocks almost all of the light you don’t want.
The hydrogen alpha is one example of a type of filter known as narrowband filters. Each narrowband filter is designed to let in a very specific slice of the spectrum. Usually we use them to image emission nebulae since they only radiate light in very specific frequencies. However, when you use a hydrogen alpha filter on a galaxy it highlights the HII regions. Those are regions of atomic hydrogen that make up the vast majority of gas in emission nebulae. You can see them in the pink regions in this image mostly following the curve of the spiral arms.
This was fascinating to see. A few of the larger HII regions showed up in the RGB image but the H-alpha data had so much more (probably because the other light from the galaxy masked the HII regions in the broadband filters). I definitely plan to devote some time to getting h-alpha data on galaxies in the future!
I processed this version somewhat differently. The earlier image used drizzle integration which effectively increases the resolution. It’s an amazing process but in this case it was making some noise in the edges of the image more noticeable. This time I did not drizzle which allowed me to use PixInsight’s Mure Denoise script which does an excellent job at noise reduction.
After that I ran DBE on each channel and then combined the RGB channels. I then ran PCC on the RGB image
I tried using the NBRGBCombination script to combine the h-alpha data with the RGB but could not get a result that I liked. Instead I used arcinhstretch to stretch the RGB data and h-alpha to make them non-linear. These were not large stretches and I didn’t try to create lots of contrast here.
Next, I created a mask from the h-alpha data and applied that to the RGB image. Then I used pixel math to substitute the h-alpha data into the RGB image. This target approach to adding in the h-alpha struck the right balance for me and didn’t add in extra noise into the background.
From there it was a lot of contrast work. Two rounds of LHE (kernel radii 80 and 180) and then a lot of tweaking with curves to get the global contrast and saturation where I wanted them. I tried the Dark Structure Enhance script but that didn’t give good results here so backed it out. Finally, a touch of the AdvSharpening script and the image was done.
All together this was 21 hours of data making this the longest integration yet. Data was acquired over four months starting in October 2019.
These are the kinds of images I’ve been hoping to get when I started astrophotography. The learning curve is steep but worthwhile!
You can find all the technical details on astrobin.