Difference between revisions of "HDR workflow with hugin"

From PanoTools.org Wiki
Jump to navigation Jump to search
m (moved fulla advice)
(→‎Quick and easy technique: things got easier thanks to layout model)
(21 intermediate revisions by 5 users not shown)
Line 2: Line 2:
 
format.  It is a simple HOWTO listing the tools available and how to use them
 
format.  It is a simple HOWTO listing the tools available and how to use them
 
with [[hugin]].
 
with [[hugin]].
 +
 +
TODO this is all very out of date, [[hugin]] now supports [[HDR]] assembly internally.
  
 
Working with [[HDR]] images is fairly extreme behaviour.  If you just want
 
Working with [[HDR]] images is fairly extreme behaviour.  If you just want
Line 18: Line 20:
 
[[ghosting]] problems.  The second technique is presented here since it
 
[[ghosting]] problems.  The second technique is presented here since it
 
involves a greater range of tools.
 
involves a greater range of tools.
 +
 +
= Quick and easy technique =
 +
The rest of this tutorial describes generating high quality output using command-line tools.  However it is now possible to complete the workflow entirely with GUI tools, something like this:
 +
 +
# Take [[bracketing|bracketed]] shots of your scene.
 +
# Open bracketed images in Hugin. Align - let's say - the middle exposures together and set the stacks in the ''Images'' tab.
 +
## If your stacks don't align (shot hand-held, sloppy panohead, etc.), set some control points inside stacks and align them too.
 +
# Stitch the panorama with [[hugin]] and [[enblend]] to HDR file.
 +
# Optionally tonemap the result with [[qtpfsgui]].
 +
 +
= Laborious and difficult technique =
  
 
== Preparing the HDR images ==
 
== Preparing the HDR images ==
  
Unless you have an expensive [[HDR]] camera, you will be merging bracketed
+
Unless you have an expensive [[HDR]] camera, you will be merging [[Bracketing|bracketed]]
 
shots to create the HDR images - Unfortunately this means that you are limited
 
shots to create the HDR images - Unfortunately this means that you are limited
 
to static scenes and landscapes.
 
to static scenes and landscapes.
Line 43: Line 56:
 
[[fulla]] using pre-calibrated data for your lens, it doesn't really work later on.
 
[[fulla]] using pre-calibrated data for your lens, it doesn't really work later on.
  
=== Merging bracketed shots with pfscalibration ===
+
=== Merging bracketed shots with PFScalibration ===
  
There are other tools for merging bracketed images, but [[pfscalibration]] is Free
+
There are other tools for merging bracketed images, but [[PFScalibration]] is Free
 
Software and does the job.
 
Software and does the job.
 +
 +
The steps outlined below for assembling [[HDR]] images can also be
 +
performed with the [[qtpfsgui]] GUI tool.
  
 
==== Calibrating the camera response curve ====
 
==== Calibrating the camera response curve ====
Line 62: Line 78:
  
 
A quick way to derive the response curve for later use is to take a series of
 
A quick way to derive the response curve for later use is to take a series of
five bracketed [[JPEG]] shots one stop apart.  eg. 2, 1, 0.5, 0.25 & 0.125
+
five bracketed [[JPEG]] shots, slightly out of focus and one stop apart.  eg. 2, 1, 0.5, 0.25 & 0.125
 
seconds exposure.  First extract the exposure times from the [[EXIF]] data:
 
seconds exposure.  First extract the exposure times from the [[EXIF]] data:
  
Line 69: Line 85:
 
Then extract the response curve, by comparing the photos, and save it:
 
Then extract the response curve, by comparing the photos, and save it:
  
   pfsinhdrgen mycamera.hdrgen | pfshdrcalibrate -v -s mycamera.response
+
   pfsinhdrgen mycamera.hdrgen | pfshdrcalibrate -v -g 6 -s mycamera.response
 
 
By default this '''mycamera.response''' file contains a weighting table that
 
effectively throws away the brightest and darkest pixels.  So open the file in
 
a text editor, find the weighting table at the end, and change the zeroed
 
values (0.000000000) to a positive number (0.001000000).
 
  
 
==== Aligning the shots ====
 
==== Aligning the shots ====
  
Even with a tripod, unless you have a programmable camera, remote control or
+
If the pictures were taken hand-held you will need
very steady hands; it is unlikely that your series of bracketed pictures align
+
to [[align a stack of photos|align the stack of photos]] using hugin.
perfectly, so:
 
  
* Start up a new hugin project for each series and load the images. Set the [[field of view]], [[lens correction model|lens parameters]] and [[projections|projection]] type, ie. if your lens is a [[fisheye Projection|fisheye]], set this for both the input and output projection.
+
Alternatively the [http://www.luxal.eu/resources/hdr/hdrprep/ hdrprep] tool
 +
can be used to do all this automatically and save a lot of time.
  
* Create a few hundred [[control points]] between each pair of consecutive photos with the '''g''' key in the '''control point''' tab.
+
==== Merging the bracketed images to Radiance RGBE ''.hdr'' format ====
 
 
* Fine tune all points and delete any with a correlation less than 90% (ie. enter -0.9 in '''select by distance''' and delete).
 
 
 
* Optimise positions and delete any control points with an error greater than 0.2 pixels, reoptimise.
 
 
 
* Adjust the field-of view in the '''stitching''' tab to something slightly smaller than the input size so there are no transparent edges.
 
 
 
* Stitch to '''multiple TIFF''' format.
 
 
 
* Rename the output [[TIFF]] files to match the input images.
 
 
 
These TIFF files need to have the [[alpha channel]] stripped away, you can do
 
this by converting to and from [[PPM]] format with [[ImageMagick]]:
 
 
 
  convert DSCN4804.tif DSCN4804.ppm
 
  convert DSCN4804.ppm DSCN4804.tif
 
  rm DSCN4804.ppm
 
 
 
==== Merging the bracketed images to OpenEXR HDR format ====
 
  
 
Create a ''hdrgen'' file listing each of your bracketed photos and their
 
Create a ''hdrgen'' file listing each of your bracketed photos and their
 
exposure times, you can base this on the ''mycamera.hdrgen'' file created earlier.
 
exposure times, you can base this on the ''mycamera.hdrgen'' file created earlier.
  
Then use this and your camera response file to create an [[EXR]] file:
+
Then use this and your camera response file to create an [[RGBE]] file:
  
   pfsinhdrgen mypicture.hdrgen | pfshdrcalibrate -v -f mycamera.response | pfsoutexr mypicture.exr
+
   pfsinhdrgen mypicture.hdrgen | pfshdrcalibrate -v -f mycamera.response | pfsoutrgbe mypicture.hdr
  
 
Check the output with pfsview:
 
Check the output with pfsview:
  
   pfsinexr mypicture.exr | pfsview
+
   pfsinrgbe mypicture.hdr | pfsview
 
 
==== Converting to floating-point TIFF format ====
 
 
 
[[hugin]] doesn't yet read [[EXR]] files, so you need to convert to [[TIFF]].
 
 
 
[[pfstools]] should be able to create [[HDR]] 32bit floating-point [[TIFF]]
 
files directly, but for some reason, this doesn't work for me.  So open each
 
[[EXR]] file in [[cinepaint]] and save as TIFF.
 
  
 
== Stitching with hugin ==
 
== Stitching with hugin ==
  
The [[TIFF]] images can be loaded into [[hugin]] as per usual with a couple of caveats:
+
The [[RGBE]] images can be loaded into [[hugin]] as per usual with a couple of caveats:
  
* Everything will appear very dark since our images represent linear sensor data.
+
* Everything may appear very dark since our images represent linear sensor data.  The display of HDR images can be configured in the [[Hugin Preferences]].
  
* Information about the [[field of view]] was lost, so this will need to be re-entered manually or re-optimised with [[PTOptimizer]].
+
* Information about the [[Field of View]] was lost, so this will need to be re-entered manually or re-optimised.
  
 
Stitch the images as per usual into a [[TIFF]] file, you can use [[enblend]] as the
 
Stitch the images as per usual into a [[TIFF]] file, you can use [[enblend]] as the
Line 138: Line 122:
  
 
This [[TIFF]] file is in floating-point 32bit per channel IEEE format.  This is
 
This [[TIFF]] file is in floating-point 32bit per channel IEEE format.  This is
impossible to display in its entirety at once, so you possibly want to create
+
impossible to display on a normal monitor in its entirety at once, so you possibly want to create
 
final 8bit per channel human-readable images.
 
final 8bit per channel human-readable images.
  
Line 150: Line 134:
  
 
Alternatively, use [[pfstools]] to manipulate the image: pfstools has the
 
Alternatively, use [[pfstools]] to manipulate the image: pfstools has the
facility to read [[HDR]] [[TIFF]] files, unfortunately it chokes on the
+
facility to read [[HDR]] [[TIFF]] files.
[[alpha channel]] created by [[nona]] and [[enblend]] - Use cinepaint to open
 
the TIFF file, flatten and save as [[EXR]].
 
  
This can be viewed with pfsview:
+
An HDR image can be viewed using pfstools using:
  
   pfsinexr stitch.exr | pfsview
+
   pfsintiff stitch.tif | pfsview
  
 
A quick way to create a good usable 8bit per channel image is to select '''logarithmic mapping''',
 
A quick way to create a good usable 8bit per channel image is to select '''logarithmic mapping''',
Line 164: Line 146:
  
 
[[tone mapping]] operations use [[HDR compression]] to compress high [[dynamic range]] images.
 
[[tone mapping]] operations use [[HDR compression]] to compress high [[dynamic range]] images.
 +
 +
Note that local tone mapping operators produce strange artefacts in the [[zenith]] and
 +
[[nadir]] of [[equirectangular]] images.  So either choose a global tone mapping operator or
 +
retouch the poles afterwards.
  
 
A related package to [[pfstools]] called [[pfstmo]] can do automatic [[tone mapping]]
 
A related package to [[pfstools]] called [[pfstmo]] can do automatic [[tone mapping]]
Line 169: Line 155:
 
options and techniques available, commands look like this:
 
options and techniques available, commands look like this:
  
   pfsinexr stitch.exr | pfstmo_drago03 | pfsgamma -g 2.2 | pfsout stitch.png
+
   pfsinrgbe stitch.hdr | pfstmo_drago03 | pfsgamma -g 2.2 | pfsout stitch.png
  
If you have trouble with [[EXR]] files and [[pfstmo]], try converting
+
[[qtpfsgui]] is an open source GUI for [[pfstools]] and can perform
to [[RGBE]] Radiance format first (RGBE files have a .hdr extension):
+
[[tone mapping]] interactively.
  
  pfsinexr stitch.exr | pfsoutrgbe stitch.hdr
+
[[Photomatix]] also can perform tone mapping.
  pfsinrgbe stitch.hdr | pfstmo_drago03 | pfsgamma -g 2.2 | pfsout stitch.png
+
 
 +
[[Category:Tutorial:Specialised]]
 +
[[Category:Software:Hugin]]
  
[[Category:Tutorial]][[Category:Tutorial:Specialised]]
+
{{Outdated}}

Revision as of 20:41, 9 June 2010

This tutorial doesn't cover reasons why you might want to stitch in HDR format. It is a simple HOWTO listing the tools available and how to use them with hugin.

TODO this is all very out of date, hugin now supports HDR assembly internally.

Working with HDR images is fairly extreme behaviour. If you just want higher quality output than you get with typical 8bit photography, then you probably want to look at a 16bit workflow with hugin.

Still here? There are two basic ways of creating an HDR panorama:

  • Stitch several panoramas of the same scene, each one at a different exposure, and merge them together into a single HDR file.
  • Create a set of HDR shots of the scene and then stitch them together.

Each has advantages and disadvantages: The first technique is simpler and has the advantage that the final HDR step can be skipped and substituted with a Contrast Blending approach, but has the potential for misalignments causing ghosting problems. The second technique is presented here since it involves a greater range of tools.

Quick and easy technique

The rest of this tutorial describes generating high quality output using command-line tools. However it is now possible to complete the workflow entirely with GUI tools, something like this:

  1. Take bracketed shots of your scene.
  2. Open bracketed images in Hugin. Align - let's say - the middle exposures together and set the stacks in the Images tab.
    1. If your stacks don't align (shot hand-held, sloppy panohead, etc.), set some control points inside stacks and align them too.
  3. Stitch the panorama with hugin and enblend to HDR file.
  4. Optionally tonemap the result with qtpfsgui.

Laborious and difficult technique

Preparing the HDR images

Unless you have an expensive HDR camera, you will be merging bracketed shots to create the HDR images - Unfortunately this means that you are limited to static scenes and landscapes.

Taking bracketed shots

The number of shots required depends on the dynamic range of the scene you need to capture and the capabilities of your camera.

Many cameras have an auto-bracketing mode that takes three or five shots two stops apart with one press of the button. This may be adequate, though a typical outdoor scene might have a dynamic range of eighteen stops which would require eight shots two stops apart.

Whatever method you choose, it should be obvious that you need a good tripod to keep the camera steady.

Correcting chromatic aberration

Now is a good time to correct chromatic aberration and vignetting with fulla using pre-calibrated data for your lens, it doesn't really work later on.

Merging bracketed shots with PFScalibration

There are other tools for merging bracketed images, but PFScalibration is Free Software and does the job.

The steps outlined below for assembling HDR images can also be performed with the qtpfsgui GUI tool.

Calibrating the camera response curve

Generally when a digital camera creates a JPEG or TIFF file, it takes a 12bit per-channel dynamic range image captured by the CCD and compresses it using a camera response curve into a 8bit output file.

So JPEG and TIFF files need unrolling with a calibrated camera response curve so they can be mapped into the linear space of the floating-point HDR image.

If you are working with RAW images, the camera response is generally linear and doesn't need calibrating, so you can skip this step.

A quick way to derive the response curve for later use is to take a series of five bracketed JPEG shots, slightly out of focus and one stop apart. eg. 2, 1, 0.5, 0.25 & 0.125 seconds exposure. First extract the exposure times from the EXIF data:

 jpeg2hdrgen *.jpg > mycamera.hdrgen

Then extract the response curve, by comparing the photos, and save it:

 pfsinhdrgen mycamera.hdrgen | pfshdrcalibrate -v -g 6 -s mycamera.response

Aligning the shots

If the pictures were taken hand-held you will need to align the stack of photos using hugin.

Alternatively the hdrprep tool can be used to do all this automatically and save a lot of time.

Merging the bracketed images to Radiance RGBE .hdr format

Create a hdrgen file listing each of your bracketed photos and their exposure times, you can base this on the mycamera.hdrgen file created earlier.

Then use this and your camera response file to create an RGBE file:

 pfsinhdrgen mypicture.hdrgen | pfshdrcalibrate -v -f mycamera.response | pfsoutrgbe mypicture.hdr

Check the output with pfsview:

 pfsinrgbe mypicture.hdr | pfsview

Stitching with hugin

The RGBE images can be loaded into hugin as per usual with a couple of caveats:

  • Everything may appear very dark since our images represent linear sensor data. The display of HDR images can be configured in the Hugin Preferences.
  • Information about the Field of View was lost, so this will need to be re-entered manually or re-optimised.

Stitch the images as per usual into a TIFF file, you can use enblend as the final step.

Post processing

This TIFF file is in floating-point 32bit per channel IEEE format. This is impossible to display on a normal monitor in its entirety at once, so you possibly want to create final 8bit per channel human-readable images.

Otherwise, typically a HDR panoramic image is used as a lightprobe for 3d rendering, in which case you are now done.

Adjusting in a GUI tool

There are various image editors that can open this file such as cinepaint, krita, vips and HDRIE. The capabilities vary so you need to experiment.

Alternatively, use pfstools to manipulate the image: pfstools has the facility to read HDR TIFF files.

An HDR image can be viewed using pfstools using:

 pfsintiff stitch.tif | pfsview

A quick way to create a good usable 8bit per channel image is to select logarithmic mapping, adjust the exposure slider until you see a good range of shadows and highlights, zoom 1:1 and save as PNG.

Tone mapping

tone mapping operations use HDR compression to compress high dynamic range images.

Note that local tone mapping operators produce strange artefacts in the zenith and nadir of equirectangular images. So either choose a global tone mapping operator or retouch the poles afterwards.

A related package to pfstools called pfstmo can do automatic tone mapping of an HDR image and compress it into a low dynamic range output. There are many options and techniques available, commands look like this:

 pfsinrgbe stitch.hdr | pfstmo_drago03 | pfsgamma -g 2.2 | pfsout stitch.png

qtpfsgui is an open source GUI for pfstools and can perform tone mapping interactively.

Photomatix also can perform tone mapping.

This article is out of date. You can help Panotools Wiki by expanding it.

Once the article is ok feel free to remove the {{Outdated}} template