Linear Workflow is a relatively new feature for Cinema4D. Before the introduction of Linear Workflow in C4D Release12 it was only possible through the use of third party plugins such as DeGamma by The Third Party.
This article aims to explain the concept behind Linear Workflow (LWF) and in the process look at the correct techniques, file formats and procedures when using LWF in your studio pipeline.
The information in the following post comes from an extremely informative and helpful document which was written by Philip Losch. Philip is one of the mastermind geniuses responsible for Maxon’s Cinema4D.
I’d like to thank Philip for giving me permission to use his material here.
Today’s monitors are not capable of showing colors how they really are. Let’s take a look at an example:
The surprising answer is: not the right one. The left one should – depending on your monitor – be pretty close.
Why is this surprising? We’ll see that when we zoom into the image:
At the top is a regular pattern of black & white pixels. At the bottom left the grey has a value of 192 (= 75%) and at the bottom right of 128 (= 50%).
Now… shouldn’t an even mix of 0% black and 100% white pixels blend into something like the bottom right grey of 50%?
At least that’s what we were taught in math: (0% + 100%) * 1⁄2 = 50%.
Unfortunately monitors do not work that way. The do not show colors linearly. The center of 0% and 100% is not 50% on a monitor. Monitors display the incoming signal with a so called gamma curve (actually it is even more complicated, but we’ll stick with this easier idea for a moment).
From left to right is your input image color. From bottom to top the actually shown color. So if the image color is 50% gray the real color shown is only about 18%. And to have a 50% gray displayed, you need an image value of about 75% (you can see the red dotted line).
As mentioned above the reality unfortunately is even more complicated. Monitors have so called ‘color profiles’. A color profile is such a curve definition – however with a possible interaction of Red, Green and Blue components. A color profile can do anything – e.g. it can substitue red for 50% green, it can add red and green and assign it blue… As an image looks widely different on devices with different color profiles many image formats allow to embed a colorprofile that ‘records’ under which conditions an image was created.
Most images nowadays are created with a so called sRGB color profile. sRGB roughly looks like the above curve and affects Red, Green and Blue components independently.
But isn’t it a dumb idea to build monitors that don’t show colors linearly and not “how they really are”? The answer is yes and no – unfortunately our human eye sees light intensities in a nonlinear way too and is more susceptible to certain ranges of intensities.
This would not matter as much if we had displays with unlimited color resolution (floating point or ‘HDR’ displays), but monitor technology for the most part is still stuck at 8 bit color depth per channel (or sometimes 10 bit for medical displays).
So now that we know we have to make do with this existing technology, we have to look for a better solution. Actually, there is the possibility to correct this with software – this is where Linear Workflow comes in.
The main problem of renderings created without LWF: lights are overblown and colours mix badly (‘unnaturally’). While the artist can compensate for parts of it by adjusting lights, falloffs, colours etc., this is only possible to a certain extent.
How LWF works
CINEMA 4D is clever enough to offer you a “one button” solution – and by default LWF is automatically activated in new scenes. If you are wondering where abouts you find this one button solution. Open the project settings (Control-D or via the Edit Menu).
It helps to understand how this works internally, especially when you need to use your rendered images in other applications later. Let’s focus on the upper path in the following image (LWF on):
LWF works in three steps:
Step 1: Before rendering all so called ‘assets’ (which are any colors, bitmaps or external references) are transformed from their colorspace (or sRGB colorspace, if they do not have a colorspace assigned) into linear colorspace.
Step 2: The render engine does all its calculations in linear colorspace.
Step 3: The rendered image is transformed from linear colorspace into the colorspace that the user chose for image output. The colorspace –if the image format permits- is embedded into the image.
There is no rule without exception: if your texture or colour is in a bump channel, alpha channel, normal channel or displacement channel the ‘raw’ image data is used (so step 1 is omitted). The reason for this, is that other applications like Z-Brush expect their images to work a certain way and contain direct height information (so e.g. black equals 0m height, white equals 100m and 50% gray equals 50m). You can also think of it this way: all material channels that don’t have anything to do with direct material colors (bump and normal channels change lighting only, alpha cuts out parts of the material and displacement changes the elevation) do not undergo this first conversion step.
Here’s a concrete example of this process:
Step 1: A sphere’s texture is an 8-bit image with no color profile embedded.
As the image has no color profile C4D assumes it is using sRGB (the the most commonly used profile).
Let’s also assume the sphere’s texture is 75% blue.
C4D now transforms this color from sRGB colorspace into linear colorspace. So the value of 75% becomes 50%.
Step 2: C4D does all lighting calculations linearly (and this is why the images will be so much superior – there is no color ‘distortion’ happening anymore).
Let’s assume a pixel is 50% illuminated. So we get 50% * 50% = 25% blue.
Step 3: The calculated image is converted into output color space, which in this example shall be sRGB again.
Our 25% become roughly 56% blue after transformation by applying the sRGB curve.
So the end result for our pixel is 56%, while in traditional rendering it would have been 75% * 50% = 37%.
And so they used LWF and lived happily ever after..
Not so fast! If you thought we only had to deal with !%§!$ hardware I have to disappoint you – as we’re also unfortunately living with !%§!$ software! It will take years and years until most software can properly handle colour profiles, linear workflow etc.
Fortunately Cinema4D, AfterEffects and Photoshop (and several modern compositing applications, for example Nuke) can play nice together – as long as you choose the right settings.
Let’s go through the limitations and problems step by step.
1.) Windows and Mac OS X
The operating systems now have some support for colour profiles. While OS X does a pretty good job evaluating image profiles, Windows 7 works for some image formats, older versions of Windows do not evaluate colour profiles. To view images, it is best use C4D’s Picture Viewer – as it always supports colour profiles. If you check the View menu, you will see there is an option to view the image with the Colour Profile or without. Select the Information Tab, and you can see the Colour Profile assigned to the image currently being viewed.
2.) Image formats
Lots of image formats do not support embedding of color profiles. While this is no problem when you render images to sRGB colourspace (as most applications naturally assume this) it becomes a big problem especially in combination with Multipass renderings and linear workflow where the image colour profiles must be linear.
You can choose your desired output Colour Profile in the Cinema4D Render Settings – Save options.
Always try to choose a format that supports colourspace information, unless
• You render a regular image to sRGB colorspace
• You render Multipasses, but don’t have LWF activated
• You render Multipasses, but know how to adjust the settings in your compositing application
The following image formats do not support colourspace information:
DPX (8- and 16-bit)
OpenEXR (8- and 16-bit)
Quicktime xxx (xxx = any format)
3.) 32-bit images
A specialty applies when you use 32-bit images for in- or output. By definition a 32-bit image is always saved in linear colourspace. C4D takes care of this for you automatically. If you choose 32-bit output the colour space option for the saved image is automatically disabled and linear colourspace is used.
4.) Photoshop CS 4 and higher
Photoshop reads and writes regular images without any problems. Single-File Multipass images are also handled correctly with one exception: if you render without LWF and output to 32-bit Photoshop can no longer composite the image properly. This comes from the above mentioned limitation that 32-bit image formats do not contain any colour profiles.
5.) AfterEffects CS 4 and higher
The most important setting in AfterEffects that needs to be adjusted is “Linearize Working space” in your project settings.
If you render with LWF in C4D this option needs to be enabled; if you render without LWF in C4D it needs to be disabled.
The reason for this is: Multipasses ‘outsource’ calculations to a compositing application. This only works though if the compositing application does the same mathematical calculations as the render engine, which means the colourspace they’re operating in needs to be identical.
Once you have Colour Management enabled in your After Effects projects, when you select a footage item, it will show you the Colour Profile in the top of the Project Window next to the thumbnail. You can check here to ensure the profile is being read correctly.
Try to use 16-bit colour depth or higher to avoid the occurrence of banding – as multipasses sum up multiple layers and then have the result transformed into monitor colourspace 8-bit usually isn’t enough.
For special passes like UVW data, Normals maps etc. activate the setting “Preserve RGB” in the “Interpret Footage” dialog to use the “raw” uncorrected data (not influenced by any color profile).
So there you have it. Linear Workflow with Cinema4D and After Effects in a nutshell. Hopefully this article helps to clear up any questions you may have had regarding LWF and C4D. Once again, huge thanks to Philip Losch for originally preparing this information and allowing me to post it up on helloluxx.