Pikcells are a design-driven studio with 15 years in the industry. Recently they’ve been working on something new for them, creating TV ads for long-term client Wren Kitchens, and we caught up with Creative Director Richard Benson to hear how they used Corona to composite actors into a rendered kitchen.
TV ads for Wren Kitchens
How did this project come about?
We have been working with Wren kitchens since 2013 and have seen them go from industry disruptor to the number 1 retailer in the UK. Recently they have started pushing their marketing material into TV advertisements, enlisting the help of Tribal DBB in London, and producing some excellent advertising campaigns.
The guys at Wren are very switched on and quickly realised that they would need to create a new advert every quarter to keep things fresh so began looking for ways to streamline the process and reduce the ongoing costs. I think they must have been watching some behind the scenes videos on youtube because they asked if it would be possible to create a CG interior with live action actors composited in. They would then be able to update the room set design regularly without having to reshoot everything.
We had never done this before, but of course we said yes…
Teaming up with Electric Theatre Collective, we designed the interiors and produced all the CG work, and they then took care of rotoscoping the actors and compositing them onto the CG back plate.
How much of it was CGI? How about the shot where the actor closes the dishwasher?
The dishwasher is all CGI, just like the rest of the kitchen. This was a pretty straightforward shoot surprisingly; the main issue was to make sure that the actor’s hand closed the tray and door in a way that did not obstruct the door handle and wouldn’t affect the type of door – handless or cup handle etc. – that we put on later in CG.
In case anyone is wondering about the shot of the man “doing the Titanic” on top of the truck, that one was all live action! The actor was leaning against a stand which was edited out in post by ETC.
What was the set-up for recording the real world footage? What sort of measurements did you take, in order to build the virtual set?
Everything in the space was painted in a 50% matte grey so that there was no colour bounce on the actors and it was easier to rotoscope them.
Clear acrylic 10mm sheets were added to the tops of each work surface so that the reflections of the actors could be extracted in post.
The main thing here was to capture all the data from each shot so we could recreate it afterwards.
Did you take any real world measurements from the set, in order to build the virtual set?
Our plan was to recreate the real world scene as exactly as possible so we recorded pretty much everything, including the slight changes in lighting for each shot.
We already had a 3D model of the set which was sent to the set builders before hand. Nevertheless we still had to measure everything again to account for slight discrepancies.:
As you can see from the photos the lighting setup is very complicated:
For lighting data we recorded:
- Positions, rotations and heights
- Positions of the “barn doors” on the lights
- Colour temperatures and power values for each light
- The size, position and reflectance of the reflector paper, and strengths of the plastic sheets used to diffuse the lighting
- Objects which are not lights but are occluding light
For the camera, we made a note of
- Shutter speed
- White balance
We also planned to rebuild the actors as 3D characters so we also recorded their dimensions and took some reference photos.
How did you match the camera movements from the set in 3ds Max?
The camera movements were tracked by our pals at Peanut FX using the small yellow squares placed around the set. The footage was sent to them along with our scale model of the grey set.
They then sent us a basic 3D model along with 3ds Max cameras, which we simply added the Corona Camera Modifier to.
How did you go about rebuilding the set and matching the lighting?
The first step was to rebuild what had been shot on set with our recorded data, so we set about reconstructing the matte grey environment 3D model.
Once the matte grey set was rebuilt in 3D we started reconstructing all the lighting data. This meant placing lights at the correct position, angle and height, then also placing objects in from the data we collected.
As we are used to lighting our images in a similar way, pretty much making everything the same as it would be set up in a studio, it wasn’t too much of a step change to set everything back up in the 3D scene. Once we had rebuilt everything in 3D and inputted all the data, the scene pretty much came together exactly as it looked in reality.
We wanted to recreate the real scene as accurately as possible even adding barn door style boxes around lights and replicating the plastic sheets covering the lights which make the light more diffuse.
Behind the set there were two large 3m x 3m acrylic screens with a high-powered spotlight behind which shone through them. Each light had a coloured gel over it to provide warm and cool light. On the surface of the 3×3 screens was a fabric grid of 10cm x 10cm squares about 10cm deep. This grid channeled light to travel in a straight line.
After initially rebuilding these lights exactly, we found that an easier way to replicate them was to create a HDRI of the surface then map this back onto itself. This reduced render times and the difference in light was not really noticeable.
Inputting all the lighting and camera data into Corona Renderer was incredibly easy. Each shot had a slightly different lighting setup which was made very easy by Coronas Lightmix feature.
The outsides were built and rendered in another file with a tweaked lighting setup so they could be composited in through the refraction pass.
Any feature requests for Corona that you thought of for us during this this project?
Yes, some more thorough-matte override materials. Currently you have to apply a matte surface to every object which needs to catch shadows, and it would be good to have a global override material like in V-Ray which keeps the original material properties.
LightMix is great but can catch you out sometimes if you untick a light but don’t turn it off, then send a render. It would be useful if unticking a light in the LightMix meant that light did not render.