We spoke with John Crawshaw from wearelut about their recent “Turning Petrolheads” project for the Hyundai i30n. This involved creating a 60 second TV ad and two 15 second TV spots, plus a print and billboard campaign (which included lenticular images, that change with angle of view, on the sides of London buses).
All the work featured a mix of CGI blended with real world footage, and we asked John about the challenges they faced.
Read about wearelut’s work on the Hyundai i30n campaigns
Hi John! Thanks for taking the time to talk with us – tell us a little about yourself
I first had a love of art and design very early on in my life, studying fine art at school and then Art & Design at Barnsley College. I knew I wanted to do something with art after college but wasn’t sure what.
My Dad was a design engineer, designing steel rolling machines and had just moved into using 2D CAD. A colleague of my Dad’s visited the house one night trying to convince him that 3D was the way to go, using a program called 3D Studio, and that was the moment that I knew I wanted to get into 3D.
There were no courses at the time that did 3D specifically, so my only option was to study Multimedia at Leeds College of Art & Design and do all the other stuff that came with the course, like web design, fine art, photography, etc.
Web design was on the up in a big way back then; everyone wanted a website. I was asked to work at a company in Bradford for one day a week, helping out with website design alongside designer Guy Utley (who now runs a successful agency) and he taught me a lot about design and how brands work.
Guy knew I was into 3D, and he knew a couple of people in Manchester running a company doing graphic design and also producing 3D animations for Brother printers. There, Jody Clark and Simon Dixon showed me the ropes of 3D Studio Max; working on the Brother printer animations was good, but I wanted to do more.
I didn’t want to move from Manchester, so I explored my options and found a company doing architectural visualization called AHD Imagery (now MI). Tony Denton was at the helm and business was booming. With short deadlines and lots of images, Tony and the team there gave me a baptism of fire in arch-viz. and this gave me a great foundation in creating realistic CGI.
Tell us a bit about wearelut
wearelut is a small company founded by myself. I set up the company to facilitate the Hyundai i30n project and things have snowballed from there. We currently use a small crew of staff to facilitate any new briefs that may come in.
Having been in the industry for many years I know a lot of the artists around Manchester and who can help me for certain jobs. Since the Hyundai project, we’ve been able to pitch on some new work and things is heading in the right direction.
How did you come to be involved in the Hyundai i30n project?
The Hyundai project came about in a strange way. Back in November 2017 a photographer friend called Sean Conboy contracted me. He had been working with an agency in Preston called Wash Design and they had just won a contract to produce a TV commercial and outdoor media for Hyundai. Sean said that they would like to meet in Preston for a drink and talk about the ins and outs of the project.
I met Andy Wamsley the owner of Wash, and we hit it off from the start. It soon became apparent to me that this was a big project. I said that we would need to get a TV production company involved who could take his idea and make it happen. One month later we met with Chief Productions. Looking at their work we knew they would nail the project – and they did.
The next step was to do a test for Wash and Hyundai based off some early sketches. The test worked out great in a few ways. It was a great way for Wash to see the first iteration of the petrol heads come to life, and secondly for Wash to see if we were up to the challenge. We passed the test with flying colours but it was apparent that we would have to go back to the drawing board for the petrol heads. Andy went away and worked up some sketches and came back to us a month later.
He came back with some new sketches and he wanted us to quote on the whole project. Wow! My first independent job, and it was to be on a major TV commercial for a top brand. The costs were approved and we started the planning process.
What hardware and software did you use for the camera matching?
Running my own company I ended up becoming a minor expert on the computer, reading endless blogs and listening to podcasts. I soon realised that the new i9 chipset would be my weapon of choice offering cost and speed.
For the Hyundai project, we had two i9 7980xe PCs with 64bg of ram running pretty much constantly. They were able to handle a lot of frames and due to only having two machines, management was made easier.
Regarding camera tracking, we used a package called 3D Equalizer. An expert piece of software in its field, it made camera tracking the faces of the actors a lot easier than some of it competitors. It was a nice lightweight piece of software that fitted in perfectly.
How hard was it colour grading the renders to match the footage?
With the grade, we had a couple issues. When we started the project we were given an edit with all the signed off shots in. This had just a basic LUT, put on it so that the Arri Alexa footage was usable.
The problem was that the grade wasn’t going to be done for another two weeks by James Bamford at The Mill in London. We had to show our client something and it had to sit into the shot so that they didn’t get worried, but having the grade halfway through the project wasn’t ideal. This did mean that we ended up grading it twice for the early sequences.
To make sure the footage bedded in well used basic techniques to match to the footage as we knew that if we did that as a base we would be in a good spot when the graded footage finally came back.
Our first actions when matching the CGI would be to sample the whitest and darkest point of the footage and then match the black and white levels on our CGI. This was a great starting point before we moved on to tweak the white balance of the camera. All our renders from Corona were set at a neutral white balance and with no contrast – this way we could control it more in the comp.
In the breakdowns, we see a wireframe of the car – was this just used for blocking (and the car in the final videos is from the real world footage?)
For all the environments we had varying levels of details depending on if they required them. In the breakdown where beamer gets out of the car, we used the car for reflections and shadow. The camera tracking guys had used it for scale anyway so it made sense to add it in.
This was the case with many of the shots. The cafe scenes had tables and chair etc. just to add the realism to the reflections. We also made a basic body double of the characters so that we could see their clothes in the reflections on their pipes.
How did you first discover Corona Renderer?
I first discovered Corona from following The Boundary on Instagram. At first, I thought it was just another render and then as time went on and more people used it, the results were getting better and better. James Cook, who modelled the petrol heads, was the first person who showed me what it was like to use.
I’ll admit that I still wasn’t convinced at the time, but as I started to use it the more I realized how easy it was to use. I think without the excellent Scene Converter it wouldn’t have had such an uptake into the industry. Now I struggle to find many companies in the arch vis world that don’t use it.
How was it using Corona Renderer to produce animations?
Before the project started I had some concerns around Corona as it was more of an unknown than anything else in the project. I’d done many animations in V-Ray and I wanted to make sure I could achieve everything that I might require in Corona.
The frame times were half of what they were in V-Ray for some reason, and the chromes looked better. We were also able to tweak the blooms and glares in a super fast and simple way.
The second point, the matte shadows. In a few shots, the petrol heads were interacting with the ground or walls. One watch of a simple tutorial on the Corona website and I was sorted! We really enjoyed screen-projecting the backplate onto tracked geometry, as it just gave us perfect shadows with alphas which we could tweak in comp.
I’ve had a few people ask what method we used for primary and secondary bounces. The thing was the scenes were so short to render that we just used path tracing for primary and UHD for secondary. We realise that this wasn’t the cleanest or the most accurate way but it saved us loads of time. I’m sure that one-day path tracing for primary and secondary will be possible.
When it came time to rendering, did you do that locally, or on a render farm?
All the rendering on this project was done locally by us in the end. We were prepared to use a farm if we needed, but in the end our two i9 7980xe machines were enough. I would say to anyone out there looking at equipment, the more machines you have, the more headaches you have.
We used Deadline to queue up jobs and we had custom scripts to make sure all the renders came out the same every time. The script would set the render path and remove any Corona contrast and white balance. It would also add in all the elements that we needed for compositing. And that was it!
What is your favourite part of this project?
My favourite part of the project was when we finished the cafe scene. This had been Andy’s (Wash Design owner) hero scene from day one.
As it came over from the editor it was the first scene to get signed off so it was the first one we started. We had already prepared the HDRI and tested it. The first couple of shots came out of the comp and I knew it was going to work well. At the end of the week we delivered the scene over to Andy and Nate Camponi (the Director) and they loved it.
What kinds of changes were needed to the scenes, materials and lighting for the print ads vs. the videos?
The main difference between the print and tv campaigns was the detail and size of the textures. Most of the textures work at 15k but not all of them.
In terms of lighting, we just used the same HDRI technique as on the TV slots. We shot HDRIs on set and then replaced the lights with Corona lights. We found this was useful to get the right intensity and allowed a little more flexibility. We tried altering the HDRIs in Photoshop, but ultimately it took too long. The only thing we did to the HDRIs was a clean up of the set so that the reflections were nice and clean.
How is it living and working in Manchester?
I’ve been in Manchester for about 12 years now. It’s just a great city. It’s just the right size to retain a good creative community for such areas as digital and graphic design, although I feel it’s always struggled to compete with London on the VFX/advertising front.
There are a few different studios now in Manchester doing VFX and I really hope things can kick up a notch and we can create a place that attracts people from out of London and other major cities.
Any other projects that you are working on, or have lined up, where you will be using Corona Renderer?
To be honest we can’t talk about many of our projects that we have out for tender, but I can confirm that we will use Corona Renderer for all of them. It’s a reasonable price and it’s super easy to use and gets very quick results. The new advancements in V-Ray integration have made it an even stronger product.
I hope you’ve found this behind-the-scenes look at the project interesting!