To find out more about this exciting news please visit our new website.
Adrien has created a detailed breakdown of the creation process for one of the environments for a short film called Initium, Adrien was one of a team of students from ArtFX in France to complete the short. His final demo reel was selected as the overall winner for last years 'Student of the Year' for VFX.
In this scenario of Initium, the short film I co-directed during my last year at ArtFX. This shot depicted the arrival of the hero at the source of a temporal distortion. During my work on environments, I used Autodesk Maya, V-Ray render, ZBrush, Adobe Photoshop and Nuke. The work on the establishing shot took about 2 to 3 weeks.
I began to create the shot very roughly for the previz. At this step, there were a lot of back and forth with animation. I used very simple shapes to design the environment.
I spent a lot of time to adjust proportions and tried to balance landscape and buildings.
Once I was satisfied with the blocking of the environment. I started with what would seem to take most of the time: Icebergs and Ice cliffs.
ZBrush was used to design the environment. I sculpted the cliff from a plan, and the Icebergs from dynamesh spheres. It really wasn't a difficult setup. However, as you can't import Maya cameras into ZBrush, I had to do several back and forths in Maya to check that the composition didn't move too much from the original layout.
My favourite brushes for Environment design.
When my mesh was too stretched I used retopology tools before going into details. I projected my original sculpt on the clean mesh and refined it as best as I could.
Retopology of the Ice cliff using Zsphere.
Due to time I knew I couldn't use a classic CG workflow with UVs/Textures. I prepared my work so that I could use camera mapping on it. That is why I split the cliff in several pieces. I reducing the polygon count with decimation master and exporting the mesh to Maya.
For the needs of the film, we wanted a base station with an industrial, unsafe look and an impressive size. It needed to be big enough to let think this industrial complex would be the source of the temporal distortion. As the proportions were set in the previz. It was quick to make. I created a database with several props, sci-fi elements. This allowed me to do the final design of the base station in a couple of days.
Closer look at the modelling / Base station seen from the final camera.
To end up the modelling, I imported some pieces of the base station into ZBrush and refined the contact between the buildings and the icebergs to make it more convincing.
Refining the base station contact in ZBrush
Vray was used to render the shots. For this one It was a very simple setup with a directional light to simulate the sun and HDR image of a sky plugged into a Vray dome light. As I said... I had no time for UVs. I duplicated cameras from the animated one, on which I changed the film gate to extend the frame. Thereafter it allowed me to create textures recovering the environment. I made high-resolution renders for each of the cameras. Using the Exr format let me see diffuse, raw lighting, rawGI, reflection, SSS, specular, normal, ID masks in one file. It was a practical way to work in Nuke. It allowed me to replace in comp, any of the passes that I improved in Photoshop.
ID masks stored in an Exr file of the base station. Ready to use in photoshop to help the "texturing" process.
Shading itself was very basic. Metallic objects of the base station had a Vray material with a chrome look. I applied a more matte material to the other pieces.
Render, diffuse painting and composited samples of some projections.
Setting each of the projections took a bit of time but was really worth it. Even if the geometries seemed isolated, I had to be very careful when I dealt with shadows. For example, I had to keep in mind the set is always casting a shadow on the station. In order to do so I disabled the primary visibility of ice elements before rendering the layer.
Starting from the beauty of the projections. I re-injected new version of diffuse (and other passes) into the nuke tree.
Camera mapping Tree of the environment. (Blue backdrop for Ice, orange for the base station)
The final composites were then projected into Nuke 3D space on low-resolution geometries. The Alpha Channel of my projections cutting the geometries gave the impression meshes were quite detailed. But for instance bridges were made with cubes.
Nuke 3D Space: preview of the environment projected on low-resolution meshes
→ Gave me the ability to render the shot in a record time. I wouldn't have been able to finish all the environments in the film if I had to render each frame with a classic full CG workflow. Although the very beginning of this shot with the aircraft crossing the tunnel was indeed computed this way.
→ Great flexibility: I could update the camera move (if changes were not too crazy) at any time without rendering all the shot.I was able to generate a 3D fog in nuke and all the various effects involved from the scenario. (Laser beam, temporal sphere...)
→ I was able to generate a 3D fog in nuke and all the various effects involved from the scenario. (Laser beam, temporal sphere...)
Nuke Scanline render of the 3D fog.
There is still a lot more I could say about this shot. But this sums up the big steps and the way I worked on the environments for this film.
I must thank Thomas Mouraille (MPC Lead Environment- DMP on World War Z and Guardians of the Galaxy) I had the chance to take part in his DMP workshop while studying at ArtFX. His experience was a precious help and a great source of inspiration.
CG Artist at Bad Robot Production