TERM 3 – PERSONAL PROJECT

In a dystopian future where plant life has become extinct, Dr. Riven, the head scientist at FloraTech Genesis Institute, embarks on a mission to revive vegetation on Earth. The story begins with Dr. Riven recording his experiment logs, expressing both reluctance and determination. He explains the various trees that may have existed a long time ago, using a holographic display to showcase their importance and the consequences of their loss.

Dr. Riven emphasizes the dire state of the planet, stressing the importance of restoring plant life for ecological balance and survival. He highlights that the success of this project hinges on humanity’s renewed respect and care for the environment. The narrative concludes with a poignant reminder of past mistakes and a hopeful call to action for environmental restoration.

The moral of “Project Terra” is the crucial importance of environmental stewardship. It underscores the devastating effects of deforestation and the urgent need to revive and protect plant life to ensure the survival of our planet. The story advocates for learning from past environmental missteps and fostering a sustainable relationship with nature, emphasizing that collective effort and respect for the ecosystem are vital for a thriving future.

*Submitted on 4th of July, 10:11 am


Working on this individual project has truly been a rollercoaster ride for me. There were moments when I felt completely lost and unsure of what to do next. However, taking short breaks to clear my mind and refocus proved to be immensely helpful. This project has been both challenging and rewarding, pushing me to improve my skills and develop a deeper understanding of compositing.

One of the significant challenges I faced was managing my time effectively while juggling multiple responsibilities. To accelerate the process, I chose to use After Effects for compositing, believing it would be easier and quicker due to its user-friendly interface and my familiarity with the software. Despite this, the project still took considerable time to complete, partly because of the complexity of the visual effects and the need for meticulous attention to detail.

During the project, there were several instances where I hit roadblocks and didn’t know how to proceed. Taking a step back, resting, and then approaching the problem with a fresh perspective allowed me to find solutions more effectively. This approach not only helped in overcoming technical hurdles but also in maintaining my mental well-being.

Another challenge I faced was with the production phase, where I was forced to reshoot my footage thrice, which resulted in waste of time and effort. However, constant determination and quick thinking, made me finish the project in time and create and output I’m proud of. I also experienced constant lag in After Effects because the files and the effects used in it were heavy and took a lot of time to load. One solution was to reduce the resolution of the timeline, however that meant I was working without clearly seeing anything. A better solution I came up with was to render a couple frames of the portion I was working in just to check if everything works perfectly. That saved me a lot of time and made my computer happy as well ;).

Another problem I faced was with how to visually present my holograms. It took me a while to figure it out, but eventually I solved it. Watching a lot of movies, grabbing references and inspiration from wherever I could, be it an image, a video, anything, helped me visualise and conceptualise what I wanted to create.

This project has been a significant learning experience for me. I have become more proficient in using After Effects, Gaea, Unreal Engine, Maya, Substance Painter and especially Premiere Pro particularly in handling complex compositing tasks. The process has taught me the importance of patience and persistence, as well as the value of taking breaks to maintain productivity and creativity.

My problem-solving skills have improved as I learned to navigate through challenges and find creative solutions. I have also gained a better understanding of the importance of planning and time management, especially when balancing multiple projects.

One thing I learned from this project is that problems are destined to happen in one way or another, but it is how you overcome them that makes you good at your work!

One of the most gratifying aspects of this project is the realization of how much I have grown since starting my coursework at LCC. Each project I work on contributes to my skill development, and I can confidently say that my abilities have improved significantly. The progress I’ve made from the beginning of my coursework to now is something I’m extremely proud of.

The final outcome of this project is something I am very proud of. Despite the challenges, I was able to create a visually compelling and technically sound composition. The project involved intricate compositing work, seamlessly integrating various elements to create a cohesive and engaging visual narrative. Completing this project has had a positive impact on my confidence and skills. It has reinforced my belief in my ability to handle complex tasks and produce high-quality work. The feedback from peers and mentors has been encouraging, highlighting the technical proficiency and creativity evident in the final product.


So, after much research, thinking and scribbling out scripts, I finally came up with an idea and a script for my project. The theme was still the same; Deforestation. The story is set in the future, with the primary location being a botany lab called FloraTech Genesis Institute (The name actually had a bit of research as Flora means Plants, Tech obviously Technology and Genesis means Birth or Creation; So what I basically had in mind is a laboratory where plant rebirth is done. Hence the name ;). The story is about a scientist who is the head of the Project Terra Renewal at this Lab. His name is Dr. Riven. He is tasked with recording his daily activities and progression for this project in the form of video logs. So the scientist is recording his video for Day one and he explains about various trees that were important for the well-being of our planet using an HUD and then he shows how our planet was like in the past, covered with trees. He then goes on to say that this is what he plans to restore because if this mission fails, our planet won’t survive long.

The underlying theme is still the same but I changed the instances and shortened the dialogues to the maximum extent. I’m planning to shoot this sometime between 27th-31st May as I have asked for the availability of some rooms in LCC. I’m looking for a room that remotely resembles a sci-fi, futuristic lab with no computers or anything. So, once I receive availability of the rooms, I’ll shoot it as soon as possible. Meanwhile, I worked on recreating the intro sequence of this project again in Unreal Engine by removing the city wall and adding a building that resembles a sci-fi lab. I downloaded a cool looking building from Sketchfab but only that building seemed a bit lonely. So, I further downloaded another building but I did not like how it was textured. So I textured it again in Substance Painter and gave it a glass-finish which looked super cool and sci-fi ish in my opinion. I tweaked the lightings a bit and worked on real-time rendering and the output was far far far better than what I got in the first scene. It looked super cool and I showed it to Manos who also had the same opinion. So I was really happy with how it has turned out to be, now all I need to do is shoot the film and start working on the VFX thing. My primary reference was Avatar (2010) and some scenes in it.

I did my reshoot on 31st of May at a room on 3rd floor at LCC(I forgot the number ;). I was quite happy with how the whole setup of the scene looked. Although there was no objects or subjects in the background, the whole scene felt something was missing. Anyways I started working on it to see how the holographic display and the trees should look like. But after consultation with Manos, he told me to add some movement to the camera to work on motion tracking and match moving. I thought that was a good idea too. So I planned to shoot again (For the 3rd and hopefully the final time). I did the shoot at my accommodation building by renting out an office room. I shot the whole sequence in an iPhone 14 with 4k resolution at 24fps. I added sort of motion sensing movement to the scene which I thought would be a cool addition since the script is demanding the lead to self-record his video log and the motion sensing movement would be sort of sci-fi, futuristic.

My first step, even before starting the shoot, was to create a sci-fi, futuristic, dystopian city environment. This was crucial as it set the tone and atmosphere for the entire project. So, I made it my first task to find and create the necessary assets to bring this vision to life.

I began by downloading some assets from Sketchfab, a website with a lot of 3D models. Although the environment changed quite a bit from my initial idea to the finished product, I put a lot of effort into creating it. Originally, I wanted to make a city with a big wall dividing the rich people inside from the poor people outside. This was meant to show that social hierarchy would still exist in the future. However, after making some changes to the script, I decided to drop the wall idea. Instead, I placed buildings that looked like futuristic science labs directly into the scene.

For the assets, I downloaded models from Sketchfab and textured a few of them myself using Substance Painter. This involved painting the textures and making sure the materials looked right for a gritty, high-tech city. I also downloaded other assets from Quixel Bridge and Marketplace, which provided high-quality models and textures that fit well into my environment.

With all the assets ready, I started building the environment. I began with an empty field and then added lights to create depth and highlight important areas. I used directional lights to simulate sunlight and point lights for specific spots, which helped enhance the overall mood.

Next, I placed the 3D models into the scene, arranging them to create a sense of organized chaos typical of a dystopian city. I adjusted their size and position to make sure they looked prominent and realistic. I also tweaked their settings to ensure they were visible and striking, creating a skyline that was both impressive and slightly menacing.

Lighting was key to setting the mood of the scene. Inspired by the movie Blade Runner 2049, I adjusted the lighting to give the scene a yellowish tint, which added a hazy, futuristic feel. To further improve the look, I added a post-process volume, allowing me to apply effects like color grading and bloom, which added depth and richness to the final render.

To bring the scene to life, I created a level sequence and added a camera. I set the camera to mimic a DSLR, which gave the shots a cinematic quality. I manually set the focus to ensure it highlighted the important areas, creating dynamic and engaging visuals. I added keyframes to animate the camera, capturing sweeping shots and intricate details from various angles.

I wanted to render each shot separately without cutting within a single take, so I captured each scene individually. This gave me more control and ensured each frame was polished. For rendering, I used the Movie Render Queue plugin from Unreal Engine 5. I chose the EXR format for its high quality and used additional features like anti-aliasing and deferred output to enhance the visuals, resulting in clean and sharp renders.

Throughout this process, I realized how much I enjoy working in Unreal Engine, especially in creating detailed environments. Bringing complex, immersive worlds to life is incredibly satisfying. This project confirmed my passion for environment design, and I want to focus on becoming a 3D environment artist in the future. The mix of technical skills and creativity needed for this field aligns perfectly with my interests, and I’m excited to keep improving my skills and exploring new possibilities in 3D art.

I encountered challenges when attempting to render the tree models I downloaded from Sketchfab. Despite experimenting with various settings to enhance realism, the results looked artificial and unnatural. This prompted me to explore alternative approaches to integrate the trees into my project.

One option was utilizing Unreal Engine to render the trees and extract their alpha channels for compositing. However, the textures behaved unexpectedly within Unreal Engine, complicating the process further. After several attempts to troubleshoot and refine the settings, I realized the tree models weren’t integrating smoothly with the scene as I had hoped.

Given these difficulties, I made the decision to abandon the idea of incorporating 3D tree models altogether. Instead, I opted to focus on other elements of the project that could be executed more effectively within the available resources and software capabilities. This shift allowed me to prioritize tasks that aligned better with achieving the desired visual quality and coherence in the final output.

This experience underscored the importance of flexibility and adaptation in creative projects, where exploring different solutions and being willing to revise initial plans can lead to more successful outcomes. By reassessing and adjusting my approach, I was able to maintain progress and ultimately enhance the overall quality of the project.

The experience creating a landscape in Gaea was completely fascinating and awe-inspiring to say the least. I’ve watched several videos and tutorials to creating worlds so effortlessly in Gaea and had always thought to try it once. That time came for this project. I created a landscape of mountains in Gaea and to be really really honest, it was the easiest landscape I’ve ever created in my life. The interface and the features and tools Gaea offers is so awesome and easy to work with. It has these preset terrains like Rocky, Mountains, Range, etc. to create something from scratch.

Although creating the landscape was easy, I faced a lot of troubles exporting the heightmap file from Gaea and importing it into Unreal Engine. Gaea offers exporting options only in the form of image file formats like png, exr or tiff. There is no option to export the landscape as a 3D file format like fbx or obj. But still, it is what it is. I had also created some textures in Gaea itself. I readied the files for export and exported them. I had set the scale to 2600m and height to 5000m. So, importing them to Unreal Engine, I had to use some formulas and values for the landscape to appear correctly. The way to import an heightmap file to Unreal is to select the landscape mode option and import heightmap file. Then in the location section on the X axis, you have to type the formula height 5000* 100 / 1009. The height being 5000m and 1009 is the resolution I exported the png from Gaea as Gaea only allows resolutions up to 1009 to be exported for free users. I did this formula and imported the file. But the real problem came after this!!! The whole landscape was filled with uneven spikes and looked nothing like what I had created in Gaea!!! I was totally heartbroken and didn’t know what to do. I tried for several solutions like re-importing the heightmap, exporting the landscape from gaea again and several others to say the least. The next option for me was to ask people about a possible solution.

I headed over to my LinkedIn and posted in it asking for possible solutions or advises and some people reached out to me offering some solutions. But even after trying out those solutions, they didn’t quite work. Others responded by saying they haven’t worked with Gaea before so they were unable to help. Overall I was in a really awful and helpless situation. I researched all over YouTube and the Internet for possible solutions but there was nothing there to offer as well. I was completely drained out and didn’t know what to do. So after a couple of thinking sessions, I thought I can create another landscape and adjust height values of it. So I headed over to Gaea again and made a rough landscape. And in the export option I adjusted the scale values to 2000 from 2600 and the height to 5000. And the used an auto level feature for my Erosion node as well a clamp feature to it. I set the clamp max values to 25%. I exported them in the Png format in 1009 resolution. Then I headed over to Unreal Engine and imported the landscape by adding the values 5000*100/1009 and in the Z axis entered the values 5000*100*0.001953125. And GUESS WHAT!!!! IT WORKED!!!! The whole landscape was exactly as the one I had created in Gaea!!! I was so happy that after several days of attempts in trying a solution I finally found it myself!!!!!

The problem was that in the first landscape, I had created two separate node graphs: one for 3D assets and one for 2 assets like textures. The problem which I think is that I was trying to export the 2D texture heightmap file and importing them to Unreal which made absolutely no sense and I don’t even know why I did it in the first place! So, I headed over to the first landscape an exported it the same way I did for the second one and imported it to Unreal Engine and that worked too!! So my mission and task to create a landscape in Gaea was a total success after several days of hard work and research and efforts.

What pains me the most to say is that I had to discard the work I did in Gaea as it did not match the scene. However, I’m happy to learn something new and am eager to use this for more future projects that I work on because creating worlds has never been easier than in Gaea!!!

Before diving into VFX and compositing work, I first needed to enhance the raw footage through color correction and grading. The original clip was overly saturated with uneven lighting, so I turned to Premiere Pro for adjustments. Using LUTs and Curves presets, I aimed to achieve a cinematic look reminiscent of films like Blade Runner or Arrival, whose color palettes I admired. Initially, I attempted to match my footage with a reference clip from Blade Runner using a comparison view, but the results fell short of expectations. Therefore, I opted to manually adjust the levels to create a distinct blue tint that better suited my vision.

After several rounds of trial and error, I achieved an output that closely resembled the atmospheric tones I sought. To preserve the highest quality throughout the post-production process and avoid compromising video clarity, I exported the sequence from Premiere Pro in .exr format. This decision ensured that I worked with uncompressed footage, maintaining the integrity and detail of the video for subsequent VFX and compositing tasks.

This meticulous approach to color correction not only refined the visual aesthetics of the footage but also set the stage for seamless integration of visual effects and compositing elements. By customizing the color grading to evoke the desired cinematic atmosphere, I laid a solid foundation for enhancing the overall narrative and visual impact of the project.

When it came to my compositing work, I opted for After Effects because I believed it would offer a more intuitive and efficient workflow for creating holographic pop-up displays, inspired by references like those seen in Avatar. To start, I researched tutorials online to understand the techniques required. My initial task was to arrange the footage and initiate motion tracking.

I began with After Effects’ built-in Tracker effect, but the camera’s uneven movements posed challenges for achieving smooth tracking. Despite multiple attempts and creating several tracking points, the results were not up to par. This led me to explore Boris FX Mocha AE tracker, which was already installed in my After Effects setup. I was pleasantly surprised by Boris’s robust tracking capabilities—they were precise, clean, and remarkably easy to use. Although the interface felt somewhat dated, the results it delivered were truly impressive.

Using Boris FX Mocha AE, I employed a circular shape masking tool preset to outline the white dome-shaped holographic device within my footage. This method ensured that the tracking data seamlessly followed the device’s motion, maintaining alignment throughout the scene. Once the tracking was complete, I exported the data back into After Effects, integrating it with the footage to achieve a cohesive visual effect.

This experience highlighted the importance of choosing the right tools for specific tasks in the post-production process. While After Effects provided a solid foundation, Boris FX Mocha AE proved indispensable for handling intricate tracking requirements with precision and efficiency. The seamless integration of tracking data enhanced the overall quality and realism of the holographic display, demonstrating the power of leveraging specialized tools in achieving professional-grade compositing results.

The next step in my project involved rotoscoping the character within the scene. Following Manos’s suggestion to enhance the focus on the character and their actions by separating them from the background and adding a slight blur, I turned to After Effects and utilized the Roto Brush tool. Initially, I attempted to use the auto-rotoscoping feature by outlining the character and selecting the Freeze button, but the results were disappointing—the roto was inaccurate and did not meet my expectations. Despite several attempts with different points in the video, the outcome remained unsatisfactory.

Recognizing the need for precision, I made the decision to manually rotoscope each frame of the character. However, I soon realized that simply isolating the character wouldn’t suffice; I also needed to rotoscope the table and the holographic display to ensure consistency in the depth of field effect. This meant painstakingly outlining each element frame by frame to achieve the desired separation and blur effect.

Although the process of rotoscoping was time-consuming, the final results were remarkably effective and visually pleasing. I intentionally left some edges unrefined, as the backgrounds lacked distinct color separation. Moreover, since I planned to use the same footage as the background, rather than inserting a new one, overly precise edges would have seemed unnatural and out of place.

Overall, while the rotoscoping process presented its challenges, particularly in achieving precise separations and maintaining visual coherence, the effort invested in manually detailing each frame paid off in achieving a polished and professional look for the scene. This experience underscored the importance of meticulous attention to detail in post-production work, ensuring that visual effects not only enhance but also seamlessly integrate with the narrative and aesthetic of the project.

The next step was to add the holographic figures of the trees. Since the 3D models did not work out well, I downloaded the pictures of trees in a PNG format. To give a depth and 3D sort of feel to the trees, I added an effect called Shatter. The Shatter effects is used to extrude the image and fill the gap between the extrude either using the same texture of the image or an in-built one. Since it was an image of a tree, I could not extrude much because it will feel fake and weird. After adding the Shatter effects and creating an output somewhat resembling a 3D tree, I used the Color Key effect to key out the background which was black. Then I added the Venetian Blinds effect to give a holographic look to the tree and adjusted the transition completion to 20%, direction to 90 degrees, width to 18 and feather to 7%. Then I added the Noise effect and adjusted the noise amount to 20%. The tree was still the same color, so I added the Curves effect to the tree and adjusted the Red, Green and Blue curves to give a blue-green sort to tint as how a real-life hologram would normally look like.

After the look of the tree was complete, I worked on the animations as to how the tree would first emerge and how it would transition to the next one according to my hand movements in the footage. So for this, I used masks to draw out a random shape and then animated it by making it pop up from the hologram device slowly and the moving the mask to the left completely for shifting between the trees according to the movement. To make it look more smoother and natural, I added an Ease-In and Ease-Out technique by using the F9 shortcut key. Since the masks did not fade away according to the entry of the next tree, I added keyframes to the Opacity of the image. This provided a much smoother, cleaner and realistic looking transition between trees as well as a good emergence from the holographic device.

I repeated the same process for the next two trees as well. The next things was for the trees to match the movement of the camera. I created a Null object and set the motion target of the tracker to the null object. Then I parented the null object and linked the trees to it to match the movement. Next, my task was to add glow light and a beam light from the holographic device to illuminate the tree as well as adding reflections on the table and the character. For this, I created a solid layer and added a Glow effect and adjusted its values as well as added expressions to the opacity layer. Then I created a mask by using the Pen tool and drew a sort of inverted pyramid shape to showcase the illumination of the beam light. I duplicated the layer and adjusted its opacity and added other solid layers with some minor adjustments to portray the reflections of the light to the table and the character.

The main problem I thought the scene had was that the hologram looked really plain and empty with just the tree. So, what I thought was to add some sort of 3D mountains onto the table to make it seem like it had been projected from the hologram device. So, I made the 3D mountains in Gaea and then added them to the composition and transferred the track data to it to make it stable. I added a few effects like noise, curves, venetian blinds, etc. to make it feel more holographic. But after consulting with a couple of my friends, they told me to add something on the table as well to make it look more realistic. So I added another video of some dials that I downloaded from you tube. I added some effects to it and also added the corner pin effect to it. The reason i did this is because, the table was not really a rectangle shape and had some distortions owing to the angle at which the video was shot. So if I tried to scale it up and mask it using a pen, (which I tried) the edges would be cut off and the perspective of the dials would not match the table. So a solution I found was to add a corner pin effect and then adjust the corners of it to match the perspective. That worked at the end as well!!

The final task was to render the output from After Effect. I used its default Render Queue and set the export format to OpenEXR for lossless export. The render did take quote a while because the file was a bit heavy.

In the later part of my video project, I utilized Nuke specifically for rotoscoping my fingers. This was essential because I wanted the fingers to appear as if they were seamlessly integrated into a holographic display, projecting from the table itself. Considering the complexity involved in accurately rotoscoping hands, especially achieving precise detail around fingers, I opted for Nuke’s Roto node over After Effects’ Roto Brush tool. While it took considerable time and effort to meticulously outline and animate the rotoscoping, the final result matched my vision perfectly. Once completed, I exported the rotoscoped footage as an alpha channel from Nuke and seamlessly integrated it into my project in After Effects, ensuring the holographic effect was both realistic and visually compelling. This workflow not only highlighted the strengths of each software tool but also underscored the importance of precision and attention to detail in achieving a cohesive and professional-grade visual presentation.

I transitioned to Premiere Pro for the final stages of video editing, color correction, and sound mixing to bring the project to life. With the visual elements finalized, my focus shifted to refining the narrative and enhancing the cinematic experience.

Firstly, I imported all rendered footage into Premiere Pro, organizing the clips and sequences to ensure a cohesive flow. Sound design played a crucial role in immersing the audience in the dystopian world I had created. To achieve this, I sourced sound effects from platforms like Pixabay and YouTube, selecting audio clips that complemented the visual storytelling and enhanced the atmosphere of the scenes.

Color grading was another vital aspect of post-production. Using Premiere Pro’s powerful Lumetri Color panel, I meticulously adjusted curves, hues, saturation, exposure, brightness, and other parameters to achieve a consistent and evocative visual tone throughout the video. I also added in an effect called Noise HLS to add grains to the footage which makes it more cinematic and realistic (Gonzalo always told us that grains add depth to the composition).

Simultaneously, I conducted sound mixing to ensure that dialogue, music, and sound effects blended seamlessly and contributed to the overall immersive experience. By adjusting levels, EQ settings, and spatial effects, I aimed to create a dynamic audio landscape that mirrored the visual complexity of the footages. The final stages of video editing in Premiere Pro allowed me to fine-tune transitions, pacing, and visual effects, ensuring that each scene flowed smoothly and effectively

In conclusion, this project represented a culmination of technical skill and creative vision across multiple stages of production, from 3D modeling and texturing in Maya and Substance Painter to environment building and rendering in Unreal Engine, and finally to video editing and sound design in Premiere Pro. Moving forward, I am eager to continue exploring the intersection of digital artistry and narrative storytelling, leveraging these tools and techniques to create compelling visual experiences that resonate with audiences.

Grained Ungrained

Reference for the holographic display



I shot this scene from a room in our LCC. I just edited a quick draft to check how the hologram would look like and also how the total scene would look. But I didn’t quite like how the latter part of the scene looked especially the render of the holographic video showcasing the forests. Also, after sharing the clip with Manos, he suggested that I add some sort of camera movement to the video. So now, I’m planning to shoot the video again for the 3rd time and hopefully the last time! ;).

This is what I planned to do for my personal project for term 2. I had a lot of doubts as to what to name the project but finally came up with the name TERRA which in Latin means Earth or Land and in Roman mythology, Terra or Tellus is the alter name for Goddess of Earth. The primary reason I chose this particular name was that the theme of my project is Deforestation and sort of like a message to end the cruelty of killing plants and trees which may lead to a dull, dark and hostile future for the generations to come.

So, the story is really simple. The story is set in the future, in the year 3069, showcasing a dystopian city called HEL which is Dutch for Hell ( Implying the present condition of the city is similar to that of Hell ) and a journalist reporting that scientists are on the verge of re-introducing plants and trees back to the planet which were extinct a long time ago due to deforestation. As the journalist is broadcasting the news, he spots a young man and asks him his opinions on what he has to say about this phenomenon. The young man (Named KRIS, no idea how I got that name, but to be honest I don’t like the name ;( ) shares his opinions on plants by using a Holographic Heads-Up-Display or an HUD, by dropping his futuristic bracelet on the floor, and conveys to the journalist that he has been learning about plants and trees quite recently and has him intrigued. He shows various trees that were once present on our planet and conveys it’s uses and just as he is explaining, Kris gets sucked into a portal to a green, luscious place covered with a lot of trees. The journalist also follows him and they both have a conversation on what they have missed by not having trees, and how important it is for people to save and care for trees for a better future.

This is basically the gist of the story but I’m not quite impressed with this one as I have a lot of concerns regarding the story and how to potentially film it and make it into a believable product. My primary concern was the question of how the young man and journalist gets sucked into a portal that emerges from an HUD, because that makes no sense at all and it’s illogical. My second concern was where to shoot this, as there are 2 locations portrayed in the story; one of a dystopian city that has no vegetation or trees or plants and the other a luscious green place filled with a lot of trees. The second location wouldn’t be a problem as London has a lot of parks and a lot of trees but for the first one, I have to find a place that has absolutely no piece of vegetation, plants or trees. It took me a lot of time, effort and research to find such a place. My primary choices were Leake Street and Toynbee Street, but upon further research I found a place called Brick Lane Park that has this cool graffiti wall which sort of looks futuristic. So I decided to proceed with it and for the green place in the story, there was a park nearby so I thought it would be perfect to shoot it there itself to reduce budget ;).

The shoot for the project was scheduled for 16th May. I got into contact with a guy from BA Film and Television, and he helped me shoot the whole thing. So for shooting, I rented a Blackmagic Pocket 4k Cinema Camera from our college as well as a DJI Ronin Gimbal as well as a RODE Wireless Mic for dubbing the voice. The Gimbal, I planned for taking the shots as smooth as possible avoiding any movements or jerking, but the cameraman did not want to use it s he was confident without using it. SO the gimbal was totally useless for shooting. As far as the camera goes, I could only shoot a couple of trials in it, as the battery went dead within 30 minutes. So I had to shoot the whole thing in an iPhone 14, which amazingly provided great results. The guy set the settings to Cinematic mode with 24fps and 4k quality and the output was really great.

As far as the actors were concerned, I took initiative to act as Kris, just because I love acting and I wanted to be an actor from a very young age, so I just wanted to seize every opportunity I could get ( also ’cause I thought no one else really could suit the role ;( ) and I called up Minghan from my class to act as the journalist. We had to take a couple of trials before getting the knack of it. The primary problem we encountered was that the whole sequence is set like that of a journalist interviewing a stranger, so we could not involve any cuts in between. So we had to take the whole sequence in one take, with that much dialogues. That was a but difficult thing to achieve, considering the fact that the dialogues were too long and there were a lot of people passing by through the camera. So, a lot of trials had to be made and a lot of retakes as well. But finally we both managed to finish up our sequence in not too bad of a manner and moved onto the park for filming the green environment scene. That scene was filmed rather quickly as we knew how to involve into the scene, after a lot of trials and retakes. But the cameraman had to leave quickly so maybe because he was getting late, he forgot to shoot the scene in 4k 24fps cinematic mode and instead shot it in 720p 14fps normal video. But I realized this only when I reached home and took a look at the footages while creating a rough cut for the video.

The next problem I encountered was with the duration of the whole video. The duration was over 2 minutes 30 seconds which is more that double of what our allotted duration was (1 minute was the allotted duration ). The duration was just another problem as I was deeply unhappy with the output created as the whole thing looked deeply unrealistic and fake and nor did it look like a place from a futuristic dystopian city. All in all , I was totally pissed off and sad because I didn’t receive an output I was hoping for. The next day, I had a meeting with Manos and he told me that I need to cut down the duration by clearing some dialogues off the script and maybe try to match the lighting of my into scene that I created in Unreal Engine. Overall, I had decided I was going to scrap the idea and had to come up with a new one which would be easier to film and more convincing as well to the audience.

For a couple of days I was totally blank, with no ideas whatsoever. So I thought of giving myself some time to just relax, chill and help me ease my pressure. After this small break, I was rejuvenated and felt fresh (I think everybody needs some of this to ease themselves and relax leaving tensions and work behind for a couple of days). Finally I got a new story in my mind.

Maya

My first step was to create a city wall inspired by a scene from The Creator. In that scene, Los Angeles is separated from another city by a huge circular wall to restrict passage. I wanted to replicate that kind of separation in my project. So, I decided to create a similar wall in Maya.

To start, I carefully studied the design and structure of the wall from The Creator. This involved analyzing how the wall fit into the overall cityscape, its scale, and the materials used. Armed with this inspiration, I opened Maya and began modeling the wall. I focused on making it large and imposing, with enough detail to make it look realistic and convincing in a futuristic setting.

I began by creating a basic cylindrical shape to represent the wall’s circular form. From there, I added details like guard towers, entry points, and various other elements to give it an industrial look. I wanted it to feel both futuristic and functional, as if it had been standing for many years, weathered by time and conflict.

After completing the initial model, I spent time refining the details. This included adding textures and materials to the wall to give it a sense of depth and realism. Then I exported the whole model as an FBX for texturing in Substance Painter.

Substance Painter

After building the city wall in Maya, my next step was to texture it using Substance Painter. I imported the wall model into Substance Painter as an FBX file, utilizing the Unreal Engine 4 Starter assets templates and enabling the auto unwrap feature to streamline the UV mapping process.

In Substance Painter, my focus was on crafting textures that would convey a sturdy and imposing presence. I meticulously designed realistic concrete and metal textures, paying close attention to details such as weathering, scratches, and imperfections. These elements were crucial to achieving an authentic look, suggesting years of exposure to the elements and perhaps even conflict within the fictional world I was creating.

To further enhance the realism and narrative of the wall, I integrated signage and text elements into the texture. These additions served to reinforce the wall’s function and significance within the urban landscape, adding layers of storytelling and visual interest. Whether they were warnings, propaganda, or informational signs, each element contributed to the immersive experience of the environment.

Once satisfied with the textures, I exported the completed textures from Substance Painter and imported them back into Maya. Integrating these textures with the model allowed me to see how they interacted with the geometry, ensuring everything aligned seamlessly. This iterative process of texturing and refining in Substance Painter and Maya enabled me to achieve the desired aesthetic and narrative impact for the city wall.

Substance Painter proved invaluable in its ability to not only texture but also enhance the realism and detail of my 3D model. The software’s intuitive tools and extensive material libraries empowered me to bring my vision to life effectively. Moving forward, this experience has strengthened my skills in texture creation and reinforced my passion for crafting immersive environments in digital art and design.

Unreal Engine 5

After completing the city wall in Maya and texturing it in Substance Painter, my next task was to import both the 3D model and textures into Unreal Engine and begin constructing a futuristic city environment. Guided by my script’s demands for a barren, desert-themed landscape devoid of vegetation, I set out to create a cityscape that exuded a sense of desolation and technological grandeur.

To populate this desert-themed environment, I downloaded assets such as rocks, cliffs, and other elements from Quixel, leveraging their high-quality textures and models to add realism to the scene. Additionally, I sourced 3D models of buildings and other structures from Sketchfab, ensuring a diverse and visually compelling cityscape.

Once imported into Unreal Engine, I integrated these elements together to form the foundation of my futuristic city. The centerpiece, the textured city wall, was a focal point around which I built the rest of the environment. Using Unreal Engine’s material editor, I created a custom material for the wall, applying the textures I had meticulously crafted in Substance Painter. This process involved adjusting material parameters to achieve the desired look of sturdy concrete and weathered metal, incorporating signage and text elements to enhance the wall’s narrative significance.

Lighting played a crucial role in setting the mood and atmosphere of the scene. Inspired by the iconic visuals of Blade Runner 2049, I configured the lighting to cast a yellowish tint over the environment, evoking a hazy, futuristic ambiance. To further elevate the visual fidelity, I utilized Unreal Engine’s post-process volume to apply effects such as color grading and bloom, enriching the depth and richness of the final render.

To animate and capture the essence of this environment, I created a level sequence and introduced a camera. Mimicking the characteristics of a DSLR camera, I carefully adjusted focal points and angles to highlight key areas of interest within the cityscape. Employing keyframes, I animated the camera to capture sweeping shots and intricate details from various perspectives, enhancing the cinematic quality and storytelling potential of the scene.

In order to maintain control over each shot and ensure a polished final result, I opted to render each scene individually without cuts within a single take. Leveraging the Movie Render Queue plugin from Unreal Engine 5, I selected the EXR format for its high dynamic range and lossless quality, essential for post-production adjustments. Additional features such as anti-aliasing and deferred output were employed to further enhance visual clarity and fidelity, resulting in clean, sharp renders that showcased the painstaking detail and craftsmanship invested in every aspect of the project.

This project not only allowed me to merge technical proficiency in 3D modeling, texturing, and lighting within Unreal Engine but also enabled me to explore and expand upon my creative vision for immersive digital environments. Moving forward, I am eager to continue honing my skills as a 3D environment artist, driven by a passion for crafting visually captivating and emotionally resonant worlds in digital art and design.

The first thing I worked on when this project was announced and preparations began was this scene in Unreal Engine. My primary reference was a scene from The Creator movie. This whole scene honestly had a lot of time and effort put into it. I modelled the city wall in Maya and then textured it in Substance Painter. I saw a lot of tutorials to create a desert-like environment, sort of like a dry, barren land with lots of ups and down on the surface making it really uneven. As far as the buildings were concerned. some of them were downloaded from Sketchfab while the others from Quixel and Marketplace. I had a lot of time and effort put into the lighting, to try and make it look like a futuristic, dystopian sort of theme, where my primary reference for the lighting came from Blade Runner 2049. But I re-textured everyone of them again to make it blend to my scene. Overall, I was quite happy but somehow, somewhere the whole scene lacked depth and realism which made me quite dissatisfied with it. After consultation with Emily, she told me she was quite impressed with this output and was happy I put a lot of effort in it, and Manos told me he was happy with it too, but try to work on making it a bit more realistic as possible. SO back to the drawing board again, I researched a whole lot about photorealism and how to render realistic videos in Unreal Engine. But although I tried several different tutorials, none of the made quite a big difference.

After the script had changed, I changed the whole intro scene as well. So this scene didn’t make it to the final cut.

Discarded Draft

Credits: