One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
Alejandro Franceschi’s Post
More Relevant Posts
-
One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
To view or add a comment, sign in
-
One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
To view or add a comment, sign in
-
I was going to write something about NukeDepthCrafter but sadly the Tencent Research license is for non-commercial use, which makes the exercise pointless. (https://lnkd.in/eR9K9A9W) However, both StableDelight which doesn't quite remove all lighting information, but at least separates the diffuse from the specular, and StableNormals which infers normals from any image uses the friendlier Apache license. Being able to relight anything is the key to natural integration of images from different sources, and the recipe is always: Depth + Normals + Albedo + Specular + Lighting = Final image ( it's more complex than that, but this is a really good start ) All of these fall under tools that are greatly improved by AI approaches. So where to find a SOTA Depth map algorithm that has a Commercial-use license? Anyone? Anyone? Alejandro?
One key step for achieving top-notch #VFX interactions with live-action photography, is the process of "de-lighting." This involves removing specular reflections to ensure that assets respond appropriately to digital (re)lighting. In CG passes, a #normals pass provides crucial light direction information. Traditionally, achieving this level of control involved labor-intensive digital replication of physical sets. This new process simplifies and streamlines the workflow, eliminating the need for extensive modeling, shading, and lighting work. When working with live-action assets that don't match #CG lighting, this #opensource #software solution can help. It generates normals and #delighting passes for live-action images, allowing for seamless integration and control, similar to (and with) CG assets. By leveraging this novel technique, VFX professionals can enhance efficiency and flexibility in their projects, offering a new way to manage lighting in both live-action and CG environments. #StableDelight on #Github: https://lnkd.in/ggDDjJAz #StableNormal on #Hithub: https://lnkd.in/gQiEMC7d _ #VFX #opensource #CG #photogrammetry #lighting #delighting #VFXindustry #ves #ilmvfx #dneg #RodeoFX #mpc #scanlinevfx #TheMill #wpp #amazonstudios #netflix
To view or add a comment, sign in
-
Camera tracking is essential for digital compositing. We extract camera movement from the real camera and replace the movement on top of the CG element. This technique helps us to blend real footage and CG elements seamlessly. We have several camera tracking techniques. For this example, I've tried 2D tracking first but it didn't work. Then I used planar camera tracking and wala! Color grading also has a vital role in the digital compositing pipeline. Here I applied a couple of color corrections to blend the raw image with the real environment. If anyone is curious about which software I used for this, it's Nuke. #Cameratracking #CG #CGI #Digitalcompositing #3D #vfx #Nuke
To view or add a comment, sign in
-
Another example where compositing can have a dramatic impact on final render. Even though a lot of things can be directly adjusted in the 3D render directly, nothing compares to the speed of changing things in comp. The reason the original render is over-exposed is simply because it's easier to darken bright values, than it is to brighten dark values (especially where you have really dark values, and you have to really gain them up). The entire comp process (and the entire 3D work in Houdini) is covered in this course : https://lnkd.in/dkynaj3K
To view or add a comment, sign in
-
Hi, LinkedIn! I am excited to share with all of you my final project from the VFX Master that I have been studying in Lightbox Academy this year. My idea was to recreate a scene from the first season of the Rings of the power. I had a month to do it, and finally I can say that it is finished!! The first thing that I did was planning the workflow with Notion, organising the tasks and the time. During the first week I did the modelling of the scene, the terrain, the rocks, and i downloaded assets from Megascans, like the planks, or the Alien with two animations from Mixamo. Then, I started with the RBD simulation, and there was my first challenge: naming all the pieces different to make the simulation to work properly. Then, i did the Pyro. For that, i made a Vellum simulation with instanced lines, and then i worked with the fire and the smoke separately. The last task was the POP simulation: sparks, ashes, and fireballs to create a realistic atmosphere. While I was doing it, I thought that could be a great idea to develop some procedural thunders to add to the scene, and make a shockwave to the final shot. So, while I was caching some simulations, I started in a parallel file to work on this elements. Once everything was done, I had to render with Mantra. However, the time for each frame was too long, and I probably was not gonna render it all for the delivery date. So I had to think of another option. Talking with a friend, he advised me to render with VRay in Maya, so I started with his help to import each element to the software. For that reason, i have to thank Joel Alonso Marcos and Jorge Cerrada Sánchez to help me in this process to render with VRay. And this is the final result! I am aware that are some things to improve, and I will do it during this months. Also, I am going to post the breakdown the next week!!! #VFX #VFXArtist #Houdini #Maya #VRay #3D #Ringsofthepower #digitalart #art
To view or add a comment, sign in
-
Hi everyone! You might have seen a breakdown of this shot last week, but I wanted to showcase the final result on its own here on LinkedIn. This shot was an exciting challenge to craft an immersive war scene. Capturing the chaotic and gritty essence of a battlefield was no small feat. The goal was to create a busy yet coherent sequence, where every explosion, missile, and burst of fire felt impactful and added to the rhythm of the scene. From the dynamic interplay of lights and lens effects to the atmospheric smoke and dust, balancing all these elements while keeping the shot visually clear was a rewarding challenge. One of the toughest aspects was ensuring seamless integration: fine-tuning shadows, reflections, and masking to make every detail sit naturally within the environment. It was meticulous work, but it made all the difference. This project pushed me to refine my VFX compositing skills, experiment with new tools, and pick up techniques that helped me improve my craft. I’d love to hear what you think! What grabs your attention the most in shots like this? Any feedback is always welcome! Quick reminder that CG spaceships and original plate were provided during REBELWAY's advanced compositing course. Hashtag section: #VFX #VisualEffects #CGI #3D #compositing #WarScene #CGIntegration #ColorGrading #nuke #thefoundry
To view or add a comment, sign in
-
🚀 New Tutorial Alert! 🚀 Take your VFX shading & lighting skills to the next level with my latest tutorial on Gaffer and Arnold! Perfect for beginners and intermediate artists, this guide dives deep into essential techniques for pro-level 3D renders—all with free, open-source tools. 🌟🎬 Check it out! 🔗 https://lnkd.in/g7zK3qm6 #VFX #3Drendering #Gaffer #Arnold #tutorial
To view or add a comment, sign in
-
Match 3 Rocket VFX Hey guys, In this project, I focused on using effective techniques. For textures, I created clean and precise ones in Substance Designer. The smoke was hand-drawn, consisting of 4 frames that play randomly. I built a shader that applies gradient textures for coloring and used a tiled noise texture to add variety and dissolve the smoke smoothly at the end. For the rest, I followed a similar approach: coloring via gradient textures and tweaking brightness and saturation by multiplying the "Red" and "Alpha" channels. I mainly used additive and premultiplied blend modes. They work great for most tasks I face A bit more details on my ArtStation: https://lnkd.in/eM4ZN7Vv #Match3 #Match3Effects #CasualGame #Rocket #Explosion #GameDev #VFX
To view or add a comment, sign in
-
Boom! Witness the Power of EmberGen in Action! Excited to share our latest explosion simulation created with #EmberGen ! This real-time volumetric simulation software is a game-changer for visual effects artists, bringing stunningly realistic explosions to life in a matter of minutes. We just prepard a small different setup using a scene we created for Primal Predator VFX Mastrclass : https://lnkd.in/d7iBHb9K Follow our YouTube channel : www.youtube.com/@NyjoFX What Makes EmberGen Amazing? • Real-Time Simulations: See your changes instantly, without waiting for renders. • Intuitive Interface: User-friendly tools make it easy to create complex effects. • High-Quality Visuals: Achieve professional-grade results with stunning detail and realism. • Flexible and Fast: Perfect for both quick previews and final production quality outputs. Why You Should Try It: Whether you're a seasoned VFX artist or just starting out, EmberGen empowers you to create jaw-dropping simulations with ease. It's time to bring your creative visions to life! Join the VFX Revolution: #VFX #EmberGen #ExplosionSimulation #3DGraphics #VisualEffects #CGI #Animation #TechTutorial #CreativeTools #RealTimeVFX
To view or add a comment, sign in
Co-Founder at Carbon Capture Shield Inc.
2moDoes it simply generate a de-lit pass of the plate which us then matted in to the shadow area? This looks interesting.