HRDC Accredited Trainer | TVET | ACLP | AI Trainer | Helping Corporates & Business Digital Transformation through AI Upskilling, Data Analytics, Process Automation & Content Marketing
ComfyUI with an advance workflow that allows you to fine tune video outputs to the level of expression and head movement. Definitely a game changer for people who are into AI Video editing, but not exactly a walk in the park unless you have spent quite sometime familiarizing with the ComfyUI interface itself.
Interestingly, in the original post's comments, industry players mention that ethically that you should not use this technique in a film with The Screen Actors Guild - American Federation of Television and Radio Artists (SAG-AFTRA) actors "just because you can". It also means explicit consent has to be obtained from SAG-AFTRA to use AI in commercial work.
I'll put a link to a relevant article in the comments if anyone is interested in reading more about this.
#artificialintelligence#generativeAI#consent
I'm *obsessed* with #ComfyUI, and this implementation in #Nuke is fascinating. However, I feel compelled to remind users of this #animation technique, about the recent #SAGAFTRA contract regarding the permissions required to manipulate a performance in this manner.
It's one thing to slightly adjust a performance to "massage" it better into a take, especially a #VFX shot, but this technique could be used to *completely alter a performance, including lip-sync.*
Yes, sometimes a performance is “found” or “made” in editing, but this is more than editing, this could resculpt an *intended performance* from scratch. Legally, that’s a “no-no.”
Specifically, regarding the use of #GenAI on union actors:
The latest SAG-AFTRA contract, ratified in late 2023, permits the use of #generativeAI to edit performances but with strict rules around consent and compensation.
Actors must provide informed and explicit consent before their performance can be digitally altered or replicated using AI. This applies to both "Employment-Based Digital Replicas," created during an actor's participation in a project, and "Independently Created Digital Replicas," which are generated from pre-existing materials.
For Employment-Based Digital Replicas, studios must give actors clear notice, typically at least 48 hours in advance, and obtain consent for both the initial creation and any subsequent use of the digital version in other projects. Actors are compensated based on the number of days they would have worked if not for the use of AI, and they are entitled to residuals just as they would be for a live performance.
In contrast, for Independently Created Digital Replicas, producers must negotiate compensation and secure explicit consent for use. These provisions remain in place even after the performer’s death unless, otherwise specified. (No “digital resurrection” without prior approval or representation.)
Such permissions are likely baked into new contracts by default, to permit the use of GenAI, but you'll have to inform the actor of any changes, so they get compensated. It's likely many will flout these rules, misunderstand how to properly apply, or provide some "flat rate fee" for an umbrella of "default fixes," but that is likely to be abused.
This is going to be killer dinner convo with producers and post-production folks.
Install ComfyUI for Nuke: https://lnkd.in/gDy_Qn7G
A HOW-TO install ComfyUI for Nuke video: https://lnkd.in/g_GmnhVc
Just finished my first quadruped animation! 😁
Simple walk cycle but super interesting exercice to explore the tigers movement.
Rig by La Salle BCN
Animation in Maya
#maya#3danimation#tiger#walkcycle
Trinity Animation - Your Gateway to Technical Animation Excellence!
Elevate your technical concepts with our cutting-edge animation services. From engineering marvels to intricate processes and complex workflows to product walkthroughs, we bring your ideas to life in fantastic and easy-to-understand detail.
Explore our technical animations now by clicking the link below:
#TechnicalAnimation#TrinityAnimation#Animation
Thanks to the effort of Francisco Contreras we can now drive ComfyUI animations from within Nuke. This opens up a whole new level of control and new workflows. Thanks so much for implementing my feature request! 🙏
If you haven’t done it, yet, check out ComfyUI for Nuke:
👉 https://lnkd.in/e--aSARS#nuke#comfyui
2D Animator and Illustrator
2moCute dog! I´m sure he or she is a great colleague and does brilliant animation :)