Unreal Han
In this video, I share my experience with facial animation using different software and solutions, from ARkit to AI-powered solutions. I break down the process into steps to help you understand the components of facial animation capture. This will enable you to make an informed decision on which solution fits your production needs best.
——————–
Resources in the video:
Weta’s Avatar facial paper: https://www.fxguide.com/fxfeatured/exclusive-joe-letteri-discusses-weta-fxs-new-facial-pipeline-on-avatar-2/
Matrix Demo breakdown (must watch): https://www.youtube.com/watch?v=h_dJtk3BCyg&t=1155s
Faceware portal: https://www.youtube.com/watch?v=zG3m0AY3f_4
Ziva face on Siggraph 2022: https://youtu.be/MAXJWEoKbxY?t=2830
D14D: https://di4d.com/
——————–
Twitter: https://twitter.com/HanYang_VFX
Insta: https://www.instagram.com/unreal_hany/
Artstaion: https://www.artstation.com/hanyang
00:00 Introduction
01:15 ARkit Face in depth
03:42 Use ARkit in Maya with metahuman
04:43 AI-driven facial
07:01 AI result review
10:06 What’s the future?
Source
my 60s ARkit workflow here: https://youtube.com/shorts/XCImJLD9FA4?feature=share
我也在期待那一天的到来,去释放arkit或者普通摄像头的最高的能力,AiGC的时代终将来临!!
Amazing explanation and sharing of knowledge! Insta sub!
does it work on mac m1 max ?
so awesome of you to share your knowledge. I really like this "sharing knowledge freely = everyone's ultimate success" philosophy. AI is going to help a lot, but it's but there yet for Facial Mocap. ARKit is so hohum, I'm not convinced it's worth all the effort of incorporating into my workflow just for some eye movements and blinks (I am using stylized in house characters so nothing is automatic). I was hoping you would give us your thoughts on Nvidia's Voice2Face, is that a good option front end labor vs. end result? I simply haven't have much success in mapping the face (again, for my characters, but for metahumans I was pretty happy, at least I thought it was better than ARKit for emotion and speech). Thanks again for taking the time to help us out!
I have an android, not an iphone. But I'm working on getting an iphone (not to have as a phone but just for facial motion capture). Could you please put together a tutorial at some point on facial motion capture in Unreal Engine 5 please?
Ar core supported android phone works here?
Me crying in 3DS MAX rip me, honestly i might aswell start some sorta RND for facial tracking for 3ds max, i need courage!!!!
For me, a big drawback of ARKit is that certain facial expressions aren't recorded into the blendshapes at all. For example you can hardly do non-symmetric brow up movement (🤨), worried face (😟) and sad face with hanging lower lip. I assume this can be solved by software in the future, but so far I haven't seen much progress in ARKit facial tracking last few years unfortunately.
I’d rather it didn’t it’s bad enough without getting it to do that
Does it run on an apple m1 chip. I heard that its complicated?
感谢分享,干货中的干货👍
What about combine and retarget 3d scanned per frame facial animation and morph animation? Maybe is not good and flexible for unreal but must be better result
Hi thanks for this! Can I get your opinion on upcoming image generators? Specifically Bluewillow as they are still in beta testing phase? Are they too late?
The ghetto camera rig was like $4k lol!!
New sub here looking for good content on unreal for cinematic purposes.
您好,杨老师。我是一名正在学习ue的学生。想请问一下您的lip sync 中,手指涂抹嘴唇那个部分是怎么做的,可以简单帮我指个方向吗。非常感谢
top
Very interesting. Thanks for sharing. Great video ❤😮😊
Nice breakdown of whats out there. Facegood looks like a very solid option for what im looking for
Also not sure if you have covered Nvidia Audio2Face – its another option with varying results
If you have $7600, you can buy r3ds wrap4d and track, which would allow you to have a 4d facial animation solution with your own hardware and have what is the best imo facial animation solution, even if it's time consuming and data intensive but I'm trying to see if I can use arkit depth data for that but I need the software to do that
We really need an AI that analyses like 10 minutes of videos of someones face and can use that data to make near-perfect copy of all the emotions of that person in metahumans.
if there's someone who looking for lipsync animation for virtual human's speeking here are some alternative ai animation solutions
speechgraphics is amazing
OVR is free
Metahuman SDK Plugin is free
Thank you SO SO SO much for this research. I know it was a mountain. I definantly will stick with ARKit and additional post animation where needed.
4:17 Super creepy 😎 Great effect to show a robot freaking out and starting to lose control.
11:09 Respect.
What an awesome video!
this just got a huge update
man….metahuman animator is comming…it use iphone depth sensor. wow.. what a fast
我们是一家来自中国的2D生成3D算法团队,我们提供高精度动作捕捉算法,如果您感兴趣,可以看看我们的产品
Dude AI is useless. It's guessing + using more power. And people growling isn't a sellable product in entertainment. Entertainment is s break from that.
Now Unreal 5.2 is out. What do you think about Metehuman Animator?
Is there any option for android alternative for ARkit
Glad I discovered this channel
I think having the unreal engine options completely work on a iOS or android device is truly being accessible
Excellent, comprehensive, brilliant, thank you
this is great info.. thanks for sharing!
Hey, have you seen Unreal Engine's 5.2 Realtime Facial Tracking Animation Demo? Are you using the same tech/approach in the video?
Really great content. Thanks for sharing.
Do those metahumans' skins naturally deform based on collisions like a real human? I'm really wondering if this is happening.
What is your opinion of the new Faceware Portal solution?
Great video!
Link to face good?
And now MetaHuman Animator is dropping soon. Exciting times!
hi good stuff, which program you use to do the machine learning from tracking data to retarget data? and how did you put it back for auto retargeting
what software or tool u use to track AI generated faical capture
What do you think about the new animation in UE 5.2 they have greatly improved it, please record a video!
That video of the robot girl and the girl kissing looks inappropriate honestly
I feel like with this ai matching system, the animations look a lot more emotional and believable!
Give alternative we dont have i phone