Given the buzz (see the viral video below) around Viggle AI recently, I decided to give it a try as a fun little weekend project.
If you haven’t heard of it yet, Viggle AI is basically a special video maker that focuses on 3D characters. It can make characters move realistically (like bending, jumping, and dancing) for short videos like you see on TikTok or Instagram. You can tell it what to make with words, use a picture of yourself, or even another video to guide it. It’s still being tested, so it’s a bit rough around the edges, but it’s free to try out!
Here’s a quick video overview of what Viggle AI can do:
Our Project – Making “ALTSpock” Dance
First, here’s the finished product, showing our wannabe Vulcan dancing…I’ll then go into a bit of detail about how it was created.
How The (MIX) Video Was Created
Here are the basic ingredients you need to create a mix:
- Image of the person you’d like to animate
- Video featuring the motions you’d like to mimic
- Background music or song to use
Currently Viggle AI can only be used via Discord…if you’re comfortable with this, then the rest of the process is a breeze. You’ll also find plenty of FAQs, the available motion templates and plenty of examples and community posts here.
So I created the original image using Midjourney using the following prompt, then used InsightFaceswap to make the character look more like myself, i.e. let’s call him “ALTSpock”.
For the video I simply used a free stock video site to find a couple of dances:
Once you have your image and video, you can use Viggle on Discord to create the new video, which at the moment is produced in 2-3 minutes. I selected the green (screen) background as this can be easily removed when putting the final video together.
For the audio, I used Udio to create a custom song. Also see this past article for more details:
Finally it was simply a matter of putting it all together with your favourite video editor…I use Final Cut Pro but you can pick whatever works best for you. And there you have it.
Animating Static Characters With Existing Templates
You can also just use a static image and have Viggle animate it using the various templates it has available…for example:
Here are a few such examples, created using the ALTSpock image and Viggle animation templates:
Here, the original character in the video template is replaced by the still picture you provide. You’ll notice the character render quality on these isn’t quite high res, but then again, this is the worst this tech is going to get.
Wrap Up + Disclaimers
The ability for more controllable video generation is clearly much needed and Viggle AI does a great job of showing what’s possible in the future. Powered by JST-1, they claim to be the first video-3d foundation model with actual physics understanding, starting from making any character move as you want (but clearly, they won’t be the last…and this tech will only get better).
I think this tech has tremendous potential for content creators of all skill levels. This fun little weekend project was both fun and educational, but that’s all it is…a test to see what the tech can do. Clearly we’ll need to work through all the ins and outs of how this tech gets moderated, refined and deployed, especially for content that is aimed at commercial use or wider distribution. In the meantime, please experiment responsibly!
Hope you enjoyed the demo.
0 comments on “Make Anyone Dance: Creating Easy 3D Animations with Viggle AI”