Hey, folks! Jim, here – You might know me from my monthly book Spotlight column, my reporting on Pokemon GO and my occasional reviews of other things. This time, though, I got offered a great opportunity to interview Hyong Seok Oh, the Head of Growth for PLASK. This coincides with their product launch and booth at this year’s CES.
But wait, I hear you say! What exactly is PLASK?
PLASK is a different way of providing motion capture capability for people to animate figures in 3D. Just to offer a little context, traditional motion capture generally involves multiple steps, from dressing an actor (Usually Andy Serkis.) in a rig and having them provide movement, to applying that movement to a virtual rig, then having that linked to a 3D model that provides textures, color, etc. All of this fuses together to provide lifelike movement to a virtual character like Shrek, Jarjar Binks or Smeagol.
PLASKs goal is to make this process easier, much cheaper and more accessible for people at the individual level, from indie content creators up through game studios. I spent roughly half an hour talking with Hyong to find out more and bring you guys the inside scoop! Read on to see what we spoke about and how you can enjoy this software for free!
Note: Interview edited just slightly for the sake of succinctness.
- Hyong Seok Oh: I’m Hyonk Seok Oh, Hyong for short. I lead the growth team at PLASK.
Jim Newman: Very cool!
- Hyong: PLASK incorporated two years ago and we’re finally launching our product in two days, actually. (At CES – Jim)
Jim: Exciting.
- Hyong: PLASK is a South Korean software company. We make motion capture software where people can edit and record motions and turn them into animation all in one browser on the PLASK platform. And so I’m very excited to tell you about that product!
Hyong passed along the link to a video of some animation done in the software. It shows people dancing alongside a virtual model whose motion mirrors their own.
- Hyong: So, you know what motion capture is, right?
Jim: Yeah, so, if I’m understanding you correctly, in terms of what your software does differently, you don’t require the more traditional motion capture rigging, the gloves, the facial rigs and stuff like that. It’s AI-based, it watches video and interprets motion and expression from there – is that a fair assessment?
- Hyong: Yeah, right. Like rigging, motion capture suit, putting all these dots on actors faces. None of that.
Jim: Yeah, that’s actually really, really cool. Especially for people who don’t have… y’know. All that. Because obviously, those professional tools cost quite a bit of money as well. So for people that are new and just getting into that 3D animation space, I feel like this could be a really useful tool for them. Is it just for your indie-level creators or are there applications for your larger production houses to use this tech?
- Hyong: So, for now, you’re exactly right. We do have to compete against these larger, traditional motion capture tools. Like the suit and other rigging products…
Jim: Like Andy Serkis wears?
- Hyong: Right, but you need at least, like, $50,000 plus to actually have those tools and animators who are trained to use them. Therefore, you’re exactly right. A lot of these small-medium businesses, they have a hard time getting through that entry barrier.
Jim: Mmhmm.
- Hyong: And they need the power of motion capture to promote their product and make sophisticated product the most, but that entry barrier is really high because typically, it takes about a whole day, a full workday for one animator to make about one second of full-blown animation with motion capture.
Jim: Oh my gosh!
- Hyong: But tools like this can help small businesses to do the same thing. It’s AI-based, so it’s not as accurate as a motion capture suit, but it’s very, very similar, gets pretty close regarding accuracy and the price is a couple thousand times cheaper!
Jim: Right! I’m sure! Absolutely! So in terms of the features PLASK offers, so you have your AI, motion capture side of it. What about things like character design and that kind of stuff? Because normally right now you have to create the character model in Maya or Blender or something like that and import it into your motion capture program – so, is someone going to be able to pick up PLASK and hit the ground running, as it were, or are they still going to need support from other programs?
- Hyong: PLASK is designed very similar to Maya and Blender. It has functions that are as sophisticated as those programs but it’s relatively easy to learn. It is designed to be used in the professional animation space, but someone who’s used to designing things in Photoshop or Blender will have a very easy time using it.
Jim: Okay, so, you can have a little knowledge going into it and pick up the intricacies of the program as you go?
- Hyong: Exactly.
Jim: That’s perfect – it sounds like a really useful tool for people. Especially nowadays, you see a lot of, like, the Vtubers. I don’t know if you’re familiar with those? The 3D models that a lot of Youtubers are using nowadays. So I think that there’s a really big growth industry right there that I think this will be perfect for! If you’re familiar with what goes on, there.
- Hyong: Yeah! I mean, that whole sector will eventually lead into the Metaverse.
Jim: Yes! Exactly!
- Hyong: People will, eventually, adopt like… how can I say it? Their physical avatar? And convert, digitize them eventually. Just simply showing a digitized skin, a digitized version of their appearance is one thing. To convert them into movement, whether it’s the face or the entire body, that’s a different story. I think that’s like a bridge from the real world to the Metaverse, right? So there’s like this iffy area right in the middle, you know?
Jim: Gotcha. So you would say this system was designed with social media in mind? Like as things become more and more connected online, y’know, providing new ways for people to express themselves through their socials?
- Hyong: For sure. Enabling companies to have better access to motion capture, that’s the first part. The second part is enabling anyone, like gamers, Youtubers, to digitize their motion. And another key component is to monetize their motion. So, like with webfree 3.0, people will eventually migrate over to decentralized ownership of things. Pretty much everything. And movement, an avatar, will be an integral part of that. And we’re planning to increase the volume of digital movement with a lot of assets that are available. There are a lot of skins, but there are not a lot of movements attached to that, to give it exponential value.
Jim: Sure.
- Hyong: Yeah, so you want to make motion capture available to anyone and to monetize that to stimulate the basis in which each individual character and their motions therein, to be injected into the metaverse.
Jim: Right, yeah. That’s a really cool concept, I like that a lot. In terms of your company – let;’s circle back to that. What projects have you guys done that our followers might be familiar with already?
- Hyong: This is our first product ever.
Jim: Oh! So it’s your first!? Wow – that’s so exciting!
- Hyong: Oh yeah, so, our founders aren’t even out of college! They’re still in college and they just made this straight out of their basement!
Jim: Wow! Hey, there’s a lot of great stuff coming out of colleges and universities – people just putting their heads together and getting it done! And I really like that actually – especially because when you’re in that kind of learning environment, you see a lot of needs and challenges and it’s a really good space to be in to kind of address those because that’s what those institutions are for!
- Hyong: Right.
Jim: Yeah, that’s really awesome. So, I really like the idea of the product – you said it’s launching in two days?
- Hyong: Yeah, January 5th at CES. Our co-founders are actually there to release it there for the first time. And we’re going to make this product available free for about three months.
Jim: Wow!
- Hyong: It’s sort of a beta launch. And within that three months we’re going to give it to whatever indie developers or Youtubers, so we can get their feedback and calibrate it to whatever they want within that time frame and then release it into the market.
Jim: Oh, that’s cool – so your initial launch is sort of a trial phase. People can get it, try it out, make you aware of potential issues or successes! That’s really fantastic.
- Hyong: Right! I mean, even before launch we’ve been getting a lot of inquiries from people wondering how to get access to this product, so… and most of them were from indie game developers. And, like, freelance animators. But eventually we’re going to target the triple-A game studios.
Jim: Of course!
- Hyong: And animation studios, because their level of expectation with motion capture is really different, right?
Jim: Mmhmm.
- Hyong: So for now, we’re thinking like role playing games. Like idle actions and attack motions made easier.
Jim: So like Raid: Shadow Legends, Marvel Strike Force – that kind of stuff?
- Hyong: Yeah, exactly. And have gamers make their movements, import them into the game. To allow gamers to do that, but also for companies to allow that and also monetize that. So it’s a two-way expression of money and motion, I guess.
Jim: Yeah, that’s fantastic – especially with video games being the largest section of the entertainment industry nowadays, that only makes sense. And everyone’s looking for that next level of personalization in their video game experience as well. I think you see that with games that allow for character creation and customization as well. And it’s a really cool idea to think of that all the way down to the way a person moves and interacts with the world on an individual level. I think that’s really cool.
- Hyong: Yeah, for sure. I mean, I’m a gamer myself and IU play a lot of… well. I used to play a lot of MMORPGs.
Jim: Yeah, you’re probably too busy right now for that kind of stuff, it sounds like!
- Hyong: Right! For sure! But as a gamer, I fully understand when people talk about the Metaverse nowadays and compare that with games like World of Warcraft. Like, what’s the fundamental difference?
Jim: Right.
- Hyong: Normal people would have trouble controlling their character and talking on Discord with other people. How is that any different than the Metaverse? There’s not a lot of difference.
Jim: Yeah, right, managing multiple tasks at once…
- Hyong: Exactly. But, when people are free to express in an environment, to do the things that only they can do themselves, and apart from allowing that expression – whether it’s movement or voice or whatever can be expressed and also monetized, I think that will – it sounds really simple! But I think that fundamentally will change the nature of games, and slowly transition the concept of games into the Metaverse. Because it deals with the concept of You and everything about You.
Jim: I agree – it sounds like a fantastic idea. So, you mentioned that people were reaching out, interested in getting the software. Do you want to give us the information on that so the folks reading can get in on the fun?
- Hyong: Yeah, sure! Our website is PLASK.ai and you can go there, log in, in two days (January 5th.) you’re gonna be able to create an account and access it right away. We’re going to give away the Professional Version, with virtually no limits on editing and processing time. The tool will be free by default from now on, but the processing of motion takes a lot of GPU and we will charge for that eventually, but not for the first three months.
Jim: One other question that I did have, what kind of hardware specs would you be looking at to be able to run it at an optimal level?
- Hyong: You don’t really need a full GPU – you can just use your integrated graphics card.
Jim: So your standard AMD or Intel out-of-the-box dealie will be fine?
- Hyong: Exactly. Just a standard computer or MacBook, any browser. It’s really low system requirements. All the processing is done on the cloud, so if you can open a website, you’re good.
Jim: That’s perfect! Because, yeah, I think if people didn’t know that it might concern some who would assume you need a fully built gaming tower or something like that.
- Hyong: Right.
Jim: So, is there anything else you think our readers might want to know about?
- Hyong: Yeah. So, this is more on the animator, professional side, but a lot of game studios have decentralized systems, especially with Covid. They’ll have their animation studio in Dublin, HQ in say… Vancouver. And they have to review and create and collaborate with motion capture and animation very intensively, but with the current environment, I don’t know how they do it. Whether they have their internal cloud structure or whatever, but transferring Maya data and Blender data and FBXGL data, non-stop, to give one review is a very cumbersome system.
Jim: Oh yeah – those files are very dense.
- Hyong: Right, so, what we’re planning to do is – I’m not sure if you’re familiar with a software called Gigma? It’s a fast design tool?
Jim: No, I’m not.
- Hyong: Ok, so, it’s like Photoshop but on the internet, on the cloud.
Jim: Ok, so, cloud based design software?
- Hyong: Right. So, it has a really good collaboration feature. You know on Google Docs, you can see your colleagues clicking on something, working on something, commenting on something. We’re planning to do that with PLASK, but with each frame of the animation.
Jim: Wow! That sounds really ambitious.
- Hyong: Yeah, so Producer, Director, designers, they can all see it together and Pms can approve of it, comment on it, even adjust it frame by frame, so people don’t have to send these files ad-nauseum and make this process really cumbersome.
Jim: That’s awesome.
- Hyong: Yeah. So that’s one thing. Another thing is, I’m not sure if you’re familiar with pre-visualization?
Jim: No! No, I have an interest in this stuff but am at an amateur level so I’m definitely getting an education today!
- Hyong: So, before people make movies, or make a game, they have to pitch an idea or a concept. Pitch a scene. In those scenes, they’ll use sketches or really low-resolution 3D characters. They don’t even move – they move them like, toys, you know? Without any rendering, just gray figures. With PLASK you can do pretty much the same thing by investing the same time, just by enacting the scene in front of a MacBook. So not only the motion capture, but the work side and the production side and the output side is one thing, but not only that – the actual conceptualization and sharing of your concept will be extremely accelerated.
Jim: Yeah.
- Hyong: Which is the underlying layer of all the production work that people actually see. There’s millions of these mock-ups and pre-visualizations that’s required. So I think this will speed up not only the production, but the conceptualizations, projection and sharing with other people. So, right now, production of animation and motion is an exclusive thing, available only to animators but this will eventually bring it down to the consumer level really rapidly.
Jim: Yeah, I really like the idea of creators just having more ability to bring their own vision to life without relying on specialists so much. I think that’s really cool.
And that was it, folks! What do you think? Sounds pretty amazing, right? Chatting with Hyong was a genuine pleasure and his passion and enthusiasm for PLASK was really evident in everything he was willing to share with me today. Be ready for their product launch date -January 5th! The Professional level of the PLASK tool will be available for free for the first three months at PLASK.ai so be sure to give it a try for yourself!
Facebook Comments