Her face is a deepfake. Her body belongs to a team of similar-sized actors. But she sings, reads the news, and sells luxury clothes on TV as AI humans go mainstream in South Korea.
Meet Zaein, one of South Korea’s most active virtual humans, who was created by Pulse9, an artificial intelligence company that is working to bring corporate dreams of the perfect employee to life. Pulse9 has created digital humans for some of South Korea’s largest conglomerates, including Shinsegae, with research indicating the global market for such life-like creations could reach $527 billion by 2030.
In South Korea, AI humans have enrolled as students at universities, interned at major companies, and appear regularly on live television driving sellouts of products from food to luxury handbags. But Pulse9 says this is only the beginning. They are “working on developing the technology to broaden AI human use”, Park Ji-eun, the company’s CEO, told AFP. “Virtual humans are basically capable of carrying out much of what real people do,” she said, adding that the current level of AI technology means humans are still needed—for now.
The demand for AI humans in South Korea was initially driven by the K-pop industry, with the idea of a virtual idol—not prone to scandals and able to work 24/7—proving popular with the country’s notoriously hard-driving music agencies. But now, Pulse9 is “expanding their roles in society to show that these virtual humans aren’t just fantasy idols but can coexist with humans as colleagues and friends”, Park said.
K-pop face Zaein’s face was created by a deep learning analysis—an AI method that teaches computers to process complex data—of the faces of K-pop stars over the last two decades. Doe-eyed with delicate features, fair skin and a willowy figure, she is brought to life by overlaying the deepfake on a human actor. More than 10 human actors, each with different talents—from singing, dancing, acting, to reporting—help animate Zaein, which is what makes this particular AI creation so “special”, Park said.
On a Monday morning, AFP met with one of the actors as she was preparing to deliver a report as Zaein on a live morning news program on South Korean broadcaster SBS. “I think it can be a good practice for people who want to become celebrities and that’s what appealed to me,” said the actor, who could not be named due to company policy.
A representative for Pulse9 said the identities of all human actors are concealed and their real faces not shown. Despite the strict measures to keep their profiles hidden, the actor said playing as a virtual human opened new doors. “Typically, a lot of people in their teens and young people become K-pop idols and I’m way past that age, but it’s nice to be able to take on that challenge,” the actor, who is in her 30s, told AFP. “I’d love to try acting as a man if I can manage my voice well, and maybe a foreigner—something that I can’t become in real life.”
Creating artificial humans will continue to require real people “until a really strong AI is created in future which will be able to process everything by itself”, Park said. The potential—and potential perils—of AI have exploded into the public consciousness in recent months, since ChatGPT burst onto the scene at the end of last year. Experts around the world, including AI pioneers, have spoken out about its dangers, and several countries are seeking regulation of the powerful but high-risk invention.
But Park is not concerned. Her company is working on new virtual idols, virtual influencers, and virtual sales agents to take over customer-facing tasks for South Korean conglomerates, which are increasingly struggling with recruitment in the low-birthrate country. South Korea—and the world—needs better, clearer regulations on what AI can do, she said, adding that when done properly, the technology can add to “the richness of life”.
The trouble however, is that a deepfake can “make it impossible to tell what is real and fake”, Kim Myuhng-joo, a professor of information security at Seoul Women’s University, told AFP. “It’s an egregious tool when used to harm others or putting people in trouble. That’s why it’s becoming a problem,” he added.—AFP
Use the share button below if you liked it.