Interactive movies are redefining cinema and storytelling and opening up a world of possibilities in the entertainment industry. There are no “spoilers” for films with no predetermined endings, whose characters and plots develop based on viewers’ real-time direction. Now, what if these viewers became characters?
California-based startup rct studio is leveraging cutting-edge artificial intelligence in its quest for a movie experience that is both interactive and immersive.
Rct studio’s goal is to create a virtual environment where human players wearing VR headsets can explore a multi-ending story environment while interacting with non-player characters (NPCs) whose behaviors are indistinguishable from human players’. Something like the American TV series Westworld, where humans visit a fictional tech-driven, Wild-West-themed amusement park populated by androids with commonsense and self-consciousness.
Rct studio made its public debut at the Y Combinator Demo Day in San Francisco last month. A 57-second demo video showed how the company’s homegrown Morpheus engine can automatically analyze and directly convert screenplay texts — including characters and dialogue, objects, action lines, and even real world physics — into real-time 3D simulations. The video went viral on Chinese social media Weibo.
In another four-minute VR demonstration run entirely on the Morpheus engine, a human player acting as a Bank Robber character plays out multiple scenarios, such as “threatens the bank teller and shoots”, “shoots randomly”, or “misfires”. Each scenario leads to different outcomes and endings.
Founded in late 2018, rct bills itself as “a new Pixar for interactive movies” and hopes to follow in the footsteps of the animation studio founded 33 years ago by a team of computer scientists and movie producers. With a vision and a willingness to leverage then-under-appreciated new technologies, Pixar ushered in a golden age of animation and produced block-busters Toy Story, Monsters, The Incredibles, and others.
The Founder of rct studio is Jesse Lyu (吕骋), a 29-year-old entrepreneur known for his previous AI startup Raven Tech (渡鸦), which developed high-end smart speakers and an Alexa-like, voice-enabled interactive system. Chinese tech giant Baidu acquired Raven Tech in 2017, and Lyu led Baidu’s smart hardware unit for about a year before departing in July 2018.
“I personally think that we will spend more and more time on the screen. The best screen is your vision, and VR is the best screen in the world for us … After ten years, everyone shall have a room called ‘another world’ (for virtual reality),” Lyu tells Synced.
Leveraging AI in interactive movies
The year 2018 saw the rapid rise of interactive movies, also known as choose-your-own adventure movies or movie games. In the Netflix interactive movie offering Black Mirror: Bandersnatch viewers make choices via on-screen prompts to decide what happens next. Sony, joined by top game developer Quantic Dream, released a movie-like adventure game called Detroit: Become Human, with stories that revolve around three androids in a future world.
There is now only a faint and blurry line between interactive games and interactive movies. Advanced computer graphic techniques are being used to develop ultra realistic video game content.
A characteristic shared by today’s interactive movies is that each possible ending has been pre-filmed. Viewers arrive at various endings by clicking through plot options, progressing forward as in a multi-choice decision tree.
That is not the type of interactive experience Lyu envisions. He dreams of a virtual world with a much broader range of interactions, where NPCs empowered by AI have commonsense and act like real humans. Instead of a list of pre-developed dialogue options on a static screen, Lyu wants to enable in-game interactions to trigger an infinite set of possible endings.
Georgios N. Yannakakis, a Professor and Director of the Institute of Digital Games at University of Malta, wrote in his book AI and Gaming “AI that plays well as a non-player character … can empower dynamic difficulty adjustment and automatic game balancing mechanisms that will in turn personalize and enhance the experience for the player.”
Advanced AI is already creating a more convincing open world for gamers. In the Old West adventure game Red Dead Redemption 2 which was a runaway hit last year, NPCs are created with memories. “If you start a fight in a saloon and return another day, the bartender will ask you not to start any trouble. Depending on the severity of past actions, they may tell you to leave or call a Sheriff,” explains a RockstarINTEL.com report. It’s these little details that can push the game experience to the next level.
VP of AI and Machine Learning at Unity Technologies Danny Lange tells Synced that rather than programming NPCs using traditional means, today’s advanced AI tech can train them to become increasingly sophisticated: “Machine Learning has this capability of ‘generalization’ that enables NPC to learn through simulation to deal with never before seen situations.”
Inside rct studio technology
Lyu believes AI and machine learning work best in carefully constrained areas with extremely high value, and movies are a perfect example of this. “Take the most complex movies – such as the Harry Potter or the Marvel series — as examples. While these movies have so many characters, they still have a clear storyline, a clear in-and-out. There are a lot of standardized, industry-grade screenplay scripts that data-driven machines can utilize and learn from.”
Built in just eight months, the company’s Morpheus engine is driving its futuristic interactive experience. Inside the Morpheus engine is a natural language processing system designed to automatically analyze screenplays and divide texts into meaningful units (text segmentation); recognize key words by categories like subjects, actions, and target; and bind all this information together. The system then instructs an automated game editor to create a real-time 3D simulation.
The engine can usually render about 90 percent of a typical script, but Lyu stresses that human design input is still needed if the text descriptions are too ambiguous or indicate a physically unrealistic thing such as the Griffin from Harry Potter.
Lyu says a number of animation studios and TV stations have shown interest in the Morpheus engine. In terms of efficiency, he estimates the engine could save a typical Hollywood action film up to US$20 million by automating key processes.
Sean MacPhedran, a Canada-based creative director and co-founder of the Ottawa International Gamer Conference, tells Synced “it would definitely be a time saver in creating storyboards, understanding the tempo of a story – in effect it sounds like the tool creates a very rough cut of the narrative.”
Another Morpheus engine innovation is using AI to enhance in-game interactions and create open-ending possibilities. The Morpheus engine requires no step-to-step handcrafted coding for in-game characters, whose behaviors are based on commonsense. For example, a NPC Police Officer understands guns are dangerous, so when a human player surrenders demands they put down all their weapons.
To create the commonsense characteristic, rct studio developers trained a convolutional neural network on a vast number of industry-standard scripts so AI could predict how to react. Developers also integrated the AI model with two filters packaged in APIs — commonsense and physics — so NPCs would not do anything physically unrealistic.
“I personally believe that in the coming year, AI can pass the Turing test in the game with a limited storyline or a world view — that is, you can’t tell whether the NPC is a computer or a human. Today’s algorithm is technically ready, in my opinion” says Lyu.
An AI expert with a Machine Learning Master’s who asked not to be identified seconded Lyu’s opinion: “It is definitely possible. From the NLP side I feel like we’re far enough along that it won’t be very limiting, though understanding players’ physical actions is probably going to be a challenge.”
Asked about the near-future plans for rct studio, Lyu says the first step is to release a first title on Steam by the end of this year. Rct studio is in talks with some of the big names in the entertainment industry to co-produce a title using its Morpheus engine.
Rct studio is also committed to developing a customized VR hardware for the Morpheus engine. “We should first improve our user experience and software. When you are so obsessed with your software to a certain extent, you will definitely want to make a hardware just to run the software,” says Lyu.
Rct studio knows that its bold plan is bound to be challenging, but it has reason to be confident: it has been accepted by the YC Accelerator Program; Sand Hill Road investors called Lyu after the YC demo, and the company recently closed its Series A funding.
Lyu’s ambitious nature and hands-on personality permeates his companies. At Baidu, he developed one of the most stylish but expensive smart speakers on the Chinese market and would not budge on compromising its features. He dislikes being labeled as a normal person, and rarely talks to media.
Most rct team members are Lyu’s confidants, who worked together at Raven Tech for years and have been through the whole game. Most importantly, they share his vision and are enthusiastic about what they are doing.
“Have you watched the movie Ready Player One (a 2018 sci-fi film whose characters meet in a VR universe)?” CMO Xinjie Ma (马欣洁) asked Synced. “That is exactly we are building!”
Journalist: Tony Peng | Editor: Michael Sarazen