At Replica, we believe AI Voice technology has the potential to eliminate critical bottlenecks in the development of narrative heavy games, enabling studios to tell ever bigger, and more immersive stories.
While talking to and working with some of the most successful studios and game franchises today, we discovered an interesting problem that many of them have in common.
Despite many of these studios boasting eye-watering production budgets and huge teams of people to co-ordinate voice acting, they are hitting an upper bound on the number of characters and lines of dialog they can ship in videogames.
The reason is, complexity grows exponentially with scale. When you consider games that have several 100’s of NPC characters, where each NPC requires a voice actor to perform lines, a director for live performance coaching, potential script changes and reshoots, mastering and sound FX, dubbing and localisation, then animation, SFX, VFX, etc; the effort involved in co-ordinating this massive project within a limited time frame is enormous.
It is obvious that the current system of populating videogames with characters and creating immersion through storytelling does not scale easily. We see the potential for AI to solve many of these challenges, allowing even small teams to dream big and scale their games into epic adventures.
Introducing Replica Smart NPCs : an AI Experience featuring The Matrix Awakens
We are thrilled to introduce, Replica Studios AI Powered Smart NPCs for Unreal Engine. Think of this as some of the foundational work we are doing to help game studios to easily scale their games with fully voiced and animated AI powered NPCs.
Download and install the playable demo featuring REPLICA Smart NPCs in the Matrix Awakens project - Download it here!
Over the coming weeks we will monitor and improve the demo with update patches. We are working to improve latency, offer integrations to other LLMs, and offer plugins to every major game engine so that all game developers can use Replica Smart NPCs and leverage our unique, high quality and ethically licensed library of AI voice actors.
Sign up to our waitlist and we’ll notify you when the Smart NPC Unreal Engine plugin is available
Have we hit an upper limit of voice acting in videogames?
Starfield has Over 250,000 lines of dialogue, over twice of Fallout 4 that clocked in at 111,000 lines. GTAV has over 160,000 lines. Red Dead Redemption 2 features 1,200 actors, 700 of whom share the 500,000 lines of voiced dialogue. These are really impressive numbers and the teams behind this deserve massive respect!
Could you imagine a game that has 5 million recorded lines of dialog? What would this game look and feel like? How many characters and NPCs would be required to populate a world so alive? How would teams keep track of so many lines, stories, characters, and how will they record that many lines within the amount of time it takes to ship a title?
The eventual Metaverse will birth new genres of games and experiences.
Today we are familiar with fast paced multi-player shooters like Fortnite, PubG or Apex Legends, or even Minecraft like worlds. We believe there is a new category of virtual experiences just around the corner that will rise to immense popularity, it will be known for its depth of narrative and endless storytelling, for having 100’s of characters, along with ever expanding quests and stories.
Games like Red Dead Redemption 2, The Witcher 3, Grand Theft Auto and the Assassin’s Creed titles have given us a glimpse into what this might look like, but these games have thus far been confined to the lofty realms of AAA studios that can afford massive teams and lengthy production cycles.
We see a future where nimble independent studios can iteratively build live experiences, leveraging the power of AI, to bring this kind of storytelling to life with just as much immersion as the big studios.
In order to realise this dream, we need a new system of populating live games with living characters that add to the immersion, in a sustainable way. The current system of filming and recording live actors does not scale.
We think one way of shipping games with 10x or 100x more voice acting would be through a combination of AI voices with generative AI models like ChatGPT so that some % of the ‘voice acting’ is done autonomously, by NPCs responding directly to player actions and commands in the game.
In order for narrative and game design teams to trust the responses of these autonomous AI Smart NPCs, the NPC ‘brains’ would need to be fine tuned on the lore of the game, the characters, backstories, and motivations. They would also need to set conditions that decide when an NPC delivers a pre-scripted line versus an AI generated line. We see the role of writers and narrative designers eventually encompassing the management and fine-tuning these memory and context conditions that power AI Smart NPCs, in addition to the writing and narrative design work they currently do.
Replica’s vision is to offer Unreal Engine 5 like capabilities, for voice acting.
Our team was inspired by The Matrix Awakens experience when Epic Games released it last year. The impact of UE5 still rings clear : Now, even a tiny game studio can match the visual fidelity of a well funded AAA team. At Replica, we are excited to be working towards a future where even small studios can dream of matching the storytelling fidelity and depth of immersive characters as well funded AAA teams.
So today, we are thrilled to share the work we’ve been doing. Inspired by the Matrix awakens demo, we are releasing an early Matrix demo powered by Replica Studios AI Smart NPCs.
With a modest windows PC you can experience the demo for yourself.
Download it here
Sign up to the waitlist to be notified when we release the Smart NPC Plugin
Our playable demo builds on Unreal Engine’s Matrix Awakens sample project and shows off some of the features that will be available in the Smart NPC plugin that we will release later this year. Features of the playable demo include the following :
Talk to any NPC and they respond in real time
Using your microphone, talk to NPCs and they will respond with dialogue in real-time
AI voices with emotion
NPC responses utilise Replica’s range of emotions and adapt in real-time.
Scale unique NPC characters for your world
NPC context and backstories can be customised for a unique tailored experience (not available in the playable demo).
NPCs can talk to each other
Ambient NPC interaction is in-built, so NPCs will converse with each other intelligently as well.
Automated lip-sync and body gestures
Included is a customised blend-shape to respond to phoneme timing output to create effortless and accurate real-time lip sync. Animation blueprints also provide suitable body gestures.