Introduction
Welcome to the first Unit of the course!
The integration of cutting-edge AI models in video games is opening a whole range of new exciting gameplays. One of them is the ability for Non-Player Characters (NPCs) to understand and respond to the player’s voice or text inputs. And this is what we’re going to do today.
Indeed, in this Unit, you’re going to integrate your first AI model in your Unity game and make it run locally.
You’re going to learn:
- What is Sentence Similarity?
- The difference between running an AI model locally or remotely (using an API).
- What the Hugging Face Hub 🤗 is.
- How to run an AI model locally with Unity Sentis and Sharp Transformers.
And you’re going to make this demo, where a smart robot that can understand your orders and perform them.
You can download the Windows demo 👉 here
To make this project, we’re going to use:
Unity Game Engine (2022.3 and +).
The Jammo Robot asset made by Mix and Jam.
Unity Sentis library, the neural network inference library that allow us to run our AI model directly inside our game.
The Hugging Face Sharp Transformers): a Unity plugin of utilities to run Transformer 🤗 models in Unity games.
You can download the complete Unity Project by clicking 👉 here
At the end of the project, you’ll build your own intelligent robot game demo.
And then, you’ll be able to iterate with other ideas:
For instance, after making this game, I created this Dungeon Escape demo ⚔️ with the same codebase, where your goal is to flee from this jail by stealing the 🔑 and the gold without getting noticed by the guard.
But I also worked on a stealth game, where you guide your character to sneak in a party and steal some stuff.
You’ll be also able to improve the demo, by adding a Speech To Text model to give order to the robot with your voice!
Sounds fun? Let’s get started!
< > Update on GitHub