School of Science and Technology 科技學院
Computing Programmes 電腦學系

Yu-Gi-Oh MR Assistance System

OUYANG Xiaolong, Chan Hing Fung, Fan Cheuk Nam, Gu Jiahao

ProgrammeBachelor of Arts with Honours in Computing and Interactive Entertainment
SupervisorMs. Pasty Yuen, Dr. CM Tang
AreasAugmented Reality Applications
Year of Completion2022


Project Aim

This project aims to enhance the game experience of ‘Yu-Gi-Oh!TCG’ with the use of real-time coexistence and interaction in the mixed reality system.

The reason our team wants to do this project is to solve the problem of not being newbie friendly. Yu-Gi-Oh trading card game was launched in 1999, there are over a thousand cards for players, newbie can’t know all of the card effect in game, they need to learn a lot from other ways such as internet to learn what are them, therefore newbie can’t quickly enjoy the card game. Using our mixed reality system can help new players to enjoy the card game immediately. We decided to create and provide a database with different card information and cooperate with the UI to give some support to players in card battle, also provide important information support in UI such as life point and game phase,  let them easily understand the basics and enjoy the game.

Nowadays there are several AR devices that can provide virtual objects and information in real time. Using those devices can reproduce the screens in Yu-Gi-Oh Anime series, letting monsters appear in the real world. Users can interact with monsters by gesture to control them in real time, such as attack and spawn, show the monster fight with cool sensory visual and sound effect feedback to the player, let them have a more immersive and exciting game play.

Project Objectives

To achieve the aim, the project has defined several sub-objectives as follows:

  • To create a real-time database

The database would store the value of monster cards and players’ life points. Once the value is changed in the battle (e.g. life points decrease because of enemy’s attack), the data would revise immediately. When players call a monster, the system would search the id of that monster and get all the value of that monster by the id.

  • To design monster model and special effect

When players called a monster card, a 3D model of that monster would appear above the card. The monster also has several animations such as appearance, attack and death. Also, some particle effects are used to cooperate with the animations to bring better visual feedback. Moreover, a few sound effects and background music would be added. These animation and special effects would work automatically in suitable timing.

  • To setup non-interactive visual interface

Non-interactive visual interfaces display useful information in the battle from the real-time database. For example, the value of ATK/DEF of a monster would be displayed in front of the 3d model of that monster. These visual interfaces only allow players to know the situation of the whole battle clearly, players could not interact with that.

  • To setup interactive visual interface

There would be several interactive visual interfaces in the system, those interfaces are used for players to interact with the system. For example, users can point to a rulebook UI, when the system receives players’ action, it will turn on the rulebook. Therefore, interactive visual interfaces can provide some support for players to know what they can do and what they are doing.


Demonstration Video

Demonstration Video (Gameplay)

Methodologies and Technologies used

We decided to use AR devices with hand tracking and image tracking functions that can accomplish the battle such as anime in the real world. First, we would use Nreal glasses be the AR devices, it can help to project the virtual object like monsters, also show the UI and effects in real world for the players. Second is the hand tracking function, it provides a player using gestures to give an order for the game and monsters. After detecting gestures, the game system will give back a corresponding feedback for the player, like change battle phase and monster actions. Third is the image tracking, we will use image tracking to detect the monster cards in the real world. Before players play the game, they must need to have their own card deck, and prepare it to play with others in the real world. In our game it is the same too, and we will use image tracking to check which card it is, and give back the card information to the player from our card database like the card detail and virtual model. Also, we simplify the original rules and instructions for the game flow, let new players easily learn and handle the game. To provide the battle function with other players online, we also use Unity NetCode to finish the networking function, letting two different devices connect online to start the battle.

Combining the above point, we expect we can create an easier Yu-Gi-Oh trading card game on an AR device, using a mixed reality system with hand tracking and image tracking let players interact with virtual monsters in real time, feel like becoming the character in anime, and have an exciting game experience. The game can run on Nreal glasses and play it online with other devices, finish the task we need and give feedback to the player, it is a success for our project. 


Card image trackingaccuracy of image tracking
Gesture trackingspeed of image tracking
Network connectionaccuracy of hand tracking
Data of the cardsdata size of the cards
Battle systemUI design
VFX and SFX 

Key Supporting Technologies

Image tracking – EasyAR

By using the image tracking, the player can summon the specific monster in the game while using the card in the real world. And let the game system know what card the player is using to play the game. It also helps to combine the real world to the game virtual world.

Hand tracking – Nreal

This function is provided by the Nreal glasses. It allows players to use their hands to give commands to the game system to play the game. Also, it is one of the good ways to play the game. Because players can do cool gestures to play the game just like the anime series we try to reproduce. Also, allow the players to not need to take extra devices to play and control the game which is a benefit for the players. Because the players need to hold their cards while playing the card game in the real world and that will be an inconvenience for them to put down their card to control some extra devices.

Networking – Unity NetCode

This function is provided by Unity which is also the game engine. It allows players to syncron the game status easily and fastly. While using the Unity NetCode, we build a system that players can automatically pair and create a room for them to play. It is very easy to use.

Design and Ideas

There is a new way to play the Yu-Gi-Oh trading card game. We will provide animations and sound effects of the models. There will be a sound effect if players start the game. When the players are in battle, they will hear the background music from the classic Yu-Gi-Oh anime series. There are many sound effects and virtual effects that we have made for the models and the game.  

As a new way for playing this card game, we design an easier game flow based on the normal Yu-Gi-Oh trading card game in the real world for our game structure. According to the original game flow, we have the same battle phase in the game, but we ignored some card effects on monsters, so newbies can easily learn how to play it. Giving a UI on screen shows the battle phase with a short sentence telling players what action they can do in different phases. Our game flow can let players have a quicker game than the original and easy to understand the basic game logic by our tutorial.

By using the AR glasses to combine the virtual and the real world. Players can see virtual monsters in the real world. This can reproduce the anime series screens to players while they are playing the real-life card game with the AR glasses. And it doesn’t happen just playing the card game. Players can use gestures to give commands to play the game just like what the characters do in the anime series to control the monster. It gives more immersion for the players while playing the game.

By giving different pieces of information that are on the right side of the UI to guide the player to play the game. With the sound effects and clear instructions in the game, there is a good environment for new players to join this complex card game. With these clear UI designs, new players can distinguish which side is their enemy. For example, we provide red and green colors on the HP UI. Use the contrasting colors to distinguish player and enemy, red and green are often used to distinguish themselves from others in games. By game experience, people who play for the first time can easily distinguish which side they are in our game.


Evaluation Study #1

The objective of the first study is to test the detect capability of the image tracking system of EasyAR.


The methodology of the study is testing  how many times that the system can detect the card in 10 seconds, and we test it in total 10 times.


The result is that two of the cards are easy to detect (more than 70% success rate), but two of them are hard to detect (less than 50% success rate).

Figure 1.

Discuss the significance of the results:

Therefore, we need to find some ways to increase the detect capability of two cards. We try to add more image sources for detecting those two cards, and it can successfully make those two cards easier to detect.


We just test it in a bright place, the detect capability may be affected by the lighting.

Figure 2.

Evaluation Study #2

There are a total of six gestures we used in the project.The objective of the first study is to test the usage frequency of different gestures.

GestureThe use of gesture
Left hand point to objectShow/close the monster information UI
Left hand victoryShow/close the rule book
Left hand grabChange the game phase
Right hand point to objectSelect the object
Right hand victoryConfirm the action (after right hand point)
Right hand grabCancel the action (after right hand point)


The methodology of the study is testing how many times different gestures would be used in a game (We test it in total two times)


The “Left hand grab” gesture is the most usual gesture, far more than others.

(LH = left hand, RH = right hand)

Figure 3.

Discuss the significance of the results:

The “Left hand grab” is the most important gesture because change the game phase is the “Must” action in the game. We need to keep this gesture easy to detect.


We do not find a newbie to test it, newbies may use the “Left hand victory” and “Left hand point” more because they do not know enough about the game rule/monster.

Evaluation Study #3

The objective is to test the detect capability of the different gestures.


The methodology of the study is testing how many times the gestures can be detected correctly (We test it in total 20 times).


Only the victory gesture has the misdetect problem, the misdetect problem of “Left hand victory” is most serious. The device mis-detects the victory to the grab.

(LH = left hand, RH = right hand, blue bar means correct, red bar means misdetect)

Discuss the significance of the results:

Because the change phase action is an irreversible action, it is very troublesome when the system detects the “Left hand victory” to the “Left hand grab”. To solve this problem, we simply remove the “Left hand victory” gesture, and provide a UI for players to use “Right hand point” pointing to it to open the rule book.

Figure 4.


To conclude this project, we did a good job on it.

By using different methods and technologies to develop the game. The project has achieved its aims.

For the aim of a new player friendly part, our game is good for new players to play the Yu-Gi-Oh card game. The game has a lot of support to let the players know what they are doing while playing the game and shows important information about the battle.

Also, for the aim of game immersion, the game provides good immersion for the player to enjoy the game. We provide different sound effects, 3D models, animations and visual effects to give players different aspects of enjoyment. Although there are limitations on the game, such as limited monster 3D models and no card effect can be activated. These limitations can be solved in the future and will be developed.

Future Development

The game is playable, but there are still different ways we can improve in future.

First for the tutorial part, we can build 3D model hands to teach the player how to do the gesture, let player to easily understand and remember the correct gesture. Also, we can make some 3D signature icons or effects such as arrows and visual effects, showing where the cards can be placed and the information of the specific area, using visual effects such as light brightening places too.

For the second direction, we can improve the projecting of the playmap, we can try to scan any plane as a playmap. Our current version of the playmap is floating on the air. We might detect a flat plane such as table and floor, and project the virtual playmap on there.

The third improvement direction is to give more ways to player control virtual monsters. Our game only provides some basic gestures for player. We hope to design more gestures for the player to use in game, and we may add voice control to increase interaction with monsters, making the player more engaged in the game like in the anime series. Nowadays there are games that provide voice control like 8th note and ghost exorcism, we can study from them voice control if it fits in our game.

For the fourth, because the time constraints, we do not have enough time to make a lot of monster model, as we said in background, there have over thousand card in game, we hope can build more cool virtual monster in future, let player can play with their favorite monster in game.

Finally, in this project we only provide Nreal device, we only build around Nreal and can’t test in different device, we hope in future the game is able to run on different AR device like hololens. These points are the improvement direction we find in our project.

Jonathan Chiu
Marketing Director
3DP Technology Limited

Jonathan handles all external affairs include business development, patents write up and public relations. He is frequently interviewed by media and is considered a pioneer in 3D printing products.

Krutz Cheuk
Biomedical Engineer
Hong Kong Sanatorium & Hospital

After graduating from OUHK, Krutz obtained an M.Sc. in Engineering Management from CityU. He is now completing his second master degree, M.Sc. in Biomedical Engineering, at CUHK. Krutz has a wide range of working experience. He has been with Siemens, VTech, and PCCW.

Hugo Leung
Software and Hardware Engineer
Innovation Team Company Limited

Hugo Leung Wai-yin, who graduated from his four-year programme in 2015, won the Best Paper Award for his ‘intelligent pill-dispenser’ design at the Institute of Electrical and Electronics Engineering’s International Conference on Consumer Electronics – China 2015.

The pill-dispenser alerts patients via sound and LED flashes to pre-set dosage and time intervals. Unlike units currently on the market, Hugo’s design connects to any mobile phone globally. In explaining how it works, he said: ‘There are three layers in the portable pillbox. The lowest level is a controller with various devices which can be connected to mobile phones in remote locations. Patients are alerted by a sound alarm and flashes. Should they fail to follow their prescribed regime, data can be sent via SMS to relatives and friends for follow up.’ The pill-dispenser has four medicine slots, plus a back-up with a LED alert, topped by a 500ml water bottle. It took Hugo three months of research and coding to complete his design, but he feels it was worth all his time and effort.

Hugo’s public examination results were disappointing and he was at a loss about his future before enrolling at the OUHK, which he now realizes was a major turning point in his life. He is grateful for the OUHK’s learning environment, its industry links and the positive guidance and encouragement from his teachers. The University is now exploring the commercial potential of his design with a pharmaceutical company. He hopes that this will benefit the elderly and chronically ill, as well as the society at large.

Soon after completing his studies, Hugo joined an automation technology company as an assistant engineer. He is responsible for the design and development of automation devices. The target is to minimize human labor and increase the quality of products. He is developing products which are used in various sections, including healthcare, manufacturing and consumer electronics.

Course CodeTitleCredits
 COMP S321FAdvanced Database and Data Warehousing5
 COMP S333FAdvanced Programming and AI Algorithms5
 COMP S351FSoftware Project Management5
 COMP S362FConcurrent and Network Programming5
 COMP S363FDistributed Systems and Parallel Computing5
 COMP S382FData Mining and Analytics5
 COMP S390FCreative Programming for Games5
 COMP S492FMachine Learning5
 ELEC S305FComputer Networking5
 ELEC S348FIOT Security5
 ELEC S371FDigital Forensics5
 ELEC S431FBlockchain Technologies5
 ELEC S425FComputer and Network Security5
 Course CodeTitleCredits
 ELEC S201FBasic Electronics5
 IT S290FHuman Computer Interaction & User Experience Design5
 STAT S251FStatistical Data Analysis5
 Course CodeTitleCredits
 COMPS333FAdvanced Programming and AI Algorithms5
 COMPS362FConcurrent and Network Programming5
 COMPS363FDistributed Systems and Parallel Computing5
 COMPS380FWeb Applications: Design and Development5
 COMPS381FServer-side Technologies and Cloud Computing5
 COMPS382FData Mining and Analytics5
 COMPS390FCreative Programming for Games5
 COMPS413FApplication Design and Development for Mobile Devices5
 COMPS492FMachine Learning5
 ELECS305FComputer Networking5
 ELECS363FAdvanced Computer Design5
 ELECS425FComputer and Network Security5