School of Science and Technology 科技學院
Computing Programmes 電腦學系

VR Driving School

Mak Ka Wai, Cheng Pak Yeung, Leung Yiu Man

Programme Bachelor of Computing with Honours in Computing and Interactive Entertainment
Supervisor Dr. William Lai, Dr. Keith Lee
Areas Virtual Reality and Augmented Reality Applications
Year of Completion 2021

Objectives

The project aim of this project is to develop a software to provide tutorials to users who wish to learn driving and enhance their sense of safety during driving. Also to replace or reduce the pressure of insufficient amount of driving teachers under the influence of COVID-19 and shortage of driving teachers.

Here we have revised objectives for the project. Especially for the third objective, the reasons for changing it as we have a more detailed rules on judging users' behaviors. After a team discussion, we decided to use a rule-based system to verify users' behaviors.

  1. Provide Eye-tracking function
    • To use a VR headset with an eye-tracking function for tracking eyeball movement.
  2. Record the movement of the eyeball
    • Setting our one of primary objectives will be recording eyeballs activities for data collections.
  3. Use rule-based judgment system
    • Adding rule-based judgments to the game system and hoping to use the judgment system to find out what mistakes students made in the process of learning to drive, so as to improve their driving habits.
  4. Use Hong Kong Street View as one of the backgrounds
    • Assemble the reality of the HK street view (Localization) as one of the objectives.
  5. Develop a virtual environment for VR simulation
    • Use the virtual world for projecting and simulating the HK environment.
  6. Evaluate the effectiveness of the software
    • Evaluate the software as similar or effective as the real driving school's teachers.

Methodologies and Technologies

We will build a virtual world by using Unity / Unreal engine. In order to communicate with VR headset and its eye-tracking function, we will read the manufacturer’s manual and write scripts.

On the other hand, we will record the data from VR headset and store them into a file or database.

Start Menu

This is the start menu, users can view this menu when the game starts, and users can choose the level they want or edit some setting or watch the button placement in this menu.

Figure 1: Start Menu UI

VR with Eye-Tracking

About the VR with Eye-Tracking, we create an eye-tracking script, which can detect the direction the user is looking at. When the user looks at the road signs or vehicle panel board, the road signs or vehicle panel board will turn red to show where the user is looking.

Figure 2: Player Screen inside vehicle

Vehicle Setting (Steering Wheel Controller, Tuning and Mirrors)

About the vehicle setting and turning, We re-adjusted the entire vehicle system. First, we adjusted the position of the shifter so that it can present the different gear we want. Second, we re-adjusted the force feedback of the Steering Wheel to make the Steering Wheel closer to the realistic feedback. We also adjusted the vehicle throttle and brakes to make it the same as the real reaction.

About the vehicle mirrors, we seperated the car mirror meshes and used “Render to Texture” and cameras, to let the inverted rear mirror can reflect the scene for the user to view.

Figure 3: Player Screen with vehicle components

Hong Kong Street View

In order to make the player feel immersed, the simulation must be close to the real environment, to allow users to learn to drive as in reality, we built a virtual 3d world based on the real Ho Man Tin street view for users to practice and test. In this virtual world, we have constructed many realistic details, such as buildings, pavement markers, traffic lights, road signs, Road prop, street lights, parked vehicles, trees etc. The details will be described in detail below.

Figure 4: Chung Hau Street

Hong Kong Street View – Road base

About the road base, since Ho Man Tin is a valley, the roads all vary in height, so it is quite difficult to build this scene. In order to build a terrain similar to reality, we refer to the altitude provided by Google Earth and Google Map to help us build the entire scene.

Figure 5: Road Simulation

Also, the road base is created and controlled by dots, each dot can control the position of the road, so that we can create the road base more similar to reality.

Figure 6: Road Simulation

Hong Kong Street View – Pavement markers

In order to make the Pavement markers in the scene the same as the real ones, we use google map to locate the Pavement markers place, let the markers be more  realistic. We downloaded similar glyphs and used photoshop to build some Pavement markers,such as ”BUS STOP 巴士站”, ”KEEP CLEAR 請勿停車”, ”SLOW 慢駛”, ”LOOK LEFT 望左”, “LOOK RIGHT 望右”.And  we also created an alpha channel for these pavement markers, so that it can use the alpha clipping function in the unity material, we can put these materials into the unity plane. Due to the large number of pavement markers on the scene, we also spent a lot of time placing these Pavement markers.

Figure 7: Road with Pavement markers

Hong Kong Street View – Building

We refer to Google Earth and Google Maps to adjust the location and scale of the buildings to make them consistent with the actual scale. All of the building images were capture from google map and downloaded from the Internet, we used photoshop to adjust the textures to complete the construction of these buildings.

Figure 8: Building Simulation

Hong Kong Street View – Oi Man Estate Bus Terminus

This Oi Man Estate Bus Terminus is built by our own hands. We use blender to construct this model, so as to be the same as the real Oi Man Estate Bus Terminus.

Figure 9: Bus Stops Simulation

Hong Kong Street View – Traffic lights

In this scene, all of the traffic lights will follow the traffic system and automatically circulate red, green, yellow and red light in an orderly manner.

Figure 10: Traffic Light Simulation

Hong Kong Street View – Road sign

All of the road signs are built by ourselves, we capture the road sign texture on the internet, and combine these road signs into Ho Man Tin road signs, including Give Way sign, Direction sign right sign, MTR sign etc.. Also, we add an eye-tracking script on it, so when the user watches the road sign, it will turn red.

Figure 11: Road Signs Simulation

Hong Kong Street View – Road prop

We used many different road props to lay out the scene to shape the details in the real world so that the whole scene is close to the real world.

Figure 12: Road Props Simulation

Pedestrians System

For the pedestrian system, more than 300 pedestrians in the entire scene. As you can see in the scene, there are 4 types of behavior of the pedestrians, which is walking, running, talking and audience. Each of the above behaviors have their own system and variables. Based on the above four behaviors, it is set up the pedestrians to the suitable place, for example there are some pedestrians using the audience mode and line up nearby the bus stop, some pedestrians using the talking mode around the plaza and university.

Figure 13: Pedestrians System Simulation

Traffic System

For the traffic system, mainly divided into three parts. First part is to explain the waypoint system, the second part is to explain the simple traffic system tool which is installed in the traffic system and the last part is to explain the relevance between route and waypoints.

To begin with, the AI car mainly follows the waypoints, and the waypoints are ordered by route. There are more than 200 waypoints in the scene. Each waypoint has its own variable, which is the parent route, AI car speed limit, connecting route points, lane change points and yield trigger points. Each of the AI car speeds will be followed by the speed limit of the waypoint. For the explanation of connecting route points and lane change points will continue in the second part.

Secondly, it will describe the simple traffic system tool installed in the traffic system. The simple traffic system tool is for creating a traffic controller, traffic light manager, traffic stop manager and spline route connector. Also, the simple traffic system tool can use the configure mode to create the lane change points, connecting route points, spawn points and yield trigger areas. The lane change points refers to the waypoint creating a lane change points for the ai car doing lane changing, and the connecting route points refers to connecting the waypoints to the other waypoints on another route. In addition, the yield trigger area means that the AI car automatically stops when it detects other vehicles in the yield trigger area. That’s all of the function of a simple traffic system tool.

Finally, for the relevance between route and waypoints. All waypoints will be sorted according to the requirements of the route. For example, if 10 waypoints are arranged in a straight line, it is a route. Also, the route can create the spawn points for AI cars in each waypoint.

For the signal connector of traffic system tools in configure mode, when clicking on the “Load Light & Routes”, it will show the traffic light and route points, as you can see in the below picture there is a red line connecting the traffic light and route. The AI car will stop at the last waypoint of the route when the traffic light is red.

Figure 16: Signal Connector System Simulation

Figure 14: Traffic System Simulation

Figure 15: Traffic System Simulation

Sound Effect

For the sound effect in the project, it can be divided into five parts. The first part is AI car sound effect, the second part is Pedestrian sound effect, the third part is Player vehicle sound effect,the fourth part is start menu and pause menu effect and the last part is the tutorial mode voice navigation sound effect

To begin with, the sound effects of AI vehicles mainly include engine sound effects. The sound effects of the engine are set in the vicinity of the AI car, and will only be triggered when the player is close to the AI car. Mainly to simulate the engine sound of a real vehicle.

Figure 17: AI Car Sound Effect

Secondly, for the pedestrian sound effect, mainly refers to talking sounds and walking sounds. The above sound effect setting is the same as the engine sound of AI Car sound effect. These sound effects are mainly to simulate the pedestrian walking or talking in real life.

Figure 18: Talking pedestrian will have a shared sound effect

Thirdly, for the player vehicle sound effect refers to the engine sound, blinker sound, horn sound, brake hiss sound. These sound effects are default by the “NWH Vehicle Physics 2” system. These sound effects are mainly to simulate the car sound in real vehicles.

The fourth part of the sound effect refers to the start menu and pause menu UI. It is mainly divided into the select sound effect and the sound effect of entering the level. It uses the button on click function for the entering level sound effect, and the button input detects function for select sound effect.

Evaluation

Evaluation Methods and Design Due to the impact of covid-19, we can only conduct the experiment with 5 people. In order to test the eye-tracking accuracy inside the VR, we borrowed the HTC VIVE Pro Eye from school for testing. Before the experiment procedures, we did eye calibration of each tester. After that, the testers are needed to wear VR headset to watch three designated targets to verify the accuracy of eye-tracking, which is one meter, three meters, five meters targets. For each target, testers are required to stare at the center circle for five seconds. Then, they are required to stare at the second inner circle for five seconds. After that, they are needed to watch the black and white grids three times and look at each one clockwise. Afterwards, they are required to watch the outer red and grey grids three times and clockwise. At last, they are needed to repeat the above steps until the last target.
1st circle (1m) 2nd circle (1m) 3rd circle (1m) 4th circle (1m) 1st circle (3m) 2nd circle (3m) 3rd circle (3m) 4th circle (3m) 1st circle (5m) 2nd circle (5m) 3rd circle  (5m) 4th circle  (5m)
Tester 1 0.99 0.10 0.58 0.56 0.80 0.53 0.65 0.50 0.81 0.62 0.63 0.50
Tester 2 0.87 0.67 0.81 0.54 0.95 0.79 0.96 0.40 0.86 0.98 0.85 0.46
Tester 3 0.87 0.93 0.88 0.52 1.00 0.87 0.88 0.56 0.86 0.45 0.90 0.52
Percentage % 99% 10% 58% 56% 80% 53% 65% 50% 81% 62% 63% 50%
87% 67% 81% 54% 95% 79% 96% 40% 86% 98% 85% 46%
87% 93% 88% 52% 100% 87% 88% 56% 86% 45% 90% 52%
Average / 100 0.91 0.57 0.76 0.54 0.92 0.73 0.83 0.49 0.84 0.68 0.79 0.49
Mean / 100 0.91 0.5667 0.7567 0.54 0.9167 0.73 0.83 0.4867 0.8433 0.6833 0.7933 0.4933
Median / 100 0.87 0.67 0.81 0.54 0.95 0.79 0.88 0.5 0.86 0.62 0.85 0.5
Mode / 100 0.87 0.1 0.58 0.56 0.8 0.53 0.65 0.5 0.86 0.62 0.63 0.5
Average in 100% 91% 57% 76% 54% 92% 73% 83% 49% 84% 68% 79% 49%

Table 1: VR eye-tracking dataset from testers

  First and Second Circle Accuracy Third and Forth Grids Circle Accuracy
Tester 1 64.11% 56.94%
Tester 2 85.61% 67.01%
Tester 3 83.17% 70.83%
Avg. Percentage 77.62% 64.93%

Table 2: Accuracy percentage of each sections

About the first tester, it is strange that it only has 10% accuracy while staring at the second inner circle. However, the accuracy in first and second inner circles are over 60%. Additionally, the accuracy in third and forth grids circles are around 57%.

Comments from the first tester stated that the headset did not detect correctly while she is looking at the forth grids circle in both ranges.

For the second tester, data showed that the accuracy in first and second inner circles increased significantly to around 85%. Furthermore, the accuracy in third and forth grids circles ascent to around 67%.

Comments from the second tester mentioned the same problem as the first tester. On the other hand, he commented that sometime may suddenly jump or skip to other closer grids that he was not staring at.

Regarding the third tester, data showed that the accuracy in first and second inner circles is reduced to around 83% but the overall accuracy in third and forth grids circles slightly increased to around 70%.

Comments from the third tester noted that he has difficulties looking at the outer circle in 1 meter  and 3 meter targets as it is near the edge of the display area. In addition, he said that the 5 meter target is more centered in the display area so that he can look at the forth circle more easily. Even though the outcome accuracy is the best, he complained the eye detection in the third and forth grids circle are not good enough.

About the fourth and fifth, both testers were tested at an earlier time, and there was no recording at that time. The test results at that time were almost the same as the first tester, and both were difficult to detect in the forth grids circle.

In conclusion, the overall accuracy is acceptable in both circles. It indicates that if the object or hitbox is large enough and centered are the best way to detect objects. Despite the lower accuracy in the edge display area, it is still acceptable as the limitation of the VR headset. Moreover, we noted that eye-calibration is essential for each tester since they have different facial and eye structure. During the first tester’s calibration, the calibration program showed that we need to make a significant change of lens length in order to fit her eyes. Furthermore, other testers are needed to calibrate but less changes than the first tester.

Conclusion

In summary, our project is able to use the eye-tracking function of the VR headset and use them to detect objects in the Unity Engine. What is more, based on our dataset,experiments and other literature references, we can understand that the accuracy of the eye-tracking function inside the VR headset is above-average which means it is usable like the traditional projection screen setup with eye-tracking module.

On the other hand, our project is designed to use rule-based judgment systems for checking if the user is looking or driving correctly or not. The tutorial mode is one of the good examples of how we check the user is doing during the teaching.

Concerning the virtual environment, our project has referenced the Ho Man Tin street view for the exam route “Chung Yee Street Exam No Two”. It means that our project has assembled a virtual environment just like in Ho Man Tin and ready for VR simulation.

For the effectiveness of the software, we have invited at least fifthteen testers which included experienced drivers and no driving experience people (No knowledge or have not been to Driving School before) to evaluate our project. The result showed that at least 50% of people are satisfied with our product.

Future Development

Despite that our project aim and objective have been satisfied, our project is not perfect in every aspect. For example, the virtual environment is not as realistic as in real life as it takes time to build 3D models. Moreover, as we have a tight time schedule, we have to focus primarily on functionality. Thus, our User Interface and 3D models are not as good as the other existing VR driving game. On the other hand, due to the exam mode not yet completed, we cannot test it out when we invited tester to test and use our project.

Regarding our future work, we would like to finalize our exam mode first because it acts as one of the important roles in our project. After that, we would like to improve our User Interface as we know that the start menu and pause menu do not display very well in the software. Furthermore, we have been acknowledged that some testers cannot find the buttons of the steering wheel controller while wearing VR headset. In view of this, we would like to address this issue by creating a gearbox handle and steering wheels with buttons inside the game view. On the other hand, we are seeking an opportunity to develop our project after we graduate. Therefore, we are participating in the OpenInno Challenge which is held by the Student Affairs Office of Metropolitan University of Hong Kong. We will try our best to win the competition and receive funding for further software and business development.

Jonathan Chiu
Marketing Director
3DP Technology Limited

Jonathan handles all external affairs include business development, patents write up and public relations. He is frequently interviewed by media and is considered a pioneer in 3D printing products.

Krutz Cheuk
Biomedical Engineer
Hong Kong Sanatorium & Hospital

After graduating from OUHK, Krutz obtained an M.Sc. in Engineering Management from CityU. He is now completing his second master degree, M.Sc. in Biomedical Engineering, at CUHK. Krutz has a wide range of working experience. He has been with Siemens, VTech, and PCCW.

Hugo Leung
Software and Hardware Engineer
Innovation Team Company Limited

Hugo Leung Wai-yin, who graduated from his four-year programme in 2015, won the Best Paper Award for his ‘intelligent pill-dispenser’ design at the Institute of Electrical and Electronics Engineering’s International Conference on Consumer Electronics – China 2015.

The pill-dispenser alerts patients via sound and LED flashes to pre-set dosage and time intervals. Unlike units currently on the market, Hugo’s design connects to any mobile phone globally. In explaining how it works, he said: ‘There are three layers in the portable pillbox. The lowest level is a controller with various devices which can be connected to mobile phones in remote locations. Patients are alerted by a sound alarm and flashes. Should they fail to follow their prescribed regime, data can be sent via SMS to relatives and friends for follow up.’ The pill-dispenser has four medicine slots, plus a back-up with a LED alert, topped by a 500ml water bottle. It took Hugo three months of research and coding to complete his design, but he feels it was worth all his time and effort.

Hugo’s public examination results were disappointing and he was at a loss about his future before enrolling at the OUHK, which he now realizes was a major turning point in his life. He is grateful for the OUHK’s learning environment, its industry links and the positive guidance and encouragement from his teachers. The University is now exploring the commercial potential of his design with a pharmaceutical company. He hopes that this will benefit the elderly and chronically ill, as well as the society at large.

Soon after completing his studies, Hugo joined an automation technology company as an assistant engineer. He is responsible for the design and development of automation devices. The target is to minimize human labor and increase the quality of products. He is developing products which are used in various sections, including healthcare, manufacturing and consumer electronics.

Course Code Title Credits
  COMP S321F Advanced Database and Data Warehousing 5
  COMP S333F Advanced Programming and AI Algorithms 5
  COMP S351F Software Project Management 5
  COMP S362F Concurrent and Network Programming 5
  COMP S363F Distributed Systems and Parallel Computing 5
  COMP S382F Data Mining and Analytics 5
  COMP S390F Creative Programming for Games 5
  COMP S492F Machine Learning 5
  ELEC S305F Computer Networking 5
  ELEC S348F IOT Security 5
  ELEC S371F Digital Forensics 5
  ELEC S431F Blockchain Technologies 5
  ELEC S425F Computer and Network Security 5
 Course CodeTitleCredits
 ELEC S201FBasic Electronics5
 IT S290FHuman Computer Interaction & User Experience Design5
 STAT S251FStatistical Data Analysis5
 Course CodeTitleCredits
 COMPS333FAdvanced Programming and AI Algorithms5
 COMPS362FConcurrent and Network Programming5
 COMPS363FDistributed Systems and Parallel Computing5
 COMPS380FWeb Applications: Design and Development5
 COMPS381FServer-side Technologies and Cloud Computing5
 COMPS382FData Mining and Analytics5
 COMPS390FCreative Programming for Games5
 COMPS413FApplication Design and Development for Mobile Devices5
 COMPS492FMachine Learning5
 ELECS305FComputer Networking5
 ELECS363FAdvanced Computer Design5
 ELECS425FComputer and Network Security5