4. Implementation - laurafo/Augmented-Flight-AVR Wiki
Since both of us have almost no experience with Unity, we did several tutorials at the beginning to get familiar with the program. For this we installed Unity, Unity Hub and Visual Studio Code. In this phase we also tested the transfer of the projects with GitHub Desktop, so that everything would work later with our augmented flight project. After this phase, we started to conceptualize a prototype of our idea in Figma.
a. First Prototype in Figma
It was especially important to us that you can also hide the application, e.g. for people who just want to enjoy the view. For this purpose, there is a button with an arrow at the top center, which can be used to show or hide the application.
In the faded state the general information "Time to destination" and "Altitude" should always be shown, for this the place around the hide/show button is perfectly suitable. However, the main indicators of our applications are the pins that show what is underneath the plane. These depend on what height you are at. For example, if you are at a normal flying altitude, cities, rivers, etc. will be shown. But if you are in a landing approach, then detailed information of the destination city is shown, e.g. places of interest, beaches, etc. Points of interest, which can be seen further in the field of view, are already shown with their names. A point of interest that is at the edge of the field of view is marked with a pin, but without a name. To provide passengers with even more information, our application is interactive. This means that the airplane window is no longer just a normal window, it is a touchscreen. If a passenger is more interested in a pin, he or she can click on it and a small window with more information will open. By showing different categories (cities, rivers, sights, etc.), these can also be filtered. The filter can be operated like the pins via touch.
For our project, we decided to use Dubai as an example city and Qatar Airways as an example airline. That's why we picked up Qatar Airways' purple color scheme and branding in our interface.
b. Vuforia / Unity / Xcode
To get started with our augmented flight project, we created a new 3D project in Unity. Since we are creating an iOS application for the tablet, we also use Vuforia. For this we downloaded the Vuforia Engine 10.3 under https://developer.vuforia.com and added it to our Unity file.
Furthermore, we have created a Basic license in the Vuforia Developer Portal under the License Manager. We also created a database under the Target manager and uploaded different images there. Based on the rating, we were able to see very nicely which images are more suitable for tracking. We therefore chose one target at the beginning with four out of five stars. After that, we downloaded the database for the Unity Editor and added it to our Unity file.
Next, we created an AR camera in Unity (Hierarchy → right click → Vuforia Engine → AR Camera). We then added our license from the Vuforia Developer Portal in the Inspector of the AR Camera under Vuforia Behaviour (script). To do this, copy this license from the Vuforia Developer Portal and paste it into Unity under the "Open Vuforia Engine configuration" button in the App License Key category.
In the next step, we recreated the canvas from our Figma prototype. For this we created a canvas in the Hierarchy (Hierarchy → right click → UI → Canvas). For the canvas we chose the portrait format of an iPad. With a further right click on the canvas we created a panel under which the individual elements of the menu are located. Since the panel is the same size as the canvas, we made it narrower and moved it to the top of the screen, and also changed the color and transparency to match our prototype. The decorative strip at the bottom of our panel also consists of a UI panel to which we assigned a graphic. For this, we created a line with a gradient in Adobe Photoshop and saved it as a PNG. In Unity, we created an "Images" folder under Assets, into which we then dragged and dropped this PNG. Then we changed the "Texture Type" value in the Inspector of the image to "Sprite (2D and UI)" and applied this setting by clicking on the scene. Now we could assign the PNG to the panel (Inspector → Image → Source Image → Panel_Decor (our PNG)). After that we adjusted the size and the position of the decorative strip.
Time to Destination & Altitude
Then we created the texts "Time to destination", "Altitude" and the respective values for them. We adjusted the color, size and position as in our prototype. In order to ensure that the individual values do not remain static, but also change accordingly, we created two scripts (Time To Destination and Altitude Value) and wrote the appropriate code for them. We then assigned these scripts to the respective text (Add component → Time to destination (script) / Altitude Value (script).
In order to provide the best experience for passengers, we included a filter button. This allows passengers to filter by the five different categories - Activity, Hotel, Shopping, Sights and Buildings. To activate the filter, the filter button must first be pressed and then the categories appear. If you select a category now, only the corresponding pins are shown. For this we have written two scripts - Filter Button Click Event and Filter Pins. The Filter Button Click Event is responsible for what happens when you click on the filter button. It also makes sure that when the filter is closed, the previous selection is cancelled. Lastly, the WhenFilterSelected() method causes the icon of the respective category to color to indicate the selection. The second script, Filter Pins, is responsible for correctly displaying the pins of the chosen category.
In the Inspector of the Filter button, we have added, under On Click (), the three click events described above and the corresponding methods. We also assigned the corresponding objects to the different filter options. We then added the Filter Pins script to the individual objects and assigned the corresponding pins to the Pin deactivated variables. In addition, the WhenFilterSelected() method from the Filter Button Click Event script is also needed here. Here the assignment of the corresponding sprites for the activated and deactivated status takes place.
However, with filters there is still the possibility to filter not only by one category, but also by several at the same time. Since both have advantages and disadvantages, we included this idea in our study. There we asked the subjects what they like better and 40% of them liked the selection of only one filter better, the other 60% preferred multiple selection. Based on these results, we decided to implement a multiple filter selection.
The first step was to allow multiple categories to be selected at the same time. For this we made a small change in the filter button click event script. The correct display of the matching pins has proven to be more difficult. The last four days we worked on our project, we worked intensively on a solution. We tried different approaches and adapted or changed the code accordingly. Unfortunately, we did not have enough time to find a satisfactory solution.
The last element of our canvas is the Menu Button, which can be used to show and hide the Augmented Flight application. For this we created a button in our canvas and assigned a graphic to it. Then, in the button's Inspector, we changed the button transition to Sprite Swap. We also wrote a script that changes the button's graphic when clicked and fades our panel in and out. We assigned this script to our Menu Button and selected the various methods under On Click () and added the appropriate graphics under Button Sprites.
Tracking & Image Targets
To illustrate the tracking, we decided to use four images of Dubai. Dubai is a very diverse city and therefore offers the perfect example. The images are already in our Vuforia Database and can now be embedded in Unity via Image Targets. For this we created four Image Targets in the Hierarchy (Hierarchy → right click → Vuforia Engine → Image Target). Then we changed the type in the Inspector to "From Database", selected our database and the corresponding image. For a better overview we moved the images all next to each other.
The following description was made for all four Image Targets: In the Hierarchy, we created a canvas under the Image Target. We rotated this by 90 degrees on the X axis and dragged it over the image with some distance, we also made it slightly larger than the Image Target. In the Inspector of the canvas we changed the Render Mode from Screen Space - Overlay to World Space and made the AR Camera the Event Camera. Then we picked out different buildings, landmarks and more on each picture and created the pins for them. A pin consists of a pin circle, a pin line and a pin label. We created a button for each, inserted the corresponding graphic and adjusted the size and position. The graphic for the pin label has a different icon depending on the category and shows the name of the hotel, sight, etc. To provide more information about each pin, we made the pin label interactive. This means that when you click on the Pin Label button, it will pop up and show more information. For this we wrote the script "Pin_OpenClose" with the method whenPinClicked(). This method makes the pin show more or less information. In the inspector of the pin label we have added this script as a component and selected the corresponding method under On Click (). If you start the game now in the Game view and hold one of the four images in front of it, the pins will be displayed in the right places and move when you change the position of the image.
(All graphics from the canvas and the pins were created in Adobe Photoshop by ourselves.)
Main Interaction Script
The Main_Interaction script was created during the process of trying to solve a problem with the AR camera. However, through several tutorials and research, we solved the problem in a different way. We didn't delete the script for now, though, as it might come in handy again later.
To get our Augmented Flight applications onto the iPad afterwards, we first installed Xcode. Using the "File" menu item in Unity, we changed the platform to iOS under Build Settings. Under Player Settings, we then set the app logo and the start screen of the app. In addition, we set the target device to iPad and the target minimum iOS version to 12.0, otherwise we would get an error message in Xcode.
After that, we used the Build and Run button to submit our program to Xcode. In Xcode we could then select the iPad and under Unity-iPhone -- Suitability & Capabilities check the box Automatically manage signing and then select our own team. We then clicked the Play button and the app was added to the iPad.