09 June Bee - molab-itp/content-2023-Fa GitHub Wiki
š some of the demo GIFs take some time to load, please be patient, thank you!š
Please Visit this [Repo](https://github.com/junebee66/GEO3-App) for all documentation
Working on saving scanned model and view as a list
This week I went to Professor's office hour to setup my firebase database app and also wrote a little pipeline to set a video in the background. However, I do receive an email saying that google api key is exposed on github.
![]()
This week I finally was able to access the image texture node in AR space and pass the ui image as image data into that space. I watched this YouTube tutorial to find a work around. However, I couldn't find a better way to update the AR sphere texture other than updating (reconstructing) the entire AR view. Another problem was that I couldn't update the second image generation and make the prompt input box be as wide as the screen.

I planned to have the user be able to generate their own 3D assets through ai and place it in the AR space to re-envisioned and re-write stories that way.
I have written a website to convert 2D image into 3D model with depth map here. I will try to write it in swift language next week.

š” Next Steps
- depth map to 3D model
- spawning function
- Geography data to see the created assets
- export / sync the creation in AR with their VisionOS Space
I recently found this ai database called Pollinations.ai, and this week I tried writing it into my app. However, I've been having problems with passing UIimage as regular image to the AR space.
This is the š Sample Code I used to start this project.
I followed Apple's tutorial to make some examples of Lidar usage with ARKit. I made a collision test, depth map viewer, obj exporter, scan with texture, and a point cloud code sketch.

I have to admit that this week has been one of the busiest so far in this semester, so I didn't do as much swift cosding as I wish to have. I've been having trouble to write buttons that execute certain interactions in the 3D space with ARKit documentation.
I feel like most of my problems every week is with the ios versions and simulator compatibility...
š“ Reality Composer Missing
This week I spent a great amount of time going through Apple's documentation and Reddit people's help to try to figure out why is my reality composer not shown in my Xcode.
It happened to be that Apple took away all Reality composer/ 3D kit away in Xcode 15. Therefore I took some time to clean my mac and download the older version. I used the following command code (for my own documentation, in case I forgot the command code):
xcrun simctl runtime
xcrun simctl runtime add "~/Downloads/visionOS_1_beta_4_Simulator_Runtime.dmg"
š“ VisionPro Missing
I realized that only Xcode 15 beta8 has VisionOS Simulator. The latest Xcode 15 does not support development on Macbook with Intel Chip.
I made a little AR space where if you tap on the fox, music will start to play. The texts are also set to when tapping on fox or planets.
I also made a little UI stuff in VisionOS.

I've been a little buzy this week but was still able to catch up the 100 days of Swift to Day 15. I've been exploring the AR kit on my own too to prepare for my final project. However, I've been having problems to find the Reality Composer Pro on my Xcode > Developer Tool, so I just hard coded most things which is (not gonna lie) a bit tedious..


š“ Apple Forum
I found the SOLUTION!!! You can just follow the answer in this Stackoverflow post:
š” iOS 17.0 Simulator (21A328) Failed with HTTP status 400: bad request

This is the directions I went on for this assignment
ā 2D graphics > 2D graphics animation > 3D viewer > 3D in different colors > Generative 3D
I might be a little off topic for this assignment because I went into the rabbit hole of 3D. I was interested in how the 3D geometry rendering ability of swift is capable of since I will be doing the AR and VR integration for the final project.
š¤²š» For people who are interested, I used SceneKit for this project

Iām planning on dive deeper into the AR kit and vision OS data transfer compatibility with IOS for future homework assignments.
I watched the Swift 100 days to day 2 and starting to look through Swift look book for most what I need and build the text art. This week has been a bit chaotic with my course registration, but I'll start spending more time int the following weeks).
ā° : 4hr in total
Uploaded to class wiki Forest & Rabbit

This might be weird but my favorite app on my phone is my "Bank of America" app. It is the only app that I feel like the UI really makes sense and actually makes complicated tasks a lot easier.
- Mobile Integration for VR (Vision Pro)
- No Code for Vision OS