Setting up MirageXR - WEKIT-ECS/MIRAGE-XR GitHub Wiki
Configure
Developers have to configure a few things before being able to run Unity-play or to compile apps for specific Build Targets. Additionally, there are optional (!) Corporate Design settings that are good to know about and that allow configuring different apps from the same build project.
The system utilises several APIs and packages that require API secrets etc. to be configured -- like IBM Watson, SketchFab, or Sentry. If these are not properly configured the way MirageXR expects them, some or all functionality will not work properly. For example, if IBM Watson credentials are not provided, Character Augmentations cannot use AI mode, and the app start will likely trigger some non-fatal errors.
[!Caution] Don't forget to set the desired build target (at the moment preferably Android!)
Clone
All good things start with cloning, so grab the link from the repository start page and clone into a local directory. If developing on Windows, remember that there is a 255-character limit to file paths, so you would want to clone into a directory high up in the hierarchy, if not top level.
git clone [email protected]:WEKIT-ECS/MIRAGE-XR.git
Install git lfs
Some files in the repository are very large. This requires git lfs support. Install lfs support globally with this command:
git lfs install
Initialise git submodule
Some code moved to a separate 'core engine' library, lib-lee (short for Learning Experience Engine library). This requires initialisation of git:
git submodule init
git submodule update
Right Unity version
You can look up, which Unity version the project runs on by checking the ProjectSettings.txt. No other version is acceptable and you need to install exactly this version through UnityHub.
API keys
[!CAUTION] For all APIs to work when compiling, you will require the keys to be set up accordingly. If you have been invited to contribute, you should have received these keys as a set of files, or should have been granted access to the files.
The API keys need to be set up as follows:
-
SketchFab: The API id and secret need to be stored. To see how to generate these, see API documentation. You have to copy over three files you can obtain from the project team into this folder (which is in .gitignore and will not be syndicated!):
/Assets/MirageXR/Common/Resources/Credentials
. The three files are:SketchfabClient.asset
,SketchfabClient.json
, andSketchfabClientWithPassword.asset
. Subsequently, thesketchfab data object
and thesketchfab login with password data
object need to be assigned: -
new Sketchfab: place
SketchfabSettings.asset
intoAssets/Resources/
-
IBM Watson: you need to put a non-tracked
ibm-credentials.env
file into theroot
folder to be able to build the natural language processing capabilities for the character models. You will need to add the API secret key credentials for the following IBM Watson services: Tone Analyzer, Assistant, Speech-to-text, Text-to-speech, and the language translator (optional) services. The file is listed below, just replace the %KEY% values with the according API secrets for each service provided by the IBM Watson dashboard. As with release 2.6 (develop branch from Feb 15 2024 onwards), we have phased out the assistant service and replaced it by openAI chatGPT support.
TONE_ANALYZER_APIKEY=%KEY%
TONE_ANALYZER_IAM_APIKEY=%KEY%
TONE_ANALYZER_URL=https://api.eu-gb.tone-analyzer.watson.cloud.ibm.com/instances/5f989edc-233e-42a0-98c5-9e1cefcf36c5
TONE_ANALYZER_AUTH_TYPE=iam
ASSISTANT_APIKEY=%KEY%
ASSISTANT_IAM_APIKEY=%KEY%
ASSISTANT_URL=https://api.eu-gb.assistant.watson.cloud.ibm.com/instances/7ca75d0a-966f-4350-91b8-06fb0e882de5
ASSISTANT_AUTH_TYPE=iam
SPEECH_TO_TEXT_APIKEY=%KEY%
SPEECH_TO_TEXT_IAM_APIKEY=%KEY%
SPEECH_TO_TEXT_URL=https://api.eu-gb.speech-to-text.watson.cloud.ibm.com/instances/00f5f658-9d9d-4b76-9505-d07885513919
SPEECH_TO_TEXT_AUTH_TYPE=iam
TEXT_TO_SPEECH_APIKEY=%KEY%
TEXT_TO_SPEECH_IAM_APIKEY=%KEY%
TEXT_TO_SPEECH_URL=https://api.eu-gb.text-to-speech.watson.cloud.ibm.com/instances/a597ad6c-b7a1-479f-a592-4fde6d504b2a
TEXT_TO_SPEECH_AUTH_TYPE=iam
LANGUAGE_TRANSLATOR_APIKEY=%KEY%
LANGUAGE_TRANSLATOR_IAM_APIKEY=%KEY%
LANGUAGE_TRANSLATOR_URL=https://api.eu-gb.language-translator.watson.cloud.ibm.com/instances/a1d20d77-8dae-49db-9aa3-5604e9d07753
LANGUAGE_TRANSLATOR_AUTH_TYPE=iam
- OpenAI: place file
openai.txt
into the folderAssets/Resources/
folder which contains the following
OPENAI_KEY=
OPENAI_API_KEY=
OPENAI_ORGANIZATION=
- AI_server: place file
AI_Server.txt
into the folderAssets/Resources/
which contains the following:
AI_API_URL=http://91.107.198.4:8001/
- Sentry: place the
SentryOptions.asset
into the folderAssets/Resources/Sentry/
so that the settings are loaded correctly. If you are creating the Sentry configuration by hand, make sure you set a valid DSN and turn off debug logging, so that no in-editor error messages get tracked through to Sentry. Once the file has been added, you can check that its settings were correctly read by openingTools>Sentry
- if this shows a DSN, then Sentry is correctly configured:
- Sentry client token: Place the file
SentryCliOptions.asset
into the folderAssets/Plugins/Sentry/
. It seems that you could potentially do a lot with the auth token, depending on the allowed scopes that were checked when generating it, e.g., changing settings in the workspace or reading the error logs (see also https://github.com/expo/sentry-expo/issues/321#issuecomment-1464335081). We therefore have excluded this file as well via the .gitignore and you will have to place it asAssets/Plugins/Sentry/SentryCliOptions.asset
manually. Once the file has been added, you can validate that its settings were correctly read by openingTools>Sentry
and checking if it shows an Auth token:
Photon
If you do not have a license key for Photon Fusion and Photon Voice, then remove the four compiler directives READY_PLAYER_ME
, XRSHARED_ADDON_AVAILABLE
, FUSION_WEAVER
, and FUSION2
from the current/desired build target in Edit
> Project Settings
> Player
.