Editing Questions to build FAQ - o3de/wikicollection GitHub Wiki
Go ahead and edit / paste questions below along with the channel it was posted in:
Channel: General
Q: how do you set the asset processor to only do a certain amount of jobs?
A: The file with the minimum number of jobs and maximum number of jobs for the AssetProcessor is located at:
<engine-root>/Registry/AssetProcessorPlatformConfig.setreg
We can also override it by placing that json section in your own *.setreg file as follows
{
"Amazon": {
"AssetProcessor": {
"Settings": {
"Jobs": {
"minJobs": 1,
"maxJobs": <Change This Value>
}
}
}
}
}
Q: Is O3DE the open-source version of Lumberyard?
A: No. It's a complete rewrite with a focus on modularity - though it brings along many of the best features of Lumberyard.
Q: Why release a new engine when there's already Godot or another open source engine?
A: O3DE was built from the ground-up to be familiar to AAA developers and creatives in other industries looking to leverage a modular engine with cloud infrastructure plug-ins as part of their development workflows. The type of complete rewrite needed to achieve those goals would have been extremely disruptive to another open source project. In the end, the more open source - the better for all of us.
Q: Is XYZ feature supported?
A: You can check out the full user guide here: https://o3de.org/docs/user-guide/
If you don't see your feature listed, open up an RFC on Github to get the discussion started!
Channel: sig-platform
Q: Why is there no Linux version if O3DE is part of the Linux Foundation?
A: There is a Linux version, but we need your help to get it running on as many systems as possible. While the focus has been primarily on the O3DE Linux Server, the O3DE roadmap includes a checklist of vital Linux functionality.
Channel: sig-release
Q: Which target platforms (PS, Xbox, Switch, Mac, Linux etc.) are supported?
A: The current release is a Developer Preview and is focused on Windows PC. Future platform information will be released at a later date.
Channel: General
Q: Is there large world support in O3DE?
A: Several foundational pieces needed are available such as asset streaming and build time processing of prefabs, but out of the box there's no functionality available that ties these pieces together to support large worlds yet.
Q: Is there work going on towards large world support?
A: It's a feature that has been requested and on the AWS side there are several groups eager to work on parts of this such as large scale terrain, so there are definitely plans. Whether those plans will be worked on in the near future is something that the community will now drive though.
Q: So on the topic of networking. Is there a recommended player or user cap for the dedicated server?
A: We had an internal project running 75 players on a single server with very little in the way of optimizations, they were actually running a 4-way cluster where two instances were for players and the other two instances load balanced several thousand AI for a total of 150 players and several thousand AI (I forget the exact number) The multi-server stuff is not inside O3DE at the moment, so I can confidently state 75 should be easily achievable.. we made a lot of optimizations bringing that tech into O3DE itself, when multi-server is fully up and running again I think we could break 10K players by some orders of magnitude.
Channel: General
Q: What is the process to install and open a project?
A: The process to install the engine is [Here](https://o3de.org/docs/welcome-guide/setup/), and once installed creating a project is [Here](https://o3de.org/docs/welcome-guide/get-started/)
Channel: General
Q: Is there any plans for a custom audio engine?
A: As of this time no, but there is WWISE, and anyone can help if they want to make any tools or features for O3DE.
Channel: General / sig-build
Q: What are all the OS's you can build to?
A: As of this time the supported OS's are Windows, Linux, Mac, iOS, and Android. But the support is limited until some platforms get finished up.
Channel: General
Q: Is this just Amazon Lumberyard re-branded?
A: No, you can think of it as Lumberyard 2.0, but in fact it is its own engine.
Channel: General / sig-build / sig-platform
Q: How do I download the engine?
A: The best way is to go [Here](https://o3de.org/docs/welcome-guide/setup/), and follow the instructions on the docs.
Channel: sig-content
Q: Is there any functionality or optimization plan for Script Canvas? The debugging ability of Script Canvas is weak, is there a plan to optimize?
A: Script Canvas in O3DE is on average about 3 times faster than it was in the previous version of Lumberyard. It was changed so that when Asset Processor works on a Script Canvas graph it produces Lua code which is what the runtime uses.
Channel: sig-core
Q: Can entities and their components be streamed across networks or is there already a generic serialization mechanism in place to do this?
A: networking is now a fully streamed solution. Clients load only a fraction of the actual level upon joining a game, anything with netbinding is stripped. All net bound entities stream from server to client with latest network state. We use a specialized network serialization mechanism for now, but we'll be looking at potential unification strategies in future.
Channel: sig-core
Q: What are the differences of Asset Builder between the O3DE and lumberyard?
A: AssetProcessor and associated tools have received substantial changes. From the top of my head:
- A lot of Cry asset handling has been removed (rc and associated builders)
- Builders that previously where part of AssetBuilder, have been moved to gems.
- FBXSdk was replaced with Assimp.
- Its possible to script them (python)
Channel: general
Q: So does the Atom rendering engine support VR?
A: VR should be possible, but I don't think anyone has tried yet.
Channel: sig-core
Q: I'm coming from DOTS in Unity.
* Does the ECS implementation have something like archetypes/automatic memory alignment and packing?
* Are there specific "system" handlers and where do you control their execution order?
* Is the ECS implementation in O3DE strict with side-effects, or can you for instance modify pointers in a system?
* Is there a job system in place? I.e. to manage and run multiple jobs modifying components?
* In DOTS entities can be managed in capsuled "worlds" (like a separate database containing a subset of all entities). Does something like this exist in O3DE?
A: We also currently don’t feature the full entity component system model you refer to, where components are simply state and systems contain behaviour that operates on a set of components. We have a basic entity component architecture instead, with goals to move towards something with better memory layout for high performance.
To confirm and expand on what @kberg [AMZN] just said, O3DE has an Entity Component Framework (ECF), not an Entity Component System (ECS) if you permit being pedantic for a moment. The difference, imho, is that the former is approach where entities directly own and manage their components and components are both data and work units, where the latter is system where entities are a unique identifier, components are data units and systems are work units. O3DE is for the most part the is based on Object Oriented Design based so an ECF fits better but with the new Prefab system putting more emphasis on scalability and performance than was done for Slices steps are being taken to move towards an ECS.
To answer @thefranke [Huawei] 's questions in order:
- No, as stated we have an ECF. Archetypes (a.k.a. tables) are ECS specific and not part of O3DE as of right now.
- There are now specific system handlers. The closest O3DE has is to use specific EBuses to get calls like OnTick.
- O3DE's ECF is fully open so any objects can be modified at any time unless synchronized using mutexe and the like.
- O3DE contains a fully functional job system that includes managing job dependencies. This is however not extensively used within the ECF and has known performance issues.
- Entities are currently managed within a specific Entity Context, each with their own EBuses. There are three currently available: Game, Editor and GameUI. AzFramework::Scene has recently been upgraded and the plan for the near future is to deprecate these three, including their EBuses and have a unique Entity Context in each AzFramework::Scene.
Channel: sig-core
Q: Can entities and their components be streamed across networks or is there already a generic serialization mechanism in place to do this?
A: Yes they can, but require the new Prefab, AzNetworking and Multiplayer. @kberg [AMZN] can provide more details.
Channel: sig-core
Q: Is there a workflow for streaming in general? I see there are "Scenes", but how do you deal with large-scale worlds and streaming blocks? Should scenes be split and handled with multiple scenes or is there some streaming concept in place for a single scene? Are there sub-scenes?
A: O3DE has 2 of the 3 planned streaming layers, which are file streaming and asset streaming. File streaming purely focuses on getting files as fast as possible, while asset streaming gets assets as fast as possible. Asset streaming support assets dependencies, which means that if asset A has a dependency on asset B and C, B and C will be asynchronously loaded before A even if A is blocking. The third layer would be high level scheduling, such as being able to divide the world into smaller blocks. With asset streaming and Prefabs ability to generate multiple products the foundational tech is in place to build such scheduling, but work on the actual scheduling hasn't started yet.
Channel: sig-presentation
Q: The pipeline of O3DE seems forward+ now, do you have any plans to add some other pipeline like deferred or RTX pipeline
A: You are correct in your assumptions, Atom is using Forward+ with cluster tile lighting and separate passes between forward direct lighting and IBL, and doing so it does have slim deferred buffers carried out between the direct lighting pass and the IBL that can be used (and are used by the IBL, transparent passes and some of the post effects).
We don't intend to develop full deferred pipeline, but this can be developed by the community and Atom easily supports this approach via our dynamic pass system and data driven pass system buffers.
Regarding RTX, @AMZN DougM is the person that can give you the complete answer, but in general we are constantly working to improve our RTX usage and support and in the future having a more complete solution for RTX not only as part of the IBL.
Channel: sig-presentation
Q: What's your purpose of Foward+ pipeline? Is it because the consideration of Mobile rendering? Or any other reasons?
A: The main considerations were support of wide variety of platform while maintaining high visual quality, high frame rate and (a constraint due to that) slim GBuffer. Carrying a full blown GBuffer that can properly support what we targeted as enhance surfaces (for example BSDF with clear coat, skin etc..) would have been quite heavy on the GBuffer side or leading to other restrictions. The slim GBuffer was originally generated to support second pass of IBL and provide minimal data required by the post effect passes.
An example of where this come to affect is the use of clear coat - to support it via a GBuffer would require at least two more channels (passing the coefficients for another lobe) that we decided to avoid and support it via the main Forward+ lighting by lighting the least significant lobe with single main probe. Same consideration was also used for the Anisotropic surface response.
Channel: newmemberq-and-a
Q: Will you guys adopt USD for asset transport?
A: Right now the primary model/asset format is FBX. But we use "Asset Importer" (ASSIMP) in the model pipeline and builder, it has support for other formats. We are added STL support, others like glTF might work (but we haven't tested and don't officially support yet). We would like to add support for USDz assets, and we'd love to gain support for USD. I would start this conversation in the #sig-content channel. I am not 100% sure of the exact process but it will probably start as a RFC write up for the feature and then be discussed by the community.
Channel: general
Q: I'm assuming Wwise is only option for proper audio atm? It just seems super weird to make open source game engine but then couple it with most expensive proprietary paid audio engine
A:
Channel: general
Q: Is anyone using O3DE with a low spec PC?
A:
Channel: general
Q: Follow up was if anyone has tried on a Cloud based VM
A:
Channel: general
Q: Is anyone using O3DE with a low spec PC?
A:
Channel: general
Q: What Graphics APIs does the engine support ? Dx12, Vulkan ?
A: DX12, Vulkan and Metal.
Channel: general
Q: Apple silicon support planned ?
A:
Channel: general
Q: is there a linux channel? is sig-platform it?
A:
Channel: newmemberq-and-a
Q: Should I be using the o3de.exe or Editor.exe, I have been using the editor.exe but not sure if there is a difference.
A: You can use either, really. o3de.exe launches Project Manager for configuring your project, which can in turn launch the Editor. Editor.exe launches Project Manger when it doesn't know which project to launch. One way to specify the project to launch when running Editor.exe is with the --project-path parameter on the command line.
Channel: Sig-Core
Q: Is there something like a console bus
A: AZ::EBUS: Good for broadcast publish sub (many senders, many listeners)
Ebus is good for "I dont care or know who is sending the message, but when someone sends it, let me know"
AZ::Interface<T>: Good to talk to a singleton that there's only ever one of
interface<t> is good for "I want a pointer to the BlahManager singleton"
Interface<T> just allows you to register a singleton and then get a pointer to it from anywhere
AZ::Event: Good if a specific thing is going to send a message, and you want to subscribe
Event is good for "I already know whos sending the event, I have a pointer to them already, subscribe to this event that they send"
Event is like C# delegate/event, in that you must have the object that has the event on it to
subscribe to it (unlike ebus, where you subscribe to 'the bus')
They're not mutually exclusive, you can have for example a BlahManager singleton that broadcasts on an ebus AND also registers itself as Interface<IBlahManager> so that others can either talk to it via bus or do Interface<IBlahManager>::Get()->DirectCall()
Channel: newmember-q-and-a
Q: Is there a way to force the editor/game always run with Vulcan instead of DX12?
A: Use the command line argument -rhi=vulkan
Channel: newmember-q-and-a
Q: I'm absolutely new at o3de so I don't really know what a builder is. A data compiler?
A: o3de has an Asset Processor (AP) executable which convert raw assets to the final assets used by o3de runtime. Builders are the modules got plugged in to AP to process different assets. So for FBX file, we have some builders to convert it to the o3de internal asset formats for models and materials. Anyone can add a new Builder (in a gem) to convert new raw asset files to o3de assets.
Channel: general
Q: My mouse is laggy, how can I smooth it out?
A: Create a file in the root of your o3de folder named *user.cfg* and add this to the file:
ed_useNewCameraSystem = true
ed_useNewCameraSystemGoto = true
--
AZ_CVARS (references in CameraInput.cpp) have values like LookSmoothness that can be changed to taste
Channel: newmember-q-and-a
Q: Does this engine have any double precision coordinate support something like Unigine has? I'm just talking CPU side.
A: Doubles for position is not currently supported. Support for environments that require large coordinates has been discussed but no specific plans are in place at the moment. Please do stick around and advocate for the need for the feature though.
Channel: newmember-q-and-a
Q: What's the preferred way to open a project in an IDE? Using the cmake generated solutions? Or using the built-in CMake support (say, in VS Code and VS 2019)? Also, what is the recommended IDE to use? I'm assuming VS 2019, but would like to see what works best.
A: Using the generated VS2019 solution is the primary approach
The CMake integration in VS has been slow for big projects a couple of VS versions back, it has gotten better but still is not in-par with the VS2019 solution. The generated solution also has more settings and features, like, we inject debugging parameters in each Test target so you can run them with F5.
Eventually we expect the community to add settings/improvements to other IDEs, there is already someone with a PR to improve VS Code.
One of the main reasons why we choose CMake is to give developers the native experience of that platform. The best developer tools are already compatible with the "expected" toolset of that platform. So, for example, in Windows, most IDEs read MSBuild and are able to handle it. Most plugins/tools understand MSBuild. So instead of us having to make every toolset/plugin happy, we generate for the common denominator. Same in other platforms.
The nice thing about CMake is that it can generate for other build tools: https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html. However, we will always focus on a "golden path" per platform which likely will be what the bigger % of developers use. Is a bit impractical to support every possible permutation, however, we could have some other paths "possible to use at your own risk".