IDEs, toolchains, and build tools in general - o3de/wikicollection GitHub Wiki

O3DE does not depend on specific IDEs.

In order to better explain the rationale and decisions behind how O3DE is built, I have to go through some details:

Toolchains. To build C++ you need a compiler and a linker. Compilers take source files (e.g. cpp) and generate "object files" (.obj in Windows and .o in Linux/Mac). Linkers take these object files and (optionally) other libraries and generate static/dynamic libraries or executables. I wont go into details and I simplified it, there are actually more steps there (e.g. precompilation). In some cases, the compiler and the linker are the same binary and they do compilation and/or linking based on the passed parameters (e.g. clang). A developer could invoke the compiler/linker directly. However, when in a big project, doing it for each file becomes an impossible task. For example, in O3DE there are currently over 9700 C++ source files and over 450 targets (libraries/executables). This would mean over 12000 compiler/linker/etc invocations. There are also some other files that are more complex to get into the compilation. For example, Qt uses moc and uic to generate files. So a developer would have to run the files that require it through moc and uic to generate the necessary h/cpp files that are then included. There are also resource files that bake files into the binary (rc files or qt's qrc files). We also have our own codegen that generates files. There are quite an amount of compilers out there: MS Compiler, Clang, GCC, ICC, etc (here is a longer list: https://www.stroustrup.com/compilers.html). There is a big number of compilers out there because they used to be produced by the hardware manufacturer. For example, for IBM to introduce their CPUs, they had to provide a compiler that could produce instructions to that new CPU. Over time, CPUs, specially general-purpose ones, standardized instruction sets and extensions, making it possible for other compilers to be created. MS Compiler, Clang, GCC, etc didnt target a particular hardware, so they rely on the common instruction sets and extensions (usually newer extensions can be enabled with parameters). Each compiler defines what parameters will receive and how things are passed. So is not possible to be "agnostic" to every compiler and usually settings have to be defined per compiler.

Compilers are also not smart on "what changed", so a developer would have to known which cpps are affected by a change to re-compile and will also have to know which libraries are affected to go and link them. The compiler and linker, in conjunction with some other tools is what is usually referred as "toolchain".

When the size of a project is significant, "build scripts" enter the picture. Build scripts are rules that define how things are built. These are usually agnostic to the compiler/linker being used, but usually have them "baked" internally somehow because the invocations to them is different. GNU Makefiles in POSIX systems (Linux and Mac) are a good example, if you inspect one of these files you will see exactly all the rules on how each object file and how each library/executable is generated. You will also see that the compiler/linker/etc are set into variables and then used through the file (that's the usual pattern). If you wanted to support multiple compilers, you would have to create variables that hide those differences and then write the remaining of the Makefile using those variables. Besides Makefiles, there are different build script tools: GNU Makefile, Ninja, MSBuild, SCons, WAF, Graddle, xcodebuild, etc. Different script tools provide different ways of doing things. Some support tracking changes and do minimal rebuilds, some are more extensible, some have their own language to define things, etc.

In general, each OS sets up a preferred/recommended build script tool and toolchain to use. For example, in Windows, the preferred toolchain is MSBuild with the Microsoft Compiler. You can use others, but the OS writer documents and recommends a combination. That doesn't mean that you cannot use Clang on Windows. However, using Clang on Windows is up to how good Clang maintainers are on keeping up with what happens in Windows. Understandably, Microsoft's compiler will play really nice with Windows and their technologies (DirectX, WindowsSDK, resource files, etc). MacOS/iOS/iPadOS recommends to use Xcode+Clang (their branch of Clang). Android recommends Gradle+Ninja+Clang. Linux doesnt have a "recommendation", most systems use GNU Makefiles, however, Makefiles are really slow so a lot of the industry shifted to Ninja, and Linux has GCC as the default toolchain.

Because every OS, console, mobile usually recommends a certain toolchain and focuses on integrating as good as possible to it, "build script generators" are a thing. CMake is a good example of this, its not the only one, SharpMake is another example. These "build script generators" are known as "project generators" (I am not a fan of the name). What these "project generators" do is to generate those "build scripts". In CMake's case, it can generate for the following: https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html Note that usually these tools do not build. In CMake's case, when invoking cmake --build what it does is to understand what is the underlying "build script" system being used and invokes it. So, for example, it would invoke MSBuild or xcodebuild (if you generated for those).

The main reason why O3DE uses CMake is because that provides the best experience in each platform. It also abstracts us of all the above and gives us just one set of "things" to maintain. If we were not using a "project generator", we would have to have build scripts for Windows, Linux, Mac, etc. Each of them would define very similar things, like what files to compile and link into a specific target, what defines, what other libraries to link in, etc.

Since we can generate the "build scripts" that each platform prefers, and all the tools to develop for that platform understand such "build scripts", then using a "project generator" improves the overall experience.

Note that this is not the only way to go. For example, SCons is a "build script" tool. In order to make an IDE like VS be able to use it, it generates a MSBuild project that VS can open and redirects all the compiler/linker calls back to itself. This is "ok", but you are subject to how well that integration is done and how well it plays out with VS (and that particular version). In practice, this didn't work for Lumberyard. We constantly had issues with the generator, had to update it every time a new VS version came out and the integration was never good (e.g. Intellisense did not understand the state of precompile definitions or if this code was enabled for this platform, or each call back to SCons was at least 20s).

Regarding IDEs. IDEs sit on top of all this. Note that they are not required at all, they are productivity tools. Visual Studio is an IDE, although the name "Visual Studio" is a bit overloaded because we sometimes refer to its compiler as "Visual Studio Compiler" or to its "build scripts" as "Visual Studio VC projects". To make things a bit more complex, Microsoft's toolchain (compiler/linker/etc) is distributed under "Visual Studio Build Tools", and that does not include the Visual Studio IDE. To make it even more fun, Microsoft named a different IDE "Visual Studio Code".

Other examples of IDEs are: Visual Studio Code, CLion, Eclipse, CodeLite, etc. Any IDE can be used as long as it understands either one of the "build scripts" that CMake can generate, or understands CMake itself. Sometimes you could generate some intermediate "thing" that can connect it. In general, an IDE should not affect how things are built. That should always be driven by the "build script". However, how well your development experience within that IDE is will depend how well that IDE understands that "build script" and integrates to it. For example, you can build individual files in VS because it understands what compilation flags need to be passed to the individual file and how to invoke the compiler to that individual file. That is an IDE feature. Visual Studio also does a very good job on forming a database to do "code assistance" in the editor, to do things like "automcomplete", "jump to definition", etc. In order to do this, it has to run a process similar to a compiler (sometimes is the compiler itself in Clang's case) and extract that information from it. How well it understands the values of compilation options/flags and how files are connected to each other through include paths and linking is up to each IDE and how good it integrate to that "build script". Is expected that Visual Studio will understand MSBuild very well, but will have more trouble understanding Ninja or CMake itself. And that has been my experience in practice. Again, doesnt mean you cannot use it, just bringing up the reasons for some features working better in one way or another.

For O3DE, considering the above explanations, Visual Studio is not needed at all. It is part of the "minimal requirements" setup just to simplify what a developer has to install and what the recommended workflow is. Since there is a huge amount of permutations on how developers can choose to work, O3DE cannot maintain/support/validate all of them. By validating we mean to frequently check that they are in a good state (usually through Continuous Integration).

However, we encourage contributors/maintainers to include support for other toolchains, build scripts and IDEs. Important to note that most likely, such support would be consider validated by contributors/maintainers and not frequently by a Continuous Integration system. So, such integrations may not work from time to time or some may be unmaintained (e.g. a contributor could add one and no longer maintain it).

For O3DE we choose a "golden path" for each platform. We basically picked a combination of "build scripts" and toolchain (compiler/linker) per platform. The decision was made based on what is the default/recommended for that platform and how good vs the competition it was. For all platforms we followed the default/recommended for the platform except for Linux. For Linux we are using Ninja for the build script (instead of GNU Makefiles) and Clang for the toolchain (instead of GCC). For the build script the decision was made after some time of using Makefiles. Makefiles are really slow compared to Ninja and developing with them daily produces a lot of wasted time. Makefiles also doesnt support multi-config generation, so handling the build is also more complex. There is also not a huge gain on using Makefiles since Linux doesnt have a "recommended IDE". For the toolchain, we didn't have proper support for GCC in Lumberyard. Since we already use Clang for other platforms, and lots of open source projects use Clang, we didn't see it as a problematic decision.

The current "golden path" per platform (at the time of writing this) is:

  • Windows: MSBuild + MS Compiler, currently VS 2019 16.9.2 or above
  • Linux: Ubuntu 18 + Ninja + Clang 9 (being updated to Ubuntu 20, Clang 12)
  • Mac: Big Sur + XCode 11.5 (and only x86, although cross-compiling from M1 to x86 and running on Rosetta 2 works)
  • iOS: same as Mac, targets iOS 13, but that can be modified through LY_IOS_DEPLOYMENT_TARGET
  • Android: arm64-v8a, API: 21, NDK r21d, Ninja, Gradle wrapper, Clang

These decisions can be revisited and the whole community can decide on what to do. If you feel strongly about any of these, I suggest starting with an RFC and open it for discussion. Any discussion will involve cost, practicality and adoption. Since this can be a very "subjective" subject, try to frame it very objectively and fact-based. For example, stating that "I prefer CLion over Visual Studio" is not going to drive a decision. We need to prove that is better for all contributors/maintainers to switch to something else.