Rendering - simple-entertainment/simplicity GitHub Wiki
Simplicity does not provide any rendering implementations, it only provides interfaces. To include rendering in your game (which I'm sure you'll want to do) you'll need to use one of the rendering plugins like OpenGL or Direct3D. Alternatively you could create your own!
The RenderingEngine
interface has additional functions to help setup the rendering environment.
std::unique_ptr<RenderingEngine> renderingEngine(new OpenGLRenderingEngine);
The rendering engine needs to have a camera to render the scene through.
std::unique_ptr<Entity> cameraEntity(new Entity);
std::unique_ptr<Camera> camera(new OpenGLCamera);
// Just call one of these!
camera->setOrthogonal(800.0f, 600.0f); // For 2D
camera->setPerspective(60.0f, 4.0f / 3.0f); // For 3D
renderingEngine->setCamera(cameraEntity.get());
If your shaders require lights you'll need to create them and give them to the rendering engine.
std::unique_ptr<Entity> lightEntity(new Entity);
unique_ptr<Light> light(new OpenGLLight("theSun"));
light->setAmbientComponent(Vector4(0.7f, 0.7f, 0.7f, 1.0f));
light->setDiffuseComponent(Vector4(0.7f, 0.7f, 0.7f, 1.0f));
light->setRange(1000.0f);
light->setSpecularComponent(Vector4(0.7f, 0.7f, 0.7f, 1.0f));
light->setStrength(32.0f);
renderingEngine->addLight(lightEntity.get());
This step is not mandatory but failing to do it could mean that rendering is much slower as the rendering engine will assume that all models should be rendered.
std::unique_ptr<Graph> graph = // Some graph...
renderingEngine->setGraph(graph.get());
// Make sure you add the graph to the scene! (see Graphs)
When creating your own rendering engine you should extend RenderingEngine
and implement the pure virtual functions. It is best to keep to the API documentation of RenderingEngine
so that if you swap it out for another rendering engine the user doesn't get a nasty surprise that they function differently.
Each renderer represents a rendering pass in the rendering engine and needs to have a shader associated with it. The construction of these objects is plugin-specific but here is an example of using the OpenGL plugin:
std::unique_ptr<Renderer> renderer(new OpenGLRenderer);
std::ifstream vertexShaderFile("src/main/glsl/default.vert");
std::ifstream fragmentShaderFile("src/main/glsl/default.frag");
std::unique_ptr<OpenGLVertexShader> vertexShader(new OpenGLVertexShader(vertexShaderFile));
std::unique_ptr<OpenGLFragmentShader> fragmentShader(new OpenGLFragmentShader(fragmentShaderFile));
vertexShaderFile.close();
fragmentShaderFile.close();
std::unique_ptr<Shader> shader(new OpenGLShader(move(vertexShader), move(fragmentShader)));
renderer->setShader(move(shader));
Now to use the renderer as a rendering pass of the rendering engine:
renderingEngine->addRenderer(move(renderer));
TODO
Textures need to be constructed using the physics factory since the implementation of a body is dependent on the rendering plugin being used e.g. OpenGL or Direct3D.
std::unique_ptr<Texture> textureFromFile = RenderingFactory::getInstance()->createTexture("texture.png");
char* data = // Some data...
unsigned int width = 512;
unsigned int height = 512;
std::unique_ptr<Texture> textureFromData = RenderingFactory::getInstance()->createTexture(data, width, height);
And to use the texture:
std::unique_ptr<Model> model = // Some model...
model->setTexture(texture);