Libgdx Guide - ZHAW-Team-Toxic/PM4-Team-Toxic GitHub Wiki

  1. Libgdx general Concepts
  2. Libgdx Demo Example

Libgdx general Concepts

This text will shortly try to explain the core concepts used by our application. For more Information refer to the official Documentation

In this Overcview the following topics will elaborated on:

  1. Modules
  2. Application Life Cycle
  3. ViewPorts
  4. Loading Assets
  5. Rendering
  6. Enitity Framework Ashley
  7. Input Handling
  8. User Interfaces
  9. Others (Audio, Physics, Texture Packing, Tile Maps, Saving)

Modules

https://libgdx.com/wiki/app/the-application-framework

Application: runs the application and informs an API client about application level events, such as window resizing. Provides logging facilities and querying methods, e.g., memory usage.

Files: exposes the underlying file system(s) of the platform. Provides an abstraction over different types of file locations on top of a custom file handle system (which does not inter-operate with Java’s File class).

Input: informs the API client of user input such as mouse, keyboard, touch or accelerometer events. Both polling and event driven processing are supported.

Net: provides means to access resources via HTTP/HTTPS in a cross-platform way, as well as create TCP server and client sockets.

Audio: provides means to playback sound effects and streaming music as well as directly accessing audio devices for PCM audio input/output.

Graphics: exposes OpenGL ES 2.0 (where available) and allows querying/setting video modes and similar things.

Application Life Cycle

Documentation

The base structure of every libgdx application.

public class SomeLevel extends ApplicationAdapter {

    @Override
    public void create() {
        //init & load assets
    }

    @Override
    public void render() {
        //Method called by the game loop from the application every time rendering should be performed.
    }

    @Override
    public void dispose() {
        //unload assets
    }
}

Common Mistake:
Do not load heavy resources inside render(), as it runs every frame. Instead, load assets in create() and release them in dispose().

Viewports

Documentation

Click here for interactive example

Viewports define how your rendered graphics behave to the screen size and ratio. They also handle the projection from user input (e.g. user clicks with the mouse) to the in game coordinates (This is handled by the camera, in our 2d case this is the "OrthographicCamera").

Multiple Viewports can be used at the same time, in our case will be probably using 2 Viewports. One handles the Interaction, rendering and projection with the ingame world the other with the ui.

Here an example where the extended Viewport displays 8m by 8m ingame area to the screen and the screenviewport is holding the ui.

        // create view with world coordinates
        extendedViewport = new ExtendViewport(8, 8);
        extendedViewport.getCamera().position.set(8, 4, 0);

        // create ui
        screenViewport = new ScreenViewport();
        stage = new Stage(screenViewport, batch);
        gameUi = new GameUi(stage);

Loading Assets

Creating assets

There are 2 important methods of packing/creating assets which we will be using

  1. Skin composer, used for creating ui (like css for html).
  2. Texture Packer, a gradle task (also as ui editor available) that packs all textures in one big texture. This enbles not having to load them seperatly.

Texture atlas example

Example of a texture atlas

Loading Assets

The loading of assets is managed with the AssetManager

It supports many different assets types: https://libgdx.com/wiki/managing-your-assets#adding-assets-to-the-queue

and enables to not having to dispose them individually but instead having just to dispose the manager.

manager.dispose();

Rendering

Documentation

With the spritebatch sprites/images can be drawn as one batch.

private Sprite sprite;
...
Texture texture = new Texture(Gdx.files.internal("image.png"));
sprite = new Sprite(texture, 20, 20, 50, 50);
sprite.setPosition(10, 10);
sprite.setRotation(45);
...
batch.begin();
sprite.draw(batch);
batch.end();

Entity Framework Ashley

Inheritance in object oriented languages is not optimal for structuring game logic and performance. This is why in Game Development an ECS (Entity Component System) is used. In our case this is The Ashley Entity Framework.

ECS

With ECS, you break down the game objects into three parts:

  • Entity: A unique ID representing an object.
  • Component: Data containers that hold specific attributes (e.g., position, velocity).
  • System: Logic that processes entities with certain components.

Example: Movement System

Components:

  • Position: Holds x and y coordinates.
  • Velocity: Holds movement speed in x and y directions.

Entity:

  • Simply a container to hold components (e.g., a player entity might have both a Position and a Velocity component).

System:

MovementSystem:

Iterates over entities that have both Position and Velocity components and updates their positions based on the velocity.

The ashley entity framwork is described here: https://github.com/libgdx/ashley/wiki/How-to-use-Ashley

If this is not enough there are some youtube videos explaining the concept further:

Input Handling

Input can be handled by listeners(Creating a listnener for input events) or polling(checking every frame).

Event handling

Documentation

Gdx.input.setInputProcessor(new InputAdapter () {
   @Override
   public boolean touchDown (int x, int y, int pointer, int button) {
      // your touch down code here
      return true; // return true to indicate the event was handled
   }

   @Override
   public boolean touchUp (int x, int y, int pointer, int button) {
      // your touch up code here
      return true; // return true to indicate the event was handled
   }
});

Multiple of these can listen to user input:

InputMultiplexer multiplexer = new InputMultiplexer();
multiplexer.addProcessor(new MyUiInputProcessor());
multiplexer.addProcessor(new MyGameInputProcessor());
Gdx.input.setInputProcessor(multiplexer);

Polling

Documentation

boolean isAPressed = Gdx.input.isKeyPressed(Keys.A);

User Interfaces

Documentation

User interfaces are done similarly to javafx when you add them in code. Ui styling can be done with skins Explained in Loading Assets

Others (Animation, Tile Maps, Saving)

Animation

Documentation

Animation can also be loaded in a texture atlas with the texture manager.

Tile Maps

Documentation

Libgdx supports loading tilemaps which enables level designing or generating levels. For this a tileset is needed.

For designing tile maps this editor is often used: https://www.mapeditor.org/

Saving

This can be done by converting entities and game state to json and saving it to the disk.

Libgdx Demo examples

In the below chapters some small code demos are shown on how to use certain elements. In our project we will have to rely for certains classes on wrappers, because some libGDX elements cannot be mocked efficiently.

Some of the resources from libGDX need to be actively disposed to free up resources. Performance issues and weird behaviours may occur when certain resources are not properly disposed of.

Textures

A Texture is used to load an images file (e.g., PNG, JPG) for rendering. These resources are then stored in the GPU for fast rendering. When textures are not needed anymore they need to be disposed of to free up resources. Texture needs to be disposed

//Load an image
Texture texture = new Texture("my_texture.png"); 

//dispose an texture
texure.dispose();

TextureRegion

A TextureRegion in LibGDX represents a part of a Texture. Instead of drawing an entire texture, TextureRegion allows you to define a specific portion of a texture to be rendered. This is useful for sprite sheets, animations, and texture atlases.

Texture texture = new Texture("spritesheet.png");
// Define a region (x:20, y:20, width:50, height:50)
TextureRegion region = new TextureRegion(texture, 20, 20, 50, 50);

Instead of loading multiple small images, TextureRegion allows extracting different parts of a larger texture.

SpriteBatch (use the SpriteBatchWrapper)

A SpriteBatch is a low-level rendering tool in LibGDX that allows you to efficiently draw multiple textures in a single batch, minimizing OpenGL calls and improving performance. Spritebatch needs to be disposed

    SpriteBatch batch;
    Texture texture;

    @Override
    public void create() {
        batch = new SpriteBatch();
        texture = new Texture("my_texture.png");
    }

    @Override
    public void render() {
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        
        batch.begin();  // Start rendering
        batch.draw(texture, 100, 100);  // Draw at position (100, 100)
        batch.end();    // End rendering
    }

    @Override
    public void dispose() {
        batch.dispose();
        texture.dispose();
    }

Why use a SpriteBatch?

  • Allows drawing multiple textures in a single batch, reducing performance overhead.
  • Should be disposed of after use to avoid memory leaks.

Stage

A Stage is a higher-level UI system built on top of SpriteBatch, used for handling UI elements and actors in a scene.
Stage needs to be disposed

Below are some use cases as to when to use the Stage.

  • When working with UI elements (buttons, labels, text fields).
  • When dealing with multiple objects that need to handle input and animations.
  • When you want to simplify rendering logic instead of manually drawing everything with SpriteBatch.
Stage stage;
Skin skin;
TextButton button;

@Override
public void create() {
    stage = new Stage(new ScreenViewport());
    Gdx.input.setInputProcessor(stage);  // Make the stage handle input

    skin = new Skin(Gdx.files.internal("uiskin.json"));
    button = new TextButton("Click Me", skin);
    button.setPosition(200, 150);

    stage.addActor(button);
}

@Override
public void render() {
    Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
    
    stage.act(Gdx.graphics.getDeltaTime()); // Update actors
    stage.draw();  // Render actors
}

@Override
public void dispose() {
    stage.dispose();
    skin.dispose();
}

In the above example we want to render a button. We first create a button, after configuring the button we will add it to the Stage via addActor().
The button will be rendered on the screen durin the rendering process. With the draw() function.

Screens

Screen is like a separate "page" or "state" in your game. It helps you organize different sections, such as a main menu, gameplay screen, or game over screen. Instead of handling everything in a single class, Screens allow you to separate logic into different classes.

Screens also need to be disposed

Game class

The MyGame class will basically manage which Screen is active and needs to be shown and is also responsible for switching between screen
The class can also be used some common attributes, which can be shared across screens.

Every Screen you make needs to have have MyGame as parameter or else you will not be able to switch screens

public class MyGame extends Game {
    @Override
    public void create() {
        setScreen(new MainScreen(this)); // Start with MainScreen
    }
}

When our game starts the MyGame class will load the MainScreen with the setScreen function from the parent class.

MainScreen

public class MainScreen implements Screen {
    private final MyGame game;
    private SpriteBatch batch;
    private BitmapFont font;

    public MainScreen(MyGame game) {
        this.game = game;
    }

    @Override
    public void show() {
        batch = new SpriteBatch();
        font = new BitmapFont();
    }

    @Override
    public void render(float delta) {
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        
        batch.begin();
        font.draw(batch, "Main Screen - Press SPACE to switch", 100, 150);
        batch.end();

        if (Gdx.input.isKeyJustPressed(com.badlogic.gdx.Input.Keys.SPACE)) {
            game.setScreen(new SecondScreen(game)); // Switch screen
        }
    }

    @Override
    public void dispose() {
        batch.dispose();
        font.dispose();
    }
}

Above is the shortened code of the MainScreen. As we can see the MainScreen receives the MyGame as parameter, so that it can be used to switch screens. If you want to switch a screen you will just need to call game.setScreen(new #ScreenName#(game));.

Second Screen

public class SecondScreen implements Screen {
    private final MyGame game;
    private SpriteBatch batch;
    private BitmapFont font;

    public SecondScreen(MyGame game) {
        this.game = game;
    }

    @Override
    public void show() {
        batch = new SpriteBatch();
        font = new BitmapFont();
    }

    @Override
    public void render(float delta) {
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        
        batch.begin();
        font.draw(batch, "Second Screen - Press BACKSPACE to return", 100, 150);
        batch.end();

        if (Gdx.input.isKeyJustPressed(com.badlogic.gdx.Input.Keys.BACKSPACE)) {
            game.setScreen(new MainScreen(game)); // Go back
        }
    }

    @Override
    public void dispose() {
        batch.dispose();
        font.dispose();
    }
}

Just a copy of the MainScreen

Ashley examples

Components

A Component holds data for an entity. It contains no logic, only attributes.

import com.badlogic.ashley.core.Component;
import com.badlogic.gdx.math.Vector2;

// Position component for storing the entity's position
public class PositionComponent implements Component {
    public Vector2 position = new Vector2();
}

Systems

A System processes entities that have specific components. We basically define what the component will do or can do.

import com.badlogic.ashley.core.*;
import com.badlogic.ashley.systems.IteratingSystem;
import com.badlogic.gdx.math.Vector2;
import com.badlogic.ashley.utils.ImmutableArray;

// System to update the position of all entities with PositionComponent
public class MovementSystem extends IteratingSystem {
    
    public MovementSystem() {
        super(Family.all(PositionComponent.class).get());
    }

    @Override
    protected void processEntity(Entity entity, float deltaTime) {
        PositionComponent position = ComponentMapper.getFor(PositionComponent.class).get(entity);
        
        // Example movement logic (moving to the right)
        position.position.x += 50 * deltaTime;
    }
}

The Ashley Engine

To mangage all the entities and systems we need an Engine.

import com.badlogic.ashley.core.Engine;
import com.badlogic.ashley.core.Entity;

public class GameWorld {
    private Engine engine;
    
    public GameWorld() {
        engine = new Engine();
        
        // Add systems to the engine
        engine.addSystem(new MovementSystem());
    }
}

Creating an Entity and adding it to the engine

Entities are created and components are added to them before adding them to the engine.

import com.badlogic.ashley.core.Entity;

public void createEntity() {
    Entity player = new Entity();
    
    PositionComponent position = new PositionComponent();
    position.position.set(100, 100);
    
    HealthComponent health = new HealthComponent();
    
    player.add(position);
    player.add(health);
    
    engine.addEntity(player);
}

In this example we are assigning what kind of components/attributes the entities need. What happens with these attributes is than handled via the systems.

Summary:

  • Entities are added to the Engine.
  • The system automatically updates all PositionComponent entities.

A Family defines which entities a system processes.

We might have systems of processes which can be applied to multiple components. We can use a Family to group components together.

Family movingEntities = Family.all(PositionComponent.class).get();
ImmutableArray<Entity> entities = engine.getEntitiesFor(movingEntities);

Summary of key words in ashley

Feature Purpose
Component Holds data (e.g., position, health)
System Processes entities with specific components
Engine Manages entities and systems
Entity Collection of components
Family Defines entity groups for systems to process

others-audio-physics-texture-packing-tile-maps-saving

LibGDX bietet ein eigenes Audiosystem für kurze Soundeffekte (Sound) und längere Musik (Music).

Wir haben ein eigenes System SoundSystem erstellt, das mit der Ashley Engine integriert ist.

Verzeichnisstruktur

Sounddateien werden im folgenden Verzeichnis gespeichert:

assets/audio/

  • click.wav
  • background.mp3

.wav eignet sich besser für kurze Effekte (z. B. UI-Sounds)
.mp3 ist für Musik möglich, aber kann auf mobilen Geräten Probleme machen

Nutzung des SoundSystem

Das SoundSystem wird automatisch zur Ashley Engine hinzugefügt (engine.addSystem(new SoundSystem())) und startet beim Spielstart eine Hintergrundmusik (looped).

Soundeffekte können überall so ausgelöst werden:

SoundSystem soundSystem = engine.getSystem(SoundSystem.class);
if (soundSystem != null) {
  soundSystem.playClick();
}

Das System kann später erweitert werden für:

  • weitere benannte Sounds (play("build"), play("attack"))
  • Lautstärkeregelung über UI
  • Musikwechsel bei Screenwechsel

Freie Soundquellen

Hier einige verlässliche Quellen für freie Sounds:

Beispiele, die wir aktuell verwenden:

Testbarkeit

Die SoundSystem-Klasse ist über Unit-Tests abgesichert:

  • Registrierung bei der Engine (engine.getSystem(...))
  • Auslösen von playClick() mit einem MockSound

Details siehe SoundSystemTest.java im core/test Verzeichnis.

⚠️ **GitHub.com Fallback** ⚠️