Getting started with the Pepper robot - utwente-interaction-lab/interaction-lab GitHub Wiki
Getting started with the Pepper robot
Pepper is a lifesize humanoid robot on wheels, often used in commercial and educational settings. It has a lot of degrees of freedom in its upper body, making it ideal for complex animations and gestures. It is able to navigate and move around quite well thanks to its wheels. Although its face cannot display facial expressions, people have had some success using body poses, gaze, eye colours to convey emotions and robot states.
The Pepper robot is programmed through a graphical editor "Choregraphe", which also includes a built in virtual version of the robot. The main developer website by Aldebaran and Softbanks contains a wealth of information on using and programming the robot. Here, we focus on the basics to get you started with your own project.
Installing Choregraphe
Go to the main documentation page for Pepper and go to the download page for your operating system: https://developer.softbankrobotics.com/pepper-naoqi-25
Scroll down and download the latest Choregraphe Setup (2.5.x). Note the licence key, you will need it during setup.
Install the downloaded file (the default settings should be fine). You may see images of other robots being used during the installation process, this is normal.
images/pepper/installation.png
Launch Choregraphe after installation. You may get several popups from your firewall, make sure you allow all programs access on the network (e.g. qilaunch.exe, naoqi.exe, choregraphe-bin.exe)
Working with Choregraphe
Choregraphe is a visual flow-diagram editor. You program your robot by linking together a series of action boxes to form the logic of your interaction.
During corona times, working with a physical Pepper is not possible. Instead, you can use the built-in virtual Pepper. There are some limitations you must keep in mind when using the virtual Pepper:
- It can not be easily distributed as part of an online experiment, instead you should create interactive videos and distribute them to your participants or users (for example, using H5P software).
- It does not support speech synthesis. Instead, you can use an external text-to-speech engine to generate speech audio files, and edit this into your interactive video (for example, Google text-to-speech or Amazon Polly to name a few)
- It (obviously) does not support any sensors like touch, or microphones for speech recognition. If doing a live interaction with the virtual robot, you can fake the sensor input by directly remote controlling the robot's responses. Or in the case of an interactive video, you can offer the user multiple-choice selections for providing user input.
When Choregraphe has started, you will see an empty project.
First, we need to configure to use a virtual Pepper robot, instead of the default NAO. Unfortunately this step seems to be a bit buggy. First, go to Edit -> Preferences
in the menu bar and click on tab Virtual Robot
and select Pepper Y20 (V10)
and press OK to save the changes. Press OK again if you are asked to restart the robot.
images/pepper/select-virtual-robot.png
Then, go back to the same Virtual Robot
tab again, and now select the latest version Pepper Y20 (V16)
of the robot. Press OK to save and OK again to restart the robot.
images/pepper/select-virtual-robot-2.png
Click on Connection -> Connect to virtual robot
from the menu bar. You should now see a virtual version of the Pepper appear in the Choregraphe editor.
images/pepper/connect-virtual.png images/pepper/connected.png
Connecting to a physical Pepper
If you want to connect to a physical Pepper, turn this on by pressing the start button underneath the tablet. During the start-up, Pepper is making sounds and also moves slightly around to sense the environment. When Pepper says 'OGNAK GNOUK' it is fully started up.This might take some minutes. Note, that Pepper might still move slightly in between even when you did not initiate anything.
To connect with the physical Pepper, make sure to use the same Wifi as Pepper is connected to. In Choregraphe, click on the green connection symbol or click on Connection -> Connect to ...
from the menu bar. In the window appearing, Pepper should be displayed, together with the status 'green' icon on the left side. Click on this row and then click on Select
.
images/pepper/select-physical-pepper.jpeg
The connection window should disappear. While a connection to Pepper is established, a small, blue update icon is displyed next to the green connection icon. When a connection is established, in the virtual environment on the right a Pepper should be displayed and in the title bar it should say 'Connected to Pepper'. You should recognize that all movements done by the pysical Pepper are also done by the virtual one in Choregraphe.
You can access the settings of the robot by typing its IP Address into the browse bar of a web browser. You get the IP Address by pressing the start button on the chest of Pepper, below the tablet. A window will appear, asking for a username and password. If the default setting has not been changed, both of them are 'nao'. You now have access to the webpage of the Pepper where you can adjust settings like volume, brightness and language.
Start Programming Pepper
Make your first program by letting Pepper dance. First, in the Box libraries
area, navigate to Entertainment -> Dances -> Pepper
and drag the Disco
dance to the main editor canvas. It now appears as an action box that you can use in the flow of your program.
Then you must connect this action box to the flow of the program, by connecting its onStart
trigger to the main onStart
. Do this by dragging a line from the top-left play icon in the main editing canvas to the play icon in the Disco action.
images/pepper/connect-dance.png
Now play your script by pressing the play button in the main toolbar or by pressing F5. You should see the virtual robot performing a dance.
images/pepper/play.png images/pepper/dancing.png images/pepper/dancing.gif
Next steps
Head over to the official documentation for more details on moving the robot's joints and creating and editing animations.
While editing animations it may help to turn autonomous life
off.
images/pepper/autonomous-life.png
Android Studio
For some time, it is now also possible to work with Pepper through Android Studio with the Pepper QiSDK. Through a plug-in, Pepper can now be fully programmed in Android Studio. This has the advantage that all operations can now be done through one application, unifying several parts that were done separately before when using Choregraphe. The programming of Pepper through Android Studio is mainly done in Java. In the future, the QiSDK will most likely become the main way to work with Pepper. Here you can find some more background information about the QiSDK.
Installing the SDK
Softbank Robotics offers a quite well documentation of the steps to follow for installing the SDK in Android Studio. A pre-requisite is of course to have installed Android Studio on your laptop or computer. Android Studio can be downloaded here.
When you have Android Studio installed, you can easily follow the steps of the installation guide of Softbank Robotics to install the SDK plug-in. First, you need to install the Android SDK and the respective built-in tools. You might not be able immediately to see the main menu where you can find the installation of the SDK. In that case you can add a shortcut for the menu so you can more easily get to it.