ICub instructions: Difference between revisions

From ISRWiki
Jump to navigation Jump to search
m (→‎Software side: add link to Facial expression driver)
(→‎Starting YARP components: embed new yarpmanager screenshots (2020))
 
(113 intermediate revisions by the same user not shown)
Line 1: Line 1:
This article explains how to manage Chico for '''demos and experiments''' alike. We will describe the hardware setup that accompanies our iCub, how to turn things on and off, and how to run demos.
[[File:ICubLisbonSept2015bright.png|thumb|400px|right|caption|Lisbon iCub robot.]]


''An older version of this article can be found at [[iCub demos/Archive]] and  [[Innovation Days 2009]].''
This article explains how to use the full-body Lisbon iCub robot (code-name: iCubLisboa01, nickname: Chico) for '''demos and experiments''' alike. We will describe the hardware setup that accompanies our iCub, how to turn things on and off, and how to run demos. For a generic description of the robot, refer to the [[Chico]] article.
 
''An older version of this article can be found at [[iCub instructions/Archive]].''


== Setup ==
== Setup ==
Line 17: Line 19:
|  10.10.1.50, icub
|  10.10.1.50, icub
|-
|-
|  '''icubsrv''' Dell XPS M1330 laptop
|  '''[[iCub laptop]]'''
|  don't worry about this machine, just keep it switched on. It contains a hard disk that gets mounted by Chico's CPU at boot time
used to control the robot; also, it exports shared volumes (directories) to other machines of the computational cluster
|  10.10.1.51, icub
|-
|  '''[[Chico3 laptop machine configuration|chico3]]''' Tsunami 17" laptop
we will use this machine for YARP and all demo management
|  10.10.1.53, icub
|  10.10.1.53, icub
|}
|}


Below Chico's table, from top to bottom we have:
Below Chico's metal support, from top to bottom we have:
{| class="wikitable" border="1"
{| class="wikitable" border="1"
|-
|-
Line 36: Line 34:
|-
|-
|  Xantrex XFR 60-46
|  Xantrex XFR 60-46
|  thick power supply unit, to power most motors. '''Voltage: 48.0''' (initially it is 0 - it changes when you turn the green motor switch), current: 10.0 or more
|  thick power supply unit, to power most motors. '''Voltage: 40.0''' (initially it is 0 - it changes when you turn the green motor switch), current: 10.0 or more
|-
|-
|  APC UPS
|  APC UPS
Line 42: Line 40:
|}
|}


''Note: the '''[[iCubBrain server configuration|iCubBrain]]''' chassis, which contains two servers used for computation (icubbrain1 - 10.10.1.41, icubbrain2 - 10.10.1.42) is normally in the ISR server room on the 6th floor as of Dec. 2009. However, it sits on top of the power supply units when we bring the iCub to external demos.''
In a separate room there are servers that make up the iCub computational cluster, to run the bulk of algorithms and programs (inverse kinematics, vision routines, etc.).


== Switching on the robot ==
== Switching on the robot ==
Line 49: Line 47:


* Check that the UPS is on
* Check that the UPS is on
* Turn on icubsrv (Dell XPS laptop)
* Check that the [[iCub laptop]] is on, if not switch it on
* Turn on chico3 (Tsunami 17" laptop), the display machine that we will use to actually control the demos
* Turn on the Xantrex power supply units; make sure the voltage values are correct (see [[iCub_instructions#Setup]])
* Turn on the Xantrex power supply units; make sure the voltage values are correct (see [[iCub_demos#Setup]])
* Check that the two [[iCubBrain server configuration|iCubBrain]] servers are on (if the demo is held at ISR then they are already on: ignore this step)
* Check that the '''red emergency button is unlocked'''
* Check that the '''red emergency button is unlocked'''
* Turn on the green switches behind Chico
* Turn on the green switches next to Chico
** Safety hint: first turn on the [[pc104]] CPU switch, wait for the CPU to be on, and only then switch the motors on
** Safety hint: first turn on the [[pc104]] CPU switch, wait for the CPU to be on, and only then switch the motors on
** Second safety hint, after turning on motors: wait for the four purple lights on each board to turn off and become two blue lights – at this point you can continue to the next steps
** Another safety hint, after turning on motors: wait for the four purple lights on each board to turn off and become two blue lights – at this point you can continue to the next steps


=== Software side ===
=== Software side ===


* Run '''cluster_manager.sh''' from the chico3 desktop icon and [[#Starting YARP components|start the needed YARP components (click here for detailed instructions)]]. The summary is:
* Check that the YARP Module Manager (yarpmanager) window is shown, if not run '''yarpmanager.sh''' from the laptop desktop icon and [[#Starting YARP components|start the needed YARP components (click here for detailed instructions)]]. The summary is:
** First, always launch <code>yarpserver</code> with the 'Run' button in the Nameserver panel
** First, launch <code>yarpserver</code> with the 'Run' button in the Cluster tab
** Then, if your application employs machines other than chico3 itself, launch the necessary <code>yarprun</code> listeners (usually on '''chico3, pc104, icubbrain1, icubbrain2''')
** Then, if your application employs machines other than the laptop itself, launch the necessary <code>yarprun</code> listeners in the Entities panel: '''icub-laptop, pc104, icubbrain1, icubbrain2, icub-cuda'''
* If you need to use the motors, start [[#iCubInterface | iCubInterface]] on the pc104. Warning: make sure that the iCub is vertical before launching this software, otherwise the robot will first fall down (dangerous)
* The Entities tab is a GUI that manages all the necessary iCub applications
* Depending on which demo you want to execute, you probably need to start the [[#Cameras | cameras]] as well as other drivers ([[#Facial expression driver | facial expressions]])
** In the iCubStartup application, launch all modules, including yarprobotinterface which is the most important one that will calibrate the robot. Just in case, make sure that the iCub is vertical before launching this software, otherwise the robot will fall down (dangerous)
** Other applications: depending on the desired application or demo, you will need to start other necessary drivers such as [[#Cameras | cameras]] or [[#Facial expression driver |facial expressions]]


== Shutting off the robot ==
== Shutting off the robot ==
Line 70: Line 67:
=== Software side ===
=== Software side ===


* Stop your demo software and the cameras with the respective application manager GUIs; do not stop <code>yarpserver</code> in the Cluster Manager yet
* Stop your demo software and the cameras with the GUI; do not stop yarprobotinterface (in the iCubStartup panel) nor <code>yarpserver</code> (in the Cluster Manager window)
* Go to the (yellow) [[pc104]] iCubInterface shell window and stop the program hitting '''ctrl+c''', just '''once'''. Chico will thus move its limbs and head to a "parking" position. (If things don't quit gracefully, hit ctrl+c more times and be ready to hold Chico's chest since the head may fall to the front.)
* In the iCubStartup panel of the GUI, stop all modules, including yarprobotinterface. Chico will thus move its limbs and head to a "parking" position. (If things don't quit gracefully, stop or kill the process more times and be ready to hold Chico's chest since the head may fall to the front.)
* Now, in the Cluster Manager GUI stop the instances of <code>yarprun</code>, and finally <code>yarpserver</code>
* In the Cluster Manager window, stop the instances of <code>yarprun</code> (lower panel). It is not necessary to stop <code>yarpserver</code> (upper panel), we usually keep it on
* Optionally, shut down the [[pc104]] gracefully by typing this in the yellow pc104 shell window: <code>sudo shutdown -h now</code>


=== Hardware side ===
=== Hardware side ===


* Turn off the two green switches. Pay attention when turning off the 'Motors' switch: if iCubInterface was not stopped properly in the previous steps, then be ready to hold the robot when turning that switch
* Shut down the [[pc104]] with this command in a terminal: <code>sudo halt</code>
* Turn off the two green switches. Pay attention when turning off the 'Motors' switch: if yarprobotinterface was not stopped properly in the previous steps, then be ready to hold the robot when turning that switch
* Turn off the Xantrex power supply units
* Turn off the Xantrex power supply units
* Turn off chico3 (Tsunami 17" laptop)
* Do not turn off the [[iCub laptop]], we usually keep it on
* If necessary, turn off icubsrv (Dell XPS laptop), other machines (such as portable servers during a demo outside ISR) and the UPS


== Stopping the robot with the red emergency button ==
== Stopping the robot with the red emergency button ==
Line 87: Line 83:
* When the robot is about to break something
* When the robot is about to break something
* When some components make nasty noises that suggest they are going to break
* When some components make nasty noises that suggest they are going to break
Use this button with great care, as it cuts power to all motors and controllers abruptly! In particular, be ready to hold Chico from his chest, because the upper part of his body might fall upon losing power.
Use this button with great care, as it cuts power to all motors and controllers abruptly! In particular, be ready to hold Chico, because the upper part of his body might fall upon losing power.


To start using the robot again, it is convenient to quit and restart all the software components and interfaces; refer to [[#Switching on the robot]] for that. '''Don't forget to unlock the red emergency button''' after an emergency, otherwise the program [[#iCubInterface]] will start but not move any joint.
To start using the robot again, it is convenient to quit and restart all the software components and interfaces; refer to [[#Switching on the robot]] for that. '''Don't forget to unlock the red emergency button''' after an emergency, otherwise the program [[#yarprobotinterface]] will start but not move any joint.


== Starting YARP components ==
== Starting YARP components ==


''See [[Cluster Management in VisLab]] for background information about this GUI (not important for most users).''
* Click on the '''yarpmanager.sh''' icon on the desktop and select 'Run in Terminal' (or 'Run' if you want to suppress the optional debug information terminal)
* You will see a window divided in two parts: Cluster Management and Nodes
* In the Cluster part, click on the green 'Run yarpserver' play button. The light above the 'Stop' button will become green.
* In the Nodes part, choose all the machines in the 'Select' column, then click 'Run Selected' and wait a bit so that all machines can turn on their green 'On' light.


This section explains how to use the graphical Cluster Manager, which is accessible by double-clicking '''cluster_manager.sh''' from the chico3 desktop icon and selecting 'Run'. In short, we use this tool to manage two types of YARP components:
=== Screenshots ===


# <code>yarpserver</code>: only one instance, which runs on the chico3 laptop itself. This is needed by all kinds of iCub applications.
[[Image:Yarpmanager_launch_yarpserver.png|1024px]]
# <code>yarprun</code> command listeners: one instance per every machine that your desired application will run on. This step is demo-dependent. For most purposes we use the following machines: '''chico3, pc104, icubbrain1, icubbrain2'''. The [[#Yoga demo]] does not require any <code>yarprun</code>.


Here is what you should do within the GUI for the two parts, respectively.
[[Image:Yarpmanager_launch_yarprun.png|1024px]]


=== <code>yarpserver</code> ===
[[Image:Yarpmanager_launch_yarprobotinterface.png|1024px]]


Click the 'Run' button in the Nameserver panel. This will launch the (single) instance of <code>yarpserver</code> on chico3. The light above the 'Stop' button will become green.
== Other components ==


=== <code>yarprun</code> ===
Many demos and programs assume that components such as cameras, the yarprobotinterface driver or the facial expression driver have been launched. To start them, first of all run the '''yarpmanager.sh''' icon. A GUI similar to this one will appear:


Now we are ready to start <code>yarprun</code> on all the machines we need (necessary for several YARP modules distributed on different machines to communicate with one another). In the 'Select' column, make sure that only the following machines are selected with a red light:
[[Image:gyarpmanager.png]]
* chico3
* pc104
* icubbrain1
* icubbrain2


Click 'Run Selected' and wait a bit so that all machines can turn on their green 'On' light. The [[pc104]] can be slow and, sometimes, unpredictable (if it doesn't find the <code>yarpserver</code>, you will need to write the following line into yarp conf: <code>10.10.1.53 40000</code>).
When you have one or more applications running, each one will have its panel (tab) and the following toolbar will be visible. Here are the most important functions (which affect all modules of the currently selected application):


Do all the selected machines have their 'On/Off' switch green by now? If so, proceed to the next step. If not, click on 'Check All' and see if we have a green light from the [[pc104]] at this point. You should see something like this:
[[Image:gyarpmanager_toolbar.png]]


[[Image:Successful_cluster_mgr.png]]
=== yarprobotinterface ===


== Other components ==
This program controls the motors and reads the robot sensors (encoders, inertial sensor, skin, force/torque). It is needed by almost all demos.
 
* '''check that the red emergency button is unlocked'''
* open the '''iCubStartup''' panel in the yarpmanager GUI; click the Run Application button. This will start both kinematics (yarprobotinterface, cartesian solvers and gaze control) and dynamics (wholeBodyDynamics and gravityCompensator)
 
Wait for all boards to answer (which takes around 1 minute); after that, you are ready to move on.


Many demos and programs assume that components such as cameras, the iCubInterface driver or the facial expression driver have been launched. Here is how to run these blocks.
There is a GUI application to manually command robot joints. Just invoke it from the [[iCub laptop]] with:
yarpmotorgui


=== Cameras ===
=== Cameras ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Double-click the '''cameras.sh icon''' on chico3 desktop and select 'Run'
* open the '''Cameras_320x240_for_ball_tracking''' panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button
* Run modules and connect


=== iCubInterface ===
=== Facial expression driver ===


This program controls the motors and reads the inertial sensors. In practice, all demos assume that iCubInterface is already running in the background.
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* open the '''Face_Expressions''' panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button


To start iCubInterface, '''check that the red emergency button is unlocked''' and click on the '''cartesian_solver.sh''' icon, running its modules. You should start at least the first module (iCubInterface, as shown in the following picture) and optionally the other modules, depending on your demo.
Note that the actual expression device driver (the first module of the two listed) runs on the [[pc104]]. Sometimes, that process cannot be properly killed and restarted from the graphical interface; in the event of you needing to do that, you can either <code>kill -9</code> its PID, or do a hard restart of the [[pc104]].


[[Image:ICubInterface_launcher_nov2011.png]]
=== iCubGui ===


Wait for all boards to answer (which takes around 1 minute); after that, you are ready to move on.
This component shows a real-time 3D model of the robot on the screen.
 
There is a GUI application to interact with iCubInterface and command robot joints. Just invoke it from the display machine with:
robotMotorGui
 
=== Facial expression driver ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Double-click the '''EMOTIONS1''' icon on chico3 desktop and select 'Run'
* open the '''iCubGui''' panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button
* Finally, run modules and connect. It should look similar to this:
 
[[Image:Face_expressions_app_mgr.png]]
 
Note that the actual expression device driver (the first module of the two listed) runs on the [[pc104]]. Sometimes, that process cannot be properly killed and restarted from the graphical interface; in the event of you needing to do that, you can either <code>kill -9</code> its PID, or do a hard restart of the [[pc104]].


=== Skin GUI ===
=== Skin GUI ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Double-click the '''skinGuiRightArm''' icon on chico3 desktop and select 'Run'
* open the '''Skin_Gui_All''' panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button
* Finally, run modules and connect.
 
At this point, the '''driftPointGuiRight''' window should look like this:
 
[[Image:DriftCompGuiRight connected.png]]
 
and the '''iCubSkinGui''' window will display the fingertip/skin tactile sensors response:
 
[[Image:ICubSkinGui example.png]]
 
See also: http://eris.liralab.it/viki/images/c/cd/SkinTutorial.pdf


== Specific demos ==
== Specific demos ==


''Refer to [[iCub demos/Archive]] for older information such as manual ways to start demos''.
''Refer to [[iCub demos/Archive]] for older information such as starting demos from terminals''.


=== Ball tracking and reaching (demoReach) ===
=== Ball tracking and grasping ===
 
''This demo is obsolete, but if you really want to use it, follow [[iCub demos/Archive#Ball_tracking_and_reaching_.28demoReach.29]]. We suggest you run the Ball Tracking and Grasping demo (demoGrasp) instead.''
 
=== Ball tracking and grasping (demoGrasp) ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Start iCubInterface and the Cartesian solvers through '''cartesianSolver_iCubInterface.sh'''. start modules and make connections.
* Make sure that the applications iCubStartup, Cameras_320x240_for_Ball_Tracking have been started
* Run '''camerasSetForTracking.sh''' (not cameras.sh!) from the chico3 desktop icon; start modules and make connections.
* Optionally, Face_Expression and iCubGui can be started too
* Run '''gazeControl.sh'''; start this module (iKinGazeCtrl)
* open the '''Red-Ball_Demo''' panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button
* Optionally, start the [[#Facial expression driver | facial expression driver]] ('''EMOTIONS1''' icon)
* Finally, run the '''demoGrasp_LeftEye.sh''' icon and use the Application Manager interface to start the modules and make the connections.


Note that this demo launches the left eye camera with special parameter values:
Note that this demo launches the left eye camera with special parameter values:
Line 197: Line 170:
   gain 0.305
   gain 0.305


=== Face tracker demo ===
=== Facial expressions ===
 
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Start the [[#Facial expression driver | facial expression driver]]
* Select the '''EMOTIONS2''' icon (Run in terminal); stop with ctrl+c
 
=== Force Control ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Start [[#iCubInterface | iCubInterface]] on the [[pc104]]
* Make sure that the applications iCubStartup has been started
* Run '''cameras.sh''' from the chico3 desktop icon; start modules and make connections
* Click on the '''Force_Control''' panel in the yarpmanager GUI, Run Application, Connect Links
* Optionally, start the [[#Facial expression driver | facial expression driver]] ('''EMOTIONS1''' icon)
* select the desired modality (screenshot below) and manually move the robot limbs:
* Run '''faceTracking_RightEye_NoHand.sh''' (or the Left/Right Arm version) from the chico3 desktop icon; start modules and make connections
[[Image:Force_control_gui.png]]
* More information available here: http://eris.liralab.it/wiki/Force_Control


=== Yoga demo ===
=== Interactive Objects Learning Behavior ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]. For this demo we only need <code>yarpserver</code> (<code>yarprun</code>s are not necessary)
For this demo, you also need a Windows machine with a <code>yarprun</code> listener (the speech recognition module will be launched on this machine).
* Run [[#iCubInterface | iCubInterface]] on the pc104
 
* Click on the '''yoga.sh''' icon and run the module
In yarpmanager, select the "Interactive Objects Learning Behavior with SCSPM" application: refresh, run, connect.
 
The grammar of recognized spoken sentences is located at
https://github.com/robotology/iol/blob/master/app/lua/verbalInteraction.txt
 
Notes: do the following commands for using IOL Object Recognition side by side with the POETICON++ demo.
  yarp rpc /actionsRenderingEngine/cmd:io
  home all
 
  yarp rpc /iolStateMachineHandler/human:rpc
  attention stop


=== Facial expression demo ===
=== Yoga ===


* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]
* Make sure that basic YARP components are running: [[#Switching on the robot|summarized Cluster Manager instructions are here]]; detailed instructions are in section [[#Starting YARP components]]. For this demo we only need <code>yarpserver</code> (<code>yarprun</code>s are not necessary)
* Start the [[#Facial expression driver | facial expression driver]] ('''EMOTIONS1''' icon)
* Make sure that iCubStartup ([[#yarprobotinterface|yarprobotinterface]]) is running
* Select the '''EMOTIONS2''' icon (Run in terminal); stop with ctrl+c
* open the '''Yoga''' panel in the yarpmanager GUI; click the Run Application button


[[Category:Vislab]]
[[Category:Vislab]]

Latest revision as of 18:58, 14 February 2020

Lisbon iCub robot.

This article explains how to use the full-body Lisbon iCub robot (code-name: iCubLisboa01, nickname: Chico) for demos and experiments alike. We will describe the hardware setup that accompanies our iCub, how to turn things on and off, and how to run demos. For a generic description of the robot, refer to the Chico article.

An older version of this article can be found at iCub instructions/Archive.

Setup

The inventory consists of:

machine notes IP address, username
Chico the robot (duh) has a pc104 CPU in its head 10.10.1.50, icub
iCub laptop used to control the robot; also, it exports shared volumes (directories) to other machines of the computational cluster 10.10.1.53, icub

Below Chico's metal support, from top to bottom we have:

what notes
Xantrex XFR 35-35 thin power supply unit, to power pc104 and some motors. Voltage: 12.9, current: 05.0 or more
Xantrex XFR 60-46 thick power supply unit, to power most motors. Voltage: 40.0 (initially it is 0 - it changes when you turn the green motor switch), current: 10.0 or more
APC UPS uninterruptible power supply

In a separate room there are servers that make up the iCub computational cluster, to run the bulk of algorithms and programs (inverse kinematics, vision routines, etc.).

Switching on the robot

Hardware side

  • Check that the UPS is on
  • Check that the iCub laptop is on, if not switch it on
  • Turn on the Xantrex power supply units; make sure the voltage values are correct (see iCub_instructions#Setup)
  • Check that the red emergency button is unlocked
  • Turn on the green switches next to Chico
    • Safety hint: first turn on the pc104 CPU switch, wait for the CPU to be on, and only then switch the motors on
    • Another safety hint, after turning on motors: wait for the four purple lights on each board to turn off and become two blue lights – at this point you can continue to the next steps

Software side

  • Check that the YARP Module Manager (yarpmanager) window is shown, if not run yarpmanager.sh from the laptop desktop icon and start the needed YARP components (click here for detailed instructions). The summary is:
    • First, launch yarpserver with the 'Run' button in the Cluster tab
    • Then, if your application employs machines other than the laptop itself, launch the necessary yarprun listeners in the Entities panel: icub-laptop, pc104, icubbrain1, icubbrain2, icub-cuda
  • The Entities tab is a GUI that manages all the necessary iCub applications
    • In the iCubStartup application, launch all modules, including yarprobotinterface which is the most important one that will calibrate the robot. Just in case, make sure that the iCub is vertical before launching this software, otherwise the robot will fall down (dangerous)
    • Other applications: depending on the desired application or demo, you will need to start other necessary drivers such as cameras or facial expressions

Shutting off the robot

Software side

  • Stop your demo software and the cameras with the GUI; do not stop yarprobotinterface (in the iCubStartup panel) nor yarpserver (in the Cluster Manager window)
  • In the iCubStartup panel of the GUI, stop all modules, including yarprobotinterface. Chico will thus move its limbs and head to a "parking" position. (If things don't quit gracefully, stop or kill the process more times and be ready to hold Chico's chest since the head may fall to the front.)
  • In the Cluster Manager window, stop the instances of yarprun (lower panel). It is not necessary to stop yarpserver (upper panel), we usually keep it on

Hardware side

  • Shut down the pc104 with this command in a terminal: sudo halt
  • Turn off the two green switches. Pay attention when turning off the 'Motors' switch: if yarprobotinterface was not stopped properly in the previous steps, then be ready to hold the robot when turning that switch
  • Turn off the Xantrex power supply units
  • Do not turn off the iCub laptop, we usually keep it on

Stopping the robot with the red emergency button

The emergency button, as the name suggests, is to be used for emergencies only. For example:

  • When the robot is about to break something
  • When some components make nasty noises that suggest they are going to break

Use this button with great care, as it cuts power to all motors and controllers abruptly! In particular, be ready to hold Chico, because the upper part of his body might fall upon losing power.

To start using the robot again, it is convenient to quit and restart all the software components and interfaces; refer to #Switching on the robot for that. Don't forget to unlock the red emergency button after an emergency, otherwise the program #yarprobotinterface will start but not move any joint.

Starting YARP components

  • Click on the yarpmanager.sh icon on the desktop and select 'Run in Terminal' (or 'Run' if you want to suppress the optional debug information terminal)
  • You will see a window divided in two parts: Cluster Management and Nodes
  • In the Cluster part, click on the green 'Run yarpserver' play button. The light above the 'Stop' button will become green.
  • In the Nodes part, choose all the machines in the 'Select' column, then click 'Run Selected' and wait a bit so that all machines can turn on their green 'On' light.

Screenshots

Other components

Many demos and programs assume that components such as cameras, the yarprobotinterface driver or the facial expression driver have been launched. To start them, first of all run the yarpmanager.sh icon. A GUI similar to this one will appear:

When you have one or more applications running, each one will have its panel (tab) and the following toolbar will be visible. Here are the most important functions (which affect all modules of the currently selected application):

yarprobotinterface

This program controls the motors and reads the robot sensors (encoders, inertial sensor, skin, force/torque). It is needed by almost all demos.

  • check that the red emergency button is unlocked
  • open the iCubStartup panel in the yarpmanager GUI; click the Run Application button. This will start both kinematics (yarprobotinterface, cartesian solvers and gaze control) and dynamics (wholeBodyDynamics and gravityCompensator)

Wait for all boards to answer (which takes around 1 minute); after that, you are ready to move on.

There is a GUI application to manually command robot joints. Just invoke it from the iCub laptop with:

yarpmotorgui

Cameras

Facial expression driver

Note that the actual expression device driver (the first module of the two listed) runs on the pc104. Sometimes, that process cannot be properly killed and restarted from the graphical interface; in the event of you needing to do that, you can either kill -9 its PID, or do a hard restart of the pc104.

iCubGui

This component shows a real-time 3D model of the robot on the screen.

Skin GUI

Specific demos

Refer to iCub demos/Archive for older information such as starting demos from terminals.

Ball tracking and grasping

  • Make sure that basic YARP components are running: summarized Cluster Manager instructions are here; detailed instructions are in section #Starting YARP components
  • Make sure that the applications iCubStartup, Cameras_320x240_for_Ball_Tracking have been started
  • Optionally, Face_Expression and iCubGui can be started too
  • open the Red-Ball_Demo panel in the yarpmanager GUI; click the Run Application button; click the Connect Links button

Note that this demo launches the left eye camera with special parameter values:

 brightness 0
 sharpness 0.5
 white balance red 0.474      // you may need to lower this, depending on illumination
 white balance blue 0.648 
 hue 0.482
 saturation 0.826
 gamma 0.400
 shutter 0.592
 gain 0.305

Facial expressions

Force Control

  • Make sure that basic YARP components are running: summarized Cluster Manager instructions are here; detailed instructions are in section #Starting YARP components
  • Make sure that the applications iCubStartup has been started
  • Click on the Force_Control panel in the yarpmanager GUI, Run Application, Connect Links
  • select the desired modality (screenshot below) and manually move the robot limbs:

Interactive Objects Learning Behavior

For this demo, you also need a Windows machine with a yarprun listener (the speech recognition module will be launched on this machine).

In yarpmanager, select the "Interactive Objects Learning Behavior with SCSPM" application: refresh, run, connect.

The grammar of recognized spoken sentences is located at https://github.com/robotology/iol/blob/master/app/lua/verbalInteraction.txt

Notes: do the following commands for using IOL Object Recognition side by side with the POETICON++ demo.

 yarp rpc /actionsRenderingEngine/cmd:io
 home all
 yarp rpc /iolStateMachineHandler/human:rpc
 attention stop

Yoga