Innovation Days 2009: Difference between revisions

From ISRWiki
Jump to navigation Jump to search
m (grammar)
(move an old ball tracking demo method to Archive sub-page)
Line 25: Line 25:
   set all ta1 // mouth talking position 1 (mouth closed)
   set all ta1 // mouth talking position 1 (mouth closed)
   set all ta2 // mouth talking position 2 (mouth open)
   set all ta2 // mouth talking position 2 (mouth open)
== Ball tracking and reaching demo ==
''Note: for the old (manual) method, see [[Innovation Days 2009/Archive]]. Here we report the new method based on the Application Manager GUI and an XML file.''
This demo uses the left eye and the right arm of Chico. Assumption: left eye is running at a resolution of 320x240 pixels. Attention: make sure that the launcher uses '''icubbrain2''' (32bit) or any of the 32-bit Cortexes for the kinematics computation.
These dependencies will be checked by the GUI:
* [[#Cameras | cameras]] are running
* [[#iCubInterface | iCubInterface]] is running
You need to set the color parameters of the camera using the framegrabberGui (left eye) to the following. Be careful, though, that the first time that you move any slider the action typically does not work. You have to move the slider first to a random position and then to the desired one.
  brightness 0
  sharpness 0.5
  white balance red 0.474
  white balance blue 0.648
  hue 0.482
  saturation 0.826
  gamma 0.400
  shutter 0.592
  gain 0.305
On chico3 type:
  cd $ICUB_ROOT/app/default/scripts
  ./manager.py $ICUB_ROOT/app/demoReach_IIT_ISR/scripts/isr/demoReach_IIT_ISR_RightHand.xml
Notes:
* the aforementioned XML file configures a system that moves the head and right arm of the robot. There exist also an XML file for moving juste the head and the left hand (<code>demoReach_IIT_ISR_LeftHand.xml</code>), one which moves just the head (<code>demoReach_IIT_ISR_NoHand.xml</code>) and one which only starts the tracker, so it moves nothing (<code>demoReach_IIT_ISR_JustTracker.xml</code>).
* you still need to set the camera parameters using framegrabberGui, we're trying to make that automatic.
=== Pausing the Ball Following demo ===
You can do this by disconnecting two ports (and connecting them again when you want to resume the demo).
Note that this will only stop the positions from being sent to the inverse kinematics module (and thus, almost always, it will stop the robot). The rest of the processes (e.g., the tracker) will keep running.
* Pause (from any machine)
  yarp disconnect /icub/LeftEyeToRoot/ballPositionOut /iKinArmCtrl/right_arm/xd:i
* Resume (from any machine)
  yarp connect /icub/LeftEyeToRoot/ballPositionOut /iKinArmCtrl/right_arm/xd:i
=== Quitting the Ball Following demo ===
* Disconnect all ports. Careful: this will stop the robot.
  cd 13.FIL
  ./7.disconnect_all.sh
* Quit/Kill the processes of the demo, in any order.


== SIFT Object Detection and Tracking demo ==
== SIFT Object Detection and Tracking demo ==

Revision as of 02:44, 18 February 2010

In this page we explain how we took care of Chico during the Innovation Days 2009 exhibition that occurred in FIL (Lisbon International Fairgrounds) from 18 to 20 June 2009.

EuroNews talked about the event: http://www.euronews.net/2009/06/23/innovation-days-in-lisbon/

Note: please refer to the iCub demos article for up-to-date instructions about managing Chico. This page, as well as Innovation Days 2009/Archive, is obsolete!

Facial expressions

  • Start the facial expression driver on the pc104:
  cd $ICUB_DIR/app/faceExpressions/scripts
  ./emotions.sh $ICUB_DIR/app/iCubLisboa01/conf
  • Start the facial expression demo by typing one of these two sequences on any machine:
  $ICUB_DIR/app/faceExpressions/scripts/cycle.sh

or

  yarp rpc /icub/face/emotions/in
  set all hap // full face happy
  set all sad // full face sad
  set all ang // full face angry
  set all neu // full face neutral
  set mou sur // mouth surprised
  set eli evi // eyelids evil
  set leb shy // left eyebrow shy
  set reb cun // right eyebrow cunning
  set all ta1 // mouth talking position 1 (mouth closed)
  set all ta2 // mouth talking position 2 (mouth open)

SIFT Object Detection and Tracking demo

Ignore this for now, as some libraries are not compiling. [controlGaze2 COMPILATION TO BE FIXED ON THE SERVERS - PROBLEMS WITH EGOSPHERELIB_LIBRARIES and PREDICTORS_LIBRARIES]

Assumptions:

On any machine, run:

  $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/controlGazeManual.sh
  $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/attentionObjects/noEgoSetup/CalibBothStart.sh

On any machine (preferably one of the icubbrains, at this task is computationally heavy) type:

  $ICUB_DIR/app/$ICUB_ROBOTNAME/scripts/attentionObjects/noEgoSetup/startSiftObjectRepresentation.sh

If you want to change the configuration, do:

  nano -w $ICUB_DIR/app/$ICUB_ROBOTNAME/conf/icubEyes.ini

Attention system demo

Assumptions:

Please note: we will run all the modules of this demo on icubbrain1 (64bit), unless specified differently.

Start the following module:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./camCalibRightManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts/
  ./salienceRightManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./egoSphereManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./attentionSelectionManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts/
  ./controlGazeManual.sh

And finally, but this time on chico3:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./appGui.sh

Press 'check all ports and connections', followed by pressing '>>' buttons (becoming green), in all the tabs of the GUI.

Now, in the 'Salience Right' tab of the GUI, press 'Initialize interface' and move the thresholds a bit (e.g., the 'intensity' one).