ICub instructions/Archive

From ISRWiki
Revision as of 21:58, 22 February 2010 by Giovanni Saponaro (talk | contribs) (merge Innovation Days 2009/Archive page into this one)
Jump to navigation Jump to search

Note: these methods are obsolete and kept here for historic reference only. Most probably, you may ignore this page and go back to iCub demos.

Useful things to archive

Desktop icons to launch the Application Manager and XMLs

Example: cameras.sh icon (to be placed in ~/Desktop/) contains:

 #!/bin/bash
 source ~/.bash_env
 cd $ICUB_ROOT/app/default/scripts
 ./manager.py cameras_320x240.xml

Note that Resource Finder searches for XMLs in $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/ first, then in a number of other directories.

Starting YARP components

In case that (i.e., cluster_manager.sh) did not work, you can launch the program in a terminal:

  cd $ICUB_ROOT/app/default/scripts
  ./icub-cluster.py $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/vislab-cluster.xml

Other components

Cameras

In case that (i.e., cameras.sh) did not work, type this inside a chico3 console:

  cd $ICUB_ROOT/app/default/scripts
  ./manager.py cameras_320x240.xml

That XML file is actually located in $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/. Resource Finder gives priority to that directory, when $ICUB_ROBOTNAME is defined, as is the case for us (iCubLisboa01).

Alternatively, the second command can use the cameras_640x480.xml configuration, if you need a larger resolution.

Very old information (Innovation Days 2009 archive)

Facial expressions

  • Start the facial expression driver on the pc104:
  cd $ICUB_DIR/app/faceExpressions/scripts
  ./emotions.sh $ICUB_DIR/app/iCubLisboa01/conf
  • Start the facial expression demo by typing one of these two sequences on any machine:
  $ICUB_DIR/app/faceExpressions/scripts/cycle.sh

or

  yarp rpc /icub/face/emotions/in
  set all hap // full face happy
  set all sad // full face sad
  set all ang // full face angry
  set all neu // full face neutral
  set mou sur // mouth surprised
  set eli evi // eyelids evil
  set leb shy // left eyebrow shy
  set reb cun // right eyebrow cunning
  set all ta1 // mouth talking position 1 (mouth closed)
  set all ta2 // mouth talking position 2 (mouth open)

SIFT Object Detection and Tracking demo

Ignore this for now, as some libraries are not compiling. [controlGaze2 COMPILATION TO BE FIXED ON THE SERVERS - PROBLEMS WITH EGOSPHERELIB_LIBRARIES and PREDICTORS_LIBRARIES]

Assumptions:

On any machine, run:

  $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/controlGazeManual.sh
  $ICUB_ROOT/app/$ICUB_ROBOTNAME/scripts/attentionObjects/noEgoSetup/CalibBothStart.sh

On any machine (preferably one of the icubbrains, at this task is computationally heavy) type:

  $ICUB_DIR/app/$ICUB_ROBOTNAME/scripts/attentionObjects/noEgoSetup/startSiftObjectRepresentation.sh

If you want to change the configuration, do:

  nano -w $ICUB_DIR/app/$ICUB_ROBOTNAME/conf/icubEyes.ini

Attention system demo

Assumptions:

Please note: we will run all the modules of this demo on icubbrain1 (64bit), unless specified differently.

Start the following module:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./camCalibRightManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts/
  ./salienceRightManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./egoSphereManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./attentionSelectionManual.sh

Then:

  cd $ICUB_ROOT/app/attentionDistributed/scripts/
  ./controlGazeManual.sh

And finally, but this time on chico3:

  cd $ICUB_ROOT/app/attentionDistributed/scripts
  ./appGui.sh

Press 'check all ports and connections', followed by pressing '>>' buttons (becoming green), in all the tabs of the GUI.

Now, in the 'Salience Right' tab of the GUI, press 'Initialize interface' and move the thresholds a bit (e.g., the 'intensity' one).