Vislab: Difference between revisions

From ISRWiki
Jump to navigation Jump to search
 
(52 intermediate revisions by 5 users not shown)
Line 16: Line 16:


=== Current projects ===
=== Current projects ===
* POETICON++ - Robots Need Language: A computational mechanism for generalisation and generation of new behaviours in robots (EC FP7, Jan. 2012 - Dec. 2015)
** http://www.poeticon.eu/
** The main objective of POETICON++ is the development of a computational mechanism for generalisation of motor programs and visual experiences for robots. To this end, it will integrate natural language and visual action/object recognition tools with motor skills and learning abilities, in the iCub humanoid. Tools and skills will engage in a cognitive dialogue for novel action generalisation and creativity experiments in two scenarios of "everyday activities", comprising of (a) behaviour generation through verbal instruction, and (b) visual scene understanding. POETICON++ views natural language as a necessary tool for endowing artificial agents with generalisation and creativity in real world environments.
* Dico(re)²s - Discount Coupon Recommendation and Redemption System (EC FP7, July 2011 - June 2013)
** http://www.dicore2s.com/
** Dico(re)²s develops and deploys a coupon-based discount campaign platform to provide consumers and retailers/manufacturers with a personalized environment for maximum customer satisfaction and business profitability.


* First-MM - Flexible Skill Acquisition and Intuitive Robot Tasking for Mobile Manipulation (EC FP7, Feb. 2010 - Jul. 2013)
* First-MM - Flexible Skill Acquisition and Intuitive Robot Tasking for Mobile Manipulation (EC FP7, Feb. 2010 - Jul. 2013)
** http://www.first-mm.eu/
** http://www.first-mm.eu/
** The goal of First-MM is to build the basis for a new generation of autonomous mobile manipulation robots that can flexibly be instructed to perform complex manipulation and transportation tasks. The project will develop a novel robot programming environment that allows even non-expert users to specify complex manipulation tasks in real-world environments. In addition to a task specification language, the environment includes concepts for probabilistic inference and for learning manipulation skills from demonstration and from experience.  
** The goal of First-MM is to build the basis for a new generation of autonomous mobile manipulation robots that can flexibly be instructed to perform complex manipulation and transportation tasks. The project will develop a novel robot programming environment that allows even non-expert users to specify complex manipulation tasks in real-world environments. In addition to a task specification language, the environment includes concepts for probabilistic inference and for learning manipulation skills from demonstration and from experience.  
* HANDLE - Developmental Pathway Towards Autonomy and Dexterity in Robot In-Hand Manipulation (EC FP7, Feb. 2009 - Feb. 2013)
 
** http://www.handle-project.eu
=== Past projects ===
** This project aims at providing advanced perception and control capabilities to the Shadow Robot hand, one of the most advanced robotic hands in mechanical terms. We follow some paradigms of human learning to make the system able to grasp and manipulate objects of different characteristics: learning by imitation and by self-exploration. Different object characteristics and usages (object affordances) determine the way the hand will perform the grasping and manipulation actions.
 
* RoboSoM - A Robotic Sense of Movement (EC FP7, Dec. 2009 - Dec. 2012)
* RoboSoM - A Robotic Sense of Movement (EC FP7, Dec. 2009 - Dec. 2012)
** http://www.robosom.eu/
** http://www.robosom.eu/
** This project aims at advancing the state-of-the-art in motion perception and control in a humanoid robot. The fundamental principles to explore are rooted on theories of human perception: Expected Perception (EP) and the Vestibular Unified Reference Frame.
** This project aims at advancing the state-of-the-art in motion perception and control in a humanoid robot. The fundamental principles to explore are rooted on theories of human perception: Expected Perception (EP) and the Vestibular Unified Reference Frame.


=== Past projects ===
* HANDLE - Developmental Pathway Towards Autonomy and Dexterity in Robot In-Hand Manipulation (EC FP7, Feb. 2009 - Feb. 2013)
** http://www.handle-project.eu
** This project aims at providing advanced perception and control capabilities to the Shadow Robot hand, one of the most advanced robotic hands in mechanical terms. We follow some paradigms of human learning to make the system able to grasp and manipulate objects of different characteristics: learning by imitation and by self-exploration. Different object characteristics and usages (object affordances) determine the way the hand will perform the grasping and manipulation actions.
 
* URUS - Ubiquitous Networking Robotics in Urban Settings (EC FP6, Dec. 2006 - Nov. 2009)
** http://urus.upc.es


* RobotCub - Robotic Open-Architecture Technology for Cognition, Understanding and Behaviour (EC FP6, Sept. 2004 - Jan. 2010)
* RobotCub - Robotic Open-Architecture Technology for Cognition, Understanding and Behaviour (EC FP6, Sept. 2004 - Jan. 2010)
** http://www.robotcub.org
** http://www.robotcub.org
* URUS - Ubiquitous Networking Robotics in Urban Settings (EC FP6, Dec. 2006 - Nov. 2009)
 
** http://urus.upc.es
* CAVIAR - Context-Aware Vision Using Image-Based Active Recognition (EC FP6, 2002 - 2005)
* CAVIAR - Context-Aware Vision Using Image-Based Active Recognition (EC FP6, 2002 - 2005)
** http://homepages.inf.ed.ac.uk/rbf/CAVIAR/
** http://homepages.inf.ed.ac.uk/rbf/CAVIAR/
* MIRROR - Mirror Neurons for Recognition (EC FP5, 2001 - 2004)
* MIRROR - Mirror Neurons for Recognition (EC FP5, 2001 - 2004)
== People ==
=== Permanent staff ===
* [http://users.isr.ist.utl.pt/~jasv/ José Santos-Victor]
* [http://users.isr.ist.utl.pt/~alex Alexandre Bernardino]
* [http://users.isr.ist.utl.pt/~jpc/ João Paulo Costeira]
* [http://welcome.isr.ist.utl.pt/people/index.asp?accao=showpeople&id_people=2 João Sentieiro]
* [http://users.isr.ist.utl.pt/~jag/ José Gaspar]
=== Researchers ===
* [http://users.isr.ist.utl.pt/~plinio/ Plinio Moreno]
* [http://users.isr.ist.utl.pt/~ricardo/ Ricardo Ferreira]
=== PhD students ===
* Bruno Damas
* Dario Figueira
* [http://users.isr.ist.utl.pt/~gsaponaro/ Giovanni Saponaro]
* Jonas Hörnstein
* Jonas Ruesch
* [http://users.isr.ist.utl.pt/~mtaiana/ Matteo Taiana]
* Nuno Moutinho
* [http://welcome.isr.ist.utl.pt/people/index.asp?accao=showpeople&id_people=105 Pedro Ribeiro]
* Ravin DeSouza ([http://www.ist.utl.pt/en/about-IST/global-cooperation/IST-EPFL/ IST-EPFL])
* Sébastien Gay ([http://www.ist.utl.pt/en/about-IST/global-cooperation/IST-EPFL/ IST-EPFL])
* Urbain Prieur (IST-UPMC)
=== Master's students ===
* Ashish Jain
* Duarte Aragão
* Filipe Vieiga
* João Pimentel
* Lester García
* Marco Henriques
* Martim Brandão
* Tayebeh Razmi
=== Other staff ===
* Ana Santos (administrative)
* Nuno Conraria (engineering)
* Ricardo Nunes (engineering)
=== Miscellaneous and past members ===
* Afshin Dehghan
* Anastácia Rodrigues (administrative)
* António Bastos
* Bruno Dias
* [http://roboticslab.uc3m.es/roboticslab/persona.php?id_pers=42 Carla González], Universidad Carlos III de Madrid, Spain
* Carlo Favali, visiting PhD student, University of Genoa, Italy
* Carlos Carreira, University of Valencia, Spain
* César Silva, Ph.D. (2001)
* Christian Wressnegger
* Claudia Deccó, visiting PhD student, University of São Paulo, Brazil
* Cláudia Soares
* Daniel Matter, IAESTE intern
* Diego Ortín, visiting PhD student, University of Zaragoza, Spain
* Etienne Grossmann
* Eval Bacca Cortes, visiting MSc student, University of Cali, Colombia
* Freek Stulp, visiting researcher, University of Groningen, The Netherlands and University of Edinburgh, UK
* [http://www.speech.kth.se/~giampi/ Giampiero Salvi], Royal Institute of Technology, Sweden
* Gianluca De Leo, University of Genoa, Italy
* Ivana Cingovksa, IAESTE intern
* [http://webdiis.unizar.es/~jminguez/ Javier Minguez], visiting PhD student, University of Zaragoza, Spain
* [http://users.isr.ist.utl.pt/~maciel/ João Maciel], Ph.D. (2002)
* José Eduardo Vianna, Universidade Federal do Espírito Santo, Brazil
* Kristijan Petkov, summer 2010 IAESTE intern
* Lenildo Silva, visiting PhD student, Universidade Federal do Rio de Janeiro, Brazil
* Luís Jordão
* [http://webdiis.unizar.es/~montesan/ Luis Montesano]
* Luís Vargas
* [http://www.isr.ist.utl.pt/~macl Manuel Lopes]
* Marco Zucchelli, visiting PhD student, Royal Institute of Technology, Sweden
* Maria Nazarycheva
* Matteo Perrone, University of Genoa, Italy
* Mauricio Arias, Escuela de Ingeniería de Antioquia, Colombia
* Miguel Praça
* Mišél Batmendijn
* Mohit Khurana, summer 2008 intern
* Niall Winters, visiting PhD student, Trinity College, Dublin, Ireland
* [http://nicolagreggio.altervista.org/ Nicola Greggio], Sant'Anna School of Advanced Studies, Pisa, Italy
* Nuno Gracias
* Nuno Pinho
* [http://www.gatv.ssr.upm.es/index.php/en/component/content/article/62 Nuria Sánchez], Universidad Politécnica de Madrid, Spain
* Raquel Vassallo, visiting PhD student, Universidade Federal do Espírito Santo, Brazil
* Ricardo Beira
* [http://users.isr.ist.utl.pt/~rco/ Ricardo Oliveira]
* Roberto Iannello, University of Genoa, Italy
* Roger Castro Freitas, visiting PhD student, Universidade Federal do Espírito Santo, Brazil
* [http://users.isr.ist.utl.pt/~rmcantin Rubén Martínez-Cantín]
* Sandra Nope Rodríguez
* Sjoerd van der Zwaan, M.Sc. (2001)
* Vicente Javier Traver
* Verica Krunić
* Vítor Costa, M.Sc. (1999)


== Material ==
== Material ==


=== Book Wishlist ===
* [[VisLab book wishlist]]
 
* [[VisLab calendar]]
* Plasticity in the Visual System: From Genes to Circuits, Raphael Pinaud (Editor), Liisa A. Tremere (Editor), Peter de Weerd, Springer 2006, http://www.amazon.com/Plasticity-Visual-System-Genes-Circuits/dp/0387281894
* [[Vislab list of journals]]
* [[VisLab slides template and logos]]


== Robots ==
== Robots ==
Line 148: Line 62:
* [[Chico]] (iCubLisboa01)
* [[Chico]] (iCubLisboa01)
* [[Chica]]
* [[Chica]]
* [[Chico head]]
* [[Vizzy]]
* [[Vizzy]]
* [[Nao]]
* [[Darwin]]


== Other resources ==
== Other resources ==
=== Blackhole network storage ===
You can store your work and backups on blackhole (10.0.3.118). As of 2013, this disk replaced the old one europa_hd (10.0.3.117).


=== Cortex cluster ===
=== Cortex cluster ===


''For information on the setup of this cluster, see [[Cortex]].''
''For information on the setup of this cluster, see [[Cortex]].''
=== Cameras ===
* [[Nickon5000D]] photo camera
* [[Flea]] firewire camera


=== Demos ===
=== Demos ===
Line 164: Line 89:
=== iCubBrain cluster ===
=== iCubBrain cluster ===


''For information on the setup of this cluster, see [[iCubBrain server configuration]].''
''For information on the setup of this cluster, see [[iCubBrain]].''


=== Network ===
=== Network ===


{|
''See the [[VisLab network]] article.''
!description
 
!IP address
=== Software repositories ===
!domain name
 
|-
Git repositories at:
|gateway
  https://github.com/vislab-tecnico-lisboa
|10.10.1.254
|gtisr.visnet
|-
|Chico pc104
|10.10.1.50
|
|-
|Chico clients
|10.10.1.51-59
|
|-
|Chico icubsrv laptop
|10.10.1.51
|
|-
|Chica net
|10.10.1.60-69
|
|-
|Balta net
|10.10.1.70-79
|
|-
|cortex server
|10.10.1.240
|server.visnet
|-
|cortex switch
|10.10.1.250
|swcompurack.visnet
|-
|vislab switch
|10.10.1.251
|swvislab.visnet
|-
|cortex1
|10.10.1.1
|cortex1.visnet
|-
|cortex2
|10.10.1.2
|cortex2.visnet
|-
|cortex3
|10.10.1.3
|cortex3.visnet
|-
|cortex4
|10.10.1.4
|cortex4.visnet
|-
|cortex5
|10.10.1.5
|cortex5.visnet
|-
|cortex6 (64 bit)
|10.10.1.6
|cortex6.visnet
|-
|icubbrain1 (64 bit)
|10.10.1.41
|icubbrain1.visnet
|-
|icubbrain2 (32 bit)
|10.10.1.42
|icubbrain2.visnet
|-
|DHCP range
|10.10.1.100-199
|
|}


''For further details, go to the [[VisLab network]] article.''
Github repositories guidelines:


=== SVN repository ===
* Repository name: All the characters must be lower case and use underscore to separate words
* Repository name: Avoid the usage of non-letter characters in the name, including "-"
* Repository description is mandatory
* README.md is mandatory
* The WIKI of the repository is highly encouraged to use, in case the README.md becomes a very large file


You can find the VisLab SVN repository at:
Old SVN repository at:
   svn://svn.isr.ist.utl.pt/vislab
   svn://svn.isr.ist.utl.pt/vislab


Line 256: Line 114:


* [[3D ball tracker]]
* [[3D ball tracker]]
* [[OpenRAVE Tutorial]]
* [[OpenRAVE Tutorial]]
* [[ROS Tutorial]]
* [[DollarPedestrianDetectionCode | Caltech Pedestrian Detection database and code]]
* [[FelzenszwalbDetectionCode | Object Detection code by Felzenszwalb, Girshick, McAllester, Ramanan]]
* [[GitCentralizedWorkflow | Using Git with a centralized workflow]]


=== Useful links ===
=== Useful links ===


* [[Checklist for new VisLab members]]
* [[Checklist for new VisLab members]]
* YARP for iCub: [[RobotCub coding basics]]
* iCub Joints: http://eris.liralab.it/wiki/ICub_joints
* iKinArmCtrl: http://eris.liralab.it/iCub/dox/html/group__iKinArmCtrl.html
* [[Temp 2010-11 iCub system update]]
* [[Software installation tricks]]


== VisLab category ==
== VisLab category ==

Latest revision as of 23:10, 4 October 2017

Institutional information: http://vislab.isr.ist.utl.pt

Our YouTube channel, containing nice videos and demonstrations: http://www.youtube.com/user/VislabLisboa

Our internal video page on this wiki: VisLab Videos

Research Topics

  • Machine Learning
  • Computer Vision

Projects

Current projects

  • POETICON++ - Robots Need Language: A computational mechanism for generalisation and generation of new behaviours in robots (EC FP7, Jan. 2012 - Dec. 2015)
    • http://www.poeticon.eu/
    • The main objective of POETICON++ is the development of a computational mechanism for generalisation of motor programs and visual experiences for robots. To this end, it will integrate natural language and visual action/object recognition tools with motor skills and learning abilities, in the iCub humanoid. Tools and skills will engage in a cognitive dialogue for novel action generalisation and creativity experiments in two scenarios of "everyday activities", comprising of (a) behaviour generation through verbal instruction, and (b) visual scene understanding. POETICON++ views natural language as a necessary tool for endowing artificial agents with generalisation and creativity in real world environments.
  • Dico(re)²s - Discount Coupon Recommendation and Redemption System (EC FP7, July 2011 - June 2013)
    • http://www.dicore2s.com/
    • Dico(re)²s develops and deploys a coupon-based discount campaign platform to provide consumers and retailers/manufacturers with a personalized environment for maximum customer satisfaction and business profitability.
  • First-MM - Flexible Skill Acquisition and Intuitive Robot Tasking for Mobile Manipulation (EC FP7, Feb. 2010 - Jul. 2013)
    • http://www.first-mm.eu/
    • The goal of First-MM is to build the basis for a new generation of autonomous mobile manipulation robots that can flexibly be instructed to perform complex manipulation and transportation tasks. The project will develop a novel robot programming environment that allows even non-expert users to specify complex manipulation tasks in real-world environments. In addition to a task specification language, the environment includes concepts for probabilistic inference and for learning manipulation skills from demonstration and from experience.

Past projects

  • RoboSoM - A Robotic Sense of Movement (EC FP7, Dec. 2009 - Dec. 2012)
    • http://www.robosom.eu/
    • This project aims at advancing the state-of-the-art in motion perception and control in a humanoid robot. The fundamental principles to explore are rooted on theories of human perception: Expected Perception (EP) and the Vestibular Unified Reference Frame.
  • HANDLE - Developmental Pathway Towards Autonomy and Dexterity in Robot In-Hand Manipulation (EC FP7, Feb. 2009 - Feb. 2013)
    • http://www.handle-project.eu
    • This project aims at providing advanced perception and control capabilities to the Shadow Robot hand, one of the most advanced robotic hands in mechanical terms. We follow some paradigms of human learning to make the system able to grasp and manipulate objects of different characteristics: learning by imitation and by self-exploration. Different object characteristics and usages (object affordances) determine the way the hand will perform the grasping and manipulation actions.
  • URUS - Ubiquitous Networking Robotics in Urban Settings (EC FP6, Dec. 2006 - Nov. 2009)
  • RobotCub - Robotic Open-Architecture Technology for Cognition, Understanding and Behaviour (EC FP6, Sept. 2004 - Jan. 2010)
  • MIRROR - Mirror Neurons for Recognition (EC FP5, 2001 - 2004)

Material

Robots

Other resources

Blackhole network storage

You can store your work and backups on blackhole (10.0.3.118). As of 2013, this disk replaced the old one europa_hd (10.0.3.117).

Cortex cluster

For information on the setup of this cluster, see Cortex.

Cameras

Demos

iCubBrain cluster

For information on the setup of this cluster, see iCubBrain.

Network

See the VisLab network article.

Software repositories

Git repositories at:

 https://github.com/vislab-tecnico-lisboa

Github repositories guidelines:

  • Repository name: All the characters must be lower case and use underscore to separate words
  • Repository name: Avoid the usage of non-letter characters in the name, including "-"
  • Repository description is mandatory
  • README.md is mandatory
  • The WIKI of the repository is highly encouraged to use, in case the README.md becomes a very large file

Old SVN repository at:

 svn://svn.isr.ist.utl.pt/vislab

Tutorials

Useful links

VisLab category

The page Category:Vislab (linked below) lists all pages related to VisLab.