Université de Liège

Introduction to intelligent robotics (INFO0948-2)

Année 2017-2018


Informations

Agenda

# Date Subject Downloads Supplements
1 9/02/2018 Introduction to the course (ch. 1), N. Vecoven slides
(old slides)
Reference book
Positions and Orientations (ch. 2), N. Vecoven Slides
2 16/02/2018 Time and Motion, Mobile Robots (ch. 3 and 4), B. Boigelot Slides
Simulator presentation, N.Vecoven
3 23/02/2018 Navigation (ch. 5), B. Boigelot Slides
4 02/03/2018 Localization and Kalman Filter (ch. 6), L. Wehenkel Slides
5 16/03/2018 Q & A for the project, N. Vecoven
6 23/03/2018 Fitting and Shape Matching (not in the reference book), Ph. Latour Slides

To go further

For shape matching: An efficient way to work with many points:the k-d tree

For the project

25/03/2018

MILESTONE A1 ( Details below)

You are expected to produce a 5-minute video of your robot that explores the map and eventually shows the map, with an audio comment explaining your implementation (why you chose a given path finding algorithm, how you decide the next point to explore, rather than what function you called). Ideally, the video should also show how your robot is making decisions (for example, show the map being built, the next point to explore, the trajectory). Your submission must include both your source code and the video (either directly as a file, or a link to an external website where you hosted your video — in this case, smake sure that we can access the video at any time after your submission).
7 13/04 at 8h30 Image Processing (ch. 12), M. Van Droogenbroeck Slides

To go further

Reducing Errors in Object-Fetching Interactions through Social Feedback:video, article

For the project

Reference book MATLAB Image processing Toolbox:
8 20/04/2018 Feature Extraction, Bag-of-features for Image Classification (ch. 13), R. Marée Slides (60 MB) Reference book Peter Corke's Machine Vision Toolbox: MATLAB Computer Vision System Toolbox (high level): MATLAB Computer Vision System Toolbox (low level): MATLAB Statistics and Machine Learning Toolbox:
27/04/2018 Project follow-up: first a brief tour of what you have achieved so far since the first deadline (deeper for groups that had problems), then time for questions
27/05/2018

Project submission

You are expected to submit:
  • your source code for the whole project.
  • a PDF report (between five and ten pages) explaining which milestones you have implemented, the ideas behind your algorithms, why you think they should work in general for a map that respects the hypotheses of the project, what ideas you rejected (and why). Basically, everything that you would like to present during your defence should be in your report. As for milestone 1 also include a link to a video.
If you want to, you can submit your project beforehand: such a late deadline does not mean we expect you to work full steam until end of May, but rather that you should decide when you work on the project so that it does not interfere with your other academic requirements.
16/06/2018 and 21/06/2018

Projects presentation

The exam will mainly consist of a live demo of your solution on a house that differs from the one provided for training. Be ready to run your code on a laptop, with a different VREP file. Please also prepare videos showing the key elements of your solution, in case there is not enough time to run a full simulation sequence. Prepare two or three slides describing the key elements of your work. This defence shall last approximately ten to fifteen minutes per group. One examiner residing in the USA, a videoconferencing system will be used, namely Skype. Please have it installed on your computer beforehand and test screen sharing with your project running in the simulator (only one computer per group is required, make sure it is powerful enough well in advance; contact us if the computers of all group members are not able to withstand Skype with the simulator). Also, be present at least 15 minutes in advance to ensure the examinations go smoothly (testing shared screens, uploading the exam map, etc.).

Presentations on the 16/06/2018 ROOM 2.93

  • BindelsGrodent - Antoine Grodent, Charles Lentz, Quentin Bindels - 16h00
  • BenoitMehdi - Benoit Umé, Mehdi Sauvage - 16h15
  • BudoMoureauPaquet - Arnaud Paquet, Quentin Budo, Céline Moureau - 16h30
  • CastilloDolorisPiron - Sergio Castillo, Samy Doloris, Francois Piron - 16h45
  • IstazMoreau - Tom Istaz, Ghislain Moreau - 17h00
  • intelligentroboticG1 - Christophe Duchesne, Ahmed Ktob, Simone Poli - 17h15
  • DCLS - Damien Sprumont, Camille Leroux - 17h30
  • Fred - Frédéric Vecoven - 17h45

Presentations on the 21/06/2018 ROOM 2.93

  • GreffeNoirhomme - Nathan Greffe, Maxime Noirhomme - 16h00
  • IssamOlivier - Issam Amraoui, Olivier Schyns - 16h15
  • PetitPierre - Sébastien Blondiau, Soulaimane Harika, Keutgen Pierre - 16h30
  • radeletgodon - Louis Godon, Florian Radelet - 16h45
  • isaterasmus58 - Alexis Renault, Julien Venot - 17h00
  • AttaBohezLentz - Nathan Atta, Thomas Bohez - 17h15
  • Mauribin - Robin Pellois, Mauricio Garcia - 17h30
  • X - Alexandre Pierroux - 17h45

Project

Project statement, list of milestones, installation procedure. The project should be done in groups of two. If you have questions about the project, you can ask N. Vecoven. Submissions must be done on the dedicated platform (all the members of your group must register on the platform so that you can make a group). Deadlines:
  • 23 March: milestone A1 and short presentation of your robot exploring the room to produce a map
  • 27 May : final submission
  • 16 June and 21 June : Final examination
You may find the following software useful during the project (for MATLAB only):
  • Peter Corke's Robotics Toolbox, also on GitHub (however, be cautious: it will not always work as expected); it is automatically installed when you perform the installation steps for the project (when running the script startup_robot.m).
  • MATLAB Robotics Toolbox (available since R2015a, not by default in all MATLAB editions)
A few links more specifically about the simulator and the code you will have to write:

FAQ

How to update VLFeat ?

By default, the script startup_robot downloads an outdated version of VLFeat (0.9.9, while the current one is 0.9.20), which lacks many features (such as an SVM implementation). To update it, you can download the latest version on the official website, including binaries (on VLFeat's download page, the file is currently under the link VLFeat 0.9.20 binary package). Extract this archive on your computer (for example, in the directory matlab/rvctools/contrib/vlfeat-0.9.20, along with the embedded version of VLFeat, matlab containing the startup_robot.m file).

Before using VLFeat, you must use a script that sets up the needed paths (very much like startup_robot), each time you start MATLAB. If you followed the previous instructions, that script is matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_setup.m.

If you get strange errors when trying to use some functions, you may have to recompile the MEX files of VLFeat. To this end, you must at least have a C compiler installed that is recognised by MATLAB, such as Visual C++ (included with Visual Studio Community) or MinGW. Once you have a compiler, start the compilation by launching the script matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_compile.m.

With Visual C++, you might still get errors when compiling. In this case, edit the file matlab/rvctools/contrib/vlfeat-0.9.20/vl/host.h to comment out the lines 315 and 335 (they look like # define snprintf _snprintf). (Another solution is to use the master branch of VLFeat.) Restart matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_compile.m.

How to use the simulator ?

For the projet in this course, you will be asked to use the simulator V-REP. It emulates a complete robot (the youBot) evolving in its environment: it will move around, place its arm, grasp objects, take pictures within the simulator.

For the installation, please follow the TRS tutorial. It is highly recommended to use MATLAB as a programming environment.

When following the tutorial, if you have an error executing the statement binding_test() (step 3.2), make sure you have started the simulation in V-REP.

In order to shoot videos from V-REP, you can use the option Tools > Video recorder; however, the video will not contain any MATLAB overlay. In order to record your voice, you may have to use separate software for video making, such as KDEnlive. Other tools allow you to capture your screen (including MATLAB windows), such as OBS Studio (free and open-source software) or Camtasia (the trial version is sufficient).