english only
EPFL > STI > IMT > LASA > Teaching > Student Projects
LASA MENU
Home
Research
Publications
Teaching
People
Arriving to LASA
Videos / Photos
Press & News

GitHub Repository
Older Source Codes
The Blog
Intranet

Prospective Students
TEACHING INDEX
Teaching
Student Projects
Former Projects
Robot Contests

Student Projects propositions

If you are interested in one of the below projects, Semester Projects, Master Projects at EPFL, or Master Projects in Industry please contact the first person of reference indicated in each description either by telephone, or by email, or by visiting us directly to the LASA offices.


Semester Projects

   
   

Human-Robot Collision Simulation and Experimental Validation

During this semester project, we propose an experimental robotics application of control an analysis for assessing the safety of mobile robots around human pedestrians. Objective: Developing an experimental setting for human-robot collision analysis. You will benefit from access to multiple mobile robots, learning to control them and implement different controllers (Velocity control, Impedance control) for gathering collision data. Moreover, you will learn how to use a motion capture system, and a set of sensors for assessing collision between a robot and a human mannequin in multiple scenarios. The student is expected to have good experience in C++ or python, solid knowledge of control, and good understanding of solid mechanics and deformation analysis would be a plus.

Project: Semester Project
Period: 01.02.2020 - 01.08.2020
Section(s): EL IN MA ME MT PH
Type: 20% theory, 30% software, 50% implementation
Knowledge(s): C++, Python/Matlab, Robotics, Mechanics
Subject(s): Human-Robot Interaction, Mechanics
Responsible(s): Diego F. Paez Granados, David Gonon
URL: Click here
   

Master Projects at EPFL

no projects


Master Projects in Industry

   
   

Integrating tactile and visual information for faster inference of object's geometry

This project aims to explore a new approach to integrating visual and contact sensors to perceive objects better and faster. Information about the shape, size and pose of the objects in environment is crucial for many simple robotic scenarios, such as reaching, grasping and manipulation tasks. In laboratory experiments, a set of several (10-20) cameras are often employed to detect objects, which would not be feasible to employ if robots were to operate in outside environment. The algorithms from computer vision, such as three dimensional reconstruction with multiple vision cameras cannot give the speed and accuracy required for grasping and manipulating objects. Further, the object gets obscured by the robotic hand when under the grasp, which hinders the image processing algorithm. On the other hand, tactile exploration algorithm has been developed in our lab [1] which should detect objects even when grasped but the object exploration not guided by vision is time-taking process. Human beings often rely on the combination of visual and tactile information while grasping objects or performing other day-to-day activities. Taking a cue from human expertise, we would develop a similar strategy for robots to efficiently combine the two sensors data. The robotic setup would consist of a kuka arm with an allegro hand mounted on it. The phalanges of the hand would be covered with tekscan pressure sensors to obtain information about the contact with the object. To obtain visual information, realsense depth camera would be employed which yields a 3D point cloud of the object in its field of view. As a pilot study, we used the two together to reconstruct a model of toy rooster (see image). In the project, we will try to reconstruct something simple such as a big bowl, which can be grasped in different ways. The project will start with preliminary goals of estimating the pose, shape and size of objects. Depending on time and progress, more interesting tasks such as detecting feasible grasp points on the object would be tackled. Once implemented, the output of project will be tested for grasping objects using robot controlled by EMG sensors (in collaboration with TNE lab). The project offers a lot of scope for creative ideas. Interested students should have adequate programming skills (python, C++, MATLAB, ROS) and background in machine learning and computer vision. The project should be carried out in spring semester but dates are flexible.

Project: Master Project in Industry
Period: 01.02.2020 - 01.08.2020
Section(s): EL ME MT
Type: 30% theorie, 30% software, 30 % implementation
Knowledge(s):
Subject(s): Programming, machine learning, computer graphics
Responsible(s): Saurav Aryan
URL: Click here
   



Last update: 01/03/2012