Theses in Robotics
Projects in Advanced Robotics
The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are Robot Operating System (ROS) for developing software for advanced robot systems and Gazebo for running realistic robotic simulations.
For further information, contact Karl Kruusamäe and Arun Kumar Singh
The following is not an exhaustive list of all available thesis/research topics.
Highlighted theses topics for 2020/2021 study year
- Social robot for neuro-rehabilitation
- Modelling and prototyping smart urban mobility infrastructure
- ROS2 for robotont
- Digital twins in Gazebo
- Enhancing teleoperation control interface with augmented cues to provoke caution
- Making KUKA youBot user friendly again
List of potential thesis topics
Development of demonstrative and promotional applications for KUKA youBot
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. The list of possible robotic demos include:
- demonstration of motion planning algorithms for mobile manipulation,
- using 3D vision for human and/or environment detection,
- interactive navigation,
- autonomous path planning,
- different pick-and-place applications,
- and human-robot collaboration.
Development of demonstrative and promotional applications for Universal Robots UR5
Sample demonstrations include:
- autonomous pick-and-place,
- load-assistance for human-robot collaboration,
- packaging,
- physical compliance during human-robot interaction,
- tracing objects surface during scanning,
- robotic kitting,
- grinding of non-flat surfaces.
Development of demonstrative and promotional applications for Clearpath Jackal
Sample demonstrations include:
- human-robot interaction,
- multi-robot mapping,
- autonomous driving,
ROS support, demos, and educational materials for open-source mobile robot ROBOTONT
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting
Detecting features of urban and off-road surroundings
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.
Digital twins in Gazebo
The ocjective of this thesis is to integrate robots available at the Institute of Technology in a single Gazebo simulation world to enable further software development and educational activities using these robots. The specific tasks within this thesis will include testing existing Gazebo packages of off-the-shelf robots and creating/improving simulation capabilities for other robots.
Follow-the-leader robotic demo
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.
Detecting hand signals for intuitive human-robot interface
This project involves creating ROS libraries for using either a Leap Motion Controller or an RGB-D camera to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).
Virtual reality user interface (VRUI) for intuitive teleoperation system
Enhancing the user-experience of a virtual reality UI developed by Georg Astok. Potentially adding virtual reality capability to a gesture- and natural-language-based robot teleoperation system.
Health monitor for intuitive telerobot
Intelligent status and error handling for an intuitive telerobotic system.
Dynamic stitching for achieveing 360° FOV
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.
3D scanning of industrial objects
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.
Modeling humans for human-robot interaction
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.
ROS wrapper for Estonian Speech Synthesizer
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing Estonian language speech synthesizer that needs to be integrated with ROS sound_play package or a stand-alone ROS wrapper package.
Robotic avatar for telepresence
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.
ROS driver for Artificial Muscle actuators
Desigining a controller box and writing software for interfacing artificial muscle actuators [1, 2] ROS.
TeMoto based smart home control
The project involves designing a open-source ROS+TeMoto based scalable smart home controller.
Detection of hardware and software resources for smart integration of robots
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of TeMoto.
Sonification of feedback during teleoperation of robots
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS.
Human-Robot and Robot-Robot collaboration applications
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams
- human-robot collaborative assembly
- distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping
- Inaccessible region teamwork
- youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze
- youbot+clearbot - youbot cannot go up ledges but it can lift smaller robot, such as clearbot, up a ledge.
Developing ROS driver for a robotic gripper
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task.
Mirroring human hand movements on industrial robots
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Universal Robot UR5 manipulator, and ROS.
ROS2-based robotics demo
Converting ROS demos and tutorials to ROS2.
ROS2 for robotont
Creating ROS2 support for robotont mobile platform
TeMoto for robotont
Swarm-management for robotont using TeMoto framework.
3D lidar for mobile robotics
Analysing the technical characteristics of 3D lidar.. Desinging and constructing the mount for Ouster OS-1 lidar and validating its applicability for indoor and outdoor scenarios.
Making KUKA youBot user friendly again
This thesis focuses on integrating the low-level software capabilities of KUKA youBot in order to achieve high-level commonly used functionalities such as
- teach mode - robot can replicate user demonstrated trajectories
- end-effector jogging
- gripper control
- gamepad integration - user can control the robot via gamepad
- web integration - user can control the robot via internet browser
The thesis is suitable for both, master and bachelor levels, as the associated code can be scaled up to generic "user-friendly control" package.
Flexible peer-to-peer network infrastructure for environments with restricted signal coverage
A very common issue with robotics in real world environments is that the network coverage is highly dependent on the environment. This makes the communication between the robot-to-base-station or robot-to-robot unreliable, potentially compromising the whole mission. This thesis focuses on implementing a peer-to-peer based network system on mobile robot platforms, where the platforms extend the network coverage between, e.g., an operator and a Worker robot. The work will be demonstrated in a real world setting, where common networking strategies for teleoperation (tethered or single router based) do not work.
Enhancing teleoperation control interface with augmented cues to provoke caution
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.
Mobile manipulation demo
The thesis making the end-effector of a mobile manipulator robot follow a continuous path. Such behaviour is of great interest in manufacturing domains such as industrial welding, sandblasting, coating, and depainting. The thesis work involves using Mobile Industrial Robot MiR and Universal Robot UR5 in collaboration.
Robot-to-human interaction
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent.
Gaze-based handover prediction
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.
RoboCloud
Setting up a system that allows booking time on a physical but networked remote robot to validate and test robot software. The thesis involves system administration and front-end development.
Social robot for neuro-rehabilitation
Customizing humanoid robot to assist doctors and psychologist during children's speech training sessions.
Modelling and prototyping smart urban mobility infrastructure
The aim is to evaluate and prototype a static sensor setup to reduce perception overhead of an autonomously driving agent
Completed projects
Masters's theses
- Madis K Nigol, Õppematerjalid robotplatvormile Robotont [Study materials for robot platform Robotont], MS thesis, 2019
- Renno Raudmäe, Avatud robotplatvorm Robotont [Open source robotics platform Robotont], MS thesis, 2019
- Asif Sattar, Human detection and distance estimation with monocular camera using YOLOv3 neural network [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019
- Ragnar Margus, Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019
- Pavel Šumejko, Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019
- Dzvezdana Arsovska, Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019
- Tõnis Tiimus, Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes [A VEP-based BCI for robotics applications], MS thesis, 2018
- Martin Appo, Hardware-agnostic compliant control ROS package for collaborative industrial manipulators [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018
- Hassan Mahmoud Shehawy Elhanash, Optical Tracking of Forearm for Classifying Fingers Poses [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018
Bachelor's theses
- Meelis Pihlap, Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019
- Kaarel Mark, Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel [Augmented reality for location determination in manufacturing], BS thesis, 2019
- Kätriin Julle, Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019
- Georg Astok, Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku [Creating virtual reality user interface using only ROS framework], BS thesis, 2019
- Martin Hallist, Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks [Teleoperation robot for arms motions], BS thesis, 2019
- Ahmed Hassan Helmy Mohamed, Software integration of autonomous robot system for mixing and serving drinks [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019
- Kristo Allaje, Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018
- Martin Maidla, Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018
- Raid Vellerind, Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017