Theses in Robotics: Difference between revisions

From Intelligent Materials and Systems Lab

 
(57 intermediate revisions by 3 users not shown)
Line 3: Line 3:
= Projects in Advanced Robotics =
= Projects in Advanced Robotics =
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''<br><br>
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''<br><br>
For further information, contact [[User:Karl|Karl Kruusamäe]] and [[User:Aks1812|Arun Kumar Singh]]
For further information, contact [[User:Karl|Karl Kruusamäe]].


The following is not an exhaustive list of all available thesis/research topics.
The following is not an exhaustive list of all available thesis/research topics.


== Highlighted theses topics for 2022/2023 study year ==
== Highlighted theses topics for 2024/2025 study year ==
# [[#COTS battery support for Robotont|COTS battery support for Robotont]]
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]
# [[#ROS2 for robotont|ROS2 for robotont]]
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]
# [[#NAO 'flippin'|NAO 'flippin']]


== List of potential thesis topics ==
== List of potential thesis topics ==
Line 26: Line 29:
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2
</gallery>
</gallery>
<hr>
=== ROBOTONT: Docker-Driven ROS Environment Switching ===
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.
<hr>
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.
<hr>
=== ROBOTONT: analysis of different options as on-board computers ===
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance.
<hr>
=== ROBOTONT Lite ===
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility.
<hr>
=== ROBOTONT: integrating a graphical programming interface ===
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706
<hr>
=== SemuBOT: multiple topics ===
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)
<hr>
=== Robotic Study Companion: a social robot for students in higher education ===
Potential Topics:
* Enhance the Robot's Speech/Natural Language Capabilities
* Build a Local Language Model for the RSC
* Develop and Program the Robot’s Behavior and Personality
* Build a Digital Twin Simulation for Multimodal Interaction
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions
* Explore and Implement Cybersecurity Measures for Social Robot
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info
<hr>
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.
<hr>


=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===
Line 64: Line 109:
* Inaccessible region teamwork
* Inaccessible region teamwork
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze
**youbot+clearbot - youbot cannot go up ledges but it can lift smaller robot, such as clearbot, up a ledge.
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.
<hr>
=== Developing ROS driver for a robotic gripper ===
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task. <br>
[[File:Smc gripper.jpg|120px|SMC LEHF32K2-64 gripper.]]
<hr>
<hr>
=== Mirroring human hand movements on industrial robots ===
=== Mirroring human hand movements on industrial robots ===
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.
<hr>
<hr>
=== ROBOTONT: TeMoto for robotont ===
=== ROBOTONT: TeMoto for robotont ===
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.
<hr>
<hr>


Line 92: Line 134:
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.  
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.  
<hr>
<hr>
=== Stratos Explore Ultraleap demonstrator for robotics ===
=== Stratos Explore Ultraleap demonstrator for robotics ===
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control.  
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control.  
<hr>
=== Continuous teleopearation setup for controlling mobile robot on streets ===
The task in this device is analyse available options for building a teleopearation cockpit for continously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations.
<hr>
=== ROBOTONT: ROS2 support for robotont ===
Creating ROS2 support for robotont mobile platform
<hr>
=== ROBOTONT: COTS battery support for robotont ===
Redisgning the body and power supply electronics for Robotont to support commercially available battery solution.
<hr>
<hr>
=== Mixed-reality scene creation for vehicle teleoperation ===
=== Mixed-reality scene creation for vehicle teleoperation ===
Line 111: Line 143:
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.
<hr>
<hr>
=== ROS2 learning materials for MoveIt ===
=== Replication of the MIT Hydra demo ===
Porting the learnng content and code examples of how to control manipulator robots from ROS1 to ROS2.
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).
<br>Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.
<br>LINKS:
<br>Video: https://youtu.be/qZg2lSeTuvM
<br>Code: https://github.com/MIT-SPARK/Hydra
<br>Paper: http://www.roboticsproceedings.org/rss18/p050.pdf
<hr>
<hr>
=== Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack ===
 
ROS Navigation provides off-the-shelf tools for making mobile robots autonomous (including the use established motion planning algorithms). However, the software architecture requires the motion planning algorithms to be implemented as C++ plugins. In reality, many motion planning algorithms are first implemented using Python libraries, thus, there is great demand for plugging Python-based motion planning algorithms in ROS Navigation. The thesis seeks to design and develop the software solution for that particular goal.
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].
<hr>
 
=== NAO 'flippin' ===
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.
<hr>
<hr>


= Completed projects =
= Completed projects =
== PhD theses ==
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023
== Masters's theses ==
== Masters's theses ==
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022
Line 140: Line 189:


== Bachelor's theses ==
== Bachelor's theses ==
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot "Robotont" firmware architecture updating], BS thesis, 2024
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023
*Aleksandra Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022

Latest revision as of 15:47, 4 October 2024

Projects in Advanced Robotics

The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are Robot Operating System (ROS) for developing software for advanced robot systems and Gazebo for running realistic robotic simulations.

For further information, contact Karl Kruusamäe.

The following is not an exhaustive list of all available thesis/research topics.

Highlighted theses topics for 2024/2025 study year

  1. ROBOTONT: analysis of different options as on-board computers
  2. SemuBOT: multiple topics
  3. ROBOTONT: integrating a graphical programming interface
  4. Robotic Study Companion: a social robot for students in higher education
  5. Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle
  6. Mixed-reality scene creation for vehicle teleoperation
  7. NAO 'flippin'

List of potential thesis topics

Our inventory includes but is not limited to:


ROBOTONT: Docker-Driven ROS Environment Switching

This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.


ROBOTONT: designing and implementing a communication protocol for additional devices

Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.


ROBOTONT: analysis of different options as on-board computers

Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance.


ROBOTONT Lite

The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility.


ROBOTONT: integrating a graphical programming interface

The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706


SemuBOT: multiple topics

In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)


Robotic Study Companion: a social robot for students in higher education

Potential Topics:

  • Enhance the Robot's Speech/Natural Language Capabilities
  • Build a Local Language Model for the RSC
  • Develop and Program the Robot’s Behavior and Personality
  • Build a Digital Twin Simulation for Multimodal Interaction
  • Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions
  • Explore and Implement Cybersecurity Measures for Social Robot

More info on Github | reach out to farnaz.baksh@ut.ee for more info


ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software

The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits. The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.


ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot

ROS

The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting

What is ROS?


Virtual reality user interface (VRUI) for intuitive teleoperation system

Detecting 2 hands with Leap Motion Controller

Enhancing the user-experience of a virtual reality UI developed by Georg Astok. Potentially adding virtual reality capability to a gesture- and natural-language-based robot teleoperation system.
Gesture-based teleoperation


Health monitor for intuitive telerobot

Intelligent status and error handling for an intuitive telerobotic system.


3D scanning of industrial objects

Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.


Modeling humans for human-robot interaction

True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.
ROS & Kinect & Skeleton-Markers Package


Robotic avatar for telepresence

Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.


Detection of hardware and software resources for smart integration of robots

Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of TeMoto.


Sonification of feedback during teleoperation of robots

Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS.


Human-Robot and Robot-Robot collaboration applications

Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams

  • human-robot collaborative assembly
  • distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping
  • Inaccessible region teamwork
    • youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze
    • youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.

Mirroring human hand movements on industrial robots

The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.


ROBOTONT: TeMoto for robotont

Swarm-management and UMRF-based task loading for robotont using TeMoto framework.


Enhancing teleoperation control interface with augmented cues to provoke caution

The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.


Robot-to-human interaction

As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent.


Gaze-based handover prediction

When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.


Real-world demonstrator for MIR+UR+TeMoto integration

Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.


ROBOTONT: Human-height human-robot interface for Robotont ground robot

Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.


Stratos Explore Ultraleap demonstrator for robotics

The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control.


Mixed-reality scene creation for vehicle teleoperation

Fusing different sensory feeds for creating high-usability teleoperation scene.


Validation study for AR-based robot user-interfaces

Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.


Replication of the MIT Hydra demo

The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).
Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.
LINKS:
Video: https://youtu.be/qZg2lSeTuvM
Code: https://github.com/MIT-SPARK/Hydra
Paper: http://www.roboticsproceedings.org/rss18/p050.pdf


Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle

The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the Autonomous Driving Lab.


NAO 'flippin'

The University of Tartu has a set of NAO robots, which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.


Completed projects

PhD theses

Masters's theses

Bachelor's theses