<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://ims.ut.ee/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Karl</id>
	<title>Intelligent Materials and Systems Lab - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://ims.ut.ee/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Karl"/>
	<link rel="alternate" type="text/html" href="https://ims.ut.ee/Special:Contributions/Karl"/>
	<updated>2026-04-22T02:56:21Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.38.2</generator>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=46104</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=46104"/>
		<updated>2026-03-23T10:40:36Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#Application of VLA (Vision-Language-Action) models in robotics|Application of VLA (Vision-Language-Action) models in robotics]]&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Migrating Firmware Architecture to FreeRTOS and Evaluating micro-ROS Integration ===&lt;br /&gt;
&lt;br /&gt;
This thesis continues the firmware development of the ROBOTONT mobile robot by modifying the existing firmware running on an STM microcontroller.&lt;br /&gt;
Although the architecture is [https://github.com/robotont/robotont-firmware/blob/v3.0-devel/docs/firmware_design.md modular by design], all components are executed within a single main loop, making extension and long-term maintenance difficult.&lt;br /&gt;
&lt;br /&gt;
The goal of this thesis is to restructure the firmware using FreeRTOS by dividing functionality into separate tasks with defined responsibilities. In addition, the thesis examines whether micro-ROS can be implemented directly on the STM microcontroller to enable ROS-based communication at the firmware level. The work evaluates technical feasibility and resource usage to determine whether this approach is suitable for ROBOTONT.&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Development of CoppeliaSim-based simulation and a digital twin ===&lt;br /&gt;
The objective of the thesis is to develop an easy to use simulation for Robotont gen3 using CoppeliaSim (https://www.coppeliarobotics.com/).&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Application of VLA (Vision-Language-Action) models in robotics ===&lt;br /&gt;
The goal of this thesis is to apply VLA models to real-world robotics. The thesis project will involve topics related to robots, machine learning, transformers.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBot x Robotont crossover ===&lt;br /&gt;
The goal of this thesis is to build a mechanically integrated add-on for Robotont enable playing out socially-assistive scenarios with Robotont. The resulting Robotont solution would serve as a lite version of Semubot. The thesis can adopt the approach proposed in this thesis: https://hdl.handle.net/10062/93420&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Kaarel-Richard Kaarelson, [https://thesis.cs.ut.ee/12b3ed0c-b3c1-473e-ae13-5063fde9850a TeMoto Action Assistant: A Web-Based Human–Robot Interface for Designing UMRF Graphs] [TeMoto Action Assistant: Veebipõhine Inimese ja Roboti Interaktsiooni Tööriist UMRF Graafide Loomiseks], BS thesis, 2025&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45512</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45512"/>
		<updated>2026-02-02T14:41:28Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Staff */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|professor of human-centred robotics, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
{{TeamMember|krisliin|Krisliin Rohtla|industrial masters coordinator and communication}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|mvihmar|Marie Vihmar|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
{{TeamMember|DavidU|David Uslar|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|karljakob|Karl Jakob Levin|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45511</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45511"/>
		<updated>2026-02-02T14:40:32Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Staff */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
{{TeamMember|krisliin|Krisliin Rohtla|industrial masters coordinator and communication}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|mvihmar|Marie Vihmar|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
{{TeamMember|DavidU|David Uslar|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|karljakob|Karl Jakob Levin|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45510</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45510"/>
		<updated>2026-02-02T14:38:17Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* PhD Students */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|mvihmar|Marie Vihmar|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
{{TeamMember|DavidU|David Uslar|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|karljakob|Karl Jakob Levin|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45509</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45509"/>
		<updated>2026-02-02T14:37:10Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* PhD Students */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
{{TeamMember|DavidU|David Uslar|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|karljakob|Karl Jakob Levin|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45508</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45508"/>
		<updated>2026-02-02T14:35:36Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* PhD Students */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
{{TeamMember|DavidU|David Uslar|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45507</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45507"/>
		<updated>2026-02-02T14:31:47Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|Lecturer in Radio Engineering}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45506</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45506"/>
		<updated>2026-02-02T14:30:27Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (distributed antennas)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
{{TeamMember|Matevz|Matevž B. Zorec|PhD student (IoT and IoRT)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45505</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45505"/>
		<updated>2026-02-02T14:29:00Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (distributed antennas)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45504</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45504"/>
		<updated>2026-02-02T14:28:16Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (distributed antennas)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- ==Students== --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
&amp;lt;!-- {{TeamMember|markus|Markus Loide|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Priit.poldmaa|Priit Põldmaa|student}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}} --&amp;gt;&lt;br /&gt;
&amp;lt;!-- {{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}} --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=45503</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=45503"/>
		<updated>2026-02-02T14:23:28Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|research fellow (sustainable materials technologies, cellulose, silicone, elastomeric foams)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, medical devices, microfabrication, AI)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (distributed antennas)}}&lt;br /&gt;
{{TeamMember|Juri.volodin|Juri Volodin|PhD student (electrochemical 3D printing)}}&lt;br /&gt;
{{TeamMember|Siimkoort|Siim Koor|PhD student (biomaterials 3D printing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
{{TeamMember|Liis.tiisvelt|Liis Tiisvelt|PhD student (materials)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Students==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|markus|Markus Loide|student}}&lt;br /&gt;
{{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}}&lt;br /&gt;
{{TeamMember|Priit.poldmaa|Priit Põldmaa|student}}&lt;br /&gt;
{{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}}&lt;br /&gt;
{{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}}&lt;br /&gt;
{{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}}&lt;br /&gt;
{{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras| (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|(Robotics)}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne| }}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43838</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43838"/>
		<updated>2025-09-10T13:04:30Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Highlighted theses topics for 2025/2026 study year */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#Application of VLA (Vision-Language-Action) models in robotics|Application of VLA (Vision-Language-Action) models in robotics]]&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Development of CoppeliaSim-based simulation and a digital twin ===&lt;br /&gt;
The objective of the thesis is to develop an easy to use simulation for Robotont gen3 using CoppeliaSim (https://www.coppeliarobotics.com/).&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Application of VLA (Vision-Language-Action) models in robotics ===&lt;br /&gt;
The goal of this thesis is to apply VLA models to real-world robotics. The thesis project will involve topics related to robots, machine learning, transformers.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Kaarel-Richard Kaarelson, [https://thesis.cs.ut.ee/12b3ed0c-b3c1-473e-ae13-5063fde9850a TeMoto Action Assistant: A Web-Based Human–Robot Interface for Designing UMRF Graphs] [TeMoto Action Assistant: Veebipõhine Inimese ja Roboti Interaktsiooni Tööriist UMRF Graafide Loomiseks], BS thesis, 2025&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43837</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43837"/>
		<updated>2025-09-10T12:53:21Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Development of CoppeliaSim-based simulation and a digital twin ===&lt;br /&gt;
The objective of the thesis is to develop an easy to use simulation for Robotont gen3 using CoppeliaSim (https://www.coppeliarobotics.com/).&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Application of VLA (Vision-Language-Action) models in robotics ===&lt;br /&gt;
The goal of this thesis is to apply VLA models to real-world robotics. The thesis project will involve topics related to robots, machine learning, transformers.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Kaarel-Richard Kaarelson, [https://thesis.cs.ut.ee/12b3ed0c-b3c1-473e-ae13-5063fde9850a TeMoto Action Assistant: A Web-Based Human–Robot Interface for Designing UMRF Graphs] [TeMoto Action Assistant: Veebipõhine Inimese ja Roboti Interaktsiooni Tööriist UMRF Graafide Loomiseks], BS thesis, 2025&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43743</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43743"/>
		<updated>2025-09-02T20:13:48Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Development of CoppeliaSim-based simulation and a digital twin ===&lt;br /&gt;
The objective of the thesis is to develop an easy to use simulation for Robotont gen3 using CoppeliaSim (https://www.coppeliarobotics.com/).&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Kaarel-Richard Kaarelson, [https://thesis.cs.ut.ee/12b3ed0c-b3c1-473e-ae13-5063fde9850a TeMoto Action Assistant: A Web-Based Human–Robot Interface for Designing UMRF Graphs] [TeMoto Action Assistant: Veebipõhine Inimese ja Roboti Interaktsiooni Tööriist UMRF Graafide Loomiseks], BS thesis, 2025&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43742</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43742"/>
		<updated>2025-09-02T12:26:58Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Development of CoppeliaSim-based simulation and a digital twin ===&lt;br /&gt;
The objective of the thesis is to develop an easy to use simulation for Robotont gen3 using CoppeliaSim (https://www.coppeliarobotics.com/).&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43113</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43113"/>
		<updated>2025-07-09T21:20:47Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Masters's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Master's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43112</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43112"/>
		<updated>2025-07-09T21:14:57Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman Ali Khan, Graphical programming interface for ROBOTONT, an open-source educational robot, BS thesis, 2025&lt;br /&gt;
*Robina Zvirgzdina, Development of an Autonomous Open-Source Inventory Performance Robot for the University of Tartu Library, BS thesis, 2025&lt;br /&gt;
*Omar Huseynli, ROS integration for the Semubot robot, BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43111</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43111"/>
		<updated>2025-07-09T21:10:29Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel, Humanoidrobot Semuboti toitesüsteemi uuendamine, BS thesis, 2025&lt;br /&gt;
*Karl Rahn, Millimeeterlaine radari integreerimine avatud lähtekoodiga muruniiduki platvormil Open Mower, BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, Sõiduki kaugjuhtimise viivituse mõõtmine, BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, Dockeri konteineritel põhinev haldustarkvara Robotont 3 õpperobotile, BS thesis, 2025&lt;br /&gt;
*Martin Kaur, Servodel põhinev sotsiaalse robotkäe süsteem SemuBotile, BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, Mass Portal Grand Pharaoh XD 3D-printeri uuendamine, BS thesis, 2025&lt;br /&gt;
*Usman, , BS thesis, 2025&lt;br /&gt;
*Robina, , BS thesis, 2025&lt;br /&gt;
*Omar, , BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43110</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43110"/>
		<updated>2025-07-09T21:07:04Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel, , BS thesis, 2025&lt;br /&gt;
*Karl Rahn, , BS thesis, 2025&lt;br /&gt;
*Mattias Mäe, , BS thesis, 2025&lt;br /&gt;
*Jürgen Kottise, , BS thesis, 2025&lt;br /&gt;
*Martin Kaur, , BS thesis, 2025&lt;br /&gt;
*Karl-Jürgen Siilak, , BS thesis, 2025&lt;br /&gt;
*Usman, , BS thesis, 2025&lt;br /&gt;
*Robina, , BS thesis, 2025&lt;br /&gt;
*Omar, , BS thesis, 2025&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43109</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43109"/>
		<updated>2025-07-09T21:06:11Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Masters's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2025&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2025&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2025&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2025&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2025&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2025&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2025&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2025&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel&lt;br /&gt;
*Karl Rahn&lt;br /&gt;
*Mattias Mäe&lt;br /&gt;
*Jürgen Kottise&lt;br /&gt;
*Martin Kaur&lt;br /&gt;
*Karl-Jürgen Siilak&lt;br /&gt;
*Usman&lt;br /&gt;
*Robina&lt;br /&gt;
*Omar&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43108</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43108"/>
		<updated>2025-07-09T21:05:24Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Masters's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Asier Mandiola Arrizabalaga, Haptic-Based Teleoperation of Robots, MS thesis, 2024&lt;br /&gt;
*Dāvis Krūminš, Web-based Robotics Lab for Effortless ROS2 Development, MS thesis, 2024&lt;br /&gt;
*Julian Rene Leclerc, Natural Language Human-Robot Interaction: A Modular Framework for Conversational Robot Control Using Large Language Models, MS thesis, 2024&lt;br /&gt;
*Miriam Calafa’, Designing Multimodal Emotional Expression for a Robotic Study Companion, MS thesis, 2024&lt;br /&gt;
*Iryna Hurova, Model-based Planning Using GPU-accelerated Simulator as a World Model, MS thesis, 2024&lt;br /&gt;
*Sander Toma Võrk, Elektrooniliste termomeetrite automaatse kalibreerimissüsteemi väljatöötamine ja rakendamine, MS thesis, 2024&lt;br /&gt;
*Carl Hjalmar Love Hult, SurfMotion: An Open Source Pipeline for Robotic Pipe Cutting and Welding, MS thesis, 2024&lt;br /&gt;
*Paola Avalos Conchas, Socially Aware Planning for Indoor Navigation, MS thesis, 2024&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel&lt;br /&gt;
*Karl Rahn&lt;br /&gt;
*Mattias Mäe&lt;br /&gt;
*Jürgen Kottise&lt;br /&gt;
*Martin Kaur&lt;br /&gt;
*Karl-Jürgen Siilak&lt;br /&gt;
*Usman&lt;br /&gt;
*Robina&lt;br /&gt;
*Omar&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43107</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43107"/>
		<updated>2025-07-09T20:57:18Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Oliver Voorel&lt;br /&gt;
*Karl Rahn&lt;br /&gt;
*Mattias Mäe&lt;br /&gt;
*Jürgen Kottise&lt;br /&gt;
*Martin Kaur&lt;br /&gt;
*Karl-Jürgen Siilak&lt;br /&gt;
*Usman&lt;br /&gt;
*Robina&lt;br /&gt;
*Omar&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43106</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43106"/>
		<updated>2025-07-09T20:48:25Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Highlighted theses topics for 2025/2026 study year */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
# [[#LLM-based task planning for robots|LLM-based task planning for robots]]&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43105</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43105"/>
		<updated>2025-07-09T20:47:23Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* LLM-based task planning */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning for robots ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43104</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43104"/>
		<updated>2025-07-09T20:46:40Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Highlighted theses topics for 2024/2025 study year */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2025/2026 study year ==&lt;br /&gt;
&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43103</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=43103"/>
		<updated>2025-07-09T20:45:19Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== LLM-based task planning ===&lt;br /&gt;
The goal of this thesis is to enable high level task planning for autonomous robots for general purpose tasks. The thesis would leverage the LLM's reasoning and TeMoto Action Engine to achieve natural task negotiation, planning, and execution. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42613</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42613"/>
		<updated>2025-05-28T12:36:35Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
*Sven Kautlenbach, [http://hdl.handle.net/10062/25570 Autonoomne seade elastsusmooduli mõõtmiseks] [Electronic device for Young’s modulus measurements], BS thesis, 2011&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42612</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42612"/>
		<updated>2025-05-28T12:31:11Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
*Kalle-Gustav Kruus, [http://hdl.handle.net/10062/32440 Jalgpalliroboti löögimehhanismi elektroonikalahendus] [Driver circuit for a kicking mechanism of a football robot], BS thesis, 2013&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42611</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=42611"/>
		<updated>2025-05-28T12:26:13Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Carl Hjalmar Love Hult, [https://hdl.handle.net/10062/90673 Using and Evaluating the Real-time Spatial Perception System Hydra in Real-world Scenarios] [Reaalajas toimiva ruumilise taju süsteemi Hydra kasutamine ja hindamine praktilistes stsenaariumides], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41664</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41664"/>
		<updated>2025-03-10T11:32:53Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Masters's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Gautier Reynes, VR-Enhanced Remote Inspection Framework for Semi-Autonomous Robot Fleet, MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41500</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41500"/>
		<updated>2025-02-24T22:44:12Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Highlighted theses topics for 2024/2025 study year */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Sign-language-based control for robots|Sign-language-based control for robots]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41499</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=41499"/>
		<updated>2025-02-24T22:43:25Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
# [[#NAO 'flippin'|NAO 'flippin']]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Sign-language-based control for robots ===&lt;br /&gt;
The goal of this thesis is to design and implement the means for interacting with a robot via conventional sign language, i.e. the human can command a robot by using sign language.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=40603</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=40603"/>
		<updated>2024-12-05T12:49:30Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* PhD theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
# [[#NAO 'flippin'|NAO 'flippin']]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=40602</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=40602"/>
		<updated>2024-12-05T12:36:31Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
# [[#NAO 'flippin'|NAO 'flippin']]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhance the Robot's Speech/Natural Language Capabilities &lt;br /&gt;
* Build a Local Language Model for the RSC&lt;br /&gt;
* Develop and Program the Robot’s Behavior and Personality&lt;br /&gt;
* Build a Digital Twin Simulation for Multimodal Interaction&lt;br /&gt;
* Explore the Use of the RSC as an Affective Robot to Address Students’ Academic Emotions&lt;br /&gt;
* Explore and Implement Cybersecurity Measures for Social Robot&lt;br /&gt;
&lt;br /&gt;
[https://github.com/orgs/RobotStudyCompanion/discussions/3 More info on Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
*Robert Valner, [https://hdl.handle.net/10062/105994 Design of TeMoto, a software framework for dependable, adaptive, and collaborative autonomous robots] [TeMoto – töökindlate, adaptiivsete ja koostöövõimeliste autonoomsete robotite arendamise tarkvararaamistik], PhD thesis, 2024&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Robotics&amp;diff=40424</id>
		<title>Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Robotics&amp;diff=40424"/>
		<updated>2024-11-21T13:29:59Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Selected publications */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==  Goal and motivation for robotics at IMS Lab ==&lt;br /&gt;
[[Image:Ims-robotics logo on white.png|thumb|300px]]&lt;br /&gt;
Robots are developed for improving the quality of our lives. Human-Robot Collaboration encompasses developing usable robots that make our everyday lives easier. Collaborative robots share their autonomy with human partners, thus improving the efficiency and quality of work in areas such as flexible manufacturing, logistics, domestic assistance, healthcare, and teleoperation in hazardous environments. We have extensive experience in research and technology of every aspect of robotics: designing electronics, mechanical engineering, software development, system integration, device building, education, and training.&lt;br /&gt;
&lt;br /&gt;
== Highlights ==&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto], framework for dependable robotics,&lt;br /&gt;
* omnidirectional [http://robotont.ut.ee robotont] platform for education and research in ROS,&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* Self-deployable Habitat for Extreme Environments ([http://www.shee.eu SHEE]),&lt;br /&gt;
* Massive open online courses (MOOC) about robotics in Estonia (https://sisu.ut.ee/robot &amp;amp; https://sisu.ut.ee/rosak/).&lt;br /&gt;
[[#Portfolio|Jump to portfolio]]&lt;br /&gt;
&lt;br /&gt;
== Capabilities ==&lt;br /&gt;
=== Equipment ===&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* PAL Robotics TIAGo&lt;br /&gt;
* MiR100&lt;br /&gt;
* Robotiq 2F&lt;br /&gt;
* ClearBot&lt;br /&gt;
* OSVR&lt;br /&gt;
* Oculus Rift&lt;br /&gt;
* Intel RealSense, Kinect, Leap Motion Controller, Ouster OS-1&lt;br /&gt;
&lt;br /&gt;
=== Skills ===&lt;br /&gt;
* ROS (Robot Operating System)&lt;br /&gt;
* Full robotics system development&lt;br /&gt;
* Hardware integration&lt;br /&gt;
* Process automation&lt;br /&gt;
* Motion planning and control theory&lt;br /&gt;
* System identification&lt;br /&gt;
* Data fusion&lt;br /&gt;
* Electronics design&lt;br /&gt;
* Simulations and digital twins&lt;br /&gt;
* Algorithm development&lt;br /&gt;
* Scientific publication&lt;br /&gt;
* ROS and engineering trainings&lt;br /&gt;
&lt;br /&gt;
== Primary contact ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering}}&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the IMS lab}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Student projects ==&lt;br /&gt;
We always welcome new motivated students who are interested in robotics to join our team. We offer student projects on the following general topics:&lt;br /&gt;
* intuitive teleoperation interfaces,&lt;br /&gt;
* collaborative robotics for flexible manufacturing,&lt;br /&gt;
* social robotics,&lt;br /&gt;
* autonomous ground vehicles,&lt;br /&gt;
* autonomous drones,&lt;br /&gt;
* robotics education.&lt;br /&gt;
A list of potential student projects in '''[[student projects in robotics|robotics]]''' or '''[[Soft_robotics_student_projects|soft robotics]]'''.&lt;br /&gt;
&lt;br /&gt;
== Portfolio ==&lt;br /&gt;
===Selected projects===&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto] - framework for building dependable robotic applications for facilitating human-robot collaboration and autonomy (in collaboration with the [http://robotics.me.utexas.edu/ Nuclear and Applied Robotics Group at UT-Austin]).&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* [https://www.yanu.ai/ Yanu] - fully autonomous robot empowered bartending unit&lt;br /&gt;
* [https://robotont.ut.ee robotont] - open source omnidirectional mobile robot platform&lt;br /&gt;
* [https://spacearchitect.org/portfolio-item/self-deployable-habitat-for-extreme-environments/ SHEE] - '''S'''elf-deployable '''H'''abitat for '''E'''xtreme '''E'''nvironments, aka &amp;quot;the Mars house&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Selected publications ===&lt;br /&gt;
* Dāvis Krūmiņš, Sandra Schumann, Veiko Vunder, Rauno Põlluäär, Kristjan Laht, Renno Raudmäe, Alvo Aabloo, Karl Kruusamäe (2024) {{doi-inline|10.1109/TLT.2024.3381858|Open remote web lab for learning robotics and ROS with physical and simulated robots in an authentic developer environment}}, ''IEEE Transactions on Learning Technologies'' '''17''', 1325 - 1338. &lt;br /&gt;
*Selma Wanna, Fabian Parra, Robert Valner, Karl Kruusamäe, Mitch Pryor (2024) {{doi-inline|10.1080/01691864.2024.2366974|Unlocking underrepresented use-cases for large language model-driven human-robot task planning}}, ''Advanced Robotics'' '''38'''(18), 1335-1348. &lt;br /&gt;
* Houman Masnavi, Jatan Shrestha, Karl Kruusamäe, Arun Kumar Singh (2023) {{doi-inline|0.1109/LRA.2023.3312969|VACNA: Visibility-Aware Cooperative Navigation with Application in Inventory Management}}, ''IEEE Robotics and Automation Letters'' '''8'''(11), 7114 - 7121.&lt;br /&gt;
* Renno Raudmäe, Sandra Schumann, Veiko Vunder, Maarika Oidekivi, Madis Kaspar Nigol, Robert Valner, Houman Masnavi, Arun Kumar Singh, Alvo Aabloo, Karl Kruusamäe (2023) {{doi-inline|10.1016/j.ohx.2023.e00436|ROBOTONT–Open-source and ROS-supported omnidirectional mobile robot for education and research}}, ''HardwareX'' '''14''', e00436.&lt;br /&gt;
* Robert Valner, Houman Masnavi, Igor Rybalskii, Rauno Põlluäär, Erik Kõiv, Alvo Aabloo, Karl Kruusamäe, Arun Kumar Singh (2022) {{doi-inline|10.3389/frobt.2022.922835|Scalable and heterogenous mobile robot fleet-based task automation in crowded hospital environments—a field test}}, ''Frontiers in Robotics and AI'' '''9''', 922835.&lt;br /&gt;
* Robert Valner, Veiko Vunder, Alvo Aabloo, Mitch Pryor, Karl Kruusamäe (2022) {{doi-inline|10.1109/access.2022.3173647|TeMoto: A Software Framework for Adaptive and Dependable Robotic Autonomy With Dynamic Resource Management}}, ''IEEE Access'' '''10''', 51889 - 51907.&lt;br /&gt;
* Robert Valner, Jason Mario Dydynski, Sookyung Cho, Karl Kruusamäe (2021) {{doi-inline|10.1177/0018720820902293|Communication of Hazards in Mixed-Reality Telerobotic Systems: The Usage of Naturalistic Avoidance Cues in Driving Tasks}}, ''Human Factors: The Journal of the Human Factors and Ergonomics Society'' '''63'''(4), 619-634.&lt;br /&gt;
* Veiko Vunder, Robert Valner, Conor McMahon, Karl Kruusamäe, Mitch Pryor (2018) {{doi-inline|10.1109/HSI.2018.8431062|Improved Situational Awareness in ROS using Panospheric Vision and Virtual Reality}}, ''2018 11th International Conference on Human System Interaction (HSI)'', 471 - 477.&lt;br /&gt;
[[Our_publications|Full list of publications]]&lt;br /&gt;
&lt;br /&gt;
=== Outreach ===&lt;br /&gt;
* Massive open online course (MOOC) about ROS (Robot Operating System) robotics in Estonian (https://sisu.ut.ee/rosak).&lt;br /&gt;
* Massive open online course (MOOC) about robotics in Estonian (https://sisu.ut.ee/robot).&lt;br /&gt;
* Professional [https://sisu.ut.ee/ros ROS trainings] in Estonia&lt;br /&gt;
* [http://www.robootika.ee/ School Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:IMS-robotics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Robotics&amp;diff=40423</id>
		<title>Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Robotics&amp;diff=40423"/>
		<updated>2024-11-21T13:23:31Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Selected publications */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==  Goal and motivation for robotics at IMS Lab ==&lt;br /&gt;
[[Image:Ims-robotics logo on white.png|thumb|300px]]&lt;br /&gt;
Robots are developed for improving the quality of our lives. Human-Robot Collaboration encompasses developing usable robots that make our everyday lives easier. Collaborative robots share their autonomy with human partners, thus improving the efficiency and quality of work in areas such as flexible manufacturing, logistics, domestic assistance, healthcare, and teleoperation in hazardous environments. We have extensive experience in research and technology of every aspect of robotics: designing electronics, mechanical engineering, software development, system integration, device building, education, and training.&lt;br /&gt;
&lt;br /&gt;
== Highlights ==&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto], framework for dependable robotics,&lt;br /&gt;
* omnidirectional [http://robotont.ut.ee robotont] platform for education and research in ROS,&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* Self-deployable Habitat for Extreme Environments ([http://www.shee.eu SHEE]),&lt;br /&gt;
* Massive open online courses (MOOC) about robotics in Estonia (https://sisu.ut.ee/robot &amp;amp; https://sisu.ut.ee/rosak/).&lt;br /&gt;
[[#Portfolio|Jump to portfolio]]&lt;br /&gt;
&lt;br /&gt;
== Capabilities ==&lt;br /&gt;
=== Equipment ===&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* PAL Robotics TIAGo&lt;br /&gt;
* MiR100&lt;br /&gt;
* Robotiq 2F&lt;br /&gt;
* ClearBot&lt;br /&gt;
* OSVR&lt;br /&gt;
* Oculus Rift&lt;br /&gt;
* Intel RealSense, Kinect, Leap Motion Controller, Ouster OS-1&lt;br /&gt;
&lt;br /&gt;
=== Skills ===&lt;br /&gt;
* ROS (Robot Operating System)&lt;br /&gt;
* Full robotics system development&lt;br /&gt;
* Hardware integration&lt;br /&gt;
* Process automation&lt;br /&gt;
* Motion planning and control theory&lt;br /&gt;
* System identification&lt;br /&gt;
* Data fusion&lt;br /&gt;
* Electronics design&lt;br /&gt;
* Simulations and digital twins&lt;br /&gt;
* Algorithm development&lt;br /&gt;
* Scientific publication&lt;br /&gt;
* ROS and engineering trainings&lt;br /&gt;
&lt;br /&gt;
== Primary contact ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering}}&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the IMS lab}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Student projects ==&lt;br /&gt;
We always welcome new motivated students who are interested in robotics to join our team. We offer student projects on the following general topics:&lt;br /&gt;
* intuitive teleoperation interfaces,&lt;br /&gt;
* collaborative robotics for flexible manufacturing,&lt;br /&gt;
* social robotics,&lt;br /&gt;
* autonomous ground vehicles,&lt;br /&gt;
* autonomous drones,&lt;br /&gt;
* robotics education.&lt;br /&gt;
A list of potential student projects in '''[[student projects in robotics|robotics]]''' or '''[[Soft_robotics_student_projects|soft robotics]]'''.&lt;br /&gt;
&lt;br /&gt;
== Portfolio ==&lt;br /&gt;
===Selected projects===&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto] - framework for building dependable robotic applications for facilitating human-robot collaboration and autonomy (in collaboration with the [http://robotics.me.utexas.edu/ Nuclear and Applied Robotics Group at UT-Austin]).&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* [https://www.yanu.ai/ Yanu] - fully autonomous robot empowered bartending unit&lt;br /&gt;
* [https://robotont.ut.ee robotont] - open source omnidirectional mobile robot platform&lt;br /&gt;
* [https://spacearchitect.org/portfolio-item/self-deployable-habitat-for-extreme-environments/ SHEE] - '''S'''elf-deployable '''H'''abitat for '''E'''xtreme '''E'''nvironments, aka &amp;quot;the Mars house&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Selected publications ===&lt;br /&gt;
* Dāvis Krūmiņš, Sandra Schumann, Veiko Vunder, Rauno Põlluäär, Kristjan Laht, Renno Raudmäe, Alvo Aabloo, Karl Kruusamäe (2024) {{doi-inline|10.1109/TLT.2024.3381858|Open remote web lab for learning robotics and ROS with physical and simulated robots in an authentic developer environment}}, ''IEEE Transactions on Learning Technologies'' '''17''', 1325 - 1338. &lt;br /&gt;
* Houman Masnavi, Jatan Shrestha, Karl Kruusamäe, Arun Kumar Singh (2023) {{doi-inline|0.1109/LRA.2023.3312969|VACNA: Visibility-Aware Cooperative Navigation with Application in Inventory Management}}, ''IEEE Robotics and Automation Letters'' '''8'''(11), 7114 - 7121. &lt;br /&gt;
* Robert Valner, Houman Masnavi, Igor Rybalskii, Rauno Põlluäär, Erik Kõiv, Alvo Aabloo, Karl Kruusamäe, Arun Kumar Singh (2022) {{doi-inline|10.3389/frobt.2022.922835|Scalable and heterogenous mobile robot fleet-based task automation in crowded hospital environments—a field test}}, ''Frontiers in Robotics and AI'' '''9''', 922835.&lt;br /&gt;
* Robert Valner, Veiko Vunder, Alvo Aabloo, Mitch Pryor, Karl Kruusamäe (2022) {{doi-inline|10.1109/access.2022.3173647|TeMoto: A Software Framework for Adaptive and Dependable Robotic Autonomy With Dynamic Resource Management}}, ''IEEE Access'' '''10''', 51889 - 51907.&lt;br /&gt;
* Robert Valner, Jason Mario Dydynski, Sookyung Cho, Karl Kruusamäe (2021) {{doi-inline|10.1177/0018720820902293|Communication of Hazards in Mixed-Reality Telerobotic Systems: The Usage of Naturalistic Avoidance Cues in Driving Tasks}}, ''Human Factors: The Journal of the Human Factors and Ergonomics Society'' '''63'''(4), 619-634.&lt;br /&gt;
* Veiko Vunder, Robert Valner, Conor McMahon, Karl Kruusamäe, Mitch Pryor (2018) {{doi-inline|10.1109/HSI.2018.8431062|Improved Situational Awareness in ROS using Panospheric Vision and Virtual Reality}}, ''2018 11th International Conference on Human System Interaction (HSI)'', 471 - 477.&lt;br /&gt;
* Karl Kruusamäe, Mitch Pryor (2016) {{doi-inline|10.1109/HSI.2016.7529630|High-precision telerobot with human-centered variable perspective and scalable gestural interface}}, ''2016 9th International Conference on Human System Interactions (HSI)'', 190-196.&lt;br /&gt;
[[Our_publications|Full list of publications]]&lt;br /&gt;
&lt;br /&gt;
=== Outreach ===&lt;br /&gt;
* Massive open online course (MOOC) about ROS (Robot Operating System) robotics in Estonian (https://sisu.ut.ee/rosak).&lt;br /&gt;
* Massive open online course (MOOC) about robotics in Estonian (https://sisu.ut.ee/robot).&lt;br /&gt;
* Professional [https://sisu.ut.ee/ros ROS trainings] in Estonia&lt;br /&gt;
* [http://www.robootika.ee/ School Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:IMS-robotics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Robotics&amp;diff=40422</id>
		<title>Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Robotics&amp;diff=40422"/>
		<updated>2024-11-21T13:20:36Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==  Goal and motivation for robotics at IMS Lab ==&lt;br /&gt;
[[Image:Ims-robotics logo on white.png|thumb|300px]]&lt;br /&gt;
Robots are developed for improving the quality of our lives. Human-Robot Collaboration encompasses developing usable robots that make our everyday lives easier. Collaborative robots share their autonomy with human partners, thus improving the efficiency and quality of work in areas such as flexible manufacturing, logistics, domestic assistance, healthcare, and teleoperation in hazardous environments. We have extensive experience in research and technology of every aspect of robotics: designing electronics, mechanical engineering, software development, system integration, device building, education, and training.&lt;br /&gt;
&lt;br /&gt;
== Highlights ==&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto], framework for dependable robotics,&lt;br /&gt;
* omnidirectional [http://robotont.ut.ee robotont] platform for education and research in ROS,&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* Self-deployable Habitat for Extreme Environments ([http://www.shee.eu SHEE]),&lt;br /&gt;
* Massive open online courses (MOOC) about robotics in Estonia (https://sisu.ut.ee/robot &amp;amp; https://sisu.ut.ee/rosak/).&lt;br /&gt;
[[#Portfolio|Jump to portfolio]]&lt;br /&gt;
&lt;br /&gt;
== Capabilities ==&lt;br /&gt;
=== Equipment ===&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* PAL Robotics TIAGo&lt;br /&gt;
* MiR100&lt;br /&gt;
* Robotiq 2F&lt;br /&gt;
* ClearBot&lt;br /&gt;
* OSVR&lt;br /&gt;
* Oculus Rift&lt;br /&gt;
* Intel RealSense, Kinect, Leap Motion Controller, Ouster OS-1&lt;br /&gt;
&lt;br /&gt;
=== Skills ===&lt;br /&gt;
* ROS (Robot Operating System)&lt;br /&gt;
* Full robotics system development&lt;br /&gt;
* Hardware integration&lt;br /&gt;
* Process automation&lt;br /&gt;
* Motion planning and control theory&lt;br /&gt;
* System identification&lt;br /&gt;
* Data fusion&lt;br /&gt;
* Electronics design&lt;br /&gt;
* Simulations and digital twins&lt;br /&gt;
* Algorithm development&lt;br /&gt;
* Scientific publication&lt;br /&gt;
* ROS and engineering trainings&lt;br /&gt;
&lt;br /&gt;
== Primary contact ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering}}&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the IMS lab}}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Student projects ==&lt;br /&gt;
We always welcome new motivated students who are interested in robotics to join our team. We offer student projects on the following general topics:&lt;br /&gt;
* intuitive teleoperation interfaces,&lt;br /&gt;
* collaborative robotics for flexible manufacturing,&lt;br /&gt;
* social robotics,&lt;br /&gt;
* autonomous ground vehicles,&lt;br /&gt;
* autonomous drones,&lt;br /&gt;
* robotics education.&lt;br /&gt;
A list of potential student projects in '''[[student projects in robotics|robotics]]''' or '''[[Soft_robotics_student_projects|soft robotics]]'''.&lt;br /&gt;
&lt;br /&gt;
== Portfolio ==&lt;br /&gt;
===Selected projects===&lt;br /&gt;
* [https://github.com/temoto-framework/temoto TeMoto] - framework for building dependable robotic applications for facilitating human-robot collaboration and autonomy (in collaboration with the [http://robotics.me.utexas.edu/ Nuclear and Applied Robotics Group at UT-Austin]).&lt;br /&gt;
* [https://semubot.ee/ SemuBot] - Estonia's first socially-assistive humanoid robot&lt;br /&gt;
* [https://www.yanu.ai/ Yanu] - fully autonomous robot empowered bartending unit&lt;br /&gt;
* [https://robotont.ut.ee robotont] - open source omnidirectional mobile robot platform&lt;br /&gt;
* [https://spacearchitect.org/portfolio-item/self-deployable-habitat-for-extreme-environments/ SHEE] - '''S'''elf-deployable '''H'''abitat for '''E'''xtreme '''E'''nvironments, aka &amp;quot;the Mars house&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Selected publications ===&lt;br /&gt;
* Houman Masnavi, Jatan Shrestha, Karl Kruusamäe, Arun Kumar Singh (2023) {{doi-inline|0.1109/LRA.2023.3312969|VACNA: Visibility-Aware Cooperative Navigation with Application in Inventory Management}},  ''IEEE Robotics and Automation Letters'' '''8'''(11), 7114 - 7121. &lt;br /&gt;
* Robert Valner, Houman Masnavi, Igor Rybalskii, Rauno Põlluäär, Erik Kõiv, Alvo Aabloo, Karl Kruusamäe, Arun Kumar Singh (2022) {{doi-inline|10.3389/frobt.2022.922835|Scalable and heterogenous mobile robot fleet-based task automation in crowded hospital environments—a field test}}, ''Frontiers in Robotics and AI'' '''9''', 922835.&lt;br /&gt;
* Robert Valner, Veiko Vunder, Alvo Aabloo, Mitch Pryor, Karl Kruusamäe (2022) {{doi-inline|10.1109/access.2022.3173647|TeMoto: A Software Framework for Adaptive and Dependable Robotic Autonomy With Dynamic Resource Management}}, ''IEEE Access'' '''10''', 51889 - 51907.&lt;br /&gt;
* Robert Valner, Jason Mario Dydynski, Sookyung Cho, Karl Kruusamäe (2021) {{doi-inline|10.1177/0018720820902293|Communication of Hazards in Mixed-Reality Telerobotic Systems: The Usage of Naturalistic Avoidance Cues in Driving Tasks}}, ''Human Factors: The Journal of the Human Factors and Ergonomics Society'' '''63'''(4), 619-634.&lt;br /&gt;
* Veiko Vunder, Robert Valner, Conor McMahon, Karl Kruusamäe, Mitch Pryor (2018) {{doi-inline|10.1109/HSI.2018.8431062|Improved Situational Awareness in ROS using Panospheric Vision and Virtual Reality}}, ''2018 11th International Conference on Human System Interaction (HSI)'', 471 - 477.&lt;br /&gt;
* Karl Kruusamäe, Mitch Pryor (2016) {{doi-inline|10.1109/HSI.2016.7529630|High-precision telerobot with human-centered variable perspective and scalable gestural interface}}, ''2016 9th International Conference on Human System Interactions (HSI)'', 190-196.&lt;br /&gt;
[[Our_publications|Full list of publications]]&lt;br /&gt;
&lt;br /&gt;
=== Outreach ===&lt;br /&gt;
* Massive open online course (MOOC) about ROS (Robot Operating System) robotics in Estonian (https://sisu.ut.ee/rosak).&lt;br /&gt;
* Massive open online course (MOOC) about robotics in Estonian (https://sisu.ut.ee/robot).&lt;br /&gt;
* Professional [https://sisu.ut.ee/ros ROS trainings] in Estonia&lt;br /&gt;
* [http://www.robootika.ee/ School Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:IMS-robotics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39631</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39631"/>
		<updated>2024-09-13T11:07:50Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* ROBOTONT: integrating a graphical programming interface */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
# [[#NAO 'flippin'|NAO 'flippin']]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts. E.g. https://doi.org/10.48550/arXiv.2011.13706&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39523</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39523"/>
		<updated>2024-09-04T06:48:20Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Highlighted theses topics for 2024/2025 study year */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
# [[#NAO 'flippin'|NAO 'flippin']]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39522</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39522"/>
		<updated>2024-09-04T06:47:33Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* List of potential thesis topics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== NAO 'flippin' ===&lt;br /&gt;
The University of Tartu has a set of [https://www.aldebaran.com/en/nao NAO robots], which have been lying on shelves for some years now. It is time to make these robots into functional social robots that can be used, e.g., for children's communication therapy or as buddies for the elderly. The goal of this thesis is to test and upgrade all the NAO robots to functional condition.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39445</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39445"/>
		<updated>2024-08-28T06:59:20Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Projects in Advanced Robotics */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2024/2025 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#ROBOTONT: integrating a graphical programming interface|ROBOTONT: integrating a graphical programming interface]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39444</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=39444"/>
		<updated>2024-08-28T06:56:17Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* ROBOTONT: TeMoto for robotont */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2023/2024 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#ROBOTONT: ROS2 support for robotont|ROBOTONT: ROS2 support for robotont]]&lt;br /&gt;
# [[#Continuous teleoperation setup for controlling mobile robot on streets|Continuous teleoperation setup for controlling mobile robot on streets]]&lt;br /&gt;
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Docker-Driven ROS Environment Switching ===&lt;br /&gt;
This thesis focuses on integrating Docker containers to manage and switch between different ROS (Robot Operating System) environments on Robotont. Currently, ROS is installed natively on Robotont's Ubuntu-based onboard computer, limiting flexibility in system recovery and configuration switching. The goal is to develop a Docker-based solution that allows users to switch between ROS environments directly from Robotont’s existing low-level menu interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: designing and implementing a communication protocol for additional devices ===&lt;br /&gt;
Robotont currently includes a designated area on its front for attaching additional devices, such as an ultrasonic range finder, a servo motor connected to an Arduino, or a MikroBUS device. This thesis aims to design and implement a standard communication protocol that will enable seamless integration and control of these devices through the high-level ROS framework running on Robotont’s onboard computer.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management and UMRF-based task loading for robotont using [https://github.com/temoto-framework TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Continuous teleoperation setup for controlling mobile robot on streets ===&lt;br /&gt;
The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the [https://adl.cs.ut.ee/lab/vehicle ADL vehicle]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS2 support for robotont ===&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=User:Karl&amp;diff=38669</id>
		<title>User:Karl</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=User:Karl&amp;diff=38669"/>
		<updated>2024-06-21T13:18:04Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{UserProfile | &lt;br /&gt;
fullname=Karl Kruusamäe |&lt;br /&gt;
picture=Karl_k_2019.jpg |&lt;br /&gt;
email=karl.kruusamae@ut.ee |&lt;br /&gt;
mobile=+372 5886 3299 |&lt;br /&gt;
orcid=0000-0002-1720-1509 |&lt;br /&gt;
gscholar=https://scholar.google.com/citations?user=aZlNwwoAAAAJ |&lt;br /&gt;
etis=https://www.etis.ee/cv/Karl_Kruusamae |&lt;br /&gt;
other=&lt;br /&gt;
{{UserProfileItem | LinkedIn | [https://www.linkedin.com/in/karl-kruusamae Karl Kruusamäe] }}&lt;br /&gt;
&lt;br /&gt;
{{UserProfileItem | Github | &lt;br /&gt;
* [https://github.com/ut-ims-robotics ut-ims-robotics] &lt;br /&gt;
* [https://github.com/robotont robotont]&lt;br /&gt;
* [https://github.com/temoto-framework TeMoto Framework] &lt;br /&gt;
* [https://github.com/kruusamae kruusamae] &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Theses topics for students ==&lt;br /&gt;
* [[Theses in Robotics|Theses and project topics for BSc and MSc level students]]&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Journal Articles ===&lt;br /&gt;
&lt;br /&gt;
* Krumins, D., Schumann, S., Vunder, V., Polluaar, R., Laht, K., Raudmae, R., Aabloo, A., Kruusamae, K. (2024) [http://doi.org/10.1109/TLT.2024.3381858 &amp;lt;nowiki&amp;gt;Open Remote Web Lab for Learning Robotics and ROS with Physical and Simulated Robots in an Authentic Developer Environment&amp;lt;/nowiki&amp;gt;], ''IEEE Transactions on Learning Technologies''. [http://doi.org/10.1109/TLT.2024.3381858 http://doi.org/10.1109/TLT.2024.3381858]&lt;br /&gt;
* Masnavi, H., Shrestha, J., Kruusamae, K., Singh, A.K. (2023) [http://doi.org/10.1109/LRA.2023.3312969 &amp;lt;nowiki&amp;gt;VACNA: Visibility-Aware Cooperative Navigation With Application in Inventory Management&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2023.3312969 http://doi.org/10.1109/LRA.2023.3312969]&lt;br /&gt;
* Raudmäe, R., Schumann, S., Vunder, V., Oidekivi, M., Nigol, M.K., Valner, R., Masnavi, H., Singh, A.K., Aabloo, A., Kruusamäe, K. (2023) [http://doi.org/10.1016/j.ohx.2023.e00436 &amp;lt;nowiki&amp;gt;ROBOTONT – Open-source and ROS-supported omnidirectional mobile robot for education and research&amp;lt;/nowiki&amp;gt;], ''HardwareX''. [http://doi.org/10.1016/j.ohx.2023.e00436 http://doi.org/10.1016/j.ohx.2023.e00436]&lt;br /&gt;
* Valner, R., Vunder, V., Aabloo, A., Pryor, M., Kruusamae, K. (2022) [http://doi.org/10.1109/ACCESS.2022.3173647 &amp;lt;nowiki&amp;gt;TeMoto: A Software Framework for Adaptive and Dependable Robotic Autonomy With Dynamic Resource Management&amp;lt;/nowiki&amp;gt;], ''IEEE Access''. [http://doi.org/10.1109/ACCESS.2022.3173647 http://doi.org/10.1109/ACCESS.2022.3173647]&lt;br /&gt;
* Masnavi, H., Shrestha, J., Mishra, M., Sujit, P.B., Kruusamae, K., Singh, A.K. (2022) [http://doi.org/10.1109/LRA.2022.3190087 &amp;lt;nowiki&amp;gt;Visibility-Aware Navigation With Batch Projection Augmented Cross-Entropy Method Over a Learned Occlusion Cost&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2022.3190087 http://doi.org/10.1109/LRA.2022.3190087]&lt;br /&gt;
* Masnavi, H., Adajania, V.K., Kruusamae, K., Singh, A.K. (2022) [http://doi.org/10.1109/ACCESS.2022.3157977 &amp;lt;nowiki&amp;gt;Real-Time Multi-Convex Model Predictive Control for Occlusion-Free Target Tracking with Quadrotors&amp;lt;/nowiki&amp;gt;], ''IEEE Access''. [http://doi.org/10.1109/ACCESS.2022.3157977 http://doi.org/10.1109/ACCESS.2022.3157977]&lt;br /&gt;
* Shashank Srikanth, Mithun Babu, Houman Masnavi, Arun Kumar Singh, Karl Kruusamäe, Krishnan Madhava Krishna (2022) [https://doi.org/10.3390/s22082995 &amp;lt;nowiki&amp;gt;Fast Adaptation of Manipulator Trajectories to Task Perturbation by Differentiating through the Optimal Solution&amp;lt;/nowiki&amp;gt;], ''Sensors''. [https://doi.org/10.3390/s22082995 https://doi.org/10.3390/s22082995]&lt;br /&gt;
* Valner, R., Wanna, S., Kruusamäe, K., Pryor, M. (2022) [http://doi.org/10.1145/3522580 &amp;lt;nowiki&amp;gt;Unified Meaning Representation Format (UMRF)-A Task Description and Execution Formalism for HRI&amp;lt;/nowiki&amp;gt;], ''ACM Transactions on Human-Robot Interaction''. [http://doi.org/10.1145/3522580 http://doi.org/10.1145/3522580]&lt;br /&gt;
* Robert Valner, Houman Masnavi, Igor Rybalskii, Rauno Põlluäär, Erik Kõiv, Alvo Aabloo, Karl Kruusamäe, Arun Kumar Singh (2022) [http://doi.org/10.3389/frobt.2022.922835 &amp;lt;nowiki&amp;gt;Scalable and heterogenous mobile robot fleet-based task automation in crowded hospital environments—a field test&amp;lt;/nowiki&amp;gt;], ''Frontiers in Robotics and AI''. [http://doi.org/10.3389/frobt.2022.922835 http://doi.org/10.3389/frobt.2022.922835]&lt;br /&gt;
* Rastgar, F., Masnavi, H., Shrestha, J., Kruusamae, K., Aabloo, A., Singh, A.K. (2021) [http://doi.org/10.1109/LRA.2021.3061398 &amp;lt;nowiki&amp;gt;GPU Accelerated Convex Approximations for Fast Multi-Agent Trajectory Optimization&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2021.3061398 http://doi.org/10.1109/LRA.2021.3061398]&lt;br /&gt;
* Karl Kruusamäe (2021) [https://www.vikerkaar.ee/archives/27646 &amp;lt;nowiki&amp;gt;En attendant Robot&amp;lt;/nowiki&amp;gt;], ''Vikerkaar''.&lt;br /&gt;
* Valner, R., Dydynski, J.M., Cho, S., Kruusamäe, K. (2021) [http://doi.org/10.1177/0018720820902293 &amp;lt;nowiki&amp;gt;Communication of Hazards in Mixed-Reality Telerobotic Systems: The Usage of Naturalistic Avoidance Cues in Driving Tasks&amp;lt;/nowiki&amp;gt;], ''Human Factors''. [http://doi.org/10.1177/0018720820902293 http://doi.org/10.1177/0018720820902293]&lt;br /&gt;
* Rinne, P., Põldsalu, I., Ratas, H.K., Kruusamäe, K., Johanson, U., Tamm, T., Põhako-Esko, K., Punning, A., Peikolainen, A.-L., Kaasik, F., Must, I., van den Ende, D., Aabloo, A. (2020) [http://doi.org/10.3791/61216 &amp;lt;nowiki&amp;gt;Fabrication of carbon-based ionic electromechanically active soft actuators&amp;lt;/nowiki&amp;gt;], ''Journal of Visualized Experiments''. [http://doi.org/10.3791/61216 http://doi.org/10.3791/61216]&lt;br /&gt;
* Valner, R., Kruusamäe, K., Pryor, M. (2018) [http://doi.org/10.3390/robotics7010009 &amp;lt;nowiki&amp;gt;TeMoto: Intuitive multi-range telerobotic system with natural gestural and verbal instruction interface&amp;lt;/nowiki&amp;gt;], ''Robotics''. [http://doi.org/10.3390/robotics7010009 http://doi.org/10.3390/robotics7010009]&lt;br /&gt;
* Zhu, Z., Horiuchi, T., Kruusamäe, K., Chang, L., Asaka, K. (2016) [http://doi.org/10.1021/acs.jpcb.5b12634 &amp;lt;nowiki&amp;gt;Influence of Ambient Humidity on the Voltage Response of Ionic Polymer-Metal Composite Sensor&amp;lt;/nowiki&amp;gt;], ''Journal of Physical Chemistry B''. [http://doi.org/10.1021/acs.jpcb.5b12634 http://doi.org/10.1021/acs.jpcb.5b12634]&lt;br /&gt;
* Zhu, Z., Horiuchi, T., Kruusamae, K., Chang, L., Asaka, K. (2016) [http://doi.org/10.1088/0964-1726/25/5/055024 &amp;lt;nowiki&amp;gt;The effect of ambient humidity on the electrical response of ion-migration-based polymer sensor with various cations&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/25/5/055024 http://doi.org/10.1088/0964-1726/25/5/055024]&lt;br /&gt;
* Horiuchi, T., Kruusamäe, K., Zhu, Z., Asaka, K. (2015) [http://doi.org/10.1016/j.sna.2015.09.034 &amp;lt;nowiki&amp;gt;Evaluating curvature and making picture-overlaid trajectory of motion of largely bent carbon nanotube composite bucky gel actuator using camera measurement system&amp;lt;/nowiki&amp;gt;], ''Sensors and Actuators, A: Physical''. [http://doi.org/10.1016/j.sna.2015.09.034 http://doi.org/10.1016/j.sna.2015.09.034]&lt;br /&gt;
* Karl Kruusamae, Ken Mukai, Takushi Sugino, Kinji Asaka (2015) [http://doi.org/10.1109/tmech.2014.2362917 &amp;lt;nowiki&amp;gt;Electroactive Shape-Fixing of Bucky-Gel Actuators&amp;lt;/nowiki&amp;gt;], ''IEEE/ASME Transactions on Mechatronics''. [http://doi.org/10.1109/tmech.2014.2362917 http://doi.org/10.1109/tmech.2014.2362917]&lt;br /&gt;
* Karl Kruusamäe, Andres Punning, Alvo Aabloo, Kinji Asaka (2015) [http://doi.org/10.3390/act4010017 &amp;lt;nowiki&amp;gt;Self-Sensing Ionic Polymer Actuators: A Review&amp;lt;/nowiki&amp;gt;], ''Actuators''. [http://doi.org/10.3390/act4010017 http://doi.org/10.3390/act4010017]&lt;br /&gt;
* Kruusamäe, K., Sugino, T., Asaka, K. (2015) [http://doi.org/10.1063/1.4923351 &amp;lt;nowiki&amp;gt;Ionic and viscoelastic mechanisms of a bucky-gel actuator&amp;lt;/nowiki&amp;gt;], ''Journal of Applied Physics''. [http://doi.org/10.1063/1.4923351 http://doi.org/10.1063/1.4923351]&lt;br /&gt;
* Karl Kruusamäe, Ken Mukai, Takushi Sugino, Kinji Asaka (2014) [http://doi.org/10.1177/1045389x14538538 &amp;lt;nowiki&amp;gt;Impact of viscoelastic properties on bucky-gel actuator performance&amp;lt;/nowiki&amp;gt;], ''Journal of Intelligent Material Systems and Structures''. [http://doi.org/10.1177/1045389x14538538 http://doi.org/10.1177/1045389x14538538]&lt;br /&gt;
* Punning, A., Kim, K.J., Palmre, V., Vidal, F., Plesse, C., Festin, N., Maziz, A., Asaka, K., Sugino, T., Alici, G., Spinks, G., Wallace, G., Must, I., Pï¿½ldsalu, I., Vunder, V., Temmer, R., Kruusamï¿½e, K., Torop, J., Kaasik, F., Rinne, P., Johanson, U., Peikolainen, A.-L., Tamm, T., Aabloo, A. (2014) [http://doi.org/10.1038/srep06913 &amp;lt;nowiki&amp;gt;Ionic electroactive polymer artificial muscles in space applications&amp;lt;/nowiki&amp;gt;], ''Scientific Reports''. [http://doi.org/10.1038/srep06913 http://doi.org/10.1038/srep06913]&lt;br /&gt;
* Kruusamäe, K., Mukai, K., Sugino, T., Asaka, K. (2014) [http://doi.org/10.1088/0964-1726/23/2/025031 &amp;lt;nowiki&amp;gt;Mechanical behaviour of bending bucky-gel actuators and its representation&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/23/2/025031 http://doi.org/10.1088/0964-1726/23/2/025031]&lt;br /&gt;
* Andres Punning, Indrek Must, Inga Põldsalu, Veiko Vunder, Rauno Temmer, Karl Kruusamäe, Friedrich Kaasik, Janno Torop, Pille Rinne, Tõnis Lulla, Urmas Johanson, Tarmo Tamm, Alvo Aabloo (2014) [http://doi.org/10.1177/1045389x14546656 &amp;lt;nowiki&amp;gt;Lifetime measurements of ionic electroactive polymer actuators&amp;lt;/nowiki&amp;gt;], ''Journal of Intelligent Material Systems and Structures''. [http://doi.org/10.1177/1045389x14546656 http://doi.org/10.1177/1045389x14546656]&lt;br /&gt;
* Einike Pilli, Marek Sammul, Piia Post, Ülle Aasjõe, Karl Kruusamäe (2013) [http://doi.org/10.12697/eha.2013.1.08 &amp;lt;nowiki&amp;gt;Eesti kõrgkoolide esmakursuslaste õpi- ja teadmuskäsitus&amp;lt;/nowiki&amp;gt;], ''Eesti Haridusteaduste Ajakiri = Estonian Journal of Education''. [http://doi.org/10.12697/eha.2013.1.08 http://doi.org/10.12697/eha.2013.1.08]&lt;br /&gt;
* Kruusamäe, K., Punning, A., Aabloo, A. (2012) [http://doi.org/10.3390/s120201950 &amp;lt;nowiki&amp;gt;Electrical model of a carbon-polymer composite (CPC) collision detector&amp;lt;/nowiki&amp;gt;], ''Sensors''. [http://doi.org/10.3390/s120201950 http://doi.org/10.3390/s120201950]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Punning, A., Kodu, M., Jaaniso, R., Graziani, S., Fortuna, L., Aabloo, A. (2011) [http://doi.org/10.1088/0964-1726/20/12/124001 &amp;lt;nowiki&amp;gt;Electromechanical model for a self-sensing ionic polymer-metal composite actuating device with patterned surface electrodes&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/20/12/124001 http://doi.org/10.1088/0964-1726/20/12/124001]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Graziani, S., Punning, A., Di Pasquale, G., Aabloo, A. (2010) [http://doi.org/10.1002/pi.2752 &amp;lt;nowiki&amp;gt;Self-sensing ionic polymer-metal composite actuating device with patterned surface electrodes&amp;lt;/nowiki&amp;gt;], ''Polymer International''. [http://doi.org/10.1002/pi.2752 http://doi.org/10.1002/pi.2752]&lt;br /&gt;
&lt;br /&gt;
=== Conference Papers ===&lt;br /&gt;
&lt;br /&gt;
* Schumann, S., Krūmiņš, D., Vunder, V., Aabloo, A., Siiman, L.A., Kruusamäe, K. (2023) [http://doi.org/10.1007/978-3-031-38454-7_24 &amp;lt;nowiki&amp;gt;A Beginner-Level MOOC on ROS Robotics Leveraging a Remote Web Lab for Programming Physical Robots&amp;lt;/nowiki&amp;gt;], ''Lecture Notes in Networks and Systems''. [http://doi.org/10.1007/978-3-031-38454-7_24 http://doi.org/10.1007/978-3-031-38454-7_24]&lt;br /&gt;
* Rastgar, F., Masnavi, H., Kruusamäe, K., Aabloo, A., Singh, A.K. (2023) [http://doi.org/10.23919/ACC55779.2023.10156088 &amp;lt;nowiki&amp;gt;GPU Accelerated Batch Trajectory Optimization for Autonomous Navigation&amp;lt;/nowiki&amp;gt;], ''Proceedings of the American Control Conference''. [http://doi.org/10.23919/ACC55779.2023.10156088 http://doi.org/10.23919/ACC55779.2023.10156088]&lt;br /&gt;
* Adajania, V.K., Masnavi, H., Rastgar, F., Kruusamae, K., Singh, A.K. (2021) [http://doi.org/10.1109/IROS51168.2021.9636337 &amp;lt;nowiki&amp;gt;Embedded Hardware Appropriate Fast 3D Trajectory Optimization for Fixed Wing Aerial Vehicles by Leveraging Hidden Convex Structures&amp;lt;/nowiki&amp;gt;], ''IEEE International Conference on Intelligent Robots and Systems''. [http://doi.org/10.1109/IROS51168.2021.9636337 http://doi.org/10.1109/IROS51168.2021.9636337]&lt;br /&gt;
* Oidekivi, M., Nolte, A., Aabloo, A., Kruusamae, K. (2021) [http://doi.org/10.1109/HSI52170.2021.9538693 &amp;lt;nowiki&amp;gt;Interpreting externally expressed intentions of an autonomous vehicle&amp;lt;/nowiki&amp;gt;], ''International Conference on Human System Interaction, HSI''. [http://doi.org/10.1109/HSI52170.2021.9538693 http://doi.org/10.1109/HSI52170.2021.9538693]&lt;br /&gt;
* Oidekivi, M., Nolte, A., Aabloo, A., Kruusamae, K. (2021) [http://doi.org/10.1109/HSI52170.2021.9538774 &amp;lt;nowiki&amp;gt;Identifying emotions from facial expression displays of robots - Results from a survey study&amp;lt;/nowiki&amp;gt;], ''International Conference on Human System Interaction, HSI''. [http://doi.org/10.1109/HSI52170.2021.9538774 http://doi.org/10.1109/HSI52170.2021.9538774]&lt;br /&gt;
* Aschenbrenner, D., Rieder, J.S.I., Van Tol, D., Van Dam, J., Rusak, Z., Blech, J.O., Azangoo, M., Panu, S., Kruusamäe, K., Masnavi, H., Rybalskii, I., Aabloo, A., Petry, M., Teixeira, G., Thiede, B., Pedrazzoli, P., Ferrario, A., Foletti, M., Confalonieri, M., Bertaggia, D., Togias, T., Makris, S. (2020) [http://doi.org/10.1109/AIVR50618.2020.00017 &amp;lt;nowiki&amp;gt;Mirrorlabs - Creating accessible Digital Twins of robotic production environment with Mixed Reality&amp;lt;/nowiki&amp;gt;], ''Proceedings - 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR 2020''. [http://doi.org/10.1109/AIVR50618.2020.00017 http://doi.org/10.1109/AIVR50618.2020.00017]&lt;br /&gt;
* Veiko Vunder, Robert Valner, Conor McMahon, Karl Kruusamae, Mitch Pryor (2018) [http://doi.org/10.1109/hsi.2018.8431062 &amp;lt;nowiki&amp;gt;Improved Situational Awareness in ROS Using Panospheric Vision and Virtual Reality&amp;lt;/nowiki&amp;gt;], ''2018 11th International Conference on Human System Interaction (HSI)''. [http://doi.org/10.1109/hsi.2018.8431062 http://doi.org/10.1109/hsi.2018.8431062]&lt;br /&gt;
* Valner, Robert and Vunder, Veiko and Zelenak, Andy and Pryor, Mitch and Aabloo, Alvo and Kruusam{\&amp;amp;#34;{a}}e, Karl (2018) [https://robotics.estec.esa.int/i-SAIRAS/isairas2018/Papers/Session%205b/1_valner_isairas2018_final_inline-50-32-Kruusam%C3%A4e-Karl.pdf &amp;lt;nowiki&amp;gt;Intuitive “Human-on-the-Loop” Interface for Tele-Operating Remote Mobile Manipulator Robots&amp;lt;/nowiki&amp;gt;], ''Proceedings of i-SAIRAS 2018''.&lt;br /&gt;
* Valner, Robert and Vunder, Veiko and Kruusamäe, Karl and Pryor, Mitchell W and Zelenak, Andy (2018) [https://www.xcdsystem.com/wmsym/2018/pdfs/FinalPaper_18438_0119112319.pdf &amp;lt;nowiki&amp;gt;TeMoto 2.0: Source Agnostic Command-to-Task Architecture Enabling Increased Autonomy in Remote Systems&amp;lt;/nowiki&amp;gt;], ''Proceedings of Waste Management Symposia 2018 (WM2018)''.&lt;br /&gt;
* Andrew Sharp, Karl Kruusamae, Ben Ebersole, Mitch Pryor (2017) [http://doi.org/10.1109/arso.2017.8025195 &amp;lt;nowiki&amp;gt;Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces&amp;lt;/nowiki&amp;gt;], ''2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO)''. [http://doi.org/10.1109/arso.2017.8025195 http://doi.org/10.1109/arso.2017.8025195]&lt;br /&gt;
* Karl Kruusamae, Mitch Pryor (2016) [http://doi.org/10.1109/hsi.2016.7529630 &amp;lt;nowiki&amp;gt;High-precision telerobot with human-centered variable perspective and scalable gestural interface&amp;lt;/nowiki&amp;gt;], ''2016 9th International Conference on Human System Interactions (HSI)''. [http://doi.org/10.1109/hsi.2016.7529630 http://doi.org/10.1109/hsi.2016.7529630]&lt;br /&gt;
* Kruusamäe, K., Thompson, J., Pryor, M. (2016) [http://www.scopus.com/inward/record.url?eid=2-s2.0-85018422425&amp;amp;amp;partnerID=MN8TOARS &amp;lt;nowiki&amp;gt;Spatially-Mapped Human-Robot interface for teleoperation of high-precision tasks&amp;lt;/nowiki&amp;gt;], ''D and RS 2016 - Decommissioning and Remote Systems''.&lt;br /&gt;
* Kruusamäe, K., Sugino, T., Asaka, K. (2015) [http://doi.org/10.1117/12.2083812 &amp;lt;nowiki&amp;gt;Measuring blocking force to interpret ionic mechanisms within bucky-gel actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2083812 http://doi.org/10.1117/12.2083812]&lt;br /&gt;
* Kruusamäe, K., Mukai, K., Sugino, T., Asaka, K. (2014) [http://doi.org/10.1117/12.2044745 &amp;lt;nowiki&amp;gt;The viscoelastic effect in bending bucky-gel actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2044745 http://doi.org/10.1117/12.2044745]&lt;br /&gt;
* Kruusam{\&amp;amp;#34;{a}}e, Karl and Asaka, Kinji (2013) Thermo- and electromechanical properties of bucky-gel bending actuators under external loading, ''第31回日本ロボット学会学術講演会''.&lt;br /&gt;
* Kruusamäe, K., Kaasik, F., Punning, A., Aabloo, A. (2013) [http://doi.org/10.1117/12.2012021 &amp;lt;nowiki&amp;gt;Self-sensing ionic electromechanically active actuator with patterned carbon electrodes&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2012021 http://doi.org/10.1117/12.2012021]&lt;br /&gt;
* Juha Kontio and Patric Granholm and Heikki Valmu and Janne M{\&amp;amp;#34;a}ntykoski and Karl Kruusam{\&amp;amp;#34;a}e and Marija Autstuoliene and Loreta Savulioniene and Hussmann, {Peter Munkebo} and Kristina Edstr{\&amp;amp;#34;o}m (2012) Supporting Programme Development with Self‐ and Cross‐evaluations – Results from an International Quality, ''Proceedings of the International Conference on Engineering Education 2012''.&lt;br /&gt;
* Kruusamäe, K., Punning, A., Aabloo, A. (2011) [http://doi.org/10.1117/12.880386 &amp;lt;nowiki&amp;gt;Self-sensing properties of carbon-polymer composite (CPC) actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.880386 http://doi.org/10.1117/12.880386]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Graziani, S., Fortuna, L., Kodu, M., Jaaniso, R., Punning, A., Aabloo, A. (2010) [http://doi.org/10.1117/12.847503 &amp;lt;nowiki&amp;gt;Experiments with self-sensing IPMC actuating device&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.847503 http://doi.org/10.1117/12.847503]&lt;br /&gt;
* Kruusamäe, K., Punninga, A., Kruusmaa, M., Aabloo, A. (2009) [http://doi.org/10.1117/12.815642 &amp;lt;nowiki&amp;gt;Dynamical variation of the impedances of IPMC&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.815642 http://doi.org/10.1117/12.815642]&lt;br /&gt;
&lt;br /&gt;
=== Other ===&lt;br /&gt;
&lt;br /&gt;
* Justin W. Hart, Nick DePalma, Mitchell W. Pryor, Bradley Hayes, Karl Kruusamäe, Reuth Mirsky, Xuesu Xiao (2021) [http://doi.org/10.1145/3434074.3444882 &amp;lt;nowiki&amp;gt;Exploring Applications for Autonomous Nonverbal Human-Robot Interaction&amp;lt;/nowiki&amp;gt;], ''Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction''. [http://doi.org/10.1145/3434074.3444882 http://doi.org/10.1145/3434074.3444882]&lt;br /&gt;
* Fatemeh Rastgar, Arun Kumar Singh, Houman Masnavi, Karl Kruusamae, Alvo Aabloo (2020) [http://doi.org/10.1109/iros45743.2020.9341566 &amp;lt;nowiki&amp;gt;A Novel Trajectory Optimization for Affine Systems: Beyond Convex-Concave Procedure&amp;lt;/nowiki&amp;gt;], ''2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)''. [http://doi.org/10.1109/iros45743.2020.9341566 http://doi.org/10.1109/iros45743.2020.9341566]&lt;br /&gt;
* Kinji Asaka, Karl Kruusamäe, Kwang Kim, Viljar Palmre, Kam K. Leang (2016) [http://doi.org/10.1007/978-3-319-31530-0_10 &amp;lt;nowiki&amp;gt;IPMCs as EAPs: How to Start Experimenting with Them&amp;lt;/nowiki&amp;gt;], ''Electromechanically Active Polymers''. [http://doi.org/10.1007/978-3-319-31530-0_10 http://doi.org/10.1007/978-3-319-31530-0_10]&lt;br /&gt;
* Karl Kruusamäe (2011) [http://www.digar.ee/id/nlib-digar:240087 &amp;lt;nowiki&amp;gt;Tehislihased: ajamid mikrorobotitele, kuid mitte ainult&amp;lt;/nowiki&amp;gt;], ''Eesti Füüsika Seltsi aastaraamat''.&lt;br /&gt;
&lt;br /&gt;
== Theses ==&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
=== Proceedings and book chapters ===&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li value=&amp;quot;17&amp;quot;&amp;gt;Veiko Vunder, Robert Valner, Conor McMahon, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitch Pryor (2018) {{doi-inline|10.1109/HSI.2018.8431062|Improved Situational Awareness in ROS using Panospheric Vision and Virtual Reality}}, ''2018 11th International Conference on Human System Interaction (HSI)'', 471 - 477.&lt;br /&gt;
&amp;lt;li value=&amp;quot;16&amp;quot;&amp;gt;Robert Valner, Veiko Vunder, Andy Zelenak, Mitch Pryor, Alvo Aabloo, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2018) Intuitive “human-on-the-loop” interface for tele-operating remote mobile manipulator robots, ''International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-SAIRAS 2018)'', accepted in April 2018.&lt;br /&gt;
&amp;lt;li value=&amp;quot;15&amp;quot;&amp;gt;Robert Valner, Veiko Vunder, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitchell W. Pryor, Andy Zelenak (2018) [http://www.xcdsystem.com/wmsym/2018/pdfs/FinalPaper_18438_0119112319.pdf TeMoto 2.0: Source Agnostic Command-to-Task Architecture Enabling Increased Autonomy in Remote Systems], ''[https://www.xcdsystem.com/wmsym/2018/ Proceedings of Waste Management Symposia 2018]'', 18438. &lt;br /&gt;
&amp;lt;li value=&amp;quot;14&amp;quot;&amp;gt;Andrew Sharp, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Ben Ebersole, Mitch Pryor (2017) {{doi-inline|10.1109/ARSO.2017.8025195|Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces}}, ''2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO)''.&lt;br /&gt;
&amp;lt;li value=&amp;quot;13&amp;quot;&amp;gt;Kinji Asaka, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Kwang Kim, Viljar Palmre, Kam K. Leang (2016) {{doi-inline|10.1007/978-3-319-31530-0_10|IPMCs as EAPs: How to Start Experimenting with Them}}, ''Electromechanically Active Polymers: A Concise Reference'', 215-233.&lt;br /&gt;
&amp;lt;li value=&amp;quot;12&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Jack Thompson, Mitch Pryor (2016) Spatially-Mapped Human-Robot Interface for Teleoperation of High-Precision Tasks, ''Decommissioning and Remote System (D&amp;amp;RS 2016)'', 239-243.&lt;br /&gt;
&amp;lt;li value=&amp;quot;11&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitch Pryor (2016) {{doi-inline|10.1109/HSI.2016.7529630|High-precision telerobot with human-centered variable perspective and scalable gestural interface}}, ''2016 9th International Conference on Human System Interactions (HSI)'', 190-196.&lt;br /&gt;
&amp;lt;li value=&amp;quot;10&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Takushi Sugino, Kinji Asaka (2015) {{doi-inline|10.1117/12.2083812|Measuring blocking force to interpret ionic mechanisms within bucky-gel actuators}}, ''Proc. SPIE'' '''9430''', 94300P.&lt;br /&gt;
&amp;lt;li value=&amp;quot;9&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Ken Mukai, Takushi Sugino, Kinji Asaka (2014) {{doi-inline|10.1117/12.2044745|The viscoelastic effect in bending bucky-gel actuators}}, ''Proc. SPIE'' '''9056''', 90560G.&lt;br /&gt;
&amp;lt;li value=&amp;quot;8&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Kinji Asaka (2013) Thermo- and electromechanical properties of bucky-gel bending actuators under external loading, ''第31回日本ロボット学会学術講演会 (The 31st Annual Conference of the Robotic Society of Japan)'', RSJ2013AC2L2-01.&lt;br /&gt;
&amp;lt;li value=&amp;quot;7&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Friedrich Kaasik, Andres Punning and Alvo Aabloo (2013) {{doi-inline|10.1117/12.2012021|Self-sensing ionic electromechanically active actuator with patterned carbon electrodes}}, ''Proc. SPIE'' '''8687''', 868706.&lt;br /&gt;
&amp;lt;li value=&amp;quot;6&amp;quot;&amp;gt;Juha Kontio, Patric Granholm, Heikki Valmu, Janne Mäntykoski, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Marija Aukstuoliene, Loreta Savulioniene, Peter Munkebo Hussmann, Kristina Edström (2012) Supporting programme development with self‐ and cross‐evaluations – results from an international quality assurance project, ''[http://www.ineer.org/Events/ICEE2012/isbn9789522163110.pdf Proceedings of International Conference on Engineering Education 2012 (ICEE 2012)]'', 816 - 823.&lt;br /&gt;
&amp;lt;li value=&amp;quot;5&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2012) [[Media:Kruusamae_efsa2011_lk60-72.pdf|Tehislihased: ajamid mikrorobotitele, kuid mitte ainult]], ''[http://www.fyysika.ee/doc/aastaraamat2011.pdf Eesti Füüsika Seltsi aastaraamat 2011, XXII aastakäik]'', 60-72.&lt;br /&gt;
&amp;lt;li value=&amp;quot;4&amp;quot;&amp;gt;Indrek Must, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Urmas Johanson, Tarmo Tamm, Andres Punning, Alvo Aabloo (2011-2012) [https://www.taylorfrancis.com/books/9781482266894/chapters/10.1201%2F9781482266894-16 Characterisation of electromechanically active polymers using electrochemical impedance spectroscopy], ''{{doi-inline|10.1201%2F9781482266894 Lecture Notes on Impedance Spectroscopy: Measurement, Modeling and Applications}}'' '''2''', Olfa Kanoun (Ed.), CRC Press / Taylor &amp;amp; Francis Group, pp 113-121&lt;br /&gt;
&amp;lt;li value=&amp;quot;3&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Andres Punning, Alvo Aabloo (2011) {{doi-inline|10.1117/12.880386|Self-sensing properties of carbon-polymer composite (CPC) actuators}}, ''Proc. SPIE'' '''7976''', 79760Q.&lt;br /&gt;
&amp;lt;li value=&amp;quot;2&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Paola Brunetto, Salvatore Graziani, Luigi Fortuna, Margus Kodu, Raivo Jaaniso, Andres Punning, Alvo Aabloo (2010) {{doi-inline|10.1117/12.847503|Experiments with self-sensing IPMC actuating device}}, ''Proc. SPIE'' '''7642''', 76420V.&lt;br /&gt;
&amp;lt;li value=&amp;quot;1&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Andres Punning, Maarja Kruusmaa, Alvo Aabloo (2009) {{doi-inline|10.1117/12.815642|Dynamical variation of the impedances of IPMC}}, ''Proc. SPIE'' '''7287''', 72870V.&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
=== PhD thesis ===&lt;br /&gt;
*&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2012), [http://hdl.handle.net/10062/28066 Deformation-dependent electrode impedance of ionic electromechanically active polymers], Dissertationes physicae Universitatis Tartuensis 83 (128 p).&lt;br /&gt;
&lt;br /&gt;
=== MSc thesis ===&lt;br /&gt;
*&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2008), [[Media:Kruusamae_masterthesis.pdf|Isetundlik IPMC täitur]], Magistritöö (30 EAP, 47 lk), Tartu Ülikooli loodus- ja tehnoloogiateaduskond.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
*[https://www.youtube.com/@unitartu-ims-robotics UniTartu IMS Robotics @ YouTube] &lt;br /&gt;
*Temoto [手元] shared autonomy telerobot agent on YouTube: [https://www.youtube.com/watch?v=L25HHFd00rc (1)]  [https://www.youtube.com/watch?v=3Tm95RDPDhc (2)]  [https://www.youtube.com/watch?v=B90hZZdKxwM (3)]&lt;br /&gt;
*Tartu Ülikooli robotikarja animatsioon [[Media:Ut bots 2010 divx.avi| (DivX, 40.9 MB)]] [http://www.youtube.com/watch?v=fgiGQjmEpt8 (YouTube)]&lt;br /&gt;
*Team Helina ja Püha Vaim 4000 [[Media:Pv4k_xvid.avi|(XviD, 21.6 MB)]] [http://www.youtube.com/watch?v=y3hvFm8js7w (YouTube)]&lt;br /&gt;
*Berit Talpsepa robootiline skulptuur &amp;quot;Superheterodüünvastuvõtt&amp;quot; [[Media:Superheterodyne_xvid.avi|(XviD, 21.7 MB)]] [http://www.youtube.com/watch?v=iha5UzRMeww (YouTube)]&lt;br /&gt;
*Team Helina ja Püha Vaim 3000 [[Media:Pyhavaim3k_xvid.avi|(XviD, 28.6 MB)]] [http://www.youtube.com/watch?v=AZLa-TfDKn0 (YouTube)]&lt;br /&gt;
&lt;br /&gt;
[[Image:Rising_longhorn_120dpi.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[[Category:IMS-robotics]]&lt;br /&gt;
[[Category:PI]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=User:Karl&amp;diff=38668</id>
		<title>User:Karl</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=User:Karl&amp;diff=38668"/>
		<updated>2024-06-21T13:16:57Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Videos */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{UserProfile | &lt;br /&gt;
fullname=Karl Kruusamäe |&lt;br /&gt;
picture=Karl_k_2019.jpg |&lt;br /&gt;
email=karl.kruusamae@ut.ee |&lt;br /&gt;
mobile=+372 5886 3299 |&lt;br /&gt;
skype=karl_kruusamae |&lt;br /&gt;
orcid=0000-0002-1720-1509 |&lt;br /&gt;
gscholar=https://scholar.google.com/citations?user=aZlNwwoAAAAJ |&lt;br /&gt;
etis=https://www.etis.ee/cv/Karl_Kruusamae |&lt;br /&gt;
other=&lt;br /&gt;
{{UserProfileItem | LinkedIn | [https://www.linkedin.com/in/karl-kruusamae Karl Kruusamäe] }}&lt;br /&gt;
&lt;br /&gt;
{{UserProfileItem | Github | &lt;br /&gt;
* [https://github.com/ut-ims-robotics ut-ims-robotics] &lt;br /&gt;
* [https://github.com/robotont robotont]&lt;br /&gt;
* [https://github.com/temoto-framework TeMoto Framework] &lt;br /&gt;
* [https://github.com/kruusamae kruusamae] &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Theses topics for students ==&lt;br /&gt;
* [[Theses in Robotics|Theses and project topics for BSc and MSc level students]]&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Journal Articles ===&lt;br /&gt;
&lt;br /&gt;
* Krumins, D., Schumann, S., Vunder, V., Polluaar, R., Laht, K., Raudmae, R., Aabloo, A., Kruusamae, K. (2024) [http://doi.org/10.1109/TLT.2024.3381858 &amp;lt;nowiki&amp;gt;Open Remote Web Lab for Learning Robotics and ROS with Physical and Simulated Robots in an Authentic Developer Environment&amp;lt;/nowiki&amp;gt;], ''IEEE Transactions on Learning Technologies''. [http://doi.org/10.1109/TLT.2024.3381858 http://doi.org/10.1109/TLT.2024.3381858]&lt;br /&gt;
* Masnavi, H., Shrestha, J., Kruusamae, K., Singh, A.K. (2023) [http://doi.org/10.1109/LRA.2023.3312969 &amp;lt;nowiki&amp;gt;VACNA: Visibility-Aware Cooperative Navigation With Application in Inventory Management&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2023.3312969 http://doi.org/10.1109/LRA.2023.3312969]&lt;br /&gt;
* Raudmäe, R., Schumann, S., Vunder, V., Oidekivi, M., Nigol, M.K., Valner, R., Masnavi, H., Singh, A.K., Aabloo, A., Kruusamäe, K. (2023) [http://doi.org/10.1016/j.ohx.2023.e00436 &amp;lt;nowiki&amp;gt;ROBOTONT – Open-source and ROS-supported omnidirectional mobile robot for education and research&amp;lt;/nowiki&amp;gt;], ''HardwareX''. [http://doi.org/10.1016/j.ohx.2023.e00436 http://doi.org/10.1016/j.ohx.2023.e00436]&lt;br /&gt;
* Valner, R., Vunder, V., Aabloo, A., Pryor, M., Kruusamae, K. (2022) [http://doi.org/10.1109/ACCESS.2022.3173647 &amp;lt;nowiki&amp;gt;TeMoto: A Software Framework for Adaptive and Dependable Robotic Autonomy With Dynamic Resource Management&amp;lt;/nowiki&amp;gt;], ''IEEE Access''. [http://doi.org/10.1109/ACCESS.2022.3173647 http://doi.org/10.1109/ACCESS.2022.3173647]&lt;br /&gt;
* Masnavi, H., Shrestha, J., Mishra, M., Sujit, P.B., Kruusamae, K., Singh, A.K. (2022) [http://doi.org/10.1109/LRA.2022.3190087 &amp;lt;nowiki&amp;gt;Visibility-Aware Navigation With Batch Projection Augmented Cross-Entropy Method Over a Learned Occlusion Cost&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2022.3190087 http://doi.org/10.1109/LRA.2022.3190087]&lt;br /&gt;
* Masnavi, H., Adajania, V.K., Kruusamae, K., Singh, A.K. (2022) [http://doi.org/10.1109/ACCESS.2022.3157977 &amp;lt;nowiki&amp;gt;Real-Time Multi-Convex Model Predictive Control for Occlusion-Free Target Tracking with Quadrotors&amp;lt;/nowiki&amp;gt;], ''IEEE Access''. [http://doi.org/10.1109/ACCESS.2022.3157977 http://doi.org/10.1109/ACCESS.2022.3157977]&lt;br /&gt;
* Shashank Srikanth, Mithun Babu, Houman Masnavi, Arun Kumar Singh, Karl Kruusamäe, Krishnan Madhava Krishna (2022) [https://doi.org/10.3390/s22082995 &amp;lt;nowiki&amp;gt;Fast Adaptation of Manipulator Trajectories to Task Perturbation by Differentiating through the Optimal Solution&amp;lt;/nowiki&amp;gt;], ''Sensors''. [https://doi.org/10.3390/s22082995 https://doi.org/10.3390/s22082995]&lt;br /&gt;
* Valner, R., Wanna, S., Kruusamäe, K., Pryor, M. (2022) [http://doi.org/10.1145/3522580 &amp;lt;nowiki&amp;gt;Unified Meaning Representation Format (UMRF)-A Task Description and Execution Formalism for HRI&amp;lt;/nowiki&amp;gt;], ''ACM Transactions on Human-Robot Interaction''. [http://doi.org/10.1145/3522580 http://doi.org/10.1145/3522580]&lt;br /&gt;
* Robert Valner, Houman Masnavi, Igor Rybalskii, Rauno Põlluäär, Erik Kõiv, Alvo Aabloo, Karl Kruusamäe, Arun Kumar Singh (2022) [http://doi.org/10.3389/frobt.2022.922835 &amp;lt;nowiki&amp;gt;Scalable and heterogenous mobile robot fleet-based task automation in crowded hospital environments—a field test&amp;lt;/nowiki&amp;gt;], ''Frontiers in Robotics and AI''. [http://doi.org/10.3389/frobt.2022.922835 http://doi.org/10.3389/frobt.2022.922835]&lt;br /&gt;
* Rastgar, F., Masnavi, H., Shrestha, J., Kruusamae, K., Aabloo, A., Singh, A.K. (2021) [http://doi.org/10.1109/LRA.2021.3061398 &amp;lt;nowiki&amp;gt;GPU Accelerated Convex Approximations for Fast Multi-Agent Trajectory Optimization&amp;lt;/nowiki&amp;gt;], ''IEEE Robotics and Automation Letters''. [http://doi.org/10.1109/LRA.2021.3061398 http://doi.org/10.1109/LRA.2021.3061398]&lt;br /&gt;
* Karl Kruusamäe (2021) [https://www.vikerkaar.ee/archives/27646 &amp;lt;nowiki&amp;gt;En attendant Robot&amp;lt;/nowiki&amp;gt;], ''Vikerkaar''.&lt;br /&gt;
* Valner, R., Dydynski, J.M., Cho, S., Kruusamäe, K. (2021) [http://doi.org/10.1177/0018720820902293 &amp;lt;nowiki&amp;gt;Communication of Hazards in Mixed-Reality Telerobotic Systems: The Usage of Naturalistic Avoidance Cues in Driving Tasks&amp;lt;/nowiki&amp;gt;], ''Human Factors''. [http://doi.org/10.1177/0018720820902293 http://doi.org/10.1177/0018720820902293]&lt;br /&gt;
* Rinne, P., Põldsalu, I., Ratas, H.K., Kruusamäe, K., Johanson, U., Tamm, T., Põhako-Esko, K., Punning, A., Peikolainen, A.-L., Kaasik, F., Must, I., van den Ende, D., Aabloo, A. (2020) [http://doi.org/10.3791/61216 &amp;lt;nowiki&amp;gt;Fabrication of carbon-based ionic electromechanically active soft actuators&amp;lt;/nowiki&amp;gt;], ''Journal of Visualized Experiments''. [http://doi.org/10.3791/61216 http://doi.org/10.3791/61216]&lt;br /&gt;
* Valner, R., Kruusamäe, K., Pryor, M. (2018) [http://doi.org/10.3390/robotics7010009 &amp;lt;nowiki&amp;gt;TeMoto: Intuitive multi-range telerobotic system with natural gestural and verbal instruction interface&amp;lt;/nowiki&amp;gt;], ''Robotics''. [http://doi.org/10.3390/robotics7010009 http://doi.org/10.3390/robotics7010009]&lt;br /&gt;
* Zhu, Z., Horiuchi, T., Kruusamäe, K., Chang, L., Asaka, K. (2016) [http://doi.org/10.1021/acs.jpcb.5b12634 &amp;lt;nowiki&amp;gt;Influence of Ambient Humidity on the Voltage Response of Ionic Polymer-Metal Composite Sensor&amp;lt;/nowiki&amp;gt;], ''Journal of Physical Chemistry B''. [http://doi.org/10.1021/acs.jpcb.5b12634 http://doi.org/10.1021/acs.jpcb.5b12634]&lt;br /&gt;
* Zhu, Z., Horiuchi, T., Kruusamae, K., Chang, L., Asaka, K. (2016) [http://doi.org/10.1088/0964-1726/25/5/055024 &amp;lt;nowiki&amp;gt;The effect of ambient humidity on the electrical response of ion-migration-based polymer sensor with various cations&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/25/5/055024 http://doi.org/10.1088/0964-1726/25/5/055024]&lt;br /&gt;
* Horiuchi, T., Kruusamäe, K., Zhu, Z., Asaka, K. (2015) [http://doi.org/10.1016/j.sna.2015.09.034 &amp;lt;nowiki&amp;gt;Evaluating curvature and making picture-overlaid trajectory of motion of largely bent carbon nanotube composite bucky gel actuator using camera measurement system&amp;lt;/nowiki&amp;gt;], ''Sensors and Actuators, A: Physical''. [http://doi.org/10.1016/j.sna.2015.09.034 http://doi.org/10.1016/j.sna.2015.09.034]&lt;br /&gt;
* Karl Kruusamae, Ken Mukai, Takushi Sugino, Kinji Asaka (2015) [http://doi.org/10.1109/tmech.2014.2362917 &amp;lt;nowiki&amp;gt;Electroactive Shape-Fixing of Bucky-Gel Actuators&amp;lt;/nowiki&amp;gt;], ''IEEE/ASME Transactions on Mechatronics''. [http://doi.org/10.1109/tmech.2014.2362917 http://doi.org/10.1109/tmech.2014.2362917]&lt;br /&gt;
* Karl Kruusamäe, Andres Punning, Alvo Aabloo, Kinji Asaka (2015) [http://doi.org/10.3390/act4010017 &amp;lt;nowiki&amp;gt;Self-Sensing Ionic Polymer Actuators: A Review&amp;lt;/nowiki&amp;gt;], ''Actuators''. [http://doi.org/10.3390/act4010017 http://doi.org/10.3390/act4010017]&lt;br /&gt;
* Kruusamäe, K., Sugino, T., Asaka, K. (2015) [http://doi.org/10.1063/1.4923351 &amp;lt;nowiki&amp;gt;Ionic and viscoelastic mechanisms of a bucky-gel actuator&amp;lt;/nowiki&amp;gt;], ''Journal of Applied Physics''. [http://doi.org/10.1063/1.4923351 http://doi.org/10.1063/1.4923351]&lt;br /&gt;
* Karl Kruusamäe, Ken Mukai, Takushi Sugino, Kinji Asaka (2014) [http://doi.org/10.1177/1045389x14538538 &amp;lt;nowiki&amp;gt;Impact of viscoelastic properties on bucky-gel actuator performance&amp;lt;/nowiki&amp;gt;], ''Journal of Intelligent Material Systems and Structures''. [http://doi.org/10.1177/1045389x14538538 http://doi.org/10.1177/1045389x14538538]&lt;br /&gt;
* Punning, A., Kim, K.J., Palmre, V., Vidal, F., Plesse, C., Festin, N., Maziz, A., Asaka, K., Sugino, T., Alici, G., Spinks, G., Wallace, G., Must, I., Pï¿½ldsalu, I., Vunder, V., Temmer, R., Kruusamï¿½e, K., Torop, J., Kaasik, F., Rinne, P., Johanson, U., Peikolainen, A.-L., Tamm, T., Aabloo, A. (2014) [http://doi.org/10.1038/srep06913 &amp;lt;nowiki&amp;gt;Ionic electroactive polymer artificial muscles in space applications&amp;lt;/nowiki&amp;gt;], ''Scientific Reports''. [http://doi.org/10.1038/srep06913 http://doi.org/10.1038/srep06913]&lt;br /&gt;
* Kruusamäe, K., Mukai, K., Sugino, T., Asaka, K. (2014) [http://doi.org/10.1088/0964-1726/23/2/025031 &amp;lt;nowiki&amp;gt;Mechanical behaviour of bending bucky-gel actuators and its representation&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/23/2/025031 http://doi.org/10.1088/0964-1726/23/2/025031]&lt;br /&gt;
* Andres Punning, Indrek Must, Inga Põldsalu, Veiko Vunder, Rauno Temmer, Karl Kruusamäe, Friedrich Kaasik, Janno Torop, Pille Rinne, Tõnis Lulla, Urmas Johanson, Tarmo Tamm, Alvo Aabloo (2014) [http://doi.org/10.1177/1045389x14546656 &amp;lt;nowiki&amp;gt;Lifetime measurements of ionic electroactive polymer actuators&amp;lt;/nowiki&amp;gt;], ''Journal of Intelligent Material Systems and Structures''. [http://doi.org/10.1177/1045389x14546656 http://doi.org/10.1177/1045389x14546656]&lt;br /&gt;
* Einike Pilli, Marek Sammul, Piia Post, Ülle Aasjõe, Karl Kruusamäe (2013) [http://doi.org/10.12697/eha.2013.1.08 &amp;lt;nowiki&amp;gt;Eesti kõrgkoolide esmakursuslaste õpi- ja teadmuskäsitus&amp;lt;/nowiki&amp;gt;], ''Eesti Haridusteaduste Ajakiri = Estonian Journal of Education''. [http://doi.org/10.12697/eha.2013.1.08 http://doi.org/10.12697/eha.2013.1.08]&lt;br /&gt;
* Kruusamäe, K., Punning, A., Aabloo, A. (2012) [http://doi.org/10.3390/s120201950 &amp;lt;nowiki&amp;gt;Electrical model of a carbon-polymer composite (CPC) collision detector&amp;lt;/nowiki&amp;gt;], ''Sensors''. [http://doi.org/10.3390/s120201950 http://doi.org/10.3390/s120201950]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Punning, A., Kodu, M., Jaaniso, R., Graziani, S., Fortuna, L., Aabloo, A. (2011) [http://doi.org/10.1088/0964-1726/20/12/124001 &amp;lt;nowiki&amp;gt;Electromechanical model for a self-sensing ionic polymer-metal composite actuating device with patterned surface electrodes&amp;lt;/nowiki&amp;gt;], ''Smart Materials and Structures''. [http://doi.org/10.1088/0964-1726/20/12/124001 http://doi.org/10.1088/0964-1726/20/12/124001]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Graziani, S., Punning, A., Di Pasquale, G., Aabloo, A. (2010) [http://doi.org/10.1002/pi.2752 &amp;lt;nowiki&amp;gt;Self-sensing ionic polymer-metal composite actuating device with patterned surface electrodes&amp;lt;/nowiki&amp;gt;], ''Polymer International''. [http://doi.org/10.1002/pi.2752 http://doi.org/10.1002/pi.2752]&lt;br /&gt;
&lt;br /&gt;
=== Conference Papers ===&lt;br /&gt;
&lt;br /&gt;
* Schumann, S., Krūmiņš, D., Vunder, V., Aabloo, A., Siiman, L.A., Kruusamäe, K. (2023) [http://doi.org/10.1007/978-3-031-38454-7_24 &amp;lt;nowiki&amp;gt;A Beginner-Level MOOC on ROS Robotics Leveraging a Remote Web Lab for Programming Physical Robots&amp;lt;/nowiki&amp;gt;], ''Lecture Notes in Networks and Systems''. [http://doi.org/10.1007/978-3-031-38454-7_24 http://doi.org/10.1007/978-3-031-38454-7_24]&lt;br /&gt;
* Rastgar, F., Masnavi, H., Kruusamäe, K., Aabloo, A., Singh, A.K. (2023) [http://doi.org/10.23919/ACC55779.2023.10156088 &amp;lt;nowiki&amp;gt;GPU Accelerated Batch Trajectory Optimization for Autonomous Navigation&amp;lt;/nowiki&amp;gt;], ''Proceedings of the American Control Conference''. [http://doi.org/10.23919/ACC55779.2023.10156088 http://doi.org/10.23919/ACC55779.2023.10156088]&lt;br /&gt;
* Adajania, V.K., Masnavi, H., Rastgar, F., Kruusamae, K., Singh, A.K. (2021) [http://doi.org/10.1109/IROS51168.2021.9636337 &amp;lt;nowiki&amp;gt;Embedded Hardware Appropriate Fast 3D Trajectory Optimization for Fixed Wing Aerial Vehicles by Leveraging Hidden Convex Structures&amp;lt;/nowiki&amp;gt;], ''IEEE International Conference on Intelligent Robots and Systems''. [http://doi.org/10.1109/IROS51168.2021.9636337 http://doi.org/10.1109/IROS51168.2021.9636337]&lt;br /&gt;
* Oidekivi, M., Nolte, A., Aabloo, A., Kruusamae, K. (2021) [http://doi.org/10.1109/HSI52170.2021.9538693 &amp;lt;nowiki&amp;gt;Interpreting externally expressed intentions of an autonomous vehicle&amp;lt;/nowiki&amp;gt;], ''International Conference on Human System Interaction, HSI''. [http://doi.org/10.1109/HSI52170.2021.9538693 http://doi.org/10.1109/HSI52170.2021.9538693]&lt;br /&gt;
* Oidekivi, M., Nolte, A., Aabloo, A., Kruusamae, K. (2021) [http://doi.org/10.1109/HSI52170.2021.9538774 &amp;lt;nowiki&amp;gt;Identifying emotions from facial expression displays of robots - Results from a survey study&amp;lt;/nowiki&amp;gt;], ''International Conference on Human System Interaction, HSI''. [http://doi.org/10.1109/HSI52170.2021.9538774 http://doi.org/10.1109/HSI52170.2021.9538774]&lt;br /&gt;
* Aschenbrenner, D., Rieder, J.S.I., Van Tol, D., Van Dam, J., Rusak, Z., Blech, J.O., Azangoo, M., Panu, S., Kruusamäe, K., Masnavi, H., Rybalskii, I., Aabloo, A., Petry, M., Teixeira, G., Thiede, B., Pedrazzoli, P., Ferrario, A., Foletti, M., Confalonieri, M., Bertaggia, D., Togias, T., Makris, S. (2020) [http://doi.org/10.1109/AIVR50618.2020.00017 &amp;lt;nowiki&amp;gt;Mirrorlabs - Creating accessible Digital Twins of robotic production environment with Mixed Reality&amp;lt;/nowiki&amp;gt;], ''Proceedings - 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR 2020''. [http://doi.org/10.1109/AIVR50618.2020.00017 http://doi.org/10.1109/AIVR50618.2020.00017]&lt;br /&gt;
* Veiko Vunder, Robert Valner, Conor McMahon, Karl Kruusamae, Mitch Pryor (2018) [http://doi.org/10.1109/hsi.2018.8431062 &amp;lt;nowiki&amp;gt;Improved Situational Awareness in ROS Using Panospheric Vision and Virtual Reality&amp;lt;/nowiki&amp;gt;], ''2018 11th International Conference on Human System Interaction (HSI)''. [http://doi.org/10.1109/hsi.2018.8431062 http://doi.org/10.1109/hsi.2018.8431062]&lt;br /&gt;
* Valner, Robert and Vunder, Veiko and Zelenak, Andy and Pryor, Mitch and Aabloo, Alvo and Kruusam{\&amp;amp;#34;{a}}e, Karl (2018) [https://robotics.estec.esa.int/i-SAIRAS/isairas2018/Papers/Session%205b/1_valner_isairas2018_final_inline-50-32-Kruusam%C3%A4e-Karl.pdf &amp;lt;nowiki&amp;gt;Intuitive “Human-on-the-Loop” Interface for Tele-Operating Remote Mobile Manipulator Robots&amp;lt;/nowiki&amp;gt;], ''Proceedings of i-SAIRAS 2018''.&lt;br /&gt;
* Valner, Robert and Vunder, Veiko and Kruusamäe, Karl and Pryor, Mitchell W and Zelenak, Andy (2018) [https://www.xcdsystem.com/wmsym/2018/pdfs/FinalPaper_18438_0119112319.pdf &amp;lt;nowiki&amp;gt;TeMoto 2.0: Source Agnostic Command-to-Task Architecture Enabling Increased Autonomy in Remote Systems&amp;lt;/nowiki&amp;gt;], ''Proceedings of Waste Management Symposia 2018 (WM2018)''.&lt;br /&gt;
* Andrew Sharp, Karl Kruusamae, Ben Ebersole, Mitch Pryor (2017) [http://doi.org/10.1109/arso.2017.8025195 &amp;lt;nowiki&amp;gt;Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces&amp;lt;/nowiki&amp;gt;], ''2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO)''. [http://doi.org/10.1109/arso.2017.8025195 http://doi.org/10.1109/arso.2017.8025195]&lt;br /&gt;
* Karl Kruusamae, Mitch Pryor (2016) [http://doi.org/10.1109/hsi.2016.7529630 &amp;lt;nowiki&amp;gt;High-precision telerobot with human-centered variable perspective and scalable gestural interface&amp;lt;/nowiki&amp;gt;], ''2016 9th International Conference on Human System Interactions (HSI)''. [http://doi.org/10.1109/hsi.2016.7529630 http://doi.org/10.1109/hsi.2016.7529630]&lt;br /&gt;
* Kruusamäe, K., Thompson, J., Pryor, M. (2016) [http://www.scopus.com/inward/record.url?eid=2-s2.0-85018422425&amp;amp;amp;partnerID=MN8TOARS &amp;lt;nowiki&amp;gt;Spatially-Mapped Human-Robot interface for teleoperation of high-precision tasks&amp;lt;/nowiki&amp;gt;], ''D and RS 2016 - Decommissioning and Remote Systems''.&lt;br /&gt;
* Kruusamäe, K., Sugino, T., Asaka, K. (2015) [http://doi.org/10.1117/12.2083812 &amp;lt;nowiki&amp;gt;Measuring blocking force to interpret ionic mechanisms within bucky-gel actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2083812 http://doi.org/10.1117/12.2083812]&lt;br /&gt;
* Kruusamäe, K., Mukai, K., Sugino, T., Asaka, K. (2014) [http://doi.org/10.1117/12.2044745 &amp;lt;nowiki&amp;gt;The viscoelastic effect in bending bucky-gel actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2044745 http://doi.org/10.1117/12.2044745]&lt;br /&gt;
* Kruusam{\&amp;amp;#34;{a}}e, Karl and Asaka, Kinji (2013) Thermo- and electromechanical properties of bucky-gel bending actuators under external loading, ''第31回日本ロボット学会学術講演会''.&lt;br /&gt;
* Kruusamäe, K., Kaasik, F., Punning, A., Aabloo, A. (2013) [http://doi.org/10.1117/12.2012021 &amp;lt;nowiki&amp;gt;Self-sensing ionic electromechanically active actuator with patterned carbon electrodes&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.2012021 http://doi.org/10.1117/12.2012021]&lt;br /&gt;
* Juha Kontio and Patric Granholm and Heikki Valmu and Janne M{\&amp;amp;#34;a}ntykoski and Karl Kruusam{\&amp;amp;#34;a}e and Marija Autstuoliene and Loreta Savulioniene and Hussmann, {Peter Munkebo} and Kristina Edstr{\&amp;amp;#34;o}m (2012) Supporting Programme Development with Self‐ and Cross‐evaluations – Results from an International Quality, ''Proceedings of the International Conference on Engineering Education 2012''.&lt;br /&gt;
* Kruusamäe, K., Punning, A., Aabloo, A. (2011) [http://doi.org/10.1117/12.880386 &amp;lt;nowiki&amp;gt;Self-sensing properties of carbon-polymer composite (CPC) actuators&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.880386 http://doi.org/10.1117/12.880386]&lt;br /&gt;
* Kruusamäe, K., Brunetto, P., Graziani, S., Fortuna, L., Kodu, M., Jaaniso, R., Punning, A., Aabloo, A. (2010) [http://doi.org/10.1117/12.847503 &amp;lt;nowiki&amp;gt;Experiments with self-sensing IPMC actuating device&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.847503 http://doi.org/10.1117/12.847503]&lt;br /&gt;
* Kruusamäe, K., Punninga, A., Kruusmaa, M., Aabloo, A. (2009) [http://doi.org/10.1117/12.815642 &amp;lt;nowiki&amp;gt;Dynamical variation of the impedances of IPMC&amp;lt;/nowiki&amp;gt;], ''Proceedings of SPIE - The International Society for Optical Engineering''. [http://doi.org/10.1117/12.815642 http://doi.org/10.1117/12.815642]&lt;br /&gt;
&lt;br /&gt;
=== Other ===&lt;br /&gt;
&lt;br /&gt;
* Justin W. Hart, Nick DePalma, Mitchell W. Pryor, Bradley Hayes, Karl Kruusamäe, Reuth Mirsky, Xuesu Xiao (2021) [http://doi.org/10.1145/3434074.3444882 &amp;lt;nowiki&amp;gt;Exploring Applications for Autonomous Nonverbal Human-Robot Interaction&amp;lt;/nowiki&amp;gt;], ''Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction''. [http://doi.org/10.1145/3434074.3444882 http://doi.org/10.1145/3434074.3444882]&lt;br /&gt;
* Fatemeh Rastgar, Arun Kumar Singh, Houman Masnavi, Karl Kruusamae, Alvo Aabloo (2020) [http://doi.org/10.1109/iros45743.2020.9341566 &amp;lt;nowiki&amp;gt;A Novel Trajectory Optimization for Affine Systems: Beyond Convex-Concave Procedure&amp;lt;/nowiki&amp;gt;], ''2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)''. [http://doi.org/10.1109/iros45743.2020.9341566 http://doi.org/10.1109/iros45743.2020.9341566]&lt;br /&gt;
* Kinji Asaka, Karl Kruusamäe, Kwang Kim, Viljar Palmre, Kam K. Leang (2016) [http://doi.org/10.1007/978-3-319-31530-0_10 &amp;lt;nowiki&amp;gt;IPMCs as EAPs: How to Start Experimenting with Them&amp;lt;/nowiki&amp;gt;], ''Electromechanically Active Polymers''. [http://doi.org/10.1007/978-3-319-31530-0_10 http://doi.org/10.1007/978-3-319-31530-0_10]&lt;br /&gt;
* Karl Kruusamäe (2011) [http://www.digar.ee/id/nlib-digar:240087 &amp;lt;nowiki&amp;gt;Tehislihased: ajamid mikrorobotitele, kuid mitte ainult&amp;lt;/nowiki&amp;gt;], ''Eesti Füüsika Seltsi aastaraamat''.&lt;br /&gt;
&lt;br /&gt;
== Theses ==&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
=== Proceedings and book chapters ===&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li value=&amp;quot;17&amp;quot;&amp;gt;Veiko Vunder, Robert Valner, Conor McMahon, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitch Pryor (2018) {{doi-inline|10.1109/HSI.2018.8431062|Improved Situational Awareness in ROS using Panospheric Vision and Virtual Reality}}, ''2018 11th International Conference on Human System Interaction (HSI)'', 471 - 477.&lt;br /&gt;
&amp;lt;li value=&amp;quot;16&amp;quot;&amp;gt;Robert Valner, Veiko Vunder, Andy Zelenak, Mitch Pryor, Alvo Aabloo, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2018) Intuitive “human-on-the-loop” interface for tele-operating remote mobile manipulator robots, ''International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-SAIRAS 2018)'', accepted in April 2018.&lt;br /&gt;
&amp;lt;li value=&amp;quot;15&amp;quot;&amp;gt;Robert Valner, Veiko Vunder, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitchell W. Pryor, Andy Zelenak (2018) [http://www.xcdsystem.com/wmsym/2018/pdfs/FinalPaper_18438_0119112319.pdf TeMoto 2.0: Source Agnostic Command-to-Task Architecture Enabling Increased Autonomy in Remote Systems], ''[https://www.xcdsystem.com/wmsym/2018/ Proceedings of Waste Management Symposia 2018]'', 18438. &lt;br /&gt;
&amp;lt;li value=&amp;quot;14&amp;quot;&amp;gt;Andrew Sharp, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Ben Ebersole, Mitch Pryor (2017) {{doi-inline|10.1109/ARSO.2017.8025195|Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces}}, ''2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO)''.&lt;br /&gt;
&amp;lt;li value=&amp;quot;13&amp;quot;&amp;gt;Kinji Asaka, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Kwang Kim, Viljar Palmre, Kam K. Leang (2016) {{doi-inline|10.1007/978-3-319-31530-0_10|IPMCs as EAPs: How to Start Experimenting with Them}}, ''Electromechanically Active Polymers: A Concise Reference'', 215-233.&lt;br /&gt;
&amp;lt;li value=&amp;quot;12&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Jack Thompson, Mitch Pryor (2016) Spatially-Mapped Human-Robot Interface for Teleoperation of High-Precision Tasks, ''Decommissioning and Remote System (D&amp;amp;RS 2016)'', 239-243.&lt;br /&gt;
&amp;lt;li value=&amp;quot;11&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Mitch Pryor (2016) {{doi-inline|10.1109/HSI.2016.7529630|High-precision telerobot with human-centered variable perspective and scalable gestural interface}}, ''2016 9th International Conference on Human System Interactions (HSI)'', 190-196.&lt;br /&gt;
&amp;lt;li value=&amp;quot;10&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Takushi Sugino, Kinji Asaka (2015) {{doi-inline|10.1117/12.2083812|Measuring blocking force to interpret ionic mechanisms within bucky-gel actuators}}, ''Proc. SPIE'' '''9430''', 94300P.&lt;br /&gt;
&amp;lt;li value=&amp;quot;9&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Ken Mukai, Takushi Sugino, Kinji Asaka (2014) {{doi-inline|10.1117/12.2044745|The viscoelastic effect in bending bucky-gel actuators}}, ''Proc. SPIE'' '''9056''', 90560G.&lt;br /&gt;
&amp;lt;li value=&amp;quot;8&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Kinji Asaka (2013) Thermo- and electromechanical properties of bucky-gel bending actuators under external loading, ''第31回日本ロボット学会学術講演会 (The 31st Annual Conference of the Robotic Society of Japan)'', RSJ2013AC2L2-01.&lt;br /&gt;
&amp;lt;li value=&amp;quot;7&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Friedrich Kaasik, Andres Punning and Alvo Aabloo (2013) {{doi-inline|10.1117/12.2012021|Self-sensing ionic electromechanically active actuator with patterned carbon electrodes}}, ''Proc. SPIE'' '''8687''', 868706.&lt;br /&gt;
&amp;lt;li value=&amp;quot;6&amp;quot;&amp;gt;Juha Kontio, Patric Granholm, Heikki Valmu, Janne Mäntykoski, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Marija Aukstuoliene, Loreta Savulioniene, Peter Munkebo Hussmann, Kristina Edström (2012) Supporting programme development with self‐ and cross‐evaluations – results from an international quality assurance project, ''[http://www.ineer.org/Events/ICEE2012/isbn9789522163110.pdf Proceedings of International Conference on Engineering Education 2012 (ICEE 2012)]'', 816 - 823.&lt;br /&gt;
&amp;lt;li value=&amp;quot;5&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2012) [[Media:Kruusamae_efsa2011_lk60-72.pdf|Tehislihased: ajamid mikrorobotitele, kuid mitte ainult]], ''[http://www.fyysika.ee/doc/aastaraamat2011.pdf Eesti Füüsika Seltsi aastaraamat 2011, XXII aastakäik]'', 60-72.&lt;br /&gt;
&amp;lt;li value=&amp;quot;4&amp;quot;&amp;gt;Indrek Must, &amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Urmas Johanson, Tarmo Tamm, Andres Punning, Alvo Aabloo (2011-2012) [https://www.taylorfrancis.com/books/9781482266894/chapters/10.1201%2F9781482266894-16 Characterisation of electromechanically active polymers using electrochemical impedance spectroscopy], ''{{doi-inline|10.1201%2F9781482266894 Lecture Notes on Impedance Spectroscopy: Measurement, Modeling and Applications}}'' '''2''', Olfa Kanoun (Ed.), CRC Press / Taylor &amp;amp; Francis Group, pp 113-121&lt;br /&gt;
&amp;lt;li value=&amp;quot;3&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Andres Punning, Alvo Aabloo (2011) {{doi-inline|10.1117/12.880386|Self-sensing properties of carbon-polymer composite (CPC) actuators}}, ''Proc. SPIE'' '''7976''', 79760Q.&lt;br /&gt;
&amp;lt;li value=&amp;quot;2&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Paola Brunetto, Salvatore Graziani, Luigi Fortuna, Margus Kodu, Raivo Jaaniso, Andres Punning, Alvo Aabloo (2010) {{doi-inline|10.1117/12.847503|Experiments with self-sensing IPMC actuating device}}, ''Proc. SPIE'' '''7642''', 76420V.&lt;br /&gt;
&amp;lt;li value=&amp;quot;1&amp;quot;&amp;gt;&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt;, Andres Punning, Maarja Kruusmaa, Alvo Aabloo (2009) {{doi-inline|10.1117/12.815642|Dynamical variation of the impedances of IPMC}}, ''Proc. SPIE'' '''7287''', 72870V.&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
=== PhD thesis ===&lt;br /&gt;
*&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2012), [http://hdl.handle.net/10062/28066 Deformation-dependent electrode impedance of ionic electromechanically active polymers], Dissertationes physicae Universitatis Tartuensis 83 (128 p).&lt;br /&gt;
&lt;br /&gt;
=== MSc thesis ===&lt;br /&gt;
*&amp;lt;u&amp;gt;Karl Kruusamäe&amp;lt;/u&amp;gt; (2008), [[Media:Kruusamae_masterthesis.pdf|Isetundlik IPMC täitur]], Magistritöö (30 EAP, 47 lk), Tartu Ülikooli loodus- ja tehnoloogiateaduskond.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
*[https://www.youtube.com/@unitartu-ims-robotics UniTartu IMS Robotics @ YouTube] &lt;br /&gt;
*Temoto [手元] shared autonomy telerobot agent on YouTube: [https://www.youtube.com/watch?v=L25HHFd00rc (1)]  [https://www.youtube.com/watch?v=3Tm95RDPDhc (2)]  [https://www.youtube.com/watch?v=B90hZZdKxwM (3)]&lt;br /&gt;
*Tartu Ülikooli robotikarja animatsioon [[Media:Ut bots 2010 divx.avi| (DivX, 40.9 MB)]] [http://www.youtube.com/watch?v=fgiGQjmEpt8 (YouTube)]&lt;br /&gt;
*Team Helina ja Püha Vaim 4000 [[Media:Pv4k_xvid.avi|(XviD, 21.6 MB)]] [http://www.youtube.com/watch?v=y3hvFm8js7w (YouTube)]&lt;br /&gt;
*Berit Talpsepa robootiline skulptuur &amp;quot;Superheterodüünvastuvõtt&amp;quot; [[Media:Superheterodyne_xvid.avi|(XviD, 21.7 MB)]] [http://www.youtube.com/watch?v=iha5UzRMeww (YouTube)]&lt;br /&gt;
*Team Helina ja Püha Vaim 3000 [[Media:Pyhavaim3k_xvid.avi|(XviD, 28.6 MB)]] [http://www.youtube.com/watch?v=AZLa-TfDKn0 (YouTube)]&lt;br /&gt;
&lt;br /&gt;
[[Image:Rising_longhorn_120dpi.png|center|400px]]&lt;br /&gt;
&lt;br /&gt;
[[Category:IMS-robotics]]&lt;br /&gt;
[[Category:PI]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=38667</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=38667"/>
		<updated>2024-06-21T12:37:09Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Staff */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, microfabrication, CMOS MEMS sensors)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Roman|Roman Leinus|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Robertvalner|Robert Valner|PhD student (robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne|PhD student}}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|karlkaru|Karl Karu|PhD student}}&lt;br /&gt;
{{TeamMember|Helena.nulk|Helena Nulk|PhD student (collaborative robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|PhD student (Elastomeric foams, PDMS)}}&lt;br /&gt;
{{TeamMember|ye.wang|Ye Wang|PhD student (Design and development of tailored nanostructures)}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras|PhD student (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|PhD student (Robotics)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
{{TeamMember|Houman.masnavi|Houman Masnavi|PhD student (deep-learning-based motion-planning in robotics)}}&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (dsitributed antennas)}}&lt;br /&gt;
{{TeamMember|Pranjal.sharma|Pranjal Sharma|PhD student (additive manufacturing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Students==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|markus|Markus Loide|student}}&lt;br /&gt;
{{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}}&lt;br /&gt;
{{TeamMember|Priit.poldmaa|Priit Põldmaa|student}}&lt;br /&gt;
{{TeamMember|raunoumborg|Rauno Umborg|student (computer engineering)}}&lt;br /&gt;
{{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}}&lt;br /&gt;
{{TeamMember|Ingmar.laan|Ingmar Laan|student (ionic polymer actuators)}}&lt;br /&gt;
{{TeamMember|Johannes.muru|Johannes Muru|student}}&lt;br /&gt;
{{TeamMember|Hermanratas|Herman Klas Ratas|student}}&lt;br /&gt;
{{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}}&lt;br /&gt;
{{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}}&lt;br /&gt;
{{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Mkuuts|Mona Küüts|student (soft robotics)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=38666</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=38666"/>
		<updated>2024-06-21T12:36:34Z</updated>

		<summary type="html">&lt;p&gt;Karl: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, microfabrication, CMOS MEMS sensors)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|not-yet|Stanley Mugisha|post-doc (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|junior lecturer (robotics design)}}&lt;br /&gt;
{{TeamMember|Roman|Roman Leinus|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Robertvalner|Robert Valner|PhD student (robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne|PhD student}}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|karlkaru|Karl Karu|PhD student}}&lt;br /&gt;
{{TeamMember|Helena.nulk|Helena Nulk|PhD student (collaborative robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|PhD student (Elastomeric foams, PDMS)}}&lt;br /&gt;
{{TeamMember|ye.wang|Ye Wang|PhD student (Design and development of tailored nanostructures)}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras|PhD student (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|PhD student (Robotics)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
{{TeamMember|Houman.masnavi|Houman Masnavi|PhD student (deep-learning-based motion-planning in robotics)}}&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (dsitributed antennas)}}&lt;br /&gt;
{{TeamMember|Pranjal.sharma|Pranjal Sharma|PhD student (additive manufacturing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Students==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|markus|Markus Loide|student}}&lt;br /&gt;
{{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}}&lt;br /&gt;
{{TeamMember|Priit.poldmaa|Priit Põldmaa|student}}&lt;br /&gt;
{{TeamMember|raunoumborg|Rauno Umborg|student (computer engineering)}}&lt;br /&gt;
{{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}}&lt;br /&gt;
{{TeamMember|Ingmar.laan|Ingmar Laan|student (ionic polymer actuators)}}&lt;br /&gt;
{{TeamMember|Johannes.muru|Johannes Muru|student}}&lt;br /&gt;
{{TeamMember|Hermanratas|Herman Klas Ratas|student}}&lt;br /&gt;
{{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}}&lt;br /&gt;
{{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}}&lt;br /&gt;
{{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Mkuuts|Mona Küüts|student (soft robotics)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=People&amp;diff=38665</id>
		<title>People</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=People&amp;diff=38665"/>
		<updated>2024-06-21T12:34:22Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* PhD Students */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{DISPLAYTITLE:The Lab's Team}} __NOTOC__&lt;br /&gt;
== Staff ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Alvo|Alvo Aabloo|professor, head of the lab (polymer materials)}} &lt;br /&gt;
{{TeamMember|Longfei|Longfei Chang| guest professor, polymeric transducers}} &lt;br /&gt;
{{TeamMember|Tarmo|Tarmo Tamm| Professor of applied materials science}}&lt;br /&gt;
{{TeamMember|Karl|Karl Kruusamäe|associate professor of robotics engineering, [[IMS robotics]]}}&lt;br /&gt;
{{TeamMember|Indrekm|Indrek Must|associate professor of soft robotics}}&lt;br /&gt;
{{TeamMember|Heiki|Heiki Kasemägi|associate professor (ion-conducting polymer, computer simulations) &amp;amp; Computer engineering study programme manager}}&lt;br /&gt;
{{TeamMember|Gajanee|Kaija Põhako-Esko|associate professor of materials chemistry, former Marie Skłodowska-Curie fellow (chemistry, organic synthesis, ionic liquids)}}&lt;br /&gt;
{{TeamMember|Urmas|Urmas Johanson|researcher (electrochemistry)}}&lt;br /&gt;
{{TeamMember|Annaliisa|Anna-Liisa Peikolainen|researcher (carbon materials, chemistry)}}&lt;br /&gt;
{{TeamMember|Janno|Janno Torop|associate professor of materials engineering (stimuli-responsive materials, structural energy storage devices, carbon materials)}}&lt;br /&gt;
{{TeamMember|Saoni|Saoni Banerji| research fellow (CMOS MEMS sensors, mixed-signal ASIC design, microelectronics)}}&lt;br /&gt;
{{TeamMember|Ritesh.Soni|Ritesh Soni| research fellow (soft robotics, microfabrication, CMOS MEMS sensors)}}&lt;br /&gt;
{{TeamMember|Veix|Veiko Vunder|lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|not-yet|Stanley Mugisha|post-doc (robotics)}}&lt;br /&gt;
{{TeamMember|Artur|Artur Abels|junior lecturer (digital electronics)}}&lt;br /&gt;
{{TeamMember|Rennoraudmae|Renno Raudmäe|junior lecturer (robotics)}}&lt;br /&gt;
{{TeamMember|Roman|Roman Leinus|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Teet|Teet Tilk|engineer (electronics)}}&lt;br /&gt;
{{TeamMember|Tauri|Tauri Tätte|engineer (robotics)}}&lt;br /&gt;
{{TeamMember|Aune|Aune Tamm|head of administration}}&lt;br /&gt;
{{TeamMember|Anett|Anett Toom|project assistant}}&lt;br /&gt;
{{TeamMember|Kirsi|Kirsi Zirel|project assistant}}&lt;br /&gt;
{{TeamMember|Mariana|Mariana Kukk|Delta X coordinator and communications manager}}&lt;br /&gt;
{{TeamMember|Frkaasik|Friedrich Kaasik| Head of Knowledge Transfer }}&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
== PhD Students ==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Robertvalner|Robert Valner|PhD student (robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Pille|Pille Rinne|PhD student}}&lt;br /&gt;
{{TeamMember|hans_priks|Hans Priks|PhD student (conducting polymers)}}&lt;br /&gt;
{{TeamMember|karlkaru|Karl Karu|PhD student}}&lt;br /&gt;
{{TeamMember|Helena.nulk|Helena Nulk|PhD student (collaborative robotics, [[IMS robotics]])}}&lt;br /&gt;
{{TeamMember|Ingridre|Ingrid Rebane|PhD student (Elastomeric foams, PDMS)}}&lt;br /&gt;
{{TeamMember|ye.wang|Ye Wang|PhD student (Design and development of tailored nanostructures)}}&lt;br /&gt;
{{TeamMember|Iman.dadras|Iman Dadras|PhD student (Microfabrication)}}&lt;br /&gt;
{{TeamMember|Fatemeh.rastgar|Fatemeh Rastgar|PhD student (Robotics)}}&lt;br /&gt;
{{TeamMember|yauheni|Yauheni Sarokin|PhD student (Variable stiffness textiles)}}&lt;br /&gt;
{{TeamMember|Houman.masnavi|Houman Masnavi|PhD student (deep-learning-based motion-planning in robotics)}}&lt;br /&gt;
{{TeamMember|Oleksandr.syzoniuk|Oleksandr Syzoniuk|PhD student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Kadriannvaldur|Kadri-Ann Valdur|PhD student (bioinspired soft robotics)}}&lt;br /&gt;
{{TeamMember|Jkalde|Jaanus Kalde|PhD student (dsitributed antennas)}}&lt;br /&gt;
{{TeamMember|Pranjal.sharma|Pranjal Sharma|PhD student (additive manufacturing)}}&lt;br /&gt;
{{TeamMember|Sandra|Sandra Schumann|PhD student (robotics education)}}&lt;br /&gt;
{{TeamMember|Gryogor|Igor Rybalskii|PhD student (human-robot collaboration and augmented reality)}}&lt;br /&gt;
{{TeamMember|not-yet|Agnes Luhtaru|PhD student (multimodal human-robot interaction)}}&lt;br /&gt;
{{TeamMember|Farnaz|Farnaz Baksh|PhD student (social robotics)}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Students==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|markus|Markus Loide|student}}&lt;br /&gt;
{{TeamMember|Liivak|Martin Liivak|student (FEM simulations of 3D-MB's)}}&lt;br /&gt;
{{TeamMember|Priit.poldmaa|Priit Põldmaa|student}}&lt;br /&gt;
{{TeamMember|raunoumborg|Rauno Umborg|student (computer engineering)}}&lt;br /&gt;
{{TeamMember|Kaarelsiimut|Kaarel Siimut|student (aerated concrete)}}&lt;br /&gt;
{{TeamMember|Ingmar.laan|Ingmar Laan|student (ionic polymer actuators)}}&lt;br /&gt;
{{TeamMember|Johannes.muru|Johannes Muru|student}}&lt;br /&gt;
{{TeamMember|Hermanratas|Herman Klas Ratas|student}}&lt;br /&gt;
{{TeamMember|Ats.aasmaa|Ats Aasmaa|student (modelling of microbatteries)}}&lt;br /&gt;
{{TeamMember|Phuong.nguyen|Phuong Nguyen|student (FEM simulations)}}&lt;br /&gt;
{{TeamMember|Magnus.kaldjarv|Magnus Kaldjärv|student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Mkuuts|Mona Küüts|student (soft robotics)}}&lt;br /&gt;
{{TeamMember|Eva.m6tsh2rg|Eva Mõtshärg|student (robotics education)}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
==Colleagues==&lt;br /&gt;
&lt;br /&gt;
{{Team|&lt;br /&gt;
{{TeamMember|Rosin|Margus Rosin|lecturer (FPGA)}}&lt;br /&gt;
{{TeamMember|Ramon.rantsus|Ramon Rantsus|educational robotics}}&lt;br /&gt;
&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38642</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38642"/>
		<updated>2024-06-19T14:21:10Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2023/2024 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#ROBOTONT: ROS2 support for robotont|ROBOTONT: ROS2 support for robotont]]&lt;br /&gt;
# [[#Continuous teleoperation setup for controlling mobile robot on streets|Continuous teleoperation setup for controlling mobile robot on streets]]&lt;br /&gt;
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Continuous teleoperation setup for controlling mobile robot on streets ===&lt;br /&gt;
The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the [https://adl.cs.ut.ee/lab/vehicle ADL vehicle]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS2 support for robotont ===&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://hdl.handle.net/10062/93431 Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://hdl.handle.net/10062/93432 Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://hdl.handle.net/10062/93429 Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://hdl.handle.net/10062/93427 Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://hdl.handle.net/10062/93424 Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://hdl.handle.net/10062/93421 Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://hdl.handle.net/10062/93430 MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://hdl.handle.net/10062/93434 Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://hdl.handle.net/10062/93420 Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38641</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38641"/>
		<updated>2024-06-19T14:15:01Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2023/2024 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#ROBOTONT: ROS2 support for robotont|ROBOTONT: ROS2 support for robotont]]&lt;br /&gt;
# [[#Continuous teleoperation setup for controlling mobile robot on streets|Continuous teleoperation setup for controlling mobile robot on streets]]&lt;br /&gt;
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Continuous teleoperation setup for controlling mobile robot on streets ===&lt;br /&gt;
The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the [https://adl.cs.ut.ee/lab/vehicle ADL vehicle]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS2 support for robotont ===&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet Rhea: An Open-Source Table Tennis Ball Launcher Robot for Multiball Training] [Avatud lähtekoodiga lauatennise palliviske robot mitmikpall treenimiseks], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet Laadimisjaama ja transportkesta väljatöötamine õpperobotile Robotont] [Development of charging dock and transport case for Robotont], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet Valguslahenduse tarkvara väljatöötamine õpperobotile Robotont] [Development of light solution software for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet Toitepinge ja tarbevoolu monitoorimine ning toitehalduse püsivara loomine õpperobotil Robotont] [Monitoring supply voltage and current consumption and creating firmware for Robotont’s power management system], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet Õpperoboti Robotont püsivara arhitektuuri uuendamine] [Education robot &amp;quot;Robotont&amp;quot; firmware architecture updating], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet Avatud robotplatvormi Robotont 3 kasutajaliidese väljatöötamine] [Development of a user interface for the open robotics platform Robotont 3], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet Elektroonikalahendus ja püsivara robotkäe juhtimiseks sotsiaalsel humanoidrobotil SemuBot] [Building the SemuBot arm electronics system], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet Käe mehaanika disain sotsiaalsele humanoidrobotile SemuBot] [Arm mechanics design for social humanoid robot SemuBot], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://not-yet Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://not-yet Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://not-yet Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://not-yet Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://not-yet Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://not-yet Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://not-yet MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://not-yet Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://not-yet Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38640</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=38640"/>
		<updated>2024-06-19T14:08:58Z</updated>

		<summary type="html">&lt;p&gt;Karl: /* Bachelor's theses */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&amp;lt;div class=&amp;quot;toclimit-3&amp;quot;&amp;gt;__TOC__&amp;lt;/div&amp;gt; &lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]. &lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
== Highlighted theses topics for 2023/2024 study year ==&lt;br /&gt;
# [[#ROBOTONT: analysis of different options as on-board computers|ROBOTONT: analysis of different options as on-board computers]]&lt;br /&gt;
# [[#SemuBOT: multiple topics|SemuBOT: multiple topics]]&lt;br /&gt;
# [[#Robotic Study Companion: a social robot for students in higher education|Robotic Study Companion: a social robot for students in higher education]]&lt;br /&gt;
# [[#Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle|Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle]]&lt;br /&gt;
# [[#ROBOTONT: ROS2 support for robotont|ROBOTONT: ROS2 support for robotont]]&lt;br /&gt;
# [[#Continuous teleoperation setup for controlling mobile robot on streets|Continuous teleoperation setup for controlling mobile robot on streets]]&lt;br /&gt;
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]&lt;br /&gt;
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]&lt;br /&gt;
&lt;br /&gt;
== List of potential thesis topics ==&lt;br /&gt;
Our inventory includes but is not limited to:&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed&amp;quot;&amp;gt;&lt;br /&gt;
File:Youbot.png|120px|thumb|KUKA youBot&lt;br /&gt;
File:Ur5_left.png|120px|thumb|Universal Robot UR5&lt;br /&gt;
File:Franka_Emika_Panda.jpg|120px|thumb|Franka Emika Panda&lt;br /&gt;
File:Clearpath_Jackal.jpg|120px|thumb|Clearpath Jackal&lt;br /&gt;
File:Kinova_KG-3_Gripper.jpg|120px|thumb|Kinova 3-finger gripper&lt;br /&gt;
File:Xarm7.jpg|120px|thumb|UFACTORY xArm7&lt;br /&gt;
File:Robotont_banner.png|120px|thumb|robotont&lt;br /&gt;
File:Turtlebot3-waffle-pi.jpg|thumb|TurtleBot3&lt;br /&gt;
File:Parrot-bebop-2.jpg|thumb|Parrot Bebop 2&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: analysis of different options as on-board computers ===&lt;br /&gt;
Currently ROBOTONT uses Intel NUC as an onboard computer. The goal of this thesis is to validate Robotont's software stack on alternative compute devices (e.g. Raspi4, Intel Compute Stick, and NVIDIA Jetson Nano) and benchmark their performance for the most popular Robotont use-case demos (e.g. webapp teleop, ar-tag steering, 2D mapping, and 3D mapping). The objetive is to propose at least two different compute solutions: one that optimizes cost and another that optimizes for the performance. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: integrating a graphical programming interface ===&lt;br /&gt;
The goal of this thesis is to integrate graphical programming solution (e.g. Scratch or Blockly) to enable programming of Robotont by non-experts.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT Lite ===&lt;br /&gt;
The goal of this thesis is to optimize ROBOTONT platform for cost by replacing the onboard compute and sensor with low-cost alternatives but ensuring ROS/ROS2 software compatibility. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== SemuBOT: multiple topics ===&lt;br /&gt;
In 2023/2024 many topics are developed to support the development of open-source humanoid robot project SemuBOT (https://www.facebook.com/semubotmtu/)&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Robotic Study Companion: a social robot for students in higher education ===&lt;br /&gt;
Potential Topics:&lt;br /&gt;
* Enhancing the RSC Speech/NLP Capabilities&lt;br /&gt;
* Build the RSC Human-Robot Multimodal Interaction&lt;br /&gt;
* Developing the RSC Personalities/Behaviour&lt;br /&gt;
* Create a WebApp for Monitoring Student Performance via the RSC&lt;br /&gt;
* Exploring the RSC use as an Affective Robot with a focus on Students’ Academic Emotions&lt;br /&gt;
[https://github.com/Farnaz03/RoboticStudyCompanion Github] | reach out to farnaz.baksh@ut.ee for more info&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: Quick launch demo suite and the final complete release of its ROS1 software ===&lt;br /&gt;
The goal of this thesis is to refine existing robotont demos (e.g. ar-tag steering, follow-the-leader, dancing-with-robot, LEAP-based control etc) and package them in an easy to use way for quick deployment by anyone during public events such as science popularization workshops and school visits.&lt;br /&gt;
The results of this work will be packaged as the final ROS1 release of Robotont software as the EOL for ROS1 is in May 2025.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
Enhancing the user-experience of a virtual reality UI developed by [https://github.com/ut-ims-robotics/vrui_rviz Georg Astok]. Potentially adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Health monitor for intuitive telerobot ===&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== 3D scanning of industrial objects ===&lt;br /&gt;
Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Modeling humans for human-robot interaction ===&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robotic avatar for telepresence ===&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Detection of hardware and software resources for smart integration of robots ===&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Sonification of feedback during teleoperation of robots ===&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Human-Robot and Robot-Robot collaboration applications ===&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+robotont - youbot cannot go up ledges but it can lift smaller robot, such as robotont, up a ledge.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Mirroring human hand movements on industrial robots ===&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: TeMoto for robotont ===&lt;br /&gt;
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Enhancing teleoperation control interface with augmented cues to provoke caution===&lt;br /&gt;
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Robot-to-human interaction ===&lt;br /&gt;
As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Gaze-based handover prediction ===&lt;br /&gt;
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===&lt;br /&gt;
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===&lt;br /&gt;
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Stratos Explore Ultraleap demonstrator for robotics ===&lt;br /&gt;
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control. &lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Continuous teleoperation setup for controlling mobile robot on streets ===&lt;br /&gt;
The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the [https://adl.cs.ut.ee/lab/vehicle ADL vehicle]&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== ROBOTONT: ROS2 support for robotont ===&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Mixed-reality scene creation for vehicle teleoperation ===&lt;br /&gt;
Fusing different sensory feeds for creating high-usability teleoperation scene.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Validation study for AR-based robot user-interfaces ===&lt;br /&gt;
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
=== Replication of the MIT Hydra demo ===&lt;br /&gt;
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).&lt;br /&gt;
&amp;lt;br&amp;gt;Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.&lt;br /&gt;
&amp;lt;br&amp;gt;LINKS:&lt;br /&gt;
&amp;lt;br&amp;gt;Video: https://youtu.be/qZg2lSeTuvM&lt;br /&gt;
&amp;lt;br&amp;gt;Code: https://github.com/MIT-SPARK/Hydra&lt;br /&gt;
&amp;lt;br&amp;gt;Paper: http://www.roboticsproceedings.org/rss18/p050.pdf&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Quantitative Evaluation of the Situation Awareness during the Teleoperation of an Urban Vehicle ===&lt;br /&gt;
The goal of this thesis is to develop the methodology for measuring the operator's situation awareness while operating an autonomous urban vehicle. Potential metrics could include driving accuracy, speed, and latency. This thesis will be conducted as part of the activities at the [https://adl.cs.ut.ee Autonomous Driving Lab].&lt;br /&gt;
&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== PhD theses ==&lt;br /&gt;
*Houman Masnavi, [https://hdl.handle.net/10062/91394 Visibility aware navigation] [Nähtavust arvestav navigatsioon], PhD thesis, 2023&lt;br /&gt;
&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Robert Allik, Validation of NoMaD as a Global Planner for Mobile Robots, MS thesis, 2024&lt;br /&gt;
*Rauno Põlluäär, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79393 Designing and Implementing a Bird’s-eye View Interface for a Self-driving Vehicle’s Teleoperation System] [Isejuhtiva sõiduki kaugjuhtimissüsteemile linnuvaate kasutajaliidese loomine], MS thesis, 2024&lt;br /&gt;
*Eva Mõtshärg, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=77385 3D-prinditava kere disain ja analüüs vabavaralisele haridusrobotile Robotont] [Design and Analysis of a 3D Printable Chassis for the Open Source Educational Robot Robotont], MS thesis, 2023&lt;br /&gt;
*Farnaz Baksh, [https://dspace.ut.ee/items/a7a9cc15-27e9-450c-94e8-3267f0c95c56 An Open-source Robotic Study Companion for University Students] [Avatud lähtekoodiga robotõpikaaslane üliõpilastele], MS thesis, 2023&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022&lt;br /&gt;
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022&lt;br /&gt;
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021&lt;br /&gt;
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021&lt;br /&gt;
*Muhammad Usman, [http://hdl.handle.net/10062/72126 Development of an Optimization-Based Motion Planner and Its ROS Interface for a Non-Holonomic Mobile Manipulator] [Optimeerimisele baseeruva liikumisplaneerija arendamine ja selle ROSi liides mitteholonoomse mobiilse manipulaatori jaoks], MS thesis, 2020&lt;br /&gt;
*Maarika Oidekivi, [http://hdl.handle.net/10062/72119 Masina kavatsuse väljendamine ja tõlgendamine] [Communicating and interpreting machine intent], MS thesis, 2020&lt;br /&gt;
*Houman Masnavi, [http://hdl.handle.net/10062/72118 Multi-Robot Motion Planning for Shared Payload Transportation] [Rajaplaneerimine multi-robot süsteemile jagatud lasti transportimisel], MS thesis, 2020&lt;br /&gt;
*Fabian Ernesto Parra Gil, [http://hdl.handle.net/10062/72112 Implementation of Robot Manager Subsystem for Temoto Software Framework] [Robotite Halduri alamsüsteemi väljatöötamine tarkvararaamistikule TEMOTO], MS thesis, 2020&lt;br /&gt;
*Zafarullah, [http://hdl.handle.net/10062/72125 Gaze Assisted Neural Network based Prediction of End-Point of Human Reaching Trajectories], MS thesis, 2020&lt;br /&gt;
*Madis K Nigol, [http://hdl.handle.net/10062/64339 Õppematerjalid robotplatvormile Robotont] [Study materials for robot platform Robotont], MS thesis, 2019&lt;br /&gt;
*Renno Raudmäe, [http://hdl.handle.net/10062/64341 Avatud robotplatvorm Robotont] [Open source robotics platform Robotont], MS thesis, 2019&lt;br /&gt;
*Asif Sattar, [http://hdl.handle.net/10062/64352 Human detection and distance estimation with monocular camera using YOLOv3 neural network] [Inimeste tuvastamine ning kauguse hindamine kasutades kaamerat ning YOLOv3 tehisnärvivõrku], MS thesis, 2019&lt;br /&gt;
*Ragnar Margus, [http://hdl.handle.net/10062/64337 Kergliiklusvahendite jagamisteenuseks vajaliku positsioneerimismooduli loomine ja uurimine] [Develoment and testing of a IoT module for electric vehicle sharing service], MS thesis, 2019&lt;br /&gt;
*Pavel Šumejko, [http://hdl.handle.net/10062/64320 Robust Solution for Extrinsic Calibration of a 2D Laser-Rangefinder and a Monocular USB Camera] [Meetod 2D laserkaugusmõõdiku ja USB-kaamera väliseks kalibreerimiseks], MS thesis, 2019&lt;br /&gt;
*Dzvezdana Arsovska, [http://hdl.handle.net/10062/64321  Building an Efficient and Secure Software Supply Pipeline for Aerial Robotics Application] [Efektiivne ja turvaline tarkvaraarendusahel lennurobootika rakenduses], MS thesis, 2019&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes] [A VEP-based BCI for robotics applications], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators] [Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses] [Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Albert Unn, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=79937 Suhtlusvõimekuse arendamine sotsiaalsele humanoidrobotile SemuBot] [Developing the ability to communicate for SemuBot, a social humanoid robot], BS thesis, 2024&lt;br /&gt;
*Veronika-Marina Volynets, [https://hdl.handle.net/10062/99943 Development of Control Electronics and Program for Robotont's Height Adjustment Mechanism] [Juhtelektroonika ja programmi väljatöötamine Robotondi kõrguse reguleerimise mehhanismile], BS thesis, 2024&lt;br /&gt;
*Nikita Kurenkov, [https://hdl.handle.net/10062/99880 Flexible Screen Integration and Development of Neck Movement Mechanism for Social Humanoid Robot SemuBot] [Humanoidroboti näo lahenduse leidmine ja rakendamine; humanoidroboti jaoks kaela mehhanismi väljatöötamine], BS thesis, 2024&lt;br /&gt;
*Elchin Huseynov, [https://hdl.handle.net/10062/99668 Design and Control of a Social Humanoid Robot - SemuBot’s Hand] [Sotsiaalse Humanoidroboti Disain ja Juhtimine - SemuBoti Käsi], BS thesis, 2024&lt;br /&gt;
*Veronika Podliesnova, [https://hdl.handle.net/10062/99636 Real-Time Detection of Robot Failures by Monitoring Operator’s Brain Activity with EEG-based Brain-Computer Interface] [Reaalajas Robotirikke Tuvastamine Operaatori Ajutegevuse Jälgimise teel EEG-põhise Aju-Arvuti Liidese abil], BS thesis, 2024&lt;br /&gt;
*Märten Josh Peedimaa, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Karl Sander Vinkel, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Raimo Köidam, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Robert Valge, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Leonid Tšigrinski, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Andres Sakk, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Kaur Kullamäe, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Kristjan Madis Kask, [https://hdl.handle.net/not-yet TODO] [TODO], BS thesis, 2024&lt;br /&gt;
*Sven-Ervin Paap, [https://hdl.handle.net/not-yet ROS2 draiver õpperobotile Robotont] [ROS2 driver for the educational robot Robotont], BS thesis, 2024&lt;br /&gt;
*Timur Nizamov, [https://hdl.handle.net/10062/99592 Audio System for the Social Humanoid Robot SemuBot] [Helisüsteem sotsiaalsele humanoidrobotile SemuBot], BS thesis, 2024&lt;br /&gt;
*Georgs Narbuts, [https://hdl.handle.net/10062/99554 Holonomic Motion Drive System of a Social Humanoid Robot SemuBot] [Sotsiaalse humanoidroboti SemuBoti holonoomne liikuv ajamisüsteem], BS thesis, 2024&lt;br /&gt;
*Iryna Hurova, [https://hdl.handle.net/10062/90638 Kitting station of the learning factory] [Õppetehase komplekteerimisjaam], BS thesis, 2023&lt;br /&gt;
*Paola Avalos Conchas, [https://hdl.handle.net/10062/90641 Payload transportation system of a learning factory] [Õppetehase kasuliku koorma transpordisüsteem], BS thesis, 2023&lt;br /&gt;
*Pille Pärnalaas, [https://not-yet Pöördpõik ajami arendus robotplatvormile Robotont] [Development of swerve drive for robotic platform Robotont], BS thesis, 2023&lt;br /&gt;
*Priit Rooden, [https://not-yet Autonoomse laadimislahenduse väljatöötamine õpperobotile Robotont] [Development of an autonomous charging solution for the robot platform Robotont], BS thesis, 2023&lt;br /&gt;
*Marko Muro, [https://not-yet Robotondi akulahenduse ning 12 V pingeregulaatori prototüüpimine] [Prototyping battery solution and 12 V voltage regulator for Robotont], BS thesis, 2023&lt;br /&gt;
*Danel Leppenen, [https://not-yet Nav2 PYIF: Python-based Motion Planning for ROS 2 Navigation 2] [Nav 2 PYIF: Pythoni põhine liikumise planeerija ROS 2 Navigation 2-le], BS thesis, 2023&lt;br /&gt;
*Kertrud Geddily Küüt, [https://not-yet Kõrgust reguleeriv mehhanism Robotondile] [Height adjusting mechanism for Robotont], BS thesis, 2023&lt;br /&gt;
*Ingvar Drikkit, [https://not-yet Lisaseadmete võimekuse arendamine haridusrobotile Robotont] [Developing add-on device support for the educational robot Robotont], BS thesis, 2023&lt;br /&gt;
*Kristo Pool, [https://not-yet MoveIt 2 õppematerjalid] [Learning materials for MoveIt 2], BS thesis, 2023&lt;br /&gt;
*Erki Veeväli, [https://not-yet Development of a Continuous Teleoperation System for Urban Road Vehicle] [Linnasõiduki pideva kaugjuhtimissüsteemi arendus], BS thesis, 2023&lt;br /&gt;
*Aleksandra	Doroshenko, [https://not-yet Haiglates inimesi juhatava roboti disain] [Hospital guide robot design], BS thesis, 2023&lt;br /&gt;
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022&lt;br /&gt;
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022&lt;br /&gt;
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022&lt;br /&gt;
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&amp;amp;year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022&lt;br /&gt;
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022&lt;br /&gt;
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022&lt;br /&gt;
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022&lt;br /&gt;
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021&lt;br /&gt;
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021&lt;br /&gt;
*Tarvi Tepandi, [http://hdl.handle.net/10062/72665 Segareaalsusel põhinev kasutajaliides mobiilse roboti kaugjuhtimiseks Microsoft HoloLens 2 vahendusel] [Mixed-reality user interface for teleoperating mobile robots with Microsoft HoloLens 2], BS thesis, 2021&lt;br /&gt;
*Rudolf Põldma, [http://hdl.handle.net/10062/72674 Tartu linna Narva maantee ringristmiku digikaksik] [Digital twin for Narva street roundabout in Tartu], BS thesis, 2021&lt;br /&gt;
*Kwasi Akuamoah Boateng, [http://hdl.handle.net/10062/72804 Digital Twin of a Teaching and Learning Robotics Lab] [Robotite õpetamise ja õppimise labori digitaalne kaksik], BS thesis, 2021&lt;br /&gt;
*Karina Sein, [http://hdl.handle.net/10062/72102 Eestikeelse kõnesünteesi võimaldamine robootika arendusplatvormil ROS] [Enabling Estonian speech synthesis on the Robot Operating System (ROS)], BS thesis, 2020&lt;br /&gt;
*Ranno Mäesepp, [http://hdl.handle.net/10062/72100 Takistuste vältimise lahendus õpperobotile Robotont] [Obstacle avoidance solution for educational robot platform Robotont], BS thesis, 2020&lt;br /&gt;
*Igor Rybalskii, [http://hdl.handle.net/10062/72060 Gesture Detection Software for Human-Robot Collaboration] [Žestituvastus tarkvara inimese ja roboti koostööks], BS thesis, 2020&lt;br /&gt;
*Meelis Pihlap, [http://hdl.handle.net/10062/64292 Mitme roboti koostöö funktsionaalsuste väljatöötamine tarkvararaamistikule TeMoto] [Multi-robot collaboration functionalities for robot software development framework TeMoto], BS thesis, 2019&lt;br /&gt;
*Kaarel Mark, [http://hdl.handle.net/10062/64290 Liitreaalsuse kasutamine tootmisprotsessis asukohtade määramisel] [Augmented reality for location determination in manufacturing], BS thesis, 2019&lt;br /&gt;
*Kätriin Julle, [http://hdl.handle.net/10062/64279 Roboti KUKA youBot riistvara ja ROS-tarkvara uuendamine] [Upgrading robot KUKA youBot’s hardware and ROS-software], BS thesis, 2019&lt;br /&gt;
*Georg Astok, [http://hdl.handle.net/10062/64274 Roboti juhtimine virtuaalreaalsuses kasutades ROS-raamistikku] [Creating virtual reality user interface using only ROS framework], BS thesis, 2019&lt;br /&gt;
*Martin Hallist, [http://hdl.handle.net/10062/64275 Robotipõhine kaugkohalolu käte liigutuste ülekandmiseks] [Teleoperation robot for arms motions], BS thesis, 2019&lt;br /&gt;
*Ahmed Hassan Helmy Mohamed, [http://hdl.handle.net/10062/63946 Software integration of autonomous robot system for mixing and serving drinks] [Jooke valmistava ja serveeriva robotsüsteemi tarkvaralahendus], BS thesis, 2019&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS] [Driver package for using Leap Motion™ controller on the robotics development platform ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine] [Omnimotion and odometry development for open robot development platform Robotont], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile] [ROS driver development for the University of Tartu’s Robotex robotics platform], BS thesis, 2017&lt;br /&gt;
&lt;br /&gt;
[[Category:Theses Topics]]&lt;/div&gt;</summary>
		<author><name>Karl</name></author>
	</entry>
</feed>