Theses in Robotics: Difference between revisions

From Intelligent Materials and Systems Lab

(33 intermediate revisions by the same user not shown)
Line 7: Line 7:
The following is not an exhaustive list of all available thesis/research topics.
The following is not an exhaustive list of all available thesis/research topics.


== Highlighted theses topics for 2021/2022 study year ==
== Highlighted theses topics for 2022/2023 study year ==
# [[#Human-height human-robot interface for Robotont ground robot|Human-height human-robot interface for Robotont ground robot]]
# [[#Continuous teleopearation setup for controlling mobile robot on streets|Continuous teleopearation setup for controlling mobile robot on streets]]
# [[#ROS2 for robotont|ROS2 for robotont]]
# [[#ROBOTONT: COTS battery support for robotont|ROBOTONT: COTS battery support for robotont]]
# [[#RoboCloud|RoboCloud]]
# [[#ROS2 learning materials for MoveIt|ROS2 learning materials for MoveIt]]
# [[#Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack|Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack]]
# [[#Stratos Explore Ultraleap demonstrator for robotics|Stratos Explore Ultraleap demonstrator for robotics]]
# [[#ROBOTONT: ROS2 support for robotont|ROBOTONT: ROS2 support for robotont]]
# [[#Mixed-reality scene creation for vehicle teleoperation|Mixed-reality scene creation for vehicle teleoperation]]
# [[#Vision-based pick-and-place demo with xArm7|Vision-based pick-and-place demo with xArm7]]
# [[#Replication of the MIT Hydra demo|Replication of the MIT Hydra demo]]


== List of potential thesis topics ==
== List of potential thesis topics ==
Line 26: Line 32:
</gallery>
</gallery>


=== ROS support, demos, and educational materials for open-source mobile robot ROBOTONT ===
=== ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot ===
[[Image:RosLarge.png|left|100px|ROS]]
[[Image:RosLarge.png|left|100px|ROS]]
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting<br><br>
The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting<br><br>
[[Image:Ros_equation.png|x100px|What is ROS?]]
[[Image:Ros_equation.png|x100px|What is ROS?]]
<hr>
<hr>
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===
=== Virtual reality user interface (VRUI) for intuitive teleoperation system ===
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]
Line 49: Line 56:
=== Robotic avatar for telepresence ===
=== Robotic avatar for telepresence ===
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.
<hr>
=== ROS driver for Artificial Muscle actuators ===
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&index=10&list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.
<hr>
<hr>
=== Detection of hardware and software resources for smart integration of robots ===
=== Detection of hardware and software resources for smart integration of robots ===
Line 74: Line 78:
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.
<hr>
<hr>
=== ROS2 for robotont ===
=== ROBOTONT: TeMoto for robotont ===
Creating ROS2 support for robotont mobile platform
<hr>
=== TeMoto for robotont ===
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.
<hr>
<hr>
=== Enhancing teleoperation control interface with augmented cues to provoke caution===
=== Enhancing teleoperation control interface with augmented cues to provoke caution===
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.
The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.
Line 88: Line 90:
=== Gaze-based handover prediction ===
=== Gaze-based handover prediction ===
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.
When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.
<hr>
=== RoboCloud ===
Setting up a system that allows booking time on a physical but networked remote robot to validate and test robot software. The thesis involves system administration and front-end development.
<hr>
<hr>
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===
=== Real-world demonstrator for MIR+UR+TeMoto integration  ===
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.
Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.
<hr>
<hr>
=== Human-height human-robot interface for Robotont ground robot ===
=== ROBOTONT: Human-height human-robot interface for Robotont ground robot ===
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.  
Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.  
<hr>
=== Stratos Explore Ultraleap demonstrator for robotics ===
The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control.
<hr>
=== Continuous teleopearation setup for controlling mobile robot on streets ===
The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the [https://adl.cs.ut.ee/lab/vehicle ADL vehicle]
<hr>
=== ROBOTONT: ROS2 support for robotont ===
Creating ROS2 support for robotont mobile platform
<hr>
=== ROBOTONT: COTS battery support for robotont ===
Redisgning the body and power supply electronics for Robotont to support commercially available battery solution.
<hr>
=== Mixed-reality scene creation for vehicle teleoperation ===
Fusing different sensory feeds for creating high-usability teleoperation scene.
<hr>
=== Validation study for AR-based robot user-interfaces ===
Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.
<hr>
=== ROS2 learning materials for MoveIt ===
Porting the learnng content and code examples of how to control manipulator robots from ROS1 to ROS2.
<hr>
=== Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack ===
ROS Navigation provides off-the-shelf tools for making mobile robots autonomous (including the use established motion planning algorithms). However, the software architecture requires the motion planning algorithms to be implemented as C++ plugins. In reality, many motion planning algorithms are first implemented using Python libraries, thus, there is great demand for plugging Python-based motion planning algorithms in ROS Navigation. The thesis seeks to design and develop the software solution for that particular goal.
<hr>
=== Vision-based pick-and-place demo with xArm7 ===
ROS and MoveIt are used to demonstrate how an unorganized pile of objects is picked by an xArm7 robot and ordered into a box or tray. The system integrates a depth camera to detect 3D environment of the robot and track objects.
<hr>
=== Replication of the MIT Hydra demo ===
The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).
<br>Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.
<br>LINKS:
<br>Video: https://youtu.be/qZg2lSeTuvM
<br>Code: https://github.com/MIT-SPARK/Hydra
<br>Paper: http://www.roboticsproceedings.org/rss18/p050.pdf
<hr>
<hr>


= Completed projects =
= Completed projects =
== Masters's theses ==
== Masters's theses ==
*Igor Rybalskii, [http://hdl.handle.net/10062/83028 Augmented reality (AR) for enabling human-robot collaboration with ROS robots] [Liitreaalsus inimese ja roboti koostöö võimaldamiseks ROS-i robotitega], MS thesis, 2022
*Md. Maniruzzaman, [http://hdl.handle.net/10062/83025 Object search and retrieval in indoor environment using a Mobile Manipulator] [Objektide otsimine ja teisaldamine siseruumides mobiilse manipulaatori abil], MS thesis, 2022
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021
*Allan Kustavus, [http://hdl.handle.net/10062/72651 Design and Implementation of a Generalized Resource Management Architecture in the TeMoto Software Framework] [Üldise ressursihalduri disain ja teostus TeMoto tarkvara raamistikule], MS thesis, 2021
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021
*Kristina Meister, [http://hdl.handle.net/10062/72350 External human-vehicle interaction - a study in the context of an autonomous ride-hailing service], MS thesis, 2021
Line 119: Line 157:


== Bachelor's theses ==
== Bachelor's theses ==
*Hui Shi, [http://hdl.handle.net/10062/83043 Expanding the Open-source ROS Software Pack age opencv_apps with Dedicated Blob Detection Functionality] [Avatud lähtekoodiga ROS-i tarkvarakimbu opencv_apps laiendamine laigutuvasti funktsioonaalsusega], BS thesis, 2022
*Dāvis Krūmiņš, [http://hdl.handle.net/10062/83040 Web-based learning and software development environment for remote access of ROS robots] [Veebipõhine õppe- ja tarkvaraarenduse keskkond ROS robotite juurdepääsuks kaugteel], BS thesis, 2022
*Anna Jakovleva, [http://hdl.handle.net/10062/83037 Roboquiz - an interactive human-robot game] [Roboquiz - interaktiivne inimese ja roboti mäng], BS thesis, 2022
*Kristjan Laht, [https://comserv.cs.ut.ee/ati_thesis/datasheet.php?id=74702&year=2022Robot Localization with Fiducial Markers] [Roboti lokaliseerimine koordinaatmärkidega], BS thesis, 2022
*Hans Pärtel Pani, [http://hdl.handle.net/10062/83008 ROS draiver pehmerobootika haaratsile] [ROS Driver for Soft Robotic Gripper], BS thesis, 2022
*Markus Erik Sügis, [http://hdl.handle.net/10062/83015 Jagatud juhtimise põhimõttel realiseeritud robotite kaugjuhtimissüsteem] [A continuous teleoperating system based on shared control concept], BS thesis, 2022
*Taaniel Küla, [http://hdl.handle.net/10062/83007 ROS2 platvormile keevitusroboti tarkvara portotüübi loomine kasutades UR5 robot-manipulaatorit] [Welding robot software prototype for ROS2 using UR5 robot arm], BS thesis, 2022
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021
*Rauno Põlluäär, [http://hdl.handle.net/10062/72672 Veebirakendus-põhine kasutajaliides avatud robotplatvormi Robotont juhtimiseks ja haldamiseks] [Web application-based user interface for controlling and managing open-source robotics platform Robotont], BS thesis, 2021
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021
*Hendrik Olesk, [http://hdl.handle.net/10062/72664 Nägemisulatuses kaugjuhitava mobiilse robotmanipulaatori kasutajamugavuse tõstmine] [Improving the usability of a mobile manipulator robot for line-of-sight remote control], BS thesis, 2021

Revision as of 14:22, 2 September 2022

Projects in Advanced Robotics

The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are Robot Operating System (ROS) for developing software for advanced robot systems and Gazebo for running realistic robotic simulations.

For further information, contact Karl Kruusamäe and Arun Kumar Singh

The following is not an exhaustive list of all available thesis/research topics.

Highlighted theses topics for 2022/2023 study year

  1. Continuous teleopearation setup for controlling mobile robot on streets
  2. ROBOTONT: COTS battery support for robotont
  3. ROS2 learning materials for MoveIt
  4. Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack
  5. Stratos Explore Ultraleap demonstrator for robotics
  6. ROBOTONT: ROS2 support for robotont
  7. Mixed-reality scene creation for vehicle teleoperation
  8. Vision-based pick-and-place demo with xArm7
  9. Replication of the MIT Hydra demo

List of potential thesis topics

Our inventory includes but is not limited to:

ROBOTONT: ROS support, demos, and educational materials for open-source mobile robot

ROS

The project involves many potential theses topic on open-source robot platform ROBOTONT. The nature of the thesis can be software development to improve the platform's capabilites, simulation of specific scenarios, and/or demonstration of ROBOTONT in real-life setting. A more detailed thesis topic will be outlined during in-person meeting

What is ROS?


Virtual reality user interface (VRUI) for intuitive teleoperation system

Detecting 2 hands with Leap Motion Controller

Enhancing the user-experience of a virtual reality UI developed by Georg Astok. Potentially adding virtual reality capability to a gesture- and natural-language-based robot teleoperation system.
Gesture-based teleoperation


Health monitor for intuitive telerobot

Intelligent status and error handling for an intuitive telerobotic system.


3D scanning of industrial objects

Using laser sensors and cameras to create accurate models of industrial products for quality control or further processing.


Modeling humans for human-robot interaction

True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.
ROS & Kinect & Skeleton-Markers Package


Robotic avatar for telepresence

Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.


Detection of hardware and software resources for smart integration of robots

Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of TeMoto.


Sonification of feedback during teleoperation of robots

Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS.


Human-Robot and Robot-Robot collaboration applications

Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams

  • human-robot collaborative assembly
  • distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping
  • Inaccessible region teamwork
    • youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze
    • youbot+clearbot - youbot cannot go up ledges but it can lift smaller robot, such as clearbot, up a ledge.

Developing ROS driver for a robotic gripper

The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task.
SMC LEHF32K2-64 gripper.


Mirroring human hand movements on industrial robots

The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Ultraleap, Universal Robot UR5 manipulator, and ROS.


ROBOTONT: TeMoto for robotont

Swarm-management for robotont using TeMoto framework.


Enhancing teleoperation control interface with augmented cues to provoke caution

The task is to create a telerobot control interface where video feed from the remote site and/or a mixed-reality scene is augmented with visual cues to provoke caution in human operator.


Robot-to-human interaction

As robots and autonomous machines start sharing the same space a humans, their actions need to be understood by the people occupying the same space. For instance, a human worker needs to understand what the robot partner is planning next or a pedestrian needs to clearly comprehend the behaviour of a driverless vehicle. To reduce the ambiguity, the robot needs mechanisms to convey its intent (whatever it is going to do next). The aim of the thesis is to outline existing methods for machines to convey their intent and develop a unified model interface for expressing that intent.


Gaze-based handover prediction

When human needs to pass an object to a robot manipulator, the robot must understand where in 3D space the object handover occurs and then plan an appropriate motion. Human gaze can be used as the input for predicting which object to track. This thesis activities involve camera-based eye tracking and safe motion-planning.


Real-world demonstrator for MIR+UR+TeMoto integration

Integration of mobile manipulator (MIR100 + UR5e + Robotiq gripper) to demonstrate TeMoto in a collaborative application.


ROBOTONT: Human-height human-robot interface for Robotont ground robot

Robotont is an ankle-high flat mobile robot. For humans to interact with Robotont, there is a need for a compact and lightweight mechanical structure that is tall enough for comfortable human-robot interaction. The objective of the thesis is to develop the mechanical designs and build prototypes that ensure stable operation and meet the aesthetic requirements for use in public places.


Stratos Explore Ultraleap demonstrator for robotics

The aim of this thesis is to systematically analyze the strengths and limitations of the Stratos Explore Ultraleap device in the context of controlling a robot. Subsequently implement a demonstrator application for showing off its applicability for robot control.


Continuous teleopearation setup for controlling mobile robot on streets

The task in this device is analyse available options for building a teleopearation cockpit for continuously controlling a mobile robot moving on the streets. The contribution of the thesis is to set up the system, validate its usability, and benchmark its capabilities/limitations on the ADL vehicle


ROBOTONT: ROS2 support for robotont

Creating ROS2 support for robotont mobile platform


ROBOTONT: COTS battery support for robotont

Redisgning the body and power supply electronics for Robotont to support commercially available battery solution.


Mixed-reality scene creation for vehicle teleoperation

Fusing different sensory feeds for creating high-usability teleoperation scene.


Validation study for AR-based robot user-interfaces

Designing and carrying out a user study to validate the functionality and usability of an human-robot interface.


ROS2 learning materials for MoveIt

Porting the learnng content and code examples of how to control manipulator robots from ROS1 to ROS2.


Navigator PYRX: Python-based motion planning for ROS2 Navigation software stack

ROS Navigation provides off-the-shelf tools for making mobile robots autonomous (including the use established motion planning algorithms). However, the software architecture requires the motion planning algorithms to be implemented as C++ plugins. In reality, many motion planning algorithms are first implemented using Python libraries, thus, there is great demand for plugging Python-based motion planning algorithms in ROS Navigation. The thesis seeks to design and develop the software solution for that particular goal.


Vision-based pick-and-place demo with xArm7

ROS and MoveIt are used to demonstrate how an unorganized pile of objects is picked by an xArm7 robot and ordered into a box or tray. The system integrates a depth camera to detect 3D environment of the robot and track objects.


Replication of the MIT Hydra demo

The goal of the thesis is to use the Hydra software package and integrate it for a robot used at IMS Robotics (e.g., TIAGo, Jackal, Robotont).
Hydra takes sensor data in input (stereo or RGB-D camera images and IMU data) and produces a hierarchical model of the environment, by estimating the trajectory of the robot (including loop closures), building a metric-semantic 3D mesh, and segmenting objects, places, and rooms in an indoor environment. These representations are combined into a 3D scene graph, which enables novel approaches for hierarchical loop closure detection and ensures the representations remain consistent after loop closure. Hydra is implemented in C++ and is ROS-compatible. It uses a neural-network based image segmentation front-end, but it otherwise relies on an efficient and multi-threaded CPU-based implementation, which is suitable for mobile robot deployment.
LINKS:
Video: https://youtu.be/qZg2lSeTuvM
Code: https://github.com/MIT-SPARK/Hydra
Paper: http://www.roboticsproceedings.org/rss18/p050.pdf


Completed projects

Masters's theses

Bachelor's theses