Virtual Reality Tracking and Robotic Arms
McGaha, Aaron (2018)
McGaha, Aaron
Metropolia Ammattikorkeakoulu
2018
Creative Commons Attribution-ShareAlike 3.0 Unported
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:amk-2018111016927
https://urn.fi/URN:NBN:fi:amk-2018111016927
Tiivistelmä
The goal of this thesis project was to create a proof of concept for the integration of virtual reality tracking systems and robotic control systems. Commercially-available virtual reality systems may be used to create human-machine interfaces which require less training time than traditional interfaces in many fields.
Accurate position and orientation data can be acquired from virtual reality systems at sufficient frequency to keep up with human motion. This data can then be used to control the movement of a robotic arm without the need for traditional controls such as buttons, joysticks, or pedals.
Position data from HTC Vive virtual reality tracked controllers is collected on a PC. This data is parsed by a Unity program on the PC, turned into gcode readable by the uArm Swift Pro robotic arm, and sent to the arm over a serial connection.
The result is a system where a user holds a tracked controller and the endpoint of the robotic arm matches the movement of the user. Further work can be done to improve this project in the arena of the firmware used by the robotic arm. The movement speed is limited by how the firmware executes movement commands.
Accurate position and orientation data can be acquired from virtual reality systems at sufficient frequency to keep up with human motion. This data can then be used to control the movement of a robotic arm without the need for traditional controls such as buttons, joysticks, or pedals.
Position data from HTC Vive virtual reality tracked controllers is collected on a PC. This data is parsed by a Unity program on the PC, turned into gcode readable by the uArm Swift Pro robotic arm, and sent to the arm over a serial connection.
The result is a system where a user holds a tracked controller and the endpoint of the robotic arm matches the movement of the user. Further work can be done to improve this project in the arena of the firmware used by the robotic arm. The movement speed is limited by how the firmware executes movement commands.