Interaction with images using hand gestures
Basnet, Suman (2016)
Basnet, Suman
Metropolia Ammattikorkeakoulu
2016
All rights reserved
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:amk-2016060311781
https://urn.fi/URN:NBN:fi:amk-2016060311781
Tiivistelmä
The main objective of this Final Year Project (FYP) is to achieve prototype of an embedded system where any user can control the flow of the images in the Graphical User Interface (GUI).
This report starts with working mechanism of the sensors where gesture recognition pattern of the sensors is discussed. Then, the hardware and software requirements are enlisted with their features’ description in the system specifications heading. Eventually, stepwise elaboration of two different phases of the system is discussed in the methodology heading.
The main constituents of this FYP are Arduino Nano, Ultrasonic sensors and Python Pro-gramming Language. The whole system activates when ultrasonic sensors connected to the Arduino Nano follow the hand gestures of the user. After that the Arduino Nano forward the information about the hand movements to the serial port of the computer and Python Programming language executes the commands indicated by the hand movement, chang-ing the images in the Graphical User Interface (GUI) accordingly.
Finally; challenges, limitations and possible upgrade in the system design are discussed in the conclusion part of this report.
This report starts with working mechanism of the sensors where gesture recognition pattern of the sensors is discussed. Then, the hardware and software requirements are enlisted with their features’ description in the system specifications heading. Eventually, stepwise elaboration of two different phases of the system is discussed in the methodology heading.
The main constituents of this FYP are Arduino Nano, Ultrasonic sensors and Python Pro-gramming Language. The whole system activates when ultrasonic sensors connected to the Arduino Nano follow the hand gestures of the user. After that the Arduino Nano forward the information about the hand movements to the serial port of the computer and Python Programming language executes the commands indicated by the hand movement, chang-ing the images in the Graphical User Interface (GUI) accordingly.
Finally; challenges, limitations and possible upgrade in the system design are discussed in the conclusion part of this report.