... | ... | @@ -41,12 +41,37 @@ Basic understanding about accessibility, Programming |
|
|
|
|
|
**Assignee:** offen
|
|
|
|
|
|
## [Internship / Bachelorthesis] Gesture based user interaction
|
|
|
**Selected Topics**: StudyATHome:AI, AT
|
|
|
## [Internship / Bachelorthesis] Hand gesture control for the Smart Living Lab
|
|
|
**Selected Topics**: AT, StudyATHome:AI, Machine Vision
|
|
|
Write to: <deinhofe@technikum-wien.at>
|
|
|
|
|
|
The [OAK-D camera](https://shop.luxonis.com/collections/all/products/1098obcenclosure) is a depth camera with embedded computer vision and neural inference support. This means that the processsing of a camera frame and the processing on a machine learning model is done on the camera freeing the host CPU (e.g. Raspberry Pi) for other tasks.
|
|
|
|
|
|
The OAK-D camera can be programmed by a very simple python API and has a lots of useful examples for starting.
|
|
|
|
|
|
### Project Task
|
|
|
|
|
|
Use hand trackig, as shown in the image, to control items of the Smart Living Lab by interfacing the OpenHAB REST API.
|
|
|
|
|
|
![Hand tracking example](uploads/afe36b46451e0045ac660d5c9d75b0ff/image.png)
|
|
|
[OAK-D hand tracking example](https://github.com/geaxgx/depthai_hand_tracker)
|
|
|
|
|
|
### Links
|
|
|
|
|
|
* [OAK-D camea](https://shop.luxonis.com/collections/all/products/1098obcenclosure)
|
|
|
* [DepthAI Docs](https://docs.luxonis.com/en/latest/)
|
|
|
* [OAK-D hand tracking example](https://github.com/geaxgx/depthai_hand_tracker)
|
|
|
|
|
|
## [Internship / Bachelorthesis] Gesture based user interaction with Mediapipe library
|
|
|
**Selected Topics**: StudyATHome:AI, Machine Vision, AT
|
|
|
Write to: <deinhofe@technikum-wien.at>
|
|
|
|
|
|
In computer vision facial landmark algorithms are capable of finding nose, mouth and eye contours of a face in an image or video. Recently also machine learning is used to detect faces or facial landmarks. As soon as the facial features (e.g. eye-brow, eye-blink, head movement,...) are found they can be used for user interaction.
|
|
|
|
|
|
Google provides a very simplified library (mediapipe) tracking facial landmarks in real time. It can be used to control various applications or items. There are many easy youtube tutorials of how to use it.
|
|
|
|
|
|
![Facial landmark tracking](https://google.github.io/mediapipe/images/mobile/face_mesh_android_gpu.gif)
|
|
|
|
|
|
Tasks of the topic could be:
|
|
|
* Facial gesture based music program
|
|
|
* Facial gesture based Smart Home control
|
... | ... | @@ -54,16 +79,12 @@ Tasks of the topic could be: |
|
|
|
|
|
**Technologies / Requirements**
|
|
|
|
|
|
Python or C/C++ or Java, OpenCV (JavaCV)
|
|
|
|
|
|
**Links**
|
|
|
Python/Javascript + Mediapipe library, OpenCV
|
|
|
|
|
|
* [Tutorial about Computer Vision](https://www.learnopencv.com/facemark-facial-landmark-detection-using-opencv/)
|
|
|
* [Selected Java Examples](https://es.technikum-wien.at/embedded_systems_public/Java-Exercises-9-JavaCV)
|
|
|
* [JavaCV Computer Vision Library](https://github.com/bytedeco/javacv)
|
|
|
* [Tutorial to create a custom classifier for hand gesture control](https://www.learnopencv.com/training-a-custom-object-detector-with-dlib-making-gesture-controlled-applications/)
|
|
|
### Links
|
|
|
|
|
|
**Assignee:** Variant: OpenFace+Asterics: B.J.M.G. (BSA5)
|
|
|
* [Mediapipe library](https://google.github.io/mediapipe/)
|
|
|
* [Murtaza's workshop facial landmark video tutorial](https://www.youtube.com/watch?v=V9bzew8A1tc)
|
|
|
|
|
|
## [Internship / Bachelorthesis] PupilLabs DIY-Eyetracker
|
|
|
**Selected Topics**: StudyATHome:AI, AT
|
... | ... | |