In the continuation of my senior project I decided to go from muscular activity to brain activity (EEG) and see if it is possible to develop a real time gaming framework using the brain signals. The special experimental paradigm was created to find out the limits of EEG-based visual perception where the subject had to count the number of occurances of the specific stimulus at different frequences. The study is ongoing and the paper is planned to be published for the CIFMA2019 conference.
My graduation project in which I research possible applications of non-invasive electromyography (EMG) sensors in different gaming scenarios. Linear Discriminant Analysis and Logistic Regression are used for individual gesture recognition in real-time. The project is written purely in Python using scikit-learn, pygame, and pyautogui.
Online marketplace platform where Nazarbayev University students can post their products or services. NUKupi was developed by me and my roommate as a part of the Software Engineering course. The backend was developed using Java Servlet API first, but then I moved it to a more convenient JAX-RS.
I have been using LICECap program for desktop GIF recordording for a while now. Unfortunately, in LICECap you have to adjust recording frame for every window separately and sometimes it does not work so well. I decided to make use of Apple Quartz Window Services API and imporve the concept a little bit. Now you can select the specific window you want to record and turn it into GIF. The image scale, compression quality and FPS are also adjustable.
Currently working on a pet tracking iOS application at Tractive in Linz, Austria.