The digital enabler to self-determination.
Whom HEADCUBE® is made for?
The control system
Specifically developed control system through head movements
Users can control the tablet with their head alone. This is possible with the help of AI-based analysis of head movements by the front camera of the mobile device.
HEADCUBE® can thus recognize how the head is positioned in relation to the screen and reacts to changes in this position by triggering the control. Through head movement, the position of the head is changed and thus an input is initiated.
Touch control remains available as a supplementary input option.
How it works
Intuitive navigation through simple input optionsThe input menu always consists of four fields. The desired fields are selected by head movements. The control runs parallel to the head movements:
- turn the head to the side, steers to the right or left respectively
- moving the head vertically controls up or down
- optionally, eye closure can be added to the control
HEADCUBE®’s scanning control system allows to use the application with only one direction of movement.
In the scanning process, the four input fields are marked one after the other at an individual pace and users select the desired field by their direction of movement.
What HEADCUBE® can do
Typical everyday situations, where HEADCUBE® can help
That's why HEADCUBE® is perfect for you
What users say about HEADCUBE®
With HEADCUBE® we can save a lot of time in the care team and enable those affected to lead a much more self-determined daily life.
What users say about HEADCUBE®
Now I don't have to call my kids or my assistant to tell them, "Hey, I need this, I need that," but I can do it independently....
I use HEADCUBE® all the time... I use it for everything.
My son started using HEADCUBE® this week. We are so excited that he can finally watch videos and play games. This is very important for him to increase his independence.
Es ist magisch und ein Wunder. HEADCUBE® ist ein wesentlicher Bestandteil meiner Reise zur Kommunikation! Ich bin so dankbar für alles, was HEADCUBE auf dem Gebiet der Barrierefreiheit tut.
Try HEADCUBE® 30 days for free! Sign up now for test version.
As a business economist and business mathematician responsible for management issues.
Is software developer and responsible for the technical development of HEADCUBE®.
What HEADCUBE® stands for
With HEADCUBE® we create a digital assistance. For all those who have not been able to use a mobile device due to barriers. Our goal is to open up possibilities for users - with a simple and intuitive control system. HEADCUBE® stands for digital participation and the playful use of various functions.
Questions from users and caregivers.
Unfortunately, HEADCUBE® is not yet available in the App Store or Google Play Store. The App is currently in the beta test phase. If you are already interested in the app, you can get access to the beta version in advance. Simply contact us at email@example.com
Once you have installed the app on your device, you can create your own profile and start using HEADCUBE®.
No, that is not yet possible.
The use of the app is free of charge.
With a user profile, you can use the HEADCUBE® simultaneously on several devices. Several users can use the app on one device, not at the same time, but after logging out and logging in with the corresponding user profile.
No, only a front camera is required. All common devices have the necessary technical requirements to use HEADCUBE®.
Eye tracking in the actual sense, i.e., control by "looking" at desired fields, is not possible for the time being. However, control by closing the eyes is available with the help of the scanning process.
The scanning control first selects which direction of movement is most comfortable for the user (up, down, right, left, turn right, turn left or close eyes). Then the speed of the scanner is set. The scanner is a colored border that selects the 4 fields one after the other in a clockwise direction. To move to a field, the user must perform his predefined movement as long as the field is marked in color by the scanner. The procedure is comparable to a slide show where a user clicks 'stop' to view a picture.
The user interface, i.e. which functions are displayed, cannot be selected individually at the moment. In contrast, the control can be adapted to individuals. This is done by taking into account which movements users can perform and how well the respective movement is possible. We have developed several control systems, which have different functionalities depending on the number of movements of the user. Furthermore, the sensitivity of each movement can be adjusted, i.e. how strong and how long users have to perform a movement to trigger a control.