Our research since 2014
State of the art Deep Learning techniques
were designed for rich cloud resources and huge datasets in mind.
They are perfect for Narrow AI applications with established boundaries like speech processing or fake video generation.
But this is not something equally good for the autonomous machines in the constantly changing physical world.
It doesn’t make sense to pay for cloud processing of data from all cameras and sensors in each device.
Just like we do not memorize everything we see or hear – machines should be able to define what is valuable for their tasks.
Instead machines can process data locally or mark valuable data in video or audio files for processing on the Intranet or cloud server.
Battery powered machines can strongly benefit from low-latency local data processing as long as their neural networks store data useful to them and not all other units.
Even 5G and newer networks introduce significant additional latency – the further we go from the processing unit – the more latency we have. So they do not solve the problem.
Neuroscience
Biological brain structures can teach us how they contribute to autonomy, energy efficiency, and fast zero to few-shot learning.
We hire people that are able to reverse engineer the most important brain features.
After more than 7 years of extensive research, we successfully identified what is the missing link between biological and technical General Intelligence
Current R&D projects
Object recognition with continuous
Machine Learning
Object recognition is one of the most important tasks in computer vision. In this project we teach machines to learn new attributes of already known objects and completely new object classes (even without a label)
View projectAutonomous movement & decision making
Non-stationary machines require a way to decide how to move and what to do in specific context. We research decision models for perception, reasoning and action – combined with state generalization module that provides intuitive understanding of internal response to the external environment context.
View project