The standard answer to this question is: No, we live in a 3D world and to discover its third dimension – depth, we need at least two receivers. This fact is confirmed by Mother Nature who decided not to follow the examples from the Greek Mythology and skipped the creation of Cyclopes.  


The one-eyed robots can be found in Hollywood only. Even though we like them a lot, they too, have problems in recognizing the objects in front of them. 

 

Stuart

 

The minion Stuart finds himself talking to a fireplug, and the ghoul Mike uses a contact lens to find a solution.

 

Mike

 

So may one-eyed robots get to know the 3D world?

 

The experiments to create devices with stereoscopic ‘’vision’’ led either to solutions imitating the human vision – based on two cameras, or to solutions replacing vision with ‘’touch’’ – lasers, sonars, radars. The robots were first merely sci-fi. Then they entered the real world – first were those imitating human vision and later those replacing it with ‘touch’’. However, sci-fi gave birth to one-eyed robots, too. Have you ever thought that the ‘’celebrity robot’’ R2-D2 is one-eyed?

 

R2-D2

 

Almost 40 years later, an advanced version came to the world – BB-8. Despite that its moves were better, it was still a Cyclops. 

 

BB8

 

Both did great interacting in the 3D world of Star Wars. Unfortunately, the science so far failed to introduce the one-eyed robots to the real world. It is considered impossible as we live in a 3D world, and to get to know it, at least two receivers are needed…

 

Well, Cadbest believes it is possible. 

 

The  engineers noticed that actually some animals – hens, pigeons, horses, do great even thought their eyes are placed on the side of the head. This means that each eye sees a different picture. The brain may process the picture as seen by the left eye with the picture seen by the right eye, but these are not two projections of the same scene which would add to the depth of the image. Instead, these are two projections of two different scenes, which together form one 2D scene. So, even though these animals have two eyes, they are actually Cyclops. But then, how do these animals get oriented in a 3D world? The answer is ingeniously simple. Have you noticed that they all – hens, horses, pigeons, move their heads back and forth while walking? At first sight, this movement is needed to keep balance. But this movement can be seen even when the animal is not walking. On top of that, this movement is absent with the animals whose eyes are placed on the front of the head. 

 

 

 

 

Then what is going on? The engineers at Cadbest are of an opinion that during this head movement the brain of the animal takes ‘’two pictures’’ of the surrounding world from a different angle. These two projections help the brain to calculate the depth of the image.

 

The above observations are not the basis of the solution invented by the company, but they may explain the behavior of the animals.  Actually, Cadbest relied on the epi-polar geometry to find an analytic solution of the problem by developing algorithms for:

  • Passive recognition of moving objects in moving surroundings;
  • Visual parameters alignment in added and augmented reality

The patents are already pending.  The solution is one simple to prove – a simple webcam recognizes the three dimensions of the passive object moved in front of it and transmits the movement to each of the 6 degrees of freedom to an image on the screen, thus managing the movement of the image. 

 

 

The algorithms, that have been developed based on this invention, allow application in some spheres with a high potential. These spheres include, but are not limited to:

  • Special purpose systems - The usage of a single camera for passive recording of moving objects by a moving observer, or for navigation and orientation purposes solves a big problem: a military object does not emit any touching waves or beams hence it can’t be detected. 
  • Existing augmented reality - The augmented reality object changes its angle and position subject of the movement of the observer, thus looking real all the time. The invention can be applied in the gaming industry and used to develop a completely new class of computer games. The games could be played in the real world and not in the phantasy world;
  •  Eventually, the ‘’celebrity robots’’ R2-D2 and BB-8 will enter the real world as they will have the vision which will allow them to be fully functional. 

 

Already in the geometry jungle, the team decided to go ahead and made another discovery. It appeared that the very complex mathematical interaction in a case of more than two eyes/cameras has an elegant solution which Cadbest called ‘’ multiple view geometry’’. The engineers found the existence of a very important point, which they have called ‘’Zeta point’’. The ‘’Zeta point’’ allowed the complex mathematical solution to be decomposed to a simple system of equations.

 

Of course, Mother Nature has found this solution before them – the spiders have more than two eyes.

 

 

They process the visual data in a similar way. Actually, in an even simplified way, as the structure of their retina provides information about another important parameter – the focal distance of each eye. Using ‘’Zeta points’’ and calculated focus distances the algorithms discovered by Cadbest help to increase the precision and the solution detection pace by more than 10 000 times. This makes it possible in real time to define trajectories, locations and changes in the world surrounding us.

 The algorithms have many practical applications in the field of bionics and complex systems control. Their practical application requires time, strategic partners, dedicated team, and sufficient resources.