Artificial intelligence began hunting buyers by tracking their pupils

Artificial intelligence began hunting buyers by tracking their pupils

[ad_1]

An algorithm for a neural network has been developed in Israel

The computer will be able to read in the eyes what the buyer wants to buy. The corresponding algorithm was developed by a group of scientists from the USA and Israel.

Technology has been known for quite a long time that helps, by tracking the movement of the pupils of an immobilized person, to help him perform certain actions. Such technologies operate systems that combine human eyes, a computer and, for example, an exoskeleton. They work like this: when a person moves his gaze to the left or right, he, thanks to reading the computer camera “by the eyes,” can trigger the operation of an artificial limb.

Researchers from the United States and Israel have developed an algorithm that predicts user actions based on eye movements when working with the screen. The neural network was “taught” using a special program that tracked the direction of gaze of a potential buyer of a virtual store when he looked through the catalog on the website.

As a result, based on certain eye movements, the program learned to predict the most likely purchases even before the user made a conscious decision.

According to developers from the University of Maryland, before achieving this result, they needed to summarize hundreds of thousands of eye positions with millions of different parameters.

According to them, the development can be useful to sellers of virtual supermarkets in order to quickly respond to the wishes of customers, as well as psychologists, psychiatrists, designers or financiers.

Psychologists have long known that eyes can tell a lot about a person, at least about whether he is hiding something from you or not. In case of slyness, his gaze will meet yours less than one-third of the time of the entire conversation. You can also tell a lot by dilated pupils (under normal lighting conditions) and frequent blinking, which usually reveals the anxiety of the interlocutor.

However, if earlier the smallest details of eye movements and the direction of gaze could elude people, now, thanks to learning neural networks, this shortcoming of our natural reactions can be overcome quite soon.

[ad_2]

Source link