Satisfaction a Future Ranking Signal in Google Search Results?
Do you search through Google on your phone? How do you know whether or not Google is watching you as you do and keeps on eye on whether or not you like the results you receive during your searches? Could Satisfaction with search results be a ranking signal that Google may use now, or in the future?
A newly published Google patent application describes technology that would modify scoring and ranking of query results using biometric indicators of user satisfaction or negative engagement with a search result. In other words; Google would track how satisfied or unsatisfied someone might be with search results, and using machine learning, build a model based upon that satisfaction, raising or lowering search results for a query. This kind of reaction might be captured using a camera on a searcher’s phone to see their reaction to a search result, as depicted in the following screenshot from the patent:
This satisfaction would be based upon Google tracking and measuring biometric parameters of a user obtained after thst search result is presented to the user, to determine whether those may indicate negative engagement by the user with a search result.
For example, someone searches for “Seafood Restaurants,” and the top result is a restaurant they have visited before and didn’t like, causing them to frown, which may be captured on their phone’s camera. That reaction may be seen as a negative signal by the search engine, and could potentially count against that restaurant ranking as highly for that query term. The patent tells us that such a reaction may influence search results for multiple searchers:
The actions include providing a search result to a user; receiving one or more biometric parameters of the user and a satisfaction value; and training a ranking model using the biometric parameters and the satisfaction value. Determining that one or more biometric parameters indicate likely negative engagement by the user with the first search result comprises detecting:
- Increased body temperature
- Pupil dilation
- Eye twitching
- Facial flushing
- Decreased blink rate
- Increased heart rate.
The patent is:
Methods, systems, and apparatus, including computer program products, for providing query results using biometric parameters. One of the methods includes providing a search result in response to receiving a search query. If one or more of biometric parameters of a user indicate likely negative engagement by the user with the first search result, an additional search result is obtained and provided in response to the search query.
When I think of how often I get my face right up in my phone’s screen while searching for something, the idea that Google might use the phone’s camera to capture my facial impressions as I’m looking at results doesn’t surprise me. Would Google use such signals to rank search results, or build a model of biometric reactions to search results? It’s an interesting question. Instead of social media likes or dislikes, these rankings would be based upon what would be percieved as actual likes or dislikes.
Could you envision Google using an approach like this one in ranking search results?