SUBSCRIBE TO OUR BLOG
Is a Linkless Search Engine Possible?
When people start talking about the ways that search engine results could be addressed in the future, one thing that a lot of discussions have centered around is “going linkless.” This phrase means that right now, one of the important metrics that search algorithms use to assign value, rank, and usefulness to a piece of content is just how often people link to it. There’s some sentiment amongst the technologically savvy and those focused on SEO or search engine optimization, that using links as one of the criteria for judging the value and usefulness of content is inefficient.
Some believe that better results might be achieved if other metrics, besides the presence and popularity of links is used. There are, after all, already other metrics that a search engine uses, like search queries, and even frequency of brand name mentions. So why do we need links at all? Is it impossible to go without links?
In reality, it’s quite possible to form rankings without assessing links. However, the results tend to be unsatisfactory.
The Russian Experiment
One Russian search engine company, Yandy, actually carried out a bold experiment that tried to get rid of links as a metric for judging search relevance. And, not entirely surprisingly, they proved that technically, this was actually a straightforward thing to do. They eliminated the use of links as a ranking criterion and concentrated on other factors, such as the nature of search queries, and of course, the frequency of brand names used.
The most interesting thing about the Yandy experiment is that, at first, it seemed like a resounding success. In the initial stages, the ranking system seemed both fast and accurate, and people enjoyed the efficiency of the system. Then other people realized this system was easier to exploit, and they started to do so.
Yandy had run into a similar problem as Google back in the early days of internet search. One of the ways that people quickly exploited the use of keywords, for example, was to take the same word, and use it, repeatedly, hundreds or even thousands of times. Because search engines of the day used much more simplistic, indiscriminate algorithms. Those algorithms decided that because an important keyword appeared on a website or web page more frequently, it must, therefore, be more useful and relevant. Unfortunately, in some cases, keywords were merely “lining” the bottom of a piece of web content that had nothing to do with the search topic, but because the word appeared hundreds of times, people were directed there.
The Yandy experiment eventually reverted to including the use of links as part of their ranking system. From a security perspective, it’s considerably more difficult to “game the system” through the use of links, since it relies on other sites to include links to your content, and the simple tricks of cutting and pasting something multiple times will not fool today’s more sophisticated algorithms.
Links Are Not the End All
Links do not wholly determine rankings on the Google search engine results page or SERP, and links won’t always remain one of the most critical factors. Search algorithms face a similar battle and use the same preventative strategies as cybersecurity experts who are continually fighting to ensure that data is safe. Hackers will examine the latest cybersecurity defenses and then attempt to bypass or break them. Sometimes they succeed. For search engines, people who are looking to boost rankings are constantly observing the ranking movement, trying to understand how the latest changes to the search algorithms are affecting results, and then attempting to exploit that. If they succeed, then the search engines look at their algorithms and readjust them to resist exploitation.
In this sense, search algorithms must strike a balance between ease of use, and vulnerability. It truly is a case of emphasizing that the whole is greater than the sum of its parts. A search algorithm must use many different factors to determine how to award a ranking for a piece of content. It cannot rely on just one or two variables to make a decision. But at the same time, the number of criteria used and how those criteria are interpreted is in a state of constant evolution. Links may one day play a less critical role.
Ranking Factors of the Future
There is still a lot of talk of “intended votes” in SEO, which means that if people genuinely mention a brand or a name in comments or an article, that should carry some weight. And to a certain degree, it does. However, there’s also still something to be said for a high-value link, such as when a trusted website includes a link to another site that is felt to be worthy of mention or inclusion. Links are still something that is given a lot of weight, and people tend to be much choosier about which ones they decide to include, versus merely mentioning names or brands.
A direct link to another site is easier to see and use, and it’s something that algorithms are aware of, which is why the number of links relates to the prominence or value of a content provider. A reputable site would not risk their ranking by associating with a subpar content provider. Therefore, these links are seen as legitimate and are treated as a reliable factor in ranking. With the inclusion of other elements such as social media and even new technologies making verbal queries possible, algorithms will continue to change.
Links, in some sense, may change as well, even if the link itself as a concept retains importance in ranking systems. Links are already ranked, based on “quality,” so numerous mentions on an anonymous blog or in a press release can be considered lower in value than links from popular sources. The more difficult it is to “counterfeit” the value the of a result, the more useful it is in providing genuine results for people using search engines. So for these reasons, links might be the best way—for now.