The Sandbox Effect

When a layman refers to a sandbox, he is talking about the child’s play area filled with sand – a place where maybe he also once made sand castles. And when a technical person mentions the sandbox, he is referring to a restricted environment where certain functions such a deleting and modifying files are prohibited. The sandbox is used when executables files come from an external source that is not entirely trusted.

In this era, where people depend on search engines for almost all information, there comes the biggest name in the field – Google. Unlike other search engines, Google displays search results on the basis of how the page ranks which are judged on the number of people linking to it and the popularity quotient rather than just the content of the page. The PageRank feature had catapulted Google to the peak. But then, came webmasters who included false meta data which started causing unwanted results to turn up among the search results.

But now, webmasters are shouting in distress when their pages do not appear on Google though they appear on Yahoo! and MSN.

The Sandbox effect is a theory used to describe the hold-up of websites while being queued for ranking by the Google search engine. Websites which are newly registered or see frequent ownership changes are placed in a sandbox before they can be deemed appropriate for ranking.

Differences in opinions have come up about its existence. People who do not believe in it explain the behaviour due to a result of the Google’s search algorithm. Those people do believe in it, usually webmasters whose websites don’t rank up on Google, it is a reason for distress as it apparently takes a year or even longer for the websites to be ranked on Google.

Tests have been conducted to find out if the sandbox actually does exist. While it has been stated by the testers that such a hold-up does exist, they have not been able to identify the reason behind it – whether is it a result of a poor optimization technique or due to the search algorithm of Google.

It is thought that the sandbox is a measure taken by Google to avoid spam websites abusing search engine optimisation techniques to reach the top of the index. When Google’s web crawler, GoogleBot, crawls through the web downloading the Internet on Google’s indices server, it looks for content which people would find worth a read. And this is when Google blacklists the unworthy ones, those whose meta data doesn’t match the content, those which abuse the power of search which Google has endowed upon modern civilization.

Google has an undisclosed number of servers under its name which are constantly downloading the Internet. Yes, every page of it, byte by byte. It is constructing the “Database of Intentions”, from which everyone will benefit. You can just google out your research for the debate next week, unlike when you had to gobble books after books some years ago. Would you want spam websites popping up in between your searches?

That is why I cannot understand why the sandbox effect creating a controversy in the World Wide Web? Losers always call for foul play when they themselves are at fault. Google has done a fair job by creating the “sandbox”, be it a physical lock-up or an end-result of their algorithm. Let the kids play in the sandbox. Papa shall make sure that unless you learn the ways of world outside your crib, you fool around in the sandbox.

Sahil Khan

[Image courtesy:]