Google’s indexing initiative has potentially raised eyebrows of many people, while providing smile for others. News points to Google’s intention to index facebook comments (including others which are accessible only through an HTTP POST request).
Indexing by Google
With the indexing of comments, Google shall present user comments as standard search engine outcomes. The aim is to contain anything which lies hidden behind a form, like facebook, Disqus and other JavaScript-based applications. These powerful and easy to use web applications have served considerably in connecting people across the web effortlessly (one of the most vital reasons for their huge popularity).
There are two key requests which can be initiated on the web, namely GET and POST. The GET request is meant to read data, while the POST can ‘alter’ data. For this reason, search engine robots (like Google’s) have been sticking to the GET requests. Since reading data brings in no alteration to the content being read, Googlebot (program which determines which web sites to crawl and collects documents from the web) seemed a passive observer. But now, it can interact (and possibly alter) the content it crawls. However, there is less feasibility that Googlebot will ‘alter’ data.
Google could have done indexing of web content much earlier. With the emergence of Ajax (the technology which reduces the time lag between your click and search result) long ago, Google could have implemented this technique earlier. However, this is only an opinion.
The favorable element
Expanding search engine operation can extend the content and increase suitability. Users can expect to receive more appropriate results and click on what exactly they have been looking for.
This interesting change can be good news for SEOs, who otherwise find commenting platforms not much usable. The blog commenting did not provide search boosts to their websites till now. However, with this change, the text from comment boxes will be delivered in Google search.
More importantly, it will help users discriminate between what to comment and what not to comment online.
The unfavorable element
Considering the other side of the story, there is concern among developers regarding the POST requests of Googlebot. The involvement of robots increases the feasibility of errors, and any untoward incident may not be outright rejected. However, robots.txt file can be used to disallow Googlebot from crawling a website’s forms (for POST URLs).
Private users may not be happy regarding the invasion being done to their comments. Facebook users use the privacy settings to their advantage, when not wanting to socialize liberally. But with indexing, their names and comments will be revealed. Individuals have their boundaries set, and intruding into a ‘restricted area’ may be unwelcome. Facebook may come to help it users in some way, protecting their privacy.
Wait and watch
Google has revealed responsibility and does not intend to perform any task which can potentially lead to an ‘unintended user action’. With the growing popularity of user-friendly community portals like facebook, traffic is a crucial component in the industry. Many users may find it discomforting for their words to be made public.
It is time to actually ascertain the ups and downs of a technological innovation. Words are said to be more harmful than the deadliest weapons in the world, and it appears that there is going to be a sound motivation to this guideline.
Written by Guest Author
This post is written by guest author