Tuesday, April 12, 2011

The New Parameters in Google Algorithm

Online PR News – 12-April-2011 –Webmasters across the world particularly those who are engaged in Google SEO and Google Optimization should know that Google has made some significant changes in its algorithm. This change is very significant since it has prompted a major change in the overall ranking system of Google. As more and more people are doing Best SEO to obtain higher ranking in the search engine result pages of Google it has become very difficult for Google to identify the quality ones. It can no longer rely solely on backlinks to distinguish between the good and the bad.
Therefore it has become very important for Google to introduce a stricter filtering process. It is now taking into consideration many new things alongside the one it had relied on earlier. The new things that it is taking into consideration are:
Domain Related Factors:
1.The past record of a domain like how often it changed IP
2.Domains past owners i.e how often the owner was changed
3.The external mention of the domain (non linked)
4.The geo targeted setting in Google Webmaster tool.
5.Use of the Keyword in the domain.
Site Architecture Factors:
1.Website url structure
2.site navigation structure
3.Use of external CSS / JS files;
4.Website structure accessibility (use of inaccessible navigation, JavaScript, etc);
5.Use of canonical URLs;
6.“Correct” HTML code (?);
7.Cookies usage;
Content Factors:
1.Updated content (frequently updated content gets preference)
2.Content uniqueness (duplicate content invites penalty)
3. Pure text content ratio (without links, images, code, etc)
4. Keyword density (ideal keyword density 2-5%)
5. Rampant mis-spelling of words, bad grammar, and 10,000 word screeds without punctuation;
Internal Linking Factors:
1 . No of internal links to page;
2 . No of internal links to page with identical / targeted anchor text;
3 . No of internal links to page from content (instead of navigation bar, breadcrumbs, etc);
4 . No of links using “nofollow” attribute; (?)
5 . Internal link density,
Website factors
1.Use of Robot.txt
2.Overall site update frequency
3.Overall site size
4.Amount of time passed after being indexed by Google
5.Use of XML sitemap
6.On-page trust flags (Contact info (for local search even more important), Privacy policy, TOS,
and similar);
7.Website type (e.g. blog instead of informational sites in top 10)
These are some of the factors that Google is taking into account in its latest updated algorithm.
Contact Information:

Monday, April 11, 2011

Future of Google digital library is hard to read

It was a glittering dream: A vast worldwide digital library, tens of millions of books all in one easily accessible place . . . named Google.

Now that dream has been denied, and soon dreamers will meet to see whether they can fashion a more workable vision - one that will pass legal muster.

In a Manhattan court March 22, U.S. Circuit Judge Denny Chin struck down an agreement among search engine Google, the Authors Guild, and the Association of American Publishers. The pact would have let Google sell access to its ever-growing database of more than 15 million digitized books. But no. The decision, a pivotal moment in the history of electronic books and libraries, stands firm on traditional notions of copyright, monopolies, and privacy. With the agreement rejected, all sides will huddle April 25 to see whether there's a next step.

"I'd love to be a fly on the wall at that meeting," says Corynne McSherry, intellectual-property director at the Electronic Frontier Foundation, which filed an objection in the case along with the American Civil Liberties Union.

"I don't know how they're going to work it out," says Ken Auletta, author of Googled: The End of the World as We Know It.

It's been a twisty-turny journey. In 2002, Google began scanning books into its database. In 2004, it launched Google Search (later renamed Google Books), by which users could view snippets and download, for a fee, public-domain books (those to which no one holds a copyright). Google partnered with places such as Harvard, Michigan, Stanford, and Oxford Universities and began to digitize their holdings.

But many of those books were under copyright, prompting the Authors Guild and the Publishers Association to sue in 2005.

A settlement was reached in 2008. Tellingly, Google agreed to pay $125 million to search for copyright holders and pay authors and publishers fees and royalties. Auletta says, "They were, in effect, acknowledging there's such a thing as copyright. That's a huge admission for a digital company to make."

But in 2009, the U.S. Department of Justice, worried about giving big Google a monopoly, balked. The agreement was amended, and last year it reached Chin's desk. The digital world had been waiting for the outcome.