Google’s Florida update (November 2003) got hit real hard on or around January 23. Google’s latest update is called Austin, and they are beginning to ‘sound’ like elections!
Depending on the industry you happen to be in, you could have been hit less, or harder- it depends on a whole number of factors and not one situation is usually the same. One of our clients that, up to recently had their site optimized by another SEO firm was completely devastated to realize his site was gone from the face of the earth.
Things such as FFA’s (Free for All) link farms, invisible text (!) and stuffed meta tags that are 20 screens long, filled with useless spam will have your site penalized, or even banned faster than you can blink an eye.
Google’s new search algorithm is getting smarter and better. Part of it is due to the Hilltop Algorithm, which I wrote about at the beginning of January. In combination with the Page Rank™ algorithm that Google has been using since 1998, these two algo’s work in tandem in what is supposed to be producing better, more relevant search results.
This last statement remains to be tested, as in some cases I have found that some of the results produced are not always relevant. But Google is continuously tweaking and ‘fine-tuning’ its technology to make it better and the next few weeks should see more improvements.
To read the complete article on the Rank for $ales website, click here.
Posted by Search Engine News Blog and by Tech Blog.
Google Update Austin: Google Update Florida Again
"Wow Google just destroyed my rankings"A common line I have been told over and over again. Ever since the middle of November Google has been up to some real trickery...
While this update appears to be similar to update Florida, there has not been anywhere near the same amount of coverage. I have read the posts located at the Highrankings Forums, Cre8asite Forums, SearchGuild, and IHelpYou Forums. I have not read the posts at WebmasterWorld since I am not a subscriber. Like the Florida update it is clear that some spam has risen to the top (Google Search: Michigan Data Recovery courtesy MakeMeTop) and some results have improved.
Most SEOs would recommend not making large changes to your site while the search engine is still in a state of flux.
Search ResultsMany sites that were not beaten down by the Florida update have been dropped during this update. Some of the common sites which have dropped are
- those who relied heavily on "on the page" SEO
- those who exchanged links with off topic sites to rank well
- location specific sites
- large shopping sites
- deep inner pages within sites which may have a strong site rank
What is Google Trying to do?Contrary to popular belief the CIA does not control the Google Dance.
People like me can make a living manipulating Google's search results. Google does not like this. Essentially Google wants the most relevant (not the most optimized) sites at the top of the search results.
Google may actually periodically re weight factors only for the sake of making it harder to figure out what their algorithm is. Most of the core powerful websites will remain unaffected by the changing algorithms. Those site most likely to rise or fall are those gaming the search engine. Some innocent sites may get lost in the shuffle, but that is not what Google is concerned with. Google wants to provide the most relevant search results to its users.
If Google can make it hard to optimize your site for Google then it will require you to waste resources figuring out the latest algorithm change or conform to what Google desires. Google wants information to appear in the free listings. Google would like commercial sites to use their ads.
What is with the big shakeups?Any time you introduce a new algorithm which is based on a relational database the results will be a bit shaky off the start. With each update the results will fall more in line with the goals of the new algorithm.
Off the start many of the results may not make sense, but the end goal is that most will. You and I both notice the shakeups more than a normal searcher since we track our specific websites.
Google coping with problemsMany search engine algorithms become less efficient as the size of the database grows. To fight off spamming and to improve functionality of their search product Google is trying to aid PageRank by using other algorithmic features. Instead of being primarily powered by just PageRank they are most likely trying to introduce techniques such as LocalRank, stemming, and latent symantic indexing.
What is PageRank?PageRank is an overall view of the popularity of a specific page against the entire internet. In general it is an approximation of the odds a random web surfer would come across a page.
The problem with PageRank is that since its base is against the entire internet it is very easy for a small sector of the internet to bond together to manipulate results.
What is LocalRank?LocalRank is a concept similar to PageRank, but based on a local segment of the web. Teoma reranks the relevancy of search results on the fly (after the search) based upon their local interconnectivity. When Google bought Kaltix it purchased a similar technology.
What is Stemming?Stemming is the conversion of a word to its base form in an attempt to better understand it. Typically the exact match of the the searched version should have a slightly higher weighting, but the other versions of the word also play into account in search results.
What is Latent Symantic Indexing?Latent symantic indexing is the organization of search results through conceptually understand each page. Based upon the natural patterns in language, latent symantic indexing can mathematically approximate a good understanding of the purpose of a document.
Why does Google test new technologies on its largest database first?The reason the change is occurring on the main site first is that it has the most content...meaning that when you look for patterns in a relational database, the larger your initial data set, the better the results you will be able to extract.
After the results are extracted it then easy to apply the results of the research to the other search locations. Some of the short term sloppy results may actually be necessary for Google to compare data to fine tune its algorithm. It may even compare data retrieved from search results.
What Did Google Do?I think off the start Google implemented a keyword filter on some phrases. The setpoint level is based upon many dynamic factors and could even be determined by the search results.
During the Florida Update I noticed that I was able to get some sites listed (which were filtered out) just by splitting up the words and lowering the keyword density. I have noticed that this update seems to be even less lenient.
I do not believe the change is just a filter, I only believe they use a strong filter in combination with other technologies. After they start to collect data the filter becomes more and more lenient. To see what the search results would be like without the "filter" use allintext: before your search. This will show results similar to the pre Austin dance. I would not grieve about the results though as they will again be changing as the database settles out.
I believe as Google collects data from the search results, whatever type of filter they are using will slowly be lifted and replaced with data from the search results.
What Can I do to Help My Site Out?If you are a purely commercial website Google is trying to push you to use their ad system. If you still do not want to use the ad system it is going to require accumulating inbound links from quality websites.
A good place to start is by registering your site with some of the top web directories. In addition you will want to look for directories specific to your field.
Where else can I get links?You can get links from anywhere, but it is appearing that topic related links are becoming more important. The best way to get links is to have content that other people like enough that they just want to link to it.
I hate to be the person to say focus on the content, but as search engines get more advanced it will be harder to manipulate them. Pick a topic that is tight enough that you can be the best at it, and pick a topic you are interested it...as it will be far easier and more enjoyable to create.
If you can get people to want to link to you because you have valuable information, especially if they link without you even asking...then you have a great competitive advantage against most of your competitors. If you need any help with promoting your website feel free to give me a call or shoot me an email.
- by Aaron Wall
We Value Privacy
Change In the DNA 10:If you’re a forum regular you know by now that Google’s results are changing again. I hope your rankings improve.
While this update has not yet fully rolled out to all the datacenters, you can search the new Google here: http://188.8.131.52/.
What difference do you notice?
So what’s different now? Andy Beal said that it “looks like they have tweaked things to include a lot more “authority” sites that were previously not included.” This speaks to the Stepforth announcement that Google had increased the number of links that they recognize.
Daniel Brandt from over at Google-Watch.org said that he’s seeing “lots of Austin type results in many categories.” About the change in results he said “It’s too early [to tell what the difference is], or not a very impressive update.”
“In this case the change is not as significant as Florida or Austin. I don’t think it will be as sticky,” he concluded.
GoogleGuy, a Google spokesperson who posts at WebMasterWorld, verified the update and gave a rough timeline for its roll out, “I just talked to somebody else at Google. Sounds like 64.x.x.x is indeed the wave of the future. They did say that it may roll out over several days instead of being done over the weekend though.”
In answer to a poster’s question on whether the Brandy update was a move back to pre-Florida results GoogleGuy said, “we’ve definitely been working to incorporate new signals of quality and improve the way that we rank pages, so the results at the 64.x.x.x data center are not a rollback or pre-Florida results–it’s several steps forward based on new ways of gauging quality and relevance.”
Visit this Google datacenter: http://184.108.40.206/ and post any differences you notice.
Garrett French is the editor of iEntry’s eBusiness channel. You can talk to him directly at WebProWorld, the eBusiness Community Forum.
The Google update of the 17th-20th February 2004 (nicknamed ‘Brandy’ by WebmasterWorld) resulted in major changes in the results the search engine returns.
The ‘Brandy’ update seems to have incorporated some pre-’Florida’ results (another major update that occurred at the end of 2003), mixed with numerous new factors. Google stores its index on a number of data centers around the world. Since ‘Florida’, some of the old data centers were taken offline, and pundits believe that Google has kept the old SERPs (Search Engine Results Pages) in a preserved state for the last few months.
Indeed, Google brought these data centres back at the same time that Yahoo! broke from Google, in favour of its new Inktomi-based results. Consequently, I don’t think this is the last of the major changes we’ll see in Google, but it does seem that Google is getting closer to what it aims to achieve.
Five ChangesBrin, one of the founders of Google recently said,
Google has made five significant changes to its algorithmic formulas in the last two weeks.
(Associated Press (AP), Feb 17th 2004)
While we can only guess at what those changes were, the following are probably a good bet.
- Increase in Index SizeGoogle’s spider, Googlebot, has had a busy few weeks — at the time of the update, Google announced that it had massively increased the size of its index.
This move was probably made to ensure Google made headlines at the same time as Yahoo! (for example, in this report in the BBC News, Feb 18th 2004). However, in order to increase the index size, Google may have had to re-include some of the pre-Florida results that had previously been dropped.
- Latent Semantic Indexing (LSI)This
is a very significant new technology that Google has always been
interested in, and the incorporation of LSI has been on the cards for
some time. If you are an insomniac, then Yu et al.’s paper
is quite helpful in explaining the concept, but, in short, LSI is about
using close semantic matches to put your page into the correct topical
It’s all about synonyms. LSI may see Google effectively remove all instances of the search keyword when analysing your page, in favour of a close analysis of other words. For example, consider the search term ‘travel insurance’. LSI-based algorithms will look for words and links that pertain to related topics, such as skiing, holidays, medical, backpacking, and airports.
- Links and Anchor TextLinks
have always been the essence of Google, but the engine is steadily
altering its focus. The importance of Page Rank (PR), Google’s unique
ranking system, is being steadily downgraded in favour of the nature,
quality, and quantity of inbound and outbound link anchor text. If PR is
downgraded, and the wording of inbound links is boosted, this may
explain, to a large degree, the position in which many sites currently
For example, most people will link to a site’s homepage. In the past, due to internal linking structures, PR was spread and other pages benefited. Now, it is more important for Webmasters to attract links that point directly to the relevant pages of their sites using anchor text that’s relevant to the specific pages.
Furthermore, Google seems to be using outbound links to determine how useful and authoritative a site is. For example, directories that are doing well are those that direct link to the sites, rather than use dynamic URLs.
more than ever, has the question of who’s linking to your site become
critical. Links must be from related topic sites (the higher the PR the
better); those links are seen to define your ‘neighbourhood’.
If we again consider the example of travel insurance, big insurance companies might buy links on holiday-related sites in order to boost their ranking. These businesses will actively invest in gaining targeted inbound links from a broad mix of sites. Consequently, their neighbourhoods appear tightly focused to Google.
- Downgrading of Traditional Tag-Based OptimisationClever use of the title, h1, h2, bold, and italics tags, and CSS, is no longer as important to a site’s ranking as it once was. It is very interesting to listen to Sergey (co-founder of Google) talk about this, because he’s the one usually quoted about the ways in which people manipulate his index. Google has taken big steps to downgrade standard SEO techniques in favour of LSI and linking, which are far less manipulable by the masses.
The Impact of BrandyThese changes make for sober reading if you’re a Webmaster — to optimize your site successfully for Google has become a lot more difficult. Nevertheless, there are a number of practical steps that can be taken to promote your ranking in the short and long term.
- SynonymsAs LSI appears to be so significant, it is important to start looking carefully at the information architecture of each major section of your site, and to increase the use of related words. It is also important to re-examine the title tags to include this concept; good title tags have synonyms and avoid repetition of the key phrase.
- Outbound LinksLink to authority sites on your subject. In the travel insurance example, these authority sites could include places like the State department, major skiing directories, etc. Not only will this help with LSI, it also allows Google to define the neighbourhood more easily. Furthermore, you could engage in link swaps with other companies so that you gain the benefit of an on-topic, LSI-friendly link.
- Inbound Links and Link to Us PagesBased
on what we have just said, sites need to formulate a link development
strategy. A budget needs to be set aside to buy links and develop
mini-sites. Look to set up links with university sites (.edu or ac.uk),
as these seem to be valuable given Google’s informational bias.
Each section of a site should have its own link-to-us page. For example, HotScripts, the major computer script directory, has a great link-to-us page.
By providing people with creatives and cut-and-paste HTML, you can vastly improve your chances of attracting reciprocal links to your site. You’ll need to have a separate page for each section, to maximise on-topic inbound links.
is important to develop separate mini-sites (also known as satellite
sites) for each key subject of your Website. This is a useful tactic
that improves your chances of appearing in the SERPs for your keywords.
Furthermore, as the last three Google updates have shaken things up so
much, having more than one site reduces the likelihood that your
business will be disrupted by the engine’s updates. However, Google is
likely to view satellite sites as spam, so you must take some steps to
reduce the chances of your being blacklisted on this basis.
First, make it as hard as possible to for Google to detect host affiliation between your main site and its mini-sites. Google may define sites to be owned by the same person if the first 3 octets of the sites’ IP addresses are the same (e.g. 123.123.123.xxx). Therefore, if you’re going to run mini-sites, put them on different Web hosts.
Secondly, use different domain names for your mini-sites, rather than sub-domains of your main site. In the past, Google has not penalised sub-domains, but the early results from the Brandy update show a considerable reduction in the presence of sub-domains in the SERPs.
Finally, be very careful with the linking strategy you use between mini-sites — Google will look at the linking structure very critically. Don’t plaster each of your sites with links to the others, and don’t reciprocate links between the sites.
Mini-sites make it easier to create on-topic neighbourhoods and experiment with LSI techniques. Creating a large network can be a means to boost your main site’s rank, but make sure you’re well aware of the risks involved with creating these mini-sites before you embark.
Use Brandy to your Advantage!Google optimisation is now a lot harder than it used to be. However, the index is still manipulable. Success involves hard work, and potentially the expenditure of funds to develop a good mini-site network and buy links on relevant pages.
Joe Rogan's Twitter: https://twitter.com/joerogan
Rogan's Forums: http://forums.joerogan.net/
Tom Campbell's Forums: http://www.my-big-toe.com/forums/
My FREE Newsletter: http://redpillphilosophy.com
Cool Shirts: http://redpillphilosophy.com/store
Support this Message: http://redpillphilosophy.com/invest
My Bitcoin Address: 1NtSZ7SrVRm9z6KWURNiNPpNbgmoo774sK