In a Web world dominated by change, dynamism, interactive communication channels, social networks with features not even imagined two years ago, blogs that lead the way in the online publishing sector, do static websites still stand a chance?
Hardly. With all of the search engines – even back in the days when SEO was easy and meant just adding keywords to the keywords meta tag – it was all about the content.
The websites didn’t necessarily need quality content to rank high in the SERPs, but they needed some kind of content, preferably in large amounts, often redundant and in many cases irrelevant.
The lack of quality and use of spam (black hat SEO) led to the birth of splogs and scraper sites lowering the quality of the results displayed by the search engines after a search query.
Trying to keep their users satisfied, the search engines needed to change their algorithms frequently, and SEO had to change with them, becoming today a real “science” of the Web. The quantity aspect in optimizing a static website hasn’t changed that much but the search engines got smarter and now the focus is on quality and freshness.
So what can the static websites do to keep up with the new realities? The answer is simple, and it was given by many SEOs in the past four years: focus on structure and develop a flexible website that can be easily and periodically updated with fresh, unique, high-quality content. Today the content doesn’t refer solely to text. In the same category we can include images (video, pictures, graphics) and sound. And all aspects of this content need optimization.
A Google compatible sitemap that was correctly configured (to tell the spiders how often the site is being updated, which pages are the most important and so on) will let the bots know when is the right time to scan a site to look for updates.