- Veröffentlicht am 3rd September 2013
SEO Industry Updates for August 2013
Mobile Page Speed Now a Ranking Factor
As predicted, we saw Google announce a new page speed signal for smartphones. With a strong hint that mobile pages should load in under one second Google announced a new set of guidelines and an updated PageSpeed Insights tool to help webmasters optimize their mobile pages for rendering performance.
At first glance this seems like a pretty tough target, however Google’s guidelines largely focus on rendering the above-the-fold content to users in one second whilst allowing the rest of the page to load and render in the background. The above-the-fold HTML, CSS, and JS is collectively known as the critical rendering path and is where webmasters should focus their efforts. Google specifically mention the following areas as ways of achieving this sub second target.
- Server must render the response
- Number of redirects should be minimized
- Number of roundtrips to first render should be minimized
- Reserve time for browser layout and rendering (200 ms)
You can read more over at the Webmaster Central blog below.
In depth Articles: Google take another step in rewarding high quality content
Pandu Nayek, the engineer behind Google’s higher quality sites algorithm (AKA Panda), announced a new type of enhanced result this month for what Google refer to as ‘in depth articles’. He goes into more detail about some of the technical aspects that can help publishers achieve this type of result. However, we suspect that the real trick will be how to ‘create compelling in-depth content’. This will inevitably be a significant factor in whether these results actually start rendering or not.
Publishers are encouraged to:
- use schema.org “article” markup,
- provide authorship markup,
- use rel=next and rel=prev for paginated articles (also watch out for common rel=canonical mistakes),
- provide information about your organization’s logo,
- and of course, create compelling in-depth content.
What this does illustrate however is the importance of taking advantage of the technical markup that helps support Google’s enhanced results. This is a trend that is going to continue with other notable search engines also announcing visual enhancements this month. When you also consider there are over 200 different types of Schema markup, the opportunities in Google alone will be significant.
For the moment, in-depth articles results are only rolling out on google.com in English but we would anticipate a wider release for english speaking countries in the coming months.
Google’s Manual Actions viewer: What may lie behind the obvious?
For so long, many webmasters have wanted clear indication whether their sites have been impacted by a manual webspam action by Google. So it was a common request to Google to be more active and to communicate specifically for what can be hurting a website’s performance in search rankings. Google satisfied this desire by introducing a new feature called Manual Actions in Google Webmaster Tools – many webmasters are relieved that from now on they have a more distinctive roadmap to clean up such issues on their websites.
Google mentions “The Manual Action viewer in Webmaster Tools shows information about actions taken by the manual webspam team that directly affect that site’s ranking in Google’s web search results.”
To try out this new feature go to Webmaster Tools > Search Traffic > Manual Actions. It’s likely that you won’t see any notice there as only 2% of domains are affected from such an action.
Two types of actions are displayed on the Manual Actions page.
- Site-wide matches: For actions that impact an entire site.
- Partial matches For actions that impact individual URLs or sections of a site.,
Moreover with the new feature, it’s simpler to submit a reconsideration request by clicking “Request a review” and may utilise this only if there’s a manual action applied on the site.
However there is another implication following the introduction of this feature. Recently we have been monitoring some sources in the industry which revealed that in two cases of a manual action by Google there was a similar, very interesting pattern. Google provided each case with two examples of links violating their guidelines: One was an advertorial/sponsored post and the other was third-party spammy domain or poor-quality directory. Possibly many webmasters would think that cases like the latter may be discounted by Google, but it seems that this is not happening.
Along with Google’s recent updates on its Link Schemes page concerning guest posts, advertorials and press releases, it’s sure that many site owners will reconsider what has to be flagged potentially harmful and what not in their future link audits.