The Melbourne SEO Meetup for March, 2015 kicked-off at a new venue and a new day.

Now located at Honey Bar in South Melbourne (right across the street from previous Location Limerick Arms Hotel), it seems that every first Tuesday of the month will be the new date - kicking off in style 3rd of March, 2015.

I arrived a little late for the Meetup so I missed the usual intro to SEO from Chris Burgess and the beginning of Saijo's news but here is a recap of the night from the moment I got there.

SEO News Roundup for February 2015

Saijo George (Twitter) from Envato gave his usual monthly round up for the latest news in Search Engine Optimisation.

His full presentation is available below:

Because I was late, I missed the beginning and didn't take notes until Slide 11.

Fortunately, I wasn't lost because two very pertinent news items I was acutely aware of was the introduction of the 'slow' label on mobile search results and the official announcement of mobile-friendly websites officially becoming a rank factor on April 21st.

Both of these are important to keep note of in Google's pursuit of rewarding best practices in mobile-friendly web design.

One part of the presentation that I missed which did peak my interest was the use of wildcards in Google Suggest (Slide 7). The use of the underscore symbol seems to be the ticket, though why it isn'tan * is beyond me.

Saijo was good as always in his presentation, engaging with the audience - always asking if anyone used or was aware of the news item and could provide insights.

Peter Mead chimed in with his personal experience of using Bing Ads - the overall volume is lower than Google but is still a very viable form of generating traffic.

I was also happy to learn that Bing offers a $40 coupon for Australians to get started with their platform.

Page Speed & SEO

Abbas Heidari was up first to give his presentation on improving Page Speed for SEO.

Why we should care?

Both Google and Bing use page speed as a ranking factor for their respective search engine results pages.

Visitors spend less time on slow websites and if search engines are to do their job properly, they must serve the search intent to the best of their capabilities.

Abbas shared some stats that help put things into perspective:

  • 79% of all web shoppers say they will never return to a slow website.
  • 44% of these users will tell their friends to not use a slow website.

One experiment that Abbas shared was when Shopzilla improved site load from ~6 seconds to 1.2 seconds - which helped increase revenue 5-12%.

Another Case Study Abbas showcased was Walmart's focus to improve pagespeed. For the slowest 5% of users, Walmart would load in ~24 seconds. Walmart set achievable SLAs and revisit their optimisation efforts monthly - with a great focus on Page Processing time first (i.e. The 80% rule, or in Walmart's case 90%). Walmart were able to successfully wipe of 8 seconds for the 95% slowest percentile, achieving a 2.7s SLA.

These case studies help frame the business case for improving page speed on large scale websites.

Page Load Speed

GT Metrix Graphic for Why Pages are Slow

GT metrix Graphic for Why Pages are Slow

Abbas provided insights on the above diagram from GTmetrix on how pages load both on the front-end and back-end for the user.

Performance Golden Rule

The golden rule for Website Performance is that 80-90% of user response speed is on the front end - namely large images, excessive HTTP requests and uncompressed files. This was well illustrated in the Walmart performance case study.

47% users expect pages to load two seconds or less

Measuring Page Speed

Some of the most popular tools used to measure pagespeed are mainly web based and are listed as follows:

  • Pagespeed Insights: The official Pagespeed Insights tool from Google. This would be most pertinent to those in attendance as Google uses this tool to measure their pagespeed rank factor for SERP placement.
  • Web Page Test: A very good diagnostic tool that captures the Waterfall for a website's page load - has a wide variety of locations available.
  • GTmetrix: Probably one of the better insight tools that provides great, actionable insights so you can quickly identify fixes to improve website performance.

One great browser based tool that was not included in the slide is YSlow. YSlow was originally developed by Yahoo but it is an Open Source Browser Extension that helps grade your website for common best practices in improving page speed.

Improving website performance and speed

Abbas got to the meat of his presentation and discussed the main ways to improve performance and speed of your website.

Use GZIP Compression

Gzip Compression is essential in reducing the size of your web page's assets to decrease to sheer load the end user needs to download.

In most website installations a simple edit to your .htaccess file could reduce response size by 70% - with Gzip compression supported in most modern browsers this is a no-brainer decision. This article on CSS tricks on Active Gzip Compression is a great guide - be sure to read the comments for even more tips.

The presentation itself discussed Wordpress Plugins for Gzip compression, so be sure to check your development environment and publishing platform to see if there are any turnkey solutions available.

Clean Up your Data

Another Wordpress related one, this relates to excessive or inefficient database content improving pagespeed.

I raised an slight objection to this - as best practices would involve caching your webpages so that anonymous users would, ideally, never touch the database on the initial page load.

Cleaning up the database is a simple process in most CMS installations, however for you performance freaks out there you would know the main bottleneck in database performance are inefficient queries.

Sometimes a stray module or plugin could send a SQL query that would, unnecessarily, request the whole table when it could be optimised to reduce the computation required to produce the results of the SQL query.

Identify Plugins to decrease speed

If your running a CMS, there maybe a lot of plugins available to help optimise your website's speed. This was Wordpress specific, so the plugins discussed were:

Not my area, so I'll share some good ones for Drupal:

  • Boost - The Boost Module helps cache static web pages for Drupal Can be configured to re-cache a page on certain actions (i.e. on edit/comment/update).
  • Varnish HTTP Accelerator Integration - Varnish is an advanced and very fast reverse-proxy system. Great for high volume web traffic acceleration.
  • Magic - I don't use Magic much because it's functionality is built in the Omega 4 base theme but Magic can exclude files, libraries and do all sorts of crazy things to take away the pain that some external modules/themes etc bring to your website.

Tracking Codes, video embeds and share buttons

Lot's of third-party tools have tracking libraries in Javascript that really logjams the HTTP requests for your page.

This is particularly dire when these elements render-block critical elements of the page, hurting your perceived load time.

Abbas's advice was to simply pick and choose your external embeds carefully.

Sound advice, but not the one I would give.

External javascript libraries should be deferred, using a library such as Lab.js or Head.js, so that they can load after a page is fully rendered.

Key scripts could additionally be inlined to help improve perceived load time - with the increased HTML size negligible after gzip compression.

CDN

Using a content delivery network can help with the serving of static assets for users.

Not only do you get the benefit of parallel HTTP requests by using an external domain but also many CDN's are designed to serve content from the closest available location to the end user, speeding up the process.

Spam/Comments

Cleaning up spam and comments was advice provided during the presentation.

User Contributed content is always a hairy prospect as you, by definition, have limited control on the content they publish (without a moderation process).

Limiting the capabilities of user contributions, removing common footprints from your website and other anti-spam techniques such as Honeypot are widely advised to reduce Spam.

Clean Up your Plugins

Again, another Wordpress related one. Coming from the Drupal world, having modules that stagnate website performance is best to be removed if they are unused.

A good practice is to optimise modules/plugins and remove the bottlenecks - such as an unneeded js library or uncompressed images - without removing the entire module.

A lot of websites out there have duplicate requests to the same library, so decoupling libraries from your modules is the best practice and a real strength in Drupal.

Use cache, minify css, javascript and HTML

Caching is vital to serve anonymous traffic static rendered webpages for the databases and infrastructure of your website is never needed to serve content.

Cached pages can be compressed and is one the biggest performance benefit for the front-end.

Minifiction of CSS, Javascript and HTML helps reduce line-spaces, comments etc to make the files as small as possible whilst still being production ready.

Aggregation is also important, as it combines files into one single file - reducing HTTP requests significantly.

Optimise Images

Images are the biggest speed deterrent for a majority of webpages. Using a library like PNGquant can help compress all your PNGs in one place.

Abbas shared some resources, however I disagreed on his assertion to avoid PNG images unless you need transparency.

8-bit compressed PNGs are wonderfully small and are of much higher quality than equivalent JPEG images - this is due to convolution-based edge detection in JPEG compression, creating 'artefacts' in the image.

Using TinyPNG.com (works for JPEGs as well) is another great tool, albeit being just a web-based version of pngquant. Kraken.io is another web-based tool that helps automate the process, which is great for CMS websites.

Whilst libraries like PNGquant are great, the computation required from your server may actually hurt your website's speed.

Set Expire Headers

Setting long expire headers for stale, static assets can help tell a browser whether or not to request a file or simply grab it from the browser's cache.

Not only does it prevent any unnecessary downloads, but reduces HTTP requests for that the user is focused on loading the content rather than other assets.

Discussion

This presentation had great feedback from the audience.

I was disappointed that there was nothing about reducing perceived load time specifically in the presentation, however later on in the Meetup Group page Matthew Heyes asked how to reduce page-speed (regarding render-blocking).

I was able to answer his question on stack overflow by introducing concepts such as Critical CSS Path and Deferred JS libraries.

I also shared a link to Perf Planet which was requested during the talk.

Performance optimisation often involves identifying bottle-necks and chipping away at them, however for a standard SEO Consultant just understanding the concept and plugins would suffice in practice.

Smart Blog Commenting

The next presentation was a good practical guide to smart blog commenting by Chris Finnegan(Twitter) from WP Copilot.

UPDATE:Chris has uploaded his slides which are now published on Slideshare.

Chris also wrote a blog post expanding this topic:

Smart Blog Commenting Generates Instant Traffic (And Builds Relationships)

Build Referral Traffic

One of the main benefits of strategic blog commenting was gaining referral traffic. Posting insightful, amusing and engaging comments on a high-traffic authority source can help funnel that audience to your website.

Though not as a valuable as traffic with search intent, these users are still a great source of traffic to help contribute to your website's goal.

Build Relationships

Posting comments is also a great way to build relationships. Every content publisher loves being read and a real blog comment helps validate this feeling.

I like to adhere to the 90:9:1 rule which basically breaks down to:

  • 90% of people consume
  • 9% of people participate
  • 1% create

By being the 9% as a blog commenter, you help build yourself as a trusted contact for an influential content publisher (the 1%) and can lead to further opportunities with the publisher and their audience.

Chris cited an example where he worked on an analytical post and contributed to a similar post from an authority source (which I, unfortunately, took no notes of) and was able to get their work cited by the authority publication..

This is a great example of leveraging outreach to build relationships and earn link with relevant, related content.

Another benefit of blog commenting is that it contributes to a natural link profile.

Analysing link distribution in a natural, non-optimised space you will find that most backlinks to a source are brand/url links and contain a mix between do and no-follow.

Because you are contributing the the ether of the internet, your backlink profile builds naturally with a consistent blog commenting routine.

Scale Up with the Blog Commenting Toolkit

Manual blog commenting is great, but identifying blog commenting targets at scale requires tool-assistance to reduce research and qualifying time.

Using Scrapebox with a handful of advanced search operators allow you to create a 'hitlist' of blogs to comment.

Scrapebox can also be used for automatic commenting, however this technique is used specifically for smart, insightful blog commenting with manual and human insights into your target posts.

Establishing a Gravatar profile image helps humanise your comments by putting a face to the content.

Using the Moz Toolbar can help qualify SERPs as blog comment destination and increasing the results per page for Google - after disabling Google Instant - is remarkably useful for on-the-spot blog commenting.

Having an RSS reader like Feedly to aggregate sources of high authority blogs can help you maintain this habit of commenting with the additional benefit of being a 'regular' in the community.

Chris also likes commenting with Disqus but there are other blog commenting platforms like Intense Debate and Livefyre can be used to build a reputable profile.

Discussion about smart blog commenting

Chris' presentation was great and incited high audience participation contributing all sort of ideas.

Michael Jones from Holidaypoint.com.au also discussed Google+ comments as a method to build authority, but questioned if the comments were indexed.

I was able to chime in that yes, Google+ comments were indeed indexed by Google, therefore passing PageRank- however there doesn't seem to be a method to comment as a Google+ page which limits it's effectiveness to personal brand authority building.

p.s Footprint to find Google plus comments is a little unusual, but works great.

"Add a comment Opening" + keyword

This is due to the pre-rendered state of the comments being indexed. To validate that a url does indeed index comments, simply dosite:exacturl "Verbatim Comment".

Overall participation for the entire meetup was great and definitely makes the Melbourne SEO Meetup my only "Can't Miss" professional networking event each and every month.

I wish that Chris' presentation had more detail on identifying footprint opportunities because to me, that is an essential SEO skill.

But rather than lament on what a topic should have, maybe I should put my money where my mouth is and make a presentation myself?

Next Melbourne SEO Meetup

The next meetup is scheduled for the 7th of April, 2015. There are two feature presentations:

  • Google Plus 101: Melinda Sampson is the owner of Click Winning Content and will discuss tips and tricks for Google+ success.
  • LinkedIn SEO Techniques for Personal and Company Profiles: Sue Ellson is an independent LinkedIn Specialist and will be providing insights into how to maximise LinkedIn to help improve pageviews, traffic and return on investment.