Are you excluding bots from your web traffic?

You’re getting plenty of traffic to your website, but very few conversions. You’ve upped your band width, included a clear call-to-action and even invested in a content PR zone to provide industry insights to your customers. But still, nothing. 

It’s easy to get caught up analysing web traffic. What made traffic increase last Wednesday? Who’s visiting us at 4am from the Philippines? And, what’s so interesting about the contact us page? But, try not to get too engrossed.

One of the common problems with analysing web stats is that a big chunk of your traffic could be from search engine spiders and other bots that have no real interest in engaging with your business.

Bots are the programs that harvest information for search engines, trawling your web pages to determine how relevant your website is and where your site should be indexed in a search engine.

They’re certainly not a bad thing. In fact, bots are the reason that your site may be getting higher indexes by Google, as they’re working to determine that your website is relevant to certain search terms and positioning you accordingly.

However, to build a more accurate picture of your website traffic, you should exclude the traffic from bots and spiders from your web statistics report. Using Google Analytics, it’s as simple as adjusting your reporting view settings and ticking bot filtering.

Stone Junction advises all its clients to use Google Analytics to review the performance of their websites. If they don’t know how, we help them set it up. If you’re unsure where to start, get in touch with the team on 01785 225416 or e-mail me directly at laura@stonejunction.co.uk.

Laura England

Stone Junction is a cool technical PR agency based in Stafford. We work for all sorts of businesses, with a particular focus on technology, technical and engineering companies. We like being sent cake and biscuits by clients, journalists and prospects.

No comments: