Even though the number in Google Analytics shouldn’t be trusted 100%, but they certainly giving a big picture of what’s happening with your website, and for website’s traffic, they have great detail on how much traffic is coming to your website, where it comes from and the quality of the traffic. Lately, bots traffic becomes an increasing problem, because it’s leading to obvious skews in data.
Why bots traffic giving you problem? Because bots traffic mainly a wave of automated traffic to your website, because it’s not real person, so the traffic that your website get doesn’t have real value, at worst, it’s malicious. When you though your website has high stats on traffic, but actually it can affect negatively on many other stats.
Bots alter website bounce and conversion rates
Bounce rate of your website is an important indicator of performance and can explain few things. Low bounce rate can be interpret with audience are engaged to your site and your website is attracting relevant and good quality traffic. In other thing, high bounce rate can be viewed as your website may having a problem with the quality of the traffic your website getting or there is a serious aesthetic or issues of usability with your website, this causing visitors to doubt of your website and leave immediately.
Incapsula reported that in 2014, 56% of estimated all website traffic are bots traffic. If you think the number for a moment, this could mean that more than half of your website’s visitors are actually bots and not real human, this could be the reason why conversion rates of your site is low. Bots traffic getting your website with 100% bounce rate, 0 second visit duration and making a downfall of your goal conversion. This can affect overall bounce and conversion rates, so your website performance look worse than the reality.
How to exclude bots traffic from Google analytics report
Productive new bots are emerging all the time and it’s really difficult to faithfully defending against all of them. Even when you can specify that some or all bots get restricted visiting your website at all using .htaccess file, but an effective way to dealt with this problem is using filtering bot visits in Google Analytics using manual filters. The new Google Analytics that rolls out in last July making you can easily exclude all known bots from appearing in your stats in just a click.
There is something that worth point out is that ‘exclude all’ option not consistently separating all bots. So you can use both filtering methods, individual filtering and exclude all. Using both methods, meaning that you are less likely seeing skewed number, but you need to aware that your website’s traffic may appearing lower when these method are being used.
How to use Exclude All Bot method
- Sign in to your Google Analytics account
- Go to ‘All web site data’ menu
- Click ‘View settings’
- Scroll down to the bottom and you will see ‘Bot Filtering’ option, then tick the box
- Tick also the ‘Exclude all known bots and spiders’ box
- Click save
How to filtering individual bots method
- Go to ‘Admin’
- Under ‘All web site data’, click ‘Filters’
- Pick ‘+New filter’
- Check ‘Create new filter’
- Give name to the new filter
- Under filter type, choose ‘Custom’
- Select ‘Exclude’ and choose ‘Referral’ from ‘Filter’ field
- In ‘Filter Pattern’ copy the name of bot that you want to exclude/block
- Click save.