Managing Googlebot/Bingbot Exclusions in Security JavaScript without Impacting SEO
I need to add an important security related javascript on my HTML pages that detects a few signals like presence of selenium variables in widnow/document objects. Once something is detected, a request is sent to my backend to capture this data.
Googlebot / bingbot may also emit some of these signals (I am tracking 20+ signals) & these bots make thousands of visits to my various webpages. So somehow I do not want to execute the script for these bots.
1. If I use useragent, either on backend to totally exclude this script for googlebot or on frontend to not execute the script – will it be safe for my SEO? Can Googlebot penalize this assuming script is used for cloaking etc.?
2. How bot detection companies like Human Security (PerimeterX) manage this? Do they track even Googlebot activity?
I need to add an important security related javascript on my HTML pages that detects a few signals like presence of selenium variables in widnow/document objects. Once something is detected, a request is sent to my backend to capture this data.Googlebot / bingbot may also emit some of these signals (I am tracking 20+ signals) & these bots make thousands of visits to my various webpages. So somehow I do not want to execute the script for these bots.1. If I use useragent, either on backend to totally exclude this script for googlebot or on frontend to not execute the script – will it be safe for my SEO? Can Googlebot penalize this assuming script is used for cloaking etc.?2. How bot detection companies like Human Security (PerimeterX) manage this? Do they track even Googlebot activity? Read More