Spider
To import your products and keep them up to date on our website, we use a spider that occasionally visits your website to gather your pricing, stock and product specifications. Usually this happens once every 24 hours but it could happen more or less often depending on how often your website changes. The pause between each HTTP request defaults to 1 second, but can be adjusted using Crawl-delay as explained below. Naturally, we don't want to spider you against your will. We use fixed IP addresses, identifying user agents and we respect your robots.txt. If you have any remaining questions or concerns, please don't hesitate to contact us.
IP addresses and user agents
All our spiders use the user agent Mozilla/5.0 (compatible; ServerHunterSpider/1.1; +https://www.serverhunter.com/spider/). Please note that the version 1.1 in the user agent can change when we make updates.
Requests are always made from one of these IP adresses:
212.47.237.88 2001:bc8:628:1708::1 159.69.95.139 2a01:4f8:c2c:1c15::1 188.34.195.233 2a01:4f8:1c1c:19e7::1
You can also retrieve this list through https://www.serverhunter.com/spider/ips/, the round-robin DNS record spiders.serverhunter.com or whitelist based on our FQDN RDNS which will match spider*.serverhunter.com.
How to allow our spider to access your website
By default, we should have no issues accessing your website. However, please make sure that:1. If you use any firewalls or services like CloudFlare, please whitelist the IP addresses listed above.
2. If you use a robots.txt file, please make sure our user agent is allowed access by appending this to your robots.txt file:
User-agent: ServerHunter Allow: /
How to slow down our spider
By default, our spider pauses for 1 second between each HTTP request. This should be more than enough for most websites.If you want to increase or decrease this delay, please append the following code to your robots.txt file, where 10 is the amount of seconds:
User-agent: ServerHunter Crawl-delay: 10
How to block our spider from your website
We would appreciate if you could contact us to investigate and resolve your issue rather than blocking our spider entirely.However, if our spider is misbehaving and you need to block our spider urgently, put this in your robots.txt file:
User-agent: ServerHunter Disallow: /
Please note the spider caches the robots.txt file for 1 hour, so your changes will not take immediate effect.