SentiBot is a web crawler that indexes internet users’ posts for the SentiOne platform. This application analyzes over 300,000 domains daily from 7 European countries and we make every effort to minimize the traffic generated by us. But if you are here because you’ve noticed bothersome traffic on your site generated by "SentiBot www.sentibot.eu (compatible with Googlebot)", please contact us by filling out the form below.
How to block or lower load? If the issue is urgent, don’t hesitate to use the form below and describe the concern.
If the issue is not urgent, we propose following solution. SentiBot checks the robots.txt files for the domains every few days. To exclude crawler in robots.txt file or lower the traffic, rules for crawler “sentibot” should be included (supported parameters: crawl-delay and disallow). Robots.txt file handling is now in beta phase.
Blocking isolated IP addresses is not advised, as the SentiBot crawler is installed on many servers and their pool is changing constantly.