Last updated: 4 months ago
Lets face it. The majority of the Internet traffic are bots. Posting a link on Twitter? Be sure that at least 20 bots will follow your link before any real user even discovers your post. However, this is not a bad thing. Especially not when the software you are using is smart enough to distinguish between bots and real users.
We are using a always growing library to detect a huge list of standard bots (crawlers, search engines, etc..). On top of that we added another custom layer to detect bots that are not (yet/widely) known. As well as those who are doing things not so nicely.
This is making it easy for you to know what your traffic actually is and where it is coming from. You can find automated scripts and crawlers, real users and a lot more.