Skip to content

BotDetector is a golang library that detects Bot/Spider/Crawler from user agent

License

Notifications You must be signed in to change notification settings

logocomune/botdetector

Repository files navigation

BotDetector

Build Status Go Report Card codecov

BotDetector is a Go library that detects bots, spiders, and crawlers from user agents.

Installation

go get -u github.com/logocomune/botdetector/v2

Usage

Simple usage

   userAgent := req.Header.Get("User-Agent")

detector, _ := botdetector.New()
isBot := detector.IsBot(userAgnet)

if isBot {
log.Println("Bot, Spider or Crawler detected")
}

Adding Custom Rules

You can add custom detection rules with the WithRules method. For example:

userAgent := req.Header.Get("User-Agent")

detector, _ := botdetector.New(WithRules([]string{"my rule", "^test"}))
isBot := detector.IsBot(userAgent)

if isBot {
log.Println("Bot, Spider or Crawler detected")
}

Custom Rule Patterns:

pattern description
"..." Checks if the string contains the specified pattern.
"^..." Checks if the string starts with the specified pattern.
"...$" Checks if the string ends with the specified pattern.
"^...$" Checks if the string strictly matches the entire pattern.

In this example, the custom rules "my rule" and "^test" are added to the existing detection rules.

Adding Cache

You can add a lru cache rules with the WithCache method. For example:

userAgent := req.Header.Get("User-Agent")

detector, _ := botdetector.New(WithCache(1000))
isBot := detector.IsBot(userAgent)

if isBot {
log.Println("Bot, Spider or Crawler detected")
}

Example

Simple example

Inspiration

BotSeeker is inspired by CrawlerDetect, an awesome PHP project.