Source: ai.robots.txt
Section: web
Priority: optional
Maintainer: Daniel Baumann <daniel@debian.org>
Build-Depends:
 debhelper-compat (= 13),
 jq,
Standards-Version: 4.7.3
Homepage: https://github.com/ai-robots-txt/ai.robots.txt
Vcs-Browser: https://forgejo.debian.net/httpd/ai.robots.txt
Vcs-Git: https://forgejo.debian.net/httpd/ai.robots.txt

Package: apache2-block-ai-bots
Section: web
Architecture: all
Depends:
 apache2,
 ${misc:Depends},
Conflicts:
 apache2-ai-bots (<< 1.45+20260307+dfsg-2~),
Replaces:
 apache2-ai-bots (<< 1.45+20260307+dfsg-2~),
Description: list of AI agents and robots to block (apache2)
 ai.robots.txt is a list containing AI-related crawlers of all types, regardless
 of purpose.
 .
 Blocking access based on the user agent does not block all crawlers, but it is
 a simple and low overhead way of blocking most crawlers.
 .
 This package contains the apache2 integration,
 please see /usr/share/doc/apache2-block-ai-bots/README.Debian on how to enable it.

Package: apache2-ai-bots
Section: oldlibs
Architecture: all
Depends:
 apache2-block-ai-bots,
 ${misc:Depends},
Description: list of AI agents and robots to block (transitional package)
 Package to ease upgrading from older apache2-ai-bots package to the new
 apache2-block-ai-bots package.
 .
 This package can be purged at anytime once the apache2-block-ai-bots package
 has been installed.

Package: haproxy-block-ai-bots
Section: web
Architecture: all
Depends:
 haproxy,
 ${misc:Depends},
Description: list of AI agents and robots to block (haproxy)
 ai.robots.txt is a list containing AI-related crawlers of all types, regardless
 of purpose.
 .
 Blocking access based on the user agent does not block all crawlers, but it is
 a simple and low overhead way of blocking most crawlers.
 .
 This package contains the haproxy integration,
 please see /usr/share/doc/haproxy-block-ai-bots/README.Debian on how to enable
 it.

Package: nginx-block-ai-bots
Section: web
Architecture: all
Depends:
 nginx,
 ${misc:Depends},
Description: list of AI agents and robots to block (nginx)
 ai.robots.txt is a list containing AI-related crawlers of all types, regardless
 of purpose.
 .
 Blocking access based on the user agent does not block all crawlers, but it is
 a simple and low overhead way of blocking most crawlers.
 .
 This package contains the nginx integration,
 please see /usr/share/doc/nginx-block-ai-bots/README.Debian on how to enable it.
