Merge branch '0.10' into issue1644

pull/1645/head
Serg G. Brester 2017-04-21 10:32:29 +02:00 committed by GitHub
commit 311f8fea83
128 changed files with 3474 additions and 1090 deletions

View File

@ -11,6 +11,7 @@ python:
- 3.3
- 3.4
- 3.5
- 3.6
- pypy3
before_install:
- if [[ $TRAVIS_PYTHON_VERSION == 2* || $TRAVIS_PYTHON_VERSION == 'pypy' ]]; then export F2B_PY_2=true && echo "Set F2B_PY_2"; fi

176
ChangeLog
View File

@ -10,6 +10,93 @@ ver. 0.10.0 (2016/XX/XXX) - gonna-be-released-some-time-shining
-----------
TODO: implementing of options resp. other tasks from PR #1346
documentation should be extended (new options, etc)
### Fixes
* `filter.d/pam-generic.conf`:
- [grave] injection on user name to host fixed
* `filter.d/sshd.conf`:
- rewritten using `prefregex` and used MLFID-related multi-line parsing
(by using tag `<F-MLFID>` instead of buffering with `maxlines`);
- optional parameter `mode` rewritten: normal (default), ddos, extra or aggressive (combines all),
see sshd for regex details)
* filter.d/sendmail-reject.conf:
- rewritten using `prefregex` and used MLFID-related multi-line parsing;
- optional parameter `mode` introduced: normal (default), extra or aggressive
* filter.d/haproxy-http-auth: do not mistake client port for part of an IPv6 address (gh-1745)
* `action.d/complain.conf`
- fixed using new tag `<ip-rev>` (sh/dash compliant now)
* `action.d/sendmail-geoip-lines.conf`
- fixed using new tag `<ip-host>` (without external command execution)
* fail2ban-regex: fixed matched output by multi-line (buffered) parsing
* fail2ban-regex: support for multi-line debuggex URL implemented (gh-422)
* fixed ipv6-action errors on systems not supporting ipv6 and vice versa (gh-1741)
### New Features
* New Actions:
* New Filters:
### Enhancements
* Introduced new filter option `prefregex` for pre-filtering using single regular expression (gh-1698);
* Many times faster and fewer CPU-hungry because of parsing with `maxlines=1`, so without
line buffering (scrolling of the buffer-window).
Combination of tags `<F-MLFID>` and `<F-NOFAIL>` can be used now to process multi-line logs
using single-line expressions:
- tag `<F-MLFID>`: used to identify resp. store failure info for groups of log-lines with the same
identifier (e. g. combined failure-info for the same conn-id by `<F-MLFID>(?:conn-id)</F-MLFID>`,
see sshd.conf for example);
- tag `<F-MLFFORGET>`: can be used as mark to forget current multi-line MLFID (e. g. by connection
closed, reset or disconnect etc);
- tag `<F-NOFAIL>`: used as mark for no-failure (helper to accumulate common failure-info,
e. g. from lines that contain IP-address);
Opposite to obsolete multi-line parsing (using buffering with `maxlines`) it is more precise and
can recognize multiple failure attempts within the same connection (MLFID).
* Several filters optimized with pre-filtering using new option `prefregex`, and multiline filter
using `<F-MLFID>` + `<F-NOFAIL>` combination;
* Exposes filter group captures in actions (non-recursive interpolation of tags `<F-...>`,
see gh-1698, gh-1110)
* Some filters extended with user name (can be used in gh-1243 to distinguish IP and user,
resp. to remove after success login the user-related failures only);
* Safer, more stable and faster replaceTag interpolation (switched from cycle over all tags
to re.sub with callable)
* substituteRecursiveTags optimization + moved in helpers facilities (because currently used
commonly in server and in client)
* New tags (usable in actions):
- `<fid>` - failure identifier (if raw resp. failures without IP address)
- `<ip-rev>` - PTR reversed representation of IP address
- `<ip-host>` - host name of the IP address
- `<F-...>` - interpolates to the corresponding filter group capture `...`
* Allow to use filter options by `fail2ban-regex`, example:
fail2ban-regex text.log "sshd[mode=aggressive]"
* Samples test case factory extended with filter options - dict in JSON to control
filter options (e. g. mode, etc.):
# filterOptions: {"mode": "aggressive"}
* Introduced new jail option "ignoreself", specifies whether the local resp. own IP addresses
should be ignored (default is true). Fail2ban will not ban a host which matches such addresses.
Option "ignoreip" affects additionally to "ignoreself" and don't need to include the DNS
resp. IPs of the host self.
* Regex will be compiled as MULTILINE only if needed (buffering with `maxlines` > 1), that enables:
- to improve performance by the single line parsing (see gh-1733);
- make regex more precise (because distinguish between anchors `^`/`$` for the begin/end of string
and the new-line character '\n', e. g. if coming from filters (like systemd journal) that allow
the parsing of log-entries contain new-line chars (as single entry);
- if multiline regex however expected (by single-line parsing without buffering) - prefix `(?m)`
could be used in regex to enable it;
* implemented execution of `actionstart` on demand (conditional), if action depends on `family` (gh-1742):
- new action parameter `actionstart_on_demand` (bool) can be set to prevent/allow starting action
on demand (default retrieved automatically, if some conditional parameter `param?family=...`
presents in action properties), see `action.d/pf.conf` for example;
- additionally `actionstop` will be executed only for families previously executing `actionstart`
(starting on demand only)
* introduced new command `actionflush`: executed in order to flush all bans at once
e. g. by unban all, reload with removing action, stop, shutdown the system (gh-1743),
the actions having `actionflush` do not execute `actionunban` for each single ticket
* add new command `actionflush` default for several iptables/iptables-ipset actions (and common include);
ver. 0.10.0-alpha-1 (2016/07/14) - ipv6-support-etc
-----------
### Fixes
* [Grave] memory leak's fixed (gh-1277, gh-1234)
@ -83,6 +170,10 @@ TODO: implementing of options resp. other tasks from PR #1346
if configuration is clean (fails by wrong configured jails if option `-t` specified)
* New command action parameter `actionrepair` - command executed in order to restore
sane environment in error case of `actioncheck`.
* Reporting via abuseipdb.com:
- Bans can now be reported to abuseipdb
- Catagories must be set in the config
- Relevant log lines included in report
### Enhancements
* Huge increasing of fail2ban performance and especially test-cases performance (see gh-1109)
@ -165,6 +256,31 @@ fail2ban-client set loglevel INFO
- faster match and fewer searching of appropriate templates
(DateDetector.matchTime calls rarer DateTemplate.matchDate now);
- several standard filters extended with exact prefixed or anchored date templates;
* Added possibility to recognize restored state of the tickets (see gh-1669).
New option `norestored` introduced, to ignore restored tickets (after restart).
To avoid execution of ban/unban for the restored tickets, `norestored = true`
could be added in definition section of action.
For conditional usage in the shell-based actions an interpolation `<restored>`
could be used also. E. g. it is enough to add following script-piece at begin
of `actionban` (or `actionunban`) to prevent execution:
`if [ '<restored>' = '1' ]; then exit 0; fi;`
Several actions extended now using `norestored` option:
- complain.conf
- dshield.conf
- mail-buffered.conf
- mail-whois-lines.conf
- mail-whois.conf
- mail.conf
- sendmail-buffered.conf
- sendmail-geoip-lines.conf
- sendmail-whois-ipjailmatches.conf
- sendmail-whois-ipmatches.conf
- sendmail-whois-lines.conf
- sendmail-whois-matches.conf
- sendmail-whois.conf
- sendmail.conf
- smtp.py
- xarf-login-attack.conf
* fail2ban-testcases:
- `assertLogged` extended with parameter wait (to wait up to specified timeout,
before we throw assert exception) + test cases rewritten using that
@ -172,29 +288,72 @@ fail2ban-client set loglevel INFO
- new `with_foreground_server_thread` decorator to test several client/server commands
ver. 0.9.6 (2016/XX/XX) - wanna-be-released
ver. 0.9.x (2016/??/??) - wanna-be-released
-----------
0.9.x line is no longer heavily developed. If you are interested in
new features (e.g. IPv6 support), please consider 0.10 branch and its
releases.
### Fixes
* Fixed a systemd-journal handling in fail2ban-regex (gh-1657)
* filter.d/sshd.conf
- Fixed non-anchored part of failregex (misleading match of colon inside
IPv6 address instead of `: ` in the reason-part by missing space, gh-1658)
(0.10th resp. IPv6 relevant only, amend for gh-1479)
* config/pathes-freebsd.conf
- Fixed filenames for apache and nginx log files (gh-1667)
* filter.d/sshd.conf
- new aggressive rules (gh-864):
- Connection reset by peer (multi-line rule during authorization process)
- No supported authentication methods available
- single line and multi-line expression optimized, added optional prefixes
and suffix (logged from several ssh versions), according to gh-1206;
- fixed expression received disconnect auth fail (optional space after port
part, gh-1652)
and suffix (logged from several ssh versions), according to gh-1206;
* filter.d/suhosin.conf
- greedy catch-all before `<HOST>` fixed (potential vulnerability)
* filter.d/cyrus-imap.conf
- accept entries without login-info resp. hostname before IP address (gh-1707)
* Filter tests extended with check of all config-regexp, that contains greedy catch-all
before `<HOST>`, that is hard-anchored at end or precise sub expression after `<HOST>`
### New Features
* New Actions:
- action.d/netscaler: Block IPs on a Citrix Netscaler ADC (gh-1663)
* New Filters:
- filter.d/domino-smtp: IBM Domino SMTP task (gh-1603)
### Enhancements
* Introduced new log-level `MSG` (as INFO-2, equivalent to 18)
ver. 0.9.6 (2016/12/10) - stretch-is-coming
-----------
### Fixes
* Misleading add resp. enable of (already available) jail in database, that
induced a subsequent error: last position of log file will be never retrieved (gh-795)
* Fixed a distribution related bug within testReadStockJailConfForceEnabled
(e.g. test-cases faults on Fedora, see gh-1353)
* Fixed pythonic filters and test scripts (running via wrong python version,
* Fixed pythonic filters and test scripts (running via wrong python version,
uses "fail2ban-python" now);
* Fixed test case "testSetupInstallRoot" for not default python version (also
using direct call, out of virtualenv);
* Fixed ambiguous wrong recognized date pattern resp. its optional parts (see gh-1512);
* FIPS compliant, use sha1 instead of md5 if it not allowed (see gh-1540)
* Monit config: scripting is not supported in path (gh-1556)
* `filter.d/apache-modsecurity.conf`
- Fixed for newer version (one space, gh-1626), optimized: non-greedy catch-all
replaced for safer match, unneeded catch-all anchoring removed, non-capturing
* `filter.d/asterisk.conf`
- Fixed to match different asterisk log prefix (source file: method:)
* `filter.d/dovecot.conf`
- Fixed failregex ignores failures through some not relevant info (gh-1623)
* `filter.d/ignorecommands/apache-fakegooglebot`
- Fixed error within apache-fakegooglebot, that will be called
- Fixed error within apache-fakegooglebot, that will be called
with wrong python version (gh-1506)
* `filter.d/assp.conf`
- Extended failregex and test cases to handle ASSP V1 and V2 (gh-1494)
@ -208,18 +367,21 @@ releases.
- recognized "Failed publickey for" (gh-1477);
- optimized failregex to match all of "Failed any-method for ... from <HOST>" (gh-1479)
- eliminated possible complex injections (on user-name resp. auth-info, see gh-1479)
- optional port part after host (see gh-1533, gh-1581)
### New Features
* New Actions:
- `action.d/npf.conf` for NPF, the latest packet filter for NetBSD
* New Filters:
- `filter.d/mongodb-auth.conf` for MongoDB (document-oriented NoSQL database engine)
(gh-1586, gh-1606 and gh-1607)
### Enhancements
* DateTemplate regexp extended with the word-end boundary, additionally to
* DateTemplate regexp extended with the word-end boundary, additionally to
word-start boundary
* Introduces new command "fail2ban-python", as automatically created symlink to
* Introduces new command "fail2ban-python", as automatically created symlink to
python executable, where fail2ban currently installed (resp. its modules are located):
- allows to use the same version, fail2ban currently running, e.g. in
- allows to use the same version, fail2ban currently running, e.g. in
external scripts just via replace python with fail2ban-python:
```diff
-#!/usr/bin/env python

View File

@ -41,6 +41,7 @@ config/action.d/mynetwatchman.conf
config/action.d/nftables-allports.conf
config/action.d/nftables-common.conf
config/action.d/nftables-multiport.conf
config/action.d/npf.conf
config/action.d/nsupdate.conf
config/action.d/osx-afctl.conf
config/action.d/osx-ipfw.conf
@ -100,6 +101,7 @@ config/filter.d/horde.conf
config/filter.d/ignorecommands/apache-fakegooglebot
config/filter.d/kerio.conf
config/filter.d/lighttpd-auth.conf
config/filter.d/mongodb-auth.conf
config/filter.d/monit.conf
config/filter.d/murmur.conf
config/filter.d/mysqld-auth.conf
@ -136,7 +138,6 @@ config/filter.d/solid-pop3d.conf
config/filter.d/squid.conf
config/filter.d/squirrelmail.conf
config/filter.d/sshd.conf
config/filter.d/sshd-ddos.conf
config/filter.d/stunnel.conf
config/filter.d/suhosin.conf
config/filter.d/tine20.conf
@ -154,6 +155,7 @@ config/paths-opensuse.conf
config/paths-osx.conf
CONTRIBUTING.md
COPYING
.coveragerc
DEVELOP
fail2ban-2to3
fail2ban/client/actionreader.py
@ -213,6 +215,7 @@ fail2ban/tests/clientbeautifiertestcase.py
fail2ban/tests/clientreadertestcase.py
fail2ban/tests/config/action.d/brokenaction.conf
fail2ban/tests/config/fail2ban.conf
fail2ban/tests/config/filter.d/common.conf
fail2ban/tests/config/filter.d/simple.conf
fail2ban/tests/config/filter.d/test.conf
fail2ban/tests/config/filter.d/test.local
@ -287,6 +290,7 @@ fail2ban/tests/files/logs/haproxy-http-auth
fail2ban/tests/files/logs/horde
fail2ban/tests/files/logs/kerio
fail2ban/tests/files/logs/lighttpd-auth
fail2ban/tests/files/logs/mongodb-auth
fail2ban/tests/files/logs/monit
fail2ban/tests/files/logs/murmur
fail2ban/tests/files/logs/mysqld-auth
@ -322,7 +326,6 @@ fail2ban/tests/files/logs/solid-pop3d
fail2ban/tests/files/logs/squid
fail2ban/tests/files/logs/squirrelmail
fail2ban/tests/files/logs/sshd
fail2ban/tests/files/logs/sshd-ddos
fail2ban/tests/files/logs/stunnel
fail2ban/tests/files/logs/suhosin
fail2ban/tests/files/logs/tine20
@ -386,6 +389,7 @@ man/fail2ban-testcases.1
man/fail2ban-testcases.h2m
man/generate-man
man/jail.conf.5
.pylintrc
README.md
README.Solaris
RELEASE

View File

@ -17,9 +17,13 @@ Though Fail2Ban is able to reduce the rate of incorrect authentications
attempts, it cannot eliminate the risk that weak authentication presents.
Configure services to use only two factor or public/private authentication
mechanisms if you really want to protect services.
<img src="http://www.worldipv6launch.org/wp-content/themes/ipv6/downloads/World_IPv6_launch_logo.svg" height="52pt"/> | Since v0.10 fail2ban supports the matching of the IPv6 addresses.
------|------
This README is a quick introduction to Fail2ban. More documentation, FAQ, HOWTOs
are available in fail2ban(1) manpage and on the website http://www.fail2ban.org
are available in fail2ban(1) manpage, [Wiki](https://github.com/fail2ban/fail2ban/wiki)
and on the website http://www.fail2ban.org
Installation:
-------------
@ -86,7 +90,7 @@ Contact:
See [CONTRIBUTING.md](https://github.com/fail2ban/fail2ban/blob/master/CONTRIBUTING.md)
### You just appreciate this program:
send kudos to the original author ([Cyril Jaquier](mailto: Cyril Jaquier <cyril.jaquier@fail2ban.org>))
send kudos to the original author ([Cyril Jaquier](mailto:cyril.jaquier@fail2ban.org))
or *better* to the [mailing list](https://lists.sourceforge.net/lists/listinfo/fail2ban-users)
since Fail2Ban is "community-driven" for years now.

View File

@ -53,7 +53,7 @@ Preparation
or an alternative for comparison with previous release
git diff 0.9.5 | grep -B2 'index 0000000..' | grep -B1 'new file mode' | sed -n -e '/^diff /s,.* b/,,gp' >> MANIFEST
git diff 0.10.0 | grep -B2 'index 0000000..' | grep -B1 'new file mode' | sed -n -e '/^diff /s,.* b/,,gp' >> MANIFEST
sort MANIFEST | uniq | sponge MANIFEST
* Run::
@ -70,7 +70,7 @@ Preparation
* clean up current directory::
diff -rul --exclude \*.pyc . /tmp/fail2ban-0.9.5/
diff -rul --exclude \*.pyc . /tmp/fail2ban-0.10.0/
* Only differences should be files that you don't want distributed.
@ -83,7 +83,7 @@ Preparation
* To generate a list of committers use e.g.::
git shortlog -sn 0.9.5.. | sed -e 's,^[ 0-9\t]*,,g' | tr '\n' '\|' | sed -e 's:|:, :g'
git shortlog -sn 0.10.0.. | sed -e 's,^[ 0-9\t]*,,g' | tr '\n' '\|' | sed -e 's:|:, :g'
* Ensure the top of the ChangeLog has the right version and current date.
* Ensure the top entry of the ChangeLog has the right version and current date.
@ -106,7 +106,7 @@ Preparation
* Tag the release by using a signed (and annotated) tag. Cut/paste
release ChangeLog entry as tag annotation::
git tag -s 0.9.5
git tag -s 0.10.0
Pre Release
===========

3
THANKS
View File

@ -16,6 +16,7 @@ Alexander Koeppe (IPv6 support)
Alexandre Perrin (kAworu)
Amir Caspi
Amy
Andrew James Collett (ajcollett)
Andrew St. Jean
Andrey G. Grozin
Andy Fragen
@ -111,6 +112,7 @@ Sean DuBois
Sebastian Arcus
Serg G. Brester
Sergey Safarov
Shaun C.
Sireyessire
silviogarbes
Stefan Tatschner
@ -121,6 +123,7 @@ Thomas Mayer
Tom Pike
Tom Hendrikx
Tomas Pihl
Thomas Skierlo (phaleas)
Tony Lawrence
Tomasz Ciolek
Tyler

View File

@ -0,0 +1,105 @@
# Fail2ban configuration file
#
# Action to report IP address to abuseipdb.com
# You must sign up to obtain an API key from abuseipdb.com.
#
# NOTE: These reports may include sensitive Info.
# If you want cleaner reports that ensure no user data see the helper script at the below website.
#
# IMPORTANT:
#
# Reporting an IP of abuse is a serious complaint. Make sure that it is
# serious. Fail2ban developers and network owners recommend you only use this
# action for:
# * The recidive where the IP has been banned multiple times
# * Where maxretry has been set quite high, beyond the normal user typing
# password incorrectly.
# * For filters that have a low likelihood of receiving human errors
#
# This action relies on a api_key being added to the above action conf,
# and the appropriate categories set.
#
# Example, for ssh bruteforce (in section [sshd] of `jail.local`):
# action = %(known/action)s
# %(action_abuseipdb)s[abuseipdb_apikey="my-api-key", abuseipdb_category="18,22"]
#
# See below for catagories.
#
# Original Ref: https://wiki.shaunc.com/wikka.php?wakka=ReportingToAbuseIPDBWithFail2Ban
# Added to fail2ban by Andrew James Collett (ajcollett)
## abuseIPDB Catagories, `the abuseipdb_category` MUST be set in the jail.conf action call.
# Example, for ssh bruteforce: action = %(action_abuseipdb)s[abuseipdb_category="18,22"]
# ID Title Description
# 3 Fraud Orders
# 4 DDoS Attack
# 9 Open Proxy
# 10 Web Spam
# 11 Email Spam
# 14 Port Scan
# 18 Brute-Force
# 19 Bad Web Bot
# 20 Exploited Host
# 21 Web App Attack
# 22 SSH Secure Shell (SSH) abuse. Use this category in combination with more specific categories.
# 23 IoT Targeted
# See https://abuseipdb.com/categories for more descriptions
[Definition]
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD
#
actionstart =
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop =
# Option: actioncheck
# Notes.: command executed once before each actionban command
# Values: CMD
#
actioncheck =
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.
#
# ** IMPORTANT! **
#
# By default, this posts directly to AbuseIPDB's API, unfortunately
# this results in a lot of backslashes/escapes appearing in the
# reports. This also may include info like your hostname.
# If you have your own web server with PHP available, you can
# use my (Shaun's) helper PHP script by commenting out the first #actionban
# line below, uncommenting the second one, and pointing the URL at
# wherever you install the helper script. For the PHP helper script, see
# <https://wiki.shaunc.com/wikka.php?wakka=ReportingToAbuseIPDBWithFail2Ban>
#
# --ciphers ecdhe_ecdsa_aes_256_sha is used to workaround a
# "NSS error -12286" from curl as it attempts to connect using
# SSLv3. See https://www.centos.org/forums/viewtopic.php?t=52732
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionban = curl --fail --ciphers ecdhe_ecdsa_aes_256_sha --data 'key=<abuseipdb_apikey>' --data-urlencode 'comment=<matches>' --data 'ip=<ip>' --data 'category=<abuseipdb_category>' "https://www.abuseipdb.com/report/json"
# Option: actionunban
# Notes.: command executed when unbanning an IP. Take care that the
# command is executed with Fail2Ban user rights.
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionunban =
[Init]
# Option: abuseipdb_apikey
# Notes Your API key from abuseipdb.com
# Values: STRING Default: None
# Register for abuseipdb [https://www.abuseipdb.com], get api key and set below.
# You will need to set the catagory in the action call.
abuseipdb_apikey =

View File

@ -34,6 +34,12 @@ before = helpers-common.conf
[Definition]
# Used in test cases for coverage internal transformations
debug = 0
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD
@ -58,9 +64,11 @@ actioncheck =
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionban = oifs=${IFS};
IFS=.; SEP_IP=( <ip> ); set -- ${SEP_IP}; ADDRESSES=$(dig +short -t txt -q $4.$3.$2.$1.abuse-contacts.abusix.org);
IFS=,; ADDRESSES=$(echo $ADDRESSES)
actionban = oifs=${IFS};
RESOLVER_ADDR="%(addr_resolver)s"
if [ "<debug>" -gt 0 ]; then echo "try to resolve $RESOLVER_ADDR"; fi
ADDRESSES=$(dig +short -t txt -q $RESOLVER_ADDR | tr -d '"')
IFS=,; ADDRESSES=$(echo $ADDRESSES)
IFS=${oifs}
IP=<ip>
if [ ! -z "$ADDRESSES" ]; then
@ -78,7 +86,12 @@ actionban = oifs=${IFS};
#
actionunban =
[Init]
# Server as resolver used in dig command
#
addr_resolver = <ip-rev>abuse-contacts.abusix.org
# Default message used for abuse content
#
message = Dear Sir/Madam,\n\nWe have detected abuse from the IP address $IP, which according to a abusix.com is on your network. We would appreciate if you would investigate and take action as appropriate.\n\nLog lines are given below, but please ask if you require any further information.\n\n(If you are not the correct person to contact about this please accept our apologies - your e-mail address was extracted from the whois record by an automated process.)\n\n This mail was generated by Fail2Ban.\nThe recipient address of this report was provided by the Abuse Contact DB by abusix.com. abusix.com does not maintain the content of the database. All information which we pass out, derives from the RIR databases and is processed for ease of use. If you want to change or report non working abuse contacts please contact the appropriate RIR. If you have any further question, contact abusix.com directly via email (info@abusix.com). Information about the Abuse Contact Database can be found here: https://abusix.com/global-reporting/abuse-contact-db\nabusix.com is neither responsible nor liable for the content or accuracy of this message.\n
# Path to the log files which contain relevant lines for the abuser IP

View File

@ -28,6 +28,9 @@
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD

View File

@ -10,14 +10,23 @@
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD
#
actionstart = touch /var/run/fail2ban/fail2ban.dummy
printf %%b "<init>\n" >> /var/run/fail2ban/fail2ban.dummy
actionstart = if [ ! -z '<target>' ]; then touch <target>; fi;
printf %%b "<init>\n" <to_target>
echo "%(debug)s started"
# Option: actionflush
# Notes.: command executed once to flush (clear) all IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = printf %%b "-*\n" <to_target>
echo "%(debug)s clear all"
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop = rm -f /var/run/fail2ban/fail2ban.dummy
actionstop = if [ ! -z '<target>' ]; then rm -f <target>; fi;
echo "%(debug)s stopped"
# Option: actioncheck
# Notes.: command executed once before each actionban command
@ -31,7 +40,8 @@ actioncheck =
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionban = printf %%b "+<ip>\n" >> /var/run/fail2ban/fail2ban.dummy
actionban = printf %%b "+<ip>\n" <to_target>
echo "%(debug)s banned <ip> (family: <family>)"
# Option: actionunban
# Notes.: command executed when unbanning an IP. Take care that the
@ -39,9 +49,15 @@ actionban = printf %%b "+<ip>\n" >> /var/run/fail2ban/fail2ban.dummy
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionunban = printf %%b "-<ip>\n" >> /var/run/fail2ban/fail2ban.dummy
actionunban = printf %%b "-<ip>\n" <to_target>
echo "%(debug)s unbanned <ip> (family: <family>)"
debug = [<name>] <actname> <target> --
[Init]
init = 123
target = /var/run/fail2ban/fail2ban.dummy
to_target = >> <target>

View File

@ -35,7 +35,7 @@ actioncheck =
# service name example:
# firewall-cmd --zone=<zone> --add-rich-rule="rule family='<family>' source address='<ip>' service name='<service>' log prefix='f2b-<name>' level='<level>' limit value='<rate>/m' <rich-blocktype>"
#
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges seperated by a comma or space for an example: http, https, 22-60, 18 smtp
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges separated by a comma or space for an example: http, https, 22-60, 18 smtp
actionban = ports="<port>"; for p in $(echo $ports | tr ", " " "); do firewall-cmd --add-rich-rule="rule family='<family>' source address='<ip>' port port='$p' protocol='<protocol>' log prefix='f2b-<name>' level='<level>' limit value='<rate>/m' <rich-blocktype>"; done

View File

@ -33,7 +33,7 @@ actioncheck =
# service name example:
# firewall-cmd --zone=<zone> --add-rich-rule="rule family='ipv4' source address='<ip>' service name='<service>' <rich-blocktype>"
#
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges seperated by a comma or space for an example: http, https, 22-60, 18 smtp
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges separated by a comma or space for an example: http, https, 22-60, 18 smtp
actionban = ports="<port>"; for p in $(echo $ports | tr ", " " "); do firewall-cmd --add-rich-rule="rule family='<family>' source address='<ip>' port port='$p' protocol='<protocol>' <rich-blocktype>"; done

View File

@ -7,6 +7,9 @@
_grep_logs = logpath="<logpath>"; grep <grepopts> -E %(_grep_logs_args)s $logpath | <greplimit>
_grep_logs_args = '(^|[^0-9])<ip>([^0-9]|$)'
# Used for actions, that should not by executed if ticket was restored:
_bypass_if_restored = if [ '<restored>' = '1' ]; then exit 0; fi;
[Init]
greplimit = tail -n <grepmax>
grepmax = 1000

View File

@ -26,7 +26,7 @@ actionstart = <iptables> -N f2b-<name>
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> -j f2b-<name>
<iptables> -F f2b-<name>
<actionflush>
<iptables> -X f2b-<name>
# Option: actioncheck

View File

@ -16,6 +16,14 @@ after = iptables-blocktype.local
iptables-common.local
# iptables-blocktype.local is obsolete
[Definition]
# Option: actionflush
# Notes.: command executed once to flush IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = <iptables> -F f2b-<name>
[Init]

View File

@ -30,12 +30,19 @@ before = iptables-common.conf
actionstart = ipset --create f2b-<name> iphash
<iptables> -I <chain> -p <protocol> -m multiport --dports <port> -m set --match-set f2b-<name> src -j <blocktype>
# Option: actionflush
# Notes.: command executed once to flush IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = ipset --flush f2b-<name>
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> -m multiport --dports <port> -m set --match-set f2b-<name> src -j <blocktype>
ipset --flush f2b-<name>
<actionflush>
ipset --destroy f2b-<name>
# Option: actionban

View File

@ -29,12 +29,18 @@ before = iptables-common.conf
actionstart = ipset create <ipmset> hash:ip timeout <bantime><familyopt>
<iptables> -I <chain> -m set --match-set <ipmset> src -j <blocktype>
# Option: actionflush
# Notes.: command executed once to flush IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = ipset flush <ipmset>
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop = <iptables> -D <chain> -m set --match-set <ipmset> src -j <blocktype>
ipset flush <ipmset>
<actionflush>
ipset destroy <ipmset>
# Option: actionban

View File

@ -29,12 +29,18 @@ before = iptables-common.conf
actionstart = ipset create <ipmset> hash:ip timeout <bantime><familyopt>
<iptables> -I <chain> -p <protocol> -m multiport --dports <port> -m set --match-set <ipmset> src -j <blocktype>
# Option: actionflush
# Notes.: command executed once to flush IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = ipset flush <ipmset>
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> -m multiport --dports <port> -m set --match-set <ipmset> src -j <blocktype>
ipset flush <ipmset>
<actionflush>
ipset destroy <ipmset>
# Option: actionban

View File

@ -26,13 +26,19 @@ actionstart = <iptables> -N f2b-<name>
<iptables> -I f2b-<name>-log -j LOG --log-prefix "$(expr f2b-<name> : '\(.\{1,23\}\)'):DROP " --log-level warning -m limit --limit 6/m --limit-burst 2
<iptables> -A f2b-<name>-log -j <blocktype>
# Option: actionflush
# Notes.: command executed once to flush IPS, by shutdown (resp. by stop of the jail or this action)
# Values: CMD
#
actionflush = <iptables> -F f2b-<name>
<iptables> -F f2b-<name>-log
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> -m multiport --dports <port> -j f2b-<name>
<iptables> -F f2b-<name>
<iptables> -F f2b-<name>-log
<actionflush>
<iptables> -X f2b-<name>
<iptables> -X f2b-<name>-log

View File

@ -23,7 +23,7 @@ actionstart = <iptables> -N f2b-<name>
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> -m multiport --dports <port> -j f2b-<name>
<iptables> -F f2b-<name>
<actionflush>
<iptables> -X f2b-<name>
# Option: actioncheck

View File

@ -25,7 +25,7 @@ actionstart = <iptables> -N f2b-<name>
# Values: CMD
#
actionstop = <iptables> -D <chain> -m state --state NEW -p <protocol> --dport <port> -j f2b-<name>
<iptables> -F f2b-<name>
<actionflush>
<iptables> -X f2b-<name>
# Option: actioncheck

View File

@ -35,6 +35,12 @@ before = iptables-common.conf
# shorter of the two timeouts actually matters.
actionstart = if [ `id -u` -eq 0 ];then <iptables> -I <chain> -m recent --update --seconds 3600 --name <iptname> -j <blocktype>;fi
# Option: actionflush
#
# [TODO] Flushing is currently not implemented for xt_recent
#
actionflush =
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
# Values: CMD

View File

@ -23,7 +23,7 @@ actionstart = <iptables> -N f2b-<name>
# Values: CMD
#
actionstop = <iptables> -D <chain> -p <protocol> --dport <port> -j f2b-<name>
<iptables> -F f2b-<name>
<actionflush>
<iptables> -X f2b-<name>
# Option: actioncheck

View File

@ -6,6 +6,9 @@
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD

View File

@ -11,6 +11,9 @@ before = mail-whois-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD
@ -18,7 +21,7 @@ before = mail-whois-common.conf
actionstart = printf %%b "Hi,\n
The jail <name> has been started successfully.\n
Regards,\n
Fail2Ban" | <mailcmd> -s "[Fail2Ban] <name>: started on `uname -n`" <dest>
Fail2Ban" | <mailcmd> "[Fail2Ban] <name>: started on `uname -n`" <dest>
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
@ -27,7 +30,7 @@ actionstart = printf %%b "Hi,\n
actionstop = printf %%b "Hi,\n
The jail <name> has been stopped.\n
Regards,\n
Fail2Ban" | <mailcmd> -s "[Fail2Ban] <name>: stopped on `uname -n`" <dest>
Fail2Ban" | <mailcmd> "[Fail2Ban] <name>: stopped on `uname -n`" <dest>
# Option: actioncheck
# Notes.: command executed once before each actionban command
@ -52,6 +55,7 @@ _ban_mail_content = ( printf %%b "Hi,\n
printf %%b "\n
Regards,\n
Fail2Ban" )
actionban = %(_ban_mail_content)s | <mailcmd> "[Fail2Ban] <name>: banned <ip> from `uname -n`" <dest>
# Option: actionunban

View File

@ -10,6 +10,9 @@ before = mail-whois-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD

View File

@ -6,6 +6,9 @@
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD

View File

@ -0,0 +1,33 @@
# Fail2ban Citrix Netscaler Action
# by Juliano Jeziorny
# juliano@jeziorny.eu
#
# The script will add offender IPs to a dataset on netscaler, the dataset can then be used to block the IPs at a cs/vserver or global level
# This dataset is then used to block IPs using responder policies on the netscaler.
#
# The script assumes using HTTPS with unsecure certificate to access the netscaler,
# if you have a valid certificate installed remove the -k from the curl lines, or if you want http change it accordingly (and remove the -k)
#
# This action depends on curl
#
# You need to populate the 3 options inside Init
#
# ns_host: IP or hostname of netslcaer appliance
# ns_auth: username:password, suggest base64 encoded for a little added security (echo -n "username:password" | base64)
# ns_dataset: Name of the netscaler dataset holding the IPs to be blocked.
#
# For further details on how to use it please check http://blog.ckzone.eu/2017/01/fail2ban-action-for-citrix-netscaler.html
[Init]
ns_host =
ns_auth =
ns_dataset =
[Definition]
actionstart = curl -kH 'Authorization: Basic <ns_auth>' https://<ns_host>/nitro/v1/config
actioncheck =
actionban = curl -k -H 'Authorization: Basic <ns_auth>' -X PUT -d '{"policydataset_value_binding":{"name":"<ns_dataset>","value":"<ip>"}}' https://<ns_host>/nitro/v1/config/
actionunban = curl -H 'Authorization: Basic <ns_auth>' -X DELETE -k "https://<ns_host>/nitro/v1/config/policydataset_value_binding/<ns_dataset>?args=value:<ip>"

View File

@ -18,6 +18,9 @@
actionstart = echo "table <<tablename>-<name>> persist counters" | pfctl -f-
echo "block proto <protocol> from <<tablename>-<name>> to <actiontype>" | pfctl -f-
# Option: start_on_demand - to start action on demand
# Example: `action=pf[actionstart_on_demand=true]`
actionstart_on_demand = false
# Option: actionstop
# Notes.: command executed once at the end of Fail2Ban
@ -71,8 +74,6 @@ tablename = f2b
#
protocol = tcp
# Option: actiontype
# Notes.: defines additions to the blocking rule
# Values: leave empty to block all attempts from the host

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionstart
# Notes.: command executed once at the start of Fail2Ban.
# Values: CMD

View File

@ -11,6 +11,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: Command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.
@ -33,7 +36,7 @@ actionban = ( printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n
http://whois.domaintools.com/<ip>\n\n
Country:`geoiplookup -f /usr/share/GeoIP/GeoIP.dat "<ip>" | cut -d':' -f2-`
AS:`geoiplookup -f /usr/share/GeoIP/GeoIPASNum.dat "<ip>" | cut -d':' -f2-`
hostname: `host -t A <ip> 2>&1`\n\n
hostname: <ip-host>\n\n
Lines containing failures of <ip>\n";
%(_grep_logs)s;
printf %%b "\n

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -11,6 +11,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -10,6 +10,9 @@ before = sendmail-common.conf
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
# Option: actionban
# Notes.: command executed when banning an IP. Take care that the
# command is executed with Fail2Ban user rights.

View File

@ -123,9 +123,12 @@ class SMTPAction(ActionBase):
self.message_values = CallingMap(
jailname = self._jail.name,
hostname = socket.gethostname,
bantime = self._jail.actions.getBanTime,
bantime = lambda: self._jail.actions.getBanTime(),
)
# bypass ban/unban for restored tickets
self.norestored = 1
def _sendMessage(self, subject, text):
"""Sends message based on arguments and instance's properties.
@ -211,6 +214,8 @@ class SMTPAction(ActionBase):
Dictionary which includes information in relation to
the ban.
"""
if aInfo.get('restored'):
return
aInfo.update(self.message_values)
message = "".join([
messages['ban']['head'],

View File

@ -32,6 +32,9 @@
[Definition]
# bypass ban/unban for restored tickets
norestored = 1
actionstart =
actionstop =

View File

@ -9,20 +9,23 @@ before = apache-common.conf
[Definition]
prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^%(_apache_error_client)s (AH(01797|01630): )?client denied by server configuration
^%(_apache_error_client)s (AH01617: )?user \S* authentication failure
^%(_apache_error_client)s (AH01618: )?user \S* not found
^%(_apache_error_client)s (AH01614: )?client used wrong authentication scheme
^%(_apache_error_client)s (AH\d+: )?Authorization of user \S* to access .* failed
^%(_apache_error_client)s (AH0179[24]: )?(Digest: )?user \S*: password mismatch
^%(_apache_error_client)s (AH0179[01]: |Digest: )user `\S*' in realm `.+' (not found|denied by provider)
^%(_apache_error_client)s (AH01631: )?user \S*: authorization failure
^%(_apache_error_client)s (AH01775: )?(Digest: )?invalid nonce .* received - length is not \S+(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01788: )?(Digest: )?realm mismatch - got `.*?' but expected `.+'(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01789: )?(Digest: )?unknown algorithm `
^%(_apache_error_client)s (AH01793: )?invalid qop `
^%(_apache_error_client)s (AH01777: )?(Digest: )?invalid nonce .*? received - user attempted time travel(, referer: \S+)?\s*$
# auth_type = ((?:Digest|Basic): )?
auth_type = ([A-Z]\w+: )?
failregex = ^client denied by server configuration\b
^user <F-USER>(?:\S*|.*?)</F-USER> auth(?:oriz|entic)ation failure\b
^user <F-USER>(?:\S*|.*?)</F-USER> not found\b
^client used wrong authentication scheme\b
^Authorization of user <F-USER>(?:\S*|.*?)</F-USER> to access .*? failed\b
^%(auth_type)suser <F-USER>(?:\S*|.*?)</F-USER>: password mismatch\b
^%(auth_type)suser `<F-USER>(?:[^']*|.*?)</F-USER>' in realm `.+' (not found|denied by provider)\b
^%(auth_type)sinvalid nonce .* received - length is not \S+(, referer: \S+)?\s*$
^%(auth_type)srealm mismatch - got `(?:[^']*|.*?)' but expected `.+'(, referer: \S+)?\s*$
^%(auth_type)sunknown algorithm `(?:[^']*|.*?)' received\b
^invalid qop `(?:[^']*|.*?)' received\b
^%(auth_type)sinvalid nonce .*? received - user attempted time travel\b
ignoreregex =
@ -48,11 +51,12 @@ ignoreregex =
# See also: http://wiki.apache.org/httpd/ListOfErrors
# Expressions that don't have tests and aren't common.
# more be added with https://issues.apache.org/bugzilla/show_bug.cgi?id=55284
# ^%(_apache_error_client)s (AH01778: )?user .*: nonce expired \([\d.]+ seconds old - max lifetime [\d.]+\) - sending new nonce\s*$
# ^%(_apache_error_client)s (AH01779: )?user .*: one-time-nonce mismatch - sending new nonce\s*$
# ^%(_apache_error_client)s (AH02486: )?realm mismatch - got `.*' but no realm specified\s*$
# ^user .*: nonce expired \([\d.]+ seconds old - max lifetime [\d.]+\) - sending new nonce\s*$
# ^user .*: one-time-nonce mismatch - sending new nonce\s*$
# ^realm mismatch - got `(?:[^']*|.*?)' but no realm specified\s*$
#
# referer is always in error log messages if it exists added as per the log_error_core function in server/log.c
# Because url/referer are foreign input, short form of regex used if long enough to idetify failure.
#
# Author: Cyril Jaquier
# Major edits by Daniel Black
# Major edits by Daniel Black and Ben Rubson.
# Rewritten for v.0.10 by Sergey Brester (sebres).

View File

@ -23,14 +23,13 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s ((AH001(28|30): )?File does not exist|(AH01264: )?script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$
^%(_apache_error_client)s script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^(?:File does not exist|script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$
^script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
ignoreregex =
[Init]
# Webroot represents the webroot on which all other files are based
webroot = /var/www/

View File

@ -10,9 +10,10 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s ModSecurity: (\[.*?\] )*Access denied with code [45]\d\d.*$
failregex = ^%(_apache_error_client)s ModSecurity:\s+(?:\[(?:\w+ \"[^\"]*\"|[^\]]*)\]\s*)*Access denied with code [45]\d\d
ignoreregex =
# https://github.com/SpiderLabs/ModSecurity/wiki/ModSecurity-2-Data-Formats
# Author: Daniel Black
# Sergey G. Brester aka sebres (review, optimization)

View File

@ -9,8 +9,10 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: warning: HTTP_.*?: ignoring function definition attempt(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: error importing function definition for `HTTP_.*?'(, referer: \S+)?\s*$
prefregex = ^%(_apache_error_client)s (AH01215: )?/bin/([bd]a)?sh: <F-CONTENT>.+</F-CONTENT>$
failregex = ^warning: HTTP_[^:]+: ignoring function definition attempt(, referer: \S+)?\s*$
^error importing function definition for `HTTP_[^']+'(, referer: \S+)?\s*$
ignoreregex =

View File

@ -8,7 +8,7 @@
#
[Definition]
# Note: First three failregex matches below are for ASSP V1 with the remaining being designed for V2. Deleting the V1 regex is recommended but I left it in for compatibilty reasons.
# Note: First three failregex matches below are for ASSP V1 with the remaining being designed for V2. Deleting the V1 regex is recommended but I left it in for compatibility reasons.
__assp_actions = (?:dropping|refusing)

View File

@ -18,16 +18,18 @@ iso8601 = \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[+-]\d{4}
# All Asterisk log messages begin like this:
log_prefix= (?:NOTICE|SECURITY|WARNING)%(__pid_re)s:?(?:\[C-[\da-f]*\])? [^:]+:\d*(?:(?: in)? \w+:)?
failregex = ^%(__prefix_line)s%(log_prefix)s Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^%(__prefix_line)s%(log_prefix)s Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed to authenticate as '[^']*'$
^%(__prefix_line)s%(log_prefix)s No registration for peer '[^']*' \(from <HOST>\)$
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$
^%(__prefix_line)s%(log_prefix)s Failed to authenticate (user|device) [^@]+@<HOST>\S*$
^%(__prefix_line)s%(log_prefix)s hacking attempt detected '<HOST>'$
^%(__prefix_line)s%(log_prefix)s SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$
^%(__prefix_line)s%(log_prefix)s "Rejecting unknown SIP connection from <HOST>"$
^%(__prefix_line)s%(log_prefix)s Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$
prefregex = ^%(__prefix_line)s%(log_prefix)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^Host <HOST> failed to authenticate as '[^']*'$
^No registration for peer '[^']*' \(from <HOST>\)$
^Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$
^Failed to authenticate (user|device) [^@]+@<HOST>\S*$
^hacking attempt detected '<HOST>'$
^SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$
^"Rejecting unknown SIP connection from <HOST>"$
^Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$
ignoreregex =

View File

@ -12,8 +12,10 @@ before = common.conf
_daemon = courieresmtpd
failregex = ^%(__prefix_line)serror,relay=<HOST>,.*: 550 User (<.*> )?unknown\.?$
^%(__prefix_line)serror,relay=<HOST>,msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
prefregex = ^%(__prefix_line)serror,relay=<HOST>,<F-CONTENT>.+</F-CONTENT>$
failregex = ^[^:]*: 550 User (<.*> )?unknown\.?$
^msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
ignoreregex =

View File

@ -13,7 +13,7 @@ before = common.conf
_daemon = (?:cyrus/)?(?:imap(d|s)?|pop3(d|s)?)
failregex = ^%(__prefix_line)sbadlogin: \S+ ?\[<HOST>\] \S+ .*?\[?SASL\(-13\): (authentication failure|user not found): .*\]?$
failregex = ^%(__prefix_line)sbadlogin: [^\[]*\[<HOST>\] \S+ .*?\[?SASL\(-13\): (authentication failure|user not found): .*\]?$
ignoreregex =

View File

@ -0,0 +1,47 @@
# Fail2Ban configuration file for IBM Domino SMTP Server TASK to detect failed login attempts
#
# Author: Christian Brandlehner
#
# $Revision: 003 $
#
# Configuration:
# Set the following Domino Server parameters in notes.ini:
# console_log_enabled=1
# log_sessions=2
# You also have to use a date and time format supported by fail2ban. Recommended notes.ini configuration is:
# DateOrder=DMY
# DateSeparator=-
# ClockType=24_Hour
# TimeSeparator=:
#
# Depending on your locale you might have to tweak the date and time format so fail2ban can read the log
#[INCLUDES]
# Read common prefixes. If any customizations available -- read them from
# common.local
#before = common.conf
[Definition]
# Option: failregex
# Notes.: regex to match the password failure messages in the logfile. The
# host must be matched by a group named "host". The tag "<HOST>" can
# be used for standard IP/hostname matching and is only an alias for
# (?:::f{4,6}:)?(?P<host>\S+)
# Values: TEXT
#
# Sample log entries (used different time formats and an extra sample with process info in front of date)
# 01-23-2009 19:54:51 SMTP Server: Authentication failed for user postmaster ; connecting host 1.2.3.4
# [28325:00010-3735542592] 22-06-2014 09:56:12 smtp: postmaster [1.2.3.4] authentication failure using internet password
# 08-09-2014 06:14:27 smtp: postmaster [1.2.3.4] authentication failure using internet password
# 08-09-2014 06:14:27 SMTP Server: Authentication failed for user postmaster ; connecting host 1.2.3.4
__prefix = (?:\[[^\]]+\])?\s+
failregex = ^%(__prefix)sSMTP Server: Authentication failed for user .*? \; connecting host <HOST>$
^%(__prefix)ssmtp: (?:[^\[]+ )*\[<HOST>\] authentication failure using internet password\s*$
# Option: ignoreregex
# Notes.: regex to ignore. If this regex matches, the line is ignored.
# Values: TEXT
#
ignoreregex =

View File

@ -7,13 +7,16 @@ before = common.conf
[Definition]
_daemon = (auth|dovecot(-auth)?|auth-worker)
_auth_worker = (?:dovecot: )?auth(?:-worker)?
_daemon = (dovecot(-auth)?|auth)
failregex = ^%(__prefix_line)s(%(__pam_auth)s(\(dovecot:auth\))?:)?\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(\s+user=\S*)?\s*$
^%(__prefix_line)s(pop3|imap)-login: (Info: )?(Aborted login|Disconnected)(: Inactivity)? \(((auth failed, \d+ attempts)( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<\S*>,)?( method=\S+,)? rip=<HOST>(, lip=(\d{1,3}\.){3}\d{1,3})?(, TLS( handshaking(: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^%(__prefix_line)s(Info|dovecot: auth\(default\)|auth-worker\(\d+\)): pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
^%(__prefix_line)s(auth|auth-worker\(\d+\)): (pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
^%(__prefix_line)s(auth|auth-worker\(\d+\)): Info: ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
prefregex = ^%(__prefix_line)s(%(_auth_worker)s(?:\([^\)]+\))?: )?(?:%(__pam_auth)s(?:\(dovecot:auth\))?: |(?:pop3|imap)-login: )?(?:Info: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$
^(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
^(?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
^ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
ignoreregex =
@ -31,3 +34,4 @@ datepattern = {^LN-BEG}TAI64N
# Author: Martin Waschbuesch
# Daniel Black (rewrote with begin and end anchors)
# Martin O'Neal (added LDAP authentication failure regex)
# Sergey G. Brester aka sebres (reviewed, optimized, IPv6-compatibility)

View File

@ -23,9 +23,11 @@ before = common.conf
_daemon = dropbear
failregex = ^%(__prefix_line)s[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$
^%(__prefix_line)s[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^%(__prefix_line)s[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$
prefregex = ^%(__prefix_line)s<F-CONTENT>(?:[Ll]ogin|[Bb]ad|[Ee]xit).+</F-CONTENT>$
failregex = ^[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$
^[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$
ignoreregex =

View File

@ -9,7 +9,9 @@ after = exim-common.local
[Definition]
host_info = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?\[<HOST>\](?::\d+)? (?:I=\[\S+\](:\d+)? )?(?:U=\S+ )?(?:P=e?smtp )?
host_info_pre = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?
host_info_suf = (?::\d+)?(?: I=\[\S+\](:\d+)?)?(?: U=\S+)?(?: P=e?smtp)?(?: F=(?:<>|[^@]+@\S+))?\s
host_info = %(host_info_pre)s\[<HOST>\]%(host_info_suf)s
pid = (?: \[\d+\])?
# DEV Notes:

View File

@ -13,14 +13,17 @@ before = exim-common.conf
[Definition]
# Fre-filter via "prefregex" is currently inactive because of too different failure syntax in exim-log (testing needed):
#prefregex = ^%(pid)s <F-CONTENT>\b(?:\w+ authenticator failed|([\w\-]+ )?SMTP (?:(?:call|connection) from|protocol(?: synchronization)? error)|no MAIL in|(?:%(host_info_pre)s\[[^\]]+\]%(host_info_suf)s(?:sender verify fail|rejected RCPT|dropped|AUTH command))).+</F-CONTENT>$
failregex = ^%(pid)s %(host_info)ssender verify fail for <\S+>: (?:Unknown user|Unrouteable address|all relevant MX records point to non-existent hosts)\s*$
^%(pid)s \w+ authenticator failed for (\S+ )?\(\S+\) \[<HOST>\](?::\d+)?(?: I=\[\S+\](:\d+)?)?: 535 Incorrect authentication data( \(set_id=.*\)|: \d+ Time\(s\))?\s*$
^%(pid)s %(host_info)sF=(?:<>|[^@]+@\S+) rejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$
^%(pid)s %(host_info)srejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$
^%(pid)s SMTP protocol synchronization error \([^)]*\): rejected (?:connection from|"\S+") %(host_info)s(?:next )?input=".*"\s*$
^%(pid)s SMTP call from \S+ %(host_info)sdropped: too many nonmail commands \(last was "\S+"\)\s*$
^%(pid)s SMTP protocol error in "AUTH \S*(?: \S*)?" %(host_info)sAUTH command used when not advertised\s*$
^%(pid)s no MAIL in SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sD=\d+s(?: C=\S*)?\s*$
^%(pid)s \S+ SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$
^%(pid)s ([\w\-]+ )?SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$
ignoreregex =

View File

@ -25,8 +25,11 @@ _daemon = Froxlor
# (?:::f{4,6}:)?(?P<host>[\w\-.^_]+)
# Values: TEXT
#
failregex = ^%(__prefix_line)s\[Login Action <HOST>\] Unknown user \S* tried to login.$
^%(__prefix_line)s\[Login Action <HOST>\] User \S* tried to login with wrong password.$
prefregex = ^%(__prefix_line)s\[Login Action <HOST>\] <F-CONTENT>.+</F-CONTENT>$
failregex = ^Unknown user \S* tried to login.$
^User \S* tried to login with wrong password.$
# Option: ignoreregex

View File

@ -28,7 +28,7 @@ _daemon = haproxy
# (?:::f{4,6}:)?(?P<host>[\w\-.^_]+)
# Values: TEXT
#
failregex = ^%(__prefix_line)s<HOST>.*<NOSRV> -1/-1/-1/-1/\+*\d* 401
failregex = ^%(__prefix_line)s<HOST>(?::\d+)?\s+.*<NOSRV> -1/-1/-1/-1/\+*\d* 401
# Option: ignoreregex
# Notes.: regex to ignore. If this regex matches, the line is ignored.

View File

@ -0,0 +1,49 @@
# Fail2Ban filter for unsuccesfull MongoDB authentication attempts
#
# Logfile /var/log/mongodb/mongodb.log
#
# add setting in /etc/mongodb.conf
# logpath=/var/log/mongodb/mongodb.log
#
# and use of the authentication
# auth = true
#
[Definition]
#failregex = ^\s+\[initandlisten\] connection accepted from <HOST>:\d+ \#(?P<__connid>\d+) \(1 connection now open\)<SKIPLINES>\s+\[conn(?P=__connid)\] Failed to authenticate\s+
failregex = ^\s+\[conn(?P<__connid>\d+)\] Failed to authenticate [^\n]+<SKIPLINES>\s+\[conn(?P=__connid)\] end connection <HOST>
ignoreregex =
[Init]
maxlines = 10
# DEV Notes:
#
# Regarding the multiline regex:
#
# There can be a nunber of non-related lines between the first and second part
# of this regex maxlines of 10 is quite generious.
#
# Note the capture __connid, includes the connection ID, used in second part of regex.
#
# The first regex is commented out (but will match also), because it is better to use
# the host from "end connection" line (uncommented above):
# - it has the same prefix, searching begins directly with failure message
# (so faster, because ignores success connections at all)
# - it is not so vulnerable in case of possible race condition
#
# Log example:
# 2016-10-20T09:54:27.108+0200 [initandlisten] connection accepted from 127.0.0.1:53276 #1 (1 connection now open)
# 2016-10-20T09:54:27.109+0200 [conn1] authenticate db: test { authenticate: 1, nonce: "xxx", user: "root", key: "xxx" }
# 2016-10-20T09:54:27.110+0200 [conn1] Failed to authenticate root@test with mechanism MONGODB-CR: AuthenticationFailed UserNotFound Could not find user root@test
# 2016-11-09T09:54:27.894+0100 [conn1] end connection 127.0.0.1:53276 (0 connections now open)
# 2016-11-09T11:55:58.890+0100 [initandlisten] connection accepted from 127.0.0.1:54266 #1510 (1 connection now open)
# 2016-11-09T11:55:58.892+0100 [conn1510] authenticate db: admin { authenticate: 1, nonce: "xxx", user: "root", key: "xxx" }
# 2016-11-09T11:55:58.892+0100 [conn1510] Failed to authenticate root@admin with mechanism MONGODB-CR: AuthenticationFailed key mismatch
# 2016-11-09T11:55:58.894+0100 [conn1510] end connection 127.0.0.1:54266 (0 connections now open)
#
# Authors: Alexander Finkhäuser
# Sergey G. Brester (sebres)

View File

@ -17,8 +17,10 @@ _usernameregex = [^>]+
_prefix = \s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+:
failregex = ^%(_prefix)s Invalid server password$
^%(_prefix)s Wrong certificate or password for existing user$
prefregex = ^%(_prefix)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^Invalid server password$
^Wrong certificate or password for existing user$
ignoreregex =

View File

@ -1,4 +1,4 @@
# Fail2Ban filter for unsuccesfull MySQL authentication attempts
# Fail2Ban filter for unsuccesful MySQL authentication attempts
#
#
# To log wrong MySQL access attempts add to /etc/my.cnf in [mysqld]:

View File

@ -34,9 +34,11 @@ __daemon_combs_re=(?:%(__pid_re)s?:\s+%(__daemon_re)s|%(__daemon_re)s%(__pid_re)
# this can be optional (for instance if we match named native log files)
__line_prefix=(?:\s\S+ %(__daemon_combs_re)s\s+)?
failregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: (view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: zone transfer '\S+/AXFR/\w+' denied\s*$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$
prefregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: <F-CONTENT>.+</F-CONTENT>$
failregex = ^(view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$
^zone transfer '\S+/AXFR/\w+' denied\s*$
^bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$
ignoreregex =

View File

@ -16,7 +16,12 @@ _ttys_re=\S*
__pam_re=\(?%(__pam_auth)s(?:\(\S+\))?\)?:?
_daemon = \S+
failregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s ruser=\S* rhost=<HOST>(?:\s+user=.*)?\s*$
prefregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^ruser=<F-USER>\S*</F-USER> rhost=<HOST>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>\S*</F-USER>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>.*?</F-USER>\s*$
^ruser=<F-USER>.*?</F-USER> rhost=<HOST>\s*$
ignoreregex =

View File

@ -12,7 +12,7 @@ before = common.conf
_daemon = postfix(-\w+)?/smtpd
failregex = ^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 454 4\.7\.1 Service unavailable; Client host \[\S+\] blocked using .* from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
failregex = ^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: [45]54 [45]\.7\.1 Service unavailable; Client host \[\S+\] blocked\b
ignoreregex =

View File

@ -12,13 +12,15 @@ before = common.conf
_daemon = postfix(-\w+)?/(?:submission/|smtps/)?smtp[ds]
failregex = ^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 554 5\.7\.1 .*$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$
^%(__prefix_line)sNOQUEUE: reject: EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname;
^%(__prefix_line)sNOQUEUE: reject: VRFY from \S+\[<HOST>\]: 550 5\.1\.1 .*$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)simproper command pipelining after \S+ from [^[]*\[<HOST>\]:?$
prefregex = ^%(__prefix_line)s(?:NOQUEUE: reject:|improper command pipelining) <F-CONTENT>.+</F-CONTENT>$
failregex = ^RCPT from \S+\[<HOST>\]: 554 5\.7\.1
^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$
^EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname;
^VRFY from \S+\[<HOST>\]: 550 5\.1\.1
^RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^after \S+ from [^[]*\[<HOST>\]:?$
ignoreregex =

View File

@ -16,10 +16,14 @@ _daemon = proftpd
__suffix_failed_login = (User not authorized for login|No such user found|Incorrect password|Password expired|Account disabled|Invalid shell: '\S+'|User in \S+|Limit (access|configuration) denies login|Not a UserAlias|maximum login length exceeded).?
failregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .* \(Login failed\): %(__suffix_failed_login)s\s*$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ SECURITY VIOLATION: .* login attempted\. *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ Maximum login attempts \(\d+\) exceeded *$
prefregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ <F-CONTENT>(?:USER|SECURITY|Maximum).+</F-CONTENT>$
failregex = ^USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^USER .* \(Login failed\): %(__suffix_failed_login)s\s*$
^SECURITY VIOLATION: .* login attempted\. *$
^Maximum login attempts \(\d+\) exceeded *$
ignoreregex =

View File

@ -21,30 +21,45 @@ before = common.conf
_daemon = (?:(sm-(mta|acceptingconnections)|sendmail))
failregex = ^%(__prefix_line)s\w{14}: ruleset=check_rcpt, arg1=(?P<email><\S+@\S+>), relay=(\S+ )?\[<HOST>\]( \(may be forged\))?, reject=(550 5\.7\.1 (?P=email)\.\.\. Relaying denied\. (IP name possibly forged \[(\d+\.){3}\d+\]|Proper authentication required\.|IP name lookup failed \[(\d+\.){3}\d+\])|553 5\.1\.8 (?P=email)\.\.\. Domain of sender address \S+ does not exist|550 5\.[71]\.1 (?P=email)\.\.\. (Rejected: .*|User unknown))$
^%(__prefix_line)sruleset=check_relay, arg1=(?P<dom>\S+), arg2=<HOST>, relay=((?P=dom) )?\[(\d+\.){3}\d+\]( \(may be forged\))?, reject=421 4\.3\.2 (Connection rate limit exceeded\.|Too many open connections\.)$
^%(__prefix_line)s\w{14}: rejecting commands from (\S* )?\[<HOST>\] due to pre-greeting traffic after \d+ seconds$
^%(__prefix_line)s\w{14}: (\S+ )?\[<HOST>\]: ((?i)expn|vrfy) \S+ \[rejected\]$
^(?P<__prefix>%(__prefix_line)s\w+: )<[^@]+@[^>]+>\.\.\. No such user here<SKIPLINES>(?P=__prefix)from=<[^@]+@[^>]+>, size=\d+, class=\d+, nrcpts=\d+, bodytype=\w+, proto=E?SMTP, daemon=MTA, relay=\S+ \[<HOST>\]$
prefregex = ^<F-MLFID>%(__prefix_line)s(?:\w{14}: )?</F-MLFID><F-CONTENT>.+</F-CONTENT>$
cmnfailre = ^ruleset=check_rcpt, arg1=(?P<email><\S+@\S+>), relay=(\S+ )?\[<HOST>\](?: \(may be forged\))?, reject=(550 5\.7\.1 (?P=email)\.\.\. Relaying denied\. (IP name possibly forged \[(\d+\.){3}\d+\]|Proper authentication required\.|IP name lookup failed \[(\d+\.){3}\d+\])|553 5\.1\.8 (?P=email)\.\.\. Domain of sender address \S+ does not exist|550 5\.[71]\.1 (?P=email)\.\.\. (Rejected: .*|User unknown))$
^ruleset=check_relay, arg1=(?P<dom>\S+), arg2=<HOST>, relay=((?P=dom) )?\[(\d+\.){3}\d+\](?: \(may be forged\))?, reject=421 4\.3\.2 (Connection rate limit exceeded\.|Too many open connections\.)$
^rejecting commands from (\S* )?\[<HOST>\] due to pre-greeting traffic after \d+ seconds$
^(?:\S+ )?\[<HOST>\]: (?:(?i)expn|vrfy) \S+ \[rejected\]$
^<[^@]+@[^>]+>\.\.\. No such user here$
^<F-NOFAIL>from=<[^@]+@[^>]+></F-NOFAIL>, size=\d+, class=\d+, nrcpts=\d+, bodytype=\w+, proto=E?SMTP, daemon=MTA, relay=\S+ \[<HOST>\]$
ignoreregex =
mdre-normal =
mdre-extra = ^(?:\S+ )?\[<HOST>\](?: \(may be forged\))? did not issue (?:[A-Z]{4}[/ ]?)+during connection to M(?:TA|SP)(?:-\w+)?$
[Init]
mdre-aggressive = %(mdre-extra)s
failregex = %(cmnfailre)s
<mdre-<mode>>
# Parameter "mode": normal (default), extra or aggressive
# Usage example (for jail.local):
# [sendmail-reject]
# filter = sendmail-reject[mode=extra]
#
mode = normal
ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches
maxlines = 10
# DEV NOTES:
#
# Regarding the last multiline regex:
# Regarding the multiline regex:
#
# There can be a nunber of non-related lines between the first and second part
# of this regex maxlines of 10 is quite generious. Only one of the
# "No such user" lines needs to be matched before the line with the HOST.
# "No such user" lines generate a failure and needs to be matched together with
# another line with the HOST, therefore no-failure line was added as regex, that
# contains HOST (see line with tag <F-NOFAIL>).
#
# Note the capture __prefix, includes both the __prefix_lines (which includes
# the sendmail PID), but also the \w+ which the the sendmail assigned mail ID.
# Note the capture <F-MLFID>, includes both the __prefix_lines (which includes
# the sendmail PID), but also the `\w{14}` which the the sendmail assigned
# mail ID (todo: check this is necessary, possible obsolete).
#
# Author: Daniel Black and Fabian Wenk
# Author: Daniel Black, Fabian Wenk and Sergey Brester aka sebres.
# Rewritten using prefregex by Serg G. Brester.

View File

@ -1,29 +0,0 @@
# Fail2Ban ssh filter for at attempted exploit
#
# The regex here also relates to a exploit:
#
# http://www.securityfocus.com/bid/17958/exploit
# The example code here shows the pushing of the exploit straight after
# reading the server version. This is where the client version string normally
# pushed. As such the server will read this unparsible information as
# "Did not receive identification string".
[INCLUDES]
# Read common prefixes. If any customizations available -- read them from
# common.local
before = common.conf
[Definition]
_daemon = sshd
failregex = ^%(__prefix_line)sDid not receive identification string from <HOST>\s*$
ignoreregex =
[Init]
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
# Author: Yaroslav Halchenko

View File

@ -14,32 +14,74 @@
# common.local
before = common.conf
[Definition]
[DEFAULT]
_daemon = sshd
failregex = ^%(__prefix_line)s(?:error: PAM: )?[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*$
^%(__prefix_line)s(?:error: PAM: )?User not known to the underlying authentication module for .* from <HOST>\s*$
^%(__prefix_line)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>(?: port \d+)?(?: ssh\d*)?(?(cond_user):|(?:(?:(?! from ).)*)$)
^%(__prefix_line)sROOT LOGIN REFUSED.* FROM <HOST>\s*$
^%(__prefix_line)s[iI](?:llegal|nvalid) user .* from <HOST>\s*$
^%(__prefix_line)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*$
^%(__prefix_line)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*$
^%(__prefix_line)sUser .+ from <HOST> not allowed because not in any group\s*$
^%(__prefix_line)srefused connect from \S+ \(<HOST>\)\s*$
^%(__prefix_line)s(?:error: )?Received disconnect from <HOST>: 3: .*: Auth fail(?: \[preauth\])?$
^%(__prefix_line)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*$
^%(__prefix_line)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*$
^(?P<__prefix>%(__prefix_line)s)User .+ not allowed because account is locked<SKIPLINES>(?P=__prefix)(?:error: )?Received disconnect from <HOST>: 11: .+ \[preauth\]$
^(?P<__prefix>%(__prefix_line)s)Disconnecting: Too many authentication failures for .+? \[preauth\]<SKIPLINES>(?P=__prefix)(?:error: )?Connection closed by <HOST> \[preauth\]$
^(?P<__prefix>%(__prefix_line)s)Connection from <HOST> port \d+(?: on \S+ port \d+)?<SKIPLINES>(?P=__prefix)Disconnecting: Too many authentication failures for .+? \[preauth\]$
^%(__prefix_line)s(error: )?maximum authentication attempts exceeded for .* from <HOST>(?: port \d*)?(?: ssh\d*)? \[preauth\]$
^%(__prefix_line)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*$
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
__pref = (?:(?:error|fatal): (?:PAM: )?)?
# optional suffix (logged from several ssh versions) like " [preauth]"
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
[Definition]
prefregex = ^<F-MLFID>%(__prefix_line)s</F-MLFID>%(__pref)s<F-CONTENT>.+</F-CONTENT>$
cmnfailre = ^[aA]uthentication (?:failure|error|failed) for <F-USER>.*</F-USER> from <HOST>( via \S+)?\s*%(__suff)s$
^User not known to the underlying authentication module for <F-USER>.*</F-USER> from <HOST>\s*%(__suff)s$
^Failed \S+ for (?P<cond_inv>invalid user )?<F-USER>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^<F-USER>ROOT</F-USER> LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^[iI](?:llegal|nvalid) user <F-USER>.*?</F-USER> from <HOST>%(__on_port_opt)s\s*$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not in any group\s*%(__suff)s$
^refused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^Received <F-MLFFORGET>disconnect</F-MLFFORGET> from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^pam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=<F-USER>\S*</F-USER>\s*rhost=<HOST>\s.*%(__suff)s$
^(error: )?maximum authentication attempts exceeded for <F-USER>.*</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?%(__suff)s$
^User <F-USER>.+</F-USER> not allowed because account is locked%(__suff)s
^<F-MLFFORGET>Disconnecting</F-MLFFORGET>: Too many authentication failures(?: for <F-USER>.+?</F-USER>)?%(__suff)s
^<F-NOFAIL>Received <F-MLFFORGET>disconnect</F-MLFFORGET></F-NOFAIL> from <HOST>: 11:
^<F-NOFAIL>Connection <F-MLFFORGET>closed</F-MLFFORGET></F-NOFAIL> by <HOST>%(__suff)s$
mdre-normal =
mdre-ddos = ^Did not receive identification string from <HOST>%(__suff)s$
^Connection <F-MLFFORGET>reset</F-MLFFORGET> by <HOST>%(__on_port_opt)s%(__suff)s
^<F-NOFAIL>SSH: Server;Ltype:</F-NOFAIL> (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:
^Read from socket failed: Connection <F-MLFFORGET>reset</F-MLFFORGET> by peer%(__suff)s
mdre-extra = ^Received <F-MLFFORGET>disconnect</F-MLFFORGET> from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^Unable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^Unable to negotiate a (?:cipher|key exchange method)%(__suff)s$
mdre-aggressive = %(mdre-ddos)s
%(mdre-extra)s
cfooterre = ^<F-NOFAIL>Connection from</F-NOFAIL> <HOST>
failregex = %(cmnfailre)s
<mdre-<mode>>
%(cfooterre)s
# Parameter "mode": normal (default), ddos, extra or aggressive (combines all)
# Usage example (for jail.local):
# [sshd]
# mode = extra
# # or another jail (rewrite filter parameters of jail):
# [sshd-aggressive]
# filter = sshd[mode=aggressive]
#
mode = normal
#filter = sshd[mode=aggressive]
ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches
maxlines = 10
maxlines = 1
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
@ -52,5 +94,5 @@ datepattern = {^LN-BEG}
# and later catch-all's could contain user-provided input, which need to be greedily
# matched away first.
#
# Author: Cyril Jaquier, Yaroslav Halchenko, Petr Voralek, Daniel Black
# Author: Cyril Jaquier, Yaroslav Halchenko, Petr Voralek, Daniel Black and Sergey Brester aka sebres
# Rewritten using prefregex (and introduced "mode" parameter) by Serg G. Brester.

View File

@ -17,7 +17,7 @@ _daemon = (?:lighttpd|suhosin)
_lighttpd_prefix = (?:\(mod_fastcgi\.c\.\d+\) FastCGI-stderr:\s)
failregex = ^%(__prefix_line)s%(_lighttpd_prefix)s?ALERT - .* \(attacker '<HOST>', file '.*'(?:, line \d+)?\)$
failregex = ^%(__prefix_line)s%(_lighttpd_prefix)s?ALERT - .*? \(attacker '<HOST>', file '[^']*'(?:, line \d+)?\)$
ignoreregex =

View File

@ -14,8 +14,10 @@ before = common.conf
_daemon = xinetd
failregex = ^%(__prefix_line)sFAIL: \S+ address from=<HOST>$
^%(__prefix_line)sFAIL: \S+ libwrap from=<HOST>$
prefregex = ^%(__prefix_line)sFAIL: <F-CONTENT>.+</F-CONTENT>$
failregex = ^\S+ address from=<HOST>$
^\S+ libwrap from=<HOST>$
ignoreregex =

View File

@ -44,10 +44,14 @@ before = paths-debian.conf
# MISCELLANEOUS OPTIONS
#
# "ignoreip" can be an IP address, a CIDR mask or a DNS host. Fail2ban will not
# ban a host which matches an address in this list. Several addresses can be
# defined using space (and/or comma) separator.
ignoreip = 127.0.0.1/8 ::1
# "ignorself" specifies whether the local resp. own IP addresses should be ignored
# (default is true). Fail2ban will not ban a host which matches such addresses.
#ignorself = true
# "ignoreip" can be a list of IP addresses, CIDR masks or DNS hosts. Fail2ban
# will not ban a host which matches an address in this list. Several addresses
# can be defined using space (and/or comma) separator.
#ignoreip = 127.0.0.1/8 ::1
# External command that will take an tagged arguments to ignore, e.g. <ip>,
# and return true if the IP is to be ignored. False otherwise.
@ -207,6 +211,12 @@ action_badips = badips.py[category="%(__name__)s", banaction="%(banaction)s", ag
#
action_badips_report = badips[category="%(__name__)s", agent="%(fail2ban_agent)s"]
# Report ban via abuseipdb.com.
#
# See action.d/abuseipdb.conf for usage example and details.
#
action_abuseipdb = abuseipdb
# Choose default action. To change, just override value of 'action' with the
# interpolation to the chosen action shortcut (e.g. action_mw, action_mwl, etc) in jail.local
# globally (section [DEFAULT]) or per specific section
@ -223,15 +233,11 @@ action = %(action_)s
[sshd]
port = ssh
logpath = %(sshd_log)s
backend = %(sshd_backend)s
[sshd-ddos]
# This jail corresponds to the standard configuration in Fail2ban.
# The mail-whois action send a notification e-mail with a whois request
# in the body.
# To use more aggressive sshd modes set filter parameter "mode" in jail.local:
# normal (default), ddos, extra or aggressive (combines all).
# See "tests/files/logs/sshd" or "filter.d/sshd.conf" for usage example and details.
mode = normal
filter = sshd[mode=%(mode)s]
port = ssh
logpath = %(sshd_log)s
backend = %(sshd_backend)s
@ -547,7 +553,11 @@ backend = %(syslog_backend)s
[sendmail-reject]
# To use more aggressive modes set filter parameter "mode" in jail.local:
# normal (default), extra or aggressive
# See "tests/files/logs/sendmail-reject" or "filter.d/sendmail-reject.conf" for usage example and details.
mode = normal
filter = sendmail-reject[mode=%(mode)s]
port = smtp,465,submission
logpath = %(syslog_mail)s
backend = %(syslog_backend)s
@ -731,6 +741,13 @@ logpath = %(mysql_log)s
backend = %(mysql_backend)s
# Log wrong MongoDB auth (for details see filter 'filter.d/mongodb-auth.conf')
[mongodb-auth]
# change port when running with "--shardsvr" or "--configsvr" runtime operation
port = 27017
logpath = /var/log/mongodb/mongodb.log
# Jail for more extended banning of persistent abusers
# !!! WARNINGS !!!
# 1. Make sure that your loglevel specified in fail2ban.conf/.local
@ -810,8 +827,9 @@ maxretry = 1
[pass2allow-ftp]
# this pass2allow example allows FTP traffic after successful HTTP authentication
port = ftp,ftp-data,ftps,ftps-data
# knocking_url variable must be overridden to some secret value in filter.d/apache-pass.local
filter = apache-pass
# knocking_url variable must be overridden to some secret value in jail.local
knocking_url = /knocking/
filter = apache-pass[knocking_url="%(knocking_url)s"]
# access log of the website with HTTP auth
logpath = %(apache_access_log)s
blocktype = RETURN
@ -845,3 +863,8 @@ logpath = /var/log/haproxy.log
port = ldap,ldaps
filter = slapd
logpath = /var/log/slapd.log
[domino-smtp]
port = smtp,ssmtp
filter = domino-smtp
logpath = /home/domino01/data/IBM_TECHNICAL_SUPPORT/console.log

View File

@ -34,13 +34,13 @@ auditd_log = /dev/null
# http://svnweb.freebsd.org/ports/head/www/apache24/files/patch-config.layout
# http://svnweb.freebsd.org/ports/head/www/apache22/files/patch-config.layout
apache_error_log = /usr/local/www/logs/*error[_.]log
apache_error_log = /var/log/httpd-error.log
apache_access_log = /usr/local/www/logs/*access[_.]log
apache_access_log = /var/log/httpd-access.log
# http://svnweb.freebsd.org/ports/head/www/nginx/Makefile?view=markup
nginx_error_log = /var/log/nginx-error.log
nginx_error_log = /var/log/nginx/error.log
nginx_access_log = /var/log/nginx-access.log
nginx_access_log = /var/log/nginx/access.log

View File

@ -36,3 +36,15 @@ mysql_log = /var/log/mysql/mysqld.log
roundcube_errors_log = /srv/www/roundcubemail/logs/errors
solidpop3d_log = %(syslog_mail)s
# These services will log to the journal via syslog, so use the journal by
# default.
syslog_backend = systemd
sshd_backend = systemd
dropbear_backend = systemd
proftpd_backend = systemd
pureftpd_backend = systemd
wuftpd_backend = systemd
postfix_backend = systemd
dovecot_backend = systemd
mysql_backend = systemd

View File

@ -27,8 +27,10 @@ __license__ = "GPL"
import logging.handlers
# Custom debug levels
logging.MSG = logging.INFO - 2
logging.TRACEDEBUG = 7
logging.HEAVYDEBUG = 5
logging.addLevelName(logging.MSG, 'MSG')
logging.addLevelName(logging.TRACEDEBUG, 'TRACE')
logging.addLevelName(logging.HEAVYDEBUG, 'HEAVY')

View File

@ -28,6 +28,7 @@ import os
from .configreader import DefinitionInitConfigReader
from ..helpers import getLogger
from ..server.action import CommandAction
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -37,16 +38,23 @@ class ActionReader(DefinitionInitConfigReader):
_configOpts = {
"actionstart": ["string", None],
"actionstart_on_demand": ["string", None],
"actionstop": ["string", None],
"actionflush": ["string", None],
"actionreload": ["string", None],
"actioncheck": ["string", None],
"actionrepair": ["string", None],
"actionban": ["string", None],
"actionunban": ["string", None],
"norestored": ["string", None],
}
def __init__(self, file_, jailName, initOpts, **kwargs):
self._name = initOpts.get("actname", file_)
actname = initOpts.get("actname")
if actname is None:
actname = file_
initOpts["actname"] = actname
self._name = actname
DefinitionInitConfigReader.__init__(
self, file_, jailName, initOpts, **kwargs)
@ -64,16 +72,25 @@ class ActionReader(DefinitionInitConfigReader):
return self._name
def convert(self):
opts = self.getCombined(
ignore=CommandAction._escapedTags | set(('timeout', 'bantime')))
# type-convert only after combined (otherwise boolean converting prevents substitution):
for o in ('norestored', 'actionstart_on_demand'):
if opts.get(o):
opts[o] = self._convert_to_boolean(opts[o])
# stream-convert:
head = ["set", self._jailName]
stream = list()
stream.append(head + ["addaction", self._name])
multi = []
for opt, optval in self._opts.iteritems():
for opt, optval in opts.iteritems():
if opt in self._configOpts:
multi.append([opt, optval])
if self._initOpts:
for opt, optval in self._initOpts.iteritems():
multi.append([opt, optval])
if opt not in self._configOpts:
multi.append([opt, optval])
if len(multi) > 1:
stream.append(["multi-set", self._jailName, "action", self._name, multi])
elif len(multi):

View File

@ -89,6 +89,8 @@ class Beautifier:
val = " ".join(map(str, res1[1])) if isinstance(res1[1], list) else res1[1]
msg.append("%s %s:\t%s" % (prefix1, res1[0], val))
msg = "\n".join(msg)
elif len(inC) < 2:
pass # to few cmd args for below
elif inC[1] == "syslogsocket":
msg = "Current syslog socket is:\n"
msg += "`- " + response
@ -110,6 +112,8 @@ class Beautifier:
else:
msg = "Current database purge age is:\n"
msg += "`- %iseconds" % response
elif len(inC) < 3:
pass # to few cmd args for below
elif inC[2] in ("logpath", "addlogpath", "dellogpath"):
if len(response) == 0:
msg = "No file is currently monitored"
@ -178,7 +182,8 @@ class Beautifier:
msg += ", ".join(response)
except Exception:
logSys.warning("Beautifier error. Please report the error")
logSys.error("Beautify %r with %r failed", response, self.__inputCmd)
logSys.error("Beautify %r with %r failed", response, self.__inputCmd,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
msg = repr(msg) + repr(response)
return msg

View File

@ -32,7 +32,7 @@ from ..helpers import getLogger
if sys.version_info >= (3,2):
# SafeConfigParser deprecated from Python 3.2 (renamed to ConfigParser)
from configparser import ConfigParser as SafeConfigParser, \
from configparser import ConfigParser as SafeConfigParser, NoSectionError, \
BasicInterpolation
# And interpolation of __name__ was simply removed, thus we need to
@ -60,7 +60,7 @@ if sys.version_info >= (3,2):
parser, option, accum, rest, section, map, depth)
else: # pragma: no cover
from ConfigParser import SafeConfigParser
from ConfigParser import SafeConfigParser, NoSectionError
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -200,6 +200,21 @@ after = 1.conf
def get_sections(self):
return self._sections
def options(self, section, withDefault=True):
"""Return a list of option names for the given section name.
Parameter `withDefault` controls the include of names from section `[DEFAULT]`
"""
try:
opts = self._sections[section]
except KeyError:
raise NoSectionError(section)
if withDefault:
# mix it with defaults:
return set(opts.keys()) | set(self._defaults)
# only own option names:
return opts.keys()
def read(self, filenames, get_includes=True):
if not isinstance(filenames, list):
filenames = [ filenames ]

View File

@ -29,24 +29,12 @@ import os
from ConfigParser import NoOptionError, NoSectionError
from .configparserinc import sys, SafeConfigParserWithIncludes, logLevel
from ..helpers import getLogger
from ..helpers import getLogger, _merge_dicts, substituteRecursiveTags
# Gets the instance of the logger.
logSys = getLogger(__name__)
# if sys.version_info >= (3,5):
# def _merge_dicts(x, y):
# return {**x, **y}
# else:
def _merge_dicts(x, y):
r = x
if y:
r = x.copy()
r.update(y)
return r
class ConfigReader():
"""Generic config reader class.
@ -121,33 +109,44 @@ class ConfigReader():
self._cfg = ConfigReaderUnshared(**self._cfg_share_kwargs)
def sections(self):
if self._cfg is not None:
try:
return self._cfg.sections()
return []
except AttributeError:
return []
def has_section(self, sec):
if self._cfg is not None:
try:
return self._cfg.has_section(sec)
return False
except AttributeError:
return False
def merge_section(self, *args, **kwargs):
if self._cfg is not None:
return self._cfg.merge_section(*args, **kwargs)
def merge_section(self, section, *args, **kwargs):
try:
return self._cfg.merge_section(section, *args, **kwargs)
except AttributeError:
raise NoSectionError(section)
def options(self, section, withDefault=False):
"""Return a list of option names for the given section name.
def options(self, *args):
if self._cfg is not None:
return self._cfg.options(*args)
return {}
Parameter `withDefault` controls the include of names from section `[DEFAULT]`
"""
try:
return self._cfg.options(section, withDefault)
except AttributeError:
raise NoSectionError(section)
def get(self, sec, opt, raw=False, vars={}):
if self._cfg is not None:
try:
return self._cfg.get(sec, opt, raw=raw, vars=vars)
return None
except AttributeError:
raise NoSectionError(sec)
def getOptions(self, *args, **kwargs):
if self._cfg is not None:
return self._cfg.getOptions(*args, **kwargs)
return {}
def getOptions(self, section, *args, **kwargs):
try:
return self._cfg.getOptions(section, *args, **kwargs)
except AttributeError:
raise NoSectionError(section)
class ConfigReaderUnshared(SafeConfigParserWithIncludes):
@ -176,6 +175,8 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
if not os.path.exists(self._basedir):
raise ValueError("Base configuration directory %s does not exist "
% self._basedir)
if filename.startswith("./"): # pragma: no cover
filename = os.path.abspath(filename)
basename = os.path.join(self._basedir, filename)
logSys.debug("Reading configs for %s under %s " , filename, self._basedir)
config_files = [ basename + ".conf" ]
@ -224,6 +225,7 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
values = dict()
if pOptions is None:
pOptions = {}
# Get only specified options:
for optname in options:
if isinstance(options, (list,tuple)):
if len(optname) > 2:
@ -276,9 +278,13 @@ class DefinitionInitConfigReader(ConfigReader):
def __init__(self, file_, jailName, initOpts, **kwargs):
ConfigReader.__init__(self, **kwargs)
if file_.startswith("./"): # pragma: no cover
file_ = os.path.abspath(file_)
self.setFile(file_)
self.setJailName(jailName)
self._initOpts = initOpts
self._pOpts = dict()
self._defCache = dict()
def setFile(self, fileName):
self._file = fileName
@ -302,23 +308,74 @@ class DefinitionInitConfigReader(ConfigReader):
self._create_unshared(self._file)
return SafeConfigParserWithIncludes.read(self._cfg, self._file)
def getOptions(self, pOpts):
def getOptions(self, pOpts, all=False):
# overwrite static definition options with init values, supplied as
# direct parameters from jail-config via action[xtra1="...", xtra2=...]:
if not pOpts:
pOpts = dict()
if self._initOpts:
if not pOpts:
pOpts = dict()
pOpts = _merge_dicts(pOpts, self._initOpts)
self._opts = ConfigReader.getOptions(
self, "Definition", self._configOpts, pOpts)
self._pOpts = pOpts
if self.has_section("Init"):
for opt in self.options("Init"):
v = self.get("Init", opt)
if not opt.startswith('known/') and opt != '__name__':
# get only own options (without options from default):
getopt = lambda opt: self.get("Init", opt)
for opt in self.options("Init", withDefault=False):
if opt == '__name__': continue
v = None
if not opt.startswith('known/'):
if v is None: v = getopt(opt)
self._initOpts['known/'+opt] = v
if not opt in self._initOpts:
if opt not in self._initOpts:
if v is None: v = getopt(opt)
self._initOpts[opt] = v
if all and self.has_section("Definition"):
# merge with all definition options (and options from default),
# bypass already converted option (so merge only new options):
for opt in self.options("Definition"):
if opt == '__name__' or opt in self._opts: continue
self._opts[opt] = self.get("Definition", opt)
def _convert_to_boolean(self, value):
return value.lower() in ("1", "yes", "true", "on")
def getCombOption(self, optname):
"""Get combined definition option (as string) using pre-set and init
options as preselection (values with higher precedence as specified in section).
Can be used only after calling of getOptions.
"""
try:
return self._defCache[optname]
except KeyError:
try:
v = self.get("Definition", optname, vars=self._pOpts)
except (NoSectionError, NoOptionError, ValueError):
v = None
self._defCache[optname] = v
return v
def getCombined(self, ignore=()):
combinedopts = self._opts
if self._initOpts:
combinedopts = _merge_dicts(combinedopts, self._initOpts)
if not len(combinedopts):
return {}
# ignore conditional options:
ignore = set(ignore).copy()
for n in combinedopts:
cond = SafeConfigParserWithIncludes.CONDITIONAL_RE.match(n)
if cond:
n, cond = cond.groups()
ignore.add(n)
# substiture options already specified direct:
opts = substituteRecursiveTags(combinedopts,
ignore=ignore, addrepl=self.getCombOption)
if not opts:
raise ValueError('recursive tag definitions unable to be resolved')
return opts
def convert(self):
raise NotImplementedError

View File

@ -41,27 +41,30 @@ from optparse import OptionParser, Option
from ConfigParser import NoOptionError, NoSectionError, MissingSectionHeaderError
try: # pragma: no cover
from systemd import journal
from ..server.filtersystemd import FilterSystemd
except ImportError:
journal = None
FilterSystemd = None
from ..version import version
from .jailreader import JailReader
from .filterreader import FilterReader
from ..server.filter import Filter, FileContainer
from ..server.failregex import RegexException
from ..server.failregex import Regex, RegexException
from ..helpers import str2LogLevel, getVerbosityFormat, FormatterWithTraceBack, getLogger, PREFER_ENC
# Gets the instance of the logger.
logSys = getLogger("fail2ban")
def debuggexURL(sample, regex):
q = urllib.urlencode({ 're': regex.replace('<HOST>', '(?&.ipv4)'),
'str': sample,
'flavor': 'python' })
return 'https://www.debuggex.com/?' + q
def debuggexURL(sample, regex, multiline=False, useDns="yes"):
args = {
're': Regex._resolveHostTag(regex, useDns=useDns),
'str': sample,
'flavor': 'python'
}
if multiline: args['flags'] = 'm'
return 'https://www.debuggex.com/?' + urllib.urlencode(args)
def output(args):
def output(args): # pragma: no cover (overriden in test-cases)
print(args)
def shortstr(s, l=53):
@ -80,7 +83,7 @@ def pprint_list(l, header=None):
s = ''
output( s + "| " + "\n| ".join(l) + '\n`-' )
def journal_lines_gen(myjournal): # pragma: no cover
def journal_lines_gen(flt, myjournal): # pragma: no cover
while True:
try:
entry = myjournal.get_next()
@ -88,7 +91,7 @@ def journal_lines_gen(myjournal): # pragma: no cover
continue
if not entry:
break
yield FilterSystemd.formatJournalEntry(entry)
yield flt.formatJournalEntry(entry)
def get_opt_parser():
# use module docstring for help output
@ -120,6 +123,8 @@ Report bugs to https://github.com/fail2ban/fail2ban/issues
version="%prog " + version)
p.add_options([
Option("-c", "--config", default='/etc/fail2ban',
help="set alternate config directory"),
Option("-d", "--datepattern",
help="set custom pattern used to match date/times"),
Option("-e", "--encoding", default=PREFER_ENC,
@ -196,15 +201,17 @@ class RegexStat(object):
class LineStats(object):
"""Just a convenience container for stats
"""
def __init__(self):
def __init__(self, opts):
self.tested = self.matched = 0
self.matched_lines = []
self.missed = 0
self.missed_lines = []
self.missed_lines_timeextracted = []
self.ignored = 0
self.ignored_lines = []
self.ignored_lines_timeextracted = []
if opts.debuggex:
self.matched_lines_timeextracted = []
self.missed_lines_timeextracted = []
self.ignored_lines_timeextracted = []
def __str__(self):
return "%(tested)d lines, %(ignored)d ignored, %(matched)d matched, %(missed)d missed" % self
@ -228,14 +235,14 @@ class Fail2banRegex(object):
self._ignoreregex = list()
self._failregex = list()
self._time_elapsed = None
self._line_stats = LineStats()
self._line_stats = LineStats(opts)
if opts.maxlines:
self.setMaxLines(opts.maxlines)
else:
self._maxlines = 20
if opts.journalmatch is not None:
self.setJournalMatch(opts.journalmatch.split())
self.setJournalMatch(shlex.split(opts.journalmatch))
if opts.datepattern:
self.setDatePattern(opts.datepattern)
if opts.usedns:
@ -243,6 +250,7 @@ class Fail2banRegex(object):
self._filter.returnRawHost = opts.raw
self._filter.checkFindTime = False
self._filter.checkAllRegex = True
self._opts = opts
def decode_line(self, line):
return FileContainer.decode_line('<LOG>', self._encoding, line)
@ -265,69 +273,117 @@ class Fail2banRegex(object):
output( "Use maxlines : %d" % self._filter.getMaxLines() )
def setJournalMatch(self, v):
if self._journalmatch is None:
self._journalmatch = v
self._journalmatch = v
def readRegex(self, value, regextype):
assert(regextype in ('fail', 'ignore'))
regex = regextype + 'regex'
if os.path.isfile(value) or os.path.isfile(value + '.conf'):
if os.path.basename(os.path.dirname(value)) == 'filter.d':
# try to check - we've case filter?[options...]?:
basedir = self._opts.config
fltFile = None
fltOpt = {}
if regextype == 'fail':
fltName, fltOpt = JailReader.extractOptions(value)
if fltName is not None:
if "." in fltName[~5:]:
tryNames = (fltName,)
else:
tryNames = (fltName, fltName + '.conf', fltName + '.local')
for fltFile in tryNames:
if not "/" in fltFile:
if os.path.basename(basedir) == 'filter.d':
fltFile = os.path.join(basedir, fltFile)
else:
fltFile = os.path.join(basedir, 'filter.d', fltFile)
else:
basedir = os.path.dirname(fltFile)
if os.path.isfile(fltFile):
break
fltFile = None
# if it is filter file:
if fltFile is not None:
if (basedir == self._opts.config
or os.path.basename(basedir) == 'filter.d'
or ("." not in fltName[~5:] and "/" not in fltName)
):
## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.):
basedir = os.path.dirname(os.path.dirname(value))
value = os.path.splitext(os.path.basename(value))[0]
output( "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir)
if not reader.read():
output( "ERROR: failed to load filter %s" % value )
return False
if os.path.basename(basedir) == 'filter.d':
basedir = os.path.dirname(basedir)
fltName = os.path.splitext(os.path.basename(fltName))[0]
output( "Use %11s filter file : %s, basedir: %s" % (regex, fltName, basedir) )
else:
## foreign file - readexplicit this file and includes if possible:
output( "Use %11s file : %s" % (regex, value) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config)
reader.setBaseDir(None)
if not reader.readexplicit():
output( "ERROR: failed to read %s" % value )
return False
output( "Use %11s file : %s" % (regex, fltName) )
basedir = None
if fltOpt:
output( "Use filter options : %r" % fltOpt )
reader = FilterReader(fltName, 'fail2ban-regex-jail', fltOpt, share_config=self.share_config, basedir=basedir)
ret = None
try:
if basedir is not None:
ret = reader.read()
else:
## foreign file - readexplicit this file and includes if possible:
reader.setBaseDir(None)
ret = reader.readexplicit()
except Exception as e:
output("Wrong config file: %s" % (str(e),))
if self._verbose: raise(e)
if not ret:
output( "ERROR: failed to load filter %s" % value )
return False
reader.getOptions(None)
readercommands = reader.convert()
regex_values = [
RegexStat(m[3])
for m in filter(
lambda x: x[0] == 'set' and x[2] == "add%sregex" % regextype,
readercommands)
] + [
RegexStat(m)
for mm in filter(
lambda x: x[0] == 'multi-set' and x[2] == "add%sregex" % regextype,
readercommands)
for m in mm[3]
]
# Read out and set possible value of maxlines
for command in readercommands:
if command[2] == "maxlines":
maxlines = int(command[3])
try:
self.setMaxLines(maxlines)
except ValueError:
output( "ERROR: Invalid value for maxlines (%(maxlines)r) " \
"read from %(value)s" % locals() )
return False
elif command[2] == 'addjournalmatch':
journalmatch = command[3:]
self.setJournalMatch(journalmatch)
elif command[2] == 'datepattern':
datepattern = command[3]
self.setDatePattern(datepattern)
regex_values = {}
for opt in readercommands:
if opt[0] == 'multi-set':
optval = opt[3]
elif opt[0] == 'set':
optval = opt[3:]
else: # pragma: no cover
continue
try:
if opt[2] == "prefregex":
for optval in optval:
self._filter.prefRegex = optval
elif opt[2] == "addfailregex":
stor = regex_values.get('fail')
if not stor: stor = regex_values['fail'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addFailRegex(optval)
elif opt[2] == "addignoreregex":
stor = regex_values.get('ignore')
if not stor: stor = regex_values['ignore'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addIgnoreRegex(optval)
elif opt[2] == "maxlines":
for optval in optval:
self.setMaxLines(optval)
elif opt[2] == "datepattern":
for optval in optval:
self.setDatePattern(optval)
elif opt[2] == "addjournalmatch": # pragma: no cover
if self._opts.journalmatch is None:
self.setJournalMatch(optval)
except ValueError as e: # pragma: no cover
output( "ERROR: Invalid value for %s (%r) " \
"read from %s: %s" % (opt[2], optval, value, e) )
return False
else:
output( "Use %11s line : %s" % (regex, shortstr(value)) )
regex_values = [RegexStat(value)]
regex_values = {regextype: [RegexStat(value)]}
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
for regextype, regex_values in regex_values.iteritems():
regex = regextype + 'regex'
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
return True
def testIgnoreRegex(self, line):
@ -337,7 +393,7 @@ class Fail2banRegex(object):
if ret is not None:
found = True
regex = self._ignoreregex[ret].inc()
except RegexException as e:
except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e )
return False
return found
@ -347,6 +403,7 @@ class Fail2banRegex(object):
fullBuffer = len(orgLineBuffer) >= self._filter.getMaxLines()
try:
ret = self._filter.processLine(line, date)
lines = []
line = self._filter.processedLine()
for match in ret:
# Append True/False flag depending if line was matched by
@ -355,7 +412,7 @@ class Fail2banRegex(object):
regex = self._failregex[match[0]]
regex.inc()
regex.appendIP(match)
except RegexException as e:
except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e )
return False
for bufLine in orgLineBuffer[int(fullBuffer):]:
@ -363,14 +420,23 @@ class Fail2banRegex(object):
try:
self._line_stats.missed_lines.pop(
self._line_stats.missed_lines.index("".join(bufLine)))
self._line_stats.missed_lines_timeextracted.pop(
self._line_stats.missed_lines_timeextracted.index(
"".join(bufLine[::2])))
if self._debuggex:
self._line_stats.missed_lines_timeextracted.pop(
self._line_stats.missed_lines_timeextracted.index(
"".join(bufLine[::2])))
except ValueError:
pass
else:
self._line_stats.matched += 1
self._line_stats.missed -= 1
# if buffering - add also another lines from match:
if self._print_all_matched:
if not self._debuggex:
self._line_stats.matched_lines.append("".join(bufLine))
else:
lines.append(bufLine[0] + bufLine[2])
self._line_stats.matched += 1
self._line_stats.missed -= 1
if lines: # pre-lines parsed in multiline mode (buffering)
lines.append(line)
line = "\n".join(lines)
return line, ret
def process(self, test_lines):
@ -392,19 +458,23 @@ class Fail2banRegex(object):
self._line_stats.ignored += 1
if not self._print_no_ignored and (self._print_all_ignored or self._line_stats.ignored <= self._maxlines + 1):
self._line_stats.ignored_lines.append(line)
self._line_stats.ignored_lines_timeextracted.append(line_datetimestripped)
if self._debuggex:
self._line_stats.ignored_lines_timeextracted.append(line_datetimestripped)
if len(ret) > 0:
assert(not is_ignored)
self._line_stats.matched += 1
if self._print_all_matched:
self._line_stats.matched_lines.append(line)
if self._debuggex:
self._line_stats.matched_lines_timeextracted.append(line_datetimestripped)
else:
if not is_ignored:
self._line_stats.missed += 1
if not self._print_no_missed and (self._print_all_missed or self._line_stats.missed <= self._maxlines + 1):
self._line_stats.missed_lines.append(line)
self._line_stats.missed_lines_timeextracted.append(line_datetimestripped)
if self._debuggex:
self._line_stats.missed_lines_timeextracted.append(line_datetimestripped)
self._line_stats.tested += 1
self._time_elapsed = time.time() - t0
@ -414,6 +484,7 @@ class Fail2banRegex(object):
assert(self._line_stats.missed == lstats.tested - (lstats.matched + lstats.ignored))
lines = lstats[ltype]
l = lstats[ltype + '_lines']
multiline = self._filter.getMaxLines() > 1
if lines:
header = "%s line(s):" % (ltype.capitalize(),)
if self._debuggex:
@ -427,7 +498,8 @@ class Fail2banRegex(object):
for arg in [l, regexlist]:
ans = [ x + [y] for x in ans for y in arg ]
b = map(lambda a: a[0] + ' | ' + a[1].getFailRegex() + ' | ' +
debuggexURL(self.encode_line(a[0]), a[1].getFailRegex()), ans)
debuggexURL(self.encode_line(a[0]), a[1].getFailRegex(),
multiline, self._opts.usedns), ans)
pprint_list([x.rstrip() for x in b], header)
else:
output( "%s too many to print. Use --print-all-%s " \
@ -502,14 +574,14 @@ class Fail2banRegex(object):
for line in hdlr:
yield self.decode_line(line)
def start(self, opts, args):
def start(self, args):
cmd_log, cmd_regex = args[:2]
try:
if not self.readRegex(cmd_regex, 'fail'):
if not self.readRegex(cmd_regex, 'fail'): # pragma: no cover
return False
if len(args) == 3 and not self.readRegex(args[2], 'ignore'):
if len(args) == 3 and not self.readRegex(args[2], 'ignore'): # pragma: no cover
return False
except RegexException as e:
output( 'ERROR: %s' % e )
@ -521,31 +593,39 @@ class Fail2banRegex(object):
output( "Use log file : %s" % cmd_log )
output( "Use encoding : %s" % self._encoding )
test_lines = self.file_lines_gen(hdlr)
except IOError as e:
except IOError as e: # pragma: no cover
output( e )
return False
elif cmd_log == "systemd-journal": # pragma: no cover
if not journal:
elif cmd_log.startswith("systemd-journal"): # pragma: no cover
if not FilterSystemd:
output( "Error: systemd library not found. Exiting..." )
return False
myjournal = journal.Reader(converters={'__CURSOR': lambda x: x})
output( "Use systemd journal" )
output( "Use encoding : %s" % self._encoding )
backend, beArgs = JailReader.extractOptions(cmd_log)
flt = FilterSystemd(None, **beArgs)
flt.setLogEncoding(self._encoding)
myjournal = flt.getJournalReader()
journalmatch = self._journalmatch
self.setDatePattern(None)
if journalmatch:
try:
for element in journalmatch:
if element == "+":
myjournal.add_disjunction()
else:
myjournal.add_match(element)
except ValueError:
output( "Error: Invalid journalmatch: %s" % shortstr(" ".join(journalmatch)) )
return False
flt.addJournalMatch(journalmatch)
output( "Use journal match : %s" % " ".join(journalmatch) )
test_lines = journal_lines_gen(myjournal)
test_lines = journal_lines_gen(flt, myjournal)
else:
output( "Use single line : %s" % shortstr(cmd_log) )
test_lines = [ cmd_log ]
# if single line parsing (without buffering)
if self._filter.getMaxLines() <= 1:
output( "Use single line : %s" % shortstr(cmd_log.replace("\n", r"\n")) )
test_lines = [ cmd_log ]
else: # multi line parsing (with buffering)
test_lines = cmd_log.split("\n")
output( "Use multi line : %s line(s)" % len(test_lines) )
for i, l in enumerate(test_lines):
if i >= 5:
output( "| ..." ); break
output( "| %2.2s: %s" % (i+1, shortstr(l)) )
output( "`-" )
output( "" )
self.process(test_lines)
@ -598,5 +678,5 @@ def exec_command_line(*args):
logSys.addHandler(stdout)
fail2banRegex = Fail2banRegex(opts)
if not fail2banRegex.start(opts, args):
if not fail2banRegex.start(args):
sys.exit(-1)

View File

@ -27,8 +27,7 @@ __license__ = "GPL"
import os
import shlex
from .configreader import DefinitionInitConfigReader, _merge_dicts
from ..server.action import CommandAction
from .configreader import DefinitionInitConfigReader
from ..helpers import getLogger
# Gets the instance of the logger.
@ -38,6 +37,7 @@ logSys = getLogger(__name__)
class FilterReader(DefinitionInitConfigReader):
_configOpts = {
"prefregex": ["string", None],
"ignoreregex": ["string", None],
"failregex": ["string", ""],
"maxlines": ["int", None],
@ -52,17 +52,6 @@ class FilterReader(DefinitionInitConfigReader):
def getFile(self):
return self.__file
def getCombined(self):
combinedopts = self._opts
if self._initOpts:
combinedopts = _merge_dicts(self._opts, self._initOpts)
if not len(combinedopts):
return {}
opts = CommandAction.substituteRecursiveTags(combinedopts)
if not opts:
raise ValueError('recursive tag definitions unable to be resolved')
return opts
def convert(self):
stream = list()
opts = self.getCombined()
@ -70,6 +59,7 @@ class FilterReader(DefinitionInitConfigReader):
return stream
for opt, value in opts.iteritems():
if opt in ("failregex", "ignoreregex"):
if value is None: continue
multi = []
for regex in value.split('\n'):
# Do not send a command if the rule is empty.
@ -79,14 +69,14 @@ class FilterReader(DefinitionInitConfigReader):
stream.append(["multi-set", self._jailName, "add" + opt, multi])
elif len(multi):
stream.append(["set", self._jailName, "add" + opt, multi[0]])
elif opt == 'maxlines':
# We warn when multiline regex is used without maxlines > 1
# therefore keep sure we set this option first.
stream.insert(0, ["set", self._jailName, "maxlines", value])
elif opt == 'datepattern':
stream.append(["set", self._jailName, "datepattern", value])
elif opt in ('maxlines', 'prefregex'):
# Be sure we set this options first.
stream.insert(0, ["set", self._jailName, opt, value])
elif opt in ('datepattern'):
stream.append(["set", self._jailName, opt, value])
# Do not send a command if the match is empty.
elif opt == 'journalmatch':
if value is None: continue
for match in value.split("\n"):
if match == '': continue
stream.append(

View File

@ -43,7 +43,7 @@ logSys = getLogger(__name__)
class JailReader(ConfigReader):
# regex, to extract list of options:
optionCRE = re.compile(r"^([\w\-_\.]+)(?:\[(.*)\])?\s*$", re.DOTALL)
optionCRE = re.compile(r"^([^\[]+)(?:\[(.*)\])?\s*$", re.DOTALL)
# regex, to iterate over single option in option list, syntax:
# `action = act[p1="...", p2='...', p3=...]`, where the p3=... not contains `,` or ']'
# since v0.10 separator extended with `]\s*[` for support of multiple option groups, syntax
@ -110,6 +110,7 @@ class JailReader(ConfigReader):
["string", "failregex", None],
["string", "ignoreregex", None],
["string", "ignorecommand", None],
["bool", "ignoreself", None],
["string", "ignoreip", None],
["string", "filter", ""],
["string", "datepattern", None],
@ -136,13 +137,14 @@ class JailReader(ConfigReader):
if not filterName:
raise JailDefError("Invalid filter definition %r" % flt)
self.__filter = FilterReader(
filterName, self.__name, filterOpt, share_config=self.share_config, basedir=self.getBaseDir())
filterName, self.__name, filterOpt,
share_config=self.share_config, basedir=self.getBaseDir())
ret = self.__filter.read()
# merge options from filter as 'known/...':
self.__filter.getOptions(self.__opts)
ConfigReader.merge_section(self, self.__name, self.__filter.getCombined(), 'known/')
if not ret:
raise JailDefError("Unable to read the filter %r" % filterName)
# merge options from filter as 'known/...' (all options unfiltered):
self.__filter.getOptions(self.__opts, all=True)
ConfigReader.merge_section(self, self.__name, self.__filter.getCombined(), 'known/')
else:
self.__filter = None
logSys.warning("No filter set for jail %s" % self.__name)
@ -219,8 +221,8 @@ class JailReader(ConfigReader):
if self.__filter:
stream.extend(self.__filter.convert())
for opt, value in self.__opts.iteritems():
if opt == "logpath" and \
not self.__opts.get('backend', None).startswith("systemd"):
if opt == "logpath":
if self.__opts.get('backend', None).startswith("systemd"): continue
found_files = 0
for path in value.split("\n"):
path = path.rsplit(" ", 1)

View File

@ -169,6 +169,36 @@ def splitwords(s):
return []
return filter(bool, map(str.strip, re.split('[ ,\n]+', s)))
if sys.version_info >= (3,5):
eval(compile(r'''if 1:
def _merge_dicts(x, y):
"""Helper to merge dicts.
"""
if y:
return {**x, **y}
return x
def _merge_copy_dicts(x, y):
"""Helper to merge dicts to guarantee a copy result (r is never x).
"""
return {**x, **y}
''', __file__, 'exec'))
else:
def _merge_dicts(x, y):
"""Helper to merge dicts.
"""
r = x
if y:
r = x.copy()
r.update(y)
return r
def _merge_copy_dicts(x, y):
"""Helper to merge dicts to guarantee a copy result (r is never x).
"""
r = x.copy()
if y:
r.update(y)
return r
#
# Following "uni_decode" function unified python independent any to string converting
@ -200,6 +230,113 @@ else:
raise
return uni_decode(x, enc, 'replace')
#
# Following facilities used for safe recursive interpolation of
# tags (<tag>) in tagged options.
#
# max tag replacement count:
MAX_TAG_REPLACE_COUNT = 10
# compiled RE for tag name (replacement name)
TAG_CRE = re.compile(r'<([^ <>]+)>')
def substituteRecursiveTags(inptags, conditional='',
ignore=(), addrepl=None
):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
#logSys = getLogger("fail2ban")
tre_search = TAG_CRE.search
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
# init:
ignore = set(ignore)
done = set()
noRecRepl = hasattr(tags, "getRawItem")
# repeat substitution while embedded-recursive (repFlag is True)
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done (or in ignore list):
if tag in ignore or tag in done: continue
# ignore replacing callable items from calling map - should be converted on demand only (by get):
if noRecRepl and callable(tags.getRawItem(tag)): continue
value = orgval = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = tre_search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
# found replacement tag:
rtag = m.group(1)
# don't replace tags that should be currently ignored (pre-replacement):
if rtag in ignore:
m = tre_search(value, m.end())
continue
#logSys.log(5, 'found: %s' % rtag)
if rtag == tag or refCounts.get(rtag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, rtag, refCounts, value))
repl = None
if conditional:
repl = tags.get(rtag + '?' + conditional)
if repl is None:
repl = tags.get(rtag)
# try to find tag using additional replacement (callable):
if repl is None and addrepl is not None:
repl = addrepl(rtag)
if repl is None:
# Missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = tre_search(value, m.end())
continue
# if calling map - be sure we've string:
if noRecRepl: repl = str(repl)
value = value.replace('<%s>' % rtag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[rtag] = refCounts.get(rtag, 0) + 1
# the next match for replace:
m = tre_search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if orgval != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if tre_search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
class BgService(object):
"""Background servicing

View File

@ -81,6 +81,7 @@ protocol = [
["status <JAIL> [FLAVOR]", "gets the current status of <JAIL>, with optional flavor or extended info"],
['', "JAIL CONFIGURATION", ""],
["set <JAIL> idle on|off", "sets the idle state of <JAIL>"],
["set <JAIL> ignoreself true|false", "allows the ignoring of own IP addresses"],
["set <JAIL> addignoreip <IP>", "adds <IP> to the ignore list of <JAIL>"],
["set <JAIL> delignoreip <IP>", "removes <IP> from the ignore list of <JAIL>"],
["set <JAIL> addlogpath <FILE> ['tail']", "adds <FILE> to the monitoring list of <JAIL>, optionally starting at the 'tail' of the file (default 'head')."],
@ -117,6 +118,7 @@ protocol = [
["get <JAIL> logpath", "gets the list of the monitored files for <JAIL>"],
["get <JAIL> logencoding", "gets the encoding of the log files for <JAIL>"],
["get <JAIL> journalmatch", "gets the journal filter match for <JAIL>"],
["get <JAIL> ignoreself", "gets the current value of the ignoring the own IP addresses"],
["get <JAIL> ignoreip", "gets the list of ignored IP addresses for <JAIL>"],
["get <JAIL> ignorecommand", "gets ignorecommand of <JAIL>"],
["get <JAIL> failregex", "gets the list of regular expressions which matches the failures for <JAIL>"],

View File

@ -32,10 +32,11 @@ import time
from abc import ABCMeta
from collections import MutableMapping
from .failregex import mapTag2Opt
from .ipdns import asip
from .mytime import MyTime
from .utils import Utils
from ..helpers import getLogger
from ..helpers import getLogger, _merge_copy_dicts, substituteRecursiveTags, TAG_CRE, MAX_TAG_REPLACE_COUNT
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -46,14 +47,19 @@ _cmd_lock = threading.Lock()
# Todo: make it configurable resp. automatically set, ex.: `[ -f /proc/net/if_inet6 ] && echo 'yes' || echo 'no'`:
allowed_ipv6 = True
# max tag replacement count:
MAX_TAG_REPLACE_COUNT = 10
# capture groups from filter for map to ticket data:
FCUSTAG_CRE = re.compile(r'<F-([A-Z0-9_\-]+)>'); # currently uppercase only
# compiled RE for tag name (replacement name)
TAG_CRE = re.compile(r'<([^ <>]+)>')
CONDITIONAL_FAM_RE = re.compile(r"^(\w+)\?(family)=")
# New line, space
ADD_REPL_TAGS = {
"br": "\n",
"sp": " "
}
class CallingMap(MutableMapping):
class CallingMap(MutableMapping, object):
"""A Mapping type which returns the result of callable values.
`CallingMap` behaves similar to a standard python dictionary,
@ -70,23 +76,71 @@ class CallingMap(MutableMapping):
The dictionary data which can be accessed to obtain items uncalled
"""
# immutable=True saves content between actions, without interim copying (save original on demand, recoverable via reset)
__slots__ = ('data', 'storage', 'immutable', '__org_data')
def __init__(self, *args, **kwargs):
self.storage = dict()
self.immutable = True
self.data = dict(*args, **kwargs)
def reset(self, immutable=True):
self.storage = dict()
try:
self.data = self.__org_data
except AttributeError:
pass
self.immutable = immutable
def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.data)
return "%s(%r)" % (self.__class__.__name__, self._asdict())
def _asdict(self):
try:
return dict(self)
except:
return dict(self.data, **self.storage)
def getRawItem(self, key):
try:
value = self.storage[key]
except KeyError:
value = self.data[key]
return value
def __getitem__(self, key):
value = self.data[key]
try:
value = self.storage[key]
except KeyError:
value = self.data[key]
if callable(value):
return value()
else:
return value
# check arguments can be supplied to callable (for backwards compatibility):
value = value(self) if hasattr(value, '__code__') and value.__code__.co_argcount else value()
self.storage[key] = value
return value
def __setitem__(self, key, value):
self.data[key] = value
# mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
self.storage[key] = value
def __unavailable(self, key):
raise KeyError("Key %r was deleted" % key)
def __delitem__(self, key):
# mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
try:
del self.storage[key]
except KeyError:
pass
del self.data[key]
def __iter__(self):
@ -95,8 +149,8 @@ class CallingMap(MutableMapping):
def __len__(self):
return len(self.data)
def copy(self):
return self.__class__(self.data.copy())
def copy(self): # pargma: no cover
return self.__class__(_merge_copy_dicts(self.data, self.storage))
class ActionBase(object):
@ -149,17 +203,17 @@ class ActionBase(object):
self._name = name
self._logSys = getLogger("fail2ban.%s" % self.__class__.__name__)
def start(self):
def start(self): # pragma: no cover - abstract
"""Executed when the jail/action is started.
"""
pass
def stop(self):
def stop(self): # pragma: no cover - abstract
"""Executed when the jail/action is stopped.
"""
pass
def ban(self, aInfo):
def ban(self, aInfo): # pragma: no cover - abstract
"""Executed when a ban occurs.
Parameters
@ -170,7 +224,7 @@ class ActionBase(object):
"""
pass
def unban(self, aInfo):
def unban(self, aInfo): # pragma: no cover - abstract
"""Executed when a ban expires.
Parameters
@ -219,14 +273,16 @@ class CommandAction(ActionBase):
self.timeout = 60
## Command executed in order to initialize the system.
self.actionstart = ''
## Command executed when an IP address gets banned.
## Command executed when ticket gets banned.
self.actionban = ''
## Command executed when an IP address gets removed.
## Command executed when ticket gets removed.
self.actionunban = ''
## Command executed in order to check requirements.
self.actioncheck = ''
## Command executed in order to restore sane environment in error case.
self.actionrepair = ''
## Command executed in order to flush all bans at once (e. g. by stop/shutdown the system).
self.actionflush = ''
## Command executed in order to stop the system.
self.actionstop = ''
## Command executed in case of reloading action.
@ -238,6 +294,7 @@ class CommandAction(ActionBase):
super(CommandAction, self).__init__(jail, name)
self.__init = 1
self.__properties = None
self.__started = {}
self.__substCache = {}
self.clearAllParams()
self._logSys.debug("Created %s" % self.__class__)
@ -259,6 +316,16 @@ class CommandAction(ActionBase):
# set:
self.__dict__[name] = value
def __delattr__(self, name):
if not name.startswith('_'):
# parameters changed - clear properties and substitution cache:
self.__properties = None
self.__substCache.clear()
#self._logSys.debug("Unset action %r %s", self._name, name)
self._logSys.debug(" Unset %s", name)
# del:
del self.__dict__[name]
@property
def _properties(self):
"""A dictionary of the actions properties.
@ -280,7 +347,11 @@ class CommandAction(ActionBase):
def _substCache(self):
return self.__substCache
def _executeOperation(self, tag, operation):
def _getOperation(self, tag, family):
return self.replaceTag(tag, self._properties,
conditional=('family=' + family), cache=self.__substCache)
def _executeOperation(self, tag, operation, family=[]):
"""Executes the operation commands (like "actionstart", "actionstop", etc).
Replace the tags in the action command with actions properties
@ -290,14 +361,14 @@ class CommandAction(ActionBase):
res = True
try:
# common (resp. ipv4):
startCmd = self.replaceTag(tag, self._properties,
conditional='family=inet4', cache=self.__substCache)
if startCmd:
res &= self.executeCmd(startCmd, self.timeout)
startCmd = None
if not family or 'inet4' in family:
startCmd = self._getOperation(tag, 'inet4')
if startCmd:
res &= self.executeCmd(startCmd, self.timeout)
# start ipv6 actions if available:
if allowed_ipv6:
startCmd6 = self.replaceTag(tag, self._properties,
conditional='family=inet6', cache=self.__substCache)
if allowed_ipv6 and (not family or 'inet6' in family):
startCmd6 = self._getOperation(tag, 'inet6')
if startCmd6 and startCmd6 != startCmd:
res &= self.executeCmd(startCmd6, self.timeout)
if not res:
@ -305,13 +376,34 @@ class CommandAction(ActionBase):
except ValueError as e:
raise RuntimeError("Error %s action %s/%s: %r" % (operation, self._jail, self._name, e))
def start(self):
COND_FAMILIES = {'inet4':1, 'inet6':1}
@property
def _startOnDemand(self):
"""Checks the action depends on family (conditional)"""
v = self._properties.get('actionstart_on_demand')
if v is None:
v = False
for n in self._properties:
if CONDITIONAL_FAM_RE.match(n):
v = True
break
self._properties['actionstart_on_demand'] = v
return v
def start(self, family=[]):
"""Executes the "actionstart" command.
Replace the tags in the action command with actions properties
and executes the resulting command.
"""
return self._executeOperation('<actionstart>', 'starting')
if not family:
# check the action depends on family (conditional):
if self._startOnDemand:
return True
elif self.__started.get(family): # pragma: no cover - normally unreachable
return True
return self._executeOperation('<actionstart>', 'starting', family=family)
def ban(self, aInfo):
"""Executes the "actionban" command.
@ -325,6 +417,20 @@ class CommandAction(ActionBase):
Dictionary which includes information in relation to
the ban.
"""
# if we should start the action on demand (conditional by family):
if self._startOnDemand:
family = aInfo.get('family')
if not self.__started.get(family):
self.start(family)
self.__started[family] = 1
# mark also another families as "started" (-1), if they are equal
# (on demand, but the same for ipv4 and ipv6):
cmd = self._getOperation('<actionstart>', family)
for f in CommandAction.COND_FAMILIES:
if f != family and not self.__started.get(f):
if cmd == self._getOperation('<actionstart>', f):
self.__started[f] = -1
# ban:
if not self._processCmd('<actionban>', aInfo):
raise RuntimeError("Error banning %(ip)s" % aInfo)
@ -343,13 +449,41 @@ class CommandAction(ActionBase):
if not self._processCmd('<actionunban>', aInfo):
raise RuntimeError("Error unbanning %(ip)s" % aInfo)
def flush(self):
"""Executes the "actionflush" command.
Command executed in order to flush all bans at once (e. g. by stop/shutdown
the system), instead of unbunning of each single ticket.
Replaces the tags in the action command with actions properties
and executes the resulting command.
"""
family = []
# cumulate started families, if started on demand (conditional):
if self._startOnDemand:
for f in CommandAction.COND_FAMILIES:
if self.__started.get(f) == 1: # only real started:
family.append(f)
# if no started (on demand) actions:
if not family: return True
return self._executeOperation('<actionflush>', 'flushing', family=family)
def stop(self):
"""Executes the "actionstop" command.
Replaces the tags in the action command with actions properties
and executes the resulting command.
"""
return self._executeOperation('<actionstop>', 'stopping')
family = []
# cumulate started families, if started on demand (conditional):
if self._startOnDemand:
for f in CommandAction.COND_FAMILIES:
if self.__started.get(f) == 1: # only real started:
family.append(f)
self.__started[f] = 0
# if no started (on demand) actions:
if not family: return True
return self._executeOperation('<actionstop>', 'stopping', family=family)
def reload(self, **kwargs):
"""Executes the "actionreload" command.
@ -364,83 +498,6 @@ class CommandAction(ActionBase):
"""
return self._executeOperation('<actionreload>', 'reloading')
@classmethod
def substituteRecursiveTags(cls, inptags, conditional=''):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
t = TAG_CRE
# repeat substitution while embedded-recursive (repFlag is True)
done = cls._escapedTags.copy()
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done:
if tag in done: continue
value = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = t.search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
found_tag = m.group(1)
#logSys.log(5, 'found: %s' % found_tag)
if found_tag == tag or refCounts.get(found_tag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, found_tag, refCounts, value))
repl = None
if found_tag not in cls._escapedTags:
repl = tags.get(found_tag + '?' + conditional)
if repl is None:
repl = tags.get(found_tag)
if repl is None:
# Escaped or missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = t.search(value, m.end())
continue
value = value.replace('<%s>' % found_tag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[found_tag] = refCounts.get(found_tag, 0) + 1
# the next match for replace:
m = t.search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if tags[tag] != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if t.search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
@staticmethod
def escapeTag(value):
"""Escape characters which may be used for command injection.
@ -483,33 +540,150 @@ class CommandAction(ActionBase):
str
`query` string with tags replaced.
"""
if '<' not in query: return query
# use cache if allowed:
if cache is not None:
ckey = (query, conditional)
string = cache.get(ckey)
if string is not None:
return string
# replace:
string = query
aInfo = cls.substituteRecursiveTags(aInfo, conditional)
for tag in aInfo:
if "<%s>" % tag in query:
value = aInfo.get(tag + '?' + conditional)
try:
return cache[ckey]
except KeyError:
pass
# **Important**: don't replace if calling map - contains dynamic values only,
# no recursive tags, otherwise may be vulnerable on foreign user-input:
noRecRepl = isinstance(aInfo, CallingMap)
subInfo = aInfo
if not noRecRepl:
# substitute tags recursive (and cache if possible),
# first try get cached tags dictionary:
subInfo = csubkey = None
if cache is not None:
csubkey = ('subst-tags', id(aInfo), conditional)
try:
subInfo = cache[csubkey]
except KeyError:
pass
# interpolation of dictionary:
if subInfo is None:
subInfo = substituteRecursiveTags(aInfo, conditional, ignore=cls._escapedTags)
# cache if possible:
if csubkey is not None:
cache[csubkey] = subInfo
# substitution callable, used by interpolation of each tag
def substVal(m):
tag = m.group(1) # tagname from match
value = None
if conditional:
value = subInfo.get(tag + '?' + conditional)
if value is None:
value = subInfo.get(tag)
if value is None:
value = aInfo.get(tag)
value = str(value) # assure string
if tag in cls._escapedTags:
# That one needs to be escaped since its content is
# out of our control
value = cls.escapeTag(value)
string = string.replace('<' + tag + '>', value)
# New line, space
string = reduce(lambda s, kv: s.replace(*kv), (("<br>", '\n'), ("<sp>", " ")), string)
# cache if properties:
# fallback (no or default replacement)
return ADD_REPL_TAGS.get(tag, m.group())
value = str(value) # assure string
if tag in cls._escapedTags:
# That one needs to be escaped since its content is
# out of our control
value = cls.escapeTag(value)
# replacement for tag:
return value
# interpolation of query:
count = MAX_TAG_REPLACE_COUNT + 1
while True:
value = TAG_CRE.sub(substVal, query)
# **Important**: no recursive replacement for tags from calling map (properties only):
if noRecRepl: break
# possible recursion ?
if value == query or '<' not in value: break
query = value
count -= 1
if count <= 0:
raise ValueError(
"unexpected too long replacement interpolation, "
"possible self referencing definitions in query: %s" % (query,))
# cache if possible:
if cache is not None:
cache[ckey] = string
cache[ckey] = value
#
return string
return value
ESCAPE_CRE = re.compile(r"""[\\#&;`|*?~<>\^\(\)\[\]{}$'"\n\r]""")
ESCAPE_VN_CRE = re.compile(r"\W")
@classmethod
def replaceDynamicTags(cls, realCmd, aInfo):
"""Replaces dynamical tags in `query` with property values.
**Important**
-------------
Because this tags are dynamic resp. foreign (user) input:
- values should be escaped (using "escape" as shell variable)
- no recursive substitution (no interpolation for <a<b>>)
- don't use cache
Parameters
----------
query : str
String with tags.
aInfo : dict
Tags(keys) and associated values for substitution in query.
Returns
-------
str
shell script as string or array with tags replaced (direct or as variables).
"""
# array for escaped vars:
varsDict = dict()
def escapeVal(tag, value):
# if the value should be escaped:
if cls.ESCAPE_CRE.search(value):
# That one needs to be escaped since its content is
# out of our control
tag = 'f2bV_%s' % cls.ESCAPE_VN_CRE.sub('_', tag)
varsDict[tag] = value # add variable
value = '$'+tag # replacement as variable
# replacement for tag:
return value
# substitution callable, used by interpolation of each tag
def substVal(m):
tag = m.group(1) # tagname from match
try:
value = aInfo[tag]
except KeyError:
# fallback (no or default replacement)
return ADD_REPL_TAGS.get(tag, m.group())
value = str(value) # assure string
# replacement for tag:
return escapeVal(tag, value)
# Replace normally properties of aInfo non-recursive:
realCmd = TAG_CRE.sub(substVal, realCmd)
# Replace ticket options (filter capture groups) non-recursive:
if '<' in realCmd:
tickData = aInfo.get("F-*")
if not tickData: tickData = {}
def substTag(m):
tag = mapTag2Opt(m.groups()[0])
try:
value = str(tickData[tag])
except KeyError:
return ""
return escapeVal("F_"+tag, value)
realCmd = FCUSTAG_CRE.sub(substTag, realCmd)
# build command corresponding "escaped" variables:
if varsDict:
realCmd = Utils.buildShellCmd(realCmd, varsDict)
return realCmd
def _processCmd(self, cmd, aInfo=None, conditional=''):
"""Executes a command with preliminary checks and substitutions.
@ -575,9 +749,9 @@ class CommandAction(ActionBase):
realCmd = self.replaceTag(cmd, self._properties,
conditional=conditional, cache=self.__substCache)
# Replace tags
# Replace dynamical tags, important - don't cache, no recursion and auto-escape here
if aInfo is not None:
realCmd = self.replaceTag(realCmd, aInfo, conditional=conditional)
realCmd = self.replaceDynamicTags(realCmd, aInfo)
else:
realCmd = cmd
@ -611,8 +785,5 @@ class CommandAction(ActionBase):
logSys.debug("Nothing to do")
return True
_cmd_lock.acquire()
try:
with _cmd_lock:
return Utils.executeCmd(realCmd, timeout, shell=True, output=False, **kwargs)
finally:
_cmd_lock.release()

View File

@ -150,7 +150,7 @@ class Actions(JailThread, Mapping):
# reload actions after all parameters set via stream:
for name, initOpts in self._reload_actions.iteritems():
if name in self._actions:
self._actions[name].reload(**initOpts if initOpts else {})
self._actions[name].reload(**(initOpts if initOpts else {}))
# remove obsolete actions (untouched by reload process):
delacts = OrderedDict((name, action) for name, action in self._actions.iteritems()
if name not in self._reload_actions)
@ -286,44 +286,88 @@ class Actions(JailThread, Mapping):
self.stopActions()
return True
def __getBansMerged(self, mi, overalljails=False):
"""Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside.
class ActionInfo(CallingMap):
This function never returns None for ainfo lambdas - always a ticket (merged or single one)
and prevents any errors through merging (to guarantee ban actions will be executed).
[TODO] move merging to observer - here we could wait for merge and read already merged info from a database
AI_DICT = {
"ip": lambda self: self.__ticket.getIP(),
"family": lambda self: self['ip'].familyStr,
"ip-rev": lambda self: self['ip'].getPTR(''),
"ip-host": lambda self: self['ip'].getHost(),
"fid": lambda self: self.__ticket.getID(),
"failures": lambda self: self.__ticket.getAttempt(),
"time": lambda self: self.__ticket.getTime(),
"matches": lambda self: "\n".join(self.__ticket.getMatches()),
# to bypass actions, that should not be executed for restored tickets
"restored": lambda self: (1 if self.__ticket.restored else 0),
# extra-interpolation - all match-tags (captured from the filter):
"F-*": lambda self, tag=None: self.__ticket.getData(tag),
# merged info:
"ipmatches": lambda self: "\n".join(self._mi4ip(True).getMatches()),
"ipjailmatches": lambda self: "\n".join(self._mi4ip().getMatches()),
"ipfailures": lambda self: self._mi4ip(True).getAttempt(),
"ipjailfailures": lambda self: self._mi4ip().getAttempt(),
}
Parameters
----------
mi : dict
merge info, initial for lambda should contains {ip, ticket}
overalljails : bool
switch to get a merged bans :
False - (default) bans merged for current jail only
True - bans merged for all jails of current ip address
__slots__ = CallingMap.__slots__ + ('__ticket', '__jail', '__mi4ip')
def __init__(self, ticket, jail=None, immutable=True, data=AI_DICT):
self.__ticket = ticket
self.__jail = jail
self.storage = dict()
self.immutable = immutable
self.data = data
def copy(self): # pargma: no cover
return self.__class__(self.__ticket, self.__jail, self.immutable, self.data.copy())
def _mi4ip(self, overalljails=False):
"""Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside.
This function never returns None for ainfo lambdas - always a ticket (merged or single one)
and prevents any errors through merging (to guarantee ban actions will be executed).
[TODO] move merging to observer - here we could wait for merge and read already merged info from a database
Parameters
----------
overalljails : bool
switch to get a merged bans :
False - (default) bans merged for current jail only
True - bans merged for all jails of current ip address
Returns
-------
BanTicket
merged or self ticket only
"""
if not hasattr(self, '__mi4ip'):
self.__mi4ip = {}
mi = self.__mi4ip
idx = 'all' if overalljails else 'jail'
if idx in mi:
return mi[idx] if mi[idx] is not None else self.__ticket
try:
jail = self.__jail
ip = self['ip']
mi[idx] = None
if not jail.database: # pragma: no cover
return self.__ticket
if overalljails:
mi[idx] = jail.database.getBansMerged(ip=ip)
else:
mi[idx] = jail.database.getBansMerged(ip=ip, jail=jail)
except Exception as e:
logSys.error(
"Failed to get %s bans merged, jail '%s': %s",
idx, jail.name, e,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
return mi[idx] if mi[idx] is not None else self.__ticket
def __getActionInfo(self, ticket):
ip = ticket.getIP()
aInfo = Actions.ActionInfo(ticket, self._jail)
return aInfo
Returns
-------
BanTicket
merged or self ticket only
"""
idx = 'all' if overalljails else 'jail'
if idx in mi:
return mi[idx] if mi[idx] is not None else mi['ticket']
try:
jail=self._jail
ip=mi['ip']
mi[idx] = None
if overalljails:
mi[idx] = jail.database.getBansMerged(ip=ip)
else:
mi[idx] = jail.database.getBansMerged(ip=ip, jail=jail)
except Exception as e:
logSys.error(
"Failed to get %s bans merged, jail '%s': %s",
idx, jail.name, e,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
return mi[idx] if mi[idx] is not None else mi['ticket']
def __checkBan(self):
"""Check for IP address to ban.
@ -341,27 +385,19 @@ class Actions(JailThread, Mapping):
ticket = self._jail.getFailTicket()
if not ticket:
break
aInfo = CallingMap()
bTicket = BanManager.createBanTicket(ticket)
ip = bTicket.getIP()
aInfo["ip"] = ip
aInfo["failures"] = bTicket.getAttempt()
aInfo["time"] = bTicket.getTime()
aInfo["matches"] = "\n".join(bTicket.getMatches())
if self._jail.database is not None:
mi4ip = lambda overalljails=False, self=self, \
mi={'ip':ip, 'ticket':bTicket}: self.__getBansMerged(mi, overalljails)
aInfo["ipmatches"] = lambda: "\n".join(mi4ip(True).getMatches())
aInfo["ipjailmatches"] = lambda: "\n".join(mi4ip().getMatches())
aInfo["ipfailures"] = lambda: mi4ip(True).getAttempt()
aInfo["ipjailfailures"] = lambda: mi4ip().getAttempt()
aInfo = self.__getActionInfo(bTicket)
reason = {}
if self.__banManager.addBanTicket(bTicket, reason=reason):
cnt += 1
logSys.notice("[%s] %sBan %s", self._jail.name, ('' if not bTicket.restored else 'Restore '), ip)
for name, action in self._actions.iteritems():
try:
action.ban(aInfo.copy())
if ticket.restored and getattr(action, 'norestored', False):
continue
if not aInfo.immutable: aInfo.reset()
action.ban(aInfo)
except Exception as e:
logSys.error(
"Failed to execute ban jail '%s' action '%s' "
@ -411,25 +447,37 @@ class Actions(JailThread, Mapping):
If actions specified, don't flush list - just execute unban for
given actions (reload, obsolete resp. removed actions).
"""
log = True
if actions is None:
logSys.debug("Flush ban list")
lst = self.__banManager.flushBanList()
else:
log = False # don't log "[jail] Unban ..." if removing actions only.
lst = iter(self.__banManager)
cnt = 0
# first we'll execute flush for actions supporting this operation:
unbactions = {}
for name, action in (actions if actions is not None else self._actions).iteritems():
if hasattr(action, 'flush') and action.actionflush:
logSys.notice("[%s] Flush ticket(s) with %s", self._jail.name, name)
action.flush()
else:
unbactions[name] = action
actions = unbactions
# unban each ticket with non-flasheable actions:
for ticket in lst:
# delete ip from database also:
if db and self._jail.database is not None:
ip = str(ticket.getIP())
self._jail.database.delBan(self._jail, ip)
# unban ip:
self.__unBan(ticket, actions=actions)
self.__unBan(ticket, actions=actions, log=log)
cnt += 1
logSys.debug("Unbanned %s, %s ticket(s) in %r",
cnt, self.__banManager.size(), self._jail.name)
return cnt
def __unBan(self, ticket, actions=None):
def __unBan(self, ticket, actions=None, log=True):
"""Unbans host corresponding to the ticket.
Executes the actions in order to unban the host given in the
@ -444,17 +492,17 @@ class Actions(JailThread, Mapping):
unbactions = self._actions
else:
unbactions = actions
aInfo = dict()
aInfo["ip"] = ticket.getIP()
aInfo["failures"] = ticket.getAttempt()
aInfo["time"] = ticket.getTime()
aInfo["matches"] = "".join(ticket.getMatches())
if actions is None:
ip = ticket.getIP()
aInfo = self.__getActionInfo(ticket)
if log:
logSys.notice("[%s] Unban %s", self._jail.name, aInfo["ip"])
for name, action in unbactions.iteritems():
try:
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, aInfo["ip"])
action.unban(aInfo.copy())
if ticket.restored and getattr(action, 'norestored', False):
continue
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, ip)
if not aInfo.immutable: aInfo.reset()
action.unban(aInfo)
except Exception as e:
logSys.error(
"Failed to execute unban jail '%s' action '%s' "

View File

@ -284,7 +284,7 @@ class DateDetector(object):
if preMatch is not None:
# get cached or create a copy with modified name/pattern, using preMatch replacement for {DATE}:
template = _getAnchoredTemplate(template,
wrap=lambda s: RE_DATE_PREMATCH.sub(s, preMatch))
wrap=lambda s: RE_DATE_PREMATCH.sub(lambda m: s, preMatch))
# append date detector template (ignore duplicate if some was added before default):
self._appendTemplate(template, ignoreDup=ignoreDup)

View File

@ -27,6 +27,68 @@ import sys
from .ipdns import IPAddr
FTAG_CRE = re.compile(r'</?[\w\-]+/?>')
FCUSTNAME_CRE = re.compile(r'^(/?)F-([A-Z0-9_\-]+)$'); # currently uppercase only
R_HOST = [
# separated ipv4:
r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,),
# separated ipv6:
r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,),
# place-holder for ipv6 enclosed in optional [] (used in addr-, host-regex)
"",
# separated dns:
r"""(?P<dns>[\w\-.^_]*\w)""",
# place-holder for ADDR tag-replacement (joined):
"",
# place-holder for HOST tag replacement (joined):
""
]
RI_IPV4 = 0
RI_IPV6 = 1
RI_IPV6BR = 2
RI_DNS = 3
RI_ADDR = 4
RI_HOST = 5
R_HOST[RI_IPV6BR] = r"""\[?%s\]?""" % (R_HOST[RI_IPV6],)
R_HOST[RI_ADDR] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR])),)
R_HOST[RI_HOST] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR], R_HOST[RI_DNS])),)
RH4TAG = {
# separated ipv4 (self closed, closed):
"IP4": R_HOST[RI_IPV4],
"F-IP4/": R_HOST[RI_IPV4],
# separated ipv6 (self closed, closed):
"IP6": R_HOST[RI_IPV6],
"F-IP6/": R_HOST[RI_IPV6],
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
"ADDR": R_HOST[RI_ADDR],
"F-ADDR/": R_HOST[RI_ADDR],
# separated dns (self closed, closed):
"DNS": R_HOST[RI_DNS],
"F-DNS/": R_HOST[RI_DNS],
# default failure-id as no space tag:
"F-ID/": r"""(?P<fid>\S+)""",
# default failure port, like 80 or http :
"F-PORT/": r"""(?P<fport>\w+)""",
}
# default failure groups map for customizable expressions (with different group-id):
R_MAP = {
"ID": "fid",
"PORT": "fport",
}
def mapTag2Opt(tag):
try: # if should be mapped:
return R_MAP[tag]
except KeyError:
return tag.lower()
##
# Regular expression class.
#
@ -41,20 +103,16 @@ class Regex:
# avoid construction of invalid object.
# @param value the regular expression
def __init__(self, regex, **kwargs):
def __init__(self, regex, multiline=False, **kwargs):
self._matchCache = None
# Perform shortcuts expansions.
# Resolve "<HOST>" tag using default regular expression for host:
# Replace standard f2b-tags (like "<HOST>", etc) using default regular expressions:
regex = Regex._resolveHostTag(regex, **kwargs)
# Replace "<SKIPLINES>" with regular expression for multiple lines.
regexSplit = regex.split("<SKIPLINES>")
regex = regexSplit[0]
for n, regexLine in enumerate(regexSplit[1:]):
regex += "\n(?P<skiplines%i>(?:(.*\n)*?))" % n + regexLine
#
if regex.lstrip() == '':
raise RegexException("Cannot add empty regex")
try:
self._regexObj = re.compile(regex, re.MULTILINE)
self._regexObj = re.compile(regex, re.MULTILINE if multiline else 0)
self._regex = regex
except sre_constants.error:
raise RegexException("Unable to compile regular expression '%s'" %
@ -71,38 +129,52 @@ class Regex:
@staticmethod
def _resolveHostTag(regex, useDns="yes"):
# separated ipv4:
r_host = []
r = r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,)
regex = regex.replace("<IP4>", r); # self closed
regex = regex.replace("<F-IP4/>", r); # closed
r_host.append(r)
# separated ipv6:
r = r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,)
regex = regex.replace("<IP6>", r); # self closed
regex = regex.replace("<F-IP6/>", r); # closed
r_host.append(r"""\[?%s\]?""" % (r,)); # enclose ipv6 in optional [] in host-regex
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
regex = regex.replace("<ADDR>", "(?:%s)" % ("|".join(r_host),))
# separated dns:
r = r"""(?P<dns>[\w\-.^_]*\w)"""
regex = regex.replace("<DNS>", r); # self closed
regex = regex.replace("<F-DNS/>", r); # closed
if useDns not in ("no",):
r_host.append(r)
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
regex = regex.replace("<HOST>", "(?:%s)" % ("|".join(r_host),))
# default failure-id as no space tag:
regex = regex.replace("<F-ID/>", r"""(?P<fid>\S+)"""); # closed
# default failure port, like 80 or http :
regex = regex.replace("<F-PORT/>", r"""(?P<port>\w+)"""); # closed
# default failure groups (begin / end tag) for customizable expressions:
for o,r in (('IP4', 'ip4'), ('IP6', 'ip6'), ('DNS', 'dns'), ('ID', 'fid'), ('PORT', 'fport')):
regex = regex.replace("<F-%s>" % o, "(?P<%s>" % r); # open tag
regex = regex.replace("</F-%s>" % o, ")"); # close tag
return regex
openTags = dict()
props = {
'nl': 0, # new lines counter by <SKIPLINES> tag;
}
# tag interpolation callable:
def substTag(m):
tag = m.group()
tn = tag[1:-1]
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
if tn == "HOST":
return R_HOST[RI_HOST if useDns not in ("no",) else RI_ADDR]
# replace "<SKIPLINES>" with regular expression for multiple lines (by buffering with maxlines)
if tn == "SKIPLINES":
nl = props['nl']
props['nl'] = nl + 1
return r"\n(?P<skiplines%i>(?:(?:.*\n)*?))" % (nl,)
# static replacement from RH4TAG:
try:
return RH4TAG[tn]
except KeyError:
pass
# (begin / end tag) for customizable expressions, additionally used as
# user custom tags (match will be stored in ticket data, can be used in actions):
m = FCUSTNAME_CRE.match(tn)
if m: # match F-...
m = m.groups()
tn = m[1]
# close tag:
if m[0]:
# check it was already open:
if openTags.get(tn):
return ")"
return tag; # tag not opened, use original
# open tag:
openTags[tn] = 1
# if should be mapped:
tn = mapTag2Opt(tn)
return "(?P<%s>" % (tn,)
# original, no replacement:
return tag
# substitute tags:
return FTAG_CRE.sub(substTag, regex)
##
# Gets the regular expression.
@ -121,40 +193,45 @@ class Regex:
# method of this object.
# @param a list of tupples. The tupples are ( prematch, datematch, postdatematch )
def search(self, tupleLines):
def search(self, tupleLines, orgLines=None):
self._matchCache = self._regexObj.search(
"\n".join("".join(value[::2]) for value in tupleLines) + "\n")
if self.hasMatched():
# Find start of the first line where the match was found
try:
self._matchLineStart = self._matchCache.string.rindex(
"\n", 0, self._matchCache.start() +1 ) + 1
except ValueError:
self._matchLineStart = 0
# Find end of the last line where the match was found
try:
self._matchLineEnd = self._matchCache.string.index(
"\n", self._matchCache.end() - 1) + 1
except ValueError:
self._matchLineEnd = len(self._matchCache.string)
if self._matchCache:
if orgLines is None: orgLines = tupleLines
# if single-line:
if len(orgLines) <= 1:
self._matchedTupleLines = orgLines
self._unmatchedTupleLines = []
else:
# Find start of the first line where the match was found
try:
matchLineStart = self._matchCache.string.rindex(
"\n", 0, self._matchCache.start() +1 ) + 1
except ValueError:
matchLineStart = 0
# Find end of the last line where the match was found
try:
matchLineEnd = self._matchCache.string.index(
"\n", self._matchCache.end() - 1) + 1
except ValueError:
matchLineEnd = len(self._matchCache.string)
lineCount1 = self._matchCache.string.count(
"\n", 0, self._matchLineStart)
lineCount2 = self._matchCache.string.count(
"\n", 0, self._matchLineEnd)
self._matchedTupleLines = tupleLines[lineCount1:lineCount2]
self._unmatchedTupleLines = tupleLines[:lineCount1]
n = 0
for skippedLine in self.getSkippedLines():
for m, matchedTupleLine in enumerate(
self._matchedTupleLines[n:]):
if "".join(matchedTupleLine[::2]) == skippedLine:
self._unmatchedTupleLines.append(
self._matchedTupleLines.pop(n+m))
n += m
break
self._unmatchedTupleLines.extend(tupleLines[lineCount2:])
lineCount1 = self._matchCache.string.count(
"\n", 0, matchLineStart)
lineCount2 = self._matchCache.string.count(
"\n", 0, matchLineEnd)
self._matchedTupleLines = orgLines[lineCount1:lineCount2]
self._unmatchedTupleLines = orgLines[:lineCount1]
n = 0
for skippedLine in self.getSkippedLines():
for m, matchedTupleLine in enumerate(
self._matchedTupleLines[n:]):
if "".join(matchedTupleLine[::2]) == skippedLine:
self._unmatchedTupleLines.append(
self._matchedTupleLines.pop(n+m))
n += m
break
self._unmatchedTupleLines.extend(orgLines[lineCount2:])
# Checks if the previous call to search() matched.
#
@ -166,6 +243,13 @@ class Regex:
else:
return False
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
##
# Returns skipped lines.
#
@ -243,6 +327,10 @@ class RegexException(Exception):
#
FAILURE_ID_GROPS = ("fid", "ip4", "ip6", "dns")
# Additionally allows multi-line failure-id (used for wrapping e. g. conn-id to host)
#
FAILURE_ID_PRESENTS = FAILURE_ID_GROPS + ("mlfid",)
##
# Regular expression class.
#
@ -257,20 +345,16 @@ class FailRegex(Regex):
# avoid construction of invalid object.
# @param value the regular expression
def __init__(self, regex, **kwargs):
def __init__(self, regex, prefRegex=None, **kwargs):
# Initializes the parent.
Regex.__init__(self, regex, **kwargs)
# Check for group "dns", "ip4", "ip6", "fid"
if not [grp for grp in FAILURE_ID_GROPS if grp in self._regexObj.groupindex]:
if (not [grp for grp in FAILURE_ID_PRESENTS if grp in self._regexObj.groupindex]
and (prefRegex is None or
not [grp for grp in FAILURE_ID_PRESENTS if grp in prefRegex._regexObj.groupindex])
):
raise RegexException("No failure-id group in '%s'" % self._regex)
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
##
# Returns the matched failure id.
#

View File

@ -38,6 +38,7 @@ from .datedetector import DateDetector
from .mytime import MyTime
from .failregex import FailRegex, Regex, RegexException
from .action import CommandAction
from .utils import Utils
from ..helpers import getLogger, PREFER_ENC
# Gets the instance of the logger.
@ -65,6 +66,8 @@ class Filter(JailThread):
self.jail = jail
## The failures manager.
self.failManager = FailManager()
## Regular expression pre-filtering matching the failures.
self.__prefRegex = None
## The regular expression list matching the failures.
self.__failRegex = list()
## The regular expression list with expressions to ignore.
@ -73,6 +76,8 @@ class Filter(JailThread):
self.setUseDns(useDns)
## The amount of time to look back.
self.__findTime = 600
## Ignore own IPs flag:
self.__ignoreSelf = True
## The ignore IP list.
self.__ignoreIpList = []
## Size of line buffer
@ -86,6 +91,8 @@ class Filter(JailThread):
self.__ignoreCommand = False
## Default or preferred encoding (to decode bytes from file or journal):
self.__encoding = PREFER_ENC
## Cache temporary holds failures info (used by multi-line for wrapping e. g. conn-id to host):
self.__mlfidCache = None
## Error counter (protected, so can be used in filter implementations)
## if it reached 100 (at once), run-cycle will go idle
self._errors = 0
@ -99,7 +106,7 @@ class Filter(JailThread):
self.ticks = 0
self.dateDetector = DateDetector()
logSys.debug("Created %s" % self)
logSys.debug("Created %s", self)
def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.jail)
@ -129,6 +136,23 @@ class Filter(JailThread):
self.delLogPath(path)
delattr(self, '_reload_logs')
@property
def mlfidCache(self):
if self.__mlfidCache:
return self.__mlfidCache
self.__mlfidCache = Utils.Cache(maxCount=100, maxTime=5*60)
return self.__mlfidCache
@property
def prefRegex(self):
return self.__prefRegex
@prefRegex.setter
def prefRegex(self, value):
if value:
self.__prefRegex = Regex(value, useDns=self.__useDns)
else:
self.__prefRegex = None
##
# Add a regular expression which matches the failure.
#
@ -137,13 +161,11 @@ class Filter(JailThread):
# @param value the regular expression
def addFailRegex(self, value):
multiLine = self.getMaxLines() > 1
try:
regex = FailRegex(value, useDns=self.__useDns)
regex = FailRegex(value, prefRegex=self.__prefRegex, multiline=multiLine,
useDns=self.__useDns)
self.__failRegex.append(regex)
if "\n" in regex.getRegex() and not self.getMaxLines() > 1:
logSys.warning(
"Mutliline regex set for jail %r "
"but maxlines not greater than 1", self.jailName)
except RegexException as e:
logSys.error(e)
raise e
@ -158,18 +180,15 @@ class Filter(JailThread):
del self.__failRegex[index]
except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index)
"valid", index)
##
# Get the regular expression which matches the failure.
# Get the regular expressions as list.
#
# @return the regular expression
# @return the regular expression list
def getFailRegex(self):
failRegex = list()
for regex in self.__failRegex:
failRegex.append(regex.getRegex())
return failRegex
return [regex.getRegex() for regex in self.__failRegex]
##
# Add the regular expression which matches the failure.
@ -196,7 +215,7 @@ class Filter(JailThread):
del self.__ignoreRegex[index]
except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index)
"valid", index)
##
# Get the regular expression which matches the failure.
@ -219,9 +238,9 @@ class Filter(JailThread):
value = value.lower() # must be a string by now
if not (value in ('yes', 'warn', 'no', 'raw')):
logSys.error("Incorrect value %r specified for usedns. "
"Using safe 'no'" % (value,))
"Using safe 'no'", value)
value = 'no'
logSys.debug("Setting usedns = %s for %s" % (value, self))
logSys.debug("Setting usedns = %s for %s", value, self)
self.__useDns = value
##
@ -334,7 +353,7 @@ class Filter(JailThread):
encoding = PREFER_ENC
codecs.lookup(encoding) # Raise LookupError if invalid codec
self.__encoding = encoding
logSys.info(" encoding: %s" % encoding)
logSys.info(" encoding: %s", encoding)
return encoding
##
@ -379,7 +398,7 @@ class Filter(JailThread):
if not isinstance(ip, IPAddr):
ip = IPAddr(ip)
if self.inIgnoreIPList(ip):
logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.' % ip)
logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.', ip)
unixTime = MyTime.time()
self.failManager.addFailure(FailTicket(ip, unixTime), self.failManager.getMaxRetry())
@ -394,6 +413,17 @@ class Filter(JailThread):
return ip
##
# Ignore own IP/DNS.
#
@property
def ignoreSelf(self):
return self.__ignoreSelf
@ignoreSelf.setter
def ignoreSelf(self, value):
self.__ignoreSelf = value
##
# Add an IP/DNS to the ignore list.
#
@ -423,7 +453,7 @@ class Filter(JailThread):
def logIgnoreIp(self, ip, log_ignore, ignore_source="unknown source"):
if log_ignore:
logSys.info("[%s] Ignore %s by %s" % (self.jailName, ip, ignore_source))
logSys.info("[%s] Ignore %s by %s", self.jailName, ip, ignore_source)
def getIgnoreIP(self):
return self.__ignoreIpList
@ -439,6 +469,11 @@ class Filter(JailThread):
def inIgnoreIPList(self, ip, log_ignore=False):
if not isinstance(ip, IPAddr):
ip = IPAddr(ip)
# check own IPs should be ignored and 'ip' is self IP:
if self.__ignoreSelf and ip in DNSUtils.getSelfIPs():
return True
for net in self.__ignoreIpList:
# check if the IP is covered by ignore IP
if ip.isInNet(net):
@ -447,7 +482,7 @@ class Filter(JailThread):
if self.__ignoreCommand:
command = CommandAction.replaceTag(self.__ignoreCommand, { 'ip': ip } )
logSys.debug('ignore command: ' + command)
logSys.debug('ignore command: %s', command)
ret, ret_ignore = CommandAction.executeCmd(command, success_codes=(0, 1))
ret_ignore = ret and ret_ignore == 0
self.logIgnoreIp(ip, log_ignore and ret_ignore, ignore_source="command")
@ -486,10 +521,7 @@ class Filter(JailThread):
for element in self.processLine(line, date):
ip = element[1]
unixTime = element[2]
lines = element[3]
fail = {}
if len(element) > 4:
fail = element[4]
fail = element[3]
logSys.debug("Processing line with time:%s and ip:%s",
unixTime, ip)
if self.inIgnoreIPList(ip, log_ignore=True):
@ -497,7 +529,7 @@ class Filter(JailThread):
logSys.info(
"[%s] Found %s - %s", self.jailName, ip, datetime.datetime.fromtimestamp(unixTime).strftime("%Y-%m-%d %H:%M:%S")
)
tick = FailTicket(ip, unixTime, lines, data=fail)
tick = FailTicket(ip, unixTime, data=fail)
self.failManager.addFailure(tick)
# reset (halve) error counter (successfully processed line):
if self._errors:
@ -532,6 +564,34 @@ class Filter(JailThread):
return ignoreRegexIndex
return None
def _mergeFailure(self, mlfid, fail, failRegex):
mlfidFail = self.mlfidCache.get(mlfid) if self.__mlfidCache else None
# if multi-line failure id (connection id) known:
if mlfidFail:
mlfidGroups = mlfidFail[1]
# update - if not forget (disconnect/reset):
if not fail.get('mlfforget'):
mlfidGroups.update(fail)
else:
self.mlfidCache.unset(mlfid) # remove cached entry
# merge with previous info:
fail2 = mlfidGroups.copy()
fail2.update(fail)
if not fail.get('nofail'): # be sure we've correct current state
try:
del fail2['nofail']
except KeyError:
pass
fail2["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
fail = fail2
elif not fail.get('mlfforget'):
mlfidFail = [self.__lastDate, fail]
self.mlfidCache.set(mlfid, mlfidFail)
if fail.get('nofail'):
fail["matches"] = failRegex.getMatchedTupleLines()
return fail
##
# Finds the failure in a line given split into time and log parts.
#
@ -564,7 +624,7 @@ class Filter(JailThread):
dateTimeMatch = self.dateDetector.getTime(timeText, tupleLine[3])
if dateTimeMatch is None:
logSys.error("findFailure failed to parse timeText: " + timeText)
logSys.error("findFailure failed to parse timeText: %s", timeText)
date = self.__lastDate
else:
@ -582,77 +642,121 @@ class Filter(JailThread):
date, MyTime.time(), self.getFindTime())
return failList
self.__lineBuffer = (
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
logSys.log(5, "Looking for failregex match of %r" % self.__lineBuffer)
if self.__lineBufferSize > 1:
orgBuffer = self.__lineBuffer = (
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
else:
orgBuffer = self.__lineBuffer = [tupleLine[:3]]
logSys.log(5, "Looking for failregex match of %r", self.__lineBuffer)
# Pre-filter fail regex (if available):
preGroups = {}
if self.__prefRegex:
self.__prefRegex.search(self.__lineBuffer)
if not self.__prefRegex.hasMatched():
return failList
preGroups = self.__prefRegex.getGroups()
logSys.log(7, "Pre-filter matched %s", preGroups)
repl = preGroups.get('content')
# Content replacement:
if repl:
del preGroups['content']
self.__lineBuffer = [('', '', repl)]
# Iterates over all the regular expressions.
for failRegexIndex, failRegex in enumerate(self.__failRegex):
failRegex.search(self.__lineBuffer)
if failRegex.hasMatched():
# The failregex matched.
logSys.log(7, "Matched %s", failRegex)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
logSys.log(7, "Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format."
% ("\n".join(failRegex.getMatchedLines()), timeText))
failRegex.search(self.__lineBuffer, orgBuffer)
if not failRegex.hasMatched():
continue
# The failregex matched.
logSys.log(7, "Matched %s", failRegex)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
logSys.log(7, "Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match:
raw = returnRawHost
try:
fail = failRegex.getGroups()
# failure-id:
fid = fail.get('fid')
# ip-address or host:
host = fail.get('ip4') or fail.get('ip6')
if host is not None:
raw = True
else:
host = fail.get('dns')
if host is None:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format.",
"\n".join(failRegex.getMatchedLines()), timeText)
continue
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match:
try:
raw = returnRawHost
if preGroups:
fail = preGroups.copy()
fail.update(failRegex.getGroups())
else:
fail = failRegex.getGroups()
# first try to check we have mlfid case (caching of connection id by multi-line):
mlfid = fail.get('mlfid')
if mlfid is not None:
fail = self._mergeFailure(mlfid, fail, failRegex)
# bypass if no-failure case:
if fail.get('nofail'):
logSys.log(7, "Nofail by mlfid %r in regex %s: %s",
mlfid, failRegexIndex, fail.get('mlfforget', "waiting for failure"))
if not self.checkAllRegex: return failList
else:
# matched lines:
fail["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
# failure-id:
fid = fail.get('fid')
# ip-address or host:
host = fail.get('ip4')
if host is not None:
cidr = IPAddr.FAM_IPv4
raw = True
else:
host = fail.get('ip6')
if host is not None:
cidr = IPAddr.FAM_IPv6
raw = True
if host is None:
host = fail.get('dns')
if host is None:
# first try to check we have mlfid case (cache connection id):
if fid is None and mlfid is None:
# if no failure-id also (obscure case, wrong regex), throw error inside getFailID:
if fid is None:
fid = failRegex.getFailID()
host = fid
cidr = IPAddr.CIDR_RAW
# if raw - add single ip or failure-id,
# otherwise expand host to multiple ips using dns (or ignore it if not valid):
if raw:
ip = IPAddr(host, cidr)
# check host equal failure-id, if not - failure with complex id:
if fid is not None and fid != host:
ip = IPAddr(fid, IPAddr.CIDR_RAW)
failList.append([failRegexIndex, ip, date,
failRegex.getMatchedLines(), fail])
if not self.checkAllRegex:
break
else:
ips = DNSUtils.textToIp(host, self.__useDns)
if ips:
for ip in ips:
failList.append([failRegexIndex, ip, date,
failRegex.getMatchedLines(), fail])
if not self.checkAllRegex:
break
except RegexException as e: # pragma: no cover - unsure if reachable
logSys.error(e)
fid = failRegex.getFailID()
host = fid
cidr = IPAddr.CIDR_RAW
# if mlfid case (not failure):
if host is None:
logSys.log(7, "No failure-id by mlfid %r in regex %s: %s",
mlfid, failRegexIndex, fail.get('mlfforget', "waiting for identifier"))
if not self.checkAllRegex: return failList
ips = [None]
# if raw - add single ip or failure-id,
# otherwise expand host to multiple ips using dns (or ignore it if not valid):
elif raw:
ip = IPAddr(host, cidr)
# check host equal failure-id, if not - failure with complex id:
if fid is not None and fid != host:
ip = IPAddr(fid, IPAddr.CIDR_RAW)
ips = [ip]
# otherwise, try to use dns conversion:
else:
ips = DNSUtils.textToIp(host, self.__useDns)
# append failure with match to the list:
for ip in ips:
failList.append([failRegexIndex, ip, date, fail])
if not self.checkAllRegex:
break
except RegexException as e: # pragma: no cover - unsure if reachable
logSys.error(e)
return failList
def status(self, flavor="basic"):
@ -716,7 +820,7 @@ class FileFilter(Filter):
db = self.jail.database
if db is not None:
db.updateLog(self.jail, log)
logSys.info("Removed logfile: %r" % path)
logSys.info("Removed logfile: %r", path)
self._delLogPath(path)
return
@ -781,7 +885,7 @@ class FileFilter(Filter):
def getFailures(self, filename):
log = self.getLog(filename)
if log is None:
logSys.error("Unable to get failures in " + filename)
logSys.error("Unable to get failures in %s", filename)
return False
# We should always close log (file), otherwise may be locked (log-rotate, etc.)
try:
@ -790,11 +894,11 @@ class FileFilter(Filter):
has_content = log.open()
# see http://python.org/dev/peps/pep-3151/
except IOError as e:
logSys.error("Unable to open %s" % filename)
logSys.error("Unable to open %s", filename)
logSys.exception(e)
return False
except OSError as e: # pragma: no cover - requires race condition to tigger this
logSys.error("Error opening %s" % filename)
logSys.error("Error opening %s", filename)
logSys.exception(e)
return False
except Exception as e: # pragma: no cover - Requires implemention error in FileContainer to generate
@ -1015,7 +1119,7 @@ class FileContainer:
## sys.stdout.flush()
# Compare hash and inode
if self.__hash != myHash or self.__ino != stats.st_ino:
logSys.info("Log rotation detected for %s" % self.__filename)
logSys.log(logging.MSG, "Log rotation detected for %s", self.__filename)
self.__hash = myHash
self.__ino = stats.st_ino
self.__pos = 0

View File

@ -178,6 +178,14 @@ class FilterSystemd(JournalFilter): # pragma: systemd no cover
def getJournalMatch(self):
return self.__matches
##
# Get journal reader
#
# @return journal reader
def getJournalReader(self):
return self.__journal
##
# Format journal log entry into syslog style
#

View File

@ -64,16 +64,19 @@ class DNSUtils:
if ips is not None:
return ips
# retrieve ips
try:
ips = list()
for result in socket.getaddrinfo(dns, None, 0, 0, socket.IPPROTO_TCP):
ip = IPAddr(result[4][0])
if ip.isValid:
ips.append(ip)
except socket.error as e:
# todo: make configurable the expired time of cache entry:
logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, e)
ips = list()
ips = list()
saveerr = None
for fam, ipfam in ((socket.AF_INET, IPAddr.FAM_IPv4), (socket.AF_INET6, IPAddr.FAM_IPv6)):
try:
for result in socket.getaddrinfo(dns, None, fam, 0, socket.IPPROTO_TCP):
ip = IPAddr(result[4][0], ipfam)
if ip.isValid:
ips.append(ip)
except socket.error as e:
saveerr = e
if not ips and saveerr:
logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, saveerr)
DNSUtils.CACHE_nameToIp.set(dns, ips)
return ips
@ -115,6 +118,42 @@ class DNSUtils:
return ipList
@staticmethod
def getSelfNames():
"""Get own host names of self"""
# try find cached own hostnames (this tuple-key cannot be used elsewhere):
key = ('self','dns')
names = DNSUtils.CACHE_ipToName.get(key)
# get it using different ways (a set with names of localhost, hostname, fully qualified):
if names is None:
names = set(['localhost'])
for hostname in (socket.gethostname, socket.getfqdn):
try:
names |= set([hostname()])
except Exception as e: # pragma: no cover
logSys.warning("Retrieving own hostnames failed: %s", e)
# cache and return :
DNSUtils.CACHE_ipToName.set(key, names)
return names
@staticmethod
def getSelfIPs():
"""Get own IP addresses of self"""
# try find cached own IPs (this tuple-key cannot be used elsewhere):
key = ('self','ips')
ips = DNSUtils.CACHE_nameToIp.get(key)
# get it using different ways (a set with IPs of localhost, hostname, fully qualified):
if ips is None:
ips = set()
for hostname in DNSUtils.getSelfNames():
try:
ips |= set(DNSUtils.textToIp(hostname, 'yes'))
except Exception as e: # pragma: no cover
logSys.warning("Retrieving own IPs of %s failed: %s", hostname, e)
# cache and return :
DNSUtils.CACHE_nameToIp.set(key, ips)
return ips
##
# Class for IP address handling.
@ -140,6 +179,8 @@ class IPAddr(object):
CIDR_RAW = -2
CIDR_UNSPEC = -1
FAM_IPv4 = CIDR_RAW - socket.AF_INET
FAM_IPv6 = CIDR_RAW - socket.AF_INET6
def __new__(cls, ipstr, cidr=CIDR_UNSPEC):
# check already cached as IPAddr
@ -191,7 +232,11 @@ class IPAddr(object):
self._raw = ipstr
# if not raw - recognize family, set addr, etc.:
if cidr != IPAddr.CIDR_RAW:
for family in [socket.AF_INET, socket.AF_INET6]:
if cidr is not None and cidr < IPAddr.CIDR_RAW:
family = [IPAddr.CIDR_RAW - cidr]
else:
family = [socket.AF_INET, socket.AF_INET6]
for family in family:
try:
binary = socket.inet_pton(family, ipstr)
self._family = family
@ -252,6 +297,11 @@ class IPAddr(object):
def family(self):
return self._family
FAM2STR = {socket.AF_INET: 'inet4', socket.AF_INET6: 'inet6'}
@property
def familyStr(self):
return IPAddr.FAM2STR.get(self._family)
@property
def plen(self):
return self._plen
@ -346,7 +396,7 @@ class IPAddr(object):
return socket.inet_ntop(self._family, binary) + add
def getPTR(self, suffix=""):
def getPTR(self, suffix=None):
""" return the DNS PTR string of the provided IP address object
If "suffix" is provided it will be appended as the second and top
@ -356,17 +406,22 @@ class IPAddr(object):
"""
if self.isIPv4:
exploded_ip = self.ntoa.split(".")
if not suffix:
if suffix is None:
suffix = "in-addr.arpa."
elif self.isIPv6:
exploded_ip = self.hexdump
if not suffix:
if suffix is None:
suffix = "ip6.arpa."
else:
return ""
return "%s.%s" % (".".join(reversed(exploded_ip)), suffix)
def getHost(self):
"""Return the host name (DNS) of the provided IP address object
"""
return DNSUtils.ipToName(self.ntoa)
@property
def isIPv4(self):
"""Either the IP object is of address family AF_INET

View File

@ -308,6 +308,12 @@ class Server:
return self.__jails[name].idle
# Filter
def setIgnoreSelf(self, name, value):
self.__jails[name].filter.ignoreSelf = value
def getIgnoreSelf(self, name):
return self.__jails[name].filter.ignoreSelf
def addIgnoreIP(self, name, ip):
self.__jails[name].filter.addIgnoreIP(ip)
@ -379,6 +385,14 @@ class Server:
def getIgnoreCommand(self, name):
return self.__jails[name].filter.getIgnoreCommand()
def setPrefRegex(self, name, value):
flt = self.__jails[name].filter
logSys.debug(" prefregex: %r", value)
flt.prefRegex = value
def getPrefRegex(self, name):
return self.__jails[name].filter.prefRegex
def addFailRegex(self, name, value, multiple=False):
flt = self.__jails[name].filter
if not multiple: value = (value,)

View File

@ -95,18 +95,10 @@ def reGroupDictStrptime(found_dict, msec=False):
Unix time stamp.
"""
now = MyTime.now()
year = month = day = hour = minute = None
hour = minute = None
now = \
year = month = day = hour = minute = tzoffset = \
weekday = julian = week_of_year = None
second = fraction = 0
tzoffset = None
# Default to -1 to signify that values not known; not critical to have,
# though
week_of_year = -1
week_of_year_start = -1
# weekday and julian defaulted to -1 so as to signal need to calculate
# values
weekday = julian = -1
for key, val in found_dict.iteritems():
if val is None: continue
# Directives not explicitly handled below:
@ -116,13 +108,9 @@ def reGroupDictStrptime(found_dict, msec=False):
# worthless without day of the week
if key == 'y':
year = int(val)
# Open Group specification for strptime() states that a %y
#value in the range of [00, 68] is in the century 2000, while
#[69,99] is in the century 1900
if year <= 68:
# Fail2ban year should be always in the current century (>= 2000)
if year <= 2000:
year += 2000
else:
year += 1900
elif key == 'Y':
year = int(val)
elif key == 'm':
@ -156,7 +144,7 @@ def reGroupDictStrptime(found_dict, msec=False):
elif key == 'S':
second = int(val)
elif key == 'f':
if msec:
if msec: # pragma: no cover - currently unused
s = val
# Pad to always return microseconds.
s += "0" * (6 - len(s))
@ -166,21 +154,14 @@ def reGroupDictStrptime(found_dict, msec=False):
elif key == 'a':
weekday = locale_time.a_weekday.index(val.lower())
elif key == 'w':
weekday = int(val)
if weekday == 0:
weekday = 6
else:
weekday -= 1
weekday = int(val) - 1
if weekday < 0: weekday = 6
elif key == 'j':
julian = int(val)
elif key in ('U', 'W'):
week_of_year = int(val)
if key == 'U':
# U starts week on Sunday.
week_of_year_start = 6
else:
# W starts week on Monday.
week_of_year_start = 0
# U starts week on Sunday, W - on Monday
week_of_year_start = 6 if key == 'U' else 0
elif key == 'z':
z = val
if z in ("Z", "UTC", "GMT"):
@ -199,31 +180,28 @@ def reGroupDictStrptime(found_dict, msec=False):
# Fail2Ban will assume it's this year
assume_year = False
if year is None:
if not now: now = MyTime.now()
year = now.year
assume_year = True
# If we know the week of the year and what day of that week, we can figure
# out the Julian day of the year.
if julian == -1 and week_of_year != -1 and weekday != -1:
week_starts_Mon = True if week_of_year_start == 0 else False
julian = _calc_julian_from_U_or_W(year, week_of_year, weekday,
week_starts_Mon)
# Cannot pre-calculate datetime.datetime() since can change in Julian
# calculation and thus could have different value for the day of the week
# calculation.
if julian != -1 and (month is None or day is None):
datetime_result = datetime.datetime.fromordinal((julian - 1) + datetime.datetime(year, 1, 1).toordinal())
year = datetime_result.year
month = datetime_result.month
day = datetime_result.day
# Add timezone info
if tzoffset is not None:
gmtoff = tzoffset * 60
else:
gmtoff = None
if month is None or day is None:
# If we know the week of the year and what day of that week, we can figure
# out the Julian day of the year.
if julian is None and week_of_year is not None and weekday is not None:
julian = _calc_julian_from_U_or_W(year, week_of_year, weekday,
(week_of_year_start == 0))
# Cannot pre-calculate datetime.datetime() since can change in Julian
# calculation and thus could have different value for the day of the week
# calculation.
if julian is not None:
datetime_result = datetime.datetime.fromordinal((julian - 1) + datetime.datetime(year, 1, 1).toordinal())
year = datetime_result.year
month = datetime_result.month
day = datetime_result.day
# Fail2Ban assume today
assume_today = False
if month is None and day is None:
if not now: now = MyTime.now()
month = now.month
day = now.day
assume_today = True
@ -231,22 +209,28 @@ def reGroupDictStrptime(found_dict, msec=False):
# Actully create date
date_result = datetime.datetime(
year, month, day, hour, minute, second, fraction)
if gmtoff is not None:
date_result = date_result - datetime.timedelta(seconds=gmtoff)
# Add timezone info
if tzoffset is not None:
date_result -= datetime.timedelta(seconds=tzoffset * 60)
if date_result > now and assume_today:
# Rollover at midnight, could mean it's yesterday...
date_result = date_result - datetime.timedelta(days=1)
if date_result > now and assume_year:
# Could be last year?
# also reset month and day as it's not yesterday...
date_result = date_result.replace(
year=year-1, month=month, day=day)
if assume_today:
if not now: now = MyTime.now()
if date_result > now:
# Rollover at midnight, could mean it's yesterday...
date_result -= datetime.timedelta(days=1)
if assume_year:
if not now: now = MyTime.now()
if date_result > now:
# Could be last year?
# also reset month and day as it's not yesterday...
date_result = date_result.replace(
year=year-1, month=month, day=day)
if gmtoff is not None:
# make time:
if tzoffset is not None:
tm = calendar.timegm(date_result.utctimetuple())
else:
tm = time.mktime(date_result.timetuple())
if msec:
if msec: # pragma: no cover - currently unused
tm += fraction/1000000.0
return tm

View File

@ -56,7 +56,9 @@ class Ticket(object):
self._time = time if time is not None else MyTime.time()
self._data = {'matches': matches or [], 'failures': 0}
if data is not None:
self._data.update(data)
for k,v in data.iteritems():
if v is not None:
self._data[k] = v
if ticket:
# ticket available - copy whole information from ticket:
self.__dict__.update(i for i in ticket.__dict__.iteritems() if i[0] in self.__dict__)
@ -136,7 +138,8 @@ class Ticket(object):
self._data['matches'] = matches or []
def getMatches(self):
return self._data.get('matches', [])
return [(line if isinstance(line, basestring) else "".join(line)) \
for line in self._data.get('matches', ())]
@property
def restored(self):
@ -233,7 +236,11 @@ class FailTicket(Ticket):
self.__retry += count
self._data['failures'] += attempt
if matches:
self._data['matches'] += matches
# we should duplicate "matches", because possibly referenced to multiple tickets:
if self._data['matches']:
self._data['matches'] = self._data['matches'] + matches
else:
self._data['matches'] = matches
def setLastTime(self, value):
if value > self._time:

View File

@ -108,11 +108,11 @@ class Transmitter:
value = command[1:]
# if all ips:
if len(value) == 1 and value[0] == "--all":
self.__server.setUnbanIP()
return
return self.__server.setUnbanIP()
cnt = 0
for value in value:
self.__server.setUnbanIP(None, value)
return None
cnt += self.__server.setUnbanIP(None, value)
return cnt
elif command[0] == "echo":
return command[1:]
elif command[0] == "sleep":
@ -181,6 +181,10 @@ class Transmitter:
raise Exception("Invalid idle option, must be 'on' or 'off'")
return self.__server.getIdleJail(name)
# Filter
elif command[1] == "ignoreself":
value = command[2]
self.__server.setIgnoreSelf(name, value)
return self.__server.getIgnoreSelf(name)
elif command[1] == "addignoreip":
value = command[2]
self.__server.addIgnoreIP(name, value)
@ -221,6 +225,10 @@ class Transmitter:
value = command[2:]
self.__server.delJournalMatch(name, value)
return self.__server.getJournalMatch(name)
elif command[1] == "prefregex":
value = command[2]
self.__server.setPrefRegex(name, value)
return self.__server.getPrefRegex(name)
elif command[1] == "addfailregex":
value = command[2]
self.__server.addFailRegex(name, value, multiple=multiple)
@ -337,10 +345,14 @@ class Transmitter:
return self.__server.getLogEncoding(name)
elif command[1] == "journalmatch": # pragma: systemd no cover
return self.__server.getJournalMatch(name)
elif command[1] == "ignoreself":
return self.__server.getIgnoreSelf(name)
elif command[1] == "ignoreip":
return self.__server.getIgnoreIP(name)
elif command[1] == "ignorecommand":
return self.__server.getIgnoreCommand(name)
elif command[1] == "prefregex":
return self.__server.getPrefRegex(name)
elif command[1] == "failregex":
return self.__server.getFailRegex(name)
elif command[1] == "ignoreregex":

View File

@ -28,7 +28,7 @@ import signal
import subprocess
import sys
import time
from ..helpers import getLogger, uni_decode
from ..helpers import getLogger, _merge_dicts, uni_decode
if sys.version_info >= (3, 3):
import importlib.machinery
@ -60,6 +60,7 @@ class Utils():
DEFAULT_SLEEP_TIME = 2
DEFAULT_SLEEP_INTERVAL = 0.2
DEFAULT_SHORT_INTERVAL = 0.001
DEFAULT_SHORTEST_INTERVAL = DEFAULT_SHORT_INTERVAL / 100
class Cache(object):
@ -98,6 +99,12 @@ class Utils():
cache.popitem()
cache[k] = (v, t + self.maxTime)
def unset(self, k):
try:
del self._cache[k]
except KeyError: # pragme: no cover
pass
@staticmethod
def setFBlockMode(fhandle, value):
@ -110,7 +117,31 @@ class Utils():
return flags
@staticmethod
def executeCmd(realCmd, timeout=60, shell=True, output=False, tout_kill_tree=True, success_codes=(0,)):
def buildShellCmd(realCmd, varsDict):
"""Generates new shell command as array, contains map as variables to
arguments statement (varsStat), the command (realCmd) used this variables and
the list of the arguments, mapped from varsDict
Example:
buildShellCmd('echo "V2: $v2, V1: $v1"', {"v1": "val 1", "v2": "val 2", "vUnused": "unused var"})
returns:
['v1=$0 v2=$1 vUnused=$2 \necho "V2: $v2, V1: $v1"', 'val 1', 'val 2', 'unused var']
"""
# build map as array of vars and command line array:
varsStat = ""
if not isinstance(realCmd, list):
realCmd = [realCmd]
i = len(realCmd)-1
for k, v in varsDict.iteritems():
varsStat += "%s=$%s " % (k, i)
realCmd.append(v)
i += 1
realCmd[0] = varsStat + "\n" + realCmd[0]
return realCmd
@staticmethod
def executeCmd(realCmd, timeout=60, shell=True, output=False, tout_kill_tree=True,
success_codes=(0,), varsDict=None):
"""Executes a command.
Parameters
@ -125,6 +156,8 @@ class Utils():
output : bool
If output is True, the function returns tuple (success, stdoutdata, stderrdata, returncode).
If False, just indication of success is returned
varsDict: dict
variables supplied to the command (or to the shell script)
Returns
-------
@ -140,10 +173,18 @@ class Utils():
"""
stdout = stderr = None
retcode = None
popen = None
popen = env = None
if varsDict:
if shell:
# build map as array of vars and command line array:
realCmd = Utils.buildShellCmd(realCmd, varsDict)
else: # pragma: no cover - currently unused
env = _merge_dicts(os.environ, varsDict)
realCmdId = id(realCmd)
logCmd = lambda level: logSys.log(level, "%x -- exec: %s", realCmdId, realCmd)
try:
popen = subprocess.Popen(
realCmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell,
realCmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell, env=env,
preexec_fn=os.setsid # so that killpg does not kill our process
)
# wait with timeout for process has terminated:
@ -152,13 +193,15 @@ class Utils():
def _popen_wait_end():
retcode = popen.poll()
return (True, retcode) if retcode is not None else None
retcode = Utils.wait_for(_popen_wait_end, timeout, Utils.DEFAULT_SHORT_INTERVAL)
# popen.poll is fast operation so we can use the shortest sleep interval:
retcode = Utils.wait_for(_popen_wait_end, timeout, Utils.DEFAULT_SHORTEST_INTERVAL)
if retcode:
retcode = retcode[1]
# if timeout:
if retcode is None:
logSys.error("%s -- timed out after %s seconds." %
(realCmd, timeout))
if logCmd: logCmd(logging.ERROR); logCmd = None
logSys.error("%x -- timed out after %s seconds." %
(realCmdId, timeout))
pgid = os.getpgid(popen.pid)
# if not tree - first try to terminate and then kill, otherwise - kill (-9) only:
os.killpg(pgid, signal.SIGTERM) # Terminate the process
@ -168,59 +211,62 @@ class Utils():
if retcode is None or tout_kill_tree: # Still going...
os.killpg(pgid, signal.SIGKILL) # Kill the process
time.sleep(Utils.DEFAULT_SLEEP_INTERVAL)
retcode = popen.poll()
if retcode is None: # pragma: no cover - too sporadic
retcode = popen.poll()
#logSys.debug("%s -- killed %s ", realCmd, retcode)
if retcode is None and not Utils.pid_exists(pgid): # pragma: no cover
retcode = signal.SIGKILL
except OSError as e:
if logCmd: logCmd(logging.ERROR); logCmd = None
stderr = "%s -- failed with %s" % (realCmd, e)
logSys.error(stderr)
if not popen:
return False if not output else (False, stdout, stderr, retcode)
std_level = logging.DEBUG if retcode in success_codes else logging.ERROR
if std_level > logSys.getEffectiveLevel():
if logCmd: logCmd(std_level-1); logCmd = None
# if we need output (to return or to log it):
if output or std_level >= logSys.getEffectiveLevel():
# if was timeouted (killed/terminated) - to prevent waiting, set std handles to non-blocking mode.
if popen.stdout:
try:
if retcode is None or retcode < 0:
Utils.setFBlockMode(popen.stdout, False)
stdout = popen.stdout.read()
except IOError as e:
except IOError as e: # pragma: no cover
logSys.error(" ... -- failed to read stdout %s", e)
if stdout is not None and stdout != '' and std_level >= logSys.getEffectiveLevel():
logSys.log(std_level, "%s -- stdout:", realCmd)
for l in stdout.splitlines():
logSys.log(std_level, " -- stdout: %r", uni_decode(l))
logSys.log(std_level, "%x -- stdout: %r", realCmdId, uni_decode(l))
popen.stdout.close()
if popen.stderr:
try:
if retcode is None or retcode < 0:
Utils.setFBlockMode(popen.stderr, False)
stderr = popen.stderr.read()
except IOError as e:
except IOError as e: # pragma: no cover
logSys.error(" ... -- failed to read stderr %s", e)
if stderr is not None and stderr != '' and std_level >= logSys.getEffectiveLevel():
logSys.log(std_level, "%s -- stderr:", realCmd)
for l in stderr.splitlines():
logSys.log(std_level, " -- stderr: %r", uni_decode(l))
logSys.log(std_level, "%x -- stderr: %r", realCmdId, uni_decode(l))
popen.stderr.close()
success = False
if retcode in success_codes:
logSys.debug("%-.40s -- returned successfully %i", realCmd, retcode)
logSys.debug("%x -- returned successfully %i", realCmdId, retcode)
success = True
elif retcode is None:
logSys.error("%-.40s -- unable to kill PID %i", realCmd, popen.pid)
logSys.error("%x -- unable to kill PID %i", realCmdId, popen.pid)
elif retcode < 0 or retcode > 128:
# dash would return negative while bash 128 + n
sigcode = -retcode if retcode < 0 else retcode - 128
logSys.error("%-.40s -- killed with %s (return code: %s)",
realCmd, signame.get(sigcode, "signal %i" % sigcode), retcode)
logSys.error("%x -- killed with %s (return code: %s)",
realCmdId, signame.get(sigcode, "signal %i" % sigcode), retcode)
else:
msg = _RETCODE_HINTS.get(retcode, None)
logSys.error("%-.40s -- returned %i", realCmd, retcode)
logSys.error("%x -- returned %i", realCmdId, retcode)
if msg:
logSys.info("HINT on %i: %s", retcode, msg % locals())
if output:
@ -284,7 +330,7 @@ class Utils():
return e.errno == errno.EPERM
else:
return True
else:
else: # pragma : no cover (no windows currently supported)
@staticmethod
def pid_exists(pid):
import ctypes

View File

@ -61,6 +61,7 @@ if sys.version_info >= (2,7): # pragma: no cover - may be unavailable
# Must cancel timer!
if self.action._timer:
self.action._timer.cancel()
super(BadIPsActionTest, self).tearDown()
def testCategory(self):
categories = self.action.getCategories()

View File

@ -30,18 +30,22 @@ else:
from ..dummyjail import DummyJail
from ..utils import CONFIG_DIR, asyncserver
from ..utils import CONFIG_DIR, asyncserver, Utils, uni_decode
class TestSMTPServer(smtpd.SMTPServer):
def process_message(self, peer, mailfrom, rcpttos, data):
def __init__(self, *args):
smtpd.SMTPServer.__init__(self, *args)
self.ready = False
def process_message(self, peer, mailfrom, rcpttos, data, **kwargs):
self.peer = peer
self.mailfrom = mailfrom
self.rcpttos = rcpttos
self.org_data = data
# replace new line (with tab or space) for possible mime translations (word wrap):
self.data = re.sub(r"\n[\t ]", " ", data)
# replace new line (with tab or space) for possible mime translations (word wrap),
self.data = re.sub(r"\n[\t ]", " ", uni_decode(data))
self.ready = True
class SMTPActionTest(unittest.TestCase):
@ -63,7 +67,7 @@ class SMTPActionTest(unittest.TestCase):
port = self.smtpd.socket.getsockname()[1]
self.action = customActionModule.Action(
self.jail, "test", host="127.0.0.1:%i" % port)
self.jail, "test", host="localhost:%i" % port)
## because of bug in loop (see loop in asyncserver.py) use it's loop instead of asyncore.loop:
self._active = True
@ -77,9 +81,16 @@ class SMTPActionTest(unittest.TestCase):
self.smtpd.close()
self._active = False
self._loop_thread.join()
super(SMTPActionTest, self).tearDown()
def _exec_and_wait(self, doaction, timeout=3, short=False):
if short: timeout /= 25
self.smtpd.ready = False
doaction()
Utils.wait_for(lambda: self.smtpd.ready, timeout)
def testStart(self):
self.action.start()
self._exec_and_wait(self.action.start)
self.assertEqual(self.smtpd.mailfrom, "fail2ban")
self.assertEqual(self.smtpd.rcpttos, ["root"])
self.assertTrue(
@ -87,23 +98,28 @@ class SMTPActionTest(unittest.TestCase):
in self.smtpd.data)
def testStop(self):
self.action.stop()
self._exec_and_wait(self.action.stop)
self.assertEqual(self.smtpd.mailfrom, "fail2ban")
self.assertEqual(self.smtpd.rcpttos, ["root"])
self.assertTrue(
"Subject: [Fail2Ban] %s: stopped" %
self.jail.name in self.smtpd.data)
def testBan(self):
def _testBan(self, restored=False):
aInfo = {
'ip': "127.0.0.2",
'failures': 3,
'matches': "Test fail 1\n",
'ipjailmatches': "Test fail 1\nTest Fail2\n",
'ipmatches': "Test fail 1\nTest Fail2\nTest Fail3\n",
}
}
if restored:
aInfo['restored'] = 1
self.action.ban(aInfo)
self._exec_and_wait(lambda: self.action.ban(aInfo), short=restored)
if restored: # no mail, should raises attribute error:
self.assertRaises(AttributeError, lambda: self.smtpd.mailfrom)
return
self.assertEqual(self.smtpd.mailfrom, "fail2ban")
self.assertEqual(self.smtpd.rcpttos, ["root"])
subject = "Subject: [Fail2Ban] %s: banned %s" % (
@ -113,26 +129,32 @@ class SMTPActionTest(unittest.TestCase):
"%i attempts" % aInfo['failures'], self.smtpd.data)
self.action.matches = "matches"
self.action.ban(aInfo)
self._exec_and_wait(lambda: self.action.ban(aInfo))
self.assertIn(aInfo['matches'], self.smtpd.data)
self.action.matches = "ipjailmatches"
self.action.ban(aInfo)
self._exec_and_wait(lambda: self.action.ban(aInfo))
self.assertIn(aInfo['ipjailmatches'], self.smtpd.data)
self.action.matches = "ipmatches"
self.action.ban(aInfo)
self._exec_and_wait(lambda: self.action.ban(aInfo))
self.assertIn(aInfo['ipmatches'], self.smtpd.data)
def testBan(self):
self._testBan()
def testNOPByRestored(self):
self._testBan(restored=True)
def testOptions(self):
self.action.start()
self._exec_and_wait(self.action.start)
self.assertEqual(self.smtpd.mailfrom, "fail2ban")
self.assertEqual(self.smtpd.rcpttos, ["root"])
self.action.fromname = "Test"
self.action.fromaddr = "test@example.com"
self.action.toaddr = "test@example.com, test2@example.com"
self.action.start()
self._exec_and_wait(self.action.start)
self.assertEqual(self.smtpd.mailfrom, "test@example.com")
self.assertTrue("From: %s <%s>" %
(self.action.fromname, self.action.fromaddr) in self.smtpd.data)

View File

@ -29,7 +29,7 @@ import tempfile
import time
import unittest
from ..server.action import CommandAction, CallingMap
from ..server.action import CommandAction, CallingMap, substituteRecursiveTags
from ..server.actions import OrderedDict
from ..server.utils import Utils
@ -40,12 +40,20 @@ class CommandActionTest(LogCaptureTestCase):
def setUp(self):
"""Call before every test case."""
self.__action = CommandAction(None, "Test")
LogCaptureTestCase.setUp(self)
self.__action = CommandAction(None, "Test")
# prevent execute stop if start fails (or event not started at all):
self.__action_started = False
orgstart = self.__action.start
def _action_start():
self.__action_started = True
return orgstart()
self.__action.start = _action_start
def tearDown(self):
"""Call after every test case."""
self.__action.stop()
if self.__action_started:
self.__action.stop()
LogCaptureTestCase.tearDown(self)
def testSubstituteRecursiveTags(self):
@ -56,30 +64,30 @@ class CommandActionTest(LogCaptureTestCase):
}
# Recursion is bad
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<A>'}))
lambda: substituteRecursiveTags({'A': '<A>'}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<A>'}))
lambda: substituteRecursiveTags({'A': '<B>', 'B': '<A>'}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'}))
lambda: substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'}))
# Unresolveable substition
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''}))
lambda: substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''}))
lambda: substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''}))
# We need here an ordered, because the sequence of iteration is very important for this test
if OrderedDict:
# No cyclic recursion, just multiple replacement of tag <T>, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict(
self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T>'), ('T', '1'), ('Z', '<X> <T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1', 'T': '1', 'Y': 'y=y1', 'Z': 'x=x1 1 y=y1'}
)
# No cyclic recursion, just multiple replacement of tag <T> in composite tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict(
self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T> <Z> <<R1>> <<R2>>'), ('R1', 'Z'), ('R2', 'Y'), ('T', '1'), ('Z', '<T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1 1 y=y1 1 y=y1 y=y1', 'R1': 'Z', 'R2': 'Y', 'T': '1', 'Z': '1 y=y1', 'Y': 'y=y1'}
)
# No cyclic recursion, just multiple replacement of same tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict((
self.assertEqual(substituteRecursiveTags( OrderedDict((
('actionstart', 'ipset create <ipmset> hash:ip timeout <bantime> family <ipsetfamily>\n<iptables> -I <chain> <actiontype>'),
('ipmset', 'f2b-<name>'),
('name', 'any'),
@ -111,44 +119,75 @@ class CommandActionTest(LogCaptureTestCase):
))
)
# Cyclic recursion by composite tag creation, tags "create" another tag, that closes cycle:
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict((
self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'),
('DE', 'cycle <A>'),
)) ))
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict((
self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('DE', 'cycle <A>'),
('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'),
)) ))
# missing tags are ok
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'})
self.assertEqual(substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'})
self.assertEqual(substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'})
self.assertEqual(substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'})
# Escaped tags should be ignored
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'})
self.assertEqual(substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'})
# Multiple stuff on same line is ok
self.assertEqual(CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}),
self.assertEqual(substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}),
{ 'failregex': "to=pokie fromip=<IP> evilperson=pokie",
'honeypot': 'pokie',
'ignoreregex': '',
})
# rest is just cool
self.assertEqual(CommandAction.substituteRecursiveTags(aInfo),
self.assertEqual(substituteRecursiveTags(aInfo),
{ 'HOST': "192.0.2.0",
'ABC': '123 192.0.2.0',
'xyz': '890 123 192.0.2.0',
})
# obscure embedded case
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}),
self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}),
{'A': '<IPV4HOST>', 'PREF': 'IPV4'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}),
self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}),
{'A': '1.2.3.4', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'})
# more embedded within a string and two interpolations
self.assertEqual(CommandAction.substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}),
self.assertEqual(substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}),
{'A': 'A 1.2.3.4 B IPV4 C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'})
def testSubstRec_DontTouchUnusedCallable(self):
cm = CallingMap({
'A':0,
'B':lambda self: '<A><A>',
'C':'',
'D':''
})
#
# should raise no exceptions:
substituteRecursiveTags(cm)
# add exception tag:
cm['C'] = lambda self,i=0: 5 // int(self['A']) # raise error by access
# test direct get of callable (should raise an error):
self.assertRaises(ZeroDivisionError, lambda: cm['C'])
# should raise no exceptions (tag "C" still unused):
substituteRecursiveTags(cm)
# add reference to "broken" tag:
cm['D'] = 'test=<C>'
# should raise an exception (BOOM by replacement of tag "D" recursive):
self.assertRaises(ZeroDivisionError, lambda: substituteRecursiveTags(cm))
#
# should raise no exceptions:
self.assertEqual(self.__action.replaceTag('test=<A>', cm), "test=0")
# **Important**: recursive replacement of dynamic data from calling map should be prohibited,
# otherwise may be vulnerable on foreign user-input:
self.assertEqual(self.__action.replaceTag('test=<A>--<B>--<A>', cm), "test=0--<A><A>--0")
# should raise an exception (BOOM by replacement of tag "C"):
self.assertRaises(ZeroDivisionError, lambda: self.__action.replaceTag('test=<C>', cm))
# should raise no exceptions (replaces tag "D" only):
self.assertEqual(self.__action.replaceTag('<D>', cm), "test=<C>")
def testReplaceTag(self):
aInfo = {
'HOST': "192.0.2.0",
@ -186,7 +225,7 @@ class CommandActionTest(LogCaptureTestCase):
# Callable
self.assertEqual(
self.__action.replaceTag("09 <matches> 11",
CallingMap(matches=lambda: str(10))),
CallingMap(matches=lambda self: str(10))),
"09 10 11")
def testReplaceNoTag(self):
@ -194,7 +233,27 @@ class CommandActionTest(LogCaptureTestCase):
# Will raise ValueError if it is
self.assertEqual(
self.__action.replaceTag("abc",
CallingMap(matches=lambda: int("a"))), "abc")
CallingMap(matches=lambda self: int("a"))), "abc")
def testReplaceTagSelfRecursion(self):
setattr(self.__action, 'a', "<a")
setattr(self.__action, 'b', "c>")
setattr(self.__action, 'b?family=inet6', "b>")
setattr(self.__action, 'ac', "<a><b>")
setattr(self.__action, 'ab', "<ac>")
setattr(self.__action, 'x?family=inet6', "")
# produce self-referencing properties except:
self.assertRaisesRegexp(ValueError, r"properties contain self referencing definitions",
lambda: self.__action.replaceTag("<a><b>",
self.__action._properties, conditional="family=inet4")
)
# remore self-referencing in props:
delattr(self.__action, 'ac')
# produce self-referencing query except:
self.assertRaisesRegexp(ValueError, r"possible self referencing definitions in query",
lambda: self.__action.replaceTag("<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x>>>>>>>>>>>>>>>>>>>>>",
self.__action._properties, conditional="family=inet6")
)
def testReplaceTagConditionalCached(self):
setattr(self.__action, 'abc', "123")
@ -217,10 +276,10 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache),
"Text 890-567 text 567 '567'")
self.assertEqual(len(cache) if cache is not None else -1, 3)
self.assertTrue(len(cache) >= 3)
# set one parameter - internal properties and cache should be reseted:
setattr(self.__action, 'xyz', "000-<abc>")
self.assertEqual(len(cache) if cache is not None else -1, 0)
self.assertEqual(len(cache), 0)
# test againg, should have 000 instead of 890:
for i in range(2):
self.assertEqual(
@ -235,7 +294,7 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache),
"Text 000-567 text 567 '567'")
self.assertEqual(len(cache), 3)
self.assertTrue(len(cache) >= 3)
def testExecuteActionBan(self):
@ -301,13 +360,24 @@ class CommandActionTest(LogCaptureTestCase):
self.assertEqual(self.__action.ROST,"192.0.2.0")
def testExecuteActionUnbanAinfo(self):
aInfo = {
aInfo = CallingMap({
'ABC': "123",
}
self.__action.actionban = "touch /tmp/fail2ban.test.123"
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>"
'ip': '192.0.2.1',
'F-*': lambda self: {
'fid': 111,
'fport': 222,
'user': "tester"
}
})
self.__action.actionban = "touch /tmp/fail2ban.test.123; echo 'failure <F-ID> of <F-USER> -<F-TEST>- from <ip>:<F-PORT>'"
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>; echo 'user <F-USER> unbanned'"
self.__action.ban(aInfo)
self.__action.unban(aInfo)
self.assertLogged(
" -- stdout: 'failure 111 of tester -- from 192.0.2.1:222'",
" -- stdout: 'user tester unbanned'",
all=True
)
def testExecuteActionStartEmpty(self):
self.__action.actionstart = ""
@ -319,6 +389,51 @@ class CommandActionTest(LogCaptureTestCase):
self.assertLogged('Nothing to do')
self.pruneLog()
def testExecuteWithVars(self):
self.assertTrue(self.__action.executeCmd(
r'''printf %b "foreign input:\n'''
r''' -- $f2bV_A --\n'''
r''' -- $f2bV_B --\n'''
r''' -- $(echo -n $f2bV_C) --''' # echo just replaces \n to test it as single line
r'''"''',
varsDict={
'f2bV_A': 'I\'m a hacker; && $(echo $f2bV_B)',
'f2bV_B': 'I"m very bad hacker',
'f2bV_C': '`Very | very\n$(bad & worst hacker)`'
}))
self.assertLogged(r"""foreign input:""",
' -- I\'m a hacker; && $(echo $f2bV_B) --',
' -- I"m very bad hacker --',
' -- `Very | very $(bad & worst hacker)` --', all=True)
def testExecuteReplaceEscapeWithVars(self):
self.__action.actionban = 'echo "** ban <ip>, reason: <reason> ...\\n<matches>"'
self.__action.actionunban = 'echo "** unban <ip>"'
self.__action.actionstop = 'echo "** stop monitoring"'
matches = [
'<actionunban>',
'" Hooray! #',
'`I\'m cool script kiddy',
'`I`m very cool > /here-is-the-path/to/bin/.x-attempt.sh',
'<actionstop>',
]
aInfo = {
'ip': '192.0.2.1',
'reason': 'hacking attempt ( he thought he knows how f2b internally works ;)',
'matches': '\n'.join(matches)
}
self.pruneLog()
self.__action.ban(aInfo)
self.assertLogged(
'** ban %s' % aInfo['ip'], aInfo['reason'], *matches, all=True)
self.assertNotLogged(
'** unban %s' % aInfo['ip'], '** stop monitoring', all=True)
self.pruneLog()
self.__action.unban(aInfo)
self.__action.stop()
self.assertLogged(
'** unban %s' % aInfo['ip'], '** stop monitoring', all=True)
def testExecuteIncorrectCmd(self):
CommandAction.executeCmd('/bin/ls >/dev/null\nbogusXXX now 2>/dev/null')
self.assertLogged('HINT on 127: "Command not found"')
@ -330,8 +445,9 @@ class CommandActionTest(LogCaptureTestCase):
self.assertFalse(CommandAction.executeCmd('sleep 30', timeout=timeout))
# give a test still 1 second, because system could be too busy
self.assertTrue(time.time() >= stime + timeout and time.time() <= stime + timeout + 1)
self.assertLogged('sleep 30 -- timed out after')
self.assertLogged('sleep 30 -- killed with SIGTERM')
self.assertLogged('sleep 30', ' -- timed out after', all=True)
self.assertLogged(' -- killed with SIGTERM',
' -- killed with SIGKILL')
def testExecuteTimeoutWithNastyChildren(self):
# temporary file for a nasty kid shell script
@ -387,9 +503,9 @@ class CommandActionTest(LogCaptureTestCase):
# Verify that the process itself got killed
self.assertTrue(Utils.wait_for(lambda: not pid_exists(cpid), 3))
self.assertLogged('my pid ', 'Resource temporarily unavailable')
self.assertLogged('timed out')
self.assertLogged('killed with SIGTERM',
'killed with SIGKILL')
self.assertLogged(' -- timed out')
self.assertLogged(' -- killed with SIGTERM',
' -- killed with SIGKILL')
os.unlink(tmpFilename)
os.unlink(tmpFilename + '.pid')
@ -403,7 +519,7 @@ class CommandActionTest(LogCaptureTestCase):
"stderr: 'The rain in Spain stays mainly in the plain'\n")
def testCallingMap(self):
mymap = CallingMap(callme=lambda: str(10), error=lambda: int('a'),
mymap = CallingMap(callme=lambda self: str(10), error=lambda self: int('a'),
dontcallme= "string", number=17)
# Should work fine
@ -412,3 +528,43 @@ class CommandActionTest(LogCaptureTestCase):
"10 okay string 17")
# Error will now trip, demonstrating delayed call
self.assertRaises(ValueError, lambda x: "%(error)i" % x, mymap)
def testCallingMapModify(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': 'test',
})
# test reset (without modifications):
m.reset()
# do modifications:
m['a'] = 4
del m['c']
# test set and delete:
self.assertEqual(len(m), 2)
self.assertNotIn('c', m)
self.assertEqual((m['a'], m['b']), (4, 10))
# reset to original and test again:
m.reset()
s = repr(m)
self.assertEqual(len(m), 3)
self.assertIn('c', m)
self.assertEqual((m['a'], m['b'], m['c']), (5, 11, 'test'))
def testCallingMapRep(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': ''
})
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ''", s)
m['c'] = lambda self: self['xxx'] + 7; # unresolvable
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ", s) # presents as callable
self.assertNotIn("'c': ''", s) # but not empty

View File

@ -38,7 +38,7 @@ class AddFailure(unittest.TestCase):
def tearDown(self):
"""Call after every test case."""
pass
super(AddFailure, self).tearDown()
def testAdd(self):
self.assertTrue(self.__banManager.addBanTicket(self.__ticket))
@ -147,7 +147,7 @@ class StatusExtendedCymruInfo(unittest.TestCase):
def tearDown(self):
"""Call after every test case."""
pass
super(StatusExtendedCymruInfo, self).tearDown()
available = True, None

View File

@ -37,6 +37,7 @@ class BeautifierTest(unittest.TestCase):
def tearDown(self):
""" Call after every test case """
super(BeautifierTest, self).tearDown()
def testGetInputCmd(self):
cmd = ["test"]

View File

@ -28,7 +28,7 @@ import re
import shutil
import tempfile
import unittest
from ..client.configreader import ConfigReader, ConfigReaderUnshared
from ..client.configreader import ConfigReader, ConfigReaderUnshared, NoSectionError
from ..client import configparserinc
from ..client.jailreader import JailReader
from ..client.filterreader import FilterReader
@ -317,7 +317,17 @@ class JailReaderTest(LogCaptureTestCase):
self.assertLogged('File %s is a dangling link, thus cannot be monitored' % f2)
self.assertEqual(JailReader._glob(os.path.join(d, 'nonexisting')), [])
def testCommonFunction(self):
c = ConfigReader(share_config={})
# test common functionalities (no shared, without read of config):
self.assertEqual(c.sections(), [])
self.assertFalse(c.has_section('test'))
self.assertRaises(NoSectionError, c.merge_section, 'test', {})
self.assertRaises(NoSectionError, c.options, 'test')
self.assertRaises(NoSectionError, c.get, 'test', 'any')
self.assertRaises(NoSectionError, c.getOptions, 'test', {})
class FilterReaderTest(unittest.TestCase):
def __init__(self, *args, **kwargs):
@ -347,7 +357,7 @@ class FilterReaderTest(unittest.TestCase):
['set', 'testcase01', 'addjournalmatch',
"FIELD= with spaces ", "+", "AFIELD= with + char and spaces"],
['set', 'testcase01', 'datepattern', "%Y %m %d %H:%M:%S"],
['set', 'testcase01', 'maxlines', "1"], # Last for overide test
['set', 'testcase01', 'maxlines', 1], # Last for overide test
]
filterReader = FilterReader("testcase01", "testcase01", {})
filterReader.setBaseDir(TEST_FILES_DIR)
@ -517,12 +527,10 @@ class JailsReaderTest(LogCaptureTestCase):
['add', 'brokenaction', 'auto'],
['set', 'brokenaction', 'addfailregex', '<IP>'],
['set', 'brokenaction', 'addaction', 'brokenaction'],
['set',
'brokenaction',
'action',
'brokenaction',
'actionban',
'hit with big stick <ip>'],
['multi-set', 'brokenaction', 'action', 'brokenaction', [
['actionban', 'hit with big stick <ip>'],
['actname', 'brokenaction']
]],
['add', 'parse_to_end_of_jail.conf', 'auto'],
['set', 'parse_to_end_of_jail.conf', 'addfailregex', '<IP>'],
['start', 'emptyaction'],
@ -548,7 +556,10 @@ class JailsReaderTest(LogCaptureTestCase):
actionName = os.path.basename(actionConfig).replace('.conf', '')
actionReader = ActionReader(actionName, "TEST", {}, basedir=CONFIG_DIR)
self.assertTrue(actionReader.read())
actionReader.getOptions({}) # populate _opts
try:
actionReader.getOptions({}) # populate _opts
except Exception as e: # pragma: no cover
self.fail("action %r\n%s: %s" % (actionName, type(e).__name__, e))
if not actionName.endswith('-common'):
self.assertIn('Definition', actionReader.sections(),
msg="Action file %r is lacking [Definition] section" % actionConfig)
@ -627,7 +638,7 @@ class JailsReaderTest(LogCaptureTestCase):
# grab all filter names
filters = set(os.path.splitext(os.path.split(a)[1])[0]
for a in glob.glob(os.path.join('config', 'filter.d', '*.conf'))
if not a.endswith('common.conf'))
if not (a.endswith('common.conf') or a.endswith('-aggressive.conf')))
# get filters of all jails (filter names without options inside filter[...])
filters_jail = set(
JailReader.extractOptions(jail.options['filter'])[0] for jail in jails.jails
@ -711,6 +722,7 @@ class JailsReaderTest(LogCaptureTestCase):
self.assertEqual(opts['socket'], '/var/run/fail2ban/fail2ban.sock')
self.assertEqual(opts['pidfile'], '/var/run/fail2ban/fail2ban.pid')
configurator.readAll()
configurator.getOptions()
configurator.convertToProtocol()
commands = configurator.getConfigStream()

View File

@ -1 +0,0 @@
../../../../config/filter.d/common.conf

View File

@ -1,6 +1,13 @@
#[INCLUDES]
#before = common.conf
[Definition]
failregex = failure test 1 (filter.d/test.conf) <HOST>
[DEFAULT]
_daemon = default
[Definition]
where = conf
failregex = failure <_daemon> <one> (filter.d/test.%(where)s) <HOST>
[Init]
# test parameter, should be overriden in jail by "filter=test[one=1,...]"
one = *1*

Some files were not shown because too many files have changed in this diff Show More